Compare commits
26 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 71c60ade8a | |||
| bf3dc635d9 | |||
| 8ca899aaf5 | |||
| 15facf48eb | |||
| 71fc90fcb8 | |||
| 856701fb6f | |||
| 6037b62612 | |||
| 8f88cb0030 | |||
| c224562423 | |||
| 5c07aef526 | |||
| d54d37061f | |||
| a6afec0e11 | |||
| 205112021b | |||
| 853f2737f1 | |||
| 7c61107f87 | |||
| 7a22474efd | |||
| f2cf4e0d58 | |||
| db4bebfa57 | |||
| 435b77e1df | |||
| 6f80e442cf | |||
| 0fcbf5e3ed | |||
| 3cf6308b79 | |||
| 7e5a4da659 | |||
| d27fcaf342 | |||
| 5b28a065c0 | |||
| e74e1eaf70 |
@@ -200,7 +200,7 @@ Die Diagnostic-UI hat sechs Top-Tabs:
|
|||||||
- **Main** — Live-Chat-Test, Status (Brain / RVS / Proxy), End-to-End-Trace
|
- **Main** — Live-Chat-Test, Status (Brain / RVS / Proxy), End-to-End-Trace
|
||||||
- **Gehirn** — Memory-Verwaltung (Vector-DB), Token/Call-Metrics (Subscription-Quota), Bootstrap & Migration, Komplett-Gehirn Export/Import
|
- **Gehirn** — Memory-Verwaltung (Vector-DB), Token/Call-Metrics (Subscription-Quota), Bootstrap & Migration, Komplett-Gehirn Export/Import
|
||||||
- **Skills** — Liste mit Logs, Run, Activate/Deactivate, Export/Import als tar.gz
|
- **Skills** — Liste mit Logs, Run, Activate/Deactivate, Export/Import als tar.gz
|
||||||
- **Trigger** — Timer + Watcher anlegen/anzeigen/loeschen, Live-Variablen-Anzeige (disk_free, current_lat, hour_of_day, …), near(lat, lon, m) als Condition-Funktion
|
- **Trigger** — Timer + Watcher anlegen/anzeigen/loeschen, Live-Variablen-Anzeige (disk_free, current_lat, hour_of_day, …), GPS-Funktionen `near() / entered_near() / left_near()` für unterschiedliche Geofencing-Modi
|
||||||
- **Dateien** — alle Dateien aus `/shared/uploads/` mit Multi-Select, Bulk-Download (ZIP) + Bulk-Delete
|
- **Dateien** — alle Dateien aus `/shared/uploads/` mit Multi-Select, Bulk-Download (ZIP) + Bulk-Delete
|
||||||
- **Einstellungen** — Reparatur (Container-Restart), Wipe, Sprachausgabe, Whisper, Sprachmodell, Runtime-Config, App-Onboarding (QR), Komplett-Reset
|
- **Einstellungen** — Reparatur (Container-Restart), Wipe, Sprachausgabe, Whisper, Sprachmodell, Runtime-Config, App-Onboarding (QR), Komplett-Reset
|
||||||
|
|
||||||
@@ -319,7 +319,14 @@ Erreichbar unter `http://<VM-IP>:3001`. Teilt das Netzwerk mit der Bridge.
|
|||||||
- **Main**: Brain/RVS/Proxy-Status, Chat-Test, "ARIA denkt..."-Indikator, End-to-End-Trace, Container-Logs
|
- **Main**: Brain/RVS/Proxy-Status, Chat-Test, "ARIA denkt..."-Indikator, End-to-End-Trace, Container-Logs
|
||||||
- **Gehirn**: Memory-Browser (Vector-DB), Suche mit zwei Modi (**📝 Wortlich** = Substring-Match Default + **🧠 Semantisch** mit Score-Threshold), **Advanced Search** (aufklappbares Panel, beliebig viele AND/OR-verknuepfte Felder, + Button fuer mehr Zeilen), Type+Pinned-Filter (greifen auch in der Suche), klappbare Type-Kategorien (Default eingeklappt), Add/Edit/Delete mit Category-Autosuggest, **📎 Anhaenge** pro Memory (Bilder/PDFs/...): Upload + Thumbnail-Vorschau + Lightbox + Lösch-Button, 📎N-Badge in der Liste, automatischer Cleanup beim Memory-Delete. ℹ-Info-Modal das erklaert welche Types FEST in den Prompt vs. Cold Memory wandern. **📄 Druckansicht** (Strg+P → PDF). Konversation-Status mit Destillat-Trigger, **Token/Call-Metrics mit Subscription-Quota-Tracking**, Bootstrap & Migration (3 Wiederherstellungs-Wege), Gehirn-Export/Import (tar.gz)
|
- **Gehirn**: Memory-Browser (Vector-DB), Suche mit zwei Modi (**📝 Wortlich** = Substring-Match Default + **🧠 Semantisch** mit Score-Threshold), **Advanced Search** (aufklappbares Panel, beliebig viele AND/OR-verknuepfte Felder, + Button fuer mehr Zeilen), Type+Pinned-Filter (greifen auch in der Suche), klappbare Type-Kategorien (Default eingeklappt), Add/Edit/Delete mit Category-Autosuggest, **📎 Anhaenge** pro Memory (Bilder/PDFs/...): Upload + Thumbnail-Vorschau + Lightbox + Lösch-Button, 📎N-Badge in der Liste, automatischer Cleanup beim Memory-Delete. ℹ-Info-Modal das erklaert welche Types FEST in den Prompt vs. Cold Memory wandern. **📄 Druckansicht** (Strg+P → PDF). Konversation-Status mit Destillat-Trigger, **Token/Call-Metrics mit Subscription-Quota-Tracking**, Bootstrap & Migration (3 Wiederherstellungs-Wege), Gehirn-Export/Import (tar.gz)
|
||||||
- **Skills**: Liste aller Skills mit Logs pro Run, Activate/Deactivate, Export/Import als tar.gz, "von ARIA"-Badge fuer selbst gebaute
|
- **Skills**: Liste aller Skills mit Logs pro Run, Activate/Deactivate, Export/Import als tar.gz, "von ARIA"-Badge fuer selbst gebaute
|
||||||
- **Trigger**: passive Aufweck-Quellen. **Timer** (einmalig, ISO-Timestamp oder via `in_seconds` als Server-Berechnung) + **Watcher** (recurring, mit Condition + Throttle). Liste aktiver Trigger + Logs pro Feuer-Event. Modal mit Type-Dropdown, Live-Anzeige aller verfuegbaren Condition-Variablen (`disk_free_gb`, `hour_of_day`, `current_lat/lon`, `last_user_message_ago_sec`, …) und Condition-Funktionen (`near(lat, lon, m)` fuer GPS-Geofencing). Sicherer Condition-Parser via Python `ast` (Whitelist, kein `eval`). Der System-Prompt enthaelt zusaetzlich einen `## Aktuelle Zeit`-Block (UTC + Europa/Berlin) damit ARIA Timer-Zeitpunkte korrekt setzen kann.
|
- **Trigger**: passive Aufweck-Quellen. **Timer** (einmalig, ISO-Timestamp oder via `in_seconds` als Server-Berechnung) + **Watcher** (recurring, mit Condition + Throttle). Liste aktiver Trigger + Logs pro Feuer-Event. Modal mit Type-Dropdown, Live-Anzeige aller verfuegbaren Condition-Variablen (`disk_free_gb`, `hour_of_day`, `current_lat/lon`, `last_user_message_ago_sec`, …). **Drei GPS-Funktionen** mit unterschiedlicher Semantik:
|
||||||
|
- `near(lat, lon, r)` — SOLANGE im Radius (mit Throttle gegen Spam). Use-Case: „bin ich noch in der Nähe von X?"
|
||||||
|
- `entered_near(lat, lon, r)` — EINMAL beim Eintritt (Übergang außen→innen). Use-Case: Blitzer-Warner mit r=2000 → 2 km Vorwarnung, oder Ankunfts-Erinnerung mit r=100
|
||||||
|
- `left_near(lat, lon, r)` — EINMAL beim Verlassen (Übergang innen→außen). Use-Case: „Hast du am Parkplatz X was vergessen?"
|
||||||
|
|
||||||
|
Sicherer Condition-Parser via Python `ast` (Whitelist, kein `eval`). Der System-Prompt enthaelt zusaetzlich einen `## Aktuelle Zeit`-Block (UTC + Europa/Berlin) damit ARIA Timer-Zeitpunkte korrekt setzen kann.
|
||||||
|
|
||||||
|
**Auflösung**: Background-Loop tickt alle 8s (vorher 30s — bei 100 km/h durch einen 300m-Radius war eine Vorbeifahrt nur ~22s drin und konnte verpasst werden). Plus event-getrieben: Bridge ruft nach jedem `location_update` von der App sofort einen `/triggers/check-now` im Brain — Watcher sehen die frische Position in Millisekunden statt im Polling-Takt. `near()`-Funktionen ignorieren GPS-Daten älter als 5 Minuten (verhindert Phantom-Fires bei abgeschaltetem Tracking).
|
||||||
- **Dateien**: Browser fuer `/shared/uploads/` mit Multi-Select + "Alle markieren" + Bulk-Download (ZIP bei 2+) + Bulk-Delete. Live-Update der Chat-Bubbles beim Delete.
|
- **Dateien**: Browser fuer `/shared/uploads/` mit Multi-Select + "Alle markieren" + Bulk-Download (ZIP bei 2+) + Bulk-Delete. Live-Update der Chat-Bubbles beim Delete.
|
||||||
- **Einstellungen**: Reparatur (Container-Restart fuer Brain/Bridge/Qdrant), Komplett-Reset, Betriebsmodi, Sprachausgabe + Voice-Cloning + F5-TTS-Tuning + Voice Export/Import, Whisper, Sprachmodell (brainModel), Onboarding-QR, App-Cleanup
|
- **Einstellungen**: Reparatur (Container-Restart fuer Brain/Bridge/Qdrant), Komplett-Reset, Betriebsmodi, Sprachausgabe + Voice-Cloning + F5-TTS-Tuning + Voice Export/Import, Whisper, Sprachmodell (brainModel), Onboarding-QR, App-Cleanup
|
||||||
|
|
||||||
@@ -355,8 +362,13 @@ Erreichbar unter `http://<VM-IP>:3001`. Teilt das Netzwerk mit der Bridge.
|
|||||||
- **Lokale Voice-Wahl**: Pro Geraet eigene Stimme moeglich (in Settings). Diagnostic-Wechsel ueberschreibt alle App-Wahlen.
|
- **Lokale Voice-Wahl**: Pro Geraet eigene Stimme moeglich (in Settings). Diagnostic-Wechsel ueberschreibt alle App-Wahlen.
|
||||||
- **Voice-Ready Toast**: Beim Wechsel zeigt die App "Stimme X bereit (X.Ys)" sobald der Preload durch ist
|
- **Voice-Ready Toast**: Beim Wechsel zeigt die App "Stimme X bereit (X.Ys)" sobald der Preload durch ist
|
||||||
- **Play-Button**: Jede ARIA-Nachricht kann nochmal vorgelesen werden (aus Cache wenn vorhanden, sonst neu rendern)
|
- **Play-Button**: Jede ARIA-Nachricht kann nochmal vorgelesen werden (aus Cache wenn vorhanden, sonst neu rendern)
|
||||||
- **Chat-Suche**: Lupe in der Statusleiste filtert Nachrichten live
|
- **Chat-Suche**: Lupe in der Statusleiste — Highlight + Next/Prev springt zum Treffer (Bubble landet am Text-Anfang oben am Viewport)
|
||||||
|
- **Jump-to-Bottom-Button**: erscheint rechts unten sobald man weg von der neuesten Nachricht scrollt, ein Tap fuehrt zurueck
|
||||||
|
- **Delivery-Status pro User-Bubble** (WhatsApp-Style): `⏱` (queued, wartet auf Verbindung) → `⏳` (sending) → `✓` (Bridge hat ACK gesendet) → `✓✓` (ARIA hat verarbeitet). Bei Netzausfall werden Nachrichten lokal als queued gehalten und beim Reconnect automatisch geflusht. Bei drei ACK-Timeouts → `⚠ tippen f. Retry`. Idempotenz auf der Bridge (LRU ueber `clientMsgId`) verhindert Doppelte beim Retry
|
||||||
- **Mülltonne pro Bubble** (mit Confirm): gezielt eine Nachricht loeschen — geht nicht nur aus der UI weg, sondern auch aus `chat_backup.jsonl`, Brain-Conversation-Window und allen anderen Clients (RVS-Broadcast). Wichtig damit ARIA den Turn auch beim naechsten Prompt nicht mehr im Kontext hat
|
- **Mülltonne pro Bubble** (mit Confirm): gezielt eine Nachricht loeschen — geht nicht nur aus der UI weg, sondern auch aus `chat_backup.jsonl`, Brain-Conversation-Window und allen anderen Clients (RVS-Broadcast). Wichtig damit ARIA den Turn auch beim naechsten Prompt nicht mehr im Kontext hat
|
||||||
|
- **🗂️ Notizen-Inbox + Memory-Editor**: Neben der Lupe oeffnet `🗂️` ein Vollbild-Modal mit allen Memory/Trigger/Skill-Spezial-Bubbles aus dem Chat plus dem vollen DB-Browser. Tap auf eine Memory oeffnet ein **Detail/Edit-Modal**: Felder editieren, Anhaenge hoch-/runterladen + loeschen, Memory komplett loeschen. Identischer Editor auch in Settings → 🧠 Gedaechtnis. Spezial-Bubbles werden aus dem Chat-Stream gefiltert (keine ewig-unten-haengenden Notiz-Bubbles mehr)
|
||||||
|
- **Bubble-Header dynamic**: „ARIA hat etwas gemerkt" / „Notiz geaendert" (gelb) / „Notiz geloescht" (rot) — je nach action im memory_saved-Event
|
||||||
|
- **App-Crash-Reporting**: ungefangene JS-Errors + React-Render-Fehler landen automatisch in `/shared/logs/app.log` via RVS — kein ADB noetig, Logs holen via `tools/fetch-app-logs.sh` oder Diagnostic GET `/api/app-log`. ErrorBoundary verhindert White-Screen, zeigt stattdessen Error-Box im Modal mit Stack-Trace + Schliessen-Button
|
||||||
- **Mehrere Anhaenge**: Bilder + Dateien sammeln, Text hinzufuegen, dann zusammen senden
|
- **Mehrere Anhaenge**: Bilder + Dateien sammeln, Text hinzufuegen, dann zusammen senden
|
||||||
- **Paste-Support**: Bilder aus Zwischenablage einfuegen (Diagnostic)
|
- **Paste-Support**: Bilder aus Zwischenablage einfuegen (Diagnostic)
|
||||||
- **Anhaenge**: Bridge speichert in Shared Volume, ARIA kann darauf zugreifen, Re-Download ueber RVS
|
- **Anhaenge**: Bridge speichert in Shared Volume, ARIA kann darauf zugreifen, Re-Download ueber RVS
|
||||||
@@ -867,10 +879,12 @@ docker exec aria-brain curl localhost:8080/memory/stats
|
|||||||
- [x] **Phase B Punkt 2:** Migration aus `aria-data/brain-import/` → atomare Memory-Punkte (Identity / Rule / Preference / Tool / Skill, idempotent ueber migration_key) + Bootstrap-Snapshot Export/Import (nur pinned)
|
- [x] **Phase B Punkt 2:** Migration aus `aria-data/brain-import/` → atomare Memory-Punkte (Identity / Rule / Preference / Tool / Skill, idempotent ueber migration_key) + Bootstrap-Snapshot Export/Import (nur pinned)
|
||||||
- [x] **Phase B Punkt 3:** Brain Conversation-Loop (Single-Chat UI, Rolling Window 50 Turns, Schwelle 60 → automatisches Destillat, manueller Trigger)
|
- [x] **Phase B Punkt 3:** Brain Conversation-Loop (Single-Chat UI, Rolling Window 50 Turns, Schwelle 60 → automatisches Destillat, manueller Trigger)
|
||||||
- [x] **Phase B Punkt 4:** Skills-System (Python-only via local-venv, skill_create als Tool, dynamische run_<skill> Tools, Diagnostic Skills-Tab mit Logs/Toggle/Export/Import, skill_created Live-Notification in App+Diagnostic, harte Schwelle "pip → Skill")
|
- [x] **Phase B Punkt 4:** Skills-System (Python-only via local-venv, skill_create als Tool, dynamische run_<skill> Tools, Diagnostic Skills-Tab mit Logs/Toggle/Export/Import, skill_created Live-Notification in App+Diagnostic, harte Schwelle "pip → Skill")
|
||||||
- [x] **Phase B Punkt 5:** Triggers-System (passive Aufweck-Quellen — Timer + Watcher mit safe Condition-Parser, GPS-near(), Diagnostic Trigger-Tab, kontinuierliches GPS-Tracking in der App fuer Use-Cases wie Blitzer-Warner). Inklusive Brain → Bridge HTTP-Push (Port 8090 intern) damit Trigger-Antworten ueber RVS in App + Diagnostic + TTS landen.
|
- [x] **Phase B Punkt 5:** Triggers-System (passive Aufweck-Quellen — Timer + Watcher mit safe Condition-Parser, drei GPS-Funktionen `near()` / `entered_near()` / `left_near()` für unterschiedliche Geofencing-Modi, Diagnostic Trigger-Tab, kontinuierliches GPS-Tracking in der App fuer Use-Cases wie Blitzer-Warner). Tick-Frequenz 8s + event-getriebene Auswertung bei jedem `location_update` (statt 30s-Polling) damit auch Auto-Vorbeifahrten bei 100+ km/h durch kleine Radien zuverlässig erwischt werden. `near()`-Funktionen ignorieren GPS-Daten älter als 5 Minuten. Inklusive Brain → Bridge HTTP-Push (Port 8090 intern) damit Trigger-Antworten ueber RVS in App + Diagnostic + TTS landen.
|
||||||
- [x] **Proxy Tool-Use durchreichen**: claude-max-api-proxy patcht via eigene Adapter (`proxy-patches/`) den `tools`/`tool_calls`-Roundtrip — Claude Code rief vorher ihre internen Tools (Bash, sleep) statt der ARIA-Brain-Tools (trigger_timer, skill_*, ...). Jetzt funktioniert Tool-Use End-to-End.
|
- [x] **Proxy Tool-Use durchreichen**: claude-max-api-proxy patcht via eigene Adapter (`proxy-patches/`) den `tools`/`tool_calls`-Roundtrip — Claude Code rief vorher ihre internen Tools (Bash, sleep) statt der ARIA-Brain-Tools (trigger_timer, skill_*, ...). Jetzt funktioniert Tool-Use End-to-End.
|
||||||
- [x] **Single Source of Truth — Qdrant**: `memory_save`-Tool fuer ARIA, Claude-Code-Auto-Memory abgeklemmt (tmpfs ueber `~/.claude/projects` im Proxy-Container), `brain-import/` zum reinen Drop-Folder degradiert, Cold-Memory mit Score-Threshold (0.30) gegen Embedder-Noise/Crosstalk, Diagnostic-Gehirn-UI mit Wortlich-/Semantisch-Suche, Advanced Search (AND/OR mit + Button), Memory-Druckansicht, Muelltonne pro Chat-Bubble. DB ist jetzt durchgaengig die einzige Wissensquelle, kein paralleles File-Memory mehr.
|
- [x] **Single Source of Truth — Qdrant**: `memory_save`-Tool fuer ARIA, Claude-Code-Auto-Memory abgeklemmt (tmpfs ueber `~/.claude/projects` im Proxy-Container), `brain-import/` zum reinen Drop-Folder degradiert, Cold-Memory mit Score-Threshold (0.30) gegen Embedder-Noise/Crosstalk, Diagnostic-Gehirn-UI mit Wortlich-/Semantisch-Suche, Advanced Search (AND/OR mit + Button), Memory-Druckansicht, Muelltonne pro Chat-Bubble. DB ist jetzt durchgaengig die einzige Wissensquelle, kein paralleles File-Memory mehr.
|
||||||
- [x] **Memory-Anhaenge mit Vision-Pipeline**: Pro Memory koennen Bilder/PDFs/beliebige Dateien angehaengt werden (unter `/shared/memory-attachments/<id>/`, max 20 MB). Diagnostic-UI mit Thumbnail-Vorschau + Lightbox, App `memory_saved`-Bubble mit Tap-to-Load via RVS, System-Prompt zeigt Anhang-Pfade. **ARIA sieht Bilder echt** via Claude Code's eingebautes multi-modales `Read`-Tool — kein Proxy-Patch noetig. `memory_save` hat `attach_paths`-Parameter sodass ARIA ein User-Foto im selben Tool-Call lesen, Infos extrahieren (Kennzeichen, Marken, Texte) und als Memory + Anhang persistieren kann. Bilder bleiben am Memory haengen — bei spaeteren Detail-Fragen liest ARIA das Bild einfach nochmal.
|
- [x] **Memory-Anhaenge mit Vision-Pipeline**: Pro Memory koennen Bilder/PDFs/beliebige Dateien angehaengt werden (unter `/shared/memory-attachments/<id>/`, max 20 MB). Diagnostic-UI mit Thumbnail-Vorschau + Lightbox, App `memory_saved`-Bubble mit Tap-to-Load via RVS, System-Prompt zeigt Anhang-Pfade. **ARIA sieht Bilder echt** via Claude Code's eingebautes multi-modales `Read`-Tool — kein Proxy-Patch noetig. `memory_save` hat `attach_paths`-Parameter sodass ARIA ein User-Foto im selben Tool-Call lesen, Infos extrahieren (Kennzeichen, Marken, Texte) und als Memory + Anhang persistieren kann. Bilder bleiben am Memory haengen — bei spaeteren Detail-Fragen liest ARIA das Bild einfach nochmal.
|
||||||
|
- [x] **Memory-Editor in der App** (5 Etappen): Notizen-Inbox-Button neben der Lupe oeffnet ein Modal mit allen Spezial-Bubbles aus dem aktuellen Chat plus dem vollen DB-Browser. Tap auf eine Memory → Detail-Modal mit Anhang-Vorschau, Stift-Icon wechselt in Edit-Mode (Felder editieren + Anhaenge hoch-/runterladen + loeschen). Identischer Editor unter Settings → 🧠 Gedaechtnis. Bubble-Header dynamic je nach Aktion (created/updated/deleted). RVS-Brain-Proxy als Fundament (`brain_request`/`brain_response`) damit die App beliebige Brain-HTTP-Endpoints adressieren kann. `memory_search` + `memory_update` als ARIA-Tools damit sie aktiv die DB pruefen und Eintraege patchen kann statt zu fragmentieren.
|
||||||
|
- [x] **App-Crash-Reporting via RVS**: ErrorBoundary + global JS-Error-Handler + Promise-Rejection-Tracker schicken Crashes als `app_log`-Event durch RVS. Bridge sammelt in `/shared/logs/app.log`, Diagnostic GET `/api/app-log`. `tools/fetch-app-logs.sh` holt die Logs auf die Dev-Maschine (gitignored `.aria-debug/`). Damit kann Stefan unterwegs ohne ADB debuggen — der erste Bug (URLSearchParams in Hermes) wurde so in 5 Minuten gefunden.
|
||||||
- [x] Sprachmodell-Setting wieder funktional (brainModel in runtime.json statt aria-core)
|
- [x] Sprachmodell-Setting wieder funktional (brainModel in runtime.json statt aria-core)
|
||||||
- [x] App-Chat-Sync: kompletter Server-Sync bei Reconnect (Server = Source of Truth) + chat_cleared Live-Update. Lokal-only Bubbles (Skill-Notifications, laufende Voice ohne STT) bleiben erhalten.
|
- [x] App-Chat-Sync: kompletter Server-Sync bei Reconnect (Server = Source of Truth) + chat_cleared Live-Update. Lokal-only Bubbles (Skill-Notifications, laufende Voice ohne STT) bleiben erhalten.
|
||||||
- [x] App: Chat-Suche mit Next/Prev Navigation statt Filter
|
- [x] App: Chat-Suche mit Next/Prev Navigation statt Filter
|
||||||
|
|||||||
@@ -79,8 +79,8 @@ android {
|
|||||||
applicationId "com.ariacockpit"
|
applicationId "com.ariacockpit"
|
||||||
minSdkVersion rootProject.ext.minSdkVersion
|
minSdkVersion rootProject.ext.minSdkVersion
|
||||||
targetSdkVersion rootProject.ext.targetSdkVersion
|
targetSdkVersion rootProject.ext.targetSdkVersion
|
||||||
versionCode 10306
|
versionCode 10407
|
||||||
versionName "0.1.3.6"
|
versionName "0.1.4.7"
|
||||||
// Fallback fuer Libraries mit Product Flavors
|
// Fallback fuer Libraries mit Product Flavors
|
||||||
missingDimensionStrategy 'react-native-camera', 'general'
|
missingDimensionStrategy 'react-native-camera', 'general'
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "aria-cockpit",
|
"name": "aria-cockpit",
|
||||||
"version": "0.1.3.6",
|
"version": "0.1.4.7",
|
||||||
"private": true,
|
"private": true,
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"android": "react-native run-android",
|
"android": "react-native run-android",
|
||||||
|
|||||||
@@ -169,6 +169,12 @@ export const MemoryBrowser: React.FC<Props> = ({ restrictToIds, title, flatStyle
|
|||||||
data={filtered}
|
data={filtered}
|
||||||
keyExtractor={m => m.id}
|
keyExtractor={m => m.id}
|
||||||
renderItem={renderItem}
|
renderItem={renderItem}
|
||||||
|
// nestedScrollEnabled: notwendig damit die FlatList auf Android
|
||||||
|
// scrollt wenn sie in einer aeusseren ScrollView haengt (Settings-
|
||||||
|
// Screen ist ScrollView). Ohne das frisst der aeussere ScrollView
|
||||||
|
// alle Gesten und die innere Liste ist tot.
|
||||||
|
nestedScrollEnabled={true}
|
||||||
|
keyboardShouldPersistTaps="handled"
|
||||||
ListEmptyComponent={
|
ListEmptyComponent={
|
||||||
<Text style={{color:'#555570',textAlign:'center',padding:20,fontStyle:'italic'}}>
|
<Text style={{color:'#555570',textAlign:'center',padding:20,fontStyle:'italic'}}>
|
||||||
{items.length === 0 ? '(keine Memories in der DB)' : '(keine Treffer für diese Filter)'}
|
{items.length === 0 ? '(keine Memories in der DB)' : '(keine Treffer für diese Filter)'}
|
||||||
|
|||||||
@@ -114,13 +114,36 @@ interface ChatMessage {
|
|||||||
* sind noch nicht persistiert (kurzer Race) — Muelltonne erscheint erst
|
* sind noch nicht persistiert (kurzer Race) — Muelltonne erscheint erst
|
||||||
* wenn das chat_backup-Event vom Bridge zurueck kommt. */
|
* wenn das chat_backup-Event vom Bridge zurueck kommt. */
|
||||||
backupTs?: number;
|
backupTs?: number;
|
||||||
|
/** Client-seitige Eindeutigs-ID fuer Delivery-Tracking (offline-Queue,
|
||||||
|
* ACK von Bridge, Idempotenz bei Retry). Wird beim Senden generiert und
|
||||||
|
* durch die Bridge zurueck-gespiegelt. */
|
||||||
|
clientMsgId?: string;
|
||||||
|
/** Delivery-Status der User-Bubble (WhatsApp-style): queued = noch nicht
|
||||||
|
* raus (offline), sending = an Bridge unterwegs, sent = Bridge hat ACK
|
||||||
|
* gesendet, delivered = Brain hat geantwortet, failed = Retry-Limit. */
|
||||||
|
deliveryStatus?: 'queued' | 'sending' | 'sent' | 'delivered' | 'failed';
|
||||||
|
/** Anzahl der bisherigen Sende-Versuche (fuer Retry-Limit). */
|
||||||
|
sendAttempts?: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Ein Eintrag im Gedanken-Stream — chronologisches Log dessen was ARIA
|
||||||
|
* intern macht (Brain-`agent_activity`-Events). Bleibt zwischen Denk-
|
||||||
|
* Phasen stehen, wird in AsyncStorage persistiert. */
|
||||||
|
interface ThoughtEntry {
|
||||||
|
ts: number;
|
||||||
|
/** Roh-Activity vom Brain: thinking, tool, assistant, idle (= ✓ fertig). */
|
||||||
|
activity: string;
|
||||||
|
/** Bei activity='tool' der Tool-Name, sonst leer. */
|
||||||
|
tool?: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
// --- Konstanten ---
|
// --- Konstanten ---
|
||||||
|
|
||||||
const CHAT_STORAGE_KEY = 'aria_chat_messages';
|
const CHAT_STORAGE_KEY = 'aria_chat_messages';
|
||||||
|
const THOUGHT_STORAGE_KEY = 'aria_thought_stream';
|
||||||
const MAX_STORED_MESSAGES = 500;
|
const MAX_STORED_MESSAGES = 500;
|
||||||
const MAX_MEMORY_MESSAGES = 500;
|
const MAX_MEMORY_MESSAGES = 500;
|
||||||
|
const MAX_THOUGHTS = 500;
|
||||||
|
|
||||||
// Hilfe: Messages-Array auf Max kappen (aelteste raus) — verhindert OOM
|
// Hilfe: Messages-Array auf Max kappen (aelteste raus) — verhindert OOM
|
||||||
// im Gespraechsmodus bei sehr vielen Nachrichten.
|
// im Gespraechsmodus bei sehr vielen Nachrichten.
|
||||||
@@ -236,11 +259,20 @@ const ChatScreen: React.FC = () => {
|
|||||||
const [fullscreenImage, setFullscreenImage] = useState<string | null>(null);
|
const [fullscreenImage, setFullscreenImage] = useState<string | null>(null);
|
||||||
const [memoryDetailId, setMemoryDetailId] = useState<string | null>(null);
|
const [memoryDetailId, setMemoryDetailId] = useState<string | null>(null);
|
||||||
const [inboxVisible, setInboxVisible] = useState(false);
|
const [inboxVisible, setInboxVisible] = useState(false);
|
||||||
|
const [showJumpDown, setShowJumpDown] = useState(false);
|
||||||
const [searchQuery, setSearchQuery] = useState('');
|
const [searchQuery, setSearchQuery] = useState('');
|
||||||
const [searchVisible, setSearchVisible] = useState(false);
|
const [searchVisible, setSearchVisible] = useState(false);
|
||||||
const [searchIndex, setSearchIndex] = useState(0); // welcher Treffer aktiv ist
|
const [searchIndex, setSearchIndex] = useState(0); // welcher Treffer aktiv ist
|
||||||
const [pendingAttachments, setPendingAttachments] = useState<{file: any, isPhoto: boolean}[]>([]);
|
const [pendingAttachments, setPendingAttachments] = useState<{file: any, isPhoto: boolean}[]>([]);
|
||||||
const [agentActivity, setAgentActivity] = useState<{activity: string, tool: string}>({activity: 'idle', tool: ''});
|
const [agentActivity, setAgentActivity] = useState<{activity: string, tool: string}>({activity: 'idle', tool: ''});
|
||||||
|
// Gedanken-Stream: chronologisches Log dessen was ARIA intern macht.
|
||||||
|
// Wird aus agent_activity-Events gefuettert und in AsyncStorage persistiert.
|
||||||
|
const [thoughts, setThoughts] = useState<ThoughtEntry[]>([]);
|
||||||
|
const [thoughtsVisible, setThoughtsVisible] = useState(false);
|
||||||
|
// Spiegel der letzten Activity in einer Ref — verhindert dass aufeinander-
|
||||||
|
// folgende identische Events (z.B. zwei 'thinking' hintereinander) den
|
||||||
|
// Stream zumuellen. Eigentlich seltener Fall, aber billig zu pruefen.
|
||||||
|
const lastThoughtKeyRef = useRef<string>('');
|
||||||
// Service-Status (Gamebox: F5-TTS / Whisper Lade-Status) + Banner-Sichtbarkeit
|
// Service-Status (Gamebox: F5-TTS / Whisper Lade-Status) + Banner-Sichtbarkeit
|
||||||
const [serviceStatus, setServiceStatus] = useState<Record<string, {state: string, model?: string, loadSeconds?: number, error?: string}>>({});
|
const [serviceStatus, setServiceStatus] = useState<Record<string, {state: string, model?: string, loadSeconds?: number, error?: string}>>({});
|
||||||
const [serviceBannerDismissed, setServiceBannerDismissed] = useState(false);
|
const [serviceBannerDismissed, setServiceBannerDismissed] = useState(false);
|
||||||
@@ -259,6 +291,20 @@ const ChatScreen: React.FC = () => {
|
|||||||
|
|
||||||
const flatListRef = useRef<FlatList>(null);
|
const flatListRef = useRef<FlatList>(null);
|
||||||
const messageIdCounter = useRef(0);
|
const messageIdCounter = useRef(0);
|
||||||
|
// Spiegel der messages-Liste in einer Ref — Closures (z.B. dispatchWithAck-
|
||||||
|
// Retry) brauchen Zugriff auf den aktuellen Status einer Bubble.
|
||||||
|
const messagesRef = useRef<ChatMessage[]>([]);
|
||||||
|
// Watchdog gegen "ARIA denkt"-Hang: wird bei jedem agent_activity-Event mit
|
||||||
|
// nicht-idle Status neu armiert. Feuert er, sind 180s lang KEINE Updates
|
||||||
|
// vom Brain mehr gekommen → wir gehen davon aus dass die Verbindung
|
||||||
|
// verloren ist oder das Brain abgestuerzt — Timeout-Bubble + Reset.
|
||||||
|
const stuckWatchdog = useRef<ReturnType<typeof setTimeout> | null>(null);
|
||||||
|
const clearStuckWatchdog = () => {
|
||||||
|
if (stuckWatchdog.current) {
|
||||||
|
clearTimeout(stuckWatchdog.current);
|
||||||
|
stuckWatchdog.current = null;
|
||||||
|
}
|
||||||
|
};
|
||||||
// ServerPaths fuer die der User auf "oeffnen" geklickt hat — beim
|
// ServerPaths fuer die der User auf "oeffnen" geklickt hat — beim
|
||||||
// file_response wird die Datei nach dem Speichern direkt mit dem System-
|
// file_response wird die Datei nach dem Speichern direkt mit dem System-
|
||||||
// Intent geoeffnet (PDF-Viewer, Galerie, etc.).
|
// Intent geoeffnet (PDF-Viewer, Galerie, etc.).
|
||||||
@@ -270,6 +316,116 @@ const ChatScreen: React.FC = () => {
|
|||||||
return `msg_${Date.now()}_${messageIdCounter.current}`;
|
return `msg_${Date.now()}_${messageIdCounter.current}`;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
// Eindeutige clientMsgId fuer Delivery-Tracking (Bridge-Echo, Retry,
|
||||||
|
// Idempotenz). Format: cmsg_<ms>_<rand> — eindeutig genug fuer eine
|
||||||
|
// 100er-Dedup-Window auf der Bridge.
|
||||||
|
const nextClientMsgId = (): string =>
|
||||||
|
`cmsg_${Date.now()}_${Math.floor(Math.random() * 1_000_000)}`;
|
||||||
|
|
||||||
|
// Wie lange wir auf das ACK warten bevor wir retryen. Bridge sollte
|
||||||
|
// unmittelbar zurueckmelden — 30s ist grosszuegig fuer schlechte Netze.
|
||||||
|
const ACK_TIMEOUT_MS = 30_000;
|
||||||
|
// Wie oft re-tryen wir bevor wir "failed" anzeigen.
|
||||||
|
const MAX_SEND_ATTEMPTS = 3;
|
||||||
|
// Pending ACK-Timer pro clientMsgId — fuer cancel beim ACK.
|
||||||
|
const ackTimers = useRef<Map<string, ReturnType<typeof setTimeout>>>(new Map());
|
||||||
|
const clearAckTimer = (cmid: string) => {
|
||||||
|
const t = ackTimers.current.get(cmid);
|
||||||
|
if (t) {
|
||||||
|
clearTimeout(t);
|
||||||
|
ackTimers.current.delete(cmid);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Pending-Payloads pro clientMsgId — wir brauchen sie fuer Retry nach
|
||||||
|
// ACK-Timeout oder nach Reconnect (offline-Queue). Liegt in einer Ref
|
||||||
|
// damit der Inhalt Closures ueberlebt.
|
||||||
|
const pendingPayloads = useRef<Map<string, { type: 'chat' | 'audio'; payload: Record<string, unknown> }>>(new Map());
|
||||||
|
|
||||||
|
// ConnectionState in Ref spiegeln — fuer Closures (onMessage, Send-Pfade)
|
||||||
|
// die sonst auf einen veralteten Wert zugreifen wuerden.
|
||||||
|
const connectionStateRef = useRef<ConnectionState>('disconnected');
|
||||||
|
|
||||||
|
// Status einer Bubble per clientMsgId aendern (Helper)
|
||||||
|
const updateMessageStatus = useCallback(
|
||||||
|
(cmid: string, patch: Partial<Pick<ChatMessage, 'deliveryStatus' | 'sendAttempts'>>) => {
|
||||||
|
setMessages(prev => prev.map(m => (m.clientMsgId === cmid ? { ...m, ...patch } : m)));
|
||||||
|
},
|
||||||
|
[],
|
||||||
|
);
|
||||||
|
|
||||||
|
// Sende eine 'chat'- oder 'audio'-Nachricht an die Bridge mit ACK-Tracking.
|
||||||
|
// - Wenn offline → status='queued', wird beim Reconnect rausgeschickt.
|
||||||
|
// - Wenn online → status='sending', Timer fuer ACK-Erwartung.
|
||||||
|
// - Bei ACK-Timeout: retry (bis MAX_SEND_ATTEMPTS) oder 'failed'.
|
||||||
|
// - Wenn die Bubble inzwischen 'delivered' ist (z.B. ARIA hat geantwortet
|
||||||
|
// bevor das ACK durchkam) → komplett abbrechen, keinen Retry mehr.
|
||||||
|
const dispatchWithAck = useCallback(
|
||||||
|
(cmid: string, type: 'chat' | 'audio', payload: Record<string, unknown>, attempt = 1) => {
|
||||||
|
// Schutz: wenn die Bubble inzwischen delivered ist, Retry-Loop stoppen
|
||||||
|
// (kann bei verspaeteten ACKs oder manuellem Retry passieren wenn ARIA
|
||||||
|
// schon laengst geantwortet hat).
|
||||||
|
const current = messagesRef.current.find(m => m.clientMsgId === cmid);
|
||||||
|
if (current?.deliveryStatus === 'delivered') {
|
||||||
|
clearAckTimer(cmid);
|
||||||
|
pendingPayloads.current.delete(cmid);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
pendingPayloads.current.set(cmid, { type, payload });
|
||||||
|
const online = connectionStateRef.current === 'connected';
|
||||||
|
if (!online) {
|
||||||
|
updateMessageStatus(cmid, { deliveryStatus: 'queued', sendAttempts: attempt });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
// RVS.send mit clientMsgId — Bridge spiegelt das im chat_ack zurueck
|
||||||
|
rvs.send(type, { ...payload, clientMsgId: cmid });
|
||||||
|
updateMessageStatus(cmid, { deliveryStatus: 'sending', sendAttempts: attempt });
|
||||||
|
clearAckTimer(cmid);
|
||||||
|
ackTimers.current.set(
|
||||||
|
cmid,
|
||||||
|
setTimeout(() => {
|
||||||
|
ackTimers.current.delete(cmid);
|
||||||
|
// Vor dem Retry erneut pruefen ob die Bubble nicht inzwischen
|
||||||
|
// delivered wurde — sonst spawnen wir endlose Retries.
|
||||||
|
const fresh = messagesRef.current.find(m => m.clientMsgId === cmid);
|
||||||
|
if (fresh?.deliveryStatus === 'delivered') {
|
||||||
|
pendingPayloads.current.delete(cmid);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (attempt >= MAX_SEND_ATTEMPTS) {
|
||||||
|
updateMessageStatus(cmid, { deliveryStatus: 'failed', sendAttempts: attempt });
|
||||||
|
console.warn('[Chat] Send fehlgeschlagen nach %d Versuchen: %s', attempt, cmid);
|
||||||
|
} else {
|
||||||
|
console.warn('[Chat] kein ACK fuer %s — Retry #%d', cmid, attempt + 1);
|
||||||
|
dispatchWithAck(cmid, type, payload, attempt + 1);
|
||||||
|
}
|
||||||
|
}, ACK_TIMEOUT_MS),
|
||||||
|
);
|
||||||
|
},
|
||||||
|
[updateMessageStatus],
|
||||||
|
);
|
||||||
|
|
||||||
|
// Alle 'queued'-Nachrichten beim Reconnect rausschicken
|
||||||
|
const flushQueuedMessages = useCallback(() => {
|
||||||
|
setMessages(prev => {
|
||||||
|
for (const m of prev) {
|
||||||
|
if (m.deliveryStatus !== 'queued' || !m.clientMsgId) continue;
|
||||||
|
const pending = pendingPayloads.current.get(m.clientMsgId);
|
||||||
|
if (!pending) continue;
|
||||||
|
// Versuchszaehler beibehalten (oder mit 1 starten falls leer)
|
||||||
|
dispatchWithAck(m.clientMsgId, pending.type, pending.payload, m.sendAttempts || 1);
|
||||||
|
}
|
||||||
|
return prev;
|
||||||
|
});
|
||||||
|
}, [dispatchWithAck]);
|
||||||
|
|
||||||
|
// Manueller Retry nach 'failed' (tap auf das ⚠️-Icon)
|
||||||
|
const retryFailedMessage = useCallback((cmid: string) => {
|
||||||
|
const pending = pendingPayloads.current.get(cmid);
|
||||||
|
if (!pending) return;
|
||||||
|
dispatchWithAck(cmid, pending.type, pending.payload, 1);
|
||||||
|
}, [dispatchWithAck]);
|
||||||
|
|
||||||
// TTS- + GPS-Settings beim Mount + alle 2s neu laden (damit Settings-Toggle
|
// TTS- + GPS-Settings beim Mount + alle 2s neu laden (damit Settings-Toggle
|
||||||
// sofort greift, ohne Context- oder Event-System)
|
// sofort greift, ohne Context- oder Event-System)
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
@@ -375,12 +531,24 @@ const ChatScreen: React.FC = () => {
|
|||||||
const parsed: ChatMessage[] = JSON.parse(stored);
|
const parsed: ChatMessage[] = JSON.parse(stored);
|
||||||
if (Array.isArray(parsed) && parsed.length > 0) {
|
if (Array.isArray(parsed) && parsed.length > 0) {
|
||||||
console.log('[Chat] ${parsed.length} Nachrichten geladen');
|
console.log('[Chat] ${parsed.length} Nachrichten geladen');
|
||||||
setMessages(parsed);
|
// MERGE statt Overwrite: zwischen Mount und Load-Done koennen
|
||||||
|
// bereits Nachrichten ankommen (User schreibt sofort, WS-Events
|
||||||
|
// kommen vor Load-Ende). Vorher hat setMessages(parsed) diese
|
||||||
|
// ueberschrieben → "Nachricht weg ohne Spur". Jetzt mergen wir
|
||||||
|
// per id; lokal-gerade-hinzugefuegte schlagen Gespeichertes
|
||||||
|
// (die sind frischer).
|
||||||
|
setMessages(prev => {
|
||||||
|
if (prev.length === 0) return parsed;
|
||||||
|
const byId = new Map<string, ChatMessage>();
|
||||||
|
for (const m of parsed) byId.set(m.id, m);
|
||||||
|
for (const m of prev) byId.set(m.id, m);
|
||||||
|
return [...byId.values()].sort((a, b) => (a.timestamp || 0) - (b.timestamp || 0));
|
||||||
|
});
|
||||||
const maxId = parsed.reduce((max, msg) => {
|
const maxId = parsed.reduce((max, msg) => {
|
||||||
const num = parseInt(msg.id.split('_').pop() || '0', 10);
|
const num = parseInt(msg.id.split('_').pop() || '0', 10);
|
||||||
return num > max ? num : max;
|
return num > max ? num : max;
|
||||||
}, 0);
|
}, 0);
|
||||||
messageIdCounter.current = maxId;
|
messageIdCounter.current = Math.max(messageIdCounter.current, maxId);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
@@ -418,6 +586,22 @@ const ChatScreen: React.FC = () => {
|
|||||||
// RVS-Nachrichten abonnieren
|
// RVS-Nachrichten abonnieren
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
const unsubMessage = rvs.onMessage((message: RVSMessage) => {
|
const unsubMessage = rvs.onMessage((message: RVSMessage) => {
|
||||||
|
// chat_ack: Bridge bestaetigt Empfang einer chat/audio-Nachricht.
|
||||||
|
// Wir markieren die Bubble als 'sent' (✓) und stoppen den ACK-Timer.
|
||||||
|
if (message.type === ('chat_ack' as any)) {
|
||||||
|
const cmid = (message.payload as any).clientMsgId as string | undefined;
|
||||||
|
if (cmid) {
|
||||||
|
clearAckTimer(cmid);
|
||||||
|
pendingPayloads.current.delete(cmid);
|
||||||
|
setMessages(prev => prev.map(m =>
|
||||||
|
m.clientMsgId === cmid && m.deliveryStatus !== 'delivered'
|
||||||
|
? { ...m, deliveryStatus: 'sent' }
|
||||||
|
: m
|
||||||
|
));
|
||||||
|
}
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
// file_saved: Bridge meldet Server-Pfad — in Attachment merken fuer Re-Download
|
// file_saved: Bridge meldet Server-Pfad — in Attachment merken fuer Re-Download
|
||||||
if (message.type === 'file_saved') {
|
if (message.type === 'file_saved') {
|
||||||
const serverPath = (message.payload.serverPath as string) || '';
|
const serverPath = (message.payload.serverPath as string) || '';
|
||||||
@@ -473,6 +657,10 @@ const ChatScreen: React.FC = () => {
|
|||||||
mimeType: f.mimeType || '',
|
mimeType: f.mimeType || '',
|
||||||
serverPath: f.serverPath || '',
|
serverPath: f.serverPath || '',
|
||||||
})) as Attachment[];
|
})) as Attachment[];
|
||||||
|
// clientMsgId weiterreichen — Bridge spiegelt sie im chat_backup,
|
||||||
|
// damit wir lokale Bubbles per ID dedupen koennen statt nur per
|
||||||
|
// Text/Timestamp-Heuristik.
|
||||||
|
const cmid = typeof m.clientMsgId === 'string' ? m.clientMsgId : undefined;
|
||||||
return {
|
return {
|
||||||
id: nextId(),
|
id: nextId(),
|
||||||
sender: role as 'user' | 'aria',
|
sender: role as 'user' | 'aria',
|
||||||
@@ -480,20 +668,45 @@ const ChatScreen: React.FC = () => {
|
|||||||
timestamp: m.ts || Date.now(),
|
timestamp: m.ts || Date.now(),
|
||||||
attachments: attachments.length ? attachments : undefined,
|
attachments: attachments.length ? attachments : undefined,
|
||||||
backupTs: typeof m.ts === 'number' ? m.ts : undefined,
|
backupTs: typeof m.ts === 'number' ? m.ts : undefined,
|
||||||
|
...(cmid && { clientMsgId: cmid }),
|
||||||
|
// Server-Bubble = vom Brain verarbeitet → 'delivered' (✓✓)
|
||||||
|
...(role === 'user' && cmid && { deliveryStatus: 'delivered' as const }),
|
||||||
};
|
};
|
||||||
});
|
});
|
||||||
const maxTs = incoming.reduce((mx: number, m: any) => Math.max(mx, m.ts || 0), 0);
|
const maxTs = incoming.reduce((mx: number, m: any) => Math.max(mx, m.ts || 0), 0);
|
||||||
setMessages(prev => {
|
setMessages(prev => {
|
||||||
|
// ClientMsgIds die der Server kennt — lokale Bubbles mit der
|
||||||
|
// gleichen ID werden durch die Server-Version ersetzt.
|
||||||
|
const serverCmids = new Set(
|
||||||
|
fromServer.map(s => s.clientMsgId).filter((x): x is string => !!x)
|
||||||
|
);
|
||||||
// Lokal-only Bubbles erkennen + behalten:
|
// Lokal-only Bubbles erkennen + behalten:
|
||||||
// - Skill-Created-Notifications (skillCreated gesetzt)
|
// - Skill-Created-Notifications (skillCreated gesetzt)
|
||||||
// - Laufende Sprachnachrichten ohne STT-Result (audioRequestId
|
// - Laufende Sprachnachrichten ohne STT-Result (audioRequestId
|
||||||
// gesetzt UND text leer/Placeholder)
|
// gesetzt UND text leer/Placeholder)
|
||||||
const localOnly = prev.filter(m =>
|
// - User-Bubbles deren clientMsgId der Server noch nicht kennt:
|
||||||
m.skillCreated ||
|
// z.B. waehrend Reconnect-Race oder solange flushQueuedMessages
|
||||||
m.triggerCreated ||
|
// noch laeuft. ABER: wenn der Server eine textgleiche Bubble
|
||||||
m.memorySaved ||
|
// im gleichen 5-Min-Fenster hat (Alter Backup-Eintrag ohne
|
||||||
(m.audioRequestId && (!m.text || m.text === '🎙 Aufnahme...' || m.text === 'Aufnahme...'))
|
// clientMsgId, vor dem Bridge-Patch geschrieben), werten wir
|
||||||
);
|
// das als Treffer und verwerfen die lokale Kopie — sonst
|
||||||
|
// Doppelpost: einmal als Server-Bubble (delivered) und einmal
|
||||||
|
// als lokale failed/queued mit Retry-Knopf.
|
||||||
|
const FIVE_MIN = 5 * 60 * 1000;
|
||||||
|
const localOnly = prev.filter(m => {
|
||||||
|
if (m.skillCreated || m.triggerCreated || m.memorySaved) return true;
|
||||||
|
if (m.audioRequestId && (!m.text || m.text === '🎙 Aufnahme...' || m.text === 'Aufnahme...')) return true;
|
||||||
|
if (m.sender === 'user' && m.clientMsgId && !serverCmids.has(m.clientMsgId)) {
|
||||||
|
const serverHasIt = fromServer.some(s =>
|
||||||
|
s.sender === 'user' &&
|
||||||
|
s.text === m.text &&
|
||||||
|
Math.abs((s.timestamp || 0) - (m.timestamp || 0)) < FIVE_MIN,
|
||||||
|
);
|
||||||
|
if (serverHasIt) return false;
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
});
|
||||||
// Server-Stand + lokal-only (chronologisch sortiert)
|
// Server-Stand + lokal-only (chronologisch sortiert)
|
||||||
const merged = [...fromServer, ...localOnly].sort((a, b) => a.timestamp - b.timestamp);
|
const merged = [...fromServer, ...localOnly].sort((a, b) => a.timestamp - b.timestamp);
|
||||||
return capMessages(merged);
|
return capMessages(merged);
|
||||||
@@ -749,8 +962,25 @@ const ChatScreen: React.FC = () => {
|
|||||||
messageId: (message.payload.messageId as string) || undefined,
|
messageId: (message.payload.messageId as string) || undefined,
|
||||||
backupTs: (message.payload.backupTs as number) || undefined,
|
backupTs: (message.payload.backupTs as number) || undefined,
|
||||||
};
|
};
|
||||||
return capMessages([...prev, ariaMsg]);
|
// ARIA hat geantwortet → alle User-Bubbles davor als 'delivered'
|
||||||
|
// markieren (WhatsApp-Doppelhaken ✓✓). Brain hat sie verarbeitet.
|
||||||
|
return capMessages([...prev, ariaMsg]).map(m =>
|
||||||
|
m.sender === 'user'
|
||||||
|
&& (m.deliveryStatus === 'sent' || m.deliveryStatus === 'sending')
|
||||||
|
? { ...m, deliveryStatus: 'delivered' }
|
||||||
|
: m
|
||||||
|
);
|
||||||
});
|
});
|
||||||
|
// ARIA hat geantwortet → Watchdog clearen, falls noch armiert
|
||||||
|
clearStuckWatchdog();
|
||||||
|
// ALLE noch laufenden ACK-Timer clearen — Bridge hat unsere Messages
|
||||||
|
// ja offensichtlich verarbeitet (sonst keine ARIA-Antwort). Wenn
|
||||||
|
// ein ACK aus Netzgruenden verloren ging, soll der Retry nicht
|
||||||
|
// nachtraeglich loslaufen und die Bubble auf 'failed' setzen.
|
||||||
|
for (const cmid of Array.from(ackTimers.current.keys())) {
|
||||||
|
clearAckTimer(cmid);
|
||||||
|
pendingPayloads.current.delete(cmid);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// TTS-Audio abspielen wenn vorhanden — respektiert geraetelokalen Mute/Disable
|
// TTS-Audio abspielen wenn vorhanden — respektiert geraetelokalen Mute/Disable
|
||||||
@@ -793,8 +1023,39 @@ const ChatScreen: React.FC = () => {
|
|||||||
const activity = (message.payload.activity as string) || 'idle';
|
const activity = (message.payload.activity as string) || 'idle';
|
||||||
const tool = (message.payload.tool as string) || '';
|
const tool = (message.payload.tool as string) || '';
|
||||||
setAgentActivity({ activity, tool });
|
setAgentActivity({ activity, tool });
|
||||||
|
// In den Gedanken-Stream einfuegen. Dedup gegen identische Folge-
|
||||||
|
// Events (z.B. zwei mal 'thinking' direkt hintereinander). Tool-
|
||||||
|
// Events NIE dedupen — wenn ARIA dreimal Bash hintereinander ruft,
|
||||||
|
// sollen alle drei sichtbar sein.
|
||||||
|
const key = `${activity}|${tool}`;
|
||||||
|
const isTool = activity === 'tool';
|
||||||
|
if (isTool || key !== lastThoughtKeyRef.current) {
|
||||||
|
lastThoughtKeyRef.current = key;
|
||||||
|
setThoughts(prev => {
|
||||||
|
const next = [...prev, { ts: Date.now(), activity, tool }];
|
||||||
|
return next.length > MAX_THOUGHTS ? next.slice(-MAX_THOUGHTS) : next;
|
||||||
|
});
|
||||||
|
}
|
||||||
// Spotify darf waehrend "ARIA denkt/schreibt" weiterspielen — pausiert
|
// Spotify darf waehrend "ARIA denkt/schreibt" weiterspielen — pausiert
|
||||||
// nur wenn TTS startet (dann acquired _firePlaybackStarted den Focus).
|
// nur wenn TTS startet (dann acquired _firePlaybackStarted den Focus).
|
||||||
|
// Watchdog: solange Brain noch Lebenszeichen sendet (jedes neue
|
||||||
|
// activity-Event), Timer neu starten. 21 Min ohne Update → Hang.
|
||||||
|
// Knapp ueber Brain-Timeout (20 Min) damit nur bei echten
|
||||||
|
// Verbindungsabbruechen / Brain-Crashes gefeuert wird, nicht waehrend
|
||||||
|
// legitimer langer Multi-Tool-Sessions die das Brain selbst kappt.
|
||||||
|
clearStuckWatchdog();
|
||||||
|
if (activity !== 'idle') {
|
||||||
|
stuckWatchdog.current = setTimeout(() => {
|
||||||
|
stuckWatchdog.current = null;
|
||||||
|
setAgentActivity({ activity: 'idle', tool: '' });
|
||||||
|
setMessages(prev => capMessages([...prev, {
|
||||||
|
id: nextId(),
|
||||||
|
sender: 'aria',
|
||||||
|
text: '⚠️ Habe gerade keine Verbindung zurueck bekommen (Timeout nach 21 Min). Deine letzte Nachricht ist evtl. nicht durchgekommen — schick sie nochmal.',
|
||||||
|
timestamp: Date.now(),
|
||||||
|
}]));
|
||||||
|
}, 1_260_000);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Voice-Config aus Diagnostic — setzt die lokale App-Stimme auf den
|
// Voice-Config aus Diagnostic — setzt die lokale App-Stimme auf den
|
||||||
@@ -838,6 +1099,7 @@ const ChatScreen: React.FC = () => {
|
|||||||
|
|
||||||
const unsubState = rvs.onStateChange((state) => {
|
const unsubState = rvs.onStateChange((state) => {
|
||||||
setConnectionState(state);
|
setConnectionState(state);
|
||||||
|
connectionStateRef.current = state;
|
||||||
// Bei (re)connect: KOMPLETTEN Server-Stand holen. Server ist die
|
// Bei (re)connect: KOMPLETTEN Server-Stand holen. Server ist die
|
||||||
// Source-of-Truth — wenn er leer ist (z.B. nach "Konversation
|
// Source-of-Truth — wenn er leer ist (z.B. nach "Konversation
|
||||||
// zuruecksetzen"), soll die App das spiegeln, auch wenn sie offline
|
// zuruecksetzen"), soll die App das spiegeln, auch wenn sie offline
|
||||||
@@ -845,11 +1107,26 @@ const ChatScreen: React.FC = () => {
|
|||||||
// Nachrichten vom Server, oder leeres Array wenn Server leer.
|
// Nachrichten vom Server, oder leeres Array wenn Server leer.
|
||||||
if (state === 'connected') {
|
if (state === 'connected') {
|
||||||
rvs.send('chat_history_request' as any, { since: 0, limit: 200 });
|
rvs.send('chat_history_request' as any, { since: 0, limit: 200 });
|
||||||
|
// Offline-Queue flushen — alle 'queued'-Bubbles raussschicken
|
||||||
|
flushQueuedMessages();
|
||||||
|
} else if (state === 'disconnected') {
|
||||||
|
// ACK-Timer cancellen, betroffene Bubbles auf 'queued' zurueck
|
||||||
|
for (const [cmid, t] of ackTimers.current.entries()) {
|
||||||
|
clearTimeout(t);
|
||||||
|
ackTimers.current.delete(cmid);
|
||||||
|
setMessages(prev => prev.map(m =>
|
||||||
|
m.clientMsgId === cmid && m.deliveryStatus === 'sending'
|
||||||
|
? { ...m, deliveryStatus: 'queued' }
|
||||||
|
: m
|
||||||
|
));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
// Initalen Status setzen
|
// Initalen Status setzen
|
||||||
setConnectionState(rvs.getState());
|
const initialState = rvs.getState();
|
||||||
|
setConnectionState(initialState);
|
||||||
|
connectionStateRef.current = initialState;
|
||||||
|
|
||||||
return () => {
|
return () => {
|
||||||
unsubMessage();
|
unsubMessage();
|
||||||
@@ -1026,6 +1303,40 @@ const ChatScreen: React.FC = () => {
|
|||||||
return () => { if (saveTimer.current) clearTimeout(saveTimer.current); };
|
return () => { if (saveTimer.current) clearTimeout(saveTimer.current); };
|
||||||
}, [messages]);
|
}, [messages]);
|
||||||
|
|
||||||
|
// Gedanken-Stream beim Mount aus AsyncStorage laden
|
||||||
|
useEffect(() => {
|
||||||
|
AsyncStorage.getItem(THOUGHT_STORAGE_KEY)
|
||||||
|
.then(raw => {
|
||||||
|
if (!raw) return;
|
||||||
|
try {
|
||||||
|
const parsed = JSON.parse(raw);
|
||||||
|
if (Array.isArray(parsed)) setThoughts(parsed.slice(-MAX_THOUGHTS));
|
||||||
|
} catch {}
|
||||||
|
})
|
||||||
|
.catch(() => {});
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
// Gedanken-Stream persistieren (debounced)
|
||||||
|
const thoughtSaveTimer = useRef<ReturnType<typeof setTimeout> | null>(null);
|
||||||
|
useEffect(() => {
|
||||||
|
if (thoughts.length === 0) {
|
||||||
|
AsyncStorage.removeItem(THOUGHT_STORAGE_KEY).catch(() => {});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (thoughtSaveTimer.current) clearTimeout(thoughtSaveTimer.current);
|
||||||
|
thoughtSaveTimer.current = setTimeout(() => {
|
||||||
|
AsyncStorage.setItem(
|
||||||
|
THOUGHT_STORAGE_KEY,
|
||||||
|
JSON.stringify(thoughts.slice(-MAX_THOUGHTS)),
|
||||||
|
).catch(() => {});
|
||||||
|
}, 500);
|
||||||
|
return () => { if (thoughtSaveTimer.current) clearTimeout(thoughtSaveTimer.current); };
|
||||||
|
}, [thoughts]);
|
||||||
|
|
||||||
|
// messagesRef immer aktuell halten — wird von dispatchWithAck/Retry gelesen
|
||||||
|
// damit Retries auf den aktuellen deliveryStatus reagieren koennen.
|
||||||
|
useEffect(() => { messagesRef.current = messages; }, [messages]);
|
||||||
|
|
||||||
// Inverted FlatList: neueste Nachrichten unten, kein manuelles Scrollen noetig
|
// Inverted FlatList: neueste Nachrichten unten, kein manuelles Scrollen noetig
|
||||||
// Spezial-Bubbles (memorySaved/triggerCreated/skillCreated) sollen im Chat
|
// Spezial-Bubbles (memorySaved/triggerCreated/skillCreated) sollen im Chat
|
||||||
// NICHT mehr erscheinen — sie werden in der Notizen-Inbox angezeigt.
|
// NICHT mehr erscheinen — sie werden in der Notizen-Inbox angezeigt.
|
||||||
@@ -1051,26 +1362,60 @@ const ChatScreen: React.FC = () => {
|
|||||||
setSearchIndex(0);
|
setSearchIndex(0);
|
||||||
}, [searchQuery]);
|
}, [searchQuery]);
|
||||||
|
|
||||||
// Bei Index-Wechsel zu der entsprechenden Bubble scrollen.
|
// Tracking damit wir nicht zur selben Bubble mehrfach scrollen (z.B. wenn
|
||||||
// FlatList ist `inverted` → viewPosition 0.5 (mitte) ist beim inverted-Render
|
// neue Nachrichten kommen waehrend Suche aktiv ist → invertedMessages
|
||||||
// tatsaechlich die Mitte des sichtbaren Bereichs. Wir verzoegern minimal
|
// aendert sich, soll aber nicht den Scroll erneut triggern).
|
||||||
// damit Layout sicher fertig ist.
|
const lastSearchScrollKey = useRef<string>('');
|
||||||
|
// Pending Retry-Timer fuer onScrollToIndexFailed — wird gecancelt sobald
|
||||||
|
// ein neuer Search-Hit kommt, damit alte Retries nicht den neuen
|
||||||
|
// Scroll-Versuch durcheinanderbringen ("permanent springen"-Bug).
|
||||||
|
const pendingScrollRetry = useRef<ReturnType<typeof setTimeout> | null>(null);
|
||||||
|
const clearPendingScrollRetry = () => {
|
||||||
|
if (pendingScrollRetry.current) {
|
||||||
|
clearTimeout(pendingScrollRetry.current);
|
||||||
|
pendingScrollRetry.current = null;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Bei Search-Index-Wechsel zur entsprechenden Bubble scrollen.
|
||||||
|
// FlatList ist `inverted`. viewPosition 0 = Item-Top oben am Viewport →
|
||||||
|
// Treffer-Bubble liegt mit dem Anfang direkt oben sichtbar.
|
||||||
|
// WICHTIG: invertedMessages bewusst NICHT in den Deps — sonst feuert das
|
||||||
|
// Effekt bei jeder neuen ARIA-Nachricht erneut und scrollt amok.
|
||||||
|
// Den aktuellen Snapshot von invertedMessages holen wir via Ref.
|
||||||
|
const invertedMessagesRef = useRef(invertedMessages);
|
||||||
|
invertedMessagesRef.current = invertedMessages;
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (!searchMatchIds.length) return;
|
if (!searchMatchIds.length) {
|
||||||
|
lastSearchScrollKey.current = '';
|
||||||
|
clearPendingScrollRetry();
|
||||||
|
return;
|
||||||
|
}
|
||||||
const id = searchMatchIds[searchIndex];
|
const id = searchMatchIds[searchIndex];
|
||||||
if (!id) return;
|
if (!id) return;
|
||||||
const idx = invertedMessages.findIndex(m => m.id === id);
|
// Eindeutiger Schluessel pro Treffer-Stop — verhindert dass identische
|
||||||
|
// Re-Renders erneut scrollen.
|
||||||
|
const key = `${searchIndex}:${id}`;
|
||||||
|
if (lastSearchScrollKey.current === key) return;
|
||||||
|
lastSearchScrollKey.current = key;
|
||||||
|
// Neue Suche → alte Retries verwerfen
|
||||||
|
clearPendingScrollRetry();
|
||||||
|
const idx = invertedMessagesRef.current.findIndex(m => m.id === id);
|
||||||
if (idx < 0 || !flatListRef.current) return;
|
if (idx < 0 || !flatListRef.current) return;
|
||||||
const tryScroll = () => {
|
requestAnimationFrame(() => {
|
||||||
try {
|
try {
|
||||||
flatListRef.current?.scrollToIndex({ index: idx, animated: true, viewPosition: 0.5 });
|
flatListRef.current?.scrollToIndex({ index: idx, animated: true, viewPosition: 0 });
|
||||||
} catch {
|
} catch {
|
||||||
// wird von onScrollToIndexFailed nochmal versucht
|
// onScrollToIndexFailed-Handler uebernimmt den Fallback
|
||||||
}
|
}
|
||||||
};
|
});
|
||||||
// requestAnimationFrame statt setTimeout 0 — wartet auf naechsten Layout-Frame
|
}, [searchIndex, searchMatchIds]);
|
||||||
requestAnimationFrame(tryScroll);
|
|
||||||
}, [searchIndex, searchMatchIds, invertedMessages]);
|
// Unmount → pending Timer verwerfen, sonst feuern sie nach Navigation ins Leere
|
||||||
|
useEffect(() => () => {
|
||||||
|
clearPendingScrollRetry();
|
||||||
|
clearStuckWatchdog();
|
||||||
|
}, []);
|
||||||
|
|
||||||
const activeSearchId = searchMatchIds[searchIndex] || '';
|
const activeSearchId = searchMatchIds[searchIndex] || '';
|
||||||
const gotoSearchPrev = () => {
|
const gotoSearchPrev = () => {
|
||||||
@@ -1150,29 +1495,33 @@ const ChatScreen: React.FC = () => {
|
|||||||
const wasInterrupted = interruptAriaIfBusy();
|
const wasInterrupted = interruptAriaIfBusy();
|
||||||
const location = await getCurrentLocation();
|
const location = await getCurrentLocation();
|
||||||
|
|
||||||
|
const cmid = nextClientMsgId();
|
||||||
const userMsg: ChatMessage = {
|
const userMsg: ChatMessage = {
|
||||||
id: nextId(),
|
id: nextId(),
|
||||||
sender: 'user',
|
sender: 'user',
|
||||||
text,
|
text,
|
||||||
timestamp: Date.now(),
|
timestamp: Date.now(),
|
||||||
|
clientMsgId: cmid,
|
||||||
|
deliveryStatus: connectionStateRef.current === 'connected' ? 'sending' : 'queued',
|
||||||
|
sendAttempts: 1,
|
||||||
};
|
};
|
||||||
setMessages(prev => capMessages([...prev, userMsg]));
|
setMessages(prev => capMessages([...prev, userMsg]));
|
||||||
|
|
||||||
console.log('[Chat] sende mit voice=%s speed=%s interrupted=%s',
|
console.log('[Chat] sende cmid=%s voice=%s speed=%s interrupted=%s',
|
||||||
localXttsVoiceRef.current || '(default)', ttsSpeedRef.current, wasInterrupted);
|
cmid, localXttsVoiceRef.current || '(default)', ttsSpeedRef.current, wasInterrupted);
|
||||||
// An RVS senden — mit geraetelokaler Voice (Bridge nutzt sie fuer die Antwort)
|
dispatchWithAck(cmid, 'chat', {
|
||||||
rvs.send('chat', {
|
|
||||||
text,
|
text,
|
||||||
voice: localXttsVoiceRef.current,
|
voice: localXttsVoiceRef.current,
|
||||||
speed: ttsSpeedRef.current,
|
speed: ttsSpeedRef.current,
|
||||||
interrupted: wasInterrupted,
|
interrupted: wasInterrupted,
|
||||||
...(location && { location }),
|
...(location && { location }),
|
||||||
});
|
});
|
||||||
}, [inputText, getCurrentLocation, pendingAttachments, sendPendingAttachments, interruptAriaIfBusy]);
|
}, [inputText, getCurrentLocation, pendingAttachments, sendPendingAttachments, interruptAriaIfBusy, dispatchWithAck]);
|
||||||
|
|
||||||
// Anfrage abbrechen — sofort lokalen Indicator weg, Bridge triggert doctor --fix
|
// Anfrage abbrechen — sofort lokalen Indicator weg, Bridge triggert doctor --fix
|
||||||
const cancelRequest = useCallback(() => {
|
const cancelRequest = useCallback(() => {
|
||||||
setAgentActivity({ activity: 'idle', tool: '' });
|
setAgentActivity({ activity: 'idle', tool: '' });
|
||||||
|
clearStuckWatchdog();
|
||||||
rvs.send('cancel_request' as any, {});
|
rvs.send('cancel_request' as any, {});
|
||||||
}, []);
|
}, []);
|
||||||
|
|
||||||
@@ -1189,6 +1538,7 @@ const ChatScreen: React.FC = () => {
|
|||||||
if (speaking) audioService.haltAllPlayback('user spricht (barge-in)');
|
if (speaking) audioService.haltAllPlayback('user spricht (barge-in)');
|
||||||
if (thinking) {
|
if (thinking) {
|
||||||
setAgentActivity({ activity: 'idle', tool: '' });
|
setAgentActivity({ activity: 'idle', tool: '' });
|
||||||
|
clearStuckWatchdog();
|
||||||
rvs.send('cancel_request' as any, {});
|
rvs.send('cancel_request' as any, {});
|
||||||
}
|
}
|
||||||
return true;
|
return true;
|
||||||
@@ -1201,16 +1551,20 @@ const ChatScreen: React.FC = () => {
|
|||||||
const location = await getCurrentLocation();
|
const location = await getCurrentLocation();
|
||||||
const audioRequestId = `audio_${Date.now()}_${Math.floor(Math.random() * 100000)}`;
|
const audioRequestId = `audio_${Date.now()}_${Math.floor(Math.random() * 100000)}`;
|
||||||
|
|
||||||
|
const cmid = nextClientMsgId();
|
||||||
const userMsg: ChatMessage = {
|
const userMsg: ChatMessage = {
|
||||||
id: nextId(),
|
id: nextId(),
|
||||||
sender: 'user',
|
sender: 'user',
|
||||||
text: '🎙 Spracheingabe wird verarbeitet...',
|
text: '🎙 Spracheingabe wird verarbeitet...',
|
||||||
timestamp: Date.now(),
|
timestamp: Date.now(),
|
||||||
audioRequestId,
|
audioRequestId,
|
||||||
|
clientMsgId: cmid,
|
||||||
|
deliveryStatus: connectionStateRef.current === 'connected' ? 'sending' : 'queued',
|
||||||
|
sendAttempts: 1,
|
||||||
};
|
};
|
||||||
setMessages(prev => capMessages([...prev, userMsg]));
|
setMessages(prev => capMessages([...prev, userMsg]));
|
||||||
|
|
||||||
rvs.send('audio', {
|
dispatchWithAck(cmid, 'audio', {
|
||||||
base64: result.base64,
|
base64: result.base64,
|
||||||
durationMs: result.durationMs,
|
durationMs: result.durationMs,
|
||||||
mimeType: result.mimeType,
|
mimeType: result.mimeType,
|
||||||
@@ -1271,13 +1625,20 @@ const ChatScreen: React.FC = () => {
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
// Chat-Nachricht mit allen Anhaengen
|
// Chat-Nachricht mit allen Anhaengen. clientMsgId nur wenn Text dabei
|
||||||
|
// ist — files selber haben (noch) kein ACK-Tracking auf der Bridge.
|
||||||
|
const cmid = messageText ? nextClientMsgId() : undefined;
|
||||||
const userMsg: ChatMessage = {
|
const userMsg: ChatMessage = {
|
||||||
id: msgId,
|
id: msgId,
|
||||||
sender: 'user',
|
sender: 'user',
|
||||||
text: messageText || `${pendingAttachments.length} Anhang/Anhaenge`,
|
text: messageText || `${pendingAttachments.length} Anhang/Anhaenge`,
|
||||||
timestamp: Date.now(),
|
timestamp: Date.now(),
|
||||||
attachments,
|
attachments,
|
||||||
|
...(cmid && {
|
||||||
|
clientMsgId: cmid,
|
||||||
|
deliveryStatus: connectionStateRef.current === 'connected' ? 'sending' : 'queued',
|
||||||
|
sendAttempts: 1,
|
||||||
|
}),
|
||||||
};
|
};
|
||||||
setMessages(prev => capMessages([...prev, userMsg]));
|
setMessages(prev => capMessages([...prev, userMsg]));
|
||||||
|
|
||||||
@@ -1311,9 +1672,11 @@ const ChatScreen: React.FC = () => {
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
// Text als separate Nachricht (damit ARIA weiss was zu tun ist)
|
// Text als separate Nachricht (damit ARIA weiss was zu tun ist) — mit
|
||||||
if (messageText) {
|
// dem clientMsgId der Bubble, damit Bridge+ACK die richtige Bubble
|
||||||
rvs.send('chat', {
|
// adressieren.
|
||||||
|
if (messageText && cmid) {
|
||||||
|
dispatchWithAck(cmid, 'chat', {
|
||||||
text: messageText,
|
text: messageText,
|
||||||
voice: localXttsVoiceRef.current,
|
voice: localXttsVoiceRef.current,
|
||||||
speed: ttsSpeedRef.current,
|
speed: ttsSpeedRef.current,
|
||||||
@@ -1323,7 +1686,7 @@ const ChatScreen: React.FC = () => {
|
|||||||
|
|
||||||
setPendingAttachments([]);
|
setPendingAttachments([]);
|
||||||
setInputText('');
|
setInputText('');
|
||||||
}, [pendingAttachments, getCurrentLocation]);
|
}, [pendingAttachments, getCurrentLocation, dispatchWithAck]);
|
||||||
|
|
||||||
// --- Rendering ---
|
// --- Rendering ---
|
||||||
|
|
||||||
@@ -1375,17 +1738,30 @@ const ChatScreen: React.FC = () => {
|
|||||||
<TouchableOpacity
|
<TouchableOpacity
|
||||||
key={`${item.id}-att-${idx}`}
|
key={`${item.id}-att-${idx}`}
|
||||||
style={styles.memoryAttachmentRow}
|
style={styles.memoryAttachmentRow}
|
||||||
onPress={() => {
|
onPress={async () => {
|
||||||
if (!a.path) return;
|
if (!a.path) return;
|
||||||
if (a.localUri) {
|
if (a.localUri) {
|
||||||
if (isImage) setFullscreenImage(a.localUri);
|
const localPath = a.localUri.replace(/^file:\/\//, '');
|
||||||
else openFileWithIntent(a.localUri.replace(/^file:\/\//, ''), a.mime || '');
|
const exists = await RNFS.exists(localPath).catch(() => false);
|
||||||
} else {
|
if (exists) {
|
||||||
// Datei via Bridge nachladen — file_response hat den
|
if (isImage) setFullscreenImage(a.localUri);
|
||||||
// memorySaved-Match-Path und cached + zeigt direkt
|
else openFileWithIntent(localPath, a.mime || '');
|
||||||
autoOpenPaths.current.add(a.path);
|
return;
|
||||||
rvs.send('file_request' as any, { serverPath: a.path, requestId: `memAtt_${item.id}_${idx}` });
|
}
|
||||||
|
// Cache weg → localUri leeren + neu laden
|
||||||
|
setMessages(prev => prev.map(mm => mm.id === item.id && mm.memorySaved
|
||||||
|
? { ...mm, memorySaved: { ...mm.memorySaved,
|
||||||
|
attachments: mm.memorySaved.attachments?.map(x =>
|
||||||
|
x.path === a.path ? { ...x, localUri: undefined } : x) } }
|
||||||
|
: mm));
|
||||||
|
if (Platform.OS === 'android') {
|
||||||
|
ToastAndroid.show('Cache leer — lade nach...', ToastAndroid.SHORT);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
// Datei via Bridge nachladen — file_response hat den
|
||||||
|
// memorySaved-Match-Path und cached + zeigt direkt
|
||||||
|
autoOpenPaths.current.add(a.path);
|
||||||
|
rvs.send('file_request' as any, { serverPath: a.path, requestId: `memAtt_${item.id}_${idx}` });
|
||||||
}}
|
}}
|
||||||
>
|
>
|
||||||
<Text style={styles.memoryAttachmentIcon}>{icon}</Text>
|
<Text style={styles.memoryAttachmentIcon}>{icon}</Text>
|
||||||
@@ -1489,17 +1865,32 @@ const ChatScreen: React.FC = () => {
|
|||||||
) : (
|
) : (
|
||||||
<TouchableOpacity
|
<TouchableOpacity
|
||||||
style={styles.attachmentFile}
|
style={styles.attachmentFile}
|
||||||
onPress={() => {
|
onPress={async () => {
|
||||||
// Lokal vorhanden \u2192 direkt mit System-Intent oeffnen
|
// Lokal vorhanden? Cache koennte geleert worden sein \u2014
|
||||||
|
// Datei-Existenz pruefen bevor wir den Intent feuern.
|
||||||
if (att.uri) {
|
if (att.uri) {
|
||||||
openFileWithIntent(att.uri.replace(/^file:\/\//, ''), att.mimeType || '');
|
const localPath = att.uri.replace(/^file:\/\//, '');
|
||||||
return;
|
const exists = await RNFS.exists(localPath).catch(() => false);
|
||||||
|
if (exists) {
|
||||||
|
openFileWithIntent(localPath, att.mimeType || '');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
// Cache weg \u2192 uri im State leeren damit UI "tippen zum Laden" zeigt
|
||||||
|
setMessages(prev => prev.map(m => m.id === item.id
|
||||||
|
? { ...m, attachments: m.attachments?.map(a =>
|
||||||
|
a.serverPath === att.serverPath ? { ...a, uri: undefined } : a) }
|
||||||
|
: m));
|
||||||
|
if (Platform.OS === 'android') {
|
||||||
|
ToastAndroid.show('Cache leer \u2014 lade nach...', ToastAndroid.SHORT);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
// Sonst: file_request \u2192 bei file_response wird die Datei
|
// Re-Download via file_request \u2192 bei file_response wird die
|
||||||
// gespeichert UND geoeffnet (autoOpenPaths-Tracking).
|
// Datei gespeichert UND geoeffnet (autoOpenPaths-Tracking).
|
||||||
if (att.serverPath) {
|
if (att.serverPath) {
|
||||||
autoOpenPaths.current.add(att.serverPath);
|
autoOpenPaths.current.add(att.serverPath);
|
||||||
rvs.send('file_request' as any, { serverPath: att.serverPath, requestId: item.id });
|
rvs.send('file_request' as any, { serverPath: att.serverPath, requestId: item.id });
|
||||||
|
} else if (Platform.OS === 'android') {
|
||||||
|
ToastAndroid.show('Datei kann nicht nachgeladen werden (kein serverPath)', ToastAndroid.LONG);
|
||||||
}
|
}
|
||||||
}}
|
}}
|
||||||
>
|
>
|
||||||
@@ -1562,7 +1953,31 @@ const ChatScreen: React.FC = () => {
|
|||||||
<Text style={styles.bubbleTrashIcon}>{'🗑'}</Text>
|
<Text style={styles.bubbleTrashIcon}>{'🗑'}</Text>
|
||||||
</TouchableOpacity>
|
</TouchableOpacity>
|
||||||
) : null}
|
) : null}
|
||||||
<Text style={styles.timestamp}>{time}</Text>
|
<View style={styles.statusRow}>
|
||||||
|
<Text style={styles.timestamp}>{time}</Text>
|
||||||
|
{isUser && item.deliveryStatus ? (
|
||||||
|
item.deliveryStatus === 'failed' && item.clientMsgId ? (
|
||||||
|
<TouchableOpacity
|
||||||
|
hitSlop={{top:6,bottom:6,left:6,right:6}}
|
||||||
|
onPress={() => retryFailedMessage(item.clientMsgId!)}
|
||||||
|
>
|
||||||
|
<Text style={styles.statusFailed}>{'⚠ tippen f. Retry'}</Text>
|
||||||
|
</TouchableOpacity>
|
||||||
|
) : (
|
||||||
|
<Text style={
|
||||||
|
item.deliveryStatus === 'queued' ? styles.statusQueued :
|
||||||
|
item.deliveryStatus === 'sending' ? styles.statusSending :
|
||||||
|
item.deliveryStatus === 'sent' ? styles.statusSent :
|
||||||
|
/* delivered */ styles.statusDelivered
|
||||||
|
}>
|
||||||
|
{item.deliveryStatus === 'queued' ? '⏱' :
|
||||||
|
item.deliveryStatus === 'sending' ? '⏳' :
|
||||||
|
item.deliveryStatus === 'sent' ? '✓' :
|
||||||
|
/* delivered */ '✓✓'}
|
||||||
|
</Text>
|
||||||
|
)
|
||||||
|
) : null}
|
||||||
|
</View>
|
||||||
</View>
|
</View>
|
||||||
);
|
);
|
||||||
};
|
};
|
||||||
@@ -1605,7 +2020,13 @@ const ChatScreen: React.FC = () => {
|
|||||||
{connectionState === 'connected' ? 'Verbunden' :
|
{connectionState === 'connected' ? 'Verbunden' :
|
||||||
connectionState === 'connecting' ? 'Verbinde...' : 'Getrennt'}
|
connectionState === 'connecting' ? 'Verbinde...' : 'Getrennt'}
|
||||||
</Text>
|
</Text>
|
||||||
<TouchableOpacity onPress={() => setInboxVisible(true)} style={{marginLeft: 'auto', paddingHorizontal: 6}} hitSlop={{top:8,bottom:8,left:6,right:6}}>
|
<TouchableOpacity onPress={() => setThoughtsVisible(true)} style={{marginLeft: 'auto', paddingHorizontal: 6, flexDirection: 'row', alignItems: 'center'}} hitSlop={{top:8,bottom:8,left:6,right:6}}>
|
||||||
|
<Text style={{fontSize: 16}}>{'\uD83D\uDCAD'}</Text>
|
||||||
|
{thoughts.length > 0 ? (
|
||||||
|
<Text style={{color: '#8888AA', fontSize: 11, marginLeft: 3}}>{thoughts.length}</Text>
|
||||||
|
) : null}
|
||||||
|
</TouchableOpacity>
|
||||||
|
<TouchableOpacity onPress={() => setInboxVisible(true)} style={{paddingHorizontal: 6}} hitSlop={{top:8,bottom:8,left:6,right:6}}>
|
||||||
<Text style={{fontSize: 18}}>{'\uD83D\uDDC2\uFE0F'}</Text>
|
<Text style={{fontSize: 18}}>{'\uD83D\uDDC2\uFE0F'}</Text>
|
||||||
</TouchableOpacity>
|
</TouchableOpacity>
|
||||||
<TouchableOpacity onPress={() => setSearchVisible(!searchVisible)} style={{paddingHorizontal: 6}} hitSlop={{top:8,bottom:8,left:6,right:6}}>
|
<TouchableOpacity onPress={() => setSearchVisible(!searchVisible)} style={{paddingHorizontal: 6}} hitSlop={{top:8,bottom:8,left:6,right:6}}>
|
||||||
@@ -1698,15 +2119,26 @@ const ChatScreen: React.FC = () => {
|
|||||||
ref={flatListRef}
|
ref={flatListRef}
|
||||||
inverted
|
inverted
|
||||||
data={invertedMessages}
|
data={invertedMessages}
|
||||||
|
onScroll={(e) => {
|
||||||
|
// Bei inverted FlatList: contentOffset.y > 0 = weg von "unten"
|
||||||
|
// (= aelter scrollen). Wir zeigen den Jump-Down-Button ab ~250px.
|
||||||
|
const y = e.nativeEvent.contentOffset.y;
|
||||||
|
setShowJumpDown(y > 250);
|
||||||
|
}}
|
||||||
|
scrollEventThrottle={120}
|
||||||
onScrollToIndexFailed={(info) => {
|
onScrollToIndexFailed={(info) => {
|
||||||
// FlatList kennt das Item-Layout noch nicht. Zuerst grob in die
|
// FlatList kennt das Item-Layout noch nicht. Wir scrollen grob in
|
||||||
// Naehe scrollen (Average-Item-Hoehe-Schaetzung), dann nach 250ms
|
// die Naehe (Average-Item-Hoehe-Schaetzung) und versuchen EINMAL
|
||||||
// praezise nochmal versuchen.
|
// nach 300ms praezise nachzusetzen. Mehr Retries → Endlos-Cascade
|
||||||
|
// (jeder failed Retry triggert wieder den Handler → 3, 9, 27 ...
|
||||||
|
// Scrolls in der Pipeline = der "permanent springen"-Bug).
|
||||||
const offset = info.averageItemLength * info.index;
|
const offset = info.averageItemLength * info.index;
|
||||||
try { flatListRef.current?.scrollToOffset({ offset, animated: false }); } catch {}
|
try { flatListRef.current?.scrollToOffset({ offset, animated: false }); } catch {}
|
||||||
setTimeout(() => {
|
clearPendingScrollRetry();
|
||||||
try { flatListRef.current?.scrollToIndex({ index: info.index, animated: true, viewPosition: 0.5 }); } catch {}
|
pendingScrollRetry.current = setTimeout(() => {
|
||||||
}, 250);
|
pendingScrollRetry.current = null;
|
||||||
|
try { flatListRef.current?.scrollToIndex({ index: info.index, animated: true, viewPosition: 0 }); } catch {}
|
||||||
|
}, 300);
|
||||||
}}
|
}}
|
||||||
keyExtractor={item => item.id}
|
keyExtractor={item => item.id}
|
||||||
renderItem={renderMessage}
|
renderItem={renderMessage}
|
||||||
@@ -1773,6 +2205,24 @@ const ChatScreen: React.FC = () => {
|
|||||||
</View>
|
</View>
|
||||||
)}
|
)}
|
||||||
|
|
||||||
|
{/* Jump-to-Bottom-Button — erscheint wenn man weg von der neuesten
|
||||||
|
Nachricht gescrollt hat. Bei inverted FlatList ist scrollToOffset
|
||||||
|
0 == neueste Nachricht visuell unten. */}
|
||||||
|
{showJumpDown && (
|
||||||
|
<TouchableOpacity
|
||||||
|
style={styles.jumpDownBtn}
|
||||||
|
activeOpacity={0.85}
|
||||||
|
onPress={() => {
|
||||||
|
try {
|
||||||
|
flatListRef.current?.scrollToOffset({ offset: 0, animated: true });
|
||||||
|
} catch {}
|
||||||
|
setShowJumpDown(false);
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
<Text style={{color:'#fff', fontSize:18, fontWeight:'700'}}>{'↓'}</Text>
|
||||||
|
</TouchableOpacity>
|
||||||
|
)}
|
||||||
|
|
||||||
{/* Eingabebereich */}
|
{/* Eingabebereich */}
|
||||||
<View style={styles.inputContainer}>
|
<View style={styles.inputContainer}>
|
||||||
{/* Datei-Buttons */}
|
{/* Datei-Buttons */}
|
||||||
@@ -1851,6 +2301,102 @@ const ChatScreen: React.FC = () => {
|
|||||||
</ErrorBoundary>
|
</ErrorBoundary>
|
||||||
) : null}
|
) : null}
|
||||||
|
|
||||||
|
{/* Gedanken-Stream — chronologisches Log von ARIAs interner Aktivitaet.
|
||||||
|
Bottom-Sheet (slide-up), 60% Bildschirmhoehe. Mülltonne zum Leeren. */}
|
||||||
|
<Modal
|
||||||
|
visible={thoughtsVisible}
|
||||||
|
animationType="slide"
|
||||||
|
transparent
|
||||||
|
onRequestClose={() => setThoughtsVisible(false)}
|
||||||
|
>
|
||||||
|
<TouchableOpacity
|
||||||
|
style={{flex:1, backgroundColor:'rgba(0,0,0,0.5)', justifyContent:'flex-end'}}
|
||||||
|
activeOpacity={1}
|
||||||
|
onPress={() => setThoughtsVisible(false)}
|
||||||
|
>
|
||||||
|
<TouchableOpacity activeOpacity={1} style={{height:'60%', backgroundColor:'#0D0D1A', borderTopLeftRadius:16, borderTopRightRadius:16}}>
|
||||||
|
{/* Drag-Indicator */}
|
||||||
|
<View style={{alignItems:'center', paddingTop:8, paddingBottom:4}}>
|
||||||
|
<View style={{width:40, height:4, borderRadius:2, backgroundColor:'#2A2A3E'}} />
|
||||||
|
</View>
|
||||||
|
<View style={{flexDirection:'row', alignItems:'center', padding:14, borderBottomWidth:1, borderBottomColor:'#1E1E2E'}}>
|
||||||
|
<Text style={{color:'#FFD60A', fontWeight:'bold', fontSize:16, flex:1}}>
|
||||||
|
{'💭'} Gedanken-Stream {thoughts.length > 0 ? `(${thoughts.length})` : ''}
|
||||||
|
</Text>
|
||||||
|
{thoughts.length > 0 ? (
|
||||||
|
<TouchableOpacity
|
||||||
|
onPress={() => {
|
||||||
|
Alert.alert('Gedanken-Stream leeren?', `Alle ${thoughts.length} Eintraege werden geloescht.`, [
|
||||||
|
{ text: 'Abbrechen', style: 'cancel' },
|
||||||
|
{ text: 'Leeren', style: 'destructive', onPress: () => {
|
||||||
|
setThoughts([]);
|
||||||
|
lastThoughtKeyRef.current = '';
|
||||||
|
} },
|
||||||
|
]);
|
||||||
|
}}
|
||||||
|
hitSlop={{top:8,bottom:8,left:8,right:8}}
|
||||||
|
style={{paddingHorizontal:8}}
|
||||||
|
>
|
||||||
|
<Text style={{fontSize:18}}>{'🗑'}</Text>
|
||||||
|
</TouchableOpacity>
|
||||||
|
) : null}
|
||||||
|
<TouchableOpacity onPress={() => setThoughtsVisible(false)} hitSlop={{top:8,bottom:8,left:8,right:8}}>
|
||||||
|
<Text style={{color:'#8888AA', fontSize:24}}>×</Text>
|
||||||
|
</TouchableOpacity>
|
||||||
|
</View>
|
||||||
|
{thoughts.length === 0 ? (
|
||||||
|
<View style={{flex:1, alignItems:'center', justifyContent:'center', padding:24}}>
|
||||||
|
<Text style={{color:'#555570', fontSize:13, fontStyle:'italic', textAlign:'center'}}>
|
||||||
|
Noch keine Gedanken aufgezeichnet.{'\n'}Sobald ARIA was tut, taucht's hier auf.
|
||||||
|
</Text>
|
||||||
|
</View>
|
||||||
|
) : (
|
||||||
|
<FlatList
|
||||||
|
data={thoughts}
|
||||||
|
keyExtractor={(_, i) => `t_${i}`}
|
||||||
|
contentContainerStyle={{paddingVertical:8}}
|
||||||
|
renderItem={({ item, index }) => {
|
||||||
|
const prev = index > 0 ? thoughts[index - 1] : null;
|
||||||
|
// Lange Pause? → Trenn-Linie mit Minuten-Hint
|
||||||
|
const gapMin = prev ? Math.floor((item.ts - prev.ts) / 60000) : 0;
|
||||||
|
const showGap = gapMin >= 1;
|
||||||
|
const time = new Date(item.ts).toLocaleTimeString('de-DE', {hour:'2-digit', minute:'2-digit', second:'2-digit'});
|
||||||
|
const icon =
|
||||||
|
item.activity === 'idle' ? '✓' :
|
||||||
|
item.activity === 'tool' ? '🔧' :
|
||||||
|
item.activity === 'assistant' ? '✍️' :
|
||||||
|
item.activity === 'thinking' ? '💭' : '•';
|
||||||
|
const label =
|
||||||
|
item.activity === 'idle' ? 'fertig' :
|
||||||
|
item.activity === 'tool' ? (item.tool || 'tool') :
|
||||||
|
item.activity === 'assistant' ? 'schreibt' :
|
||||||
|
item.activity === 'thinking' ? 'denkt' : item.activity;
|
||||||
|
const isIdle = item.activity === 'idle';
|
||||||
|
return (
|
||||||
|
<View>
|
||||||
|
{showGap ? (
|
||||||
|
<View style={{flexDirection:'row', alignItems:'center', paddingHorizontal:16, paddingVertical:6}}>
|
||||||
|
<View style={{flex:1, height:1, backgroundColor:'#1E1E2E'}} />
|
||||||
|
<Text style={{color:'#555570', fontSize:10, paddingHorizontal:8}}>
|
||||||
|
{gapMin < 60 ? `${gapMin} Min` : `${Math.floor(gapMin/60)}h ${gapMin%60}m`}
|
||||||
|
</Text>
|
||||||
|
<View style={{flex:1, height:1, backgroundColor:'#1E1E2E'}} />
|
||||||
|
</View>
|
||||||
|
) : null}
|
||||||
|
<View style={{flexDirection:'row', paddingHorizontal:16, paddingVertical:5}}>
|
||||||
|
<Text style={{color:'#555570', fontSize:11, width:78}}>{time}</Text>
|
||||||
|
<Text style={{fontSize:13, width:24}}>{icon}</Text>
|
||||||
|
<Text style={{color: isIdle ? '#34C759' : '#E0E0F0', fontSize:13, flex:1}}>{label}</Text>
|
||||||
|
</View>
|
||||||
|
</View>
|
||||||
|
);
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
)}
|
||||||
|
</TouchableOpacity>
|
||||||
|
</TouchableOpacity>
|
||||||
|
</Modal>
|
||||||
|
|
||||||
{/* Notizen-Inbox — Listet alle Memories aus dem aktuellen Chat (Special-Bubbles).
|
{/* Notizen-Inbox — Listet alle Memories aus dem aktuellen Chat (Special-Bubbles).
|
||||||
Bestes-Aus-beiden-Welten: nur die Memory-IDs aus den memorySaved-Bubbles
|
Bestes-Aus-beiden-Welten: nur die Memory-IDs aus den memorySaved-Bubbles
|
||||||
des aktuellen Chats, plus den vollen Browser darunter wenn der User mehr will. */}
|
des aktuellen Chats, plus den vollen Browser darunter wenn der User mehr will. */}
|
||||||
@@ -2111,6 +2657,35 @@ const styles = StyleSheet.create({
|
|||||||
marginTop: 4,
|
marginTop: 4,
|
||||||
alignSelf: 'flex-end',
|
alignSelf: 'flex-end',
|
||||||
},
|
},
|
||||||
|
statusRow: {
|
||||||
|
flexDirection: 'row',
|
||||||
|
alignItems: 'center',
|
||||||
|
alignSelf: 'flex-end',
|
||||||
|
gap: 6,
|
||||||
|
marginTop: 4,
|
||||||
|
},
|
||||||
|
statusQueued: {
|
||||||
|
color: '#FFD60A', // Gelb — wartet auf Verbindung
|
||||||
|
fontSize: 11,
|
||||||
|
},
|
||||||
|
statusSending: {
|
||||||
|
color: 'rgba(255,255,255,0.5)',
|
||||||
|
fontSize: 11,
|
||||||
|
},
|
||||||
|
statusSent: {
|
||||||
|
color: 'rgba(255,255,255,0.6)',
|
||||||
|
fontSize: 12,
|
||||||
|
},
|
||||||
|
statusDelivered: {
|
||||||
|
color: '#34C759', // Gruen — Brain hat geantwortet
|
||||||
|
fontSize: 12,
|
||||||
|
fontWeight: '700',
|
||||||
|
},
|
||||||
|
statusFailed: {
|
||||||
|
color: '#FF3B30',
|
||||||
|
fontSize: 11,
|
||||||
|
fontWeight: '700',
|
||||||
|
},
|
||||||
emptyContainer: {
|
emptyContainer: {
|
||||||
flex: 1,
|
flex: 1,
|
||||||
alignItems: 'center',
|
alignItems: 'center',
|
||||||
@@ -2313,6 +2888,23 @@ const styles = StyleSheet.create({
|
|||||||
color: '#555570',
|
color: '#555570',
|
||||||
fontSize: 10,
|
fontSize: 10,
|
||||||
},
|
},
|
||||||
|
jumpDownBtn: {
|
||||||
|
position: 'absolute',
|
||||||
|
right: 16,
|
||||||
|
bottom: 80,
|
||||||
|
width: 44,
|
||||||
|
height: 44,
|
||||||
|
borderRadius: 22,
|
||||||
|
backgroundColor: '#0096FF',
|
||||||
|
alignItems: 'center',
|
||||||
|
justifyContent: 'center',
|
||||||
|
shadowColor: '#000',
|
||||||
|
shadowOffset: { width: 0, height: 2 },
|
||||||
|
shadowOpacity: 0.4,
|
||||||
|
shadowRadius: 4,
|
||||||
|
elevation: 5,
|
||||||
|
zIndex: 100,
|
||||||
|
},
|
||||||
bubbleTrash: {
|
bubbleTrash: {
|
||||||
position: 'absolute',
|
position: 'absolute',
|
||||||
top: 4,
|
top: 4,
|
||||||
|
|||||||
@@ -868,7 +868,7 @@ const SettingsScreen: React.FC = () => {
|
|||||||
})()}
|
})()}
|
||||||
</View>
|
</View>
|
||||||
</Modal>
|
</Modal>
|
||||||
<ScrollView style={styles.container} contentContainerStyle={styles.content}>
|
<ScrollView style={styles.container} contentContainerStyle={styles.content} nestedScrollEnabled={true}>
|
||||||
|
|
||||||
{currentSection === null && (
|
{currentSection === null && (
|
||||||
<>
|
<>
|
||||||
|
|||||||
@@ -54,6 +54,18 @@ function _newRequestId(): string {
|
|||||||
return `brain_${Date.now().toString(36)}_${_nextId}`;
|
return `brain_${Date.now().toString(36)}_${_nextId}`;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/** Mini-Query-String-Builder ohne URLSearchParams (Hermes-Polyfill kennt
|
||||||
|
* kein URLSearchParams.set, crasht). Akzeptiert object mit string/number/
|
||||||
|
* bool-Values; undefined/null/leere Strings werden ausgelassen. */
|
||||||
|
function _qs(params: Record<string, unknown>): string {
|
||||||
|
const parts: string[] = [];
|
||||||
|
for (const [k, v] of Object.entries(params)) {
|
||||||
|
if (v === undefined || v === null || v === '') continue;
|
||||||
|
parts.push(`${encodeURIComponent(k)}=${encodeURIComponent(String(v))}`);
|
||||||
|
}
|
||||||
|
return parts.length ? `?${parts.join('&')}` : '';
|
||||||
|
}
|
||||||
|
|
||||||
interface SendOpts {
|
interface SendOpts {
|
||||||
method?: 'GET' | 'POST' | 'PATCH' | 'DELETE';
|
method?: 'GET' | 'POST' | 'PATCH' | 'DELETE';
|
||||||
body?: AnyJson;
|
body?: AnyJson;
|
||||||
@@ -119,29 +131,31 @@ export const brainApi = {
|
|||||||
|
|
||||||
/** Liste aller Memories, optional nach Type gefiltert. */
|
/** Liste aller Memories, optional nach Type gefiltert. */
|
||||||
listMemories(opts: { type?: string; limit?: number } = {}): Promise<Memory[]> {
|
listMemories(opts: { type?: string; limit?: number } = {}): Promise<Memory[]> {
|
||||||
const qs = new URLSearchParams();
|
const qs = _qs({ type: opts.type, limit: opts.limit || 500 });
|
||||||
if (opts.type) qs.set('type', opts.type);
|
return _send(`/memory/list${qs}`);
|
||||||
qs.set('limit', String(opts.limit || 500));
|
|
||||||
return _send(`/memory/list?${qs.toString()}`);
|
|
||||||
},
|
},
|
||||||
|
|
||||||
/** Volltext-Substring-Suche. */
|
/** Volltext-Substring-Suche. */
|
||||||
searchText(q: string, opts: { type?: string; includePinned?: boolean; k?: number } = {}): Promise<Memory[]> {
|
searchText(q: string, opts: { type?: string; includePinned?: boolean; k?: number } = {}): Promise<Memory[]> {
|
||||||
const qs = new URLSearchParams({ q });
|
const qs = _qs({
|
||||||
if (opts.type) qs.set('type', opts.type);
|
q,
|
||||||
qs.set('include_pinned', String(opts.includePinned !== false));
|
type: opts.type,
|
||||||
qs.set('k', String(opts.k || 50));
|
include_pinned: opts.includePinned !== false,
|
||||||
return _send(`/memory/search-text?${qs.toString()}`);
|
k: opts.k || 50,
|
||||||
|
});
|
||||||
|
return _send(`/memory/search-text${qs}`);
|
||||||
},
|
},
|
||||||
|
|
||||||
/** Semantische Suche (Embedder). */
|
/** Semantische Suche (Embedder). */
|
||||||
searchSemantic(q: string, opts: { type?: string; includePinned?: boolean; k?: number; threshold?: number } = {}): Promise<Memory[]> {
|
searchSemantic(q: string, opts: { type?: string; includePinned?: boolean; k?: number; threshold?: number } = {}): Promise<Memory[]> {
|
||||||
const qs = new URLSearchParams({ q });
|
const qs = _qs({
|
||||||
if (opts.type) qs.set('type', opts.type);
|
q,
|
||||||
qs.set('include_pinned', String(opts.includePinned !== false));
|
type: opts.type,
|
||||||
qs.set('k', String(opts.k || 10));
|
include_pinned: opts.includePinned !== false,
|
||||||
qs.set('score_threshold', String(opts.threshold ?? 0.30));
|
k: opts.k || 10,
|
||||||
return _send(`/memory/search?${qs.toString()}`);
|
score_threshold: opts.threshold ?? 0.30,
|
||||||
|
});
|
||||||
|
return _send(`/memory/search${qs}`);
|
||||||
},
|
},
|
||||||
|
|
||||||
/** Memory anlegen. */
|
/** Memory anlegen. */
|
||||||
|
|||||||
+11
-2
@@ -134,10 +134,19 @@ META_TOOLS = [
|
|||||||
"function": {
|
"function": {
|
||||||
"name": "trigger_watcher",
|
"name": "trigger_watcher",
|
||||||
"description": (
|
"description": (
|
||||||
"Lege einen Watcher-Trigger an — pollt alle paar Minuten eine Condition, "
|
"Lege einen Watcher-Trigger an — pollt eine Condition, "
|
||||||
"feuert wenn sie wahr wird (mit Throttle damit's nicht spammt). "
|
"feuert wenn sie wahr wird (mit Throttle damit's nicht spammt). "
|
||||||
"Use-Case: 'sag bescheid wenn Disk unter 5GB', 'pingt mich wenn um 8 Uhr'. "
|
"Use-Case: 'sag bescheid wenn Disk unter 5GB', 'pingt mich wenn um 8 Uhr'. "
|
||||||
"Welche Variablen verfuegbar sind und ihre Bedeutung steht im System-Prompt."
|
"Welche Variablen verfuegbar sind und ihre Bedeutung steht im System-Prompt.\n\n"
|
||||||
|
"Fuer GPS-Trigger gibt es DREI Modi — waehle nach Use-Case:\n"
|
||||||
|
"- **`near(lat, lon, r)`**: SOLANGE im Radius (mit Throttle gegen Spam). "
|
||||||
|
"Use-Case: 'bin ich noch in der Naehe von X?'. Empfohlener throttle 300-3600s.\n"
|
||||||
|
"- **`entered_near(lat, lon, r)`**: EINMAL beim Eintritt (Uebergang draussen→innen). "
|
||||||
|
"Use-Case: Blitzer-Warner, Ankunfts-Erinnerung. Mit grossem r (z.B. 2000) "
|
||||||
|
"wird's zur Vorwarnung 2 km vor dem Ziel. Empfohlener throttle: kurz (30-60s, "
|
||||||
|
"nur gegen GPS-Jitter).\n"
|
||||||
|
"- **`left_near(lat, lon, r)`**: EINMAL beim Verlassen (Uebergang innen→draussen). "
|
||||||
|
"Use-Case: 'Hast du am Parkplatz X was vergessen?'. Empfohlener throttle: kurz."
|
||||||
),
|
),
|
||||||
"parameters": {
|
"parameters": {
|
||||||
"type": "object",
|
"type": "object",
|
||||||
|
|||||||
+68
-19
@@ -27,7 +27,12 @@ import watcher as watcher_mod
|
|||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
TICK_SEC = 30
|
# Polling-Frequenz des Background-Loops. Vorher 30s → Auto-Vorbeifahrt
|
||||||
|
# durch einen 300m-Radius bei >50 km/h konnte zwischen zwei Ticks komplett
|
||||||
|
# verpasst werden. Mit 8s ist auch eine 18-Sekunden-Durchfahrt (120 km/h
|
||||||
|
# durch 300m) garantiert mind. einmal getroffen. Der Loop ist billig
|
||||||
|
# (paar Dateilesungen + AST-Eval), das macht Brain nicht warm.
|
||||||
|
TICK_SEC = 8
|
||||||
BRIDGE_URL = os.environ.get("BRIDGE_URL", "http://aria-bridge:8090")
|
BRIDGE_URL = os.environ.get("BRIDGE_URL", "http://aria-bridge:8090")
|
||||||
|
|
||||||
|
|
||||||
@@ -159,7 +164,12 @@ async def _fire(trigger: dict, agent_factory) -> None:
|
|||||||
|
|
||||||
|
|
||||||
async def _tick(agent_factory) -> None:
|
async def _tick(agent_factory) -> None:
|
||||||
"""Ein Pruefdurchlauf. Geht ueber alle Triggers, feuert was zu feuern ist."""
|
"""Ein Pruefdurchlauf. Geht ueber alle Triggers, feuert was zu feuern ist.
|
||||||
|
|
||||||
|
near()-State-Tracking: entered_near/left_near brauchen die Information
|
||||||
|
ob ein near()-Aufruf beim letzten Tick true war (Uebergang erkennen).
|
||||||
|
Wir halten das pro Trigger als near_states-Dict im Manifest und
|
||||||
|
aktualisieren es nach jedem Eval — auch wenn nicht gefeuert wird."""
|
||||||
try:
|
try:
|
||||||
all_triggers = triggers_mod.list_triggers(active_only=True)
|
all_triggers = triggers_mod.list_triggers(active_only=True)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
@@ -168,35 +178,74 @@ async def _tick(agent_factory) -> None:
|
|||||||
if not all_triggers:
|
if not all_triggers:
|
||||||
return
|
return
|
||||||
now = datetime.now(timezone.utc)
|
now = datetime.now(timezone.utc)
|
||||||
# Variablen einmal pro Tick sammeln (nicht pro Trigger — Disk-Stat ist teuer)
|
|
||||||
try:
|
|
||||||
vars_ = watcher_mod.collect_variables()
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning("collect_variables: %s", e)
|
|
||||||
vars_ = {}
|
|
||||||
|
|
||||||
# Watcher: last_checked_at jetzt updaten (auch wenn nicht gefeuert wird,
|
|
||||||
# damit der Check-Interval respektiert wird)
|
|
||||||
for t in all_triggers:
|
|
||||||
if t.get("type") == "watcher":
|
|
||||||
try:
|
|
||||||
t["last_checked_at"] = _now_iso()
|
|
||||||
triggers_mod.write(t["name"], t)
|
|
||||||
except Exception:
|
|
||||||
pass
|
|
||||||
|
|
||||||
for trigger in all_triggers:
|
for trigger in all_triggers:
|
||||||
|
if trigger.get("type") != "watcher":
|
||||||
|
continue
|
||||||
try:
|
try:
|
||||||
if _should_fire(trigger, vars_, now):
|
# Variablen pro Trigger sammeln — wegen prev_near_states-Closure
|
||||||
|
prev = trigger.get("near_states") or {}
|
||||||
|
vars_ = watcher_mod.collect_variables(prev_near_states=prev)
|
||||||
|
|
||||||
|
# Condition evaluieren via _should_fire (intern ruft watcher.evaluate)
|
||||||
|
fired = _should_fire(trigger, vars_, now)
|
||||||
|
|
||||||
|
# State immer updaten, egal ob gefeuert wurde — sonst greift
|
||||||
|
# entered_near/left_near nicht
|
||||||
|
new_states = vars_.get("_new_near_states") or {}
|
||||||
|
trigger["near_states"] = new_states
|
||||||
|
trigger["last_checked_at"] = _now_iso()
|
||||||
|
try:
|
||||||
|
triggers_mod.write(trigger["name"], trigger)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("trigger.write %s: %s", trigger.get("name"), e)
|
||||||
|
|
||||||
|
if fired:
|
||||||
# Feuern als eigener Task — wenn ARIA langsam antwortet,
|
# Feuern als eigener Task — wenn ARIA langsam antwortet,
|
||||||
# darf der naechste Tick nicht blockieren
|
# darf der naechste Tick nicht blockieren
|
||||||
asyncio.create_task(_fire(trigger, agent_factory))
|
asyncio.create_task(_fire(trigger, agent_factory))
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.warning("Trigger-Check %s: %s", trigger.get("name"), e)
|
logger.warning("Trigger-Check %s: %s", trigger.get("name"), e)
|
||||||
|
|
||||||
|
# Timer (one-shot) — separat ohne near-State
|
||||||
|
timer_vars = None
|
||||||
|
for trigger in all_triggers:
|
||||||
|
if trigger.get("type") != "timer":
|
||||||
|
continue
|
||||||
|
try:
|
||||||
|
if timer_vars is None:
|
||||||
|
timer_vars = watcher_mod.collect_variables()
|
||||||
|
if _should_fire(trigger, timer_vars, now):
|
||||||
|
asyncio.create_task(_fire(trigger, agent_factory))
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("Timer-Check %s: %s", trigger.get("name"), e)
|
||||||
|
|
||||||
|
|
||||||
|
# Module-Level-Slot fuer die agent_factory damit on-demand-Ticks (von
|
||||||
|
# z.B. POST /triggers/check-now) Zugang haben ohne durch den ganzen
|
||||||
|
# Lifespan-Pfad geschleust zu werden.
|
||||||
|
_AGENT_FACTORY = None
|
||||||
|
|
||||||
|
|
||||||
|
async def tick_now() -> dict:
|
||||||
|
"""Sofortiger Trigger-Check — nicht warten auf den naechsten Loop-Tick.
|
||||||
|
Wird genutzt wenn ein neues GPS-Update reinkommt: Bridge ruft das nach
|
||||||
|
_persist_location, damit Watcher mit near() den frischen Wert sofort
|
||||||
|
sehen statt bis zu TICK_SEC Sekunden zu warten."""
|
||||||
|
if _AGENT_FACTORY is None:
|
||||||
|
return {"ok": False, "error": "Background-Loop noch nicht gestartet"}
|
||||||
|
try:
|
||||||
|
await _tick(_AGENT_FACTORY)
|
||||||
|
return {"ok": True}
|
||||||
|
except Exception as exc:
|
||||||
|
logger.exception("tick_now: %s", exc)
|
||||||
|
return {"ok": False, "error": str(exc)}
|
||||||
|
|
||||||
|
|
||||||
async def run_loop(agent_factory) -> None:
|
async def run_loop(agent_factory) -> None:
|
||||||
"""Endlosschleife — wird vom main lifespan gestartet + gestoppt."""
|
"""Endlosschleife — wird vom main lifespan gestartet + gestoppt."""
|
||||||
|
global _AGENT_FACTORY
|
||||||
|
_AGENT_FACTORY = agent_factory
|
||||||
logger.info("Trigger-Loop gestartet (TICK_SEC=%d)", TICK_SEC)
|
logger.info("Trigger-Loop gestartet (TICK_SEC=%d)", TICK_SEC)
|
||||||
while True:
|
while True:
|
||||||
try:
|
try:
|
||||||
|
|||||||
@@ -657,6 +657,16 @@ def triggers_list(active_only: bool = False):
|
|||||||
return {"triggers": triggers_mod.list_triggers(active_only=active_only)}
|
return {"triggers": triggers_mod.list_triggers(active_only=active_only)}
|
||||||
|
|
||||||
|
|
||||||
|
@app.post("/triggers/check-now")
|
||||||
|
async def triggers_check_now():
|
||||||
|
"""Sofortiger Trigger-Check, statt auf den naechsten Background-Tick
|
||||||
|
zu warten. Wird von der Bridge nach jedem location_update gerufen
|
||||||
|
damit GPS-Watcher (near()) den frischen Wert SOFORT sehen — bei
|
||||||
|
Auto-Vorbeifahrt durch einen 300m-Radius hat man sonst nur ~20s
|
||||||
|
Drinnen-Zeit, was unter TICK_SEC fallen kann."""
|
||||||
|
return await background_mod.tick_now()
|
||||||
|
|
||||||
|
|
||||||
@app.get("/triggers/conditions")
|
@app.get("/triggers/conditions")
|
||||||
def triggers_conditions():
|
def triggers_conditions():
|
||||||
"""Verfuegbare Variablen + Funktionen fuer Watcher-Conditions
|
"""Verfuegbare Variablen + Funktionen fuer Watcher-Conditions
|
||||||
|
|||||||
+11
-10
@@ -164,15 +164,17 @@ def build_skills_section(skills: List[dict]) -> str:
|
|||||||
"static-ffmpeg, beautifulsoup4, …). Falls etwas WIRKLICH nur via apt geht: "
|
"static-ffmpeg, beautifulsoup4, …). Falls etwas WIRKLICH nur via apt geht: "
|
||||||
"Stefan fragen ob es ins Brain-Dockerfile soll.")
|
"Stefan fragen ob es ins Brain-Dockerfile soll.")
|
||||||
lines.append("")
|
lines.append("")
|
||||||
lines.append("**Harte Regel — IMMER Skill anlegen wenn:** die Loesung erfordert eine "
|
lines.append("**Goldene Regel: NIE ungefragt Skills anlegen.** Selbst wenn die Aufgabe "
|
||||||
"pip-Library. Begruendung: Brain-Container hat keinen persistenten State "
|
"eine pip-Library braucht — erst die Aufgabe loesen (mit Bash, `pip install` "
|
||||||
"ausser /data/skills/. Ohne Skill wuerde der Install bei jedem "
|
"im Brain ist ok, oder Workaround), und nur wenn Stefan EXPLIZIT sagt "
|
||||||
"Container-Restart wiederholt.")
|
"'mach daraus einen Skill' / 'leg den als Skill an' / 'dafuer einen Skill' "
|
||||||
|
"rufst du `skill_create` auf. Begruendung: Skill-Setup (venv + pip install) "
|
||||||
|
"blockt das Brain bis zu 12 Minuten. Ein unaufgefordert angelegter Skill "
|
||||||
|
"macht ARIA stumm und nervt Stefan jedes Mal.")
|
||||||
lines.append("")
|
lines.append("")
|
||||||
lines.append("**Sonst — Skill nur wenn alle vier zutreffen:**")
|
lines.append("**Wenn Stefan einen Skill explizit moechte, pruef:**")
|
||||||
lines.append("")
|
lines.append("")
|
||||||
lines.append("1. **Wiederkehrend** — die Aufgabe wird realistisch nochmal gestellt. "
|
lines.append("1. **Wiederkehrend** — die Aufgabe wird realistisch nochmal gestellt.")
|
||||||
"Einmal-Faelle (\"wie spaet ist es jetzt\") kein Skill.")
|
|
||||||
lines.append("2. **Nicht-trivial** — mehrere Schritte. Ein einzelner Shell-Befehl "
|
lines.append("2. **Nicht-trivial** — mehrere Schritte. Ein einzelner Shell-Befehl "
|
||||||
"(`date`, `hostname`, `ls`) ist KEIN Skill — das macht Bash direkt.")
|
"(`date`, `hostname`, `ls`) ist KEIN Skill — das macht Bash direkt.")
|
||||||
lines.append("3. **Parametrisierbar** — der Skill nimmt Eingaben (URL, Datei, Suchbegriff) "
|
lines.append("3. **Parametrisierbar** — der Skill nimmt Eingaben (URL, Datei, Suchbegriff) "
|
||||||
@@ -180,9 +182,8 @@ def build_skills_section(skills: List[dict]) -> str:
|
|||||||
lines.append("4. **Wiederverwendbar als ganzes** — Stefan wuerde es zukuenftig per Name "
|
lines.append("4. **Wiederverwendbar als ganzes** — Stefan wuerde es zukuenftig per Name "
|
||||||
"ansprechen (\"mach mir den YouTube zu MP3\") statt jedes Mal zu erklaeren.")
|
"ansprechen (\"mach mir den YouTube zu MP3\") statt jedes Mal zu erklaeren.")
|
||||||
lines.append("")
|
lines.append("")
|
||||||
lines.append("Wenn nichts installiert werden muss UND nicht alle vier zutreffen: einfach "
|
lines.append("Wenn auch nur EINE der vier nicht zutrifft: hoeflich nachfragen ob er "
|
||||||
"die Aufgabe loesen ohne Skill anzulegen. Stefan kann jederzeit sagen "
|
"wirklich einen permanenten Skill will oder die Aufgabe einmalig reicht.")
|
||||||
"'bau daraus einen Skill'.")
|
|
||||||
return "\n".join(lines)
|
return "\n".join(lines)
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -25,7 +25,7 @@ logger = logging.getLogger(__name__)
|
|||||||
RUNTIME_CONFIG_FILE = Path("/shared/config/runtime.json")
|
RUNTIME_CONFIG_FILE = Path("/shared/config/runtime.json")
|
||||||
ENV_MODEL = os.environ.get("BRAIN_MODEL", "claude-sonnet-4")
|
ENV_MODEL = os.environ.get("BRAIN_MODEL", "claude-sonnet-4")
|
||||||
PROXY_URL = os.environ.get("PROXY_URL", "http://proxy:3456")
|
PROXY_URL = os.environ.get("PROXY_URL", "http://proxy:3456")
|
||||||
PROXY_TIMEOUT_SEC = float(os.environ.get("PROXY_TIMEOUT_SEC", "300"))
|
PROXY_TIMEOUT_SEC = float(os.environ.get("PROXY_TIMEOUT_SEC", "1200"))
|
||||||
|
|
||||||
|
|
||||||
def _read_model_from_runtime() -> str:
|
def _read_model_from_runtime() -> str:
|
||||||
|
|||||||
+81
-7
@@ -25,7 +25,7 @@ import shutil
|
|||||||
import time
|
import time
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Any
|
from typing import Any, Dict, Optional
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
@@ -91,6 +91,12 @@ def _cpu_load_1min() -> float:
|
|||||||
|
|
||||||
_DAYS = ["mon", "tue", "wed", "thu", "fri", "sat", "sun"]
|
_DAYS = ["mon", "tue", "wed", "thu", "fri", "sat", "sun"]
|
||||||
|
|
||||||
|
# Maximales GPS-Alter fuer near()-Auswertung. Wenn die App laenger nicht
|
||||||
|
# gepushed hat (z.B. Tracking aus, Mobilfunk weg, App geschlossen), gilt
|
||||||
|
# die Position als "unbekannt" und near() liefert False — verhindert
|
||||||
|
# Phantom-Fires basierend auf einer wochen-alten Position.
|
||||||
|
NEAR_MAX_AGE_SEC = 5 * 60
|
||||||
|
|
||||||
|
|
||||||
def _gps_state() -> dict[str, Any]:
|
def _gps_state() -> dict[str, Any]:
|
||||||
"""Letzte bekannte Position aus /shared/state/location.json.
|
"""Letzte bekannte Position aus /shared/state/location.json.
|
||||||
@@ -119,8 +125,22 @@ def _user_activity_age() -> int:
|
|||||||
return int(time.time() - ts)
|
return int(time.time() - ts)
|
||||||
|
|
||||||
|
|
||||||
def collect_variables() -> dict[str, Any]:
|
def _near_key(lat: float, lon: float, radius_m: float) -> str:
|
||||||
"""Liefert aktuellen Snapshot aller Built-in-Variablen + near()-Helper."""
|
"""Stabiler Schluessel pro near()-Aufruf — fuer entered_near/left_near
|
||||||
|
State-Tracking pro Trigger pro Aufrufstelle."""
|
||||||
|
return f"{float(lat):.6f},{float(lon):.6f},{int(float(radius_m))}"
|
||||||
|
|
||||||
|
|
||||||
|
def collect_variables(prev_near_states: Optional[Dict[str, bool]] = None) -> Dict[str, Any]:
|
||||||
|
"""Liefert aktuellen Snapshot aller Built-in-Variablen + near()-Helper.
|
||||||
|
|
||||||
|
prev_near_states: pro Trigger gespeicherter Zustand vom letzten Eval
|
||||||
|
(für entered_near/left_near). Wird vom background-Loop reingegeben.
|
||||||
|
Nach dem Eval kann man `vars_['_new_near_states']` auslesen, um den
|
||||||
|
Update-Snapshot zurueck ins Trigger-Manifest zu schreiben."""
|
||||||
|
if prev_near_states is None:
|
||||||
|
prev_near_states = {}
|
||||||
|
new_near_states: Dict[str, bool] = {}
|
||||||
free_gb, free_pct = _disk_stats()
|
free_gb, free_pct = _disk_stats()
|
||||||
now = datetime.now()
|
now = datetime.now()
|
||||||
gps = _gps_state()
|
gps = _gps_state()
|
||||||
@@ -176,12 +196,17 @@ def collect_variables() -> dict[str, Any]:
|
|||||||
|
|
||||||
# Funktion-Helper — wird vom Parser als ast.Call mit Name "near" erkannt.
|
# Funktion-Helper — wird vom Parser als ast.Call mit Name "near" erkannt.
|
||||||
# Closure ueber die GPS-Werte, damit eval keine extra Variablen braucht.
|
# Closure ueber die GPS-Werte, damit eval keine extra Variablen braucht.
|
||||||
def _near(lat: float, lon: float, radius_m: float) -> bool:
|
def _compute_near(lat: float, lon: float, radius_m: float) -> bool:
|
||||||
"""Haversine-Distanz: True wenn aktuelle Position < radius_m vom Punkt."""
|
"""Haversine-Distanz: True wenn aktuelle Position < radius_m vom Punkt.
|
||||||
|
Plus Age-Schutz: GPS-Daten aelter als NEAR_MAX_AGE_SEC werden als
|
||||||
|
veraltet betrachtet → False."""
|
||||||
cur_lat = vars_.get("current_lat")
|
cur_lat = vars_.get("current_lat")
|
||||||
cur_lon = vars_.get("current_lon")
|
cur_lon = vars_.get("current_lon")
|
||||||
if cur_lat is None or cur_lon is None:
|
if cur_lat is None or cur_lon is None:
|
||||||
return False
|
return False
|
||||||
|
age = vars_.get("location_age_sec")
|
||||||
|
if isinstance(age, (int, float)) and age >= 0 and age > NEAR_MAX_AGE_SEC:
|
||||||
|
return False
|
||||||
try:
|
try:
|
||||||
R = 6371000.0
|
R = 6371000.0
|
||||||
phi1 = math.radians(float(cur_lat))
|
phi1 = math.radians(float(cur_lat))
|
||||||
@@ -194,7 +219,39 @@ def collect_variables() -> dict[str, Any]:
|
|||||||
except Exception:
|
except Exception:
|
||||||
return False
|
return False
|
||||||
|
|
||||||
|
def _near(lat: float, lon: float, radius_m: float) -> bool:
|
||||||
|
"""True solange im Radius drin. Plus State-Tracking fuer
|
||||||
|
entered_near/left_near — wir merken uns das letzte Ergebnis
|
||||||
|
damit Uebergaenge erkannt werden koennen."""
|
||||||
|
current = _compute_near(lat, lon, radius_m)
|
||||||
|
new_near_states[_near_key(lat, lon, radius_m)] = current
|
||||||
|
return current
|
||||||
|
|
||||||
|
def _entered_near(lat: float, lon: float, radius_m: float) -> bool:
|
||||||
|
"""True NUR beim Uebergang draussen → innen. Use-Case: einmal
|
||||||
|
feuern wenn der User in den Radius reinfaehrt (Blitzer-Warner,
|
||||||
|
Ankunft-Erinnerung). Bei groesserem Radius = Vorwarnung."""
|
||||||
|
current = _compute_near(lat, lon, radius_m)
|
||||||
|
key = _near_key(lat, lon, radius_m)
|
||||||
|
new_near_states[key] = current
|
||||||
|
prev = bool(prev_near_states.get(key, False))
|
||||||
|
return current and not prev
|
||||||
|
|
||||||
|
def _left_near(lat: float, lon: float, radius_m: float) -> bool:
|
||||||
|
"""True NUR beim Uebergang innen → draussen. Use-Case: 'Hast
|
||||||
|
du am Parkplatz X was vergessen?' beim Verlassen."""
|
||||||
|
current = _compute_near(lat, lon, radius_m)
|
||||||
|
key = _near_key(lat, lon, radius_m)
|
||||||
|
new_near_states[key] = current
|
||||||
|
prev = bool(prev_near_states.get(key, False))
|
||||||
|
return prev and not current
|
||||||
|
|
||||||
vars_["near"] = _near
|
vars_["near"] = _near
|
||||||
|
vars_["entered_near"] = _entered_near
|
||||||
|
vars_["left_near"] = _left_near
|
||||||
|
# Update-Snapshot fuer den Caller (background-Loop schreibt das pro
|
||||||
|
# Trigger zurueck damit beim naechsten Tick prev_near_states stimmt)
|
||||||
|
vars_["_new_near_states"] = new_near_states
|
||||||
return vars_
|
return vars_
|
||||||
|
|
||||||
|
|
||||||
@@ -236,8 +293,25 @@ def describe_functions() -> list[dict]:
|
|||||||
{
|
{
|
||||||
"name": "near",
|
"name": "near",
|
||||||
"signature": "near(lat, lon, radius_m)",
|
"signature": "near(lat, lon, radius_m)",
|
||||||
"desc": "True wenn die aktuelle GPS-Position innerhalb von radius_m Metern "
|
"desc": "True SOLANGE die aktuelle GPS-Position innerhalb von radius_m "
|
||||||
"vom Punkt (lat, lon) liegt. Haversine. Bei unbekannter Position: False.",
|
"Metern vom Punkt (lat, lon) liegt. Feuert wiederholt (mit throttle). "
|
||||||
|
"Use-Case: 'bin noch in der Naehe von X?'. "
|
||||||
|
"Haversine. Bei unbekannter oder > 5min alter Position: False.",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "entered_near",
|
||||||
|
"signature": "entered_near(lat, lon, radius_m)",
|
||||||
|
"desc": "True NUR im Moment des Eintritts in den Radius (Uebergang "
|
||||||
|
"draussen → innen). Use-Case: einmaliger Fire bei Ankunft / "
|
||||||
|
"Blitzer-Warnung. Mit grossem Radius (z.B. 2000) wird das zur "
|
||||||
|
"Vorwarnung bevor man am Punkt ist.",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "left_near",
|
||||||
|
"signature": "left_near(lat, lon, radius_m)",
|
||||||
|
"desc": "True NUR im Moment des Verlassens des Radius (Uebergang "
|
||||||
|
"innen → draussen). Use-Case: 'Hast du am Parkplatz X was "
|
||||||
|
"vergessen?' beim Wegfahren.",
|
||||||
},
|
},
|
||||||
]
|
]
|
||||||
|
|
||||||
|
|||||||
+139
-15
@@ -25,6 +25,7 @@ import time
|
|||||||
import sys
|
import sys
|
||||||
import tempfile
|
import tempfile
|
||||||
import uuid
|
import uuid
|
||||||
|
from collections import OrderedDict
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
|
|
||||||
@@ -475,6 +476,13 @@ class ARIABridge:
|
|||||||
self.current_mode = self._load_persisted_mode()
|
self.current_mode = self._load_persisted_mode()
|
||||||
self.running = False
|
self.running = False
|
||||||
|
|
||||||
|
# Idempotenz: zuletzt gesehene clientMsgIds (App-seitig generiert).
|
||||||
|
# Beim Reconnect/Retry sendet die App dieselbe ID nochmal — wir
|
||||||
|
# antworten erneut mit ACK aber leiten NICHT doppelt an Brain weiter.
|
||||||
|
# OrderedDict als FIFO mit Capping (Insertion-Order).
|
||||||
|
self._seen_client_msg_ids: "OrderedDict[str, float]" = OrderedDict()
|
||||||
|
self._SEEN_CLIENT_MSG_LIMIT = 200
|
||||||
|
|
||||||
# Komponenten (TTS: F5-TTS remote auf der Gamebox, lokales TTS wurde entfernt)
|
# Komponenten (TTS: F5-TTS remote auf der Gamebox, lokales TTS wurde entfernt)
|
||||||
self.tts_enabled = True
|
self.tts_enabled = True
|
||||||
self.xtts_voice = ""
|
self.xtts_voice = ""
|
||||||
@@ -938,7 +946,12 @@ class ARIABridge:
|
|||||||
def _persist_location(self, location: Optional[dict]) -> None:
|
def _persist_location(self, location: Optional[dict]) -> None:
|
||||||
"""Speichert die letzte bekannte GPS-Position fuer Watcher.
|
"""Speichert die letzte bekannte GPS-Position fuer Watcher.
|
||||||
Erwartet {lat, lon} oder {lat, lng}. Nicht-Dicts und fehlende
|
Erwartet {lat, lon} oder {lat, lng}. Nicht-Dicts und fehlende
|
||||||
Koordinaten werden ignoriert."""
|
Koordinaten werden ignoriert.
|
||||||
|
|
||||||
|
Plus: triggert sofort einen on-demand Trigger-Check im Brain
|
||||||
|
(POST /triggers/check-now). Ohne das wartet der Watcher-Loop
|
||||||
|
bis zu TICK_SEC Sekunden — bei Auto-Vorbeifahrt durch einen
|
||||||
|
300m-Radius (18-43s drin) kann das den Trigger verpassen."""
|
||||||
if not isinstance(location, dict):
|
if not isinstance(location, dict):
|
||||||
return
|
return
|
||||||
try:
|
try:
|
||||||
@@ -950,9 +963,31 @@ class ARIABridge:
|
|||||||
"lat": float(lat),
|
"lat": float(lat),
|
||||||
"lon": float(lon),
|
"lon": float(lon),
|
||||||
})
|
})
|
||||||
|
except Exception:
|
||||||
|
return
|
||||||
|
# Fire-and-forget: Brain-on-demand-Tick. Wenn Brain nicht antwortet
|
||||||
|
# oder langsam ist, blockt das nicht den GPS-Pfad.
|
||||||
|
try:
|
||||||
|
asyncio.create_task(self._trigger_brain_check_now())
|
||||||
except Exception:
|
except Exception:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
async def _trigger_brain_check_now(self) -> None:
|
||||||
|
"""Brain-Endpoint POST /triggers/check-now anstossen."""
|
||||||
|
brain_url = os.environ.get("BRAIN_URL", "http://aria-brain:8080")
|
||||||
|
def _post():
|
||||||
|
try:
|
||||||
|
req = urllib.request.Request(
|
||||||
|
f"{brain_url}/triggers/check-now",
|
||||||
|
data=b"", method="POST",
|
||||||
|
headers={"Content-Type": "application/json"},
|
||||||
|
)
|
||||||
|
with urllib.request.urlopen(req, timeout=8) as r:
|
||||||
|
return r.status
|
||||||
|
except Exception:
|
||||||
|
return None
|
||||||
|
await asyncio.get_event_loop().run_in_executor(None, _post)
|
||||||
|
|
||||||
def _persist_user_activity(self) -> None:
|
def _persist_user_activity(self) -> None:
|
||||||
"""Markiert dass der User gerade etwas gemacht hat (Chat/Voice).
|
"""Markiert dass der User gerade etwas gemacht hat (Chat/Voice).
|
||||||
Watcher: last_user_message_ago_sec basiert darauf."""
|
Watcher: last_user_message_ago_sec basiert darauf."""
|
||||||
@@ -1281,10 +1316,12 @@ class ARIABridge:
|
|||||||
self._pending_files_flush_task = None
|
self._pending_files_flush_task = None
|
||||||
text = self._build_pending_files_message(user_text)
|
text = self._build_pending_files_message(user_text)
|
||||||
self._pending_files = []
|
self._pending_files = []
|
||||||
await self.send_to_core(text, source="app-file+chat")
|
# create_task statt await — sonst blockt der RVS-recv-Loop bis Brain
|
||||||
|
# fertig ist (siehe chat-handler oben).
|
||||||
|
asyncio.create_task(self.send_to_core(text, source="app-file+chat"))
|
||||||
return True
|
return True
|
||||||
|
|
||||||
async def send_to_core(self, text: str, source: str = "bridge") -> None:
|
async def send_to_core(self, text: str, source: str = "bridge", client_msg_id: Optional[str] = None) -> None:
|
||||||
"""Sendet Text an aria-brain (HTTP /chat) und broadcastet die Antwort.
|
"""Sendet Text an aria-brain (HTTP /chat) und broadcastet die Antwort.
|
||||||
|
|
||||||
Nicht-Streaming: wir warten bis Brain fertig ist, dann pushen wir
|
Nicht-Streaming: wir warten bis Brain fertig ist, dann pushen wir
|
||||||
@@ -1298,8 +1335,13 @@ class ARIABridge:
|
|||||||
logger.info("[brain] chat ← %s '%s'", source, text[:80])
|
logger.info("[brain] chat ← %s '%s'", source, text[:80])
|
||||||
|
|
||||||
# User-Nachricht in chat_backup.jsonl loggen — wird beim App-Reconnect
|
# User-Nachricht in chat_backup.jsonl loggen — wird beim App-Reconnect
|
||||||
# / Diagnostic-Reload als History-Quelle gelesen.
|
# / Diagnostic-Reload als History-Quelle gelesen. clientMsgId speichern
|
||||||
self._append_chat_backup({"role": "user", "text": text, "source": source})
|
# damit die App beim chat_history_response ihre lokale Bubble
|
||||||
|
# dedupen kann (sonst verschwindet sie nach Offline→Online-Race).
|
||||||
|
entry: dict = {"role": "user", "text": text, "source": source}
|
||||||
|
if client_msg_id:
|
||||||
|
entry["clientMsgId"] = client_msg_id
|
||||||
|
self._append_chat_backup(entry)
|
||||||
|
|
||||||
# agent_activity → thinking. _emit_activity statt direktem _send_to_rvs
|
# agent_activity → thinking. _emit_activity statt direktem _send_to_rvs
|
||||||
# damit der State-Cache fuer die spaetere idle-Dedup richtig steht.
|
# damit der State-Cache fuer die spaetere idle-Dedup richtig steht.
|
||||||
@@ -1311,8 +1353,10 @@ class ARIABridge:
|
|||||||
url, data=payload, method="POST",
|
url, data=payload, method="POST",
|
||||||
headers={"Content-Type": "application/json"},
|
headers={"Content-Type": "application/json"},
|
||||||
)
|
)
|
||||||
# Cold-Start kann lange dauern, 5min Timeout
|
# 20 Min Timeout — lange Multi-Tool-Workflows (Karten,
|
||||||
with urllib.request.urlopen(req, timeout=300) as resp:
|
# PDFs, viele curl-Calls) brauchen das. 5 Min waren chronisch
|
||||||
|
# zu knapp und haben ARIA mitten in der Arbeit gekappt.
|
||||||
|
with urllib.request.urlopen(req, timeout=1200) as resp:
|
||||||
return resp.status, resp.read().decode("utf-8", errors="ignore")
|
return resp.status, resp.read().decode("utf-8", errors="ignore")
|
||||||
except Exception as exc:
|
except Exception as exc:
|
||||||
return None, str(exc)
|
return None, str(exc)
|
||||||
@@ -1503,6 +1547,36 @@ class ARIABridge:
|
|||||||
except Exception:
|
except Exception:
|
||||||
break
|
break
|
||||||
|
|
||||||
|
async def _send_chat_ack(self, client_msg_id: Optional[str]) -> None:
|
||||||
|
"""Bestaetigt der App den Empfang einer chat/audio-Nachricht.
|
||||||
|
App nutzt das fuer Delivery-Status (✓ = sent). Ohne ACK wuerde die
|
||||||
|
App nach Timeout retryen — gegen Verlust bei Netz-Hicksern.
|
||||||
|
"""
|
||||||
|
if not client_msg_id:
|
||||||
|
return
|
||||||
|
await self._send_to_rvs({
|
||||||
|
"type": "chat_ack",
|
||||||
|
"payload": {"clientMsgId": client_msg_id},
|
||||||
|
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
||||||
|
})
|
||||||
|
|
||||||
|
def _is_duplicate_client_msg(self, client_msg_id: Optional[str]) -> bool:
|
||||||
|
"""Prueft ob wir diese clientMsgId schon verarbeitet haben.
|
||||||
|
Wenn ja → True (Caller soll ACK senden aber NICHT an Brain forwarden).
|
||||||
|
Wenn nein → in den Seen-Cache aufnehmen + False zurueck.
|
||||||
|
"""
|
||||||
|
if not client_msg_id:
|
||||||
|
return False
|
||||||
|
if client_msg_id in self._seen_client_msg_ids:
|
||||||
|
logger.info("[rvs] Idempotenz: cmid=%s bereits verarbeitet, ignoriere",
|
||||||
|
client_msg_id)
|
||||||
|
return True
|
||||||
|
self._seen_client_msg_ids[client_msg_id] = time.time()
|
||||||
|
# Capping: aelteste Eintraege rauswerfen
|
||||||
|
while len(self._seen_client_msg_ids) > self._SEEN_CLIENT_MSG_LIMIT:
|
||||||
|
self._seen_client_msg_ids.popitem(last=False)
|
||||||
|
return False
|
||||||
|
|
||||||
async def _handle_rvs_message(self, raw_message: str) -> None:
|
async def _handle_rvs_message(self, raw_message: str) -> None:
|
||||||
"""Verarbeitet Nachrichten von der App (via RVS).
|
"""Verarbeitet Nachrichten von der App (via RVS).
|
||||||
|
|
||||||
@@ -1527,6 +1601,13 @@ class ARIABridge:
|
|||||||
sender = payload.get("sender", "")
|
sender = payload.get("sender", "")
|
||||||
if sender in ("aria", "stt"):
|
if sender in ("aria", "stt"):
|
||||||
return
|
return
|
||||||
|
# Delivery-ACK: immer zurueckschicken (auch bei Idempotenz-Hit),
|
||||||
|
# damit die App den Status auf 'sent' setzen kann. Idempotenz-
|
||||||
|
# Check VERHINDERT aber die Doppel-Weiterleitung an Brain.
|
||||||
|
client_msg_id = payload.get("clientMsgId") or None
|
||||||
|
await self._send_chat_ack(client_msg_id)
|
||||||
|
if self._is_duplicate_client_msg(client_msg_id):
|
||||||
|
return
|
||||||
text = payload.get("text", "")
|
text = payload.get("text", "")
|
||||||
# Voice-Override fuer Folgenachrichten setzen — gilt bis zum naechsten
|
# Voice-Override fuer Folgenachrichten setzen — gilt bis zum naechsten
|
||||||
# chat-Event. Leerer String "" = explizit Default-Voice (override loeschen).
|
# chat-Event. Leerer String "" = explizit Default-Voice (override loeschen).
|
||||||
@@ -1562,7 +1643,16 @@ class ARIABridge:
|
|||||||
" [BARGE-IN]" if interrupted else "",
|
" [BARGE-IN]" if interrupted else "",
|
||||||
" [GPS]" if location else "",
|
" [GPS]" if location else "",
|
||||||
text[:80])
|
text[:80])
|
||||||
await self.send_to_core(core_text, source="app" + (" [barge-in]" if interrupted else ""))
|
# KEIN await: send_to_core kann 20 Min dauern. Wenn wir
|
||||||
|
# hier awaiten, blockt der `async for raw_message in ws`-
|
||||||
|
# Loop solange → RVS-Server droppt uns nach ~4 Min idle.
|
||||||
|
# Als Task: Brain laeuft im Hintergrund, RVS-recv bleibt
|
||||||
|
# bedienbar, Pings werden beantwortet, Verbindung lebt.
|
||||||
|
asyncio.create_task(self.send_to_core(
|
||||||
|
core_text,
|
||||||
|
source="app" + (" [barge-in]" if interrupted else ""),
|
||||||
|
client_msg_id=client_msg_id,
|
||||||
|
))
|
||||||
return
|
return
|
||||||
|
|
||||||
if msg_type == "cancel_request":
|
if msg_type == "cancel_request":
|
||||||
@@ -1738,7 +1828,8 @@ class ARIABridge:
|
|||||||
|
|
||||||
if not file_b64:
|
if not file_b64:
|
||||||
text = f"Stefan hat eine Datei gesendet ({file_name}, {file_type}) aber die Daten sind leer angekommen."
|
text = f"Stefan hat eine Datei gesendet ({file_name}, {file_type}) aber die Daten sind leer angekommen."
|
||||||
await self.send_to_core(text, source="app-file")
|
# create_task statt await — RVS-recv darf nicht blocken
|
||||||
|
asyncio.create_task(self.send_to_core(text, source="app-file"))
|
||||||
return
|
return
|
||||||
|
|
||||||
if file_type.startswith("image/"):
|
if file_type.startswith("image/"):
|
||||||
@@ -2126,6 +2217,12 @@ class ARIABridge:
|
|||||||
|
|
||||||
elif msg_type == "audio":
|
elif msg_type == "audio":
|
||||||
# Audio von der App → decodieren → STT → an aria-core
|
# Audio von der App → decodieren → STT → an aria-core
|
||||||
|
# Delivery-ACK + Idempotenz wie bei chat — App nutzt die ACKs
|
||||||
|
# auch fuer Sprach-Bubbles (Status auf der Bubble: ✓ sent).
|
||||||
|
client_msg_id = payload.get("clientMsgId") or None
|
||||||
|
await self._send_chat_ack(client_msg_id)
|
||||||
|
if self._is_duplicate_client_msg(client_msg_id):
|
||||||
|
return
|
||||||
audio_b64 = payload.get("base64", "")
|
audio_b64 = payload.get("base64", "")
|
||||||
mime_type = payload.get("mimeType", "audio/mp4")
|
mime_type = payload.get("mimeType", "audio/mp4")
|
||||||
duration_ms = payload.get("durationMs", 0)
|
duration_ms = payload.get("durationMs", 0)
|
||||||
@@ -2156,7 +2253,8 @@ class ARIABridge:
|
|||||||
" [GPS]" if location else "",
|
" [GPS]" if location else "",
|
||||||
f" reqId={audio_request_id[:16]}" if audio_request_id else "")
|
f" reqId={audio_request_id[:16]}" if audio_request_id else "")
|
||||||
asyncio.create_task(self._process_app_audio(
|
asyncio.create_task(self._process_app_audio(
|
||||||
audio_b64, mime_type, interrupted, audio_request_id, location))
|
audio_b64, mime_type, interrupted, audio_request_id, location,
|
||||||
|
client_msg_id=client_msg_id))
|
||||||
|
|
||||||
elif msg_type == "stt_response":
|
elif msg_type == "stt_response":
|
||||||
# Antwort der whisper-bridge auf unseren stt_request
|
# Antwort der whisper-bridge auf unseren stt_request
|
||||||
@@ -2215,7 +2313,8 @@ class ARIABridge:
|
|||||||
async def _process_app_audio(self, audio_b64: str, mime_type: str,
|
async def _process_app_audio(self, audio_b64: str, mime_type: str,
|
||||||
interrupted: bool = False,
|
interrupted: bool = False,
|
||||||
audio_request_id: str = "",
|
audio_request_id: str = "",
|
||||||
location: Optional[dict] = None) -> None:
|
location: Optional[dict] = None,
|
||||||
|
client_msg_id: Optional[str] = None) -> None:
|
||||||
"""App-Audio → STT → aria-core. Primaer via whisper-bridge (RVS), Fallback lokal.
|
"""App-Audio → STT → aria-core. Primaer via whisper-bridge (RVS), Fallback lokal.
|
||||||
|
|
||||||
interrupted=True wenn der User waehrend ARIA noch sprach/dachte aufgenommen hat
|
interrupted=True wenn der User waehrend ARIA noch sprach/dachte aufgenommen hat
|
||||||
@@ -2271,7 +2370,9 @@ class ARIABridge:
|
|||||||
|
|
||||||
# Dann an Brain — der blockt synchron bis ARIA fertig ist.
|
# Dann an Brain — der blockt synchron bis ARIA fertig ist.
|
||||||
core_text = self._build_core_text(text, interrupted, location)
|
core_text = self._build_core_text(text, interrupted, location)
|
||||||
await self.send_to_core(core_text, source="app-voice" + (" [barge-in]" if interrupted else ""))
|
await self.send_to_core(core_text,
|
||||||
|
source="app-voice" + (" [barge-in]" if interrupted else ""),
|
||||||
|
client_msg_id=client_msg_id)
|
||||||
else:
|
else:
|
||||||
logger.info("[rvs] Keine Sprache erkannt — ignoriert")
|
logger.info("[rvs] Keine Sprache erkannt — ignoriert")
|
||||||
|
|
||||||
@@ -2418,17 +2519,22 @@ class ARIABridge:
|
|||||||
status = await asyncio.get_event_loop().run_in_executor(None, _do_request)
|
status = await asyncio.get_event_loop().run_in_executor(None, _do_request)
|
||||||
logger.info("[cancel] Diagnostic /api/cancel: %s", status)
|
logger.info("[cancel] Diagnostic /api/cancel: %s", status)
|
||||||
|
|
||||||
async def _emit_activity(self, activity: str, tool: str = "") -> None:
|
async def _emit_activity(self, activity: str, tool: str = "", force: bool = False) -> None:
|
||||||
"""Sendet agent_activity an die App — nur wenn sich der State geaendert hat.
|
"""Sendet agent_activity an die App — nur wenn sich der State geaendert hat.
|
||||||
|
|
||||||
Trailing Agent-Events nach chat:final werden 3s lang unterdrueckt
|
Trailing Agent-Events nach chat:final werden 3s lang unterdrueckt
|
||||||
(nur 'idle' kommt immer durch)."""
|
(nur 'idle' kommt immer durch).
|
||||||
|
|
||||||
|
force=True: kein State-Dedup — wird vom Proxy-Tool-Hook genutzt
|
||||||
|
damit auch wiederholte gleiche Tool-Aufrufe (z.B. 3x Bash
|
||||||
|
hintereinander) im Gedanken-Stream als eigene Eintraege sichtbar
|
||||||
|
bleiben."""
|
||||||
if activity != "idle" and self._last_chat_final_at > 0:
|
if activity != "idle" and self._last_chat_final_at > 0:
|
||||||
since_final = asyncio.get_event_loop().time() - self._last_chat_final_at
|
since_final = asyncio.get_event_loop().time() - self._last_chat_final_at
|
||||||
if since_final < 3.0:
|
if since_final < 3.0:
|
||||||
return
|
return
|
||||||
state = (activity, tool)
|
state = (activity, tool)
|
||||||
if state == self._last_activity_state:
|
if not force and state == self._last_activity_state:
|
||||||
return
|
return
|
||||||
self._last_activity_state = state
|
self._last_activity_state = state
|
||||||
await self._send_to_rvs({
|
await self._send_to_rvs({
|
||||||
@@ -2576,6 +2682,24 @@ class ARIABridge:
|
|||||||
self._handle_trigger_fired(reply, trigger_name, ttype, events)
|
self._handle_trigger_fired(reply, trigger_name, ttype, events)
|
||||||
)
|
)
|
||||||
await _send_response(writer, 200, {"ok": True})
|
await _send_response(writer, 200, {"ok": True})
|
||||||
|
elif method == "POST" and path == "/internal/agent-activity":
|
||||||
|
# Vom Proxy gefeuert bei jedem Claude-Code-tool_use-Event
|
||||||
|
# (Bash, Read, Edit, Grep, ...). Wir spiegeln das als
|
||||||
|
# RVS agent_activity an App+Diagnostic damit der Gedanken-
|
||||||
|
# Stream live mitlaufen kann.
|
||||||
|
try:
|
||||||
|
data = json.loads(body.decode("utf-8", "ignore"))
|
||||||
|
except Exception as exc:
|
||||||
|
await _send_response(writer, 400, {"error": f"bad json: {exc}"})
|
||||||
|
return
|
||||||
|
tool = (data.get("tool") or "").strip()
|
||||||
|
if not tool:
|
||||||
|
await _send_response(writer, 400, {"error": "tool erforderlich"})
|
||||||
|
return
|
||||||
|
# Force-emit (kein Dedup): User soll JEDEN Tool-Call sehen
|
||||||
|
# selbst wenn derselbe Name zweimal in Folge kommt.
|
||||||
|
asyncio.create_task(self._emit_activity("tool", tool, force=True))
|
||||||
|
await _send_response(writer, 200, {"ok": True})
|
||||||
elif method == "POST" and path == "/internal/delete-chat-message":
|
elif method == "POST" and path == "/internal/delete-chat-message":
|
||||||
try:
|
try:
|
||||||
data = json.loads(body.decode("utf-8", "ignore"))
|
data = json.loads(body.decode("utf-8", "ignore"))
|
||||||
|
|||||||
@@ -301,6 +301,7 @@
|
|||||||
<input type="checkbox" id="gps-debug-toggle" onchange="toggleGpsDebug()" style="margin-right:4px;vertical-align:middle;">
|
<input type="checkbox" id="gps-debug-toggle" onchange="toggleGpsDebug()" style="margin-right:4px;vertical-align:middle;">
|
||||||
GPS-Position einblenden
|
GPS-Position einblenden
|
||||||
</label>
|
</label>
|
||||||
|
<button class="btn secondary" onclick="openThoughtStream()" id="btn-thoughts" title="Gedanken-Stream — was ARIA intern tut" style="padding:4px 10px;font-size:11px;">💭 Gedanken <span id="thoughts-count" style="color:#8888AA;"></span></button>
|
||||||
<button class="btn secondary" onclick="toggleChatFullscreen()" id="btn-chat-fs" style="padding:4px 10px;font-size:11px;">Vollbild</button>
|
<button class="btn secondary" onclick="toggleChatFullscreen()" id="btn-chat-fs" style="padding:4px 10px;font-size:11px;">Vollbild</button>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@@ -342,6 +343,22 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
<!-- Gedanken-Stream Modal — chronologisches Log was ARIA intern tut.
|
||||||
|
Zentrales Modal (max 720px breit), Liste mit Auto-Scroll ans Ende
|
||||||
|
wenn neue Eintraege reinkommen. -->
|
||||||
|
<div id="thought-stream-modal" style="display:none;position:fixed;top:0;left:0;width:100vw;height:100vh;background:rgba(0,0,0,0.7);z-index:1100;align-items:center;justify-content:center;padding:24px;" onclick="if(event.target===this) closeThoughtStream();">
|
||||||
|
<div style="background:#0D0D1A;border:1px solid #1E1E2E;border-radius:12px;width:100%;max-width:720px;height:70vh;display:flex;flex-direction:column;">
|
||||||
|
<div style="display:flex;align-items:center;padding:14px;border-bottom:1px solid #1E1E2E;">
|
||||||
|
<h2 style="margin:0;color:#FFD60A;flex:1;font-size:16px;">💭 Gedanken-Stream <span id="thoughts-count-modal" style="color:#8888AA;font-weight:normal;"></span></h2>
|
||||||
|
<button class="btn secondary" onclick="clearThoughtStream()" id="btn-clear-thoughts" title="Stream leeren" style="padding:4px 10px;font-size:11px;color:#FF3B30;border-color:#FF3B30;margin-right:6px;">🗑 Leeren</button>
|
||||||
|
<button class="btn secondary" onclick="closeThoughtStream()" style="padding:4px 12px;">Schliessen</button>
|
||||||
|
</div>
|
||||||
|
<div id="thought-stream-list" style="flex:1;overflow-y:auto;padding:8px 0;font-size:13px;font-family:monospace;">
|
||||||
|
<!-- gefuellt durch renderThoughtStream() -->
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
<!-- Sessions + alter Brain-Viewer entfernt — Memories laufen jetzt
|
<!-- Sessions + alter Brain-Viewer entfernt — Memories laufen jetzt
|
||||||
komplett ueber den Gehirn-Tab gegen die Vector-DB im aria-brain. -->
|
komplett ueber den Gehirn-Tab gegen die Vector-DB im aria-brain. -->
|
||||||
|
|
||||||
@@ -2166,6 +2183,9 @@
|
|||||||
}
|
}
|
||||||
|
|
||||||
function updateThinkingIndicator(msg) {
|
function updateThinkingIndicator(msg) {
|
||||||
|
// Gedanken-Stream fuettern — JEDES Event (auch idle als ✓ fertig)
|
||||||
|
pushThought(msg.activity || '', msg.tool || '');
|
||||||
|
|
||||||
const indicators = [
|
const indicators = [
|
||||||
document.getElementById('thinking-indicator'),
|
document.getElementById('thinking-indicator'),
|
||||||
document.getElementById('thinking-indicator-fs'),
|
document.getElementById('thinking-indicator-fs'),
|
||||||
@@ -2202,6 +2222,114 @@
|
|||||||
}, 120000);
|
}, 120000);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ── Gedanken-Stream ─────────────────────────────
|
||||||
|
// Chronologisches Log von agent_activity-Events. Wird in localStorage
|
||||||
|
// persistiert (ueberlebt Page-Reload), capped auf MAX_THOUGHTS.
|
||||||
|
const THOUGHT_STORAGE_KEY = 'aria_thought_stream';
|
||||||
|
const MAX_THOUGHTS = 500;
|
||||||
|
let thoughtStream = [];
|
||||||
|
let lastThoughtKey = '';
|
||||||
|
let _thoughtSaveTimer = null;
|
||||||
|
|
||||||
|
function loadThoughtStream() {
|
||||||
|
try {
|
||||||
|
const raw = localStorage.getItem(THOUGHT_STORAGE_KEY);
|
||||||
|
if (!raw) return;
|
||||||
|
const parsed = JSON.parse(raw);
|
||||||
|
if (Array.isArray(parsed)) thoughtStream = parsed.slice(-MAX_THOUGHTS);
|
||||||
|
} catch {}
|
||||||
|
updateThoughtsBadge();
|
||||||
|
}
|
||||||
|
|
||||||
|
function persistThoughtStream() {
|
||||||
|
if (_thoughtSaveTimer) clearTimeout(_thoughtSaveTimer);
|
||||||
|
_thoughtSaveTimer = setTimeout(() => {
|
||||||
|
try {
|
||||||
|
if (thoughtStream.length === 0) localStorage.removeItem(THOUGHT_STORAGE_KEY);
|
||||||
|
else localStorage.setItem(THOUGHT_STORAGE_KEY, JSON.stringify(thoughtStream.slice(-MAX_THOUGHTS)));
|
||||||
|
} catch {}
|
||||||
|
}, 500);
|
||||||
|
}
|
||||||
|
|
||||||
|
function pushThought(activity, tool) {
|
||||||
|
// Dedup gegen direkt aufeinanderfolgende identische Events. Tool-
|
||||||
|
// Events NIE dedupen — drei Bash-Calls in Folge sollen drei Eintraege
|
||||||
|
// ergeben, nicht einen.
|
||||||
|
const key = `${activity}|${tool || ''}`;
|
||||||
|
if (activity !== 'tool' && key === lastThoughtKey) return;
|
||||||
|
lastThoughtKey = key;
|
||||||
|
thoughtStream.push({ ts: Date.now(), activity, tool: tool || '' });
|
||||||
|
if (thoughtStream.length > MAX_THOUGHTS) thoughtStream = thoughtStream.slice(-MAX_THOUGHTS);
|
||||||
|
updateThoughtsBadge();
|
||||||
|
// Wenn das Modal offen ist: live nachrendern + ans Ende scrollen
|
||||||
|
const modal = document.getElementById('thought-stream-modal');
|
||||||
|
if (modal && modal.style.display !== 'none') renderThoughtStream(true);
|
||||||
|
persistThoughtStream();
|
||||||
|
}
|
||||||
|
|
||||||
|
function updateThoughtsBadge() {
|
||||||
|
const a = document.getElementById('thoughts-count');
|
||||||
|
if (a) a.textContent = thoughtStream.length ? `(${thoughtStream.length})` : '';
|
||||||
|
const b = document.getElementById('thoughts-count-modal');
|
||||||
|
if (b) b.textContent = thoughtStream.length ? `(${thoughtStream.length})` : '';
|
||||||
|
}
|
||||||
|
|
||||||
|
function openThoughtStream() {
|
||||||
|
const modal = document.getElementById('thought-stream-modal');
|
||||||
|
if (!modal) return;
|
||||||
|
modal.style.display = 'flex';
|
||||||
|
renderThoughtStream(true);
|
||||||
|
}
|
||||||
|
|
||||||
|
function closeThoughtStream() {
|
||||||
|
const modal = document.getElementById('thought-stream-modal');
|
||||||
|
if (modal) modal.style.display = 'none';
|
||||||
|
}
|
||||||
|
|
||||||
|
function clearThoughtStream() {
|
||||||
|
if (thoughtStream.length === 0) return;
|
||||||
|
if (!confirm(`Gedanken-Stream leeren? ${thoughtStream.length} Eintraege werden geloescht.`)) return;
|
||||||
|
thoughtStream = [];
|
||||||
|
lastThoughtKey = '';
|
||||||
|
updateThoughtsBadge();
|
||||||
|
renderThoughtStream(false);
|
||||||
|
persistThoughtStream();
|
||||||
|
}
|
||||||
|
|
||||||
|
function _escapeHtml(s) {
|
||||||
|
return String(s).replace(/[&<>"']/g, c => ({'&':'&','<':'<','>':'>','"':'"',"'":'''}[c]));
|
||||||
|
}
|
||||||
|
|
||||||
|
function renderThoughtStream(autoscroll) {
|
||||||
|
const list = document.getElementById('thought-stream-list');
|
||||||
|
if (!list) return;
|
||||||
|
if (thoughtStream.length === 0) {
|
||||||
|
list.innerHTML = '<div style="padding:24px;text-align:center;color:#555570;font-style:italic;">Noch keine Gedanken aufgezeichnet.<br>Sobald ARIA was tut, taucht\'s hier auf.</div>';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const rows = [];
|
||||||
|
let prevTs = 0;
|
||||||
|
for (const t of thoughtStream) {
|
||||||
|
const gapMin = prevTs ? Math.floor((t.ts - prevTs) / 60000) : 0;
|
||||||
|
if (gapMin >= 1) {
|
||||||
|
const label = gapMin < 60 ? `${gapMin} Min` : `${Math.floor(gapMin/60)}h ${gapMin%60}m`;
|
||||||
|
rows.push(`<div style="display:flex;align-items:center;padding:6px 16px;gap:8px;"><div style="flex:1;height:1px;background:#1E1E2E;"></div><span style="color:#555570;font-size:10px;">${label}</span><div style="flex:1;height:1px;background:#1E1E2E;"></div></div>`);
|
||||||
|
}
|
||||||
|
prevTs = t.ts;
|
||||||
|
const d = new Date(t.ts);
|
||||||
|
const time = `${String(d.getHours()).padStart(2,'0')}:${String(d.getMinutes()).padStart(2,'0')}:${String(d.getSeconds()).padStart(2,'0')}`;
|
||||||
|
let icon, label, color;
|
||||||
|
if (t.activity === 'idle') { icon = '✓'; label = 'fertig'; color = '#34C759'; }
|
||||||
|
else if (t.activity === 'tool') { icon = '🔧'; label = t.tool || 'tool'; color = '#E0E0F0'; }
|
||||||
|
else if (t.activity === 'assistant'){ icon = '✍️'; label = 'schreibt'; color = '#E0E0F0'; }
|
||||||
|
else if (t.activity === 'thinking'){ icon = '💭'; label = 'denkt'; color = '#E0E0F0'; }
|
||||||
|
else { icon = '•'; label = t.activity; color = '#E0E0F0'; }
|
||||||
|
rows.push(`<div style="display:flex;padding:4px 16px;align-items:baseline;"><span style="color:#555570;width:78px;font-size:11px;">${time}</span><span style="width:24px;">${icon}</span><span style="color:${color};flex:1;">${_escapeHtml(label)}</span></div>`);
|
||||||
|
}
|
||||||
|
list.innerHTML = rows.join('');
|
||||||
|
if (autoscroll) list.scrollTop = list.scrollHeight;
|
||||||
|
}
|
||||||
|
|
||||||
// ── XTTS Panel ─────────────────────────────
|
// ── XTTS Panel ─────────────────────────────
|
||||||
function renderVoiceList(voices) {
|
function renderVoiceList(voices) {
|
||||||
const box = document.getElementById('xtts-voice-list');
|
const box = document.getElementById('xtts-voice-list');
|
||||||
@@ -4696,6 +4824,7 @@
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
loadThoughtStream();
|
||||||
connectWS();
|
connectWS();
|
||||||
</script>
|
</script>
|
||||||
</body>
|
</body>
|
||||||
|
|||||||
@@ -12,8 +12,10 @@ services:
|
|||||||
DIST=$$(find /usr/local/lib -path '*/claude-max-api-proxy/dist' -type d | head -1) &&
|
DIST=$$(find /usr/local/lib -path '*/claude-max-api-proxy/dist' -type d | head -1) &&
|
||||||
sed -i 's/startServer({ port })/startServer({ port, host: process.env.HOST || \"127.0.0.1\" })/' $$DIST/server/standalone.js &&
|
sed -i 's/startServer({ port })/startServer({ port, host: process.env.HOST || \"127.0.0.1\" })/' $$DIST/server/standalone.js &&
|
||||||
sed -i 's/\"--no-session-persistence\",/\"--no-session-persistence\",\"--dangerously-skip-permissions\",/' $$DIST/subprocess/manager.js &&
|
sed -i 's/\"--no-session-persistence\",/\"--no-session-persistence\",\"--dangerously-skip-permissions\",/' $$DIST/subprocess/manager.js &&
|
||||||
|
sed -i 's/const DEFAULT_TIMEOUT = 300000;/const DEFAULT_TIMEOUT = 1200000;/' $$DIST/subprocess/manager.js &&
|
||||||
cp /proxy-patches/openai-to-cli.js $$DIST/adapter/openai-to-cli.js &&
|
cp /proxy-patches/openai-to-cli.js $$DIST/adapter/openai-to-cli.js &&
|
||||||
cp /proxy-patches/cli-to-openai.js $$DIST/adapter/cli-to-openai.js &&
|
cp /proxy-patches/cli-to-openai.js $$DIST/adapter/cli-to-openai.js &&
|
||||||
|
cp /proxy-patches/routes.js $$DIST/server/routes.js &&
|
||||||
claude-max-api"
|
claude-max-api"
|
||||||
volumes:
|
volumes:
|
||||||
- ~/.claude:/root/.claude # Claude CLI Auth (Credentials in /root/.claude/.credentials.json)
|
- ~/.claude:/root/.claude # Claude CLI Auth (Credentials in /root/.claude/.credentials.json)
|
||||||
|
|||||||
@@ -297,6 +297,23 @@ Skills mit Tool-Use.
|
|||||||
- [x] **Gehirn-Kategorien standardmaessig eingeklappt**: Beim ersten Aufruf alle Type-Sections collapsed, Stefan klappt gezielt auf was er sehen will. State persistiert in localStorage
|
- [x] **Gehirn-Kategorien standardmaessig eingeklappt**: Beim ersten Aufruf alle Type-Sections collapsed, Stefan klappt gezielt auf was er sehen will. State persistiert in localStorage
|
||||||
- [x] **Klappbare Type-Header + Category-AutoSuggest + Info-Modal**: Type-Header (▼/▶) klappbar, Category-Feld im Neu/Edit-Modal mit `<datalist>`-Vorschlaegen aller existierenden Categories, ℹ-Button-Modal erklaert welche Types FEST im System-Prompt vs. Cold Memory sind
|
- [x] **Klappbare Type-Header + Category-AutoSuggest + Info-Modal**: Type-Header (▼/▶) klappbar, Category-Feld im Neu/Edit-Modal mit `<datalist>`-Vorschlaegen aller existierenden Categories, ℹ-Button-Modal erklaert welche Types FEST im System-Prompt vs. Cold Memory sind
|
||||||
|
|
||||||
|
### GPS-Trigger-Verbesserungen (entered_near + left_near + Timing-Fix)
|
||||||
|
|
||||||
|
- [x] **near() bei Auto-Vorbeifahrten verpasst — gefixt**: Background-Loop tickte alle 30s, Vorbeifahrt durch 300m-Radius bei 50-120 km/h dauert nur 18-43s → Tick konnte komplett dazwischen liegen. Fix: `TICK_SEC` 30 → 8 (Loop ist billig, Brain merkt das nicht). Plus event-getrieben: Bridge ruft nach jedem `location_update` ein POST `/triggers/check-now` im Brain → Watcher sehen die frische Position in Millisekunden statt im Polling-Takt. Polling läuft parallel als Fallback für Watcher ohne GPS-Bezug
|
||||||
|
- [x] **near() Age-Schutz**: GPS-Daten älter als 5 Minuten (`NEAR_MAX_AGE_SEC=300`) gelten als veraltet → `near()` liefert False. Vorher hätte ein wochen-alter Wert die Funktion weiter als „in der Nähe" eingeordnet → Phantom-Fires wenn Tracking aus war
|
||||||
|
- [x] **Drei GPS-Modi statt einem**: `near()` bleibt = „solange drin". Neu: **`entered_near(lat, lon, r)`** feuert NUR beim Übergang außen→innen (Blitzer-Warner mit r=2000 = 2 km Vorwarnung, Ankunft mit r=100), **`left_near(lat, lon, r)`** feuert NUR beim Übergang innen→außen („Hast du am Parkplatz was vergessen?"). State-Tracking pro Trigger pro near-Aufruf (`near_states`-Dict im Manifest) — Background-Loop schreibt den letzten Auswertungswert immer zurück, damit beim nächsten Tick die Übergangs-Erkennung greift. ARIA's `trigger_watcher`-Tool-Description erklärt die drei Modi inkl. empfohlener Throttle-Werte (kurz für entered/left, lang für near)
|
||||||
|
|
||||||
|
### App-Memory-Editor + Crash-Reporting
|
||||||
|
|
||||||
|
- [x] **Bubble-Header dynamic** (created/updated/deleted): Die `🧠`-Bubble zeigt jetzt was passiert ist — "ARIA hat etwas gemerkt" / "Notiz geändert" / "Notiz gelöscht" (rot bei delete). Brain-Tools schicken `action`-Feld im memory_saved-Event mit
|
||||||
|
- [x] **Tap auf Memory-Bubble → Detail-Modal**: Komponente `MemoryDetailModal` zeigt alle Felder (Titel, Type, Category, Tags, voller Content, Anhang-Vorschau mit Thumbnails). Stift-Icon wechselt in Edit-Mode mit Form-Feldern + 📌 Pinned-Toggle. **Anhänge hoch-/runterladen + löschen** im Modal (DocumentPicker, multipart-Upload via RVS-Brain-Proxy). Memory komplett löschen mit Confirm
|
||||||
|
- [x] **Notizen-Inbox-Button (`🗂️`)** neben der Lupe in der Status-Leiste: Vollbild-Modal mit zwei Sections — „Aus diesem Chat" (kompakte Liste der Spezial-Bubbles aus dem aktuellen Verlauf, klickbar) + „Alle Memories aus der DB" mit dem `MemoryBrowser`. Spezial-Bubbles (memorySaved/triggerCreated/skillCreated) werden im Chat-Stream gefiltert (statt unten zu kleben)
|
||||||
|
- [x] **Memory-Editor in App-Settings**: neue Sektion 🧠 „Gedächtnis" in den App-Einstellungen. Komplette CRUD-UI mit Wortlich-Suche, Type-Dropdown, Pinned/Cold-Filter, „+ Neu" anlegen. Selbe `MemoryBrowser`-Komponente wie in der Inbox
|
||||||
|
- [x] **RVS-Brain-Proxy als Fundament**: Bridge implementiert generischen `brain_request` / `brain_response`-Channel — die App kann beliebige Brain-HTTP-Endpoints via RVS adressieren (GET/POST/PATCH/DELETE, JSON+Base64-Body, base64-encoded Binär-Antworten). `services/brainApi.ts` als Promise-basierter Client mit Request-ID-Routing, Timeout, automatischem Listener-Setup
|
||||||
|
- [x] **App-Crash-Reporting via RVS**: ErrorBoundary-Komponente fängt React-Render-Fehler, `installGlobalCrashReporter` haengt sich an `ErrorUtils.setGlobalHandler` + `HermesInternal.enablePromiseRejectionTracker`. Crashes wandern als `app_log`-Event durch RVS, Bridge schreibt JSONL in `/shared/logs/app.log`. Diagnostic-Server liefert GET `/api/app-log[?limit=N]` + POST `/api/app-log/clear`. **`tools/fetch-app-logs.sh`** holt die Logs auf die Dev-Maschine (über `ARIA_DIAG_URL` aus `.claude/aria-vm.env`), speichert in `.aria-debug/` (gitignored), zeigt Stack-Trace kompakt auf stdout
|
||||||
|
- [x] **`memory_search` + `memory_update` Tools**: ARIA kann die DB jetzt aktiv durchsuchen (Volltext/Semantic) und existierende Einträge per ID patchen statt fragmentierende neue anzulegen. Tool-Description sagt explizit „Memory ist Truth über Conversation-Window" — wenn der User korrigiert hat, gilt das was im Memory steht. Wichtig nach Diagnostic-Edits damit ARIA die neue Wahrheit sieht statt aus dem Window zu raten
|
||||||
|
- [x] **App-Bugfixes**: (a) URLSearchParams crasht in Hermes — durch Mini-Query-Builder ersetzt (`brainApi._qs()`). (b) Cache leer + Datei-Tap → Auto-Re-Download via file_request statt Toast-Sackgasse, plus State-Cleanup (uri/localUri auf undefined). (c) Memory-Liste in Settings scrollt jetzt (nestedScrollEnabled auf FlatList + äußere ScrollView). (d) Modal-im-Modal auf Android gefixt — MemoryBrowser nimmt optionalen `onOpenMemory`-Callback, kein verschachteltes DetailModal mehr. (e) Alert.prompt (iOS-only) durch eigenes Text-Input-Modal ersetzt fuer „Neue Memory anlegen"
|
||||||
|
|
||||||
### Memory-Anhaenge mit Vision (Stufe A-E + attach_paths)
|
### Memory-Anhaenge mit Vision (Stufe A-E + attach_paths)
|
||||||
|
|
||||||
- [x] **Anhaenge an Memory-Eintraege** — Bilder/PDFs/beliebige Dateien koennen an jede Memory gehaengt werden, liegen physisch unter `/shared/memory-attachments/<memory-id>/`. Cleanup beim Memory-Delete automatisch. Limit 20 MB pro Datei
|
- [x] **Anhaenge an Memory-Eintraege** — Bilder/PDFs/beliebige Dateien koennen an jede Memory gehaengt werden, liegen physisch unter `/shared/memory-attachments/<memory-id>/`. Cleanup beim Memory-Delete automatisch. Limit 20 MB pro Datei
|
||||||
@@ -324,14 +341,22 @@ Skills mit Tool-Use.
|
|||||||
- [x] Info-Buttons mit Modal-Erklaerungen im Gehirn-Tab
|
- [x] Info-Buttons mit Modal-Erklaerungen im Gehirn-Tab
|
||||||
- [x] Token/Call-Metrics + Subscription-Quota-Tracking: pro Claude-Call ein Log-Eintrag mit Token-Schaetzung (chars/4). Gehirn-Tab zeigt 1h/5h/24h/30d-Aggregat + Progress-Bar gegen Plan-Limit (Pro=45/5h, Max 5x=225/5h, Max 20x=900/5h, Custom). Warn-Schwelle 80%, kritisch 90%.
|
- [x] Token/Call-Metrics + Subscription-Quota-Tracking: pro Claude-Call ein Log-Eintrag mit Token-Schaetzung (chars/4). Gehirn-Tab zeigt 1h/5h/24h/30d-Aggregat + Progress-Bar gegen Plan-Limit (Pro=45/5h, Max 5x=225/5h, Max 20x=900/5h, Custom). Warn-Schwelle 80%, kritisch 90%.
|
||||||
|
|
||||||
|
### Chat-Stabilitaet: Such-Scroll, Stuck-Watchdog, Delivery-Handshake
|
||||||
|
|
||||||
|
- [x] **Such-Scroll springt nicht mehr permanent**: `onScrollToIndexFailed` hatte 3 cascading `setTimeout`s (120/320/600 ms) — jeder failed Retry triggerte den Handler wieder → 3, 9, 27 Scrolls in der Pipeline. Plus `invertedMessages` war in den useEffect-Deps: jede neue ARIA-Nachricht re-triggerte den Such-Scroll. Fix: nur EIN Retry nach 300 ms, in einer Ref-getrackten Timer-Variable; bei neuem Such-Hit wird der pending Retry gecancelt. `invertedMessages`-Snapshot via Ref statt Dep
|
||||||
|
- [x] **Jump-to-Bottom-Button** rechts unten in der Chat-Liste — taucht ab ~250 px Scroll-Weg auf, scrollt zur neuesten Nachricht (bei inverted FlatList `scrollToOffset(0)`)
|
||||||
|
- [x] **AsyncStorage-Init-Race**: zwischen Mount und „Verlauf aus AsyncStorage geladen" konnte eine User-Nachricht oder ein WS-Event ankommen — `setMessages(parsed)` ueberschrieb's mit dem alten Stand und die frische Nachricht war spurlos weg. Fix: Merge per `id` (frischere `prev`-Eintraege schlagen Gespeichertes), sortiert nach `timestamp`. `messageIdCounter` wird nur noch erhoeht, nie zurueckgesetzt
|
||||||
|
- [x] **Stuck-Thinking-Watchdog**: „ARIA denkt..." blieb gelegentlich kleben (Brain-Crash, WS-Disconnect ohne idle-Event, Cancel mit Race). Fix: jeder `agent_activity != idle` armiert einen 180s-Timer; ohne neues Lebenszeichen geht's auto-idle + Bubble „⚠ Habe gerade keine Verbindung zurueck bekommen". Watchdog wird beim ARIA-Reply, beim Cancel/Barge-In und beim Screen-Unmount gecleart
|
||||||
|
- [x] **Delivery-Handshake (WhatsApp-Style)**: pro User-Bubble ein lokaler `clientMsgId` + `deliveryStatus` (queued/sending/sent/delivered/failed). Bridge sendet `chat_ack` zurueck (✓ sent) und schreibt die ID ins `chat_backup.jsonl`. ARIA-Reply markiert alle vorigen User-Bubbles als delivered (✓✓). LRU-Idempotenz auf der Bridge (200 cmids) verhindert Doppelte beim Retry. Offline-Queue: Nachrichten im Flugmodus bleiben lokal als ⏱-queued, beim Reconnect feuert `flushQueuedMessages`. ACK-Timeout 30 s, bis zu 3 Retries, danach ⚠ + Tap-fuer-Retry
|
||||||
|
- [x] **Offline-Bubble verschwand nach Reconnect (Race)**: parallel laufen `chat_history_request` und `flushQueuedMessages` beim Reconnect; die History-Antwort kam an bevor die Bridge die Bubble persistiert hatte → Merge ersetzte den lokalen Stand → Bubble weg (war aber in Diagnostic drin). Fix: Bridge spiegelt `clientMsgId` im `chat_backup.jsonl`, App-Merge dedupt per cmid und behaelt lokale Bubbles deren ID der Server noch nicht kennt
|
||||||
|
- [x] **Doppel-Bubble nach Retry**: Backup-Eintraege von vor dem cmid-Patch hatten keine `clientMsgId` — Server-Bubble (ohne cmid) und lokale failed-Bubble (mit cmid) standen beide im Merge. Plus ACK-Timer lief gelegentlich weiter obwohl die Bubble schon `delivered` war → Retry pushte den Status zurueck auf `sending`. Fix: Merge faellt zusaetzlich auf `text+timestamp`-Heuristik im 5-Min-Fenster zurueck; `dispatchWithAck` prueft per Ref ob die Bubble inzwischen `delivered` ist und cancelt dann; bei ARIA-Reply werden alle laufenden ACK-Timer gecleart
|
||||||
|
|
||||||
## Offen
|
## Offen
|
||||||
|
|
||||||
### App Features
|
### App Features
|
||||||
- [ ] Chat-History zuverlaessiger laden (AsyncStorage Race Condition)
|
|
||||||
- [ ] Custom-Wake-Word-Upload via Diagnostic (eigene .onnx-Files ohne App-Rebuild)
|
- [ ] Custom-Wake-Word-Upload via Diagnostic (eigene .onnx-Files ohne App-Rebuild)
|
||||||
|
|
||||||
### Architektur
|
### Architektur
|
||||||
- [ ] Bilder: Claude Vision direkt nutzen (aktuell nur Dateipfad an ARIA)
|
|
||||||
- [ ] Diagnostic: System-Info Tab (Container-Status, Disk, RAM, CPU)
|
- [ ] Diagnostic: System-Info Tab (Container-Status, Disk, RAM, CPU)
|
||||||
- [ ] RVS Zombie-Connections endgueltig loesen
|
- [ ] RVS Zombie-Connections endgueltig loesen
|
||||||
- [ ] Gamebox: kleine Web-Oberflaeche fuer Credentials/Server-Config oder zentral aus Diagnostic per RVS push
|
- [ ] Gamebox: kleine Web-Oberflaeche fuer Credentials/Server-Config oder zentral aus Diagnostic per RVS push
|
||||||
|
|||||||
@@ -0,0 +1,309 @@
|
|||||||
|
/**
|
||||||
|
* ARIA-patched API Route Handlers
|
||||||
|
*
|
||||||
|
* Erweiterung der npm-Version von claude-max-api-proxy:
|
||||||
|
* - Bei jedem Claude-CLI-`assistant`-Event mit tool_use-Block (Bash, Read,
|
||||||
|
* Edit, Grep, …) wird ein HTTP-POST an die Bridge gefeuert
|
||||||
|
* (ARIA_TOOL_HOOK_URL, default http://aria-bridge:8090/internal/agent-activity).
|
||||||
|
* Bridge spiegelt das als RVS `agent_activity` an App+Diagnostic →
|
||||||
|
* Gedanken-Stream zeigt live was ARIA gerade tool-maessig macht.
|
||||||
|
* - Fire-and-forget, fail-open. Wenn die Bridge nicht antwortet, bricht
|
||||||
|
* der Brain-Call NICHT ab.
|
||||||
|
*
|
||||||
|
* Wird zur Container-Startzeit ueber die npm-Version geschrieben
|
||||||
|
* (siehe docker-compose.yml proxy-Block).
|
||||||
|
*/
|
||||||
|
import { v4 as uuidv4 } from "uuid";
|
||||||
|
import http from "http";
|
||||||
|
import { ClaudeSubprocess } from "../subprocess/manager.js";
|
||||||
|
import { openaiToCli } from "../adapter/openai-to-cli.js";
|
||||||
|
import { cliResultToOpenai, createDoneChunk, } from "../adapter/cli-to-openai.js";
|
||||||
|
|
||||||
|
const TOOL_HOOK_URL = process.env.ARIA_TOOL_HOOK_URL
|
||||||
|
|| "http://aria-bridge:8090/internal/agent-activity";
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Pusht einen Tool-Use-Event an die Bridge. Fire-and-forget — keine Awaits,
|
||||||
|
* keine Fehler nach oben. Logged Fehler still.
|
||||||
|
*/
|
||||||
|
function _emitToolEvent(toolName) {
|
||||||
|
if (!toolName) return;
|
||||||
|
try {
|
||||||
|
const u = new URL(TOOL_HOOK_URL);
|
||||||
|
const body = JSON.stringify({ tool: String(toolName) });
|
||||||
|
const req = http.request({
|
||||||
|
method: "POST",
|
||||||
|
hostname: u.hostname,
|
||||||
|
port: u.port || 80,
|
||||||
|
path: u.pathname,
|
||||||
|
headers: { "Content-Type": "application/json", "Content-Length": Buffer.byteLength(body) },
|
||||||
|
timeout: 2000,
|
||||||
|
}, (res) => { res.resume(); });
|
||||||
|
req.on("error", () => {});
|
||||||
|
req.on("timeout", () => req.destroy());
|
||||||
|
req.write(body);
|
||||||
|
req.end();
|
||||||
|
} catch (_) { /* niemals weiterwerfen */ }
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Hookt die `assistant`-Events des Subprozesses. Jedes assistant-Message
|
||||||
|
* kann mehrere content-Bloecke haben — tool_use-Bloecke pushen wir live.
|
||||||
|
*/
|
||||||
|
function _attachToolHook(subprocess) {
|
||||||
|
subprocess.on("assistant", (message) => {
|
||||||
|
try {
|
||||||
|
const blocks = message?.message?.content || [];
|
||||||
|
for (const b of blocks) {
|
||||||
|
if (b && b.type === "tool_use" && b.name) {
|
||||||
|
_emitToolEvent(b.name);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (_) { /* fail-open */ }
|
||||||
|
});
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Handle POST /v1/chat/completions
|
||||||
|
*
|
||||||
|
* Main endpoint for chat requests, supports both streaming and non-streaming
|
||||||
|
*/
|
||||||
|
export async function handleChatCompletions(req, res) {
|
||||||
|
const requestId = uuidv4().replace(/-/g, "").slice(0, 24);
|
||||||
|
const body = req.body;
|
||||||
|
const stream = body.stream === true;
|
||||||
|
try {
|
||||||
|
// Validate request
|
||||||
|
if (!body.messages || !Array.isArray(body.messages) || body.messages.length === 0) {
|
||||||
|
res.status(400).json({
|
||||||
|
error: {
|
||||||
|
message: "messages is required and must be a non-empty array",
|
||||||
|
type: "invalid_request_error",
|
||||||
|
code: "invalid_messages",
|
||||||
|
},
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
// Convert to CLI input format
|
||||||
|
const cliInput = openaiToCli(body);
|
||||||
|
const subprocess = new ClaudeSubprocess();
|
||||||
|
// ARIA-Patch: Tool-Use-Events live an die Bridge weiterleiten.
|
||||||
|
// Greift fuer beide Branches (stream + non-stream).
|
||||||
|
_attachToolHook(subprocess);
|
||||||
|
if (stream) {
|
||||||
|
await handleStreamingResponse(req, res, subprocess, cliInput, requestId);
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
await handleNonStreamingResponse(res, subprocess, cliInput, requestId);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
catch (error) {
|
||||||
|
const message = error instanceof Error ? error.message : "Unknown error";
|
||||||
|
console.error("[handleChatCompletions] Error:", message);
|
||||||
|
if (!res.headersSent) {
|
||||||
|
res.status(500).json({
|
||||||
|
error: {
|
||||||
|
message,
|
||||||
|
type: "server_error",
|
||||||
|
code: null,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Handle streaming response (SSE)
|
||||||
|
*
|
||||||
|
* IMPORTANT: The Express req.on("close") event fires when the request body
|
||||||
|
* is fully received, NOT when the client disconnects. For SSE connections,
|
||||||
|
* we use res.on("close") to detect actual client disconnection.
|
||||||
|
*/
|
||||||
|
async function handleStreamingResponse(req, res, subprocess, cliInput, requestId) {
|
||||||
|
// Set SSE headers
|
||||||
|
res.setHeader("Content-Type", "text/event-stream");
|
||||||
|
res.setHeader("Cache-Control", "no-cache");
|
||||||
|
res.setHeader("Connection", "keep-alive");
|
||||||
|
res.setHeader("X-Request-Id", requestId);
|
||||||
|
// CRITICAL: Flush headers immediately to establish SSE connection
|
||||||
|
// Without this, headers are buffered and client times out waiting
|
||||||
|
res.flushHeaders();
|
||||||
|
// Send initial comment to confirm connection is alive
|
||||||
|
res.write(":ok\n\n");
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
let isFirst = true;
|
||||||
|
let lastModel = "claude-sonnet-4";
|
||||||
|
let isComplete = false;
|
||||||
|
// Handle actual client disconnect (response stream closed)
|
||||||
|
res.on("close", () => {
|
||||||
|
if (!isComplete) {
|
||||||
|
// Client disconnected before response completed - kill subprocess
|
||||||
|
subprocess.kill();
|
||||||
|
}
|
||||||
|
resolve();
|
||||||
|
});
|
||||||
|
// Handle streaming content deltas
|
||||||
|
subprocess.on("content_delta", (event) => {
|
||||||
|
const text = event.event.delta?.text || "";
|
||||||
|
if (text && !res.writableEnded) {
|
||||||
|
const chunk = {
|
||||||
|
id: `chatcmpl-${requestId}`,
|
||||||
|
object: "chat.completion.chunk",
|
||||||
|
created: Math.floor(Date.now() / 1000),
|
||||||
|
model: lastModel,
|
||||||
|
choices: [{
|
||||||
|
index: 0,
|
||||||
|
delta: {
|
||||||
|
role: isFirst ? "assistant" : undefined,
|
||||||
|
content: text,
|
||||||
|
},
|
||||||
|
finish_reason: null,
|
||||||
|
}],
|
||||||
|
};
|
||||||
|
res.write(`data: ${JSON.stringify(chunk)}\n\n`);
|
||||||
|
isFirst = false;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
// Handle final assistant message (for model name)
|
||||||
|
subprocess.on("assistant", (message) => {
|
||||||
|
lastModel = message.message.model;
|
||||||
|
});
|
||||||
|
subprocess.on("result", (_result) => {
|
||||||
|
isComplete = true;
|
||||||
|
if (!res.writableEnded) {
|
||||||
|
// Send final done chunk with finish_reason
|
||||||
|
const doneChunk = createDoneChunk(requestId, lastModel);
|
||||||
|
res.write(`data: ${JSON.stringify(doneChunk)}\n\n`);
|
||||||
|
res.write("data: [DONE]\n\n");
|
||||||
|
res.end();
|
||||||
|
}
|
||||||
|
resolve();
|
||||||
|
});
|
||||||
|
subprocess.on("error", (error) => {
|
||||||
|
console.error("[Streaming] Error:", error.message);
|
||||||
|
if (!res.writableEnded) {
|
||||||
|
res.write(`data: ${JSON.stringify({
|
||||||
|
error: { message: error.message, type: "server_error", code: null },
|
||||||
|
})}\n\n`);
|
||||||
|
res.end();
|
||||||
|
}
|
||||||
|
resolve();
|
||||||
|
});
|
||||||
|
subprocess.on("close", (code) => {
|
||||||
|
// Subprocess exited - ensure response is closed
|
||||||
|
if (!res.writableEnded) {
|
||||||
|
if (code !== 0 && !isComplete) {
|
||||||
|
// Abnormal exit without result - send error
|
||||||
|
res.write(`data: ${JSON.stringify({
|
||||||
|
error: { message: `Process exited with code ${code}`, type: "server_error", code: null },
|
||||||
|
})}\n\n`);
|
||||||
|
}
|
||||||
|
res.write("data: [DONE]\n\n");
|
||||||
|
res.end();
|
||||||
|
}
|
||||||
|
resolve();
|
||||||
|
});
|
||||||
|
// Start the subprocess
|
||||||
|
subprocess.start(cliInput.prompt, {
|
||||||
|
model: cliInput.model,
|
||||||
|
sessionId: cliInput.sessionId,
|
||||||
|
}).catch((err) => {
|
||||||
|
console.error("[Streaming] Subprocess start error:", err);
|
||||||
|
reject(err);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Handle non-streaming response
|
||||||
|
*/
|
||||||
|
async function handleNonStreamingResponse(res, subprocess, cliInput, requestId) {
|
||||||
|
return new Promise((resolve) => {
|
||||||
|
let finalResult = null;
|
||||||
|
subprocess.on("result", (result) => {
|
||||||
|
finalResult = result;
|
||||||
|
});
|
||||||
|
subprocess.on("error", (error) => {
|
||||||
|
console.error("[NonStreaming] Error:", error.message);
|
||||||
|
res.status(500).json({
|
||||||
|
error: {
|
||||||
|
message: error.message,
|
||||||
|
type: "server_error",
|
||||||
|
code: null,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
resolve();
|
||||||
|
});
|
||||||
|
subprocess.on("close", (code) => {
|
||||||
|
if (finalResult) {
|
||||||
|
res.json(cliResultToOpenai(finalResult, requestId));
|
||||||
|
}
|
||||||
|
else if (!res.headersSent) {
|
||||||
|
res.status(500).json({
|
||||||
|
error: {
|
||||||
|
message: `Claude CLI exited with code ${code} without response`,
|
||||||
|
type: "server_error",
|
||||||
|
code: null,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
resolve();
|
||||||
|
});
|
||||||
|
// Start the subprocess
|
||||||
|
subprocess
|
||||||
|
.start(cliInput.prompt, {
|
||||||
|
model: cliInput.model,
|
||||||
|
sessionId: cliInput.sessionId,
|
||||||
|
})
|
||||||
|
.catch((error) => {
|
||||||
|
res.status(500).json({
|
||||||
|
error: {
|
||||||
|
message: error.message,
|
||||||
|
type: "server_error",
|
||||||
|
code: null,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
resolve();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Handle GET /v1/models
|
||||||
|
*
|
||||||
|
* Returns available models
|
||||||
|
*/
|
||||||
|
export function handleModels(_req, res) {
|
||||||
|
res.json({
|
||||||
|
object: "list",
|
||||||
|
data: [
|
||||||
|
{
|
||||||
|
id: "claude-opus-4",
|
||||||
|
object: "model",
|
||||||
|
owned_by: "anthropic",
|
||||||
|
created: Math.floor(Date.now() / 1000),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: "claude-sonnet-4",
|
||||||
|
object: "model",
|
||||||
|
owned_by: "anthropic",
|
||||||
|
created: Math.floor(Date.now() / 1000),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: "claude-haiku-4",
|
||||||
|
object: "model",
|
||||||
|
owned_by: "anthropic",
|
||||||
|
created: Math.floor(Date.now() / 1000),
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Handle GET /health
|
||||||
|
*
|
||||||
|
* Health check endpoint
|
||||||
|
*/
|
||||||
|
export function handleHealth(_req, res) {
|
||||||
|
res.json({
|
||||||
|
status: "ok",
|
||||||
|
provider: "claude-code-cli",
|
||||||
|
timestamp: new Date().toISOString(),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
//# sourceMappingURL=routes.js.map
|
||||||
Reference in New Issue
Block a user