Compare commits
25 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| b1796520b8 | |||
| 0ff44d99c4 | |||
| 8c74b3fed8 | |||
| c3fefc60c0 | |||
| 7107ce4fdd | |||
| fa47068d6d | |||
| 07c761fc72 | |||
| 6821eaaa38 | |||
| 31aa86a2a9 | |||
| 87cb687610 | |||
| eb4059a887 | |||
| 415706036b | |||
| e2dd47255e | |||
| 3497aa23f8 | |||
| 8491fb2af7 | |||
| f61864282e | |||
| b2f7d6dda2 | |||
| eeedcc4781 | |||
| 5cf8cab5bd | |||
| 3ae9e19524 | |||
| 0ec4b00879 | |||
| b6b4b1b4d9 | |||
| 950a9d009c | |||
| 693542ef19 | |||
| d12f356ebe |
@@ -195,12 +195,14 @@ Bestehendes Token nochmal als QR anzeigen: `./generate-token.sh show`
|
||||
http://<VM-IP>:3001
|
||||
```
|
||||
|
||||
Die Diagnostic-UI hat vier Top-Tabs:
|
||||
Die Diagnostic-UI hat sechs Top-Tabs:
|
||||
|
||||
- **Main** — Live-Chat-Test, Status (Brain / RVS / Proxy), End-to-End-Trace
|
||||
- **Gehirn** — Memory-Verwaltung (Vector-DB), Skills, Export/Import des kompletten Gehirns als tar.gz
|
||||
- **Dateien** — alle Dateien aus `/shared/uploads/` (von ARIA generiert oder hochgeladen) mit Download/Delete
|
||||
- **Einstellungen** — Reparatur (Container-Restart), Wipe, Sprachausgabe, Whisper, Runtime-Config, App-Onboarding (QR), Komplett-Reset
|
||||
- **Gehirn** — Memory-Verwaltung (Vector-DB), Token/Call-Metrics (Subscription-Quota), Bootstrap & Migration, Komplett-Gehirn Export/Import
|
||||
- **Skills** — Liste mit Logs, Run, Activate/Deactivate, Export/Import als tar.gz
|
||||
- **Trigger** — Timer + Watcher anlegen/anzeigen/loeschen, Live-Variablen-Anzeige (disk_free, current_lat, hour_of_day, …), near(lat, lon, m) als Condition-Funktion
|
||||
- **Dateien** — alle Dateien aus `/shared/uploads/` mit Multi-Select, Bulk-Download (ZIP) + Bulk-Delete
|
||||
- **Einstellungen** — Reparatur (Container-Restart), Wipe, Sprachausgabe, Whisper, Sprachmodell, Runtime-Config, App-Onboarding (QR), Komplett-Reset
|
||||
|
||||
---
|
||||
|
||||
@@ -311,13 +313,16 @@ Erreichbar unter `http://<VM-IP>:3001`. Teilt das Netzwerk mit der Bridge.
|
||||
### Tabs
|
||||
|
||||
- **Main**: Brain/RVS/Proxy-Status, Chat-Test, "ARIA denkt..."-Indikator, End-to-End-Trace, Container-Logs
|
||||
- **Gehirn**: Memory-Browser (Vector-DB), Suche + Filter, Edit/Add/Delete, Gehirn-Export/Import (tar.gz), Skills (geplant)
|
||||
- **Dateien**: Browser fuer `/shared/uploads/` — von ARIA generierte oder hochgeladene Dateien herunterladen oder loeschen (Live-Update der Chat-Bubbles)
|
||||
- **Einstellungen**: Reparatur (Container-Restart fuer Brain/Bridge/Qdrant), Komplett-Reset, Betriebsmodi, Sprachausgabe + Voice-Cloning + F5-TTS-Tuning, Whisper, Onboarding-QR, App-Cleanup
|
||||
- **Gehirn**: Memory-Browser (Vector-DB), Suche + Filter, Edit/Add/Delete, Konversation-Status mit Destillat-Trigger, **Token/Call-Metrics mit Subscription-Quota-Tracking**, Bootstrap & Migration (3 Wiederherstellungs-Wege), Gehirn-Export/Import (tar.gz). Info-Buttons (ℹ) ueberall mit Modal-Erklaerung.
|
||||
- **Skills**: Liste aller Skills mit Logs pro Run, Activate/Deactivate, Export/Import als tar.gz, "von ARIA"-Badge fuer selbst gebaute
|
||||
- **Trigger**: passive Aufweck-Quellen. **Timer** (einmalig, ISO-Timestamp oder via `in_seconds` als Server-Berechnung) + **Watcher** (recurring, mit Condition + Throttle). Liste aktiver Trigger + Logs pro Feuer-Event. Modal mit Type-Dropdown, Live-Anzeige aller verfuegbaren Condition-Variablen (`disk_free_gb`, `hour_of_day`, `current_lat/lon`, `last_user_message_ago_sec`, …) und Condition-Funktionen (`near(lat, lon, m)` fuer GPS-Geofencing). Sicherer Condition-Parser via Python `ast` (Whitelist, kein `eval`). Der System-Prompt enthaelt zusaetzlich einen `## Aktuelle Zeit`-Block (UTC + Europa/Berlin) damit ARIA Timer-Zeitpunkte korrekt setzen kann.
|
||||
- **Dateien**: Browser fuer `/shared/uploads/` mit Multi-Select + "Alle markieren" + Bulk-Download (ZIP bei 2+) + Bulk-Delete. Live-Update der Chat-Bubbles beim Delete.
|
||||
- **Einstellungen**: Reparatur (Container-Restart fuer Brain/Bridge/Qdrant), Komplett-Reset, Betriebsmodi, Sprachausgabe + Voice-Cloning + F5-TTS-Tuning + Voice Export/Import, Whisper, Sprachmodell (brainModel), Onboarding-QR, App-Cleanup
|
||||
|
||||
### Was zusaetzlich noch drin steckt
|
||||
|
||||
- **Disk-Voll Banner** mit copy-baren Cleanup-Befehlen (safe + aggressiv)
|
||||
- **Token/Call-Metrics**: pro Claude-Call ein Eintrag in `/data/metrics.jsonl` mit ts + Token-Schaetzung. Gehirn-Tab zeigt 1h/5h/24h/30d-Aggregat plus Progress-Bar gegen Plan-Limit (Pro / Max 5x / Max 20x / Custom). Warn-Schwelle 80%, kritisch 90%.
|
||||
- **Voice Cloning**: Audio-Samples hochladen, Whisper transkribiert den Ref-Text automatisch
|
||||
- **Voice Export/Import**: einzelne Stimmen als `.tar.gz` zwischen Gameboxen mitnehmen
|
||||
- **Settings Export/Import**: `voice_config.json` + `highlight_triggers.json` als JSON-Bundle
|
||||
@@ -353,6 +358,7 @@ Erreichbar unter `http://<VM-IP>:3001`. Teilt das Netzwerk mit der Bridge.
|
||||
- **Einstellungen**: TTS-aktiv, F5-TTS-Voice, Pre-Roll-Buffer, Stille-Toleranz, Speicherort, Auto-Download, GPS, Verbose-Logging
|
||||
- **Auto-Update**: Prueft beim Start + per Button auf neue Version, Download + Installation ueber RVS (FileProvider)
|
||||
- GPS-Position (optional, mit Runtime-Permission-Request) — wird in jeden Chat/Audio-Payload mitgegeben und ist in Diagnostic als Debug-Block einblendbar
|
||||
- **GPS-Tracking (kontinuierlich)**: Toggle in Settings → Standort. Wenn aktiv, pushed die App alle ~15s bzw. ab 30m Bewegung ein `location_update` an die Bridge — Voraussetzung damit Watcher mit `near(lat, lon, m)` (z.B. Blitzer-Warner, Ankunft-Erinnerungen) ueberhaupt feuern koennen. ARIA selbst kann das Tracking via `request_location_tracking`-Tool an-/ausschalten und tut das automatisch wenn sie einen GPS-Watcher anlegt
|
||||
- QR-Code Scanner fuer Token-Pairing
|
||||
- **ARIA-Dateien empfangen**: Wenn ARIA eine PDF/Bild/Markdown/ZIP fuer dich erstellt (Marker `[FILE: /shared/uploads/aria_*]` in der Antwort), erscheint sie als eigene Anhang-Bubble. Tippen → wird via RVS geladen + mit Android-Intent-Picker geoeffnet (PDF-Viewer, Bildbetrachter, Standard-App). Inline-Bilder aus Markdown-``-Syntax werden direkt unter dem Text gerendert (PNG/JPG via Image, SVG via react-native-svg)
|
||||
- **Vollbild mit Pinch-Zoom**: Bilder im Vollbild-Modal sind pinch-zoombar (1x..5x), 1-Finger-Pan wenn gezoomt, Doppel-Tap toggelt 1x↔2.5x — alles ohne externe Lib
|
||||
@@ -842,20 +848,30 @@ docker exec aria-brain curl localhost:8080/memory/stats
|
||||
### Phase A — Refactor: OpenClaw raus, eigenes Brain rein
|
||||
|
||||
- [x] aria-brain Container-Skeleton (FastAPI, Qdrant, sentence-transformers)
|
||||
- [x] aria-core (OpenClaw) komplett abgerissen — Tag `v0.1.2.0` als Archiv
|
||||
- [x] Diagnostic: Gehirn-Tab (Memory Search/Filter, Add/Edit/Delete)
|
||||
- [x] Diagnostic: Gehirn-Export/Import als tar.gz
|
||||
- [x] Diagnostic: Datei-Manager (Liste, Suche, Download, Delete mit Live-Bubble-Update)
|
||||
- [x] App: Datei-Manager als Modal in den Einstellungen
|
||||
- [x] Diagnostic: Datei-Manager (Liste, Suche, Download, Delete, Multi-Select + ZIP + Bulk-Delete)
|
||||
- [x] Diagnostic: Komplett-Reset (Wipe All)
|
||||
- [x] Diagnostic: Info-Buttons mit Modal-Erklaerungen (Status, Konversation, Memories, Bootstrap)
|
||||
- [x] App: Datei-Manager als Modal in den Einstellungen (mit Multi-Select + ZIP-Download)
|
||||
- [x] Voice Export/Import (einzelne Stimmen + F5/Whisper-Settings als Bundle)
|
||||
- [x] aria-core (OpenClaw) komplett abgerissen — Tag `v0.1.2.0` als Archiv
|
||||
- [ ] **Phase B Punkt 2:** Migration `aria-data/brain-import/` → atomare Memory-Punkte
|
||||
- [ ] **Phase B Punkt 3:** Brain Conversation-Loop (Single-Chat + Rolling Window + Memory-Destillat)
|
||||
- [ ] **Phase B Punkt 4:** Skills-System (Manifest, venv, README pro Skill, Diagnostic-Tab)
|
||||
|
||||
### Phase B — Brain mit Memory + Loop + Skills
|
||||
|
||||
- [x] **Phase B Punkt 2:** Migration aus `aria-data/brain-import/` → atomare Memory-Punkte (Identity / Rule / Preference / Tool / Skill, idempotent ueber migration_key) + Bootstrap-Snapshot Export/Import (nur pinned)
|
||||
- [x] **Phase B Punkt 3:** Brain Conversation-Loop (Single-Chat UI, Rolling Window 50 Turns, Schwelle 60 → automatisches Destillat, manueller Trigger)
|
||||
- [x] **Phase B Punkt 4:** Skills-System (Python-only via local-venv, skill_create als Tool, dynamische run_<skill> Tools, Diagnostic Skills-Tab mit Logs/Toggle/Export/Import, skill_created Live-Notification in App+Diagnostic, harte Schwelle "pip → Skill")
|
||||
- [x] **Phase B Punkt 5:** Triggers-System (passive Aufweck-Quellen — Timer + Watcher mit safe Condition-Parser, GPS-near(), Diagnostic Trigger-Tab, kontinuierliches GPS-Tracking in der App fuer Use-Cases wie Blitzer-Warner)
|
||||
- [x] Sprachmodell-Setting wieder funktional (brainModel in runtime.json statt aria-core)
|
||||
- [x] App-Chat-Sync: kompletter Server-Sync bei Reconnect (Server = Source of Truth) + chat_cleared Live-Update. Lokal-only Bubbles (Skill-Notifications, laufende Voice ohne STT) bleiben erhalten.
|
||||
- [x] App: Chat-Suche mit Next/Prev Navigation statt Filter
|
||||
- [x] Token/Call-Metrics + Subscription-Quota-Tracking (Pro / Max 5x / Max 20x / Custom)
|
||||
- [x] Datei-Manager Multi-Select: Bulk-Download als ZIP + Bulk-Delete (Diagnostic + App)
|
||||
|
||||
### Phase 2 — ARIA wird produktiv
|
||||
|
||||
- [ ] Skills bauen (Bildgenerierung, etc.)
|
||||
- [ ] Erste Skills bauen lassen (yt-dlp, pdf-extract, etc. — durch normale Anfragen)
|
||||
- [ ] Gitea-Integration
|
||||
- [ ] VM einrichten (Desktop, Browser, Tools)
|
||||
- [ ] Heartbeat (periodische Selbst-Checks)
|
||||
|
||||
@@ -79,8 +79,8 @@ android {
|
||||
applicationId "com.ariacockpit"
|
||||
minSdkVersion rootProject.ext.minSdkVersion
|
||||
targetSdkVersion rootProject.ext.targetSdkVersion
|
||||
versionCode 10202
|
||||
versionName "0.1.2.2"
|
||||
versionCode 10207
|
||||
versionName "0.1.2.7"
|
||||
// Fallback fuer Libraries mit Product Flavors
|
||||
missingDimensionStrategy 'react-native-camera', 'general'
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "aria-cockpit",
|
||||
"version": "0.1.2.2",
|
||||
"version": "0.1.2.7",
|
||||
"private": true,
|
||||
"scripts": {
|
||||
"android": "react-native run-android",
|
||||
|
||||
@@ -79,6 +79,14 @@ interface ChatMessage {
|
||||
active: boolean;
|
||||
setupError?: string;
|
||||
};
|
||||
/** Trigger-Created-Bubble: ARIA hat einen neuen Trigger angelegt */
|
||||
triggerCreated?: {
|
||||
name: string;
|
||||
type: 'timer' | 'watcher' | string;
|
||||
message: string;
|
||||
fires_at?: string;
|
||||
condition?: string;
|
||||
};
|
||||
}
|
||||
|
||||
// --- Konstanten ---
|
||||
@@ -201,6 +209,7 @@ const ChatScreen: React.FC = () => {
|
||||
const [fullscreenImage, setFullscreenImage] = useState<string | null>(null);
|
||||
const [searchQuery, setSearchQuery] = useState('');
|
||||
const [searchVisible, setSearchVisible] = useState(false);
|
||||
const [searchIndex, setSearchIndex] = useState(0); // welcher Treffer aktiv ist
|
||||
const [pendingAttachments, setPendingAttachments] = useState<{file: any, isPhoto: boolean}[]>([]);
|
||||
const [agentActivity, setAgentActivity] = useState<{activity: string, tool: string}>({activity: 'idle', tool: ''});
|
||||
// Service-Status (Gamebox: F5-TTS / Whisper Lade-Status) + Banner-Sichtbarkeit
|
||||
@@ -396,6 +405,67 @@ const ChatScreen: React.FC = () => {
|
||||
}
|
||||
|
||||
// skill_created: ARIA hat einen neuen Skill angelegt → eigene Bubble
|
||||
// chat_cleared: Diagnostic hat die History komplett geleert
|
||||
// → lokal auch loeschen (visuell + Persistenz)
|
||||
if (message.type === 'chat_cleared') {
|
||||
console.log('[Chat] chat_cleared — leere lokale Anzeige + Storage');
|
||||
setMessages([]);
|
||||
AsyncStorage.removeItem(CHAT_STORAGE_KEY).catch(() => {});
|
||||
AsyncStorage.removeItem('aria_chat_last_sync').catch(() => {});
|
||||
return;
|
||||
}
|
||||
|
||||
// chat_history_response: kompletter Server-Stand. App ersetzt ihre
|
||||
// persistierte Chat-History damit. Lokal-only Bubbles (laufende
|
||||
// Voice-Aufnahmen ohne STT-Result, Skill-Created-Events ohne
|
||||
// text) bleiben erhalten — die sind durch fehlendes 'text' oder
|
||||
// skillCreated/audioRequestId klar als "lokal" erkennbar.
|
||||
if (message.type === 'chat_history_response') {
|
||||
const p = (message.payload || {}) as any;
|
||||
const incoming = (p.messages || []) as Array<any>;
|
||||
console.log(`[Chat] Server-Sync: ${incoming.length} Nachrichten vom Server`);
|
||||
const fromServer: ChatMessage[] = incoming.map(m => {
|
||||
const role = m.role === 'user' ? 'user' : 'aria';
|
||||
const files = Array.isArray(m.files) ? m.files : [];
|
||||
const attachments = files.map((f: any) => ({
|
||||
type: (typeof f.mimeType === 'string' && f.mimeType.startsWith('image/')) ? 'image' : 'file',
|
||||
name: f.name || 'datei',
|
||||
size: f.size || 0,
|
||||
mimeType: f.mimeType || '',
|
||||
serverPath: f.serverPath || '',
|
||||
})) as Attachment[];
|
||||
return {
|
||||
id: nextId(),
|
||||
sender: role as 'user' | 'aria',
|
||||
text: m.text || '',
|
||||
timestamp: m.ts || Date.now(),
|
||||
attachments: attachments.length ? attachments : undefined,
|
||||
};
|
||||
});
|
||||
const maxTs = incoming.reduce((mx: number, m: any) => Math.max(mx, m.ts || 0), 0);
|
||||
setMessages(prev => {
|
||||
// Lokal-only Bubbles erkennen + behalten:
|
||||
// - Skill-Created-Notifications (skillCreated gesetzt)
|
||||
// - Laufende Sprachnachrichten ohne STT-Result (audioRequestId
|
||||
// gesetzt UND text leer/Placeholder)
|
||||
const localOnly = prev.filter(m =>
|
||||
m.skillCreated ||
|
||||
m.triggerCreated ||
|
||||
(m.audioRequestId && (!m.text || m.text === '🎙 Aufnahme...' || m.text === 'Aufnahme...'))
|
||||
);
|
||||
// Server-Stand + lokal-only (chronologisch sortiert)
|
||||
const merged = [...fromServer, ...localOnly].sort((a, b) => a.timestamp - b.timestamp);
|
||||
return capMessages(merged);
|
||||
});
|
||||
if (maxTs > 0) {
|
||||
AsyncStorage.setItem('aria_chat_last_sync', String(maxTs)).catch(() => {});
|
||||
} else {
|
||||
// Server leer → unsere lastSync auch zuruecksetzen
|
||||
AsyncStorage.removeItem('aria_chat_last_sync').catch(() => {});
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
if (message.type === 'skill_created') {
|
||||
const p = (message.payload || {}) as any;
|
||||
const skillMsg: ChatMessage = {
|
||||
@@ -415,6 +485,26 @@ const ChatScreen: React.FC = () => {
|
||||
return;
|
||||
}
|
||||
|
||||
// trigger_created: ARIA hat einen Trigger angelegt → eigene Bubble
|
||||
if (message.type === 'trigger_created') {
|
||||
const p = (message.payload || {}) as any;
|
||||
const triggerMsg: ChatMessage = {
|
||||
id: nextId(),
|
||||
sender: 'aria',
|
||||
text: '',
|
||||
timestamp: Date.now(),
|
||||
triggerCreated: {
|
||||
name: String(p.name || '(unbenannt)'),
|
||||
type: String(p.type || 'timer'),
|
||||
message: String(p.message || ''),
|
||||
fires_at: p.fires_at ? String(p.fires_at) : undefined,
|
||||
condition: p.condition ? String(p.condition) : undefined,
|
||||
},
|
||||
};
|
||||
setMessages(prev => capMessages([...prev, triggerMsg]));
|
||||
return;
|
||||
}
|
||||
|
||||
// file_deleted: Datei wurde geloescht (vom Diagnostic User) → Bubble updaten
|
||||
if (message.type === 'file_deleted') {
|
||||
const p = (message.payload?.path as string) || '';
|
||||
@@ -480,6 +570,13 @@ const ChatScreen: React.FC = () => {
|
||||
const dbgText = ((message.payload.text as string) || '').slice(0, 60);
|
||||
console.log('[Chat] chat-event sender=%s text=%s', sender || '(none)', dbgText);
|
||||
|
||||
// last-sync tracken — so dass beim Reconnect nicht wieder dieselbe
|
||||
// Nachricht aus dem Server-Backup nachgeladen wird
|
||||
if (sender === 'aria' || sender === 'user' || sender === 'stt') {
|
||||
const ts = message.timestamp || Date.now();
|
||||
AsyncStorage.setItem('aria_chat_last_sync', String(ts)).catch(() => {});
|
||||
}
|
||||
|
||||
// STT-Ergebnis: Transkribierten Text in die Sprach-Bubble schreiben.
|
||||
// WICHTIG: Nur die ERSTE noch unaufgeloeste Aufnahme matchen — sonst
|
||||
// wuerde bei zwei kurz hintereinander gesendeten Audios beide Bubbles
|
||||
@@ -647,6 +744,14 @@ const ChatScreen: React.FC = () => {
|
||||
|
||||
const unsubState = rvs.onStateChange((state) => {
|
||||
setConnectionState(state);
|
||||
// Bei (re)connect: KOMPLETTEN Server-Stand holen. Server ist die
|
||||
// Source-of-Truth — wenn er leer ist (z.B. nach "Konversation
|
||||
// zuruecksetzen"), soll die App das spiegeln, auch wenn sie offline
|
||||
// war als das passiert ist. since=0 + limit=200 → die letzten 200
|
||||
// Nachrichten vom Server, oder leeres Array wenn Server leer.
|
||||
if (state === 'connected') {
|
||||
rvs.send('chat_history_request' as any, { since: 0, limit: 200 });
|
||||
}
|
||||
});
|
||||
|
||||
// Initalen Status setzen
|
||||
@@ -830,6 +935,51 @@ const ChatScreen: React.FC = () => {
|
||||
// Inverted FlatList: neueste Nachrichten unten, kein manuelles Scrollen noetig
|
||||
const invertedMessages = useMemo(() => [...messages].reverse(), [messages]);
|
||||
|
||||
// Such-Treffer: alle Message-IDs die zur Query passen, in chronologischer
|
||||
// Reihenfolge (aelteste zuerst). Bei Query-Change resetten wir den Index.
|
||||
const searchMatchIds = useMemo(() => {
|
||||
const q = searchQuery.trim().toLowerCase();
|
||||
if (!q) return [] as string[];
|
||||
return messages
|
||||
.filter(m => (m.text || '').toLowerCase().includes(q))
|
||||
.map(m => m.id);
|
||||
}, [messages, searchQuery]);
|
||||
|
||||
useEffect(() => {
|
||||
setSearchIndex(0);
|
||||
}, [searchQuery]);
|
||||
|
||||
// Bei Index-Wechsel zu der entsprechenden Bubble scrollen.
|
||||
// FlatList ist `inverted` → viewPosition 0.5 (mitte) ist beim inverted-Render
|
||||
// tatsaechlich die Mitte des sichtbaren Bereichs. Wir verzoegern minimal
|
||||
// damit Layout sicher fertig ist.
|
||||
useEffect(() => {
|
||||
if (!searchMatchIds.length) return;
|
||||
const id = searchMatchIds[searchIndex];
|
||||
if (!id) return;
|
||||
const idx = invertedMessages.findIndex(m => m.id === id);
|
||||
if (idx < 0 || !flatListRef.current) return;
|
||||
const tryScroll = () => {
|
||||
try {
|
||||
flatListRef.current?.scrollToIndex({ index: idx, animated: true, viewPosition: 0.5 });
|
||||
} catch {
|
||||
// wird von onScrollToIndexFailed nochmal versucht
|
||||
}
|
||||
};
|
||||
// requestAnimationFrame statt setTimeout 0 — wartet auf naechsten Layout-Frame
|
||||
requestAnimationFrame(tryScroll);
|
||||
}, [searchIndex, searchMatchIds, invertedMessages]);
|
||||
|
||||
const activeSearchId = searchMatchIds[searchIndex] || '';
|
||||
const gotoSearchPrev = () => {
|
||||
if (!searchMatchIds.length) return;
|
||||
setSearchIndex(i => (i - 1 + searchMatchIds.length) % searchMatchIds.length);
|
||||
};
|
||||
const gotoSearchNext = () => {
|
||||
if (!searchMatchIds.length) return;
|
||||
setSearchIndex(i => (i + 1) % searchMatchIds.length);
|
||||
};
|
||||
|
||||
// GPS-Position holen (optional)
|
||||
const getCurrentLocation = useCallback((): Promise<{ lat: number; lon: number } | null> => {
|
||||
if (!gpsEnabled) {
|
||||
@@ -1081,12 +1231,38 @@ const ChatScreen: React.FC = () => {
|
||||
hour: '2-digit',
|
||||
minute: '2-digit',
|
||||
});
|
||||
const isSearchHit = activeSearchId === item.id;
|
||||
const searchHighlightStyle = isSearchHit
|
||||
? { borderWidth: 2, borderColor: '#FFD60A' }
|
||||
: null;
|
||||
|
||||
// Spezial-Bubble: ARIA hat einen Trigger angelegt
|
||||
if (item.triggerCreated) {
|
||||
const t = item.triggerCreated;
|
||||
const detailLine = t.type === 'timer'
|
||||
? `feuert: ${t.fires_at || '?'}`
|
||||
: `wenn: ${t.condition || '?'}`;
|
||||
return (
|
||||
<View style={[styles.messageBubble, styles.ariaBubble, {borderLeftWidth: 3, borderLeftColor: '#FFD60A'}, searchHighlightStyle]}>
|
||||
<Text style={{color: '#FFD60A', fontWeight: 'bold', fontSize: 14}}>
|
||||
{'⏰ ARIA hat einen Trigger angelegt'}
|
||||
</Text>
|
||||
<Text style={{color: '#E0E0F0', marginTop: 4, fontSize: 14}}>
|
||||
<Text style={{fontWeight: 'bold'}}>{t.name}</Text>
|
||||
<Text style={{color: '#8888AA', fontSize: 12}}>{` (${t.type})`}</Text>
|
||||
</Text>
|
||||
<Text style={{color: '#8888AA', fontSize: 12, marginTop: 2, fontFamily: 'monospace'}}>{detailLine}</Text>
|
||||
<Text style={{color: '#888', fontSize: 12, marginTop: 2}}>{`"${t.message}"`}</Text>
|
||||
<Text style={{color: '#555570', fontSize: 10, marginTop: 6}}>ARIA-Trigger · {time}</Text>
|
||||
</View>
|
||||
);
|
||||
}
|
||||
|
||||
// Spezial-Bubble: ARIA hat einen Skill erstellt
|
||||
if (item.skillCreated) {
|
||||
const s = item.skillCreated;
|
||||
return (
|
||||
<View style={[styles.messageBubble, styles.ariaBubble, {borderLeftWidth: 3, borderLeftColor: '#FFD60A'}]}>
|
||||
<View style={[styles.messageBubble, styles.ariaBubble, {borderLeftWidth: 3, borderLeftColor: '#FFD60A'}, searchHighlightStyle]}>
|
||||
<Text style={{color: '#FFD60A', fontWeight: 'bold', fontSize: 14}}>
|
||||
{'🛠 ARIA hat einen neuen Skill erstellt'}
|
||||
</Text>
|
||||
@@ -1106,7 +1282,7 @@ const ChatScreen: React.FC = () => {
|
||||
}
|
||||
|
||||
return (
|
||||
<View style={[styles.messageBubble, isUser ? styles.userBubble : styles.ariaBubble]}>
|
||||
<View style={[styles.messageBubble, isUser ? styles.userBubble : styles.ariaBubble, searchHighlightStyle]}>
|
||||
{/* Anhang-Vorschau */}
|
||||
{item.attachments?.map((att, idx) => (
|
||||
<View key={idx}>
|
||||
@@ -1280,7 +1456,7 @@ const ChatScreen: React.FC = () => {
|
||||
);
|
||||
})()}
|
||||
|
||||
{/* Suchleiste */}
|
||||
{/* Suchleiste mit Treffer-Navigation */}
|
||||
{searchVisible && (
|
||||
<View style={styles.searchBar}>
|
||||
<TextInput
|
||||
@@ -1291,17 +1467,47 @@ const ChatScreen: React.FC = () => {
|
||||
placeholderTextColor="#555570"
|
||||
autoFocus
|
||||
/>
|
||||
{searchQuery ? (
|
||||
<Text style={{color: searchMatchIds.length ? '#0096FF' : '#555570', fontSize: 12, paddingHorizontal: 6}}>
|
||||
{searchMatchIds.length ? `${searchIndex + 1}/${searchMatchIds.length}` : '0/0'}
|
||||
</Text>
|
||||
) : null}
|
||||
<TouchableOpacity
|
||||
onPress={gotoSearchPrev}
|
||||
disabled={!searchMatchIds.length}
|
||||
style={{paddingHorizontal: 6, opacity: searchMatchIds.length ? 1 : 0.3}}
|
||||
>
|
||||
<Text style={{color: '#0096FF', fontSize: 18}}>{'▲'}</Text>
|
||||
</TouchableOpacity>
|
||||
<TouchableOpacity
|
||||
onPress={gotoSearchNext}
|
||||
disabled={!searchMatchIds.length}
|
||||
style={{paddingHorizontal: 6, opacity: searchMatchIds.length ? 1 : 0.3}}
|
||||
>
|
||||
<Text style={{color: '#0096FF', fontSize: 18}}>{'▼'}</Text>
|
||||
</TouchableOpacity>
|
||||
<TouchableOpacity onPress={() => { setSearchVisible(false); setSearchQuery(''); }}>
|
||||
<Text style={{color: '#FF3B30', fontSize: 14, paddingHorizontal: 8}}>X</Text>
|
||||
</TouchableOpacity>
|
||||
</View>
|
||||
)}
|
||||
|
||||
{/* Nachrichtenliste */}
|
||||
{/* Nachrichtenliste — Suche FILTERT NICHT mehr, sondern hebt aktiven
|
||||
Treffer hervor (siehe renderMessage: activeSearchId-Border). */}
|
||||
<FlatList
|
||||
ref={flatListRef}
|
||||
inverted
|
||||
data={searchQuery ? messages.filter(m => m.text.toLowerCase().includes(searchQuery.toLowerCase())).reverse() : invertedMessages}
|
||||
data={invertedMessages}
|
||||
onScrollToIndexFailed={(info) => {
|
||||
// FlatList kennt das Item-Layout noch nicht. Zuerst grob in die
|
||||
// Naehe scrollen (Average-Item-Hoehe-Schaetzung), dann nach 250ms
|
||||
// praezise nochmal versuchen.
|
||||
const offset = info.averageItemLength * info.index;
|
||||
try { flatListRef.current?.scrollToOffset({ offset, animated: false }); } catch {}
|
||||
setTimeout(() => {
|
||||
try { flatListRef.current?.scrollToIndex({ index: info.index, animated: true, viewPosition: 0.5 }); } catch {}
|
||||
}, 250);
|
||||
}}
|
||||
keyExtractor={item => item.id}
|
||||
renderItem={renderMessage}
|
||||
contentContainerStyle={styles.messageList}
|
||||
|
||||
@@ -51,6 +51,7 @@ import {
|
||||
TTS_SPEED_STORAGE_KEY,
|
||||
} from '../services/audio';
|
||||
import audioService from '../services/audio';
|
||||
import gpsTrackingService from '../services/gpsTracking';
|
||||
import { isVerboseLogging, setVerboseLogging } from '../services/logger';
|
||||
import {
|
||||
isWakeReadySoundEnabled,
|
||||
@@ -121,6 +122,7 @@ const SettingsScreen: React.FC = () => {
|
||||
const [manualPort, setManualPort] = useState('8765');
|
||||
const [currentMode, setCurrentMode] = useState('normal');
|
||||
const [gpsEnabled, setGpsEnabled] = useState(false);
|
||||
const [gpsTracking, setGpsTracking] = useState(gpsTrackingService.isActive());
|
||||
const [scannerVisible, setScannerVisible] = useState(false);
|
||||
const [logTab, setLogTab] = useState<LogTab>('live');
|
||||
const [logs, setLogs] = useState<LogEntry[]>([]);
|
||||
@@ -155,6 +157,9 @@ const SettingsScreen: React.FC = () => {
|
||||
const [fileManagerError, setFileManagerError] = useState('');
|
||||
const [fileManagerSearch, setFileManagerSearch] = useState('');
|
||||
const [fileManagerFilter, setFileManagerFilter] = useState<'all' | 'aria' | 'user'>('all');
|
||||
const [fileManagerSelected, setFileManagerSelected] = useState<Set<string>>(new Set());
|
||||
const fileZipPending = useRef<string | null>(null); // requestId fuer ZIP-Antwort
|
||||
const [fileZipBusy, setFileZipBusy] = useState(false);
|
||||
const [voiceCloneVisible, setVoiceCloneVisible] = useState(false);
|
||||
const [tempPath, setTempPath] = useState('');
|
||||
// Sub-Screen Navigation: null = Hauptmenue, sonst eine der Section-IDs.
|
||||
@@ -185,6 +190,11 @@ const SettingsScreen: React.FC = () => {
|
||||
AsyncStorage.getItem('aria_gps_enabled').then(saved => {
|
||||
if (saved !== null) setGpsEnabled(saved === 'true');
|
||||
});
|
||||
// gpsTrackingService status syncen + auf Aenderungen lauschen
|
||||
setGpsTracking(gpsTrackingService.isActive());
|
||||
const offGps = gpsTrackingService.onChange(setGpsTracking);
|
||||
// Persistierten Status wiederherstellen (war Tracking beim letzten Mal an?)
|
||||
gpsTrackingService.restoreFromStorage().catch(() => {});
|
||||
AsyncStorage.getItem(TTS_PREROLL_STORAGE_KEY).then(saved => {
|
||||
if (saved != null) {
|
||||
const n = parseFloat(saved);
|
||||
@@ -242,6 +252,10 @@ const SettingsScreen: React.FC = () => {
|
||||
});
|
||||
// Voice-Liste vom XTTS-Server holen (via RVS)
|
||||
rvs.send('xtts_list_voices' as any, {});
|
||||
return () => {
|
||||
// gpsTrackingService-Listener abmelden (Variable offGps oben definiert)
|
||||
try { offGps(); } catch {}
|
||||
};
|
||||
}, []);
|
||||
|
||||
// Speichergroesse berechnen
|
||||
@@ -395,9 +409,51 @@ const SettingsScreen: React.FC = () => {
|
||||
const p: any = message.payload || {};
|
||||
if (p.path) {
|
||||
setFileManagerFiles(prev => prev.filter(f => f.path !== p.path));
|
||||
setFileManagerSelected(prev => {
|
||||
if (!prev.has(p.path)) return prev;
|
||||
const next = new Set(prev);
|
||||
next.delete(p.path);
|
||||
return next;
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// ARIA bittet um GPS-Tracking An/Aus (Tool request_location_tracking)
|
||||
if (message.type === ('location_tracking' as any)) {
|
||||
const p: any = message.payload || {};
|
||||
const on = !!p.on;
|
||||
const reason = (p.reason as string) || 'ARIA';
|
||||
if (on) {
|
||||
gpsTrackingService.start(reason).catch(() => {});
|
||||
} else {
|
||||
gpsTrackingService.stop(reason);
|
||||
}
|
||||
}
|
||||
|
||||
// Datei-Manager: ZIP-Response (Multi-Download)
|
||||
if (message.type === ('file_zip_response' as any)) {
|
||||
const p: any = message.payload || {};
|
||||
if (p.requestId && p.requestId !== fileZipPending.current) return; // veraltet
|
||||
fileZipPending.current = null;
|
||||
setFileZipBusy(false);
|
||||
if (!p.ok || !p.data) {
|
||||
ToastAndroid.show('ZIP fehlgeschlagen: ' + (p.error || 'unbekannt'), ToastAndroid.LONG);
|
||||
return;
|
||||
}
|
||||
// base64 → in Downloads-Ordner schreiben
|
||||
(async () => {
|
||||
try {
|
||||
const ts = new Date().toISOString().replace(/[:.]/g, '-').slice(0, 19);
|
||||
const dir = RNFS.DownloadDirectoryPath;
|
||||
const filePath = `${dir}/aria-files-${ts}.zip`;
|
||||
await RNFS.writeFile(filePath, p.data, 'base64');
|
||||
ToastAndroid.show(`ZIP gespeichert: ${filePath} (${Math.round((p.size||0)/1024)} KB)`, ToastAndroid.LONG);
|
||||
} catch (e: any) {
|
||||
ToastAndroid.show('ZIP speichern fehlgeschlagen: ' + e.message, ToastAndroid.LONG);
|
||||
}
|
||||
})();
|
||||
}
|
||||
|
||||
// Voice wurde gespeichert → Liste neu laden + ggf. auswaehlen
|
||||
if (message.type === ('xtts_voice_saved' as any)) {
|
||||
const name = (message.payload as any).name as string;
|
||||
@@ -644,64 +700,170 @@ const SettingsScreen: React.FC = () => {
|
||||
<Text style={{color:'#8888AA', textAlign:'center', marginTop:20}}>Lade...</Text>
|
||||
) : fileManagerError ? (
|
||||
<Text style={{color:'#FF6B6B', textAlign:'center', marginTop:20}}>{fileManagerError}</Text>
|
||||
) : (
|
||||
<ScrollView style={{flex:1}} contentContainerStyle={{padding:12}}>
|
||||
{(() => {
|
||||
let files = fileManagerFiles;
|
||||
if (fileManagerFilter === 'aria') files = files.filter(f => f.fromAria);
|
||||
else if (fileManagerFilter === 'user') files = files.filter(f => !f.fromAria);
|
||||
if (fileManagerSearch) {
|
||||
const q = fileManagerSearch.toLowerCase();
|
||||
files = files.filter(f => f.name.toLowerCase().includes(q));
|
||||
}
|
||||
if (!files.length) {
|
||||
return <Text style={{color:'#555570', textAlign:'center', marginTop:20}}>Keine Dateien</Text>;
|
||||
}
|
||||
const fmtSize = (b: number) => b < 1024 ? `${b} B` : b < 1024*1024 ? `${(b/1024).toFixed(1)} KB` : `${(b/1024/1024).toFixed(1)} MB`;
|
||||
return files.map(f => (
|
||||
<View key={f.path} style={{
|
||||
backgroundColor:'#0D0D1A', padding:12, borderRadius:8, marginBottom:8,
|
||||
flexDirection:'row', alignItems:'center', gap:8,
|
||||
}}>
|
||||
<View style={{flex:1}}>
|
||||
<View style={{flexDirection:'row', alignItems:'center'}}>
|
||||
) : (() => {
|
||||
// Visible files (Filter+Suche)
|
||||
let files = fileManagerFiles;
|
||||
if (fileManagerFilter === 'aria') files = files.filter(f => f.fromAria);
|
||||
else if (fileManagerFilter === 'user') files = files.filter(f => !f.fromAria);
|
||||
if (fileManagerSearch) {
|
||||
const q = fileManagerSearch.toLowerCase();
|
||||
files = files.filter(f => f.name.toLowerCase().includes(q));
|
||||
}
|
||||
const visiblePaths = files.map(f => f.path);
|
||||
const selectedHere = visiblePaths.filter(p => fileManagerSelected.has(p));
|
||||
const allSelected = visiblePaths.length > 0 && selectedHere.length === visiblePaths.length;
|
||||
const fmtSize = (b: number) => b < 1024 ? `${b} B` : b < 1024*1024 ? `${(b/1024).toFixed(1)} KB` : `${(b/1024/1024).toFixed(1)} MB`;
|
||||
|
||||
const toggleSelectAll = () => {
|
||||
setFileManagerSelected(prev => {
|
||||
const next = new Set(prev);
|
||||
if (allSelected) visiblePaths.forEach(p => next.delete(p));
|
||||
else visiblePaths.forEach(p => next.add(p));
|
||||
return next;
|
||||
});
|
||||
};
|
||||
const toggleOne = (p: string) => {
|
||||
setFileManagerSelected(prev => {
|
||||
const next = new Set(prev);
|
||||
if (next.has(p)) next.delete(p);
|
||||
else next.add(p);
|
||||
return next;
|
||||
});
|
||||
};
|
||||
const bulkDelete = () => {
|
||||
const paths = [...fileManagerSelected];
|
||||
if (!paths.length) return;
|
||||
Alert.alert(
|
||||
`${paths.length} Dateien löschen?`,
|
||||
'In allen Chat-Bubbles werden sie als gelöscht markiert.',
|
||||
[
|
||||
{ text: 'Abbrechen', style: 'cancel' },
|
||||
{ text: 'Löschen', style: 'destructive', onPress: () => {
|
||||
rvs.send('file_delete_batch_request' as any, { paths, requestId: 'batch-' + Date.now() });
|
||||
setFileManagerSelected(new Set());
|
||||
ToastAndroid.show(`${paths.length} Lösch-Befehle gesendet…`, ToastAndroid.SHORT);
|
||||
}},
|
||||
],
|
||||
);
|
||||
};
|
||||
const bulkDownload = () => {
|
||||
const paths = [...fileManagerSelected];
|
||||
if (!paths.length) return;
|
||||
// 1 Datei: einfach via file_request (existing pattern). ZIP nur bei 2+.
|
||||
if (paths.length === 1) {
|
||||
rvs.send('file_request' as any, { serverPath: paths[0], requestId: 'single-' + Date.now() });
|
||||
ToastAndroid.show('Datei wird heruntergeladen…', ToastAndroid.SHORT);
|
||||
return;
|
||||
}
|
||||
const reqId = 'zip-' + Date.now();
|
||||
fileZipPending.current = reqId;
|
||||
setFileZipBusy(true);
|
||||
rvs.send('file_zip_request' as any, { paths, requestId: reqId });
|
||||
ToastAndroid.show(`ZIP wird erstellt (${paths.length} Dateien)…`, ToastAndroid.LONG);
|
||||
};
|
||||
|
||||
return (
|
||||
<>
|
||||
{/* Bulk-Bar */}
|
||||
<View style={{paddingHorizontal:12, paddingBottom:8, flexDirection:'row', alignItems:'center', gap:8, flexWrap:'wrap'}}>
|
||||
<TouchableOpacity onPress={toggleSelectAll} style={{flexDirection:'row', alignItems:'center', gap:6, paddingVertical:4}}>
|
||||
<View style={{
|
||||
width:18, height:18, borderRadius:3,
|
||||
borderWidth:2, borderColor: allSelected ? '#0096FF' : '#555570',
|
||||
backgroundColor: allSelected ? '#0096FF' : 'transparent',
|
||||
alignItems:'center', justifyContent:'center',
|
||||
}}>
|
||||
{allSelected && <Text style={{color:'#fff', fontSize:11, fontWeight:'bold'}}>✓</Text>}
|
||||
</View>
|
||||
<Text style={{color:'#E0E0F0', fontSize:13}}>Alle markieren</Text>
|
||||
</TouchableOpacity>
|
||||
{fileManagerSelected.size > 0 && (
|
||||
<>
|
||||
<Text style={{color:'#555570', fontSize:13}}>·</Text>
|
||||
<Text style={{color:'#0096FF', fontSize:13, fontWeight:'600'}}>{fileManagerSelected.size} ausgewählt</Text>
|
||||
<TouchableOpacity
|
||||
onPress={bulkDownload}
|
||||
disabled={fileZipBusy}
|
||||
style={{paddingVertical:4, paddingHorizontal:10, borderRadius:6, backgroundColor:'#0096FF22', opacity: fileZipBusy ? 0.5 : 1}}
|
||||
>
|
||||
<Text style={{color:'#0096FF', fontSize:12}}>{fileZipBusy ? '⏳ ZIP…' : (fileManagerSelected.size > 1 ? '⬇ ZIP' : '⬇ Download')}</Text>
|
||||
</TouchableOpacity>
|
||||
<TouchableOpacity
|
||||
onPress={bulkDelete}
|
||||
style={{paddingVertical:4, paddingHorizontal:10, borderRadius:6, backgroundColor:'#FF6B6B22'}}
|
||||
>
|
||||
<Text style={{color:'#FF6B6B', fontSize:12}}>🗑 Löschen</Text>
|
||||
</TouchableOpacity>
|
||||
</>
|
||||
)}
|
||||
</View>
|
||||
|
||||
<ScrollView style={{flex:1}} contentContainerStyle={{padding:12, paddingTop:0}}>
|
||||
{!files.length ? (
|
||||
<Text style={{color:'#555570', textAlign:'center', marginTop:20}}>Keine Dateien</Text>
|
||||
) : files.map(f => {
|
||||
const selected = fileManagerSelected.has(f.path);
|
||||
return (
|
||||
<TouchableOpacity
|
||||
key={f.path}
|
||||
onPress={() => toggleOne(f.path)}
|
||||
activeOpacity={0.7}
|
||||
style={{
|
||||
backgroundColor: selected ? '#1E2C44' : '#0D0D1A',
|
||||
padding:12, borderRadius:8, marginBottom:8,
|
||||
flexDirection:'row', alignItems:'center', gap:8,
|
||||
borderWidth: selected ? 1 : 0, borderColor:'#0096FF',
|
||||
}}
|
||||
>
|
||||
<View style={{
|
||||
backgroundColor: f.fromAria ? '#0096FF22' : '#34C75922',
|
||||
paddingHorizontal:6, paddingVertical:1, borderRadius:3, marginRight:6,
|
||||
width:18, height:18, borderRadius:3,
|
||||
borderWidth:2, borderColor: selected ? '#0096FF' : '#555570',
|
||||
backgroundColor: selected ? '#0096FF' : 'transparent',
|
||||
alignItems:'center', justifyContent:'center',
|
||||
}}>
|
||||
<Text style={{color: f.fromAria ? '#0096FF' : '#34C759', fontSize:9}}>
|
||||
{f.fromAria ? 'ARIA' : 'USER'}
|
||||
{selected && <Text style={{color:'#fff', fontSize:11, fontWeight:'bold'}}>✓</Text>}
|
||||
</View>
|
||||
<View style={{flex:1}}>
|
||||
<View style={{flexDirection:'row', alignItems:'center'}}>
|
||||
<View style={{
|
||||
backgroundColor: f.fromAria ? '#0096FF22' : '#34C75922',
|
||||
paddingHorizontal:6, paddingVertical:1, borderRadius:3, marginRight:6,
|
||||
}}>
|
||||
<Text style={{color: f.fromAria ? '#0096FF' : '#34C759', fontSize:9}}>
|
||||
{f.fromAria ? 'ARIA' : 'USER'}
|
||||
</Text>
|
||||
</View>
|
||||
<Text style={{color:'#E0E0F0', fontSize:13, flex:1}} numberOfLines={1}>{f.name}</Text>
|
||||
</View>
|
||||
<Text style={{color:'#555570', fontSize:10, marginTop:2}}>
|
||||
{fmtSize(f.size)} · {new Date(f.mtime).toLocaleString('de-DE')}
|
||||
</Text>
|
||||
</View>
|
||||
<Text style={{color:'#E0E0F0', fontSize:13, flex:1}} numberOfLines={1}>{f.name}</Text>
|
||||
</View>
|
||||
<Text style={{color:'#555570', fontSize:10, marginTop:2}}>
|
||||
{fmtSize(f.size)} · {new Date(f.mtime).toLocaleString('de-DE')}
|
||||
</Text>
|
||||
</View>
|
||||
<TouchableOpacity
|
||||
onPress={() => {
|
||||
Alert.alert(
|
||||
'Datei löschen?',
|
||||
`"${f.name}"\n\nIn allen Chat-Bubbles wird sie als gelöscht markiert.`,
|
||||
[
|
||||
{ text: 'Abbrechen', style: 'cancel' },
|
||||
{ text: 'Löschen', style: 'destructive', onPress: () => {
|
||||
rvs.send('file_delete_request' as any, { path: f.path });
|
||||
ToastAndroid.show('Lösch-Befehl gesendet…', ToastAndroid.SHORT);
|
||||
}},
|
||||
],
|
||||
);
|
||||
}}
|
||||
style={{padding:8}}
|
||||
>
|
||||
<Text style={{color:'#FF6B6B', fontSize:18}}>🗑</Text>
|
||||
</TouchableOpacity>
|
||||
</View>
|
||||
));
|
||||
})()}
|
||||
</ScrollView>
|
||||
)}
|
||||
<TouchableOpacity
|
||||
onPress={() => {
|
||||
Alert.alert(
|
||||
'Datei löschen?',
|
||||
`"${f.name}"\n\nIn allen Chat-Bubbles wird sie als gelöscht markiert.`,
|
||||
[
|
||||
{ text: 'Abbrechen', style: 'cancel' },
|
||||
{ text: 'Löschen', style: 'destructive', onPress: () => {
|
||||
rvs.send('file_delete_request' as any, { path: f.path });
|
||||
ToastAndroid.show('Lösch-Befehl gesendet…', ToastAndroid.SHORT);
|
||||
}},
|
||||
],
|
||||
);
|
||||
}}
|
||||
style={{padding:8}}
|
||||
>
|
||||
<Text style={{color:'#FF6B6B', fontSize:18}}>🗑</Text>
|
||||
</TouchableOpacity>
|
||||
</TouchableOpacity>
|
||||
);
|
||||
})}
|
||||
</ScrollView>
|
||||
</>
|
||||
);
|
||||
})()}
|
||||
</View>
|
||||
</Modal>
|
||||
<ScrollView style={styles.container} contentContainerStyle={styles.content}>
|
||||
@@ -865,6 +1027,29 @@ const SettingsScreen: React.FC = () => {
|
||||
thumbColor={gpsEnabled ? '#FFFFFF' : '#666680'}
|
||||
/>
|
||||
</View>
|
||||
|
||||
{/* GPS-Tracking (kontinuierlich) — fuer near()-Watcher */}
|
||||
<View style={[styles.toggleRow, {marginTop: 12, borderTopWidth: 1, borderTopColor: '#1E1E2E', paddingTop: 12}]}>
|
||||
<View style={styles.toggleInfo}>
|
||||
<Text style={styles.toggleLabel}>GPS-Tracking (kontinuierlich)</Text>
|
||||
<Text style={styles.toggleHint}>
|
||||
Sendet alle ~15s deine Position an ARIA (wenn du dich {'>'}30m bewegt
|
||||
hast). Nur noetig fuer GPS-basierte Trigger wie Blitzer-Warner
|
||||
(near()-Conditions). ARIA kann das auch selbst an-/abschalten wenn
|
||||
sie einen GPS-Watcher anlegt. Akku-Verbrauch erhoeht — bei langer
|
||||
Fahrt einplanen.
|
||||
</Text>
|
||||
</View>
|
||||
<Switch
|
||||
value={gpsTracking}
|
||||
onValueChange={(v) => {
|
||||
if (v) gpsTrackingService.start('manuell').catch(() => {});
|
||||
else gpsTrackingService.stop('manuell');
|
||||
}}
|
||||
trackColor={{ false: '#2A2A3E', true: '#FF9500' }}
|
||||
thumbColor={gpsTracking ? '#FFFFFF' : '#666680'}
|
||||
/>
|
||||
</View>
|
||||
</View>
|
||||
</>)}
|
||||
|
||||
|
||||
@@ -0,0 +1,138 @@
|
||||
/**
|
||||
* GPS-Tracking-Service.
|
||||
*
|
||||
* Wenn aktiv: pushed alle paar Sekunden die aktuelle Position als
|
||||
* `location_update {lat, lon}` an den RVS-Server, damit Brain-Watcher
|
||||
* mit `near()`-Conditions etwas zum Vergleichen haben.
|
||||
*
|
||||
* Default: AUS. Wird entweder vom User manuell in Settings angeschaltet
|
||||
* oder von ARIA via location_tracking-RVS-Message (Brain-Tool
|
||||
* `request_location_tracking`).
|
||||
*
|
||||
* Energie-Schutz: distanceFilter 30m, interval 15s. Echte Fahrt-Updates
|
||||
* (Geschwindigkeit) kommen sauber durch, stationaer wird kaum gesendet.
|
||||
*/
|
||||
|
||||
import AsyncStorage from '@react-native-async-storage/async-storage';
|
||||
import { PermissionsAndroid, Platform, ToastAndroid } from 'react-native';
|
||||
import Geolocation from '@react-native-community/geolocation';
|
||||
import rvs from './rvs';
|
||||
|
||||
type Listener = (active: boolean) => void;
|
||||
|
||||
class GpsTrackingService {
|
||||
private watchId: number | null = null;
|
||||
private active = false;
|
||||
private listeners: Set<Listener> = new Set();
|
||||
// Defensive: nicht zu schnell oeffentlich togglen
|
||||
private lastChangeAt = 0;
|
||||
|
||||
isActive(): boolean {
|
||||
return this.active;
|
||||
}
|
||||
|
||||
onChange(cb: Listener): () => void {
|
||||
this.listeners.add(cb);
|
||||
return () => { this.listeners.delete(cb); };
|
||||
}
|
||||
|
||||
private notify() {
|
||||
for (const cb of this.listeners) {
|
||||
try { cb(this.active); } catch {}
|
||||
}
|
||||
}
|
||||
|
||||
/** Beim App-Start: gespeicherten Zustand wiederherstellen (Default off). */
|
||||
async restoreFromStorage(): Promise<void> {
|
||||
try {
|
||||
const v = await AsyncStorage.getItem('aria_gps_tracking');
|
||||
if (v === 'true') {
|
||||
console.log('[gps-track] Restore: war an, starte wieder');
|
||||
this.start('Beim Start wiederhergestellt');
|
||||
}
|
||||
} catch {}
|
||||
}
|
||||
|
||||
private async ensurePermission(): Promise<boolean> {
|
||||
if (Platform.OS !== 'android') return true;
|
||||
try {
|
||||
const granted = await PermissionsAndroid.request(
|
||||
PermissionsAndroid.PERMISSIONS.ACCESS_FINE_LOCATION,
|
||||
{
|
||||
title: 'GPS-Tracking',
|
||||
message: 'ARIA braucht laufende Standort-Updates damit GPS-Watcher (Blitzer-Warner, near()) funktionieren.',
|
||||
buttonPositive: 'Erlauben',
|
||||
buttonNegative: 'Abbrechen',
|
||||
},
|
||||
);
|
||||
return granted === PermissionsAndroid.RESULTS.GRANTED;
|
||||
} catch (e) {
|
||||
console.warn('[gps-track] Permission-Fehler:', e);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
async start(reason: string = ''): Promise<boolean> {
|
||||
if (this.active) return true;
|
||||
const ok = await this.ensurePermission();
|
||||
if (!ok) {
|
||||
ToastAndroid.show('GPS-Tracking: Berechtigung abgelehnt', ToastAndroid.LONG);
|
||||
return false;
|
||||
}
|
||||
try {
|
||||
this.watchId = Geolocation.watchPosition(
|
||||
(pos) => {
|
||||
const lat = pos.coords.latitude;
|
||||
const lon = pos.coords.longitude;
|
||||
rvs.send('location_update' as any, { lat, lon });
|
||||
},
|
||||
(err) => {
|
||||
console.warn('[gps-track] watchPosition error:', err?.code, err?.message);
|
||||
},
|
||||
{
|
||||
enableHighAccuracy: true,
|
||||
distanceFilter: 30, // erst senden wenn 30m gewandert
|
||||
interval: 15000, // (Android) gewuenschte Frequenz
|
||||
fastestInterval: 10000, // (Android) max Frequenz
|
||||
} as any,
|
||||
);
|
||||
this.active = true;
|
||||
this.lastChangeAt = Date.now();
|
||||
this.notify();
|
||||
AsyncStorage.setItem('aria_gps_tracking', 'true').catch(() => {});
|
||||
ToastAndroid.show(
|
||||
reason ? `GPS-Tracking aktiv (${reason})` : 'GPS-Tracking aktiv',
|
||||
ToastAndroid.SHORT,
|
||||
);
|
||||
console.log('[gps-track] gestartet', reason ? `(${reason})` : '');
|
||||
return true;
|
||||
} catch (e: any) {
|
||||
console.warn('[gps-track] start fehlgeschlagen:', e?.message);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
stop(reason: string = ''): void {
|
||||
if (!this.active) return;
|
||||
if (this.watchId !== null) {
|
||||
try { Geolocation.clearWatch(this.watchId); } catch {}
|
||||
this.watchId = null;
|
||||
}
|
||||
this.active = false;
|
||||
this.lastChangeAt = Date.now();
|
||||
this.notify();
|
||||
AsyncStorage.setItem('aria_gps_tracking', 'false').catch(() => {});
|
||||
ToastAndroid.show(
|
||||
reason ? `GPS-Tracking aus (${reason})` : 'GPS-Tracking aus',
|
||||
ToastAndroid.SHORT,
|
||||
);
|
||||
console.log('[gps-track] gestoppt', reason ? `(${reason})` : '');
|
||||
}
|
||||
|
||||
async toggle(reason: string = ''): Promise<void> {
|
||||
if (this.active) this.stop(reason);
|
||||
else await this.start(reason);
|
||||
}
|
||||
}
|
||||
|
||||
export default new GpsTrackingService();
|
||||
+195
-1
@@ -25,6 +25,8 @@ from memory import Embedder, VectorStore, MemoryPoint
|
||||
from prompts import build_system_prompt
|
||||
from proxy_client import ProxyClient, Message as ProxyMessage
|
||||
import skills as skills_mod
|
||||
import triggers as triggers_mod
|
||||
import watcher as watcher_mod
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -90,6 +92,120 @@ META_TOOLS = [
|
||||
"parameters": {"type": "object", "properties": {}},
|
||||
},
|
||||
},
|
||||
{
|
||||
"type": "function",
|
||||
"function": {
|
||||
"name": "trigger_timer",
|
||||
"description": (
|
||||
"Lege einen Timer-Trigger an — feuert EINMALIG und ruft dich dann selbst auf "
|
||||
"(Push-Nachricht an Stefan). Use-Case: 'erinnere mich in 10min', "
|
||||
"'sag mir um 14:30 Bescheid'. Genau EINES von `in_seconds` ODER `fires_at` "
|
||||
"muss gesetzt sein."
|
||||
),
|
||||
"parameters": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"name": {"type": "string", "description": "kurzer kebab-case-Name, a-z 0-9 - _"},
|
||||
"in_seconds": {
|
||||
"type": "integer",
|
||||
"description": (
|
||||
"Relativ ab jetzt in Sekunden. Bevorzugt bei Angaben wie "
|
||||
"'in 2 Minuten' (=120), 'in 1 Stunde' (=3600). "
|
||||
"Server berechnet daraus den absoluten Feuer-Zeitpunkt."
|
||||
),
|
||||
},
|
||||
"fires_at": {
|
||||
"type": "string",
|
||||
"description": (
|
||||
"Absoluter ISO-Timestamp UTC fuer feste Termine, z.B. "
|
||||
"'2026-05-12T14:30:00Z'. Die aktuelle Zeit findest du im "
|
||||
"System-Prompt unter '## Aktuelle Zeit'. Fuer relative Angaben "
|
||||
"lieber `in_seconds` nutzen."
|
||||
),
|
||||
},
|
||||
"message": {"type": "string", "description": "Was soll bei der Erinnerung gesagt werden"},
|
||||
},
|
||||
"required": ["name", "message"],
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"type": "function",
|
||||
"function": {
|
||||
"name": "trigger_watcher",
|
||||
"description": (
|
||||
"Lege einen Watcher-Trigger an — pollt alle paar Minuten eine Condition, "
|
||||
"feuert wenn sie wahr wird (mit Throttle damit's nicht spammt). "
|
||||
"Use-Case: 'sag bescheid wenn Disk unter 5GB', 'pingt mich wenn um 8 Uhr'. "
|
||||
"Welche Variablen verfuegbar sind und ihre Bedeutung steht im System-Prompt."
|
||||
),
|
||||
"parameters": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"name": {"type": "string", "description": "kurzer Name"},
|
||||
"condition": {
|
||||
"type": "string",
|
||||
"description": (
|
||||
"Boolescher Ausdruck mit den erlaubten Variablen, z.B. "
|
||||
"'disk_free_gb < 5', 'hour_of_day == 8 and day_of_week == \"mon\"'. "
|
||||
"Operatoren: < > <= >= == != and or not"
|
||||
),
|
||||
},
|
||||
"message": {"type": "string", "description": "Was soll bei Erfuellung gesagt werden"},
|
||||
"check_interval_sec": {
|
||||
"type": "integer",
|
||||
"description": "Wie oft Condition pruefen (Default 300 = alle 5min, min 30)",
|
||||
},
|
||||
"throttle_sec": {
|
||||
"type": "integer",
|
||||
"description": "Mindestabstand zwischen 2 Feuerungen (Default 3600 = max 1x/h)",
|
||||
},
|
||||
},
|
||||
"required": ["name", "condition", "message"],
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"type": "function",
|
||||
"function": {
|
||||
"name": "trigger_cancel",
|
||||
"description": "Loescht einen Trigger (Timer abbrechen oder Watcher entfernen).",
|
||||
"parameters": {
|
||||
"type": "object",
|
||||
"properties": {"name": {"type": "string"}},
|
||||
"required": ["name"],
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"type": "function",
|
||||
"function": {
|
||||
"name": "trigger_list",
|
||||
"description": "Zeigt alle Trigger (active + inaktiv). Selten noetig — Stefan sieht sie im Diagnostic.",
|
||||
"parameters": {"type": "object", "properties": {}},
|
||||
},
|
||||
},
|
||||
{
|
||||
"type": "function",
|
||||
"function": {
|
||||
"name": "request_location_tracking",
|
||||
"description": (
|
||||
"Bittet die App, das kontinuierliche GPS-Tracking zu aktivieren oder zu "
|
||||
"deaktivieren. Default ist AUS (Akku-Schutz). Nutze das wenn du einen "
|
||||
"GPS-basierten Watcher anlegst (z.B. `near(...)`), sonst hat die App "
|
||||
"veraltete Position und der Watcher feuert nie. Auch wieder ausschalten "
|
||||
"wenn der letzte GPS-Watcher geloescht wurde."
|
||||
),
|
||||
"parameters": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"on": {"type": "boolean", "description": "true = Tracking an, false = aus"},
|
||||
"reason": {"type": "string", "description": "Kurzer Grund (wird in App-Notification angezeigt)"},
|
||||
},
|
||||
"required": ["on"],
|
||||
},
|
||||
},
|
||||
},
|
||||
]
|
||||
|
||||
|
||||
@@ -175,8 +291,16 @@ class Agent:
|
||||
active_skills = [s for s in all_skills if s.get("active", True)]
|
||||
tools = list(META_TOOLS) + [_skill_to_tool(s) for s in active_skills]
|
||||
|
||||
# Trigger-Liste + Variablen-Info fuer den System-Prompt
|
||||
all_triggers = triggers_mod.list_triggers(active_only=False)
|
||||
condition_vars = watcher_mod.describe_variables()
|
||||
condition_funcs = watcher_mod.describe_functions()
|
||||
|
||||
# 5. System-Prompt + Window-Messages
|
||||
system_prompt = build_system_prompt(hot, cold, skills=all_skills)
|
||||
system_prompt = build_system_prompt(hot, cold, skills=all_skills,
|
||||
triggers=all_triggers,
|
||||
condition_vars=condition_vars,
|
||||
condition_funcs=condition_funcs)
|
||||
messages = [ProxyMessage(role="system", content=system_prompt)]
|
||||
for t in self.conversation.window():
|
||||
messages.append(ProxyMessage(role=t.role, content=t.content))
|
||||
@@ -273,6 +397,76 @@ class Agent:
|
||||
if err:
|
||||
out += f"\nstderr:\n{err}"
|
||||
return out
|
||||
if name == "trigger_timer":
|
||||
fires_at_iso = arguments.get("fires_at")
|
||||
in_seconds = arguments.get("in_seconds")
|
||||
if not fires_at_iso and in_seconds is not None:
|
||||
from datetime import datetime as _dt, timezone as _tz, timedelta as _td
|
||||
try:
|
||||
secs = int(in_seconds)
|
||||
except (TypeError, ValueError):
|
||||
return "FEHLER: in_seconds muss eine ganze Zahl sein."
|
||||
if secs < 1:
|
||||
return "FEHLER: in_seconds muss >= 1 sein."
|
||||
fires_at_iso = (_dt.now(_tz.utc) + _td(seconds=secs)).isoformat(timespec="seconds")
|
||||
if not fires_at_iso:
|
||||
return "FEHLER: entweder `in_seconds` ODER `fires_at` muss gesetzt sein."
|
||||
t = triggers_mod.create_timer(
|
||||
name=arguments["name"],
|
||||
fires_at_iso=fires_at_iso,
|
||||
message=arguments["message"],
|
||||
author="aria",
|
||||
)
|
||||
self._pending_events.append({
|
||||
"type": "trigger_created",
|
||||
"trigger": {"name": t["name"], "type": "timer",
|
||||
"fires_at": t["fires_at"], "message": t["message"]},
|
||||
})
|
||||
return f"OK — Timer '{t['name']}' angelegt, feuert um {t['fires_at']}."
|
||||
if name == "trigger_watcher":
|
||||
t = triggers_mod.create_watcher(
|
||||
name=arguments["name"],
|
||||
condition=arguments["condition"],
|
||||
message=arguments["message"],
|
||||
check_interval_sec=int(arguments.get("check_interval_sec", 300)),
|
||||
throttle_sec=int(arguments.get("throttle_sec", 3600)),
|
||||
author="aria",
|
||||
)
|
||||
self._pending_events.append({
|
||||
"type": "trigger_created",
|
||||
"trigger": {"name": t["name"], "type": "watcher",
|
||||
"condition": t["condition"], "message": t["message"]},
|
||||
})
|
||||
return f"OK — Watcher '{t['name']}' angelegt: feuert wenn '{t['condition']}'."
|
||||
if name == "trigger_cancel":
|
||||
try:
|
||||
triggers_mod.delete(arguments["name"])
|
||||
return f"OK — Trigger '{arguments['name']}' geloescht."
|
||||
except ValueError as e:
|
||||
return f"FEHLER: {e}"
|
||||
if name == "request_location_tracking":
|
||||
on = bool(arguments.get("on", False))
|
||||
reason = (arguments.get("reason") or "").strip()
|
||||
self._pending_events.append({
|
||||
"type": "location_tracking",
|
||||
"on": on,
|
||||
"reason": reason,
|
||||
})
|
||||
return f"OK — Tracking-Request gesendet (on={on}). App wird in Kuerze umschalten."
|
||||
if name == "trigger_list":
|
||||
items = triggers_mod.list_triggers(active_only=False)
|
||||
if not items:
|
||||
return "(keine Trigger vorhanden)"
|
||||
lines = []
|
||||
for t in items:
|
||||
state = "aktiv" if t.get("active", True) else "DEAKTIVIERT"
|
||||
if t["type"] == "timer":
|
||||
lines.append(f"- {t['name']} (timer, {state}): feuert {t.get('fires_at')} — \"{t.get('message','')[:50]}\"")
|
||||
elif t["type"] == "watcher":
|
||||
lines.append(f"- {t['name']} (watcher, {state}): cond=\"{t.get('condition')}\", throttle={t.get('throttle_sec')}s")
|
||||
else:
|
||||
lines.append(f"- {t['name']} ({t['type']}, {state})")
|
||||
return "\n".join(lines)
|
||||
return f"Unbekanntes Tool: {name}"
|
||||
except Exception as exc:
|
||||
logger.exception("Tool '%s' fehlgeschlagen", name)
|
||||
|
||||
@@ -0,0 +1,169 @@
|
||||
"""
|
||||
Background-Loop fuer Triggers.
|
||||
|
||||
Laeuft alle TICK_SEC Sekunden in einem asyncio Task, geht ueber alle
|
||||
active Triggers und entscheidet ob sie feuern muessen.
|
||||
|
||||
Feuern bedeutet:
|
||||
1. Trigger-Manifest update (fire_count++, last_fired_at, ggf. deaktivieren)
|
||||
2. Log-Eintrag schreiben
|
||||
3. agent.chat() mit einem system-Praefix aufrufen (NICHT als 'user'!)
|
||||
→ ARIA bekommt das wie eine Push-Nachricht und kann antworten
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
from datetime import datetime, timezone
|
||||
from typing import Optional
|
||||
|
||||
import triggers as triggers_mod
|
||||
import watcher as watcher_mod
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
TICK_SEC = 30
|
||||
|
||||
|
||||
def _now_iso() -> str:
|
||||
return datetime.now(timezone.utc).isoformat()
|
||||
|
||||
|
||||
def _parse_iso(s: str) -> Optional[datetime]:
|
||||
if not s:
|
||||
return None
|
||||
try:
|
||||
return datetime.fromisoformat(s.replace("Z", "+00:00"))
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
def _should_fire(trigger: dict, vars_: dict, now: datetime) -> bool:
|
||||
if not trigger.get("active", True):
|
||||
return False
|
||||
t = trigger.get("type", "")
|
||||
|
||||
if t == "timer":
|
||||
fires_at = _parse_iso(trigger.get("fires_at", ""))
|
||||
if not fires_at:
|
||||
return False
|
||||
if fires_at.tzinfo is None:
|
||||
fires_at = fires_at.replace(tzinfo=timezone.utc)
|
||||
return now >= fires_at
|
||||
|
||||
if t == "watcher":
|
||||
# Check-Interval respektieren (sonst pollen wir zu hektisch)
|
||||
check_interval = int(trigger.get("check_interval_sec", 300))
|
||||
last_checked = _parse_iso(trigger.get("last_checked_at", ""))
|
||||
if last_checked:
|
||||
if last_checked.tzinfo is None:
|
||||
last_checked = last_checked.replace(tzinfo=timezone.utc)
|
||||
if (now - last_checked).total_seconds() < check_interval:
|
||||
return False
|
||||
# Throttle: erst feuern wenn last_fired lange genug her ist
|
||||
last_fired = _parse_iso(trigger.get("last_fired_at", ""))
|
||||
throttle = int(trigger.get("throttle_sec", 3600))
|
||||
if last_fired:
|
||||
if last_fired.tzinfo is None:
|
||||
last_fired = last_fired.replace(tzinfo=timezone.utc)
|
||||
if (now - last_fired).total_seconds() < throttle:
|
||||
return False
|
||||
# Condition pruefen
|
||||
cond = (trigger.get("condition") or "").strip()
|
||||
if not cond:
|
||||
return False
|
||||
try:
|
||||
return watcher_mod.evaluate(cond, vars_)
|
||||
except Exception as e:
|
||||
logger.warning("Trigger %s: Condition '%s' fehlerhaft: %s",
|
||||
trigger.get("name"), cond, e)
|
||||
return False
|
||||
|
||||
if t == "cron":
|
||||
# TODO: später, wenn jemand Bock auf Cron-Parser hat
|
||||
return False
|
||||
|
||||
return False
|
||||
|
||||
|
||||
async def _fire(trigger: dict, agent_factory) -> None:
|
||||
"""Ruft ARIA mit einer System-Praefix-Nachricht auf."""
|
||||
name = trigger.get("name", "?")
|
||||
message = trigger.get("message") or "(ohne Nachricht)"
|
||||
ttype = trigger.get("type", "?")
|
||||
|
||||
# Manifest updaten
|
||||
try:
|
||||
triggers_mod.mark_fired(name)
|
||||
except Exception as e:
|
||||
logger.warning("mark_fired %s: %s", name, e)
|
||||
|
||||
# Log
|
||||
triggers_mod.append_log(name, {"event": "fired", "type": ttype, "message": message})
|
||||
|
||||
# System-Nachricht an ARIA: nicht als User, sondern als Hinweis
|
||||
prompt = (
|
||||
f"[Trigger ausgelöst: '{name}', Typ: {ttype}] "
|
||||
f"Geplante Nachricht: \"{message}\". "
|
||||
f"Sage Stefan jetzt diese Information, in deinem Stil. "
|
||||
f"Wenn der Trigger ein Watcher war (Bedingung wurde erfuellt), "
|
||||
f"erwaehne kurz worum es geht. Antworte direkt, keine Rueckfrage."
|
||||
)
|
||||
|
||||
try:
|
||||
agent = agent_factory()
|
||||
reply = agent.chat(prompt, source="trigger")
|
||||
logger.info("[trigger] %s gefeuert → ARIA-Reply: %s", name, reply[:80])
|
||||
triggers_mod.append_log(name, {"event": "reply", "text": reply[:500]})
|
||||
except Exception as e:
|
||||
logger.exception("Trigger %s feuern fehlgeschlagen: %s", name, e)
|
||||
triggers_mod.append_log(name, {"event": "error", "error": str(e)[:300]})
|
||||
|
||||
|
||||
async def _tick(agent_factory) -> None:
|
||||
"""Ein Pruefdurchlauf. Geht ueber alle Triggers, feuert was zu feuern ist."""
|
||||
try:
|
||||
all_triggers = triggers_mod.list_triggers(active_only=True)
|
||||
except Exception as e:
|
||||
logger.warning("triggers.list: %s", e)
|
||||
return
|
||||
if not all_triggers:
|
||||
return
|
||||
now = datetime.now(timezone.utc)
|
||||
# Variablen einmal pro Tick sammeln (nicht pro Trigger — Disk-Stat ist teuer)
|
||||
try:
|
||||
vars_ = watcher_mod.collect_variables()
|
||||
except Exception as e:
|
||||
logger.warning("collect_variables: %s", e)
|
||||
vars_ = {}
|
||||
|
||||
# Watcher: last_checked_at jetzt updaten (auch wenn nicht gefeuert wird,
|
||||
# damit der Check-Interval respektiert wird)
|
||||
for t in all_triggers:
|
||||
if t.get("type") == "watcher":
|
||||
try:
|
||||
t["last_checked_at"] = _now_iso()
|
||||
triggers_mod.write(t["name"], t)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
for trigger in all_triggers:
|
||||
try:
|
||||
if _should_fire(trigger, vars_, now):
|
||||
# Feuern als eigener Task — wenn ARIA langsam antwortet,
|
||||
# darf der naechste Tick nicht blockieren
|
||||
asyncio.create_task(_fire(trigger, agent_factory))
|
||||
except Exception as e:
|
||||
logger.warning("Trigger-Check %s: %s", trigger.get("name"), e)
|
||||
|
||||
|
||||
async def run_loop(agent_factory) -> None:
|
||||
"""Endlosschleife — wird vom main lifespan gestartet + gestoppt."""
|
||||
logger.info("Trigger-Loop gestartet (TICK_SEC=%d)", TICK_SEC)
|
||||
while True:
|
||||
try:
|
||||
await _tick(agent_factory)
|
||||
except Exception as e:
|
||||
logger.exception("Tick-Fehler: %s", e)
|
||||
await asyncio.sleep(TICK_SEC)
|
||||
+135
-1
@@ -20,6 +20,9 @@ import logging
|
||||
import os
|
||||
from typing import List, Optional
|
||||
|
||||
import asyncio
|
||||
from contextlib import asynccontextmanager
|
||||
|
||||
from fastapi import FastAPI, HTTPException, BackgroundTasks, Request
|
||||
from fastapi.responses import Response
|
||||
from pydantic import BaseModel, Field
|
||||
@@ -29,6 +32,10 @@ from conversation import Conversation
|
||||
from proxy_client import ProxyClient
|
||||
from agent import Agent
|
||||
import skills as skills_mod
|
||||
import metrics as metrics_mod
|
||||
import triggers as triggers_mod
|
||||
import watcher as watcher_mod
|
||||
import background as background_mod
|
||||
|
||||
logging.basicConfig(level=logging.INFO, format="%(asctime)s [%(levelname)s] %(name)s: %(message)s")
|
||||
logger = logging.getLogger("aria-brain")
|
||||
@@ -36,7 +43,23 @@ logger = logging.getLogger("aria-brain")
|
||||
QDRANT_HOST = os.environ.get("QDRANT_HOST", "aria-qdrant")
|
||||
QDRANT_PORT = int(os.environ.get("QDRANT_PORT", "6333"))
|
||||
|
||||
app = FastAPI(title="ARIA Brain", version="0.1.0")
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
"""Beim Brain-Start: Trigger-Background-Loop anwerfen. Beim Shutdown: stoppen."""
|
||||
task = asyncio.create_task(background_mod.run_loop(agent))
|
||||
logger.info("Lifespan: Trigger-Loop gestartet")
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
task.cancel()
|
||||
try:
|
||||
await task
|
||||
except asyncio.CancelledError:
|
||||
pass
|
||||
logger.info("Lifespan: Trigger-Loop gestoppt")
|
||||
|
||||
|
||||
app = FastAPI(title="ARIA Brain", version="0.1.0", lifespan=lifespan)
|
||||
|
||||
_embedder: Optional[Embedder] = None
|
||||
_store: Optional[VectorStore] = None
|
||||
@@ -404,6 +427,117 @@ def conversation_distill_now():
|
||||
return agent().distill_old_turns()
|
||||
|
||||
|
||||
# ─── Call-Metrics (Token / Quota-Monitoring) ────────────────────────
|
||||
|
||||
@app.get("/metrics/calls")
|
||||
def metrics_calls():
|
||||
"""Liefert Aggregate fuer 1h / 5h / 24h / 30d.
|
||||
Jedes Window: {window_seconds, calls, tokens_in, tokens_out, by_model}."""
|
||||
return metrics_mod.stats()
|
||||
|
||||
|
||||
# ─── Triggers (passive Aufweck-Quellen) ─────────────────────────────
|
||||
|
||||
class TriggerTimerBody(BaseModel):
|
||||
name: str
|
||||
fires_at: str # ISO timestamp
|
||||
message: str
|
||||
author: str = "stefan"
|
||||
|
||||
|
||||
class TriggerWatcherBody(BaseModel):
|
||||
name: str
|
||||
condition: str
|
||||
message: str
|
||||
check_interval_sec: int = 300
|
||||
throttle_sec: int = 3600
|
||||
author: str = "stefan"
|
||||
|
||||
|
||||
class TriggerPatch(BaseModel):
|
||||
active: bool | None = None
|
||||
message: str | None = None
|
||||
condition: str | None = None
|
||||
throttle_sec: int | None = None
|
||||
check_interval_sec: int | None = None
|
||||
fires_at: str | None = None
|
||||
|
||||
|
||||
@app.get("/triggers/list")
|
||||
def triggers_list(active_only: bool = False):
|
||||
return {"triggers": triggers_mod.list_triggers(active_only=active_only)}
|
||||
|
||||
|
||||
@app.get("/triggers/conditions")
|
||||
def triggers_conditions():
|
||||
"""Verfuegbare Variablen + Funktionen fuer Watcher-Conditions
|
||||
(mit aktuellen Werten)."""
|
||||
current = watcher_mod.collect_variables()
|
||||
# near() ist ein callable in vars_ — fuer die UI rausfiltern
|
||||
serializable = {k: v for k, v in current.items() if not callable(v)}
|
||||
return {
|
||||
"variables": watcher_mod.describe_variables(),
|
||||
"functions": watcher_mod.describe_functions(),
|
||||
"current": serializable,
|
||||
}
|
||||
|
||||
|
||||
@app.get("/triggers/{name}")
|
||||
def triggers_get(name: str):
|
||||
t = triggers_mod.read(name)
|
||||
if t is None:
|
||||
raise HTTPException(404, f"Trigger '{name}' nicht gefunden")
|
||||
return t
|
||||
|
||||
|
||||
@app.get("/triggers/{name}/logs")
|
||||
def triggers_get_logs(name: str, limit: int = 50):
|
||||
return {"logs": triggers_mod.list_logs(name, limit=limit)}
|
||||
|
||||
|
||||
@app.post("/triggers/timer")
|
||||
def triggers_create_timer(body: TriggerTimerBody):
|
||||
try:
|
||||
return triggers_mod.create_timer(
|
||||
name=body.name, fires_at_iso=body.fires_at,
|
||||
message=body.message, author=body.author,
|
||||
)
|
||||
except ValueError as exc:
|
||||
raise HTTPException(400, str(exc))
|
||||
|
||||
|
||||
@app.post("/triggers/watcher")
|
||||
def triggers_create_watcher(body: TriggerWatcherBody):
|
||||
try:
|
||||
return triggers_mod.create_watcher(
|
||||
name=body.name, condition=body.condition,
|
||||
message=body.message,
|
||||
check_interval_sec=body.check_interval_sec,
|
||||
throttle_sec=body.throttle_sec,
|
||||
author=body.author,
|
||||
)
|
||||
except ValueError as exc:
|
||||
raise HTTPException(400, str(exc))
|
||||
|
||||
|
||||
@app.patch("/triggers/{name}")
|
||||
def triggers_patch(name: str, body: TriggerPatch):
|
||||
patch = {k: v for k, v in body.model_dump().items() if v is not None}
|
||||
try:
|
||||
return triggers_mod.update(name, patch)
|
||||
except ValueError as exc:
|
||||
raise HTTPException(404, str(exc))
|
||||
|
||||
|
||||
@app.delete("/triggers/{name}")
|
||||
def triggers_delete(name: str):
|
||||
try:
|
||||
triggers_mod.delete(name)
|
||||
except ValueError as exc:
|
||||
raise HTTPException(404, str(exc))
|
||||
return {"deleted": name}
|
||||
|
||||
|
||||
# ─── Skills ─────────────────────────────────────────────────────────
|
||||
|
||||
class SkillCreate(BaseModel):
|
||||
|
||||
@@ -0,0 +1,133 @@
|
||||
"""
|
||||
Call-Metrics fuer den Proxy-Client.
|
||||
|
||||
Pro Claude-Call wird ein Eintrag in /data/metrics.jsonl angehaengt:
|
||||
|
||||
{"ts": <ms>, "model": "...", "in": <tokens_in_estimate>, "out": <tokens_out_estimate>}
|
||||
|
||||
Tokens-Schaetzung: characters / 4 (Anthropic-Default-Heuristik). Nicht exakt
|
||||
aber gut genug fuer Quota-Monitoring. Wir summieren nicht in-memory weil
|
||||
der Brain-Container neugestartet werden kann — alles auf Disk.
|
||||
|
||||
Auswertung via aggregate(window_seconds) — liefert {calls, tokens_in, tokens_out}
|
||||
fuer die letzten N Sekunden. Lazy gelesen, keine grossen Datenmengen erwartet
|
||||
(bei 1000 Calls/Tag ~70 KB pro Monat).
|
||||
|
||||
Auto-Rotate: bei > 50k Zeilen werden die aeltesten 25k weggeschnitten.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import time
|
||||
from pathlib import Path
|
||||
from typing import List
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
METRICS_FILE = Path(os.environ.get("METRICS_FILE", "/data/metrics.jsonl"))
|
||||
ROTATE_AT = 50_000
|
||||
ROTATE_KEEP = 25_000
|
||||
|
||||
|
||||
def _estimate_tokens(text: str) -> int:
|
||||
"""Anthropic-Default: ~4 chars pro Token. Grob genug."""
|
||||
if not text:
|
||||
return 0
|
||||
return max(1, len(text) // 4)
|
||||
|
||||
|
||||
def _messages_tokens(messages: list) -> int:
|
||||
total = 0
|
||||
for m in messages:
|
||||
# Pydantic-Model oder dict
|
||||
if hasattr(m, "content"):
|
||||
total += _estimate_tokens(m.content or "")
|
||||
elif isinstance(m, dict):
|
||||
c = m.get("content") or ""
|
||||
if isinstance(c, str):
|
||||
total += _estimate_tokens(c)
|
||||
return total
|
||||
|
||||
|
||||
def log_call(model: str, messages_in: list, reply_text: str = "") -> None:
|
||||
"""Eine Call-Metric anhaengen. Robust gegen Fehler (silent fail)."""
|
||||
try:
|
||||
tokens_in = _messages_tokens(messages_in)
|
||||
tokens_out = _estimate_tokens(reply_text)
|
||||
line = json.dumps({
|
||||
"ts": int(time.time() * 1000),
|
||||
"model": model,
|
||||
"in": tokens_in,
|
||||
"out": tokens_out,
|
||||
})
|
||||
METRICS_FILE.parent.mkdir(parents=True, exist_ok=True)
|
||||
with METRICS_FILE.open("a", encoding="utf-8") as f:
|
||||
f.write(line + "\n")
|
||||
# Sanftes Rotate ohne hohe IO-Kosten — nur alle 1000 Calls checken
|
||||
if (tokens_in + tokens_out) % 1000 < 4:
|
||||
_maybe_rotate()
|
||||
except Exception as exc:
|
||||
logger.warning("metrics.log_call: %s", exc)
|
||||
|
||||
|
||||
def _maybe_rotate() -> None:
|
||||
try:
|
||||
if not METRICS_FILE.exists():
|
||||
return
|
||||
with METRICS_FILE.open("r", encoding="utf-8") as f:
|
||||
lines = f.readlines()
|
||||
if len(lines) > ROTATE_AT:
|
||||
keep = lines[-ROTATE_KEEP:]
|
||||
METRICS_FILE.write_text("".join(keep), encoding="utf-8")
|
||||
logger.info("metrics rotated: %d → %d Zeilen", len(lines), len(keep))
|
||||
except Exception as exc:
|
||||
logger.warning("metrics rotate: %s", exc)
|
||||
|
||||
|
||||
def aggregate(window_seconds: int) -> dict:
|
||||
"""Aggregiert die Calls der letzten N Sekunden."""
|
||||
now_ms = int(time.time() * 1000)
|
||||
cutoff_ms = now_ms - (window_seconds * 1000)
|
||||
calls = 0
|
||||
tokens_in = 0
|
||||
tokens_out = 0
|
||||
by_model: dict[str, int] = {}
|
||||
if METRICS_FILE.exists():
|
||||
try:
|
||||
for raw in METRICS_FILE.read_text(encoding="utf-8").splitlines():
|
||||
raw = raw.strip()
|
||||
if not raw:
|
||||
continue
|
||||
try:
|
||||
obj = json.loads(raw)
|
||||
except Exception:
|
||||
continue
|
||||
if obj.get("ts", 0) < cutoff_ms:
|
||||
continue
|
||||
calls += 1
|
||||
tokens_in += int(obj.get("in") or 0)
|
||||
tokens_out += int(obj.get("out") or 0)
|
||||
m = obj.get("model", "?")
|
||||
by_model[m] = by_model.get(m, 0) + 1
|
||||
except Exception as exc:
|
||||
logger.warning("metrics aggregate: %s", exc)
|
||||
return {
|
||||
"window_seconds": window_seconds,
|
||||
"calls": calls,
|
||||
"tokens_in": tokens_in,
|
||||
"tokens_out": tokens_out,
|
||||
"by_model": by_model,
|
||||
}
|
||||
|
||||
|
||||
def stats() -> dict:
|
||||
"""Komplett-Snapshot mit den drei wichtigsten Fenstern."""
|
||||
return {
|
||||
"h1": aggregate(3600),
|
||||
"h5": aggregate(5 * 3600),
|
||||
"h24": aggregate(24 * 3600),
|
||||
"d30": aggregate(30 * 24 * 3600),
|
||||
}
|
||||
+85
-2
@@ -15,10 +15,34 @@ mit dem Conversation-Loop in spaeteren Phasen.
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime, timezone, timedelta
|
||||
from typing import List
|
||||
|
||||
from memory import MemoryPoint
|
||||
|
||||
|
||||
def build_time_section() -> str:
|
||||
"""Aktueller Zeitstempel — damit ARIA Timer korrekt anlegen kann
|
||||
und Watcher-Conditions mit hour_of_day etc. einordenbar bleiben."""
|
||||
now_utc = datetime.now(timezone.utc)
|
||||
# Europa/Berlin: Sommerzeit CEST = UTC+2, Winterzeit CET = UTC+1.
|
||||
# Wir nehmen den simplen Fall (kein zoneinfo-Import noetig im Brain-Image):
|
||||
# Stefans VM laeuft auf UTC, die Bridge in der Wohnung — Anzeige reicht.
|
||||
local_offset_h = 2 if 3 <= now_utc.month <= 10 else 1
|
||||
local = now_utc + timedelta(hours=local_offset_h)
|
||||
lines = [
|
||||
"## Aktuelle Zeit",
|
||||
f"- UTC: {now_utc.isoformat(timespec='seconds')}",
|
||||
f"- Lokal (Europa/Berlin, UTC+{local_offset_h}): "
|
||||
f"{local.strftime('%Y-%m-%d %H:%M:%S')} ({local.strftime('%A')})",
|
||||
"",
|
||||
"Nutze das fuer Trigger-Timestamps und um Watcher-Conditions wie "
|
||||
"`hour_of_day == 8` einzuordnen. Fuer relative Angaben "
|
||||
"('in 10min', 'in 2 Stunden') nutze beim `trigger_timer` den "
|
||||
"`in_seconds`-Parameter — Server rechnet dann selbst.",
|
||||
]
|
||||
return "\n".join(lines)
|
||||
|
||||
TYPE_HEADINGS = {
|
||||
"identity": "## Wer du bist",
|
||||
"rule": "## Sicherheitsregeln & Prinzipien",
|
||||
@@ -115,16 +139,75 @@ def build_skills_section(skills: List[dict]) -> str:
|
||||
return "\n".join(lines)
|
||||
|
||||
|
||||
def build_triggers_section(
|
||||
triggers: List[dict],
|
||||
condition_vars: List[dict],
|
||||
condition_funcs: List[dict] | None = None,
|
||||
) -> str:
|
||||
"""Triggers (passive Aufweck-Quellen) + verfuegbare Condition-Variablen + Funktionen."""
|
||||
lines = ["## Trigger (passive Aufweck-Quellen)"]
|
||||
lines.append("")
|
||||
lines.append("Trigger sind ANDERS als Skills: das System ruft DICH wenn ein Event passiert. "
|
||||
"Du legst sie an wenn Stefan sagt 'erinner mich an X' oder 'sag bescheid wenn Y'.")
|
||||
lines.append("")
|
||||
if triggers:
|
||||
lines.append("### Aktuelle Trigger")
|
||||
for t in triggers:
|
||||
active = t.get("active", True)
|
||||
mark = "" if active else " [INAKTIV]"
|
||||
if t["type"] == "timer":
|
||||
lines.append(f"- **{t['name']}**{mark} (timer) feuert {t.get('fires_at')}: \"{t.get('message','')[:80]}\"")
|
||||
elif t["type"] == "watcher":
|
||||
lines.append(f"- **{t['name']}**{mark} (watcher) cond=`{t.get('condition')}`: \"{t.get('message','')[:80]}\"")
|
||||
lines.append("")
|
||||
lines.append("### Verfuegbare Condition-Variablen (fuer Watcher)")
|
||||
for v in condition_vars:
|
||||
lines.append(f"- `{v['name']}` ({v['type']}) — {v['desc']}")
|
||||
if condition_funcs:
|
||||
lines.append("")
|
||||
lines.append("### Verfuegbare Funktionen in Conditions")
|
||||
for fn in condition_funcs:
|
||||
lines.append(f"- `{fn['signature']}` — {fn['desc']}")
|
||||
lines.append("")
|
||||
lines.append("Operatoren in Conditions: `<` `>` `<=` `>=` `==` `!=` `and` `or` `not`. "
|
||||
"Beispiele: `disk_free_gb < 5 and hour_of_day >= 8`, "
|
||||
"`day_of_week == \"mon\"`, `near(53.123, 7.456, 500)`. "
|
||||
"Funktionen nur mit Konstanten als Argumenten (keine Variablen, "
|
||||
"keine geschachtelten Funktionen).")
|
||||
lines.append("")
|
||||
lines.append("### Wann welcher Typ?")
|
||||
lines.append("- **Timer** fuer einmalige Erinnerungen mit konkreter Zeit ('in 10min', 'um 14:30').")
|
||||
lines.append("- **Watcher** fuer 'wenn X passiert' (Disk voll, bestimmte Tageszeit, GPS-Naehe).")
|
||||
lines.append("- ARIA legt Trigger NUR auf Stefan-Wunsch an, nicht eigenmaechtig.")
|
||||
lines.append("")
|
||||
lines.append("### GPS-Watcher mit near()")
|
||||
lines.append(
|
||||
"Wenn du einen Watcher mit `near()` anlegst: die App sendet GPS-Position "
|
||||
"nur kontinuierlich wenn Tracking AN ist (Default: AUS, Akku-Schutz). "
|
||||
"Rufe dafuer `request_location_tracking(on=true, reason=\"...\")` auf "
|
||||
"bevor oder gleich nach dem trigger_watcher. Sonst hat current_lat/lon "
|
||||
"veraltete Werte und der Watcher feuert nie. "
|
||||
"Beim Loeschen des letzten GPS-Watchers (trigger_cancel) wieder "
|
||||
"`request_location_tracking(on=false)` aufrufen.")
|
||||
return "\n".join(lines)
|
||||
|
||||
|
||||
def build_system_prompt(
|
||||
pinned: List[MemoryPoint],
|
||||
cold: List[MemoryPoint] | None = None,
|
||||
skills: List[dict] | None = None,
|
||||
triggers: List[dict] | None = None,
|
||||
condition_vars: List[dict] | None = None,
|
||||
condition_funcs: List[dict] | None = None,
|
||||
) -> str:
|
||||
"""Kompletter System-Prompt: Hot + Cold + Skills."""
|
||||
parts = [build_hot_memory_section(pinned)]
|
||||
"""Kompletter System-Prompt: Hot + Cold + Skills + Triggers."""
|
||||
parts = [build_hot_memory_section(pinned), "", build_time_section()]
|
||||
if skills:
|
||||
parts.append("")
|
||||
parts.append(build_skills_section(skills))
|
||||
if condition_vars:
|
||||
parts.append("")
|
||||
parts.append(build_triggers_section(triggers or [], condition_vars, condition_funcs))
|
||||
if cold:
|
||||
parts.append("")
|
||||
parts.append(build_cold_memory_section(cold))
|
||||
|
||||
@@ -18,6 +18,8 @@ from typing import List, Optional
|
||||
import httpx
|
||||
from pydantic import BaseModel
|
||||
|
||||
import metrics
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
RUNTIME_CONFIG_FILE = Path("/shared/config/runtime.json")
|
||||
@@ -135,6 +137,9 @@ class ProxyClient:
|
||||
"arguments": args,
|
||||
})
|
||||
|
||||
# Call-Metric anhaengen — Token-Schaetzung fuer Quota-Monitoring
|
||||
metrics.log_call(payload["model"], messages, content or "")
|
||||
|
||||
return ProxyResult(content=content or "", tool_calls=tool_calls, finish_reason=finish_reason)
|
||||
|
||||
def close(self):
|
||||
|
||||
@@ -0,0 +1,229 @@
|
||||
"""
|
||||
Triggers — passive Aufweck-Quellen fuer ARIA.
|
||||
|
||||
Skills sind aktiv (ARIA ruft sie). Triggers sind passiv — das System ruft
|
||||
ARIA wenn ein Event passiert. Drei Typen:
|
||||
|
||||
timer Einmalig zu einem festen Zeitpunkt
|
||||
watcher Recurring: Condition pruefen, bei True → feuern (mit Throttle)
|
||||
cron Cron-Expression (vorerst nicht implementiert, Platzhalter)
|
||||
|
||||
Layout:
|
||||
/data/triggers/<name>.json Manifest pro Trigger
|
||||
/data/triggers/logs/<name>.jsonl Append-only Log pro Feuerung
|
||||
|
||||
Polling-Kosten: Brain-internes Background-Polling (kein LLM-Call).
|
||||
ARIA wird nur aufgeweckt wenn ein Trigger tatsaechlich feuert.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import shutil
|
||||
import time
|
||||
from datetime import datetime, timezone
|
||||
from pathlib import Path
|
||||
from typing import Optional
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
TRIGGERS_DIR = Path(os.environ.get("TRIGGERS_DIR", "/data/triggers"))
|
||||
LOGS_DIR = TRIGGERS_DIR / "logs"
|
||||
NAME_RE = re.compile(r"^[a-zA-Z0-9_-]{2,60}$")
|
||||
VALID_TYPES = {"timer", "watcher", "cron"}
|
||||
|
||||
|
||||
def _now_iso() -> str:
|
||||
return datetime.now(timezone.utc).isoformat()
|
||||
|
||||
|
||||
def _safe_name(name: str) -> str:
|
||||
if not isinstance(name, str) or not NAME_RE.match(name):
|
||||
raise ValueError(f"Ungueltiger Trigger-Name: {name!r}")
|
||||
return name
|
||||
|
||||
|
||||
def _path(name: str) -> Path:
|
||||
return TRIGGERS_DIR / f"{_safe_name(name)}.json"
|
||||
|
||||
|
||||
def _ensure_dirs():
|
||||
TRIGGERS_DIR.mkdir(parents=True, exist_ok=True)
|
||||
LOGS_DIR.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
|
||||
# ─── CRUD ───────────────────────────────────────────────────────────
|
||||
|
||||
def list_triggers(active_only: bool = False) -> list[dict]:
|
||||
if not TRIGGERS_DIR.exists():
|
||||
return []
|
||||
out: list[dict] = []
|
||||
for f in sorted(TRIGGERS_DIR.glob("*.json")):
|
||||
try:
|
||||
data = json.loads(f.read_text(encoding="utf-8"))
|
||||
if active_only and not data.get("active", True):
|
||||
continue
|
||||
out.append(data)
|
||||
except Exception as e:
|
||||
logger.warning("Trigger lesen %s: %s", f, e)
|
||||
return out
|
||||
|
||||
|
||||
def read(name: str) -> Optional[dict]:
|
||||
p = _path(name)
|
||||
if not p.exists():
|
||||
return None
|
||||
try:
|
||||
return json.loads(p.read_text(encoding="utf-8"))
|
||||
except Exception as e:
|
||||
logger.warning("Trigger %s lesen: %s", name, e)
|
||||
return None
|
||||
|
||||
|
||||
def write(name: str, data: dict) -> None:
|
||||
_ensure_dirs()
|
||||
data["updated_at"] = _now_iso()
|
||||
p = _path(name)
|
||||
tmp = p.with_suffix(".tmp")
|
||||
tmp.write_text(json.dumps(data, indent=2, ensure_ascii=False), encoding="utf-8")
|
||||
tmp.replace(p)
|
||||
|
||||
|
||||
def delete(name: str) -> None:
|
||||
p = _path(name)
|
||||
if not p.exists():
|
||||
raise ValueError(f"Trigger '{name}' nicht gefunden")
|
||||
p.unlink()
|
||||
# Logs auch wegraeumen
|
||||
log_file = LOGS_DIR / f"{_safe_name(name)}.jsonl"
|
||||
if log_file.exists():
|
||||
log_file.unlink()
|
||||
|
||||
|
||||
def update(name: str, patch: dict) -> dict:
|
||||
data = read(name)
|
||||
if data is None:
|
||||
raise ValueError(f"Trigger '{name}' nicht gefunden")
|
||||
allowed = {"active", "message", "condition", "throttle_sec",
|
||||
"check_interval_sec", "fires_at"}
|
||||
for k, v in patch.items():
|
||||
if k in allowed:
|
||||
data[k] = v
|
||||
write(name, data)
|
||||
return data
|
||||
|
||||
|
||||
# ─── Create-Helpers (typ-spezifisch) ────────────────────────────────
|
||||
|
||||
def create_timer(
|
||||
name: str,
|
||||
fires_at_iso: str,
|
||||
message: str,
|
||||
author: str = "aria",
|
||||
) -> dict:
|
||||
_safe_name(name)
|
||||
if _path(name).exists():
|
||||
raise ValueError(f"Trigger '{name}' existiert schon")
|
||||
# ISO validieren
|
||||
try:
|
||||
datetime.fromisoformat(fires_at_iso.replace("Z", "+00:00"))
|
||||
except Exception:
|
||||
raise ValueError(f"fires_at_iso ungueltig: {fires_at_iso}")
|
||||
data = {
|
||||
"name": name,
|
||||
"type": "timer",
|
||||
"active": True,
|
||||
"author": author,
|
||||
"created_at": _now_iso(),
|
||||
"fires_at": fires_at_iso,
|
||||
"message": message,
|
||||
"fire_count": 0,
|
||||
"last_fired_at": None,
|
||||
}
|
||||
write(name, data)
|
||||
logger.info("Trigger angelegt: %s (timer, fires_at=%s)", name, fires_at_iso)
|
||||
return data
|
||||
|
||||
|
||||
def create_watcher(
|
||||
name: str,
|
||||
condition: str,
|
||||
message: str,
|
||||
check_interval_sec: int = 300,
|
||||
throttle_sec: int = 3600,
|
||||
author: str = "aria",
|
||||
) -> dict:
|
||||
_safe_name(name)
|
||||
if _path(name).exists():
|
||||
raise ValueError(f"Trigger '{name}' existiert schon")
|
||||
# Condition parsen-pruefen (wirft bei Syntax-Fehler)
|
||||
from watcher import parse_condition
|
||||
parse_condition(condition) # nur Validate
|
||||
if check_interval_sec < 30:
|
||||
check_interval_sec = 30 # nicht oefter als alle 30s pruefen
|
||||
if throttle_sec < 0:
|
||||
throttle_sec = 0
|
||||
data = {
|
||||
"name": name,
|
||||
"type": "watcher",
|
||||
"active": True,
|
||||
"author": author,
|
||||
"created_at": _now_iso(),
|
||||
"condition": condition,
|
||||
"check_interval_sec": int(check_interval_sec),
|
||||
"throttle_sec": int(throttle_sec),
|
||||
"message": message,
|
||||
"fire_count": 0,
|
||||
"last_fired_at": None,
|
||||
"last_checked_at": None,
|
||||
}
|
||||
write(name, data)
|
||||
logger.info("Trigger angelegt: %s (watcher, cond='%s')", name, condition)
|
||||
return data
|
||||
|
||||
|
||||
# ─── Feuern + Log ───────────────────────────────────────────────────
|
||||
|
||||
def mark_fired(name: str) -> dict:
|
||||
data = read(name)
|
||||
if data is None:
|
||||
raise ValueError(f"Trigger '{name}' nicht gefunden")
|
||||
data["fire_count"] = int(data.get("fire_count", 0)) + 1
|
||||
data["last_fired_at"] = _now_iso()
|
||||
# Timer: nach Feuern auto-deaktivieren (one-shot)
|
||||
if data.get("type") == "timer":
|
||||
data["active"] = False
|
||||
write(name, data)
|
||||
return data
|
||||
|
||||
|
||||
def append_log(name: str, entry: dict) -> None:
|
||||
_ensure_dirs()
|
||||
log_file = LOGS_DIR / f"{_safe_name(name)}.jsonl"
|
||||
record = {"ts": _now_iso()}
|
||||
record.update(entry)
|
||||
try:
|
||||
with log_file.open("a", encoding="utf-8") as f:
|
||||
f.write(json.dumps(record, ensure_ascii=False) + "\n")
|
||||
except Exception as e:
|
||||
logger.warning("Trigger-Log append %s: %s", name, e)
|
||||
|
||||
|
||||
def list_logs(name: str, limit: int = 50) -> list[dict]:
|
||||
log_file = LOGS_DIR / f"{_safe_name(name)}.jsonl"
|
||||
if not log_file.exists():
|
||||
return []
|
||||
try:
|
||||
lines = log_file.read_text(encoding="utf-8").splitlines()
|
||||
out: list[dict] = []
|
||||
for line in lines[-limit:]:
|
||||
try:
|
||||
out.append(json.loads(line))
|
||||
except Exception:
|
||||
pass
|
||||
return out
|
||||
except Exception:
|
||||
return []
|
||||
@@ -0,0 +1,310 @@
|
||||
"""
|
||||
Built-in Condition-Variablen + sicherer Mini-Parser fuer Watcher-Triggers.
|
||||
|
||||
Erlaubte Variablen + die EINE Funktion `near(lat, lon, radius_m)` kommen
|
||||
aus diesem Modul. Condition-Ausdruck ist ein sicheres Subset von Python
|
||||
(kein eval, kein exec): nur Vergleiche, Boolean-Operatoren, Whitelisted
|
||||
Funktionen, Variablen aus describe_variables(), Konstanten (Zahl/Bool/Str).
|
||||
|
||||
Beispiele:
|
||||
disk_free_gb < 5
|
||||
hour_of_day == 8 and day_of_week == "mon"
|
||||
is_weekend and minute_of_hour == 0
|
||||
near(53.123, 7.456, 500)
|
||||
current_lat and location_age_sec < 120
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import ast
|
||||
import json
|
||||
import logging
|
||||
import math
|
||||
import os
|
||||
import shutil
|
||||
import time
|
||||
from datetime import datetime
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
STATE_DIR = Path("/shared/state")
|
||||
|
||||
|
||||
# ─── State-Helfer (gemeinsam mit Bridge: /shared/state/*.json) ──────
|
||||
|
||||
def _read_state(name: str) -> dict | None:
|
||||
f = STATE_DIR / f"{name}.json"
|
||||
if not f.exists():
|
||||
return None
|
||||
try:
|
||||
return json.loads(f.read_text(encoding="utf-8"))
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
# ─── Variablen-Quellen ──────────────────────────────────────────────
|
||||
|
||||
def _disk_stats() -> tuple[float, float]:
|
||||
"""Returns (free_gb, free_pct). Schaut /shared (geteiltes Volume) — sonst /."""
|
||||
target = "/shared" if os.path.exists("/shared") else "/"
|
||||
try:
|
||||
st = shutil.disk_usage(target)
|
||||
free_gb = st.free / (1024 ** 3)
|
||||
free_pct = 100.0 * st.free / st.total if st.total else 0.0
|
||||
return free_gb, free_pct
|
||||
except Exception as e:
|
||||
logger.warning("disk_usage: %s", e)
|
||||
return 0.0, 0.0
|
||||
|
||||
|
||||
def _uptime_sec() -> int:
|
||||
try:
|
||||
with open("/proc/uptime", "r") as f:
|
||||
return int(float(f.read().split()[0]))
|
||||
except Exception:
|
||||
return 0
|
||||
|
||||
|
||||
def _ram_free_mb() -> int:
|
||||
"""Container-RAM: MemAvailable aus /proc/meminfo (kB → MB)."""
|
||||
try:
|
||||
with open("/proc/meminfo", "r") as f:
|
||||
for line in f:
|
||||
if line.startswith("MemAvailable:"):
|
||||
return int(line.split()[1]) // 1024
|
||||
except Exception:
|
||||
pass
|
||||
return 0
|
||||
|
||||
|
||||
def _cpu_load_1min() -> float:
|
||||
"""load avg ueber 1 Minute (linux). Vorsicht: das ist die HOST-load,
|
||||
nicht container-spezifisch."""
|
||||
try:
|
||||
with open("/proc/loadavg", "r") as f:
|
||||
return float(f.read().split()[0])
|
||||
except Exception:
|
||||
return 0.0
|
||||
|
||||
|
||||
_DAYS = ["mon", "tue", "wed", "thu", "fri", "sat", "sun"]
|
||||
|
||||
|
||||
def _gps_state() -> dict[str, Any]:
|
||||
"""Letzte bekannte Position aus /shared/state/location.json.
|
||||
Returns dict mit current_lat, current_lon (oder None), location_age_sec."""
|
||||
data = _read_state("location") or {}
|
||||
now = int(time.time())
|
||||
age = -1
|
||||
lat = data.get("lat")
|
||||
lon = data.get("lon")
|
||||
ts = data.get("ts_unix")
|
||||
if isinstance(ts, (int, float)):
|
||||
age = int(now - ts)
|
||||
return {
|
||||
"current_lat": float(lat) if isinstance(lat, (int, float)) else None,
|
||||
"current_lon": float(lon) if isinstance(lon, (int, float)) else None,
|
||||
"location_age_sec": age,
|
||||
}
|
||||
|
||||
|
||||
def _user_activity_age() -> int:
|
||||
"""Sekunden seit letzter User-Aktion (Chat oder Voice). -1 wenn nie."""
|
||||
data = _read_state("activity") or {}
|
||||
ts = data.get("last_user_ts")
|
||||
if not isinstance(ts, (int, float)):
|
||||
return -1
|
||||
return int(time.time() - ts)
|
||||
|
||||
|
||||
def collect_variables() -> dict[str, Any]:
|
||||
"""Liefert aktuellen Snapshot aller Built-in-Variablen + near()-Helper."""
|
||||
free_gb, free_pct = _disk_stats()
|
||||
now = datetime.now()
|
||||
gps = _gps_state()
|
||||
|
||||
# Memory-Counts aus der Vector-DB (lazy import, sonst zirkulaer)
|
||||
memory_count = 0
|
||||
pinned_count = 0
|
||||
try:
|
||||
from main import store # type: ignore
|
||||
s = store()
|
||||
memory_count = s.count()
|
||||
try:
|
||||
pinned_count = len(s.list_pinned())
|
||||
except Exception:
|
||||
pass
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
vars_: dict[str, Any] = {
|
||||
# Disk + System
|
||||
"disk_free_gb": round(free_gb, 2),
|
||||
"disk_free_pct": round(free_pct, 1),
|
||||
"ram_free_mb": _ram_free_mb(),
|
||||
"cpu_load_1min": round(_cpu_load_1min(), 2),
|
||||
"uptime_sec": _uptime_sec(),
|
||||
|
||||
# Zeit
|
||||
"hour_of_day": now.hour,
|
||||
"minute_of_hour": now.minute,
|
||||
"day_of_month": now.day,
|
||||
"month": now.month,
|
||||
"year": now.year,
|
||||
"day_of_week": _DAYS[now.weekday()],
|
||||
"is_weekend": now.weekday() >= 5,
|
||||
"unix_timestamp": int(time.time()),
|
||||
|
||||
# GPS
|
||||
"current_lat": gps["current_lat"],
|
||||
"current_lon": gps["current_lon"],
|
||||
"location_age_sec": gps["location_age_sec"],
|
||||
|
||||
# Activity
|
||||
"last_user_message_ago_sec": _user_activity_age(),
|
||||
|
||||
# Memory
|
||||
"memory_count": memory_count,
|
||||
"pinned_count": pinned_count,
|
||||
|
||||
# rvs_connected: kann Brain noch nicht zuverlaessig feststellen
|
||||
# (Bridge muesste eigenen Heartbeat-State schreiben — kommt spaeter)
|
||||
"rvs_connected": False,
|
||||
}
|
||||
|
||||
# Funktion-Helper — wird vom Parser als ast.Call mit Name "near" erkannt.
|
||||
# Closure ueber die GPS-Werte, damit eval keine extra Variablen braucht.
|
||||
def _near(lat: float, lon: float, radius_m: float) -> bool:
|
||||
"""Haversine-Distanz: True wenn aktuelle Position < radius_m vom Punkt."""
|
||||
cur_lat = vars_.get("current_lat")
|
||||
cur_lon = vars_.get("current_lon")
|
||||
if cur_lat is None or cur_lon is None:
|
||||
return False
|
||||
try:
|
||||
R = 6371000.0
|
||||
phi1 = math.radians(float(cur_lat))
|
||||
phi2 = math.radians(float(lat))
|
||||
dphi = math.radians(float(lat) - float(cur_lat))
|
||||
dlam = math.radians(float(lon) - float(cur_lon))
|
||||
a = math.sin(dphi / 2) ** 2 + math.cos(phi1) * math.cos(phi2) * math.sin(dlam / 2) ** 2
|
||||
distance = 2 * R * math.asin(math.sqrt(a))
|
||||
return distance < float(radius_m)
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
vars_["near"] = _near
|
||||
return vars_
|
||||
|
||||
|
||||
def describe_variables() -> list[dict]:
|
||||
"""Beschreibung — fuer System-Prompt + UI."""
|
||||
return [
|
||||
# Disk / System
|
||||
{"name": "disk_free_gb", "type": "number", "desc": "freier Plattenplatz in GB (auf /shared)"},
|
||||
{"name": "disk_free_pct", "type": "number", "desc": "freier Plattenplatz in Prozent"},
|
||||
{"name": "ram_free_mb", "type": "number", "desc": "freier RAM im Brain-Container (MB)"},
|
||||
{"name": "cpu_load_1min", "type": "number", "desc": "Load-Avg 1min (Host)"},
|
||||
{"name": "uptime_sec", "type": "number", "desc": "Sekunden seit Brain-Start"},
|
||||
# Zeit
|
||||
{"name": "hour_of_day", "type": "number", "desc": "0..23, lokale Zeit"},
|
||||
{"name": "minute_of_hour", "type": "number", "desc": "0..59"},
|
||||
{"name": "day_of_month", "type": "number", "desc": "1..31"},
|
||||
{"name": "month", "type": "number", "desc": "1..12"},
|
||||
{"name": "year", "type": "number", "desc": "z.B. 2026"},
|
||||
{"name": "day_of_week", "type": "string", "desc": "mon|tue|wed|thu|fri|sat|sun"},
|
||||
{"name": "is_weekend", "type": "bool", "desc": "True wenn Samstag oder Sonntag"},
|
||||
{"name": "unix_timestamp", "type": "number", "desc": "Sekunden seit Epoche (UTC)"},
|
||||
# GPS
|
||||
{"name": "current_lat", "type": "number", "desc": "letzte bekannte Breitengrad (oder None)"},
|
||||
{"name": "current_lon", "type": "number", "desc": "letzte bekannte Laengengrad (oder None)"},
|
||||
{"name": "location_age_sec", "type": "number", "desc": "Sekunden seit letzter Position (-1 = nie)"},
|
||||
# Activity
|
||||
{"name": "last_user_message_ago_sec", "type": "number",
|
||||
"desc": "Sekunden seit letztem User-Input (-1 = nie)"},
|
||||
# Memory
|
||||
{"name": "memory_count", "type": "number", "desc": "Anzahl Memories total"},
|
||||
{"name": "pinned_count", "type": "number", "desc": "Anzahl pinned (Hot Memory)"},
|
||||
{"name": "rvs_connected", "type": "bool", "desc": "RVS-Verbindung (z.Zt. immer False)"},
|
||||
]
|
||||
|
||||
|
||||
def describe_functions() -> list[dict]:
|
||||
"""Whitelisted Funktionen fuer Conditions."""
|
||||
return [
|
||||
{
|
||||
"name": "near",
|
||||
"signature": "near(lat, lon, radius_m)",
|
||||
"desc": "True wenn die aktuelle GPS-Position innerhalb von radius_m Metern "
|
||||
"vom Punkt (lat, lon) liegt. Haversine. Bei unbekannter Position: False.",
|
||||
},
|
||||
]
|
||||
|
||||
|
||||
_ALLOWED_FUNCTIONS = {f["name"] for f in describe_functions()}
|
||||
|
||||
|
||||
# ─── Sicherer Condition-Parser ──────────────────────────────────────
|
||||
|
||||
_ALLOWED_NODES = (
|
||||
ast.Expression, ast.BoolOp, ast.UnaryOp, ast.Compare,
|
||||
ast.Name, ast.Constant, ast.Load,
|
||||
ast.And, ast.Or, ast.Not,
|
||||
ast.Eq, ast.NotEq, ast.Lt, ast.LtE, ast.Gt, ast.GtE,
|
||||
ast.Call,
|
||||
)
|
||||
|
||||
|
||||
def parse_condition(expr: str) -> ast.Expression:
|
||||
"""Parst einen Condition-Ausdruck und validiert ihn gegen das Safe-Subset.
|
||||
Wirft ValueError bei verbotenen Konstrukten."""
|
||||
expr = (expr or "").strip()
|
||||
if not expr:
|
||||
raise ValueError("Leere Condition")
|
||||
if len(expr) > 500:
|
||||
raise ValueError("Condition zu lang (>500 Zeichen)")
|
||||
try:
|
||||
tree = ast.parse(expr, mode="eval")
|
||||
except SyntaxError as e:
|
||||
raise ValueError(f"Condition Syntax-Fehler: {e}")
|
||||
allowed_names = {v["name"] for v in describe_variables()}
|
||||
for node in ast.walk(tree):
|
||||
if not isinstance(node, _ALLOWED_NODES):
|
||||
raise ValueError(f"Verbotener Ausdruck: {type(node).__name__}")
|
||||
if isinstance(node, ast.Call):
|
||||
# Nur direkter Funktionsname, kein attribute-access (foo.bar())
|
||||
if not isinstance(node.func, ast.Name):
|
||||
raise ValueError("Funktionsaufruf nur ueber direkten Namen erlaubt")
|
||||
if node.func.id not in _ALLOWED_FUNCTIONS:
|
||||
raise ValueError(f"Verbotene Funktion: {node.func.id}")
|
||||
# Args muessen Constants oder einzelne Names sein
|
||||
for a in node.args:
|
||||
if not isinstance(a, (ast.Constant, ast.Name, ast.UnaryOp)):
|
||||
raise ValueError(f"Argument-Typ in {node.func.id}() nicht erlaubt")
|
||||
if node.keywords:
|
||||
raise ValueError("Keyword-Argumente in Funktionen nicht erlaubt")
|
||||
if isinstance(node, ast.Name):
|
||||
if (node.id not in allowed_names
|
||||
and node.id not in _ALLOWED_FUNCTIONS
|
||||
and node.id not in ("True", "False")):
|
||||
raise ValueError(f"Unbekannte Variable: {node.id}")
|
||||
if isinstance(node, ast.Constant):
|
||||
if not isinstance(node.value, (int, float, str, bool)) and node.value is not None:
|
||||
raise ValueError(f"Verbotener Konstant-Typ: {type(node.value).__name__}")
|
||||
return tree
|
||||
|
||||
|
||||
def evaluate(expr: str, variables: dict[str, Any] | None = None) -> bool:
|
||||
"""Evaluiert die Condition gegen die aktuellen Variablen.
|
||||
Returns bool. Bei Fehler in Variablen → False (defensiv)."""
|
||||
tree = parse_condition(expr)
|
||||
vars_ = variables if variables is not None else collect_variables()
|
||||
code = compile(tree, "<condition>", "eval")
|
||||
# Globals leer, locals enthalten Variablen + near()-Funktion → kein Builtin-Zugriff
|
||||
try:
|
||||
result = eval(code, {"__builtins__": {}}, vars_)
|
||||
except Exception as e:
|
||||
logger.warning("Condition '%s' eval-Fehler: %s", expr, e)
|
||||
return False
|
||||
return bool(result)
|
||||
+262
-28
@@ -21,6 +21,7 @@ import os
|
||||
import re
|
||||
import signal
|
||||
import ssl
|
||||
import time
|
||||
import sys
|
||||
import tempfile
|
||||
import uuid
|
||||
@@ -919,6 +920,94 @@ class ARIABridge:
|
||||
except Exception as e:
|
||||
logger.warning("[rvs] file_from_aria broadcast fehlgeschlagen: %s", e)
|
||||
|
||||
def _persist_state(self, key: str, data: dict) -> None:
|
||||
"""Atomic-Write in /shared/state/<key>.json — fuer Brain-Watcher.
|
||||
Wird genutzt fuer location + activity-Tracking."""
|
||||
try:
|
||||
import time as _time
|
||||
data = dict(data)
|
||||
data["ts_unix"] = int(_time.time())
|
||||
Path("/shared/state").mkdir(parents=True, exist_ok=True)
|
||||
target = Path(f"/shared/state/{key}.json")
|
||||
tmp = target.with_suffix(".tmp")
|
||||
tmp.write_text(json.dumps(data), encoding="utf-8")
|
||||
tmp.replace(target)
|
||||
except Exception as e:
|
||||
logger.warning("[state] %s schreiben fehlgeschlagen: %s", key, e)
|
||||
|
||||
def _persist_location(self, location: Optional[dict]) -> None:
|
||||
"""Speichert die letzte bekannte GPS-Position fuer Watcher.
|
||||
Erwartet {lat, lon} oder {lat, lng}. Nicht-Dicts und fehlende
|
||||
Koordinaten werden ignoriert."""
|
||||
if not isinstance(location, dict):
|
||||
return
|
||||
try:
|
||||
lat = location.get("lat")
|
||||
lon = location.get("lon") or location.get("lng")
|
||||
if lat is None or lon is None:
|
||||
return
|
||||
self._persist_state("location", {
|
||||
"lat": float(lat),
|
||||
"lon": float(lon),
|
||||
})
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def _persist_user_activity(self) -> None:
|
||||
"""Markiert dass der User gerade etwas gemacht hat (Chat/Voice).
|
||||
Watcher: last_user_message_ago_sec basiert darauf."""
|
||||
self._persist_state("activity", {"last_user_ts": int(time.time())})
|
||||
|
||||
def _append_chat_backup(self, entry: dict) -> None:
|
||||
"""Schreibt eine Zeile in /shared/config/chat_backup.jsonl.
|
||||
Wird von Diagnostic + App als History-Quelle gelesen.
|
||||
entry braucht mindestens {role, text}; ts wird ergaenzt."""
|
||||
try:
|
||||
line = {"ts": int(asyncio.get_event_loop().time() * 1000)}
|
||||
line.update(entry)
|
||||
Path("/shared/config").mkdir(parents=True, exist_ok=True)
|
||||
with open("/shared/config/chat_backup.jsonl", "a", encoding="utf-8") as f:
|
||||
f.write(json.dumps(line, ensure_ascii=False) + "\n")
|
||||
except Exception as e:
|
||||
logger.warning("[backup] chat_backup-Write fehlgeschlagen: %s", e)
|
||||
|
||||
def _read_chat_backup_since(self, since_ms: int, limit: int = 100) -> list[dict]:
|
||||
"""Liest chat_backup.jsonl, gibt Eintraege > since_ms zurueck, max limit neueste.
|
||||
File-deleted-Marker werden honoriert: vor einem file_deleted-Marker liegende
|
||||
Eintraege mit gleichem Pfad werden als deleted markiert."""
|
||||
path = Path("/shared/config/chat_backup.jsonl")
|
||||
if not path.exists():
|
||||
return []
|
||||
try:
|
||||
lines = path.read_text(encoding="utf-8").splitlines()
|
||||
except Exception as e:
|
||||
logger.warning("[backup] Lesen fehlgeschlagen: %s", e)
|
||||
return []
|
||||
out: list[dict] = []
|
||||
for raw in lines:
|
||||
raw = raw.strip()
|
||||
if not raw:
|
||||
continue
|
||||
try:
|
||||
obj = json.loads(raw)
|
||||
except Exception:
|
||||
continue
|
||||
ts = obj.get("ts") or 0
|
||||
if ts <= since_ms:
|
||||
continue
|
||||
# file_deleted-Marker: nicht als Chat ausliefern, aber an die App schicken
|
||||
# damit sie ihre Bubbles updaten kann (separater Pfad existiert ja schon)
|
||||
if obj.get("type") == "file_deleted":
|
||||
continue
|
||||
role = obj.get("role")
|
||||
if role not in ("user", "assistant"):
|
||||
continue
|
||||
out.append(obj)
|
||||
# Auf "limit" neueste cappen
|
||||
if len(out) > limit:
|
||||
out = out[-limit:]
|
||||
return out
|
||||
|
||||
async def _process_core_response(self, text: str, payload: dict) -> None:
|
||||
"""Verarbeitet eine fertige Antwort von aria-core.
|
||||
|
||||
@@ -933,6 +1022,9 @@ class ARIABridge:
|
||||
logger.info("[core] NO_REPLY empfangen — Antwort still verworfen")
|
||||
return
|
||||
|
||||
# Antwort in chat_backup.jsonl loggen (cleaned text, ohne File-Marker)
|
||||
# — passiert weiter unten nach extract_file_markers
|
||||
|
||||
# File-Marker `[FILE: /shared/uploads/aria_xyz.pdf]` extrahieren —
|
||||
# ARIA legt damit Dateien fuer den User bereit (Bilder, PDFs, etc.).
|
||||
# Der Marker wird aus dem Antworttext entfernt (TTS soll ihn nicht
|
||||
@@ -949,6 +1041,15 @@ class ARIABridge:
|
||||
f"aber nicht erstellt:\n{missing_list}\n"
|
||||
"Bitte ARIA bitten, sie wirklich zu schreiben.").strip()
|
||||
|
||||
# Antwort in chat_backup.jsonl loggen (gecleanter Text, ohne File-Marker)
|
||||
# File-Marker werden separat als file_from_aria-Events ausgeliefert.
|
||||
self._append_chat_backup({
|
||||
"role": "assistant",
|
||||
"text": text,
|
||||
"files": [{"serverPath": f["serverPath"], "name": f["name"],
|
||||
"mimeType": f["mimeType"], "size": f["size"]} for f in aria_files],
|
||||
})
|
||||
|
||||
metadata = payload.get("metadata", {})
|
||||
is_critical = metadata.get("critical", False)
|
||||
requested_voice = metadata.get("voice")
|
||||
@@ -1024,6 +1125,12 @@ class ARIABridge:
|
||||
except Exception as e:
|
||||
logger.error("[core] XTTS-Request fehlgeschlagen: %s — kein Audio", e)
|
||||
|
||||
# ARIA ist fertig — App's "ARIA denkt..." Indicator zurueck auf idle.
|
||||
# _last_chat_final_at bewusst NICHT setzen: die 3s-Cooldown war fuer
|
||||
# trailing OpenClaw-Activity-Events; bei Voice-Chat wuerde sie die
|
||||
# naechste thinking-Welle unterdruecken.
|
||||
await self._emit_activity("idle", "")
|
||||
|
||||
# ── Mode Persistence (global, nicht pro Geraet) ──────
|
||||
_MODE_FILE = "/shared/config/mode.json"
|
||||
|
||||
@@ -1184,12 +1291,13 @@ class ARIABridge:
|
||||
payload = json.dumps({"message": text, "source": source}).encode("utf-8")
|
||||
logger.info("[brain] chat ← %s '%s'", source, text[:80])
|
||||
|
||||
# agent_activity broadcasten (App + Diagnostic "ARIA denkt..." Indicator)
|
||||
await self._send_to_rvs({
|
||||
"type": "agent_activity",
|
||||
"payload": {"activity": "thinking"},
|
||||
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
||||
})
|
||||
# User-Nachricht in chat_backup.jsonl loggen — wird beim App-Reconnect
|
||||
# / Diagnostic-Reload als History-Quelle gelesen.
|
||||
self._append_chat_backup({"role": "user", "text": text, "source": source})
|
||||
|
||||
# agent_activity → thinking. _emit_activity statt direktem _send_to_rvs
|
||||
# damit der State-Cache fuer die spaetere idle-Dedup richtig steht.
|
||||
await self._emit_activity("thinking", "")
|
||||
|
||||
def _do_call():
|
||||
try:
|
||||
@@ -1206,11 +1314,7 @@ class ARIABridge:
|
||||
status, body = await asyncio.get_event_loop().run_in_executor(None, _do_call)
|
||||
if status != 200:
|
||||
logger.error("[brain] /chat fehlgeschlagen: status=%s body=%s", status, body[:200])
|
||||
await self._send_to_rvs({
|
||||
"type": "agent_activity",
|
||||
"payload": {"activity": "idle"},
|
||||
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
||||
})
|
||||
await self._emit_activity("idle", "")
|
||||
await self._send_to_rvs({
|
||||
"type": "chat",
|
||||
"payload": {
|
||||
@@ -1225,21 +1329,13 @@ class ARIABridge:
|
||||
data = json.loads(body)
|
||||
except Exception:
|
||||
logger.error("[brain] /chat lieferte ungueltiges JSON: %s", body[:200])
|
||||
await self._send_to_rvs({
|
||||
"type": "agent_activity",
|
||||
"payload": {"activity": "idle"},
|
||||
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
||||
})
|
||||
await self._emit_activity("idle", "")
|
||||
return
|
||||
|
||||
reply = (data.get("reply") or "").strip()
|
||||
if not reply:
|
||||
logger.warning("[brain] /chat: leerer Reply")
|
||||
await self._send_to_rvs({
|
||||
"type": "agent_activity",
|
||||
"payload": {"activity": "idle"},
|
||||
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
||||
})
|
||||
await self._emit_activity("idle", "")
|
||||
return
|
||||
|
||||
# Side-Channel-Events VOR der Chat-Bubble broadcasten (z.B. skill_created)
|
||||
@@ -1254,6 +1350,26 @@ class ARIABridge:
|
||||
})
|
||||
logger.info("[brain] ARIA hat einen Skill erstellt: %s",
|
||||
event.get("skill", {}).get("name"))
|
||||
elif etype == "trigger_created":
|
||||
await self._send_to_rvs({
|
||||
"type": "trigger_created",
|
||||
"payload": event.get("trigger", {}),
|
||||
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
||||
})
|
||||
logger.info("[brain] ARIA hat einen Trigger angelegt: %s",
|
||||
event.get("trigger", {}).get("name"))
|
||||
elif etype == "location_tracking":
|
||||
# ARIA bittet die App das GPS-Tracking ein-/auszuschalten
|
||||
await self._send_to_rvs({
|
||||
"type": "location_tracking",
|
||||
"payload": {
|
||||
"on": bool(event.get("on")),
|
||||
"reason": event.get("reason") or "",
|
||||
},
|
||||
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
||||
})
|
||||
logger.info("[brain] location_tracking Request: on=%s (%s)",
|
||||
event.get("on"), event.get("reason", ""))
|
||||
|
||||
# _process_core_response uebernimmt alles weitere:
|
||||
# File-Marker extrahieren + broadcasten, NO_REPLY-Check, Chat-
|
||||
@@ -1265,6 +1381,8 @@ class ARIABridge:
|
||||
await self._process_core_response(reply, {})
|
||||
except Exception:
|
||||
logger.exception("[brain] _process_core_response Fehler")
|
||||
await self._emit_activity("idle", "")
|
||||
# Originaler Fallback-Send (toter Code, _emit_activity uebernimmt jetzt)
|
||||
await self._send_to_rvs({
|
||||
"type": "agent_activity",
|
||||
"payload": {"activity": "idle"},
|
||||
@@ -1412,6 +1530,9 @@ class ARIABridge:
|
||||
if text:
|
||||
interrupted = bool(payload.get("interrupted", False))
|
||||
location = payload.get("location") or None
|
||||
# State persist fuer Brain-Watcher (current_lat, ..., last_user_ts)
|
||||
self._persist_location(location)
|
||||
self._persist_user_activity()
|
||||
# Wenn Files gerade gepuffert sind (Bild + Text gleichzeitig
|
||||
# gesendet), mergen wir sie zu einer einzigen Anfrage statt
|
||||
# zwei separater send_to_core-Calls.
|
||||
@@ -1657,6 +1778,20 @@ class ARIABridge:
|
||||
except Exception as e:
|
||||
logger.warning("[rvs] file_saved konnte nicht an App gesendet werden: %s", e)
|
||||
|
||||
elif msg_type == "chat_history_request":
|
||||
# App holt verpasste Nachrichten beim Reconnect.
|
||||
# payload: {since: <ts_ms>}, default 0 = alles
|
||||
since = int(payload.get("since") or 0)
|
||||
limit = int(payload.get("limit") or 100)
|
||||
logger.info("[rvs] chat_history_request since=%d limit=%d", since, limit)
|
||||
messages = self._read_chat_backup_since(since, limit=limit)
|
||||
await self._send_to_rvs({
|
||||
"type": "chat_history_response",
|
||||
"payload": {"messages": messages, "since": since},
|
||||
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
||||
})
|
||||
return
|
||||
|
||||
elif msg_type == "file_list_request":
|
||||
# App fragt die Liste aller /shared/uploads/-Dateien an.
|
||||
logger.info("[rvs] file_list_request von App")
|
||||
@@ -1681,6 +1816,89 @@ class ARIABridge:
|
||||
logger.warning("[rvs] file_list_request: %s", e)
|
||||
return
|
||||
|
||||
elif msg_type == "file_delete_batch_request":
|
||||
# App will mehrere Dateien auf einmal loeschen.
|
||||
paths = payload.get("paths") or []
|
||||
req_id = payload.get("requestId", "")
|
||||
logger.warning("[rvs] file_delete_batch_request: %d Pfade", len(paths))
|
||||
try:
|
||||
body_bytes = json.dumps({"paths": paths}).encode("utf-8")
|
||||
req = urllib.request.Request(
|
||||
"http://localhost:3001/api/files-delete-batch",
|
||||
data=body_bytes, method="POST",
|
||||
headers={"Content-Type": "application/json"},
|
||||
)
|
||||
def _do_delete():
|
||||
try:
|
||||
with urllib.request.urlopen(req, timeout=30) as resp:
|
||||
return resp.status, resp.read().decode("utf-8", errors="ignore")
|
||||
except Exception as e:
|
||||
return None, str(e)
|
||||
status, body = await asyncio.get_event_loop().run_in_executor(None, _do_delete)
|
||||
logger.info("[rvs] file_delete_batch result: status=%s", status)
|
||||
# Server broadcastet file_deleted pro Pfad — App kriegt das via persistente RVS.
|
||||
# Wir bestaetigen zusaetzlich mit Counts.
|
||||
try: d = json.loads(body or "{}")
|
||||
except: d = {}
|
||||
await self._send_to_rvs({
|
||||
"type": "file_delete_batch_response",
|
||||
"payload": {
|
||||
"requestId": req_id,
|
||||
"deleted": len(d.get("deleted", [])),
|
||||
"errors": d.get("errors", []),
|
||||
},
|
||||
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
||||
})
|
||||
except Exception as e:
|
||||
logger.warning("[rvs] file_delete_batch_request: %s", e)
|
||||
return
|
||||
|
||||
elif msg_type == "file_zip_request":
|
||||
# App will mehrere Dateien als ZIP. Bridge holt ZIP von Diagnostic
|
||||
# via HTTP, kodiert base64 und schickt zurueck. Cap auf 30 MB
|
||||
# ZIP-Groesse damit RVS nicht erstickt.
|
||||
paths = payload.get("paths") or []
|
||||
req_id = payload.get("requestId", "")
|
||||
logger.warning("[rvs] file_zip_request: %d Pfade (req=%s)", len(paths), req_id)
|
||||
|
||||
def _do_zip():
|
||||
try:
|
||||
body_bytes = json.dumps({"paths": paths}).encode("utf-8")
|
||||
req = urllib.request.Request(
|
||||
"http://localhost:3001/api/files-download-zip",
|
||||
data=body_bytes, method="POST",
|
||||
headers={"Content-Type": "application/json"},
|
||||
)
|
||||
with urllib.request.urlopen(req, timeout=120) as resp:
|
||||
if resp.status != 200:
|
||||
return None, f"HTTP {resp.status}"
|
||||
data = resp.read()
|
||||
if len(data) > 30 * 1024 * 1024:
|
||||
return None, f"ZIP zu gross ({len(data) // (1024*1024)} MB > 30 MB)"
|
||||
return data, None
|
||||
except Exception as e:
|
||||
return None, str(e)
|
||||
|
||||
data, err = await asyncio.get_event_loop().run_in_executor(None, _do_zip)
|
||||
if err or data is None:
|
||||
await self._send_to_rvs({
|
||||
"type": "file_zip_response",
|
||||
"payload": {"requestId": req_id, "ok": False, "error": err or "leer"},
|
||||
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
||||
})
|
||||
return
|
||||
import base64 as _b64
|
||||
await self._send_to_rvs({
|
||||
"type": "file_zip_response",
|
||||
"payload": {
|
||||
"requestId": req_id, "ok": True,
|
||||
"size": len(data),
|
||||
"data": _b64.b64encode(data).decode("ascii"),
|
||||
},
|
||||
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
||||
})
|
||||
return
|
||||
|
||||
elif msg_type == "file_delete_request":
|
||||
# App will eine Datei loeschen — leite an Diagnostic.
|
||||
p = payload.get("path", "")
|
||||
@@ -1708,6 +1926,17 @@ class ARIABridge:
|
||||
logger.warning("[rvs] file_delete_request: %s", e)
|
||||
return
|
||||
|
||||
elif msg_type == "location_update":
|
||||
# Live-GPS-Update von der App (nicht an Chat gekoppelt). Wird in
|
||||
# /shared/state/location.json geschrieben, damit Watcher-Trigger
|
||||
# near()-Conditions auswerten koennen.
|
||||
lat = payload.get("lat")
|
||||
lon = payload.get("lon") or payload.get("lng")
|
||||
if lat is not None and lon is not None:
|
||||
self._persist_location({"lat": lat, "lon": lon})
|
||||
logger.debug("[gps] location_update: %.5f, %.5f", float(lat), float(lon))
|
||||
return
|
||||
|
||||
elif msg_type == "container_restart":
|
||||
# App-Button "Container neu" — leitet generisch an Diagnostic
|
||||
# weiter. Whitelist ist im Diagnostic-Server.
|
||||
@@ -1797,6 +2026,9 @@ class ARIABridge:
|
||||
interrupted = bool(payload.get("interrupted", False))
|
||||
audio_request_id = payload.get("audioRequestId", "") or ""
|
||||
location = payload.get("location") or None
|
||||
# State persist fuer Brain-Watcher (current_lat etc.)
|
||||
self._persist_location(location)
|
||||
self._persist_user_activity()
|
||||
logger.info("[rvs] Audio empfangen: %s, %dms, %dKB%s%s%s",
|
||||
mime_type, duration_ms, len(audio_b64) // 1365,
|
||||
" [BARGE-IN]" if interrupted else "",
|
||||
@@ -1887,13 +2119,11 @@ class ARIABridge:
|
||||
|
||||
if text.strip():
|
||||
logger.info("[rvs] STT Ergebnis: '%s'", text[:80])
|
||||
# Hints (Barge-In, GPS) als Praefix vorschalten — gemeinsamer Helper
|
||||
# mit dem chat-Pfad damit das Verhalten konsistent ist.
|
||||
core_text = self._build_core_text(text, interrupted, location)
|
||||
# ERST an aria-core senden (wichtigster Schritt)
|
||||
await self.send_to_core(core_text, source="app-voice" + (" [barge-in]" if interrupted else ""))
|
||||
# STT-Text an RVS senden (fuer Anzeige in App + Diagnostic)
|
||||
# sender="stt" damit Bridge es ignoriert (kein Loop)
|
||||
|
||||
# Reihenfolge wichtig: STT-Text ZUERST broadcasten damit die App
|
||||
# die Voice-Bubble sofort mit dem erkannten Text aktualisieren
|
||||
# kann — send_to_core blockt danach synchron auf Brain (kann
|
||||
# dauern), wuerde sonst die Anzeige verzoegern.
|
||||
try:
|
||||
stt_payload = {
|
||||
"text": text,
|
||||
@@ -1917,6 +2147,10 @@ class ARIABridge:
|
||||
logger.warning("[rvs] STT-Text NICHT broadcastet — _send_to_rvs lieferte False")
|
||||
except Exception as e:
|
||||
logger.warning("[rvs] STT-Text konnte nicht an RVS gesendet werden: %s", e)
|
||||
|
||||
# Dann an Brain — der blockt synchron bis ARIA fertig ist.
|
||||
core_text = self._build_core_text(text, interrupted, location)
|
||||
await self.send_to_core(core_text, source="app-voice" + (" [barge-in]" if interrupted else ""))
|
||||
else:
|
||||
logger.info("[rvs] Keine Sprache erkannt — ignoriert")
|
||||
|
||||
|
||||
@@ -1,5 +1,7 @@
|
||||
FROM node:22-alpine
|
||||
WORKDIR /app
|
||||
# zip fuer Multi-Datei-Downloads (Brain-Export nutzt tar.gz, Datei-Manager zip)
|
||||
RUN apk add --no-cache zip
|
||||
COPY package.json ./
|
||||
RUN npm install --production
|
||||
COPY . .
|
||||
|
||||
+764
-87
File diff suppressed because it is too large
Load Diff
@@ -1361,6 +1361,77 @@ const server = http.createServer((req, res) => {
|
||||
});
|
||||
fs.createReadStream(safe).pipe(res);
|
||||
return;
|
||||
} else if (req.url === "/api/files-download-zip" && req.method === "POST") {
|
||||
// Multi-Datei-Download als ZIP. Body: {paths: ["/shared/uploads/...", ...]}.
|
||||
// Streamt zip stdout direkt in die Response.
|
||||
let body = "";
|
||||
req.on("data", c => { body += c; if (body.length > 65536) req.destroy(); });
|
||||
req.on("end", () => {
|
||||
let paths = [];
|
||||
try { paths = (JSON.parse(body || "{}").paths || []); } catch { paths = []; }
|
||||
// Whitelist: nur /shared/uploads/, existieren muessen sie
|
||||
paths = paths
|
||||
.map(p => path.resolve(String(p)))
|
||||
.filter(p => p.startsWith("/shared/uploads/") && fs.existsSync(p));
|
||||
if (!paths.length) {
|
||||
res.writeHead(400, { "Content-Type": "application/json" });
|
||||
res.end(JSON.stringify({ ok: false, error: "Keine gueltigen Pfade" }));
|
||||
return;
|
||||
}
|
||||
const ts = new Date().toISOString().replace(/[:.]/g, "-").slice(0, 19);
|
||||
const fname = `aria-files-${ts}.zip`;
|
||||
res.writeHead(200, {
|
||||
"Content-Type": "application/zip",
|
||||
"Content-Disposition": `attachment; filename="${fname}"`,
|
||||
});
|
||||
// zip -j: junk paths (Dateien ohne Verzeichnisstruktur ablegen)
|
||||
const { spawn } = require("child_process");
|
||||
const zip = spawn("zip", ["-j", "-q", "-", ...paths]);
|
||||
zip.stdout.pipe(res);
|
||||
let stderr = "";
|
||||
zip.stderr.on("data", d => stderr += d.toString());
|
||||
zip.on("close", code => {
|
||||
if (code !== 0 && code !== 12) {
|
||||
log("error", "server", `zip exit ${code}: ${stderr.slice(0, 200)}`);
|
||||
}
|
||||
});
|
||||
req.on("close", () => { if (!zip.killed) zip.kill("SIGTERM"); });
|
||||
});
|
||||
return;
|
||||
} else if (req.url === "/api/files-delete-batch" && req.method === "POST") {
|
||||
let body = "";
|
||||
req.on("data", c => { body += c; if (body.length > 65536) req.destroy(); });
|
||||
req.on("end", () => {
|
||||
try {
|
||||
let paths = (JSON.parse(body || "{}").paths || []);
|
||||
paths = paths
|
||||
.map(p => path.resolve(String(p)))
|
||||
.filter(p => p.startsWith("/shared/uploads/"));
|
||||
const deleted = [];
|
||||
const errors = [];
|
||||
for (const p of paths) {
|
||||
try {
|
||||
if (fs.existsSync(p)) fs.unlinkSync(p);
|
||||
deleted.push(p);
|
||||
broadcast({ type: "file_deleted", path: p });
|
||||
sendToRVS_raw({ type: "file_deleted", payload: { path: p }, timestamp: Date.now() });
|
||||
try {
|
||||
fs.appendFileSync("/shared/config/chat_backup.jsonl",
|
||||
JSON.stringify({ type: "file_deleted", path: p, ts: Date.now(), by: "user" }) + "\n");
|
||||
} catch {}
|
||||
} catch (e) {
|
||||
errors.push({ path: p, error: e.message });
|
||||
}
|
||||
}
|
||||
log("info", "server", `Bulk-Delete: ${deleted.length} OK, ${errors.length} Fehler`);
|
||||
res.writeHead(200, { "Content-Type": "application/json" });
|
||||
res.end(JSON.stringify({ ok: true, deleted, errors }));
|
||||
} catch (err) {
|
||||
res.writeHead(500, { "Content-Type": "application/json" });
|
||||
res.end(JSON.stringify({ ok: false, error: err.message }));
|
||||
}
|
||||
});
|
||||
return;
|
||||
} else if (req.url === "/api/files-delete" && req.method === "POST") {
|
||||
let body = "";
|
||||
req.on("data", c => { body += c; if (body.length > 4096) req.destroy(); });
|
||||
@@ -1448,6 +1519,30 @@ const server = http.createServer((req, res) => {
|
||||
}
|
||||
});
|
||||
return;
|
||||
} else if (req.url === "/api/chat-history-clear" && req.method === "POST") {
|
||||
// Leert die Diagnostic-Anzeige-History (chat_backup.jsonl) UND broadcastet
|
||||
// chat_cleared an alle RVS-Clients (App leert lokal). Brain's
|
||||
// Rolling-Window (conversation.jsonl) ist davon unabhaengig — Caller
|
||||
// sollte zusaetzlich /api/brain/conversation/reset triggern.
|
||||
log("warn", "server", "HTTP /api/chat-history-clear");
|
||||
try {
|
||||
const file = "/shared/config/chat_backup.jsonl";
|
||||
if (fs.existsSync(file)) fs.unlinkSync(file);
|
||||
// Browser-Clients: leere chat_history
|
||||
broadcast({ type: "chat_history", messages: [] });
|
||||
// App via RVS: chat_cleared
|
||||
sendToRVS_raw({
|
||||
type: "chat_cleared",
|
||||
payload: { ts: Date.now() },
|
||||
timestamp: Date.now(),
|
||||
});
|
||||
res.writeHead(200, { "Content-Type": "application/json" });
|
||||
res.end(JSON.stringify({ ok: true }));
|
||||
} catch (err) {
|
||||
res.writeHead(500, { "Content-Type": "application/json" });
|
||||
res.end(JSON.stringify({ ok: false, error: err.message }));
|
||||
}
|
||||
return;
|
||||
} else if (req.url === "/api/wipe-all" && req.method === "POST") {
|
||||
// Komplett-Reset — Gedaechtnis, Stimmen, Config alle weg. SSH-Keys
|
||||
// und .env bleiben, RVS-Anbindung bleibt. Brain + Qdrant werden
|
||||
|
||||
@@ -55,6 +55,15 @@ Wichtige Mechanismen:
|
||||
|
||||
### Bugs / Fixes
|
||||
|
||||
- [x] **Timer "in 2 Minuten" wird wieder angelegt**: ARIA hatte keine Moeglichkeit die aktuelle Zeit zu kennen — kein Bash-Tool, kein Time-Tool, kein Timestamp im System-Prompt. Die Tool-Beschreibung von `trigger_timer` empfahl sogar `date -u -d '+10 minutes'` via Bash, aber Bash gab's nicht. Folge: LLM liess den Tool-Call entweder weg oder riet einen Cutoff-Zeitstempel (Vergangenheit) → Background-Loop feuerte beim naechsten 30s-Tick sofort statt in 2min. Fix: (1) `build_time_section()` in `prompts.py` injiziert UTC + lokale Europa/Berlin-Zeit als `## Aktuelle Zeit`-Block oben im System-Prompt. (2) `trigger_timer` akzeptiert jetzt `in_seconds` als Alternative zu `fires_at` — Server rechnet den absoluten Timestamp, ARIA muss nicht ISO-rechnen
|
||||
- [x] **"ARIA denkt..." haengt nach Brain-Antwort** (App + Diagnostic): `send_to_core` schickte `thinking` direkt via `_send_to_rvs`, hat aber `_last_activity_state` nicht gepflegt — der spaetere `_emit_activity("idle")` wurde dedupliziert und verschluckt. Fix: durchgehend `_emit_activity` fuer beide Zustaende
|
||||
- [x] **Such-Scroll in App-Chat springt jetzt zur Treffer-Bubble**: `scrollToIndex` wurde zu frueh gerufen + `viewPosition: 0.4` schoss vorbei. Fix: `requestAnimationFrame` + `viewPosition: 0.5` + `onScrollToIndexFailed`-Fallback mit averageItemLength-Schaetzung + 250ms-Retry
|
||||
- [x] **STT-Bubble bekommt den Text jetzt sofort** (nicht erst mit ARIAs Antwort): `_process_app_audio` rief erst `send_to_core` (blockt synchron) und DANN STT-Broadcast. Fix: Reihenfolge getauscht — STT raus, dann Core-Call
|
||||
- [x] **ARIA-Antworten landen wieder in der Diagnostic**: `if (sender === 'aria') return;` im `rvs_chat`-Handler war OpenClaw-Leiche und filterte die neuen Brain-Antworten weg. Fix: aria → received-Bubble
|
||||
- [x] **Brain-Card im Main-Tab zeigt jetzt Live-Status**: `updateState` ueberschrieb die Card mit altem `state.gateway`-Text aus OpenClaw-Zeiten. Fix: `updateState` laesst Brain-Card unangetastet, `loadBrainStatus` synchronisiert beide Cards (Main + Gehirn-Tab) alle 15s
|
||||
- [x] **App-Chat-Sync zeigte veralteten Stand**: `since:lastSync` war diff-only — wenn Server geleert war, blieb die App-History stehen. Fix: `since:0, limit:200` komplett-Replace (Server = Source of Truth). Lokal-only Bubbles (Skill-Notifications, laufende Voice ohne STT) bleiben erhalten
|
||||
- [x] **Konversation-Reset leert jetzt beides**: vorher leerte der Button nur das Brain-Memory, `chat_backup.jsonl` blieb. Fix: ein Button feuert `Promise.all` auf `/api/brain/conversation/reset` + `/api/chat-history-clear`, plus `chat_cleared`-Broadcast via RVS damit App + Diagnostic sich live leeren
|
||||
- [x] **JS-Crashes beim Diagnostic-Laden behoben**: Ghost-IDs aus OpenClaw-Zeiten (`gw-dot`, `openclaw-config`, `btn-core-term`, `core-auth`, `perms-status`, `rc-compact-after`) wurden null-referenziert. Fix: null-safe oder Code raus
|
||||
- [x] Diagnostic: "ARIA denkt..." bleibt nicht mehr stehen
|
||||
- [x] App: "ARIA denkt..." Indicator + Abbrechen-Button (Bridge spiegelt agent_activity via RVS)
|
||||
- [x] Textnachrichten werden von ARIA beantwortet (Bridge chat handler fix)
|
||||
@@ -212,22 +221,75 @@ Wichtige Mechanismen:
|
||||
- [x] RVS Nachrichten vom Smartphone gehen durch
|
||||
- [x] SSH Volume read-write fuer Proxy (kein -F Workaround mehr)
|
||||
|
||||
## Offen
|
||||
## Brain — Phase B (komplett)
|
||||
|
||||
### Brain (Phase B — der grosse Refactor laeuft)
|
||||
Der grosse Refactor weg von OpenClaw zu eigener Brain-Architektur — alle 4 Punkte
|
||||
durch. ARIA hat jetzt eigenes Gedaechtnis (Vector-DB), eigenen Loop, eigene
|
||||
Skills mit Tool-Use.
|
||||
|
||||
- [x] aria-brain Container-Skeleton (FastAPI + Qdrant + sentence-transformers)
|
||||
- [x] Memory CRUD via Diagnostic-Gehirn-Tab (Add/Edit/Delete + Search + Filter)
|
||||
- [x] Gehirn-Export/Import als tar.gz (komplett: Memories + Skills + Qdrant)
|
||||
- [x] Voice-Bridge: aria-core-spezifische Logik raus (doctor_fix, aria_restart, aria_session_reset, compact_after)
|
||||
- [x] aria-core komplett aus docker-compose.yml raus, Watchdog raus
|
||||
- [x] Diagnostic: Wipe-All-Button (Memory + Stimmen + Settings)
|
||||
- [x] Voice Export/Import (Diagnostic + XTTS-Bridge auf Gaming-PC)
|
||||
### Infrastruktur
|
||||
|
||||
- [x] aria-brain Container (FastAPI + Qdrant + sentence-transformers, MiniLM multilingual)
|
||||
- [x] aria-core (OpenClaw) abgerissen — Tag `v0.1.2.0` als Archiv
|
||||
- [x] docker-compose komplett umgebaut: brain + qdrant + bridge + diagnostic + proxy
|
||||
- [x] Voice-Bridge: aria-core-Logik raus (doctor_fix, aria_restart, compact_after) → durch Brain-HTTP-Call ersetzt
|
||||
- [x] Sprachmodell-Setting in runtime.json (brainModel) — Diagnostic kann Modell live wechseln, Brain-Restart noetig
|
||||
|
||||
### Memory / Vector-DB
|
||||
|
||||
- [x] Memory CRUD via Diagnostic-Gehirn-Tab (Add/Edit/Delete + Suche + Type/Pinned-Filter)
|
||||
- [x] **Migration aus brain-import/** (Phase B Punkt 2) — Parser fuer AGENT.md/USER.md/TOOLING.md, atomare Punkte mit migration_key (idempotent)
|
||||
- [x] **Bootstrap-Snapshot** (Phase B Punkt 2) — Export/Import nur pinned Memories als JSON
|
||||
- [x] **Komplettes Gehirn** Export/Import als tar.gz (Memories + Skills + Qdrant)
|
||||
|
||||
### Conversation-Loop (Phase B Punkt 3)
|
||||
|
||||
- [x] Single-Chat UI + Rolling Window (50 Turns)
|
||||
- [x] Memory-Destillat: bei >60 Turns automatisch 30 aelteste → fact-Memories via Claude-Call
|
||||
- [x] Hot Memory (pinned) + Cold Memory (Top-5 semantisch) im System-Prompt
|
||||
- [x] Manueller Destillat-Trigger + Konversation-Reset (Brain + Diagnostic chat_backup gleichzeitig)
|
||||
- [x] Bridge schreibt chat_backup.jsonl bei jedem Turn (User + ARIA + ARIA-Files)
|
||||
- [x] App-Chat-Sync: kompletter Server-Sync bei Reconnect (Server = Source of Truth). Wenn Server leer → App leert auch. Lokal-only Bubbles (Skill-Notifications, laufende Voice ohne STT) bleiben erhalten. Plus chat_cleared Live-Update wenn Diagnostic die History wiped.
|
||||
|
||||
### Skills-System (Phase B Punkt 4)
|
||||
|
||||
- [x] Python-only Skills (local-venv pro Skill, eigene pip-Pakete)
|
||||
- [x] Tool-Use im Brain: skill_create als Meta-Tool, dynamische run_<skill> pro aktivem Skill
|
||||
- [x] Harte Schwelle dokumentiert: pip-Install → IMMER Skill (Brain hat keinen Persistenz ausser /data/skills/)
|
||||
- [x] Diagnostic Skills-Tab: Liste, README, Logs pro Run, Activate/Deactivate/Delete, Export/Import als tar.gz
|
||||
- [x] skill_created Live-Notification: gelbe Bubble in App + Diagnostic sobald ARIA selbst einen Skill anlegt
|
||||
|
||||
### Triggers-System (Phase B Punkt 5)
|
||||
|
||||
- [x] **Filesystem-Layer** unter `/data/triggers/<name>.json` + `logs/<name>.jsonl` pro Trigger
|
||||
- [x] **Timer** (one-shot, ISO-Timestamp) — "erinner mich in 10 Minuten an X" → ARIA legt via `trigger_timer`-Tool an, Background-Loop feuert zum Stichzeitpunkt einmal
|
||||
- [x] **Watcher** (recurring) — feuert wenn `condition` true wird, mit Throttle (min_seconds_between_fires) gegen Spam. Checks alle 30s
|
||||
- [x] **Sicherer Condition-Parser** via Python `ast`-Module (Whitelist statt `eval`): nur `<` `>` `<=` `>=` `==` `!=` `and` `or` `not`, Konstanten + Variablennamen aus Whitelist
|
||||
- [x] **Built-in Variablen**: `disk_free_gb`, `disk_free_pct`, `ram_free_mb`, `cpu_load_1min`, `uptime_sec`, `hour_of_day`, `minute_of_hour`, `day_of_month`, `month`, `year`, `day_of_week`, `is_weekend`, `unix_timestamp`, `current_lat`, `current_lon`, `location_age_sec`, `last_user_message_ago_sec`, `memory_count`, `pinned_count`, `rvs_connected`
|
||||
- [x] **near(lat, lon, radius_m) Funktion** im Parser (Haversine) — GPS-Geofencing fuer Blitzer-Warner / Ankunft-Erinnerungen
|
||||
- [x] **Background-Loop** im Brain-Container (Lifespan async task): laeuft alle 30s, prueft alle aktiven Trigger, ruft bei Match `agent.chat(prompt, source="trigger")` mit System-Praefix → ARIA reagiert wie auf eine Frage von Stefan, kann TTS sprechen / Skills starten / weitere Trigger anlegen
|
||||
- [x] **Diagnostic Trigger-Tab**: Liste aktiver Trigger mit Logs, Anlegen-Modal mit Type-Dropdown, Live-Anzeige aller verfuegbaren Variablen + Funktionen, Beispiele
|
||||
- [x] **App Live-Notification**: `trigger_created`-Bubble (gelb) sobald ARIA selbst einen Trigger anlegt — User sieht sofort dass die Bitte angekommen ist
|
||||
- [x] **GPS-Tracking via App** (`@react-native-community/geolocation` watchPosition, distanceFilter 30m, interval 15s) — Singleton-Service in `gpsTracking.ts`, Toggle in Settings → Standort, persistiert AsyncStorage, Restore beim App-Start
|
||||
- [x] **`request_location_tracking`-Tool**: ARIA kann das Tracking via `location_tracking`-Event an-/ausschalten — Bridge forwarded an App, App startet/stoppt watchPosition. ARIA tut das automatisch wenn sie einen Watcher mit `near()` anlegt
|
||||
- [x] **`location_update`-Forwarding**: App schickt alle 15s/30m ein `location_update {lat,lon}`, Bridge persistiert in `/shared/state/location.json`, Watcher liest beim Check
|
||||
- [x] **Activity-Persistenz**: `/shared/state/activity.json` traegt User-Message-Zeitstempel, damit `last_user_message_ago_sec` als Variable verfuegbar ist
|
||||
- [x] **`trigger_cancel`** + **`trigger_list`** als Tools — ARIA kann eigene Trigger verwalten
|
||||
- [x] **Triggers-Block im System-Prompt**: aktive Trigger + verfuegbare Variablen + Funktionen werden bei jedem Chat-Turn injiziert, dazu Hinweis dass GPS-Watcher `request_location_tracking` mit-aufrufen sollen
|
||||
- [x] **Aktuelle-Zeit-Block im System-Prompt**: UTC + lokale Europa/Berlin-Zeit (Sommer/Winter-Heuristik) wird bei jedem Chat-Turn oben mit-injiziert, damit Timer-fires_at und Watcher mit `hour_of_day` ueberhaupt sinnvoll sind. `trigger_timer` akzeptiert zusaetzlich `in_seconds` (Server rechnet) — ARIA muss bei relativen Angaben ('in 2 Minuten') nicht selbst ISO-rechnen
|
||||
|
||||
### Diagnostic / App Features (drumherum)
|
||||
|
||||
- [x] Datei-Manager (Diagnostic + App-Modal): /shared/uploads/ verwalten, Multi-Select + Select-All + Bulk-Download als ZIP + Bulk-Delete
|
||||
- [x] Wipe-All-Button (Memory + Stimmen + Settings)
|
||||
- [x] Voice Export/Import pro Stimme (Diagnostic + XTTS-Bridge auf Gamebox)
|
||||
- [x] F5/Whisper-Settings als JSON-Bundle Export/Import
|
||||
- [x] Datei-Manager (Diagnostic + App-Modal): /shared/uploads/ verwalten, Delete spiegelt sich live in den Chat-Bubbles
|
||||
- [ ] **Phase B Punkt 2:** Migration `aria-data/brain-import/` → atomare Memory-Punkte (Identity / Rules / Preferences / Tools)
|
||||
- [ ] **Phase B Punkt 3:** Brain Conversation-Loop (Single-Chat UI + Rolling Window + Memory-Destillat)
|
||||
- [ ] **Phase B Punkt 4:** Skills-System (Manifest, venv/local-bin, README pro Skill, Diagnostic-Skills-Tab, Export/Import)
|
||||
- [x] App Chat-Suche umgebaut: Highlight + Next/Prev statt Filter
|
||||
- [x] App Pinch-Zoom in Bildern rewriten (Multi-Touch-Race-Bugs)
|
||||
- [x] Info-Buttons mit Modal-Erklaerungen im Gehirn-Tab
|
||||
- [x] Token/Call-Metrics + Subscription-Quota-Tracking: pro Claude-Call ein Log-Eintrag mit Token-Schaetzung (chars/4). Gehirn-Tab zeigt 1h/5h/24h/30d-Aggregat + Progress-Bar gegen Plan-Limit (Pro=45/5h, Max 5x=225/5h, Max 20x=900/5h, Custom). Warn-Schwelle 80%, kritisch 90%.
|
||||
|
||||
## Offen
|
||||
|
||||
### App Features
|
||||
- [ ] Chat-History zuverlaessiger laden (AsyncStorage Race Condition)
|
||||
@@ -238,3 +300,7 @@ Wichtige Mechanismen:
|
||||
- [ ] Diagnostic: System-Info Tab (Container-Status, Disk, RAM, CPU)
|
||||
- [ ] RVS Zombie-Connections endgueltig loesen
|
||||
- [ ] Gamebox: kleine Web-Oberflaeche fuer Credentials/Server-Config oder zentral aus Diagnostic per RVS push
|
||||
- [ ] Erste Skills bauen lassen (yt-dlp, pdf-extract, image-resize, etc.) — durch normale Anfragen, ARIA legt sie selbst an
|
||||
- [ ] Tool-Use-Verifikation: Live-Test ob claude-max-api-proxy `tools` und `tool_calls` sauber durchreicht
|
||||
- [ ] Heartbeat (periodische Selbst-Checks)
|
||||
- [ ] Lokales LLM als Waechter (Triage vor Claude-Call)
|
||||
|
||||
@@ -25,6 +25,11 @@ const ALLOWED_TYPES = new Set([
|
||||
"xtts_export_voice", "xtts_voice_exported",
|
||||
"xtts_import_voice", "xtts_voice_imported",
|
||||
"skill_created",
|
||||
"trigger_created",
|
||||
"location_update", "location_tracking",
|
||||
"chat_history_request", "chat_history_response", "chat_cleared",
|
||||
"file_delete_batch_request", "file_delete_batch_response",
|
||||
"file_zip_request", "file_zip_response",
|
||||
"xtts_delete_voice",
|
||||
"voice_preload", "voice_ready",
|
||||
"stt_request", "stt_response",
|
||||
|
||||
Reference in New Issue
Block a user