Compare commits
18 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 6821eaaa38 | |||
| 31aa86a2a9 | |||
| 87cb687610 | |||
| eb4059a887 | |||
| 415706036b | |||
| e2dd47255e | |||
| 3497aa23f8 | |||
| 8491fb2af7 | |||
| f61864282e | |||
| b2f7d6dda2 | |||
| eeedcc4781 | |||
| 5cf8cab5bd | |||
| 3ae9e19524 | |||
| 0ec4b00879 | |||
| b6b4b1b4d9 | |||
| 950a9d009c | |||
| 693542ef19 | |||
| d12f356ebe |
@@ -195,12 +195,13 @@ Bestehendes Token nochmal als QR anzeigen: `./generate-token.sh show`
|
|||||||
http://<VM-IP>:3001
|
http://<VM-IP>:3001
|
||||||
```
|
```
|
||||||
|
|
||||||
Die Diagnostic-UI hat vier Top-Tabs:
|
Die Diagnostic-UI hat fünf Top-Tabs:
|
||||||
|
|
||||||
- **Main** — Live-Chat-Test, Status (Brain / RVS / Proxy), End-to-End-Trace
|
- **Main** — Live-Chat-Test, Status (Brain / RVS / Proxy), End-to-End-Trace
|
||||||
- **Gehirn** — Memory-Verwaltung (Vector-DB), Skills, Export/Import des kompletten Gehirns als tar.gz
|
- **Gehirn** — Memory-Verwaltung (Vector-DB), Token/Call-Metrics (Subscription-Quota), Bootstrap & Migration, Komplett-Gehirn Export/Import
|
||||||
- **Dateien** — alle Dateien aus `/shared/uploads/` (von ARIA generiert oder hochgeladen) mit Download/Delete
|
- **Skills** — Liste mit Logs, Run, Activate/Deactivate, Export/Import als tar.gz
|
||||||
- **Einstellungen** — Reparatur (Container-Restart), Wipe, Sprachausgabe, Whisper, Runtime-Config, App-Onboarding (QR), Komplett-Reset
|
- **Dateien** — alle Dateien aus `/shared/uploads/` mit Multi-Select, Bulk-Download (ZIP) + Bulk-Delete
|
||||||
|
- **Einstellungen** — Reparatur (Container-Restart), Wipe, Sprachausgabe, Whisper, Sprachmodell, Runtime-Config, App-Onboarding (QR), Komplett-Reset
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
@@ -311,13 +312,15 @@ Erreichbar unter `http://<VM-IP>:3001`. Teilt das Netzwerk mit der Bridge.
|
|||||||
### Tabs
|
### Tabs
|
||||||
|
|
||||||
- **Main**: Brain/RVS/Proxy-Status, Chat-Test, "ARIA denkt..."-Indikator, End-to-End-Trace, Container-Logs
|
- **Main**: Brain/RVS/Proxy-Status, Chat-Test, "ARIA denkt..."-Indikator, End-to-End-Trace, Container-Logs
|
||||||
- **Gehirn**: Memory-Browser (Vector-DB), Suche + Filter, Edit/Add/Delete, Gehirn-Export/Import (tar.gz), Skills (geplant)
|
- **Gehirn**: Memory-Browser (Vector-DB), Suche + Filter, Edit/Add/Delete, Konversation-Status mit Destillat-Trigger, **Token/Call-Metrics mit Subscription-Quota-Tracking**, Bootstrap & Migration (3 Wiederherstellungs-Wege), Gehirn-Export/Import (tar.gz). Info-Buttons (ℹ) ueberall mit Modal-Erklaerung.
|
||||||
- **Dateien**: Browser fuer `/shared/uploads/` — von ARIA generierte oder hochgeladene Dateien herunterladen oder loeschen (Live-Update der Chat-Bubbles)
|
- **Skills**: Liste aller Skills mit Logs pro Run, Activate/Deactivate, Export/Import als tar.gz, "von ARIA"-Badge fuer selbst gebaute
|
||||||
- **Einstellungen**: Reparatur (Container-Restart fuer Brain/Bridge/Qdrant), Komplett-Reset, Betriebsmodi, Sprachausgabe + Voice-Cloning + F5-TTS-Tuning, Whisper, Onboarding-QR, App-Cleanup
|
- **Dateien**: Browser fuer `/shared/uploads/` mit Multi-Select + "Alle markieren" + Bulk-Download (ZIP bei 2+) + Bulk-Delete. Live-Update der Chat-Bubbles beim Delete.
|
||||||
|
- **Einstellungen**: Reparatur (Container-Restart fuer Brain/Bridge/Qdrant), Komplett-Reset, Betriebsmodi, Sprachausgabe + Voice-Cloning + F5-TTS-Tuning + Voice Export/Import, Whisper, Sprachmodell (brainModel), Onboarding-QR, App-Cleanup
|
||||||
|
|
||||||
### Was zusaetzlich noch drin steckt
|
### Was zusaetzlich noch drin steckt
|
||||||
|
|
||||||
- **Disk-Voll Banner** mit copy-baren Cleanup-Befehlen (safe + aggressiv)
|
- **Disk-Voll Banner** mit copy-baren Cleanup-Befehlen (safe + aggressiv)
|
||||||
|
- **Token/Call-Metrics**: pro Claude-Call ein Eintrag in `/data/metrics.jsonl` mit ts + Token-Schaetzung. Gehirn-Tab zeigt 1h/5h/24h/30d-Aggregat plus Progress-Bar gegen Plan-Limit (Pro / Max 5x / Max 20x / Custom). Warn-Schwelle 80%, kritisch 90%.
|
||||||
- **Voice Cloning**: Audio-Samples hochladen, Whisper transkribiert den Ref-Text automatisch
|
- **Voice Cloning**: Audio-Samples hochladen, Whisper transkribiert den Ref-Text automatisch
|
||||||
- **Voice Export/Import**: einzelne Stimmen als `.tar.gz` zwischen Gameboxen mitnehmen
|
- **Voice Export/Import**: einzelne Stimmen als `.tar.gz` zwischen Gameboxen mitnehmen
|
||||||
- **Settings Export/Import**: `voice_config.json` + `highlight_triggers.json` als JSON-Bundle
|
- **Settings Export/Import**: `voice_config.json` + `highlight_triggers.json` als JSON-Bundle
|
||||||
@@ -842,20 +845,29 @@ docker exec aria-brain curl localhost:8080/memory/stats
|
|||||||
### Phase A — Refactor: OpenClaw raus, eigenes Brain rein
|
### Phase A — Refactor: OpenClaw raus, eigenes Brain rein
|
||||||
|
|
||||||
- [x] aria-brain Container-Skeleton (FastAPI, Qdrant, sentence-transformers)
|
- [x] aria-brain Container-Skeleton (FastAPI, Qdrant, sentence-transformers)
|
||||||
|
- [x] aria-core (OpenClaw) komplett abgerissen — Tag `v0.1.2.0` als Archiv
|
||||||
- [x] Diagnostic: Gehirn-Tab (Memory Search/Filter, Add/Edit/Delete)
|
- [x] Diagnostic: Gehirn-Tab (Memory Search/Filter, Add/Edit/Delete)
|
||||||
- [x] Diagnostic: Gehirn-Export/Import als tar.gz
|
- [x] Diagnostic: Gehirn-Export/Import als tar.gz
|
||||||
- [x] Diagnostic: Datei-Manager (Liste, Suche, Download, Delete mit Live-Bubble-Update)
|
- [x] Diagnostic: Datei-Manager (Liste, Suche, Download, Delete, Multi-Select + ZIP + Bulk-Delete)
|
||||||
- [x] App: Datei-Manager als Modal in den Einstellungen
|
|
||||||
- [x] Diagnostic: Komplett-Reset (Wipe All)
|
- [x] Diagnostic: Komplett-Reset (Wipe All)
|
||||||
|
- [x] Diagnostic: Info-Buttons mit Modal-Erklaerungen (Status, Konversation, Memories, Bootstrap)
|
||||||
|
- [x] App: Datei-Manager als Modal in den Einstellungen (mit Multi-Select + ZIP-Download)
|
||||||
- [x] Voice Export/Import (einzelne Stimmen + F5/Whisper-Settings als Bundle)
|
- [x] Voice Export/Import (einzelne Stimmen + F5/Whisper-Settings als Bundle)
|
||||||
- [x] aria-core (OpenClaw) komplett abgerissen — Tag `v0.1.2.0` als Archiv
|
|
||||||
- [ ] **Phase B Punkt 2:** Migration `aria-data/brain-import/` → atomare Memory-Punkte
|
### Phase B — Brain mit Memory + Loop + Skills
|
||||||
- [ ] **Phase B Punkt 3:** Brain Conversation-Loop (Single-Chat + Rolling Window + Memory-Destillat)
|
|
||||||
- [ ] **Phase B Punkt 4:** Skills-System (Manifest, venv, README pro Skill, Diagnostic-Tab)
|
- [x] **Phase B Punkt 2:** Migration aus `aria-data/brain-import/` → atomare Memory-Punkte (Identity / Rule / Preference / Tool / Skill, idempotent ueber migration_key) + Bootstrap-Snapshot Export/Import (nur pinned)
|
||||||
|
- [x] **Phase B Punkt 3:** Brain Conversation-Loop (Single-Chat UI, Rolling Window 50 Turns, Schwelle 60 → automatisches Destillat, manueller Trigger)
|
||||||
|
- [x] **Phase B Punkt 4:** Skills-System (Python-only via local-venv, skill_create als Tool, dynamische run_<skill> Tools, Diagnostic Skills-Tab mit Logs/Toggle/Export/Import, skill_created Live-Notification in App+Diagnostic, harte Schwelle "pip → Skill")
|
||||||
|
- [x] Sprachmodell-Setting wieder funktional (brainModel in runtime.json statt aria-core)
|
||||||
|
- [x] App-Chat-Sync: kompletter Server-Sync bei Reconnect (Server = Source of Truth) + chat_cleared Live-Update. Lokal-only Bubbles (Skill-Notifications, laufende Voice ohne STT) bleiben erhalten.
|
||||||
|
- [x] App: Chat-Suche mit Next/Prev Navigation statt Filter
|
||||||
|
- [x] Token/Call-Metrics + Subscription-Quota-Tracking (Pro / Max 5x / Max 20x / Custom)
|
||||||
|
- [x] Datei-Manager Multi-Select: Bulk-Download als ZIP + Bulk-Delete (Diagnostic + App)
|
||||||
|
|
||||||
### Phase 2 — ARIA wird produktiv
|
### Phase 2 — ARIA wird produktiv
|
||||||
|
|
||||||
- [ ] Skills bauen (Bildgenerierung, etc.)
|
- [ ] Erste Skills bauen lassen (yt-dlp, pdf-extract, etc. — durch normale Anfragen)
|
||||||
- [ ] Gitea-Integration
|
- [ ] Gitea-Integration
|
||||||
- [ ] VM einrichten (Desktop, Browser, Tools)
|
- [ ] VM einrichten (Desktop, Browser, Tools)
|
||||||
- [ ] Heartbeat (periodische Selbst-Checks)
|
- [ ] Heartbeat (periodische Selbst-Checks)
|
||||||
|
|||||||
@@ -79,8 +79,8 @@ android {
|
|||||||
applicationId "com.ariacockpit"
|
applicationId "com.ariacockpit"
|
||||||
minSdkVersion rootProject.ext.minSdkVersion
|
minSdkVersion rootProject.ext.minSdkVersion
|
||||||
targetSdkVersion rootProject.ext.targetSdkVersion
|
targetSdkVersion rootProject.ext.targetSdkVersion
|
||||||
versionCode 10202
|
versionCode 10205
|
||||||
versionName "0.1.2.2"
|
versionName "0.1.2.5"
|
||||||
// Fallback fuer Libraries mit Product Flavors
|
// Fallback fuer Libraries mit Product Flavors
|
||||||
missingDimensionStrategy 'react-native-camera', 'general'
|
missingDimensionStrategy 'react-native-camera', 'general'
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "aria-cockpit",
|
"name": "aria-cockpit",
|
||||||
"version": "0.1.2.2",
|
"version": "0.1.2.5",
|
||||||
"private": true,
|
"private": true,
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"android": "react-native run-android",
|
"android": "react-native run-android",
|
||||||
|
|||||||
@@ -79,6 +79,14 @@ interface ChatMessage {
|
|||||||
active: boolean;
|
active: boolean;
|
||||||
setupError?: string;
|
setupError?: string;
|
||||||
};
|
};
|
||||||
|
/** Trigger-Created-Bubble: ARIA hat einen neuen Trigger angelegt */
|
||||||
|
triggerCreated?: {
|
||||||
|
name: string;
|
||||||
|
type: 'timer' | 'watcher' | string;
|
||||||
|
message: string;
|
||||||
|
fires_at?: string;
|
||||||
|
condition?: string;
|
||||||
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
// --- Konstanten ---
|
// --- Konstanten ---
|
||||||
@@ -201,6 +209,7 @@ const ChatScreen: React.FC = () => {
|
|||||||
const [fullscreenImage, setFullscreenImage] = useState<string | null>(null);
|
const [fullscreenImage, setFullscreenImage] = useState<string | null>(null);
|
||||||
const [searchQuery, setSearchQuery] = useState('');
|
const [searchQuery, setSearchQuery] = useState('');
|
||||||
const [searchVisible, setSearchVisible] = useState(false);
|
const [searchVisible, setSearchVisible] = useState(false);
|
||||||
|
const [searchIndex, setSearchIndex] = useState(0); // welcher Treffer aktiv ist
|
||||||
const [pendingAttachments, setPendingAttachments] = useState<{file: any, isPhoto: boolean}[]>([]);
|
const [pendingAttachments, setPendingAttachments] = useState<{file: any, isPhoto: boolean}[]>([]);
|
||||||
const [agentActivity, setAgentActivity] = useState<{activity: string, tool: string}>({activity: 'idle', tool: ''});
|
const [agentActivity, setAgentActivity] = useState<{activity: string, tool: string}>({activity: 'idle', tool: ''});
|
||||||
// Service-Status (Gamebox: F5-TTS / Whisper Lade-Status) + Banner-Sichtbarkeit
|
// Service-Status (Gamebox: F5-TTS / Whisper Lade-Status) + Banner-Sichtbarkeit
|
||||||
@@ -396,6 +405,67 @@ const ChatScreen: React.FC = () => {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// skill_created: ARIA hat einen neuen Skill angelegt → eigene Bubble
|
// skill_created: ARIA hat einen neuen Skill angelegt → eigene Bubble
|
||||||
|
// chat_cleared: Diagnostic hat die History komplett geleert
|
||||||
|
// → lokal auch loeschen (visuell + Persistenz)
|
||||||
|
if (message.type === 'chat_cleared') {
|
||||||
|
console.log('[Chat] chat_cleared — leere lokale Anzeige + Storage');
|
||||||
|
setMessages([]);
|
||||||
|
AsyncStorage.removeItem(CHAT_STORAGE_KEY).catch(() => {});
|
||||||
|
AsyncStorage.removeItem('aria_chat_last_sync').catch(() => {});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// chat_history_response: kompletter Server-Stand. App ersetzt ihre
|
||||||
|
// persistierte Chat-History damit. Lokal-only Bubbles (laufende
|
||||||
|
// Voice-Aufnahmen ohne STT-Result, Skill-Created-Events ohne
|
||||||
|
// text) bleiben erhalten — die sind durch fehlendes 'text' oder
|
||||||
|
// skillCreated/audioRequestId klar als "lokal" erkennbar.
|
||||||
|
if (message.type === 'chat_history_response') {
|
||||||
|
const p = (message.payload || {}) as any;
|
||||||
|
const incoming = (p.messages || []) as Array<any>;
|
||||||
|
console.log(`[Chat] Server-Sync: ${incoming.length} Nachrichten vom Server`);
|
||||||
|
const fromServer: ChatMessage[] = incoming.map(m => {
|
||||||
|
const role = m.role === 'user' ? 'user' : 'aria';
|
||||||
|
const files = Array.isArray(m.files) ? m.files : [];
|
||||||
|
const attachments = files.map((f: any) => ({
|
||||||
|
type: (typeof f.mimeType === 'string' && f.mimeType.startsWith('image/')) ? 'image' : 'file',
|
||||||
|
name: f.name || 'datei',
|
||||||
|
size: f.size || 0,
|
||||||
|
mimeType: f.mimeType || '',
|
||||||
|
serverPath: f.serverPath || '',
|
||||||
|
})) as Attachment[];
|
||||||
|
return {
|
||||||
|
id: nextId(),
|
||||||
|
sender: role as 'user' | 'aria',
|
||||||
|
text: m.text || '',
|
||||||
|
timestamp: m.ts || Date.now(),
|
||||||
|
attachments: attachments.length ? attachments : undefined,
|
||||||
|
};
|
||||||
|
});
|
||||||
|
const maxTs = incoming.reduce((mx: number, m: any) => Math.max(mx, m.ts || 0), 0);
|
||||||
|
setMessages(prev => {
|
||||||
|
// Lokal-only Bubbles erkennen + behalten:
|
||||||
|
// - Skill-Created-Notifications (skillCreated gesetzt)
|
||||||
|
// - Laufende Sprachnachrichten ohne STT-Result (audioRequestId
|
||||||
|
// gesetzt UND text leer/Placeholder)
|
||||||
|
const localOnly = prev.filter(m =>
|
||||||
|
m.skillCreated ||
|
||||||
|
m.triggerCreated ||
|
||||||
|
(m.audioRequestId && (!m.text || m.text === '🎙 Aufnahme...' || m.text === 'Aufnahme...'))
|
||||||
|
);
|
||||||
|
// Server-Stand + lokal-only (chronologisch sortiert)
|
||||||
|
const merged = [...fromServer, ...localOnly].sort((a, b) => a.timestamp - b.timestamp);
|
||||||
|
return capMessages(merged);
|
||||||
|
});
|
||||||
|
if (maxTs > 0) {
|
||||||
|
AsyncStorage.setItem('aria_chat_last_sync', String(maxTs)).catch(() => {});
|
||||||
|
} else {
|
||||||
|
// Server leer → unsere lastSync auch zuruecksetzen
|
||||||
|
AsyncStorage.removeItem('aria_chat_last_sync').catch(() => {});
|
||||||
|
}
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
if (message.type === 'skill_created') {
|
if (message.type === 'skill_created') {
|
||||||
const p = (message.payload || {}) as any;
|
const p = (message.payload || {}) as any;
|
||||||
const skillMsg: ChatMessage = {
|
const skillMsg: ChatMessage = {
|
||||||
@@ -415,6 +485,26 @@ const ChatScreen: React.FC = () => {
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// trigger_created: ARIA hat einen Trigger angelegt → eigene Bubble
|
||||||
|
if (message.type === 'trigger_created') {
|
||||||
|
const p = (message.payload || {}) as any;
|
||||||
|
const triggerMsg: ChatMessage = {
|
||||||
|
id: nextId(),
|
||||||
|
sender: 'aria',
|
||||||
|
text: '',
|
||||||
|
timestamp: Date.now(),
|
||||||
|
triggerCreated: {
|
||||||
|
name: String(p.name || '(unbenannt)'),
|
||||||
|
type: String(p.type || 'timer'),
|
||||||
|
message: String(p.message || ''),
|
||||||
|
fires_at: p.fires_at ? String(p.fires_at) : undefined,
|
||||||
|
condition: p.condition ? String(p.condition) : undefined,
|
||||||
|
},
|
||||||
|
};
|
||||||
|
setMessages(prev => capMessages([...prev, triggerMsg]));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
// file_deleted: Datei wurde geloescht (vom Diagnostic User) → Bubble updaten
|
// file_deleted: Datei wurde geloescht (vom Diagnostic User) → Bubble updaten
|
||||||
if (message.type === 'file_deleted') {
|
if (message.type === 'file_deleted') {
|
||||||
const p = (message.payload?.path as string) || '';
|
const p = (message.payload?.path as string) || '';
|
||||||
@@ -480,6 +570,13 @@ const ChatScreen: React.FC = () => {
|
|||||||
const dbgText = ((message.payload.text as string) || '').slice(0, 60);
|
const dbgText = ((message.payload.text as string) || '').slice(0, 60);
|
||||||
console.log('[Chat] chat-event sender=%s text=%s', sender || '(none)', dbgText);
|
console.log('[Chat] chat-event sender=%s text=%s', sender || '(none)', dbgText);
|
||||||
|
|
||||||
|
// last-sync tracken — so dass beim Reconnect nicht wieder dieselbe
|
||||||
|
// Nachricht aus dem Server-Backup nachgeladen wird
|
||||||
|
if (sender === 'aria' || sender === 'user' || sender === 'stt') {
|
||||||
|
const ts = message.timestamp || Date.now();
|
||||||
|
AsyncStorage.setItem('aria_chat_last_sync', String(ts)).catch(() => {});
|
||||||
|
}
|
||||||
|
|
||||||
// STT-Ergebnis: Transkribierten Text in die Sprach-Bubble schreiben.
|
// STT-Ergebnis: Transkribierten Text in die Sprach-Bubble schreiben.
|
||||||
// WICHTIG: Nur die ERSTE noch unaufgeloeste Aufnahme matchen — sonst
|
// WICHTIG: Nur die ERSTE noch unaufgeloeste Aufnahme matchen — sonst
|
||||||
// wuerde bei zwei kurz hintereinander gesendeten Audios beide Bubbles
|
// wuerde bei zwei kurz hintereinander gesendeten Audios beide Bubbles
|
||||||
@@ -647,6 +744,14 @@ const ChatScreen: React.FC = () => {
|
|||||||
|
|
||||||
const unsubState = rvs.onStateChange((state) => {
|
const unsubState = rvs.onStateChange((state) => {
|
||||||
setConnectionState(state);
|
setConnectionState(state);
|
||||||
|
// Bei (re)connect: KOMPLETTEN Server-Stand holen. Server ist die
|
||||||
|
// Source-of-Truth — wenn er leer ist (z.B. nach "Konversation
|
||||||
|
// zuruecksetzen"), soll die App das spiegeln, auch wenn sie offline
|
||||||
|
// war als das passiert ist. since=0 + limit=200 → die letzten 200
|
||||||
|
// Nachrichten vom Server, oder leeres Array wenn Server leer.
|
||||||
|
if (state === 'connected') {
|
||||||
|
rvs.send('chat_history_request' as any, { since: 0, limit: 200 });
|
||||||
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
// Initalen Status setzen
|
// Initalen Status setzen
|
||||||
@@ -830,6 +935,51 @@ const ChatScreen: React.FC = () => {
|
|||||||
// Inverted FlatList: neueste Nachrichten unten, kein manuelles Scrollen noetig
|
// Inverted FlatList: neueste Nachrichten unten, kein manuelles Scrollen noetig
|
||||||
const invertedMessages = useMemo(() => [...messages].reverse(), [messages]);
|
const invertedMessages = useMemo(() => [...messages].reverse(), [messages]);
|
||||||
|
|
||||||
|
// Such-Treffer: alle Message-IDs die zur Query passen, in chronologischer
|
||||||
|
// Reihenfolge (aelteste zuerst). Bei Query-Change resetten wir den Index.
|
||||||
|
const searchMatchIds = useMemo(() => {
|
||||||
|
const q = searchQuery.trim().toLowerCase();
|
||||||
|
if (!q) return [] as string[];
|
||||||
|
return messages
|
||||||
|
.filter(m => (m.text || '').toLowerCase().includes(q))
|
||||||
|
.map(m => m.id);
|
||||||
|
}, [messages, searchQuery]);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
setSearchIndex(0);
|
||||||
|
}, [searchQuery]);
|
||||||
|
|
||||||
|
// Bei Index-Wechsel zu der entsprechenden Bubble scrollen.
|
||||||
|
// FlatList ist `inverted` → viewPosition 0.5 (mitte) ist beim inverted-Render
|
||||||
|
// tatsaechlich die Mitte des sichtbaren Bereichs. Wir verzoegern minimal
|
||||||
|
// damit Layout sicher fertig ist.
|
||||||
|
useEffect(() => {
|
||||||
|
if (!searchMatchIds.length) return;
|
||||||
|
const id = searchMatchIds[searchIndex];
|
||||||
|
if (!id) return;
|
||||||
|
const idx = invertedMessages.findIndex(m => m.id === id);
|
||||||
|
if (idx < 0 || !flatListRef.current) return;
|
||||||
|
const tryScroll = () => {
|
||||||
|
try {
|
||||||
|
flatListRef.current?.scrollToIndex({ index: idx, animated: true, viewPosition: 0.5 });
|
||||||
|
} catch {
|
||||||
|
// wird von onScrollToIndexFailed nochmal versucht
|
||||||
|
}
|
||||||
|
};
|
||||||
|
// requestAnimationFrame statt setTimeout 0 — wartet auf naechsten Layout-Frame
|
||||||
|
requestAnimationFrame(tryScroll);
|
||||||
|
}, [searchIndex, searchMatchIds, invertedMessages]);
|
||||||
|
|
||||||
|
const activeSearchId = searchMatchIds[searchIndex] || '';
|
||||||
|
const gotoSearchPrev = () => {
|
||||||
|
if (!searchMatchIds.length) return;
|
||||||
|
setSearchIndex(i => (i - 1 + searchMatchIds.length) % searchMatchIds.length);
|
||||||
|
};
|
||||||
|
const gotoSearchNext = () => {
|
||||||
|
if (!searchMatchIds.length) return;
|
||||||
|
setSearchIndex(i => (i + 1) % searchMatchIds.length);
|
||||||
|
};
|
||||||
|
|
||||||
// GPS-Position holen (optional)
|
// GPS-Position holen (optional)
|
||||||
const getCurrentLocation = useCallback((): Promise<{ lat: number; lon: number } | null> => {
|
const getCurrentLocation = useCallback((): Promise<{ lat: number; lon: number } | null> => {
|
||||||
if (!gpsEnabled) {
|
if (!gpsEnabled) {
|
||||||
@@ -1081,12 +1231,38 @@ const ChatScreen: React.FC = () => {
|
|||||||
hour: '2-digit',
|
hour: '2-digit',
|
||||||
minute: '2-digit',
|
minute: '2-digit',
|
||||||
});
|
});
|
||||||
|
const isSearchHit = activeSearchId === item.id;
|
||||||
|
const searchHighlightStyle = isSearchHit
|
||||||
|
? { borderWidth: 2, borderColor: '#FFD60A' }
|
||||||
|
: null;
|
||||||
|
|
||||||
|
// Spezial-Bubble: ARIA hat einen Trigger angelegt
|
||||||
|
if (item.triggerCreated) {
|
||||||
|
const t = item.triggerCreated;
|
||||||
|
const detailLine = t.type === 'timer'
|
||||||
|
? `feuert: ${t.fires_at || '?'}`
|
||||||
|
: `wenn: ${t.condition || '?'}`;
|
||||||
|
return (
|
||||||
|
<View style={[styles.messageBubble, styles.ariaBubble, {borderLeftWidth: 3, borderLeftColor: '#FFD60A'}, searchHighlightStyle]}>
|
||||||
|
<Text style={{color: '#FFD60A', fontWeight: 'bold', fontSize: 14}}>
|
||||||
|
{'⏰ ARIA hat einen Trigger angelegt'}
|
||||||
|
</Text>
|
||||||
|
<Text style={{color: '#E0E0F0', marginTop: 4, fontSize: 14}}>
|
||||||
|
<Text style={{fontWeight: 'bold'}}>{t.name}</Text>
|
||||||
|
<Text style={{color: '#8888AA', fontSize: 12}}>{` (${t.type})`}</Text>
|
||||||
|
</Text>
|
||||||
|
<Text style={{color: '#8888AA', fontSize: 12, marginTop: 2, fontFamily: 'monospace'}}>{detailLine}</Text>
|
||||||
|
<Text style={{color: '#888', fontSize: 12, marginTop: 2}}>{`"${t.message}"`}</Text>
|
||||||
|
<Text style={{color: '#555570', fontSize: 10, marginTop: 6}}>ARIA-Trigger · {time}</Text>
|
||||||
|
</View>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
// Spezial-Bubble: ARIA hat einen Skill erstellt
|
// Spezial-Bubble: ARIA hat einen Skill erstellt
|
||||||
if (item.skillCreated) {
|
if (item.skillCreated) {
|
||||||
const s = item.skillCreated;
|
const s = item.skillCreated;
|
||||||
return (
|
return (
|
||||||
<View style={[styles.messageBubble, styles.ariaBubble, {borderLeftWidth: 3, borderLeftColor: '#FFD60A'}]}>
|
<View style={[styles.messageBubble, styles.ariaBubble, {borderLeftWidth: 3, borderLeftColor: '#FFD60A'}, searchHighlightStyle]}>
|
||||||
<Text style={{color: '#FFD60A', fontWeight: 'bold', fontSize: 14}}>
|
<Text style={{color: '#FFD60A', fontWeight: 'bold', fontSize: 14}}>
|
||||||
{'🛠 ARIA hat einen neuen Skill erstellt'}
|
{'🛠 ARIA hat einen neuen Skill erstellt'}
|
||||||
</Text>
|
</Text>
|
||||||
@@ -1106,7 +1282,7 @@ const ChatScreen: React.FC = () => {
|
|||||||
}
|
}
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<View style={[styles.messageBubble, isUser ? styles.userBubble : styles.ariaBubble]}>
|
<View style={[styles.messageBubble, isUser ? styles.userBubble : styles.ariaBubble, searchHighlightStyle]}>
|
||||||
{/* Anhang-Vorschau */}
|
{/* Anhang-Vorschau */}
|
||||||
{item.attachments?.map((att, idx) => (
|
{item.attachments?.map((att, idx) => (
|
||||||
<View key={idx}>
|
<View key={idx}>
|
||||||
@@ -1280,7 +1456,7 @@ const ChatScreen: React.FC = () => {
|
|||||||
);
|
);
|
||||||
})()}
|
})()}
|
||||||
|
|
||||||
{/* Suchleiste */}
|
{/* Suchleiste mit Treffer-Navigation */}
|
||||||
{searchVisible && (
|
{searchVisible && (
|
||||||
<View style={styles.searchBar}>
|
<View style={styles.searchBar}>
|
||||||
<TextInput
|
<TextInput
|
||||||
@@ -1291,17 +1467,47 @@ const ChatScreen: React.FC = () => {
|
|||||||
placeholderTextColor="#555570"
|
placeholderTextColor="#555570"
|
||||||
autoFocus
|
autoFocus
|
||||||
/>
|
/>
|
||||||
|
{searchQuery ? (
|
||||||
|
<Text style={{color: searchMatchIds.length ? '#0096FF' : '#555570', fontSize: 12, paddingHorizontal: 6}}>
|
||||||
|
{searchMatchIds.length ? `${searchIndex + 1}/${searchMatchIds.length}` : '0/0'}
|
||||||
|
</Text>
|
||||||
|
) : null}
|
||||||
|
<TouchableOpacity
|
||||||
|
onPress={gotoSearchPrev}
|
||||||
|
disabled={!searchMatchIds.length}
|
||||||
|
style={{paddingHorizontal: 6, opacity: searchMatchIds.length ? 1 : 0.3}}
|
||||||
|
>
|
||||||
|
<Text style={{color: '#0096FF', fontSize: 18}}>{'▲'}</Text>
|
||||||
|
</TouchableOpacity>
|
||||||
|
<TouchableOpacity
|
||||||
|
onPress={gotoSearchNext}
|
||||||
|
disabled={!searchMatchIds.length}
|
||||||
|
style={{paddingHorizontal: 6, opacity: searchMatchIds.length ? 1 : 0.3}}
|
||||||
|
>
|
||||||
|
<Text style={{color: '#0096FF', fontSize: 18}}>{'▼'}</Text>
|
||||||
|
</TouchableOpacity>
|
||||||
<TouchableOpacity onPress={() => { setSearchVisible(false); setSearchQuery(''); }}>
|
<TouchableOpacity onPress={() => { setSearchVisible(false); setSearchQuery(''); }}>
|
||||||
<Text style={{color: '#FF3B30', fontSize: 14, paddingHorizontal: 8}}>X</Text>
|
<Text style={{color: '#FF3B30', fontSize: 14, paddingHorizontal: 8}}>X</Text>
|
||||||
</TouchableOpacity>
|
</TouchableOpacity>
|
||||||
</View>
|
</View>
|
||||||
)}
|
)}
|
||||||
|
|
||||||
{/* Nachrichtenliste */}
|
{/* Nachrichtenliste — Suche FILTERT NICHT mehr, sondern hebt aktiven
|
||||||
|
Treffer hervor (siehe renderMessage: activeSearchId-Border). */}
|
||||||
<FlatList
|
<FlatList
|
||||||
ref={flatListRef}
|
ref={flatListRef}
|
||||||
inverted
|
inverted
|
||||||
data={searchQuery ? messages.filter(m => m.text.toLowerCase().includes(searchQuery.toLowerCase())).reverse() : invertedMessages}
|
data={invertedMessages}
|
||||||
|
onScrollToIndexFailed={(info) => {
|
||||||
|
// FlatList kennt das Item-Layout noch nicht. Zuerst grob in die
|
||||||
|
// Naehe scrollen (Average-Item-Hoehe-Schaetzung), dann nach 250ms
|
||||||
|
// praezise nochmal versuchen.
|
||||||
|
const offset = info.averageItemLength * info.index;
|
||||||
|
try { flatListRef.current?.scrollToOffset({ offset, animated: false }); } catch {}
|
||||||
|
setTimeout(() => {
|
||||||
|
try { flatListRef.current?.scrollToIndex({ index: info.index, animated: true, viewPosition: 0.5 }); } catch {}
|
||||||
|
}, 250);
|
||||||
|
}}
|
||||||
keyExtractor={item => item.id}
|
keyExtractor={item => item.id}
|
||||||
renderItem={renderMessage}
|
renderItem={renderMessage}
|
||||||
contentContainerStyle={styles.messageList}
|
contentContainerStyle={styles.messageList}
|
||||||
|
|||||||
@@ -155,6 +155,9 @@ const SettingsScreen: React.FC = () => {
|
|||||||
const [fileManagerError, setFileManagerError] = useState('');
|
const [fileManagerError, setFileManagerError] = useState('');
|
||||||
const [fileManagerSearch, setFileManagerSearch] = useState('');
|
const [fileManagerSearch, setFileManagerSearch] = useState('');
|
||||||
const [fileManagerFilter, setFileManagerFilter] = useState<'all' | 'aria' | 'user'>('all');
|
const [fileManagerFilter, setFileManagerFilter] = useState<'all' | 'aria' | 'user'>('all');
|
||||||
|
const [fileManagerSelected, setFileManagerSelected] = useState<Set<string>>(new Set());
|
||||||
|
const fileZipPending = useRef<string | null>(null); // requestId fuer ZIP-Antwort
|
||||||
|
const [fileZipBusy, setFileZipBusy] = useState(false);
|
||||||
const [voiceCloneVisible, setVoiceCloneVisible] = useState(false);
|
const [voiceCloneVisible, setVoiceCloneVisible] = useState(false);
|
||||||
const [tempPath, setTempPath] = useState('');
|
const [tempPath, setTempPath] = useState('');
|
||||||
// Sub-Screen Navigation: null = Hauptmenue, sonst eine der Section-IDs.
|
// Sub-Screen Navigation: null = Hauptmenue, sonst eine der Section-IDs.
|
||||||
@@ -395,9 +398,39 @@ const SettingsScreen: React.FC = () => {
|
|||||||
const p: any = message.payload || {};
|
const p: any = message.payload || {};
|
||||||
if (p.path) {
|
if (p.path) {
|
||||||
setFileManagerFiles(prev => prev.filter(f => f.path !== p.path));
|
setFileManagerFiles(prev => prev.filter(f => f.path !== p.path));
|
||||||
|
setFileManagerSelected(prev => {
|
||||||
|
if (!prev.has(p.path)) return prev;
|
||||||
|
const next = new Set(prev);
|
||||||
|
next.delete(p.path);
|
||||||
|
return next;
|
||||||
|
});
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Datei-Manager: ZIP-Response (Multi-Download)
|
||||||
|
if (message.type === ('file_zip_response' as any)) {
|
||||||
|
const p: any = message.payload || {};
|
||||||
|
if (p.requestId && p.requestId !== fileZipPending.current) return; // veraltet
|
||||||
|
fileZipPending.current = null;
|
||||||
|
setFileZipBusy(false);
|
||||||
|
if (!p.ok || !p.data) {
|
||||||
|
ToastAndroid.show('ZIP fehlgeschlagen: ' + (p.error || 'unbekannt'), ToastAndroid.LONG);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
// base64 → in Downloads-Ordner schreiben
|
||||||
|
(async () => {
|
||||||
|
try {
|
||||||
|
const ts = new Date().toISOString().replace(/[:.]/g, '-').slice(0, 19);
|
||||||
|
const dir = RNFS.DownloadDirectoryPath;
|
||||||
|
const filePath = `${dir}/aria-files-${ts}.zip`;
|
||||||
|
await RNFS.writeFile(filePath, p.data, 'base64');
|
||||||
|
ToastAndroid.show(`ZIP gespeichert: ${filePath} (${Math.round((p.size||0)/1024)} KB)`, ToastAndroid.LONG);
|
||||||
|
} catch (e: any) {
|
||||||
|
ToastAndroid.show('ZIP speichern fehlgeschlagen: ' + e.message, ToastAndroid.LONG);
|
||||||
|
}
|
||||||
|
})();
|
||||||
|
}
|
||||||
|
|
||||||
// Voice wurde gespeichert → Liste neu laden + ggf. auswaehlen
|
// Voice wurde gespeichert → Liste neu laden + ggf. auswaehlen
|
||||||
if (message.type === ('xtts_voice_saved' as any)) {
|
if (message.type === ('xtts_voice_saved' as any)) {
|
||||||
const name = (message.payload as any).name as string;
|
const name = (message.payload as any).name as string;
|
||||||
@@ -644,9 +677,8 @@ const SettingsScreen: React.FC = () => {
|
|||||||
<Text style={{color:'#8888AA', textAlign:'center', marginTop:20}}>Lade...</Text>
|
<Text style={{color:'#8888AA', textAlign:'center', marginTop:20}}>Lade...</Text>
|
||||||
) : fileManagerError ? (
|
) : fileManagerError ? (
|
||||||
<Text style={{color:'#FF6B6B', textAlign:'center', marginTop:20}}>{fileManagerError}</Text>
|
<Text style={{color:'#FF6B6B', textAlign:'center', marginTop:20}}>{fileManagerError}</Text>
|
||||||
) : (
|
) : (() => {
|
||||||
<ScrollView style={{flex:1}} contentContainerStyle={{padding:12}}>
|
// Visible files (Filter+Suche)
|
||||||
{(() => {
|
|
||||||
let files = fileManagerFiles;
|
let files = fileManagerFiles;
|
||||||
if (fileManagerFilter === 'aria') files = files.filter(f => f.fromAria);
|
if (fileManagerFilter === 'aria') files = files.filter(f => f.fromAria);
|
||||||
else if (fileManagerFilter === 'user') files = files.filter(f => !f.fromAria);
|
else if (fileManagerFilter === 'user') files = files.filter(f => !f.fromAria);
|
||||||
@@ -654,15 +686,120 @@ const SettingsScreen: React.FC = () => {
|
|||||||
const q = fileManagerSearch.toLowerCase();
|
const q = fileManagerSearch.toLowerCase();
|
||||||
files = files.filter(f => f.name.toLowerCase().includes(q));
|
files = files.filter(f => f.name.toLowerCase().includes(q));
|
||||||
}
|
}
|
||||||
if (!files.length) {
|
const visiblePaths = files.map(f => f.path);
|
||||||
return <Text style={{color:'#555570', textAlign:'center', marginTop:20}}>Keine Dateien</Text>;
|
const selectedHere = visiblePaths.filter(p => fileManagerSelected.has(p));
|
||||||
}
|
const allSelected = visiblePaths.length > 0 && selectedHere.length === visiblePaths.length;
|
||||||
const fmtSize = (b: number) => b < 1024 ? `${b} B` : b < 1024*1024 ? `${(b/1024).toFixed(1)} KB` : `${(b/1024/1024).toFixed(1)} MB`;
|
const fmtSize = (b: number) => b < 1024 ? `${b} B` : b < 1024*1024 ? `${(b/1024).toFixed(1)} KB` : `${(b/1024/1024).toFixed(1)} MB`;
|
||||||
return files.map(f => (
|
|
||||||
<View key={f.path} style={{
|
const toggleSelectAll = () => {
|
||||||
backgroundColor:'#0D0D1A', padding:12, borderRadius:8, marginBottom:8,
|
setFileManagerSelected(prev => {
|
||||||
flexDirection:'row', alignItems:'center', gap:8,
|
const next = new Set(prev);
|
||||||
|
if (allSelected) visiblePaths.forEach(p => next.delete(p));
|
||||||
|
else visiblePaths.forEach(p => next.add(p));
|
||||||
|
return next;
|
||||||
|
});
|
||||||
|
};
|
||||||
|
const toggleOne = (p: string) => {
|
||||||
|
setFileManagerSelected(prev => {
|
||||||
|
const next = new Set(prev);
|
||||||
|
if (next.has(p)) next.delete(p);
|
||||||
|
else next.add(p);
|
||||||
|
return next;
|
||||||
|
});
|
||||||
|
};
|
||||||
|
const bulkDelete = () => {
|
||||||
|
const paths = [...fileManagerSelected];
|
||||||
|
if (!paths.length) return;
|
||||||
|
Alert.alert(
|
||||||
|
`${paths.length} Dateien löschen?`,
|
||||||
|
'In allen Chat-Bubbles werden sie als gelöscht markiert.',
|
||||||
|
[
|
||||||
|
{ text: 'Abbrechen', style: 'cancel' },
|
||||||
|
{ text: 'Löschen', style: 'destructive', onPress: () => {
|
||||||
|
rvs.send('file_delete_batch_request' as any, { paths, requestId: 'batch-' + Date.now() });
|
||||||
|
setFileManagerSelected(new Set());
|
||||||
|
ToastAndroid.show(`${paths.length} Lösch-Befehle gesendet…`, ToastAndroid.SHORT);
|
||||||
|
}},
|
||||||
|
],
|
||||||
|
);
|
||||||
|
};
|
||||||
|
const bulkDownload = () => {
|
||||||
|
const paths = [...fileManagerSelected];
|
||||||
|
if (!paths.length) return;
|
||||||
|
// 1 Datei: einfach via file_request (existing pattern). ZIP nur bei 2+.
|
||||||
|
if (paths.length === 1) {
|
||||||
|
rvs.send('file_request' as any, { serverPath: paths[0], requestId: 'single-' + Date.now() });
|
||||||
|
ToastAndroid.show('Datei wird heruntergeladen…', ToastAndroid.SHORT);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const reqId = 'zip-' + Date.now();
|
||||||
|
fileZipPending.current = reqId;
|
||||||
|
setFileZipBusy(true);
|
||||||
|
rvs.send('file_zip_request' as any, { paths, requestId: reqId });
|
||||||
|
ToastAndroid.show(`ZIP wird erstellt (${paths.length} Dateien)…`, ToastAndroid.LONG);
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
{/* Bulk-Bar */}
|
||||||
|
<View style={{paddingHorizontal:12, paddingBottom:8, flexDirection:'row', alignItems:'center', gap:8, flexWrap:'wrap'}}>
|
||||||
|
<TouchableOpacity onPress={toggleSelectAll} style={{flexDirection:'row', alignItems:'center', gap:6, paddingVertical:4}}>
|
||||||
|
<View style={{
|
||||||
|
width:18, height:18, borderRadius:3,
|
||||||
|
borderWidth:2, borderColor: allSelected ? '#0096FF' : '#555570',
|
||||||
|
backgroundColor: allSelected ? '#0096FF' : 'transparent',
|
||||||
|
alignItems:'center', justifyContent:'center',
|
||||||
}}>
|
}}>
|
||||||
|
{allSelected && <Text style={{color:'#fff', fontSize:11, fontWeight:'bold'}}>✓</Text>}
|
||||||
|
</View>
|
||||||
|
<Text style={{color:'#E0E0F0', fontSize:13}}>Alle markieren</Text>
|
||||||
|
</TouchableOpacity>
|
||||||
|
{fileManagerSelected.size > 0 && (
|
||||||
|
<>
|
||||||
|
<Text style={{color:'#555570', fontSize:13}}>·</Text>
|
||||||
|
<Text style={{color:'#0096FF', fontSize:13, fontWeight:'600'}}>{fileManagerSelected.size} ausgewählt</Text>
|
||||||
|
<TouchableOpacity
|
||||||
|
onPress={bulkDownload}
|
||||||
|
disabled={fileZipBusy}
|
||||||
|
style={{paddingVertical:4, paddingHorizontal:10, borderRadius:6, backgroundColor:'#0096FF22', opacity: fileZipBusy ? 0.5 : 1}}
|
||||||
|
>
|
||||||
|
<Text style={{color:'#0096FF', fontSize:12}}>{fileZipBusy ? '⏳ ZIP…' : (fileManagerSelected.size > 1 ? '⬇ ZIP' : '⬇ Download')}</Text>
|
||||||
|
</TouchableOpacity>
|
||||||
|
<TouchableOpacity
|
||||||
|
onPress={bulkDelete}
|
||||||
|
style={{paddingVertical:4, paddingHorizontal:10, borderRadius:6, backgroundColor:'#FF6B6B22'}}
|
||||||
|
>
|
||||||
|
<Text style={{color:'#FF6B6B', fontSize:12}}>🗑 Löschen</Text>
|
||||||
|
</TouchableOpacity>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</View>
|
||||||
|
|
||||||
|
<ScrollView style={{flex:1}} contentContainerStyle={{padding:12, paddingTop:0}}>
|
||||||
|
{!files.length ? (
|
||||||
|
<Text style={{color:'#555570', textAlign:'center', marginTop:20}}>Keine Dateien</Text>
|
||||||
|
) : files.map(f => {
|
||||||
|
const selected = fileManagerSelected.has(f.path);
|
||||||
|
return (
|
||||||
|
<TouchableOpacity
|
||||||
|
key={f.path}
|
||||||
|
onPress={() => toggleOne(f.path)}
|
||||||
|
activeOpacity={0.7}
|
||||||
|
style={{
|
||||||
|
backgroundColor: selected ? '#1E2C44' : '#0D0D1A',
|
||||||
|
padding:12, borderRadius:8, marginBottom:8,
|
||||||
|
flexDirection:'row', alignItems:'center', gap:8,
|
||||||
|
borderWidth: selected ? 1 : 0, borderColor:'#0096FF',
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
<View style={{
|
||||||
|
width:18, height:18, borderRadius:3,
|
||||||
|
borderWidth:2, borderColor: selected ? '#0096FF' : '#555570',
|
||||||
|
backgroundColor: selected ? '#0096FF' : 'transparent',
|
||||||
|
alignItems:'center', justifyContent:'center',
|
||||||
|
}}>
|
||||||
|
{selected && <Text style={{color:'#fff', fontSize:11, fontWeight:'bold'}}>✓</Text>}
|
||||||
|
</View>
|
||||||
<View style={{flex:1}}>
|
<View style={{flex:1}}>
|
||||||
<View style={{flexDirection:'row', alignItems:'center'}}>
|
<View style={{flexDirection:'row', alignItems:'center'}}>
|
||||||
<View style={{
|
<View style={{
|
||||||
@@ -697,11 +834,13 @@ const SettingsScreen: React.FC = () => {
|
|||||||
>
|
>
|
||||||
<Text style={{color:'#FF6B6B', fontSize:18}}>🗑</Text>
|
<Text style={{color:'#FF6B6B', fontSize:18}}>🗑</Text>
|
||||||
</TouchableOpacity>
|
</TouchableOpacity>
|
||||||
</View>
|
</TouchableOpacity>
|
||||||
));
|
);
|
||||||
})()}
|
})}
|
||||||
</ScrollView>
|
</ScrollView>
|
||||||
)}
|
</>
|
||||||
|
);
|
||||||
|
})()}
|
||||||
</View>
|
</View>
|
||||||
</Modal>
|
</Modal>
|
||||||
<ScrollView style={styles.container} contentContainerStyle={styles.content}>
|
<ScrollView style={styles.container} contentContainerStyle={styles.content}>
|
||||||
|
|||||||
+141
-1
@@ -25,6 +25,8 @@ from memory import Embedder, VectorStore, MemoryPoint
|
|||||||
from prompts import build_system_prompt
|
from prompts import build_system_prompt
|
||||||
from proxy_client import ProxyClient, Message as ProxyMessage
|
from proxy_client import ProxyClient, Message as ProxyMessage
|
||||||
import skills as skills_mod
|
import skills as skills_mod
|
||||||
|
import triggers as triggers_mod
|
||||||
|
import watcher as watcher_mod
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
@@ -90,6 +92,90 @@ META_TOOLS = [
|
|||||||
"parameters": {"type": "object", "properties": {}},
|
"parameters": {"type": "object", "properties": {}},
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"type": "function",
|
||||||
|
"function": {
|
||||||
|
"name": "trigger_timer",
|
||||||
|
"description": (
|
||||||
|
"Lege einen Timer-Trigger an — feuert EINMALIG zum angegebenen Zeitpunkt "
|
||||||
|
"und ruft dich selbst auf (Push-Nachricht an Stefan). "
|
||||||
|
"Use-Case: 'erinnere mich in 10min', 'sag mir um 14:30 Bescheid'."
|
||||||
|
),
|
||||||
|
"parameters": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"name": {"type": "string", "description": "kurzer kebab-case-Name, a-z 0-9 - _"},
|
||||||
|
"fires_at": {
|
||||||
|
"type": "string",
|
||||||
|
"description": (
|
||||||
|
"Absoluter ISO-Timestamp UTC, z.B. '2026-05-12T14:30:00Z'. "
|
||||||
|
"Berechne aus relativer Angabe ('in 10min') selbst — die "
|
||||||
|
"aktuelle Zeit findest du im System-Prompt nicht, also nutze "
|
||||||
|
"Bash: `date -u -d '+10 minutes' --iso-8601=seconds`."
|
||||||
|
),
|
||||||
|
},
|
||||||
|
"message": {"type": "string", "description": "Was soll bei der Erinnerung gesagt werden"},
|
||||||
|
},
|
||||||
|
"required": ["name", "fires_at", "message"],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "function",
|
||||||
|
"function": {
|
||||||
|
"name": "trigger_watcher",
|
||||||
|
"description": (
|
||||||
|
"Lege einen Watcher-Trigger an — pollt alle paar Minuten eine Condition, "
|
||||||
|
"feuert wenn sie wahr wird (mit Throttle damit's nicht spammt). "
|
||||||
|
"Use-Case: 'sag bescheid wenn Disk unter 5GB', 'pingt mich wenn um 8 Uhr'. "
|
||||||
|
"Welche Variablen verfuegbar sind und ihre Bedeutung steht im System-Prompt."
|
||||||
|
),
|
||||||
|
"parameters": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"name": {"type": "string", "description": "kurzer Name"},
|
||||||
|
"condition": {
|
||||||
|
"type": "string",
|
||||||
|
"description": (
|
||||||
|
"Boolescher Ausdruck mit den erlaubten Variablen, z.B. "
|
||||||
|
"'disk_free_gb < 5', 'hour_of_day == 8 and day_of_week == \"mon\"'. "
|
||||||
|
"Operatoren: < > <= >= == != and or not"
|
||||||
|
),
|
||||||
|
},
|
||||||
|
"message": {"type": "string", "description": "Was soll bei Erfuellung gesagt werden"},
|
||||||
|
"check_interval_sec": {
|
||||||
|
"type": "integer",
|
||||||
|
"description": "Wie oft Condition pruefen (Default 300 = alle 5min, min 30)",
|
||||||
|
},
|
||||||
|
"throttle_sec": {
|
||||||
|
"type": "integer",
|
||||||
|
"description": "Mindestabstand zwischen 2 Feuerungen (Default 3600 = max 1x/h)",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"required": ["name", "condition", "message"],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "function",
|
||||||
|
"function": {
|
||||||
|
"name": "trigger_cancel",
|
||||||
|
"description": "Loescht einen Trigger (Timer abbrechen oder Watcher entfernen).",
|
||||||
|
"parameters": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {"name": {"type": "string"}},
|
||||||
|
"required": ["name"],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "function",
|
||||||
|
"function": {
|
||||||
|
"name": "trigger_list",
|
||||||
|
"description": "Zeigt alle Trigger (active + inaktiv). Selten noetig — Stefan sieht sie im Diagnostic.",
|
||||||
|
"parameters": {"type": "object", "properties": {}},
|
||||||
|
},
|
||||||
|
},
|
||||||
]
|
]
|
||||||
|
|
||||||
|
|
||||||
@@ -175,8 +261,14 @@ class Agent:
|
|||||||
active_skills = [s for s in all_skills if s.get("active", True)]
|
active_skills = [s for s in all_skills if s.get("active", True)]
|
||||||
tools = list(META_TOOLS) + [_skill_to_tool(s) for s in active_skills]
|
tools = list(META_TOOLS) + [_skill_to_tool(s) for s in active_skills]
|
||||||
|
|
||||||
|
# Trigger-Liste + Variablen-Info fuer den System-Prompt
|
||||||
|
all_triggers = triggers_mod.list_triggers(active_only=False)
|
||||||
|
condition_vars = watcher_mod.describe_variables()
|
||||||
|
|
||||||
# 5. System-Prompt + Window-Messages
|
# 5. System-Prompt + Window-Messages
|
||||||
system_prompt = build_system_prompt(hot, cold, skills=all_skills)
|
system_prompt = build_system_prompt(hot, cold, skills=all_skills,
|
||||||
|
triggers=all_triggers,
|
||||||
|
condition_vars=condition_vars)
|
||||||
messages = [ProxyMessage(role="system", content=system_prompt)]
|
messages = [ProxyMessage(role="system", content=system_prompt)]
|
||||||
for t in self.conversation.window():
|
for t in self.conversation.window():
|
||||||
messages.append(ProxyMessage(role=t.role, content=t.content))
|
messages.append(ProxyMessage(role=t.role, content=t.content))
|
||||||
@@ -273,6 +365,54 @@ class Agent:
|
|||||||
if err:
|
if err:
|
||||||
out += f"\nstderr:\n{err}"
|
out += f"\nstderr:\n{err}"
|
||||||
return out
|
return out
|
||||||
|
if name == "trigger_timer":
|
||||||
|
t = triggers_mod.create_timer(
|
||||||
|
name=arguments["name"],
|
||||||
|
fires_at_iso=arguments["fires_at"],
|
||||||
|
message=arguments["message"],
|
||||||
|
author="aria",
|
||||||
|
)
|
||||||
|
self._pending_events.append({
|
||||||
|
"type": "trigger_created",
|
||||||
|
"trigger": {"name": t["name"], "type": "timer",
|
||||||
|
"fires_at": t["fires_at"], "message": t["message"]},
|
||||||
|
})
|
||||||
|
return f"OK — Timer '{t['name']}' angelegt, feuert um {t['fires_at']}."
|
||||||
|
if name == "trigger_watcher":
|
||||||
|
t = triggers_mod.create_watcher(
|
||||||
|
name=arguments["name"],
|
||||||
|
condition=arguments["condition"],
|
||||||
|
message=arguments["message"],
|
||||||
|
check_interval_sec=int(arguments.get("check_interval_sec", 300)),
|
||||||
|
throttle_sec=int(arguments.get("throttle_sec", 3600)),
|
||||||
|
author="aria",
|
||||||
|
)
|
||||||
|
self._pending_events.append({
|
||||||
|
"type": "trigger_created",
|
||||||
|
"trigger": {"name": t["name"], "type": "watcher",
|
||||||
|
"condition": t["condition"], "message": t["message"]},
|
||||||
|
})
|
||||||
|
return f"OK — Watcher '{t['name']}' angelegt: feuert wenn '{t['condition']}'."
|
||||||
|
if name == "trigger_cancel":
|
||||||
|
try:
|
||||||
|
triggers_mod.delete(arguments["name"])
|
||||||
|
return f"OK — Trigger '{arguments['name']}' geloescht."
|
||||||
|
except ValueError as e:
|
||||||
|
return f"FEHLER: {e}"
|
||||||
|
if name == "trigger_list":
|
||||||
|
items = triggers_mod.list_triggers(active_only=False)
|
||||||
|
if not items:
|
||||||
|
return "(keine Trigger vorhanden)"
|
||||||
|
lines = []
|
||||||
|
for t in items:
|
||||||
|
state = "aktiv" if t.get("active", True) else "DEAKTIVIERT"
|
||||||
|
if t["type"] == "timer":
|
||||||
|
lines.append(f"- {t['name']} (timer, {state}): feuert {t.get('fires_at')} — \"{t.get('message','')[:50]}\"")
|
||||||
|
elif t["type"] == "watcher":
|
||||||
|
lines.append(f"- {t['name']} (watcher, {state}): cond=\"{t.get('condition')}\", throttle={t.get('throttle_sec')}s")
|
||||||
|
else:
|
||||||
|
lines.append(f"- {t['name']} ({t['type']}, {state})")
|
||||||
|
return "\n".join(lines)
|
||||||
return f"Unbekanntes Tool: {name}"
|
return f"Unbekanntes Tool: {name}"
|
||||||
except Exception as exc:
|
except Exception as exc:
|
||||||
logger.exception("Tool '%s' fehlgeschlagen", name)
|
logger.exception("Tool '%s' fehlgeschlagen", name)
|
||||||
|
|||||||
@@ -0,0 +1,169 @@
|
|||||||
|
"""
|
||||||
|
Background-Loop fuer Triggers.
|
||||||
|
|
||||||
|
Laeuft alle TICK_SEC Sekunden in einem asyncio Task, geht ueber alle
|
||||||
|
active Triggers und entscheidet ob sie feuern muessen.
|
||||||
|
|
||||||
|
Feuern bedeutet:
|
||||||
|
1. Trigger-Manifest update (fire_count++, last_fired_at, ggf. deaktivieren)
|
||||||
|
2. Log-Eintrag schreiben
|
||||||
|
3. agent.chat() mit einem system-Praefix aufrufen (NICHT als 'user'!)
|
||||||
|
→ ARIA bekommt das wie eine Push-Nachricht und kann antworten
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import logging
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
import triggers as triggers_mod
|
||||||
|
import watcher as watcher_mod
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
TICK_SEC = 30
|
||||||
|
|
||||||
|
|
||||||
|
def _now_iso() -> str:
|
||||||
|
return datetime.now(timezone.utc).isoformat()
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_iso(s: str) -> Optional[datetime]:
|
||||||
|
if not s:
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
return datetime.fromisoformat(s.replace("Z", "+00:00"))
|
||||||
|
except Exception:
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def _should_fire(trigger: dict, vars_: dict, now: datetime) -> bool:
|
||||||
|
if not trigger.get("active", True):
|
||||||
|
return False
|
||||||
|
t = trigger.get("type", "")
|
||||||
|
|
||||||
|
if t == "timer":
|
||||||
|
fires_at = _parse_iso(trigger.get("fires_at", ""))
|
||||||
|
if not fires_at:
|
||||||
|
return False
|
||||||
|
if fires_at.tzinfo is None:
|
||||||
|
fires_at = fires_at.replace(tzinfo=timezone.utc)
|
||||||
|
return now >= fires_at
|
||||||
|
|
||||||
|
if t == "watcher":
|
||||||
|
# Check-Interval respektieren (sonst pollen wir zu hektisch)
|
||||||
|
check_interval = int(trigger.get("check_interval_sec", 300))
|
||||||
|
last_checked = _parse_iso(trigger.get("last_checked_at", ""))
|
||||||
|
if last_checked:
|
||||||
|
if last_checked.tzinfo is None:
|
||||||
|
last_checked = last_checked.replace(tzinfo=timezone.utc)
|
||||||
|
if (now - last_checked).total_seconds() < check_interval:
|
||||||
|
return False
|
||||||
|
# Throttle: erst feuern wenn last_fired lange genug her ist
|
||||||
|
last_fired = _parse_iso(trigger.get("last_fired_at", ""))
|
||||||
|
throttle = int(trigger.get("throttle_sec", 3600))
|
||||||
|
if last_fired:
|
||||||
|
if last_fired.tzinfo is None:
|
||||||
|
last_fired = last_fired.replace(tzinfo=timezone.utc)
|
||||||
|
if (now - last_fired).total_seconds() < throttle:
|
||||||
|
return False
|
||||||
|
# Condition pruefen
|
||||||
|
cond = (trigger.get("condition") or "").strip()
|
||||||
|
if not cond:
|
||||||
|
return False
|
||||||
|
try:
|
||||||
|
return watcher_mod.evaluate(cond, vars_)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("Trigger %s: Condition '%s' fehlerhaft: %s",
|
||||||
|
trigger.get("name"), cond, e)
|
||||||
|
return False
|
||||||
|
|
||||||
|
if t == "cron":
|
||||||
|
# TODO: später, wenn jemand Bock auf Cron-Parser hat
|
||||||
|
return False
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
async def _fire(trigger: dict, agent_factory) -> None:
|
||||||
|
"""Ruft ARIA mit einer System-Praefix-Nachricht auf."""
|
||||||
|
name = trigger.get("name", "?")
|
||||||
|
message = trigger.get("message") or "(ohne Nachricht)"
|
||||||
|
ttype = trigger.get("type", "?")
|
||||||
|
|
||||||
|
# Manifest updaten
|
||||||
|
try:
|
||||||
|
triggers_mod.mark_fired(name)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("mark_fired %s: %s", name, e)
|
||||||
|
|
||||||
|
# Log
|
||||||
|
triggers_mod.append_log(name, {"event": "fired", "type": ttype, "message": message})
|
||||||
|
|
||||||
|
# System-Nachricht an ARIA: nicht als User, sondern als Hinweis
|
||||||
|
prompt = (
|
||||||
|
f"[Trigger ausgelöst: '{name}', Typ: {ttype}] "
|
||||||
|
f"Geplante Nachricht: \"{message}\". "
|
||||||
|
f"Sage Stefan jetzt diese Information, in deinem Stil. "
|
||||||
|
f"Wenn der Trigger ein Watcher war (Bedingung wurde erfuellt), "
|
||||||
|
f"erwaehne kurz worum es geht. Antworte direkt, keine Rueckfrage."
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
agent = agent_factory()
|
||||||
|
reply = agent.chat(prompt, source="trigger")
|
||||||
|
logger.info("[trigger] %s gefeuert → ARIA-Reply: %s", name, reply[:80])
|
||||||
|
triggers_mod.append_log(name, {"event": "reply", "text": reply[:500]})
|
||||||
|
except Exception as e:
|
||||||
|
logger.exception("Trigger %s feuern fehlgeschlagen: %s", name, e)
|
||||||
|
triggers_mod.append_log(name, {"event": "error", "error": str(e)[:300]})
|
||||||
|
|
||||||
|
|
||||||
|
async def _tick(agent_factory) -> None:
|
||||||
|
"""Ein Pruefdurchlauf. Geht ueber alle Triggers, feuert was zu feuern ist."""
|
||||||
|
try:
|
||||||
|
all_triggers = triggers_mod.list_triggers(active_only=True)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("triggers.list: %s", e)
|
||||||
|
return
|
||||||
|
if not all_triggers:
|
||||||
|
return
|
||||||
|
now = datetime.now(timezone.utc)
|
||||||
|
# Variablen einmal pro Tick sammeln (nicht pro Trigger — Disk-Stat ist teuer)
|
||||||
|
try:
|
||||||
|
vars_ = watcher_mod.collect_variables()
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("collect_variables: %s", e)
|
||||||
|
vars_ = {}
|
||||||
|
|
||||||
|
# Watcher: last_checked_at jetzt updaten (auch wenn nicht gefeuert wird,
|
||||||
|
# damit der Check-Interval respektiert wird)
|
||||||
|
for t in all_triggers:
|
||||||
|
if t.get("type") == "watcher":
|
||||||
|
try:
|
||||||
|
t["last_checked_at"] = _now_iso()
|
||||||
|
triggers_mod.write(t["name"], t)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
for trigger in all_triggers:
|
||||||
|
try:
|
||||||
|
if _should_fire(trigger, vars_, now):
|
||||||
|
# Feuern als eigener Task — wenn ARIA langsam antwortet,
|
||||||
|
# darf der naechste Tick nicht blockieren
|
||||||
|
asyncio.create_task(_fire(trigger, agent_factory))
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("Trigger-Check %s: %s", trigger.get("name"), e)
|
||||||
|
|
||||||
|
|
||||||
|
async def run_loop(agent_factory) -> None:
|
||||||
|
"""Endlosschleife — wird vom main lifespan gestartet + gestoppt."""
|
||||||
|
logger.info("Trigger-Loop gestartet (TICK_SEC=%d)", TICK_SEC)
|
||||||
|
while True:
|
||||||
|
try:
|
||||||
|
await _tick(agent_factory)
|
||||||
|
except Exception as e:
|
||||||
|
logger.exception("Tick-Fehler: %s", e)
|
||||||
|
await asyncio.sleep(TICK_SEC)
|
||||||
+130
-1
@@ -20,6 +20,9 @@ import logging
|
|||||||
import os
|
import os
|
||||||
from typing import List, Optional
|
from typing import List, Optional
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
from contextlib import asynccontextmanager
|
||||||
|
|
||||||
from fastapi import FastAPI, HTTPException, BackgroundTasks, Request
|
from fastapi import FastAPI, HTTPException, BackgroundTasks, Request
|
||||||
from fastapi.responses import Response
|
from fastapi.responses import Response
|
||||||
from pydantic import BaseModel, Field
|
from pydantic import BaseModel, Field
|
||||||
@@ -29,6 +32,10 @@ from conversation import Conversation
|
|||||||
from proxy_client import ProxyClient
|
from proxy_client import ProxyClient
|
||||||
from agent import Agent
|
from agent import Agent
|
||||||
import skills as skills_mod
|
import skills as skills_mod
|
||||||
|
import metrics as metrics_mod
|
||||||
|
import triggers as triggers_mod
|
||||||
|
import watcher as watcher_mod
|
||||||
|
import background as background_mod
|
||||||
|
|
||||||
logging.basicConfig(level=logging.INFO, format="%(asctime)s [%(levelname)s] %(name)s: %(message)s")
|
logging.basicConfig(level=logging.INFO, format="%(asctime)s [%(levelname)s] %(name)s: %(message)s")
|
||||||
logger = logging.getLogger("aria-brain")
|
logger = logging.getLogger("aria-brain")
|
||||||
@@ -36,7 +43,23 @@ logger = logging.getLogger("aria-brain")
|
|||||||
QDRANT_HOST = os.environ.get("QDRANT_HOST", "aria-qdrant")
|
QDRANT_HOST = os.environ.get("QDRANT_HOST", "aria-qdrant")
|
||||||
QDRANT_PORT = int(os.environ.get("QDRANT_PORT", "6333"))
|
QDRANT_PORT = int(os.environ.get("QDRANT_PORT", "6333"))
|
||||||
|
|
||||||
app = FastAPI(title="ARIA Brain", version="0.1.0")
|
@asynccontextmanager
|
||||||
|
async def lifespan(app: FastAPI):
|
||||||
|
"""Beim Brain-Start: Trigger-Background-Loop anwerfen. Beim Shutdown: stoppen."""
|
||||||
|
task = asyncio.create_task(background_mod.run_loop(agent))
|
||||||
|
logger.info("Lifespan: Trigger-Loop gestartet")
|
||||||
|
try:
|
||||||
|
yield
|
||||||
|
finally:
|
||||||
|
task.cancel()
|
||||||
|
try:
|
||||||
|
await task
|
||||||
|
except asyncio.CancelledError:
|
||||||
|
pass
|
||||||
|
logger.info("Lifespan: Trigger-Loop gestoppt")
|
||||||
|
|
||||||
|
|
||||||
|
app = FastAPI(title="ARIA Brain", version="0.1.0", lifespan=lifespan)
|
||||||
|
|
||||||
_embedder: Optional[Embedder] = None
|
_embedder: Optional[Embedder] = None
|
||||||
_store: Optional[VectorStore] = None
|
_store: Optional[VectorStore] = None
|
||||||
@@ -404,6 +427,112 @@ def conversation_distill_now():
|
|||||||
return agent().distill_old_turns()
|
return agent().distill_old_turns()
|
||||||
|
|
||||||
|
|
||||||
|
# ─── Call-Metrics (Token / Quota-Monitoring) ────────────────────────
|
||||||
|
|
||||||
|
@app.get("/metrics/calls")
|
||||||
|
def metrics_calls():
|
||||||
|
"""Liefert Aggregate fuer 1h / 5h / 24h / 30d.
|
||||||
|
Jedes Window: {window_seconds, calls, tokens_in, tokens_out, by_model}."""
|
||||||
|
return metrics_mod.stats()
|
||||||
|
|
||||||
|
|
||||||
|
# ─── Triggers (passive Aufweck-Quellen) ─────────────────────────────
|
||||||
|
|
||||||
|
class TriggerTimerBody(BaseModel):
|
||||||
|
name: str
|
||||||
|
fires_at: str # ISO timestamp
|
||||||
|
message: str
|
||||||
|
author: str = "stefan"
|
||||||
|
|
||||||
|
|
||||||
|
class TriggerWatcherBody(BaseModel):
|
||||||
|
name: str
|
||||||
|
condition: str
|
||||||
|
message: str
|
||||||
|
check_interval_sec: int = 300
|
||||||
|
throttle_sec: int = 3600
|
||||||
|
author: str = "stefan"
|
||||||
|
|
||||||
|
|
||||||
|
class TriggerPatch(BaseModel):
|
||||||
|
active: bool | None = None
|
||||||
|
message: str | None = None
|
||||||
|
condition: str | None = None
|
||||||
|
throttle_sec: int | None = None
|
||||||
|
check_interval_sec: int | None = None
|
||||||
|
fires_at: str | None = None
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/triggers/list")
|
||||||
|
def triggers_list(active_only: bool = False):
|
||||||
|
return {"triggers": triggers_mod.list_triggers(active_only=active_only)}
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/triggers/conditions")
|
||||||
|
def triggers_conditions():
|
||||||
|
"""Verfuegbare Variablen fuer Watcher-Conditions (mit aktuellen Werten)."""
|
||||||
|
return {
|
||||||
|
"variables": watcher_mod.describe_variables(),
|
||||||
|
"current": watcher_mod.collect_variables(),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/triggers/{name}")
|
||||||
|
def triggers_get(name: str):
|
||||||
|
t = triggers_mod.read(name)
|
||||||
|
if t is None:
|
||||||
|
raise HTTPException(404, f"Trigger '{name}' nicht gefunden")
|
||||||
|
return t
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/triggers/{name}/logs")
|
||||||
|
def triggers_get_logs(name: str, limit: int = 50):
|
||||||
|
return {"logs": triggers_mod.list_logs(name, limit=limit)}
|
||||||
|
|
||||||
|
|
||||||
|
@app.post("/triggers/timer")
|
||||||
|
def triggers_create_timer(body: TriggerTimerBody):
|
||||||
|
try:
|
||||||
|
return triggers_mod.create_timer(
|
||||||
|
name=body.name, fires_at_iso=body.fires_at,
|
||||||
|
message=body.message, author=body.author,
|
||||||
|
)
|
||||||
|
except ValueError as exc:
|
||||||
|
raise HTTPException(400, str(exc))
|
||||||
|
|
||||||
|
|
||||||
|
@app.post("/triggers/watcher")
|
||||||
|
def triggers_create_watcher(body: TriggerWatcherBody):
|
||||||
|
try:
|
||||||
|
return triggers_mod.create_watcher(
|
||||||
|
name=body.name, condition=body.condition,
|
||||||
|
message=body.message,
|
||||||
|
check_interval_sec=body.check_interval_sec,
|
||||||
|
throttle_sec=body.throttle_sec,
|
||||||
|
author=body.author,
|
||||||
|
)
|
||||||
|
except ValueError as exc:
|
||||||
|
raise HTTPException(400, str(exc))
|
||||||
|
|
||||||
|
|
||||||
|
@app.patch("/triggers/{name}")
|
||||||
|
def triggers_patch(name: str, body: TriggerPatch):
|
||||||
|
patch = {k: v for k, v in body.model_dump().items() if v is not None}
|
||||||
|
try:
|
||||||
|
return triggers_mod.update(name, patch)
|
||||||
|
except ValueError as exc:
|
||||||
|
raise HTTPException(404, str(exc))
|
||||||
|
|
||||||
|
|
||||||
|
@app.delete("/triggers/{name}")
|
||||||
|
def triggers_delete(name: str):
|
||||||
|
try:
|
||||||
|
triggers_mod.delete(name)
|
||||||
|
except ValueError as exc:
|
||||||
|
raise HTTPException(404, str(exc))
|
||||||
|
return {"deleted": name}
|
||||||
|
|
||||||
|
|
||||||
# ─── Skills ─────────────────────────────────────────────────────────
|
# ─── Skills ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
class SkillCreate(BaseModel):
|
class SkillCreate(BaseModel):
|
||||||
|
|||||||
@@ -0,0 +1,133 @@
|
|||||||
|
"""
|
||||||
|
Call-Metrics fuer den Proxy-Client.
|
||||||
|
|
||||||
|
Pro Claude-Call wird ein Eintrag in /data/metrics.jsonl angehaengt:
|
||||||
|
|
||||||
|
{"ts": <ms>, "model": "...", "in": <tokens_in_estimate>, "out": <tokens_out_estimate>}
|
||||||
|
|
||||||
|
Tokens-Schaetzung: characters / 4 (Anthropic-Default-Heuristik). Nicht exakt
|
||||||
|
aber gut genug fuer Quota-Monitoring. Wir summieren nicht in-memory weil
|
||||||
|
der Brain-Container neugestartet werden kann — alles auf Disk.
|
||||||
|
|
||||||
|
Auswertung via aggregate(window_seconds) — liefert {calls, tokens_in, tokens_out}
|
||||||
|
fuer die letzten N Sekunden. Lazy gelesen, keine grossen Datenmengen erwartet
|
||||||
|
(bei 1000 Calls/Tag ~70 KB pro Monat).
|
||||||
|
|
||||||
|
Auto-Rotate: bei > 50k Zeilen werden die aeltesten 25k weggeschnitten.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
import os
|
||||||
|
import time
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import List
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
METRICS_FILE = Path(os.environ.get("METRICS_FILE", "/data/metrics.jsonl"))
|
||||||
|
ROTATE_AT = 50_000
|
||||||
|
ROTATE_KEEP = 25_000
|
||||||
|
|
||||||
|
|
||||||
|
def _estimate_tokens(text: str) -> int:
|
||||||
|
"""Anthropic-Default: ~4 chars pro Token. Grob genug."""
|
||||||
|
if not text:
|
||||||
|
return 0
|
||||||
|
return max(1, len(text) // 4)
|
||||||
|
|
||||||
|
|
||||||
|
def _messages_tokens(messages: list) -> int:
|
||||||
|
total = 0
|
||||||
|
for m in messages:
|
||||||
|
# Pydantic-Model oder dict
|
||||||
|
if hasattr(m, "content"):
|
||||||
|
total += _estimate_tokens(m.content or "")
|
||||||
|
elif isinstance(m, dict):
|
||||||
|
c = m.get("content") or ""
|
||||||
|
if isinstance(c, str):
|
||||||
|
total += _estimate_tokens(c)
|
||||||
|
return total
|
||||||
|
|
||||||
|
|
||||||
|
def log_call(model: str, messages_in: list, reply_text: str = "") -> None:
|
||||||
|
"""Eine Call-Metric anhaengen. Robust gegen Fehler (silent fail)."""
|
||||||
|
try:
|
||||||
|
tokens_in = _messages_tokens(messages_in)
|
||||||
|
tokens_out = _estimate_tokens(reply_text)
|
||||||
|
line = json.dumps({
|
||||||
|
"ts": int(time.time() * 1000),
|
||||||
|
"model": model,
|
||||||
|
"in": tokens_in,
|
||||||
|
"out": tokens_out,
|
||||||
|
})
|
||||||
|
METRICS_FILE.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
with METRICS_FILE.open("a", encoding="utf-8") as f:
|
||||||
|
f.write(line + "\n")
|
||||||
|
# Sanftes Rotate ohne hohe IO-Kosten — nur alle 1000 Calls checken
|
||||||
|
if (tokens_in + tokens_out) % 1000 < 4:
|
||||||
|
_maybe_rotate()
|
||||||
|
except Exception as exc:
|
||||||
|
logger.warning("metrics.log_call: %s", exc)
|
||||||
|
|
||||||
|
|
||||||
|
def _maybe_rotate() -> None:
|
||||||
|
try:
|
||||||
|
if not METRICS_FILE.exists():
|
||||||
|
return
|
||||||
|
with METRICS_FILE.open("r", encoding="utf-8") as f:
|
||||||
|
lines = f.readlines()
|
||||||
|
if len(lines) > ROTATE_AT:
|
||||||
|
keep = lines[-ROTATE_KEEP:]
|
||||||
|
METRICS_FILE.write_text("".join(keep), encoding="utf-8")
|
||||||
|
logger.info("metrics rotated: %d → %d Zeilen", len(lines), len(keep))
|
||||||
|
except Exception as exc:
|
||||||
|
logger.warning("metrics rotate: %s", exc)
|
||||||
|
|
||||||
|
|
||||||
|
def aggregate(window_seconds: int) -> dict:
|
||||||
|
"""Aggregiert die Calls der letzten N Sekunden."""
|
||||||
|
now_ms = int(time.time() * 1000)
|
||||||
|
cutoff_ms = now_ms - (window_seconds * 1000)
|
||||||
|
calls = 0
|
||||||
|
tokens_in = 0
|
||||||
|
tokens_out = 0
|
||||||
|
by_model: dict[str, int] = {}
|
||||||
|
if METRICS_FILE.exists():
|
||||||
|
try:
|
||||||
|
for raw in METRICS_FILE.read_text(encoding="utf-8").splitlines():
|
||||||
|
raw = raw.strip()
|
||||||
|
if not raw:
|
||||||
|
continue
|
||||||
|
try:
|
||||||
|
obj = json.loads(raw)
|
||||||
|
except Exception:
|
||||||
|
continue
|
||||||
|
if obj.get("ts", 0) < cutoff_ms:
|
||||||
|
continue
|
||||||
|
calls += 1
|
||||||
|
tokens_in += int(obj.get("in") or 0)
|
||||||
|
tokens_out += int(obj.get("out") or 0)
|
||||||
|
m = obj.get("model", "?")
|
||||||
|
by_model[m] = by_model.get(m, 0) + 1
|
||||||
|
except Exception as exc:
|
||||||
|
logger.warning("metrics aggregate: %s", exc)
|
||||||
|
return {
|
||||||
|
"window_seconds": window_seconds,
|
||||||
|
"calls": calls,
|
||||||
|
"tokens_in": tokens_in,
|
||||||
|
"tokens_out": tokens_out,
|
||||||
|
"by_model": by_model,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def stats() -> dict:
|
||||||
|
"""Komplett-Snapshot mit den drei wichtigsten Fenstern."""
|
||||||
|
return {
|
||||||
|
"h1": aggregate(3600),
|
||||||
|
"h5": aggregate(5 * 3600),
|
||||||
|
"h24": aggregate(24 * 3600),
|
||||||
|
"d30": aggregate(30 * 24 * 3600),
|
||||||
|
}
|
||||||
+38
-1
@@ -115,16 +115,53 @@ def build_skills_section(skills: List[dict]) -> str:
|
|||||||
return "\n".join(lines)
|
return "\n".join(lines)
|
||||||
|
|
||||||
|
|
||||||
|
def build_triggers_section(triggers: List[dict], condition_vars: List[dict]) -> str:
|
||||||
|
"""Triggers (passive Aufweck-Quellen) + verfuegbare Condition-Variablen."""
|
||||||
|
lines = ["## Trigger (passive Aufweck-Quellen)"]
|
||||||
|
lines.append("")
|
||||||
|
lines.append("Trigger sind ANDERS als Skills: das System ruft DICH wenn ein Event passiert. "
|
||||||
|
"Du legst sie an wenn Stefan sagt 'erinner mich an X' oder 'sag bescheid wenn Y'.")
|
||||||
|
lines.append("")
|
||||||
|
if triggers:
|
||||||
|
lines.append("### Aktuelle Trigger")
|
||||||
|
for t in triggers:
|
||||||
|
active = t.get("active", True)
|
||||||
|
mark = "" if active else " [INAKTIV]"
|
||||||
|
if t["type"] == "timer":
|
||||||
|
lines.append(f"- **{t['name']}**{mark} (timer) feuert {t.get('fires_at')}: \"{t.get('message','')[:80]}\"")
|
||||||
|
elif t["type"] == "watcher":
|
||||||
|
lines.append(f"- **{t['name']}**{mark} (watcher) cond=`{t.get('condition')}`: \"{t.get('message','')[:80]}\"")
|
||||||
|
lines.append("")
|
||||||
|
lines.append("### Verfuegbare Condition-Variablen (fuer Watcher)")
|
||||||
|
for v in condition_vars:
|
||||||
|
lines.append(f"- `{v['name']}` ({v['type']}) — {v['desc']}")
|
||||||
|
lines.append("")
|
||||||
|
lines.append("Operatoren in Conditions: `<` `>` `<=` `>=` `==` `!=` `and` `or` `not`. "
|
||||||
|
"Beispiel: `disk_free_gb < 5 and hour_of_day >= 8`. "
|
||||||
|
"String-Werte in Quotes: `day_of_week == \"mon\"`.")
|
||||||
|
lines.append("")
|
||||||
|
lines.append("### Wann welcher Typ?")
|
||||||
|
lines.append("- **Timer** fuer einmalige Erinnerungen mit konkreter Zeit ('in 10min', 'um 14:30').")
|
||||||
|
lines.append("- **Watcher** fuer 'wenn X passiert' (Disk voll, bestimmte Tageszeit).")
|
||||||
|
lines.append("- ARIA legt Trigger NUR auf Stefan-Wunsch an, nicht eigenmaechtig.")
|
||||||
|
return "\n".join(lines)
|
||||||
|
|
||||||
|
|
||||||
def build_system_prompt(
|
def build_system_prompt(
|
||||||
pinned: List[MemoryPoint],
|
pinned: List[MemoryPoint],
|
||||||
cold: List[MemoryPoint] | None = None,
|
cold: List[MemoryPoint] | None = None,
|
||||||
skills: List[dict] | None = None,
|
skills: List[dict] | None = None,
|
||||||
|
triggers: List[dict] | None = None,
|
||||||
|
condition_vars: List[dict] | None = None,
|
||||||
) -> str:
|
) -> str:
|
||||||
"""Kompletter System-Prompt: Hot + Cold + Skills."""
|
"""Kompletter System-Prompt: Hot + Cold + Skills + Triggers."""
|
||||||
parts = [build_hot_memory_section(pinned)]
|
parts = [build_hot_memory_section(pinned)]
|
||||||
if skills:
|
if skills:
|
||||||
parts.append("")
|
parts.append("")
|
||||||
parts.append(build_skills_section(skills))
|
parts.append(build_skills_section(skills))
|
||||||
|
if condition_vars:
|
||||||
|
parts.append("")
|
||||||
|
parts.append(build_triggers_section(triggers or [], condition_vars))
|
||||||
if cold:
|
if cold:
|
||||||
parts.append("")
|
parts.append("")
|
||||||
parts.append(build_cold_memory_section(cold))
|
parts.append(build_cold_memory_section(cold))
|
||||||
|
|||||||
@@ -18,6 +18,8 @@ from typing import List, Optional
|
|||||||
import httpx
|
import httpx
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel
|
||||||
|
|
||||||
|
import metrics
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
RUNTIME_CONFIG_FILE = Path("/shared/config/runtime.json")
|
RUNTIME_CONFIG_FILE = Path("/shared/config/runtime.json")
|
||||||
@@ -135,6 +137,9 @@ class ProxyClient:
|
|||||||
"arguments": args,
|
"arguments": args,
|
||||||
})
|
})
|
||||||
|
|
||||||
|
# Call-Metric anhaengen — Token-Schaetzung fuer Quota-Monitoring
|
||||||
|
metrics.log_call(payload["model"], messages, content or "")
|
||||||
|
|
||||||
return ProxyResult(content=content or "", tool_calls=tool_calls, finish_reason=finish_reason)
|
return ProxyResult(content=content or "", tool_calls=tool_calls, finish_reason=finish_reason)
|
||||||
|
|
||||||
def close(self):
|
def close(self):
|
||||||
|
|||||||
@@ -0,0 +1,229 @@
|
|||||||
|
"""
|
||||||
|
Triggers — passive Aufweck-Quellen fuer ARIA.
|
||||||
|
|
||||||
|
Skills sind aktiv (ARIA ruft sie). Triggers sind passiv — das System ruft
|
||||||
|
ARIA wenn ein Event passiert. Drei Typen:
|
||||||
|
|
||||||
|
timer Einmalig zu einem festen Zeitpunkt
|
||||||
|
watcher Recurring: Condition pruefen, bei True → feuern (mit Throttle)
|
||||||
|
cron Cron-Expression (vorerst nicht implementiert, Platzhalter)
|
||||||
|
|
||||||
|
Layout:
|
||||||
|
/data/triggers/<name>.json Manifest pro Trigger
|
||||||
|
/data/triggers/logs/<name>.jsonl Append-only Log pro Feuerung
|
||||||
|
|
||||||
|
Polling-Kosten: Brain-internes Background-Polling (kein LLM-Call).
|
||||||
|
ARIA wird nur aufgeweckt wenn ein Trigger tatsaechlich feuert.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
import shutil
|
||||||
|
import time
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
TRIGGERS_DIR = Path(os.environ.get("TRIGGERS_DIR", "/data/triggers"))
|
||||||
|
LOGS_DIR = TRIGGERS_DIR / "logs"
|
||||||
|
NAME_RE = re.compile(r"^[a-zA-Z0-9_-]{2,60}$")
|
||||||
|
VALID_TYPES = {"timer", "watcher", "cron"}
|
||||||
|
|
||||||
|
|
||||||
|
def _now_iso() -> str:
|
||||||
|
return datetime.now(timezone.utc).isoformat()
|
||||||
|
|
||||||
|
|
||||||
|
def _safe_name(name: str) -> str:
|
||||||
|
if not isinstance(name, str) or not NAME_RE.match(name):
|
||||||
|
raise ValueError(f"Ungueltiger Trigger-Name: {name!r}")
|
||||||
|
return name
|
||||||
|
|
||||||
|
|
||||||
|
def _path(name: str) -> Path:
|
||||||
|
return TRIGGERS_DIR / f"{_safe_name(name)}.json"
|
||||||
|
|
||||||
|
|
||||||
|
def _ensure_dirs():
|
||||||
|
TRIGGERS_DIR.mkdir(parents=True, exist_ok=True)
|
||||||
|
LOGS_DIR.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
|
||||||
|
# ─── CRUD ───────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def list_triggers(active_only: bool = False) -> list[dict]:
|
||||||
|
if not TRIGGERS_DIR.exists():
|
||||||
|
return []
|
||||||
|
out: list[dict] = []
|
||||||
|
for f in sorted(TRIGGERS_DIR.glob("*.json")):
|
||||||
|
try:
|
||||||
|
data = json.loads(f.read_text(encoding="utf-8"))
|
||||||
|
if active_only and not data.get("active", True):
|
||||||
|
continue
|
||||||
|
out.append(data)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("Trigger lesen %s: %s", f, e)
|
||||||
|
return out
|
||||||
|
|
||||||
|
|
||||||
|
def read(name: str) -> Optional[dict]:
|
||||||
|
p = _path(name)
|
||||||
|
if not p.exists():
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
return json.loads(p.read_text(encoding="utf-8"))
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("Trigger %s lesen: %s", name, e)
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def write(name: str, data: dict) -> None:
|
||||||
|
_ensure_dirs()
|
||||||
|
data["updated_at"] = _now_iso()
|
||||||
|
p = _path(name)
|
||||||
|
tmp = p.with_suffix(".tmp")
|
||||||
|
tmp.write_text(json.dumps(data, indent=2, ensure_ascii=False), encoding="utf-8")
|
||||||
|
tmp.replace(p)
|
||||||
|
|
||||||
|
|
||||||
|
def delete(name: str) -> None:
|
||||||
|
p = _path(name)
|
||||||
|
if not p.exists():
|
||||||
|
raise ValueError(f"Trigger '{name}' nicht gefunden")
|
||||||
|
p.unlink()
|
||||||
|
# Logs auch wegraeumen
|
||||||
|
log_file = LOGS_DIR / f"{_safe_name(name)}.jsonl"
|
||||||
|
if log_file.exists():
|
||||||
|
log_file.unlink()
|
||||||
|
|
||||||
|
|
||||||
|
def update(name: str, patch: dict) -> dict:
|
||||||
|
data = read(name)
|
||||||
|
if data is None:
|
||||||
|
raise ValueError(f"Trigger '{name}' nicht gefunden")
|
||||||
|
allowed = {"active", "message", "condition", "throttle_sec",
|
||||||
|
"check_interval_sec", "fires_at"}
|
||||||
|
for k, v in patch.items():
|
||||||
|
if k in allowed:
|
||||||
|
data[k] = v
|
||||||
|
write(name, data)
|
||||||
|
return data
|
||||||
|
|
||||||
|
|
||||||
|
# ─── Create-Helpers (typ-spezifisch) ────────────────────────────────
|
||||||
|
|
||||||
|
def create_timer(
|
||||||
|
name: str,
|
||||||
|
fires_at_iso: str,
|
||||||
|
message: str,
|
||||||
|
author: str = "aria",
|
||||||
|
) -> dict:
|
||||||
|
_safe_name(name)
|
||||||
|
if _path(name).exists():
|
||||||
|
raise ValueError(f"Trigger '{name}' existiert schon")
|
||||||
|
# ISO validieren
|
||||||
|
try:
|
||||||
|
datetime.fromisoformat(fires_at_iso.replace("Z", "+00:00"))
|
||||||
|
except Exception:
|
||||||
|
raise ValueError(f"fires_at_iso ungueltig: {fires_at_iso}")
|
||||||
|
data = {
|
||||||
|
"name": name,
|
||||||
|
"type": "timer",
|
||||||
|
"active": True,
|
||||||
|
"author": author,
|
||||||
|
"created_at": _now_iso(),
|
||||||
|
"fires_at": fires_at_iso,
|
||||||
|
"message": message,
|
||||||
|
"fire_count": 0,
|
||||||
|
"last_fired_at": None,
|
||||||
|
}
|
||||||
|
write(name, data)
|
||||||
|
logger.info("Trigger angelegt: %s (timer, fires_at=%s)", name, fires_at_iso)
|
||||||
|
return data
|
||||||
|
|
||||||
|
|
||||||
|
def create_watcher(
|
||||||
|
name: str,
|
||||||
|
condition: str,
|
||||||
|
message: str,
|
||||||
|
check_interval_sec: int = 300,
|
||||||
|
throttle_sec: int = 3600,
|
||||||
|
author: str = "aria",
|
||||||
|
) -> dict:
|
||||||
|
_safe_name(name)
|
||||||
|
if _path(name).exists():
|
||||||
|
raise ValueError(f"Trigger '{name}' existiert schon")
|
||||||
|
# Condition parsen-pruefen (wirft bei Syntax-Fehler)
|
||||||
|
from watcher import parse_condition
|
||||||
|
parse_condition(condition) # nur Validate
|
||||||
|
if check_interval_sec < 30:
|
||||||
|
check_interval_sec = 30 # nicht oefter als alle 30s pruefen
|
||||||
|
if throttle_sec < 0:
|
||||||
|
throttle_sec = 0
|
||||||
|
data = {
|
||||||
|
"name": name,
|
||||||
|
"type": "watcher",
|
||||||
|
"active": True,
|
||||||
|
"author": author,
|
||||||
|
"created_at": _now_iso(),
|
||||||
|
"condition": condition,
|
||||||
|
"check_interval_sec": int(check_interval_sec),
|
||||||
|
"throttle_sec": int(throttle_sec),
|
||||||
|
"message": message,
|
||||||
|
"fire_count": 0,
|
||||||
|
"last_fired_at": None,
|
||||||
|
"last_checked_at": None,
|
||||||
|
}
|
||||||
|
write(name, data)
|
||||||
|
logger.info("Trigger angelegt: %s (watcher, cond='%s')", name, condition)
|
||||||
|
return data
|
||||||
|
|
||||||
|
|
||||||
|
# ─── Feuern + Log ───────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def mark_fired(name: str) -> dict:
|
||||||
|
data = read(name)
|
||||||
|
if data is None:
|
||||||
|
raise ValueError(f"Trigger '{name}' nicht gefunden")
|
||||||
|
data["fire_count"] = int(data.get("fire_count", 0)) + 1
|
||||||
|
data["last_fired_at"] = _now_iso()
|
||||||
|
# Timer: nach Feuern auto-deaktivieren (one-shot)
|
||||||
|
if data.get("type") == "timer":
|
||||||
|
data["active"] = False
|
||||||
|
write(name, data)
|
||||||
|
return data
|
||||||
|
|
||||||
|
|
||||||
|
def append_log(name: str, entry: dict) -> None:
|
||||||
|
_ensure_dirs()
|
||||||
|
log_file = LOGS_DIR / f"{_safe_name(name)}.jsonl"
|
||||||
|
record = {"ts": _now_iso()}
|
||||||
|
record.update(entry)
|
||||||
|
try:
|
||||||
|
with log_file.open("a", encoding="utf-8") as f:
|
||||||
|
f.write(json.dumps(record, ensure_ascii=False) + "\n")
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("Trigger-Log append %s: %s", name, e)
|
||||||
|
|
||||||
|
|
||||||
|
def list_logs(name: str, limit: int = 50) -> list[dict]:
|
||||||
|
log_file = LOGS_DIR / f"{_safe_name(name)}.jsonl"
|
||||||
|
if not log_file.exists():
|
||||||
|
return []
|
||||||
|
try:
|
||||||
|
lines = log_file.read_text(encoding="utf-8").splitlines()
|
||||||
|
out: list[dict] = []
|
||||||
|
for line in lines[-limit:]:
|
||||||
|
try:
|
||||||
|
out.append(json.loads(line))
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
return out
|
||||||
|
except Exception:
|
||||||
|
return []
|
||||||
@@ -0,0 +1,150 @@
|
|||||||
|
"""
|
||||||
|
Built-in Condition-Variablen + sicherer Mini-Parser fuer Watcher-Triggers.
|
||||||
|
|
||||||
|
Erlaubte Variablen kommen aus diesem Modul. Condition-Ausdruck ist ein
|
||||||
|
sicheres Subset von Python (kein eval, kein exec): nur Vergleiche und
|
||||||
|
Boolean-Operatoren, nur die hier deklarierten Variablen, nur Zahlen +
|
||||||
|
String-Literale als rechte Seite.
|
||||||
|
|
||||||
|
Beispiele:
|
||||||
|
disk_free_gb < 5
|
||||||
|
hour_of_day == 8 and day_of_week == "mon"
|
||||||
|
rvs_connected == False
|
||||||
|
(disk_free_pct < 10 and uptime_sec > 3600)
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import ast
|
||||||
|
import logging
|
||||||
|
import os
|
||||||
|
import shutil
|
||||||
|
import time
|
||||||
|
from datetime import datetime
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
# ─── Variablen-Quellen ──────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _disk_stats() -> tuple[float, float]:
|
||||||
|
"""Returns (free_gb, free_pct). Schaut /shared (geteiltes Volume) — sonst /."""
|
||||||
|
target = "/shared" if os.path.exists("/shared") else "/"
|
||||||
|
try:
|
||||||
|
st = shutil.disk_usage(target)
|
||||||
|
free_gb = st.free / (1024 ** 3)
|
||||||
|
free_pct = 100.0 * st.free / st.total if st.total else 0.0
|
||||||
|
return free_gb, free_pct
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("disk_usage: %s", e)
|
||||||
|
return 0.0, 0.0
|
||||||
|
|
||||||
|
|
||||||
|
def _uptime_sec() -> int:
|
||||||
|
try:
|
||||||
|
with open("/proc/uptime", "r") as f:
|
||||||
|
return int(float(f.read().split()[0]))
|
||||||
|
except Exception:
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
def _rvs_connected() -> bool:
|
||||||
|
"""Liest /shared/config/runtime.json oder ein Bridge-State-File.
|
||||||
|
Aktuell: wir koennen das nicht zuverlaessig aus dem Brain-Container
|
||||||
|
bestimmen — gibt False als sicheren Default zurueck.
|
||||||
|
Spaeter: Bridge schreibt einen Heartbeat-File den wir hier lesen."""
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
_DAYS = ["mon", "tue", "wed", "thu", "fri", "sat", "sun"]
|
||||||
|
|
||||||
|
|
||||||
|
def collect_variables() -> dict[str, Any]:
|
||||||
|
"""Liefert aktuellen Snapshot aller Built-in-Variablen."""
|
||||||
|
free_gb, free_pct = _disk_stats()
|
||||||
|
now = datetime.now()
|
||||||
|
# Memory-Count aus der Vector-DB (importiert lazy um zirkulaere Imports
|
||||||
|
# zu vermeiden — beim Modul-Load gibt's noch keinen Store)
|
||||||
|
memory_count = 0
|
||||||
|
try:
|
||||||
|
from main import store # type: ignore
|
||||||
|
s = store()
|
||||||
|
memory_count = s.count()
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
return {
|
||||||
|
"disk_free_gb": round(free_gb, 2),
|
||||||
|
"disk_free_pct": round(free_pct, 1),
|
||||||
|
"uptime_sec": _uptime_sec(),
|
||||||
|
"hour_of_day": now.hour,
|
||||||
|
"day_of_week": _DAYS[now.weekday()],
|
||||||
|
"rvs_connected": _rvs_connected(),
|
||||||
|
"memory_count": memory_count,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def describe_variables() -> list[dict]:
|
||||||
|
"""Liste der verfuegbaren Variablen + Beschreibung — fuer System-Prompt + UI."""
|
||||||
|
return [
|
||||||
|
{"name": "disk_free_gb", "type": "number", "desc": "freier Plattenplatz in GB (auf /shared)"},
|
||||||
|
{"name": "disk_free_pct", "type": "number", "desc": "freier Plattenplatz in Prozent"},
|
||||||
|
{"name": "uptime_sec", "type": "number", "desc": "Sekunden seit Brain-Start"},
|
||||||
|
{"name": "hour_of_day", "type": "number", "desc": "0..23, lokale Zeit"},
|
||||||
|
{"name": "day_of_week", "type": "string", "desc": "mon|tue|wed|thu|fri|sat|sun"},
|
||||||
|
{"name": "rvs_connected", "type": "bool", "desc": "True wenn RVS-Verbindung steht"},
|
||||||
|
{"name": "memory_count", "type": "number", "desc": "Anzahl Memories in der Vector-DB"},
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
# ─── Sicherer Condition-Parser ──────────────────────────────────────
|
||||||
|
|
||||||
|
_ALLOWED_NODES = (
|
||||||
|
ast.Expression, ast.BoolOp, ast.UnaryOp, ast.Compare,
|
||||||
|
ast.Name, ast.Constant, ast.Load,
|
||||||
|
ast.And, ast.Or, ast.Not,
|
||||||
|
ast.Eq, ast.NotEq, ast.Lt, ast.LtE, ast.Gt, ast.GtE,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def parse_condition(expr: str) -> ast.Expression:
|
||||||
|
"""Parst einen Condition-Ausdruck und validiert ihn gegen das Safe-Subset.
|
||||||
|
Wirft ValueError bei verbotenen Konstrukten."""
|
||||||
|
expr = (expr or "").strip()
|
||||||
|
if not expr:
|
||||||
|
raise ValueError("Leere Condition")
|
||||||
|
if len(expr) > 500:
|
||||||
|
raise ValueError("Condition zu lang (>500 Zeichen)")
|
||||||
|
try:
|
||||||
|
tree = ast.parse(expr, mode="eval")
|
||||||
|
except SyntaxError as e:
|
||||||
|
raise ValueError(f"Condition Syntax-Fehler: {e}")
|
||||||
|
# Whitelist-Walk
|
||||||
|
allowed_names = {v["name"] for v in describe_variables()}
|
||||||
|
for node in ast.walk(tree):
|
||||||
|
if not isinstance(node, _ALLOWED_NODES):
|
||||||
|
raise ValueError(f"Verbotener Ausdruck: {type(node).__name__}")
|
||||||
|
if isinstance(node, ast.Name):
|
||||||
|
if node.id not in allowed_names and node.id not in ("True", "False"):
|
||||||
|
raise ValueError(f"Unbekannte Variable: {node.id}")
|
||||||
|
if isinstance(node, ast.Constant):
|
||||||
|
if not isinstance(node.value, (int, float, str, bool)) and node.value is not None:
|
||||||
|
raise ValueError(f"Verbotener Konstant-Typ: {type(node.value).__name__}")
|
||||||
|
return tree
|
||||||
|
|
||||||
|
|
||||||
|
def evaluate(expr: str, variables: dict[str, Any] | None = None) -> bool:
|
||||||
|
"""Evaluiert die Condition gegen die aktuellen Variablen.
|
||||||
|
Returns bool. Bei Fehler in Variablen → False (defensiv)."""
|
||||||
|
tree = parse_condition(expr)
|
||||||
|
vars_ = variables if variables is not None else collect_variables()
|
||||||
|
code = compile(tree, "<condition>", "eval")
|
||||||
|
# Globals leer, locals nur die erlaubten Variablen → kein Builtin-Zugriff
|
||||||
|
try:
|
||||||
|
result = eval(code, {"__builtins__": {}}, vars_)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("Condition '%s' eval-Fehler: %s", expr, e)
|
||||||
|
return False
|
||||||
|
return bool(result)
|
||||||
+194
-28
@@ -919,6 +919,56 @@ class ARIABridge:
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.warning("[rvs] file_from_aria broadcast fehlgeschlagen: %s", e)
|
logger.warning("[rvs] file_from_aria broadcast fehlgeschlagen: %s", e)
|
||||||
|
|
||||||
|
def _append_chat_backup(self, entry: dict) -> None:
|
||||||
|
"""Schreibt eine Zeile in /shared/config/chat_backup.jsonl.
|
||||||
|
Wird von Diagnostic + App als History-Quelle gelesen.
|
||||||
|
entry braucht mindestens {role, text}; ts wird ergaenzt."""
|
||||||
|
try:
|
||||||
|
line = {"ts": int(asyncio.get_event_loop().time() * 1000)}
|
||||||
|
line.update(entry)
|
||||||
|
Path("/shared/config").mkdir(parents=True, exist_ok=True)
|
||||||
|
with open("/shared/config/chat_backup.jsonl", "a", encoding="utf-8") as f:
|
||||||
|
f.write(json.dumps(line, ensure_ascii=False) + "\n")
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("[backup] chat_backup-Write fehlgeschlagen: %s", e)
|
||||||
|
|
||||||
|
def _read_chat_backup_since(self, since_ms: int, limit: int = 100) -> list[dict]:
|
||||||
|
"""Liest chat_backup.jsonl, gibt Eintraege > since_ms zurueck, max limit neueste.
|
||||||
|
File-deleted-Marker werden honoriert: vor einem file_deleted-Marker liegende
|
||||||
|
Eintraege mit gleichem Pfad werden als deleted markiert."""
|
||||||
|
path = Path("/shared/config/chat_backup.jsonl")
|
||||||
|
if not path.exists():
|
||||||
|
return []
|
||||||
|
try:
|
||||||
|
lines = path.read_text(encoding="utf-8").splitlines()
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("[backup] Lesen fehlgeschlagen: %s", e)
|
||||||
|
return []
|
||||||
|
out: list[dict] = []
|
||||||
|
for raw in lines:
|
||||||
|
raw = raw.strip()
|
||||||
|
if not raw:
|
||||||
|
continue
|
||||||
|
try:
|
||||||
|
obj = json.loads(raw)
|
||||||
|
except Exception:
|
||||||
|
continue
|
||||||
|
ts = obj.get("ts") or 0
|
||||||
|
if ts <= since_ms:
|
||||||
|
continue
|
||||||
|
# file_deleted-Marker: nicht als Chat ausliefern, aber an die App schicken
|
||||||
|
# damit sie ihre Bubbles updaten kann (separater Pfad existiert ja schon)
|
||||||
|
if obj.get("type") == "file_deleted":
|
||||||
|
continue
|
||||||
|
role = obj.get("role")
|
||||||
|
if role not in ("user", "assistant"):
|
||||||
|
continue
|
||||||
|
out.append(obj)
|
||||||
|
# Auf "limit" neueste cappen
|
||||||
|
if len(out) > limit:
|
||||||
|
out = out[-limit:]
|
||||||
|
return out
|
||||||
|
|
||||||
async def _process_core_response(self, text: str, payload: dict) -> None:
|
async def _process_core_response(self, text: str, payload: dict) -> None:
|
||||||
"""Verarbeitet eine fertige Antwort von aria-core.
|
"""Verarbeitet eine fertige Antwort von aria-core.
|
||||||
|
|
||||||
@@ -933,6 +983,9 @@ class ARIABridge:
|
|||||||
logger.info("[core] NO_REPLY empfangen — Antwort still verworfen")
|
logger.info("[core] NO_REPLY empfangen — Antwort still verworfen")
|
||||||
return
|
return
|
||||||
|
|
||||||
|
# Antwort in chat_backup.jsonl loggen (cleaned text, ohne File-Marker)
|
||||||
|
# — passiert weiter unten nach extract_file_markers
|
||||||
|
|
||||||
# File-Marker `[FILE: /shared/uploads/aria_xyz.pdf]` extrahieren —
|
# File-Marker `[FILE: /shared/uploads/aria_xyz.pdf]` extrahieren —
|
||||||
# ARIA legt damit Dateien fuer den User bereit (Bilder, PDFs, etc.).
|
# ARIA legt damit Dateien fuer den User bereit (Bilder, PDFs, etc.).
|
||||||
# Der Marker wird aus dem Antworttext entfernt (TTS soll ihn nicht
|
# Der Marker wird aus dem Antworttext entfernt (TTS soll ihn nicht
|
||||||
@@ -949,6 +1002,15 @@ class ARIABridge:
|
|||||||
f"aber nicht erstellt:\n{missing_list}\n"
|
f"aber nicht erstellt:\n{missing_list}\n"
|
||||||
"Bitte ARIA bitten, sie wirklich zu schreiben.").strip()
|
"Bitte ARIA bitten, sie wirklich zu schreiben.").strip()
|
||||||
|
|
||||||
|
# Antwort in chat_backup.jsonl loggen (gecleanter Text, ohne File-Marker)
|
||||||
|
# File-Marker werden separat als file_from_aria-Events ausgeliefert.
|
||||||
|
self._append_chat_backup({
|
||||||
|
"role": "assistant",
|
||||||
|
"text": text,
|
||||||
|
"files": [{"serverPath": f["serverPath"], "name": f["name"],
|
||||||
|
"mimeType": f["mimeType"], "size": f["size"]} for f in aria_files],
|
||||||
|
})
|
||||||
|
|
||||||
metadata = payload.get("metadata", {})
|
metadata = payload.get("metadata", {})
|
||||||
is_critical = metadata.get("critical", False)
|
is_critical = metadata.get("critical", False)
|
||||||
requested_voice = metadata.get("voice")
|
requested_voice = metadata.get("voice")
|
||||||
@@ -1024,6 +1086,12 @@ class ARIABridge:
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error("[core] XTTS-Request fehlgeschlagen: %s — kein Audio", e)
|
logger.error("[core] XTTS-Request fehlgeschlagen: %s — kein Audio", e)
|
||||||
|
|
||||||
|
# ARIA ist fertig — App's "ARIA denkt..." Indicator zurueck auf idle.
|
||||||
|
# _last_chat_final_at bewusst NICHT setzen: die 3s-Cooldown war fuer
|
||||||
|
# trailing OpenClaw-Activity-Events; bei Voice-Chat wuerde sie die
|
||||||
|
# naechste thinking-Welle unterdruecken.
|
||||||
|
await self._emit_activity("idle", "")
|
||||||
|
|
||||||
# ── Mode Persistence (global, nicht pro Geraet) ──────
|
# ── Mode Persistence (global, nicht pro Geraet) ──────
|
||||||
_MODE_FILE = "/shared/config/mode.json"
|
_MODE_FILE = "/shared/config/mode.json"
|
||||||
|
|
||||||
@@ -1184,12 +1252,13 @@ class ARIABridge:
|
|||||||
payload = json.dumps({"message": text, "source": source}).encode("utf-8")
|
payload = json.dumps({"message": text, "source": source}).encode("utf-8")
|
||||||
logger.info("[brain] chat ← %s '%s'", source, text[:80])
|
logger.info("[brain] chat ← %s '%s'", source, text[:80])
|
||||||
|
|
||||||
# agent_activity broadcasten (App + Diagnostic "ARIA denkt..." Indicator)
|
# User-Nachricht in chat_backup.jsonl loggen — wird beim App-Reconnect
|
||||||
await self._send_to_rvs({
|
# / Diagnostic-Reload als History-Quelle gelesen.
|
||||||
"type": "agent_activity",
|
self._append_chat_backup({"role": "user", "text": text, "source": source})
|
||||||
"payload": {"activity": "thinking"},
|
|
||||||
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
# agent_activity → thinking. _emit_activity statt direktem _send_to_rvs
|
||||||
})
|
# damit der State-Cache fuer die spaetere idle-Dedup richtig steht.
|
||||||
|
await self._emit_activity("thinking", "")
|
||||||
|
|
||||||
def _do_call():
|
def _do_call():
|
||||||
try:
|
try:
|
||||||
@@ -1206,11 +1275,7 @@ class ARIABridge:
|
|||||||
status, body = await asyncio.get_event_loop().run_in_executor(None, _do_call)
|
status, body = await asyncio.get_event_loop().run_in_executor(None, _do_call)
|
||||||
if status != 200:
|
if status != 200:
|
||||||
logger.error("[brain] /chat fehlgeschlagen: status=%s body=%s", status, body[:200])
|
logger.error("[brain] /chat fehlgeschlagen: status=%s body=%s", status, body[:200])
|
||||||
await self._send_to_rvs({
|
await self._emit_activity("idle", "")
|
||||||
"type": "agent_activity",
|
|
||||||
"payload": {"activity": "idle"},
|
|
||||||
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
|
||||||
})
|
|
||||||
await self._send_to_rvs({
|
await self._send_to_rvs({
|
||||||
"type": "chat",
|
"type": "chat",
|
||||||
"payload": {
|
"payload": {
|
||||||
@@ -1225,21 +1290,13 @@ class ARIABridge:
|
|||||||
data = json.loads(body)
|
data = json.loads(body)
|
||||||
except Exception:
|
except Exception:
|
||||||
logger.error("[brain] /chat lieferte ungueltiges JSON: %s", body[:200])
|
logger.error("[brain] /chat lieferte ungueltiges JSON: %s", body[:200])
|
||||||
await self._send_to_rvs({
|
await self._emit_activity("idle", "")
|
||||||
"type": "agent_activity",
|
|
||||||
"payload": {"activity": "idle"},
|
|
||||||
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
|
||||||
})
|
|
||||||
return
|
return
|
||||||
|
|
||||||
reply = (data.get("reply") or "").strip()
|
reply = (data.get("reply") or "").strip()
|
||||||
if not reply:
|
if not reply:
|
||||||
logger.warning("[brain] /chat: leerer Reply")
|
logger.warning("[brain] /chat: leerer Reply")
|
||||||
await self._send_to_rvs({
|
await self._emit_activity("idle", "")
|
||||||
"type": "agent_activity",
|
|
||||||
"payload": {"activity": "idle"},
|
|
||||||
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
|
||||||
})
|
|
||||||
return
|
return
|
||||||
|
|
||||||
# Side-Channel-Events VOR der Chat-Bubble broadcasten (z.B. skill_created)
|
# Side-Channel-Events VOR der Chat-Bubble broadcasten (z.B. skill_created)
|
||||||
@@ -1254,6 +1311,14 @@ class ARIABridge:
|
|||||||
})
|
})
|
||||||
logger.info("[brain] ARIA hat einen Skill erstellt: %s",
|
logger.info("[brain] ARIA hat einen Skill erstellt: %s",
|
||||||
event.get("skill", {}).get("name"))
|
event.get("skill", {}).get("name"))
|
||||||
|
elif etype == "trigger_created":
|
||||||
|
await self._send_to_rvs({
|
||||||
|
"type": "trigger_created",
|
||||||
|
"payload": event.get("trigger", {}),
|
||||||
|
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
||||||
|
})
|
||||||
|
logger.info("[brain] ARIA hat einen Trigger angelegt: %s",
|
||||||
|
event.get("trigger", {}).get("name"))
|
||||||
|
|
||||||
# _process_core_response uebernimmt alles weitere:
|
# _process_core_response uebernimmt alles weitere:
|
||||||
# File-Marker extrahieren + broadcasten, NO_REPLY-Check, Chat-
|
# File-Marker extrahieren + broadcasten, NO_REPLY-Check, Chat-
|
||||||
@@ -1265,6 +1330,8 @@ class ARIABridge:
|
|||||||
await self._process_core_response(reply, {})
|
await self._process_core_response(reply, {})
|
||||||
except Exception:
|
except Exception:
|
||||||
logger.exception("[brain] _process_core_response Fehler")
|
logger.exception("[brain] _process_core_response Fehler")
|
||||||
|
await self._emit_activity("idle", "")
|
||||||
|
# Originaler Fallback-Send (toter Code, _emit_activity uebernimmt jetzt)
|
||||||
await self._send_to_rvs({
|
await self._send_to_rvs({
|
||||||
"type": "agent_activity",
|
"type": "agent_activity",
|
||||||
"payload": {"activity": "idle"},
|
"payload": {"activity": "idle"},
|
||||||
@@ -1657,6 +1724,20 @@ class ARIABridge:
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.warning("[rvs] file_saved konnte nicht an App gesendet werden: %s", e)
|
logger.warning("[rvs] file_saved konnte nicht an App gesendet werden: %s", e)
|
||||||
|
|
||||||
|
elif msg_type == "chat_history_request":
|
||||||
|
# App holt verpasste Nachrichten beim Reconnect.
|
||||||
|
# payload: {since: <ts_ms>}, default 0 = alles
|
||||||
|
since = int(payload.get("since") or 0)
|
||||||
|
limit = int(payload.get("limit") or 100)
|
||||||
|
logger.info("[rvs] chat_history_request since=%d limit=%d", since, limit)
|
||||||
|
messages = self._read_chat_backup_since(since, limit=limit)
|
||||||
|
await self._send_to_rvs({
|
||||||
|
"type": "chat_history_response",
|
||||||
|
"payload": {"messages": messages, "since": since},
|
||||||
|
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
||||||
|
})
|
||||||
|
return
|
||||||
|
|
||||||
elif msg_type == "file_list_request":
|
elif msg_type == "file_list_request":
|
||||||
# App fragt die Liste aller /shared/uploads/-Dateien an.
|
# App fragt die Liste aller /shared/uploads/-Dateien an.
|
||||||
logger.info("[rvs] file_list_request von App")
|
logger.info("[rvs] file_list_request von App")
|
||||||
@@ -1681,6 +1762,89 @@ class ARIABridge:
|
|||||||
logger.warning("[rvs] file_list_request: %s", e)
|
logger.warning("[rvs] file_list_request: %s", e)
|
||||||
return
|
return
|
||||||
|
|
||||||
|
elif msg_type == "file_delete_batch_request":
|
||||||
|
# App will mehrere Dateien auf einmal loeschen.
|
||||||
|
paths = payload.get("paths") or []
|
||||||
|
req_id = payload.get("requestId", "")
|
||||||
|
logger.warning("[rvs] file_delete_batch_request: %d Pfade", len(paths))
|
||||||
|
try:
|
||||||
|
body_bytes = json.dumps({"paths": paths}).encode("utf-8")
|
||||||
|
req = urllib.request.Request(
|
||||||
|
"http://localhost:3001/api/files-delete-batch",
|
||||||
|
data=body_bytes, method="POST",
|
||||||
|
headers={"Content-Type": "application/json"},
|
||||||
|
)
|
||||||
|
def _do_delete():
|
||||||
|
try:
|
||||||
|
with urllib.request.urlopen(req, timeout=30) as resp:
|
||||||
|
return resp.status, resp.read().decode("utf-8", errors="ignore")
|
||||||
|
except Exception as e:
|
||||||
|
return None, str(e)
|
||||||
|
status, body = await asyncio.get_event_loop().run_in_executor(None, _do_delete)
|
||||||
|
logger.info("[rvs] file_delete_batch result: status=%s", status)
|
||||||
|
# Server broadcastet file_deleted pro Pfad — App kriegt das via persistente RVS.
|
||||||
|
# Wir bestaetigen zusaetzlich mit Counts.
|
||||||
|
try: d = json.loads(body or "{}")
|
||||||
|
except: d = {}
|
||||||
|
await self._send_to_rvs({
|
||||||
|
"type": "file_delete_batch_response",
|
||||||
|
"payload": {
|
||||||
|
"requestId": req_id,
|
||||||
|
"deleted": len(d.get("deleted", [])),
|
||||||
|
"errors": d.get("errors", []),
|
||||||
|
},
|
||||||
|
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
||||||
|
})
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("[rvs] file_delete_batch_request: %s", e)
|
||||||
|
return
|
||||||
|
|
||||||
|
elif msg_type == "file_zip_request":
|
||||||
|
# App will mehrere Dateien als ZIP. Bridge holt ZIP von Diagnostic
|
||||||
|
# via HTTP, kodiert base64 und schickt zurueck. Cap auf 30 MB
|
||||||
|
# ZIP-Groesse damit RVS nicht erstickt.
|
||||||
|
paths = payload.get("paths") or []
|
||||||
|
req_id = payload.get("requestId", "")
|
||||||
|
logger.warning("[rvs] file_zip_request: %d Pfade (req=%s)", len(paths), req_id)
|
||||||
|
|
||||||
|
def _do_zip():
|
||||||
|
try:
|
||||||
|
body_bytes = json.dumps({"paths": paths}).encode("utf-8")
|
||||||
|
req = urllib.request.Request(
|
||||||
|
"http://localhost:3001/api/files-download-zip",
|
||||||
|
data=body_bytes, method="POST",
|
||||||
|
headers={"Content-Type": "application/json"},
|
||||||
|
)
|
||||||
|
with urllib.request.urlopen(req, timeout=120) as resp:
|
||||||
|
if resp.status != 200:
|
||||||
|
return None, f"HTTP {resp.status}"
|
||||||
|
data = resp.read()
|
||||||
|
if len(data) > 30 * 1024 * 1024:
|
||||||
|
return None, f"ZIP zu gross ({len(data) // (1024*1024)} MB > 30 MB)"
|
||||||
|
return data, None
|
||||||
|
except Exception as e:
|
||||||
|
return None, str(e)
|
||||||
|
|
||||||
|
data, err = await asyncio.get_event_loop().run_in_executor(None, _do_zip)
|
||||||
|
if err or data is None:
|
||||||
|
await self._send_to_rvs({
|
||||||
|
"type": "file_zip_response",
|
||||||
|
"payload": {"requestId": req_id, "ok": False, "error": err or "leer"},
|
||||||
|
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
||||||
|
})
|
||||||
|
return
|
||||||
|
import base64 as _b64
|
||||||
|
await self._send_to_rvs({
|
||||||
|
"type": "file_zip_response",
|
||||||
|
"payload": {
|
||||||
|
"requestId": req_id, "ok": True,
|
||||||
|
"size": len(data),
|
||||||
|
"data": _b64.b64encode(data).decode("ascii"),
|
||||||
|
},
|
||||||
|
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
||||||
|
})
|
||||||
|
return
|
||||||
|
|
||||||
elif msg_type == "file_delete_request":
|
elif msg_type == "file_delete_request":
|
||||||
# App will eine Datei loeschen — leite an Diagnostic.
|
# App will eine Datei loeschen — leite an Diagnostic.
|
||||||
p = payload.get("path", "")
|
p = payload.get("path", "")
|
||||||
@@ -1887,13 +2051,11 @@ class ARIABridge:
|
|||||||
|
|
||||||
if text.strip():
|
if text.strip():
|
||||||
logger.info("[rvs] STT Ergebnis: '%s'", text[:80])
|
logger.info("[rvs] STT Ergebnis: '%s'", text[:80])
|
||||||
# Hints (Barge-In, GPS) als Praefix vorschalten — gemeinsamer Helper
|
|
||||||
# mit dem chat-Pfad damit das Verhalten konsistent ist.
|
# Reihenfolge wichtig: STT-Text ZUERST broadcasten damit die App
|
||||||
core_text = self._build_core_text(text, interrupted, location)
|
# die Voice-Bubble sofort mit dem erkannten Text aktualisieren
|
||||||
# ERST an aria-core senden (wichtigster Schritt)
|
# kann — send_to_core blockt danach synchron auf Brain (kann
|
||||||
await self.send_to_core(core_text, source="app-voice" + (" [barge-in]" if interrupted else ""))
|
# dauern), wuerde sonst die Anzeige verzoegern.
|
||||||
# STT-Text an RVS senden (fuer Anzeige in App + Diagnostic)
|
|
||||||
# sender="stt" damit Bridge es ignoriert (kein Loop)
|
|
||||||
try:
|
try:
|
||||||
stt_payload = {
|
stt_payload = {
|
||||||
"text": text,
|
"text": text,
|
||||||
@@ -1917,6 +2079,10 @@ class ARIABridge:
|
|||||||
logger.warning("[rvs] STT-Text NICHT broadcastet — _send_to_rvs lieferte False")
|
logger.warning("[rvs] STT-Text NICHT broadcastet — _send_to_rvs lieferte False")
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.warning("[rvs] STT-Text konnte nicht an RVS gesendet werden: %s", e)
|
logger.warning("[rvs] STT-Text konnte nicht an RVS gesendet werden: %s", e)
|
||||||
|
|
||||||
|
# Dann an Brain — der blockt synchron bis ARIA fertig ist.
|
||||||
|
core_text = self._build_core_text(text, interrupted, location)
|
||||||
|
await self.send_to_core(core_text, source="app-voice" + (" [barge-in]" if interrupted else ""))
|
||||||
else:
|
else:
|
||||||
logger.info("[rvs] Keine Sprache erkannt — ignoriert")
|
logger.info("[rvs] Keine Sprache erkannt — ignoriert")
|
||||||
|
|
||||||
|
|||||||
@@ -1,5 +1,7 @@
|
|||||||
FROM node:22-alpine
|
FROM node:22-alpine
|
||||||
WORKDIR /app
|
WORKDIR /app
|
||||||
|
# zip fuer Multi-Datei-Downloads (Brain-Export nutzt tar.gz, Datei-Manager zip)
|
||||||
|
RUN apk add --no-cache zip
|
||||||
COPY package.json ./
|
COPY package.json ./
|
||||||
RUN npm install --production
|
RUN npm install --production
|
||||||
COPY . .
|
COPY . .
|
||||||
|
|||||||
+753
-82
File diff suppressed because it is too large
Load Diff
@@ -1361,6 +1361,77 @@ const server = http.createServer((req, res) => {
|
|||||||
});
|
});
|
||||||
fs.createReadStream(safe).pipe(res);
|
fs.createReadStream(safe).pipe(res);
|
||||||
return;
|
return;
|
||||||
|
} else if (req.url === "/api/files-download-zip" && req.method === "POST") {
|
||||||
|
// Multi-Datei-Download als ZIP. Body: {paths: ["/shared/uploads/...", ...]}.
|
||||||
|
// Streamt zip stdout direkt in die Response.
|
||||||
|
let body = "";
|
||||||
|
req.on("data", c => { body += c; if (body.length > 65536) req.destroy(); });
|
||||||
|
req.on("end", () => {
|
||||||
|
let paths = [];
|
||||||
|
try { paths = (JSON.parse(body || "{}").paths || []); } catch { paths = []; }
|
||||||
|
// Whitelist: nur /shared/uploads/, existieren muessen sie
|
||||||
|
paths = paths
|
||||||
|
.map(p => path.resolve(String(p)))
|
||||||
|
.filter(p => p.startsWith("/shared/uploads/") && fs.existsSync(p));
|
||||||
|
if (!paths.length) {
|
||||||
|
res.writeHead(400, { "Content-Type": "application/json" });
|
||||||
|
res.end(JSON.stringify({ ok: false, error: "Keine gueltigen Pfade" }));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const ts = new Date().toISOString().replace(/[:.]/g, "-").slice(0, 19);
|
||||||
|
const fname = `aria-files-${ts}.zip`;
|
||||||
|
res.writeHead(200, {
|
||||||
|
"Content-Type": "application/zip",
|
||||||
|
"Content-Disposition": `attachment; filename="${fname}"`,
|
||||||
|
});
|
||||||
|
// zip -j: junk paths (Dateien ohne Verzeichnisstruktur ablegen)
|
||||||
|
const { spawn } = require("child_process");
|
||||||
|
const zip = spawn("zip", ["-j", "-q", "-", ...paths]);
|
||||||
|
zip.stdout.pipe(res);
|
||||||
|
let stderr = "";
|
||||||
|
zip.stderr.on("data", d => stderr += d.toString());
|
||||||
|
zip.on("close", code => {
|
||||||
|
if (code !== 0 && code !== 12) {
|
||||||
|
log("error", "server", `zip exit ${code}: ${stderr.slice(0, 200)}`);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
req.on("close", () => { if (!zip.killed) zip.kill("SIGTERM"); });
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
} else if (req.url === "/api/files-delete-batch" && req.method === "POST") {
|
||||||
|
let body = "";
|
||||||
|
req.on("data", c => { body += c; if (body.length > 65536) req.destroy(); });
|
||||||
|
req.on("end", () => {
|
||||||
|
try {
|
||||||
|
let paths = (JSON.parse(body || "{}").paths || []);
|
||||||
|
paths = paths
|
||||||
|
.map(p => path.resolve(String(p)))
|
||||||
|
.filter(p => p.startsWith("/shared/uploads/"));
|
||||||
|
const deleted = [];
|
||||||
|
const errors = [];
|
||||||
|
for (const p of paths) {
|
||||||
|
try {
|
||||||
|
if (fs.existsSync(p)) fs.unlinkSync(p);
|
||||||
|
deleted.push(p);
|
||||||
|
broadcast({ type: "file_deleted", path: p });
|
||||||
|
sendToRVS_raw({ type: "file_deleted", payload: { path: p }, timestamp: Date.now() });
|
||||||
|
try {
|
||||||
|
fs.appendFileSync("/shared/config/chat_backup.jsonl",
|
||||||
|
JSON.stringify({ type: "file_deleted", path: p, ts: Date.now(), by: "user" }) + "\n");
|
||||||
|
} catch {}
|
||||||
|
} catch (e) {
|
||||||
|
errors.push({ path: p, error: e.message });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
log("info", "server", `Bulk-Delete: ${deleted.length} OK, ${errors.length} Fehler`);
|
||||||
|
res.writeHead(200, { "Content-Type": "application/json" });
|
||||||
|
res.end(JSON.stringify({ ok: true, deleted, errors }));
|
||||||
|
} catch (err) {
|
||||||
|
res.writeHead(500, { "Content-Type": "application/json" });
|
||||||
|
res.end(JSON.stringify({ ok: false, error: err.message }));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
return;
|
||||||
} else if (req.url === "/api/files-delete" && req.method === "POST") {
|
} else if (req.url === "/api/files-delete" && req.method === "POST") {
|
||||||
let body = "";
|
let body = "";
|
||||||
req.on("data", c => { body += c; if (body.length > 4096) req.destroy(); });
|
req.on("data", c => { body += c; if (body.length > 4096) req.destroy(); });
|
||||||
@@ -1448,6 +1519,30 @@ const server = http.createServer((req, res) => {
|
|||||||
}
|
}
|
||||||
});
|
});
|
||||||
return;
|
return;
|
||||||
|
} else if (req.url === "/api/chat-history-clear" && req.method === "POST") {
|
||||||
|
// Leert die Diagnostic-Anzeige-History (chat_backup.jsonl) UND broadcastet
|
||||||
|
// chat_cleared an alle RVS-Clients (App leert lokal). Brain's
|
||||||
|
// Rolling-Window (conversation.jsonl) ist davon unabhaengig — Caller
|
||||||
|
// sollte zusaetzlich /api/brain/conversation/reset triggern.
|
||||||
|
log("warn", "server", "HTTP /api/chat-history-clear");
|
||||||
|
try {
|
||||||
|
const file = "/shared/config/chat_backup.jsonl";
|
||||||
|
if (fs.existsSync(file)) fs.unlinkSync(file);
|
||||||
|
// Browser-Clients: leere chat_history
|
||||||
|
broadcast({ type: "chat_history", messages: [] });
|
||||||
|
// App via RVS: chat_cleared
|
||||||
|
sendToRVS_raw({
|
||||||
|
type: "chat_cleared",
|
||||||
|
payload: { ts: Date.now() },
|
||||||
|
timestamp: Date.now(),
|
||||||
|
});
|
||||||
|
res.writeHead(200, { "Content-Type": "application/json" });
|
||||||
|
res.end(JSON.stringify({ ok: true }));
|
||||||
|
} catch (err) {
|
||||||
|
res.writeHead(500, { "Content-Type": "application/json" });
|
||||||
|
res.end(JSON.stringify({ ok: false, error: err.message }));
|
||||||
|
}
|
||||||
|
return;
|
||||||
} else if (req.url === "/api/wipe-all" && req.method === "POST") {
|
} else if (req.url === "/api/wipe-all" && req.method === "POST") {
|
||||||
// Komplett-Reset — Gedaechtnis, Stimmen, Config alle weg. SSH-Keys
|
// Komplett-Reset — Gedaechtnis, Stimmen, Config alle weg. SSH-Keys
|
||||||
// und .env bleiben, RVS-Anbindung bleibt. Brain + Qdrant werden
|
// und .env bleiben, RVS-Anbindung bleibt. Brain + Qdrant werden
|
||||||
|
|||||||
@@ -212,22 +212,56 @@ Wichtige Mechanismen:
|
|||||||
- [x] RVS Nachrichten vom Smartphone gehen durch
|
- [x] RVS Nachrichten vom Smartphone gehen durch
|
||||||
- [x] SSH Volume read-write fuer Proxy (kein -F Workaround mehr)
|
- [x] SSH Volume read-write fuer Proxy (kein -F Workaround mehr)
|
||||||
|
|
||||||
## Offen
|
## Brain — Phase B (komplett)
|
||||||
|
|
||||||
### Brain (Phase B — der grosse Refactor laeuft)
|
Der grosse Refactor weg von OpenClaw zu eigener Brain-Architektur — alle 4 Punkte
|
||||||
|
durch. ARIA hat jetzt eigenes Gedaechtnis (Vector-DB), eigenen Loop, eigene
|
||||||
|
Skills mit Tool-Use.
|
||||||
|
|
||||||
- [x] aria-brain Container-Skeleton (FastAPI + Qdrant + sentence-transformers)
|
### Infrastruktur
|
||||||
- [x] Memory CRUD via Diagnostic-Gehirn-Tab (Add/Edit/Delete + Search + Filter)
|
|
||||||
- [x] Gehirn-Export/Import als tar.gz (komplett: Memories + Skills + Qdrant)
|
- [x] aria-brain Container (FastAPI + Qdrant + sentence-transformers, MiniLM multilingual)
|
||||||
- [x] Voice-Bridge: aria-core-spezifische Logik raus (doctor_fix, aria_restart, aria_session_reset, compact_after)
|
- [x] aria-core (OpenClaw) abgerissen — Tag `v0.1.2.0` als Archiv
|
||||||
- [x] aria-core komplett aus docker-compose.yml raus, Watchdog raus
|
- [x] docker-compose komplett umgebaut: brain + qdrant + bridge + diagnostic + proxy
|
||||||
- [x] Diagnostic: Wipe-All-Button (Memory + Stimmen + Settings)
|
- [x] Voice-Bridge: aria-core-Logik raus (doctor_fix, aria_restart, compact_after) → durch Brain-HTTP-Call ersetzt
|
||||||
- [x] Voice Export/Import (Diagnostic + XTTS-Bridge auf Gaming-PC)
|
- [x] Sprachmodell-Setting in runtime.json (brainModel) — Diagnostic kann Modell live wechseln, Brain-Restart noetig
|
||||||
|
|
||||||
|
### Memory / Vector-DB
|
||||||
|
|
||||||
|
- [x] Memory CRUD via Diagnostic-Gehirn-Tab (Add/Edit/Delete + Suche + Type/Pinned-Filter)
|
||||||
|
- [x] **Migration aus brain-import/** (Phase B Punkt 2) — Parser fuer AGENT.md/USER.md/TOOLING.md, atomare Punkte mit migration_key (idempotent)
|
||||||
|
- [x] **Bootstrap-Snapshot** (Phase B Punkt 2) — Export/Import nur pinned Memories als JSON
|
||||||
|
- [x] **Komplettes Gehirn** Export/Import als tar.gz (Memories + Skills + Qdrant)
|
||||||
|
|
||||||
|
### Conversation-Loop (Phase B Punkt 3)
|
||||||
|
|
||||||
|
- [x] Single-Chat UI + Rolling Window (50 Turns)
|
||||||
|
- [x] Memory-Destillat: bei >60 Turns automatisch 30 aelteste → fact-Memories via Claude-Call
|
||||||
|
- [x] Hot Memory (pinned) + Cold Memory (Top-5 semantisch) im System-Prompt
|
||||||
|
- [x] Manueller Destillat-Trigger + Konversation-Reset (Brain + Diagnostic chat_backup gleichzeitig)
|
||||||
|
- [x] Bridge schreibt chat_backup.jsonl bei jedem Turn (User + ARIA + ARIA-Files)
|
||||||
|
- [x] App-Chat-Sync: kompletter Server-Sync bei Reconnect (Server = Source of Truth). Wenn Server leer → App leert auch. Lokal-only Bubbles (Skill-Notifications, laufende Voice ohne STT) bleiben erhalten. Plus chat_cleared Live-Update wenn Diagnostic die History wiped.
|
||||||
|
|
||||||
|
### Skills-System (Phase B Punkt 4)
|
||||||
|
|
||||||
|
- [x] Python-only Skills (local-venv pro Skill, eigene pip-Pakete)
|
||||||
|
- [x] Tool-Use im Brain: skill_create als Meta-Tool, dynamische run_<skill> pro aktivem Skill
|
||||||
|
- [x] Harte Schwelle dokumentiert: pip-Install → IMMER Skill (Brain hat keinen Persistenz ausser /data/skills/)
|
||||||
|
- [x] Diagnostic Skills-Tab: Liste, README, Logs pro Run, Activate/Deactivate/Delete, Export/Import als tar.gz
|
||||||
|
- [x] skill_created Live-Notification: gelbe Bubble in App + Diagnostic sobald ARIA selbst einen Skill anlegt
|
||||||
|
|
||||||
|
### Diagnostic / App Features (drumherum)
|
||||||
|
|
||||||
|
- [x] Datei-Manager (Diagnostic + App-Modal): /shared/uploads/ verwalten, Multi-Select + Select-All + Bulk-Download als ZIP + Bulk-Delete
|
||||||
|
- [x] Wipe-All-Button (Memory + Stimmen + Settings)
|
||||||
|
- [x] Voice Export/Import pro Stimme (Diagnostic + XTTS-Bridge auf Gamebox)
|
||||||
- [x] F5/Whisper-Settings als JSON-Bundle Export/Import
|
- [x] F5/Whisper-Settings als JSON-Bundle Export/Import
|
||||||
- [x] Datei-Manager (Diagnostic + App-Modal): /shared/uploads/ verwalten, Delete spiegelt sich live in den Chat-Bubbles
|
- [x] App Chat-Suche umgebaut: Highlight + Next/Prev statt Filter
|
||||||
- [ ] **Phase B Punkt 2:** Migration `aria-data/brain-import/` → atomare Memory-Punkte (Identity / Rules / Preferences / Tools)
|
- [x] App Pinch-Zoom in Bildern rewriten (Multi-Touch-Race-Bugs)
|
||||||
- [ ] **Phase B Punkt 3:** Brain Conversation-Loop (Single-Chat UI + Rolling Window + Memory-Destillat)
|
- [x] Info-Buttons mit Modal-Erklaerungen im Gehirn-Tab
|
||||||
- [ ] **Phase B Punkt 4:** Skills-System (Manifest, venv/local-bin, README pro Skill, Diagnostic-Skills-Tab, Export/Import)
|
- [x] Token/Call-Metrics + Subscription-Quota-Tracking: pro Claude-Call ein Log-Eintrag mit Token-Schaetzung (chars/4). Gehirn-Tab zeigt 1h/5h/24h/30d-Aggregat + Progress-Bar gegen Plan-Limit (Pro=45/5h, Max 5x=225/5h, Max 20x=900/5h, Custom). Warn-Schwelle 80%, kritisch 90%.
|
||||||
|
|
||||||
|
## Offen
|
||||||
|
|
||||||
### App Features
|
### App Features
|
||||||
- [ ] Chat-History zuverlaessiger laden (AsyncStorage Race Condition)
|
- [ ] Chat-History zuverlaessiger laden (AsyncStorage Race Condition)
|
||||||
@@ -238,3 +272,7 @@ Wichtige Mechanismen:
|
|||||||
- [ ] Diagnostic: System-Info Tab (Container-Status, Disk, RAM, CPU)
|
- [ ] Diagnostic: System-Info Tab (Container-Status, Disk, RAM, CPU)
|
||||||
- [ ] RVS Zombie-Connections endgueltig loesen
|
- [ ] RVS Zombie-Connections endgueltig loesen
|
||||||
- [ ] Gamebox: kleine Web-Oberflaeche fuer Credentials/Server-Config oder zentral aus Diagnostic per RVS push
|
- [ ] Gamebox: kleine Web-Oberflaeche fuer Credentials/Server-Config oder zentral aus Diagnostic per RVS push
|
||||||
|
- [ ] Erste Skills bauen lassen (yt-dlp, pdf-extract, image-resize, etc.) — durch normale Anfragen, ARIA legt sie selbst an
|
||||||
|
- [ ] Tool-Use-Verifikation: Live-Test ob claude-max-api-proxy `tools` und `tool_calls` sauber durchreicht
|
||||||
|
- [ ] Heartbeat (periodische Selbst-Checks)
|
||||||
|
- [ ] Lokales LLM als Waechter (Triage vor Claude-Call)
|
||||||
|
|||||||
@@ -25,6 +25,10 @@ const ALLOWED_TYPES = new Set([
|
|||||||
"xtts_export_voice", "xtts_voice_exported",
|
"xtts_export_voice", "xtts_voice_exported",
|
||||||
"xtts_import_voice", "xtts_voice_imported",
|
"xtts_import_voice", "xtts_voice_imported",
|
||||||
"skill_created",
|
"skill_created",
|
||||||
|
"trigger_created",
|
||||||
|
"chat_history_request", "chat_history_response", "chat_cleared",
|
||||||
|
"file_delete_batch_request", "file_delete_batch_response",
|
||||||
|
"file_zip_request", "file_zip_response",
|
||||||
"xtts_delete_voice",
|
"xtts_delete_voice",
|
||||||
"voice_preload", "voice_ready",
|
"voice_preload", "voice_ready",
|
||||||
"stt_request", "stt_response",
|
"stt_request", "stt_response",
|
||||||
|
|||||||
Reference in New Issue
Block a user