Compare commits
3 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| db053c2dbd | |||
| 8c1dac86d5 | |||
| 8fb95b884f |
@@ -271,9 +271,13 @@ Die Bridge verbindet die Android App mit ARIA und bietet lokale Sprachverarbeitu
|
|||||||
|
|
||||||
**Nachrichtenfluss:**
|
**Nachrichtenfluss:**
|
||||||
```
|
```
|
||||||
App → RVS → Bridge → aria-core
|
Text: App → RVS → Bridge → chat.send → aria-core
|
||||||
aria-core → Bridge → RVS → App
|
Audio: App → RVS → Bridge → FFmpeg → Whisper STT → chat.send → aria-core
|
||||||
→ Lautsprecher (TTS)
|
Datei: App → RVS → Bridge → /shared/uploads/ → chat.send (mit Pfad) → aria-core
|
||||||
|
|
||||||
|
aria-core → Antwort → Gateway → Diagnostic → RVS → App
|
||||||
|
→ Bridge → Piper TTS → RVS → App (Audio)
|
||||||
|
→ Bridge → Lautsprecher (lokal)
|
||||||
```
|
```
|
||||||
|
|
||||||
### Features
|
### Features
|
||||||
@@ -335,9 +339,11 @@ API-Endpoint fuer andere Services: `GET http://localhost:3001/api/session`
|
|||||||
- Text-Chat mit ARIA
|
- Text-Chat mit ARIA
|
||||||
- **Sprachaufnahme**: Push-to-Talk (halten) oder Tap-to-Talk (tippen, Auto-Stop bei Stille)
|
- **Sprachaufnahme**: Push-to-Talk (halten) oder Tap-to-Talk (tippen, Auto-Stop bei Stille)
|
||||||
- **VAD (Voice Activity Detection)**: Erkennt 1.8s Stille und stoppt automatisch
|
- **VAD (Voice Activity Detection)**: Erkennt 1.8s Stille und stoppt automatisch
|
||||||
- **Wake Word**: Toggle-Button aktiviert kontinuierliches Mikrofon-Monitoring
|
- **STT (Speech-to-Text)**: Audio wird in der Bridge per Whisper transkribiert, transkribierter Text erscheint im Chat
|
||||||
|
- **Wake Word**: Toggle-Button (Ohr-Symbol) aktiviert kontinuierliches Mikrofon-Monitoring
|
||||||
- **TTS-Wiedergabe**: ARIA antwortet per Lautsprecher (Ramona/Thorsten)
|
- **TTS-Wiedergabe**: ARIA antwortet per Lautsprecher (Ramona/Thorsten)
|
||||||
- Datei- und Kamera-Upload
|
- **Datei- und Bild-Upload**: Bilder inline im Chat, Dateien mit Icon + Name + Groesse
|
||||||
|
- **Anhaenge**: Bridge speichert Dateien in Shared Volume (`/shared/uploads/`), ARIA kann darauf zugreifen
|
||||||
- GPS-Position (optional)
|
- GPS-Position (optional)
|
||||||
- QR-Code Scanner fuer Token-Pairing
|
- QR-Code Scanner fuer Token-Pairing
|
||||||
|
|
||||||
@@ -381,15 +387,28 @@ GITEA_REPO=stefan/aria-agent
|
|||||||
GITEA_USER=stefan
|
GITEA_USER=stefan
|
||||||
```
|
```
|
||||||
|
|
||||||
### Audio-Pipeline
|
### Audio-Pipeline (Spracheingabe)
|
||||||
|
|
||||||
```
|
```
|
||||||
App (Mikrofon) → AAC/MP4 Aufnahme → Base64 → RVS → Bridge
|
App (Mikrofon) → AAC/MP4 Aufnahme → Base64 → RVS → Bridge
|
||||||
Bridge: FFmpeg (16kHz PCM) → Whisper STT → Text → aria-core
|
Bridge: FFmpeg (16kHz PCM) → Whisper STT → Text → aria-core
|
||||||
|
Bridge: STT-Ergebnis → RVS → App (Placeholder wird durch transkribierten Text ersetzt)
|
||||||
aria-core → Antwort → Bridge → Piper TTS (WAV) → Base64 → RVS → App
|
aria-core → Antwort → Bridge → Piper TTS (WAV) → Base64 → RVS → App
|
||||||
App: Base64 → WAV → Lautsprecher
|
App: Base64 → WAV → Lautsprecher
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Datei-Pipeline (Bilder & Anhaenge)
|
||||||
|
|
||||||
|
```
|
||||||
|
App (Kamera/Dateimanager) → Base64 → RVS → Bridge
|
||||||
|
Bridge: Speichert in /shared/uploads/ (Shared Volume, fuer aria-core sichtbar)
|
||||||
|
Bridge: chat.send → "Stefan hat ein Bild geschickt: foto.jpg — liegt unter /shared/uploads/..."
|
||||||
|
ARIA: Kann Datei per Bash/Read-Tool oeffnen und analysieren
|
||||||
|
```
|
||||||
|
|
||||||
|
**Unterstuetzte Formate:** Bilder (JPG, PNG), Dokumente (PDF, DOCX, TXT), beliebige Dateien.
|
||||||
|
Bilder werden in der App inline angezeigt, andere Dateien als Icon + Dateiname.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## Datenverzeichnis — aria-data/
|
## Datenverzeichnis — aria-data/
|
||||||
@@ -453,6 +472,8 @@ docker compose up -d
|
|||||||
| `./aria-data/ssh` (bind) | `/root/.ssh`, `/home/node/.ssh` | SSH Keys |
|
| `./aria-data/ssh` (bind) | `/root/.ssh`, `/home/node/.ssh` | SSH Keys |
|
||||||
| `./aria-data/brain` (bind) | `/home/node/.openclaw/workspace/memory` | Gedaechtnis |
|
| `./aria-data/brain` (bind) | `/home/node/.openclaw/workspace/memory` | Gedaechtnis |
|
||||||
| `./aria-data/skills` (bind) | `/home/node/.openclaw/workspace/skills` | Skills |
|
| `./aria-data/skills` (bind) | `/home/node/.openclaw/workspace/skills` | Skills |
|
||||||
|
| `aria-shared` | `/shared` (Core + Bridge) | Datei-Austausch (Uploads von App) |
|
||||||
|
| `./aria-data/config/diag-state` (bind) | `/data` (Diagnostic) | Persistenter State (aktive Session) |
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
@@ -507,8 +528,13 @@ docker exec aria-core ssh aria-wohnung hostname
|
|||||||
Dadurch ist ARIA langsamer als die direkte Claude CLI. Timeout ist auf 900s (15 Min).
|
Dadurch ist ARIA langsamer als die direkte Claude CLI. Timeout ist auf 900s (15 Min).
|
||||||
- **Kein Streaming zur App**: Die App zeigt erst die fertige Antwort, keine Streaming-Tokens.
|
- **Kein Streaming zur App**: Die App zeigt erst die fertige Antwort, keine Streaming-Tokens.
|
||||||
- **Wake Word nur auf VM**: Die Bridge hoert auf "ARIA" ueber das lokale Mikrofon der VM.
|
- **Wake Word nur auf VM**: Die Bridge hoert auf "ARIA" ueber das lokale Mikrofon der VM.
|
||||||
In der App gibt es Energy-basierte Erkennung (Phase 1).
|
In der App gibt es Energy-basierte Erkennung (Phase 1). On-device "ARIA"-Keyword (Porcupine) ist Phase 2.
|
||||||
- **Audio-Format**: App nimmt AAC/MP4 auf, Bridge konvertiert via FFmpeg zu 16kHz PCM.
|
- **Audio-Format**: App nimmt AAC/MP4 auf, Bridge konvertiert via FFmpeg zu 16kHz PCM.
|
||||||
|
- **Bildanalyse eingeschraenkt**: Bilder werden in `/shared/uploads/` gespeichert. ARIA kann
|
||||||
|
sie per Bash/Read-Tool oeffnen, aber Claude Vision (direkte Bildanalyse) ist ueber den
|
||||||
|
Proxy-Pfad (`claude --print`) noch nicht moeglich. ARIA sieht den Dateipfad, nicht das Bild.
|
||||||
|
- **Dateigroesse**: Grosse Dateien (>5MB) koennen WebSocket-Limits ueberschreiten.
|
||||||
|
Bilder werden in der App auf max 1920x1920px @ 80% Qualitaet komprimiert.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|||||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
BIN
Binary file not shown.
+1
-1
File diff suppressed because one or more lines are too long
Binary file not shown.
BIN
Binary file not shown.
+1
-1
@@ -1,4 +1,4 @@
|
|||||||
#Sun Mar 29 11:40:22 CEST 2026
|
#Sun Mar 29 12:31:28 CEST 2026
|
||||||
base.2=/home/duffy/Dokumente/programmierung/ARIA-AGENT/android/android/app/build/intermediates/dex/release/mergeDexRelease/classes2.dex
|
base.2=/home/duffy/Dokumente/programmierung/ARIA-AGENT/android/android/app/build/intermediates/dex/release/mergeDexRelease/classes2.dex
|
||||||
path.2=classes2.dex
|
path.2=classes2.dex
|
||||||
base.1=/home/duffy/Dokumente/programmierung/ARIA-AGENT/android/android/app/build/intermediates/global_synthetics_dex/release/classes.dex
|
base.1=/home/duffy/Dokumente/programmierung/ARIA-AGENT/android/android/app/build/intermediates/global_synthetics_dex/release/classes.dex
|
||||||
|
|||||||
+1
-1
File diff suppressed because one or more lines are too long
+1
-1
File diff suppressed because one or more lines are too long
@@ -17,6 +17,7 @@ import {
|
|||||||
import DocumentPicker, {
|
import DocumentPicker, {
|
||||||
DocumentPickerResponse,
|
DocumentPickerResponse,
|
||||||
} from 'react-native-document-picker';
|
} from 'react-native-document-picker';
|
||||||
|
import RNFS from 'react-native-fs';
|
||||||
|
|
||||||
// --- Typen ---
|
// --- Typen ---
|
||||||
|
|
||||||
@@ -74,15 +75,17 @@ const FileUpload: React.FC<FileUploadProps> = ({ onFileSelected, onCancel }) =>
|
|||||||
|
|
||||||
setLoading(true);
|
setLoading(true);
|
||||||
try {
|
try {
|
||||||
// In Produktion: Datei lesen und zu Base64 konvertieren
|
// Datei lesen und zu Base64 konvertieren
|
||||||
// const base64 = await RNFS.readFile(selectedFile.fileCopyUri || selectedFile.uri, 'base64');
|
const filePath = selectedFile.fileCopyUri || selectedFile.uri;
|
||||||
const base64Placeholder = '';
|
// URI-Schema entfernen fuer RNFS (file:// → absoluter Pfad)
|
||||||
|
const cleanPath = filePath.replace('file://', '');
|
||||||
|
const base64 = await RNFS.readFile(cleanPath, 'base64');
|
||||||
|
|
||||||
const fileData: FileData = {
|
const fileData: FileData = {
|
||||||
name: selectedFile.name || 'unbenannt',
|
name: selectedFile.name || 'unbenannt',
|
||||||
type: selectedFile.type || 'application/octet-stream',
|
type: selectedFile.type || 'application/octet-stream',
|
||||||
size: selectedFile.size || 0,
|
size: selectedFile.size || 0,
|
||||||
base64: base64Placeholder,
|
base64,
|
||||||
uri: selectedFile.uri,
|
uri: selectedFile.uri,
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|||||||
@@ -15,6 +15,7 @@ import {
|
|||||||
KeyboardAvoidingView,
|
KeyboardAvoidingView,
|
||||||
Platform,
|
Platform,
|
||||||
StyleSheet,
|
StyleSheet,
|
||||||
|
Image,
|
||||||
Modal,
|
Modal,
|
||||||
} from 'react-native';
|
} from 'react-native';
|
||||||
import AsyncStorage from '@react-native-async-storage/async-storage';
|
import AsyncStorage from '@react-native-async-storage/async-storage';
|
||||||
@@ -33,6 +34,8 @@ interface Attachment {
|
|||||||
type: 'image' | 'file' | 'audio';
|
type: 'image' | 'file' | 'audio';
|
||||||
name: string;
|
name: string;
|
||||||
size?: number;
|
size?: number;
|
||||||
|
uri?: string; // Lokaler Pfad oder data URI fuer Anzeige
|
||||||
|
mimeType?: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
interface ChatMessage {
|
interface ChatMessage {
|
||||||
@@ -93,6 +96,24 @@ const ChatScreen: React.FC = () => {
|
|||||||
// RVS-Nachrichten abonnieren
|
// RVS-Nachrichten abonnieren
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
const unsubMessage = rvs.onMessage((message: RVSMessage) => {
|
const unsubMessage = rvs.onMessage((message: RVSMessage) => {
|
||||||
|
// STT-Ergebnis: Spracheingabe-Placeholder mit transkribiertem Text ersetzen
|
||||||
|
if (message.type === 'stt_result') {
|
||||||
|
const sttText = (message.payload.text as string) || '';
|
||||||
|
if (sttText) {
|
||||||
|
setMessages(prev => prev.map(m =>
|
||||||
|
m.sender === 'user' && m.text.includes('Spracheingabe wird verarbeitet')
|
||||||
|
? { ...m, text: sttText }
|
||||||
|
: m
|
||||||
|
));
|
||||||
|
} else {
|
||||||
|
// Keine Sprache erkannt — Placeholder entfernen
|
||||||
|
setMessages(prev => prev.filter(m =>
|
||||||
|
!(m.sender === 'user' && m.text.includes('Spracheingabe wird verarbeitet'))
|
||||||
|
));
|
||||||
|
}
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
if (message.type === 'chat') {
|
if (message.type === 'chat') {
|
||||||
// Nur Nachrichten von ARIA anzeigen — eigene Nachrichten werden lokal hinzugefuegt
|
// Nur Nachrichten von ARIA anzeigen — eigene Nachrichten werden lokal hinzugefuegt
|
||||||
const sender = (message.payload.sender as string) || '';
|
const sender = (message.payload.sender as string) || '';
|
||||||
@@ -159,7 +180,7 @@ const ChatScreen: React.FC = () => {
|
|||||||
const userMsg: ChatMessage = {
|
const userMsg: ChatMessage = {
|
||||||
id: nextId(),
|
id: nextId(),
|
||||||
sender: 'user',
|
sender: 'user',
|
||||||
text: '[Sprachnachricht]',
|
text: '🎙 Spracheingabe wird verarbeitet...',
|
||||||
timestamp: Date.now(),
|
timestamp: Date.now(),
|
||||||
attachments: [{ type: 'audio', name: 'Sprachaufnahme' }],
|
attachments: [{ type: 'audio', name: 'Sprachaufnahme' }],
|
||||||
};
|
};
|
||||||
@@ -201,14 +222,22 @@ const ChatScreen: React.FC = () => {
|
|||||||
);
|
);
|
||||||
}, [messages]);
|
}, [messages]);
|
||||||
|
|
||||||
// Auto-Scroll bei neuen Nachrichten
|
// Auto-Scroll wird ueber onContentSizeChange der FlatList gesteuert
|
||||||
useEffect(() => {
|
const shouldAutoScroll = useRef(true);
|
||||||
if (messages.length > 0) {
|
const handleContentSizeChange = useCallback(() => {
|
||||||
setTimeout(() => {
|
if (shouldAutoScroll.current) {
|
||||||
flatListRef.current?.scrollToEnd({ animated: true });
|
flatListRef.current?.scrollToEnd({ animated: false });
|
||||||
}, 100);
|
|
||||||
}
|
}
|
||||||
}, [messages]);
|
}, []);
|
||||||
|
const handleScrollBeginDrag = useCallback(() => {
|
||||||
|
shouldAutoScroll.current = false;
|
||||||
|
}, []);
|
||||||
|
const handleScrollEndDrag = useCallback((e: any) => {
|
||||||
|
// Auto-Scroll wieder aktivieren wenn User ganz unten ist
|
||||||
|
const { contentOffset, contentSize, layoutMeasurement } = e.nativeEvent;
|
||||||
|
const isAtBottom = contentOffset.y + layoutMeasurement.height >= contentSize.height - 50;
|
||||||
|
shouldAutoScroll.current = isAtBottom;
|
||||||
|
}, []);
|
||||||
|
|
||||||
// GPS-Position holen (optional)
|
// GPS-Position holen (optional)
|
||||||
const getCurrentLocation = useCallback((): Promise<{ lat: number; lon: number } | null> => {
|
const getCurrentLocation = useCallback((): Promise<{ lat: number; lon: number } | null> => {
|
||||||
@@ -262,9 +291,8 @@ const ChatScreen: React.FC = () => {
|
|||||||
const userMsg: ChatMessage = {
|
const userMsg: ChatMessage = {
|
||||||
id: nextId(),
|
id: nextId(),
|
||||||
sender: 'user',
|
sender: 'user',
|
||||||
text: '[Sprachnachricht]',
|
text: '🎙 Spracheingabe wird verarbeitet...',
|
||||||
timestamp: Date.now(),
|
timestamp: Date.now(),
|
||||||
attachments: [{ type: 'audio', name: 'Sprachaufnahme' }],
|
|
||||||
};
|
};
|
||||||
setMessages(prev => [...prev, userMsg]);
|
setMessages(prev => [...prev, userMsg]);
|
||||||
|
|
||||||
@@ -281,12 +309,19 @@ const ChatScreen: React.FC = () => {
|
|||||||
setShowFileUpload(false);
|
setShowFileUpload(false);
|
||||||
const location = await getCurrentLocation();
|
const location = await getCurrentLocation();
|
||||||
|
|
||||||
|
const isImage = file.type.startsWith('image/');
|
||||||
const userMsg: ChatMessage = {
|
const userMsg: ChatMessage = {
|
||||||
id: nextId(),
|
id: nextId(),
|
||||||
sender: 'user',
|
sender: 'user',
|
||||||
text: `[Datei: ${file.name}]`,
|
text: 'Anhang empfangen',
|
||||||
timestamp: Date.now(),
|
timestamp: Date.now(),
|
||||||
attachments: [{ type: 'file', name: file.name, size: file.size }],
|
attachments: [{
|
||||||
|
type: isImage ? 'image' : 'file',
|
||||||
|
name: file.name,
|
||||||
|
size: file.size,
|
||||||
|
uri: isImage && file.base64 ? `data:${file.type};base64,${file.base64}` : file.uri,
|
||||||
|
mimeType: file.type,
|
||||||
|
}],
|
||||||
};
|
};
|
||||||
setMessages(prev => [...prev, userMsg]);
|
setMessages(prev => [...prev, userMsg]);
|
||||||
|
|
||||||
@@ -307,9 +342,14 @@ const ChatScreen: React.FC = () => {
|
|||||||
const userMsg: ChatMessage = {
|
const userMsg: ChatMessage = {
|
||||||
id: nextId(),
|
id: nextId(),
|
||||||
sender: 'user',
|
sender: 'user',
|
||||||
text: `[Foto: ${photo.fileName}]`,
|
text: 'Anhang empfangen',
|
||||||
timestamp: Date.now(),
|
timestamp: Date.now(),
|
||||||
attachments: [{ type: 'image', name: photo.fileName }],
|
attachments: [{
|
||||||
|
type: 'image',
|
||||||
|
name: photo.fileName,
|
||||||
|
uri: photo.base64 ? `data:${photo.type};base64,${photo.base64}` : undefined,
|
||||||
|
mimeType: photo.type,
|
||||||
|
}],
|
||||||
};
|
};
|
||||||
setMessages(prev => [...prev, userMsg]);
|
setMessages(prev => [...prev, userMsg]);
|
||||||
|
|
||||||
@@ -334,16 +374,35 @@ const ChatScreen: React.FC = () => {
|
|||||||
|
|
||||||
return (
|
return (
|
||||||
<View style={[styles.messageBubble, isUser ? styles.userBubble : styles.ariaBubble]}>
|
<View style={[styles.messageBubble, isUser ? styles.userBubble : styles.ariaBubble]}>
|
||||||
<Text style={[styles.messageText, isUser ? styles.userText : styles.ariaText]}>
|
{/* Anhang-Vorschau */}
|
||||||
{item.text}
|
|
||||||
</Text>
|
|
||||||
{item.attachments?.map((att, idx) => (
|
{item.attachments?.map((att, idx) => (
|
||||||
<View key={idx} style={styles.attachmentBadge}>
|
<View key={idx}>
|
||||||
<Text style={styles.attachmentText}>
|
{att.type === 'image' && att.uri ? (
|
||||||
{att.type === 'image' ? '\uD83D\uDDBC\uFE0F' : att.type === 'audio' ? '\uD83C\uDFA4' : '\uD83D\uDCC4'} {att.name}
|
<Image
|
||||||
</Text>
|
source={{ uri: att.uri }}
|
||||||
|
style={styles.attachmentImage}
|
||||||
|
resizeMode="contain"
|
||||||
|
/>
|
||||||
|
) : (
|
||||||
|
<View style={styles.attachmentFile}>
|
||||||
|
<Text style={styles.attachmentFileIcon}>
|
||||||
|
{att.mimeType?.includes('pdf') ? '\uD83D\uDCC4' :
|
||||||
|
att.mimeType?.includes('word') || att.mimeType?.includes('document') ? '\uD83D\uDCC3' :
|
||||||
|
att.mimeType?.includes('sheet') || att.mimeType?.includes('excel') ? '\uD83D\uDCC8' :
|
||||||
|
'\uD83D\uDCC1'}
|
||||||
|
</Text>
|
||||||
|
<Text style={styles.attachmentFileName} numberOfLines={1}>{att.name}</Text>
|
||||||
|
{att.size ? <Text style={styles.attachmentFileSize}>{Math.round(att.size / 1024)}KB</Text> : null}
|
||||||
|
</View>
|
||||||
|
)}
|
||||||
</View>
|
</View>
|
||||||
))}
|
))}
|
||||||
|
{/* Text (nicht anzeigen wenn nur "Anhang empfangen" und ein Bild da ist) */}
|
||||||
|
{!(item.text === 'Anhang empfangen' && item.attachments?.some(a => a.type === 'image' && a.uri)) && (
|
||||||
|
<Text style={[styles.messageText, isUser ? styles.userText : styles.ariaText]}>
|
||||||
|
{item.text}
|
||||||
|
</Text>
|
||||||
|
)}
|
||||||
<Text style={styles.timestamp}>{time}</Text>
|
<Text style={styles.timestamp}>{time}</Text>
|
||||||
</View>
|
</View>
|
||||||
);
|
);
|
||||||
@@ -376,6 +435,9 @@ const ChatScreen: React.FC = () => {
|
|||||||
renderItem={renderMessage}
|
renderItem={renderMessage}
|
||||||
contentContainerStyle={styles.messageList}
|
contentContainerStyle={styles.messageList}
|
||||||
showsVerticalScrollIndicator={false}
|
showsVerticalScrollIndicator={false}
|
||||||
|
onContentSizeChange={handleContentSizeChange}
|
||||||
|
onScrollBeginDrag={handleScrollBeginDrag}
|
||||||
|
onScrollEndDrag={handleScrollEndDrag}
|
||||||
ListEmptyComponent={
|
ListEmptyComponent={
|
||||||
<View style={styles.emptyContainer}>
|
<View style={styles.emptyContainer}>
|
||||||
<Text style={styles.emptyIcon}>{'\uD83E\uDD16'}</Text>
|
<Text style={styles.emptyIcon}>{'\uD83E\uDD16'}</Text>
|
||||||
@@ -517,17 +579,34 @@ const styles = StyleSheet.create({
|
|||||||
ariaText: {
|
ariaText: {
|
||||||
color: '#E0E0F0',
|
color: '#E0E0F0',
|
||||||
},
|
},
|
||||||
attachmentBadge: {
|
attachmentImage: {
|
||||||
backgroundColor: 'rgba(255,255,255,0.1)',
|
width: '100%',
|
||||||
borderRadius: 6,
|
height: 200,
|
||||||
paddingHorizontal: 8,
|
borderRadius: 8,
|
||||||
paddingVertical: 4,
|
marginBottom: 6,
|
||||||
marginTop: 6,
|
backgroundColor: '#0D0D1A',
|
||||||
alignSelf: 'flex-start',
|
|
||||||
},
|
},
|
||||||
attachmentText: {
|
attachmentFile: {
|
||||||
color: '#CCCCDD',
|
flexDirection: 'row',
|
||||||
fontSize: 12,
|
alignItems: 'center',
|
||||||
|
backgroundColor: 'rgba(255,255,255,0.1)',
|
||||||
|
borderRadius: 8,
|
||||||
|
padding: 10,
|
||||||
|
marginBottom: 6,
|
||||||
|
},
|
||||||
|
attachmentFileIcon: {
|
||||||
|
fontSize: 24,
|
||||||
|
marginRight: 8,
|
||||||
|
},
|
||||||
|
attachmentFileName: {
|
||||||
|
flex: 1,
|
||||||
|
color: '#E0E0F0',
|
||||||
|
fontSize: 13,
|
||||||
|
},
|
||||||
|
attachmentFileSize: {
|
||||||
|
color: '#8888AA',
|
||||||
|
fontSize: 11,
|
||||||
|
marginLeft: 8,
|
||||||
},
|
},
|
||||||
timestamp: {
|
timestamp: {
|
||||||
color: 'rgba(255,255,255,0.4)',
|
color: 'rgba(255,255,255,0.4)',
|
||||||
|
|||||||
+68
-4
@@ -954,10 +954,49 @@ class ARIABridge:
|
|||||||
await self.ws_core.send(raw_message)
|
await self.ws_core.send(raw_message)
|
||||||
|
|
||||||
elif msg_type == "file":
|
elif msg_type == "file":
|
||||||
# Datei von der App → an aria-core
|
# Datei von der App → als Text-Nachricht an aria-core
|
||||||
logger.info("[rvs] Datei empfangen: %s", payload.get("name", "?"))
|
file_name = payload.get("name", "unbekannt")
|
||||||
if self.ws_core:
|
file_type = payload.get("type", "")
|
||||||
await self.ws_core.send(raw_message)
|
file_b64 = payload.get("base64", "")
|
||||||
|
file_size = payload.get("size", 0)
|
||||||
|
width = payload.get("width", 0)
|
||||||
|
height = payload.get("height", 0)
|
||||||
|
logger.info("[rvs] Datei empfangen: %s (%s, %dKB)",
|
||||||
|
file_name, file_type, len(file_b64) // 1365 if file_b64 else 0)
|
||||||
|
|
||||||
|
# Shared Volume: /shared/ ist in Bridge UND aria-core gemountet
|
||||||
|
SHARED_DIR = "/shared/uploads"
|
||||||
|
os.makedirs(SHARED_DIR, exist_ok=True)
|
||||||
|
|
||||||
|
if file_b64 and file_type.startswith("image/"):
|
||||||
|
# Bild in Shared Volume speichern
|
||||||
|
ext = ".jpg" if "jpeg" in file_type or "jpg" in file_type else ".png"
|
||||||
|
safe_name = f"img_{int(asyncio.get_event_loop().time())}_{file_name.replace('/', '_')}"
|
||||||
|
file_path = os.path.join(SHARED_DIR, safe_name if safe_name.endswith(ext) else safe_name + ext)
|
||||||
|
with open(file_path, "wb") as f:
|
||||||
|
f.write(base64.b64decode(file_b64))
|
||||||
|
size_kb = len(file_b64) // 1365
|
||||||
|
logger.info("[rvs] Bild gespeichert: %s (%dKB)", file_path, size_kb)
|
||||||
|
text = (f"Stefan hat dir ein Bild geschickt: {file_name}"
|
||||||
|
f"{f' ({width}x{height}px)' if width else ''}"
|
||||||
|
f", {size_kb}KB."
|
||||||
|
f" Das Bild liegt unter: {file_path}")
|
||||||
|
await self.send_to_core(text, source="app-file")
|
||||||
|
elif file_b64:
|
||||||
|
# Andere Datei in Shared Volume speichern
|
||||||
|
safe_name = f"file_{int(asyncio.get_event_loop().time())}_{file_name.replace('/', '_')}"
|
||||||
|
file_path = os.path.join(SHARED_DIR, safe_name)
|
||||||
|
with open(file_path, "wb") as f:
|
||||||
|
f.write(base64.b64decode(file_b64))
|
||||||
|
size_kb = len(file_b64) // 1365
|
||||||
|
logger.info("[rvs] Datei gespeichert: %s (%dKB)", file_path, size_kb)
|
||||||
|
text = (f"Stefan hat dir eine Datei geschickt: {file_name}"
|
||||||
|
f" ({file_type}, {size_kb}KB)."
|
||||||
|
f" Die Datei liegt unter: {file_path}")
|
||||||
|
await self.send_to_core(text, source="app-file")
|
||||||
|
else:
|
||||||
|
text = f"Stefan hat eine Datei gesendet ({file_name}, {file_type}) aber die Daten sind leer angekommen."
|
||||||
|
await self.send_to_core(text, source="app-file")
|
||||||
|
|
||||||
elif msg_type == "audio":
|
elif msg_type == "audio":
|
||||||
# Audio von der App → decodieren → STT → an aria-core
|
# Audio von der App → decodieren → STT → an aria-core
|
||||||
@@ -1017,9 +1056,34 @@ class ARIABridge:
|
|||||||
|
|
||||||
if text.strip():
|
if text.strip():
|
||||||
logger.info("[rvs] STT Ergebnis: '%s'", text[:80])
|
logger.info("[rvs] STT Ergebnis: '%s'", text[:80])
|
||||||
|
# STT-Ergebnis zurueck an die App senden (zur Anzeige, nicht nochmal verarbeiten)
|
||||||
|
try:
|
||||||
|
await self._send_to_rvs({
|
||||||
|
"type": "stt_result",
|
||||||
|
"payload": {
|
||||||
|
"text": text,
|
||||||
|
"sender": "user",
|
||||||
|
},
|
||||||
|
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
||||||
|
})
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("[rvs] STT-Ergebnis konnte nicht an App gesendet werden: %s", e)
|
||||||
|
# Text trotzdem an aria-core senden
|
||||||
await self.send_to_core(text, source="app-voice")
|
await self.send_to_core(text, source="app-voice")
|
||||||
else:
|
else:
|
||||||
logger.info("[rvs] Keine Sprache erkannt — ignoriert")
|
logger.info("[rvs] Keine Sprache erkannt — ignoriert")
|
||||||
|
try:
|
||||||
|
await self._send_to_rvs({
|
||||||
|
"type": "stt_result",
|
||||||
|
"payload": {
|
||||||
|
"text": "",
|
||||||
|
"error": "Keine Sprache erkannt",
|
||||||
|
"sender": "user",
|
||||||
|
},
|
||||||
|
"timestamp": int(asyncio.get_event_loop().time() * 1000),
|
||||||
|
})
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("[rvs] STT-Fehler konnte nicht an App gesendet werden: %s", e)
|
||||||
|
|
||||||
except Exception:
|
except Exception:
|
||||||
logger.exception("[rvs] Audio-Verarbeitung fehlgeschlagen")
|
logger.exception("[rvs] Audio-Verarbeitung fehlgeschlagen")
|
||||||
|
|||||||
@@ -501,7 +501,9 @@
|
|||||||
}
|
}
|
||||||
if (msg.type === 'rvs_chat') {
|
if (msg.type === 'rvs_chat') {
|
||||||
const p = msg.msg.payload || {};
|
const p = msg.msg.payload || {};
|
||||||
addChat('received', p.text || '?', `via RVS (${p.sender || '?'})`);
|
const sender = p.sender || '?';
|
||||||
|
const chatType = (sender === 'aria') ? 'received' : 'sent';
|
||||||
|
addChat(chatType, p.text || '?', `via RVS (${sender})`);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
if (msg.type === 'proxy_result') {
|
if (msg.type === 'proxy_result') {
|
||||||
|
|||||||
@@ -338,6 +338,14 @@ function handleGatewayMessage(msg) {
|
|||||||
log("info", "gateway", `ANTWORT: "${text.slice(0, 200)}"`);
|
log("info", "gateway", `ANTWORT: "${text.slice(0, 200)}"`);
|
||||||
if (pipelineActive) pipelineEnd(true, `"${text.slice(0, 120)}"`);
|
if (pipelineActive) pipelineEnd(true, `"${text.slice(0, 120)}"`);
|
||||||
broadcast({ type: "chat_final", text, payload });
|
broadcast({ type: "chat_final", text, payload });
|
||||||
|
// Antwort auch an RVS weiterleiten → App bekommt ARIAs Antworten
|
||||||
|
if (rvsWs && rvsWs.readyState === WebSocket.OPEN && text) {
|
||||||
|
rvsWs.send(JSON.stringify({
|
||||||
|
type: "chat",
|
||||||
|
payload: { text, sender: "aria" },
|
||||||
|
timestamp: Date.now(),
|
||||||
|
}));
|
||||||
|
}
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -58,6 +58,7 @@ services:
|
|||||||
- ./aria-data/ssh:/home/node/.ssh # SSH Keys fuer VM-Zugriff
|
- ./aria-data/ssh:/home/node/.ssh # SSH Keys fuer VM-Zugriff
|
||||||
- /tmp/.X11-unix:/tmp/.X11-unix
|
- /tmp/.X11-unix:/tmp/.X11-unix
|
||||||
- /var/run/docker.sock:/var/run/docker.sock # VM von innen verwalten
|
- /var/run/docker.sock:/var/run/docker.sock # VM von innen verwalten
|
||||||
|
- aria-shared:/shared # Shared Volume fuer Datei-Austausch (Bridge <> Core)
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
networks:
|
networks:
|
||||||
- aria-net
|
- aria-net
|
||||||
@@ -72,6 +73,7 @@ services:
|
|||||||
volumes:
|
volumes:
|
||||||
- ./aria-data/voices:/voices:ro # TTS Stimmen
|
- ./aria-data/voices:/voices:ro # TTS Stimmen
|
||||||
- ./aria-data/config/aria.env:/config/aria.env
|
- ./aria-data/config/aria.env:/config/aria.env
|
||||||
|
- aria-shared:/shared # Shared Volume fuer Datei-Austausch (Bridge <> Core)
|
||||||
# Audio-Zugriff
|
# Audio-Zugriff
|
||||||
- /run/user/1000/pulse:/run/user/1000/pulse
|
- /run/user/1000/pulse:/run/user/1000/pulse
|
||||||
- /dev/snd:/dev/snd
|
- /dev/snd:/dev/snd
|
||||||
@@ -110,6 +112,7 @@ services:
|
|||||||
volumes:
|
volumes:
|
||||||
openclaw-config: # Persistiert ~/.openclaw (Model, Auth, Sessions)
|
openclaw-config: # Persistiert ~/.openclaw (Model, Auth, Sessions)
|
||||||
claude-config: # Persistiert ~/.claude (Permissions, Settings)
|
claude-config: # Persistiert ~/.claude (Permissions, Settings)
|
||||||
|
aria-shared: # Datei-Austausch zwischen Bridge und Core
|
||||||
|
|
||||||
networks:
|
networks:
|
||||||
aria-net:
|
aria-net:
|
||||||
|
|||||||
Reference in New Issue
Block a user