Wire the chat interface in ~/src/mosaic-stack to the backend API. ## Context The chat page (/chat) exists with full UI components but is NOT connected to the backend. Everything is stubbed. The old jarvis codebase had a working implementation — use it as the reference. ## Architecture Decision Use `/api/chat/stream` (chat-proxy → OpenClaw) for LLM responses. But ALSO wire conversation persistence using the existing brain + conversation-archive endpoints. ## Backend Endpoints Available (all authenticated, cookie-based auth): ### Chat/LLM: - POST /api/chat/stream — SSE stream proxied from OpenClaw - Request body: { messages: [{role, content}], model?: string } - Response: SSE stream (pass-through from OpenClaw — handle both OpenAI format and custom format) - OpenAI streaming format: data: {"choices":[{"delta":{"content":"token"},"finish_reason":null}]}\n\n - Final: data: [DONE]\n\n OR data: {"choices":[{"finish_reason":"stop"}]}\n\n - Error: event: error\ndata: {"error":"message"}\n\n ### Conversation Archive (for persistence): - GET /api/conversation-archives — list user's archived conversations - POST /api/conversation-archives — create/save a conversation - Body: { title: string, summary?: string, messages: [{role, content, timestamp?}], metadata?: object } - GET /api/conversation-archives/:id — get specific conversation - DELETE /api/conversation-archives/:id — delete ### Brain (for context): - POST /api/brain/query — query tasks, projects, events - GET /api/brain/context — get summary context ## What to Change ### 1. apps/web/src/lib/api/chat.ts Rewrite `streamChatMessage` to: - Call `POST /api/chat/stream` (not /api/llm/chat) - Use credentials: "include" (cookie auth) - Add X-Workspace-Id header from workspace context - Handle OpenAI SSE format: parse `data.choices[0].delta.content` for tokens - Also handle `data: [DONE]` as completion - Keep the same callback signature: onChunk(token), onComplete(), onError(err) Add new functions: - `saveConversation(title, messages)` → POST /api/conversation-archives - `listConversations()` → GET /api/conversation-archives - `getConversation(id)` → GET /api/conversation-archives/:id - `deleteConversation(id)` → DELETE /api/conversation-archives/:id ### 2. apps/web/src/hooks/useChat.ts - Fix `loadConversation`: load from conversation-archives API instead of ideas/brain - Fix `sendMessage`: uses streamChatMessage correctly with the messages array - Add `saveCurrentConversation()`: save to conversation-archives after each assistant reply - Wire up auto-save after each successful assistant response ### 3. apps/web/src/components/chat/ConversationSidebar.tsx Check if it loads from conversations API — if it's stubbed or uses wrong endpoint, fix to use `listConversations()`. ## Key Reference: Old Jarvis Implementation The old jarvis at ~/src/jarvis-old/apps/web/src had: - components/Chat.tsx (1616 lines — working SSE stream handler) - lib/api.ts — working API client - hooks/useConversations.ts — conversation list management Read these for reference on SSE parsing, error handling, conversation state management patterns. ## Constraints - Keep auth as cookie-based (credentials: "include") — NOT Bearer token for web client - The X-Workspace-Id header must be included on all API calls - Use useWorkspaceId() hook to get workspace ID - Keep TypeScript strict — no `any` unless unavoidable - Match existing chat.ts style (callbacks not promises for streaming) ## Process 1. git checkout main && git pull --ff-only origin main 2. git checkout -b feat/wire-chat-interface 3. Read the existing files: apps/web/src/lib/api/chat.ts, apps/web/src/hooks/useChat.ts, apps/web/src/components/chat/*.tsx 4. Read reference: ~/src/jarvis-old/apps/web/src/components/Chat.tsx (lines 400-550 for SSE handling) 5. Implement the changes 6. Run: pnpm turbo lint typecheck --filter=@mosaic/web 7. Commit --no-verify: "feat(web): wire chat interface to /api/chat/stream and conversation-archives" 8. Push and PR: ~/.config/mosaic/tools/git/pr-create.sh -t "feat(web): wire chat interface to backend" -b "Wires the stubbed chat page to /api/chat/stream (OpenClaw proxy) for LLM responses and /api/conversation-archives for persistence. References old jarvis implementation for SSE parsing patterns." When done: openclaw system event --text "Done: chat wiring PR ready" --mode now