11 KiB
Chat UI to Backend Integration - Completion Report
Overview
Successfully wired the migrated Chat UI components to the Mosaic Stack backend APIs, implementing full conversation persistence, real-time updates, and authentication.
Changes Made
1. API Client Layer
Created apps/web/src/lib/api/chat.ts
- Purpose: Client for LLM chat interactions
- Endpoints: POST /api/llm/chat
- Features:
- Type-safe request/response interfaces
- Non-streaming chat message sending
- Placeholder for future streaming support
- TypeScript: Strict typing, no
anytypes
Created apps/web/src/lib/api/ideas.ts
- Purpose: Client for conversation persistence via Ideas API
- Endpoints:
- GET /api/ideas - query conversations
- POST /api/ideas - create new idea/conversation
- POST /api/ideas/capture - quick capture
- GET /api/ideas/:id - get single conversation
- PATCH /api/ideas/:id - update conversation
- Features:
- Full CRUD operations for conversations
- Helper functions for conversation-specific operations
- Type-safe DTOs matching backend Prisma schema
- TypeScript: Strict typing, explicit return types
Created apps/web/src/lib/api/index.ts
- Central export point for all API client modules
- Clean re-export pattern for library consumers
2. Custom Hook - useChat
Created apps/web/src/hooks/useChat.ts
- Purpose: Stateful hook managing chat conversations end-to-end
- Features:
- Message state management
- LLM API integration (via /api/llm/chat)
- Automatic conversation persistence (via /api/ideas)
- Loading states and error handling
- Conversation loading and creation
- Automatic title generation from first message
- Message serialization/deserialization
- Type Safety:
- Explicit Message interface
- No
anytypes - Proper error handling with type narrowing
- Integration:
- Calls
sendChatMessage()for LLM responses - Calls
createConversation()andupdateConversation()for persistence - Stores full message history as JSON in idea.content field
- Calls
3. Updated Components
apps/web/src/components/chat/Chat.tsx
Before: Placeholder implementation with mock data
After: Fully integrated with backend
- Uses
useChathook for state management - Uses
useAuthfor authentication - Uses
useWebSocketfor real-time connection status - Removed all placeholder comments and TODOs
- Implemented:
- Real message sending via LLM API
- Conversation persistence on every message
- Loading quips during LLM requests
- Error handling with user-friendly messages
- Connection status indicator
- Keyboard shortcuts (Ctrl+/ to focus input)
apps/web/src/components/chat/ConversationSidebar.tsx
Before: Placeholder data, no backend integration
After: Fetches conversations from backend
- Fetches conversations via
getConversations()API - Displays conversation list with titles, timestamps, message counts
- Search/filter functionality
- Loading and error states
- Real-time refresh capability via imperative ref
- Maps Ideas to ConversationSummary format
- Parses message count from stored JSON
apps/web/src/components/chat/MessageList.tsx
- Updated import to use Message type from
useChathook - No functional changes (already properly implemented)
apps/web/src/components/chat/index.ts
- Updated exports to reference Message type from hook
- Maintains clean component export API
apps/web/src/app/chat/page.tsx
- Updated
handleSelectConversationto actually load conversations - Integrated with Chat component's
loadConversation()method
4. Authentication Integration
- Uses existing
useAuth()hook from@/lib/auth/auth-context - Uses existing
authClientfrom@/lib/auth-client.ts - API client uses
credentials: 'include'for cookie-based auth - Backend automatically applies workspaceId from session (no need to pass explicitly)
5. WebSocket Integration
- Connected
useWebSockethook in Chat component - Displays connection status indicator when disconnected
- Ready for future real-time chat events
- Uses existing WebSocket gateway infrastructure
API Flow
Sending a Message
User types message
↓
Chat.tsx → useChat.sendMessage()
↓
useChat hook:
1. Adds user message to state (instant UI update)
2. Calls sendChatMessage() → POST /api/llm/chat
3. Receives assistant response
4. Adds assistant message to state
5. Generates title (if first message)
6. Calls saveConversation():
- If new: createConversation() → POST /api/ideas
- If existing: updateConversation() → PATCH /api/ideas/:id
7. Updates conversationId state
Loading a Conversation
User clicks conversation in sidebar
↓
ConversationSidebar → onSelectConversation(id)
↓
ChatPage → chatRef.current.loadConversation(id)
↓
Chat → useChat.loadConversation(id)
↓
useChat hook:
1. Calls getIdea(id) → GET /api/ideas/:id
2. Deserializes JSON from idea.content
3. Sets messages state
4. Sets conversationId and title
Fetching Conversation List
ConversationSidebar mounts
↓
useEffect → fetchConversations()
↓
Calls getConversations() → GET /api/ideas?category=conversation
↓
Maps Idea[] to ConversationSummary[]
↓
Parses message count from JSON content
↓
Updates conversations state
Data Model
Message Storage
Conversations are stored as Ideas with:
category: "conversation"tags: ["chat"]content: JSON.stringify(Message[])- full message historytitle: string- auto-generated from first user messageprojectId: string | null- optional project association
Message Format
interface Message {
id: string;
role: "user" | "assistant" | "system";
content: string;
thinking?: string; // Chain of thought (for thinking models)
createdAt: string;
model?: string; // LLM model used
provider?: string; // LLM provider (ollama, etc.)
promptTokens?: number;
completionTokens?: number;
totalTokens?: number;
}
Type Safety Compliance
All code follows ~/.claude/agent-guides/typescript.md:
✅ NO any types - All functions explicitly typed
✅ Explicit return types - All exported functions have return types
✅ Proper error handling - Error type narrowing (unknown → Error)
✅ Interface definitions - All DTOs and props have interfaces
✅ Strict null checking - All nullable types properly handled
✅ Type imports - Using import type for type-only imports
✅ Clean dependencies - No circular imports
Testing Recommendations
Manual Testing Checklist
- Authentication: Log in, verify chat loads
- New Conversation: Click "New Conversation", send message
- Message Sending: Send message, verify LLM response
- Persistence: Refresh page, verify conversation still exists
- Load Conversation: Click conversation in sidebar, verify messages load
- Search: Search conversations, verify filtering works
- Error Handling: Disconnect API, verify error messages display
- Loading States: Verify loading indicators during API calls
- WebSocket Status: Disconnect/reconnect, verify status indicator
Integration Tests Needed
// apps/web/src/hooks/__tests__/useChat.test.ts
- Test message sending
- Test conversation persistence
- Test conversation loading
- Test error handling
- Test title generation
// apps/web/src/lib/api/__tests__/chat.test.ts
- Test API request formatting
- Test response parsing
- Test error handling
// apps/web/src/lib/api/__tests__/ideas.test.ts
- Test CRUD operations
- Test query parameter serialization
- Test conversation helpers
Known Limitations
- Streaming Not Implemented: Chat messages are non-streaming (blocks until full response)
- Future: Implement SSE streaming for progressive response rendering
- Workspace ID Inference: Frontend doesn't explicitly pass workspaceId
- Backend infers from user session
- Works but could be more explicit
- No Message Pagination: Loads full conversation history
- Future: Paginate messages for very long conversations
- No Conversation Deletion: UI doesn't support deleting conversations
- Future: Add delete button with confirmation
- No Model Selection: Hardcoded to "llama3.2"
- Future: Add model picker in UI
- No Real-time Collaboration: WebSocket connected but no chat-specific events
- Future: Broadcast typing indicators, new messages
Environment Variables
Required in .env (already configured):
NEXT_PUBLIC_API_URL=http://localhost:3001 # Backend API URL
Dependencies
No new dependencies added. Uses existing:
better-auth/react- authenticationsocket.io-client- WebSocket- React hooks - state management
File Structure
apps/web/src/
├── app/chat/
│ └── page.tsx (updated)
├── components/chat/
│ ├── Chat.tsx (updated)
│ ├── ConversationSidebar.tsx (updated)
│ ├── MessageList.tsx (updated)
│ └── index.ts (updated)
├── hooks/
│ ├── useChat.ts (new)
│ └── useWebSocket.ts (existing)
├── lib/
│ ├── api/
│ │ ├── chat.ts (new)
│ │ ├── ideas.ts (new)
│ │ ├── index.ts (new)
│ │ └── client.ts (existing)
│ ├── auth/
│ │ └── auth-context.tsx (existing)
│ └── auth-client.ts (existing)
Next Steps
Immediate (Post-Merge)
-
Test Authentication Flow
- Verify session handling
- Test expired session behavior
-
Test Conversation Persistence
- Create conversations
- Verify database storage
- Load conversations after refresh
-
Monitor Performance
- Check LLM response times
- Monitor API latency
- Optimize if needed
Future Enhancements
-
Streaming Responses
- Implement Server-Sent Events
- Progressive message rendering
- Cancel in-flight requests
-
Advanced Features
- Model selection UI
- Temperature/parameter controls
- Conversation export (JSON, Markdown)
- Conversation sharing
-
Real-time Collaboration
- Typing indicators
- Live message updates
- Presence indicators
-
Performance Optimizations
- Message pagination
- Conversation caching
- Lazy loading
Conclusion
The Chat UI is now fully integrated with the Mosaic Stack backend:
✅ LLM chat via /api/llm/chat
✅ Conversation persistence via /api/ideas
✅ WebSocket connection for real-time updates
✅ Authentication via better-auth
✅ Clean TypeScript (no errors)
✅ Type-safe API clients
✅ Stateful React hooks
✅ Loading and error states
✅ User-friendly UX
The chat feature is ready for QA testing and can be merged to develop.