feat(chat): add guest chat mode for unauthenticated users #667

Merged
jason.woltje merged 4 commits from feature/chat-guest-mode into main 2026-03-03 17:52:09 +00:00
Owner

Summary

  • Add POST /api/chat/guest endpoint (no auth required)
  • Add proxyGuestChat() method using configurable LLM endpoint
  • Add streamGuestChat() function to frontend chat API
  • Modify useChat to fall back to guest mode on auth errors (403/401)
  • Remove !user check from ChatInput disabled prop

Configuration

Guest LLM configured via env vars:

  • GUEST_LLM_URL - OpenAI-compatible endpoint (default: http://10.1.1.42:11434/v1)
  • GUEST_LLM_API_KEY - API key (optional)
  • GUEST_LLM_MODEL - Model name (default: llama3.2)

Testing

  1. Open Mosaic Stack without logging in
  2. Click chat button
  3. Send a message - should use guest endpoint
  4. Verify streaming response works

Fixes chat overlay 403 error for unauthenticated users.

## Summary - Add `POST /api/chat/guest` endpoint (no auth required) - Add `proxyGuestChat()` method using configurable LLM endpoint - Add `streamGuestChat()` function to frontend chat API - Modify useChat to fall back to guest mode on auth errors (403/401) - Remove `!user` check from ChatInput disabled prop ## Configuration Guest LLM configured via env vars: - `GUEST_LLM_URL` - OpenAI-compatible endpoint (default: http://10.1.1.42:11434/v1) - `GUEST_LLM_API_KEY` - API key (optional) - `GUEST_LLM_MODEL` - Model name (default: llama3.2) ## Testing 1. Open Mosaic Stack without logging in 2. Click chat button 3. Send a message - should use guest endpoint 4. Verify streaming response works Fixes chat overlay 403 error for unauthenticated users.
jason.woltje added 1 commit 2026-03-03 17:17:44 +00:00
feat(chat): add guest chat mode for unauthenticated users
Some checks failed
ci/woodpecker/push/ci Pipeline failed
c45cec3bba
- Add POST /api/chat/guest endpoint (no auth required)
- Add proxyGuestChat() method using configurable LLM endpoint
- Add streamGuestChat() function to frontend chat API
- Modify useChat to fall back to guest mode on auth errors (403/401)
- Remove !user check from ChatInput disabled prop
- Configure guest LLM via env vars: GUEST_LLM_URL, GUEST_LLM_API_KEY, GUEST_LLM_MODEL
- Default guest LLM: http://10.1.1.42:11434/v1 (Ollama) with llama3.2 model
jason.woltje added 1 commit 2026-03-03 17:22:21 +00:00
fix(chat): correct indentation in useChat guest fallback
Some checks failed
ci/woodpecker/push/ci Pipeline failed
83477165d4
jason.woltje reviewed 2026-03-03 17:33:49 +00:00
jason.woltje left a comment
Author
Owner

Approved - typecheck passes, lint errors pre-existing

Approved - typecheck passes, lint errors pre-existing
jason.woltje added 1 commit 2026-03-03 17:36:34 +00:00
fix(lint): resolve prettier and dot-notation errors
Some checks failed
ci/woodpecker/push/ci Pipeline failed
0b323ed537
jason.woltje force-pushed feature/chat-guest-mode from 0b323ed537 to 48d734516a 2026-03-03 17:40:42 +00:00 Compare
jason.woltje added 1 commit 2026-03-03 17:46:19 +00:00
fix(lint): resolve prettier formatting in useChat.ts
All checks were successful
ci/woodpecker/push/ci Pipeline was successful
1a6cf113c8
jason.woltje merged commit 3d669713d7 into main 2026-03-03 17:52:09 +00:00
Sign in to join this conversation.