feat(chat): add guest chat mode for unauthenticated users
Some checks failed
ci/woodpecker/push/ci Pipeline failed
Some checks failed
ci/woodpecker/push/ci Pipeline failed
- Add POST /api/chat/guest endpoint (no auth required) - Add proxyGuestChat() method using configurable LLM endpoint - Add streamGuestChat() function to frontend chat API - Modify useChat to fall back to guest mode on auth errors (403/401) - Remove !user check from ChatInput disabled prop - Configure guest LLM via env vars: GUEST_LLM_URL, GUEST_LLM_API_KEY, GUEST_LLM_MODEL - Default guest LLM: http://10.1.1.42:11434/v1 (Ollama) with llama3.2 model
This commit is contained in:
@@ -352,7 +352,7 @@ export const Chat = forwardRef<ChatRef, ChatProps>(function Chat(
|
||||
<div className="mx-auto max-w-4xl px-4 py-4 lg:px-8">
|
||||
<ChatInput
|
||||
onSend={handleSendMessage}
|
||||
disabled={isChatLoading || !user}
|
||||
disabled={isChatLoading}
|
||||
inputRef={inputRef}
|
||||
isStreaming={isStreaming}
|
||||
onStopStreaming={abortStream}
|
||||
|
||||
Reference in New Issue
Block a user