Some checks failed
ci/woodpecker/push/ci Pipeline failed
- Add POST /api/chat/guest endpoint (no auth required) - Add proxyGuestChat() method using configurable LLM endpoint - Add streamGuestChat() function to frontend chat API - Modify useChat to fall back to guest mode on auth errors (403/401) - Remove !user check from ChatInput disabled prop - Configure guest LLM via env vars: GUEST_LLM_URL, GUEST_LLM_API_KEY, GUEST_LLM_MODEL - Default guest LLM: http://10.1.1.42:11434/v1 (Ollama) with llama3.2 model