Implement REST API endpoints for managing LLM provider instances.
Changes:
- Created DTOs for provider CRUD operations (CreateLlmProviderDto, UpdateLlmProviderDto, LlmProviderResponseDto)
- Implemented LlmProviderAdminController with full CRUD endpoints:
- GET /llm/admin/providers - List all providers
- GET /llm/admin/providers/:id - Get provider details
- POST /llm/admin/providers - Create new provider
- PATCH /llm/admin/providers/:id - Update provider
- DELETE /llm/admin/providers/:id - Delete provider
- POST /llm/admin/providers/:id/test - Test connection
- POST /llm/admin/reload - Reload from database
- Updated llm-manager.service.ts to support OpenAI and Claude providers
- Added comprehensive test suite with 97.95% coverage
- Proper validation, error handling, and type safety
All tests pass. Pre-commit hooks pass.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add Ollama client library (ollama npm package)
- Create LlmService for chat completion and embeddings
- Support streaming responses via Server-Sent Events
- Add configuration via env vars (OLLAMA_HOST, OLLAMA_TIMEOUT)
- Create endpoints: GET /llm/health, GET /llm/models, POST /llm/chat, POST /llm/embed
- Replace old OllamaModule with new LlmModule
- Add comprehensive tests with >85% coverage
Closes#21