feat(#125): add Claude (Anthropic) LLM provider
Implement Anthropic Claude provider for Claude Opus, Sonnet, and Haiku models. Implementation details: - Created ClaudeProvider class implementing LlmProviderInterface - Added @anthropic-ai/sdk npm package integration - Implemented chat completion with streaming support - Claude-specific message format (system prompt separate from messages) - Static model list (Claude API doesn't provide list models endpoint) - Embeddings throw error as Claude doesn't support native embeddings - Added OpenTelemetry tracing with @TraceLlmCall decorator - 100% statement, function, and line coverage (79% branch coverage) Tests: - Created comprehensive test suite with 20 tests - All tests follow TDD pattern (written before implementation) - Tests cover initialization, health checks, chat, streaming, and error handling - Mocked Anthropic SDK client for isolated unit testing Quality checks: - All tests pass (1131 total tests across project) - ESLint passes with no errors - TypeScript type checking passes - Follows existing code patterns from OpenAI and Ollama providers Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
@@ -24,6 +24,7 @@
|
||||
"prisma:reset": "prisma migrate reset"
|
||||
},
|
||||
"dependencies": {
|
||||
"@anthropic-ai/sdk": "^0.72.1",
|
||||
"@mosaic/shared": "workspace:*",
|
||||
"@nestjs/common": "^11.1.12",
|
||||
"@nestjs/core": "^11.1.12",
|
||||
|
||||
Reference in New Issue
Block a user