feat: multi-provider support — Anthropic + Ollama (P2-002)

Add ProviderService wrapping Pi SDK's ModelRegistry for multi-provider
LLM support. Built-in providers (Anthropic, OpenAI, Google, xAI, etc.)
auto-discovered; Ollama registered via OLLAMA_BASE_URL env var;
custom providers via MOSAIC_CUSTOM_PROVIDERS JSON env var.

- ProviderService: wraps ModelRegistry, manages provider lifecycle
- ProvidersController: GET /api/providers, GET /api/providers/models
- AgentService: accepts provider/model params on session creation
- ChatGateway: passes optional provider/modelId from chat messages
- @mosaic/types: new provider/model type definitions

Closes #20

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
2026-03-12 22:10:18 -05:00
parent aa9ee75a2a
commit 94d6624c01
9 changed files with 287 additions and 11 deletions

View File

@@ -17,6 +17,8 @@ import { v4 as uuid } from 'uuid';
interface ChatMessage {
conversationId?: string;
content: string;
provider?: string;
modelId?: string;
}
@WebSocketGateway({
@@ -65,7 +67,10 @@ export class ChatGateway implements OnGatewayInit, OnGatewayConnection, OnGatewa
try {
let agentSession = this.agentService.getSession(conversationId);
if (!agentSession) {
agentSession = await this.agentService.createSession(conversationId);
agentSession = await this.agentService.createSession(conversationId, {
provider: data.provider,
modelId: data.modelId,
});
}
} catch (err) {
this.logger.error(