Port Ollama LLM Provider #123

Closed
opened 2026-01-30 21:28:03 +00:00 by jason.woltje · 0 comments
Owner

Port Ollama provider from jarvis r1 with retry/backoff logic.

Objective: Move Ollama-specific code from LlmService into dedicated OllamaProvider class.

Tasks:

  • Create ollama.provider.ts implementing base provider interface
  • Port retry logic with exponential backoff
  • Add connection pooling
  • Implement streaming support
  • Add comprehensive error handling
  • Write unit tests

Acceptance Criteria:

  • All existing Ollama functionality works
  • Provider implements interface correctly
  • Tests pass with 85%+ coverage
  • No breaking changes to existing API

Related: Epic #121, Phase 1 LLM Abstraction
Depends on: #122 (provider interface)

Port Ollama provider from jarvis r1 with retry/backoff logic. Objective: Move Ollama-specific code from LlmService into dedicated OllamaProvider class. Tasks: - Create ollama.provider.ts implementing base provider interface - Port retry logic with exponential backoff - Add connection pooling - Implement streaming support - Add comprehensive error handling - Write unit tests Acceptance Criteria: - All existing Ollama functionality works - Provider implements interface correctly - Tests pass with 85%+ coverage - No breaking changes to existing API Related: Epic #121, Phase 1 LLM Abstraction Depends on: #122 (provider interface)
jason.woltje added the p0apiapiphase-1 labels 2026-01-30 21:28:03 +00:00
jason.woltje added this to the M4-LLM (0.0.4) milestone 2026-01-30 23:40:48 +00:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: mosaic/stack#123