fix(agent): register Ollama directly with api: openai-completions
All checks were successful
ci/woodpecker/push/ci Pipeline was successful
All checks were successful
ci/woodpecker/push/ci Pipeline was successful
Pi SDK requires an api field on provider/model registration. Ollama uses the OpenAI-compatible completions API. Register directly via registry.registerProvider instead of the generic wrapper to pass the correct api type. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
@@ -92,15 +92,16 @@ export class ProviderService implements OnModuleInit {
|
|||||||
.map((modelId: string) => modelId.trim())
|
.map((modelId: string) => modelId.trim())
|
||||||
.filter(Boolean);
|
.filter(Boolean);
|
||||||
|
|
||||||
this.registerCustomProvider({
|
this.registry.registerProvider('ollama', {
|
||||||
id: 'ollama',
|
|
||||||
name: 'Ollama',
|
|
||||||
baseUrl: `${ollamaUrl}/v1`,
|
baseUrl: `${ollamaUrl}/v1`,
|
||||||
apiKey: 'ollama',
|
apiKey: 'ollama',
|
||||||
|
api: 'openai-completions' as never,
|
||||||
models: modelIds.map((id) => ({
|
models: modelIds.map((id) => ({
|
||||||
id,
|
id,
|
||||||
name: id,
|
name: id,
|
||||||
reasoning: false,
|
reasoning: false,
|
||||||
|
input: ['text'] as ('text' | 'image')[],
|
||||||
|
cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 },
|
||||||
contextWindow: 8192,
|
contextWindow: 8192,
|
||||||
maxTokens: 4096,
|
maxTokens: 4096,
|
||||||
})),
|
})),
|
||||||
|
|||||||
Reference in New Issue
Block a user