- Add Ollama client library (ollama npm package) - Create LlmService for chat completion and embeddings - Support streaming responses via Server-Sent Events - Add configuration via env vars (OLLAMA_HOST, OLLAMA_TIMEOUT) - Create endpoints: GET /llm/health, GET /llm/models, POST /llm/chat, POST /llm/embed - Replace old OllamaModule with new LlmModule - Add comprehensive tests with >85% coverage Closes #21
5 lines
118 B
TypeScript
5 lines
118 B
TypeScript
export * from "./llm.module";
|
|
export * from "./llm.service";
|
|
export * from "./llm.controller";
|
|
export * from "./dto";
|