feat(#131): add OpenTelemetry tracing infrastructure

Implement comprehensive distributed tracing for HTTP requests and LLM
operations using OpenTelemetry with GenAI semantic conventions.

Features:
- TelemetryService: SDK initialization with OTLP HTTP exporter
- TelemetryInterceptor: Automatic HTTP request spans
- @TraceLlmCall decorator: LLM operation tracing
- GenAI semantic conventions for model/token tracking
- Graceful degradation when tracing disabled

Instrumented:
- All HTTP requests (automatic spans)
- OllamaProvider chat/chatStream/embed operations
- Token counts, model names, durations

Environment:
- OTEL_ENABLED (default: true)
- OTEL_SERVICE_NAME (default: mosaic-api)
- OTEL_EXPORTER_OTLP_ENDPOINT (default: localhost:4318)

Tests: 23 passing with full coverage

Fixes #131

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
2026-01-31 12:55:11 -06:00
parent 64cb5c1edd
commit 51e6ad0792
13 changed files with 2838 additions and 26 deletions

View File

@@ -0,0 +1,17 @@
/**
* OpenTelemetry distributed tracing module.
* Provides HTTP request tracing and LLM operation instrumentation.
*
* @module telemetry
*/
export { TelemetryModule } from "./telemetry.module";
export { TelemetryService } from "./telemetry.service";
export { TelemetryInterceptor } from "./telemetry.interceptor";
export { SpanContextService } from "./span-context.service";
export {
TraceLlmCall,
createLlmSpan,
recordLlmUsage,
type LlmTraceMetadata,
} from "./llm-telemetry.decorator";