Implement comprehensive distributed tracing for HTTP requests and LLM operations using OpenTelemetry with GenAI semantic conventions. Features: - TelemetryService: SDK initialization with OTLP HTTP exporter - TelemetryInterceptor: Automatic HTTP request spans - @TraceLlmCall decorator: LLM operation tracing - GenAI semantic conventions for model/token tracking - Graceful degradation when tracing disabled Instrumented: - All HTTP requests (automatic spans) - OllamaProvider chat/chatStream/embed operations - Token counts, model names, durations Environment: - OTEL_ENABLED (default: true) - OTEL_SERVICE_NAME (default: mosaic-api) - OTEL_EXPORTER_OTLP_ENDPOINT (default: localhost:4318) Tests: 23 passing with full coverage Fixes #131 Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
18 lines
502 B
TypeScript
18 lines
502 B
TypeScript
/**
|
|
* OpenTelemetry distributed tracing module.
|
|
* Provides HTTP request tracing and LLM operation instrumentation.
|
|
*
|
|
* @module telemetry
|
|
*/
|
|
|
|
export { TelemetryModule } from "./telemetry.module";
|
|
export { TelemetryService } from "./telemetry.service";
|
|
export { TelemetryInterceptor } from "./telemetry.interceptor";
|
|
export { SpanContextService } from "./span-context.service";
|
|
export {
|
|
TraceLlmCall,
|
|
createLlmSpan,
|
|
recordLlmUsage,
|
|
type LlmTraceMetadata,
|
|
} from "./llm-telemetry.decorator";
|