Overview
core-ai provides observability support through its middleware system. Observability middleware hooks into every model operation to capture traces, usage metrics, and error details without changing your application code. You wrap your model with an observability middleware, and all calls through that model are automatically traced.Available integrations
OpenTelemetry
Export traces to any OpenTelemetry-compatible backend (Jaeger, Grafana, Datadog, etc.)
Langfuse
Track generations, token usage, and costs in Langfuse
How it works
Observability integrations are standard middleware applied viawrapChatModel, wrapEmbeddingModel, or wrapImageModel. Each factory function returns a middleware object that hooks into model operations to record spans or observations.
You can combine observability middleware with other middleware. The order in the array controls execution order — place observability middleware first to capture the full duration of the call including any other middleware processing.