Trace LLM calls in production with the AgentMark SDK and OpenTelemetry.
index.ts shows how to initialize the SDK, wrap LLM calls in traces, and use child spans for sub-operations. Every traced call is recorded with model, tokens, cost, and latency.
npm install @agentmark-ai/sdk @agentmark-ai/ai-sdk-v5-adapterSet environment variables:
AGENTMARK_API_KEY=your-api-key
AGENTMARK_APP_ID=your-app-idnpx tsx index.tssdk.initTracing()starts the OpenTelemetry exporter — all traces are sent to AgentMark Cloudtrace()creates a root span. The callback receives aTraceContextwithtraceId,spanId, andspan()for nesting- Child spans (via
ctx.span()) let you trace sub-operations like database calls or post-processing - The
traceIdis returned immediately — use it to link traces to your application logs - Traces are viewable locally (via
agentmark dev) or in AgentMark Cloud