Anthropic Integration
Add tracing and observability to your Anthropic Claude API calls.
Overview
The Observability SDK provides automatic tracing for Anthropic Claude API calls. Track every message, measure latency, monitor costs, and debug issues with full visibility.
Installation
npm install @transactional/observability @anthropic-ai/sdkSetup
Basic Integration
Wrap your Anthropic client with the observability wrapper:
import Anthropic from '@anthropic-ai/sdk';
import { initObservability, wrapAnthropic } from '@transactional/observability';
// Initialize observability
initObservability({
dsn: process.env.TRANSACTIONAL_OBSERVABILITY_DSN!,
});
// Create and wrap the Anthropic client
const anthropic = wrapAnthropic(new Anthropic({
apiKey: process.env.ANTHROPIC_API_KEY,
}));With Custom Options
const anthropic = wrapAnthropic(
new Anthropic({ apiKey: process.env.ANTHROPIC_API_KEY }),
{
// Default user for all traces
userId: 'system',
// Default metadata
metadata: {
environment: process.env.NODE_ENV,
version: process.env.APP_VERSION,
},
// Enable error tracking
captureErrors: true,
}
);Usage Examples
Messages API
All messages are automatically traced:
const response = await anthropic.messages.create({
model: 'claude-3-5-sonnet-20241022',
max_tokens: 1024,
messages: [
{ role: 'user', content: 'Explain quantum computing' },
],
});
// Trace automatically captures:
// - Model name
// - Input messages
// - Output response
// - Token usage (input, output)
// - Latency
// - CostWith System Prompt
const response = await anthropic.messages.create({
model: 'claude-3-5-sonnet-20241022',
max_tokens: 1024,
system: 'You are a helpful AI assistant.',
messages: [
{ role: 'user', content: 'Hello!' },
],
});
// System prompt is captured in the traceStreaming
Streaming messages are fully traced:
const stream = await anthropic.messages.stream({
model: 'claude-3-5-sonnet-20241022',
max_tokens: 1024,
messages: [{ role: 'user', content: 'Tell me a story' }],
});
for await (const event of stream) {
if (event.type === 'content_block_delta') {
process.stdout.write(event.delta.text);
}
}
// Trace captures full streamed response and token countsTool Use
Tool usage is automatically captured:
const response = await anthropic.messages.create({
model: 'claude-3-5-sonnet-20241022',
max_tokens: 1024,
tools: [
{
name: 'get_weather',
description: 'Get current weather for a location',
input_schema: {
type: 'object',
properties: {
location: { type: 'string', description: 'City name' },
},
required: ['location'],
},
},
],
messages: [
{ role: 'user', content: 'What is the weather in Paris?' },
],
});
// Trace includes tool calls and their inputsVision (Images)
Image inputs are traced:
const response = await anthropic.messages.create({
model: 'claude-3-5-sonnet-20241022',
max_tokens: 1024,
messages: [
{
role: 'user',
content: [
{
type: 'image',
source: {
type: 'base64',
media_type: 'image/png',
data: imageBase64,
},
},
{
type: 'text',
text: 'Describe this image',
},
],
},
],
});
// Trace captures image metadata (not raw data) and responseAdding Context
Per-Request Context
Add context to individual requests:
const response = await anthropic.messages.create(
{
model: 'claude-3-5-sonnet-20241022',
max_tokens: 1024,
messages: [...],
},
{
// Observability context
observability: {
name: 'code-review',
userId: 'user-123',
sessionId: 'session-456',
metadata: {
repository: 'my-app',
feature: 'code-review',
},
tags: ['code-review', 'engineering'],
},
}
);Global Context
Set context that applies to all requests:
import { getObservability } from '@transactional/observability';
const obs = getObservability();
// Set user context
obs.setUser({ id: 'user-123', email: 'user@example.com' });
// Set tags
obs.setTags({ environment: 'production', team: 'ai' });Manual Tracing
For complex workflows, use manual tracing:
import { getObservability } from '@transactional/observability';
const obs = getObservability();
async function multiTurnConversation(messages: Message[]) {
const trace = obs.trace({
name: 'multi-turn-conversation',
input: { messageCount: messages.length },
sessionId: 'conversation-123',
});
try {
const generation = obs.generation({
name: 'claude-response',
modelName: 'claude-3-5-sonnet-20241022',
input: { messages },
});
const response = await anthropic.messages.create({
model: 'claude-3-5-sonnet-20241022',
max_tokens: 1024,
messages,
});
await generation.end({
output: response.content,
promptTokens: response.usage.input_tokens,
completionTokens: response.usage.output_tokens,
});
await trace.end({ output: response.content });
return response;
} catch (error) {
await trace.error(error as Error);
throw error;
}
}Error Tracking
Capture Anthropic errors for debugging:
import { getObservability } from '@transactional/observability';
const obs = getObservability();
try {
const response = await anthropic.messages.create({...});
} catch (error) {
// Capture the error with context
obs.captureException(error as Error, {
tags: {
provider: 'anthropic',
model: 'claude-3-5-sonnet-20241022',
},
extra: {
messageCount: messages.length,
maxTokens: 1024,
},
});
throw error;
}What Gets Traced
| API Method | Traced | Details Captured |
|---|---|---|
messages.create | Yes | Model, messages, response, tokens, cost |
messages.stream | Yes | Full response, tokens |
| Tool use | Yes | Tool calls, inputs |
| Vision (images) | Yes | Image metadata, response |
Viewing Traces
Dashboard
- Go to Observability Dashboard
- Select your project
- Click Traces to see all Anthropic calls
Trace Details
Each trace shows:
- Model used
- Input messages (including system prompt)
- Output response
- Token breakdown (input, output)
- Cost calculation
- Latency
- Tool usage (if any)
- Any errors
Multi-Turn Conversations
Track entire conversations with sessions:
class ConversationManager {
private messages: Message[] = [];
private sessionId: string;
constructor(userId: string) {
this.sessionId = `conv-${Date.now()}`;
}
async sendMessage(userMessage: string) {
this.messages.push({ role: 'user', content: userMessage });
const response = await anthropic.messages.create(
{
model: 'claude-3-5-sonnet-20241022',
max_tokens: 1024,
messages: this.messages,
},
{
observability: {
name: 'conversation-turn',
sessionId: this.sessionId,
metadata: {
turnNumber: this.messages.length,
},
},
}
);
const assistantMessage = response.content[0].text;
this.messages.push({ role: 'assistant', content: assistantMessage });
return assistantMessage;
}
}Best Practices
1. Wrap Once, Use Everywhere
// lib/anthropic.ts
import Anthropic from '@anthropic-ai/sdk';
import { wrapAnthropic } from '@transactional/observability';
export const anthropic = wrapAnthropic(new Anthropic({
apiKey: process.env.ANTHROPIC_API_KEY,
}));// Use the wrapped client everywhere
import { anthropic } from '@/lib/anthropic';2. Use Sessions for Conversations
const response = await anthropic.messages.create(
{ model: 'claude-3-5-sonnet-20241022', max_tokens: 1024, messages },
{
observability: {
sessionId: `chat-${conversationId}`,
userId: user.id,
},
}
);3. Add Meaningful Names
const response = await anthropic.messages.create(
{ model: 'claude-3-5-sonnet-20241022', max_tokens: 1024, messages },
{
observability: {
name: 'document-analysis', // Descriptive name
},
}
);4. Track Feature Usage
const response = await anthropic.messages.create(
{ model: 'claude-3-5-sonnet-20241022', max_tokens: 1024, messages },
{
observability: {
tags: ['feature:analysis', 'tier:enterprise'],
},
}
);Troubleshooting
Traces Not Appearing
- Verify
initObservability()is called before creating the client - Check that
wrapAnthropic()is used - Confirm DSN is correct
Missing Token Counts
Ensure your Anthropic API responses include usage data. This is standard in the Messages API.
Streaming Not Traced Correctly
Make sure you're using messages.stream() (the SDK method) rather than manually handling streams.
Next Steps
On This Page
- Overview
- Installation
- Setup
- Basic Integration
- With Custom Options
- Usage Examples
- Messages API
- With System Prompt
- Streaming
- Tool Use
- Vision (Images)
- Adding Context
- Per-Request Context
- Global Context
- Manual Tracing
- Error Tracking
- What Gets Traced
- Viewing Traces
- Dashboard
- Trace Details
- Multi-Turn Conversations
- Best Practices
- 1. Wrap Once, Use Everywhere
- 2. Use Sessions for Conversations
- 3. Add Meaningful Names
- 4. Track Feature Usage
- Troubleshooting
- Traces Not Appearing
- Missing Token Counts
- Streaming Not Traced Correctly
- Next Steps