Migrate from LangChain
Add observability to your existing LangChain application with callback handlers.
Overview
Already using LangChain? Add full observability with a single callback handler. All your chains, agents, and LLM calls are automatically traced.
Quick Migration
Before (Direct LangChain)
import { ChatOpenAI } from '@langchain/openai';
const model = new ChatOpenAI({ modelName: 'gpt-4o' });
const response = await model.invoke('Explain quantum computing');After (With Observability)
import { ChatOpenAI } from '@langchain/openai';
+ import { initObservability } from '@transactional/observability';
+ import { TransactionalCallbackHandler } from '@transactional/observability/langchain';
+ initObservability({ dsn: process.env.TRANSACTIONAL_OBSERVABILITY_DSN! });
const model = new ChatOpenAI({ modelName: 'gpt-4o' });
+ const handler = new TransactionalCallbackHandler();
- const response = await model.invoke('Explain quantum computing');
+ const response = await model.invoke('Explain quantum computing', {
+ callbacks: [handler],
+ });Step-by-Step Migration
1. Install the SDK
npm install @transactional/observability2. Get Your DSN
- Go to Observability Dashboard
- Create or select a project
- Copy the DSN from project settings
3. Initialize Observability
Add initialization at your application entry point:
// app.ts or index.ts
import { initObservability } from '@transactional/observability';
initObservability({
dsn: process.env.TRANSACTIONAL_OBSERVABILITY_DSN!,
});4. Create a Callback Handler
import { TransactionalCallbackHandler } from '@transactional/observability/langchain';
// Create a handler for each request/conversation
const handler = new TransactionalCallbackHandler({
sessionId: 'conversation-123', // Optional
userId: 'user-456', // Optional
});5. Pass to LangChain Calls
// Chat models
const response = await model.invoke('Hello', {
callbacks: [handler],
});
// Chains
const result = await chain.invoke({ input: 'Hello' }, {
callbacks: [handler],
});
// Agents
const agentResult = await executor.invoke({ input: 'Hello' }, {
callbacks: [handler],
});Migration Examples
Chat Models
import { ChatOpenAI } from '@langchain/openai';
import { TransactionalCallbackHandler } from '@transactional/observability/langchain';
const model = new ChatOpenAI({ modelName: 'gpt-4o' });
const handler = new TransactionalCallbackHandler({ userId: 'user-123' });
const response = await model.invoke(
[{ role: 'user', content: 'Hello!' }],
{ callbacks: [handler] }
);Chains
import { LLMChain } from 'langchain/chains';
import { PromptTemplate } from '@langchain/core/prompts';
import { TransactionalCallbackHandler } from '@transactional/observability/langchain';
const chain = new LLMChain({
llm: model,
prompt: PromptTemplate.fromTemplate('Summarize: {text}'),
});
const handler = new TransactionalCallbackHandler({
sessionId: 'summarization-session',
});
const result = await chain.invoke(
{ text: 'Long article here...' },
{ callbacks: [handler] }
);Retrieval QA
import { RetrievalQAChain } from 'langchain/chains';
import { TransactionalCallbackHandler } from '@transactional/observability/langchain';
const handler = new TransactionalCallbackHandler({
userId: user.id,
sessionId: `qa-${conversationId}`,
metadata: { type: 'rag' },
});
// Pass to both retriever and LLM for full tracing
const result = await qaChain.invoke(
{ query: 'What is the refund policy?' },
{ callbacks: [handler] }
);Agents
import { AgentExecutor, createOpenAIFunctionsAgent } from 'langchain/agents';
import { TransactionalCallbackHandler } from '@transactional/observability/langchain';
const handler = new TransactionalCallbackHandler({
userId: user.id,
metadata: { agentType: 'functions' },
});
const result = await executor.invoke(
{ input: 'What is the weather in Paris?' },
{ callbacks: [handler] }
);
// Trace captures:
// - Agent planning steps
// - Tool calls and results
// - Final responseLCEL (LangChain Expression Language)
import { ChatOpenAI } from '@langchain/openai';
import { StringOutputParser } from '@langchain/core/output_parsers';
import { TransactionalCallbackHandler } from '@transactional/observability/langchain';
const chain = model.pipe(new StringOutputParser());
const handler = new TransactionalCallbackHandler();
const result = await chain.invoke('Hello', {
callbacks: [handler],
});Handler Configuration
Per-Request Options
const handler = new TransactionalCallbackHandler({
// Group related traces
sessionId: 'conversation-123',
// Track which user
userId: 'user-456',
// Custom metadata
metadata: {
environment: 'production',
feature: 'chat',
version: '1.0.0',
},
});Reusing Handlers
Create a new handler for each independent request:
// API route - new handler per request
app.post('/chat', async (req, res) => {
const handler = new TransactionalCallbackHandler({
userId: req.user?.id,
sessionId: req.body.conversationId,
});
const result = await chain.invoke(
{ input: req.body.message },
{ callbacks: [handler] }
);
res.json(result);
});What Gets Traced
The callback handler automatically captures:
| LangChain Event | Trace Type | Details Captured |
|---|---|---|
| LLM start | Generation | Model name, prompts |
| LLM end | - | Response, tokens, cost |
| Chain start | Span | Chain name, inputs |
| Chain end | - | Outputs |
| Tool start | Span | Tool name, inputs |
| Tool end | - | Outputs |
| Retriever start | Span | Query |
| Retriever end | - | Documents |
| Agent action | Span | Action, inputs |
| Error | - | Error message, stack |
Trace Structure Example
For a RAG chain, the trace looks like:
Trace: qa-chain
├── Span: retrieval
│ └── Generation: embedding (text-embedding-3-small)
├── Span: format-documents
└── Generation: llm-response (gpt-4o)
├── Input: 1,234 tokens
├── Output: 456 tokens
└── Cost: $0.0234
Framework Examples
Next.js API Route
// app/api/chat/route.ts
import { ChatOpenAI } from '@langchain/openai';
import { TransactionalCallbackHandler } from '@transactional/observability/langchain';
export async function POST(request: Request) {
const { message, userId, sessionId } = await request.json();
const model = new ChatOpenAI({ modelName: 'gpt-4o' });
const handler = new TransactionalCallbackHandler({
userId,
sessionId,
});
const response = await model.invoke(message, {
callbacks: [handler],
});
return Response.json({ message: response.content });
}Express
import { ChatOpenAI } from '@langchain/openai';
import { TransactionalCallbackHandler } from '@transactional/observability/langchain';
app.post('/chat', async (req, res) => {
const handler = new TransactionalCallbackHandler({
userId: req.user?.id,
sessionId: req.body.sessionId,
});
const model = new ChatOpenAI({ modelName: 'gpt-4o' });
const response = await model.invoke(req.body.message, {
callbacks: [handler],
});
res.json({ message: response.content });
});Error Tracking
Capture LangChain errors:
import { getObservability } from '@transactional/observability';
const obs = getObservability();
try {
const result = await chain.invoke({ input }, { callbacks: [handler] });
} catch (error) {
obs.captureException(error as Error, {
tags: { framework: 'langchain' },
extra: { input },
});
throw error;
}Verifying Migration
- Make a LangChain call with the callback handler
- Go to Observability Dashboard
- Select your project and click Traces
- You should see the full trace hierarchy
Common Issues
Traces Not Appearing
- Ensure
initObservability()is called at startup - Verify the callback handler is passed in
callbacksarray - Check DSN is correct
Nested Chains Not Showing
Pass the callback handler to all nested components:
// Pass to the chain, not just the top-level invoke
const chain = new LLMChain({
llm: model,
prompt,
callbacks: [handler], // Handler at chain level
});
// OR pass at invoke time
await chain.invoke({ input }, { callbacks: [handler] });Missing Token Counts
Some models don't return token counts. The SDK estimates based on text length when actual counts aren't available.
Next Steps
- LangChain Integration Guide - Full documentation
- Sessions - Group conversation traces
- Error Tracking - Capture and manage errors
On This Page
- Overview
- Quick Migration
- Before (Direct LangChain)
- After (With Observability)
- Step-by-Step Migration
- 1. Install the SDK
- 2. Get Your DSN
- 3. Initialize Observability
- 4. Create a Callback Handler
- 5. Pass to LangChain Calls
- Migration Examples
- Chat Models
- Chains
- Retrieval QA
- Agents
- LCEL (LangChain Expression Language)
- Handler Configuration
- Per-Request Options
- Reusing Handlers
- What Gets Traced
- Trace Structure Example
- Framework Examples
- Next.js API Route
- Express
- Error Tracking
- Verifying Migration
- Common Issues
- Traces Not Appearing
- Nested Chains Not Showing
- Missing Token Counts
- Next Steps