Transactional

Migrate from OpenAI SDK

Add observability to your existing OpenAI integration with minimal code changes.

Overview

Already using the OpenAI SDK? Add full observability with just 3 lines of code. No need to change your existing API calls.

Quick Migration

Before (Direct OpenAI)

import OpenAI from 'openai';
 
const openai = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY,
});
 
const response = await openai.chat.completions.create({
  model: 'gpt-4o',
  messages: [{ role: 'user', content: 'Hello!' }],
});

After (With Observability)

import OpenAI from 'openai';
+ import { initObservability, wrapOpenAI } from '@transactional/observability';
 
+ initObservability({ dsn: process.env.TRANSACTIONAL_OBSERVABILITY_DSN! });
 
- const openai = new OpenAI({
+ const openai = wrapOpenAI(new OpenAI({
  apiKey: process.env.OPENAI_API_KEY,
- });
+ }));
 
// All existing code works unchanged!
const response = await openai.chat.completions.create({
  model: 'gpt-4o',
  messages: [{ role: 'user', content: 'Hello!' }],
});

Step-by-Step Migration

1. Install the SDK

npm install @transactional/observability

2. Get Your DSN

  1. Go to Observability Dashboard
  2. Create or select a project
  3. Copy the DSN from project settings

3. Initialize Observability

Add initialization at your application entry point:

// app.ts or index.ts
import { initObservability } from '@transactional/observability';
 
initObservability({
  dsn: process.env.TRANSACTIONAL_OBSERVABILITY_DSN!,
  // Optional: only enable in production
  enabled: process.env.NODE_ENV === 'production',
});

4. Wrap Your OpenAI Client

Update where you create your OpenAI client:

// lib/openai.ts
import OpenAI from 'openai';
import { wrapOpenAI } from '@transactional/observability';
 
export const openai = wrapOpenAI(new OpenAI({
  apiKey: process.env.OPENAI_API_KEY,
}));

5. Use As Normal

All your existing code continues to work:

import { openai } from '@/lib/openai';
 
// Chat completions
const chat = await openai.chat.completions.create({
  model: 'gpt-4o',
  messages: [{ role: 'user', content: 'Hello!' }],
});
 
// Streaming
const stream = await openai.chat.completions.create({
  model: 'gpt-4o',
  messages: [{ role: 'user', content: 'Tell me a story' }],
  stream: true,
});
 
// Function calling
const tools = await openai.chat.completions.create({
  model: 'gpt-4o',
  messages: [...],
  tools: [...],
});
 
// Embeddings
const embedding = await openai.embeddings.create({
  model: 'text-embedding-3-small',
  input: 'Hello world',
});

Adding Context (Optional)

Enhance your traces with additional context:

User Tracking

import { getObservability } from '@transactional/observability';
 
// After user authentication
const obs = getObservability();
obs.setUser({
  id: user.id,
  email: user.email,
});

Per-Request Context

const response = await openai.chat.completions.create(
  {
    model: 'gpt-4o',
    messages: [...],
  },
  {
    observability: {
      name: 'customer-support',
      userId: user.id,
      sessionId: conversationId,
      tags: ['support', 'chat'],
    },
  }
);

Grouping Conversations

// Use the same sessionId for all messages in a conversation
const sessionId = `chat-${conversationId}`;
 
// First message
await openai.chat.completions.create(
  { model: 'gpt-4o', messages: [...] },
  { observability: { sessionId } }
);
 
// Follow-up message (same session)
await openai.chat.completions.create(
  { model: 'gpt-4o', messages: [...] },
  { observability: { sessionId } }
);

Framework Examples

Next.js App Router

// lib/openai.ts
import OpenAI from 'openai';
import { wrapOpenAI } from '@transactional/observability';
 
export const openai = wrapOpenAI(new OpenAI({
  apiKey: process.env.OPENAI_API_KEY,
}));
// app/api/chat/route.ts
import { openai } from '@/lib/openai';
 
export async function POST(request: Request) {
  const { messages, userId } = await request.json();
 
  const response = await openai.chat.completions.create(
    {
      model: 'gpt-4o',
      messages,
    },
    {
      observability: {
        name: 'api-chat',
        userId,
      },
    }
  );
 
  return Response.json(response.choices[0].message);
}

Express

// lib/openai.ts
import OpenAI from 'openai';
import { wrapOpenAI } from '@transactional/observability';
 
export const openai = wrapOpenAI(new OpenAI({
  apiKey: process.env.OPENAI_API_KEY,
}));
// routes/chat.ts
import { openai } from '../lib/openai';
 
app.post('/chat', async (req, res) => {
  const { messages } = req.body;
 
  const response = await openai.chat.completions.create(
    {
      model: 'gpt-4o',
      messages,
    },
    {
      observability: {
        userId: req.user?.id,
        sessionId: req.session?.id,
      },
    }
  );
 
  res.json(response.choices[0].message);
});

Error Tracking

Enable error tracking for OpenAI failures:

import { getObservability } from '@transactional/observability';
 
const obs = getObservability();
 
try {
  const response = await openai.chat.completions.create({...});
} catch (error) {
  obs.captureException(error as Error, {
    tags: { provider: 'openai' },
    extra: { model: 'gpt-4o' },
  });
  throw error;
}

What You Get

After migration, you automatically get:

FeatureDescription
Full TracingEvery API call is traced
Token TrackingPrompt and completion tokens
Cost CalculationAutomatic cost per request
Latency MetricsResponse time tracking
Error TrackingAPI errors captured
Session GroupingGroup related calls
DashboardVisual analytics and debugging

Verifying Migration

  1. Make an API call with the wrapped client
  2. Go to Observability Dashboard
  3. Select your project and click Traces
  4. You should see your API call with full details

Rollback

If you need to rollback, simply remove the wrapper:

- import { wrapOpenAI } from '@transactional/observability';
 
- const openai = wrapOpenAI(new OpenAI({
+ const openai = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY,
- }));
+ });

Next Steps