Migrate from Anthropic
Switch from direct Anthropic API to AI Gateway for unified API access.
Overview
Migrating from the Anthropic SDK to AI Gateway lets you use a unified OpenAI-compatible API for all providers. You gain caching, rate limiting, cost tracking, and easy provider switching.
Migration Options
You have two options:
- OpenAI SDK (recommended) - Use the OpenAI SDK with AI Gateway for a unified API
- Anthropic SDK - Continue using the Anthropic SDK with AI Gateway
Option 1: Switch to OpenAI SDK (Recommended)
Use the OpenAI SDK for all providers through AI Gateway:
Before (Direct Anthropic)
import Anthropic from '@anthropic-ai/sdk';
const anthropic = new Anthropic({
apiKey: process.env.ANTHROPIC_API_KEY,
});
const response = await anthropic.messages.create({
model: 'claude-3-5-sonnet-20241022',
max_tokens: 1024,
messages: [{ role: 'user', content: 'Hello!' }],
});After (AI Gateway with OpenAI SDK)
import OpenAI from 'openai';
const client = new OpenAI({
baseURL: 'https://api.transactional.dev/ai/v1',
apiKey: process.env.GATEWAY_API_KEY, // gw_sk_xxx
});
const response = await client.chat.completions.create({
model: 'claude-3-5-sonnet', // Anthropic model via Gateway
messages: [{ role: 'user', content: 'Hello!' }],
max_tokens: 1024,
});Benefits of OpenAI SDK
- Unified API - Same code for OpenAI, Anthropic, Google, etc.
- Easy switching - Change providers by changing the model name
- Familiar syntax - Most developers know the OpenAI API
Option 2: Continue with Anthropic SDK
Use the Anthropic SDK with AI Gateway:
Before (Direct Anthropic)
import Anthropic from '@anthropic-ai/sdk';
const anthropic = new Anthropic({
apiKey: process.env.ANTHROPIC_API_KEY,
});After (AI Gateway)
import Anthropic from '@anthropic-ai/sdk';
const anthropic = new Anthropic({
baseURL: 'https://api.transactional.dev/ai/anthropic/v1',
apiKey: process.env.GATEWAY_API_KEY, // gw_sk_xxx
});
// All existing code works unchanged!
const response = await anthropic.messages.create({
model: 'claude-3-5-sonnet-20241022',
max_tokens: 1024,
messages: [{ role: 'user', content: 'Hello!' }],
});Step-by-Step Migration
1. Get Your Gateway API Key
- Go to AI Gateway Dashboard
- Click API Keys
- Create a new key (starts with
gw_sk_)
2. Add Your Anthropic Provider Key
- Go to Provider Keys in the dashboard
- Add your Anthropic API key
- AI Gateway will use this to call Anthropic on your behalf
3. Update Your Code
Choose your preferred SDK and update:
OpenAI SDK (Recommended):
import OpenAI from 'openai';
const client = new OpenAI({
baseURL: 'https://api.transactional.dev/ai/v1',
apiKey: process.env.GATEWAY_API_KEY,
});Anthropic SDK:
import Anthropic from '@anthropic-ai/sdk';
const anthropic = new Anthropic({
baseURL: 'https://api.transactional.dev/ai/anthropic/v1',
apiKey: process.env.GATEWAY_API_KEY,
});API Mapping
When using the OpenAI SDK with Anthropic models:
| Anthropic API | OpenAI API (via Gateway) |
|---|---|
messages.create() | chat.completions.create() |
model: 'claude-3-5-sonnet-20241022' | model: 'claude-3-5-sonnet' |
system: 'You are...' | messages: [{ role: 'system', content: '...' }] |
max_tokens: 1024 | max_tokens: 1024 |
Example Conversion
Anthropic SDK:
const response = await anthropic.messages.create({
model: 'claude-3-5-sonnet-20241022',
max_tokens: 1024,
system: 'You are a helpful assistant.',
messages: [
{ role: 'user', content: 'Hello!' },
],
});
console.log(response.content[0].text);OpenAI SDK via Gateway:
const response = await client.chat.completions.create({
model: 'claude-3-5-sonnet',
max_tokens: 1024,
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Hello!' },
],
});
console.log(response.choices[0].message.content);Feature Compatibility
| Feature | Anthropic SDK | OpenAI SDK via Gateway |
|---|---|---|
| Chat messages | Yes | Yes |
| Streaming | Yes | Yes |
| Tool use | Yes | Yes |
| Vision (images) | Yes | Yes |
| System prompts | Yes | Yes |
| JSON mode | - | Yes (via response_format) |
Python Migration
Before (Direct Anthropic)
from anthropic import Anthropic
client = Anthropic(
api_key=os.environ["ANTHROPIC_API_KEY"]
)
response = client.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello!"}]
)After (OpenAI SDK via Gateway)
from openai import OpenAI
client = OpenAI(
base_url="https://api.transactional.dev/ai/v1",
api_key=os.environ["GATEWAY_API_KEY"]
)
response = client.chat.completions.create(
model="claude-3-5-sonnet",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello!"}]
)What You Get
After migration, you automatically get:
| Feature | Description |
|---|---|
| Unified API | Same code for all providers |
| Semantic Caching | Automatic response caching |
| Cost Tracking | Real-time cost monitoring |
| Rate Limiting | Configurable request limits |
| Fallback | Automatic failover to backup providers |
| Analytics | Usage dashboards |
| Easy Switching | Change providers by changing model name |
Multi-Provider Example
With AI Gateway, switch providers without changing your code:
const client = new OpenAI({
baseURL: 'https://api.transactional.dev/ai/v1',
apiKey: process.env.GATEWAY_API_KEY,
});
// Anthropic Claude
const claude = await client.chat.completions.create({
model: 'claude-3-5-sonnet',
messages: [{ role: 'user', content: 'Hello!' }],
});
// OpenAI GPT-4
const gpt4 = await client.chat.completions.create({
model: 'gpt-4o',
messages: [{ role: 'user', content: 'Hello!' }],
});
// Google Gemini
const gemini = await client.chat.completions.create({
model: 'gemini-1.5-pro',
messages: [{ role: 'user', content: 'Hello!' }],
});Streaming
Streaming works with both approaches:
OpenAI SDK:
const stream = await client.chat.completions.create({
model: 'claude-3-5-sonnet',
messages: [{ role: 'user', content: 'Tell me a story' }],
stream: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content || '');
}Tool Use
Tool use is supported:
const response = await client.chat.completions.create({
model: 'claude-3-5-sonnet',
messages: [{ role: 'user', content: 'What is the weather?' }],
tools: [
{
type: 'function',
function: {
name: 'get_weather',
description: 'Get weather for a location',
parameters: {
type: 'object',
properties: {
location: { type: 'string' },
},
},
},
},
],
});Rollback
If you need to rollback, revert to direct Anthropic:
import Anthropic from '@anthropic-ai/sdk';
const anthropic = new Anthropic({
apiKey: process.env.ANTHROPIC_API_KEY,
});Troubleshooting
Model Not Found
- Use the Gateway model name:
claude-3-5-sonnet(notclaude-3-5-sonnet-20241022) - Check available models in the dashboard
Response Format Different
When switching from Anthropic SDK to OpenAI SDK:
- Anthropic:
response.content[0].text - OpenAI:
response.choices[0].message.content
Tool Call Format Different
Tool calls have slightly different formats between SDKs. See the API documentation for details.
Next Steps
- AI Gateway Overview - Full feature documentation
- OpenAI SDK Guide - Using the OpenAI SDK
- Caching - Configure semantic caching
- Fallback - Set up provider fallbacks
On This Page
- Overview
- Migration Options
- Option 1: Switch to OpenAI SDK (Recommended)
- Before (Direct Anthropic)
- After (AI Gateway with OpenAI SDK)
- Benefits of OpenAI SDK
- Option 2: Continue with Anthropic SDK
- Before (Direct Anthropic)
- After (AI Gateway)
- Step-by-Step Migration
- 1. Get Your Gateway API Key
- 2. Add Your Anthropic Provider Key
- 3. Update Your Code
- API Mapping
- Example Conversion
- Feature Compatibility
- Python Migration
- Before (Direct Anthropic)
- After (OpenAI SDK via Gateway)
- What You Get
- Multi-Provider Example
- Streaming
- Tool Use
- Rollback
- Troubleshooting
- Model Not Found
- Response Format Different
- Tool Call Format Different
- Next Steps