Skip to main content

Seamless Migration from OpenAI: Step-by-Step Guide

APIpieOpenAI

This guide outlines the simple steps required to migrate any application from OpenAI to APIpie, leveraging our compatible API structure.

Introduction - Migrate from OpenAI

If your application currently uses OpenAI, transitioning to APIpie is straightforward. Our API accepts the same structured requests, meaning you only need to update the base URL to migrate seamlessly. For more information on OpenAI's API structure, you might refer to OpenAI API Reference.

Getting Started - Migration Steps

1. Create an Account

  • Link: Register here
    • Follow the link and fill out the form to create your account.

2. Add Credit

  • Link: Add Credit
    • Access the subscription section after logging in to add credits to your account.

3. Generate an API Key

  • Link: Generate API Key
    • Navigate to the API keys section and create a new key. This key is necessary for API requests.

4. Update Your API Configuration

  • Locate where your application is configured to communicate with OpenAI, typically where the base URL for OpenAI is set.
  • Change the base URL from OpenAI's URL (e.g., https://api.openai.com/v1) to APIpie's URL https://apipie.ai/v1.
  • Update your configured API key to match the new APIpie API key

Configuring OpenAI SDK to use APIpie

APIpie's API endpoints for chat completions, vision, images, embeddings, and speech are fully compatible with OpenAI's API.

If you have an application that uses one of OpenAI's libraries, you can quickly change it to point to APIpie, and start running your existing applications with our service.

Python Configuration

import os
import openai

client = openai.OpenAI(
api_key=os.environ.get("APIPIE_API_KEY"), # Your APIpie API key
base_url="https://apipie.ai/v1", # APIpie base URL
)

JavaScript/TypeScript Configuration

import OpenAI from "openai";

const client = new OpenAI({
apiKey: process.env.APIPIE_API_KEY, // Your APIpie API key
baseURL: "https://apipie.ai/v1", // APIpie base URL
});

Querying Language Models

Chat Completions Example

import os
import openai

client = openai.OpenAI(
api_key=os.environ.get("APIPIE_API_KEY"),
base_url="https://apipie.ai/v1",
)

response = client.chat.completions.create(
model="gpt-4o-mini", # APIpie supports various models including OpenAI ones
messages=[
{ role: 'system', content: 'You are an personal assistant' },
{ role: 'user', content: 'Who won the 2015 NRL Grand Final?' },
]
)

print(response.choices[0].message.content)
import OpenAI from 'openai';

const client = new OpenAI({
apiKey: process.env.APIPIE_API_KEY,
baseURL: 'https://apipie.ai/v1',
});

const response = await client.chat.completions.create({
model: 'gpt-4o-mini',
messages: [
{ role: 'system', content: 'You are an personal assistant' },
{ role: 'user', content: 'Who won the 2015 NRL Grand Final?' },
],
});

console.log(response.choices[0].message.content);

Streaming Responses

import os
import openai

client = openai.OpenAI(
api_key=os.environ.get("APIPIE_API_KEY"),
base_url="https://apipie.ai/v1",
)

stream = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{ role: 'system', content: 'You are an personal assistant' },
{ role: 'user', content: 'Who won the 2015 NRL Grand Final?' },
],
stream=True,
)

for chunk in stream:
print(chunk.choices[0].delta.content or "", end="", flush=True)
import OpenAI from 'openai';

const client = new OpenAI({
apiKey: process.env.APIPIE_API_KEY,
baseURL: 'https://apipie.ai/v1',
});

async function run() {
const stream = await client.chat.completions.create({
model: 'gpt-3.5-turbo',
messages: [
{ role: 'system', content: 'You are an personal assistant' },
{ role: 'user', content: 'Who won the 2015 NRL Grand Final?' },
],
stream: true,
});

for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content || '');
}
}

run();

Multimodal Vision Models

import os
import openai

client = openai.OpenAI(
api_key=os.environ.get("APIPIE_API_KEY"),
base_url="https://apipie.ai/v1",
)

response = client.chat.completions.create(
model="gpt-4-vision",
messages=[{
"role": "user",
"content": [
{"type": "text", "text": "What's in this image?"},
{
"type": "image_url",
"image_url": {
"url": "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg",
},
},
],
}],
)

print(response.choices[0].message.content)
import OpenAI from 'openai';

const client = new OpenAI({
apiKey: process.env.APIPIE_API_KEY,
baseURL: 'https://apipie.ai/v1',
});

const response = await client.chat.completions.create({
model: "gpt-4.1-2025-04-14",
messages: [{
role: "user",
content: [
{ type: "text", text: "What is in this image?" },
{
type: "image_url",
image_url: {
url: "https://en.wikipedia.org/wiki/Pie#/media/File:Arial_view_of_peach_pie_(722379748).jpg",
},
},
],
}],
});

console.log(response.choices[0].message.content);

Output:

This image captures a top-down view of a freshly baked peach pie, its golden-brown crust slightly uneven and beautifully rustic, hinting at a homemade charm.

The filling is generous with thick, juicy slices of ripe peaches, their sunset-orange color peeking through the open gaps of the crust.

The peaches glisten slightly, likely coated in a thin glaze of syrup or natural juices caramelized during baking.

The pie's edges are rough and natural, not overly polished, giving it an inviting, cozy, farm-to-table feel.

The background is plain and neutral, keeping full focus on the warm, delicious simplicity of the peach pie itself.

Image Generation

from openai import OpenAI
import os

client = OpenAI(
api_key=os.environ.get("APIPIE_API_KEY"),
base_url="https://apipie.ai/v1",
)

prompt = """
A cheerful illustration of a fox and a rabbit painting a giant rainbow together in a sunny meadow.
"""

result = client.images.generate(
model="dall-e-3",
prompt=prompt
)

print(result.data[0].url)
import OpenAI from 'openai';

const client = new OpenAI({
apiKey: process.env.APIPIE_API_KEY,
baseURL: 'https://apipie.ai/v1',
});

const prompt = `
A cheerful illustration of a fox and a rabbit painting a giant rainbow together in a sunny meadow.
`;

async function main() {
const response = await client.images.generate({
model: "dall-e-3",
prompt: prompt,
});

console.log(response.data[0].url);
}

main();

Text-to-Speech

from openai import OpenAI
import os

client = OpenAI(
api_key=os.environ.get("APIPIE_API_KEY"),
base_url="https://apipie.ai/v1",
)

speech_file_path = "speech.mp3"

response = client.audio.speech.create(
model="tts-1",
input="Every great idea starts with a single step!",
voice="alloy",
)

response.stream_to_file(speech_file_path)
import OpenAI from 'openai';
import * as fs from 'fs';

const client = new OpenAI({
apiKey: process.env.APIPIE_API_KEY,
baseURL: 'https://apipie.ai/v1',
});

async function main() {
const speechFile = 'speech.mp3';

const mp3 = await client.audio.speech.create({
model: "tts-1",
voice: "alloy",
input: "Every great idea starts with a single step!",
});

const buffer = Buffer.from(await mp3.arrayBuffer());
await fs.promises.writeFile(speechFile, buffer);

console.log(`Audio content written to ${speechFile}`);
}

main();

Vector Embeddings

import os
import openai

client = openai.OpenAI(
api_key=os.environ.get("APIPIE_API_KEY"),
base_url="https://apipie.ai/v1",
)

response = client.embeddings.create(
model = "text-embedding-ada-002",
input = "Sky is blue because air scatters sunlight’s blue wavelengths most."
)

print(response.data[0].embedding)
import OpenAI from 'openai';

const client = new OpenAI({
apiKey: process.env.APIPIE_API_KEY,
baseURL: 'https://apipie.ai/v1',
});

const response = await client.embeddings.create({
model: 'text-embedding-3-large',
input: 'Sky is blue because air scatters sunlight’s blue wavelengths most.',
});

console.log(response.data[0].embedding);

Output

[0.8594738, 0.5930284, 0.90693754,... ]

Structured Outputs (JSON Mode)

from pydantic import BaseModel
from openai import OpenAI
import os, json

client = OpenAI(
api_key=os.environ.get("APIPIE_API_KEY"),
base_url="https://apipie.ai/v1",
)

class CalendarEvent(BaseModel):
name: str
date: str
participants: list[str]

completion = client.chat.completions.create(
model="gpt-4o-mini",
messages=[
{"role": "system", "content": "Extract the event information."},
{"role": "user", "content": "Tom and Collin are going to a concert on Saturday. Answer in JSON"},
],
response_format={
"type": "json_object",
"schema": CalendarEvent.model_json_schema(),
},
)

output = json.loads(completion.choices[0].message.content)
print(json.dumps(output, indent=2))
import OpenAI from 'openai';
import { z } from 'zod';

const client = new OpenAI({
apiKey: process.env.APIPIE_API_KEY,
baseURL: 'https://apipie.ai/v1',
});

// Define schema with Zod
const calendarEventSchema = z.object({
name: z.string(),
date: z.string(),
participants: z.array(z.string())
});

async function main() {
const completion = await client.chat.completions.create({
model: "gpt-4o-mini",
messages: [
{"role": "system", "content": "Extract the event information."},
{"role": "user", "content": "Tom and Collin are going to a concert on Saturday. Answer in JSON"},
],
response_format: {
type: "json_object"
}
});

// Parse the result
const output = JSON.parse(completion.choices[0].message.content);
const validatedOutput = calendarEventSchema.parse(output);

console.log(JSON.stringify(validatedOutput, null, 2));
}

main();

Output:

{
"name": "Concert",
"date": "Saturday",
"participants": [
"Tom",
"Collin"
]
}

Tools (Functions Calling)

from openai import OpenAI
import os, json

client = OpenAI(
api_key=os.environ.get("APIPIE_API_KEY"),
base_url="https://apipie.ai/v1",
)

tools = [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current temperature for a given location.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "City and country e.g. Bogotá, Colombia"
}
},
"required": [
"location"
],
"additionalProperties": False
},
"strict": True
}
}]

completion = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "What is the weather like in Dallas?"}],
tools=tools,
tool_choice="auto"
)

print(json.dumps(completion.choices[0].message.model_dump()['tool_calls'], indent=2))
import OpenAI from 'openai';

const client = new OpenAI({
apiKey: process.env.APIPIE_API_KEY,
baseURL: 'https://apipie.ai/v1',
});

const tools = [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current temperature for a given location.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "City and country e.g. Bogotá, Colombia"
}
},
"required": [
"location"
],
"additionalProperties": false
},
"strict": true
}
}];

async function main() {
const completion = await client.chat.completions.create({
model: "gpt-4o-mini",
messages: [{ role: "user", content: "What is the weather like in Paris today?" }],
tools,
tool_choice: "auto"
});

console.log(JSON.stringify(completion.choices[0].message.tool_calls, null, 2));
}

main();

Benefits of Migrating to APIpie

Migrating to APIpie provides several benefits:

  • Full OpenAI SDK Compatibility: Seamless transition with minimal code changes required.
  • Multi-Provider Support: Access to models from OpenAI, Anthropic, Google, Meta, and many more providers.
  • Redundancy and Advanced Routing: Never lose connectivity with redundant providers for all major models and advanced price or speed based routing.
  • Cost Optimization: Automatically route to the most cost-effective provider for your needs.
  • Enhanced Performance: Optimize for speed when needed with intelligent model routing.
  • Simplified Billing: One account, one API key, one invoice - regardless of how many AI providers you use.

Need Help?

If you encounter any issues during your migration or have further questions, reach out to us via Discord.