Vercel AI SDK Integration Guide



This guide will walk you through integrating the Vercel AI SDK with APIpie, enabling you to build powerful AI-powered applications with seamless streaming and tool calling capabilities.
What is Vercel AI SDK?
Vercel AI SDK is a library designed to help developers build AI-powered user interfaces. It provides a set of tools and components for:
- Streaming text generation from various language models
- Built-in React hooks for chat interfaces and completions
- Tool calling capabilities for executing functions based on AI requests
- Streaming UI patterns for implementing typewriter effects and more
- Optimized performance with edge runtime support
By connecting Vercel AI SDK to APIpie, you unlock access to a wide range of powerful models while leveraging Vercel's optimized UI components and streaming capabilities.
Integration Steps
1. Create an APIpie Account
- Register here: APIpie Registration
- Complete the sign-up process.
2. Add Credit
- Add Credit: APIpie Subscription
- Add credits to your account to enable API access.
3. Generate an API Key
- API Key Management: APIpie API Keys
- Create a new API key for use with Vercel AI SDK.
4. Install Vercel AI SDK
Install the OpenAI-compatible provider for Vercel AI SDK:
npm install @ai-sdk/openai-compatible
# or
yarn add @ai-sdk/openai-compatible
# or
pnpm add @ai-sdk/openai-compatible
5. Configure Vercel AI SDK for APIpie
Create a provider instance with your APIpie API key:
import { createOpenAICompatible } from '@ai-sdk/openai-compatible';
// Create a provider instance
const provider = createOpenAICompatible({
name: 'apipie',
apiKey: process.env.APIPIE_API_KEY,
baseURL: 'https://apipie.ai/v1',
});
// Use the provider with a specific model
const model = provider('gpt-4o-mini'); // or any model available on APIpie
Key Features
- Streaming Responses: Get real-time tokens as they're generated for responsive UIs
- React Hooks: Pre-built hooks for chat and completion interfaces
- Tool Calling: Define and execute functions based on AI-driven decisions
- UI Components: Build elegant AI interfaces with minimal code
- Edge Compatible: Optimized for deployment on Vercel's Edge runtime
Example Workflows
Application Type | What Vercel AI SDK Helps You Build |
---|---|
Chat Interfaces | Interactive conversational applications with streaming UIs |
Text Generation | Applications that generate and stream content to users |
AI Function Calling | AI agents that can make API calls, query databases, etc. |
Next.js AI Applications | Seamless integration of AI into your Next.js projects |
Multi-Modal Applications | Applications that handle both text and image inputs/outputs |
Using Vercel AI SDK with APIpie
Basic Text Generation
import { createOpenAICompatible } from '@ai-sdk/openai-compatible';
import { streamText } from 'ai';
export async function generateRecipe() {
const apipie = createOpenAICompatible({
apiKey: process.env.APIPIE_API_KEY,
baseURL: 'https://apipie.ai/v1',
});
const response = await streamText({
model: apipie('gpt-4o-mini'),
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});
// Process the streaming response
for await (const chunk of response.textStream) {
// Do something with each chunk as it arrives
console.log(chunk);
}
// Or wait for the complete response
await response.consumeStream();
return response.text;
}
Tool Calling with Zod Schemas
import { createOpenAICompatible } from '@ai-sdk/openai-compatible';
import { streamText } from 'ai';
import { z } from 'zod';
export async function getWeatherWithAI() {
const apipie = createOpenAICompatible({
apiKey: process.env.APIPIE_API_KEY,
baseURL: 'https://apipie.ai/v1',
});
const response = await streamText({
model: apipie('gpt-4o-mini'),
prompt: 'What is the weather in San Francisco, CA in Fahrenheit?',
tools: {
getCurrentWeather: {
description: 'Get the current weather in a given location',
parameters: z.object({
location: z.string().describe('The city and state, e.g. San Francisco, CA'),
unit: z.enum(['celsius', 'fahrenheit']).optional(),
}),
execute: async ({ location, unit = 'celsius' }) => {
// In a real application, this would call your weather API
console.log(`Fetching weather for ${location} in ${unit}`);
// Mock response
return `The current weather in ${location} is 64°F.`;
},
},
},
});
await response.consumeStream();
return response.text;
}
Using with React Hooks
import { useChat } from 'ai/react';
import { createOpenAICompatible } from '@ai-sdk/openai-compatible';
// Create provider in a separate file
export const apipie = createOpenAICompatible({
apiKey: process.env.NEXT_PUBLIC_APIPIE_API_KEY,
baseURL: 'https://apipie.ai/v1',
});
export default function ChatComponent() {
const { messages, input, handleInputChange, handleSubmit } = useChat({
api: '/api/chat', // API route for server-side processing
});
return (
<div>
<div className="messages">
{messages.map(m => (
<div key={m.id} className={m.role}>
{m.content}
</div>
))}
</div>
<form onSubmit={handleSubmit}>
<input
value={input}
onChange={handleInputChange}
placeholder="Say something..."
/>
<button type="submit">Send</button>
</form>
</div>
);
}
API route implementation (/api/chat.js
):
import { createOpenAICompatible } from '@ai-sdk/openai-compatible';
import { streamText } from 'ai';
export const runtime = 'edge';
export async function POST(req) {
const { messages } = await req.json();
const apipie = createOpenAICompatible({
apiKey: process.env.APIPIE_API_KEY,
baseURL: 'https://apipie.ai/v1',
});
const response = await streamText({
model: apipie('gpt-4o-mini'),
messages,
});
return response.toTextStreamResponse();
}
Troubleshooting & FAQ
-
Which models are supported?
Any model available via APIpie's OpenAI-compatible endpoint. -
How do I handle environment variables securely?
Store your API key in environment variables and never expose it in client-side code. For Next.js, use.env.local
for development and Vercel environment variables for production. -
Can I use streaming responses with server components?
Yes, you can use streaming responses with React Server Components in Next.js App Router. -
How do I add custom headers to requests?
Use theheaders
option when creating your provider:const provider = createOpenAICompatible({
apiKey: process.env.APIPIE_API_KEY,
baseURL: 'https://apipie.ai/v1',
headers: { 'Custom-Header': 'value' },
});
For more information, see the Vercel AI SDK documentation or the GitHub repository.
Support
If you encounter any issues during the integration process, please reach out on APIpie Discord for assistance.