Skip to content

Using any model with the Vercel's AI SDK

Out of the box the Agents SDK works with OpenAI models through the Responses API or Chat Completions API. However, if you would like to use another model, the Vercel’s AI SDK offers a range of supported models that can be brought into the Agents SDK through this adapter.

  1. Install the AI SDK adapter by installing the extensions package:

    Terminal window
    npm install @openai/agents-extensions
  2. Choose your desired model package from the Vercel’s AI SDK and install it:

    Terminal window
    npm install @ai-sdk/openai
  3. Import the adapter and model to connect to your agent:

    import { openai } from '@ai-sdk/openai';
    import { aisdk } from '@openai/agents-extensions';
  4. Initialize an instance of the model to be used by the agent:

    const model = aisdk(openai('o4-mini'));
AI SDK Setup
import { Agent, run } from '@openai/agents';
// Import the model package you installed
import { openai } from '@ai-sdk/openai';
// Import the adapter
import { aisdk } from '@openai/agents-extensions';
// Create a model instance to be used by the agent
const model = aisdk(openai('o4-mini'));
// Create an agent with the model
const agent = new Agent({
name: 'My Agent',
instructions: 'You are a helpful assistant.',
model,
});
// Run the agent with the new model
run(agent, 'What is the capital of Germany?');