Using any model with the Vercel's AI SDK
Out of the box the Agents SDK works with OpenAI models through the Responses API or Chat Completions API. However, if you would like to use another model, the Vercel’s AI SDK offers a range of supported models that can be brought into the Agents SDK through this adapter.
-
Install the AI SDK adapter by installing the extensions package:
Terminal window npm install @openai/agents-extensions -
Choose your desired model package from the Vercel’s AI SDK and install it:
Terminal window npm install @ai-sdk/openai -
Import the adapter and model to connect to your agent:
import { openai } from '@ai-sdk/openai';import { aisdk } from '@openai/agents-extensions'; -
Initialize an instance of the model to be used by the agent:
const model = aisdk(openai('o4-mini'));
Example
Section titled “Example”import { Agent, run } from '@openai/agents';
// Import the model package you installedimport { openai } from '@ai-sdk/openai';
// Import the adapterimport { aisdk } from '@openai/agents-extensions';
// Create a model instance to be used by the agentconst model = aisdk(openai('o4-mini'));
// Create an agent with the modelconst agent = new Agent({ name: 'My Agent', instructions: 'You are a helpful assistant.', model,});
// Run the agent with the new modelrun(agent, 'What is the capital of Germany?');
Passing provider metadata
Section titled “Passing provider metadata”If you need to send provider-specific options with a message, pass them through
providerMetadata
. The values are forwarded directly to the underlying AI SDK
model. For example, the following providerData
in the Agents SDK
providerData: { anthropic: { cacheControl: { type: 'ephemeral'; } }}
would become
providerMetadata: { anthropic: { cacheControl: { type: 'ephemeral'; } }}
when using the AI SDK integration.