AI SDK Integration
Out of the box the Agents SDK works with OpenAI models through the Responses API or Chat Completions API. However, if you would like to use another model, the Vercel AI SDK offers a range of supported models that can be brought into the Agents SDK through this adapter.
-
Install the AI SDK adapter by installing the extensions package:
Terminal window npm install @openai/agents-extensions -
Choose your desired model package from the Vercel’s AI SDK and install it:
Terminal window npm install @ai-sdk/openai -
Import the adapter and model to connect to your agent:
Import the adapter import { openai } from '@ai-sdk/openai';import { aisdk } from '@openai/agents-extensions/ai-sdk'; -
Initialize an instance of the model to be used by the agent:
Create the model import { openai } from '@ai-sdk/openai';import { aisdk } from '@openai/agents-extensions/ai-sdk';const model = aisdk(openai('gpt-5.4'));
Code examples
Section titled “Code examples”import { Agent, run } from '@openai/agents';
// Import the model package you installedimport { openai } from '@ai-sdk/openai';
// Import the adapterimport { aisdk } from '@openai/agents-extensions/ai-sdk';
// Create a model instance to be used by the agentconst model = aisdk(openai('gpt-5.4'));
// Create an agent with the modelconst agent = new Agent({ name: 'My Agent', instructions: 'You are a helpful assistant.', model,});
// Run the agent with the new modelrun(agent, 'What is the capital of Germany?');Passing provider metadata
Section titled “Passing provider metadata”If you need to send provider-specific options with a message, pass them through providerMetadata. The values are forwarded directly to the underlying AI SDK model. For example, the following providerData in the Agents SDK
const providerData = { anthropic: { cacheControl: { type: 'ephemeral', }, },};would become
const providerMetadata = { anthropic: { cacheControl: { type: 'ephemeral', }, },};when using the AI SDK integration.
Normalizing finalized output text
Section titled “Normalizing finalized output text”Some providers return structured output as plain text with extra wrapping, such as JSON code fences. If you need provider-specific cleanup before the Agents runtime validates the final output, pass transformOutputText when creating the adapter:
import { openai } from '@ai-sdk/openai';import { aisdk } from '@openai/agents-extensions/ai-sdk';
const model = aisdk(openai('gpt-5.4'), { transformOutputText(text) { return text.match(/```(?:json)?\s*([\s\S]*?)\s*```/)?.[1]?.trim() ?? text; },});transformOutputText runs on finalized assistant text for non-streamed responses and on the final response_done event for streamed responses. It does not modify incremental output_text_delta events.
Retries
Section titled “Retries”modelSettings.retry works with AI SDK-backed models too, because retries are implemented by the Agents runtime rather than only by the default OpenAI provider.
That means you can attach the same retry configuration you would use elsewhere:
- Set
modelSettings.retryon theAgent,Runner, or both. - Compose
retryPoliciessuch asnetworkError(),httpStatus([...]), orproviderSuggested(). - Keep in mind that
providerSuggested()only helps when the wrapped AI SDK model can surface retry advice through the adapter.
For a complete example using aisdk(openai(...)), see examples/ai-sdk/retry.ts. For the retry API itself, including safety boundaries for streaming and stateful follow-up requests, see the Models guide.
Picking the right integration
Section titled “Picking the right integration”There are two related integrations in @openai/agents-extensions:
@openai/agents-extensions/ai-sdkadapts an AI SDK model so anAgentcan run on it.@openai/agents-extensions/ai-sdk-uiadapts a streamed Agents SDK run so AI SDK UI routes can return a standard streamingResponse.
Notes for AI SDK models
Section titled “Notes for AI SDK models”- The
@openai/agents-extensions/ai-sdkadapter is still in beta, so it is worth testing carefully with your chosen provider, especially smaller ones. - If you are using OpenAI models, prefer the default OpenAI model provider instead of this adapter.
- Supported AI SDK providers must expose
specificationVersionv2orv3. If you need the older v1 provider style, copy the module from examples/ai-sdk-v1 into your project. - Deferred Responses tool-loading flows are not supported here. That includes
toolNamespace(), function tools withdeferLoading: true, andtoolSearchTool(). If you need tool search, use an OpenAI Responses model directly. See the Tools guide and Models guide.
AI SDK UI stream helpers
Section titled “AI SDK UI stream helpers”@openai/agents-extensions/ai-sdk-ui provides response helpers for wiring Agents SDK streams into AI SDK UI routes:
createAiSdkTextStreamResponse(source, options?)for plain text streaming responses.createAiSdkUiMessageStreamResponse(source, options?)forUIMessageChunkstreaming responses.
Both helpers accept a StreamedRunResult, stream-like source, or compatible wrapper object and return a Response with streaming-friendly headers.
Use createAiSdkUiMessageStreamResponse(...) when your UI needs structured chunks such as tool calls or reasoning parts. Use createAiSdkTextStreamResponse(...) when you only want plain text.
Example Next.js route for UI message streaming:
import { Agent, run } from '@openai/agents';import { createAiSdkUiMessageStreamResponse } from '@openai/agents-extensions/ai-sdk-ui';
const agent = new Agent({ name: 'Assistant', instructions: 'Reply with a short answer.',});
export async function POST() { const stream = await run(agent, 'Hello there.', { stream: true }); return createAiSdkUiMessageStreamResponse(stream);}Example Next.js route for text-only streaming:
import { Agent, run } from '@openai/agents';import { createAiSdkTextStreamResponse } from '@openai/agents-extensions/ai-sdk-ui';
const agent = new Agent({ name: 'Assistant', instructions: 'Reply with a short answer.',});
export async function POST() { const stream = await run(agent, 'Hello there.', { stream: true }); return createAiSdkTextStreamResponse(stream);}For end-to-end usage, see the examples/ai-sdk-ui app in this repository.