AiSdkModel
Wraps a model from the AI SDK that adheres to the LanguageModelV1 spec to be used used as a model in the OpenAI Agents SDK to use other models.
While you can use this with the OpenAI models, it is recommended to use the default OpenAI model provider instead.
If tracing is enabled, the model will send generation spans to your traces processor.
import { aisdk } from '@openai/agents-extensions';import { openai } from '@ai-sdk/openai';
const model = aisdk(openai('gpt-4o'));
const agent = new Agent({ name: 'My Agent', model});
The Vercel AI SDK model to wrap.
Implements
Section titled “Implements”Model
Constructors
Section titled “Constructors”Constructor
Section titled “Constructor”new AiSdkModel(model): AiSdkModel
Parameters
Section titled “Parameters”Parameter | Type |
---|---|
|
|
Returns
Section titled “Returns”AiSdkModel
Methods
Section titled “Methods”getResponse()
Section titled “getResponse()”getResponse(request): Promise<{ output: AgentOutputItem[]; responseId: string; usage: Usage;}>
Get a response from the model.
Parameters
Section titled “Parameters”Parameter | Type | Description |
---|---|---|
|
|
The request to get a response for. |
Returns
Section titled “Returns”Promise
<{
output
: AgentOutputItem
[];
responseId
: string
;
usage
: Usage
;
}>
Implementation of
Section titled “Implementation of”Model.getResponse
getStreamedResponse()
Section titled “getStreamedResponse()”getStreamedResponse(request): AsyncIterable<StreamEvent>
Get a streamed response from the model.
Parameters
Section titled “Parameters”Parameter | Type |
---|---|
|
|
Returns
Section titled “Returns”AsyncIterable
<StreamEvent
>
Implementation of
Section titled “Implementation of”Model.getStreamedResponse