Skip to content

OpenAIRealtimeWebRTC

Transport layer that’s handling the connection between the client and OpenAI’s Realtime API via WebRTC. While this transport layer is designed to be used within a RealtimeSession, it can also be used standalone if you want to have a direct connection to the Realtime API.

Unless you specify a mediaStream or audioElement option, the transport layer will automatically configure the microphone and audio output to be used by the session.

new OpenAIRealtimeWebRTC(options): OpenAIRealtimeWebRTC
Parameter Type

options

OpenAIRealtimeWebRTCOptions

OpenAIRealtimeWebRTC

OpenAIRealtimeBase.constructor

set _tracingConfig(tracingConfig): void

Sets the internal tracing config. This is used to track the tracing config that has been set during the session.create event.

Parameter Type

tracingConfig

null | RealtimeTracingConfig

void

OpenAIRealtimeBase._tracingConfig


get connectionState(): WebRTCState

The current connection state of the WebRTC connection including the peer connection and data channel.

WebRTCState


get currentModel(): OpenAIRealtimeModels

The current model that is being used by the transport layer.

OpenAIRealtimeModels

set currentModel(model): void

The current model that is being used by the transport layer. Note: The model cannot be changed mid conversation.

Parameter Type

model

OpenAIRealtimeModels

void

OpenAIRealtimeBase.currentModel


get muted(): boolean

Whether the session is muted.

boolean

Whether the input audio track is currently muted null if the muting is not handled by the transport layer

RealtimeTransportLayer.muted

OpenAIRealtimeBase.muted


get status(): "connecting" | "connected" | "disconnected"

The current status of the WebRTC connection.

"connecting" | "connected" | "disconnected"

RealtimeTransportLayer.status

OpenAIRealtimeBase.status

close(): void

Close the connection to the Realtime API and disconnects the underlying WebRTC connection.

void

RealtimeTransportLayer.close

OpenAIRealtimeBase.close


connect(options): Promise<void>

Connect to the Realtime API. This will establish the connection to the OpenAI Realtime API via WebRTC.

If you are using a browser, the transport layer will also automatically configure the microphone and audio output to be used by the session.

Parameter Type Description

options

RealtimeTransportLayerConnectOptions

The options for the connection.

Promise<void>

RealtimeTransportLayer.connect

OpenAIRealtimeBase.connect


emit<K>(type, ...args): boolean
Type Parameter

K extends keyof RealtimeTranportEventTypes

Parameter Type

type

K

args

OpenAIRealtimeEventTypes[K]

boolean

RealtimeTransportLayer.emit

OpenAIRealtimeBase.emit


interrupt(): void

Interrupt the current response if one is ongoing and clear the audio buffer so that the agent stops talking.

void

RealtimeTransportLayer.interrupt

OpenAIRealtimeBase.interrupt


mute(muted): void

Mute or unmute the session.

Parameter Type Description

muted

boolean

Whether to mute the session.

void

RealtimeTransportLayer.mute

OpenAIRealtimeBase.mute


off<K>(type, listener): EventEmitter<EventTypes>
Type Parameter

K extends keyof RealtimeTranportEventTypes

Parameter Type

type

K

listener

(…args) => void

EventEmitter<EventTypes>

RealtimeTransportLayer.off

OpenAIRealtimeBase.off


on<K>(type, listener): EventEmitter<EventTypes>
Type Parameter

K extends keyof RealtimeTranportEventTypes

Parameter Type

type

K

listener

(…args) => void

EventEmitter<EventTypes>

RealtimeTransportLayer.on

OpenAIRealtimeBase.on


once<K>(type, listener): EventEmitter<EventTypes>
Type Parameter

K extends keyof RealtimeTranportEventTypes

Parameter Type

type

K

listener

(…args) => void

EventEmitter<EventTypes>

RealtimeTransportLayer.once

OpenAIRealtimeBase.once


resetHistory(oldHistory, newHistory): void

Reset the history of the conversation. This will create a diff between the old and new history and send the necessary events to the Realtime API to update the history.

Parameter Type Description

oldHistory

RealtimeItem[]

The old history of the conversation.

newHistory

RealtimeItem[]

The new history of the conversation.

void

RealtimeTransportLayer.resetHistory

OpenAIRealtimeBase.resetHistory


sendAudio(audio, options): void

Send an audio buffer to the Realtime API. If { commit: true } is passed, the audio buffer will be committed and the model will start processing it. This is necessary if you have disabled turn detection / voice activity detection (VAD).

Parameter Type Description

audio

ArrayBuffer

The audio buffer to send.

options

{ commit: boolean; }

The options for the audio buffer.

options.commit?

boolean

void

RealtimeTransportLayer.sendAudio

OpenAIRealtimeBase.sendAudio


sendEvent(event): void

Send an event to the Realtime API. This will stringify the event and send it directly to the API. This can be used if you want to take control over the connection and send events manually.

Parameter Type Description

event

RealtimeClientMessage

The event to send.

void

RealtimeTransportLayer.sendEvent

OpenAIRealtimeBase.sendEvent


sendFunctionCallOutput(
toolCall,
output,
startResponse): void

Send the output of a function call to the Realtime API.

Parameter Type Default value Description

toolCall

TransportToolCallEvent

undefined

The tool call to send the output for.

output

string

undefined

The output of the function call.

startResponse

boolean

true

Whether to start a new response after sending the output.

void

RealtimeTransportLayer.sendFunctionCallOutput

OpenAIRealtimeBase.sendFunctionCallOutput


sendMessage(message, otherEventData): void

Send a message to the Realtime API. This will create a new item in the conversation and trigger a response.

Parameter Type Description

message

RealtimeUserInput

The message to send.

otherEventData

Record<string, any>

Additional event data to send.

void

RealtimeTransportLayer.sendMessage

OpenAIRealtimeBase.sendMessage


updateSessionConfig(config): void

Updates the session config. This will merge it with the current session config with the default values and send it to the Realtime API.

Parameter Type Description

config

Partial<RealtimeSessionConfig>

The session config to update.

void

RealtimeTransportLayer.updateSessionConfig

OpenAIRealtimeBase.updateSessionConfig