Responses WebSocket Session
ResponsesWebSocketSession
dataclass
Helper that pins runs to a shared OpenAI websocket-capable provider.
Source code in src/agents/responses_websocket_session.py
aclose
async
run
async
run(
starting_agent: Agent[Any],
input: str | list[TResponseInputItem] | RunState[Any],
**kwargs: Any,
) -> RunResult
Call Runner.run with the session's shared RunConfig.
Source code in src/agents/responses_websocket_session.py
run_streamed
run_streamed(
starting_agent: Agent[Any],
input: str | list[TResponseInputItem] | RunState[Any],
**kwargs: Any,
) -> RunResultStreaming
Call Runner.run_streamed with the session's shared RunConfig.
Source code in src/agents/responses_websocket_session.py
responses_websocket_session
async
responses_websocket_session(
*,
api_key: str | None = None,
base_url: str | None = None,
websocket_base_url: str | None = None,
organization: str | None = None,
project: str | None = None,
) -> AsyncIterator[ResponsesWebSocketSession]
Create a shared OpenAI Responses websocket session for multiple Runner calls.
The helper returns a session object that injects one shared RunConfig backed by a
websocket-configured MultiProvider with one shared OpenAIProvider. This preserves
prefix-based model routing (for example openai/gpt-4.1) while keeping websocket
connections warm across turns and nested agent-as-tool runs that inherit the same
run_config.
Drain or close streamed iterators before the context exits. Exiting the context while a websocket request is still in flight may force-close the shared connection.