Agents module
ResponsesWebSocketSession
dataclass
Helper that pins runs to a shared OpenAI websocket-capable provider.
Source code in src/agents/responses_websocket_session.py
aclose
async
run
async
run(
starting_agent: Agent[Any],
input: str | list[TResponseInputItem] | RunState[Any],
**kwargs: Any,
) -> RunResult
Call Runner.run with the session's shared RunConfig.
Source code in src/agents/responses_websocket_session.py
run_streamed
run_streamed(
starting_agent: Agent[Any],
input: str | list[TResponseInputItem] | RunState[Any],
**kwargs: Any,
) -> RunResultStreaming
Call Runner.run_streamed with the session's shared RunConfig.
Source code in src/agents/responses_websocket_session.py
set_default_openai_key
Set the default OpenAI API key to use for LLM requests (and optionally tracing()). This is only necessary if the OPENAI_API_KEY environment variable is not already set.
If provided, this key will be used instead of the OPENAI_API_KEY environment variable.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
key
|
str
|
The OpenAI key to use. |
required |
use_for_tracing
|
bool
|
Whether to also use this key to send traces to OpenAI. Defaults to True If False, you'll either need to set the OPENAI_API_KEY environment variable or call set_tracing_export_api_key() with the API key you want to use for tracing. |
True
|
Source code in src/agents/__init__.py
set_default_openai_client
Set the default OpenAI client to use for LLM requests and/or tracing. If provided, this client will be used instead of the default OpenAI client.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
client
|
AsyncOpenAI
|
The OpenAI client to use. |
required |
use_for_tracing
|
bool
|
Whether to use the API key from this client for uploading traces. If False, you'll either need to set the OPENAI_API_KEY environment variable or call set_tracing_export_api_key() with the API key you want to use for tracing. |
True
|
Source code in src/agents/__init__.py
set_default_openai_api
Set the default API to use for OpenAI LLM requests. By default, we will use the responses API but you can set this to use the chat completions API instead.
Source code in src/agents/__init__.py
set_default_openai_responses_transport
Set the default transport for OpenAI Responses API requests.
By default, the Responses API uses the HTTP transport. Set this to "websocket" to use
websocket transport when the OpenAI provider resolves a Responses model.
Source code in src/agents/__init__.py
set_tracing_export_api_key
set_tracing_disabled
set_trace_processors
set_trace_processors(
processors: list[TracingProcessor],
) -> None
Set the list of trace processors. This will replace the current list of processors.
enable_verbose_stdout_logging
Enables verbose logging to stdout. This is useful for debugging.