Configuration
This page covers SDK-wide defaults that you usually set once during app startup, such as the default OpenAI client, transport, tracing export key, and debug logging behavior. These settings apply process-wide by default, so this is the right place for universal configuration rather than per-agent or per-run tuning.
If you need to configure a specific Agent, Runner, or run() call instead, see:
- Running Agents for
Runnerand per-run options. - Models for agent-level and runner-level model settings.
- Tracing for run-specific tracing configuration and exporter behavior.
OpenAI client and transport
Section titled “OpenAI client and transport”API keys and clients
Section titled “API keys and clients”By default the SDK resolves OPENAI_API_KEY lazily when it needs to create an OpenAI client. If setting the environment variable is not possible, call setDefaultOpenAIKey() manually.
import { setDefaultOpenAIKey } from '@openai/agents';
setDefaultOpenAIKey(process.env.OPENAI_API_KEY!); // sk-...You may also pass your own OpenAI client instance. The SDK will otherwise create one automatically using the default key.
import { OpenAI } from 'openai';import { setDefaultOpenAIClient } from '@openai/agents';
const customClient = new OpenAI({ baseURL: '...', apiKey: '...' });setDefaultOpenAIClient(customClient);API selection
Section titled “API selection”Finally you can switch between the Responses API and the Chat Completions API.
import { setOpenAIAPI } from '@openai/agents';
setOpenAIAPI('chat_completions');Responses transport
Section titled “Responses transport”If you are using the Responses API, you can also choose the OpenAI provider transport. The default is HTTP.
import { setOpenAIAPI, setOpenAIResponsesTransport } from '@openai/agents';
setOpenAIAPI('responses');setOpenAIResponsesTransport('websocket');Use setOpenAIResponsesTransport('websocket') to enable the WebSocket transport and setOpenAIResponsesTransport('http') to switch back. If you route websocket traffic through a proxy or gateway, set OPENAI_WEBSOCKET_BASE_URL (or configure websocketBaseURL on your OpenAIProvider).
This process-wide default only affects models that are later resolved through the default OpenAI provider. If you pass a concrete Model instance or a custom modelProvider, configure the transport there instead. See the Models guide.
Observability and debugging
Section titled “Observability and debugging”Tracing
Section titled “Tracing”Tracing is enabled by default in supported server runtimes. It is disabled by default in browsers
and when NODE_ENV=test.
By default trace export uses the same OpenAI key from the section above.
A separate key may be set via setTracingExportApiKey():
import { setTracingExportApiKey } from '@openai/agents';
setTracingExportApiKey('sk-...');Tracing can also be disabled entirely:
import { setTracingDisabled } from '@openai/agents';
setTracingDisabled(true);If you’d like to learn more about the tracing feature, please check out Tracing guide.
Debug logging
Section titled “Debug logging”The SDK uses the debug package for debug logging. Set the DEBUG environment variable to openai-agents* to see verbose logs.
export DEBUG=openai-agents*To log session persistence activity, set OPENAI_AGENTS__DEBUG_SAVE_SESSION=1.
You can obtain a namespaced logger for your own modules using getLogger(namespace) from @openai/agents.
import { getLogger } from '@openai/agents';
const logger = getLogger('my-app');logger.debug('something happened');Sensitive data in logs
Section titled “Sensitive data in logs”Certain logs may contain user data. Disable them by setting these environment variables.
To disable logging LLM inputs and outputs:
export OPENAI_AGENTS_DONT_LOG_MODEL_DATA=1To disable logging tool inputs and outputs:
export OPENAI_AGENTS_DONT_LOG_TOOL_DATA=1