Skip to content

Commit e7c80ec

Browse files
authored
Update model docs with common issues (#148)
These are common issues faced by devs, so adding docs to help.
2 parents e069279 + 8a6967b commit e7c80ec

File tree

1 file changed

+27
-0
lines changed

1 file changed

+27
-0
lines changed

Diff for: docs/models.md

+27
Original file line numberDiff line numberDiff line change
@@ -64,3 +64,30 @@ In cases where you do not have an API key from `platform.openai.com`, we recomme
6464
!!! note
6565

6666
In these examples, we use the Chat Completions API/model, because most LLM providers don't yet support the Responses API. If your LLM provider does support it, we recommend using Responses.
67+
68+
## Common issues with using other LLM providers
69+
70+
### Tracing client error 401
71+
72+
If you get errors related to tracing, this is because traces are uploaded to OpenAI servers, and you don't have an OpenAI API key. You have three options to resolve this:
73+
74+
1. Disable tracing entirely: [`set_tracing_disabled(True)`][agents.set_tracing_disabled].
75+
2. Set an OpenAI key for tracing: [`set_tracing_export_api_key(...)`][agents.set_tracing_export_api_key]. This API key will only be used for uploading traces, and must be from [platform.openai.com](https://platform.openai.com/).
76+
3. Use a non-OpenAI trace processor. See the [tracing docs](tracing.md#custom-tracing-processors).
77+
78+
### Responses API support
79+
80+
The SDK uses the Responses API by default, but most other LLM providers don't yet support it. You may see 404s or similar issues as a result. To resolve, you have two options:
81+
82+
1. Call [`set_default_openai_api("chat_completions")`][agents.set_default_openai_api]. This works if you are setting `OPENAI_API_KEY` and `OPENAI_BASE_URL` via environment vars.
83+
2. Use [`OpenAIChatCompletionsModel`][agents.models.openai_chatcompletions.OpenAIChatCompletionsModel]. There are examples [here](https://github.com/openai/openai-agents-python/tree/main/examples/model_providers/).
84+
85+
### Structured outputs support
86+
87+
Some model providers don't have support for [structured outputs](https://platform.openai.com/docs/guides/structured-outputs). This sometimes results in an error that looks something like this:
88+
89+
```
90+
BadRequestError: Error code: 400 - {'error': {'message': "'response_format.type' : value is not one of the allowed values ['text','json_object']", 'type': 'invalid_request_error'}}
91+
```
92+
93+
This is a shortcoming of some model providers - they support JSON outputs, but don't allow you to specify the `json_schema` to use for the output. We are working on a fix for this, but we suggest relying on providers that do have support for JSON schema output, because otherwise your app will often break because of malformed JSON.

0 commit comments

Comments
 (0)