AzureChatOpenAI Deployment “o3-mini” Returns 400 Error Due to Unsupported ‘temperature’ Parameter #39938
Labels
bug
This issue requires a change to an existing behavior in the product in order to be resolved.
Client
This issue points to a problem in the data-plane of the library.
customer-reported
Issues that are reported by GitHub users external to the Azure organization.
issue-addressed
Workflow: The Azure SDK team believes it to be addressed and ready to close.
needs-team-attention
Workflow: This issue needs attention from Azure service team or SDK team
OpenAI
Service Attention
Workflow: This issue is responsible by Azure service team.
Describe the bug
When using the o3-mini deployment for evaluation, the API returns a 400 error stating:
Unsupported parameter: 'temperature' is not supported with this model.
Despite conditional logic in the code intended to omit the temperature parameter for o3-mini, the parameter is still being sent in some cases. This leads to repeated HTTP POST failures during evaluation.
To Reproduce
Steps to reproduce the behavior:
python evaluate.py
"temperature" parameter.
Below is a minimal sample code snippet that demonstrates how the client is instantiated. Note that the conditional check should omit the temperature parameter when the model name includes "o3-mini". Despite this, the parameter appears to be sent, triggering the error:
Expected behavior
Screenshots
. for instance:
2025-03-04 12:52:21,701 - httpx - INFO - HTTP Request: POST https://swedencentral.api.cognitive.microsoft.com/openai/deployments/o3-mini/chat/completions?api-version=2025-01-01-preview "HTTP/1.1 400 model_error" 2025-03-04 12:52:21,703 - ragas.executor - ERROR - Exception raised in Job[2]: BadRequestError(Error code: 400 - {'error': {'message': "Unsupported parameter: 'temperature' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_parameter'}})
Additional context
The text was updated successfully, but these errors were encountered: