Skip to content

AzureChatOpenAI Deployment “o3-mini” Returns 400 Error Due to Unsupported ‘temperature’ Parameter #39938

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
keyoumao opened this issue Mar 4, 2025 · 5 comments
Labels
bug This issue requires a change to an existing behavior in the product in order to be resolved. Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. issue-addressed Workflow: The Azure SDK team believes it to be addressed and ready to close. needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team OpenAI Service Attention Workflow: This issue is responsible by Azure service team.

Comments

@keyoumao
Copy link

keyoumao commented Mar 4, 2025

  • Package Name: Azure Cognitive Services OpenAI (via AzureChatOpenAI and AzureOpenAIEmbeddings)
  • Package Version: langchain-openai = "==0.1.6"
  • Operating System: macOS (Apple Silicon, using Homebrew)
  • Python Version: 3.10.16

Describe the bug
When using the o3-mini deployment for evaluation, the API returns a 400 error stating:

Unsupported parameter: 'temperature' is not supported with this model.

Despite conditional logic in the code intended to omit the temperature parameter for o3-mini, the parameter is still being sent in some cases. This leads to repeated HTTP POST failures during evaluation.

To Reproduce
Steps to reproduce the behavior:

  1. Configure the evaluation to use an o3-mini deployment (ensure that your settings’ model name for either generator or evaluator includes "o3-mini" as a substring).
  2. Load a valid checkpoint file when prompted.
  3. Run the evaluation command: python evaluate.py
  4. Observe the logs where HTTP POST requests to the Azure endpoint return 400 errors with the message regarding the unsupported
    "temperature" parameter.
    Below is a minimal sample code snippet that demonstrates how the client is instantiated. Note that the conditional check should omit the temperature parameter when the model name includes "o3-mini". Despite this, the parameter appears to be sent, triggering the error:

# Sample settings for an o3-mini deployment
settings = {
    "GENERATOR_LLM_ENDPOINT": "https://example.cognitive.microsoft.com/",
    "GENERATOR_LLM_API_KEY": "****",  # Masked API key
    "GENERATOR_LLM_DEPLOYMENT_NAME": "o3-mini",
    "GENERATOR_LLM_MODEL_NAME": "o3-mini",
    "GENERATOR_LLM_API_VERSION": "2025-01-01-preview",
    "GENERATOR_LLM_API_TYPE": "azure",
    "GENERATOR_LLM_TEMPERATURE": 0.7,
}

params = {
    "azure_endpoint": settings["GENERATOR_LLM_ENDPOINT"],
    "deployment_name": settings["GENERATOR_LLM_DEPLOYMENT_NAME"],
    "openai_api_version": settings["GENERATOR_LLM_API_VERSION"],
    "openai_api_key": settings["GENERATOR_LLM_API_KEY"],
    "model_name": settings["GENERATOR_LLM_MODEL_NAME"],
}

# Conditional check to omit 'temperature' for o3-mini
if "o3-mini" not in settings["GENERATOR_LLM_MODEL_NAME"].lower():
    params["temperature"] = settings["GENERATOR_LLM_TEMPERATURE"]

llm = AzureChatOpenAI(**params)
print("LLM instantiated:", llm)

Expected behavior

  • The client should not include the temperature parameter when using an o3-mini model deployment.
  • The evaluation should proceed without the API returning a 400 error, with a 200 instead.
  • Alternatively, if the parameter is necessary, the API should accept it or provide clear documentation on how to handle such requests. We did not have any issues with our current code using gpt-4o or gpt-4o-mini.

Screenshots
. for instance:
2025-03-04 12:52:21,701 - httpx - INFO - HTTP Request: POST https://swedencentral.api.cognitive.microsoft.com/openai/deployments/o3-mini/chat/completions?api-version=2025-01-01-preview "HTTP/1.1 400 model_error" 2025-03-04 12:52:21,703 - ragas.executor - ERROR - Exception raised in Job[2]: BadRequestError(Error code: 400 - {'error': {'message': "Unsupported parameter: 'temperature' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_parameter'}})

Additional context

  • The error occurs during one-by-one Q&A evaluation when sending requests to the endpoint.
  • This issue blocks the evaluation process and may affect other workflows using the o3-mini deployment.
  • Any guidance on either a fix in the API or recommendations for adjusting client requests would be greatly appreciated.
@github-actions github-actions bot added customer-reported Issues that are reported by GitHub users external to the Azure organization. needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. question The issue doesn't require a change to the product in order to be resolved. Most issues start as that labels Mar 4, 2025
@xiangyan99 xiangyan99 added bug This issue requires a change to an existing behavior in the product in order to be resolved. Service Attention Workflow: This issue is responsible by Azure service team. Client This issue points to a problem in the data-plane of the library. OpenAI and removed question The issue doesn't require a change to the product in order to be resolved. Most issues start as that needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. labels Mar 4, 2025
Copy link

github-actions bot commented Mar 4, 2025

Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @trrwilson.

@github-actions github-actions bot added the needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team label Mar 4, 2025
@kristapratico
Copy link
Member

Hi @keyoumao can you move this issue to the langchain repo? https://github.com/langchain-ai/langchain/issues

Also, I see you are using an older version of langchain-openai. It looks like this was fixed in 0.2.3 or greater: langchain-ai/langchain@ce33c4f Are you able to upgrade the package?

@kristapratico kristapratico added the needs-author-feedback Workflow: More information is needed from author to address the issue. label Mar 5, 2025
@github-actions github-actions bot removed the needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team label Mar 5, 2025
Copy link

github-actions bot commented Mar 5, 2025

Hi @keyoumao. Thank you for opening this issue and giving us the opportunity to assist. To help our team better understand your issue and the details of your scenario please provide a response to the question asked above or the information requested above. This will help us more accurately address your issue.

@keyoumao
Copy link
Author

keyoumao commented Mar 5, 2025

Thank you @kristapratico ! I have some conflicts by updating the langchain-openai. I moved this to https://github.com/langchain-ai/langchain/issues/30126 . I found a similar issue in RAGAS: https://github.com/explodinggradients/ragas/issues/1945

@github-actions github-actions bot added needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team and removed needs-author-feedback Workflow: More information is needed from author to address the issue. labels Mar 5, 2025
@kristapratico
Copy link
Member

Thank you @kristapratico ! I have some conflicts by updating the langchain-openai. I moved this to https://github.com/langchain-ai/langchain/issues/30126 . I found a similar issue in RAGAS: explodinggradients/ragas#1945

Thanks @keyoumao! I'm going to go ahead and close this and let's follow-up with the maintainers on the issues you created in langchain and ragas.

@xiangyan99 xiangyan99 added the issue-addressed Workflow: The Azure SDK team believes it to be addressed and ready to close. label Mar 18, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug This issue requires a change to an existing behavior in the product in order to be resolved. Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. issue-addressed Workflow: The Azure SDK team believes it to be addressed and ready to close. needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team OpenAI Service Attention Workflow: This issue is responsible by Azure service team.
Projects
None yet
Development

No branches or pull requests

3 participants