Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

temperature is not supported with this model(o3-mini) #2104

Closed
1 task done
gautamjajoo opened this issue Feb 9, 2025 · 3 comments
Closed
1 task done

temperature is not supported with this model(o3-mini) #2104

gautamjajoo opened this issue Feb 9, 2025 · 3 comments
Labels
question Further information is requested

Comments

@gautamjajoo
Copy link

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

Temperature is not supported in the o3 model. Similar issue was reported earlier(#2072) and it was supposed to be fixed in the 1.61.1 release(#2078)

openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported parameter: 'temperature' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_parameter'}}

To Reproduce

messages = [
    {"role": "system", "content": "You are an expert"},
    {"role": "user", "content": "What is the capital of France"}
]

client = OpenAI(api_key=api_key)

response = client.chat.completions.create(
    model="o3-mini",
    messages=messages,
    temperature=0
)

print(response.choices[0].message.content)

Using this code, the above error pops up.

Code snippets

OS

macOS

Python version

Python 3.13.1

Library version

openai 1.61.1

@gautamjajoo gautamjajoo added the bug Something isn't working label Feb 9, 2025
@RobertCraigie RobertCraigie added question Further information is requested and removed bug Something isn't working labels Feb 9, 2025
@RobertCraigie
Copy link
Collaborator

#2072 was just about the CLI sending it when the user doesn't specify the temperature flag, in your snippet you're using the client API where if you specify temperature we're going to send it. If you're using a model that doesn't support temperature you shouldn't specify it.

@RobertCraigie RobertCraigie closed this as not planned Won't fix, can't repro, duplicate, stale Feb 9, 2025
@David2020-udec
Copy link

Good afternoon

Has the error finally been corrected??

5 days ago the error continued to persist, even if the variable is not specified it is passed underneath, the only way to find out is to print it when executing the model.

`
"openai": ChatOpenAI(
model="o3-mini-2025-01-31",
model_kwargs={"reasoning_effort": "high"},
api_key=environ.get("OPENAI_API_KEY")
),

print("=== MODEL CONFIGURATION o3 ===")
print(CHAT_MODELS["openai"].dict()) # Check that 'temperature' is no longer present
print("===================================================")`

@RobertCraigie
Copy link
Collaborator

That interface does not come from this SDK please report this to the library you're using.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants