Skip to content

LangchainLLMWrapper sets a default temperature even with models that don't support it (ex o3-mini) #1945

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
khalilacheche opened this issue Mar 4, 2025 · 3 comments · May be fixed by #1951
Labels
bug Something isn't working

Comments

@khalilacheche
Copy link

khalilacheche commented Mar 4, 2025

[x] I have checked the documentation and related resources and couldn't resolve my bug.

Describe the bug
When I try to create an LLM instance using LangchainLLMWrapper and specify a model that doesn't support the "temperature" parameter, I cannot get any responses from the API.

Ragas version: 0.2.13
Python version: 3.10.16

Code to Reproduce

evaluator_llm = LangchainLLMWrapper(ChatOpenAI(model_name="o3-mini"))
sample = SingleTurnSample(
response="The Eiffel Tower is located in Paris.",
reference="The Eiffel Tower is located in Paris. I has a height of 1000ft."
)

scorer = FactualCorrectness(llm = evaluator_llm)
await scorer.single_turn_ascore(sample)

Error trace
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported parameter: 'temperature' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_parameter'}}

Expected behavior
A clear and concise description of what you expected to happen.

Additional context
Add any other context about the problem here.

@khalilacheche khalilacheche added the bug Something isn't working label Mar 4, 2025
@mukul1609
Copy link

mukul1609 commented Mar 6, 2025

I guess the solution provided won't work if using AzureChatOpenAI(from langchain_openai.chat_models import AzureChatOpenAI).

  1. if we pass temperature as None -> AzureChatOpenAI throws error ('temperature' must be valid float)
  2. if we remove temperature attributes from LangchainLLMWrapper object -> AzureChatOpenAI throws error (AttributeError('AzureChatOpenAI' object has no attribute 'temperature'))

Need to get a Generic solution to work well with all existing clients for reasoning models(o1, o3-mini etc.)

@mukul1609
Copy link

Quick fix -
.tesla_rasa_eval/lib/python3.11/site-packages/ragas/llms/base.py
change the 'get_temperature' function -

    def get_temperature(self, n: int) -> float:
        """Return the temperature to use for completion based on n."""
        return 0.3 if n > 1 else n
  • change -> replaced 1e-8 to n

and pass temperature = 0 for reasoning models(o1, o3-mini etc.)

@keyoumao
Copy link

keyoumao commented Mar 6, 2025

I am encountering similar issues referring to these 2:

  1. https://github.com/Azure/azure-sdk-for-python/issues/39938
  2. https://github.com/langchain-ai/langchain/issues/30126

Maybe we could just add a conditional statement here specifically for like you suggested @mukul1609 for o1 and o3 family. There is also an article experimenting reasoning models for evaluation pipeline: https://www.reddit.com/r/Rag/comments/1ixs5wx/we_evaluated_if_reasoning_models_like_o3mini_can/?rdt=60212.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants