Skip to content

LangchainLLMWrapper sets a default temperature even with models that don't support it (ex o3-mini) #1945

Open
@khalilacheche

Description

@khalilacheche

[x] I have checked the documentation and related resources and couldn't resolve my bug.

Describe the bug
When I try to create an LLM instance using LangchainLLMWrapper and specify a model that doesn't support the "temperature" parameter, I cannot get any responses from the API.

Ragas version: 0.2.13
Python version: 3.10.16

Code to Reproduce

evaluator_llm = LangchainLLMWrapper(ChatOpenAI(model_name="o3-mini"))
sample = SingleTurnSample(
response="The Eiffel Tower is located in Paris.",
reference="The Eiffel Tower is located in Paris. I has a height of 1000ft."
)

scorer = FactualCorrectness(llm = evaluator_llm)
await scorer.single_turn_ascore(sample)

Error trace
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported parameter: 'temperature' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_parameter'}}

Expected behavior
A clear and concise description of what you expected to happen.

Additional context
Add any other context about the problem here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions