You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[x] I have checked the documentation and related resources and couldn't resolve my bug.
Describe the bug
When I try to create an LLM instance using LangchainLLMWrapper and specify a model that doesn't support the "temperature" parameter, I cannot get any responses from the API.
Ragas version: 0.2.13
Python version: 3.10.16
Code to Reproduce
evaluator_llm = LangchainLLMWrapper(ChatOpenAI(model_name="o3-mini"))
sample = SingleTurnSample(
response="The Eiffel Tower is located in Paris.",
reference="The Eiffel Tower is located in Paris. I has a height of 1000ft."
)
I guess the solution provided won't work if using AzureChatOpenAI(from langchain_openai.chat_models import AzureChatOpenAI).
if we pass temperature as None -> AzureChatOpenAI throws error ('temperature' must be valid float)
if we remove temperature attributes from LangchainLLMWrapper object -> AzureChatOpenAI throws error (AttributeError('AzureChatOpenAI' object has no attribute 'temperature'))
Need to get a Generic solution to work well with all existing clients for reasoning models(o1, o3-mini etc.)
[x] I have checked the documentation and related resources and couldn't resolve my bug.
Describe the bug
When I try to create an LLM instance using LangchainLLMWrapper and specify a model that doesn't support the "temperature" parameter, I cannot get any responses from the API.
Ragas version: 0.2.13
Python version: 3.10.16
Code to Reproduce
evaluator_llm = LangchainLLMWrapper(ChatOpenAI(model_name="o3-mini"))
sample = SingleTurnSample(
response="The Eiffel Tower is located in Paris.",
reference="The Eiffel Tower is located in Paris. I has a height of 1000ft."
)
scorer = FactualCorrectness(llm = evaluator_llm)
await scorer.single_turn_ascore(sample)
Error trace
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported parameter: 'temperature' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_parameter'}}
Expected behavior
A clear and concise description of what you expected to happen.
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: