Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: avoid FaithfulnessEvaluator and ContextRelevanceEvaluator return Nan #7685

Merged
merged 7 commits into from
May 14, 2024

Conversation

davidsbatista
Copy link
Contributor

@davidsbatista davidsbatista commented May 13, 2024

Related Issues

How did you test it?

  • unit tests,
  • integration tests
  • manual verification

@davidsbatista davidsbatista changed the title initial import fix: avoid FaithfulnessEvaluator return Nan May 13, 2024
@davidsbatista davidsbatista added the ignore-for-release-notes PRs with this flag won't be included in the release notes. label May 13, 2024
@coveralls
Copy link
Collaborator

coveralls commented May 13, 2024

Pull Request Test Coverage Report for Build 9078293946

Details

  • 0 of 0 changed or added relevant lines in 0 files are covered.
  • No unchanged relevant lines lost coverage.
  • Overall coverage increased (+0.005%) to 90.434%

Totals Coverage Status
Change from base Build 9077836136: 0.005%
Covered Lines: 6542
Relevant Lines: 7234

💛 - Coveralls

@davidsbatista davidsbatista marked this pull request as ready for review May 13, 2024 11:11
@davidsbatista davidsbatista requested a review from a team as a code owner May 13, 2024 11:11
@davidsbatista davidsbatista requested review from masci and julian-risch and removed request for a team and masci May 13, 2024 11:11
Copy link
Member

@julian-risch julian-risch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The fix for FaithfulnessEvaluator and the newly added test look good to me. 👍 Let's add release notes and I'd suggest to also add the same fix for the ContextRelevanceEvaluator as part of this PR.

res["score"] = np_mean(res["statement_scores"])

If you want to put that fix in a separate PR, we cannot close the issue yet and thus need to change the PR description with the "fixes #..."

@davidsbatista
Copy link
Contributor Author

you are right, I forgot that one - the error only occur to me on the FaithfullnessEvaluator, but I will add code + tests for the ContextRelevanceEvaluator

@davidsbatista davidsbatista requested a review from a team as a code owner May 13, 2024 14:08
@davidsbatista davidsbatista requested review from dfokina and removed request for a team May 13, 2024 14:08
@julian-risch julian-risch self-requested a review May 14, 2024 15:04
@julian-risch julian-risch changed the title fix: avoid FaithfulnessEvaluator return Nan fix: avoid FaithfulnessEvaluator and ContextRelevanceEvaluator return Nan May 14, 2024
Copy link
Member

@julian-risch julian-risch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! 👍 Just updated the PR title to include ContextRelevanceEvaluator too.

@davidsbatista davidsbatista merged commit 798dc4a into main May 14, 2024
25 checks passed
@davidsbatista davidsbatista deleted the LLM-based-eval-fix-NaN branch May 14, 2024 15:08
davidsbatista added a commit that referenced this pull request May 16, 2024
… `Nan` (#7685)

* initial import

* fixing tests

* relaxing condition

* adding safeguard for ContextRelevanceEvaluator as well

* adding release notes
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ignore-for-release-notes PRs with this flag won't be included in the release notes. topic:tests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

LLM-based evaluators shouldn't return NaN
3 participants