-
Notifications
You must be signed in to change notification settings - Fork 197
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ENH] Replace prts
metrics
#2400
Conversation
Thank you for contributing to
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some of these functions look like they should be private/protected. In the tests please run both the new and current functions to ensure the output is the same (using pytest)
aeon/benchmarking/metrics/anomaly_detection/tests/test_metrics.py
Outdated
Show resolved
Hide resolved
Changes made from the previous commit:
|
@MatthewMiddlehurst Please let me know if the test cases are fine, I can separate them into different functions if necessary. |
thanks for this, its really helpful. We will look again next week |
@MatthewMiddlehurst @SebastianSchmidl , I’ve updated the test cases so that they now compare the current implementation with the prts package for all parameter configurations. |
Please add a check for the soft-dependency "prts" to those test where you call into the existing range-based metrics to make the CI happy; similar to the existing tests: aeon/aeon/benchmarking/metrics/anomaly_detection/tests/test_ad_metrics.py Lines 228 to 232 in 8770d2f
|
Done. Let me know if there's anything else to be changed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for addressing my concerns. I'm happy for this to go in the with testing done.
I would not close the issue yet, the next step is to remove the old functions and prts
as a dependency. Unfortunately, this would involve changing a lot of the testing you have done here again, but the main purpose of the output comparison is to smooth the transition anyway.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just some small documentation changes
@MatthewMiddlehurst @SebastianSchmidl
This should be everything? |
The functions in
Could you add this information to the corresponding issue? This should be addressed in separate PRs. |
3 different PR's for 3 mentioned changes? |
@MatthewMiddlehurst Please merge this PR so I can pull the changes and update the test cases to hard-coded ones. |
@MatthewMiddlehurst A gentle reminder. |
@MatthewMiddlehurst @SebastianSchmidl Anything else to do here? |
Reference Issues/PRs
Addresses #2066
What does this implement/fix? Explain your changes.
Implementation of Precision, Recall and F1-score metrics.
PR checklist
For all contributions