-
Notifications
You must be signed in to change notification settings - Fork 45
Wrap gradient_free_optimizers (local) #624
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Codecov Report❌ Patch coverage is
❌ Your patch check has failed because the patch coverage (74.71%) is below the target coverage (80.00%). You can increase the patch coverage or adjust the target coverage.
... and 5 files with indirect coverage changes 🚀 New features to boost your workflow:
|
Hi @janosg
|
|
fix doc fixes add tests
Hi @janosg, Changesadd a new example problem with converter in internal_optimization_problemAdded functions and converter dealing with dict input. refactor test_many_algorithms.( this is minimal and is just for passing tests on this one)
gfo
|
stopping_funval: float | None = None | ||
""""Stop the optimization if the objective function is less than this value.""" | ||
|
||
convergence_iter_noimprove: PositiveInt = 1000 # need to set high |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we set this to None or a really high value instead? Is there another convercence criterion we could set to a non-None value instead? We don't want all optimizers just to run until max_iter but we also don't want pre-mature stopping of course.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am a bit confused about this.
Most of the time convergence_iter_noimprove
behaves as stopping_maxiter
. The other convergence criteria dont seem to be respected. Even after a > 100000 iterations the algorithm does not converge to a good solution. I might be missing something but hope to get a solution here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for creating the issue over there.
…st_gfo_optimizers
…okup dict for failing tests, move nag_dfols test to nag_optimizers, move test_pygmo_optimizers
PR Description
This PR adds optimizers from gradient_free_optimizers
The following optimizers are now available
Helper Functions