-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove event based logging pattern #557
base: master
Are you sure you want to change the base?
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #557 +/- ##
==========================================
+ Coverage 96.43% 98.01% +1.58%
==========================================
Files 12 10 -2
Lines 1235 1161 -74
==========================================
- Hits 1191 1138 -53
+ Misses 44 23 -21 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it's generally great that we're moving on from the observer pattern, however, I think it would be better to not pass in the whole BayesianOptimization
object in these steps, but pass the specific relevant information. E.g., in log_optimization_step
, we should pass in the params
of the point, the target
value, constraint
-value, if applicable, and a boolean for is_new_max
. Similarly, the log_optimization_start
function should accept parameter names, etc.
The only argument that I can see against this is if we want to allow people to write their own loggers, but I don't really see a usecase for this.
Hey all, I'm back with another pull request! This one removes some of the old functionality for saving logs since I've created a better way of saving state. In doing so, I've tried to simplify the file structure, mostly by deleting unnecessary files, but also by moving all the logging methods to a Logger class in logger.py. This class is now invoked on initialization of the BayesianOptimization object, and exposes the steps previously invoked by the Observable pattern, optimization_start, optimization_step, and optimization_end. I think this unification is cleaner. Changes to the format of the Logger can still be made, they are just stored under optimizer.logger.whateveryouwant.