You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If you want a different client learning rate for each round of federated averaging, here are few suggestions:
Pass round number as part of the server_state in the federated algorithm and then create a new client optimizer with learning rate based on it at each round of federated training. (caveat: potentially slow due to effectively recompiling client optimizer apply function each round)
Optax has built-in support for learning rate decay e.g., optax.exponential_decay. Some of these schedulers can be used directly / tweaked to support your use cases.
Hi,
I'm playing around with clients learning rate but I cannot find a clean way of modifying it.
Basically, I need to change the LR following a schedule based on the current round.
Is that possible?
Thanks
The text was updated successfully, but these errors were encountered: