Skip to content

Commit 84ca81b

Browse files
committed
Change:
- Learning rate decay calculated dynamiclaly to trainset size.
1 parent 0196051 commit 84ca81b

File tree

1 file changed

+4
-1
lines changed

1 file changed

+4
-1
lines changed

train.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -106,9 +106,12 @@ def train(
106106

107107
# Compile model
108108
loss = tf.keras.losses.BinaryCrossentropy(from_logits=False)
109+
# Lower lerning rate every 5th epoch.
110+
# One step means model optimized to one mini batch aka one iteration.
111+
decay_steps = int(5 * (len(trn_gen)/bs))
109112
lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(
110113
initial_learning_rate=lr,
111-
decay_steps=int(5 * 2940), # 5 Epochs
114+
decay_steps=int(decay_steps),
112115
decay_rate=0.8,
113116
)
114117
opt = tf.keras.optimizers.Adam(learning_rate=lr_schedule)

0 commit comments

Comments
 (0)