from keras.callbacks import LearningRateScheduler import math def step_decay(epoch): initial_lrate = 0.1 drop = 0.5 epochs_drop = 10 lrate = initial_lrate * math.pow(drop, math.floor((1+epoch)/epochs_drop)) return lrate model.fit(x, y, callbacks=[LearningRateScheduler(step_decay)])
from keras.callbacks import LearningRateScheduler def step_decay(epoch): initial_lrate = 0.1 drop = 0.5 epochs_drop = 10 lrate = initial_lrate * math.pow(drop, math.floor((1+epoch)/epochs_drop)) return lrate lr_scheduler = LearningRateScheduler(step_decay) model.fit(x, y, callbacks=[lr_scheduler])In this example, we are defining a learning rate scheduler function - step_decay() - which reduces the learning rate after every 10 epochs by multiplying it with a factor of 0.5. This code shows how to integrate LearningRateScheduler with our model-training loop. Package Library: Keras.