Learning Rate Scheduler
Implement a learning rate scheduler that adjusts the learning rate during training using exponential decay. The learning rate is one of the most important hyperparameters in training neural networks. Starting with a high learning rate allows fast initial convergence, while gradually decreasing it enables fine-tuning and helps the model converge to a better minimum.
The exponential decay formula is:
lr = initial_lr * decay_rate ^ epoch
Your function learning_rate_scheduler(initial_lr, decay_rate, epoch) should compute the decayed learning rate at the given epoch.
Example:
Input: initial_lr = 0.1, decay_rate = 0.5, epoch = 2
Calculation: lr = 0.1 * (0.5 ^ 2) = 0.1 * 0.25 = 0.025
Output: 0.025
A good learning rate schedule prevents the optimizer from overshooting the minimum in the loss landscape. Early in training, a larger learning rate helps traverse the loss surface quickly. As training progresses, reducing the learning rate allows the optimizer to settle into a precise minimum rather than bouncing around it.
Constraints:
Test Cases
[0.1, 0.5, 2]0.025[0.05, 0.8, 3]0.016