Click4Ai

107.

Easy

Learning Rate Scheduler

Implement a learning rate scheduler that adjusts the learning rate during training using exponential decay. The learning rate is one of the most important hyperparameters in training neural networks. Starting with a high learning rate allows fast initial convergence, while gradually decreasing it enables fine-tuning and helps the model converge to a better minimum.

The exponential decay formula is:

lr = initial_lr * decay_rate ^ epoch

Your function learning_rate_scheduler(initial_lr, decay_rate, epoch) should compute the decayed learning rate at the given epoch.

Example:

Input: initial_lr = 0.1, decay_rate = 0.5, epoch = 2

Calculation: lr = 0.1 * (0.5 ^ 2) = 0.1 * 0.25 = 0.025

Output: 0.025

A good learning rate schedule prevents the optimizer from overshooting the minimum in the loss landscape. Early in training, a larger learning rate helps traverse the loss surface quickly. As training progresses, reducing the learning rate allows the optimizer to settle into a precise minimum rather than bouncing around it.

Constraints:

  • The initial learning rate should be between 0 and 1
  • The decay rate should be between 0 and 1
  • The epoch should be a non-negative integer
  • When decay_rate is 1, the learning rate remains constant
  • Test Cases

    Test Case 1
    Input: [0.1, 0.5, 2]
    Expected: 0.025
    Test Case 2
    Input: [0.05, 0.8, 3]
    Expected: 0.016
    + 3 hidden test cases