Click4Ai

143.

Medium

Focal Loss

Implement the Focal Loss function, designed to address class imbalance in classification tasks. Focal Loss down-weights the loss contribution from easy, well-classified examples so the model focuses its learning on hard, misclassified examples.

Formula:

FL(p_t) = -alpha * (1 - p_t)^gamma * log(p_t)

Where:

p_t = predicted probability for the true class

= p if y = 1

= 1-p if y = 0

alpha = balancing factor (default 0.25)

gamma = focusing parameter (default 2), controls down-weighting strength

Example:

Input: predicted_prob = 0.7, true_label = 1, gamma = 2, alpha = 0.25

p_t = 0.7 (since true_label = 1)

FL = -0.25 * (1 - 0.7)^2 * log(0.7)

= -0.25 * 0.09 * (-0.3567)

= 0.008

Output: 0.008 (approximately)

When the model is already confident (p_t close to 1), the factor (1 - p_t)^gamma becomes very small, reducing the loss. When the model is uncertain (p_t close to 0), the modulating factor is close to 1, preserving the full loss signal. This mechanism steers training toward harder examples.

Constraints:

  • `predicted_prob` is a float in (0, 1).
  • `true_label` is either 0 or 1.
  • `gamma` >= 0 (typically 2).
  • `alpha` is a float in (0, 1) (typically 0.25).
  • Use NumPy for logarithm and power operations.
  • Test Cases

    Test Case 1
    Input: [0.7, 1]
    Expected: 0.23738376160374376
    Test Case 2
    Input: [0.4, 0]
    Expected: 0.2231435513142097
    + 3 hidden test cases