Focal Loss
Implement the Focal Loss function, designed to address class imbalance in classification tasks. Focal Loss down-weights the loss contribution from easy, well-classified examples so the model focuses its learning on hard, misclassified examples.
Formula:
FL(p_t) = -alpha * (1 - p_t)^gamma * log(p_t)
Where:
p_t = predicted probability for the true class
= p if y = 1
= 1-p if y = 0
alpha = balancing factor (default 0.25)
gamma = focusing parameter (default 2), controls down-weighting strength
Example:
Input: predicted_prob = 0.7, true_label = 1, gamma = 2, alpha = 0.25
p_t = 0.7 (since true_label = 1)
FL = -0.25 * (1 - 0.7)^2 * log(0.7)
= -0.25 * 0.09 * (-0.3567)
= 0.008
Output: 0.008 (approximately)
When the model is already confident (p_t close to 1), the factor (1 - p_t)^gamma becomes very small, reducing the loss. When the model is uncertain (p_t close to 0), the modulating factor is close to 1, preserving the full loss signal. This mechanism steers training toward harder examples.
Constraints:
Test Cases
[0.7, 1]0.23738376160374376[0.4, 0]0.2231435513142097