Dice Loss
Implement the Dice Loss function, widely used in image segmentation tasks. The Dice coefficient measures the overlap between two binary sets, and the Dice Loss is defined as 1 minus the Dice coefficient. It is particularly effective when dealing with imbalanced foreground/background ratios.
Formula:
Dice Coefficient = 2 * |X intersection Y| / (|X| + |Y|)
Dice Loss = 1 - Dice Coefficient
Where:
|X intersection Y| = sum of element-wise product of predicted and true arrays
|X| = sum of predicted array
|Y| = sum of true array
Example:
Input: predicted = [1, 0, 1, 0], true = [1, 1, 0, 0]
intersection = 1*1 + 0*1 + 1*0 + 0*0 = 1
|X| = 2, |Y| = 2
Dice = 2 * 1 / (2 + 2) = 0.5
Dice Loss = 1 - 0.5 = 0.5
Output: 0.5
A Dice Loss of 0 indicates perfect overlap between the predicted and true masks, while a Dice Loss of 1 indicates no overlap at all. The metric naturally handles class imbalance because it focuses on the intersection relative to the total number of positive predictions and ground truth positives.
Constraints:
Test Cases
[[1, 0, 1, 0], [1, 1, 0, 0]]0.5[[1, 1, 1, 1], [1, 1, 1, 1]]0.0