Implement **Categorical Cross-Entropy** loss for multi-class classification.
Formula:
CCE = -(1/n) * sum_over_samples( sum_over_classes( y_true_ij * log(y_pred_ij) ) )
Write a function categorical_cross_entropy(y_true, y_pred) where:
Example:
y_true = [[1,0,0], [0,1,0], [0,0,1]]
y_pred = [[0.9,0.05,0.05], [0.1,0.8,0.1], [0.1,0.1,0.8]]
categorical_cross_entropy(y_true, y_pred) → 0.1768
**Explanation:** Only the predicted probability for the true class matters in each row (because the one-hot vector zeros out all other terms). Lower predicted probability for the correct class means higher loss.
Constraints:
Test Cases
Test Case 1
Input:
y_true=[[1,0,0],[0,1,0],[0,0,1]], y_pred=[[0.9,0.05,0.05],[0.1,0.8,0.1],[0.1,0.1,0.8]]Expected:
0.1768Test Case 2
Input:
y_true=[[1,0],[0,1]], y_pred=[[1.0,0.0],[0.0,1.0]]Expected:
0.0Test Case 3
Input:
y_true=[[1,0],[0,1]], y_pred=[[0.5,0.5],[0.5,0.5]]Expected:
0.6931+ 2 hidden test cases