Click4Ai

65.

Medium

Implement **Categorical Cross-Entropy** loss for multi-class classification.

Formula:

CCE = -(1/n) * sum_over_samples( sum_over_classes( y_true_ij * log(y_pred_ij) ) )

Write a function categorical_cross_entropy(y_true, y_pred) where:

  • `y_true`: One-hot encoded true labels, shape (n_samples, n_classes)
  • `y_pred`: Predicted probabilities (softmax output), shape (n_samples, n_classes)
  • Example:

    y_true = [[1,0,0], [0,1,0], [0,0,1]]

    y_pred = [[0.9,0.05,0.05], [0.1,0.8,0.1], [0.1,0.1,0.8]]

    categorical_cross_entropy(y_true, y_pred) → 0.1768

    **Explanation:** Only the predicted probability for the true class matters in each row (because the one-hot vector zeros out all other terms). Lower predicted probability for the correct class means higher loss.

    Constraints:

  • Clip y_pred to [1e-15, 1-1e-15] to avoid log(0)
  • Return a single float (average loss across samples)
  • Test Cases

    Test Case 1
    Input: y_true=[[1,0,0],[0,1,0],[0,0,1]], y_pred=[[0.9,0.05,0.05],[0.1,0.8,0.1],[0.1,0.1,0.8]]
    Expected: 0.1768
    Test Case 2
    Input: y_true=[[1,0],[0,1]], y_pred=[[1.0,0.0],[0.0,1.0]]
    Expected: 0.0
    Test Case 3
    Input: y_true=[[1,0],[0,1]], y_pred=[[0.5,0.5],[0.5,0.5]]
    Expected: 0.6931
    + 2 hidden test cases