Click4Ai

64.

Easy

Implement the **Binary Cross-Entropy (Log Loss)** function, the standard loss function for binary classification.

Formula:

BCE = -(1/n) * sum(y_true * log(y_pred) + (1 - y_true) * log(1 - y_pred))

Write a function binary_cross_entropy(y_true, y_pred) that takes arrays of true labels (0 or 1) and predicted probabilities (between 0 and 1).

Example:

y_true = [1, 0, 1, 1, 0]

y_pred = [0.9, 0.1, 0.8, 0.7, 0.2]

binary_cross_entropy(y_true, y_pred) → 0.1738

**Explanation:** For each sample, if the true label is 1, we penalize low predicted probabilities; if the true label is 0, we penalize high predicted probabilities. The log function makes confident wrong predictions very costly.

Constraints:

  • Use np.log (natural logarithm)
  • Clip y_pred to [1e-15, 1-1e-15] to avoid log(0)
  • Return a single float (the average loss)
  • Test Cases

    Test Case 1
    Input: y_true=[1,0,1,1,0], y_pred=[0.9,0.1,0.8,0.7,0.2]
    Expected: 0.1738
    Test Case 2
    Input: y_true=[1,1,1], y_pred=[1.0,1.0,1.0]
    Expected: 0.0
    Test Case 3
    Input: y_true=[0,0,0], y_pred=[0.0,0.0,0.0]
    Expected: 0.0
    + 2 hidden test cases