Click4Ai

60.

Easy

Implement the **R-Squared (R^2) Score** (Coefficient of Determination).

Formula:

SS_res = sum((y_true - y_pred)^2) # Residual sum of squares

SS_tot = sum((y_true - mean(y_true))^2) # Total sum of squares

R^2 = 1 - SS_res / SS_tot

Write a function r2_score(y_true, y_pred) that returns the R^2 value.

Example:

y_true = [3, 5, 2, 7, 9]

y_pred = [2.8, 5.2, 2.1, 6.8, 9.1]

mean(y_true) = 5.2

SS_res = (0.2^2 + 0.2^2 + 0.1^2 + 0.2^2 + 0.1^2) = 0.14

SS_tot = (2.2^2 + 0.2^2 + 3.2^2 + 1.8^2 + 3.8^2) = 33.2

R^2 = 1 - 0.14/33.2 = 0.9958 (approx)

**Explanation:** R^2 measures how well predictions match actual values. A score of 1.0 means perfect prediction. A score of 0.0 means the model is no better than predicting the mean. Negative values mean the model is worse than the mean.

Constraints:

  • y_true and y_pred are 1D numpy arrays of equal length
  • Return a single float value
  • Do not use sklearn; implement from scratch
  • Test Cases

    Test Case 1
    Input: y_true=[3,5,2,7,9], y_pred=[2.8,5.2,2.1,6.8,9.1]
    Expected: 0.9957831325301205
    Test Case 2
    Input: y_true=[1,2,3,4,5], y_pred=[1,2,3,4,5]
    Expected: 1.0
    Test Case 3
    Input: y_true=[1,2,3], y_pred=[3,3,3]
    Expected: -0.5
    + 2 hidden test cases