Implement the **R-Squared (R^2) Score** (Coefficient of Determination).
Formula:
SS_res = sum((y_true - y_pred)^2) # Residual sum of squares
SS_tot = sum((y_true - mean(y_true))^2) # Total sum of squares
R^2 = 1 - SS_res / SS_tot
Write a function r2_score(y_true, y_pred) that returns the R^2 value.
Example:
y_true = [3, 5, 2, 7, 9]
y_pred = [2.8, 5.2, 2.1, 6.8, 9.1]
mean(y_true) = 5.2
SS_res = (0.2^2 + 0.2^2 + 0.1^2 + 0.2^2 + 0.1^2) = 0.14
SS_tot = (2.2^2 + 0.2^2 + 3.2^2 + 1.8^2 + 3.8^2) = 33.2
R^2 = 1 - 0.14/33.2 = 0.9958 (approx)
**Explanation:** R^2 measures how well predictions match actual values. A score of 1.0 means perfect prediction. A score of 0.0 means the model is no better than predicting the mean. Negative values mean the model is worse than the mean.
Constraints:
Test Cases
y_true=[3,5,2,7,9], y_pred=[2.8,5.2,2.1,6.8,9.1]0.9957831325301205y_true=[1,2,3,4,5], y_pred=[1,2,3,4,5]1.0y_true=[1,2,3], y_pred=[3,3,3]-0.5