Implement the **F1 Score**, the harmonic mean of precision and recall.
Formula:
F1 = 2 * (Precision * Recall) / (Precision + Recall)
The F1 score balances precision and recall into a single metric. It is especially useful when you have imbalanced classes.
Write a function f1_score(y_true, y_pred) that computes precision and recall internally, then returns the F1 score.
Example:
y_true = [1, 0, 1, 1, 0, 1, 0, 1]
y_pred = [1, 0, 0, 1, 1, 1, 0, 1]
f1_score(y_true, y_pred) → 0.8
**Explanation:** Precision = 4/5 = 0.8, Recall = 4/5 = 0.8. F1 = 2 * 0.8 * 0.8 / (0.8 + 0.8) = 0.8.
Constraints:
Test Cases
Test Case 1
Input:
y_true=[1,0,1,1,0,1,0,1], y_pred=[1,0,0,1,1,1,0,1]Expected:
0.8Test Case 2
Input:
y_true=[1,1,1], y_pred=[1,1,1]Expected:
1.0Test Case 3
Input:
y_true=[1,1,0,0], y_pred=[0,0,1,1]Expected:
0.0+ 2 hidden test cases