Implement **Binary Logistic Regression** from scratch using gradient descent.
Sigmoid Function:
sigmoid(z) = 1 / (1 + exp(-z))
Gradient Descent Update Rules:
z = X @ w + b
y_pred = sigmoid(z)
dw = (1/n) * (X^T @ (y_pred - y))
db = (1/n) * sum(y_pred - y)
w = w - learning_rate * dw
b = b - learning_rate * db
Your class should have:
Example:
# Linearly separable data
X = [[1, 2], [2, 3], [3, 4], [5, 6], [6, 7], [7, 8]]
y = [0, 0, 0, 1, 1, 1]
model = LogisticRegression(learning_rate=0.01, n_iterations=1000)
model.fit(X, y)
model.predict([[4, 5]]) # Should predict 0 or 1
model.predict_proba([[4, 5]]) # Should return probability ~0.5
**Explanation:** Logistic regression uses the sigmoid function to map linear outputs to probabilities [0, 1], then applies a threshold to make binary decisions.
Constraints:
Test Cases
Test Case 1
Input:
X=[[1,2],[2,3],[3,4],[5,6],[6,7],[7,8]], y=[0,0,0,1,1,1]Expected:
model classifies low values as 0, high values as 1Test Case 2
Input:
X=[[0],[1],[2],[3],[4],[5]], y=[0,0,0,1,1,1]Expected:
decision boundary around x=2.5; predict_proba returns values in [0,1]Test Case 3
Input:
X=[[1],[2],[3],[4]], y=[0,0,1,1], threshold=0.5Expected:
predict returns array of 0s and 1s+ 2 hidden test cases