Implement **Simple Linear Regression** using gradient descent to fit a model of the form **y = wx + b**.
Gradient Descent Update Rules:
dw = (1/n) * sum(2 * x * (w*x + b - y))
db = (1/n) * sum(2 * (w*x + b - y))
w = w - learning_rate * dw
b = b - learning_rate * db
Your class should have:
Example:
X = [1, 2, 3, 4, 5]
y = [2, 4, 6, 8, 10] (i.e., y = 2x)
model = SimpleLinearRegression(learning_rate=0.01, n_iterations=1000)
model.fit(X, y)
model.predict([6, 7]) # Should return values close to [12, 14]
**Explanation:** The model learns weight ~ 2.0 and bias ~ 0.0, fitting the line y = 2x.
Constraints:
Test Cases
Test Case 1
Input:
X=[1,2,3,4,5], y=[2,4,6,8,10]Expected:
predictions close to y=2x (weight~2.0, bias~0.0)Test Case 2
Input:
X=[1,2,3,4,5], y=[3,5,7,9,11]Expected:
predictions close to y=2x+1 (weight~2.0, bias~1.0)Test Case 3
Input:
X=[0,1,2,3], y=[1,1,1,1]Expected:
predictions close to y=1 (weight~0.0, bias~1.0)+ 2 hidden test cases