Click4Ai

52.

Medium

Implement **Multiple Linear Regression** using gradient descent for models with multiple features: **y = Xw + b**.

Gradient Descent Update Rules:

y_pred = X @ w + b

dw = (1/n) * (X^T @ (y_pred - y)) * 2

db = (1/n) * sum(2 * (y_pred - y))

w = w - learning_rate * dw

b = b - learning_rate * db

Where X is a matrix of shape (n_samples, n_features) and w is a vector of shape (n_features,).

Your class should have:

  • `fit(X, y)`: Train using gradient descent. Initialize weights to zeros.
  • `predict(X)`: Return predicted values using learned weights and bias.
  • Example:

    X = [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6]]

    y = [5, 8, 11, 14, 17] (i.e., y = x1 + 2*x2 + 1... approximately)

    model = MultipleLinearRegression(learning_rate=0.01, n_iterations=1000)

    model.fit(X, y)

    model.predict([[6, 7]]) # Should return value close to 20

    **Explanation:** With multiple features, the model learns a weight for each feature and a single bias term.

    Constraints:

  • Initialize all weights to 0 and bias to 0
  • X is a 2D numpy array of shape (n_samples, n_features)
  • y is a 1D numpy array of shape (n_samples,)
  • Test Cases

    Test Case 1
    Input: X=[[1,2],[2,3],[3,4],[4,5],[5,6]], y=[5,8,11,14,17]
    Expected: predictions close to y=x1+2*x2+1
    Test Case 2
    Input: X=[[1,0],[0,1],[1,1]], y=[2,3,5]
    Expected: predictions close to y=2*x1+3*x2
    Test Case 3
    Input: X=[[1,1],[2,2],[3,3]], y=[3,6,9]
    Expected: predictions close to y=x1+2*x2 or y=2*x1+x2
    + 2 hidden test cases