Implement a **Single-Layer Perceptron** classifier, the simplest neural network model.
Algorithm:
1. Initialize weights to zeros and bias to 0
2. For each training sample:
a. Compute linear output: z = w @ x + b
b. Apply step activation: y_hat = 1 if z >= 0 else 0
c. Update rule: w = w + lr * (y - y_hat) * x, b = b + lr * (y - y_hat)
3. Repeat for n_iterations epochs
Example:
p = Perceptron(learning_rate=0.01, n_iterations=1000)
p.fit(X_train, y_train) # y must be 0 or 1
predictions = p.predict(X_test)
**Explanation:** The Perceptron learns a linear decision boundary. It updates weights only when a prediction is wrong. It converges for linearly separable data but cannot solve non-linear problems (like XOR).
Constraints:
Test Cases
Test Case 1
Input:
AND gate: [[0,0],[0,1],[1,0],[1,1]], y=[0,0,0,1]Expected:
Learns AND function correctlyTest Case 2
Input:
OR gate: [[0,0],[0,1],[1,0],[1,1]], y=[0,1,1,1]Expected:
Learns OR function correctlyTest Case 3
Input:
activation(5)=1, activation(-3)=0, activation(0)=1Expected:
Correct step function outputs+ 2 hidden test cases