Implement a **Dense (Fully Connected) Layer**, the most common layer type in neural networks.
Forward Pass:
output = inputs @ weights + bias
Where:
Example:
layer = DenseLayer(n_inputs=3, n_neurons=2)
inputs = [[1.0, 2.0, 3.0],
[4.0, 5.0, 6.0]]
output = layer.forward(inputs) # shape (2, 2)
**Explanation:** Each neuron in the layer computes a weighted sum of ALL inputs plus a bias. The "dense" name comes from every input being connected to every neuron. This is the matrix multiplication form of multiple neurons working in parallel.
Constraints:
Test Cases
Test Case 1
Input:
inputs shape (1,3), layer DenseLayer(3,4)Expected:
output shape (1,4)Test Case 2
Input:
inputs shape (5,3), layer DenseLayer(3,2)Expected:
output shape (5,2)Test Case 3
Input:
bias=0, weights=identity, input=[1,2,3]Expected:
[1,2,3]+ 2 hidden test cases