Click4Ai

99.

Easy

Implement a **Single Neuron** with a forward pass (the fundamental building block of neural networks).

Forward Pass:

z = w1*x1 + w2*x2 + ... + wn*xn + bias (linear combination)

output = sigmoid(z) (activation)

Example:

neuron = Neuron(n_inputs=3)

neuron.weights = [0.5, -0.3, 0.8]

neuron.bias = 0.1

neuron.forward([1.0, 2.0, 3.0])

# z = 0.5*1 + (-0.3)*2 + 0.8*3 + 0.1 = 2.5

# output = sigmoid(2.5) = 0.924

**Explanation:** A neuron computes a weighted sum of inputs plus a bias, then passes the result through an activation function. This is the most basic computation unit in a neural network.

Constraints:

  • Use sigmoid activation: 1 / (1 + exp(-z))
  • Initialize weights randomly, bias to 0
  • Return a single scalar output
  • Test Cases

    Test Case 1
    Input: weights=[1,1,1], bias=0, inputs=[0,0,0]
    Expected: 0.5 (sigmoid(0))
    Test Case 2
    Input: weights=[0.5,-0.3,0.8], bias=0.1, inputs=[1,2,3]
    Expected: sigmoid(2.5)≈0.924
    Test Case 3
    Input: output is always between 0 and 1
    Expected: True (sigmoid range)
    + 2 hidden test cases