Click4Ai

93.

Easy

Implement the **Sigmoid** activation function and its derivative.

Sigmoid:

sigmoid(x) = 1 / (1 + exp(-x))

Derivative:

sigmoid'(x) = sigmoid(x) * (1 - sigmoid(x))

Example:

sigmoid(0) → 0.5

sigmoid(2) → 0.8808

sigmoid_derivative(0) → 0.25

**Explanation:** Sigmoid squashes any input to the range (0, 1). It was historically popular but causes vanishing gradients for large |x|. The derivative peaks at x=0 (value 0.25) and approaches 0 for large |x|.

Constraints:

  • Input can be a scalar or numpy array (use np.exp for vectorized computation)
  • sigmoid_derivative should compute the derivative at the given input x
  • Test Cases

    Test Case 1
    Input: x=0
    Expected: sigmoid=0.5, derivative=0.25
    Test Case 2
    Input: x=[-2,-1,0,1,2]
    Expected: sigmoid=[0.1192,0.2689,0.5,0.7311,0.8808]
    Test Case 3
    Input: x=100
    Expected: sigmoid≈1.0, derivative≈0.0
    + 2 hidden test cases