Click4Ai

96.

Easy

Implement the **Tanh (Hyperbolic Tangent)** activation function and its derivative.

Tanh:

tanh(x) = (exp(x) - exp(-x)) / (exp(x) + exp(-x))

Derivative:

tanh'(x) = 1 - tanh(x)^2

Example:

tanh(0) → 0.0

tanh(1) → 0.7616

tanh_derivative(0) → 1.0

**Explanation:** Tanh squashes input to (-1, 1) and is zero-centered (unlike sigmoid which outputs in (0,1)). This makes it preferred over sigmoid for hidden layers. However, it still suffers from vanishing gradients at extreme values.

Constraints:

  • Input can be a scalar or numpy array
  • You may use np.tanh or implement from the formula
  • Test Cases

    Test Case 1
    Input: x=0
    Expected: tanh=0.0, derivative=1.0
    Test Case 2
    Input: x=[-2,-1,0,1,2]
    Expected: tanh=[-0.9640,-0.7616,0,0.7616,0.9640]
    Test Case 3
    Input: x=100
    Expected: tanh≈1.0, derivative≈0.0
    + 2 hidden test cases