Click4Ai

95.

Easy

Implement the **Leaky ReLU** activation function and its derivative.

Leaky ReLU:

leaky_relu(x) = x if x > 0

= alpha*x if x <= 0

Derivative:

leaky_relu'(x) = 1 if x > 0

= alpha if x <= 0

Example:

leaky_relu([-2, -1, 0, 1, 2], alpha=0.01) → [-0.02, -0.01, 0, 1, 2]

**Explanation:** Leaky ReLU fixes the "dying ReLU" problem by allowing a small gradient (alpha) for negative inputs instead of zero. This ensures neurons can still learn even when their input is negative.

Constraints:

  • Default alpha = 0.01
  • Input can be a scalar or numpy array
  • Test Cases

    Test Case 1
    Input: x=[-2,-1,0,1,2], alpha=0.01
    Expected: [-0.02,-0.01,0,1,2]
    Test Case 2
    Input: x=5, alpha=0.01
    Expected: 5
    Test Case 3
    Input: x=-5, alpha=0.1
    Expected: -0.5
    + 2 hidden test cases