Click4Ai

94.

Easy

Implement the **ReLU (Rectified Linear Unit)** activation function and its derivative.

ReLU:

relu(x) = max(0, x)

Derivative:

relu'(x) = 1 if x > 0, else 0

Example:

relu([-2, -1, 0, 1, 2]) → [0, 0, 0, 1, 2]

relu_derivative([-2, -1, 0, 1, 2]) → [0, 0, 0, 1, 1]

**Explanation:** ReLU is the most widely used activation in modern deep learning. It's computationally efficient (just a threshold) and doesn't suffer from vanishing gradients for positive inputs. However, neurons with negative inputs produce zero gradient ("dying ReLU" problem).

Constraints:

  • Input can be a scalar or numpy array
  • At x=0, the derivative is conventionally defined as 0
  • Test Cases

    Test Case 1
    Input: x=[-2,-1,0,1,2]
    Expected: relu=[0,0,0,1,2], derivative=[0,0,0,1,1]
    Test Case 2
    Input: x=5
    Expected: relu=5, derivative=1
    Test Case 3
    Input: x=-5
    Expected: relu=0, derivative=0
    + 2 hidden test cases