Click4Ai

104.

Medium

Dropout Layer

Implement a Dropout layer for a deep learning model. Dropout is a regularization technique that randomly sets a fraction of the input neurons to zero during training. This prevents neurons from co-adapting and forces the network to learn more robust features.

The Dropout operation is computed as follows:

# During training:

mask = random_values > dropout_rate # Binary mask (0s and 1s)

output = input * mask / (1 - dropout_rate) # Inverted dropout scaling

# During inference:

output = input # No dropout applied

Your function dropout_layer(input_array, dropout_rate) should generate a random binary mask where each element has a probability dropout_rate of being zeroed out, and apply that mask to the input array.

Example:

Input: input_array = [[1, 2], [3, 4]], dropout_rate = 0.5

Mask (random): [[1, 0], [1, 0]]

Output: [[1, 0], [3, 0]] (elements where mask is 0 are dropped)

Inverted dropout scales the remaining activations by 1 / (1 - dropout_rate) during training so that the expected value of each neuron remains the same. This means no adjustment is needed during inference, simplifying deployment.

Constraints:

  • The dropout rate should be between 0 and 1
  • The input should be a 2D numpy array
  • Use numpy's random module to generate the dropout mask
  • Apply the mask using element-wise multiplication
  • Test Cases

    Test Case 1
    Input: [[1, 2], [3, 4]]
    Expected: [[1, 0], [3, 0]]
    Test Case 2
    Input: [[5, 6], [7, 8]]
    Expected: [[0, 6], [0, 8]]
    + 3 hidden test cases