Click4Ai

103.

Medium

Batch Normalization

Implement Batch Normalization to normalize the input data across a mini-batch. Batch Normalization is a technique that stabilizes and accelerates the training of deep neural networks by normalizing the inputs of each layer to have zero mean and unit variance.

The Batch Normalization formula is:

mean = (1/n) * sum(x_i)

variance = (1/n) * sum((x_i - mean)^2)

x_norm = (x - mean) / sqrt(variance + epsilon)

output = gamma * x_norm + beta

Your function batch_normalization(X) should normalize the input data along axis 0 (across samples for each feature) by subtracting the mean and dividing by the standard deviation.

Example:

Input: X = [[1, 2], [3, 4]]

Mean per feature: [2.0, 3.0]

Std per feature: [1.0, 1.0]

Output: [[-1.0, -1.0], [1.0, 1.0]]

Batch Normalization addresses the problem of internal covariate shift, where the distribution of each layer's inputs changes during training as the parameters of the previous layers change. By normalizing the inputs, it allows higher learning rates and reduces the sensitivity to weight initialization.

Constraints:

  • Input shape: (n_samples, n_features)
  • Normalize along axis 0 (per feature across the batch)
  • Return the normalized array with zero mean and unit variance per feature
  • Test Cases

    Test Case 1
    Input: [[1, 2], [3, 4]]
    Expected: [[-1.0, -1.0], [1.0, 1.0]]
    Test Case 2
    Input: [[5, 6], [7, 8]]
    Expected: [[-1.0, -1.0], [1.0, 1.0]]
    + 3 hidden test cases