Click4Ai

456.

Hard

Wasserstein GAN (WGAN) is a variant of Generative Adversarial Networks (GANs) that uses the Wasserstein distance as a loss function instead of the Jensen-Shannon divergence. This approach helps stabilize the training process and improves the quality of generated samples. In this problem, you will implement a WGAN with a critic network that uses the Leaky ReLU activation function and a generator network that uses the Tanh activation function.

Example:

Suppose we have a dataset of 1000 images of size 64x64. We want to train a WGAN to generate new images that are similar to the training data.

Constraints:

  • The critic network should have 5 layers with 128 units each.
  • The generator network should have 5 layers with 128 units each.
  • The Leaky ReLU activation function should have a slope of 0.2.
  • The Tanh activation function should be used for the generator output.
  • The Adam optimizer should be used with a learning rate of 0.001.
  • Test Cases

    Test Case 1
    Input: [[1, 2], [3, 4]]
    Expected: None
    Test Case 2
    Input: [[5, 6], [7, 8]]
    Expected: None
    + 3 hidden test cases