Click4Ai

150.

Hard

Neural Architecture Search (NAS)

Implement a simple Neural Architecture Search that explores the space of possible network architectures by evaluating different combinations of layer types. NAS automates the process of designing neural network architectures by systematically searching over configurations of layers, neurons, and activation functions.

Algorithm:

1. Define the search space: layer_types = ['convolutional', 'fully_connected']

2. For a given num_layers, generate all possible architectures:

architectures = product(layer_types, repeat=num_layers)

3. For each candidate architecture:

- Simulate training and evaluate accuracy (or use a proxy metric)

4. Return the architecture with the highest accuracy.

Search space size = |layer_types|^num_layers

e.g., 2 types, 3 layers -> 2^3 = 8 candidate architectures

Example:

Input: num_layers = 2

Candidate architectures:

('convolutional', 'convolutional')

('convolutional', 'fully_connected')

('fully_connected', 'convolutional')

('fully_connected', 'fully_connected')

After evaluating each (simulated), the best might be:

Output: ['convolutional', 'fully_connected']

The search exhaustively evaluates every possible combination of layer types for the given number of layers. In practice, NAS methods use more sophisticated search strategies (reinforcement learning, evolutionary algorithms, or differentiable search) to handle much larger search spaces efficiently.

Constraints:

  • `num_layers` is a positive integer (1 to 5).
  • Layer types are: 'convolutional' and 'fully_connected'.
  • Use `itertools.product` to generate all combinations.
  • Evaluate each architecture using a simulated accuracy score (`np.random.rand()`).
  • Return the architecture tuple with the highest accuracy.
  • Test Cases

    Test Case 1
    Input: 2
    Expected: ['convolutional', 'fully_connected']
    Test Case 2
    Input: 3
    Expected: ['convolutional', 'convolutional', 'fully_connected']
    + 3 hidden test cases