I am using Keras Tuner to tune the hyperparameters of my neural network.
I want to search the optimal number of hidden layers and the optimal number of units in each layer. To avoid overparametrizing the model, I want to impose the following condition:
- if the model has two layers, then choose the best number of units; up to 64 for each layer
- if the model has one layer, then choose the best number of units; up to 128 for each layer
How can this condition be imposed?
If I have tried this:
for i in range(hp.Choice('num_layers', [1, 2])):
max_units = 128 if i == 1 else 64
hp_units = hp.Int(f'units_{i}', min_value=16, max_value=max_units, step=16)
model.add(tf.keras.layers.Dense(units=hp_units, activation='relu', use_bias=True))
But this just results in the following condition:
- if exploring the second layer, then choose the best number of units; up to 64 for the second layer
- if exploring the first layer, then choose the best number of units; up to 128 for the first layer