Keras Tuner: select number of units conditional on number of layers
Asked Answered
L

1

2

I am using Keras Tuner to tune the hyperparameters of my neural network.

I want to search the optimal number of hidden layers and the optimal number of units in each layer. To avoid overparametrizing the model, I want to impose the following condition:

  • if the model has two layers, then choose the best number of units; up to 64 for each layer
  • if the model has one layer, then choose the best number of units; up to 128 for each layer

How can this condition be imposed?

If I have tried this:

for i in range(hp.Choice('num_layers', [1, 2])):
            
    max_units = 128 if i == 1 else 64
    hp_units = hp.Int(f'units_{i}', min_value=16, max_value=max_units, step=16)
    model.add(tf.keras.layers.Dense(units=hp_units, activation='relu', use_bias=True))
    

But this just results in the following condition:

  • if exploring the second layer, then choose the best number of units; up to 64 for the second layer
  • if exploring the first layer, then choose the best number of units; up to 128 for the first layer
Lisa answered 16/8, 2022 at 7:10 Comment(0)
F
2

I think it is better two just make two hparams choice variables, each for the unit count in one layer. If the second layer has zero units, it vanishes entirely.

neurons_first_layer = hp.Choice('neurons_first_layer', [16,32,64,128])
neurons_second_layer = hp.Choice('neurons_second_layer', [0,16,32,64,])
model.add(tf.keras.layers.Dense(units=neurons_first_layer, activation='relu', use_bias=True))
if neurons_second_layer: # if second layer has units
    model.add(tf.keras.layers.Dense(units=neurons_second_layer,activation='relu', use_bias=True))

That way your get 16 combinations:

[(16, 0), (16, 16), (16, 32), (16, 64), (32, 0), (32, 16), (32, 32), 
(32, 64), (64, 0), (64, 16), (64, 32), (64, 64), (128, 0), (128, 16), ...]
Fancier answered 1/2, 2023 at 23:0 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.