Keras tuner: mismatch between number of layers used and number of layers reported
Asked Answered
S

1

2

Using example from Keras Tuner website, I wrote simple tuning code

base_model = tf.keras.applications.vgg16.VGG16(input_shape=IMG_SHAPE,
                                              include_top=False, 
                                              weights='imagenet')
base_model.trainable = False

def build_model(hp):
    model = tf.keras.Sequential();
    model.add(base_model);

    for i in range(hp.Int('num_layers', 1, 2)):
        model.add(tf.keras.layers.Conv2D(filters=hp.Int('Conv2D_' + str(i),
            min_value=32,
            max_value=512,
            step=32),
            kernel_size=3, activation='relu'));
        model.add(tf.keras.layers.Dropout(hp.Choice('rate', [0.3, 0.5])));

    model.add(tf.keras.layers.GlobalAveragePooling2D());
    model.add(tf.keras.layers.Flatten());
    model.add(tf.keras.layers.Dropout(0.2));
    model.add(tf.keras.layers.Dense(5, activation='softmax'));

    model.compile(optimizer=tf.keras.optimizers.RMSprop(hp.Choice('learning_rate', [1e-4, 1e-5])),
        loss='categorical_crossentropy',
        metrics=['accuracy']);

    return model


epochs = 2
callback = tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=3)

tuner = RandomSearch(
    build_model,
    objective='val_accuracy',
    max_trials=24,
    executions_per_trial=1,
    directory=LOG_DIR);

tuner.search_space_summary();

tuner.search(train_generator,
             callbacks=[callback],
             epochs = epochs,
             steps_per_epoch = train_generator.samples // BATCH_SIZE,
             validation_data = valid_generator,
             validation_steps = valid_generator.samples // BATCH_SIZE,
             verbose = 1);

tuner.results_summary();
models = tuner.get_best_models(num_models=2);

However, when I run it with varying number of layers, but it shows mismatch between number of layers reported and value of num_layers. For example it reports three Conv2D layers and yet it shows num_layers as 1. Why ?

[Trial summary]
 |-Trial ID: 79cd7bb6146b4c243eb2bc51f19985de
 |-Score: 0.8444444537162781
 |-Best step: 0
 > Hyperparameters:
 |-Conv2D_0: 448
 |-Conv2D_1: 448
 |-Conv2D_2: 512
 |-learning_rate: 0.0001
 |-num_layers: 1
 |-rate: 0.5
Streaming answered 22/4, 2020 at 21:6 Comment(0)
G
2

Any hyperparameter seen so far will be displayed in the summary, meaning that once a trial containing three layers has been run, all subsequent summaries will contain three layer sizes. It does not mean it uses all three layers, which is indicated by by the num_layers: 1 print for this particular trial.

See omalleyt12's post here for more details: https://github.com/keras-team/keras-tuner/issues/66#issuecomment-525923517

Garboard answered 3/5, 2020 at 11:3 Comment(1)
please see my follow up question about this:#63101199Cordon

© 2022 - 2024 — McMap. All rights reserved.