I have trained a model in Keras. The model contains dropout layers and I want to be absolutely sure nothing is dropped when doing this.
According to the documentation, a layer's output can be extracted like this:
layer_name = 'my_layer'
intermediate_layer_model = Model(inputs=model.input,
outputs=model.get_layer(layer_name).output)
intermediate_output = intermediate_layer_model.predict(data)
However, docs also show how to do so with a Keras function:
get_3rd_layer_output = K.function([model.layers[0].input, K.learning_phase()],
[model.layers[3].output])
# output in test mode = 0
layer_output = get_3rd_layer_output([x, 0])[0]
# output in train mode = 1
layer_output = get_3rd_layer_output([x, 1])[0]
Here, the learning_phase() flag tells keras whether to actually use dropout and similar things that are only used during training.
My question is, if I use the first approach, will dropout automatically be deactivated, or do I need to do something similar to setting the learning phase flag (as is done in the second approach).