Keras - All layer names should be unique
Asked Answered
T

4

10

I combine two VGG net in keras together to make classification task. When I run the program, it shows an error:

RuntimeError: The name "predictions" is used 2 times in the model. All layer names should be unique.

I was confused because I only use prediction layer once in my code:

from keras.layers import Dense
import keras
from keras.models import  Model
model1 = keras.applications.vgg16.VGG16(include_top=True, weights='imagenet',
                                input_tensor=None, input_shape=None,
                                pooling=None,
                                classes=1000)
model1.layers.pop()

model2 =  keras.applications.vgg16.VGG16(include_top=True, weights='imagenet',
                                input_tensor=None, input_shape=None,
                                pooling=None,
                                classes=1000)
model2.layers.pop()
for layer in model2.layers:
    layer.name = layer.name + str("two")
model1.summary()
model2.summary()
featureLayer1 = model1.output
featureLayer2 = model2.output
combineFeatureLayer = keras.layers.concatenate([featureLayer1, featureLayer2])
prediction = Dense(1, activation='sigmoid', name='main_output')(combineFeatureLayer)

model = Model(inputs=[model1.input, model2.input], outputs= prediction)
model.summary()

Thanks for @putonspectacles help, I follow his instruction and find some interesting part. If you use model2.layers.pop() and combine the last layer of two models using "model.layers.keras.layers.concatenate([model1.output, model2.output])", you will find that the last layer information is still showed using the model.summary(). But actually they do not exist in the structure. So instead, you can use model.layers.keras.layers.concatenate([model1.layers[-1].output, model2.layers[-1].output]). It looks tricky but it works.. I think it is a problem about synchronization of the log and structure.

Transfinite answered 17/4, 2017 at 13:27 Comment(0)
S
9

First, based on the code you posted you have no layers with a name attribute 'predictions', so this error has nothing to do with your layer Dense layer prediction: i.e:

prediction = Dense(1, activation='sigmoid', 
             name='main_output')(combineFeatureLayer)

The VGG16 model has a Dense layer with name predictions. In particular this line:

x = Dense(classes, activation='softmax', name='predictions')(x)

And since you're using two of these models you have layers with duplicate names.

What you could do is rename the layer in the second model to something other than predictions, maybe predictions_1, like so:

model2 =  keras.applications.vgg16.VGG16(include_top=True, weights='imagenet',
                                input_tensor=None, input_shape=None,
                                pooling=None,
                                classes=1000)

# now change the name of the layer inplace.
model2.get_layer(name='predictions').name='predictions_1'
Springy answered 17/4, 2017 at 13:51 Comment(5)
Thank you for the explanation. I print all the layers for both model1 and model2 but there is no layer with name "predictions". I think you mean the last layer of VGG but I have already pop the origin last layer. Could you add explanation for this?Transfinite
if you're using the latest keras this link references the layer named 'predictions' github.com/fchollet/keras/blob/…Springy
and based on the error you definitely have layers named predictions. Did you try my suggestion?Springy
Hi, your suggestion is right but I pop the last layer before I combine two VGG. But you are right, I just update the solution and edit my problem. You will see that..Transfinite
AttributeError: Can't set the attribute "name", likely because it conflicts with an existing read-only @property of the object. Please choose a different name.Cribbage
Q
3

You can change the layer's name in keras, don't use 'tensorflow.python.keras'.

Here is my sample code:

from keras.layers import Dense, concatenate
from keras.applications import vgg16

num_classes = 10

model = vgg16.VGG16(include_top=False, weights='imagenet', input_tensor=None, input_shape=(64,64,3), pooling='avg')
inp = model.input
out = model.output

model2 = vgg16.VGG16(include_top=False,weights='imagenet', input_tensor=None, input_shape=(64,64,3), pooling='avg')

for layer in model2.layers:
    layer.name = layer.name + str("_2")

inp2 = model2.input
out2 = model2.output

merged = concatenate([out, out2])
merged = Dense(1024, activation='relu')(merged)
merged = Dense(num_classes, activation='softmax')(merged)

model_fusion = Model([inp, inp2], merged)
model_fusion.summary()
Quartile answered 29/7, 2018 at 13:32 Comment(3)
I created a model like this but when I add layer.name = layer.name + str("_2") to the performance of the second model changes. I do not why?Munguia
Confirmed that this approach fails for TF 2.0 when doing model.get_config() because the commensurate layer names aren't renamed in model._network_nodes.Apprehension
@NicholasLeonard What if you use this mehtod? https://mcmap.net/q/677032/-error-when-trying-to-rename-a-pretrained-model-on-tf-kerasHalfhour
S
0

Example:

# Network for affine transform estimation
affine_transform_estimator = MobileNet(
                            input_tensor=None,
                            input_shape=(config.IMAGE_H // 2, config.IMAGE_W //2, config.N_CHANNELS),
                            alpha=1.0,
                            depth_multiplier=1,
                            include_top=False,
                            weights='imagenet'
                            )
affine_transform_estimator.name = 'affine_transform_estimator'
for layer in affine_transform_estimator.layers:
    layer.name = layer.name + str("_1")

# Network for landmarks regression
landmarks_regressor = MobileNet(
                        input_tensor=None,
                        input_shape=(config.IMAGE_H // 2, config.IMAGE_W // 2, config.N_CHANNELS),
                        alpha=1.0,
                        depth_multiplier=1,
                        include_top=False,
                        weights='imagenet'
                        )
landmarks_regressor.name = 'landmarks_regressor'
for layer in landmarks_regressor.layers:
    layer.name = layer.name + str("_2")

input_image = Input(shape=(config.IMAGE_H, config.IMAGE_W, config.N_CHANNELS))
downsampled_image = MaxPooling2D(pool_size=(2,2))(input_image)
x1 = affine_transform_estimator(downsampled_image)
x2 = landmarks_regressor(downsampled_image)
x3 = add([x1,x2])

model = Model(inputs=input_image, outputs=x3)
optimizer = Adadelta()
model.compile(optimizer=optimizer, loss=mae_loss_masked)
Salesin answered 26/11, 2018 at 14:38 Comment(0)
S
0

you can use layer_.name instead of layer.name this worked for me

Sinister answered 7/6, 2023 at 10:26 Comment(1)
As it’s currently written, your answer is unclear. Please edit to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers in the help center.Hyden

© 2022 - 2024 — McMap. All rights reserved.