Keras : How to merge a dense layer and an embedding layer
Asked Answered
C

1

5

I use Keras and I try to concatenate two different layers into a vector (first values of the vector would be values of the first layer, and the other part would be the values of the second layer). One of these layers is a Dense layer and the other layer is a Embedding layer.

I know how to merge two embedding layers or two dense layers but I don't know how to merge a embedding layer and a dense layer (dimensional problem).

A simple example would be like this:

L_branch = Sequential()
L_branch.add(Dense(10, input_shape =  (4,) , activation = 'relu'))
L_branch.add(BatchNormalization())

R_branch = Sequential()
R_branch.add(Embedding(1000, 64, input_length=5))

final_branch.add(Merge([L_branch, R_branch], mode = 'concat'))

But this will not work because you can't merge layers with different dimensionalities.

PS : Sorry, english is not my native languague and I hope you will understand my problem.

Best regards.

Crespi answered 30/1, 2017 at 10:39 Comment(1)
What is the shape of expected output from this merged layer?Tamer
P
8

Use Flatten layer.

L_branch = Sequential()
L_branch.add(Dense(10, input_shape =  (4,) , activation = 'relu'))
L_branch.add(BatchNormalization())

R_branch = Sequential()
R_branch.add(Embedding(1000, 64, input_length=5))
R_branch.add(Flatten()) # <--

final_branch = Sequential() # <--
final_branch.add(Merge([L_branch, R_branch], mode = 'concat'))
Prefer answered 31/1, 2017 at 8:13 Comment(1)
For this line - final_branch.add(Merge([L_branch, R_branch], mode = 'concat')) The Keras 2.0.1 throws this warning : UserWarning: The Merge layer is deprecated and will be removed after 08/2017. Use instead layers from keras.layers.merge, e.g. add, concatenate, etc.Dragelin

© 2022 - 2024 — McMap. All rights reserved.