merging recurrent layers with dense layer in Keras
Asked Answered
B

2

8

I want to build a neural network where the two first layers are feedforward and the last one is recurrent. here is my code :

model = Sequential()
model.add(Dense(150, input_dim=23,init='normal',activation='relu'))
model.add(Dense(80,activation='relu',init='normal'))
model.add(SimpleRNN(2,init='normal')) 
adam =OP.Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=1e-08)
model.compile(loss="mean_squared_error", optimizer="rmsprop")  

and I get this error :

Exception: Input 0 is incompatible with layer simplernn_11: expected  ndim=3, found ndim=2.
model.compile(loss='mse', optimizer=adam)
Bor answered 22/6, 2016 at 10:20 Comment(0)
B
5

It is correct that in Keras, RNN layer expects input as (nb_samples, time_steps, input_dim). However, if you want to add RNN layer after a Dense layer, you still can do that after reshaping the input for the RNN layer. Reshape can be used both as a first layer and also as an intermediate layer in a sequential model. Examples are given below:

Reshape as first layer in a Sequential model

model = Sequential()
model.add(Reshape((3, 4), input_shape=(12,)))
# now: model.output_shape == (None, 3, 4)
# note: `None` is the batch dimension

Reshape as an intermediate layer in a Sequential model

model.add(Reshape((6, 2)))
# now: model.output_shape == (None, 6, 2)

For example, if you change your code in the following way, then there will be no error. I have checked it and the model compiled without any error reported. You can change the dimension as per your need.

from keras.models import Sequential
from keras.layers import Dense, SimpleRNN, Reshape
from keras.optimizers import Adam

model = Sequential()
model.add(Dense(150, input_dim=23,init='normal',activation='relu'))
model.add(Dense(80,activation='relu',init='normal'))
model.add(Reshape((1, 80)))
model.add(SimpleRNN(2,init='normal')) 
adam = Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=1e-08)
model.compile(loss="mean_squared_error", optimizer="rmsprop")
Badalona answered 22/11, 2016 at 21:52 Comment(0)
S
3

In Keras, you cannot put a Reccurrent layer after a Dense layer because the Dense layer gives output as (nb_samples, output_dim). However, a Recurrent layer expects input as (nb_samples, time_steps, input_dim). So, a Dense layer gives a 2-D output, but a Recurrent layer expects a 3-D input. However, you can do the reverse, i.e., put a Dense layer after a Recurrent layer.

Spiny answered 22/6, 2016 at 11:27 Comment(1)
thank you for your response, in fact the inputs that i use here are correlated ,so i want to create a short memory in the last layer that means the output of the system at time t has to take into account the output at time t-1 and , (therefore i want the last layer to be recurrent preceded two forward layers ) do you know how can i make the number of sequences (time_steps) variable ?Bor

© 2022 - 2024 — McMap. All rights reserved.