Default activation function in Keras
Asked Answered
S

2

11

Does anyone know the default activation function used in the recurrent layers in Keras? https://keras.io/layers/recurrent/

It says the default activation function is linear. But what about the default recurrent activation function. Nothing is mentioned about that. Any help would be highly appreciated. Thanks in advance

Shorttempered answered 18/3, 2017 at 18:18 Comment(1)
Has my answer helped you?Stanislas
L
7

Keras Recurrent is an abstact class for recurrent layers. In Keras 2.0 all default activations are linear for all implemented RNNs (LSTM, GRU and SimpleRNN). In previous versions you had:

  • linear for SimpleRNN,
  • tanh for LSTM and GRU.
Lipolysis answered 18/3, 2017 at 18:44 Comment(1)
Seems this answer is outdated. In Keras 2.2.4 default activation for LSTM is tanhChiffonier
U
1

https://github.com/keras-team/keras/blob/master/keras/layers/recurrent.py#L2081

It mentions tanh here for version 2.3.0 :-)

Unlearn answered 31/1, 2020 at 9:3 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.