Keras LSTM documentation contains high-level explanation:
dropout: Float between 0 and 1. Fraction of the units to drop for the linear transformation of the inputs.
recurrent_dropout: Float between 0 and 1. Fraction of the units to drop for the linear transformation
of the recurrent state.
But this totally corresponds to the answer you refer to:
Regular dropout is applied on the inputs and/or the outputs, meaning the vertical arrows from x_t
and to h_t
. ...
Recurrent dropout masks (or "drops") the connections between the recurrent units; that would be the horizontal arrows in your picture.
If you're interested in details on the formula level, the best way is to inspect the source code: keras/layers/recurrent.py
, look for rec_dp_mask
(recurrent dropout mask) and dp_mask
. One is affecting the h_tm1
(the previous memory cell), the other affects the inputs
.