The CuDNNGRU
in TensorFlow 1.0
is really fast. But when I shifted to TensorFlow 2.0
i am unable to find CuDNNGRU
. Simple GRU
is really slow in TensorFlow 2.0
.
Is there any way to use CuDNNGRU
in TensorFlow 2.0
?
The CuDNNGRU
in TensorFlow 1.0
is really fast. But when I shifted to TensorFlow 2.0
i am unable to find CuDNNGRU
. Simple GRU
is really slow in TensorFlow 2.0
.
Is there any way to use CuDNNGRU
in TensorFlow 2.0
?
The importable implementations have been deprecated - instead, LSTM
and GRU
will default to CuDNNLSTM
and CuDNNGRU
if all conditions are met:
activation = 'tanh'
recurrent_activation = 'sigmoid'
recurrent_dropout = 0
unroll = False
use_bias = True
reset_after = True
(GRU only)Also ensure TensorFlow uses GPU:
import tensorflow as tf
from tensorflow.python.client import device_lib
print(device_lib.list_local_devices())
print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))
Update: there appears to be a problem w/ TF 2.0.0 when running on Colab in getting CuDNN to work; try !pip install tensorflow==2.1.0
instead.
pre
paddings but it took around the same time, with post
paddings. No, i am not doing masking. –
Adept !pip install tensorflow==2.1.0
. –
Marquand © 2022 - 2024 — McMap. All rights reserved.
10
time more thanCuDNNGRU
– Adept