I'm trying to reshape a tensor from [A, B, C, D]
into [A, B, C * D]
and feed it into a dynamic_rnn
. Assume that I don't know the B, C, and D in advance (they're a result of a convolutional network).
I think in Theano such reshaping would look like this:
x = x.flatten(ndim=3)
It seems that in TensorFlow there's no easy way to do this and so far here's what I came up with:
x_shape = tf.shape(x)
x = tf.reshape(x, [batch_size, x_shape[1], tf.reduce_prod(x_shape[2:])]
Even when the shape of x
is known during graph building (i.e. print(x.get_shape())
prints out absolute values, like [10, 20, 30, 40]
after the reshaping get_shape()
becomes [10, None, None]
. Again, still assume the initial shape isn't known so I can't operate with absolute values.
And when I'm passing x
to a dynamic_rnn
it fails:
ValueError: Input size (depth of inputs) must be accessible via shape inference, but saw value None.
Why is reshape
unable to handle this case? What is the right way of replicating Theano's flatten(ndim=n)
in TensorFlow with tensors of rank 4 and more?