Change device placement of an existing tensorflow variable
Asked Answered
O

0

9

How do I change the device placement of a tf.Variable() ? I tried two methods

a = tf.Variable(1,name = 'a')  # a's device is not set
with tf.device('/gpu:0'):
    a = tf.get_variable('a',1)   

this creates a new variable on the gpu and doesn't change device assignment for a

I tried enforcing variable reuse by using

tf.get_variable_scope().reuse_variables()

Here is the code :

a = tf.Variable(1,name = 'a')  # a's device is not set
tf.get_variable_scope().reuse_variables()
with tf.device('/gpu:0'):
    a = tf.get_variable('a',1)   

this creates a new variable on the gpu and doesn't change device assignment for a

This time, I get an error saying the variable a did not exist in the gpu.

Any help on changing-device-placement or lazy-device-assignment would be appreciated. Thanks

Overbear answered 17/2, 2017 at 14:33 Comment(3)
It's not supported, and I can't think of a hacky approach either. You could put several variables on several devices, and then copy values between them to simulate moving the variableMulford
We've been using tf.identity to copy variables over, I think.Drachma
Have a look at github.com/tensorflow/tensorflow/blob/master/tensorflow/python/… There they remove the device placement flag from graph nodes.Wysocki

© 2022 - 2024 — McMap. All rights reserved.