Zero initialiser for biases using get_variable in tensorflow
Asked Answered
U

1

8

A code I'm modifying is using tf.get_variable for weight variables, and tf.Variable for bias initialisation. After some searching, it seems that get_variable should always be favoured due to its portability in regards to sharing. So I tried to change the bias variable to get_variable but can't seem to get it to work.

Original: tf.Variable(tf.zeros([128]), trainable=True, name="b1")

My attempt: tf.get_variable(name="b1", shape=[128], initializer=tf.zeros_initializer(shape=[128]))

I get an error saying that the shape should not be specified for constants. But removing the shape then throws an error for no arguments.

I'm very new to tf so I'm probably misunderstanding something fundamental here. Thanks for the help in advance :)

Undermost answered 24/1, 2017 at 6:34 Comment(1)
tf.get_variable(name="b1", initializer=tf.zeros_initializer(shape=[128])) like this?Teresitateressa
P
18

Following should work: tf.get_variable(name="b1", shape=[128], initializer=tf.zeros_initializer())

Pycno answered 26/1, 2017 at 1:18 Comment(1)
instead of zero_initializer() can I use a numpy array with values inside?Ternopol

© 2022 - 2024 — McMap. All rights reserved.