I am trying to use tf.train.exponential_decay with predefined estimators and this is proving to be super difficult for some reason. Am I missing something here?
Here is my old code with constant learning_rate:
classifier = tf.estimator.DNNRegressor(
feature_columns=f_columns,
model_dir='./TF',
hidden_units=[2, 2],
optimizer=tf.train.ProximalAdagradOptimizer(
learning_rate=0.50,
l1_regularization_strength=0.001,
))
Now I tried adding this:
starter_learning_rate = 0.50
global_step = tf.Variable(0, trainable=False)
learning_rate = tf.train.exponential_decay(starter_learning_rate, global_step,
10000, 0.96, staircase=True)
but now what?
- estimator.predict() does not accept global_step so it will be stuck at 0?
- Even if I pass learning_rate to tf.train.ProximalAdagradOptimizer() I get an error saying
"ValueError: Tensor("ExponentialDecay:0", shape=(), dtype=float32) must be from the same graph as Tensor("dnn/hiddenlayer_0/kernel/part_0:0", shape=(62, 2), dtype=float32_ref)."
Your help is greatly appreciated. I am using TF1.6 btw.