Using tf.train.exponential_decay with predefined estimator?
Asked Answered
C

1

6

I am trying to use tf.train.exponential_decay with predefined estimators and this is proving to be super difficult for some reason. Am I missing something here?

Here is my old code with constant learning_rate:

classifier = tf.estimator.DNNRegressor(
    feature_columns=f_columns,
    model_dir='./TF',
    hidden_units=[2, 2],
    optimizer=tf.train.ProximalAdagradOptimizer(
      learning_rate=0.50,
      l1_regularization_strength=0.001,
    ))

Now I tried adding this:

starter_learning_rate = 0.50
global_step = tf.Variable(0, trainable=False)
learning_rate = tf.train.exponential_decay(starter_learning_rate, global_step,
                                           10000, 0.96, staircase=True)

but now what?

  • estimator.predict() does not accept global_step so it will be stuck at 0?
  • Even if I pass learning_rate to tf.train.ProximalAdagradOptimizer() I get an error saying

"ValueError: Tensor("ExponentialDecay:0", shape=(), dtype=float32) must be from the same graph as Tensor("dnn/hiddenlayer_0/kernel/part_0:0", shape=(62, 2), dtype=float32_ref)."

Your help is greatly appreciated. I am using TF1.6 btw.

Cryptogenic answered 11/3, 2018 at 19:27 Comment(0)
L
0

you should let optimizer under mode == tf.estimator.ModeKeys.TRAIN

here is sample code

def _model_fn(features, labels, mode, config):

    # xxxxxxxxx
    # xxxxxxxxx

    assert mode == tf.estimator.ModeKeys.TRAIN

    global_step = tf.train.get_global_step()
    decay_learning_rate = tf.train.exponential_decay(learning_rate, global_step, 100, 0.98, staircase=True)
    optimizer = adagrad.AdagradOptimizer(decay_learning_rate)

    update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)
    with tf.control_dependencies(update_ops):
         train_op = optimizer.minimize(loss, global_step=tf.train.get_global_step())
    return tf.estimator.EstimatorSpec(mode, loss=loss, train_op=train_op, training_chief_hooks=chief_hooks, eval_metric_ops=metrics)
Leonoreleonsis answered 6/3, 2020 at 3:6 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.