I am using TensorFlow 1.12 in eager execution, and I want to inspect the values of my gradients and my weights at different points during training for debugging purposes. This answer uses TensorBoard to get nice graphs of weight and gradient distribution over epochs, which is what I would like. However, when I use Keras' TensorBoard callback, I get this:
WARNING:tensorflow:Weight and gradient histograms not supported for eagerexecution, setting `histogram_freq` to `0`.
In other words, this is not compatible with eager execution. Is there any other way to print gradients and/or weigths? Most non-TensorBoard answers seem to rely on graph-based execution.
tf.fit()
, in order to print gradients over multiple epochs? Since that function abstracts away to loss taking and weights updating, I wouldn´t know how to get (the kernel of), say, my input layer and the loss "in the same place" and to feed those to thet.gradient
function. – Estey