How do I display a list of all gradients in Tensorboard?… here is a solution to the problem.
How do I display a list of all gradients in Tensorboard?
I’m using Tensorboard 1.5 and I want to see how my gradients look.
Here is an example of the layer I’m using:
net = tf.layers.dense(features, 40, activation=tf.nn.relu, kernel_regularizer=regularizer,
kernel_initializer=tf.contrib.layers.xavier_initializer())
Here is my optimizer:
train_op = tf.train.AdamOptimizer(learning_rate = learning_rate).minimize(loss)
For my model parameters, I create the summary this way:
for var in tf.trainable_variables():
tf.summary.histogram(var.name, var)
Is there a similar way to get all gradients in the for loop to create my summary?
Solution
You should first use the optimizer’s compute_gradients
to get gradients and then pass them to summary:
opt = tf.train.AdamOptimizer(learning_rate = learning_rate)
# Calculate the gradients for the batch of data
grads = opt.compute_gradients(loss)
# Add histograms for gradients.
for grad, var in grads:
if grad is not None:
summaries.append(tf.summary.histogram(var.op.name + '/gradients', grad))
Then for training, you can call the optimizer’s apply_gradients
:
# Apply the gradients to adjust the shared variables.
train_op = opt.apply_gradients(grads, global_step=global_step)
More can be found in tensorflow cifar10 tutorial .