ssh config example

an example reminder


Host <name of ssh config>
  HostName <ip address/domain address>
  Port 22
  User username
  IdentityFile <location to private key>
  RemoteForward 52698

The indent is actually a tab.

The remote forwarding is for rmate/rsub

lazy summary adding in tensorflow

When creating a training script in tensorflow, there rises the need to sometimes add summary protobufs later on in the same step. For example, lets say a training session is in play with a metric calculation step included. Periodically, I want to run a prediction with a validation/test data and record the metrics for these predictions along with the summary writer used to log the process of the training steps. In other words, a tensorboard image like the following is desired:

Also available at medium

Continue reading “lazy summary adding in tensorflow”

no idea why gradient is not applied..

import tensorflow as tf
import numpy as np
a= np.zeros((2,2), dtype=”float32″)
b= np.array([[6,7],[8,9]], dtype=”float32″)
t1= tf.placeholder(tf.float32, shape=(2,2))
label_t = tf.placeholder(tf.float32, shape=(2,2))
t2 = tf.layers.dense(t1,2, activation=tf.nn.relu)
# t2 = tf.layers.dense(t_mid, 2, activation=tf.nn.relu)
# loss1 = tf.losses.mean_squared_error(label_t, t2)
loss1 = tf.reduce_sum(tf.square(label_t – t2))
optimizer = tf.train.AdamOptimizer(0.1)
train_op22 = optimizer.minimize(loss1)
# grad = optimizer.compute_gradients(loss1)
# train_op = optimizer.apply_gradients(grad)
# initop = tf.global_variables_initializer()
# vnames = [ for v in tf.trainable_variables()]
# print(“vnames”, vnames)
g = tf.get_default_graph()
with tf.Session(graph=g) as sess:
for i inrange(steps):
print(“trainop”, train_op22)
pred2, loss_val, _=[t2,loss1, train_op22], feed_dict={t1:a, label_t:b})
print(“loss”, loss_val)
# print(“pred2”, pred2)
# print(“grad”, grad_out)
# for v in vnames_out:
# print(v)
# for t,v in grad_out:
# print(t)
# print(v)
# print(“===”)
# pred =, feed_dict={t1:a, label_t:b})
# print(“pred”,pred)
# print(a)