I've googled and came to know that Tensorflow's constant() function generates a constant Tensor (big surprise!) and cannot be modified.
But when I do:
>>> a = tf.constant(0.0)
>>> a = a + 1.0
I don't see any error generated by Tensorflow.
I understand the reason, a is now a new tensor operation Add (<tf.Tensor 'add_1:0' shape=() dtype=float32>).
My question is, What is the use of Tensorflow constant if we can modify it? Does it has anything to do with graph optimization? Am I missing something trivial here?
Thanks in advance.