No, you can not see the content of the tensor without running the graph (doing session.run()). The only things you can see are:
- the dimensionality of the tensor (but I assume it is not hard to calculate it for the list of the operations that TF has)
 
- type of the operation that will be used to generate the tensor (
transpose_1:0, random_uniform:0) 
- type of elements in the tensor (
float32) 
I have not found this in documentation, but I believe that the values of the variables (and some of the constants are not calculated at the time of assignment).
Take a look at this example:
import tensorflow as tf
from datetime import datetime
dim = 7000
The first example where I just initiate a constant Tensor of random numbers run approximately the same time irrespectibly of dim (0:00:00.003261)
startTime = datetime.now()
m1 = tf.truncated_normal([dim, dim], mean=0.0, stddev=0.02, dtype=tf.float32, seed=1)
print datetime.now() - startTime
In the second case, where the constant is actually gets evaluated and the values are assigned, the time clearly depends on dim (0:00:01.244642)
startTime = datetime.now()
m1 = tf.truncated_normal([dim, dim], mean=0.0, stddev=0.02, dtype=tf.float32, seed=1)
sess = tf.Session()
sess.run(m1)
print datetime.now() - startTime
And you can make it more clear by calculating something (d = tf.matrix_determinant(m1), keeping in mind that the time will run in O(dim^2.8))
P.S. I found were it is explained in documentation:
A Tensor object is a symbolic handle to the result of an operation,
  but does not actually hold the values of the operation's output.