TensorFlow programs use a tensor data structure to represent all data — only tensors are passed between operations in the computation graph. You can think of a TensorFlow tensor as an n-dimensional array or list.

For example, a scaler is a tensor, a vector is a tensor and a matrix is a tensor. A tensor has a rank, a shape and a static type, so a tensor can be represented as a multidimensional array of numbers.

1) Rank

Tensor rank is the number of dimensions of the tensor, for example, a rank one tensor is a vector, a rank two tensor is a matrix.

In [1]: import tensorflow as tf
# Create a Variable, that will be initialized to the scalar value 100
In [2]: var = tf.Variable(100, name="variable_counter")# Create an Op to multiply ten to `var`.
In [3]: ten = tf.constant(10)
In [4]: new_var = tf.mul(var, ten)
In [5]: update = tf.assign(var, new_var)# Variables must be initialized by running an `init` Op after having# launched the graph. We first have to add the `init` Op to the graph.
In [6]: init_op = tf.initialize_all_variables()# Launch the graph and run the ops.
In [7]: with tf.Session()as sess:
sess.run(init_op)print(sess.run(var))for _ inrange(5):
sess.run(update)print(sess.run(var))
...:
# output 100100010000100000100000010000000

In [1]: import tensorflow as tf
# Create a Variable, that will be initialized to the scalar value 100
In [2]: var = tf.Variable(100, name="variable_counter")
# Create an Op to multiply ten to `var`.
In [3]: ten = tf.constant(10)
In [4]: new_var = tf.mul(var, ten)
In [5]: update = tf.assign(var, new_var)
# Variables must be initialized by running an `init` Op after having
# launched the graph. We first have to add the `init` Op to the graph.
In [6]: init_op = tf.initialize_all_variables()
# Launch the graph and run the ops.
In [7]: with tf.Session() as sess:
sess.run(init_op)
print(sess.run(var))
for _ in range(5):
sess.run(update)
print(sess.run(var))
...:
# output
100
1000
10000
100000
1000000
10000000

Variables are in-memory buffers containing tensors, when you train a model with TensorFlow, variables can be used to hold and update parameters. Variables must be explicitly initialized and cab be saved to disk during and after training.

1
2
3
4
5
6
7
8
9

# Create a variable with a random value.
In [10]: weights = tf.Variable(tf.random_normal([4096,500], stddev=0.25), name="weights")# Create another variable with the same value as 'weights'.
In [11]: weights2 = tf.Variable(weights.initialized_value(), name="weights2")# Create another variable with twice the value of 'weights'
In [12]: weights_twice = tf.Variable(weights.initialized_value() * 2.0, name="weights_twice")

# Create a variable with a random value.
In [10]: weights = tf.Variable(tf.random_normal([4096, 500], stddev=0.25), name="weights")
# Create another variable with the same value as 'weights'.
In [11]: weights2 = tf.Variable(weights.initialized_value(), name="weights2")
# Create another variable with twice the value of 'weights'
In [12]: weights_twice = tf.Variable(weights.initialized_value() * 2.0, name="weights_twice")

Fetches

To fetch the outputs of operations, execute the graph with a run() call on the Session object and pass in the tensors to retrieve. Except fetching the single tensor node, you can also fetch multiple tensors:

TensorFlow provides a feed mechanism for patching a tensor directly into any operation in the graph.

A feed temporarily replaces the output of an operation with a tensor value. You supply feed data as an argument to a run() call. The feed is only used for the run call to which it is passed. The most common use case involves designating specific operations to be “feed” operations by using tf.placeholder() to create them: