Tensorflow eager for language generation

Text generation is a task in which we generate sentences based on the probability distribution of words learnt by the model from the training data. We trained our model on Leo Tolstoy’s War and Peace so it can understand the style of his writing and try to generate new sentences.

War and Peace is a difficult text to model upon as out of the 38,000 odd total vocabulary size the number of words that repeated only once was around 18,000.

Tensorflow Eager provides the flexibility to use Tensorflow without using graphs, so we can build graphs on the fly during run time. Tensorflow eager provides multiple advantages and below are the few as listed on Tf website

An intuitive interface — Structure your code naturally and use Python data structures. Quickly iterate on small models and small data.

model.main_model(tf.zeros([batch_size,60,word_embedding_size]),state="train") ## just for the LSTM weights to be initialized so that values can be restoredtf.contrib.eager.Saver.restore(file_prefix='jai_model_v2/weights61',self=tf.contrib.eager.Saver(var_list=list(model.variables)))global_step = tf.train.get_or_create_global_step()global_step.assign(50)

below is the training loop.once the iterator is done with one batch it will throw a OutOfRange exception will catch the exception and restart iterator.initializable iterator is not supported in tensorflow eager.