LSTM regression using TensorFlow.

This post tries to demonstrates how to approximate a sequence of vectors using a recurrent neural networks, in particular I will be using the LSTM architecture, The complete code used for this post could be found here. Most of the examples I found in the internet apply the LSTM architecture to natural language processing problems, and I couldn’t find an example where this architecture could be used to predict continuous values.

Update: code compatible with tensorflow 1.1.0

The same model can be achieved by using the LSTM layer from polyaxon, here’s a an experiment configuration to achieve the same results from this post:

Original Post:

So the task here is to predict a sequence of real numbers based on previous observations. The traditional neural networks architectures can’t do this, this is why recurrent neural networks were made to address this issue, as they allow to store previous information to predict future event.

In this example we will try to predict a couple of functions:

sin

sin and cos on the same time

x*sin(x)

First of all let’s build our model, lstm_model, the model is a list of stacked lstm cells of different time steps followed by a dense layers.

this will create a data that will allow our model to look time_steps number of times back in the past in order to make a prediction. So if for example our first cell is a 10 time_steps cell, then for each prediction we want to make, we need to feed the cell 10 historical data points. The y values should correspond to the tenth value of the data we want to predict.

model loss

N.B I am not completely sure if this is the right way to train lstm on regression problems, I am still experimenting with the RNN sequence-to-sequence model, I will update this post or write a new one to use the sequence-to-sequence model.