This paper describes a simple and efficient Neural Language Model approach for text classification that relies only on unsupervised word representation inputs. Our model employs Recurrent Neural Network Long Short-Term Memory (RNN-LSTM), on top of pre-trained word vectors for sentence-level classification tasks. In our hypothesis we argue that using word vectors obtained from an unsupervised neural language model as an extra feature with RNN-LSTM for Natural Language Processing (NLP) system can increase the performance of the system. Our technique has improved the model to capture syntactic and semantic word relationships. We show that simple RNN-LSTM with word2vec achieves excellent result on IMDB Stanford benchmark for sentiment analysis task.