A New Approach and Its Applications for Time Series Analysis and Prediction Based on Moving Average of nth-Order Difference

Abstract

As a typical problem in data mining, Time Series Predictions are widely applied in various domains. The approach focuses on series of observations, with the aim that, using mathematics, statistics and artificial intelligence methods, to analyze, process and make a prediction on the next most probable value based on a number of previous values. We propose an algorithm using the average sum of nth -order difference of series terms with limited range margins, in order to establish a way to predict the next series term based on both, the original data set and a negligible error. The algorithm performances are evaluated using measurement data sets on monthly average Sunspot Number, Earthquakes and Pseudo-Periodical Synthetic Time Series.

Keywords

Time Series Time Series Analysis Sunspot Number Mean Absolute Error Time Series Prediction

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Lan, Y., Neagu, D.: A New Time Series Prediction Algorithm Based on Moving Average of nth-Order Difference. In: Proceedings of the Sixth International Conference on Machine Learning and Applications, pp. 248–253. IEEE Computer Society, Washington, DC, USA (2007)Google Scholar