Time-Series Prediction Algorithm
(Redirected from Forecasting Algorithm)
Jump to navigation
Jump to search
A Time-Series Prediction Algorithm is a sequential prediction algorithm that can be applied by a temporal prediction system to solve a temporal prediction task.
- Context:
- It can range from being a Univariate Forecasting Algorithm to being a Multi-Predictor Forecasting Algorithm.
- It can range from (typically) being a Data-Driven Forecasting Algorithm (e.g. a supervised) to being a Heuristic Forecasting Algorithm.
- It can range from being a Statistics-based Forecasting Algorithm to being an ML-based Forecasting Algorithm.
- It can range from being a Temporal Point Prediction Algorithm to being a Temporal Series Prediction Algorithm (such as timeseries pattern classification algorithm).
- It can range from being a Time-Series Categorical Prediction Algorithm to being a Time-Series Ranking Algorithm to being a Time-Series Numeric Prediction Algorithm.
- Example(s):
- a Naive Forecasting Algorithm.
- a Mean-based Forecasting Algorithm.
- a Random Walk Forecasting Algorithm, such as a random-walk-without-drift algorithm.
- a Linear Trend Forecasting Algorithm.
- a Moving Average-based Forecasting Algorithm.
- an Exponential Forecasting Algorithm, such as a Holt-Winters forecasting algorithm.
- a Vector Autoregression Algorithm, such as Bayesian vector autoregression.
- an ARIMA-based Algorithm.
- a Vector Autogression Algorithm.
- a SVM-based Forecasting Algorithm.
- a Neural Network-based Forecasting Algorithm.
- Counter-Example(s):
- See: Ranking Algorithm.
References
2018a
- (Makridakis et al., 2018) ⇒ Spyros Makridakis, Evangelos Spiliotis, and Vassilios Assimakopoulos. (2018). “Statistical and Machine Learning Forecasting Methods: Concerns and Ways Forward.” In: PloS one, 13(3).
- QUOTE: Machine Learning (ML) methods have been proposed in the academic literature as alternatives to statistical ones for time series forecasting. Yet, scant evidence is available about their relative performance in terms of accuracy and computational requirements. The purpose of this paper is to evaluate such performance across multiple forecasting horizons using a large subset of 1045 monthly time series used in the M3 Competition. ...
2018b
- https://towardsdatascience.com/using-lstms-to-forecast-time-series-4ab688386b1f
- QUOTE: There are several time-series forecasting techniques like auto regression (AR) models, moving average (MA) models, Holt-winters, ARIMA etc., to name a few. So, what is the need for yet another model like LSTM-RNN to forecast time-series? This is quite a valid question to begin with and here are the reasons that I could come up with (respond below if you are aware of more, I will be curious to know)—
2018c
- https://machinelearningmastery.com/time-series-forecasting-methods-in-python-cheat-sheet/
- QUOTE:
- Autoregression (AR).
- Moving Average (MA).
- Autoregressive Moving Average (ARMA).
- Autoregressive Integrated Moving Average (ARIMA) Seasonal Autoregressive Integrated Moving-Average (SARIMA).
- Seasonal Autoregressive Integrated Moving-Average with Exogenous Regressors (SARIMAX).
- Vector Autoregression (VAR) Vector Autoregression Moving-Average (VARMA).
- Vector Autoregression Moving-Average with Exogenous Regressors (VARMAX).
- [[Simple Exponential Smoothing (SES).
- Holt Winter’s Exponential Smoothing (HWES)
- QUOTE:
2014
- http://people.duke.edu/~rnau/411avg.htm
- QUOTE: As a first step in moving beyond mean models, random walk models, and linear trend models, nonseasonal patterns and trends can be extrapolated using a moving-average or smoothing model. The basic assumption behind averaging and smoothing models is that the time series is locally stationary with a slowly varying mean. Hence, we take a moving (local) average to estimate the current value of the mean and then use that as the forecast for the near future. This can be considered as a compromise between the mean model and the random-walk-without-drift-model. The same strategy can be used to estimate and extrapolate a local trend. A moving average is often called a "smoothed" version of the original series because short-term averaging has the effect of smoothing out the bumps in the original series. By adjusting the degree of smoothing (the width of the moving average), we can hope to strike some kind of optimal balance between the performance of the mean and random walk models. The simplest kind of averaging model is the....
2013
- http://www.businessdictionary.com/definition/forecasting-system.html
- QUOTE: Set of techniques or tools required for analysis of historical data, selection of most appropriate modeling structure, model validation, development of forecasts, and monitoring and adjustment of forecasts, etc.
2006
- (De Gooijer & Hyndman, 2006) ⇒ Jan G. De Gooijer, and Rob J. Hyndman. (2006). “25 years of time series forecasting.” In: International Journal of Forecasting, 22(3). doi:10.1016/j.ijforecast.2006.01.001
- AUTHOR KEYWORDS: Accuracy measures; ARCH; ARIMA; Combining; Count data; Densities; Exponential smoothing; Kalman filter; Long memory; Multivariate; Neural nets; Nonlinearity; Prediction intervals; Regime-switching; Robustness; Seasonality; State space; Structural models; Transfer function; Univariate; VAR.