What is Autoregressive?

Andrew Burger

"Autoregressive" is a statistical term used when working with time series data that refers to a variable quantity or value of interest that is correlated to, or dependent upon, previous values of that same variable. The related term "autoregression" is a form of regression analysis that uses time series data as input to find out whether a variable of interest is indeed autoregressive, that is, dependent upon previous values of itself. A variable of interest that turns out to be autoregressive suggests, but doesn't in and of itself prove, that there is a cause-and-effect relationship between the current and past values. Hence, time series of known or suspected autoregressive quantities or values are often analyzed using predictive analytic methods to forecast future values of such variables.

Businessman with a briefcase
Businessman with a briefcase

Variables of interest that exhibit some significant degree of autoregression pop up in a variety of places as a result of human and natural processes. Stock market prices, foreign exchange rates, digital signals and the number of individuals in a population, for example, are all considered to be autoregressive, at least to some degree. Moreover, there are a variety of forms of autoregression analysis, each one being considered better or worse suited, and hence applied, to particular types of autoregressive data sets. Among such applications, autoregression is being used in health care to improve resolution and interpretation of ultrasound diagnostic tests; in telecommunications to improve transmission, reception and processing of digital signals; in economics to forecast macroeconomic and business performance; and in financial services to calculate personal credit scores, detect fraud and calculate insurance risk profiles and premiums.

Autoregressive moving average (ARMA) models combine autoregression and moving average models — averages whose constituent elements shift as time progresses. Also known as Box-Jenkins models — named after George Box and Gwilym Jenkins, the statisticians who improved upon their original formulations and popularized their use — they typically are used to model and test time series that are a function of exogenous, or external, shocks and their own past performance. ARMA models are "fit" to actual observations over time of some known or suspected autoregressive variable or variables of interest to better understand the processes that generate them. In contrast to strictly autoregressive models, they are considered a means of establishing causality — the existence of a cause and effect relationship between the independent and dependent variable or variables. Hence, they are commonly used in time series forecasting and other forms of predictive analytics.

You might also Like

Readers Also Love

Discuss this Article

Post your comments
Forgot password?