Science
Fact-checked

At AllTheScience, we're committed to delivering accurate, trustworthy information. Our expert-authored content is rigorously fact-checked and sourced from credible authorities. Discover how we uphold the highest standards in providing you with reliable knowledge.

Learn more...

What Is Exponential Smoothing?

Matt Hubbard
Matt Hubbard

Exponential smoothing is a technique for manipulating data from a series of chronological observations to downplay the effects of random variation. Mathematical modeling, the creation of a numerical simulation for a data set, often treats observed data as the sum of two or more components, one of which is random error, the differences between the observed value and the underlying true value. When properly applied, smoothing techniques minimize the effect of the random variation, making it easier to see the underlying phenomenon — a benefit both in presenting the data and in making forecasts of future values. They are referred to as "smoothing" techniques because they remove jagged ups and downs associated with random variation and leave behind a smoother line or curve when the data is graphed. The disadvantage of smoothing techniques is that when improperly used they can also smooth away important trends or cyclical changes within the data as well as the random variation, and thereby distort any predictions they offer.

The simplest smoothing technique is to take an average of past values. Unfortunately, this also completely obscures any trends, changes, or cycles within the data. More complicated averages eliminate some but not all of this obscuring and still tend to lag as forecasters, not responding to changes in trends until several observations after the trend has changed. Examples of this include a moving average that only uses the most recent observations or a weighted average that values some observations more than others. Exponential smoothing represents an attempt to improve upon these defects.

Exponential smoothing may be used to downplay random variation in data gathered from a series of chronological observations.
Exponential smoothing may be used to downplay random variation in data gathered from a series of chronological observations.

Simple exponential smoothing is the most basic form, using a simple recursive formula to transform the data. S1, the first smoothed point, is simply equal to O1, the first observed data. For each subsequent point, the smoothed point is an interpolation between the previous smoothed data and the current observation: Sn = aOn + (1-a)Sn-1. The constant "a" is known as the smoothing constant; it is valued between zero and one and determines how much weight is given to the raw data and how much to the smoothed data. Statistical analysis to minimize the random error generally determines the optimal value for a given series of data.

If the recursive formula for Sn is rewritten only in terms of the observed data, it yields the formula Sn = aOn + a(1-a)On-1 + a(1-a) 2On-2 + . . . revealing that the smoothed data is a weighted average of all the data with the weights varying exponentially in a geometric series. This is the source of the exponential in the phrase "exponential smoothing." The closer the value of "a" is to one, the more responsive to changes in trend the smoothed data will be, but at the expense of also being more subject to the random variation in the data.

The benefit of simple exponential smoothing is that it allows for a trend in how the smoothed data is changing. It does poorly, however, at separating changes in the trend from the random variations inherent to the data. For that reason, double and triple exponential smoothing are also used, introducing additional constants and more complicated recursions in order to account for trend and cyclical change in the data.

Unemployment data is an excellent example of data that benefits from triple exponential smoothing. Triple smoothing allows the unemployment data to be viewed as the sum of four factors: the unavoidable random error in collecting the data, a base level of unemployment, the cyclical seasonal variation that affects many industries, and a changing trend that reflects the health of the economy. By assigning smoothing constants to the base, the trend, and the seasonal variation, triple smoothing makes it easier for a layman to see how unemployment is varying over time. The choice of different constants will alter the appearance of the smoothed data, however, which is one of the reasons economists can sometimes differ greatly in their forecasts.

Exponential smoothing is one of many methods for mathematically altering data to make more sense of the phenomenon that generated the data. The computations can be performed on commonly available office software, so it is also an easily available technique. Properly used, it is an invaluable tool for presenting data and for making predictions. Improperly performed, it can potentially obscure important information along with the random variations, so care should be taken with smoothed data.

Discuss this Article

Post your comments
Login:
Forgot password?
Register:
    • Exponential smoothing may be used to downplay random variation in data gathered from a series of chronological observations.
      By: FotolEdhar
      Exponential smoothing may be used to downplay random variation in data gathered from a series of chronological observations.