The document discusses different methods for calculating moving averages of time series data, including simple, trailing, and exponentially weighted moving averages. It also covers double and triple exponential smoothing techniques, which can model the level, trend, and seasonality components of time series. Triple exponential smoothing, also known as the Holt-Winters method, is demonstrated as an example using R code to analyze time series sales data and predict future values.
IAC 2024 - IA Fast Track to Search Focused AI Solutions
Ā
Moving Average
1. Moving Average Methods
Edward L. Boone
Department of Statistical Sciences and Operations Research
Virginia Commonwealth University
November 11, 2013
Edward L. Boone
2. Simple Moving Average
We are considering time series data xt where t = 1, 2, ..., T .
The order of the observations matter.
A simple moving average attempts to ļ¬nd a local mean.
This can be done simply by taking the average of the
points around the time of interest.
For example if we are interested in a window of width k we
simply take xt , xtā1 , xt+1 ,...,xt+k ,xtāk and compute their
average.
Edward L. Boone
3. Example
Consider the following example:
x1
1.2
x2
1.3
x3
1.1
x4
1.2
x5
1.4
x6
1.7
x7
1.6
x8
1.8
x9
1.5
x10
1.6
If we want the moving average at time t = 3 with window 2.
ĀÆ
x3,2 =
x1 + x2 + x3 + x4 + x5
1.2 + 1.3 + 1.1 + 1.2 + 1.4
=
= 1.24
5
5
If we want the moving average at time t = 7 with window 2.
ĀÆ
x7,2 =
x5 + x6 + x7 + x8 + x9
1.4 + 1.7 + 1.6 + 1.8 + 1.5
=
= 1.6
5
5
Notice that the ālocalā means are not similar.
Edward L. Boone
4. Trailing Moving Average
The problem with a standard moving average is that for the
mean at time t we need to know t + 1, t + 2,...,t + k , which is in
the future.
In many useful cases we donāt know the future.
We want to just use past values.
This leads to the idea of the trailing moving average.
Only take the average of xtāk , xtāk +1 ,...x1 , xt .
ĀÆ
xt,k =
Edward L. Boone
1
k
t
xt
i=tāk
6. Simple vs. Trailing Moving Average
There are some issues that we will have to confront with all
time series methods.
How to handle the starting values?
Outliers?
Gaps?
Prediction?
Some of these are easier to deal with than others.
Edward L. Boone
7. Simple vs. Trailing Moving Average
23
Consider the example to the
right.
Edward L. Boone
22
x
20
19
18
17
Notice that the red line is
āsmootherā than the blue
line.
21
Red is centered moving
average.
Blue is trailing moving
average.
True
Center
Trail
0
20
40
60
t
80
100
8. Issues with Moving Averages
Problems with simple moving average techniques.
In the previous methods, all observations in window get the
same weight.
We may wish to downweight observations as they get
farther in the past and always use all observations.
Gaps?
Prediction?
Some of these are easier to deal with than others.
Edward L. Boone
9. Exponentially Weighted Moving Average
A āsimpleā way to address the weighting problem is using a
weighted moving average.
There are several versions of these.
Each attempts to model the components of a time series
dataset.
If we just want to model the level then the Exponentially
Weighted Moving Average may be reasonable.
S1 = x1
St = Ī±xt + (1 ā Ī±)Stā1
These downweight the previous observations but still use
all observations.
Edward L. Boone
12. Exponentially Weighted Moving Average
What we have looked at so far is only concerned in estimating a
level (mean).
Only models the level.
We want to model the trend.
We want to model the Seasonality as well.
In order to do this we will need to build the model using
these basic components.
Edward L. Boone
13. Double Exponential Smoothing
Now we can add in a trend term bt .
Starting values:
S1 = x1
b1 = x2 ā x1
Process smoothing:
St
bt
Edward L. Boone
= Ī±xt + (1 ā Ī±)(Stā1 + btā1 )
= Ī²(St ā Stā1 ) + (1 ā Ī²)btā1
15. Example
Again consider the following example using Ī± = 0.3 and
Ī² = 0.2.
x1
1.2
x2
1.3
x3
1.1
x4
1.2
x5
1.4
x6
1.7
x7
1.6
x8
1.8
Predict x11 and x12 .
S10 = 1.7343
b10 = 0.0576
x11 = S10 + b10
= 1.7343 + 0.0567
= 1.7919
x12 = S10 + 2b10
= 1.7343 + 2(0.0567)
= 1.8496
Edward L. Boone
x9
1.5
x10
1.6
16. Triple Exponential Smoothing
To have a level, trend and season can get a bit complicated.
We need to know what period the seasonality manifests.
Quarterly data the seasonal ālag" L may be 4.
Monthly data the seasonal ālag" L may be 12.
Weekly data the seasonal ālag" L may be 52.
Daily data the seasonal ālag" L may be 365.
Think how complicated hourly data would be.
For simplicity we will consider Quarterly data with L = 4.
Edward L. Boone
17. Triple Exponential Smoothing
This is also known as the Holt-Winters method.
Process smoothing:
xt
+ (1 ā Ī±)(Stā1 + btā1 )
St = Ī±
CtāL
bt = Ī²(St ā Stā1 ) + (1 ā Ī²)btā1
xt
Ct = Ī³ + (1 ā Ī³)CtāL
st
Starting values:
These are more difļ¬cult to get.
Some use the ļ¬rst few cycles and get means.
Some use regression to get S0 and b0 , then use those to
get the initial Cās.
We will let R do this for us so we donāt have to worry about
it.
Edward L. Boone
19. HoltWinters Function in R
Using the HoltWinters function in R we can estimate:
smoothing parameters
ļ¬tted values
the ļ¬tted values also contain the level, trend and seasonal
components
Edward L. Boone
25. Conclusion
Moving average and āsmoothing" methods are ad hoc methods
for analyzing time series data.
MA methods require user input k .
Smoothing methods downweight past observations.
These methods can directly model the level, trend and
season components.
Since they are ad hoc they can produce odd results at
times.
While these methods are useful we need to be careful because
they have no clear theory to back them up.
Edward L. Boone