MBMG-7104/ ITHS-2202/ IMAS-3101/ IMHS-3101
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Forecasting
 Managers require good forecasts of future events to
make good decisions.
 For example, forecasts of interest rates, energy prices,
and other economic indicators are needed for financial
planning;
 Sales forecasts are needed to plan production and
workforce capacity; and
 forecasts of trends in demographics, consumer behavior,
and technological innovation are needed for long-term
strategic planning.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Forecasting
 Many of us have faced the challenge of selecting the
best option when buying a new product or trying out a
new technique.
 This can be a challenging task as most options tend to
sound similar to one another, making it difficult to
determine the best choice.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Forecasting
 Business analysts may choose from a wide range of
forecasting techniques to support decision making.
 Three major categories of forecasting approaches are : -
Qualitative and judgmental
techniques
Statistical time-series
models
Explanatory/causal
methods.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
1. Qualitative and Judgmental Forecasting
Qualitative and judgmental techniques rely on experience
and intuition;
they are necessary –
 when historical data are not available or
 when the decision maker needs to forecast far into the
future.
For example, a forecast of when the next generation of a microprocessor will
be available and what capabilities it might have will depend greatly on the
opinions and expertise of individuals who understand the technology.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Judgmental techniques range from: -
 simple methods as a manager’s opinion,
 or a group-based jury of executive opinion.
to more structured approaches such as
 Historical analogy and
 The Delphi method.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Historical Analogy
In historical analogy a forecast is obtained through
a comparative analysis with a previous situation.
For example, if a new product is being introduced, the response of
consumers to marketing campaigns to similar, previous products
can be used as a basis to predict how the new marketing campaign
might fare
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
 Analogies often provide good forecasts,
 but you need to be careful to recognize new or different
circumstances.
 Another analogy is international conflict relative to the
price of oil.
 Should war break out, the price would be expected to
rise, analogous to what it has done in the past.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
The Delphi Method
The Delphi method, uses a panel of experts, whose identities are
typically kept confidential from one another, to respond to a
sequence of questionnaires.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Characteristics of the Delphi
 Participants are experts in their field.
 The technique uses a series of rounds or iterations where
information is given back to the participants for review.
 Participants work anonymously. They do not know who the
other participants might be.
 Future focused
 The Delphi technique is a “consensus” research method.
 In most cases, the goal is to approach a consensus among the
expert panel as to future “best” solutions.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Advantages and Limitations of Delphi
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
 A time series is a stream of historical data, such as weekly sales.
 Time series generally have one or more of the following
components:
 Trends (T),
 Seasonal effects (S),
 Cyclical effects (C), or
 Random behavior (R)
 Time-series models assume that whatever forces have influenced
the dependent variable in the recent past will continue into the
near future. Thus, the forecasts are developed by extrapolating
these data into the future.
2. Statistical time-series models
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
 Statistical time-series models find greater applicability for short-
range forecasting problems.
 We characterize the values of a time series over T periods as At, t
= 1, 2, c, T.
 Time series are tabulated or graphed to show the nature of the
time dependence.
 The forecast value (Ye) is commonly expressed as a
multiplicative or additive function of its components as :
 Additive model : Ye=T+S+C+R
 Multiplicative model : Ye = T.S.C.R
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Components of Time-series
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
 T : Trend is a gradual long-term directional movement in the
data (growth or decline).
 S: Seasonal effects are similar variations occurring during
corresponding periods, c.g., December retail sales.
 Seasonal can be quarterly, monthly, weekly, daily, or even hourly
indexes.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
 C: Cyclical factors are the long-term swings about the trend line.
 They are often associated with business cycles and may extend
out to several years in length.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
 R: Random component are sporadic (unpredictable) effects due
to chance and unusual occurrences.
 Time series that do not have trend, seasonal, or cyclical effects
but are relatively constant and exhibit only random behavior are
called stationary time series.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Different types of Time-series models
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
A. Naive Approach
 It is the simplest way to forecast.
 It is a technique that assumes dependent variable (such as
demand, sales) in the next period is equal to dependent variable
in the most recent period.
 E.g. Feb. month sales of Big basket will follow the sales of Jan.
month.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
B. Moving Averages model
 The moving average method assumes that future observations
will be similar to the recent past.
 It is most useful as a short-range forecasting method.
 Although this method is very simple, it has proven to be quite
useful in stable environments, such as inventory management,
demand forecast etc.
 Simple Moving Average :
 Mathematically : SM𝐴 =
𝐷𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒 𝑖𝑛 𝑘 𝑝𝑒𝑟𝑖𝑜𝑑
𝑇𝑜𝑡𝑎𝑙 𝑡𝑖𝑚𝑒 𝑝𝑒𝑟𝑖𝑜𝑑 𝑁
Where’s Ft+1 = Forecast for t+1 time, N= times, Yk= dependent variable in k period
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Moving Averages model
 For example, let's say we have a monthly sales data for the past
7 months, as follows:
 If we want to forecast the sales for the next month of march., we
can use a 2-month SMA, which calculates the average of the past
two months:
Month Actual
Sales
2 month SMA
forecast
Sales
Forecast
Jan 100
Feb 120
Mar. 110 =(100+120)/2= 110 110
Apr. 130 = (120+110)/2=115 115
May. 140 =(110+130)/2 = 120 120
Jun. 120 =(130+140)/2 = 135 135
Jul. 110 = (140+120)/2 = 130 130
Aug. = (120+110)/2 = 115 115
80
90
100
110
120
130
140
150
Jan Feb Mar. Apr. May. Jun. Jul.
SMA - Forecast
Actual Sales Sales Forecast
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
 Another example
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
 Weighted Moving Average :
 In a weighted moving average, past observations are given differential
weights (usually the weight decrease as the data becomes older).
 Weighted moving average is given by :
WM𝐴 =
𝑊𝑒𝑖𝑔ℎ𝑡 𝑜𝑓 𝑝𝑒𝑟𝑖𝑜𝑑 𝑘 (𝐷𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒 𝑖𝑛 𝑝𝑒𝑟𝑖𝑜𝑑 𝑘)
𝑊𝑒𝑖𝑔ℎ𝑡𝑠
Or
where Wk is the weight given to value of Y at time k (Yk) and
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
 Here's an example to illustrate how WMA works:
 Let's say you are tracking the sales of a particular product over a period
of 5 months. The sales figures for each month are as follows:
Month Actual
Sales(Units)
Weight WMA WMA for May month
Jan. 100 0.5 100 = (100*0.5) +
(120*0.3) + (150*0.1)
+ (180*0.05)+
(200*0.05) =
50+36+15+9+10
Feb. 120 0.3 95
Mar. 150 0.1 102.67
Apr. 180 0.05 110.47
May 200 0.05 120 = 120
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Error Metrics and Forecast Accuracy
 The quality of a forecast depends on how accurate it is in predicting
future values of a time series.
 In the simple moving average and smoothing model, different values
for forecasted time period will produce different forecasts.
 To analyze the effectiveness of different forecasting models, we can
define error metrics, which compare quantitatively the forecast with the
actual observations.
 Three metrics that are commonly used are :
• Mean Absolute Deviation,
• Mean Square Error, and
• Mean Absolute Percentage Error. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Mean Absolute Deviation :
 The mean absolute deviation (MAD) is the absolute
difference between the actual value and the forecast,
averaged over a range of forecasted values:
where At = actual value of the time series at time t, Ft = forecast value for time t,
n = number of forecast values
(not the number of data points since we do not have a forecast value associated
with the first k data points)
 MAD provides a robust measure of error and is less affected
by extreme observations @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Mean Square Error :
 Mean square error (MSE) is probably the most commonly used error
metric.
 It penalizes larger errors because squaring larger numbers has a
greater impact than squaring smaller numbers. The formula for MSE
is
 Again, n represents the number of forecast values used in computing
the average.
 The square root of MSE, called the root mean square error
(RMSE), is used:
 RMSE is expressed in the same units as the data, allowing for more
practical comparisons. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Mean Absolute Percentage Error :
 The mean absolute percentage error (MAPE) is the average
of absolute errors divided by actual observation values.
 MAPE eliminates the measurement scale by dividing the
absolute error by the time-series data value.
 This allows a better relative comparison.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
C. Exponential Smoothing
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
 This technique fits a trend line to a series of historical data points and
then projects the line into the future for medium to long-range
forecasts.
 If the time series exhibits an increasing or decreasing trend then a
trend analysis is more appropriate.
 A trend line defines the relationship between forecast value and the
time period by the following equation.
 ŷ = a + bX, where, ŷ is the forecast value and X is the time period.
 Where ŷ-computed value of the variable to be predicted (called the
dependent variable) 36
D. Trend Line
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
 X is the independent variable and Y is the dependent
variable since the forecast value depends on the time
period.
 A least square method is used to develop a trend line.
 a = Y -axis intercept.
 b = slope of the regression line (or the rate of change in Y
for given changes in X)
 X = the independent variable (which in this case is time).
37
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
38
 Now we have to calculate the value of 'a' and 'b' constants
as :
𝑏 =
(𝑋− 𝑋)∗(𝑌− 𝑌)
(𝑋−𝑋 )2 =
𝐶𝑜𝑣( 𝑋,𝑌)
𝑉𝑎𝑟 (𝑋)
And
a = 𝑌 − 𝑏 ∗ 𝑋
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
 In the equation, Y = a + bX, a is the intercept on the Y-
axis. a gives the value of variable Y, when X = 0.
 The slope of the line is b which gives the change in the
value of dependent variable Y for a unit change in the
value of X.
 The “Intercept” and “Slope” functions in Excel are used to
calculate a and b respectively.
39
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Example: Consider the demand data given in the table
below.
Project the trendline.
Time: Independent
Variable (X) 1 2 3 4 5 6 7 8 9 10
Demand :
Dependent
Variable (Y) 9 15 32 48 52 60 39 65 90 93
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Solution Manual Method :-
41
Time: (X)
Demand
(Y) X- X̅ Y- Y̅ (X- X̅)* (Y- Y̅) (X- X̅)2
1 9 -4.5 -41.3 185.85 20.25
2 15 -3.5 -35.3 123.55 12.25
3 32 -2.5 -18.3 45.75 6.25
4 48 -1.5 -2.3 3.45 2.25
5 52 -0.5 1.7 -0.85 0.25
6 60 0.5 9.7 4.85 0.25
7 39 1.5 -11.3 -16.95 2.25
8 65 2.5 14.7 36.75 6.25
9 90 3.5 39.7 138.95 12.25
10 93 4.5 42.7 192.15 20.25
5.5 50.3 713.5 82.5
X̅ Y̅ Σ(X- X̅)*(Y- Y̅) Σ (X- X̅)2
𝑏 =
(𝑋− 𝑋)∗(𝑌− 𝑌)
(𝑋−𝑋 )2
=
713.5
82.5
= 8.6485
a = 𝑌 − 𝑏 ∗ 𝑋
= 50.3 – 8.6485*5.5
= 2.733
Trendline ∶
Y = a + 𝑏 ∗ X
= 2.733+ 8.6485*X
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
42
Solution Manual Method :-
 For any given time
period, the difference
between the forecast
(values on dash line) and
the actual demand (values
on zigzag line) gives the
error in that period.
 The trend analysis
method minimizes the
sum of the squares of
these errors in calculating
the values of a and b.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Example: Consider the demand data given in the table
below.
43
Time: Independent
Variable (X) 1 2 3 4 5 6 7 8 9 10
Demand : Dependent
Variable (Y) 9 15 32 48 52 60 39 65 90 93
Time:
Independent
Variable (X)
Demand :
Dependent
Variable (Y)
1 9
2 15
3 32
4 48
5 52
6 60
7 39
8 65
9 90
10 93
Slope (b) 8.65
Intercept (a) 2.73
y = 8.6485x + 2.7333
0
10
20
30
40
50
60
70
80
90
100
1 2 3 4 5 6 7 8 9 10
Demand : Dependent Variable (Y)
Demand : Dependent Variable (Y) Linear (Demand : Dependent Variable (Y))
Solution Excel :-
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
 The Excel functions give b = 8.65 and a = 2.73.
 Use them in equation, Y = a + bX, to make a forecast.
 For example, for period 11 (X = 11),
 Forecast = 2.73 + 11*8.65 = 97.87.
 Similarly, for period 12,
 Forecast = 2.73 + 12*8.65 = 106.52.
44
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
3. Explanatory/causal methods.
Explanatory/causal models, often called econometric
models, seek to identify factors that explain statistically
the patterns observed in the variable being forecast,
usually with regression analysis.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Regression Analysis
 Regression Analysis establishes a relationship between two sets of numbers
that are time series.
 For example, when a series of Y numbers (such as the monthly sales of
cameras over a period of years) is causally connected with the series of X
numbers (the monthly advertising budget), then it is beneficial to establish
a relationship between X and Y in order to forecast Y.
 In regression analysis X is the independent variable and Y is the dependent
variable.
46
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
 The regression analysis gives the relationship between X
and Y by the following equation.
Y = a + bX,
where, a is the intercept on the Y-axis
(value of the variable Y when X = 0); and b is the slope of the line
which gives the change in the value of variable Y for a unit change
in the value of X.
 The “Intercept” function in Excel calculates a and the
“Slope” function in Excel is used to find the value of b.
47
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Example: Use the data given in the following table for ten-pairs of X
and Y.
o The Excel functions give b = 50.23 and a = 62.44.
o Use them in equation, Y = a + bX, to forecast.
o Suppose X = 15, then
Forecast = 62.44 + 50.23*15 = 815.84.
Observation
Number
1 2 3 4 5 6 7 8 9 10
Independent
Variable (x)
10 12 11 9 10 12 10 13 14 12
Dependent
Variable (y)
400 600 700 500 800 700 500 700 800 600
48
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
The forecasts (values on straight line) and the actual demand
data points have been plotted in the following figure.
49
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
 For any given time period, the difference between the
forecast values and the actual demand gives the error in that
period.
 The regression analysis minimizes the sum of the squares of
these errors in calculating the values of a and b.
 An assumption that is generally made in regression analysis
is that the relationship between the correlate pairs is linear.
 However, if nonlinear relations are hypothesized, there are
strong, but more complex methods for doing nonlinear
regression analyses.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
Coefficient of Determination
 The coefficient of determination (r2), where r is the value of the
coefficient of correlation, is a measure of the variability that is accounted
for by the regression line for the dependent variable.
 The coefficient of determination always falls between 0 and 1.
 For example, if r = 0.8, the coefficient of determination is r2 = 0.64
meaning that 64% of the variation in Y is due to variation in X.
 The remaining 36% variation in the value of Y is due to other variables.
 If the coefficient of determination is low, multiple regression analysis
may be used to account for all variables affecting the independent
variable Y. 51
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
forecasting technique.pptx

forecasting technique.pptx

  • 1.
    MBMG-7104/ ITHS-2202/ IMAS-3101/IMHS-3101 @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 2.
    Forecasting  Managers requiregood forecasts of future events to make good decisions.  For example, forecasts of interest rates, energy prices, and other economic indicators are needed for financial planning;  Sales forecasts are needed to plan production and workforce capacity; and  forecasts of trends in demographics, consumer behavior, and technological innovation are needed for long-term strategic planning. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 3.
    Forecasting  Many ofus have faced the challenge of selecting the best option when buying a new product or trying out a new technique.  This can be a challenging task as most options tend to sound similar to one another, making it difficult to determine the best choice. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 4.
    Forecasting  Business analystsmay choose from a wide range of forecasting techniques to support decision making.  Three major categories of forecasting approaches are : - Qualitative and judgmental techniques Statistical time-series models Explanatory/causal methods. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 5.
    1. Qualitative andJudgmental Forecasting Qualitative and judgmental techniques rely on experience and intuition; they are necessary –  when historical data are not available or  when the decision maker needs to forecast far into the future. For example, a forecast of when the next generation of a microprocessor will be available and what capabilities it might have will depend greatly on the opinions and expertise of individuals who understand the technology. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 6.
    Judgmental techniques rangefrom: -  simple methods as a manager’s opinion,  or a group-based jury of executive opinion. to more structured approaches such as  Historical analogy and  The Delphi method. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 7.
    Historical Analogy In historicalanalogy a forecast is obtained through a comparative analysis with a previous situation. For example, if a new product is being introduced, the response of consumers to marketing campaigns to similar, previous products can be used as a basis to predict how the new marketing campaign might fare @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 8.
    @Ravindra Nath Shukla(PhD Scholar) ABV-IIITM
  • 9.
     Analogies oftenprovide good forecasts,  but you need to be careful to recognize new or different circumstances.  Another analogy is international conflict relative to the price of oil.  Should war break out, the price would be expected to rise, analogous to what it has done in the past. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 10.
    The Delphi Method TheDelphi method, uses a panel of experts, whose identities are typically kept confidential from one another, to respond to a sequence of questionnaires. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 11.
    Characteristics of theDelphi  Participants are experts in their field.  The technique uses a series of rounds or iterations where information is given back to the participants for review.  Participants work anonymously. They do not know who the other participants might be.  Future focused  The Delphi technique is a “consensus” research method.  In most cases, the goal is to approach a consensus among the expert panel as to future “best” solutions. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 12.
    Advantages and Limitationsof Delphi @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 13.
     A timeseries is a stream of historical data, such as weekly sales.  Time series generally have one or more of the following components:  Trends (T),  Seasonal effects (S),  Cyclical effects (C), or  Random behavior (R)  Time-series models assume that whatever forces have influenced the dependent variable in the recent past will continue into the near future. Thus, the forecasts are developed by extrapolating these data into the future. 2. Statistical time-series models @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 14.
     Statistical time-seriesmodels find greater applicability for short- range forecasting problems.  We characterize the values of a time series over T periods as At, t = 1, 2, c, T.  Time series are tabulated or graphed to show the nature of the time dependence.  The forecast value (Ye) is commonly expressed as a multiplicative or additive function of its components as :  Additive model : Ye=T+S+C+R  Multiplicative model : Ye = T.S.C.R @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 15.
    Components of Time-series @RavindraNath Shukla (PhD Scholar) ABV-IIITM
  • 16.
     T :Trend is a gradual long-term directional movement in the data (growth or decline).  S: Seasonal effects are similar variations occurring during corresponding periods, c.g., December retail sales.  Seasonal can be quarterly, monthly, weekly, daily, or even hourly indexes. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 17.
     C: Cyclicalfactors are the long-term swings about the trend line.  They are often associated with business cycles and may extend out to several years in length. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 18.
     R: Randomcomponent are sporadic (unpredictable) effects due to chance and unusual occurrences.  Time series that do not have trend, seasonal, or cyclical effects but are relatively constant and exhibit only random behavior are called stationary time series. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 19.
    Different types ofTime-series models @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 20.
    A. Naive Approach It is the simplest way to forecast.  It is a technique that assumes dependent variable (such as demand, sales) in the next period is equal to dependent variable in the most recent period.  E.g. Feb. month sales of Big basket will follow the sales of Jan. month. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 21.
    B. Moving Averagesmodel  The moving average method assumes that future observations will be similar to the recent past.  It is most useful as a short-range forecasting method.  Although this method is very simple, it has proven to be quite useful in stable environments, such as inventory management, demand forecast etc.  Simple Moving Average :  Mathematically : SM𝐴 = 𝐷𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒 𝑖𝑛 𝑘 𝑝𝑒𝑟𝑖𝑜𝑑 𝑇𝑜𝑡𝑎𝑙 𝑡𝑖𝑚𝑒 𝑝𝑒𝑟𝑖𝑜𝑑 𝑁 Where’s Ft+1 = Forecast for t+1 time, N= times, Yk= dependent variable in k period @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 22.
    Moving Averages model For example, let's say we have a monthly sales data for the past 7 months, as follows:  If we want to forecast the sales for the next month of march., we can use a 2-month SMA, which calculates the average of the past two months: Month Actual Sales 2 month SMA forecast Sales Forecast Jan 100 Feb 120 Mar. 110 =(100+120)/2= 110 110 Apr. 130 = (120+110)/2=115 115 May. 140 =(110+130)/2 = 120 120 Jun. 120 =(130+140)/2 = 135 135 Jul. 110 = (140+120)/2 = 130 130 Aug. = (120+110)/2 = 115 115 80 90 100 110 120 130 140 150 Jan Feb Mar. Apr. May. Jun. Jul. SMA - Forecast Actual Sales Sales Forecast @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 23.
     Another example @RavindraNath Shukla (PhD Scholar) ABV-IIITM
  • 24.
     Weighted MovingAverage :  In a weighted moving average, past observations are given differential weights (usually the weight decrease as the data becomes older).  Weighted moving average is given by : WM𝐴 = 𝑊𝑒𝑖𝑔ℎ𝑡 𝑜𝑓 𝑝𝑒𝑟𝑖𝑜𝑑 𝑘 (𝐷𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒 𝑖𝑛 𝑝𝑒𝑟𝑖𝑜𝑑 𝑘) 𝑊𝑒𝑖𝑔ℎ𝑡𝑠 Or where Wk is the weight given to value of Y at time k (Yk) and @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 25.
     Here's anexample to illustrate how WMA works:  Let's say you are tracking the sales of a particular product over a period of 5 months. The sales figures for each month are as follows: Month Actual Sales(Units) Weight WMA WMA for May month Jan. 100 0.5 100 = (100*0.5) + (120*0.3) + (150*0.1) + (180*0.05)+ (200*0.05) = 50+36+15+9+10 Feb. 120 0.3 95 Mar. 150 0.1 102.67 Apr. 180 0.05 110.47 May 200 0.05 120 = 120 @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 26.
    Error Metrics andForecast Accuracy  The quality of a forecast depends on how accurate it is in predicting future values of a time series.  In the simple moving average and smoothing model, different values for forecasted time period will produce different forecasts.  To analyze the effectiveness of different forecasting models, we can define error metrics, which compare quantitatively the forecast with the actual observations.  Three metrics that are commonly used are : • Mean Absolute Deviation, • Mean Square Error, and • Mean Absolute Percentage Error. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 27.
    Mean Absolute Deviation:  The mean absolute deviation (MAD) is the absolute difference between the actual value and the forecast, averaged over a range of forecasted values: where At = actual value of the time series at time t, Ft = forecast value for time t, n = number of forecast values (not the number of data points since we do not have a forecast value associated with the first k data points)  MAD provides a robust measure of error and is less affected by extreme observations @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 28.
    Mean Square Error:  Mean square error (MSE) is probably the most commonly used error metric.  It penalizes larger errors because squaring larger numbers has a greater impact than squaring smaller numbers. The formula for MSE is  Again, n represents the number of forecast values used in computing the average.  The square root of MSE, called the root mean square error (RMSE), is used:  RMSE is expressed in the same units as the data, allowing for more practical comparisons. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 29.
    Mean Absolute PercentageError :  The mean absolute percentage error (MAPE) is the average of absolute errors divided by actual observation values.  MAPE eliminates the measurement scale by dividing the absolute error by the time-series data value.  This allows a better relative comparison. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 30.
    @Ravindra Nath Shukla(PhD Scholar) ABV-IIITM
  • 31.
    @Ravindra Nath Shukla(PhD Scholar) ABV-IIITM
  • 32.
    @Ravindra Nath Shukla(PhD Scholar) ABV-IIITM
  • 33.
    C. Exponential Smoothing @RavindraNath Shukla (PhD Scholar) ABV-IIITM
  • 34.
    @Ravindra Nath Shukla(PhD Scholar) ABV-IIITM
  • 35.
    @Ravindra Nath Shukla(PhD Scholar) ABV-IIITM
  • 36.
     This techniquefits a trend line to a series of historical data points and then projects the line into the future for medium to long-range forecasts.  If the time series exhibits an increasing or decreasing trend then a trend analysis is more appropriate.  A trend line defines the relationship between forecast value and the time period by the following equation.  ŷ = a + bX, where, ŷ is the forecast value and X is the time period.  Where ŷ-computed value of the variable to be predicted (called the dependent variable) 36 D. Trend Line @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 37.
     X isthe independent variable and Y is the dependent variable since the forecast value depends on the time period.  A least square method is used to develop a trend line.  a = Y -axis intercept.  b = slope of the regression line (or the rate of change in Y for given changes in X)  X = the independent variable (which in this case is time). 37 @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 38.
    38  Now wehave to calculate the value of 'a' and 'b' constants as : 𝑏 = (𝑋− 𝑋)∗(𝑌− 𝑌) (𝑋−𝑋 )2 = 𝐶𝑜𝑣( 𝑋,𝑌) 𝑉𝑎𝑟 (𝑋) And a = 𝑌 − 𝑏 ∗ 𝑋 @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 39.
     In theequation, Y = a + bX, a is the intercept on the Y- axis. a gives the value of variable Y, when X = 0.  The slope of the line is b which gives the change in the value of dependent variable Y for a unit change in the value of X.  The “Intercept” and “Slope” functions in Excel are used to calculate a and b respectively. 39 @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 40.
    Example: Consider thedemand data given in the table below. Project the trendline. Time: Independent Variable (X) 1 2 3 4 5 6 7 8 9 10 Demand : Dependent Variable (Y) 9 15 32 48 52 60 39 65 90 93 @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 41.
    Solution Manual Method:- 41 Time: (X) Demand (Y) X- X̅ Y- Y̅ (X- X̅)* (Y- Y̅) (X- X̅)2 1 9 -4.5 -41.3 185.85 20.25 2 15 -3.5 -35.3 123.55 12.25 3 32 -2.5 -18.3 45.75 6.25 4 48 -1.5 -2.3 3.45 2.25 5 52 -0.5 1.7 -0.85 0.25 6 60 0.5 9.7 4.85 0.25 7 39 1.5 -11.3 -16.95 2.25 8 65 2.5 14.7 36.75 6.25 9 90 3.5 39.7 138.95 12.25 10 93 4.5 42.7 192.15 20.25 5.5 50.3 713.5 82.5 X̅ Y̅ Σ(X- X̅)*(Y- Y̅) Σ (X- X̅)2 𝑏 = (𝑋− 𝑋)∗(𝑌− 𝑌) (𝑋−𝑋 )2 = 713.5 82.5 = 8.6485 a = 𝑌 − 𝑏 ∗ 𝑋 = 50.3 – 8.6485*5.5 = 2.733 Trendline ∶ Y = a + 𝑏 ∗ X = 2.733+ 8.6485*X @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 42.
    42 Solution Manual Method:-  For any given time period, the difference between the forecast (values on dash line) and the actual demand (values on zigzag line) gives the error in that period.  The trend analysis method minimizes the sum of the squares of these errors in calculating the values of a and b. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 43.
    Example: Consider thedemand data given in the table below. 43 Time: Independent Variable (X) 1 2 3 4 5 6 7 8 9 10 Demand : Dependent Variable (Y) 9 15 32 48 52 60 39 65 90 93 Time: Independent Variable (X) Demand : Dependent Variable (Y) 1 9 2 15 3 32 4 48 5 52 6 60 7 39 8 65 9 90 10 93 Slope (b) 8.65 Intercept (a) 2.73 y = 8.6485x + 2.7333 0 10 20 30 40 50 60 70 80 90 100 1 2 3 4 5 6 7 8 9 10 Demand : Dependent Variable (Y) Demand : Dependent Variable (Y) Linear (Demand : Dependent Variable (Y)) Solution Excel :- @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 44.
     The Excelfunctions give b = 8.65 and a = 2.73.  Use them in equation, Y = a + bX, to make a forecast.  For example, for period 11 (X = 11),  Forecast = 2.73 + 11*8.65 = 97.87.  Similarly, for period 12,  Forecast = 2.73 + 12*8.65 = 106.52. 44 @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 45.
    3. Explanatory/causal methods. Explanatory/causalmodels, often called econometric models, seek to identify factors that explain statistically the patterns observed in the variable being forecast, usually with regression analysis. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 46.
    Regression Analysis  RegressionAnalysis establishes a relationship between two sets of numbers that are time series.  For example, when a series of Y numbers (such as the monthly sales of cameras over a period of years) is causally connected with the series of X numbers (the monthly advertising budget), then it is beneficial to establish a relationship between X and Y in order to forecast Y.  In regression analysis X is the independent variable and Y is the dependent variable. 46 @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 47.
     The regressionanalysis gives the relationship between X and Y by the following equation. Y = a + bX, where, a is the intercept on the Y-axis (value of the variable Y when X = 0); and b is the slope of the line which gives the change in the value of variable Y for a unit change in the value of X.  The “Intercept” function in Excel calculates a and the “Slope” function in Excel is used to find the value of b. 47 @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 48.
    Example: Use thedata given in the following table for ten-pairs of X and Y. o The Excel functions give b = 50.23 and a = 62.44. o Use them in equation, Y = a + bX, to forecast. o Suppose X = 15, then Forecast = 62.44 + 50.23*15 = 815.84. Observation Number 1 2 3 4 5 6 7 8 9 10 Independent Variable (x) 10 12 11 9 10 12 10 13 14 12 Dependent Variable (y) 400 600 700 500 800 700 500 700 800 600 48 @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 49.
    The forecasts (valueson straight line) and the actual demand data points have been plotted in the following figure. 49 @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 50.
     For anygiven time period, the difference between the forecast values and the actual demand gives the error in that period.  The regression analysis minimizes the sum of the squares of these errors in calculating the values of a and b.  An assumption that is generally made in regression analysis is that the relationship between the correlate pairs is linear.  However, if nonlinear relations are hypothesized, there are strong, but more complex methods for doing nonlinear regression analyses. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
  • 51.
    Coefficient of Determination The coefficient of determination (r2), where r is the value of the coefficient of correlation, is a measure of the variability that is accounted for by the regression line for the dependent variable.  The coefficient of determination always falls between 0 and 1.  For example, if r = 0.8, the coefficient of determination is r2 = 0.64 meaning that 64% of the variation in Y is due to variation in X.  The remaining 36% variation in the value of Y is due to other variables.  If the coefficient of determination is low, multiple regression analysis may be used to account for all variables affecting the independent variable Y. 51 @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM