Managers require accurate forecasts to make good decisions. There are three main categories of forecasting approaches: qualitative and judgmental techniques which rely on experience; statistical time-series models which analyze historical data patterns; and explanatory/causal methods which consider factors influencing changes. Some common forecasting techniques include moving averages, exponential smoothing, and trend line analysis, with error metrics like mean absolute deviation used to evaluate accuracy.
2. Forecasting
๏ Managers require good forecasts of future events to
make good decisions.
๏ For example, forecasts of interest rates, energy prices,
and other economic indicators are needed for financial
planning;
๏ Sales forecasts are needed to plan production and
workforce capacity; and
๏ forecasts of trends in demographics, consumer behavior,
and technological innovation are needed for long-term
strategic planning.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
3. Forecasting
๏ Many of us have faced the challenge of selecting the
best option when buying a new product or trying out a
new technique.
๏ This can be a challenging task as most options tend to
sound similar to one another, making it difficult to
determine the best choice.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
4. Forecasting
๏ Business analysts may choose from a wide range of
forecasting techniques to support decision making.
๏ Three major categories of forecasting approaches are : -
Qualitative and judgmental
techniques
Statistical time-series
models
Explanatory/causal
methods.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
5. 1. Qualitative and Judgmental Forecasting
Qualitative and judgmental techniques rely on experience
and intuition;
they are necessary โ
๏ when historical data are not available or
๏ when the decision maker needs to forecast far into the
future.
For example, a forecast of when the next generation of a microprocessor will
be available and what capabilities it might have will depend greatly on the
opinions and expertise of individuals who understand the technology.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
6. Judgmental techniques range from: -
๏ต simple methods as a managerโs opinion,
๏ต or a group-based jury of executive opinion.
to more structured approaches such as
๏ต Historical analogy and
๏ต The Delphi method.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
7. Historical Analogy
In historical analogy a forecast is obtained through
a comparative analysis with a previous situation.
For example, if a new product is being introduced, the response of
consumers to marketing campaigns to similar, previous products
can be used as a basis to predict how the new marketing campaign
might fare
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
9. ๏ต Analogies often provide good forecasts,
๏ต but you need to be careful to recognize new or different
circumstances.
๏ต Another analogy is international conflict relative to the
price of oil.
๏ต Should war break out, the price would be expected to
rise, analogous to what it has done in the past.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
10. The Delphi Method
The Delphi method, uses a panel of experts, whose identities are
typically kept confidential from one another, to respond to a
sequence of questionnaires.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
11. Characteristics of the Delphi
๏ต Participants are experts in their field.
๏ต The technique uses a series of rounds or iterations where
information is given back to the participants for review.
๏ต Participants work anonymously. They do not know who the
other participants might be.
๏ต Future focused
๏ต The Delphi technique is a โconsensusโ research method.
๏ต In most cases, the goal is to approach a consensus among the
expert panel as to future โbestโ solutions.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
13. ๏ A time series is a stream of historical data, such as weekly sales.
๏ Time series generally have one or more of the following
components:
๏ง Trends (T),
๏ง Seasonal effects (S),
๏ง Cyclical effects (C), or
๏ง Random behavior (R)
๏ Time-series models assume that whatever forces have influenced
the dependent variable in the recent past will continue into the
near future. Thus, the forecasts are developed by extrapolating
these data into the future.
2. Statistical time-series models
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
14. ๏ Statistical time-series models find greater applicability for short-
range forecasting problems.
๏ We characterize the values of a time series over T periods as At, t
= 1, 2, c, T.
๏ Time series are tabulated or graphed to show the nature of the
time dependence.
๏ The forecast value (Ye) is commonly expressed as a
multiplicative or additive function of its components as :
๏ง Additive model : Ye=T+S+C+R
๏ง Multiplicative model : Ye = T.S.C.R
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
16. ๏ T : Trend is a gradual long-term directional movement in the
data (growth or decline).
๏ S: Seasonal effects are similar variations occurring during
corresponding periods, c.g., December retail sales.
๏ Seasonal can be quarterly, monthly, weekly, daily, or even hourly
indexes.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
17. ๏ C: Cyclical factors are the long-term swings about the trend line.
๏ They are often associated with business cycles and may extend
out to several years in length.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
18. ๏ R: Random component are sporadic (unpredictable) effects due
to chance and unusual occurrences.
๏ Time series that do not have trend, seasonal, or cyclical effects
but are relatively constant and exhibit only random behavior are
called stationary time series.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
19. Different types of Time-series models
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
20. A. Naive Approach
๏ It is the simplest way to forecast.
๏ It is a technique that assumes dependent variable (such as
demand, sales) in the next period is equal to dependent variable
in the most recent period.
๏ E.g. Feb. month sales of Big basket will follow the sales of Jan.
month.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
21. B. Moving Averages model
๏ The moving average method assumes that future observations
will be similar to the recent past.
๏ It is most useful as a short-range forecasting method.
๏ Although this method is very simple, it has proven to be quite
useful in stable environments, such as inventory management,
demand forecast etc.
๏ Simple Moving Average :
๏ Mathematically : SM๐ด =
๐ท๐๐๐๐๐๐๐๐ก ๐ฃ๐๐๐๐๐๐๐ ๐๐ ๐ ๐๐๐๐๐๐
๐๐๐ก๐๐ ๐ก๐๐๐ ๐๐๐๐๐๐ ๐
Whereโs Ft+1 = Forecast for t+1 time, N= times, Yk= dependent variable in k period
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
22. Moving Averages model
๏ For example, let's say we have a monthly sales data for the past
7 months, as follows:
๏ If we want to forecast the sales for the next month of march., we
can use a 2-month SMA, which calculates the average of the past
two months:
Month Actual
Sales
2 month SMA
forecast
Sales
Forecast
Jan 100
Feb 120
Mar. 110 =(100+120)/2= 110 110
Apr. 130 = (120+110)/2=115 115
May. 140 =(110+130)/2 = 120 120
Jun. 120 =(130+140)/2 = 135 135
Jul. 110 = (140+120)/2 = 130 130
Aug. = (120+110)/2 = 115 115
80
90
100
110
120
130
140
150
Jan Feb Mar. Apr. May. Jun. Jul.
SMA - Forecast
Actual Sales Sales Forecast
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
24. ๏ Weighted Moving Average :
๏ In a weighted moving average, past observations are given differential
weights (usually the weight decrease as the data becomes older).
๏ Weighted moving average is given by :
WM๐ด =
๐๐๐๐โ๐ก ๐๐ ๐๐๐๐๐๐ ๐ (๐ท๐๐๐๐๐๐๐๐ก ๐ฃ๐๐๐๐๐๐๐ ๐๐ ๐๐๐๐๐๐ ๐)
๐๐๐๐โ๐ก๐
Or
where Wk is the weight given to value of Y at time k (Yk) and
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
25. ๏ Here's an example to illustrate how WMA works:
๏ Let's say you are tracking the sales of a particular product over a period
of 5 months. The sales figures for each month are as follows:
Month Actual
Sales(Units)
Weight WMA WMA for May month
Jan. 100 0.5 100 = (100*0.5) +
(120*0.3) + (150*0.1)
+ (180*0.05)+
(200*0.05) =
50+36+15+9+10
Feb. 120 0.3 95
Mar. 150 0.1 102.67
Apr. 180 0.05 110.47
May 200 0.05 120 = 120
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
26. Error Metrics and Forecast Accuracy
๏ The quality of a forecast depends on how accurate it is in predicting
future values of a time series.
๏ In the simple moving average and smoothing model, different values
for forecasted time period will produce different forecasts.
๏ To analyze the effectiveness of different forecasting models, we can
define error metrics, which compare quantitatively the forecast with the
actual observations.
๏ Three metrics that are commonly used are :
โข Mean Absolute Deviation,
โข Mean Square Error, and
โข Mean Absolute Percentage Error. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
27. Mean Absolute Deviation :
๏ต The mean absolute deviation (MAD) is the absolute
difference between the actual value and the forecast,
averaged over a range of forecasted values:
where At = actual value of the time series at time t, Ft = forecast value for time t,
n = number of forecast values
(not the number of data points since we do not have a forecast value associated
with the first k data points)
๏ต MAD provides a robust measure of error and is less affected
by extreme observations @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
28. Mean Square Error :
๏ต Mean square error (MSE) is probably the most commonly used error
metric.
๏ต It penalizes larger errors because squaring larger numbers has a
greater impact than squaring smaller numbers. The formula for MSE
is
๏ต Again, n represents the number of forecast values used in computing
the average.
๏ต The square root of MSE, called the root mean square error
(RMSE), is used:
๏ต RMSE is expressed in the same units as the data, allowing for more
practical comparisons. @Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
29. Mean Absolute Percentage Error :
๏ต The mean absolute percentage error (MAPE) is the average
of absolute errors divided by actual observation values.
๏ต MAPE eliminates the measurement scale by dividing the
absolute error by the time-series data value.
๏ต This allows a better relative comparison.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
36. ๏ต This technique fits a trend line to a series of historical data points and
then projects the line into the future for medium to long-range
forecasts.
๏ต If the time series exhibits an increasing or decreasing trend then a
trend analysis is more appropriate.
๏ต A trend line defines the relationship between forecast value and the
time period by the following equation.
๏ต ลท = a + bX, where, ลท is the forecast value and X is the time period.
๏ต Where ลท-computed value of the variable to be predicted (called the
dependent variable) 36
D. Trend Line
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
37. ๏ต X is the independent variable and Y is the dependent
variable since the forecast value depends on the time
period.
๏ต A least square method is used to develop a trend line.
๏ต a = Y -axis intercept.
๏ต b = slope of the regression line (or the rate of change in Y
for given changes in X)
๏ต X = the independent variable (which in this case is time).
37
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
38. 38
๏ต Now we have to calculate the value of 'a' and 'b' constants
as :
๐ =
(๐โ ๐)โ(๐โ ๐)
(๐โ๐ )2 =
๐ถ๐๐ฃ( ๐,๐)
๐๐๐ (๐)
And
a = ๐ โ ๐ โ ๐
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
39. ๏ต In the equation, Y = a + bX, a is the intercept on the Y-
axis. a gives the value of variable Y, when X = 0.
๏ต The slope of the line is b which gives the change in the
value of dependent variable Y for a unit change in the
value of X.
๏ต The โInterceptโ and โSlopeโ functions in Excel are used to
calculate a and b respectively.
39
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
40. Example: Consider the demand data given in the table
below.
Project the trendline.
Time: Independent
Variable (X) 1 2 3 4 5 6 7 8 9 10
Demand :
Dependent
Variable (Y) 9 15 32 48 52 60 39 65 90 93
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
42. 42
Solution Manual Method :-
๏ต For any given time
period, the difference
between the forecast
(values on dash line) and
the actual demand (values
on zigzag line) gives the
error in that period.
๏ต The trend analysis
method minimizes the
sum of the squares of
these errors in calculating
the values of a and b.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
44. ๏ The Excel functions give b = 8.65 and a = 2.73.
๏ Use them in equation, Y = a + bX, to make a forecast.
๏ For example, for period 11 (X = 11),
๏ Forecast = 2.73 + 11*8.65 = 97.87.
๏ Similarly, for period 12,
๏ Forecast = 2.73 + 12*8.65 = 106.52.
44
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
45. 3. Explanatory/causal methods.
Explanatory/causal models, often called econometric
models, seek to identify factors that explain statistically
the patterns observed in the variable being forecast,
usually with regression analysis.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
46. Regression Analysis
๏ Regression Analysis establishes a relationship between two sets of numbers
that are time series.
๏ For example, when a series of Y numbers (such as the monthly sales of
cameras over a period of years) is causally connected with the series of X
numbers (the monthly advertising budget), then it is beneficial to establish
a relationship between X and Y in order to forecast Y.
๏ In regression analysis X is the independent variable and Y is the dependent
variable.
46
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
47. ๏ง The regression analysis gives the relationship between X
and Y by the following equation.
Y = a + bX,
where, a is the intercept on the Y-axis
(value of the variable Y when X = 0); and b is the slope of the line
which gives the change in the value of variable Y for a unit change
in the value of X.
๏ง The โInterceptโ function in Excel calculates a and the
โSlopeโ function in Excel is used to find the value of b.
47
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
48. Example: Use the data given in the following table for ten-pairs of X
and Y.
o The Excel functions give b = 50.23 and a = 62.44.
o Use them in equation, Y = a + bX, to forecast.
o Suppose X = 15, then
Forecast = 62.44 + 50.23*15 = 815.84.
Observation
Number
1 2 3 4 5 6 7 8 9 10
Independent
Variable (x)
10 12 11 9 10 12 10 13 14 12
Dependent
Variable (y)
400 600 700 500 800 700 500 700 800 600
48
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
49. The forecasts (values on straight line) and the actual demand
data points have been plotted in the following figure.
49
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
50. ๏ For any given time period, the difference between the
forecast values and the actual demand gives the error in that
period.
๏ The regression analysis minimizes the sum of the squares of
these errors in calculating the values of a and b.
๏ An assumption that is generally made in regression analysis
is that the relationship between the correlate pairs is linear.
๏ However, if nonlinear relations are hypothesized, there are
strong, but more complex methods for doing nonlinear
regression analyses.
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM
51. Coefficient of Determination
๏ The coefficient of determination (r2), where r is the value of the
coefficient of correlation, is a measure of the variability that is accounted
for by the regression line for the dependent variable.
๏ The coefficient of determination always falls between 0 and 1.
๏ For example, if r = 0.8, the coefficient of determination is r2 = 0.64
meaning that 64% of the variation in Y is due to variation in X.
๏ The remaining 36% variation in the value of Y is due to other variables.
๏ If the coefficient of determination is low, multiple regression analysis
may be used to account for all variables affecting the independent
variable Y. 51
@Ravindra Nath Shukla (PhD Scholar) ABV-IIITM