Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

Successfully reported this slideshow.

Like this presentation? Why not share!

- Forecasting methods by Neeraj Bhand... by Neeraj Bhandari 1482 views
- Forecasting Slides by knksmart 106383 views
- Forecasting Techniques by guest865c0e0c 214907 views
- Using Financial Forecasts to Advise... by Irma Miller 1909 views
- Forecasting by Murage Macharia 6400 views
- Demand Forecasting Methods by infinity 226354 views

No Downloads

Total views

1,634

On SlideShare

0

From Embeds

0

Number of Embeds

8

Shares

0

Downloads

108

Comments

2

Likes

3

No notes for slide

response variable) is affected by changes in another variable (the explanatory variable). The terms &quot;response&quot; and

&quot;explanatory&quot; mean the same thing as &quot;dependent&quot; and &quot;independent&quot;, but the former terminology is preferred because

the &quot;independent&quot; variable may actually be interdependent with many other variables as well.

Any line fitted through a cloud of data will deviate from each data point to greater or lesser degree. The vertical

distance between a data point and the fitted line is termed a &quot;residual&quot;. This distance is a measure of prediction error, in

the sense that it is the discrepancy between the actual value of the response variable and the value predicted by the line.

Linear regression determines the best-fit line through a scatterplot of data, such that the sum of squared residuals is

minimized; equivalently, it minimizes the error variance. The fit is &quot;best&quot; in precisely that sense: the sum of squared

errors is as small as possible. That is why it is also termed &quot;Ordinary Least Squares&quot; regression.

Suppose we believe that the value of y tends to increase or decrease in a linear manner as x increases. Then we could select a model relating y to x by drawing a line which is well fitted to a given data set. Such a deterministic model – one that does not allow for errors of prediction –might be adequate if all of the data points fell on the fitted line. However, you can see that this idealistic situation will not occur.

The solution to the proceeding problem is to construct a probabilistic model relating y to x- one that knowledge the random variation of the data points about a line. One type of probabilistic model, a simple linear regression model, makes assumption that the mean value of y for a given value of x graphs as straight line and that points deviate about this line of means by a random amount equal to e, i.e.

y = A + B x + e,

where A and B are unknown parameters of the deterministic (nonrandom ) portion of the model.

If we suppose that the points deviate above or below the line of means and with expected value E(e) = 0 then the mean value of y is

y = A + B x.

Therefore, the mean value of y for a given value of x, represented by the symbol E(y) graphs as straight line with y-intercept A and slope B.

- 1. By Srikavya Ravi teja Arun Nayak Karthik Amit Nayak
- 2. Forecasting lays a ground for reducing the risk in all decision making because many of the decisions need to be made under uncertainty. In business applications, forecasting serves as a starting point of major decisions in finance, marketing, productions, and purchasing.
- 3. Method or technique for estimating many future aspects of a business or other operation. It is used in the practice of customer demand planning in every day business forecasting for manufacturing companies.
- 4. There are three types of forecasting 1.Qualitative or Judgmental methods 2.Extrapolative or Time series methods 3.Causal or Explanatory methods
- 5. A time series is a sequence of data points, measured typically at successive points in time spaced at uniform time intervals. Time series forecasting is the use of a model to predict future values based on previously observed values. In this measurements are taken at successive points or over successive periods.
- 6. Time-Series Cyclical Component Random Component Trend Component Seasonal Component
- 7. Long-run increase or decrease over time (overall upward or downward movement) Data taken over a long period of time Upward trend Sales Time
- 8. Trend can be upward or downward Trend can be linear or non-linear Downward linear trend Sales Time Upward nonlinear trend Sales Time (continued)
- 9. In this,Variation dependent on the time of year Each year shows same pattern Often monthly or quarterly Sales Time (Quarterly) Winter Spring Summer Fall
- 10. Up & down movement repeating over long time frame Regularly occur but may vary in length Each year does not show same pattern Sales 1 Cycle Year
- 11. Unpredictable, random, “residual” fluctuations Due to random variations of Nature Accidents or unusual events “Noise” in the time series
- 12. Moving Average Method - average of demands occurring in several of the most recent periods. Weighted Moving Average - allows for varying weighting of old demands. Exponential Smoothing – exponentially decreases the weighting of old demands. Linear method
- 13. The Simple Moving Average smooth past data by arithmetically averaging over a specified period and projecting forward in time. This is normally considered a smoothing algorithm and has poor forecasting results in most cases. A moving average is commonly used with time series data to smooth out short- term fluctuations and highlight longer-term trends or cycles.
- 14. Question: What are the 3- week and 6-week moving average forecasts for demand? Assume you only have 3 weeks and 6 weeks of actual demand data for the respective forecasts Question: What are the 3- week and 6-week moving average forecasts for demand? Assume you only have 3 weeks and 6 weeks of actual demand data for the respective forecasts Week Demand 1 650 2 678 3 720 4 785 5 859 6 920 7 850 8 758 9 892 10 920 11 789 12 844 F = A + A + A +...+A n t t-1 t-2 t-3 t-n
- 15. Week Demand 3-Week 6-Week 1 650 2 678 3 720 4 785 682.67 5 859 727.67 6 920 788.00 7 850 854.67 768.67 8 758 876.33 802.00 9 892 842.67 815.33 10 920 833.33 844.00 11 789 856.67 866.50 12 844 867.00 854.83 F4=(650+678+720)/3 =682.67 F7=(650+678+720 +785+859+920)/6 =768.67 Calculating the moving averages gives us:
- 16. F = w A + w A + w A +...+w At 1 t-1 2 t-2 3 t-3 n t-n w = 1i i=1 n ∑ While the moving average formula implies an equal weight being placed on each value that is being averaged, the weighted moving average permits an unequal weighting on prior time periods While the moving average formula implies an equal weight being placed on each value that is being averaged, the weighted moving average permits an unequal weighting on prior time periods wt = weight given to time period “t” occurrence (weights must add to one) wt = weight given to time period “t” occurrence (weights must add to one) The formula for the moving average is:The formula for the moving average is:
- 17. Weights: t-1 .5 t-2 .3 t-3 .2 Week Demand 1 650 2 678 3 720 4 Question: Given the weekly demand and weights, what is the forecast for the 4th period or Week 4? Question: Given the weekly demand and weights, what is the forecast for the 4th period or Week 4? Note that the weights place more emphasis on the most recent data, that is time period “t-1” Note that the weights place more emphasis on the most recent data, that is time period “t-1”
- 18. Week Demand Forecast 1 650 2 678 3 720 4 693.4 F4 = 0.5(720)+0.3(678)+0.2(650)=693.4
- 19. Drawback of previous models is carrying large amount of Data. As new data is added to this method, oldest observation is dropped and the new forecast is calculated. In many applications the most recent occurrences are more indicative of the future than those in the more distant past. If this premise is valid then Exponential smoothing may be the most logical method to use. Most used and widely used in retail firms, wholesale companies and service agencies.
- 20. Exponential smoothing is a technique that can be applied to time series data, either to produce smoothed data for presentation, or to make forecasts. Exponential smoothing methods give larger weights to more recent observations, and the weights decrease exponentially as the observations become more distant.
- 21. Premise: The most recent observations might have the highest predictive value Therefore, we should give more weight to the more recent time periods when forecasting Ft = Ft-1 + α(At-1 - Ft-1)Ft = Ft-1 + α(At-1 - Ft-1) constantsmoothingAlpha periodepast t timin theoccuranceActualA periodpast time1inalueForecast vF periodt timecomingfor thelueForcast vaF :Where 1-t 1-t t = = = = α
- 22. Question: What are the exponential smoothing forecasts for periods 2-5 using a =0.5? Assume F1=D1 Question: What are the exponential smoothing forecasts for periods 2-5 using a =0.5? Assume F1=D1 Week Demand 1 820 2 775 3 680 4 655 5
- 23. Week Demand 0.5 1 820 820.00 2 775 820.00 3 680 797.50 4 655 738.75 5 696.88 F1=820+(0.5)(820-820)=820 F3=820+(0.5)(775-820)=797.75
- 24. Simple linear regression is the most commonly used technique for determining how one variable of interest is affected by changes in another variable. Simple linear regression is used for three main purposes: 1. To describe the linear dependence of one variable on another 2. To predict values of one variable from values of another, for which more data are available 3. To correct for the linear dependence of one variable on another, in order to clarify other features of its variability. 24
- 25. Yt = a + bx 0 1 2 3 4 5 x (Time) YThe simple linear regression model seeks to fit a line through various data over time The simple linear regression model seeks to fit a line through various data over time a Where Yt is the regressed forecast value or dependent variable in the model, a is the intercept value of the the regression line, and b is similar to the slope of the regression line.
- 26. a = y- bx b = xy- n(y)(x) x - n(x2 2 ∑ ∑ )
- 27. Week Sales 1 150 2 157 3 162 4 166 5 177 Question: Given the data below, what is the simple linear regression model that can be used to predict sales in future weeks? Question: Given the data below, what is the simple linear regression model that can be used to predict sales in future weeks?
- 28. Week Week*Week Sales Week*Sales 1 1 150 150 2 4 157 314 3 9 162 486 4 16 166 664 5 25 177 885 3 55 162.4 2499 Average Sum Average Sum b = xy- n(y)(x) x - n(x = 2499- 5(162.4)(3) = a = y- bx =162.4- (6.3)(3) = 2 2 ∑ ∑ − = ) ( )55 5 9 63 10 6.3 143.5 Answer: First, using the linear regression formulas, we can compute “a” and “b” Answer: First, using the linear regression formulas, we can compute “a” and “b”
- 29. Yt = 143.5 + 6.3x 180 Period 135 140 145 150 155 160 165 170 175 1 2 3 4 5 Sales Sales Forecast The resulting regression model is: Now if we plot the regression generated forecasts against the actual sales we obtain the following chart:
- 30. 30

No public clipboards found for this slide

Login to see the comments