This document summarizes a study that evaluates the forecast rationality of food price inflation in Pakistan using quarterly data from 1975 to 2008. VAR models were used to generate forecasts of food price inflation and consumer price index. Four accuracy metrics (RMSE, MAE, MAPE, TIC) were calculated to select the best forecasts. The forecasts were then evaluated based on tests of weak, sufficient, strong, and strict rationality. Weak rationality requires unbiasedness and weak efficiency. Sufficient rationality requires passing tests of orthogonality. Strong rationality requires conditional efficiency by comparing individual forecasts to combinations. Strict rationality requires passing tests of strong rationality in sub-periods. The results found that the food price forecasts
This document summarizes a study that evaluates the rationality of ARIMA forecasts for several economic indicators in Pakistan using annual data from 1975 to 2008. The study develops ARIMA models to forecast food price inflation, consumer price index, GDP per capita, and money supply. It then assesses the forecasts based on criteria for weak, sufficient, strong, and strict rationality proposed by previous research. Food price inflation forecasts were found to meet the criteria for weak and strong rationality, indicating they are reliable for policymaking. The other forecasts did not fully meet the rationality criteria except for money supply.
Application of consistency and efficiency test for forecastsAlexander Decker
This document evaluates the forecasting efficiency of food price inflation, consumer price index, GDP per capita, and money supply data from Pakistan from 1975 to 2008. It uses ARIMA models to generate forecasts, which are then evaluated for consistency and efficiency. Consistency tests whether the actual and forecasted values are cointegrated and have the same order of integration. Efficiency tests examine whether forecasts minimize forecast errors and fully incorporate available information. The study finds that food price forecasts are consistent and efficient based on these criteria.
This document presents a novel approach for combining individual realized volatility measures to form new estimators of asset price variability. It analyzes 30 different realized measures estimated from high frequency IBM stock price data from 1996-2007. It finds that a simple equally-weighted average of the realized measures is not outperformed by any individual measure and that combining measures provides benefits by incorporating information from different estimators. Optimal linear and multiplicative combination estimators are estimated and none of the individual measures are found to encompass all the information in other measures, further supporting the use of combination estimators.
This document describes a factor analysis conducted on survey data from 12 individuals evaluating news anchors on 6 scales measuring credibility. The factor analysis identified 2 underlying factors accounting for most of the variance in the data: Resourcefulness (knowledge, intelligence, believability) and External Appearances (likeability, attractiveness, appearance). Varimax rotation was used to interpret the 2 factors, which provide a simplified way for the news director to develop strategies around anchor qualities compared to using all 6 individual scales.
This document outlines the methodology used to analyze the relationship between company performance and risk disclosure in annual reports. Performance is measured as the change in net income and stock price. Risk disclosure is measured using proxies like number of pages/words about risk and number of times "risk" is mentioned. The study examines 24 Dutch AEX companies' annual reports from 2006-2007. It tests if 1) change in performance correlates with change in risk disclosure and 2) stability of performance correlates with risk disclosure. Multiple steps are taken, including calculating correlations both with and without financial companies.
This document analyzes demand forecasting methods for four pharmaceutical products. Four forecasting methods - naive, cumulative mean, simple moving average, and exponential smoothing - were evaluated based on mean error, mean absolute percentage error, and mean squared error. Visual Basic for Applications was used to optimize parameters for simple moving average and exponential smoothing. The best method for each product was determined to be the one with the lowest mean squared error. Forecasts and 90% confidence intervals are presented for next-month demand.
On Confidence Intervals Construction for Measurement System Capability Indica...IRJESJOURNAL
Abstract: There are many criteria that have been proposed to determine the capability of a measurement system, all based on estimates of variance components. Some of them are the Precision to Tolerance Ratio, the Signal to Noise Ratio and the probabilities of misclassification. For most of these indicators, there are no exact confidence intervals, since the exact distributions of the point estimators are not known. In such situations, two approaches are widely used to obtain approximate confidence intervals: the Modified Large Samples (MLS) methods initially proposed by Graybill and Wang, and the construction of Generalized Confidence Intervals (GCI) introduced by Weerahandi. In this work we focus on the construction of the confidence intervals by the generalized approach in the context of Gauge repeatability and reproducibility studies. Since GCI are obtained by simulation procedures, we analyze the effect of the number of simulations on the variability of the confidence limits as well as the effect of the size of the experiment designed to collect data on the precision of the estimates. Both studies allowed deriving some practical implementation guidelinesin the use of the GCI approach. We finally present a real case study in which this technique was applied to evaluate the capability of a destructive measurement system.
This document discusses an integrated model for sensitivity analysis and scenario analysis using breakeven analysis for operational and investment risk analysis. It was developed by Prof. Sreedhara Ramesh Chandra and Dr. Krishna Banana. The model aims to address limitations in existing sensitivity, scenario, and breakeven analysis models by integrating the three approaches. It introduces proportions and percentages to more precisely determine variable values. It also establishes relationships between scenario values and measures sensitivity through changes from a predetermined relational constant value (sales revenue). The model allows consideration of all cash flow determinants and provides a direct link between operational and investment risk measurements to improve investment decisions.
This document summarizes a study that evaluates the rationality of ARIMA forecasts for several economic indicators in Pakistan using annual data from 1975 to 2008. The study develops ARIMA models to forecast food price inflation, consumer price index, GDP per capita, and money supply. It then assesses the forecasts based on criteria for weak, sufficient, strong, and strict rationality proposed by previous research. Food price inflation forecasts were found to meet the criteria for weak and strong rationality, indicating they are reliable for policymaking. The other forecasts did not fully meet the rationality criteria except for money supply.
Application of consistency and efficiency test for forecastsAlexander Decker
This document evaluates the forecasting efficiency of food price inflation, consumer price index, GDP per capita, and money supply data from Pakistan from 1975 to 2008. It uses ARIMA models to generate forecasts, which are then evaluated for consistency and efficiency. Consistency tests whether the actual and forecasted values are cointegrated and have the same order of integration. Efficiency tests examine whether forecasts minimize forecast errors and fully incorporate available information. The study finds that food price forecasts are consistent and efficient based on these criteria.
This document presents a novel approach for combining individual realized volatility measures to form new estimators of asset price variability. It analyzes 30 different realized measures estimated from high frequency IBM stock price data from 1996-2007. It finds that a simple equally-weighted average of the realized measures is not outperformed by any individual measure and that combining measures provides benefits by incorporating information from different estimators. Optimal linear and multiplicative combination estimators are estimated and none of the individual measures are found to encompass all the information in other measures, further supporting the use of combination estimators.
This document describes a factor analysis conducted on survey data from 12 individuals evaluating news anchors on 6 scales measuring credibility. The factor analysis identified 2 underlying factors accounting for most of the variance in the data: Resourcefulness (knowledge, intelligence, believability) and External Appearances (likeability, attractiveness, appearance). Varimax rotation was used to interpret the 2 factors, which provide a simplified way for the news director to develop strategies around anchor qualities compared to using all 6 individual scales.
This document outlines the methodology used to analyze the relationship between company performance and risk disclosure in annual reports. Performance is measured as the change in net income and stock price. Risk disclosure is measured using proxies like number of pages/words about risk and number of times "risk" is mentioned. The study examines 24 Dutch AEX companies' annual reports from 2006-2007. It tests if 1) change in performance correlates with change in risk disclosure and 2) stability of performance correlates with risk disclosure. Multiple steps are taken, including calculating correlations both with and without financial companies.
This document analyzes demand forecasting methods for four pharmaceutical products. Four forecasting methods - naive, cumulative mean, simple moving average, and exponential smoothing - were evaluated based on mean error, mean absolute percentage error, and mean squared error. Visual Basic for Applications was used to optimize parameters for simple moving average and exponential smoothing. The best method for each product was determined to be the one with the lowest mean squared error. Forecasts and 90% confidence intervals are presented for next-month demand.
On Confidence Intervals Construction for Measurement System Capability Indica...IRJESJOURNAL
Abstract: There are many criteria that have been proposed to determine the capability of a measurement system, all based on estimates of variance components. Some of them are the Precision to Tolerance Ratio, the Signal to Noise Ratio and the probabilities of misclassification. For most of these indicators, there are no exact confidence intervals, since the exact distributions of the point estimators are not known. In such situations, two approaches are widely used to obtain approximate confidence intervals: the Modified Large Samples (MLS) methods initially proposed by Graybill and Wang, and the construction of Generalized Confidence Intervals (GCI) introduced by Weerahandi. In this work we focus on the construction of the confidence intervals by the generalized approach in the context of Gauge repeatability and reproducibility studies. Since GCI are obtained by simulation procedures, we analyze the effect of the number of simulations on the variability of the confidence limits as well as the effect of the size of the experiment designed to collect data on the precision of the estimates. Both studies allowed deriving some practical implementation guidelinesin the use of the GCI approach. We finally present a real case study in which this technique was applied to evaluate the capability of a destructive measurement system.
This document discusses an integrated model for sensitivity analysis and scenario analysis using breakeven analysis for operational and investment risk analysis. It was developed by Prof. Sreedhara Ramesh Chandra and Dr. Krishna Banana. The model aims to address limitations in existing sensitivity, scenario, and breakeven analysis models by integrating the three approaches. It introduces proportions and percentages to more precisely determine variable values. It also establishes relationships between scenario values and measures sensitivity through changes from a predetermined relational constant value (sales revenue). The model allows consideration of all cash flow determinants and provides a direct link between operational and investment risk measurements to improve investment decisions.
This document summarizes a journal article about modeling economic growth under uncertainty. It introduces a one-sector stochastic growth model where production depends on capital, labor, and a random variable. It maximizes the expected discounted utility of consumption to determine optimal policies. The model generalizes previous deterministic growth models by incorporating uncertainty. It analyzes the long-run properties of the stochastic growth process, showing properties like the existence of a unique stationary distribution are analogous to the steady state in deterministic models. The techniques used differ from previous work and help unify different approaches to modeling growth under uncertainty.
Statistics is the systematic collection, organization, analysis, and interpretation of data. It plays an important role in decision making by helping extract meaningful information from raw data. There are two main types of statistics - descriptive statistics which summarizes and presents data, and inferential statistics which makes inferences, tests hypotheses, and determines relationships in the data. Statistics has many applications in fields like business, medicine, economics and more. It helps simplify complex data, enable comparisons, identify trends, and aid decision making. Common statistical terms include population, sample, variables, attributes, and parameters. Data can be collected through various methods including direct observation, interviews, questionnaires, and more.
Analysis of Forecasting Sales By Using Quantitative And Qualitative MethodsIJERA Editor
This paper focuses on analysis of forecasting sales using quantitative and qualitative methods. This forecast should be able to help create a model for measuring a successes and setting goals from financial and operational view points. The resulting model should tell if we have met our goals with respect to measures, targets, initiatives
This document discusses estimating stochastic relative risk aversion from interest rates. It first introduces a model for deriving relative risk aversion from interest rates using a time inhomogeneous single factor short rate model. It then details the estimation methodology used, which calibrates the model to US LIBOR data to estimate a time series for the market price of risk and ex-ante bond Sharpe ratio. This allows deducing a stochastic process for relative risk aversion under a power utility function. Estimated mean relative risk aversion is 49.89. The document then introduces modifying a Real Business Cycle model to allow time-varying relative risk aversion, finding it better matches empirical consumption volatility than a baseline model.
Sensitivity analysis is the study of how uncertainty in the inputs of a mathematical model propagates to uncertainty in the model's outputs. It is useful for understanding relationships between inputs and outputs, identifying important inputs, and reducing uncertainty. Sensitivity analysis typically involves running the model many times while varying inputs, and calculating sensitivity measures from the resulting outputs to determine which inputs most influence uncertainty in the outputs. Common methods include variance-based approaches and screening methods.
Parac hacer Macroeconomia Necesitamos Fundamentos MicroeconomicosPAD Ancash
The document argues that macroeconomics does not need microeconomic foundations, and instead it is microeconomics that needs foundations built from neuroscience and biology. It challenges five common arguments for microfoundations in macroeconomics. Specifically, it argues that 1) the Lucas critique does not necessarily require microfoundations, 2) there are no policy-invariant micro principles, 3) new Keynesian models are not truly founded on first principles, 4) there are no well-established micro principles given limitations of current choice theory, and 5) aggregate behavior cannot be built up from individual behavior in a straightforward way.
This powerpoint presentation was done as part of the course STAT 591 titled Mater's Seminar during Third semester of MSc. Agricultural Statistics at Agricultural College, Bapatla under ANGRAU, Andhra Pradesh.
The document discusses fundamentals of data analysis for marketing research. It covers preparing data for analysis through editing, coding, weighting and variable transformation. Simple tabulation, frequency distributions, descriptive statistics and cross tabulation techniques are presented. An overview of statistical techniques includes univariate, bivariate and multivariate methods. Factors influencing statistical technique choices and concepts of hypothesis testing like null hypotheses and significance levels are also summarized.
10th Alex Marketing Club (Forecasting) by Dr. Haitham Maraei 6 Jan-2018Mahmoud Bahgat
Forecasting is an important part of marketing and business planning. There are many techniques for forecasting, including both qualitative and quantitative methods. Qualitative methods include surveys, expert opinions, and market experiments, while quantitative time series methods analyze past trends and patterns to predict the future. Effective forecasting requires understanding factors like demand trends, seasonality, elasticity, and uncertainty. The summary provides an overview of key concepts and challenges in forecasting for marketing and business.
This document is a thesis submitted by Jai Kedia for a degree in mathematics and business economics. It examines alternative risk measures to the traditional beta measure in predicting stock returns. The thesis provides an introduction and acknowledges the contributions of the advisors. It then presents an abstract that outlines the goal of analyzing if alternative risk measures such as higher moments, size, leverage, and price-to-book ratio can improve predictions of stock returns beyond just beta. Finally, it presents a table of contents that outlines the various chapters covering the return/risk relationship, modern portfolio theory, mathematical analysis of stock prices, a literature review on previous empirical studies, the empirical analysis conducted, and a conclusion.
The document summarizes a simulation study that examined the effects of using raw scores versus IRT-derived scores when operationalizing latent constructs in moderated multiple regression analyses. The study found that using raw scores can inflate Type 1 error rates for interaction terms under conditions of assessment inappropriateness. However, rescaling the scores using the Graded Response Model, a polytomous IRT model, mitigated these effects. The study supports the idea that IRT scores provide a more robust metric than raw scores in moderated regression analyses, especially under suboptimal assessment conditions.
Apoorva Javadekar - Conditional Correlations of Macro Variables and Implica...Apoorva Javadekar
This ppt By Apoorva Javadekar is all about Understanding the structure of the Cross Country Correlation for Macro Variables: and Asset Pricing and Risk Sharing Implications
A dynamic Nelson-Siegel model with forward-looking indicators for the yield c...FGV Brazil
This paper proposes a Factor-Augmented Dynamic Nelson-Siegel (FADNS) model to predict the yield curve in the US that relies on a large data set of weekly financial and macroeconomic variables. The FADNS model significantly improves interest rate forecasts relative to the extant models in the literature. For longer horizons, it beats autoregressive alternatives, with a reduction in mean absolute error of up to 40%. For shorter horizons, it offers a good challenge to autoregressive forecasting models, outperforming them for the 7- and 10-year yields. The out-of-sample analysis shows that the good performance comes mostly from the forward-looking nature of the variables we employ. Including them reduces the mean absolute error in 5 basis points on average with respect to models that reflect only past macroeconomic events.
Date: 2017-03
Authors:
Vieira, Fausto José Araújo
Chague, Fernando Daniel
Fernandes, Marcelo
This paper examines how disagreement among investors about macroeconomic factors affects stock returns. The author finds that following periods of high disagreement, stocks highly sensitive to macroeconomic factors ("high macro beta stocks") earn lower future returns compared to less sensitive stocks ("low macro beta stocks"). This suggests high macro beta stocks become overvalued during high disagreement periods due to investors' differing views. Regression analyses show macroeconomic factor risk premiums are negatively related to lagged macro disagreement measures. The findings support theories that macroeconomic factors price risk but also show how disagreement can lead to speculative stock prices.
A logistic regression was conducted to predict if homeowners would accept or decline a solar panel subsidy offer based on household income and monthly mortgage payment. The full model was a good fit to the data and correctly classified 83.3% of cases. While the predictors were not statistically significant individually, they distinguished between acceptors and decliners as a set. Other factors may provide a better fitting model.
This document discusses quantitative approaches to forecasting, including time series analysis and forecasting techniques. It covers the components of a time series, including trends, cycles, seasonality, and irregular components. Specific quantitative forecasting approaches covered include smoothing methods like moving averages, weighted moving averages, and exponential smoothing. Examples are provided to demonstrate how to perform moving averages and exponential smoothing on time series data for sales of headache medicine. The document aims to teach readers how to analyze time series data and select appropriate forecasting techniques.
This document discusses operations management and forecasting. It explains that operations management deals with designing and managing processes, products, services and supply chains to deliver goods and services customers want. Forecasting helps managers reduce uncertainty by predicting future demand to match supply. The document then discusses various forecasting methods including qualitative judgmental methods and quantitative mathematical modeling methods. It covers short, medium and long-range forecasting as well as different time series and causal modeling techniques.
We use data on twins matched to register-based information on earnings to examine the longstanding
puzzle of non-existent compensating wage differentials. The use of twin data allows us to remove otherwise unobserved productivity differences that were the prominent reason for estimation bias in the earlier studies. Using twin differences we find evidence for positive compensation of adverse working conditions in the labor market.
This document discusses human impacts on the planet that have increased dramatically in recent centuries and can now be measured and observed from space. It describes how human activities like industrial fishing, waste disposal, and deforestation are negatively impacting oceans, atmosphere, and forests. Satellite images taken over time reveal changes to landscapes and ecosystems from human influences and show the need for more sustainable environmental practices.
El documento describe una serie de ensayos químicos para identificar diferentes familias de compuestos en una muestra de material vegetal. Se realizó un estudio químico preliminar mediante reacciones de coloración que permiten visualizar el contenido químico. Se identificaron compuestos como taninos, proantocianidinas, hidratos de carbono reductores, antocianinas, betacianinas, saponinas y flavonoides.
la importancia del fenómeno sociocultural que representa el cooperativismo en todas sus formas, y principalmente su trascendencia como uno de los medios más eficaces para perfeccionar y democratizar los procesos económicos, basados en el esfuerzo personal y la ayuda mutua de los integrantes de cooperativas.
This document summarizes a journal article about modeling economic growth under uncertainty. It introduces a one-sector stochastic growth model where production depends on capital, labor, and a random variable. It maximizes the expected discounted utility of consumption to determine optimal policies. The model generalizes previous deterministic growth models by incorporating uncertainty. It analyzes the long-run properties of the stochastic growth process, showing properties like the existence of a unique stationary distribution are analogous to the steady state in deterministic models. The techniques used differ from previous work and help unify different approaches to modeling growth under uncertainty.
Statistics is the systematic collection, organization, analysis, and interpretation of data. It plays an important role in decision making by helping extract meaningful information from raw data. There are two main types of statistics - descriptive statistics which summarizes and presents data, and inferential statistics which makes inferences, tests hypotheses, and determines relationships in the data. Statistics has many applications in fields like business, medicine, economics and more. It helps simplify complex data, enable comparisons, identify trends, and aid decision making. Common statistical terms include population, sample, variables, attributes, and parameters. Data can be collected through various methods including direct observation, interviews, questionnaires, and more.
Analysis of Forecasting Sales By Using Quantitative And Qualitative MethodsIJERA Editor
This paper focuses on analysis of forecasting sales using quantitative and qualitative methods. This forecast should be able to help create a model for measuring a successes and setting goals from financial and operational view points. The resulting model should tell if we have met our goals with respect to measures, targets, initiatives
This document discusses estimating stochastic relative risk aversion from interest rates. It first introduces a model for deriving relative risk aversion from interest rates using a time inhomogeneous single factor short rate model. It then details the estimation methodology used, which calibrates the model to US LIBOR data to estimate a time series for the market price of risk and ex-ante bond Sharpe ratio. This allows deducing a stochastic process for relative risk aversion under a power utility function. Estimated mean relative risk aversion is 49.89. The document then introduces modifying a Real Business Cycle model to allow time-varying relative risk aversion, finding it better matches empirical consumption volatility than a baseline model.
Sensitivity analysis is the study of how uncertainty in the inputs of a mathematical model propagates to uncertainty in the model's outputs. It is useful for understanding relationships between inputs and outputs, identifying important inputs, and reducing uncertainty. Sensitivity analysis typically involves running the model many times while varying inputs, and calculating sensitivity measures from the resulting outputs to determine which inputs most influence uncertainty in the outputs. Common methods include variance-based approaches and screening methods.
Parac hacer Macroeconomia Necesitamos Fundamentos MicroeconomicosPAD Ancash
The document argues that macroeconomics does not need microeconomic foundations, and instead it is microeconomics that needs foundations built from neuroscience and biology. It challenges five common arguments for microfoundations in macroeconomics. Specifically, it argues that 1) the Lucas critique does not necessarily require microfoundations, 2) there are no policy-invariant micro principles, 3) new Keynesian models are not truly founded on first principles, 4) there are no well-established micro principles given limitations of current choice theory, and 5) aggregate behavior cannot be built up from individual behavior in a straightforward way.
This powerpoint presentation was done as part of the course STAT 591 titled Mater's Seminar during Third semester of MSc. Agricultural Statistics at Agricultural College, Bapatla under ANGRAU, Andhra Pradesh.
The document discusses fundamentals of data analysis for marketing research. It covers preparing data for analysis through editing, coding, weighting and variable transformation. Simple tabulation, frequency distributions, descriptive statistics and cross tabulation techniques are presented. An overview of statistical techniques includes univariate, bivariate and multivariate methods. Factors influencing statistical technique choices and concepts of hypothesis testing like null hypotheses and significance levels are also summarized.
10th Alex Marketing Club (Forecasting) by Dr. Haitham Maraei 6 Jan-2018Mahmoud Bahgat
Forecasting is an important part of marketing and business planning. There are many techniques for forecasting, including both qualitative and quantitative methods. Qualitative methods include surveys, expert opinions, and market experiments, while quantitative time series methods analyze past trends and patterns to predict the future. Effective forecasting requires understanding factors like demand trends, seasonality, elasticity, and uncertainty. The summary provides an overview of key concepts and challenges in forecasting for marketing and business.
This document is a thesis submitted by Jai Kedia for a degree in mathematics and business economics. It examines alternative risk measures to the traditional beta measure in predicting stock returns. The thesis provides an introduction and acknowledges the contributions of the advisors. It then presents an abstract that outlines the goal of analyzing if alternative risk measures such as higher moments, size, leverage, and price-to-book ratio can improve predictions of stock returns beyond just beta. Finally, it presents a table of contents that outlines the various chapters covering the return/risk relationship, modern portfolio theory, mathematical analysis of stock prices, a literature review on previous empirical studies, the empirical analysis conducted, and a conclusion.
The document summarizes a simulation study that examined the effects of using raw scores versus IRT-derived scores when operationalizing latent constructs in moderated multiple regression analyses. The study found that using raw scores can inflate Type 1 error rates for interaction terms under conditions of assessment inappropriateness. However, rescaling the scores using the Graded Response Model, a polytomous IRT model, mitigated these effects. The study supports the idea that IRT scores provide a more robust metric than raw scores in moderated regression analyses, especially under suboptimal assessment conditions.
Apoorva Javadekar - Conditional Correlations of Macro Variables and Implica...Apoorva Javadekar
This ppt By Apoorva Javadekar is all about Understanding the structure of the Cross Country Correlation for Macro Variables: and Asset Pricing and Risk Sharing Implications
A dynamic Nelson-Siegel model with forward-looking indicators for the yield c...FGV Brazil
This paper proposes a Factor-Augmented Dynamic Nelson-Siegel (FADNS) model to predict the yield curve in the US that relies on a large data set of weekly financial and macroeconomic variables. The FADNS model significantly improves interest rate forecasts relative to the extant models in the literature. For longer horizons, it beats autoregressive alternatives, with a reduction in mean absolute error of up to 40%. For shorter horizons, it offers a good challenge to autoregressive forecasting models, outperforming them for the 7- and 10-year yields. The out-of-sample analysis shows that the good performance comes mostly from the forward-looking nature of the variables we employ. Including them reduces the mean absolute error in 5 basis points on average with respect to models that reflect only past macroeconomic events.
Date: 2017-03
Authors:
Vieira, Fausto José Araújo
Chague, Fernando Daniel
Fernandes, Marcelo
This paper examines how disagreement among investors about macroeconomic factors affects stock returns. The author finds that following periods of high disagreement, stocks highly sensitive to macroeconomic factors ("high macro beta stocks") earn lower future returns compared to less sensitive stocks ("low macro beta stocks"). This suggests high macro beta stocks become overvalued during high disagreement periods due to investors' differing views. Regression analyses show macroeconomic factor risk premiums are negatively related to lagged macro disagreement measures. The findings support theories that macroeconomic factors price risk but also show how disagreement can lead to speculative stock prices.
A logistic regression was conducted to predict if homeowners would accept or decline a solar panel subsidy offer based on household income and monthly mortgage payment. The full model was a good fit to the data and correctly classified 83.3% of cases. While the predictors were not statistically significant individually, they distinguished between acceptors and decliners as a set. Other factors may provide a better fitting model.
This document discusses quantitative approaches to forecasting, including time series analysis and forecasting techniques. It covers the components of a time series, including trends, cycles, seasonality, and irregular components. Specific quantitative forecasting approaches covered include smoothing methods like moving averages, weighted moving averages, and exponential smoothing. Examples are provided to demonstrate how to perform moving averages and exponential smoothing on time series data for sales of headache medicine. The document aims to teach readers how to analyze time series data and select appropriate forecasting techniques.
This document discusses operations management and forecasting. It explains that operations management deals with designing and managing processes, products, services and supply chains to deliver goods and services customers want. Forecasting helps managers reduce uncertainty by predicting future demand to match supply. The document then discusses various forecasting methods including qualitative judgmental methods and quantitative mathematical modeling methods. It covers short, medium and long-range forecasting as well as different time series and causal modeling techniques.
We use data on twins matched to register-based information on earnings to examine the longstanding
puzzle of non-existent compensating wage differentials. The use of twin data allows us to remove otherwise unobserved productivity differences that were the prominent reason for estimation bias in the earlier studies. Using twin differences we find evidence for positive compensation of adverse working conditions in the labor market.
This document discusses human impacts on the planet that have increased dramatically in recent centuries and can now be measured and observed from space. It describes how human activities like industrial fishing, waste disposal, and deforestation are negatively impacting oceans, atmosphere, and forests. Satellite images taken over time reveal changes to landscapes and ecosystems from human influences and show the need for more sustainable environmental practices.
El documento describe una serie de ensayos químicos para identificar diferentes familias de compuestos en una muestra de material vegetal. Se realizó un estudio químico preliminar mediante reacciones de coloración que permiten visualizar el contenido químico. Se identificaron compuestos como taninos, proantocianidinas, hidratos de carbono reductores, antocianinas, betacianinas, saponinas y flavonoides.
la importancia del fenómeno sociocultural que representa el cooperativismo en todas sus formas, y principalmente su trascendencia como uno de los medios más eficaces para perfeccionar y democratizar los procesos económicos, basados en el esfuerzo personal y la ayuda mutua de los integrantes de cooperativas.
The document discusses the present perfect and present perfect progressive tenses in English. It provides examples of how each tense is used to express actions that began in the past and continue to the present, with the present perfect focusing on completed actions and the present perfect progressive emphasizing ongoing or repeated actions over time. Key uses of the present perfect include expressing unspecified past times, repetition, and ongoing situations. The present perfect progressive typically takes time expressions like "for" or "since" and implies duration.
La voz pasiva se utiliza para dar más importancia a la acción que a quien la realiza. Se forma cambiando el objeto directo de la voz activa al sujeto en la pasiva y convirtiendo al sujeto activo en complemento agente. Para transformar una oración de voz activa a pasiva, se cambian los verbos y la posición de las palabras, sustituyendo los pronombres sujetos por pronombres objetos.
Este documento resume as principais regras sobre férias no Código do Trabalho português, incluindo: (1) os trabalhadores têm direito a 22 dias úteis de férias anuais; (2) as férias devem ser gozadas preferencialmente de forma consecutiva, mas podem ser interpoladas por acordo; (3) os trabalhadores com bom registo de assiduidade têm direito a mais dias de férias.
El presente libro tiene un carácter informativo y su aplicación requiere la aprobación de médico naturista. El libro trata de formas o métodos naturales y ancestrales usados por nuestros abuelos, antepasados, por nuestra historia anterior a 1,900, antes de la industrialización o modernización de las medicinas. Son formas o tratamientos que hemos escuchado pero que en el devenir de los años los hemos olvidado. Los pequeños pueblos tienen sus secretos y su cultura popular, según sus productos naturales y ellos saben para qué sirve sobre todo qué enfermedad alivia, ya que allí no hay médicos; de ellos debemos aprender, de ellos es esta valiosa información. Toda esta información fue olvidada actualmente en las grandes urbes y capitales del mundo por la revolución de las comunicaciones, de la televisión, por la propaganda masiva sobre tal producto milagroso, que cura esto, pero al pasar 5 años, ese producto no está y bueno.
Así tenemos alimentos que usamos diariamente, pero que no sabemos qué propiedades medicinales tienen, tal es el caso de las frutas y verduras, que sirven para aliviar muchas enfermedades. También debemos recordar y usar los famosos baños medicinales, que juntos con las plantas medicinales hacen una sinergia que nos relaja después de un reconfortante baño, etc. El uso de emplastos de barro es de uso milenario e histórico, hasta los animales lo usan actualmente, los chanchos, cuando tienen sus heridas, simplemente se revuelcan en el barro. Finalmente algo agradable para nosotros de la tercera edad, tomar licores que sirvan para beneficiarnos en la salud. Vuelvo a reiterar no se olviden de usar esta información bajo la supervisión de un médico naturista.
Finalmente, esta obra está dedicada a la memoria de mis hijas KISSIE Y NADIA, enterradas hace mas de 10 años, por problemas respiratorios; a la edad de 5 años ya presentaban algunos problemas de una simple gripe mal curada, luego entraron a ligas mayores, pasaron al grupo de niñas con bronquitis, también mal curadas; después de los 10 años aproximadamente se titularon en la universidad de la vida, de asmáticas por los buenos efectos de la industrialización de la salud; ellas murieron a los 15, 16 años, pero dos años antes ya estaban desahuciadas, en eso sí fueron certeros, ya les fallaba el corazón y se fueron por sendos infartos cardiacos. Que Dios acompañe a mis hijas en el cielo. Para que historias trágicas como estas no se vuelvan a repetir, ustedes deben ser los mensajeros de estas informaciones naturales que sirven hace miles de años, y evitar el uso de sustancias que nos perjudican y matan lentamente. Sólo hay que compartir la información, GRACIAS.
This is is the presentation with final data and conclusions about the results on our questionnaires about sex, filled in by mothers and their teen daughters, developing the Grundtvig Project "True Interaction" (European Commission. 2009-2011).
El documento habla sobre el valor de cría y las metodologías para estimarlo. Explica que el valor de cría de un individuo depende de los efectos aditivos de todos sus genes y de las frecuencias génicas de la población. Luego describe diferentes métodos para estimar el valor de cría como usar registros propios, de parientes o promedios de hermanos e hijos. Finalmente, explica que el método BLUP (Best Linear Unbiased Prediction) usa modelos predictores considerando efectos fijos y el valor genético aditivo.
The Kruskal-Wallis H test is a nonparametric procedure used to compare more than two populations in a completely randomized design. It ranks all measurements jointly and uses the sum of ranks for each sample to compare distributions. The document provides steps to conduct a Kruskal-Wallis test: state the null and alternative hypotheses, rank all measurements jointly, calculate rank sums for each sample, use a test statistic to determine if there are differences between distributions, and reject the null hypothesis if the test statistic exceeds the critical value. An example compares achievement test scores across four teaching techniques using this procedure.
El documento describe los principios básicos de los antivirales y los diferentes pasos de la replicación viral en los que pueden actuar. Explica los medicamentos usados para tratar infecciones por virus del herpes simple y varicela zoster, como el aciclovir, valaciclovir y famciclovir. También cubre antivirales para citomegalovirus como ganciclovir, valganciclovir, foscarnet y cidofovir.
Communication is the process of expressing and receiving ideas through language, speech, and other means. Typical speech and language development follows predictable patterns through childhood. Speech disorders involve difficulties producing sounds, while language disorders involve challenges with comprehension, expression, or formulation of ideas. Communication disorders can be caused by brain injury, disease, lack of early stimulation, or other factors. Students with communication disorders are evaluated and teachers adapt instruction to support their needs through techniques like repetition, visual aids, and social skills training. Alternative communication systems can also help those unable to communicate verbally.
Crisis, caos y sutentabilidad p. kotler chaotics- Hugo Brunetta
Este documento discute estrategias para lograr sustentabilidad empresarial frente al caos. Los autores Philip Kotler y John Caslione argumentan que las organizaciones deben mejorar su capacidad de respuesta ante tiempos turbulentos mediante una planificación estratégica más dinámica y decisiones más rápidas a niveles más bajos. También deben dividirse en unidades más pequeñas y simples. La sustentabilidad requiere reconocer factores sociales, económicos y ambientales que afectan la estrategia a largo plazo, preserv
Strategic cost management as a recession survival tool in the nigerian manufa...Alexander Decker
The document discusses strategic cost management as a recession survival tool in Nigeria. It aims to determine if Nigerian companies use strategic cost management techniques, the extent of their use in manufacturing and financial services, factors influencing adoption, and if it can be used as a competitive strategy for survival in recessions. The research found that while Nigerian companies are receptive to strategic cost management philosophies, challenges inhibit adoption and implementation. Manufacturing concerns utilize the tools more than financial services. Companies were encouraged to adopt strategic cost management and the government to create an enabling environment for adoption.
Este documento presenta un índice y resumen de varios capítulos sobre topografía. En el Capítulo I, introduce la topografía, su definición, objetivo y aspectos históricos. Explica cómo se divide la topografía para su estudio y los equipos empleados. Los Capítulos II y III cubren fundamentos teóricos como medición de ángulos, nivelación y ejemplos de cálculos. El Capítulo IV presenta conclusiones.
La Región Orinoquía se encuentra entre los ríos Guaviare y Arauca, la cordillera oriental y el río Orinoco. Es una inmensa llanura con temperaturas entre 24° y 28°C que propicia el crecimiento de pastos y la ganadería. También alberga grandes reservas de petróleo que están atrayendo población. La receta típica de la región es la ternera a la llanera o "mamona", que se prepara asando una ternera entera durante varias horas.
1. The teacher scores the student's pre-test passages by converting the rubric scores (+, -, x) to point values and calculating a total score for each passage.
2. The teacher compares the total scores to determine if the student scored higher on the below-grade or on-grade passage.
3. If the student scored higher on the below-grade passage, that passage level becomes their IPL. If they scored higher on the on-grade or above-grade passage, the teacher assigns additional RPA passages to identify the student's instructional
Este documento describe los principios de la responsabilidad social según la norma ISO 26000. Explica que la responsabilidad social es un compromiso voluntario de las organizaciones para comportarse de manera ética y contribuir al desarrollo sostenible. También señala que la ISO 26000 es el estándar global más reconocido para la responsabilidad social, el cual fue desarrollado a través de un amplio proceso de consenso. Finalmente, resume las principales materias que debe considerar una organización para implementar la responsabilidad social de acuerdo a este estándar
This document discusses validating risk models using intraday value-at-risk (VaR) and expected shortfall (ES) approaches with the Multiplicative Component GARCH (MC-GARCH) model. The study assesses different distributional assumptions for innovations in the MC-GARCH model and evaluates their effects on modeling and forecasting performance. Backtesting procedures are used to validate the models' predictive power for VaR and ES. Results show non-normal distributions best fit the intraday data and forecast ES, while an asymmetric distribution best forecasts VaR.
This document summarizes John Sneed's research on developing an earnings forecasting model based on theoretical factors rather than statistical selection of variables. It begins by describing Ou's existing model and its limitations in relying on statistical techniques without theoretical justification. It then discusses theories from economics literature on factors that could lead to differential profits across firms/industries: improper measurement of intangible capital like R&D/advertising, differential returns on such investments, and existence of market power. Based on these theories, Sneed develops a model incorporating variables like average R&D over 5 years, prior year's advertising, and 5-year average capital expenditures to test if it improves upon Ou's model.
Forecasting is the process of making predictions about events that have not yet occurred based on past data and other information. There are many different forecasting methods that can be qualitative or quantitative, including time series analysis, causal modeling, judgmental approaches, and more recently artificial intelligence techniques. Accuracy is important in forecasting and is typically measured using values like mean absolute error or mean squared error. Forecasting has wide applications in domains like business, economics, weather, earthquakes, and more. Limitations to forecasting accuracy exist, such as the chaotic nature of systems like the weather beyond two weeks.
MODELING THE AUTOREGRESSIVE CAPITAL ASSET PRICING MODEL FOR TOP 10 SELECTED...IAEME Publication
Systematic risk is the uncertainty inherent to the entire market or entire market segment and Unsystematic risk is the type of uncertainty that comes with the company or industry we invest. It can be reduced through diversification. The study generalized for selecting of non -linear capital asset pricing model for top securities in BSE and made an attempt to identify the marketable and non-marketable risk of investors of top companies. The analysis was conducted at different stages. They are Vector auto regression of systematic and unsystematic risk.
This summary analyzes a study that compares the forecast accuracy of healthcare product demand using linear and non-linear regression models.
The study finds that for fluctuating demand with small historical values, non-linear regression using a polynomial function provides better forecast accuracy than linear regression. However, for steady demand, linear regression is more accurate. Two factors - percentage of elderly people and regular exercise - are found to have the most significant impact on healthcare demand forecasts. When these factors are varied in the linear model, forecast accuracy is highest. For some products, a 5th or 6th degree polynomial function in non-linear regression also improves accuracy over the linear model.
How to assess the reliability of measurements in rehabilitationanalisedecurvas
This document discusses methods for assessing the reliability of measurements in rehabilitation research. It summarizes several statistical methods that can be used to evaluate reliability, including correlation coefficients, changes in mean values between tests, measurement variability, and clinically important changes. The document uses an example of measurements of isokinetic muscle strength to demonstrate how to calculate and interpret various reliability indices. Overall, it provides rehabilitation researchers with guidance on comprehensively assessing reliability using multiple statistical approaches.
https://utilitasmathematica.com/index.php/Index/
Utilitas Mathematica by making commits to strengthening our professional community. This journal is published by Utilitas Mathematica Academy provides all over. Utilitas Mathematica international level in terms of research provides worldwide.
https://utilitasmathematica.com/index.php/Index/
Utilitas Mathematica by making commits to strengthening our professional community. This journal is published by Utilitas Mathematica Academy provided all over. Utilitas Mathematica international level in terms of research provides worldwide.
https://goo.gl/maps/Pkz14omBWry4czBcA
Box 7, University Centre University of Manitoba Winnipeg, Manitoba R3T 2N2
Forecasting Economic Activity using Asset PricesPanos Kouvelis
This dissertation evaluates how well the asset prices and, in particular the term spread, the short rate and the real stock returns, forecast the GDP growth and the Industrial Production. The study is applied with data of seven countries (Canada, France, Germany, Italy, Japan, United Kingdom and United States) and it covers a period of time between 1966 until now. The research finds that the asset prices have forecasting power for one quarter/month but they lose their power when the forecasting horizon increases. Moreover, the paper evaluates that the real stock return is the best predictor of the GDP growth and that the short rate has more predictive content than the term spread.
Keywords: Term spread, short rate, stock returns, output growth, forecasting horizon, out-of-sample statistics
This paper examines the "variance premium", which is the difference between the squared VIX index and expected realized variance. The authors show that the variance premium captures attitudes toward economic uncertainty and predicts future stock returns over short horizons. They develop a generalized long-run risks model that generates a time-varying variance premium consistent with market return and risk-free rate levels. The model requires extensions to match the large size, volatility and skewness of the variance premium and its short-term return predictability. Calibrating the model to cash flow and asset pricing targets allows it to generate the variance premium and its return predictability features.
This document presents an estimated arbitrage-free model that jointly models nominal and real US Treasury yields. It estimates separate arbitrage-free Nelson-Siegel models for nominal and real yields, finding a three-factor model fits nominal yields well and a two-factor model fits real yields. It then estimates a four-factor joint model that fits both yield curves. The joint model is used to decompose breakeven inflation rates into inflation expectations and inflation risk premium components.
SUITABILITY OF COINTEGRATION TESTS ON DATA STRUCTURE OF DIFFERENT ORDERSBRNSS Publication Hub
This document summarizes research investigating the suitability of cointegration tests on time series data of different orders. The researchers used simulated time series data from normal and gamma distributions at sample sizes of 30, 60, and 90. Three cointegration tests (Engle-Granger, Johansen, and Phillips-Ouliaris) were applied to the data. The tests were assessed based on type 1 error rates and power to determine which test was most robust for different distributions and sample sizes. The results indicated the Phillips-Ouliaris test was generally the most effective at determining cointegration across different sample sizes and distributions.
Relationship between macroeconomic variables and malaysia available shariah i...Azrul Abdullah
This paper aims to study the relationship between local and foreign macroeconomic variables and Malaysia available Shariah Indices. In our study, we used the Vector Error Correction (VEC) framework by initially looking at the long run and short run relationship between Malaysia available Shariah indices (i.e. KLSI, FTSE Bursa Malaysia EMAS Shariah Index and FTSE Bursa Malaysia Hijrah Shariah Index) and the macroeconomic variables via the Johansen cointegration technique. Monthly data during the twenty two-year period (from January 1990 to December 2011) has been collected from DataStream and tested. The findings show positive relationship between the variables from 1990 to 2006. However, mix results were found after the period till 2011. This study then conclude that the standardized set of macroeconomic variables that specified by earlier researchers still can be relied but in careful policy formulation.
A LINEAR REGRESSION APPROACH TO PREDICTION OF STOCK MARKET TRADING VOLUME: A ...ijmvsc
Predicting daily behavior of stock market is a serious challenge for investors and corporate stockholders and it can help them to invest with more confident by taking risks and fluctuations into consideration. In this paper, by applying linear regression for predicting behavior of S&P 500 index, we prove that our proposed method has a similar and good performance in comparison to real volumes and the stockholders can invest confidentially based on that.
This document summarizes methods for establishing meaningful performance expectations across different test forms by setting invariant latent standards along the underlying competence continuum, rather than cutscores that vary by test content. It describes how Angoff ratings can be analyzed using item response curves to identify the latent threshold (θ*) representing each performance level. Preliminary analyses of expert ratings for a licensure exam show ratings better differentiated item difficulties and performance levels after aligning with item curves, and several methods for deriving θ* from the ratings are demonstrated and compared.
Combining forecast from different models has shown to perform better than single forecast in most time series. To improve the quality of forecast we can go for combining forecast. We study the effect of decomposing a series into multiple components and performing forecasts on each component separately... The original series is decomposed into trend, seasonality and an irregular component for each series. The statistical methods such as ARIMA, Holt-Winter have been used to forecast these components. In this paper we focus on how the best models of one series can be applied to similar frequency pattern series for forecasting using association mining. The proposed method forecasted value has been compared with Holt Winter method and shown that the results are better than Holt Winter method
A NOVEL PERFORMANCE MEASURE FOR MACHINE LEARNING CLASSIFICATIONIJMIT JOURNAL
Machine learning models have been widely used in numerous classification problems and performance measures play a critical role in machine learning model development, selection, and evaluation. This paper covers a comprehensive overview of performance measures in machine learning classification. Besides, we proposed a framework to construct a novel evaluation metric that is based on the voting results of three performance measures, each of which has strengths and limitations. The new metric can be proved better than accuracy in terms of consistency and discriminancy.
Similar to Forecast analysis of food price inflation in pakistan (20)
Abnormalities of hormones and inflammatory cytokines in women affected with p...Alexander Decker
Women with polycystic ovary syndrome (PCOS) have elevated levels of hormones like luteinizing hormone and testosterone, as well as higher levels of insulin and insulin resistance compared to healthy women. They also have increased levels of inflammatory markers like C-reactive protein, interleukin-6, and leptin. This study found these abnormalities in the hormones and inflammatory cytokines of women with PCOS ages 23-40, indicating that hormone imbalances associated with insulin resistance and elevated inflammatory markers may worsen infertility in women with PCOS.
A usability evaluation framework for b2 c e commerce websitesAlexander Decker
This document presents a framework for evaluating the usability of B2C e-commerce websites. It involves user testing methods like usability testing and interviews to identify usability problems in areas like navigation, design, purchasing processes, and customer service. The framework specifies goals for the evaluation, determines which website aspects to evaluate, and identifies target users. It then describes collecting data through user testing and analyzing the results to identify usability problems and suggest improvements.
A universal model for managing the marketing executives in nigerian banksAlexander Decker
This document discusses a study that aimed to synthesize motivation theories into a universal model for managing marketing executives in Nigerian banks. The study was guided by Maslow and McGregor's theories. A sample of 303 marketing executives was used. The results showed that managers will be most effective at motivating marketing executives if they consider individual needs and create challenging but attainable goals. The emerged model suggests managers should provide job satisfaction by tailoring assignments to abilities and monitoring performance with feedback. This addresses confusion faced by Nigerian bank managers in determining effective motivation strategies.
A unique common fixed point theorems in generalized dAlexander Decker
This document presents definitions and properties related to generalized D*-metric spaces and establishes some common fixed point theorems for contractive type mappings in these spaces. It begins by introducing D*-metric spaces and generalized D*-metric spaces, defines concepts like convergence and Cauchy sequences. It presents lemmas showing the uniqueness of limits in these spaces and the equivalence of different definitions of convergence. The goal of the paper is then stated as obtaining a unique common fixed point theorem for generalized D*-metric spaces.
A trends of salmonella and antibiotic resistanceAlexander Decker
This document provides a review of trends in Salmonella and antibiotic resistance. It begins with an introduction to Salmonella as a facultative anaerobe that causes nontyphoidal salmonellosis. The emergence of antimicrobial-resistant Salmonella is then discussed. The document proceeds to cover the historical perspective and classification of Salmonella, definitions of antimicrobials and antibiotic resistance, and mechanisms of antibiotic resistance in Salmonella including modification or destruction of antimicrobial agents, efflux pumps, modification of antibiotic targets, and decreased membrane permeability. Specific resistance mechanisms are discussed for several classes of antimicrobials.
A transformational generative approach towards understanding al-istifhamAlexander Decker
This document discusses a transformational-generative approach to understanding Al-Istifham, which refers to interrogative sentences in Arabic. It begins with an introduction to the origin and development of Arabic grammar. The paper then explains the theoretical framework of transformational-generative grammar that is used. Basic linguistic concepts and terms related to Arabic grammar are defined. The document analyzes how interrogative sentences in Arabic can be derived and transformed via tools from transformational-generative grammar, categorizing Al-Istifham into linguistic and literary questions.
A time series analysis of the determinants of savings in namibiaAlexander Decker
This document summarizes a study on the determinants of savings in Namibia from 1991 to 2012. It reviews previous literature on savings determinants in developing countries. The study uses time series analysis including unit root tests, cointegration, and error correction models to analyze the relationship between savings and variables like income, inflation, population growth, deposit rates, and financial deepening in Namibia. The results found inflation and income have a positive impact on savings, while population growth negatively impacts savings. Deposit rates and financial deepening were found to have no significant impact. The study reinforces previous work and emphasizes the importance of improving income levels to achieve higher savings rates in Namibia.
A therapy for physical and mental fitness of school childrenAlexander Decker
This document summarizes a study on the importance of exercise in maintaining physical and mental fitness for school children. It discusses how physical and mental fitness are developed through participation in regular physical exercises and cannot be achieved solely through classroom learning. The document outlines different types and components of fitness and argues that developing fitness should be a key objective of education systems. It recommends that schools ensure pupils engage in graded physical activities and exercises to support their overall development.
A theory of efficiency for managing the marketing executives in nigerian banksAlexander Decker
This document summarizes a study examining efficiency in managing marketing executives in Nigerian banks. The study was examined through the lenses of Kaizen theory (continuous improvement) and efficiency theory. A survey of 303 marketing executives from Nigerian banks found that management plays a key role in identifying and implementing efficiency improvements. The document recommends adopting a "3H grand strategy" to improve the heads, hearts, and hands of management and marketing executives by enhancing their knowledge, attitudes, and tools.
This document discusses evaluating the link budget for effective 900MHz GSM communication. It describes the basic parameters needed for a high-level link budget calculation, including transmitter power, antenna gains, path loss, and propagation models. Common propagation models for 900MHz that are described include Okumura model for urban areas and Hata model for urban, suburban, and open areas. Rain attenuation is also incorporated using the updated ITU model to improve communication during rainfall.
A synthetic review of contraceptive supplies in punjabAlexander Decker
This document discusses contraceptive use in Punjab, Pakistan. It begins by providing background on the benefits of family planning and contraceptive use for maternal and child health. It then analyzes contraceptive commodity data from Punjab, finding that use is still low despite efforts to improve access. The document concludes by emphasizing the need for strategies to bridge gaps and meet the unmet need for effective and affordable contraceptive methods and supplies in Punjab in order to improve health outcomes.
A synthesis of taylor’s and fayol’s management approaches for managing market...Alexander Decker
1) The document discusses synthesizing Taylor's scientific management approach and Fayol's process management approach to identify an effective way to manage marketing executives in Nigerian banks.
2) It reviews Taylor's emphasis on efficiency and breaking tasks into small parts, and Fayol's focus on developing general management principles.
3) The study administered a survey to 303 marketing executives in Nigerian banks to test if combining elements of Taylor and Fayol's approaches would help manage their performance through clear roles, accountability, and motivation. Statistical analysis supported combining the two approaches.
A survey paper on sequence pattern mining with incrementalAlexander Decker
This document summarizes four algorithms for sequential pattern mining: GSP, ISM, FreeSpan, and PrefixSpan. GSP is an Apriori-based algorithm that incorporates time constraints. ISM extends SPADE to incrementally update patterns after database changes. FreeSpan uses frequent items to recursively project databases and grow subsequences. PrefixSpan also uses projection but claims to not require candidate generation. It recursively projects databases based on short prefix patterns. The document concludes by stating the goal was to find an efficient scheme for extracting sequential patterns from transactional datasets.
A survey on live virtual machine migrations and its techniquesAlexander Decker
This document summarizes several techniques for live virtual machine migration in cloud computing. It discusses works that have proposed affinity-aware migration models to improve resource utilization, energy efficient migration approaches using storage migration and live VM migration, and a dynamic consolidation technique using migration control to avoid unnecessary migrations. The document also summarizes works that have designed methods to minimize migration downtime and network traffic, proposed a resource reservation framework for efficient migration of multiple VMs, and addressed real-time issues in live migration. Finally, it provides a table summarizing the techniques, tools used, and potential future work or gaps identified for each discussed work.
A survey on data mining and analysis in hadoop and mongo dbAlexander Decker
This document discusses data mining of big data using Hadoop and MongoDB. It provides an overview of Hadoop and MongoDB and their uses in big data analysis. Specifically, it proposes using Hadoop for distributed processing and MongoDB for data storage and input. The document reviews several related works that discuss big data analysis using these tools, as well as their capabilities for scalable data storage and mining. It aims to improve computational time and fault tolerance for big data analysis by mining data stored in Hadoop using MongoDB and MapReduce.
1. The document discusses several challenges for integrating media with cloud computing including media content convergence, scalability and expandability, finding appropriate applications, and reliability.
2. Media content convergence challenges include dealing with the heterogeneity of media types, services, networks, devices, and quality of service requirements as well as integrating technologies used by media providers and consumers.
3. Scalability and expandability challenges involve adapting to the increasing volume of media content and being able to support new media formats and outlets over time.
This document surveys trust architectures that leverage provenance in wireless sensor networks. It begins with background on provenance, which refers to the documented history or derivation of data. Provenance can be used to assess trust by providing metadata about how data was processed. The document then discusses challenges for using provenance to establish trust in wireless sensor networks, which have constraints on energy and computation. Finally, it provides background on trust, which is the subjective probability that a node will behave dependably. Trust architectures need to be lightweight to account for the constraints of wireless sensor networks.
This document discusses private equity investments in Kenya. It provides background on private equity and discusses trends in various regions. The objectives of the study discussed are to establish the extent of private equity adoption in Kenya, identify common forms of private equity utilized, and determine typical exit strategies. Private equity can involve venture capital, leveraged buyouts, or mezzanine financing. Exits allow recycling of capital into new opportunities. The document provides context on private equity globally and in developing markets like Africa to frame the goals of the study.
This document discusses a study that analyzes the financial health of the Indian logistics industry from 2005-2012 using Altman's Z-score model. The study finds that the average Z-score for selected logistics firms was in the healthy to very healthy range during the study period. The average Z-score increased from 2006 to 2010 when the Indian economy was hit by the global recession, indicating the overall performance of the Indian logistics industry was good. The document reviews previous literature on measuring financial performance and distress using ratios and Z-scores, and outlines the objectives and methodology used in the current study.
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
Salesforce Integration for Bonterra Impact Management (fka Social Solutions A...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on integration of Salesforce with Bonterra Impact Management.
Interested in deploying an integration with Salesforce for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
Digital Banking in the Cloud: How Citizens Bank Unlocked Their MainframePrecisely
Inconsistent user experience and siloed data, high costs, and changing customer expectations – Citizens Bank was experiencing these challenges while it was attempting to deliver a superior digital banking experience for its clients. Its core banking applications run on the mainframe and Citizens was using legacy utilities to get the critical mainframe data to feed customer-facing channels, like call centers, web, and mobile. Ultimately, this led to higher operating costs (MIPS), delayed response times, and longer time to market.
Ever-changing customer expectations demand more modern digital experiences, and the bank needed to find a solution that could provide real-time data to its customer channels with low latency and operating costs. Join this session to learn how Citizens is leveraging Precisely to replicate mainframe data to its customer channels and deliver on their “modern digital bank” experiences.
Introduction of Cybersecurity with OSS at Code Europe 2024Hiroshi SHIBATA
I develop the Ruby programming language, RubyGems, and Bundler, which are package managers for Ruby. Today, I will introduce how to enhance the security of your application using open-source software (OSS) examples from Ruby and RubyGems.
The first topic is CVE (Common Vulnerabilities and Exposures). I have published CVEs many times. But what exactly is a CVE? I'll provide a basic understanding of CVEs and explain how to detect and handle vulnerabilities in OSS.
Next, let's discuss package managers. Package managers play a critical role in the OSS ecosystem. I'll explain how to manage library dependencies in your application.
I'll share insights into how the Ruby and RubyGems core team works to keep our ecosystem safe. By the end of this talk, you'll have a better understanding of how to safeguard your code.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/temporal-event-neural-networks-a-more-efficient-alternative-to-the-transformer-a-presentation-from-brainchip/
Chris Jones, Director of Product Management at BrainChip , presents the “Temporal Event Neural Networks: A More Efficient Alternative to the Transformer” tutorial at the May 2024 Embedded Vision Summit.
The expansion of AI services necessitates enhanced computational capabilities on edge devices. Temporal Event Neural Networks (TENNs), developed by BrainChip, represent a novel and highly efficient state-space network. TENNs demonstrate exceptional proficiency in handling multi-dimensional streaming data, facilitating advancements in object detection, action recognition, speech enhancement and language model/sequence generation. Through the utilization of polynomial-based continuous convolutions, TENNs streamline models, expedite training processes and significantly diminish memory requirements, achieving notable reductions of up to 50x in parameters and 5,000x in energy consumption compared to prevailing methodologies like transformers.
Integration with BrainChip’s Akida neuromorphic hardware IP further enhances TENNs’ capabilities, enabling the realization of highly capable, portable and passively cooled edge devices. This presentation delves into the technical innovations underlying TENNs, presents real-world benchmarks, and elucidates how this cutting-edge approach is positioned to revolutionize edge AI across diverse applications.
FREE A4 Cyber Security Awareness Posters-Social Engineering part 3Data Hops
Free A4 downloadable and printable Cyber Security, Social Engineering Safety and security Training Posters . Promote security awareness in the home or workplace. Lock them Out From training providers datahops.com
Dandelion Hashtable: beyond billion requests per second on a commodity serverAntonios Katsarakis
This slide deck presents DLHT, a concurrent in-memory hashtable. Despite efforts to optimize hashtables, that go as far as sacrificing core functionality, state-of-the-art designs still incur multiple memory accesses per request and block request processing in three cases. First, most hashtables block while waiting for data to be retrieved from memory. Second, open-addressing designs, which represent the current state-of-the-art, either cannot free index slots on deletes or must block all requests to do so. Third, index resizes block every request until all objects are copied to the new index. Defying folklore wisdom, DLHT forgoes open-addressing and adopts a fully-featured and memory-aware closed-addressing design based on bounded cache-line-chaining. This design offers lock-free index operations and deletes that free slots instantly, (2) completes most requests with a single memory access, (3) utilizes software prefetching to hide memory latencies, and (4) employs a novel non-blocking and parallel resizing. In a commodity server and a memory-resident workload, DLHT surpasses 1.6B requests per second and provides 3.5x (12x) the throughput of the state-of-the-art closed-addressing (open-addressing) resizable hashtable on Gets (Deletes).
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
Digital Marketing Trends in 2024 | Guide for Staying AheadWask
https://www.wask.co/ebooks/digital-marketing-trends-in-2024
Feeling lost in the digital marketing whirlwind of 2024? Technology is changing, consumer habits are evolving, and staying ahead of the curve feels like a never-ending pursuit. This e-book is your compass. Dive into actionable insights to handle the complexities of modern marketing. From hyper-personalization to the power of user-generated content, learn how to build long-term relationships with your audience and unlock the secrets to success in the ever-shifting digital landscape.
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfChart Kalyan
A Mix Chart displays historical data of numbers in a graphical or tabular form. The Kalyan Rajdhani Mix Chart specifically shows the results of a sequence of numbers over different periods.
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
Forecast analysis of food price inflation in pakistan
1. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
Forecast Analysis of Food Price Inflation in Pakistan: Applying
Rationality Criterion for VAR Forecast
Ms. Madiha Riaz
1. Department of Economics, The Islamia University of Bahawalpur, Pakistan
*E-mail of the corresponding author:madihatarar@hotmail.com
Abstract
Forecast performance is considered to be a tart test of an econometric model. An accurate forecasting system is
necessary for every industry to be able to take appropriate actions for future planning and planning creates a
substantial need for forecasts. The purpose of this study is to evaluate forecast efficiency by using Rationality
criterion of forecasts. It is therefore designed to analyze forecasting efficiency of food price inflation and
consumer price index by using thirty three years quarterly data of Pakistan covering the period 1975 to 2008.
Forecasts are obtained from VAR model specification. Four forecasting accuracy techniques, such as, Root
Mean Square Error (RMSE), Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE) and
Theil’s Inequality Coefficient (TIC) are used to be able to select the most accurate forecast from VAR. Later on
these forecasts are evaluated on the basis of Rationality criterion defined. We found food price forecast are
consistent, efficient and fulfilling the criteria of weak and strong rationality given. We propose that assessment
of forecasts obtained by applying different criterion used will make them more reliable and correct to be used in
policymaking and management decision.
Keywords: Food Price Forecasts, Weak Rationality, Strong rationality, Strict rationality
1. Introduction
Forecast performance is considered to be a spiky test of an econometric model, particularly when that model is
based on a well designed economic theory. Forecast performance is assumed to provide support for theory. This
is common concept that a good forecasting performance constitutes a ‘seal of approval’ to the empirical model
and therefore of the theory on which model is based. An accurate forecasting system is necessary for every
industry to be able to take appropriate actions for the future. It is widely recognized that one of the most
important functions of manager at all levels in an organization is planning, and planning creates a substantial
need for forecasts.
Analysis of time series and Yule (1927) forecasting has a longer history. Forecasting is often the goal of a time
series analysis. Time series analysis is generally used in business and economics to investigate the dynamic
structure of a process, to find the dynamic relationship between variables, to perform seasonal adjustment of
economic data and to improve regression analysis when the errors are serially correlated and furthermore to
produce point and interval forecast for both level and volatile data series. Accuracy of forecast is important to
policymaker
Traditional measure of forecast efficiency was comparison of RMSE. A forecast having lower RMSE considered
as the best among the others forecast having a high RMSE. A good criticism on RMSE is made by Armstrong et
al. (1995). After the rejection of conventional tools of analyzing the forecast efficiency the co integration
approach named consistency was introduced, and this technique was used by Liu et al. (1992) and Aggerwal et
al. (1995) to assess the unbiasedness, integration and co integration characteristics of macroeconomic data and
their respective forecast. Hafer et al. (1985), McNees (1986), Pearce (1987) and Zarnowitz (1984, 1985, 1993)
place great weight on minimum mean square error (MSE) but do not incorporate accuracy analysis convincingly
in their test of forecast.
Many researchers contribute to rationality testing such as Carlson (1977) Figlewski et al. (1981), Friedman
63
2. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
(1980), Gramlich (1983), Mullineaux (1978), Pearce (1979) and Pesando (1975). many studies finds the
rationality of IMF and OECD forecasts like Holden et al. (1987), Ash et al. (1990, 1998), Artis (1996), Pons
(1999, 2000, 2001), Kreinin (2000), Oller et al. (2000) and Batchelor (2001), these studies shown that the IMF
and OECD forecasts pass most of the tests of rationality. Doctrine of rationality is defined by Lee (1991),
expectations are said to be rational if they fully incorporate all of the information available to the agents at the
time the forecast is made. Efficiency of forecast is being analyzed by different approaches; e.g. Consistent
Forecast, Efficient Forecast and Rational Forecast. Bonham et al. (1991) include a test for conditional efficiency
in the definition of strong rationality. In order to analyze the rationality of price forecast Bonhan et al. (1991)
define a hierarchy of rationality tests starts from ‘weak rationality’ to ‘strict rationality’ as
• Weak rationality
• Sufficient rationality
• Strong rationality
• Strict rationality
2. Rationality
2.1 Weak Rationality
Most of the applied work such as Evans et al. (1984), Friedman (1980), Pearce (1987) and Zarnowitz (1984,
1985) viewed rationality in term of the necessary conditions of unbiasedness and information efficiency2. The
same notion of weak rationality was defined by Bonham et al. (1991) that the forecast must be unbiased and
meet the tests of weak information efficiency. Ruoss (2002) stated that unbiasedness is often tested using the
Theil-Mincer-Zarnowitz equation. This is a regression of the actual values on a constant and the forecast values.
Clement (1998) suggested to run a regression of the forecast error on the constant, if the constant deviates from
zero, the hypothesis that the forecast is unbiased is rejected.
2.2 Sufficient Rationality
The forecast must be weakly rational and must pass a more demanding test of sufficient orthogonality, namely,
that the forecast errors not be correlated with any variable in the information set available at the time of
prediction
2.3 Strong Rationality
The forecast must be sufficiently rational and pass tests of conditional efficiency. Conditional efficiency
requires a comparison of forecasts3. Call some sufficiently rational forecast a benchmark. Combine benchmark
with some competing forecast. Conditional efficiency refers to Granger et al. (1973) concept that measures the
reduction in RMSE, which occurs when a forecast is combined with one of its competitors. Against such kind of
notion Granger (1989) suggest that combining often produces a forecast superior to both components. Same
kind of notion is build by Timmermann (2006) whether forecast can be improved by combining WEO forecasts
with the Consensus forecasts. Stock et al. (2001) reported broad support for a simple combination of forecasts in
a study of a large cross-section of macroeconomic and financial variables. If the combination produces an
RMSE that is significantly smaller than the benchmark RMSE, the latter fails the test for conditional efficiency
because it has not efficiently utilize some information contained in the competing forecast.
2
The same kind of unbiasedness and efficiency notion was build by Eichenbaum et al. (1988) and Razzak
(1997)
3
Emanating from the classic study by Bates et al. (1969) a long literature on forecast combination summarized
by Clemen (1989), Diebold et al. (1996) and Timmermann (2005) has found evidence that combined forecasts
tend to produce better out-of-sample performance than individual forecasting models.
64
3. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
2.4 Strict Rationality
Bonham et al. (1991) explained in it study that a statement about rationality should not depend on arbitrary
selection of time periods. A forecast is strictly rational if it passes tests of strong rationality in a variety of sub
periods .Empirical results regarding the rationality of forecasts was explained by Lee (1991) that forecast are fail
to be rational in the strong sense even though they are not rejected by the conventional test of weak-form
rationality. Ruoss (2002) examine the forecast rationality of the Swiss economy and find GDP forecasts in
sample do not pass the most stringent test i.e., the test of strong informational efficiency, because, in some cases,
forecasts errors correlate with the forecasts of the other institutes.
Same kind of results are shown by Bonham et al. (1991) that the most stringent criteria for testing rationality
will not be useful for empirical work. On these criteria there might not be a rational forecast of inflation. Thus
there is a tension between what econometricians would like to suggest about rationality and the imperative that
agents act on what information they have. This tension might be eliminated by relaxing the criterion that
defines strict rationality.
Razzak (1997) and Rich (1989) test the rationality of National Bank of New Zealand’s survey data of inflation
expectation and SRC expected price change data respectively. Both studies end up with a same conclusion, that
the results do not reject the null hypothesis of unbiasedness, efficiency and orthogonality for a sample from their
particular survey data series. A study of US and Sweden was ended by Bryan et al. (2005) concludes that the US
data seems very unsupportive of near-rationality4, whereas the Swedish is more inconclusive. From all
discussion it can be inferred that the central goal is to produce unbiased and efficient forecast with uncorrelated
forecast error. Typically as mentioned by Yin-Wong Cheung and Menzie David Chinn (1997)that when
examining forecast accuracy researchers examine the mean, variance and serial correlation properties of the
forecast error. Following basic principles of economic forecast, the performance of forecast can be evaluated as
Unbiased forecast, efficient forecast and have uncorrelated errors.
3. Plan of Study
The aim of this study is to assess forecast accuracy by means of Rationality test applied for forecasts of food
price inflation and consumer price index data of Pakistan which are essential for efficient planning by farmers
and other industries connected to the food production. Such forecasts are also of interest to governments and
other organizations. It will consist of 33 years Quarter data covering the period 1975-2008. We will obtain
forecasts by VAR model. We will select a number of alternative criteria (such as, Root Mean Square Error
(RMSE), Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE) and Theil’s Inequality
Coefficient (TIC)) for measuring forecast accuracy at the time of selection of best forecasts. In order to test the
forecast either they are biased, erratic and unreliable or using existing information in a reasonably effective
manner we submit an application of rationality test, these issues of forecast efficiency are rarely addressed.
These criteria give different rankings, so there is no guarantee that a forecast that performs well under one
criterion is satisfactory under the others. Any conclusion from a given data set should be regarded only as
indicators of forecasting ability and not as proof of the correctness of the underlying model and criterion for that
data.
In order to test the performance of Food price inflation Forecast , we forecast two data series namely, Food
price inflation (CPI food as proxy of food price inflation), consumer price index General (CPIG).The purpose of
selecting these two data series is their strong causality with each other.
Quarterly figures are taken from the IMF’s International Financial Statistics (2009) and World Bank’s World
Development Indicator (2009). Data are taken on Quarterly basis for the period 1974-75, 2007-08. It comprises
133 observations from 1974Q2-2008Q2.The corresponding sections will explain the framework of analysis and
discussion on result.
3.1 Framework of Analysis
4
The proposition of near-rationality of inflation expectation was suggested by the work of Akerlof et al. (2000).
65
4. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
We used VAR approach presented by Sims (1980) for multivariate analysis. In the estimation of VAR we used
food price inflation alternatively with the four other variables, Real GDP, M2, Interest rate and Exchange
rate.VAR model consists of a set of seemingly unrelated regression (SUR) equations. To tackle autocorrelation,
sufficient lag structure has to be considered in the specification of the VAR model. However, to preserve
parsimony, lag length need to be justified, therefore we started with a lag of eight periods and then follow
‘general to specific’ diagnostic/specification procedure. We applied Wald test on the restriction that all the
coefficients at eight lag are equal to zero. If this restriction is accepted, the model was re-estimated with seven
period lag and same procedure was repeated till the Wald test results supported the rejection of the null
hypothesis. Once the VAR model was estimated, we used the selected VAR specification to get forecasts for
further application of Rationality test. Performance tests of forecast were based on OLS.
3.1.1 Weak Rationality Test
A forecast must be unbiased and meet tests of weak information efficiency to be weakly rational.
In the following equation
P o t = α o + α1 P e t + ε t 1
Pe is the unbiased forecast of Po, if εt is serially uncorrelated, and the coefficients are insignificantly different
from zero and one respectively. Weak information efficiency means that the forecast errors Et = P e t − P o t
are uncorrelated with the past values of the predicted variables. To test the weak efficiency hypothesis we
estimate the following regression
m
. Et = α o + ∑ α i P o t −i + ε t 2
i =1
If we fail to rejection of the following joint hypothesis it implies that past values help to explain the forecast
errors.
Ho :αo = α j = 0 for all j = 1……….. m 3
Acceptance of such hypothesis represent that the forecast error at time t is independent to the past information
contained by relevant observed price index.
3.1.2 Sufficient Rationality Test
The doctrine of sufficient rationality states that the forecast errors are not correlated with any variable in the
information set available at time of forecast. If Zt is a variable or a vector of variable used to build our forecast
model, then Zt is the exogenous variable in the following equation.
m
Et = α o + ∑ α i Z t − i + ε t 4
i =1
After estimating the equation 4 we test the following hypothesis
Ho :αo = α j = 0 for all j = 1……….. m 5
The rejection of above mentioned hypothesis states that the information contained in the past values of related
series has not been used efficiently in forming the forecast.
3.1.3 Strong Rationality Test
A forecast is said to be strongly rational if it passes the test of conditional efficiency suggest by Granger et al.
(1973). Conditional efficiency requires a comparison of forecasts. Call some sufficiently rational forecast as
benchmark; combine the benchmark with some competing forecast. Estimate the following regression.
[
Dt = α + β S t − S t + ε t ] 6
66
5. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
Where Dt and St are the difference and the sum of the benchmark and combination forecast errors, respectively,
and S t is the mean of the sum. Under the null hypothesis of conditional efficiency, that the combination does
not produce a lower RMSE, (α=β=0).F test is appropriate if β>0 and the mean errors of both forecasts have the
same sign as α. If the mean errors of the two forecasts do not have the same sign, then α cannot be interpret as
an indicator of the relative bias of the two forecasts.
3.1.4 Strict rationality Test
A forecast is strictly rational if it passes tests of strong rationality in a variety of sub periods. If a strongly
rational forecast passes the same test based on equation 6 in sub periods then according to Bonham (1991) that
particular forecast is awarded as strict rational.
4. Results and Discussions
Food Price Inflation (CPIF) and Consumer Price index general (CPIG) both are VAR (1, 2) for our data series
.Four variables are included in each model, i.e., Real GDP, M2, Interest Rate and Exchange rate to estimate
VAR
Table 1.1 in appendix illustrates forecasts Statistics, Root Mean Squared Error (RMSE), Mean Absolute error
(MAE), Mean Absolute percentage errors (MAPE), and Theil Inequality Coefficient TIC. In every case forecast
error is defined as the forecast value minus the actual value. The MAE is a measure of overall accuracy that
gives an indication of the degree of spread, where all errors are assigned equal weights. The MSE is also a
measure of overall accuracy that gives an indication of the degree of spread, but here large errors are given
additional weight. It is the most common measure of forecasting accuracy. Often the square root of the MSE,
RMSE, is considered, since the seriousness of the forecast error is then denoted in the same dimensions as the
actual and forecast values themselves. The MAPE is a relative measure that corresponds to the MAE. The
MAPE is the most useful measure to compare the accuracy of forecasts between different items or products
since it measures relative performance. If the MAPE calculated value is less than 10 percent, it is interpreted as
highly accurate forecasting, between 10 - 20 percent good forecasting, between 20 -50 percent reasonable
forecasting and over 50 percent inaccurate forecasting. Theil’s Inequality Coefficient (TIC) is another statistical
measure of forecast accuracy. A Theil’s-U greater than 1.0 indicates that the forecast model is worse; a value
less than 1.0 indicates that it is better. The closer U is to 0, the better the model. Wrapping up all this discussion
is simply to say that we get best forecast from our data series using VAR model (see statistics in table1.1for
detail).
4.1 Rationality Test for Forecasts
Carl S. Bonhan and Douglas C. Dacy (1991) classify the rationality of time series forecast as, (1) Weakly
rational, (2) sufficiently rational, (3) strongly rational, (4) strictly rational.
4.1.1 Weak Rationality
A forecast must be (a) unbiased and meet the tests of (b) weak informational efficiency to be weakly rational.
In this part we estimate the Unbiasness. We regress forecast on observed data series to get forecast errors.
CPIF = 0.5539311996 + 1.001230912*F1
(0.191) (296.822)***
CPIG = 1.591295003 + 1.000070449*F2
(0.681) (400.601)***
Forecasts are significant in explaining the observed series. T-Values in parenthesis indicates it validity.
Unbiased ness Tests
Breusch-Godfrey Serial Correlation LM Test:
Table 1.2 Ho: Serially uncorrelated errors
Forecast F-statistic Probability Lag length
CPIF 0.605 0.438 1
CPIG 5.751 0.004 2
67
6. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
Table 1.2 illustrates the results of forecasts errors. CPIG errors are serially correlated whereas CPIF errors are
serially uncorrelated which confirms CPIF forecasts are unbiased and passing the Unbaisdness test of forecast
though it is not insignificantly different from zero and one.
Table 1.3
Ho: C(1)=0, C(2)=1
Forecast F-statistic Probability Chi-square Probability
CPIF 0.539 0.585 1.077 0.583
CPIG 0.754 0.473 1.507 0.471
Table 1.3 shows that CPIF and CPIG forecast coefficient are insignificantly zero and one as null hypothesis is
accepted here.
In order to test the weak information efficiency of forecast we regress our forecasts errors on past predicted
values and find they are uncorrelated with forecasts errors.
E1 = -0.3925389311 - 0.001473105748*CPIF (-1)
(-0.136) (-0.431)
E2 = -1.415973065 - 0.0003045008285*CPIG (-1)
(-0.608) (-0.120)
Weak Informational Efficiency Tests
Table1.4
Ho: C(1)= C(2)=0
Forecast F-statistic Probability Chi-square Probability
CPIF 0.566 0.569 1.131 0.568
CPIG 0.761 0.470 1.521 0.467
We fail to reject the joint hypothesis reported in table 1.4; it implies that past values help to explain the forecast
errors. So CPIF and CPIG both are qualifying the test of weak informational efficiency. It is evident from the
table statistics. Acceptance of null hypothesis above represents the forecast error at time t is independent to the
past information contained by relevant observed price index.
4.1.2Sufficient Rationality
We regress our forecasts error on information set available used to estimate VAR model which is real GDP, M2,
interest rate, and exchange rate lags. The doctrine of sufficient rationality states that the forecast errors are not
correlated with any variable in the information set available at time of forecast.
E1 = -4.977+ 0.0848*CPIF (-1) + 4.2e-06*RGDP (-1)-9.5e-05*M2 (-1)-1.404*R(-1)-0.816*ER(-1)
(-0.75) (2.59)*** (0.25) (-2.03) ** (-1.71)* (-1.64)*
E2 = 0.025+0.26*CPIG (-1)-1.4e-06*RGDP (-1)-7.9e-06*M2 (-1) -0.02*R (-1)-0.017*ER (-1)
(0.14) (5.091) *** (-2.86) *** (-4.00) *** (-1.17) (-1.98) **
Sufficient Rationality Tests
Table 1.5
Ho: All the Coefficients are zero
Forecast F-statistic Probability Chi-square Probability
CPIF 1.470 0.194 8.823 0.184
CPIG 5.000 0.180 29.999 0.196
Table 1.5 statistics are explaining the result of sufficient rationality criterion. The rejection of above mentioned
hypothesis states that the information contained in the past values of related series has not been used efficiently
in forming the forecast ,as null hypothesis is not rejected here it indicates given information are used in making
these forecasts. Therefore both CPIF and CPIG are fulfilling the sufficient rationality criterion.
4.1.3 Strong Rationality
A forecast is said to be strongly rational if it passes the test of conditional efficiency suggest by Granger et al.
(1973). Conditional efficiency requires a comparison of forecasts. In order to Call sufficiently rational forecast
68
7. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
we need some forecasts as benchmark, intended for, we get forecasts of CPIF from ARIMA (1, 1, 1) (Auto
Regressive Moving Average)5; combine this benchmark (ARIMA) with some competing (VAR) forecast. We get
forecasts errors and estimate the difference and sum of the benchmark and combination forecast errors and also
obtain the mean of the sum to estimate α andβ. Results are reported in table 1.6 in appendix which indicates
The forecast of CPIF obtained from VAR is strongly efficient when combine with an ARIMA forecast of CPIF.
4.1.4 Strict rationality
A forecast is strictly rational if it passes tests of strong rationality in a variety of sub periods. In this study
forecasts of CPIF met the strong efficient criterion, so we estimated equation 6 in given sub periods and find
CPIF did not follow the strict rationality criterion.
We break the sample in following sub periods
• 1975Q3 to 1980Q2
• 1980Q3 to 1985Q4
• 1986Q1 to 1990Q1
• 1990Q2 to 1995Q2
• 1995Q3 to 2000Q4
• 2001Q1 to 2005Q2
• 2005Q3 to 2008Q2
Conclusion
4.1.1 4.1.2 4.2 4.3 4.4
Food price Inflation 1 1 1 1 NA
Quarter
Consumer price index general 0 1 NA NA NA
“ 1” for meeting the criteria, “0” otherwise and NA/not applicable
4.1.1 Unbiasedness Test
4.1.2 Weak Informational Efficiency Test
4.2 Sufficient Rationality
4.3 Strong Rationality
4.4 Strict Rationality
It is clear from result summary that food price inflation forecast qualify the rationality criterion used to check
the accuracy of forecasts, they are unbiased and fulfilling the criterion of weak, sufficient and strong rationality.
Consumer price index general forecast are only weakly rational. We infer from our analysis that food price
forecast are reliable for further application. Forecasting rationality test reduce the range of uncertainty within
which management judgment can be made, so that it can be used in decision making process to the benefits of
an organization and policy makers. Food Price Inflation forecasts are satisfying all the criteria used to check the
performance of forecast by VAR for given data series. We suggest policy makers and planning authorities for
reliance on these criteria to get better forecasts for further appliance. If for every forecast such criterion will be
used then more consistent and reliable results can be predicted.
5
-For more detail see Box, G. E. P. and G. M. Jenkins (1976), “Time Series Analysis, Forecasting and Control”,
Holden-Day, San Francisco.
69
8. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
References
Aggarwal, R. Mohanty, S. and Song, F. (1995), “Are Survey Forecasts of Macroeconomic Variables Rational?”,
Journal of Business, 68, (1), 99-119.
Akerlof, G. A., William, T. D. and George, L. P. (2000), “Near-Rational Wage and Price Setting and the Long-
Run Phillips Curve”, Brookings Papers on Economics Activity, 1-60.
Armstrong, J. S. and Fildes R. (1995), “On the Selection of Error Measures for Comparisons among Forecasting
Methods”, Journal of Forecasting, Vol. 14, 67-71.
Artis, M. J. (1996), “How Accurate are the IMF’s Short-term Forecasts? Another Examination of the World
Economic Outlook”, International Monetary Fund, Working Paper No. 96/89.
Ash, J. C. K., Smyth, D. J. and Heravi, S. M. (1990), “The Accuracy of OECD Forecasts of the International
Economy”, International Journal of Forecasting, 6, 379-392.
Ash, J. C. K., Smyth, D. J. and Heravi, S. M. (1998), “Are OECD Forecasts Rational and Useful?: A Directional
Analysis”, International Journal of Forecasting, 14, 381-391.
Bakhshi, H., George, K. and Anthony, Y. (2003), “Rational Expectations and Fixed-Event Forecasts: an
Application to UK Inflation”, Bank of England, UK, Working Paper No. 176.
Batchelor, R. (2001), “How Useful are the Forecasts of Intergovernmental Agencies? The OECD and IMF
versus the Consensus”, Applied Economics, 33, 225-235.
Bonham, C. S. and Douglas, C. D. (1991), “In Search of a “Strictly Rational” Forecast”, The Review of
Economics and Statistics, Vol. 73, No. 2, 245-253.
Bonham, C. S. and Cohen, R. (1995), “Testing the Rationality of Price Forecasts: Comment”, The American
Economic Review, Vol. 85, 284-289.
Box, G. E. P. and G. M. Jenkins (1976), “Time Series Analysis, Forecasting and Control”, Holden-Day, San
Francisco.
Bryan, M. F. and Stefan, P. (2005), “Testing Near-Rationality Using Detailed Survey Data”, Federal Reserve
Bank of Cleveland, Working Paper No. 05-02.
Carlson, J. A. (1977), “A Study of Price Forecast”, Annals of Economic and Social Measurement, 6, 27-56.
Engle, R. F. and C. W. J. Granger, (1987), “Co-integration and Error Correction: Representation, Estimation and
Testing”, Econometrica, 55, 251-276.
Evans, George, and Gulmani, R. (1984), “Tests for Rationality of the Carlson-Parkin Inflation Expectation
Data”, Oxford Bulletin of Economics and Statistics, 46, 1-19.
Figlewski, Stephen and Paul W. (1981), “The Formation of Inflationary Expectations”, Review of Economics
and Statistics, 63, 1-10.
Friedman, Benjamin, M. (1980), “Survey Evidence on the ‘Rationality’ of Interest Rate Expectations”, Journal
of Monetary Economics, 6, 453-465.
Gramlich, Edward, M. (1983), “Models of Inflation Expectations Formation: A comparison of Households and
Economist Forecasts”, Journal of Money, Credit and Banking, 15, 155-173.
Granger, C. W. J., (1981), “Some Properties of Time Series Data and Their Use in Econometric Model
Specification”, Journal of Econometrics, 16, 121-130.
Granger, C. W. J., (1989), “Forecasting In Business and Economics”, Second edition Academic Press, London,
page 194.
Granger, C. W. J., (1996), “Can We Improve the Perceived Quality of Economic Forecast?”, Journal of Applied
Econometrics, Vol. 11, No. 5, 455-473.
Government of Pakistan, Economic survey (various issues), Islamabad, Ministry of Finance.
Hafer, R. W. and Hein, S. E. (1985), “On the Accuracy of Time Series, Interest Rate, and Survey Forecast of
Inflation”, Journal of Business, 5, 377-398.
Holden, K. and Peel, D. A. (1990), “On Testing for Unbiasedness and Efficiency of Forecasts”, Manchester
School, 58, 120-127.
Lee, Bong-soo (1991), “On the Rationality of Forecasts”, The Review of Economics and Statistics, Vol. 73, No.
2, 365-370.
Liu, P. and G.S. Maddala (1992), “Rationality of Survey Data and Tests for Market Efficiency in the Foreign
Exchange Markets”, Journal of International Money and Finance, 11, 366-381
McNees, Stephen, K. (1986), “The Accuracy of Two Forecasting Techniques: Some Evidence and
Interpretations”, New England Economics Review, April, 20-31.
70
9. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
Mullineaux, D. J. (1978), “On Testing for Rationality: Another Look at the Livingston Price Expectations Data”,
Journal of Political Economy, 86, 329-336.
Pesando, J. E. (1975), “A Note on the Rationality of Livingston Price Expectations”, Journal of Political
Economy, 83, 849-858.
Pons, J. (1999), “Evaluating the OECD’s Forecasts for Economic Growth”, Applied Economics, 31, 893-902.
Pons, J. (2000), “The Accuracy of IMF and OECD Forecasts for G7 Countries”, Journal of Forecasting, 19, 56-
63.
Razzak, W. A. (1997), “Testing the Rationality of the National Bank of New Zealand’s Survey Data”, National
Bank of New Zealand, G97/5
Rich, R. W. (1989), “Testing the Rationality of Inflation from Survey Data: Another Look at the SRC Expected
Price Change Data”, The Review of Economics and Statistics, Vol. 71, No. 4, 682-686.
Ruoss, E. and Marcel, S. (2002), “How accurate are GDP Forecast? An Empirical Study for Switzerland”,
Quarterly Bulletin, Swiss National Bank, Zurich, 3, 42-63.
Timmermann, A. (2005), “Forecast Combinations” forthcoming in Handbook of Economic Forecasting,
Amsterdam, North Holland.
Yule, G. U. (1927), “On a Method of Investigating Periodicities in Disturbed Series with Special Reference to
Wolfer’s Sunspot Numbers”, Philosophical Transactions of the Royal Society London, Ser. A, 226, 267-298.
Zarnowitz, V. (1985), “Rational Expectations and Macroeconomic Forecasts”, Journal of Business and
Economic Statistics, 3, 293-311.
Zarnowitz, V. and Phillip, B. (1993), “Twenty-Two Years of the NBER-ASA Quarterly Economic Outlook
Surveys: Aspects and Comparisons of Forecasting Performance”, Business Cycles, Indicators and Forecasting,
University of Chicago Press, 11-84
Appendix
Table 1.1
Forecast Statistics of Quarter Data with VAR Model
CPIF CPIG
Included observations 129 127
Root Mean Squared Error 5.644 5.025
Mean Absolute Error 3.276 3.291
Mean Absolute Percentage Error 1.874 1.405
Theil Inequality Coefficient 0.010 0.008
Bias Proportion 0.74% 1.19%
Variance Proportion 0.26% 0.03%
Covariance Proportion 99.00% 98.78%
71
10. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
Table 1.6
Strong Rationality Test Results
Benchmark Forecast When Combined With
Panel A CPIF ARIMA CPIF VAR
Sign Mean Error -ve -ve
α 0.386856
β -0.042682
Prob. Bias
Conclusion Cannot Reject
Panel B CPIF VAR CPIF ARIMA
Sign Mean Error -ve -ve
α -0.387
β 0.043
Prob. 0.7267
Conclusion Cannot Reject
Sample 1975Q3 2008Q2
Result in table 1.6 shows, Panel A the benchmark forecast is ARIMA and in Panel B the benchmark is VAR and
combined with an ARIMA forecast of CPIF. The sign of α is same with the sign of mean forecast error in Panel
B. It follows the test.
72