This document summarizes a study that evaluates the forecast rationality of food price inflation in Pakistan using VAR models. The study obtains forecasts of food price inflation and consumer price index from 1975 to 2008 using VAR models. It then assesses the forecast rationality based on criteria including weak rationality, sufficient rationality, strong rationality, and strict rationality. Weak rationality requires unbiasedness and weak information efficiency. Sufficient rationality requires uncorrelated forecast errors and information variables. Strong rationality requires conditional efficiency through combining forecasts. Strict rationality requires passing tests of strong rationality in sub-periods. The study finds the food price forecasts are consistent, efficient and meet the criteria of weak and strong rationality,
This document summarizes a study that evaluates the rationality of ARIMA forecasts for several economic indicators in Pakistan using annual data from 1975 to 2008. The study develops ARIMA models to forecast food price inflation, consumer price index, GDP per capita, and money supply. It then assesses the forecasts based on criteria for weak, sufficient, strong, and strict rationality proposed by previous research. Food price inflation forecasts were found to meet the criteria for weak and strong rationality, indicating they are reliable for policymaking. The other forecasts did not fully meet the rationality criteria except for money supply.
Application of consistency and efficiency test for forecastsAlexander Decker
This document evaluates the forecasting efficiency of food price inflation, consumer price index, GDP per capita, and money supply data from Pakistan from 1975 to 2008. It uses ARIMA models to generate forecasts, which are then evaluated for consistency and efficiency. Consistency tests whether the actual and forecasted values are cointegrated and have the same order of integration. Efficiency tests examine whether forecasts minimize forecast errors and fully incorporate available information. The study finds that food price forecasts are consistent and efficient based on these criteria.
This document presents a novel approach for combining individual realized volatility measures to form new estimators of asset price variability. It analyzes 30 different realized measures estimated from high frequency IBM stock price data from 1996-2007. It finds that a simple equally-weighted average of the realized measures is not outperformed by any individual measure and that combining measures provides benefits by incorporating information from different estimators. Optimal linear and multiplicative combination estimators are estimated and none of the individual measures are found to encompass all the information in other measures, further supporting the use of combination estimators.
This document describes a factor analysis conducted on survey data from 12 individuals evaluating news anchors on 6 scales measuring credibility. The factor analysis identified 2 underlying factors accounting for most of the variance in the data: Resourcefulness (knowledge, intelligence, believability) and External Appearances (likeability, attractiveness, appearance). Varimax rotation was used to interpret the 2 factors, which provide a simplified way for the news director to develop strategies around anchor qualities compared to using all 6 individual scales.
This document outlines the methodology used to analyze the relationship between company performance and risk disclosure in annual reports. Performance is measured as the change in net income and stock price. Risk disclosure is measured using proxies like number of pages/words about risk and number of times "risk" is mentioned. The study examines 24 Dutch AEX companies' annual reports from 2006-2007. It tests if 1) change in performance correlates with change in risk disclosure and 2) stability of performance correlates with risk disclosure. Multiple steps are taken, including calculating correlations both with and without financial companies.
This document analyzes demand forecasting methods for four pharmaceutical products. Four forecasting methods - naive, cumulative mean, simple moving average, and exponential smoothing - were evaluated based on mean error, mean absolute percentage error, and mean squared error. Visual Basic for Applications was used to optimize parameters for simple moving average and exponential smoothing. The best method for each product was determined to be the one with the lowest mean squared error. Forecasts and 90% confidence intervals are presented for next-month demand.
On Confidence Intervals Construction for Measurement System Capability Indica...IRJESJOURNAL
Abstract: There are many criteria that have been proposed to determine the capability of a measurement system, all based on estimates of variance components. Some of them are the Precision to Tolerance Ratio, the Signal to Noise Ratio and the probabilities of misclassification. For most of these indicators, there are no exact confidence intervals, since the exact distributions of the point estimators are not known. In such situations, two approaches are widely used to obtain approximate confidence intervals: the Modified Large Samples (MLS) methods initially proposed by Graybill and Wang, and the construction of Generalized Confidence Intervals (GCI) introduced by Weerahandi. In this work we focus on the construction of the confidence intervals by the generalized approach in the context of Gauge repeatability and reproducibility studies. Since GCI are obtained by simulation procedures, we analyze the effect of the number of simulations on the variability of the confidence limits as well as the effect of the size of the experiment designed to collect data on the precision of the estimates. Both studies allowed deriving some practical implementation guidelinesin the use of the GCI approach. We finally present a real case study in which this technique was applied to evaluate the capability of a destructive measurement system.
This document discusses an integrated model for sensitivity analysis and scenario analysis using breakeven analysis for operational and investment risk analysis. It was developed by Prof. Sreedhara Ramesh Chandra and Dr. Krishna Banana. The model aims to address limitations in existing sensitivity, scenario, and breakeven analysis models by integrating the three approaches. It introduces proportions and percentages to more precisely determine variable values. It also establishes relationships between scenario values and measures sensitivity through changes from a predetermined relational constant value (sales revenue). The model allows consideration of all cash flow determinants and provides a direct link between operational and investment risk measurements to improve investment decisions.
This document summarizes a study that evaluates the rationality of ARIMA forecasts for several economic indicators in Pakistan using annual data from 1975 to 2008. The study develops ARIMA models to forecast food price inflation, consumer price index, GDP per capita, and money supply. It then assesses the forecasts based on criteria for weak, sufficient, strong, and strict rationality proposed by previous research. Food price inflation forecasts were found to meet the criteria for weak and strong rationality, indicating they are reliable for policymaking. The other forecasts did not fully meet the rationality criteria except for money supply.
Application of consistency and efficiency test for forecastsAlexander Decker
This document evaluates the forecasting efficiency of food price inflation, consumer price index, GDP per capita, and money supply data from Pakistan from 1975 to 2008. It uses ARIMA models to generate forecasts, which are then evaluated for consistency and efficiency. Consistency tests whether the actual and forecasted values are cointegrated and have the same order of integration. Efficiency tests examine whether forecasts minimize forecast errors and fully incorporate available information. The study finds that food price forecasts are consistent and efficient based on these criteria.
This document presents a novel approach for combining individual realized volatility measures to form new estimators of asset price variability. It analyzes 30 different realized measures estimated from high frequency IBM stock price data from 1996-2007. It finds that a simple equally-weighted average of the realized measures is not outperformed by any individual measure and that combining measures provides benefits by incorporating information from different estimators. Optimal linear and multiplicative combination estimators are estimated and none of the individual measures are found to encompass all the information in other measures, further supporting the use of combination estimators.
This document describes a factor analysis conducted on survey data from 12 individuals evaluating news anchors on 6 scales measuring credibility. The factor analysis identified 2 underlying factors accounting for most of the variance in the data: Resourcefulness (knowledge, intelligence, believability) and External Appearances (likeability, attractiveness, appearance). Varimax rotation was used to interpret the 2 factors, which provide a simplified way for the news director to develop strategies around anchor qualities compared to using all 6 individual scales.
This document outlines the methodology used to analyze the relationship between company performance and risk disclosure in annual reports. Performance is measured as the change in net income and stock price. Risk disclosure is measured using proxies like number of pages/words about risk and number of times "risk" is mentioned. The study examines 24 Dutch AEX companies' annual reports from 2006-2007. It tests if 1) change in performance correlates with change in risk disclosure and 2) stability of performance correlates with risk disclosure. Multiple steps are taken, including calculating correlations both with and without financial companies.
This document analyzes demand forecasting methods for four pharmaceutical products. Four forecasting methods - naive, cumulative mean, simple moving average, and exponential smoothing - were evaluated based on mean error, mean absolute percentage error, and mean squared error. Visual Basic for Applications was used to optimize parameters for simple moving average and exponential smoothing. The best method for each product was determined to be the one with the lowest mean squared error. Forecasts and 90% confidence intervals are presented for next-month demand.
On Confidence Intervals Construction for Measurement System Capability Indica...IRJESJOURNAL
Abstract: There are many criteria that have been proposed to determine the capability of a measurement system, all based on estimates of variance components. Some of them are the Precision to Tolerance Ratio, the Signal to Noise Ratio and the probabilities of misclassification. For most of these indicators, there are no exact confidence intervals, since the exact distributions of the point estimators are not known. In such situations, two approaches are widely used to obtain approximate confidence intervals: the Modified Large Samples (MLS) methods initially proposed by Graybill and Wang, and the construction of Generalized Confidence Intervals (GCI) introduced by Weerahandi. In this work we focus on the construction of the confidence intervals by the generalized approach in the context of Gauge repeatability and reproducibility studies. Since GCI are obtained by simulation procedures, we analyze the effect of the number of simulations on the variability of the confidence limits as well as the effect of the size of the experiment designed to collect data on the precision of the estimates. Both studies allowed deriving some practical implementation guidelinesin the use of the GCI approach. We finally present a real case study in which this technique was applied to evaluate the capability of a destructive measurement system.
This document discusses an integrated model for sensitivity analysis and scenario analysis using breakeven analysis for operational and investment risk analysis. It was developed by Prof. Sreedhara Ramesh Chandra and Dr. Krishna Banana. The model aims to address limitations in existing sensitivity, scenario, and breakeven analysis models by integrating the three approaches. It introduces proportions and percentages to more precisely determine variable values. It also establishes relationships between scenario values and measures sensitivity through changes from a predetermined relational constant value (sales revenue). The model allows consideration of all cash flow determinants and provides a direct link between operational and investment risk measurements to improve investment decisions.
Statistics is the systematic collection, organization, analysis, and interpretation of data. It plays an important role in decision making by helping extract meaningful information from raw data. There are two main types of statistics - descriptive statistics which summarizes and presents data, and inferential statistics which makes inferences, tests hypotheses, and determines relationships in the data. Statistics has many applications in fields like business, medicine, economics and more. It helps simplify complex data, enable comparisons, identify trends, and aid decision making. Common statistical terms include population, sample, variables, attributes, and parameters. Data can be collected through various methods including direct observation, interviews, questionnaires, and more.
This document summarizes a journal article about modeling economic growth under uncertainty. It introduces a one-sector stochastic growth model where production depends on capital, labor, and a random variable. It maximizes the expected discounted utility of consumption to determine optimal policies. The model generalizes previous deterministic growth models by incorporating uncertainty. It analyzes the long-run properties of the stochastic growth process, showing properties like the existence of a unique stationary distribution are analogous to the steady state in deterministic models. The techniques used differ from previous work and help unify different approaches to modeling growth under uncertainty.
Analysis of Forecasting Sales By Using Quantitative And Qualitative MethodsIJERA Editor
This paper focuses on analysis of forecasting sales using quantitative and qualitative methods. This forecast should be able to help create a model for measuring a successes and setting goals from financial and operational view points. The resulting model should tell if we have met our goals with respect to measures, targets, initiatives
This document discusses estimating stochastic relative risk aversion from interest rates. It first introduces a model for deriving relative risk aversion from interest rates using a time inhomogeneous single factor short rate model. It then details the estimation methodology used, which calibrates the model to US LIBOR data to estimate a time series for the market price of risk and ex-ante bond Sharpe ratio. This allows deducing a stochastic process for relative risk aversion under a power utility function. Estimated mean relative risk aversion is 49.89. The document then introduces modifying a Real Business Cycle model to allow time-varying relative risk aversion, finding it better matches empirical consumption volatility than a baseline model.
Sensitivity analysis is the study of how uncertainty in the inputs of a mathematical model propagates to uncertainty in the model's outputs. It is useful for understanding relationships between inputs and outputs, identifying important inputs, and reducing uncertainty. Sensitivity analysis typically involves running the model many times while varying inputs, and calculating sensitivity measures from the resulting outputs to determine which inputs most influence uncertainty in the outputs. Common methods include variance-based approaches and screening methods.
Parac hacer Macroeconomia Necesitamos Fundamentos MicroeconomicosPAD Ancash
The document argues that macroeconomics does not need microeconomic foundations, and instead it is microeconomics that needs foundations built from neuroscience and biology. It challenges five common arguments for microfoundations in macroeconomics. Specifically, it argues that 1) the Lucas critique does not necessarily require microfoundations, 2) there are no policy-invariant micro principles, 3) new Keynesian models are not truly founded on first principles, 4) there are no well-established micro principles given limitations of current choice theory, and 5) aggregate behavior cannot be built up from individual behavior in a straightforward way.
This powerpoint presentation was done as part of the course STAT 591 titled Mater's Seminar during Third semester of MSc. Agricultural Statistics at Agricultural College, Bapatla under ANGRAU, Andhra Pradesh.
The document discusses fundamentals of data analysis for marketing research. It covers preparing data for analysis through editing, coding, weighting and variable transformation. Simple tabulation, frequency distributions, descriptive statistics and cross tabulation techniques are presented. An overview of statistical techniques includes univariate, bivariate and multivariate methods. Factors influencing statistical technique choices and concepts of hypothesis testing like null hypotheses and significance levels are also summarized.
10th Alex Marketing Club (Forecasting) by Dr. Haitham Maraei 6 Jan-2018Mahmoud Bahgat
Forecasting is an important part of marketing and business planning. There are many techniques for forecasting, including both qualitative and quantitative methods. Qualitative methods include surveys, expert opinions, and market experiments, while quantitative time series methods analyze past trends and patterns to predict the future. Effective forecasting requires understanding factors like demand trends, seasonality, elasticity, and uncertainty. The summary provides an overview of key concepts and challenges in forecasting for marketing and business.
The document summarizes a simulation study that examined the effects of using raw scores versus IRT-derived scores when operationalizing latent constructs in moderated multiple regression analyses. The study found that using raw scores can inflate Type 1 error rates for interaction terms under conditions of assessment inappropriateness. However, rescaling the scores using the Graded Response Model, a polytomous IRT model, mitigated these effects. The study supports the idea that IRT scores provide a more robust metric than raw scores in moderated regression analyses, especially under suboptimal assessment conditions.
This document is a thesis submitted by Jai Kedia for a degree in mathematics and business economics. It examines alternative risk measures to the traditional beta measure in predicting stock returns. The thesis provides an introduction and acknowledges the contributions of the advisors. It then presents an abstract that outlines the goal of analyzing if alternative risk measures such as higher moments, size, leverage, and price-to-book ratio can improve predictions of stock returns beyond just beta. Finally, it presents a table of contents that outlines the various chapters covering the return/risk relationship, modern portfolio theory, mathematical analysis of stock prices, a literature review on previous empirical studies, the empirical analysis conducted, and a conclusion.
Apoorva Javadekar - Conditional Correlations of Macro Variables and Implica...Apoorva Javadekar
This ppt By Apoorva Javadekar is all about Understanding the structure of the Cross Country Correlation for Macro Variables: and Asset Pricing and Risk Sharing Implications
A dynamic Nelson-Siegel model with forward-looking indicators for the yield c...FGV Brazil
This paper proposes a Factor-Augmented Dynamic Nelson-Siegel (FADNS) model to predict the yield curve in the US that relies on a large data set of weekly financial and macroeconomic variables. The FADNS model significantly improves interest rate forecasts relative to the extant models in the literature. For longer horizons, it beats autoregressive alternatives, with a reduction in mean absolute error of up to 40%. For shorter horizons, it offers a good challenge to autoregressive forecasting models, outperforming them for the 7- and 10-year yields. The out-of-sample analysis shows that the good performance comes mostly from the forward-looking nature of the variables we employ. Including them reduces the mean absolute error in 5 basis points on average with respect to models that reflect only past macroeconomic events.
Date: 2017-03
Authors:
Vieira, Fausto José Araújo
Chague, Fernando Daniel
Fernandes, Marcelo
This paper examines how disagreement among investors about macroeconomic factors affects stock returns. The author finds that following periods of high disagreement, stocks highly sensitive to macroeconomic factors ("high macro beta stocks") earn lower future returns compared to less sensitive stocks ("low macro beta stocks"). This suggests high macro beta stocks become overvalued during high disagreement periods due to investors' differing views. Regression analyses show macroeconomic factor risk premiums are negatively related to lagged macro disagreement measures. The findings support theories that macroeconomic factors price risk but also show how disagreement can lead to speculative stock prices.
A logistic regression was conducted to predict if homeowners would accept or decline a solar panel subsidy offer based on household income and monthly mortgage payment. The full model was a good fit to the data and correctly classified 83.3% of cases. While the predictors were not statistically significant individually, they distinguished between acceptors and decliners as a set. Other factors may provide a better fitting model.
This document discusses quantitative approaches to forecasting, including time series analysis and forecasting techniques. It covers the components of a time series, including trends, cycles, seasonality, and irregular components. Specific quantitative forecasting approaches covered include smoothing methods like moving averages, weighted moving averages, and exponential smoothing. Examples are provided to demonstrate how to perform moving averages and exponential smoothing on time series data for sales of headache medicine. The document aims to teach readers how to analyze time series data and select appropriate forecasting techniques.
This document discusses operations management and forecasting. It explains that operations management deals with designing and managing processes, products, services and supply chains to deliver goods and services customers want. Forecasting helps managers reduce uncertainty by predicting future demand to match supply. The document then discusses various forecasting methods including qualitative judgmental methods and quantitative mathematical modeling methods. It covers short, medium and long-range forecasting as well as different time series and causal modeling techniques.
We use data on twins matched to register-based information on earnings to examine the longstanding
puzzle of non-existent compensating wage differentials. The use of twin data allows us to remove otherwise unobserved productivity differences that were the prominent reason for estimation bias in the earlier studies. Using twin differences we find evidence for positive compensation of adverse working conditions in the labor market.
This paper examines factors that explain and help forecast inflation in Pakistan. The authors specify an inflation model including standard monetary variables like money supply and credit growth, as well as interest rates, exchange rates, economic activity, and wheat support prices. Estimating the model from 1998-2005, the results indicate that monetary factors have played a dominant role in recent inflation in Pakistan, affecting prices with a one year lag. Money supply and credit growth are also good leading indicators that can be used to forecast future inflation.
The document discusses the key challenges facing Pakistan's economy. It outlines that Pakistan consumes more than it saves, imports more than it exports, and the government spends more than it earns in revenues. This leads to high fiscal deficits and reliance on external financing. Other challenges include a shrinking share of world trade, poor social indicators like education and health, high costs of doing business, weak governance and a lack of policy continuity between governments. Addressing these challenges is important for sustainable economic growth and development in Pakistan.
A transformational generative approach towards understanding al-istifhamAlexander Decker
This document discusses a transformational-generative approach to understanding Al-Istifham, which refers to interrogative sentences in Arabic. It begins with an introduction to the origin and development of Arabic grammar. The paper then explains the theoretical framework of transformational-generative grammar that is used. Basic linguistic concepts and terms related to Arabic grammar are defined. The document analyzes how interrogative sentences in Arabic can be derived and transformed via tools from transformational-generative grammar, categorizing Al-Istifham into linguistic and literary questions.
Statistics is the systematic collection, organization, analysis, and interpretation of data. It plays an important role in decision making by helping extract meaningful information from raw data. There are two main types of statistics - descriptive statistics which summarizes and presents data, and inferential statistics which makes inferences, tests hypotheses, and determines relationships in the data. Statistics has many applications in fields like business, medicine, economics and more. It helps simplify complex data, enable comparisons, identify trends, and aid decision making. Common statistical terms include population, sample, variables, attributes, and parameters. Data can be collected through various methods including direct observation, interviews, questionnaires, and more.
This document summarizes a journal article about modeling economic growth under uncertainty. It introduces a one-sector stochastic growth model where production depends on capital, labor, and a random variable. It maximizes the expected discounted utility of consumption to determine optimal policies. The model generalizes previous deterministic growth models by incorporating uncertainty. It analyzes the long-run properties of the stochastic growth process, showing properties like the existence of a unique stationary distribution are analogous to the steady state in deterministic models. The techniques used differ from previous work and help unify different approaches to modeling growth under uncertainty.
Analysis of Forecasting Sales By Using Quantitative And Qualitative MethodsIJERA Editor
This paper focuses on analysis of forecasting sales using quantitative and qualitative methods. This forecast should be able to help create a model for measuring a successes and setting goals from financial and operational view points. The resulting model should tell if we have met our goals with respect to measures, targets, initiatives
This document discusses estimating stochastic relative risk aversion from interest rates. It first introduces a model for deriving relative risk aversion from interest rates using a time inhomogeneous single factor short rate model. It then details the estimation methodology used, which calibrates the model to US LIBOR data to estimate a time series for the market price of risk and ex-ante bond Sharpe ratio. This allows deducing a stochastic process for relative risk aversion under a power utility function. Estimated mean relative risk aversion is 49.89. The document then introduces modifying a Real Business Cycle model to allow time-varying relative risk aversion, finding it better matches empirical consumption volatility than a baseline model.
Sensitivity analysis is the study of how uncertainty in the inputs of a mathematical model propagates to uncertainty in the model's outputs. It is useful for understanding relationships between inputs and outputs, identifying important inputs, and reducing uncertainty. Sensitivity analysis typically involves running the model many times while varying inputs, and calculating sensitivity measures from the resulting outputs to determine which inputs most influence uncertainty in the outputs. Common methods include variance-based approaches and screening methods.
Parac hacer Macroeconomia Necesitamos Fundamentos MicroeconomicosPAD Ancash
The document argues that macroeconomics does not need microeconomic foundations, and instead it is microeconomics that needs foundations built from neuroscience and biology. It challenges five common arguments for microfoundations in macroeconomics. Specifically, it argues that 1) the Lucas critique does not necessarily require microfoundations, 2) there are no policy-invariant micro principles, 3) new Keynesian models are not truly founded on first principles, 4) there are no well-established micro principles given limitations of current choice theory, and 5) aggregate behavior cannot be built up from individual behavior in a straightforward way.
This powerpoint presentation was done as part of the course STAT 591 titled Mater's Seminar during Third semester of MSc. Agricultural Statistics at Agricultural College, Bapatla under ANGRAU, Andhra Pradesh.
The document discusses fundamentals of data analysis for marketing research. It covers preparing data for analysis through editing, coding, weighting and variable transformation. Simple tabulation, frequency distributions, descriptive statistics and cross tabulation techniques are presented. An overview of statistical techniques includes univariate, bivariate and multivariate methods. Factors influencing statistical technique choices and concepts of hypothesis testing like null hypotheses and significance levels are also summarized.
10th Alex Marketing Club (Forecasting) by Dr. Haitham Maraei 6 Jan-2018Mahmoud Bahgat
Forecasting is an important part of marketing and business planning. There are many techniques for forecasting, including both qualitative and quantitative methods. Qualitative methods include surveys, expert opinions, and market experiments, while quantitative time series methods analyze past trends and patterns to predict the future. Effective forecasting requires understanding factors like demand trends, seasonality, elasticity, and uncertainty. The summary provides an overview of key concepts and challenges in forecasting for marketing and business.
The document summarizes a simulation study that examined the effects of using raw scores versus IRT-derived scores when operationalizing latent constructs in moderated multiple regression analyses. The study found that using raw scores can inflate Type 1 error rates for interaction terms under conditions of assessment inappropriateness. However, rescaling the scores using the Graded Response Model, a polytomous IRT model, mitigated these effects. The study supports the idea that IRT scores provide a more robust metric than raw scores in moderated regression analyses, especially under suboptimal assessment conditions.
This document is a thesis submitted by Jai Kedia for a degree in mathematics and business economics. It examines alternative risk measures to the traditional beta measure in predicting stock returns. The thesis provides an introduction and acknowledges the contributions of the advisors. It then presents an abstract that outlines the goal of analyzing if alternative risk measures such as higher moments, size, leverage, and price-to-book ratio can improve predictions of stock returns beyond just beta. Finally, it presents a table of contents that outlines the various chapters covering the return/risk relationship, modern portfolio theory, mathematical analysis of stock prices, a literature review on previous empirical studies, the empirical analysis conducted, and a conclusion.
Apoorva Javadekar - Conditional Correlations of Macro Variables and Implica...Apoorva Javadekar
This ppt By Apoorva Javadekar is all about Understanding the structure of the Cross Country Correlation for Macro Variables: and Asset Pricing and Risk Sharing Implications
A dynamic Nelson-Siegel model with forward-looking indicators for the yield c...FGV Brazil
This paper proposes a Factor-Augmented Dynamic Nelson-Siegel (FADNS) model to predict the yield curve in the US that relies on a large data set of weekly financial and macroeconomic variables. The FADNS model significantly improves interest rate forecasts relative to the extant models in the literature. For longer horizons, it beats autoregressive alternatives, with a reduction in mean absolute error of up to 40%. For shorter horizons, it offers a good challenge to autoregressive forecasting models, outperforming them for the 7- and 10-year yields. The out-of-sample analysis shows that the good performance comes mostly from the forward-looking nature of the variables we employ. Including them reduces the mean absolute error in 5 basis points on average with respect to models that reflect only past macroeconomic events.
Date: 2017-03
Authors:
Vieira, Fausto José Araújo
Chague, Fernando Daniel
Fernandes, Marcelo
This paper examines how disagreement among investors about macroeconomic factors affects stock returns. The author finds that following periods of high disagreement, stocks highly sensitive to macroeconomic factors ("high macro beta stocks") earn lower future returns compared to less sensitive stocks ("low macro beta stocks"). This suggests high macro beta stocks become overvalued during high disagreement periods due to investors' differing views. Regression analyses show macroeconomic factor risk premiums are negatively related to lagged macro disagreement measures. The findings support theories that macroeconomic factors price risk but also show how disagreement can lead to speculative stock prices.
A logistic regression was conducted to predict if homeowners would accept or decline a solar panel subsidy offer based on household income and monthly mortgage payment. The full model was a good fit to the data and correctly classified 83.3% of cases. While the predictors were not statistically significant individually, they distinguished between acceptors and decliners as a set. Other factors may provide a better fitting model.
This document discusses quantitative approaches to forecasting, including time series analysis and forecasting techniques. It covers the components of a time series, including trends, cycles, seasonality, and irregular components. Specific quantitative forecasting approaches covered include smoothing methods like moving averages, weighted moving averages, and exponential smoothing. Examples are provided to demonstrate how to perform moving averages and exponential smoothing on time series data for sales of headache medicine. The document aims to teach readers how to analyze time series data and select appropriate forecasting techniques.
This document discusses operations management and forecasting. It explains that operations management deals with designing and managing processes, products, services and supply chains to deliver goods and services customers want. Forecasting helps managers reduce uncertainty by predicting future demand to match supply. The document then discusses various forecasting methods including qualitative judgmental methods and quantitative mathematical modeling methods. It covers short, medium and long-range forecasting as well as different time series and causal modeling techniques.
We use data on twins matched to register-based information on earnings to examine the longstanding
puzzle of non-existent compensating wage differentials. The use of twin data allows us to remove otherwise unobserved productivity differences that were the prominent reason for estimation bias in the earlier studies. Using twin differences we find evidence for positive compensation of adverse working conditions in the labor market.
This paper examines factors that explain and help forecast inflation in Pakistan. The authors specify an inflation model including standard monetary variables like money supply and credit growth, as well as interest rates, exchange rates, economic activity, and wheat support prices. Estimating the model from 1998-2005, the results indicate that monetary factors have played a dominant role in recent inflation in Pakistan, affecting prices with a one year lag. Money supply and credit growth are also good leading indicators that can be used to forecast future inflation.
The document discusses the key challenges facing Pakistan's economy. It outlines that Pakistan consumes more than it saves, imports more than it exports, and the government spends more than it earns in revenues. This leads to high fiscal deficits and reliance on external financing. Other challenges include a shrinking share of world trade, poor social indicators like education and health, high costs of doing business, weak governance and a lack of policy continuity between governments. Addressing these challenges is important for sustainable economic growth and development in Pakistan.
A transformational generative approach towards understanding al-istifhamAlexander Decker
This document discusses a transformational-generative approach to understanding Al-Istifham, which refers to interrogative sentences in Arabic. It begins with an introduction to the origin and development of Arabic grammar. The paper then explains the theoretical framework of transformational-generative grammar that is used. Basic linguistic concepts and terms related to Arabic grammar are defined. The document analyzes how interrogative sentences in Arabic can be derived and transformed via tools from transformational-generative grammar, categorizing Al-Istifham into linguistic and literary questions.
A trends of salmonella and antibiotic resistanceAlexander Decker
This document provides a review of trends in Salmonella and antibiotic resistance. It begins with an introduction to Salmonella as a facultative anaerobe that causes nontyphoidal salmonellosis. The emergence of antimicrobial-resistant Salmonella is then discussed. The document proceeds to cover the historical perspective and classification of Salmonella, definitions of antimicrobials and antibiotic resistance, and mechanisms of antibiotic resistance in Salmonella including modification or destruction of antimicrobial agents, efflux pumps, modification of antibiotic targets, and decreased membrane permeability. Specific resistance mechanisms are discussed for several classes of antimicrobials.
A unique common fixed point theorems in generalized dAlexander Decker
This document presents definitions and properties related to generalized D*-metric spaces and establishes some common fixed point theorems for contractive type mappings in these spaces. It begins by introducing D*-metric spaces and generalized D*-metric spaces, defines concepts like convergence and Cauchy sequences. It presents lemmas showing the uniqueness of limits in these spaces and the equivalence of different definitions of convergence. The goal of the paper is then stated as obtaining a unique common fixed point theorem for generalized D*-metric spaces.
A universal model for managing the marketing executives in nigerian banksAlexander Decker
This document discusses a study that aimed to synthesize motivation theories into a universal model for managing marketing executives in Nigerian banks. The study was guided by Maslow and McGregor's theories. A sample of 303 marketing executives was used. The results showed that managers will be most effective at motivating marketing executives if they consider individual needs and create challenging but attainable goals. The emerged model suggests managers should provide job satisfaction by tailoring assignments to abilities and monitoring performance with feedback. This addresses confusion faced by Nigerian bank managers in determining effective motivation strategies.
A usability evaluation framework for b2 c e commerce websitesAlexander Decker
This document presents a framework for evaluating the usability of B2C e-commerce websites. It involves user testing methods like usability testing and interviews to identify usability problems in areas like navigation, design, purchasing processes, and customer service. The framework specifies goals for the evaluation, determines which website aspects to evaluate, and identifies target users. It then describes collecting data through user testing and analyzing the results to identify usability problems and suggest improvements.
Abnormalities of hormones and inflammatory cytokines in women affected with p...Alexander Decker
Women with polycystic ovary syndrome (PCOS) have elevated levels of hormones like luteinizing hormone and testosterone, as well as higher levels of insulin and insulin resistance compared to healthy women. They also have increased levels of inflammatory markers like C-reactive protein, interleukin-6, and leptin. This study found these abnormalities in the hormones and inflammatory cytokines of women with PCOS ages 23-40, indicating that hormone imbalances associated with insulin resistance and elevated inflammatory markers may worsen infertility in women with PCOS.
This document discusses validating risk models using intraday value-at-risk (VaR) and expected shortfall (ES) approaches with the Multiplicative Component GARCH (MC-GARCH) model. The study assesses different distributional assumptions for innovations in the MC-GARCH model and evaluates their effects on modeling and forecasting performance. Backtesting procedures are used to validate the models' predictive power for VaR and ES. Results show non-normal distributions best fit the intraday data and forecast ES, while an asymmetric distribution best forecasts VaR.
This document summarizes John Sneed's research on developing an earnings forecasting model based on theoretical factors rather than statistical selection of variables. It begins by describing Ou's existing model and its limitations in relying on statistical techniques without theoretical justification. It then discusses theories from economics literature on factors that could lead to differential profits across firms/industries: improper measurement of intangible capital like R&D/advertising, differential returns on such investments, and existence of market power. Based on these theories, Sneed develops a model incorporating variables like average R&D over 5 years, prior year's advertising, and 5-year average capital expenditures to test if it improves upon Ou's model.
Forecasting is the process of making predictions about events that have not yet occurred based on past data and other information. There are many different forecasting methods that can be qualitative or quantitative, including time series analysis, causal modeling, judgmental approaches, and more recently artificial intelligence techniques. Accuracy is important in forecasting and is typically measured using values like mean absolute error or mean squared error. Forecasting has wide applications in domains like business, economics, weather, earthquakes, and more. Limitations to forecasting accuracy exist, such as the chaotic nature of systems like the weather beyond two weeks.
MODELING THE AUTOREGRESSIVE CAPITAL ASSET PRICING MODEL FOR TOP 10 SELECTED...IAEME Publication
Systematic risk is the uncertainty inherent to the entire market or entire market segment and Unsystematic risk is the type of uncertainty that comes with the company or industry we invest. It can be reduced through diversification. The study generalized for selecting of non -linear capital asset pricing model for top securities in BSE and made an attempt to identify the marketable and non-marketable risk of investors of top companies. The analysis was conducted at different stages. They are Vector auto regression of systematic and unsystematic risk.
This summary analyzes a study that compares the forecast accuracy of healthcare product demand using linear and non-linear regression models.
The study finds that for fluctuating demand with small historical values, non-linear regression using a polynomial function provides better forecast accuracy than linear regression. However, for steady demand, linear regression is more accurate. Two factors - percentage of elderly people and regular exercise - are found to have the most significant impact on healthcare demand forecasts. When these factors are varied in the linear model, forecast accuracy is highest. For some products, a 5th or 6th degree polynomial function in non-linear regression also improves accuracy over the linear model.
How to assess the reliability of measurements in rehabilitationanalisedecurvas
This document discusses methods for assessing the reliability of measurements in rehabilitation research. It summarizes several statistical methods that can be used to evaluate reliability, including correlation coefficients, changes in mean values between tests, measurement variability, and clinically important changes. The document uses an example of measurements of isokinetic muscle strength to demonstrate how to calculate and interpret various reliability indices. Overall, it provides rehabilitation researchers with guidance on comprehensively assessing reliability using multiple statistical approaches.
https://utilitasmathematica.com/index.php/Index/
Utilitas Mathematica by making commits to strengthening our professional community. This journal is published by Utilitas Mathematica Academy provides all over. Utilitas Mathematica international level in terms of research provides worldwide.
https://utilitasmathematica.com/index.php/Index/
Utilitas Mathematica by making commits to strengthening our professional community. This journal is published by Utilitas Mathematica Academy provided all over. Utilitas Mathematica international level in terms of research provides worldwide.
https://goo.gl/maps/Pkz14omBWry4czBcA
Box 7, University Centre University of Manitoba Winnipeg, Manitoba R3T 2N2
Forecasting Economic Activity using Asset PricesPanos Kouvelis
This dissertation evaluates how well the asset prices and, in particular the term spread, the short rate and the real stock returns, forecast the GDP growth and the Industrial Production. The study is applied with data of seven countries (Canada, France, Germany, Italy, Japan, United Kingdom and United States) and it covers a period of time between 1966 until now. The research finds that the asset prices have forecasting power for one quarter/month but they lose their power when the forecasting horizon increases. Moreover, the paper evaluates that the real stock return is the best predictor of the GDP growth and that the short rate has more predictive content than the term spread.
Keywords: Term spread, short rate, stock returns, output growth, forecasting horizon, out-of-sample statistics
This paper examines the "variance premium", which is the difference between the squared VIX index and expected realized variance. The authors show that the variance premium captures attitudes toward economic uncertainty and predicts future stock returns over short horizons. They develop a generalized long-run risks model that generates a time-varying variance premium consistent with market return and risk-free rate levels. The model requires extensions to match the large size, volatility and skewness of the variance premium and its short-term return predictability. Calibrating the model to cash flow and asset pricing targets allows it to generate the variance premium and its return predictability features.
This document presents an estimated arbitrage-free model that jointly models nominal and real US Treasury yields. It estimates separate arbitrage-free Nelson-Siegel models for nominal and real yields, finding a three-factor model fits nominal yields well and a two-factor model fits real yields. It then estimates a four-factor joint model that fits both yield curves. The joint model is used to decompose breakeven inflation rates into inflation expectations and inflation risk premium components.
SUITABILITY OF COINTEGRATION TESTS ON DATA STRUCTURE OF DIFFERENT ORDERSBRNSS Publication Hub
This document summarizes research investigating the suitability of cointegration tests on time series data of different orders. The researchers used simulated time series data from normal and gamma distributions at sample sizes of 30, 60, and 90. Three cointegration tests (Engle-Granger, Johansen, and Phillips-Ouliaris) were applied to the data. The tests were assessed based on type 1 error rates and power to determine which test was most robust for different distributions and sample sizes. The results indicated the Phillips-Ouliaris test was generally the most effective at determining cointegration across different sample sizes and distributions.
Relationship between macroeconomic variables and malaysia available shariah i...Azrul Abdullah
This paper aims to study the relationship between local and foreign macroeconomic variables and Malaysia available Shariah Indices. In our study, we used the Vector Error Correction (VEC) framework by initially looking at the long run and short run relationship between Malaysia available Shariah indices (i.e. KLSI, FTSE Bursa Malaysia EMAS Shariah Index and FTSE Bursa Malaysia Hijrah Shariah Index) and the macroeconomic variables via the Johansen cointegration technique. Monthly data during the twenty two-year period (from January 1990 to December 2011) has been collected from DataStream and tested. The findings show positive relationship between the variables from 1990 to 2006. However, mix results were found after the period till 2011. This study then conclude that the standardized set of macroeconomic variables that specified by earlier researchers still can be relied but in careful policy formulation.
A LINEAR REGRESSION APPROACH TO PREDICTION OF STOCK MARKET TRADING VOLUME: A ...ijmvsc
Predicting daily behavior of stock market is a serious challenge for investors and corporate stockholders and it can help them to invest with more confident by taking risks and fluctuations into consideration. In this paper, by applying linear regression for predicting behavior of S&P 500 index, we prove that our proposed method has a similar and good performance in comparison to real volumes and the stockholders can invest confidentially based on that.
This document summarizes methods for establishing meaningful performance expectations across different test forms by setting invariant latent standards along the underlying competence continuum, rather than cutscores that vary by test content. It describes how Angoff ratings can be analyzed using item response curves to identify the latent threshold (θ*) representing each performance level. Preliminary analyses of expert ratings for a licensure exam show ratings better differentiated item difficulties and performance levels after aligning with item curves, and several methods for deriving θ* from the ratings are demonstrated and compared.
Combining forecast from different models has shown to perform better than single forecast in most time series. To improve the quality of forecast we can go for combining forecast. We study the effect of decomposing a series into multiple components and performing forecasts on each component separately... The original series is decomposed into trend, seasonality and an irregular component for each series. The statistical methods such as ARIMA, Holt-Winter have been used to forecast these components. In this paper we focus on how the best models of one series can be applied to similar frequency pattern series for forecasting using association mining. The proposed method forecasted value has been compared with Holt Winter method and shown that the results are better than Holt Winter method
A NOVEL PERFORMANCE MEASURE FOR MACHINE LEARNING CLASSIFICATIONIJMIT JOURNAL
Machine learning models have been widely used in numerous classification problems and performance measures play a critical role in machine learning model development, selection, and evaluation. This paper covers a comprehensive overview of performance measures in machine learning classification. Besides, we proposed a framework to construct a novel evaluation metric that is based on the voting results of three performance measures, each of which has strengths and limitations. The new metric can be proved better than accuracy in terms of consistency and discriminancy.
Similar to 11.forecast analysis of food price inflation in pakistan (20)
A time series analysis of the determinants of savings in namibiaAlexander Decker
This document summarizes a study on the determinants of savings in Namibia from 1991 to 2012. It reviews previous literature on savings determinants in developing countries. The study uses time series analysis including unit root tests, cointegration, and error correction models to analyze the relationship between savings and variables like income, inflation, population growth, deposit rates, and financial deepening in Namibia. The results found inflation and income have a positive impact on savings, while population growth negatively impacts savings. Deposit rates and financial deepening were found to have no significant impact. The study reinforces previous work and emphasizes the importance of improving income levels to achieve higher savings rates in Namibia.
A therapy for physical and mental fitness of school childrenAlexander Decker
This document summarizes a study on the importance of exercise in maintaining physical and mental fitness for school children. It discusses how physical and mental fitness are developed through participation in regular physical exercises and cannot be achieved solely through classroom learning. The document outlines different types and components of fitness and argues that developing fitness should be a key objective of education systems. It recommends that schools ensure pupils engage in graded physical activities and exercises to support their overall development.
A theory of efficiency for managing the marketing executives in nigerian banksAlexander Decker
This document summarizes a study examining efficiency in managing marketing executives in Nigerian banks. The study was examined through the lenses of Kaizen theory (continuous improvement) and efficiency theory. A survey of 303 marketing executives from Nigerian banks found that management plays a key role in identifying and implementing efficiency improvements. The document recommends adopting a "3H grand strategy" to improve the heads, hearts, and hands of management and marketing executives by enhancing their knowledge, attitudes, and tools.
This document discusses evaluating the link budget for effective 900MHz GSM communication. It describes the basic parameters needed for a high-level link budget calculation, including transmitter power, antenna gains, path loss, and propagation models. Common propagation models for 900MHz that are described include Okumura model for urban areas and Hata model for urban, suburban, and open areas. Rain attenuation is also incorporated using the updated ITU model to improve communication during rainfall.
A synthetic review of contraceptive supplies in punjabAlexander Decker
This document discusses contraceptive use in Punjab, Pakistan. It begins by providing background on the benefits of family planning and contraceptive use for maternal and child health. It then analyzes contraceptive commodity data from Punjab, finding that use is still low despite efforts to improve access. The document concludes by emphasizing the need for strategies to bridge gaps and meet the unmet need for effective and affordable contraceptive methods and supplies in Punjab in order to improve health outcomes.
A synthesis of taylor’s and fayol’s management approaches for managing market...Alexander Decker
1) The document discusses synthesizing Taylor's scientific management approach and Fayol's process management approach to identify an effective way to manage marketing executives in Nigerian banks.
2) It reviews Taylor's emphasis on efficiency and breaking tasks into small parts, and Fayol's focus on developing general management principles.
3) The study administered a survey to 303 marketing executives in Nigerian banks to test if combining elements of Taylor and Fayol's approaches would help manage their performance through clear roles, accountability, and motivation. Statistical analysis supported combining the two approaches.
A survey paper on sequence pattern mining with incrementalAlexander Decker
This document summarizes four algorithms for sequential pattern mining: GSP, ISM, FreeSpan, and PrefixSpan. GSP is an Apriori-based algorithm that incorporates time constraints. ISM extends SPADE to incrementally update patterns after database changes. FreeSpan uses frequent items to recursively project databases and grow subsequences. PrefixSpan also uses projection but claims to not require candidate generation. It recursively projects databases based on short prefix patterns. The document concludes by stating the goal was to find an efficient scheme for extracting sequential patterns from transactional datasets.
A survey on live virtual machine migrations and its techniquesAlexander Decker
This document summarizes several techniques for live virtual machine migration in cloud computing. It discusses works that have proposed affinity-aware migration models to improve resource utilization, energy efficient migration approaches using storage migration and live VM migration, and a dynamic consolidation technique using migration control to avoid unnecessary migrations. The document also summarizes works that have designed methods to minimize migration downtime and network traffic, proposed a resource reservation framework for efficient migration of multiple VMs, and addressed real-time issues in live migration. Finally, it provides a table summarizing the techniques, tools used, and potential future work or gaps identified for each discussed work.
A survey on data mining and analysis in hadoop and mongo dbAlexander Decker
This document discusses data mining of big data using Hadoop and MongoDB. It provides an overview of Hadoop and MongoDB and their uses in big data analysis. Specifically, it proposes using Hadoop for distributed processing and MongoDB for data storage and input. The document reviews several related works that discuss big data analysis using these tools, as well as their capabilities for scalable data storage and mining. It aims to improve computational time and fault tolerance for big data analysis by mining data stored in Hadoop using MongoDB and MapReduce.
1. The document discusses several challenges for integrating media with cloud computing including media content convergence, scalability and expandability, finding appropriate applications, and reliability.
2. Media content convergence challenges include dealing with the heterogeneity of media types, services, networks, devices, and quality of service requirements as well as integrating technologies used by media providers and consumers.
3. Scalability and expandability challenges involve adapting to the increasing volume of media content and being able to support new media formats and outlets over time.
This document surveys trust architectures that leverage provenance in wireless sensor networks. It begins with background on provenance, which refers to the documented history or derivation of data. Provenance can be used to assess trust by providing metadata about how data was processed. The document then discusses challenges for using provenance to establish trust in wireless sensor networks, which have constraints on energy and computation. Finally, it provides background on trust, which is the subjective probability that a node will behave dependably. Trust architectures need to be lightweight to account for the constraints of wireless sensor networks.
This document discusses private equity investments in Kenya. It provides background on private equity and discusses trends in various regions. The objectives of the study discussed are to establish the extent of private equity adoption in Kenya, identify common forms of private equity utilized, and determine typical exit strategies. Private equity can involve venture capital, leveraged buyouts, or mezzanine financing. Exits allow recycling of capital into new opportunities. The document provides context on private equity globally and in developing markets like Africa to frame the goals of the study.
This document discusses a study that analyzes the financial health of the Indian logistics industry from 2005-2012 using Altman's Z-score model. The study finds that the average Z-score for selected logistics firms was in the healthy to very healthy range during the study period. The average Z-score increased from 2006 to 2010 when the Indian economy was hit by the global recession, indicating the overall performance of the Indian logistics industry was good. The document reviews previous literature on measuring financial performance and distress using ratios and Z-scores, and outlines the objectives and methodology used in the current study.
A study to evaluate the attitude of faculty members of public universities of...Alexander Decker
This study evaluated faculty members' attitudes toward shared governance in public universities in Pakistan. It used a questionnaire to assess attitudes on 4 indicators of shared governance: the role of the dean, role of faculty, role of the board, and role of joint decision-making. The study analyzed responses from 90 faculty across various universities. Statistical analysis found significant differences in perceptions of shared governance based on faculty rank and gender. Faculty rank influenced perceptions of the dean's role and role of joint decision-making. Gender influenced overall perceptions of shared governance. The results indicate a need to improve shared governance practices in Pakistani universities.
A study to assess the knowledge regarding prevention of pneumonia among middl...Alexander Decker
1) The study assessed knowledge of pneumonia prevention among 60 middle-aged adults in rural Moodbidri, India. Most subjects (55%) had poor knowledge and 41.67% had average knowledge. The mean knowledge score was 40.66%.
2) Knowledge was lowest in areas of diagnosis, prevention and management (35.61%) and highest in introduction to pneumonia (45.42%).
3) There was a significant association between knowledge and gender but not other demographic factors like age, education level or occupation. The study concluded knowledge of prevention was low and health education is needed.
A study regarding analyzing recessionary impact on fundamental determinants o...Alexander Decker
This document analyzes the impact of fundamental factors on stock prices in India during normal and recessionary periods. It finds that during normal periods from 2000-2007, earnings per share had a positive and significant impact on stock prices, while coverage ratio had a negative impact. During the recession from 2007-2009, price-earnings ratio positively and significantly impacted stock prices, while growth had a negative effect. Overall, the study aims to compare the influence of fundamental factors like book value, dividends, earnings, etc. on stock prices during different economic conditions in India.
A study on would be urban-migrants’ needs and necessities in rural bangladesh...Alexander Decker
This document summarizes a study on the needs and necessities of potential rural migrants in Bangladesh and how providing certain facilities could encourage them to remain in rural areas. The study involved surveys of 350 local and non-local people across 7 upazilas to understand their satisfaction with existing services and priority of needs. The findings revealed variations in requirements between local and non-local respondents. Based on the analysis, the study recommends certain priority facilities, such as employment opportunities and community services, that should be provided in rural areas to improve quality of life and reduce migration to cities. Limitations include the small sample size not representing all of Bangladesh and difficulties collecting full information from all respondents.
A study on the evaluation of scientific creativity among scienceAlexander Decker
This study evaluated scientific creativity among 31 science teacher candidates in Turkey. The candidates were asked open-ended questions about scientific creativity and how they would advance science. Their responses showed adequate fluency and scientific knowledge, but low flexibility and originality. When asked to self-evaluate, most said their scientific creativity was partially adequate. The study aims to help improve the development of scientific creativity among future teachers.
A study on the antioxidant defense system in breast cancer patients.Alexander Decker
This document discusses a study on the antioxidant defense system in breast cancer patients. The study measured levels of reduced glutathione (GSH), superoxide dismutase (SOD) activity, total antioxidant potential (AOP), malondialdehyde (MDA), and nitrate in 40 breast cancer patients and 20 healthy controls. The results found increased MDA, SOD, and nitrite levels and decreased GSH and AOP levels in breast cancer patients compared to controls, indicating higher oxidative stress in patients from increased free radicals and lower antioxidant defenses.
This study examined 79 dry crania (55 male and 24 female) from southern Nigeria to determine the incidence and dimensions of single and double hypoglossal canals, and whether these dimensions differ between sexes. Measurements were taken of the internal and external diameters of the hypoglossal canals. The results showed significant differences in all dimensions between males and females. Bilateral single hypoglossal canals were most prevalent. In conclusion, the size of the hypoglossal canal is sex-specific, with significant differences found between males and females in this population.
“An Outlook of the Ongoing and Future Relationship between Blockchain Technologies and Process-aware Information Systems.” Invited talk at the joint workshop on Blockchain for Information Systems (BC4IS) and Blockchain for Trusted Data Sharing (B4TDS), co-located with with the 36th International Conference on Advanced Information Systems Engineering (CAiSE), 3 June 2024, Limassol, Cyprus.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
Ocean lotus Threat actors project by John Sitima 2024 (1).pptxSitimaJohn
Ocean Lotus cyber threat actors represent a sophisticated, persistent, and politically motivated group that poses a significant risk to organizations and individuals in the Southeast Asian region. Their continuous evolution and adaptability underscore the need for robust cybersecurity measures and international cooperation to identify and mitigate the threats posed by such advanced persistent threat groups.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
CAKE: Sharing Slices of Confidential Data on BlockchainClaudio Di Ciccio
Presented at the CAiSE 2024 Forum, Intelligent Information Systems, June 6th, Limassol, Cyprus.
Synopsis: Cooperative information systems typically involve various entities in a collaborative process within a distributed environment. Blockchain technology offers a mechanism for automating such processes, even when only partial trust exists among participants. The data stored on the blockchain is replicated across all nodes in the network, ensuring accessibility to all participants. While this aspect facilitates traceability, integrity, and persistence, it poses challenges for adopting public blockchains in enterprise settings due to confidentiality issues. In this paper, we present a software tool named Control Access via Key Encryption (CAKE), designed to ensure data confidentiality in scenarios involving public blockchains. After outlining its core components and functionalities, we showcase the application of CAKE in the context of a real-world cyber-security project within the logistics domain.
Paper: https://doi.org/10.1007/978-3-031-61000-4_16
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceIndexBug
Imagine a world where machines not only perform tasks but also learn, adapt, and make decisions. This is the promise of Artificial Intelligence (AI), a technology that's not just enhancing our lives but revolutionizing entire industries.
Generating privacy-protected synthetic data using Secludy and MilvusZilliz
During this demo, the founders of Secludy will demonstrate how their system utilizes Milvus to store and manipulate embeddings for generating privacy-protected synthetic data. Their approach not only maintains the confidentiality of the original data but also enhances the utility and scalability of LLMs under privacy constraints. Attendees, including machine learning engineers, data scientists, and data managers, will witness first-hand how Secludy's integration with Milvus empowers organizations to harness the power of LLMs securely and efficiently.
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
11.forecast analysis of food price inflation in pakistan
1. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
Forecast Analysis of Food Price Inflation in Pakistan: Applying
Rationality Criterion for VAR Forecast
Ms. Madiha Riaz
1. Department of Economics, The Islamia University of Bahawalpur, Pakistan
*E-mail of the corresponding author:madihatarar@hotmail.com
Abstract
Forecast performance is considered to be a tart test of an econometric model. An accurate forecasting system is
necessary for every industry to be able to take appropriate actions for future planning and planning creates a
substantial need for forecasts. The purpose of this study is to evaluate forecast efficiency by using Rationality
criterion of forecasts. It is therefore designed to analyze forecasting efficiency of food price inflation and
consumer price index by using thirty three years quarterly data of Pakistan covering the period 1975 to 2008.
Forecasts are obtained from VAR model specification. Four forecasting accuracy techniques, such as, Root
Mean Square Error (RMSE), Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE) and
Theil’s Inequality Coefficient (TIC) are used to be able to select the most accurate forecast from VAR. Later on
these forecasts are evaluated on the basis of Rationality criterion defined. We found food price forecast are
consistent, efficient and fulfilling the criteria of weak and strong rationality given. We propose that assessment
of forecasts obtained by applying different criterion used will make them more reliable and correct to be used in
policymaking and management decision.
Keywords: Food Price Forecasts, Weak Rationality, Strong rationality, Strict rationality
1. Introduction
Forecast performance is considered to be a spiky test of an econometric model, particularly when that model is
based on a well designed economic theory. Forecast performance is assumed to provide support for theory. This
is common concept that a good forecasting performance constitutes a ‘seal of approval’ to the empirical model
and therefore of the theory on which model is based. An accurate forecasting system is necessary for every
industry to be able to take appropriate actions for the future. It is widely recognized that one of the most
important functions of manager at all levels in an organization is planning, and planning creates a substantial
need for forecasts.
Analysis of time series and Yule (1927) forecasting has a longer history. Forecasting is often the goal of a time
series analysis. Time series analysis is generally used in business and economics to investigate the dynamic
structure of a process, to find the dynamic relationship between variables, to perform seasonal adjustment of
economic data and to improve regression analysis when the errors are serially correlated and furthermore to
produce point and interval forecast for both level and volatile data series. Accuracy of forecast is important to
policymaker
Traditional measure of forecast efficiency was comparison of RMSE. A forecast having lower RMSE considered
as the best among the others forecast having a high RMSE. A good criticism on RMSE is made by Armstrong et
al. (1995). After the rejection of conventional tools of analyzing the forecast efficiency the co integration
approach named consistency was introduced, and this technique was used by Liu et al. (1992) and Aggerwal et
al. (1995) to assess the unbiasedness, integration and co integration characteristics of macroeconomic data and
their respective forecast. Hafer et al. (1985), McNees (1986), Pearce (1987) and Zarnowitz (1984, 1985, 1993)
place great weight on minimum mean square error (MSE) but do not incorporate accuracy analysis convincingly
in their test of forecast.
Many researchers contribute to rationality testing such as Carlson (1977) Figlewski et al. (1981), Friedman
63
2. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
(1980), Gramlich (1983), Mullineaux (1978), Pearce (1979) and Pesando (1975). many studies finds the
rationality of IMF and OECD forecasts like Holden et al. (1987), Ash et al. (1990, 1998), Artis (1996), Pons
(1999, 2000, 2001), Kreinin (2000), Oller et al. (2000) and Batchelor (2001), these studies shown that the IMF
and OECD forecasts pass most of the tests of rationality. Doctrine of rationality is defined by Lee (1991),
expectations are said to be rational if they fully incorporate all of the information available to the agents at the
time the forecast is made. Efficiency of forecast is being analyzed by different approaches; e.g. Consistent
Forecast, Efficient Forecast and Rational Forecast. Bonham et al. (1991) include a test for conditional efficiency
in the definition of strong rationality. In order to analyze the rationality of price forecast Bonhan et al. (1991)
define a hierarchy of rationality tests starts from ‘weak rationality’ to ‘strict rationality’ as
• Weak rationality
• Sufficient rationality
• Strong rationality
• Strict rationality
2. Rationality
2.1 Weak Rationality
Most of the applied work such as Evans et al. (1984), Friedman (1980), Pearce (1987) and Zarnowitz (1984,
1985) viewed rationality in term of the necessary conditions of unbiasedness and information efficiency2. The
same notion of weak rationality was defined by Bonham et al. (1991) that the forecast must be unbiased and
meet the tests of weak information efficiency. Ruoss (2002) stated that unbiasedness is often tested using the
Theil-Mincer-Zarnowitz equation. This is a regression of the actual values on a constant and the forecast values.
Clement (1998) suggested to run a regression of the forecast error on the constant, if the constant deviates from
zero, the hypothesis that the forecast is unbiased is rejected.
2.2 Sufficient Rationality
The forecast must be weakly rational and must pass a more demanding test of sufficient orthogonality, namely,
that the forecast errors not be correlated with any variable in the information set available at the time of
prediction
2.3 Strong Rationality
The forecast must be sufficiently rational and pass tests of conditional efficiency. Conditional efficiency
requires a comparison of forecasts3. Call some sufficiently rational forecast a benchmark. Combine benchmark
with some competing forecast. Conditional efficiency refers to Granger et al. (1973) concept that measures the
reduction in RMSE, which occurs when a forecast is combined with one of its competitors. Against such kind of
notion Granger (1989) suggest that combining often produces a forecast superior to both components. Same
kind of notion is build by Timmermann (2006) whether forecast can be improved by combining WEO forecasts
with the Consensus forecasts. Stock et al. (2001) reported broad support for a simple combination of forecasts in
a study of a large cross-section of macroeconomic and financial variables. If the combination produces an
RMSE that is significantly smaller than the benchmark RMSE, the latter fails the test for conditional efficiency
because it has not efficiently utilize some information contained in the competing forecast.
2
The same kind of unbiasedness and efficiency notion was build by Eichenbaum et al. (1988) and Razzak
(1997)
3
Emanating from the classic study by Bates et al. (1969) a long literature on forecast combination summarized
by Clemen (1989), Diebold et al. (1996) and Timmermann (2005) has found evidence that combined forecasts
tend to produce better out-of-sample performance than individual forecasting models.
64
3. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
2.4 Strict Rationality
Bonham et al. (1991) explained in it study that a statement about rationality should not depend on arbitrary
selection of time periods. A forecast is strictly rational if it passes tests of strong rationality in a variety of sub
periods .Empirical results regarding the rationality of forecasts was explained by Lee (1991) that forecast are fail
to be rational in the strong sense even though they are not rejected by the conventional test of weak-form
rationality. Ruoss (2002) examine the forecast rationality of the Swiss economy and find GDP forecasts in
sample do not pass the most stringent test i.e., the test of strong informational efficiency, because, in some cases,
forecasts errors correlate with the forecasts of the other institutes.
Same kind of results are shown by Bonham et al. (1991) that the most stringent criteria for testing rationality
will not be useful for empirical work. On these criteria there might not be a rational forecast of inflation. Thus
there is a tension between what econometricians would like to suggest about rationality and the imperative that
agents act on what information they have. This tension might be eliminated by relaxing the criterion that
defines strict rationality.
Razzak (1997) and Rich (1989) test the rationality of National Bank of New Zealand’s survey data of inflation
expectation and SRC expected price change data respectively. Both studies end up with a same conclusion, that
the results do not reject the null hypothesis of unbiasedness, efficiency and orthogonality for a sample from their
particular survey data series. A study of US and Sweden was ended by Bryan et al. (2005) concludes that the US
data seems very unsupportive of near-rationality4, whereas the Swedish is more inconclusive. From all
discussion it can be inferred that the central goal is to produce unbiased and efficient forecast with uncorrelated
forecast error. Typically as mentioned by Yin-Wong Cheung and Menzie David Chinn (1997)that when
examining forecast accuracy researchers examine the mean, variance and serial correlation properties of the
forecast error. Following basic principles of economic forecast, the performance of forecast can be evaluated as
Unbiased forecast, efficient forecast and have uncorrelated errors.
3. Plan of Study
The aim of this study is to assess forecast accuracy by means of Rationality test applied for forecasts of food
price inflation and consumer price index data of Pakistan which are essential for efficient planning by farmers
and other industries connected to the food production. Such forecasts are also of interest to governments and
other organizations. It will consist of 33 years Quarter data covering the period 1975-2008. We will obtain
forecasts by VAR model. We will select a number of alternative criteria (such as, Root Mean Square Error
(RMSE), Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE) and Theil’s Inequality
Coefficient (TIC)) for measuring forecast accuracy at the time of selection of best forecasts. In order to test the
forecast either they are biased, erratic and unreliable or using existing information in a reasonably effective
manner we submit an application of rationality test, these issues of forecast efficiency are rarely addressed.
These criteria give different rankings, so there is no guarantee that a forecast that performs well under one
criterion is satisfactory under the others. Any conclusion from a given data set should be regarded only as
indicators of forecasting ability and not as proof of the correctness of the underlying model and criterion for that
data.
In order to test the performance of Food price inflation Forecast , we forecast two data series namely, Food
price inflation (CPI food as proxy of food price inflation), consumer price index General (CPIG).The purpose of
selecting these two data series is their strong causality with each other.
Quarterly figures are taken from the IMF’s International Financial Statistics (2009) and World Bank’s World
Development Indicator (2009). Data are taken on Quarterly basis for the period 1974-75, 2007-08. It comprises
133 observations from 1974Q2-2008Q2.The corresponding sections will explain the framework of analysis and
discussion on result.
3.1 Framework of Analysis
4
The proposition of near-rationality of inflation expectation was suggested by the work of Akerlof et al. (2000).
65
4. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
We used VAR approach presented by Sims (1980) for multivariate analysis. In the estimation of VAR we used
food price inflation alternatively with the four other variables, Real GDP, M2, Interest rate and Exchange
rate.VAR model consists of a set of seemingly unrelated regression (SUR) equations. To tackle autocorrelation,
sufficient lag structure has to be considered in the specification of the VAR model. However, to preserve
parsimony, lag length need to be justified, therefore we started with a lag of eight periods and then follow
‘general to specific’ diagnostic/specification procedure. We applied Wald test on the restriction that all the
coefficients at eight lag are equal to zero. If this restriction is accepted, the model was re-estimated with seven
period lag and same procedure was repeated till the Wald test results supported the rejection of the null
hypothesis. Once the VAR model was estimated, we used the selected VAR specification to get forecasts for
further application of Rationality test. Performance tests of forecast were based on OLS.
3.1.1 Weak Rationality Test
A forecast must be unbiased and meet tests of weak information efficiency to be weakly rational.
In the following equation
P o t = α o + α1 P e t + ε t 1
Pe is the unbiased forecast of Po, if εt is serially uncorrelated, and the coefficients are insignificantly different
from zero and one respectively. Weak information efficiency means that the forecast errors Et = P e t − P o t
are uncorrelated with the past values of the predicted variables. To test the weak efficiency hypothesis we
estimate the following regression
m
. Et = α o + ∑ α i P o t −i + ε t 2
i =1
If we fail to rejection of the following joint hypothesis it implies that past values help to explain the forecast
errors.
Ho :αo = α j = 0 for all j = 1……….. m 3
Acceptance of such hypothesis represent that the forecast error at time t is independent to the past information
contained by relevant observed price index.
3.1.2 Sufficient Rationality Test
The doctrine of sufficient rationality states that the forecast errors are not correlated with any variable in the
information set available at time of forecast. If Zt is a variable or a vector of variable used to build our forecast
model, then Zt is the exogenous variable in the following equation.
m
Et = α o + ∑ α i Z t − i + ε t 4
i =1
After estimating the equation 4 we test the following hypothesis
Ho :αo = α j = 0 for all j = 1……….. m 5
The rejection of above mentioned hypothesis states that the information contained in the past values of related
series has not been used efficiently in forming the forecast.
3.1.3 Strong Rationality Test
A forecast is said to be strongly rational if it passes the test of conditional efficiency suggest by Granger et al.
(1973). Conditional efficiency requires a comparison of forecasts. Call some sufficiently rational forecast as
benchmark; combine the benchmark with some competing forecast. Estimate the following regression.
[
Dt = α + β S t − S t + ε t ] 6
66
5. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
Where Dt and St are the difference and the sum of the benchmark and combination forecast errors, respectively,
and S t is the mean of the sum. Under the null hypothesis of conditional efficiency, that the combination does
not produce a lower RMSE, (α=β=0).F test is appropriate if β>0 and the mean errors of both forecasts have the
same sign as α. If the mean errors of the two forecasts do not have the same sign, then α cannot be interpret as
an indicator of the relative bias of the two forecasts.
3.1.4 Strict rationality Test
A forecast is strictly rational if it passes tests of strong rationality in a variety of sub periods. If a strongly
rational forecast passes the same test based on equation 6 in sub periods then according to Bonham (1991) that
particular forecast is awarded as strict rational.
4. Results and Discussions
Food Price Inflation (CPIF) and Consumer Price index general (CPIG) both are VAR (1, 2) for our data series
.Four variables are included in each model, i.e., Real GDP, M2, Interest Rate and Exchange rate to estimate
VAR
Table 1.1 in appendix illustrates forecasts Statistics, Root Mean Squared Error (RMSE), Mean Absolute error
(MAE), Mean Absolute percentage errors (MAPE), and Theil Inequality Coefficient TIC. In every case forecast
error is defined as the forecast value minus the actual value. The MAE is a measure of overall accuracy that
gives an indication of the degree of spread, where all errors are assigned equal weights. The MSE is also a
measure of overall accuracy that gives an indication of the degree of spread, but here large errors are given
additional weight. It is the most common measure of forecasting accuracy. Often the square root of the MSE,
RMSE, is considered, since the seriousness of the forecast error is then denoted in the same dimensions as the
actual and forecast values themselves. The MAPE is a relative measure that corresponds to the MAE. The
MAPE is the most useful measure to compare the accuracy of forecasts between different items or products
since it measures relative performance. If the MAPE calculated value is less than 10 percent, it is interpreted as
highly accurate forecasting, between 10 - 20 percent good forecasting, between 20 -50 percent reasonable
forecasting and over 50 percent inaccurate forecasting. Theil’s Inequality Coefficient (TIC) is another statistical
measure of forecast accuracy. A Theil’s-U greater than 1.0 indicates that the forecast model is worse; a value
less than 1.0 indicates that it is better. The closer U is to 0, the better the model. Wrapping up all this discussion
is simply to say that we get best forecast from our data series using VAR model (see statistics in table1.1for
detail).
4.1 Rationality Test for Forecasts
Carl S. Bonhan and Douglas C. Dacy (1991) classify the rationality of time series forecast as, (1) Weakly
rational, (2) sufficiently rational, (3) strongly rational, (4) strictly rational.
4.1.1 Weak Rationality
A forecast must be (a) unbiased and meet the tests of (b) weak informational efficiency to be weakly rational.
In this part we estimate the Unbiasness. We regress forecast on observed data series to get forecast errors.
CPIF = 0.5539311996 + 1.001230912*F1
(0.191) (296.822)***
CPIG = 1.591295003 + 1.000070449*F2
(0.681) (400.601)***
Forecasts are significant in explaining the observed series. T-Values in parenthesis indicates it validity.
Unbiased ness Tests
Breusch-Godfrey Serial Correlation LM Test:
Table 1.2 Ho: Serially uncorrelated errors
Forecast F-statistic Probability Lag length
CPIF 0.605 0.438 1
CPIG 5.751 0.004 2
67
6. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
Table 1.2 illustrates the results of forecasts errors. CPIG errors are serially correlated whereas CPIF errors are
serially uncorrelated which confirms CPIF forecasts are unbiased and passing the Unbaisdness test of forecast
though it is not insignificantly different from zero and one.
Table 1.3
Ho: C(1)=0, C(2)=1
Forecast F-statistic Probability Chi-square Probability
CPIF 0.539 0.585 1.077 0.583
CPIG 0.754 0.473 1.507 0.471
Table 1.3 shows that CPIF and CPIG forecast coefficient are insignificantly zero and one as null hypothesis is
accepted here.
In order to test the weak information efficiency of forecast we regress our forecasts errors on past predicted
values and find they are uncorrelated with forecasts errors.
E1 = -0.3925389311 - 0.001473105748*CPIF (-1)
(-0.136) (-0.431)
E2 = -1.415973065 - 0.0003045008285*CPIG (-1)
(-0.608) (-0.120)
Weak Informational Efficiency Tests
Table1.4
Ho: C(1)= C(2)=0
Forecast F-statistic Probability Chi-square Probability
CPIF 0.566 0.569 1.131 0.568
CPIG 0.761 0.470 1.521 0.467
We fail to reject the joint hypothesis reported in table 1.4; it implies that past values help to explain the forecast
errors. So CPIF and CPIG both are qualifying the test of weak informational efficiency. It is evident from the
table statistics. Acceptance of null hypothesis above represents the forecast error at time t is independent to the
past information contained by relevant observed price index.
4.1.2Sufficient Rationality
We regress our forecasts error on information set available used to estimate VAR model which is real GDP, M2,
interest rate, and exchange rate lags. The doctrine of sufficient rationality states that the forecast errors are not
correlated with any variable in the information set available at time of forecast.
E1 = -4.977+ 0.0848*CPIF (-1) + 4.2e-06*RGDP (-1)-9.5e-05*M2 (-1)-1.404*R(-1)-0.816*ER(-1)
(-0.75) (2.59)*** (0.25) (-2.03) ** (-1.71)* (-1.64)*
E2 = 0.025+0.26*CPIG (-1)-1.4e-06*RGDP (-1)-7.9e-06*M2 (-1) -0.02*R (-1)-0.017*ER (-1)
(0.14) (5.091) *** (-2.86) *** (-4.00) *** (-1.17) (-1.98) **
Sufficient Rationality Tests
Table 1.5
Ho: All the Coefficients are zero
Forecast F-statistic Probability Chi-square Probability
CPIF 1.470 0.194 8.823 0.184
CPIG 5.000 0.180 29.999 0.196
Table 1.5 statistics are explaining the result of sufficient rationality criterion. The rejection of above mentioned
hypothesis states that the information contained in the past values of related series has not been used efficiently
in forming the forecast ,as null hypothesis is not rejected here it indicates given information are used in making
these forecasts. Therefore both CPIF and CPIG are fulfilling the sufficient rationality criterion.
4.1.3 Strong Rationality
A forecast is said to be strongly rational if it passes the test of conditional efficiency suggest by Granger et al.
(1973). Conditional efficiency requires a comparison of forecasts. In order to Call sufficiently rational forecast
68
7. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
we need some forecasts as benchmark, intended for, we get forecasts of CPIF from ARIMA (1, 1, 1) (Auto
Regressive Moving Average)5; combine this benchmark (ARIMA) with some competing (VAR) forecast. We get
forecasts errors and estimate the difference and sum of the benchmark and combination forecast errors and also
obtain the mean of the sum to estimate α andβ. Results are reported in table 1.6 in appendix which indicates
The forecast of CPIF obtained from VAR is strongly efficient when combine with an ARIMA forecast of CPIF.
4.1.4 Strict rationality
A forecast is strictly rational if it passes tests of strong rationality in a variety of sub periods. In this study
forecasts of CPIF met the strong efficient criterion, so we estimated equation 6 in given sub periods and find
CPIF did not follow the strict rationality criterion.
We break the sample in following sub periods
• 1975Q3 to 1980Q2
• 1980Q3 to 1985Q4
• 1986Q1 to 1990Q1
• 1990Q2 to 1995Q2
• 1995Q3 to 2000Q4
• 2001Q1 to 2005Q2
• 2005Q3 to 2008Q2
Conclusion
4.1.1 4.1.2 4.2 4.3 4.4
Food price Inflation 1 1 1 1 NA
Quarter
Consumer price index general 0 1 NA NA NA
“ 1” for meeting the criteria, “0” otherwise and NA/not applicable
4.1.1 Unbiasedness Test
4.1.2 Weak Informational Efficiency Test
4.2 Sufficient Rationality
4.3 Strong Rationality
4.4 Strict Rationality
It is clear from result summary that food price inflation forecast qualify the rationality criterion used to check
the accuracy of forecasts, they are unbiased and fulfilling the criterion of weak, sufficient and strong rationality.
Consumer price index general forecast are only weakly rational. We infer from our analysis that food price
forecast are reliable for further application. Forecasting rationality test reduce the range of uncertainty within
which management judgment can be made, so that it can be used in decision making process to the benefits of
an organization and policy makers. Food Price Inflation forecasts are satisfying all the criteria used to check the
performance of forecast by VAR for given data series. We suggest policy makers and planning authorities for
reliance on these criteria to get better forecasts for further appliance. If for every forecast such criterion will be
used then more consistent and reliable results can be predicted.
5
-For more detail see Box, G. E. P. and G. M. Jenkins (1976), “Time Series Analysis, Forecasting and Control”,
Holden-Day, San Francisco.
69
8. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
References
Aggarwal, R. Mohanty, S. and Song, F. (1995), “Are Survey Forecasts of Macroeconomic Variables Rational?”,
Journal of Business, 68, (1), 99-119.
Akerlof, G. A., William, T. D. and George, L. P. (2000), “Near-Rational Wage and Price Setting and the Long-
Run Phillips Curve”, Brookings Papers on Economics Activity, 1-60.
Armstrong, J. S. and Fildes R. (1995), “On the Selection of Error Measures for Comparisons among Forecasting
Methods”, Journal of Forecasting, Vol. 14, 67-71.
Artis, M. J. (1996), “How Accurate are the IMF’s Short-term Forecasts? Another Examination of the World
Economic Outlook”, International Monetary Fund, Working Paper No. 96/89.
Ash, J. C. K., Smyth, D. J. and Heravi, S. M. (1990), “The Accuracy of OECD Forecasts of the International
Economy”, International Journal of Forecasting, 6, 379-392.
Ash, J. C. K., Smyth, D. J. and Heravi, S. M. (1998), “Are OECD Forecasts Rational and Useful?: A Directional
Analysis”, International Journal of Forecasting, 14, 381-391.
Bakhshi, H., George, K. and Anthony, Y. (2003), “Rational Expectations and Fixed-Event Forecasts: an
Application to UK Inflation”, Bank of England, UK, Working Paper No. 176.
Batchelor, R. (2001), “How Useful are the Forecasts of Intergovernmental Agencies? The OECD and IMF
versus the Consensus”, Applied Economics, 33, 225-235.
Bonham, C. S. and Douglas, C. D. (1991), “In Search of a “Strictly Rational” Forecast”, The Review of
Economics and Statistics, Vol. 73, No. 2, 245-253.
Bonham, C. S. and Cohen, R. (1995), “Testing the Rationality of Price Forecasts: Comment”, The American
Economic Review, Vol. 85, 284-289.
Box, G. E. P. and G. M. Jenkins (1976), “Time Series Analysis, Forecasting and Control”, Holden-Day, San
Francisco.
Bryan, M. F. and Stefan, P. (2005), “Testing Near-Rationality Using Detailed Survey Data”, Federal Reserve
Bank of Cleveland, Working Paper No. 05-02.
Carlson, J. A. (1977), “A Study of Price Forecast”, Annals of Economic and Social Measurement, 6, 27-56.
Engle, R. F. and C. W. J. Granger, (1987), “Co-integration and Error Correction: Representation, Estimation and
Testing”, Econometrica, 55, 251-276.
Evans, George, and Gulmani, R. (1984), “Tests for Rationality of the Carlson-Parkin Inflation Expectation
Data”, Oxford Bulletin of Economics and Statistics, 46, 1-19.
Figlewski, Stephen and Paul W. (1981), “The Formation of Inflationary Expectations”, Review of Economics
and Statistics, 63, 1-10.
Friedman, Benjamin, M. (1980), “Survey Evidence on the ‘Rationality’ of Interest Rate Expectations”, Journal
of Monetary Economics, 6, 453-465.
Gramlich, Edward, M. (1983), “Models of Inflation Expectations Formation: A comparison of Households and
Economist Forecasts”, Journal of Money, Credit and Banking, 15, 155-173.
Granger, C. W. J., (1981), “Some Properties of Time Series Data and Their Use in Econometric Model
Specification”, Journal of Econometrics, 16, 121-130.
Granger, C. W. J., (1989), “Forecasting In Business and Economics”, Second edition Academic Press, London,
page 194.
Granger, C. W. J., (1996), “Can We Improve the Perceived Quality of Economic Forecast?”, Journal of Applied
Econometrics, Vol. 11, No. 5, 455-473.
Government of Pakistan, Economic survey (various issues), Islamabad, Ministry of Finance.
Hafer, R. W. and Hein, S. E. (1985), “On the Accuracy of Time Series, Interest Rate, and Survey Forecast of
Inflation”, Journal of Business, 5, 377-398.
Holden, K. and Peel, D. A. (1990), “On Testing for Unbiasedness and Efficiency of Forecasts”, Manchester
School, 58, 120-127.
Lee, Bong-soo (1991), “On the Rationality of Forecasts”, The Review of Economics and Statistics, Vol. 73, No.
2, 365-370.
Liu, P. and G.S. Maddala (1992), “Rationality of Survey Data and Tests for Market Efficiency in the Foreign
Exchange Markets”, Journal of International Money and Finance, 11, 366-381
McNees, Stephen, K. (1986), “The Accuracy of Two Forecasting Techniques: Some Evidence and
Interpretations”, New England Economics Review, April, 20-31.
70
9. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
Mullineaux, D. J. (1978), “On Testing for Rationality: Another Look at the Livingston Price Expectations Data”,
Journal of Political Economy, 86, 329-336.
Pesando, J. E. (1975), “A Note on the Rationality of Livingston Price Expectations”, Journal of Political
Economy, 83, 849-858.
Pons, J. (1999), “Evaluating the OECD’s Forecasts for Economic Growth”, Applied Economics, 31, 893-902.
Pons, J. (2000), “The Accuracy of IMF and OECD Forecasts for G7 Countries”, Journal of Forecasting, 19, 56-
63.
Razzak, W. A. (1997), “Testing the Rationality of the National Bank of New Zealand’s Survey Data”, National
Bank of New Zealand, G97/5
Rich, R. W. (1989), “Testing the Rationality of Inflation from Survey Data: Another Look at the SRC Expected
Price Change Data”, The Review of Economics and Statistics, Vol. 71, No. 4, 682-686.
Ruoss, E. and Marcel, S. (2002), “How accurate are GDP Forecast? An Empirical Study for Switzerland”,
Quarterly Bulletin, Swiss National Bank, Zurich, 3, 42-63.
Timmermann, A. (2005), “Forecast Combinations” forthcoming in Handbook of Economic Forecasting,
Amsterdam, North Holland.
Yule, G. U. (1927), “On a Method of Investigating Periodicities in Disturbed Series with Special Reference to
Wolfer’s Sunspot Numbers”, Philosophical Transactions of the Royal Society London, Ser. A, 226, 267-298.
Zarnowitz, V. (1985), “Rational Expectations and Macroeconomic Forecasts”, Journal of Business and
Economic Statistics, 3, 293-311.
Zarnowitz, V. and Phillip, B. (1993), “Twenty-Two Years of the NBER-ASA Quarterly Economic Outlook
Surveys: Aspects and Comparisons of Forecasting Performance”, Business Cycles, Indicators and Forecasting,
University of Chicago Press, 11-84
Appendix
Table 1.1
Forecast Statistics of Quarter Data with VAR Model
CPIF CPIG
Included observations 129 127
Root Mean Squared Error 5.644 5.025
Mean Absolute Error 3.276 3.291
Mean Absolute Percentage Error 1.874 1.405
Theil Inequality Coefficient 0.010 0.008
Bias Proportion 0.74% 1.19%
Variance Proportion 0.26% 0.03%
Covariance Proportion 99.00% 98.78%
71
10. Developing Country Studies www.iiste.org
ISSN 2224-607X (Paper) ISSN 2225-0565 (Online)
Vol 2, No.1, 2012
Table 1.6
Strong Rationality Test Results
Benchmark Forecast When Combined With
Panel A CPIF ARIMA CPIF VAR
Sign Mean Error -ve -ve
α 0.386856
β -0.042682
Prob. Bias
Conclusion Cannot Reject
Panel B CPIF VAR CPIF ARIMA
Sign Mean Error -ve -ve
α -0.387
β 0.043
Prob. 0.7267
Conclusion Cannot Reject
Sample 1975Q3 2008Q2
Result in table 1.6 shows, Panel A the benchmark forecast is ARIMA and in Panel B the benchmark is VAR and
combined with an ARIMA forecast of CPIF. The sign of α is same with the sign of mean forecast error in Panel
B. It follows the test.
72
11. International Journals Call for Paper
The IISTE, a U.S. publisher, is currently hosting the academic journals listed below. The peer review process of the following journals
usually takes LESS THAN 14 business days and IISTE usually publishes a qualified article within 30 days. Authors should
send their full paper to the following email address. More information can be found in the IISTE website : www.iiste.org
Business, Economics, Finance and Management PAPER SUBMISSION EMAIL
European Journal of Business and Management EJBM@iiste.org
Research Journal of Finance and Accounting RJFA@iiste.org
Journal of Economics and Sustainable Development JESD@iiste.org
Information and Knowledge Management IKM@iiste.org
Developing Country Studies DCS@iiste.org
Industrial Engineering Letters IEL@iiste.org
Physical Sciences, Mathematics and Chemistry PAPER SUBMISSION EMAIL
Journal of Natural Sciences Research JNSR@iiste.org
Chemistry and Materials Research CMR@iiste.org
Mathematical Theory and Modeling MTM@iiste.org
Advances in Physics Theories and Applications APTA@iiste.org
Chemical and Process Engineering Research CPER@iiste.org
Engineering, Technology and Systems PAPER SUBMISSION EMAIL
Computer Engineering and Intelligent Systems CEIS@iiste.org
Innovative Systems Design and Engineering ISDE@iiste.org
Journal of Energy Technologies and Policy JETP@iiste.org
Information and Knowledge Management IKM@iiste.org
Control Theory and Informatics CTI@iiste.org
Journal of Information Engineering and Applications JIEA@iiste.org
Industrial Engineering Letters IEL@iiste.org
Network and Complex Systems NCS@iiste.org
Environment, Civil, Materials Sciences PAPER SUBMISSION EMAIL
Journal of Environment and Earth Science JEES@iiste.org
Civil and Environmental Research CER@iiste.org
Journal of Natural Sciences Research JNSR@iiste.org
Civil and Environmental Research CER@iiste.org
Life Science, Food and Medical Sciences PAPER SUBMISSION EMAIL
Journal of Natural Sciences Research JNSR@iiste.org
Journal of Biology, Agriculture and Healthcare JBAH@iiste.org
Food Science and Quality Management FSQM@iiste.org
Chemistry and Materials Research CMR@iiste.org
Education, and other Social Sciences PAPER SUBMISSION EMAIL
Journal of Education and Practice JEP@iiste.org
Journal of Law, Policy and Globalization JLPG@iiste.org Global knowledge sharing:
New Media and Mass Communication NMMC@iiste.org EBSCO, Index Copernicus, Ulrich's
Journal of Energy Technologies and Policy JETP@iiste.org Periodicals Directory, JournalTOCS, PKP
Historical Research Letter HRL@iiste.org Open Archives Harvester, Bielefeld
Academic Search Engine, Elektronische
Public Policy and Administration Research PPAR@iiste.org Zeitschriftenbibliothek EZB, Open J-Gate,
International Affairs and Global Strategy IAGS@iiste.org OCLC WorldCat, Universe Digtial Library ,
Research on Humanities and Social Sciences RHSS@iiste.org NewJour, Google Scholar.
Developing Country Studies DCS@iiste.org IISTE is member of CrossRef. All journals
Arts and Design Studies ADS@iiste.org have high IC Impact Factor Values (ICV).