1. The document proposes a new measure of downside risk called realised semivariance (RS-) which is calculated using high frequency asset price data and focuses only on downward price movements.
2. RS- is shown to have important predictive properties for future volatility and provides new information about risk beyond what can be learned from quadratic variation and realised variance alone.
3. An empirical analysis of trades in General Electric stock finds the downside realised semivariance captures more dependence over time than upside realised semivariance, suggesting it may be a better measure of predictive risk.
1. Economists care about standard errors in addition to point estimates as they allow measuring confidence in results. However, standard errors are often calculated incorrectly when observations are not independent or identically distributed.
2. Survey data is often clustered, stratified, and uses sampling weights, violating assumptions of independence and identical distribution. This can lead to standard errors that are too low if not properly accounted for.
3. Robust standard errors that account for survey design are important for many commonly used surveys to obtain accurate measures of confidence in results. Failing to use robust standard errors can result in incorrect statistical inference.
This document provides an overview of statistical concepts used for risk assessment. It discusses descriptive statistics such as measures of central tendency (mean, median, mode) and dispersion (variance, standard deviation) used to describe data. Inferential statistics use random sampling to make conclusions about unknown populations. Regression analysis is used to construct risk models and measure relationships between variables by finding the regression line equation that best fits the data with the highest R2 value.
The document discusses a model of dynamic trading volume under price impact. It begins with motivations and an outline of the model, which considers a representative agent facing constant investment opportunities and risk aversion, trading in a market with finite depth. Key results discussed include the optimal trading policy, welfare implications, and dynamics of the implied trading volume. Asymptotic expansions show turnover is proportional to displacement from the target risky weight and depends on parameters like volatility, risk aversion, and market depth. Trading volume is characterized as having properties similar to an Ornstein-Uhlenbeck process. The model provides a way to estimate market depth from observed trading volumes.
This document discusses filtering and likelihood inference. It begins by introducing filtering problems in economics, such as evaluating DSGE models. It then presents the state space representation approach, which models the transition and measurement equations with stochastic shocks. The goal of filtering is to compute the conditional densities of states given observed data over time using tools like the Chapman-Kolmogorov equation and Bayes' theorem. Filtering provides a recursive way to make predictions and updates estimates as new data arrives.
The document discusses three examples of nonlinear and non-Gaussian DSGE models. The first example features Epstein-Zin preferences to allow for a separation between risk aversion and the intertemporal elasticity of substitution. The second example models volatility shocks using time-varying variances. The third example aims to distinguish between the effects of stochastic volatility ("fortune") versus parameter drifting ("virtue") in explaining time-varying volatility in macroeconomic variables. The document outlines the motivation, structure, and solution methods for these three nonlinear DSGE models.
Real time information reconstruction -The Prediction Marketraghavr186
This document summarizes an approach to analyzing prediction markets using convex optimization. It discusses both offline and online formulations of the problem. The offline formulation involves accepting bids to maximize profit, which can be modeled as a linear program. Uniqueness of the solution is analyzed. The online formulation updates the model sequentially as bids arrive in real time. Various choices for the objective function are discussed, balancing risk aversion with expected gain. Truthfulness of bids is also addressed.
1) A K-map is a graphical representation of a logic function's truth table as an array. Each cell corresponds to an input combination.
2) Cells can be combined if their 1-values differ in only one variable. The product term has literals for variables that are the same in the combined cells.
3) Finding a minimal sum involves expressing the function as the sum of essential prime implicants, which are product terms that cannot be simplified further while still covering all 1-values.
The document outlines a model for analyzing transaction costs in portfolio choice. It presents explicit formulas for trading boundaries, certainty equivalent rates, liquidity premiums, and trading volumes in terms of model parameters like the spread. Graphs show how these quantities vary with factors like risk aversion. The results are obtained by solving a free boundary problem using a shadow price approach and smooth pasting conditions at the boundaries. Asymptotics of the solutions are also derived in terms of the spread approaching zero.
1. Economists care about standard errors in addition to point estimates as they allow measuring confidence in results. However, standard errors are often calculated incorrectly when observations are not independent or identically distributed.
2. Survey data is often clustered, stratified, and uses sampling weights, violating assumptions of independence and identical distribution. This can lead to standard errors that are too low if not properly accounted for.
3. Robust standard errors that account for survey design are important for many commonly used surveys to obtain accurate measures of confidence in results. Failing to use robust standard errors can result in incorrect statistical inference.
This document provides an overview of statistical concepts used for risk assessment. It discusses descriptive statistics such as measures of central tendency (mean, median, mode) and dispersion (variance, standard deviation) used to describe data. Inferential statistics use random sampling to make conclusions about unknown populations. Regression analysis is used to construct risk models and measure relationships between variables by finding the regression line equation that best fits the data with the highest R2 value.
The document discusses a model of dynamic trading volume under price impact. It begins with motivations and an outline of the model, which considers a representative agent facing constant investment opportunities and risk aversion, trading in a market with finite depth. Key results discussed include the optimal trading policy, welfare implications, and dynamics of the implied trading volume. Asymptotic expansions show turnover is proportional to displacement from the target risky weight and depends on parameters like volatility, risk aversion, and market depth. Trading volume is characterized as having properties similar to an Ornstein-Uhlenbeck process. The model provides a way to estimate market depth from observed trading volumes.
This document discusses filtering and likelihood inference. It begins by introducing filtering problems in economics, such as evaluating DSGE models. It then presents the state space representation approach, which models the transition and measurement equations with stochastic shocks. The goal of filtering is to compute the conditional densities of states given observed data over time using tools like the Chapman-Kolmogorov equation and Bayes' theorem. Filtering provides a recursive way to make predictions and updates estimates as new data arrives.
The document discusses three examples of nonlinear and non-Gaussian DSGE models. The first example features Epstein-Zin preferences to allow for a separation between risk aversion and the intertemporal elasticity of substitution. The second example models volatility shocks using time-varying variances. The third example aims to distinguish between the effects of stochastic volatility ("fortune") versus parameter drifting ("virtue") in explaining time-varying volatility in macroeconomic variables. The document outlines the motivation, structure, and solution methods for these three nonlinear DSGE models.
Real time information reconstruction -The Prediction Marketraghavr186
This document summarizes an approach to analyzing prediction markets using convex optimization. It discusses both offline and online formulations of the problem. The offline formulation involves accepting bids to maximize profit, which can be modeled as a linear program. Uniqueness of the solution is analyzed. The online formulation updates the model sequentially as bids arrive in real time. Various choices for the objective function are discussed, balancing risk aversion with expected gain. Truthfulness of bids is also addressed.
1) A K-map is a graphical representation of a logic function's truth table as an array. Each cell corresponds to an input combination.
2) Cells can be combined if their 1-values differ in only one variable. The product term has literals for variables that are the same in the combined cells.
3) Finding a minimal sum involves expressing the function as the sum of essential prime implicants, which are product terms that cannot be simplified further while still covering all 1-values.
The document outlines a model for analyzing transaction costs in portfolio choice. It presents explicit formulas for trading boundaries, certainty equivalent rates, liquidity premiums, and trading volumes in terms of model parameters like the spread. Graphs show how these quantities vary with factors like risk aversion. The results are obtained by solving a free boundary problem using a shadow price approach and smooth pasting conditions at the boundaries. Asymptotics of the solutions are also derived in terms of the spread approaching zero.
Hedging, Arbitrage, and Optimality with Superlinear Frictionsguasoni
In a continuous-time model with multiple assets described by cadlag processes, this paper characterizes superhedging prices, absence of arbitrage, and utility maximizing strategies, under general frictions that make execution prices arbitrarily unfavorable for high trading intensity. With such frictions, dual elements correspond to a pair of a shadow execution price combined with an equivalent martingale measure. For utility functions defined on the real line, optimal strategies exist even if arbitrage is present, because it is not scalable at will.
This document discusses heterogeneous agent models without aggregate uncertainty. It introduces a model with a continuum of agents who face idiosyncratic income fluctuations but no aggregate shocks. There is a unique stationary equilibrium with constant interest rates and wages. The document discusses the recursive competitive equilibrium, existence and uniqueness of the stationary equilibrium, transition functions, computation methods, and some qualitative results from calibrating the model.
This document introduces new notation and concepts for modeling an open economy using the Regional Economy (REG) model. It defines symbols for variables such as exports (X), imports (IM), government expenditures (GTN), and regional holdings (N, S). It also presents the national income equations and other key equations that describe regional disposable income, taxes, wealth, consumption, money demand, and bond demand for the open REG model. The objectives are to analyze steady state solutions and conduct experiments by changing parameters like trade propensity (μS) and government spending.
Asset Prices in Segmented and Integrated Marketsguasoni
This document summarizes a model of asset pricing in segmented and integrated markets. It begins with motivation from the financialization of commodities and market integration. It then presents a model with two regions/trees producing dividends. Equilibria are characterized for when the regions are segmented and integrated. Key results include asset prices being more cyclical and negatively correlated in segmentation, but highly positively correlated in integration. Integration always increases welfare even if it sometimes lowers asset prices and total wealth. Both regions would choose integration to access a smoother consumption stream.
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...Chiheb Ben Hammouda
Seminar talk at École des Ponts ParisTech about our recently published work "Hierarchical adaptive sparse grids and quasi-Monte Carlo for option pricing under the rough Bergomi model". - Link of the paper: https://www.tandfonline.com/doi/abs/10.1080/14697688.2020.1744700
Saddlepoint approximations, likelihood asymptotics, and approximate condition...jaredtobin
Maximum likelihood methods may be inadequate for parameter estimation in models where many nuisance parameters are present. The modified profile likelihood (MPL) of Barndorff-Nielsen (1983) serves as a highly accurate approximation to the marginal or conditional likelihood, when either exist, and can be viewed as an approximate conditional likelihood when they do not. We examine the modified profile likelihood, its variants, and its connections with Laplace and saddlepoint approximations under both theoretical and pragmatic lenses.
This document provides an overview of statistical tests commonly used in neuroimaging such as t-tests, ANOVAs, and regression. It discusses the purposes of these tests and how they are applied. T-tests are used to compare means, for example to determine if the difference between two conditions is statistically significant. ANOVAs examine variances and can be used when comparing more than two groups. Regression allows describing and predicting the relationship between variables and is useful in the general linear model approach used in SPM. Key assumptions and calculations for each method are outlined.
This document describes a collapsed dynamic factor analysis model for macroeconomic forecasting. It summarizes that multivariate time series models can more accurately capture relationships between economic variables compared to univariate models. The document then presents a collapsed dynamic factor model that relates a target time series (yt) to unobserved dynamic factors (Ft) estimated from related macroeconomic data (gt). Out-of-sample forecasting experiments on US personal income and industrial production data demonstrate the model achieves more accurate point forecasts than univariate benchmarks like random walk or AR(2) models.
This document contains notes from a Calculus I class at New York University. It discusses related rates problems, which involve taking derivatives of equations relating changing quantities to determine rates of change. The document provides examples of related rates problems involving an oil slick, two people walking towards and away from each other, and electrical resistors. It also outlines strategies for solving related rates problems, such as drawing diagrams, introducing notation, relating quantities with equations, and using the chain rule to solve for unknown rates.
The document discusses methods for solving dynamic stochastic general equilibrium (DSGE) models. It outlines perturbation and projection methods for approximating the solution to DSGE models using linearization. Perturbation methods use Taylor series approximations around a steady state to derive linear approximations. Projection methods find parametric functions that best satisfy the model equations. The document provides examples applying these methods to solve a simple neoclassical growth model.
This document provides an overview of basic statistical concepts and terms. It discusses variables, observational vs experimental research, dependent and independent variables, measurement scales, systematic and random errors, accuracy vs precision, populations, distributions like binomial and normal, central tendency, dispersion, and other key statistical concepts. Examples are provided to illustrate statistical terminology.
6. bounds test for cointegration within ardl or vecm Quang Hoang
This document discusses using the bounds test approach within an autoregressive distributed lag (ARDL) model to test for cointegration and causality between time series variables. The ARDL model estimates error correction models involving the change in one variable (ΔYt or ΔXt) regressed on lags of itself and the other variable. The bounds test involves calculating an F-statistic and comparing it to critical value bounds - if the F-statistic exceeds the upper critical value bounds, then there is cointegration, and if it falls below the lower bounds, then there is no cointegration. The document provides the null and alternative hypotheses for the bounds test when each variable is the dependent variable in the error correction model. It also outlines the
El documento describe las funciones y objetivos de un Comité de Ética Ambiental Escolar en el Instituto Normal para Varones de Oriente. Los estudiantes del grupo MEOMA presentarán el proyecto del Comité al director de la escuela, a los padres y madres de familia, y al profesor de Artes Industriales para obtener su apoyo en la difusión del proceso de ética ambiental en la escuela y la comunidad.
This document discusses assessment and evaluation in blended teaching. It distinguishes between assessing student learning to determine the quality of student work, and evaluating courses to determine effectiveness. It provides examples of formative assessments that can be used online like low-stakes quizzes and discussion forums. New forms of assessment in blended courses are discussed, like allowing additional resources on forums or documenting group work processes. Specific assessment tools are also outlined, including CATs (Classroom Assessment Techniques), rubrics, and checklists. CATs and rubrics are presented as beneficial for providing feedback and reducing instructor workload while avoiding high-stakes assessments.
Nicotinergic impact on focal and non focal neuroplasticity induced by non-inv...merzak emerzak
This study investigated the effects of nicotine on neuroplasticity induced by non-invasive brain stimulation techniques. 48 healthy non-smokers received either nicotine or placebo patches combined with transcranial direct current stimulation (tDCS) or paired associative stimulation (PAS) applied to the motor cortex. Motor evoked potentials were measured as an indicator of corticospinal excitability. Nicotine abolished or reduced inhibitory neuroplasticity induced by both tDCS and PAS, but only slightly prolonged facilitatory plasticity induced by PAS. Thus, nicotine influences plasticity differently than global cholinergic enhancement, demonstrating discernible effects of activating nicotinic receptors.
The document discusses the National Education Technology Plan and how it supports the goals of the No Child Left Behind Act. It outlines seven action steps proposed by the plan, including strengthening leadership, improving teacher training, supporting e-learning, and integrating data systems. The plan aims to fully engage today's technology-savvy students and ensure they have the skills needed for a global, digital economy. If implemented well over the next decade, the plan could help boost student achievement through new, technology-facilitated models of education.
This document summarizes two online teaching courses. The first course, TEC 948, explores the history and evolution of online teaching and strategies for effective online instruction. It aims to strengthen skills for online teaching and meet needs of 21st century students. The second course, TEC 965, focuses on designing an online course module using Moodle. It guides participants in instructional design and creating standards-aligned online learning activities using sound pedagogical principles. Both courses emphasize skills for online instruction and use of educational technology.
Web 2.0: Instruction, Assessment, Differentiationmaryebennett
This document discusses ways to differentiate instruction and assessment using Web 2.0 tools. It outlines universal design principles that provide flexibility and meet diverse learner needs. It also describes multiple intelligences and strategies for each, along with technology tools to support different learning styles, including verbal-linguistic, logical-mathematical, visual-spatial, musical-rhythmic, intrapersonal, bodily-kinesthetic, interpersonal, and naturalist intelligences. Specific Web 2.0 tools are suggested for differentiating instruction across content, process and product.
El documento habla sobre la importancia del servicio al cliente para el éxito de las empresas. Menciona que los empresarios pequeños y medianos reconocen que el servicio al cliente es un aspecto importante y que existe una amplia gama de recursos sobre cómo optimizar los recursos como los humanos y materiales para mejorar el servicio.
Hedging, Arbitrage, and Optimality with Superlinear Frictionsguasoni
In a continuous-time model with multiple assets described by cadlag processes, this paper characterizes superhedging prices, absence of arbitrage, and utility maximizing strategies, under general frictions that make execution prices arbitrarily unfavorable for high trading intensity. With such frictions, dual elements correspond to a pair of a shadow execution price combined with an equivalent martingale measure. For utility functions defined on the real line, optimal strategies exist even if arbitrage is present, because it is not scalable at will.
This document discusses heterogeneous agent models without aggregate uncertainty. It introduces a model with a continuum of agents who face idiosyncratic income fluctuations but no aggregate shocks. There is a unique stationary equilibrium with constant interest rates and wages. The document discusses the recursive competitive equilibrium, existence and uniqueness of the stationary equilibrium, transition functions, computation methods, and some qualitative results from calibrating the model.
This document introduces new notation and concepts for modeling an open economy using the Regional Economy (REG) model. It defines symbols for variables such as exports (X), imports (IM), government expenditures (GTN), and regional holdings (N, S). It also presents the national income equations and other key equations that describe regional disposable income, taxes, wealth, consumption, money demand, and bond demand for the open REG model. The objectives are to analyze steady state solutions and conduct experiments by changing parameters like trade propensity (μS) and government spending.
Asset Prices in Segmented and Integrated Marketsguasoni
This document summarizes a model of asset pricing in segmented and integrated markets. It begins with motivation from the financialization of commodities and market integration. It then presents a model with two regions/trees producing dividends. Equilibria are characterized for when the regions are segmented and integrated. Key results include asset prices being more cyclical and negatively correlated in segmentation, but highly positively correlated in integration. Integration always increases welfare even if it sometimes lowers asset prices and total wealth. Both regions would choose integration to access a smoother consumption stream.
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...Chiheb Ben Hammouda
Seminar talk at École des Ponts ParisTech about our recently published work "Hierarchical adaptive sparse grids and quasi-Monte Carlo for option pricing under the rough Bergomi model". - Link of the paper: https://www.tandfonline.com/doi/abs/10.1080/14697688.2020.1744700
Saddlepoint approximations, likelihood asymptotics, and approximate condition...jaredtobin
Maximum likelihood methods may be inadequate for parameter estimation in models where many nuisance parameters are present. The modified profile likelihood (MPL) of Barndorff-Nielsen (1983) serves as a highly accurate approximation to the marginal or conditional likelihood, when either exist, and can be viewed as an approximate conditional likelihood when they do not. We examine the modified profile likelihood, its variants, and its connections with Laplace and saddlepoint approximations under both theoretical and pragmatic lenses.
This document provides an overview of statistical tests commonly used in neuroimaging such as t-tests, ANOVAs, and regression. It discusses the purposes of these tests and how they are applied. T-tests are used to compare means, for example to determine if the difference between two conditions is statistically significant. ANOVAs examine variances and can be used when comparing more than two groups. Regression allows describing and predicting the relationship between variables and is useful in the general linear model approach used in SPM. Key assumptions and calculations for each method are outlined.
This document describes a collapsed dynamic factor analysis model for macroeconomic forecasting. It summarizes that multivariate time series models can more accurately capture relationships between economic variables compared to univariate models. The document then presents a collapsed dynamic factor model that relates a target time series (yt) to unobserved dynamic factors (Ft) estimated from related macroeconomic data (gt). Out-of-sample forecasting experiments on US personal income and industrial production data demonstrate the model achieves more accurate point forecasts than univariate benchmarks like random walk or AR(2) models.
This document contains notes from a Calculus I class at New York University. It discusses related rates problems, which involve taking derivatives of equations relating changing quantities to determine rates of change. The document provides examples of related rates problems involving an oil slick, two people walking towards and away from each other, and electrical resistors. It also outlines strategies for solving related rates problems, such as drawing diagrams, introducing notation, relating quantities with equations, and using the chain rule to solve for unknown rates.
The document discusses methods for solving dynamic stochastic general equilibrium (DSGE) models. It outlines perturbation and projection methods for approximating the solution to DSGE models using linearization. Perturbation methods use Taylor series approximations around a steady state to derive linear approximations. Projection methods find parametric functions that best satisfy the model equations. The document provides examples applying these methods to solve a simple neoclassical growth model.
This document provides an overview of basic statistical concepts and terms. It discusses variables, observational vs experimental research, dependent and independent variables, measurement scales, systematic and random errors, accuracy vs precision, populations, distributions like binomial and normal, central tendency, dispersion, and other key statistical concepts. Examples are provided to illustrate statistical terminology.
6. bounds test for cointegration within ardl or vecm Quang Hoang
This document discusses using the bounds test approach within an autoregressive distributed lag (ARDL) model to test for cointegration and causality between time series variables. The ARDL model estimates error correction models involving the change in one variable (ΔYt or ΔXt) regressed on lags of itself and the other variable. The bounds test involves calculating an F-statistic and comparing it to critical value bounds - if the F-statistic exceeds the upper critical value bounds, then there is cointegration, and if it falls below the lower bounds, then there is no cointegration. The document provides the null and alternative hypotheses for the bounds test when each variable is the dependent variable in the error correction model. It also outlines the
El documento describe las funciones y objetivos de un Comité de Ética Ambiental Escolar en el Instituto Normal para Varones de Oriente. Los estudiantes del grupo MEOMA presentarán el proyecto del Comité al director de la escuela, a los padres y madres de familia, y al profesor de Artes Industriales para obtener su apoyo en la difusión del proceso de ética ambiental en la escuela y la comunidad.
This document discusses assessment and evaluation in blended teaching. It distinguishes between assessing student learning to determine the quality of student work, and evaluating courses to determine effectiveness. It provides examples of formative assessments that can be used online like low-stakes quizzes and discussion forums. New forms of assessment in blended courses are discussed, like allowing additional resources on forums or documenting group work processes. Specific assessment tools are also outlined, including CATs (Classroom Assessment Techniques), rubrics, and checklists. CATs and rubrics are presented as beneficial for providing feedback and reducing instructor workload while avoiding high-stakes assessments.
Nicotinergic impact on focal and non focal neuroplasticity induced by non-inv...merzak emerzak
This study investigated the effects of nicotine on neuroplasticity induced by non-invasive brain stimulation techniques. 48 healthy non-smokers received either nicotine or placebo patches combined with transcranial direct current stimulation (tDCS) or paired associative stimulation (PAS) applied to the motor cortex. Motor evoked potentials were measured as an indicator of corticospinal excitability. Nicotine abolished or reduced inhibitory neuroplasticity induced by both tDCS and PAS, but only slightly prolonged facilitatory plasticity induced by PAS. Thus, nicotine influences plasticity differently than global cholinergic enhancement, demonstrating discernible effects of activating nicotinic receptors.
The document discusses the National Education Technology Plan and how it supports the goals of the No Child Left Behind Act. It outlines seven action steps proposed by the plan, including strengthening leadership, improving teacher training, supporting e-learning, and integrating data systems. The plan aims to fully engage today's technology-savvy students and ensure they have the skills needed for a global, digital economy. If implemented well over the next decade, the plan could help boost student achievement through new, technology-facilitated models of education.
This document summarizes two online teaching courses. The first course, TEC 948, explores the history and evolution of online teaching and strategies for effective online instruction. It aims to strengthen skills for online teaching and meet needs of 21st century students. The second course, TEC 965, focuses on designing an online course module using Moodle. It guides participants in instructional design and creating standards-aligned online learning activities using sound pedagogical principles. Both courses emphasize skills for online instruction and use of educational technology.
Web 2.0: Instruction, Assessment, Differentiationmaryebennett
This document discusses ways to differentiate instruction and assessment using Web 2.0 tools. It outlines universal design principles that provide flexibility and meet diverse learner needs. It also describes multiple intelligences and strategies for each, along with technology tools to support different learning styles, including verbal-linguistic, logical-mathematical, visual-spatial, musical-rhythmic, intrapersonal, bodily-kinesthetic, interpersonal, and naturalist intelligences. Specific Web 2.0 tools are suggested for differentiating instruction across content, process and product.
El documento habla sobre la importancia del servicio al cliente para el éxito de las empresas. Menciona que los empresarios pequeños y medianos reconocen que el servicio al cliente es un aspecto importante y que existe una amplia gama de recursos sobre cómo optimizar los recursos como los humanos y materiales para mejorar el servicio.
A history of optogenetics the development of tools for controlling brain circ...merzak emerzak
Optogenetics allows specific control of neural activity with light by expressing light-sensitive microbial opsins in neurons. The development of optogenetics involved adapting opsins like channelrhodopsin and halorhodopsin that transport ions in response to light. Channelrhodopsin was identified as enabling fast activation of neurons with light, and was expressed in neurons to control their activity, demonstrating the potential of optogenetics to causally study neural circuits.
On estimating the integrated co volatility usingkkislas
This document proposes a method to estimate the integrated co-volatility of two asset prices using high-frequency data that contains both microstructure noise and jumps.
It considers two cases - when the jump processes of the two assets are independent, and when they are dependent. For the independent case, it proposes an estimator that is robust to jumps. For the dependent case, it proposes a threshold estimator that combines pre-averaging to remove noise with a threshold method to reduce the effect of jumps. It proves the estimators are consistent and establishes their central limit theorems. Simulation results are also presented to illustrate the performance of the proposed methods.
This document summarizes research on small ball probabilities of α-stable Lévy processes. It provides definitions of strictly α-stable random variables and Lévy processes. It then discusses small deviations of symmetric α-stable Lévy processes and how these probabilities relate to functional laws of the iterated logarithm. Key results presented include asymptotic estimates of small deviation probabilities as the radius goes to zero for centered and shifted α-stable processes.
1. This document summarizes statistics on climate extremes, including time series plots and extreme value analyses of temperature and precipitation data from Houston, Texas.
2. Fitted generalized extreme value (GEV) distributions to one-day, three-day, and seven-day maximum precipitation values show increasing return levels with longer durations.
3. Bayesian and frequentist methods are demonstrated for fitting GEV distributions and estimating return levels of extreme precipitation events.
UT Austin - Portugal Lectures on Portfolio Choiceguasoni
This document outlines key concepts related to long-term investment opportunities and frictions. It discusses how investment opportunities depend on state variables that influence returns and risks over time. It also introduces the concepts of equivalent safe rate and equivalent annuity, which define the optimal growth rate of wealth or utility for a long-term investor. The document proposes solving for long-term optimal portfolios using duality bounds, stationary equations, and criteria for long-run optimality.
Certainly! "Modeling and simulation of energy efficiency measures in industrial processes" is a fascinating topic to explore in your research paper. Here are some key points and areas you can cover:
The document discusses various models that have been used to model power markets, including models derived from finance like Black-Scholes and multifactor models. It notes that most early models simply transposed models from finance without considering factors specific to power markets like seasonality. More recently, models have started to incorporate external variables like temperature and better represent features of power prices like switching behavior and jumps. Overall, significant work remains to develop models that fully capture the complexity of power markets.
This document discusses probability distributions for random variables. It introduces discrete distributions like the binomial and Poisson distributions which are used for counting experiments. It also introduces continuous distributions like the normal distribution which are defined over continuous ranges of values. Key concepts covered include probability density functions, cumulative distribution functions, and how to relate random variables with specific parameters to standard distributions. Examples are provided to illustrate concepts like modeling the number of plant stems in a sampling area with a Poisson distribution.
Basic concepts and how to measure price volatility
Presented by Carlos Martins-Filho at the AGRODEP Workshop on Analytical Tools for Food Prices
and Price Volatility
June 6-7, 2011 • Dakar, Senegal
For more information on the workshop or to see the latest version of this presentation visit: http://www.agrodep.org/first-annual-workshop
This document discusses regression analysis and correlation. It provides examples of functional and statistical relationships between variables. It shows how to find the least squares regression line that best fits a set of data and minimizes the prediction errors. This line can be used to predict the dependent variable from the independent variable. It also defines key regression concepts like the total sum of squares, sum of squares due to regression, sum of squared errors, coefficient of determination, and correlation coefficient.
A glimpse into mathematical finance? The realm of option pricing modelsHoracio González Duhart
This talk was given by Istvan Redl on the 8 October 2013 as part of the PSS at the University of Bath.
http://people.bath.ac.uk/hgd20/pss.html
Abstract: After introducing one of the most important concepts of mathematical finance, the fundamental theorem of asset pricing (FTAP) and the related no arbitrage pricing theory (NAPT), I will briefly discuss the main techniques and tools extensively used in option pricing, namely Monte Carlo, Fourier Transform and PDE methods. In order to give a fairly well-structured overview of a great chunk of currently preferred models, through a simple example the hierarchy of the mathematical models will be demonstrated by going from the basic Black-Scholes to some more advanced models, e.g. Stochastic Volatility with jumps. (Even those people, who are familiar with these concepts, might find the main focus, i.e. structured overview, of this talk beneficial).
Knowledge of cause-effect relationships is central to the field of climate science, supporting mechanistic understanding, observational sampling strategies, experimental design, model development and model prediction. While the major causal connections in our planet's climate system are already known, there is still potential for new discoveries in some areas. The purpose of this talk is to make this community familiar with a variety of available tools to discover potential cause-effect relationships from observed or simulation data. Some of these tools are already in use in climate science, others are just emerging in recent years. None of them are miracle solutions, but many can provide important pieces of information to climate scientists. An important way to use such methods is to generate cause-effect hypotheses that climate experts can then study further. In this talk we will (1) introduce key concepts important for causal analysis; (2) discuss some methods based on the concepts of Granger causality and Pearl causality; (3) point out some strengths and limitations of these approaches; and (4) illustrate such methods using a few real-world examples from climate science.
Estimation of the score vector and observed information matrix in intractable...Pierre Jacob
This document discusses methods for estimating derivatives of intractable likelihoods. It introduces shift estimators that use a normal prior distribution centered on the parameter value. As the prior variance goes to zero, the posterior mean approximates the score vector. Monte Carlo methods can be used to estimate the posterior moments and provide estimators of the score vector and observed information matrix with good asymptotic properties. Shift estimators are more robust than finite difference methods when the likelihood estimators have high variance. The methods have applications to hidden Markov models and other intractable models.
Portfolios and Risk Premia for the Long Runguasoni
This document summarizes a research paper on modeling long-run optimal portfolios and risk premia. It includes:
1) An outline describing the goal of developing a tractable framework for portfolio choice and derivatives pricing using a model with stochastic investment opportunities across several assets.
2) A section on the main result regarding long-run portfolios and risk premia, and implications like static fund separation and horizon effects.
3) An overview of the solution method which involves differential equations to identify candidate solutions, finite-horizon bounds to characterize performance, and conditions for long-run optimality.
2 Review of Statistics. 2 Review of Statistics.WeihanKhor2
This document provides an overview of discrete probability distributions, including the binomial and Poisson distributions.
1) It defines key concepts such as random variables, probability mass functions, and expected value as they relate to discrete random variables. 2) The binomial distribution describes independent Bernoulli trials with a constant probability of success, and is used to calculate probabilities of outcomes from events like coin flips. 3) The Poisson distribution approximates the binomial when the number of trials is large and the probability of success is small. It models rare, independent events with a constant average rate and can be used for problems involving traffic accidents or natural disasters.
When trading incurs proportional costs, leverage can scale an asset's return only up to a maximum multiple, which is sensitive to the asset's volatility and liquidity. In a continuous-time model with one safe and one risky asset with constant investment opportunities and proportional transaction costs, we find the efficient portfolios that maximize long term expected returns for given average volatility. As leverage and volatility increase, rising rebalancing costs imply a declining Sharpe ratio. Beyond a critical level, even the expected return declines. For funds that seek to replicate multiples of index returns, such as leveraged ETFs, our efficient portfolios optimally trade off alpha against tracking error.
Option Pricing under non constant volatilityEcon 643 Fina.docxjacksnathalie
Option Pricing under non constant volatility
Econ 643: Financial Economics II
Econ 643: Financial Economics II Non constant volatility 1 / 21
Department of Economics
Introduction
Attempts have been made to fix option pricing puzzles: How to be
consistent with volatility smile and smirk.
The Gram-Charlier expansion is one of then but volatility is constant
which is inconsistent with asset return’s dynamics
We review thre approaches that aim at integrating information
embedded in past returns:
GARCH type of approach,
Stochastic volatility models: Hull and White (1987),
Stochastic volatility models: Heston (1993),
Econ 643: Financial Economics II Non constant volatility 2 / 21
The GARCH option pricing
Let St be the asset price at time t and rt = ln(St/St−1) be the log-return
process. Assume that the process rt is a (G)GARCH(1,1) process:
rt = ln
(
St
St−1
)
= µt−1 + σt−1zt, zt ∼ NID(0, 1)
σ2t = ω + α(σt−1zt − θσt−1)2 + βσ2t−1.
(1)
In this model,
µt−1 = E(rt|Jt−1) is a known function of past returns.
Ex: µt = 0, µt = µ = cst, µt = µ + λσt, µt = r + λσt − 12σ
2
t , etc.
σ2t−1 = Var(rt|Jt−1) is the conditional variance of rt given the
information Jt−1 available at t − 1.
Econ 643: Financial Economics II Non constant volatility 3 / 21
GARCH: How to price options on S?
We can rely on the risk-neutral approach:
C = e−rτ E∗ (max(ST − X, 0)) ,
where E∗ is the expectation under risk-neutral dynamics.
What is the risk-neutral dynamics of St if ln(St/St−1) is a
GARCH(1,1)?
Under risk-neutral dyn., E∗
(
St
St−1
)
= er and Var∗(rt|Jt−1) = σ2t−1
(same as under historical measure). Hence, if rt ∼ GARCH(1, 1)
under risk-neutral, the corresponding mean has to be
µ∗t−1 = r −
σ2
t−1
2
. That is:
rt = r −
σ2
t−1
2
+ σt−1z
∗
t , z
∗
t ∼ NID(0, 1)
σ2t = ω + α(σt−1z
∗
t + r −
σ2
t−1
2
− µt−1 − θσt−1)2 + βσ2t−1.
(2)
Econ 643: Financial Economics II Non constant volatility 4 / 21
GARCH: Simulating the option price
To obtain the price C by simulation:
Simulate B paths of stock price using the risk-neutral dynamics (2):
(S
(b)
t+1, S
(b)
t+2, . . . , S
(b)
T
) for b = 1, . . . , B (e.g. B = 5000).
Obtain the simulated price as
Ĉ = e−rτ Ê(max(ST − X, 0)),
with
Ê(max(ST − X, 0)) =
1
B
B
∑
b=1
max(S
(b)
T
− X, 0).
Econ 643: Financial Economics II Non constant volatility 5 / 21
Option pricing under stochastic volatility
GARCH option pricing is convenient but evidence are out that
volatility is more likely stochastic.
Option pricing under SV is quite challenging because of the extra
source of uncertainty brought by the volatility equation.
The induced PDE (by SV) for option pricing can be derived but is
hard to solve.
The most common SV option pricing models are from Hull and White
(1987) and Heston (1993).
Econ 643: Financial Economics II Non constant volatility 6 / 21
Hull and White (1987)
Consider the price process St and its instantaneous variance process
V − t = σ2t obeying the dynamics:
dS.
Macroeconomic fluctuations with HANK & SAM ADEMU_Project
This document summarizes an analytical framework for studying macroeconomic fluctuations combining heterogeneous agent models (HANK) with search and matching frictions in the labor market (SAM). It shows that the interaction between endogenous income risk from SAM unemployment and precautionary savings from HANK can amplify business cycles. Specifically, it finds that this interaction can lead to an unemployment trap, cause a breakdown of the Taylor principle, amplify shock responses, and create a positive nexus between labor market tightness and the real interest rate due to time-varying unemployment risk. The framework is calibrated to match U.S. macro data and its implications are derived both analytically and quantitatively.
Nonlinear Price Impact and Portfolio Choiceguasoni
In a market with price-impact proportional to a power of the order flow, we derive optimal trading policies and their implied welfare for long-term investors with constant relative risk aversion, who trade one safe asset and one risky asset that follows geometric Brownian motion. These quantities admit asymptotic explicit formulas up to a structural constant that depends only on the price-impact exponent. Trading rates are finite as with linear impact, but they are lower near the target portfolio, and higher away from the target. The model nests the square-root impact law and, as extreme cases, linear impact and proportional transaction costs.
FellowBuddy.com is an innovative platform that brings students together to share notes, exam papers, study guides, project reports and presentation for upcoming exams.
We connect Students who have an understanding of course material with Students who need help.
Benefits:-
# Students can catch up on notes they missed because of an absence.
# Underachievers can find peer developed notes that break down lecture and study material in a way that they can understand
# Students can earn better grades, save time and study effectively
Our Vision & Mission – Simplifying Students Life
Our Belief – “The great breakthrough in your life comes when you realize it, that you can learn anything you need to learn; to accomplish any goal that you have set for yourself. This means there are no limits on what you can be, have or do.”
Like Us - https://www.facebook.com/FellowBuddycom
Similar to Measuring Downside Risk — Realised Semivariance (20)
The document examines mirror neuron activity in children with autism spectrum disorders (ASD) compared to typically developing children. Ten high-functioning children with ASD and ten controls underwent fMRI scans imitating and observing emotional facial expressions. While both groups performed equally well, the children with ASD showed no mirror neuron activity in the inferior frontal gyrus, an area associated with the mirror neuron system. Activity in this area was inversely related to social symptom severity in ASD, suggesting dysfunctional mirror neurons may underlie social deficits in autism.
The Age Of Balance Sheet Recessions What Post 2008 U.S., Europe And China Can...merzak emerzak
The document discusses balance sheet recessions and lessons that can be learned from Japan's experience in the 1990s-2000s. It provides exhibits showing how in a balance sheet recession, private sector prioritizes debt repayment over spending even at low interest rates, causing a deflationary gap. It discusses how Japan addressed this by borrowing and spending the private sector's excess savings, sustaining GDP growth despite large deficits that did not raise rates. However, premature fiscal reforms weakened the economy and increased deficits. The key lesson is that fiscal stimulus is needed until private sector balance sheets are repaired in a balance sheet recession.
1) Yijinjing is an ancient Chinese exercise originating from Shaolin Temple combining stretching and breathing.
2) Engineer Liu Sha practices yijinjing and finds it helps with shortness of breath, appetite, and neck pain under instructor Liu Yuchao.
3) Dr. Liu Yuchao from Yueyang Chinese Medicine Hospital teaches yijinjing classes in Shanghai to help patients and promote the exercises.
This document provides an abstract for a paper that will be presented at the 12th Pacific Rim Real Estate Society Conference from January 22-25, 2006 in Auckland, New Zealand. The paper aims to examine the relationships between size and systematic downside risk and unsystematic downside risk for Malaysian property shares. While previous studies have looked at the relationship between size and risk using variance as a risk measure, this study uses the alternative measure of downside risk. The abstract outlines the concepts of systematic and unsystematic downside risk, reviews prior literature, and previews the methodology and findings of the research.
1) Portfolio theory shows that risk and return are negatively correlated, and diversification across many assets reduces unsystematic risk. Only systematic risk cannot be eliminated through diversification.
2) The efficient frontier graphs the set of optimal portfolios that maximize return for a given level of risk. The capital market line depicts the combination of investments in the market portfolio and risk-free asset.
3) The Capital Asset Pricing Model derives the security market line relationship between risk and return, defining risk as an asset's beta coefficient measuring its volatility relative to the market.
This document discusses quadratic programming problems and the efficient frontier concept in portfolio optimization. It addresses combinations of risk-free rates and risky portfolios, how risky portfolio A relates to lending and borrowing at the risk-free rate, and how the efficient frontier is shaped when lending is allowed at the risk-free rate but borrowing is limited.
Executive Directors Chat Leveraging AI for Diversity, Equity, and InclusionTechSoup
Let’s explore the intersection of technology and equity in the final session of our DEI series. Discover how AI tools, like ChatGPT, can be used to support and enhance your nonprofit's DEI initiatives. Participants will gain insights into practical AI applications and get tips for leveraging technology to advance their DEI goals.
Strategies for Effective Upskilling is a presentation by Chinwendu Peace in a Your Skill Boost Masterclass organisation by the Excellence Foundation for South Sudan on 08th and 09th June 2024 from 1 PM to 3 PM on each day.
This presentation includes basic of PCOS their pathology and treatment and also Ayurveda correlation of PCOS and Ayurvedic line of treatment mentioned in classics.
How to Setup Warehouse & Location in Odoo 17 InventoryCeline George
In this slide, we'll explore how to set up warehouses and locations in Odoo 17 Inventory. This will help us manage our stock effectively, track inventory levels, and streamline warehouse operations.
A review of the growth of the Israel Genealogy Research Association Database Collection for the last 12 months. Our collection is now passed the 3 million mark and still growing. See which archives have contributed the most. See the different types of records we have, and which years have had records added. You can also see what we have for the future.
Walmart Business+ and Spark Good for Nonprofits.pdfTechSoup
"Learn about all the ways Walmart supports nonprofit organizations.
You will hear from Liz Willett, the Head of Nonprofits, and hear about what Walmart is doing to help nonprofits, including Walmart Business and Spark Good. Walmart Business+ is a new offer for nonprofits that offers discounts and also streamlines nonprofits order and expense tracking, saving time and money.
The webinar may also give some examples on how nonprofits can best leverage Walmart Business+.
The event will cover the following::
Walmart Business + (https://business.walmart.com/plus) is a new shopping experience for nonprofits, schools, and local business customers that connects an exclusive online shopping experience to stores. Benefits include free delivery and shipping, a 'Spend Analytics” feature, special discounts, deals and tax-exempt shopping.
Special TechSoup offer for a free 180 days membership, and up to $150 in discounts on eligible orders.
Spark Good (walmart.com/sparkgood) is a charitable platform that enables nonprofits to receive donations directly from customers and associates.
Answers about how you can do more with Walmart!"
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...PECB
Denis is a dynamic and results-driven Chief Information Officer (CIO) with a distinguished career spanning information systems analysis and technical project management. With a proven track record of spearheading the design and delivery of cutting-edge Information Management solutions, he has consistently elevated business operations, streamlined reporting functions, and maximized process efficiency.
Certified as an ISO/IEC 27001: Information Security Management Systems (ISMS) Lead Implementer, Data Protection Officer, and Cyber Risks Analyst, Denis brings a heightened focus on data security, privacy, and cyber resilience to every endeavor.
His expertise extends across a diverse spectrum of reporting, database, and web development applications, underpinned by an exceptional grasp of data storage and virtualization technologies. His proficiency in application testing, database administration, and data cleansing ensures seamless execution of complex projects.
What sets Denis apart is his comprehensive understanding of Business and Systems Analysis technologies, honed through involvement in all phases of the Software Development Lifecycle (SDLC). From meticulous requirements gathering to precise analysis, innovative design, rigorous development, thorough testing, and successful implementation, he has consistently delivered exceptional results.
Throughout his career, he has taken on multifaceted roles, from leading technical project management teams to owning solutions that drive operational excellence. His conscientious and proactive approach is unwavering, whether he is working independently or collaboratively within a team. His ability to connect with colleagues on a personal level underscores his commitment to fostering a harmonious and productive workplace environment.
Date: May 29, 2024
Tags: Information Security, ISO/IEC 27001, ISO/IEC 42001, Artificial Intelligence, GDPR
-------------------------------------------------------------------------------
Find out more about ISO training and certification services
Training: ISO/IEC 27001 Information Security Management System - EN | PECB
ISO/IEC 42001 Artificial Intelligence Management System - EN | PECB
General Data Protection Regulation (GDPR) - Training Courses - EN | PECB
Webinars: https://pecb.com/webinars
Article: https://pecb.com/article
-------------------------------------------------------------------------------
For more information about PECB:
Website: https://pecb.com/
LinkedIn: https://www.linkedin.com/company/pecb/
Facebook: https://www.facebook.com/PECBInternational/
Slideshare: http://www.slideshare.net/PECBCERTIFICATION
বাংলাদেশের অর্থনৈতিক সমীক্ষা ২০২৪ [Bangladesh Economic Review 2024 Bangla.pdf] কম্পিউটার , ট্যাব ও স্মার্ট ফোন ভার্সন সহ সম্পূর্ণ বাংলা ই-বুক বা pdf বই " সুচিপত্র ...বুকমার্ক মেনু 🔖 ও হাইপার লিংক মেনু 📝👆 যুক্ত ..
আমাদের সবার জন্য খুব খুব গুরুত্বপূর্ণ একটি বই ..বিসিএস, ব্যাংক, ইউনিভার্সিটি ভর্তি ও যে কোন প্রতিযোগিতা মূলক পরীক্ষার জন্য এর খুব ইম্পরট্যান্ট একটি বিষয় ...তাছাড়া বাংলাদেশের সাম্প্রতিক যে কোন ডাটা বা তথ্য এই বইতে পাবেন ...
তাই একজন নাগরিক হিসাবে এই তথ্য গুলো আপনার জানা প্রয়োজন ...।
বিসিএস ও ব্যাংক এর লিখিত পরীক্ষা ...+এছাড়া মাধ্যমিক ও উচ্চমাধ্যমিকের স্টুডেন্টদের জন্য অনেক কাজে আসবে ...
How to Manage Your Lost Opportunities in Odoo 17 CRMCeline George
Odoo 17 CRM allows us to track why we lose sales opportunities with "Lost Reasons." This helps analyze our sales process and identify areas for improvement. Here's how to configure lost reasons in Odoo 17 CRM
This presentation was provided by Steph Pollock of The American Psychological Association’s Journals Program, and Damita Snow, of The American Society of Civil Engineers (ASCE), for the initial session of NISO's 2024 Training Series "DEIA in the Scholarly Landscape." Session One: 'Setting Expectations: a DEIA Primer,' was held June 6, 2024.
How to Fix the Import Error in the Odoo 17Celine George
An import error occurs when a program fails to import a module or library, disrupting its execution. In languages like Python, this issue arises when the specified module cannot be found or accessed, hindering the program's functionality. Resolving import errors is crucial for maintaining smooth software operation and uninterrupted development processes.
1. Measuring downside risk — realised semivariance
Ole E. Barndorff-Nielsen
The T.N. Thiele Centre for Mathematics in Natural Science,
Department of Mathematical Sciences,
University of Aarhus, Ny Munkegade, DK-8000 Aarhus C, Denmark
oebn@imf.au.dk
Silja Kinnebrock
Oxford-Man Institute, University of Oxford,
Blue Boar Court, 9 Alfred Street, Oxford OX1 4EH, UK
&
Merton College, University of Oxford
silja.kinnebrock@oxford-man.ox.ac.uk
Neil Shephard
Oxford-Man Institute, University of Oxford,
Blue Boar Court, 9 Alfred Street, Oxford OX1 4EH, UK
&
Department of Economics, University of Oxford
neil.shephard@economics.ox.ac.uk
January 21, 2008
Abstract
We propose a new measure of risk, based entirely on downwards moves measured using high
frequency data. Realised semivariances are shown to have important predictive qualities for
future market volatility. The theory of these new measures is spelt out, drawing on some new
results from probability theory.
Keywords: Market frictions; Quadratic variation; Realised variance; Semimartingale; Semivari-
ance.
1
2. ‘It was understood that risk relates to an unfortunate event occurring, so for an
investment this corresponds to a low, or even negative, return. Thus getting returns in
the lower tail of the return distribution constitutes this “downside risk.” However, it is
not easy to get a simple measure of this risk.’ Quoted from Granger (2008).
1 Introduction
A number of economists have wanted to measure downside risk, the risk of prices falling, just
using information based on negative returns — a prominent recent example is by Ang, Chen, and
Xing (2006). This has been operationalised by quantities such as semivariance, value at risk and
expected shortfall, which are typically estimated using daily returns. In this paper we introduce
a new measure of the variation of asset prices based on high frequency data. It is called realised
semivariance (RS). We derive its limiting properties, relating it to quadratic variation and, in
particular, negative jumps. Further, we show it has some useful properties in empirical work,
enriching the standard ARCH models pioneered by Rob Engle over the last 25 years and building
on the recent econometric literature on realised volatility.
Realised semivariance extends the influential work of, for example, Andersen, Bollerslev, Diebold,
and Labys (2001) and Barndorff-Nielsen and Shephard (2002), on formalising so-called realised vari-
ances (RV) which links these commonly used statistics to the quadratic variation process. Realised
semivariance measures the variation of asset price falls. At a technical level it can be regarded as
a continuation of the work of Barndorff-Nielsen and Shephard (2004) and Barndorff-Nielsen and
Shephard (2006), who showed it is possible to go inside the quadratic variation process and separate
out components of the variation of prices into that due to jumps and that due to the continuous evo-
lution. This work has prompted papers by, for example, Andersen, Bollerslev, and Diebold (2007),
Huang and Tauchen (2005) and Lee and Mykland (2008) on the importance of this decomposition
empirically in economics. Surveys of this kind of thinking are provided by Andersen, Bollerslev,
and Diebold (2006) and Barndorff-Nielsen and Shephard (2007), while a lengthy discussion of the
relevant probability theory is given in Jacod (2007).
Let us start with statistics and results which are well known. Realised variance (RV) estimates
the ex-post variance of asset prices over a fixed time period. We will suppose that this period is
0 to 1. In our applied work it can be thought of as any individual day of interest. Then RV is
defined as
n
2
RV = Ytj − Ytj−1 .
j=1
where 0 = t0 < t1 < ... < tn = 1 are the times at which (trade or quote) prices are available. For
2
3. arbitrage free-markets, Y must follow a semimartingale. This estimator converges as we have more
and more data in that interval to the quadratic variation at time one,
n
2
[Y ]1 = p− lim Ytj − Ytj−1 ,
n→∞
j=1
(e.g. Protter (2004, p. 66–77)) for any sequence of deterministic partitions 0 = t0 < t1 < ... <
tn = 1 with supj {tj+1 − tj } → 0 for n → ∞. This limiting operation is often referred to as “in-fill
asymptotics” in statistics and econometrics1 .
One of the initially strange things about realised variance is that it solely uses squares of the
data, while the research of, for example, Black (1976), Nelson (1991), Glosten, Jagannathan, and
Runkle (1993) and Engle and Ng (1993) has indicated the importance of falls in prices as a driver
of conditional variance. The reason for this is clear, as the high frequency data becomes dense, the
extra information in the sign of the data can fall to zero — see also the work of Nelson (1992).
The most elegant framework in which to see this is where Y is a Brownian semimartingale
t t
Yt = as ds + σ s dWs , t ≥ 0,
0 0
where a is a locally bounded predictable drift process and σ is a c`dl`g volatility process – all
a a
adapted to some common filtration Ft , implying the model can allow for classic leverage effects.
For such a process
t
[Y ]t = σ 2 ds,
s
0
and so
d[Y ]t = σ 2 dt,
t
which means for a Brownian semimartingale the QV process tells us everything we can know about
the ex-post variation of Y . The signs of the returns are irrelevant in the limit — this is true whether
there is leverage or not.
If there are jumps in the process there are additional things to learn than just the QV process.
Let
t t
Yt = as ds + σ s dWs + J t ,
0 0
where J is a pure jump process. Then, writing jumps in Y as ∆Yt = Yt − Yt− , then
t
[Y ]t = σ 2 ds+
s (∆Ys )2 ,
0 s≤t
1
When there are market frictions it is possible to correct this statistic for their effect using the two scale estimator
of Zhang, Mykland, and A¨ ıt-Sahalia (2005), the realised kernel of Barndorff-Nielsen, Hansen, Lunde, and Shephard
(2006) or the pre-averaging based statistic of Jacod, Li, Mykland, Podolskij, and Vetter (2007).
3
4. and so QV aggregates two sources of risk. Even when we employ bipower variation (Barndorff-
Nielsen and Shephard (2004) and Barndorff-Nielsen and Shephard (2006)2 ), which allows us to
t 2 2
estimate 0 σ s ds robustly to jumps, this still leaves us with estimates of s≤t (∆Js ) . This tells
us nothing about the asymmetric behaviour of the jumps — which is important if we wish to
understand downside risk.
In this paper we introduce the downside realised semivariances (RS − )
tj ≤1
2
RS − = Ytj − Ytj−1 1Ytj −Ytj−1 ≤0 ,
j=1
where 1y is the indicator function taking the value 1 of the argument y is true. We will study the
behaviour of this statistic under in-fill asymptotics. In particular we will see that
t
p 1
RS − → σ 2 ds+
s (∆Ys )2 1∆Ys ≤0 ,
2 0 s≤1
under in-fill asymptotics. Hence RS − provides a new source of information, one which focuses on
squared negative jumps3 . Of course the corresponding upside realised semivariance
tj ≤1
+ 2
RS = Ytj − Ytj−1 1Ytj −Ytj−1 ≥0
j=1
t
p 1
→ σ 2 ds+
s (∆Ys )2 1∆Ys ≥0 ,
2 0 s≤1
maybe of particular interest to investors who have short positions in the market (hence a fall in
price can lead to a positive return and hence is desirable), such as hedge funds. Of course,
RV = RS − + RS + .
Semivariances, or more generally measures of variation below a threshold (target semivariance)
have a long history in finance. The first references are probably Markowitz (1959), Mao (1970b),
Mao (1970a), Hogan and Warren (1972) and Hogan and Warren (1974). Examples include the
work of Fishburn (1977) and Lewis (1990). Sortino ratios (which are an extension of Sharpe ratios
and were introduced by Sortino and van der Meer (1991)), and the so-called post-modern portfolio
theory by, for example, Rom and Ferguson (1993), has attracted attention. Sortino and Satchell
(2001) look at recent developments and provide a review, while Pedersen and Satchell (2002) look
at the economic theory of this measure of risk. Of course these types of measures are likely not
2
Threshold based decompositions have also been suggested in the literature, examples of this include Mancini
(2001), Jacod (2007) and Lee and Mykland (2008).
3
This type of statistic relates to the work of Babsiria and Zakoian (2001) who built separate ARCH type conditional
variance models of daily returns using positive and negative daily returns. It also resonates with the empirical results
in a recent paper by Chen and Ghysels (2007) on news impact curves estimated through semi-parametric MIDAS
regressions.
4
5. to be very informative for exchange rate investments or for the individual holdings of hedge funds
which can go either short or long. Our innovation is to bring high frequency analysis to bear on
this measure of risk.
The empirical essence of daily downside realised semivariance can be gleaned from Figure 1
which shows an analysis of trades on General Electric (GE) carried out on the New York Stock
Exchange4 from 1995 to 2005 (giving us 2,616 days of data). In graph (a) we show the path of the
trades drawn in trading time on a particular randomly chosen day in 2004, to illustrate the amount
of daily trading which is going on in this asset. Notice by 2004 the tick size has fallen to one cent.
Graph (b) shows the open to close returns, measured on the log-scale and multiplied by 100,
which indicates some moderation in the volatility during the last and first piece of the sample
period. The corresponding daily realised volatility (the square root of the realised variance) is
plotted in graph (c), based upon returns calculated every 15 trades. The Andersen, Bollerslev,
Diebold, and Labys (2000) variance signature plot is shown in graph (d), to assess the impact of
noise on the calculation of realised volatility. It suggests statistics computed on returns calculated
every 15 trades should not be too sensitive to noise for GE. Graph (e) shows the same but focusing
on daily RS − and RS + . Throughout, the statistics are computed using returns calculated every
15 trades. It indicates they are pretty close to one another on average over this sample period.
This component signature plot is in the spirit of the analysis pioneered by Andersen, Bollerslev,
Diebold, and Labys (2001) in their analysis of realised variance. Graph (f) shows the correlogram
for the downside realised semivariance and the realised variance and suggests the downside realised
semivariance has much more dependence in it than RS + . Some summary statistics for this data
are available in Table 2, which will be discussed in some detail in Section 3.
In the realised volatility literature, authors have typically worked out the impact of using realised
volatilities on volatility forecasting using regressions of future realised variance on lagged realised
variance and various other explanatory variables5 . Engle and Gallo (2006) prefers a different route,
which is to add lagged realised quantities as variance regressors in Engle (2002) and Bollerslev
(1986) GARCH type models of daily returns — the reason for their preference is that it is aimed at
a key quantity, a predictive model of future returns, and is more robust to the heteroskedasticity
inherent in the data. Typically when Engle generalises to allow for leverage he uses the Glosten,
Jagannathan, and Runkle (1993) (GJR) extension. This is the method we follow here. Throughout
we will use the subscript i to denote discrete time.
4
This data is taken from the TAQ database, managed through WRDS. Although information on trades is available
from all the different exchanges in the U.S., we solely study trades which are made at the exchange in New York.
5
Leading references include Andersen, Bollerslev, Diebold, and Labys (2001) and Andersen, Bollerslev, and Med-
dahi (2004).
5
6. Analysis of the General Electric share price from 1995 to 2005
(a): Trading prices in trading in, a day in 2004 (b): Daily log returns (open to close), times 100
35.3
10
35.2 0
0 1000 2000 3000 1996 1998 2000 2002 2004
(c): Daily realised volatility, every 15 trades (d): ABDL variance signature plot
10.0 6
7.5
5.0
4
2.5
1996 1998 2000 2002 2004 0 5 10 15 20 25 30
(e): Component variance signature plot (f): ACF: components of realised variance
3 RS +
RS −
Realised variance
0.50
2
0.25
0 5 10 15 20 25 30 0 10 20 30 40 50 60
Figure 1: Analysis of trades on General Electric carried out on the NYSE from 1995 to 2005. (a):
path of the trades drawn in trading time on a random day in 2004. (b): daily open to close returns
r√ measured on the log-scale and multiplied by 100. The corresponding daily realised volatility
i,
( RVi )is plotted in graph (c), based upon returns calculated every 15 trades. (d): variance
signature plot in trade time to assess the impact of noise on the calculation of realised variance
+ −
(RV ). (e): same thing, but for the realised semivariances (RSi and RSi ). (f) correlogram for
+ −
RSi , RVi and RSi . Code: downside.ox.
We model daily open to close returns {ri ; i = 1, 2, ..., T } as
E (ri |Gi−1 ) = µ,
hi = Var (ri |Gi−1 ) = ω + α (ri−1 − µ)2 + βhi−1 + δ (ri−1 − µ)2 Iri−1 −µ<0 + γ ′ zi−1 ,
and then use a standard Gaussian quasi-likelihood to make inference on the parameters, e.g. Boller-
slev and Wooldridge (1992). Here zi−1 are the lagged daily realised regressors and Gi−1 is the
information set generated by discrete time daily statistics available to forecast ri at time i − 1.
Table 1 shows the fit of the GE trade data from 1995-2005. It indicates the lagged RS − beating
out of the GARCH model (δ = 0) and the lagged RV. Both realised terms yield large likelihood
improvements over a standard daily returns based GARCH. Importantly there is a vast shortening
in the information gathering period needed to condition on, with the GARCH memory parameter
β dropping from 0.953 to around 0.7. This makes fitting these realised based models much easier
in practice, allowing their use on relatively short time series of data.
6
7. When the comparison with the GJR model is made, which allows for traditional leverage effects,
the results are more subtle, with the RS − significantly reducing the importance of the traditional
leverage effect while the high frequency data still has an important impact on improving the fit of
the model. In this case the RS − and RV play similar roles, with RS − no longer dominating the
impact of the RV in the model.
ARCH type models and lagged realised semivariance and variance
GARCH GJR
Lagged RS − 0.685 0.499 0.371 0.441
(2.78) (2.86) (0.91) (2.74)
Lagged RV −0.114 0.228 0.037 0.223
(−1.26) (3.30) (0.18) (2.68)
ARCH 0.040 0.036 0.046 0.040 0.017 0.021 0.016 0.002
(2.23) (2.068) (2.56) (2.11) (0.74) (1.27) (1.67) (0.12)
GARCH 0.711 0.691 0.953 0.711 0.710 0.713 0.955 0.708
(7.79) (7.071) (51.9) (9.24) (7.28) (7.65) (58.0) (7.49)
GJR 0.055 0.048 0.052 0.091
(1.05) (1.51) (2.86) (2.27)
Log-Likelihood -4527.3 -4527.9 -4577.6 -4533.5 -4526.2 -4526.2 -4562.2 -4526.9
Table 1: Gaussian quasi-likelihood fit of GARCH and GJR models fitted to daily open to close
returns on General Electric share prices, from 1995 to 2005. We allow lagged daily realised variance
(RV) and realised semivariance (RS) to appear in the conditional variance. They are computed
using every 15th trade. T-statistics, based on robust standard errors, are reported in small font
and in brackets. Code: GARCH analysis.ox
The rest of this paper has the following structure. In Section 2 we will discuss the theory of
realised semivariances, deriving a central limit theory for it under some mild assumptions. In
Section 3 we will deepen the empirical work reported here, looking at a variety of stocks and also
both trade and quote data. In Section 4 we will discuss various extensions and areas of possible
future work.
2 Econometric theory
2.1 The model and background
We start this section by repeating some of the theoretical story from Section 1.
Consider a Brownian semimartingale Y given as
t t
Yt = as ds + σ s dWs , (1)
0 0
where a is a locally bounded predictable drift process and σ is a c`dl`g volatility process. For such
a a
a process
t
[Y ]t = σ 2 ds,
s
0
7
8. and so d[Y ]t = σ 2 dt, which means that when there are no jumps the QV process tells us everything
t
we can know about the ex-post variation of Y .
When there are jumps this is no longer true, in particular let
t t
Yt = as ds + σ s dWs + J t , (2)
0 0
where J is a pure jump process. Then
t
[Y ]t = σ 2 ds+
s (∆Js )2 ,
0 s≤t
and d[Y ]t = σ 2 dt + (∆Yt )2 . Even when we employ devices like bipower variation (Barndorff-Nielsen
t
and Shephard (2004) and Barndorff-Nielsen and Shephard (2006))
tj ≤t
[1,1]
{Y }t = µ−2 p− lim
1 Ytj − Ytj−1 Ytj−1 − Ytj−2 , µ1 = E |U | , U ∼ N (0, 1),
n→∞
j=2
t 2
we are able to estimate 0 σ s ds robustly to jumps, but this still leaves us with estimates of
2
s≤t (∆Js ) . This tells us nothing about the asymmetric behaviour of the jumps.
2.2 Realised semivariances
The empirical analysis we carry out throughout this paper is based in trading time, so data arrives
into our database at irregular points in time. However, these irregularly spaced observations can
be thought of as being equally spaced observations on a new time-changed process, in the same
stochastic class, as argued by, for example, Barndorff-Nielsen, Hansen, Lunde, and Shephard (2006).
Thus there is no intellectual loss in initially considering equally spaced returns
yi = Y i − Y i−1 , i = 1, 2, ..., n.
n n
We study the functional
⌊nt⌋ 2
yi 1{yi ≥0}
V (Y, n) = 2 . (3)
yi 1{yi ≤0}
i=1
The main results then come from an application of some limit theory of Kinnebrock and Podolskij
(2007) for bipower variation. This work can be seen as an important generalisation of Barndorff-
Nielsen, Graversen, Jacod, and Shephard (2006) who studied bipower type statistics of the form
n
1 √ √
g( nyj )h( nyj−1 ),
n
j=2
when g and h were assumed to be even functions. Kinnebrock and Podolskij (2007) give the
extension to the uneven case, which is essential here6 .
6
It is also useful in developing the theory for realised autocovariance under a Brownian motion, which is important
in the theory of realised kernels developed by Barndorff-Nielsen, Hansen, Lunde, and Shephard (2006).
8
9. Proposition 1 Suppose (1) holds, then
t
p 1 1
V (Y, n) → σ 2 ds
s .
2 0 1
Proof. Trivial application of Theorem 1 in Kinnebrock and Podolskij (2007).
Corollary 1 Suppose
t t
Yt = as ds + σ s dWs + J t ,
0 0
holds, where J is a finite activity jump process then
p 1 t
1 (∆Ys )2 1{∆Ys ≥0}
V (Y, n) → σ 2 ds + .
2 0
s
1 (∆Ys )2 1{∆Ys ≤0}
s≤t
Remark. The above means that
p
(1, −1) V (Y, n) → (∆Ys )2 1{∆Ys ≥0} − (∆Ys )2 1{∆Ys ≤0} ,
s≤t
the difference in the squared jumps. Hence this statistic allows us direct econometric evidence on
the importance of the sign of jumps. Of course, by combining with bipower variation
1 t
1 p (∆Ys )2 1{∆Ys ≥0}
V (Y, n) − σ 2 ds → ,
2 0
s
1 (∆Ys )2 1{∆Ys ≤0}
s≤t
we can straightforwardly estimate the QV of just positive or negative jumps.
In order to derive a central limit theory we need to make two assumptions on the volatility
process.
(H1). If there were no jumps in the volatility then it would be sufficient to employ
t t t
σt = σ 0 + a∗ ds +
s σ ∗ dWs +
s
∗ ∗
vs dWs . (4)
0 0 0
Here a∗ , σ ∗ , v ∗ are adapted c`dl`g processes, with a∗ also being predictable and locally bounded.
a a
W ∗ is a Brownian motion independent of W .
(H2). σ 2 > 0 everywhere.
t
The assumption (H1) is rather general from an econometric viewpoint as it allows for flexible
leverage effects, multifactor volatility effects, jumps, non-stationarities, intraday effects, etc. Indeed
we do not know of a continuous time continuous sample path volatility model used in financial
economics which is outside this class. Kinnebrock and Podolskij (2007) also allow jumps in the
volatility under the usual (in this context) conditions introduced by Barndorff-Nielsen, Graversen,
Jacod, Podolskij, and Shephard (2006) and discussed by, for example, Barndorff-Nielsen, Graversen,
Jacod, and Shephard (2006) but we will not detail this here.
The assumption (H2) is also important, it rules out the situation where the diffusive component
disappears.
9
10. Proposition 2 Suppose (1), (H1) and (H2) holds, then
√ t
1 1 Dst
n V (Y, n)− σ 2 ds
s → Vt
2 0 1
where
t t t
′
Vt = αs (1) ds + αs (2) dWs + αs (3) dWs ,
0 0 0
1 1
αs (1) = √ {2as σ s + σ s σ ∗ }
s ,
2π −1
2 1
αs (2) = √ σ2 s ,
2π −1
1 4 5 −1
As = σs ,
4 −1 5
αs (3) αs (3)′ = As − αs (2) αs (2)′ ,
where αs (3) is a 2 × 2 matrix. Here W ′ ⊥ (W, W ∗ ), the Brownian motions which appears in the
⊥
Brownian semimartingale (1) and (H1).
Proof. given in the Appendix.
Remark. When we look at
RV = (1, 1) V (Y, n),
then we produce the well known result
√ t t
Dst ′
n RV − σ 2 ds
s → 2σ 2 dWs
s
0 0
which appears in Jacod (1994) and Barndorff-Nielsen and Shephard (2002).
Remark. Assume a, σ ⊥ W then
⊥
√ t
1 1
n V (Y, n)− σ 2 ds
s
2 0 1
t t
Dst 1 1 1 5 −1
→ MN √ {2as σ s + σ s σ ∗ } ds
s , σ 4 ds
s .
2π 0 −1 4 0 −1 5
If there is no drift and the volatility of volatility was small then the mean of this mixed Gaussian
distribution is zero and we could use this limit result to construct confidence intervals on these
quantities. When the drift is not zero we cannot use this result as we do not have a method for
estimating the bias which is a scaled version of
t
1
√ {2as σ s + σ s σ ∗ } ds.
s
n 0
3 t 4
Of course in practice this bias will be small. The asymptotic variance of (1, −1) V (Y, n) is n 0 σ s ds,
but obviously not mixed Gaussian.
10
11. Remark. When the a, σ ⊥ W result fails, we do not know how to construct confidence intervals
⊥
even if the drift is zero. This is because in the limit
√ t
1 1
n V (Y, n)− σ 2 ds
s
2 0 1
depends upon W . All we know is that the asymptotic variance is again
t
1 5 −1
σ 4 ds
s .
4n 0 −1 5
Notice, throughout the asymptotic variance of RS − is
t
5
σ 4 ds
s
4n 0
so is less than that of the RV (of course it estimates a different quantity so perhaps this observations
is not so particularly important). It also means the asymptotic variance of RS + − RS − is
t
3
σ 4 ds.
s
n 0
3 More empirical work
3.1 More on GE trade data
For the GE trade data, Table 2 reports basic summary statistics for squared open to close daily
returns, realised variance and downside realised semivariance. Much of this is familiar, with the
average level of squared returns and realised variance being roughly the same, while the mean of
the downside realised semivariance is around one half that of the realised variance. The most
interesting results are that the RS − statistic has a correlation with RV of around 0.86 and that
it is negatively correlated with daily returns. The former correlation is modest for an additional
volatility measure and indicates it may have additional information not in the RV statistic. The
latter result shows that large daily semivariances are associated with contemporaneous downward
moves in the asset price — which is not surprising of course.
The serial correlation in the daily statistics are also presented in Table 2. They show the RV
statistic has some predictability through time, but that the autocorrelation in the RS − is much
higher. Together with the negative correlation between returns and contemporaneous RS − (which
is consistent for a number of different assets), this suggests one should be able to modestly predict
returns using past RS − .
−
Table 3 shows the regression fit of ri on ri−1 and RSi−1 for the GE trade data. The t-statistic
on lagged RS − is just significant and positive. Hence a small amount of the variation in the
11
12. Summary information for daily statistics for GE trade data
Variable Mean S.D. Correlation matrix ACF1 ACF20
ri 0.01 1.53 1.00 -0.01 0.00
2
ri 2.34 5.42 0.06 1.00 0.17 0.07
RVi 2.61 3.05 0.03 0.61 1.00 0.52 0.26
+
RSi 1.33 2.03 0.20 0.61 0.94 1.00 0.31 0.15
−
RSi 1.28 1.28 -0.22 0.47 0.86 0.66 1.00 0.65 0.37
BP Vi 2.24 2.40 0.00 0.54 0.95 0.84 0.93 1.00 0.64 0.34
BP DVi 0.16 0.46 -0.61 -0.10 -0.08 -0.34 0.34 -0.01 1.00 0.06 0.03
Table 2: Summary statistics for daily GE data computed using trade data. ri denotes daily open
to close returns, RVi is the realised variance, RSi are the realised semivariances, and BP Vi is the
daily realised bipower variation.
high frequency falls of price in the previous day are associated with rises in future asset prices
— presumably because the high frequency falls increase the risk premium. The corresponding
−
t-statistics for the impact of RSi−1 for other series are given in Table 6, they show a similar weak
pattern.
GE trade data: Regression of returns on lagged realised semivariance and returns
Coefficient t-value Coefficient t-value Coefficient t-value
Constant 0.009 0.03 -0.061 -1.43 -0.067 -1.56
ri−1 -0.012 0.01 -0.001 -0.06 0.016 0.67
−
RSi−1 0.054 2.28 0.046 1.85
−
RSi−1 − 0.5BP Vi−1 0.109 1.26
log L -4802.2 -4799.6 -4,798.8
−
Table 3: Regression of returns ri on lagged realised semivariance RSi−1 and returns ri−1 for daily
returns based on the GE trade database.
The RS − statistic has a similar dynamic pattern to the bipower variation statistic7 . The mean
and standard deviation of the RS − statistic is slightly higher than half the realised BPV one. The
difference estimator
−
BP DVi = RSi − 0.5BP Vi ,
which estimates the squared negative jumps, is highly negatively correlated with returns but not
very correlated with other measures of volatility. Interestingly this estimator is slightly autocorre-
lated, but at each of the first 10 lags this correlation is positive which means it has some forecasting
potential.
12
13. Summary information for daily statistics for other trade data
Mean S.D. Correlation matrix ACF1 ACF20
DIS
ri -0.02 1.74 1.00 -0.00 0.00
ri2 3.03 6.52 0.04 1.00 0.15 0.08
RVi 3.98 4.69 -0.00 0.53 1.00 0.69 0.35
+
RSi 1.97 2.32 0.19 0.55 0.94 1.00 0.66 0.35
−
RSi 2.01 2.60 -0.18 0.46 0.95 0.81 1.00 0.57 0.30
BP Vi 3.33 3.97 -0.00 0.53 0.98 0.93 0.93 1.00 0.69 0.37
BP DVi 0.35 1.03 -0.46 0.13 0.52 0.25 0.72 0.43 1.00 0.05 0.04
AXP
ri 0.01 1.86 1.00 0.01 0.01
ri2 3.47 7.75 -0.00 1.00 0.15 0.09
RVi 3.65 4.57 -0.01 0.56 1.00 0.64 0.37
+
RSi 1.83 2.62 0.22 0.52 0.93 1.00 0.48 0.27
−
RSi 1.82 2.30 -0.28 0.53 0.91 0.72 1.00 0.64 0.36
BP Vi 3.09 3.74 -0.04 0.52 0.94 0.83 0.92 1.00 0.69 0.39
BP DVi 0.27 0.90 -0.63 0.27 0.37 0.10 0.62 0.28 1.00 0.20 0.11
IBM
ri 0.01 1.73 1.00 -0.05 0.01
ri2 3.02 7.25 0.04 1.00 0.13 0.04
RVi 2.94 3.03 0.03 0.55 1.00 0.65 0.34
+
RSi 1.50 1.81 0.24 0.54 0.94 1.00 0.50 0.26
−
RSi 1.44 1.43 -0.24 0.48 0.91 0.74 1.00 0.65 0.34
BP Vi 2.62 2.60 0.00 0.51 0.96 0.86 0.93 1.00 0.70 0.38
BP DVi 0.13 0.49 -0.71 0.05 0.13 -0.11 0.44 0.10 1.00 0.04 -0.01
Table 4: Summary statistics for various daily data computed using trade data. ri denotes daily
open to close returns, RVi is the realised variance, RSi is the realised semivariance, and BP Vi is
the daily realised bipower variation. BP DVi is the realised bipower downward variation statistic.
13
14. 3.2 Other trade data
Results in Table 3 show that broadly the same results hold for a number of frequently traded
assets - American Express (AXP), Walt Disney (DIS) and IBM. Table 5 shows the log-likelihood
improvements by including RV and RS − statistics into the GARCH and GJR models based on
trades. The conclusion is clear for GARCH models. By including RS − statistics in the model
there is little need to include a traditional leverage effect. Typically it is only necessary to include
RS − in the information set, adding RV plays only a modest role. For GJR models, the RV statistic
becomes more important and is sometimes slightly more effective than the RS − statistic.
Trades: logL improvements by including lagged RS − and RV in conditional variance
Lagged variables GARCH model GJR model
AXP DIS GE IBM AXP DIS GE IBM
RV, RS − & BPV 59.9 66.5 50.5 64.8 47.7 57.2 36.7 45.7
RV & BPV 53.2 63.7 44.7 54.6 45.4 56.9 36.0 44.6
RS − & BPV 59.9 65.7 48.7 62.6 47.6 53.2 36.4 42.5
BPV 46.2 57.5 44.6 43.9 40.0 50.0 35.8 34.5
RV & RS − 59.8 66.3 49.5 60.7 47.5 56.9 35.4 42.4
RV 53.0 63.5 43.2 51.5 45.1 56.7 34.7 41.9
RS − 59.6 65.6 48.7 60.6 47.1 52.4 35.4 41.7
None 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Table 5: Improvements in the Gaussian quasi-likelihood by including lagged realised quantities in
the conditional variance over standard GARCH and GJR models. Fit of GARCH and GJR models
for daily open to close returns on four share prices, from 1995 to 2005. We allow lagged daily
realised variance (RV), realised semivariance (RS− ), realised bipower variation (BPV) to appear in
the conditional variance. They are computed using every 15th trade. Code: GARCH analysis.ox
−
t-statistics for ri on RSi−1 , controlling for lagged returns
AIX DIS GE IBM
Trades -0.615 3.79 2.28 0.953
Quotes 0.059 5.30 2.33 1.72
Table 6: The t-statistics on realised semivariance calculated by regressing daily returns ri on lagged
−
daily returns and lagged daily semivariances (RSi−1 ). This is carried out for a variety of stock prices
using trade and quote data. The RS statistics are computed using every 15th high frequency data
point.
3.3 Quote data
We have carried out the same analysis based on quote data, looking solely at the series for offers to
buy placed on the New York Stock Exchange. The results are given in Tables 6 and 7. The results
7
This is computed using not one but two lags, which reduces the impact of market microstructure, as shown by
Andersen, Bollerslev, and Diebold (2007).
14
15. are in line with the previous trade data. The RS − statistic is somewhat less effective for quote
data, but the changes are marginal.
Quotes: LogL improvements by including lagged RS and RV in conditional variance
Lagged variables GARCH model GJR model
AXP DIS GE IBM AXP DIS GE IBM
RV & RS − 50.1 53.9 45.0 53.8 39.7 48.0 31.7 31.5
RV 45.0 53.6 43.3 43.9 39.1 46.3 31.6 31.3
RS − 49.5 50.7 44.5 53.7 38.0 39.4 29.1 30.0
None 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
Table 7: Quote data: Improvements in the Gaussian quasi-likelihood by including lagged realised
quantities in the conditional variance. Fit of GARCH and GJR models for daily open to close
returns on four share prices, from 1995 to 2005. We allow lagged daily realised variance (RV) and
realised semivariance (RS) to appear in the conditional variance. They are computed using every
15th trade. Code: GARCH analysis.ox
4 Additional remarks
4.1 Bipower variation
We can build on the work of Barndorff-Nielsen and Shephard (2004), Barndorff-Nielsen and Shep-
hard (2006), Andersen, Bollerslev, and Diebold (2007) and Huang and Tauchen (2005) by defining
tj ≤1 tj ≤1
2 1
BP DV = Ytj − Ytj−1 1Ytj −Ytj−1 ≤0 − µ−2 Ytj − Ytj−1 Ytj−1 − Ytj−2
2 1
j=1 j=2
p 2
→ (∆Ys ) I∆Ys ≤0 ,
s≤t
the realised bipower downward variation statistic (upward versions are likewise trivial to define).
This seems a novel way of thinking about jumps — we do not know of any literature which has
2
identified s≤t (∆Ys ) I∆Ys before. It is tempting to try to carry out jump tests based upon it to
test for the presence of downward jumps against a null of no jumps at all. However, the theory
developed in Section 2 suggests that this is going to be hard to implement this solely based on in-fill
asymptotics without stronger assumptions than we usually like to make due to the presence of the
drift term in the limiting result and the non-mixed Gaussian limit theory (we could do testing if
we assumed the drift was zero and there is no leverage term). Of course, it would not stop us
from testing things based on the time series dynamics of the process - see the work of Corradi and
Distaso (2006).
Further, a time series of such objects can be used to assess the factors which drive downward
jumps, but simply building a time series model for it, conditioning on explanatory variables.
15
16. An alternative to this approach is to use higher order power variation statistics (e.g. Barndorff-
Nielsen and Shephard (2004) and Jacod (2007)),
tj ≤1
r p
Ytj − Ytj−1 1Ytj −Ytj−1 ≤0 → |∆Ys |r I∆Ys ≤0 , r > 2,
j=1 s≤t
as n → ∞. The difficulty with using these high order statistics is that they will be more sensitive
to noise than the BPDV estimator.
4.2 Effect of noise
Suppose instead of seeing Y we see
X = Y + U,
and think of U as noise. Let us focus entirely on
n n n n
x2 1{xi ≤0} =
i
2
yi 1{yi ≤−ui } + u2 1{yi ≤−ui } + 2
i yi ui 1{yi ≤−ui }
i=1 i=1 i=1 i=1
n n n
2
≃ yi 1{ui ≤0} + u2 1{ui ≤0} + 2
i yi ui 1{ui ≤0} .
i=1 i=1 i=1
If we use the framework of Zhou (1996), where U is white noise, uncorrelated with Y , with E(U ) = 0
and Var(U ) = ω 2 then it is immediately apparent that the noise will totally dominate this statistic
in the limit as n → ∞.
Pre-averaging based statistics of Jacod, Li, Mykland, Podolskij, and Vetter (2007) could be
used here to reduce the impact of noise on the statistic.
5 Conclusions
This paper has introduced a new measure of variation called downside “realised semivariance.” It
is determined solely by high frequency downward moves in asset prices. We have seen it is possible
to carry out an asymptotic analysis of this statistic and see that its limit only contains downward
jumps.
We have assessed the effectiveness of this new measure using it as a conditioning variable for a
GARCH model of daily open to close returns. Throughout, for non-leverage based GARCH models,
downside realised semivariance is more informative than the usual realised variance statistic. When
a leverage term is introduce it is hard to tell the difference.
Various extensions to this work were suggested.
16
17. 6 Acknowledgements
This paper was prepared for a conference in honour of Robert F. Engle to be held in San Diego in
June 2008. We are extremely grateful to Tim Bollerslev, Jeffrey Russell and Mark Watson for the
invitation to give it.
The ARCH models fitted in this paper were computed using G@ARCH 5.0, the package of
Laurent and Peters (2002). Throughout programming was carried out using the Ox language of
Doornik (2001) within the OxMetrics 5.0 environment.
We are very grateful for the help of Asger Lunde in preparing some of the data we used in this
analysis and advice on various issues. We also would like to thank Anthony Ledford and Andrew
Patton for helpful suggestions at various points.
7 Appendix: Proof of Proposition 2
Consider the framework of Theorem 2 in Kinnebrock and Podolskij (2007) and choose
g1 (x) x2 1{x≥0}
g (x) = = h (x) = I2
g2 (x) x2 1{x≤0}
Assume that X is a Brownian semimartingale, conditions (H1) and (H2) are satisfied and note that
g is continuously differentiable and so their theory applies directly. Due to the particular choice of
h we obtain the stable convergence
√ t t t t
1 1
n V (Y, n)t − σ 2 ds
s → αs (1)ds + αs (2)dWs + ′
αs (3)dWs , (5)
2 0 1 0 0 0
where W ′ is a 1-dimensional Brownian motion which is defined on an extension of the filtered
probability space and is independent of the σ-field F. Using the notation
ρσ (g) = E {g(σU )} , U ∼ N (0, 1)
ρ(1) (g) = E {U g(σU )} ,
σ U ∼ N (0, 1)
1
ρ(1,1) (g) = E g(σW1 )
σ Ws dWs ,
0
the α(1), α(2) and α(3) are defined by
∂gj ∂gj
αs (1)j = σ ∗ ρ(11)
s ˜σ s + as ρσs
∂x ∂x
αs (2)j = ρ(1) (gj )
σs
αs (3) αs (3)′ = As − αs (2) αs (2)′
and the elements of the 2 × 2 matrix process A is given by
′
Aj,j = ρσs gj gj ′ + ρσs (gj ) ρσs gj ′ + ρσs gj ′ ρσs (gj ) − 3ρσs (gj ) ρσs gj ′ .
s
Then we obtain the result using the following Lemma.
17
18. Lemma 1 Let U be standard normally distrubuted. Then
2 1
E 1{U ≥0} U 3 = √ , E 1{U ≥0} U = √ ,
2π 2π
2 1
E 1{U ≤0} U 3 = − √ , E 1{U ≤0} U = − √ .
2π 2π
Proof. Let f be the density of the standard normal distribution.
∞ ∞
1 x2
f (x) xdx = √ exp − xdx
0 2π 0 2
2 ∞
1 x
= √ − exp −
2π 2 0
1
= √ .
2π
Using partial integration we obtain
∞ ∞
1 x2
f (x) xdx = √ exp − xdx
0 2π 0 2
∞ ∞
1 1 2 x2 1 1 2 x2
= √ x exp − −√ x − exp − x dx
2π 2 2 0 2π 0 2 2
∞ 2
1 x
= √ exp − x3 dx
2 2π 0 2
1 ∞ 3
= x f (x) dx.
2 0
Thus
∞
2
x3 f (x) dx = √ .
0 2π
Obviously, it holds
0 ∞ 0 ∞
f (x) xdx = − f (x) xdx, x3 f (x) dx = − x3 f (x) dx.
−∞ 0 −∞ 0
This completes the proof of the Lemma.
Using the lemma we can calculate the moments
1 2
ρσs (g1 ) = ρσs (g2 ) = σ ,
2 s
2
ρ(1) (g1 ) = √ σ 2 = −ρ(1) (g2 ) ,
σs s σs
2π
∂g1 2 ∂g2
ρσs = √ σ s = −ρσs ,
∂x 2π ∂x
∂g1 ∂g2
ρ(1)
σs = ρ(1)
σs = σs,
∂x ∂x
3 4
ρσs (g1 )2 = ρσs (g2 )2 = σ ,
2 s
18
19. ∂g1 σs ∂g2
ρ11
˜σ s = √ = −˜11
ρσs .
∂x 2π ∂x
The last statement follows from
1
∂g1 ∂g1
˜
ρσs = E (σ s W1 ) Wu dWu
∂x ∂x 0
1
= 2E σ s W1 1{W1 ≥0} Wu dWu
0
1 2 1
= 2E σ s W1 1{W1 ≥0} W −
2 1 2
3
= σ s E W1 − W1 1{W1 ≥0}
σs
= √ .
2π
References
Andersen, T. G., T. Bollerslev, and F. X. Diebold (2006). Parametric and nonparametric measurement of
volatility. In Y. A¨
ıt-Sahalia and L. P. Hansen (Eds.), Handbook of Financial Econometrics. Amsterdam:
North Holland. Forthcoming.
Andersen, T. G., T. Bollerslev, and F. X. Diebold (2007). Roughing it up: Including jump components in
the measurement, modeling and forecasting of return volatility. Review of Economics and Statistics 89,
707–720.
Andersen, T. G., T. Bollerslev, F. X. Diebold, and P. Labys (2000). Great realizations. Risk 13, 105–108.
Andersen, T. G., T. Bollerslev, F. X. Diebold, and P. Labys (2001). The distribution of exchange rate
volatility. Journal of the American Statistical Association 96, 42–55. Correction published in 2003,
volume 98, page 501.
Andersen, T. G., T. Bollerslev, and N. Meddahi (2004). Analytic evaluation of volatility forecasts. Inter-
national Economic Review 45, 1079–1110.
Ang, A., J. Chen, and Y. Xing (2006). Downside risk. Review of Financial Studies 19, 1191–1239.
Babsiria, M. E. and J.-M. Zakoian (2001). Contemporaneous asymmetry in garch processes. Journal of
Econometrics 101, 257–294.
Barndorff-Nielsen, O. E., S. E. Graversen, J. Jacod, M. Podolskij, and N. Shephard (2006). A central limit
theorem for realised power and bipower variations of continuous semimartingales. In Y. Kabanov,
R. Lipster, and J. Stoyanov (Eds.), From Stochastic Analysis to Mathematical Finance, Festschrift for
Albert Shiryaev, pp. 33–68. Springer.
Barndorff-Nielsen, O. E., S. E. Graversen, J. Jacod, and N. Shephard (2006). Limit theorems for realised
bipower variation in econometrics. Econometric Theory 22, 677–719.
Barndorff-Nielsen, O. E., P. R. Hansen, A. Lunde, and N. Shephard (2006). Designing realised kernels to
measure the ex-post variation of equity prices in the presence of noise. Unpublished paper: Nuffield
College, Oxford.
Barndorff-Nielsen, O. E. and N. Shephard (2002). Econometric analysis of realised volatility and its use in
estimating stochastic volatility models. Journal of the Royal Statistical Society, Series B 64, 253–280.
Barndorff-Nielsen, O. E. and N. Shephard (2004). Power and bipower variation with stochastic volatility
and jumps (with discussion). Journal of Financial Econometrics 2, 1–48.
Barndorff-Nielsen, O. E. and N. Shephard (2006). Econometrics of testing for jumps in financial economics
using bipower variation. Journal of Financial Econometrics 4, 1–30.
Barndorff-Nielsen, O. E. and N. Shephard (2007). Variation, jumps and high frequency data in financial
econometrics. In R. Blundell, T. Persson, and W. K. Newey (Eds.), Advances in Economics and
Econometrics. Theory and Applications, Ninth World Congress, Econometric Society Monographs,
pp. 328–372. Cambridge University Press.
19
20. Black, F. (1976). Studies of stock price volatility changes. Proceedings of the Business and Economic
Statistics Section, American Statistical Association, 177–181.
Bollerslev, T. (1986). Generalised autoregressive conditional heteroskedasticity. Journal of Economet-
rics 51, 307–327.
Bollerslev, T. and J. M. Wooldridge (1992). Quasi maximum likelihood estimation and inference in dy-
namic models with time varying covariances. Econometric Reviews 11, 143–172.
Chen, X. and E. Ghysels (2007). News - good or bad - and its impact over multiple horizons. Unpublished
paper: Department of Economics, University of North Carolina at Chapel Hill.
Corradi, V. and W. Distaso (2006). Semiparametric comparison of stochastic volatility models using
realized measures. Review of Economic Studies 73, 635–667.
Doornik, J. A. (2001). Ox: Object Oriented Matrix Programming, 5.0. London: Timberlake Consultants
Press.
Engle, R. F. (2002). Dynamic conditional correlation - a simple class of multivariate garch models. Journal
of Business and Economic Statistics 20, 339–350.
Engle, R. F. and J. P. Gallo (2006). A multiple indicator model for volatility using intra daily data. Journal
of Econometrics 131, 3–27.
Engle, R. F. and V. Ng (1993). Measuring and testing the impact of news on volatility. Journal of
Finance 48, 1749–1778.
Fishburn, P. C. (1977). Mean-risk analysis with risk associated below target variance. American Economic
Review 67, 116–126.
Glosten, L. R., R. Jagannathan, and D. Runkle (1993). Relationship between the expected value and the
volatility of the nominal excess return on stocks. Journal of Finance 48, 1779–1802.
Granger, C. W. J. (2008). In praise of pragmatic econometrics. In J. L. Castle and N. Shephard (Eds.), The
Methodology and Practice of Econometrics: A Festschrift in honour of David F Hendry, pp. 105–116.
Oxford University Press.
Hogan, W. W. and J. M. Warren (1972). Computation of the efficient boundary in the E-S portfolio
selection model. Journal of Finance and Quantitative Analysis 7, 1881–1896.
Hogan, W. W. and J. M. Warren (1974). Toward the development of an equilibrium capital-market model
based on semivariance. Journal of Finance and Quantitative Analysis 9, 1–11.
Huang, X. and G. Tauchen (2005). The relative contribution of jumps to total price variation. Journal of
Financial Econometrics 3, 456–499.
Jacod, J. (1994). Limit of random measures associated with the increments of a Brownian semimartingale.
Preprint number 120, Laboratoire de Probabiliti´s, Universit´ Pierre et Marie Curie, Paris.
e e
Jacod, J. (2007). Statistics and high frequency data. Unpublished paper.
Jacod, J., Y. Li, P. A. Mykland, M. Podolskij, and M. Vetter (2007). Microstructure noise in the continuous
case: the pre-averaging approach. Unpublished paper: Department of Statistics, University of Chicago.
Kinnebrock, S. and M. Podolskij (2007). A note on the central limit theorem for bipower variation of
general functions. Stochastic Processes and Their Applications. Forthcoming.
Laurent, S. and J. P. Peters (2002). G@RCH 2.2 : An Ox package for estimating and forecasting various
ARCH models. Journal of Economic Surveys 16, 447–485.
Lee, S. and P. A. Mykland (2008). Jumps in financial markets: A new nonparametric test and jump
dynamics. Review of Financial Studies. Forthcoming.
Lewis, A. L. (1990). Semivariance and the performance of portfolios with options. Financial Analysts
Journal .
Mancini, C. (2001). Disentangling the jumps of the diffusion in a geometric Brownian motion. Giornale
dell’Istituto Italiano degi Attuari LXIV, 19–47.
Mao, J. C. T. (1970a). Models of capital budgeting, E-V vs. E-S. Journal of Finance and Quantitative
Analysis 4, 657–675.
Mao, J. C. T. (1970b). Survey of capital budgeting: Theory and practice. Journal of Finance 25, 349–360.
Markowitz, H. (1959). Portfolio Selection. New York.
20
21. Nelson, D. B. (1991). Conditional heteroskedasticity in asset pricing: a new approach. Econometrica 59,
347–370.
Nelson, D. B. (1992). Filtering and forecasting with misspecifies ARCH models I: getting the right variance
with the wrong model. Journal of Econometrics 52, 61–90.
Pedersen, C. S. and S. E. Satchell (2002). On the foundation of performance measures under asymmetric
returns. Quantitative Finance 2, 217–223.
Protter, P. (2004). Stochastic Integration and Differential Equations. New York: Springer-Verlag.
Rom, B. M. and K. Ferguson (1993). Post-modern portfolio theory comes of age. Journal of Investing.
Sortino, F. and S. E. Satchell (2001). Managing Downside Risk in Financial Markets. Butterworth-
Heinemann.
Sortino, F. A. and R. van der Meer (1991). Downside risk. The journal of Portfolio Management 17,
27–31.
Zhang, L., P. A. Mykland, and Y. A¨ ıt-Sahalia (2005). A tale of two time scales: determining integrated
volatility with noisy high-frequency data. Journal of the American Statistical Association 100, 1394–
1411.
Zhou, B. (1996). High-frequency data and volatility in foreign-exchange rates. Journal of Business and
Economic Statistics 14, 45–52.
21