Discussion Papers

This page contains downloadable versions of a selection of recent discussion papers that have not yet appeared in print. For a full list of my discussion papers since 2000 see the ICMA Centre Discussion Papers in Finance

A comprehensive description of the trading and statistical characteristics of VIX futures and their exchange-traded notes motivates our study of their benefits to equity investors seeking to diversify their exposure. We analyze when diversification into VIX futures is ex-ante optimal for standard mean-variance investors, then extend this to include (a) skewness preference, and (b) a moderation of personal forecasts by equilibrium returns, as in the Black-Litterman framework. An empirical study shows that skewness preference increases the frequency of diversification, but out-of-sample the optimally-diversified portfolios rarely out-perform equity alone, even according to a generalized Sharpe ratio that incorporates skewness preference, except during an extreme crisis period or when the investor has personal access to accurate forecasts of VIX futures returns.

We model investment opportunities with a single source of uncertainty, i.e. the market price of the investment. Investment cost can be predetermined or perfectly correlated with the market price. The common paradigm for risk-neutral real-option pricing is a special case encompassed within our general framework, and we analyse the relationship between standard real option prices and the more general risk-averse real option values. Numerical examples illustrate how these general values depend on the frequency of decision opportunities, the investor’s risk tolerance and its sensitivity to wealth, his expected return and volatility of the underlying asset, and the price of the asset relative to initial wealth. Specific applications to real estate include property investment under `boom-bust’ or mean-reverting price scenarios, and buy-to-let or land-development opportunities.

This paper develops a model to analyse the financial relationship between Henry III, king of England between 1216 and 1272, and one group of creditors, namely the Flemish merchants that provided cloth to the royal wardrobe. From the surviving royal documents, we have reconstructed the credit advanced to the royal wardrobe by the merchants of Ypres and Douai for each year between 1247 and 1270, together with the arrears owed by the king at certain points. The model is flexible and able to capture the dynamics of the actual number of merchants trading in England as well as the extent to which the king made debt repayments.

Different theoretical and numerical methods for calculating the fair-value of a variance swap give rise to systematic biases that are most pronounced during volatile periods. For instance, differences of 10-20 percentage points would have been observed on fair-value index variance swap rates during the banking crisis in 2008, depending on the formula used and its implementation. Our empirical study utilizes more than 16 years of FTSE 100 daily options prices to compare three fair-value variance swap rates. The exchange’s variance swap rate formula, used to quote volatility indices such as VIX, has an upward bias induced by Riemann sum numerical integration that empirically outweighs the negative jump and discrete monitorization biases that are inherent in this fair-value formula. On average, the exchange’s methodology provides less accurate predictors of discretely-monitored realised volatility than the approximate swap rate formula introduced in this paper, which we implement using an almost exact analytical integration technique.

It is widely accepted that some of the most accurate predictions of aggregated asset returns are based on an appropriately specified GARCH process. As the forecast horizon is greater than the frequency of the GARCH model, such predictions either require time-consuming simulations or they can be estimated using a recent development in the GARCH literature, viz. quasi-analytic GARCH returns distributions based on analytic moment formulae for GARCH aggregated returns. We demonstrate that this new methodology yields robust and rapid calculations of the Value-at-Risk (VaR) generated by a GARCH process that can be well over 100 times faster than using Monte Carlo simulation. Our extensive empirical study applies normal and Student-t, symmetric and asymmetric (GJR) GARCH processes to returns data on different financial assets, validates the accuracy of the quasi-analytic approximations to GARCH aggregated returns and thus derives GARCH VaR estimates that are shown to be highly accurate over multiple horizons and significance levels.

This paper explores the properties of random orthogonal matrix (ROM) simulation when the random matrix is drawn from the class of rotational matrices. We describe the characteristics of ROM simulated samples that are generated using random Hessenberg, Cayley and exponential matrices and compare the computational efficiency of parametric ROM simulations with standard Monte Carlo techniques.

Conditional returns distributions generated by a GARCH process, which are important for many problems in market risk assessment and portfolio optimization, are typically generated via simulation. This paper extends previous research on analytic moments of GARCH returns distributions in several ways: we consider a general GARCH model – the GJR specification with a generic innovation distribution; we derive analytic expressions for the first four conditional moments of the forward return, of the forward variance, of the aggregated return and of the aggregated variance – corresponding moments for some specific GARCH models largely used in practice are recovered as special cases; we derive the limits of these moments as the time horizon increases, establishing regularity conditions for the moments of aggregated returns to converge to normal moments; and we demonstrate empirically that some excellent approximate predictive distributions can be obtained from these analytic moments, thus precluding the need for time-consuming simulations.

Recent research advocates volatility diversification for long equity investors. It can even be justified when short-term expected returns are highly negative, but only when its equilibrium return is ignored. Its advantages during stock market crises are clear but we show that the high transactions costs and negative carry and roll yield on volatility futures during normal periods would outweigh any benefits gained unless volatility trades are carefully timed. Our analysis highlights the difficulty of predicting when volatility diversification is optimal. Hence institutional investors should be sceptical of studies that extol its benefits. Volatility is better left to experienced traders such as speculators, vega hedgers and hedge funds.

We quantify and endogenize the model risk associated with quantile estimates using a maximum entropy distribution (MED) as benchmark. Moment-based MEDs cannot have heavy tails, however generalized beta generated distributions have attractive properties for popular applications of quantiles. These are MEDs under three simple constraints on the parameters that explicitly control tail weight and peakness. Model risk arises because analysts are constrained to use a model distribution that is not the MED. Then the model’s alpha quantile differs from the alpha quantile of the MED so the tail probability under the MED associated with the model’s alpha quantile is not alpha, it is a random variable. Model risk is endogenized by parameterizing the uncertainty about this random variable, whence the model’s alpha quantile becomes a generated random variable. To obtain a point model-risk-adjusted quantile, the generated distribution is used to adjust the model’s alpha quantile for any systematic bias and uncertainty due to model risk. An illustration based on Value-at-Risk (VaR) computes a model-risk-adjusted VaR for risk capital reserves which encompass both portfolio and VaR model risk.

This paper examines the ability of several different continuous-time one and two-factor jump-diffusion models to capture the dynamics of the VIX volatility index for the period between 1990 and 2010. For the one-factor models we study affine and non-affine specifications, possibly augmented with jumps. Jumps in one-factor models occur frequently, but add surprisingly little to the ability of the models to explain the dynamic of the VIX. We present a stochastic volatility of volatility model that can explain all the time-series characteristics of the VIX studied in this paper. Extensions demonstrate that sudden jumps in the VIX are more likely during tranquil periods and the days when jumps occur coincide with major political or economic events. Using several statistical and operational metrics we find that non-affine one-factor models outperform their affine counterparts and modeling the log of the index is superior to modeling the VIX level directly.

Abstract: GARCH option pricing models have the advantage of a well-established econometric foundation. However, multiple states need to be introduced as single state GARCH and even Lévy processes are unable to explain the term structure of the moments of financial data. We show that the continuous time version of the Markov switching GARCH(1,1) process is a stochastic model where the volatility follows a switching process. The continuous time switching GARCH model derived in this paper, where the variance process jumps between two or more GARCH volatility states, is able to capture the features of implied volatilities in an intuitive and tractable framework.

Abstract:Discrete time volatility analysis has focussed almost exclusively on GARCH processes, which are very flexible models for time varying conditional variance. The continuous limit of these processes is therefore of considerable interest for continuous time volatility modelling. Unfortunately, progress in this area has been hampered by conflicting results. The limit of the symmetric normal GARCH model is fundamental for limits of other GARCH processes, yet even this has been the point of much debate amongst econometricians. Nelson (1990) derived the limit of the strong GARCH model as a stochastic volatility process that is uncorrelated with the price process, so this limit has limited applicability. However, since the strong GARCH process is not time aggregating one should question whether it is sensible to derive its continuous limit at all. This paper derives the continuous limit of the weak GARCH process, which is time aggregating. Moreover the limit model is a stochastic volatility model with non-zero price volatility correlation in which both the variance diffusion coefficient and the price-volatility correlation are related to the skewness and kurtosis of the physical returns density. When returns are normally distributed this limit model reduces to Nelson’s strong GARCH diffusion, however, more generally it has the flexibility to fit most short term volatility skews without adding jumps in either the price or the volatility process.

The orthogonal GARCH model is a multivariate, factor GARCH model based on principal components analysis. It is my preferred approach for constructing large dimensional GARCH covariance matrices for highly correlated systems, such as term structures of interest rates or commodity futures. I wrote this primer, more than 10 years ago, as a simple introduction to the model, aimed at inexerienced users. Although I have since published many more advanced articles on the subject, I have made this little primer available on my website by popular demand.Examples for the OGARCH Primer