Statistical power is central to all planning of studies searching for disease genes, particularly since multiple testing is often a serious problem. The available resources for data collection and genotyping are limited, and the problem is to choose a combination of family data and independent controls that gives maximal power for a given total number of individuals. We show that relative efficiency is a much simpler concept to deal with than power in such connections, how the change-of-variance function can be used to select among designs, and how an extended score test can be used to minimize the efficiency loss in multiple testing.

A Transformed Copula Function Approach to Credit Portfolio Modeling

Abstract

We present a fundamental modification to the current popularly used copula function approach to the credit portfolio modeling introduced by Li (2000). The original approach simply uses a copula function to create a joint survival time distribution where individual survival time distributions are already risk neutralized, and given from single name perspectives only. Based on Buhlmann's equilibrium pricing model (1980) under certain assumptions on the aggregate risk or the multivariate Esscher and Wang transforms we find that the covariance between each individual risk and the market or aggregate risk should be included in the measure change. In the Gaussian copula model it is shown that we simply need to adjust the asset return by subtracting an item associated with the covariance risk.

This discovery allows us to theoretically link our credit portfolio modeling with our classical equity portfolio modeling in the CAPM setting. This can help us solve some practical problems we have been encountering in the credit portfolio modeling.

On the asymptotics of Nadaraya-Watson estimator: Toward a unified approach

Abstract

This paper investigates the asymptotics of Nadaraya-Watson estimator, providing a framework and a unified approach for stationary and non-stationary times series. This paper also establishes an extension to multivariate regression with non-stationary time series and provides a brief overview on the asymptotic results in relation to non-linear cointegrating regression.

Functional Ito Calculus and Financial Applications

Abstract

We expose briefly the Functional Ito Calculus, which gives a natural setting for defining the Greeks for path dependent options and gives a generalized PDE for the price of path dependent options, even in the case of non Markov dynamics. It leads to a variational calculus on volatility surfaces and a fine decomposition of the volatility risk as well to links with super-replication strategies. We examine a few practical examples and analyze the ability to hedge (or not) of some popular structures.

Intermediation and Voluntary Exposure to Counterparty Risk

Abstract

I develop a model of financial sector in which intermediation among banks interacts with debt nature of bank liabilities to generate excessive systemic risk. The central idea is to explore the possibility that certain financial institutions are able to use their lending and borrowing decisions to tilt the division of surplus in their own favor through capturing intermediation spreads, even if the implied change in the structure of financial system hurts the total surplus of the economy. The paper predicts that there is excessive exposure among banks who make risky investments and too little exposure among those who mainly provide funding. Inefficiency arises because the financial institutions who intermediate among other institutions are exposed to excessive counterparty risk: replacing them with certain other banks mitigates the extent of failure when it is inevitable without hurting the optimal level of investment. However, the equilibrium intermediators choose to over expose themselves to other risky banks and suffer the cost of failure due to contagion if they can absorb enough rents when they survive.

Limit Theorems for nearly unstable Hawkes Processes

Abstract

Because of their tractability and their natural interpretations in terms of market quantities, Hawkes processes are nowadays widely used in high frequency finance. However, in practice, the statistical estimation results seem to show that very often, only nearly unstable Hawkes processes are able to fit the data properly. By nearly unstable, we mean that the L1 norm of their kernel is close to unity. We study in this work such processes for which the stability condition is almost violated.

Our main result states that after suitable rescaling, they asymptotically behave like integrated Cox Ingersoll Ross models. Thus, modeling financial order flows as nearly unstable Hawkes processes may be a good way to reproduce both their high and low frequency stylized facts. We then extend this result to the Hawkes based price model introduced by Bacry et al. We show that under a similar criticality condition, this process converges to a Heston model. Again, we recover well known stylized facts of prices, both at the microstructure level and at the macroscopic scale. (Joint work with Thibault Jaisson, Ecole Polytechnique, Paris).

Estimating the entire quadratic covariation in case of asynchronous observations

Abstract

In this talk we consider the estimation of the quadratic (co)variation of a semimartingale from discrete observations which are irregularly spaced under high-frequency asymptotics. In the univariate setting, standard results are generalized to the case of irregular observations. In the two-dimensional setup under non-synchronous observations, we derive a stable central limit theorem for the Hayashi-Yoshida estimator in the presence of jumps. We reveal how idiosyncratic and simultaneous jumps affect the asymptotic distribution. Observation times generated by Poisson processes are explicitly discussed.

April 3, 2014 Thursday
4:00pm
5727 S. University Ave, Room 112

Jian Sun

Morgan Stanley

Implied Remaining Variance in Derivative Pricing

Abstract

In this note, we give a way to calculate a swaption implied volatility curve in closed form via the well known quadratic root formula. The closed form expression has 3 free parameters, which parsimoniously govern the assumed dynamics of implied volatility under forward swap measure. Preliminary empirical work suggests the curve fits swaptions market well (though not perfectly). Unlike previous models of stochastic implied volatility, the current model has no implications for the dynamics of instantaneous volatility.

Model-Free Leverage Effect Estimators at High Frequency

The Statistical Price to Pay for Computational Efficiency in Sparse PCA

Abstract

Computational limitations of statistical problems have largely been ignored or simply overcome by ad hoc relaxations techniques. If optimal methods cannot be computed in reasonable time, what is the best possible statistical performance of a computationally ecient procedure? Building on average case reductions, we establish these fundamental limits in the context of sparse principal component analysis and quantify the statistical price to pay for computational ecientciency. Our results can be viewed as complexity theoretic lower bounds conditionally on the assumptions that some instances of the planted clique problem cannot be solved in randomized polynomial time. [Joint work with Quentin Berthet.]

A technique for online estimation of spot volatility for high-frequency
data is developed. The method uses a price model with time shift in combination with a nonlinear market microstructure noise model. A benefit of the model is that it leads to an identifiable decomposition of spot volatility into spot volatility per transaction and the trading intensity - thus highlighting
the influence of trading intensity on volatility. The online algorithm uses a
computationally efficient particle filter. It works directly on the transaction
data and updates the volatility estimate immediately after the occurrence
of a new transaction. It also allows for the approximation of the unknown efficient prices. For volatility estimation a nonparametric recursive EM algorithm is used. We neither assume that the transaction times are equidistant nor do we use interpolated prices. For the theoretical investigations of the estimates we present a theoretical framework with infill asymptotics. [joint work with Jan. C. Neddermeyer and Sophon Tunyavetchakit]

The Unbearable Transparency of Stein Estimation

Charles Stein (1956) discovered that, under quadratic loss, the usual unbiased estimator for the mean vector of a multivariate normal distribution is inadmissible if the dimension $p$ of the mean vector exceeds two. It has since been claimed that Stein's results and the subsequent James-Stein estimator are counter-intuitive, even paradoxical, and not very useful. In response to such doubts, various authors have presented alternative derivations of Stein shrinkage estimators. Surely Stein himself did not find his results paradoxical. This talk argues that assertions of "paradoxical" or "counter-intuitive" or "not practical" have overlooked essential arguments and remarks in Stein's beautifully written 1956 paper. Among these overlooked aspects are the asymptotic geometry of quadratic loss in high dimensions that makes Stein estimation transparent; the asymptotic optimality results that can be associated with Stein estimation; his explicit mention of practical multiple shrinkage estimators; and the foreshadowing of Stein confidence balls. These ideas prove fundamental for studies of modern regularization estimators that rely on multiple shrinkage, whether implicitly or overtly.