S. Bürger, P. Mathé, Discretized Lavrent'ev regularization for the autoconvolution equation, Applicable Analysis. An International Journal, 96 (2017), pp. 1618--1637, DOI 10.1080/00036811.2016.1212336 .AbstractLavrent?ev regularization for the autoconvolution equation was considered by Janno J. in Lavrent?ev regularization of ill-posed problems containing nonlinear near-to-monotone operators with application to autoconvolution equation, Inverse Prob. 2000;16:333?348. Here this study is extended by considering discretization of the Lavrent?ev scheme by splines. It is shown how to maintain the known convergence rate by an appropriate choice of spline spaces and a proper choice of the discretization level. For piece-wise constant splines the discretized equation allows for an explicit solver, in contrast to using higher order splines. This is used to design a fast implementation by means of post-smoothing, which provides results, which are indistinguishable from results obtained by direct discretization using cubic splines.

M. Ladkau, J.G.M. Schoenmakers, J. Zhang, Libor model with expiry-wise stochastic volatility and displacement, International Journal of Portfolio Analysis and Management, 1 (2013), pp. 224--249.AbstractWe develop a multi-factor stochastic volatility Libor model with displacement, where each individual forward Libor is driven by its own square-root stochastic volatility process. The main advantage of this approach is that, maturity-wise, each square-root process can be calibrated to the corresponding cap(let)vola-strike panel at the market. However, since even after freezing the Libors in the drift of this model, the Libor dynamics are not affine, new affine approximations have to be developed in order to obtain Fourier based (approximate) pricing procedures for caps and swaptions. As a result, we end up with a Libor modeling package that allows for efficient calibration to a complete system of cap/swaption market quotes that performs well even in crises times, where structural breaks in vola-strike-maturity panels are typically observed.

V. Krätschmer, H. Zähle, Sensitivity of risk measures with respect to the normal approximation of total claim distributions, Insurance: Mathematics & Economics, 49 (2011), pp. 335--344.AbstractA simple and commonly used method to approximate the total claim distribution of a (possible weakly dependent) insurance collective is the normal approximation. In this article, we investigate the error made when the normal approximation is plugged in a fairly general distribution-invariant risk measure. We focus on the rate of the convergence of the error relative to the number of clients, we specify the relative error's asymptotic distribution, and we illustrate our results by means of a numerical example. Regarding the risk measure, we take into account distortion risk measures as well as distribution-invariant coherent risk measures.

E. Giacomini, W. Härdle, V. Spokoiny, Inhomogeneous dependency modelling with time varying copulae, Journal of Business & Economic Statistics, 27 (2009), pp. 224--234.AbstractMeasuring dependence in a multivariate time series is tantamount to modelling its dynamic structure in space and time. In the context of a multivariate normally distributed time series, the evolution of the covariance (or correlation) matrix over time describes this dynamic. A wide variety of applications, though, requires a modelling framework different from the multivariate normal. In risk management the non-normal behaviour of most financial time series calls for nonlinear dependency. The correct modelling of non-gaussian dependencies is therefore a key issue in the analysis of multivariate time series. In this paper we use copulae functions with adaptively estimated time varying parameters for modelling the distribution of returns, free from the usual normality assumptions. Further, we apply copulae to estimation of Value-at-Risk (VaR) of a portfolio and show its better performance over the RiskMetrics approach, a widely used methodology for VaR estimation

V. Spokoiny, Multiscale local change point detection with applications to Value-at-Risk, The Annals of Statistics, 37 (2009), pp. 1405--1436.AbstractThis paper offers a new procedure for nonparametric estimation and forecasting of time series with applications to volatility modeling for financial data. The approach is based on the assumption of local homogeneity: for every time point there exists a historical emphinterval of homogeneity, in which the volatility parameter can be well approximated by a constant. The procedure recovers this interval from the data using the local change point (LCP) analysis. Afterwards the estimate of the volatility can be simply obtained by local averaging. The approach carefully addresses the question of choosing the tuning parameters of the procedure using the so called “propagation” condition. The main result claims a new “oracle” inequality in terms of the modeling bias which measures the quality of the local constant approximation. This result yields the optimal rate of estimation for smooth and piecewise constant volatility functions. Then the new procedure is applied to some data sets and a comparison with a standard GARCH model is also provided. Finally we discuss applications of the new method to the Value at Risk problem. The numerical results demonstrate a very reasonable performance of the new method.

I.G. Grama, V. Spokoiny, Statistics of extremes by oracle estimation, The Annals of Statistics, 36 (2008), pp. 1619--1648.AbstractWe use the fitted Pareto law to construct an accompanying approximation of the excess distribution function. A selection rule of the location of the excess distribution function is proposed based on a stagewise lack-of-fit testing procedure. Our main result is an oracle type inequality for the Kullback-Leibler loss of the obtained adaptive estimator.

Y. Chen, W. Härdle, V. Spokoiny, Portfolio value at risk based on independent components analysis, Journal of Computational and Applied Mathematics, 205 (2007), pp. 594--607.AbstractRisk management technology applied to high-dimensional portfolios needs simple and fast methods for calculation of value at risk (VaR). The multivariate normal framework provides a simple off-the-shelf methodology but lacks the heavy-tailed distributional properties that are observed in data. A principle component-based method (tied closely to the elliptical structure of the distribution) is therefore expected to be unsatisfactory. Here, we propose and analyze a technology that is based on independent component analysis (ICA). We study the proposed ICVaR methodology in an extensive simulation study and apply it to a high-dimensional portfolio situation. Our analysis yields very accurate VaRs.

G.N. Milstein, J.G.M. Schoenmakers, V. Spokoiny, Forward and reverse representations for Markov chains, Stochastic Processes and their Applications, 117 (2007), pp. 1052--1075.AbstractIn this paper we carry over the concept of reverse probabilistic representations developed in Milstein, Schoenmakers, Spokoiny (2004) for diffusion processes, to discrete time Markov chains. We outline the construction of reverse chains in several situations and apply this to processes which are connected with jump-diffusion models and finite state Markov chains. By combining forward and reverse representations we then construct transition density estimators for chains which have root-N accuracy in any dimension and consider some applications.

D. Belomestny, V. Spokoiny, Spatial aggregation of local likelihood estimates with applications to classification, The Annals of Statistics, 35 (2007), pp. 2287--2311.AbstractThis paper presents a new method for spatially adaptive local (constant) likelihood estimation which applies to a broad class of nonparametric models, including the Gaussian, Poisson and binary response models. The main idea of the method is given a sequence of local likelihood estimates (”weak” estimates), to construct a new aggregated estimate whose pointwise risk is of order of the smallest risk among all “weak” estimates. We also propose a new approach towards selecting the parameters of the procedure by providing the prescribed behavior of the resulting estimate in the simple parametric situation. We establish a number of important theoretical results concerning the optimality of the aggregated estimate. In particular, our “oracle” results claims that its risk is up to some logarithmic multiplier equal to the smallest risk for the given family of estimates. The performance of the procedure is illustrated by application to the classification problem. A numerical study demonstrates its nice performance in simulated and real life examples.

E. VANDENBerg, A.W. Heemink, H.X. Lin, J.G.M. Schoenmakers, Probability density estimation in stochastic environmental models using reverse representations, Stochastic Environmental Research & Risk Assessment, 20 (2006), pp. 126--139.AbstractThe estimation of probability densities of variables described by stochastic differential equations has long been done using forward time estimators, which rely on the generation of forward in time realizations of the model. Recently, an estimator based on the combination of forward and reverse time estimators has been developed. This estimator has a higher order of convergence than the classical one. In this article, we explore the new estimator and compare the forward and forward? reverse estimators by applying them to a biochemical oxygen demand model. Finally, we show that the computational efficiency of the forward?reverse estimator is superior to the classical one, and discuss the algorithmic aspects of the estimator.

D. Spivakovskaya, A.W. Heemink, G.N. Milstein, J.G.M. Schoenmakers, Simulation of the transport of particles in coastal waters using forward and reverse time diffusion, Advances in Water Resources, 28 (2005), pp. 927--938.AbstractParticle models are often used to simulate the spreading of a pollutant in coastal waters in case of a calamity at sea. Here many different particle tracks starting at the point of release are generated to determine the particle concentration at some critical locations after the release. This Monte Carlo method however consumes a large CPU time. Recently, Milstein, Schoenmakers and Spokoiny (2003) introduced the concept of reverse-time diffusion. They derived a reverse system from the original forward simulation model and showed that the Monte Carlo estimator can also be based on realizations of this reverse system. In this paper we apply this concept to estimate particle concentrations in coastal waters. The results for the experiments considered show that the CPU time compared with the classical method is reduced orders of magnitude.

H. Haaf, O. Reiss, J.G.M. Schoenmakers, Numerically stable computation of CreditRisk+, Phys. Rev. E (3), 6 (2004), pp. 1--10.AbstractThe CreditRisk+ model launched by Credit Suisse First Boston in 1997 is widely used by practitioners in the banking sector as a simple means for the quantification of credit risk, primarily of the loan book. We present an alternative numerical recursion scheme for CreditRisk+, equivalent to an algorithm recently proposed by Giese, that is based on well-known expansions of the logarithm and the exponential of a power series. We show that it is advantageous for the Panjer recursion advocated in the original CreditRisk+ document, in that it is numerically stable. The crucial stability arguments are explained in detail. We explain how to apply the suggested recursion scheme to incorporate stochastic exposures into the CreditRisk+ model as introduced by Tasche (2004). Finally, the computational complexity of the resulting algorithm is stated and compared with other methods for computing the CreditRisk+ loss distribution.

G.N. Milstein, J.G.M. Schoenmakers, V. Spokoiny, Transition density estimation for stochastic differential equations via forward-reverse representations, Bernoulli. Official Journal of the Bernoulli Society for Mathematical Statistics and Probability, 10 (2004), pp. 281--312.AbstractThe general reverse diffusion equations are derived and applied to the problem of transition density estimation of diffusion processes between two fixed states. For this problem we propose density estimation based on forward?reverse representations and show that this method allows essentially better results to be achieved than the usual kernel or projection estimation based on forward representations only.

J.G.M. Schoenmakers, A.W. Heemink, K. Ponnambalm, P.E. Kloeden, Variance reduction for Monte Carlo simulation of stochastic environmental models, Applied Mathematical Modelling. Simulation and Computation for Engineering and Environmental Systems. Elsevier Science Inc., New York, NY. English, English abstracts., 26 (2002), pp. 787--795.AbstractTo determine the probability of exceedence Monte Carlo simulation of stochastic models is often used. Mathematically this requires the evaluation of an expectation of some function of a solution of a stochastic model. This can be reformulated as a Kolmogorov final value problem. It can thus be calculated numerically by either solving a deterministic partial differential equation (Kolmogorov's Backwards equations) or by simulating a large number of trajectories of the stochastic differential equation. Here we discuss a composite method of variance reduced Monte Carlo simulation. The variance reduction is obtained by the Girsanov transformation to modify the stochastic model by a correction term that is obtained from an approximate solution of the partial differential equation computed by a classical numerical method. The composite method is more efficient than either the standard Monte Carlo or the classical numerical method. The approach is applied to estimate the probability of exceedence in a model for biochemical-oxygen demand.

Preprints im Fremdverlag

A. Kyprianou, R.L. Loeffen, J.-L. Perez, Optimal control with absolutely continuous strategies for spectrally negative Levy processes, Preprint no. arXiv:1008.2363, Cornell University Library, arXiv.org, 2010.AbstractIn the last few years there has been renewed interest in the classical control problem of de Finetti for the case that underlying source of randomness is a spectrally negative Levy process. In particular a significant step forward is made in an article of Loeffen where it is shown that a natural and very general condition on the underlying Levy process which allows one to proceed with the analysis of the associated Hamilton-Jacobi-Bellman equation is that its Levy measure is absolutely continuous, having completely monotone density. In this paper we consider de Finetti's control problem but now with the restriction that control strategies are absolutely continuous with respect to Lebesgue measure. This problem has been considered by Asmussen and Taksar, Jeanblanc and Shiryaev and Boguslavskaya in the diffusive case and Gerber and Shiu for the case of a Cramer-Lundberg process with exponentially distributed jumps. We show the robustness of the condition that the underlying Levy measure has a completely monotone density and establish an explicit optimal strategy for this case that envelopes the aforementioned existing results. The explicit optimal strategy in question is the so-called refraction strategy.