[The objective of the talk is to show how stochastic modelling is routinely used by financial firms to evaluate risk. It also aims to provide a window on the role that mathematicians, physicists and computer scientists have in financial institutions, either as "quants" or actuaries.]
Insurance companies work on the principle of taking on a large number of risks that have low probability but financial impact too large to be borne comfortably by individuals. The standard framework in which these risks are analysed (and the price charged to customers is set) is that of the so-called "collective model". This is a stochastic model that analyses the frequency and severity of losses separately and calculates the expected total losses per policy by Monte Carlo simulation.
Reinsurance companies work on the same principle -- they aggregate risks that are too large to be borne by (sometimes small) insurance companies or may damage their financial stability. The collective model is still valid but a special branch of statistics (extreme value theory), originally developed by physicists to analyse extreme tidal events, is becoming the standard tool to deal with the particular challenges posed by large losses.
The core of the talk will be to present the frequency/severity model for large losses, with special emphasis on the result by Pickands et al. on the convergence behaviour of the tail of an arbitrary ground-up distribution. This result will be presented in the practical context of pricing reinsurance policies.