## MAXWELL INSTITUTE for MATHEMATICAL SCIENCES Actuarial and Financial Mathematics Seminars Abstracts

### Bonnie-Jeanne MacDonald: Defined contribution pension plans for all: what if?

This study is intended to gauge the risk inherent in defined contribution (DC) pension plans on an individual and on an aggregate basis, based on United States data. Our aim is to gain insight into the consequences of a DC pension scheme becoming the predominant pillar of retirement income for an entire society. It is necessary for the primary source of retirement income to, by design, provide a sufficient pension that will offer financial security to the elderly and will facilitate the transition from employment to retirement. Due to the uncertainty in its accumulated wealth, such a requirement could not be fulfilled by a traditional DC pension plan if the pension delivery date is fixed. Therefore, rather than focus on the accumulated wealth at a specified retirement age, this study investigates the likely retirement age of DC participants if they hoped to maintain a fixed standard of living once they have retired, which will sustain them till death. Based on the simulated output of a DC flexible age of retirement model, we decide upon the optimal investment strategies. We then examine the demographic dynamics in an entire population of DC pension plan participants. The conclusions drawn demonstrate the significant role the market plays in the effectiveness of the DC pension plan scheme's success or failure. There is a high level of uncertainty in the age of retirement of each DC participant, regardless of his or her investment strategy. Furthermore, there are large retirement age discrepancies between the DC participants in different cohorts, despite their identical characteristics. We find that, even when we allow for a wide range of investment strategies amongst the members, the ratio of retirees to workers varies significantly over time. This suggests that countries dominated by DC schemes of this type may, over time, be exposed to significant risk in the size of its labour force. (The talk is based on a paper by Bonnie-Jeanne MacDonald and Andrew Cairns.)

### Anne Bronstein: Sequential entry and exit decisions with an ergodic pathwise performance criterion

We consider a variant of an optimisation problem involving sequential entry and exit decisions that has emerged in the economics literature. The problem that we solve aims at maximising a performance criterion of an ergodic, or long-term average nature, which is better suited to decision making within a sustainable economic environment. Our results include a complete characterisation of the optimal strategy, which can take qualitatively different forms depending on the problemï¿½@~Ys data, as well as explicit expressions for the maximal value of the associated performance index.

### Torsten Kleinow: Fair Valuation of Participating Insurance Policies with Interest Rate Guarantees

................... ................... ...................

abstract

### Tom Fischer: Consumption Processes and Semilinear Projection Properties

In this talk, we consider consumption strategies for stochastic money accounts. We constructively prove the existence of discrete finite time consumption processes fulfilling certain restrictions (e.g. the money account being always positive and exactly zero at the end) and pre-specified semilinear projection properties. One possible example are consumption rates that are a martingale under the mentioned restrictions. However, it is shown that any consumption strategy with restrictions as above possesses at least one corresponding semilinear projection property and could therefore be constructed from it. As an actuarial application we show how the introduced consumption techniques can be used to develop bonus strategies for with-profits policies.

### Pavel Grigoriev: No-Arbitrage in Illiquid Markets

Some problems of characterizing the no-arbitrage in markets with frictions will be discussed. In particular we concentrate on the recent results for markets with 2 assets of Grigoriev and Cetin and Grigoriev.

### Andrew Cairns: A generalisation of the Olivier-Smith model for stochastic mortality

The talk will begin by reviewing the background to stochastic mortality modelling and the original version of the Olivier-Smith model (OS). The OS model is equivalent to a no-arbitrage model for interest rates. The model takes as input a full term structure of mortality by cohort and term to maturity. It then imposes a term-structure of volatility to describe how survival probabilities evolve over time. It is a one factor model driven by a sequence of i.i.d. Gamma random variables with consequent restrictions on volatilities.
Using some rather crude statistical procedures, we will argue that the one-factor OS model does not fit historical England and Wales mortality data very well. Instead more than one factor is required to model dynamics and a greater degree of flexibility is required to capture the observed volatility in mortality rates at different ages.
We will discuss progress towards resolving these issues, with copulas playing a central in a possible solution.

### Alex McNeil: Statistical inference for dependent credit events

Any portfolio credit risk model that is to be used to calculate a loss distribution associated with defaults and changes in rating must address the challenge of modelling dependent defaults and dependent rating migrations. Most industry models (such as KMV, CreditMetrics, CreditRisk+) incorporate mechanisms for modelling this dependence, generally by assuming conditional independence of defaults and migrations given common economic factors. However, the calibration of these mechanisms is often quite ad hoc, despite the fact that the tail of the portfolio loss distribution is extremely sensitive to small changes in the parameters governing dependence. We consider the problem of making formal statistical inference for such models based on historical default and rating migration data. In the solution we propose, portfolio credit models are represented as generalized linear mixed models (GLMMs) and inference is made using Markov chain Monte Carlo (MCMC) techniques. This general framework allows quite complex models where the random effects essentially play the role of unobserved latent factors influencing default and migration rates; to capture economic cycle effects the latent factors are allowed to have a dynamic time-series structure. An empirical study of Standard and Poors data shows strong evidence for economic cycles and also reveals pronounced sectoral heterogeneity in default and migration rates.

### Andrea Macrina: Information-based framework for asset price dynamics

A new framework for asset price dynamics is introduced where the concept of noisy information about future cashflows is used to derive the corresponding asset price processes. In particular, we price equity by modelling the dividend process first for a single-dividend paying asset, and then also for the multidividend case. The share price is given by the sum of the discounted conditional expectations of the dividends, where the conditional expectation is taken with respect to the noisy partial information associated with each impending cash flow. Dividend growth is taken into account by introducing additional structure on the dividend process. Various growth models can be considered, depending on the context. The prices of options on dividend paying assets are derived and, remarkably, the form of the price process of a European-style call option is of Black-Scholes type. For gamma-distributed dividend payments a closed-form expression for the share-price process is obtained, and a semi-analytical formula is computed for a European call option.The framework has yet another interesting twist: It generates a natural family of stochastic volatility models without the need for specifying on an ad hoc basis the stochastic dynamics of the volatility. (Work carried out in collaboration with D. C. Brody, Imperial College London, and L. P. Hughston, King's College London.)

### Tim Johnson: The optimal timing of investment decisions

We consider the situation where a decision maker is able, at a cost, to initiate and then abandon a running payoff. The objective of the decision maker is to maximise the expected discounted cashflow of the system over an infinite time horizon. The underlying stochastic process driving the system is modelled by a general one-dimensional positive It\^o diffusion and the initialisation and abandonment costs, the discounting factor, and the running payoffs can all be functions of the diffusion. A set of sufficient conditions on the problem data are identified that admit an explicit analytic solution to the problem. This problem has a number of applications in finance and economics.
In the course of solving this problem we address the fundamental real options question, that of when to capitalise an asset, in a general setting. This is related to the pricing of perpetual American options. We also establish a range of results that can provide useful tools for developing the solution to other stochastic control problems.

abstract

### Zoran Vondracek: Ruin probabilities and decompositions for general perturbed risk processes

We study a general perturbed risk process with cumulative claims modelled by a subordinator with finite expectation, and the perturbation being a spectrally negative Levy process with zero expectation. We derive a Pollaczek-Hinchin type formula for the survival probability of that risk process, and give an interpretation of the formula based on the decomposition of the dual risk process at modified ladder epochs.

### Antonis Papapantoleon: Lévy driven term structure models and cap-floor symmetries

One aim of this talk is to give an overview of term structure models (HJM and LIBOR) driven by time-inhomogeneous Levy processes. Next, we present a relationship between caps and floors with the same time to maturity and "moneyness", in term structure models driven by time-inhomogeneous Levy processes. (Based on joint work with Ernst Eberlein and Wolfgang Kluge.)
preprint

### Boualem Djehiche: Standard approaches to asset and liability risk

We compare two different models for assets and liabilities for an insurance company that can be considered in the standard approach to solvency assessment and in particular, in determining the required target capital. The first model is suggested by a joint working party by members in CEA, Comit\'e Europ\'een des Assurances, and is based on the duration concept and the second one is based on ideas from Arbitrage Pricing Theory (APT).

### Boualem Djehiche: A finite time optimal starting and stopping problem to sustain profitability of producing a commodity.

We address the issue of finding a strategy to sustain structural profitability of producing a commodity e.g. a power plant producing non-storable energy like electricity. The power plant will either continue to produce electricity until its profitability reaches a critical low level at which the production is suspended and starts again when it is profitable to produce it, depending on its market price. But, if the structural non-profitability remains for a while, the power plant will face the risk to default and be definitely closed.
We suggest a general probabilistic set up to model profitability as a function of the market price of the commodity, and find the related optimal strategy to sustain it, under the constraint that the power plant faces the risk of defaulting when being non-profitable under a fixed finite time interval.

### Mark Owen: Optimal investment with an unbounded random bequest

Consider the problem of maximizing utility of terminal wealth for a financial agent who will receive an unbounded random bequest. We assume a utility function which supports both positive and negative wealth. We prove the existence of an optimal trading strategy within a class of permissible strategies - those strategies whose wealth process is a supermartingale under all pricing measures with finite relative entropy.
We also investigate the concept of a marginal utility-based price (MUBP), and show that a price process is a MUBP if and only if it is a local martingale under the optimal measure for the utility maximizing investor.

### Peter Friz: Regular variation and smile asymptotics

We consider risk-neutral returns and give an explicit and novel formula that relates tail asymptotics asymptotics of the implied volatility smile. The theory of regular variation provides the (ideal) mathematical framework to formulate and prove such results. The practical value of our formulae comes from the vast literature on tail asymptotics and our conditions are often seen to be true by simple inspection of known results. In cases with known moment generating function (but unknown tail asymptotics) we can play some Tauberian tricks and still apply our formula. This is joint work with Shalom Benaim (Cambridge).

### Piet de Jong: Extending Lee-Carter demographic forecasting

This seminar discusses different approaches to long term demographic forecasting. Demographic forecasts are immensely important to society. Yet existing methods are simplistic and often suffer from shortcomings. One established method is the Lee-Carter" method. The seminar considers this method, and points out its shortcomings. The method is cast into state-space" form and is used as a springboard to more sophisticated approaches based on modern time series methods. These approaches include building in smoothness via regression, functional" methods, double spline methods, methods based on the Wang transform and joint" mortality modelling across countries and different groups. An attempt is made to evaluate the different approaches, and arrive at some consensus regarding the advantages and disadvantages of different methods. Approaches are illustrated using different mortality data sets.

### Xunyu Zhou: Continuous-time behavioral portfolio selection: models, theory, and examples

This paper formulates and studies a general continuous-time behavioral portfolio selection model under Kahneman and Tversky's (cumulative) prospect theory, featuring S-shaped utility (value) functions and probability distortions. Unlike the conventional expected utility maximization model, such a behavioral model could be easily mis-formulated (a.k.a. ill-posed) if its different components do not coordinate well with each other. Certain classes of an ill-posed model are identified. A new, systematic approach, which is fundamentally different from the ones employed in the utility model, is developed to solve a well-posed model, assuming a complete market and general It\^o processes for asset prices. The optimal terminal wealth positions, derived in fairly explicit forms, possess surprisingly simple structure: they resemble the payoff of a portfolio of two binary options written on the state density price. An example with a two-piece CRRA utility is presented to illustrate the general results obtained, and is solved completely for all installations of the parameters. The effect of the behavioral criterion on the risky allocations is finally discussed.

### Hailiang Yang: Ruin probability under an insurance risk model with investment income

In this talk, I will present an insurance risk model. The ruin probability and absolute ruin probability will be considered. In some special cases, closed form solutions are obtained. (This talk is based on some joint work with Hans Gerber).

### Luciano Campi: Insider trading in an equilibrium model with default (joint work with U. Cetin, LSE)

We study an equilibrium model for the pricing of a defaultable zero coupon bond in the framework of Back (1992). The market consists of an informed agent, noise traders and a market maker who sets the price using the total order. When the insider doesn't trade, the default time is modelled as a totally inaccessible stopping time for the market maker as in reduced form credit risk models. We find the equilibrium pricing rule for the market maker and show that in the equilibrium the total order behaves like a Bessel bridge from insider's viewpoint but the insider's trade cannot be detected by the market. We also prove that in the equilibrium the default time becomes predictable for the market maker.

### Christian Ewald: Malliavin differentiability of the Heston Volatility and an extension of the Hull and White pricing formula

The talk is based on results obtained jointly with Elisa Alos, UPF Barcelona. We show that the Heston volatility or equivalently the Cox-Ingersoll-Ross processs is Malliavin differentiable and give an explicit expression for the derivative. This result assures the applicability of Malliavin calculus in the framework of the Heston stochastic volatility model and the Cox-Ingersoll-Ross model for interest rates. Furthermore we derive conditions on the parameters which guarantee the existence of the second Malliavin derivative of the Heston volatility. We apply this result in order to obtain an extension of the classical Hull and White formula to the Heston model with correlation and derive an approximate option pricing formula.

### Ales Cerny: Martingale Properties of Good-Deal Price Bounds

The talk explores computation of so-called good-deal price bounds and their relationship to certain martingale measures and certain hedging problems. We provide a unifying framework for the HARA class of expected utility preferences which includes among others quadratic, logarithmic and exponential utility functions. We further explore links with indifference pricing, q-optimal measures and f-divergences.

### Timothy C Johnson: The optimal timing of investment decisions: updated

We investigate the solution od an ordinary differential equation and discuss its relevance to optimal stopping and singular control problems. The ODE $$\frac{1}{2}\sigma^2(x)w''(x)+ b(x)w'(x) - r(x)w(x) = 0$$ plays a fundamental role in the solution of stopping problems and singular control problems in stochastic control. We shall discuss some of the properties of its solution and the associated non-homogeneous equation. We shall use these properties to address stopping problems and entry and exit problems.

### Bugar Gashi: New approach to optimal investment and differentiable trading strategies

An integral form of the expected utility from terminal wealth is used as a guide in proposing a new risk-return objective functional. In such a criterion, the risk-aversion is achieved via a wealth dependent quadratic penalty of the fractions of invested wealth. In addition to retaining the properties of the expected utility from terminal wealth, this approach also offers the investor the means for selecting the measure of return, the measure of risk, and richer risk-aversion preferences. The solution to the corresponding portfolio control problem is carried out via the dynamic programming, and the explicit closed form solution is presented for an important example. An implicit approach to the problem of transaction cost is to constrain the trading strategy to be differentiable and thus of finite variation. By also using criteria that penalize the rate of change of the trading strategy, a significant reduction of the eventual transaction cost is achieved. Simulation results for the pseudo-log-optimal portfolio illustrate this method.

### Dirk Tasche: Capital allocation with kernel estimators

Capital allocation, in the context of credit portfolio risk, is often understood as determination of the value-at-risk (VaR) of the loss distribution and a risk-sensitive break-down of VaR to the parts of the portfolio. When the loss distribution is inferred from a Monte-Carlo simulation sample, the break-down of VaR requires to estimate expectations of loan losses conditional on portfolio-wide losses. We discuss the question how kernel estimation methods have to be adapted for this purpose.

### Alfred Mueller: Modelling and comparing risks

In recent years there was an increasing interest in better models for dependent risks, and in studying the effect of dependence on the riskiness of portfolios. In this talk it will be demonstrated, how copulas and stochastic orderings can be used in this context. Moreover, these static results will be extended to a dynamic stochastic process context, showing some recent findings on dependence properties of Levy copulas.

### Joern Sass: Optimal portfolio policies under transaction costs

In a Black Scholes market - consisting of one stock whose prices evolve like a geometric Brownian motion and one risk free asset - an investor wants to maximize the asymptotic growth rate of his wealth (portfolio value). Without transaction costs the optimal policy would be given by the constant Merton fraction which is the fraction of the wealth to be invested in the stock. Facing transaction costs it is no longer adequate to keep the risky fraction constant. We consider a combination of fixed (proportional to wealth) and proportional costs which punish the trading frequency as well as the magnitude of the transactions. Then an optimal trading strategy will consists of a sequence of stopping times and the optimal transactions at those times. So we have to deal with impulse control strategies which can be described as solutions of quasi-variational inequalities. Motivated by various structural results we first look at a restricted class of trading strategies which can be described by four parameters, two for the stopping boundaries and two for the new risk fractions. In this class the problem can be simplified by renewal arguments to one period between two trading times, where we have to weight the new risky fractions by their invariant distribution. This yields an explicit functional that has only to be maximized in these four parameters. So the computation of the best strategy in this class is very simple. Then we use the corresponding quasi-variational inequalities to prove that an optimal solution exists and that it can be found in this class. Solutions for short selling and borrowing can be given.

### Arne Lokka: On variations of De Finetti's optimal dividend problem

The surplus or book value of an insurance company is a random process. This can be described by the Cramer-Lundberg model or by a diffusion process. According to De Finetti it is at some point optimal to pay dividends to the shareholders. However, doing so often imply that the book value at some point will be non-positive with probability one. In my talk, I will focus on the case where the risk process is modelled by a Brownina motion, and discuss how issuance of equity or bail-out loans from the benefactors influence the optimal dividend strategy and the value of the insurance company. I will discuss various cases of costs associated with issuance of equity, and indicate possible extensions to the case of risk processes modelled by general diffusions or Levy processes.

### Johanna Neslehova: Extremes, dependence Modeling and operational risk

Due to the new regulatory guidelines known as Basel II for banking and Solvency 2 for insurance, the financial industry is looking for qualitative approaches to and quantitative models for operational risk. This talk gives an overview of the Basel II requirements for quantitative modeling of operational risk and discusses several possible approaches. Special focus is laid on the advanced measurement approach and the calculation of the operational-risk capital charge. We also raise several issues concerning diversification effects and overall quantitative risk management consequences of extremely heavy-tailed data.

### Alexander McNeil: Statistical inference for dependent defaults and credit migrations

Any portfolio credit risk model that is to be used to calculate a loss distribution associated with defaults and changes in rating must address the challenge of modeling dependent defaults and dependent rating migrations. Most industry models (such as KMV, CreditMetrics, CreditRisk+) incorporate mechanisms for modeling this dependence, generally by assuming conditional independence of defaults and migrations given common economic factors. However, the calibration of these mechanisms is often quite ad hoc, despite the fact that the tail of the portfolio loss distribution is extremely sensitive to small changes in the parameters governing dependence.

We consider the problem of making formal statistical inference for such models based on historical default and rating migration data. In the solution we propose, portfolio credit models are represented as generalized linear mixed models (GLMMs) and inference is made using Markov chain Monte Carlo (MCMC) techniques. This general framework allows quite complex models where the random effects essentially play the role of unobserved latent factors influencing default and migration rates; to capture economic cycle effects the latent factors are allowed to have a dynamic time-series structure. An empirical study of Standard and Poors data shows strong evidence for economic cycles and also reveals pronounced sectoral heterogeneity in default and migration rates.

### Gordon Woo: CATASTROPHE MODELLING OF PANDEMIC EXCESS MORTALITY RISK

Over the past two decades, catastrophe risk modelling has expanded to cover a wide range of natural and man-made hazards. Underlying all catastrophe risk modelling is the realization that historical experience, however extensive, provides only a limited window on future disaster outcomes. In respect of influenza pandemic, 1918 is not the worst case scenario. Concerned over the number of human H5N1 cases, life and health insurers worldwide have sought a quantitative risk assessment of their pandemic loss potential. RMS has developed a catastrophe pandemic risk model that is constructed using a large stochastic set of pandemic scenarios. This model will be described and compared with some actuarial approaches.

### Enrico Biffis: Optimal retention levels in dynamic reinsurance markets

We consider the problem of determining optimal retention levels for insurers willing to mitigate their risk exposure by purchasing proportional reinsurance. We revisit De Finetti's classical results in continuous-time and allow retention levels to change dynamically in response to claims experience and market performance. We also take up some ideas from dynamic reinsurance markets to intertwine De Finetti's work and Markowitz's mean-variance portfolio theory.

### David Wilkie: Some aspects of random number generation

Many actuaries and financial economists use Monte Carlo simulation methods, for which of course they require a random number generator. Many would use what is supplied within the computer system available to them. But there is a lot to random number generation. It is of interest in itself to see how it may be done, and if you understand the principles, you may be able to use your own system, which you can control much better than whatever is provided. I shall share some of my experiences with you.

### Laura Ballotta: A Dirichlet bridge sampling of the variance gamma process: pricing path-dependent options and participating life insurance contract

A new method for sampling a Variance Gamma (VG) process, called Dirichlet bridge sampling, is proposed and is shown to represent a generalization of the known gamma bridge sampling method. Dirichlet bridge sampling allows immediate generation of the entire trajectory of the process over a certain period of time, avoiding sequential sampling at arbitrary intermediate points in time as, instead, it is the case with the gamma bridge method. We explore the efficiency of the proposed simulation methodology and apply some variance reduction techniques such as stratification. The proposed method is then tested against approaches already existing in the literature, such as sequential sampling and bridge sampling of the VG process. In particular, we price path-dependent options such as Asian options, lookback options and barrier options. In addition, we use the proposed technique in order to calculate the market consistent value of some participating life insurance contracts. This application is particularly important for the insurance industry, following the introduction of market based accounting standards and stricter capital requirements in accordance with the EU Solvency II project.

### Andreas Tsanakas: To split or not to split? Capital allocation with convex risk measures

Capital allocation when aggregate requirements are given by coherent risk measures has been exhaustively studied. Approaches based on marginal costs yield allocations that provide no incentives to split the portfolio, which is consistent with the subadditivity property of the risk measure. This presentation discusses the capital allocation problem with convex risk measures, which relax the positive homogeneity/subadditivity property of coherent ones. In that context, aggregation penalties are applied and there may be a legitimate case for splitting a portfolio. It is shown that the convexity property has very strong implications in a capital allocation context, since it implies ad infinitum splitting of portfolios, if such splitting can take place at no additional cost. Finally, in a modest attempt to inject some realism into the model, constraints are imposed on possible portfolio splits and appropriate solutions are sought in the theory of coalitional games.

### Carol Alexander: Model-free hedge ratios and scale-invariant models

A price process is scale-invariant if and only if the returns distribution is independent of the price measurement scale. We show that most stochastic processes used for pricing options on financial assets have this property and that many models not previously recognised as scale-invariant are indeed so. We also prove that price hedge ratios for a wide class of contingent claims under a wide class of pricing models are model-free. In particular, previous results on model-free price hedge ratios of vanilla options based on scale-invariant models are extended to any contingent claim with homogeneous pay-off, including complex, path-dependent options. However, model-free hedge ratios only have the minimum variance property in scale-invariant stochastic volatility models when price-volatility correlation is zero. In other stochastic volatility models and in scale-invariant local volatility models, model-free hedge ratios are not minimum variance ratios and our empirical results demonstrate that they are less efficient than minimum variance hedge ratios.

### Jacques Pezier: Global portfolio optimization revisited - a least discrimination alternative to Black-Litterman

Global portfolio optimization models rank among the proudest achievements of modern finance theory, but for years practitioners have struggled to put them to work. In 1992, Back and Litterman put the problem down to difficulties portfolio managers have in expanding views about some expected asset returns into full probabilistic forecasts about all asset returns. They propose a method to mitigate personal forecasts so that the ensuing optimal portfolio does not depart too much from a chosen reference portfolio. But we find that their method lacks a sound rationale, requires ad hoc inputs and is limited in scope. We propose a more general method based on a least discrimination principle. It produces a probabilistic forecast that is true to personal views but is otherwise as close as possible, in an expected utility sense, to the forecast implied by the reference portfolio. The least discrimination method produces optimal portfolios for a variety of views, including views on volatility and correlation and produces the corresponding optimal portfolios, including options when appropriate. It also justifies a simple linear interpolation between market and personal forecasts should a compromise be reached.

### Mogens Steffensen: From life insurance to credit risk: on optimal decisions in a multi-state model

We present two situations where optimal decisions are to be made in a Multi-state model: a) Decisions on protection against risk in a life insurance context, including risk of losing income due to disability or unemployment; b) Portfolio optimization with credit risky assets where the credit risk is modelled by a (conditional) Markov chain. The two situations seem very different but they turn out to be closely related by the mathematics it takes to solve the optimization problems.

### Arnold Shapiro: Modeling fuzzy random variables

There are two important sources of uncertainty: randomness and fuzziness. Randomness models the stochastic variability of all possible outcomes of a situation and fuzziness relates to the unsharp boundaries of the parameters of the model. In this sense, randomness is largely an instrument of a normative analysis that focuses on the future, while fuzziness is more an instrument of a descriptive analysis reflecting the past and its implications. Clearly, randomness and fuzziness are complementary, and so a natural question is how fuzzy variables could interact with random variables. This presentation focuses on one important dimension of this issue, fuzzy random variables (FRVs). The goal is to model these FRVs and, in doing so, to illustrate how naturally compatible and complementary randomness and fuzziness are.

### Ashay Kadam: Issuer Heterogeneity in Credit Ratings Migration

Sources of heterogeneity in rating migration behavior are explored using a continuous time Markov chain based framework. Working in continuous time circumvents the embedding problem, mitigates the censoring effect and facilitates term structure modelling with arbitrary prediction horizons. Classical estimation provides ample evidence of heterogeneity. However, adopting a Bayesian estimation procedure can help mitigate the problems arising from data sparsity and reduce estimation error. The transition probability matrices estimated for different issuer profiles can be quite different from each other. Using the CreditRisk+ framework, and a sample credit portfolio, it can be shown that ignoring heterogeneity may give erroneous estimates of VaR and a misleading picture of the risk capital.

### Andrea Macrina: Information, Inflation, and Interest

A class of stochastic models for the pricing of inflation-linked assets is proposed. The nominal and the real pricing kernels, in terms of which the consumer price index can be expressed, are modelled by introducing a bivariate utility function depending on (a) aggregate consumption, and (b) the aggregate real liquidity benefit conferred by the money supply. Consumption and money supply policies are chosen such that the expected joint utility obtained over a specified time horizon is maximised, subject to a budget constraint that takes into account the ?value? of the liquidity benefit associated with the money supply. For any choice of the bivariate utility function, the resulting model determines a relation between the rate of consumption, the price level, and the money supply. The model also produces explicit expressions for the real and nominal pricing kernels, and hence establishes a basis for the arbitrage-free valuation of inflation-linked securities. In conclusion I will also make some remarks about the modelling of interest rates and inflation in an information-based setting. (Work in collaboration with L. P. Hughston, King's College London).

### Rainer Schulz: Renting versus Owning and the Role of Income Risk: The Case of Germany

For most households, choosing whether to rent or buy a home is a difficult, multifaceted problem. Not only do households have to grapple with the uncertainties of future movements of rents and house prices and the substantial cost of changing residence. Housing tenure decisions are further complicated if households' exposure to labour income risk varies across occupations, industries and regions. Then, potential correlations with these background risks may influence the rent or buy decision. In this study, we present preliminary empirical evidence, derived from the German Socio-Economic Panel (GSOEP), that both labour income growth and rent growth varies across industries and regions. We find that income-rent correlations have a statistically significant influence on industry-specific average rental shares in West-German federal states. However, the economic significance of the relationship between real rent growth and real income growth on the decision to rent or own is rather low. A one standard deviation of the income-rent correlation implies an increase in rental shares of about 1.75 percentage points. (Work in collaboration with Martin Wersing and Axel Werwatz).

### Andreas Milidonis: Estimation of Distress Costs Associated with Downgrades Using Regime Switching Models

We use a unique dataset of bond downgrades from a niche rating company that has been found to be reacting faster to publicly available information than its competitors. Using regime-switching models we propose risk measures to quantify stock return disturbances (distress costs) associated with the timing of downgrades. These risk measures are based on the Capital Asset Pricing Model (CAPM) and use the estimated parameters of the regime-switching models in a method that resembles a dynamic event study. We observe a noticeable switch from a low-volatility to a high-volatility regime one day before the day of downgrades. On average the volatility in stock returns triples around the time of downgrades and the stock return process remains in the high-volatility regime for about three days. Using our proposed risk measure we find that stock returns are associated with distress costs of about twenty-two*d percent (where “d” is the daily market price of risk) over a window of ten days before and after downgrades. These costs can be further separated between bond rating companies that are designated by the SEC as nationally recognized to rate debt and those which are not. (Work in collaboration with Shaun Wang, Georgia State University)

### Anton Pelsser: Approximate solutions for indifference pricing under general utility functions

With the aid of Taylor-based approximations, this paper presents results for pricing insurance contracts by using indifference pricing under general utility functions. We discuss the connection between the resulting “theoretical” indifference prices and the pricing rule-of-thumb that practitioners use: Best Estimate plus a “Market Value Margin”. Furthermore, we compare our approximations with known analytical results for exponential and power utility.

### William Shaw: Quantile mechanics and dependency

Recent work on copula theory has reinvigorated the use of quantile functions for the Monte Carlo simulation of marginal distributions. The first half of this talk discusses quantile functions as solutions of certain non-linear ordinary and partial differential equations. The PDE representation leads us to a natural generalization to a collection of multivariate distributions in which quite exotic combinations of marginal distributions are coupled together in a natural way. (Joint work with G. Steinbrecher)

### Adam Butterworth: Family history as a risk factor for common, complex disease

Under the moratorium on the use of genetic tests in insurance underwriting, greater emphasis is placed on family history information in predicting risk. We conducted literature-based systematic reviews and meta-analyses on 7 common diseases (colorectal cancer, breast cancer, lung cancer, prostate cancer, ovarian cancer, stroke and multiple sclerosis) to estimate pooled relative risks. To allow individual risk prediction, we used life-table methods based on population disease and mortality data to convert relative risks to absolute risks for different patterns of family history. I will present data from the different diseases with comments on the availability and reliability of population data, validity of methods and conclusions that can be drawn from our risk estimation efforts.

### Ermanno Pitacco: Modelling Disability. Applications to Sickness and Accident Insurance

Starting from a rather general description of the “disability process” (that is, the development through time of an individual occurrence of disability), we show that a reasonable approximation to the related probabilistic structure leads to the multistate model. In actuarial practice, the multistate model is used, for example, for pricing and reserving in Income Protection and Long Term Care business. Conversely, calculations for other insurance products in the “health” area are commonly based on simpler (and often less rigorous) methods. We first show that, as the features of the multistate model allow for several disability degrees, a rigorous modelling for Personal Accident Insurance can be obtained; in this context, risk factors (and hence rating factors) can be represented by an appropriate choice of the transition intensities. Secondly, as the multistate model provides a sound framework for interpreting practical calculation methods used in the health insurance area, we discuss some pricing and reserving formulae for Personal Accident Insurance and Sickness Insurance.

### Rüdiger Frey: Pricing and hedging credit derivatives via nonlinear filtering

We start with a brief introduction to portfolio credit risk modelling. In the main part of the talk a new information-based approach for modelling the dynamic evolution of a portfolio of credit risky securities is proposed. In this context market prices of liquidly traded derivatives are given by the solution of a nonlinear filtering problem. This problem is solved via the innovations approach to nonlinear filtering. Moreover, we derive the ensuing asset price dynamics and compute risk-minimizing hedging strategies. We conclude with some (preliminary) numerical results.

### Andrew Cairns: Quantitative and qualitative comparison of stochastic mortality models

Longevity risk (the risk that future mortality rates are lower than anticipated) has, in recent years, become a focus of attention in the insurance and pensions industry. Alongside this a number of new models have been developed to describe the stochastic evolution of mortality rates over time. In this extended seminar, we will review a number of these models, ranging from the simple Lee-Carter model to more complex multifactor models incorporating a cohort effect. As a starting point, one might use a quantitative criterion such as the BIC to identify which models are the best. However, this on its own reveals only part of the picture. In the talk we will discuss a variety of additional criteria that can be used to give a much clearer picture of the merits of each model. Amongst these, criteria that relate to the robustness of a particular model give a clear indication that one model with a high BIC is unreliable.

### Mark Owen: Utility maximisation with transaction costs

My talk will be about optimal investment in a model of currency trading with transaction costs. The model is general enough to allow a discontinuous bid-offer spread. The investor wishes to maximise their "direct" utility of consumption, which is measured in terms of consumption assets linked to some (but not necessarily all) of the traded currencies. The analysis will center on two conditions under which the straightforward existence of a dual minimiser leads to the existence of an optimal terminal wealth. The first condition is a well known growth condition on the dual function. The second weaker, and more natural condition is that of "asymptotic satiability" of the value function.

### David Dickson: Some explicit solutions for the joint density of the time of ruin and the deficit at ruin

I will start with a review of results in Dickson (ASTIN Bulletin, 2008) about the joint density of the time of ruin and deficit at ruin in the classical risk model. I will then show how these ideas can be applied to the Erlang(2) risk model. Some interesting contrasts arise between these two models, particularly in the special case when the initial surplus is 0. An exact solution for the joint density will be presented in the case of Erlang(2) claims, and some computational issues will be discussed.

### Carole Bernard: Structured investment products and the retail investor

Structured products are popular with retail investors. Many of these products provide a guaranteed return combined with some participation in the performance of the equity market. These contracts often have complex (path-dependent) designs. We explain why expected utility maximizing investors should prefer (European) contracts. However, if consumers overweight the probability of getting the maximum possible return they may prefer the more complex contracts. We explore this explanation and provide evidence that sellers encourage this type of overweighting by the projections they select in the prospectus documents.

### Dmitriy Kim: Ruin Probability for a process with switching

Ruin probability is considered for a Markov process with one level of switching between two independent Levy processes one of which is spectrally negative and another is a compound Poisson process with drift. There is given a partition of the real line into two sets and, with each set, there is associated a probability distribution. When a Markov process takes a value in one of these sets, its next increment has the distribution associated with this set. Explicit representations were found for the ruin probability in terms of ladder heights. As a consequence, results were obtained for a risk process, where the premium rate and the claim size depend on whether current reserve is above or below a certain threshold.

### Ralf Korn: Worst-case portfolio optimization with applications in finance and insurance

We consider the problem of optimal investment under the threat of a crash of  uncertain height. Further, we do not know if and when the crash happens. For this a new stock price model will be introduced and the approach of worst-case portfolio optimization will be developed. The computed optimal portfolio strategies show a much more realistic behaviour than the ones obtained in the standard Merton setting. Applications to both portfolio problems in financial and actuarial models are given.

### Filip Lindskog: Heavy tail analysis for stochastic processes and ruin probabilities under optimal investments

The talk may be divided into three parts. The first two parts introduce ideas and methods that appear in the analysis of rare events for heavy-tailed stochastic processes. In the last part, these ideas are used in an analysis of a ruin problem. The outline of the talk is as follows. Part I - Heavy tails - introduces and explains the concept of regular variation. This begins with the Pareto distribution and the aim is to show that it is natural and very useful to look at regular variation in a general setting, a weak convergence approach. Part II - Extremes for stochastic processes - focuses on understanding the extremal behaviour of stochastic integral processes driven by heavy-tailed noise. This is an illustration of the ideas discussed in Part I and will be important in the analysis in... Part III - Ruin probabilities under optimal investments. Explicit results for the asymptotics of ruin probabilities are found without strong distributional assumptions for the claim sizes and the processes representing the investment possibilities. One aim here is to illustrate the usefulness of the approach discussed in Parts I and II.

### Rutang Thanawalla: Measuring Liquidity in the CDS Market

Based on joint work with Mirela Predescu, Greg Gupton, Ahmet Kocagil, Wei Liu, Alexander Reyngold, Quantitative Research, Fitch Solutions.

This talk reviews a statistical model that ranks the liquidity of reference entities in the single-name credit default swap (CDS) market. A reduced-form approach is adopted: a handful of price- and market activity-based predictors are selected to signal the liquidity characteristics of each reference entity. When combined in a regression model, these predictors provide a basis for ranking reference entities on their relative liquidity.

The model's main contribution is that it generates a liquidity score for each reference entity. This provides a framework for: evaluating overall CDS market liquidity; assessing the liquidity in each sector; comparing corporate and sovereign liquidity; understanding the relationship between liquidity and credit quality.
The data covers well over a 1000 reference entities from different geographical regions over a near three-year period. The model has thus been statistically validated in different ways. It is shown to have high discriminatory power in separating those names which the wider market perceives to be liquid from those which it does not. The results are shown to be significant across different geographical regions. The model also performs well on walk-forward tests, which is especially reassuring given that the model estimation period spans the recent credit and banking crisis.

As of June 2008, the single-name CDS market had notional outstanding of over USD 33 trillion with total (netted) market valuation of approximately USD 2 trillion, an increase of over 65% over the previous six months. Given the large volume and increased market valuation, and the recent liquidity and counterparty-related concerns in various markets, this OTC market (along with other credit derivatives) is under consideration by legislative bodies and regulators for closer scrutiny and regulation going forward. The results of this research should partially address some of the concerns raised. It also provides a concrete framework for managing liquidity risk in this market.

### Alan Forrest: Correlation in Retail Credit Risk

This talk aims to show how ideas of correlation can change the way Retail Banks model Retail Credit Risk in practice, and how correlation can help them understand cyclic Credit Risk phenomena that otherwise require ad hoc solutions. This is illustrated by analysis of long-term default time-series at portfolio level, and at obligor level. Unlike Wholesale Credit Risk, where data is thin, Retail Credit Risk can take advantage of huge datasets to apply data-hungry General Linear Mixed Modelling techniques. Never-the-less, Retail Operations remain strongly influenced by its zero-correlation traditions, and this talk will examine the cultural and technical issues that are expected to arise as Retail catches up with Wholesale in its adoption of correlation.

### Gordon Woo: Using Geroscience to Quantify Extreme longevity Risk

Geroscience is the scientific interface between ageing and age-related disease. As a modern discipline, it is one of a number of 21st century innovations that will drive change in mortality risk in coming decades. A longevity risk modelling initiative will be described that models mortality risk at an individual level, and recognizes the intrinsic randomness in the process of medical discovery in simulating future trajectories of mortality improvement.

### Christian-Oliver Ewald: Risk minimization in stochastic volatility models: Model risk and empirical performance

In this paper the performance of locally risk-minimizing hedge strategies for European options in stochastic volatility models is studied from an experimental as well as from an empirical perspective. These hedge strategies are derived for a large class of diffusion-type stochastic volatility models, and they are as easy to implement as usual delta hedges. Our simulation results on model risk show that the locally risk-minimizing hedges are robust with respect to uncertainty and even misconceptions about the underlying data generating process. The empirical study indicates that locally risk-minimizing hedge strategies consistently produce lower standard deviations of profit-and-loss-ratios than delta hedges (over different time periods as well as in different markets). The more skewed the market and the more out-of-the-money the option, the higher the benefit.

### Chris Rogers: Optimal and robust contracts for a risk-constrained principal

The theory of risk measurement has been extensively developed over the past ten years or so, but there has been comparatively little effort devoted to using this theory to inform portfolio choice. One theme of this paper is to study how an investor in a conventional log-Brownian market would invest to optimize expected utility of terminal wealth, when subjected to a bound on his risk, as measured by a coherent law-invariant risk measure. Results of Kusuoka lead to remarkably complete expressions for the solution to this problem.

The second theme of the paper is to discuss how one would actually manage (not just measure) risk. We study a principal/agent problem, where the principal is required to satisfy some risk constraint. The principal proposes a compensation package to the agent, who then optimises selfishly ignoring the risk constraint. The principal can pick a compensation package that induces the agent to select the principal's optimal choice. We consider two possibilities: firstly, that the principal chooses a contract which is cheapest subject to satisfying the agent's participation constraint; and secondly, a robust contract which perfectly aligns the objectives of principal and agent. The two typically differ little in price, though their form can look surprisingly different.

### Rudiger Kiesel: Modeling the Forward Surface of Mortality

In recent literature, different methods have been proposed on how to define and model stochastic mortality. In most of these approaches, the so-called spot force of mortality is modeled as a stochastic process. In contrast to such spot force models, forward force mortality models infer dynamics on the entire age/term-structure of mortality.

This paper considers forward models defined based on best-estimate forecasts of survival probabilities as can be found in so-called best-estimate generation life tables. We show that the forward approach bears profound advantages in view of actuarial applications and provide a detailed analysis of forward mortality models driven by finite-dimensional Brownian motion. In particular, we address the relationship to other modeling approaches, the consistency problem of parametric forward models, and the existence of finite dimensional realizations for Gaussian forward models.

All results are illustrated based on a simple example with an affine specification.

The talk is based on joint work with Daniel Bauer (Georgia State University) and Fred Espen Benth (University of Oslo).

### Mario Wuethrich: Cost-of-Capital Margin for a General Insurance Liability Runoff

Under new solvency regulations general insurance companies need to calculate a risk margin for the risks that go beyond
the best estimate liabilities. One approach currently used is the so-called cost-of-capital approach where the companies
calculate the necessary risk bearing capital and then build reserves for the price of this risk bearing capital.

Since a general insurance liability runoff takes several years this involves multiperiod risk measures. Because multiperiod
risk measures are often difficult to handle, companies calculate the (univariate) risk measure for the next accounting year and then
use a proxy for the remaining accounting years that is based on that univariate risk measure.

We derive a rigorous multiperiod risk measure approach for a specific chain-ladder claims reserving model, where one
is still able to calculate or approximate the margin analytically. Using these analytical formulas we then compare our results to
the proxies used in practice.