Travaux académiques publiés

2019

DATA, DATA SCIENCE, MACHINE LEARNING, SIMULATIONS AND COMPUTATIONS WITH APPLICATIONS TO INSURANCE

A Machine Learning approach for individual claims reserving in insurance
Maximilien Baudry, Christian Robert [2019], Applied Stochastic Models in Business and Industry, Volume35, Issue5, September/October 2019, Pages 1127-1155.

Abstract: Accurate loss reserves are an important item in the financial statement of an insurance company and are mostly evaluated by macrolevel models with aggregate data in run‐off triangles. In recent years, a new set of literature has considered individual claims data and proposed parametric reserving models based on claim history profiles. In this paper, we present a nonparametric and flexible approach for estimating outstanding liabilities using all the covariates associated to the policy, its policyholder, and all the information received by the insurance company on the individual claims since its reporting date. We develop a machine learning–based method and explain how to build specific subsets of data for the machine learning algorithms to be trained and assessed on. The choice for a nonparametric model leads to new issues since the target variables (claim occurrence and claim severity) are right‐censored most of the time. The performance of our approach is evaluated by comparing the predictive values of the reserve estimates with their true values on simulated data. We compare our individual approach with the most used aggregate data method, namely, chain ladder, with respect to the bias and the variance of the estimates. We also provide a short real case study based on a Dutch loan insurance portfolio.

A tree-based algorithm adapted to microlevel reserving and long development claims
Olivier Lopez, Xavier Milhaud, Pierre-Emmanuel Thérond [2019], ASTIN Bulletin, Cambridge University Press (CUP), 2019, 49 (3), pp.741-762.

Abstract: In non-life insurance, business sustainability requires accurate and robust predictions of reserves related to unpaid claims. To this aim, two different approaches have historically been developed: aggregated loss triangles and individual claim reserving. The former has reached operational great success in the past decades, whereas the use of the latter still remains limited. Through two illustrative examples and introducing an appropriate tree-based algorithm, we show that individual claim reserving can be really promising, especially in the context of long-term risks.

A self-organizing predictive map for non-life insurance
Donatien Hainaut [2019], European Actuarial Journal, July 2019, Volume 9, Issue 1, pp 173–207.

Abstract: This article explores the capacity of self-organizing maps (SOMs) for analysing non-life insurance data. Contrary to feed forward neural networks, also called perceptron, a SOM does not need any a priori information on the relevancy of variables. During the learning procedure, the SOM algorithm selects the most relevant combination of explanatory variables and reduces by this way the collinearity bias. However, the specific features of insurance data require adapting the classic SOM framework to manage categorical variables and the low frequency of claims. This work proposes several extensions of SOMs in order to study the claims frequency of a portfolio of motorcycle insurances. Results are next compared to these computed with variants of the k-mean clustering algorithm.

Orthogonal polynomial expansions to evaluate stop-loss premiums
Pierre-Olivier Goffard, Patrick J. Laub (2019), Journal of Computational and Applied Mathematics (to appear)

Abstract: A numerical method is proposed to evaluate the survival function of a compound distribution and the stop-loss premiums associated with a non-proportional global reinsurance treaty. The method relies on a representation of the probability density function in terms of Laguerre polynomials and the gamma density. We compare the method against a well established Laplace transform inversion technique at the end of the paper.

Composite likelihood estimation method for hierarchical Archimedean copulas defined with multivariate compound distributions
Hélène Cossette, Simon-Pierre Gadoury, EtienneMarceau, Christian Y. Robert (2019), Journal of Multivariate Analysis, Volume 172, July 2019, Pages 59-83

Abstract: We consider the family of hierarchical Archimedean copulas obtained from multivariate exponential mixture distribution through compounding, as introduced by Cossette et al. (2017). We investigate ways of determining the structure of these copulas and estimating their parameters. An agglomerative clustering technique based on the matrix of Spearman’s rhos, combined with a bootstrap procedure, is used to identify the tree structure. Parameters are estimated through a top-down composite likelihood. The validity of the approach is illustrated through two simulation studies in which the procedure is explained step by step. The composite likelihood method is also compared to the full likelihood method in a simple case where the latter is computable.

Orthonormal polynomial expansions and lognormal sum densities
Søren Asmussen, Pierre-Olivier Goffard, Patrick J. Laub (2019), Risk and Stochastics: Ragnar Norberg at 70 (Mathematical Finance Economics), World Scientific

Abstract: Approximations for an unknown density g in terms of a reference density fν and its associated orthonormal polynomials are discussed. The main application is the approximation of the density f of a sum S of lognormals which may have different variances or be dependent. In this setting, g may be f itself or a transformed density, in particular that of logS or an exponentially tilted density. Choices of reference densities fν that are considered include normal, gamma and lognormal densities. For the lognormal case, the orthonormal polynomials are found in closed form and it is shown that they are not dense in L2(fν), a result that is closely related to the lognormal distribution not being determined by its moments and provides a warning to the most obvious choice of taking fν as lognormal. Numerical examples are presented and comparisons are made to established approaches such as the Fenton–Wilkinson method and skew-normal approximations. Also extensions to density estimation for statistical data sets and non-Gaussian copulas are outlined.

Phase-type models in life insurance: fitting and valuation of equity-linked benefits
Søren Asmussen, Jevgenijs Ivanovs, Patrick J. Laub, and Hailiang Yang (2019), Risks, 7(1), 17 pages

Abstract: Phase-type (PH) distributions are defined as distributions of lifetimes of finite continuous-time Markov processes. Their traditional applications are in queueing, insurance risk, and reliability, but more recently, also in finance and, though to a lesser extent, to life and health insurance. The advantage is that PH distributions form a dense class and that problems having explicit solutions for exponential distributions typically become computationally tractable under PH assumptions. In the first part of this paper, fitting of PH distributions to human lifetimes is considered. The class of generalized Coxian distributions is given special attention. In part, some new software is developed. In the second part, pricing of life insurance products such as guaranteed minimum death benefit and high-water benefit is treated for the case where the lifetime distribution is approximated by a PH distribution and the underlying asset price process is described by a jump diffusion with PH jumps. The expressions are typically explicit in terms of matrix-exponentials involving two matrices closely related to the Wiener-Hopf factorization, for which recently, a Lévy process version has been developed for a PH horizon. The computational power of the method of the approach is illustrated via a number of numerical examples.

Monte Carlo estimation of the density of the sum of dependent random variables
Patrick J. Laub, Robert Salomone, Zdravko I. Botev (2019), Mathematics and Computers in Simulation, 161, pp. 23-31

Abstract: We study an unbiased estimator for the density of a sum of random variables that are simulated from a computer model. A numerical study on examples with copula dependence is conducted where the proposed estimator performs favorably in terms of variance compared to other unbiased estimators. We provide applications and extensions to the estimation of marginal densities in Bayesian statistics and to the estimation of the density of sums of random variables under Gaussian copula dependence.

 

MODELS FOR INSURANCE

Stochastic Deflator for an Economic Scenario Generator with Five Factors
Cheng P.K., Planchet F. [2019], Bankers Markets Investors, n°157, June 2019.

Abstract: In this paper, we implement a stochastic deflator with five economic and financial risk factors: interest rates, market price of risk, stock prices, default intensities, and convenience yields. We examine the deflator with different financial assets, such as stocks, zero‐coupon bonds, vanilla options, and corporate coupon bonds. We find required regularity conditions to implement our stochastic deflator. Our numerical results show the reliability of the deflator approach in pricing financial derivatives.

How to Define the Quality of an Economic Scenario Genarator to Assess the Best Estimate of a French Savings Contract in € ?
Armel K., Planchet F. [2019], Bankers Markets Investors, n°157, June 2019.

Abstract: Applying a Mark-to-Market approach to evaluate the fair value of the insurer’s commitment (best-estimate) for a saving French contract in €, implies having the prices of options and guarantees of insurance policies. Since this information is not observable on an organized and liquid market, the calculation is made in a Markto-Model framework. The calibration and validation of the economic scenario generator (ESG), used to evaluate the best-estimate, by comparing the simulations to the observed data as part of a statistical approach, cannot be considered. The ESG is then calibrated and validated with reference to the financial instruments (caps, floors, swaptions, etc.), derived from the modelled risk factors, without justifying a direct link or a bijection between these financial instruments and the liability options (see for example Laurent & al. [2014], Planchet & al. [2009], Armel & Planchet [2018]). The purpose of this paper is to examine how we can define the quality of an economic scenario generator to evaluate the best-estimate of French savings contracts in €.

A Reduced-Form Model for A Life Insurance’s Net Asset Value
A. Bignon, A. Ndjeng-Ndjeng, Y. Salhi, P.-E. Thérond [2019], Bankers, Markets and Investors (2019), 157, 3-15

Abstract: In this paper we develop a closed-form model for the net asset value of a life insurance portfolio aimed at simplifying the assessment and quantification of the impact of financial stress scenarios on the insurer’s solvency. In fact, using the current practice based on an internal model is timeconsuming and thus it is not relevant when it comes to carry out sensitivity studies that should require rapid action from the management. Due to the nature of the stress scenarios that are mostly related to the financial market determinants, their impact is quite straightforward on the market value of financial assets. Therefore, in this paper, we focus on the distortion caused on the liability side and investigate a reduced-form model for the best estimate liabilities that is not only easily interpretable but also capable of anticipating market variation impact in the liabilities. The model is built based on a dataset drawn from a French life insurer’s projection model using single, double and triple shocks on the interest rates yield curve, equity market value and profit sharing provision. In order to capture as much information as possible from the dataset, several feasible regression specifications are used. The general form of the empirical model is specified as a linear combination of the risk factors and its predictive ability is investigated based using an out-of-sample analysis.

Testing the Martingale Hypothesis in a Risk-Neutral Economic Scenarios Generator
Pierre-Emmanuel Thérond, Florian Bollotte [2019], Bankers Markets & Investors : an academic & professional review, Groupe Banque, 2019, pp.16-31

Abstract: This paper examines how statistical tests for martingale hypothesis can be applied to au- dit a risk-neutral Economic Scenarios Generator (ESG). The martingale test usually used to appreciate the risk-neutrality of the generated scenarios consists in testing that the mean of discounted asset prices is constant over time, which is a necessary condition for a martingale process. Although we have not found many studies that refer to statistical martingale tests in the framework of ESG, we have noticed that this approach can be very useful from the point of view of the actuarial function, allowing for example to detect implementation errors in the ESG and provide results more easily interpretable in sensitivity analysis than the standard mar- tingale test. Nevertheless, their application can raise many questions for the actuarial function, especially as concerns the choice of the tests retained, the ESG variable (asset) to which we apply the test and the acceptance threshold, and we will thus propose some recommendations for the use of these tests within a risk neutral ESG.

Experience Prospective Life-Tables for the Algerian Retirees
Flici F., Planchet F. [2019], Risks, 7(2), 38, Special Issue “New Perspectives in Actuarial Risk Management”.

Abstract: The aim of this paper is to construct prospective life tables adapted to the experience of Algerian retirees. Mortality data of the retired population are only available for the ages from 50 to 95 years and older and for the period from 2004 to 2013. The use of the conventional prospective mortality models is not supposed to provide robust forecasts given data limitation in terms of either exposure to death risk or data length. To improve forecasting robustness, we use the global population mortality as an external reference. The adjustment of the experience mortality on the reference allows projecting the age-specific death rates calculated based on the experience of the retired population. We propose a generalized version of the Brass-type relational model incorporating a quadratic effect to perform the adjustment. Results show no significant difference for men, either retired or not, but reveal a gap of over three years in the remaining life expectancy at age 50 in favor of retired women compared to those of the global population.

Mesure de l’espérance de vie sans dépendance en France métropolitaine
Guibert Q., Planchet F., Schwarzinger M. [2018c], Bulletin Français d’Actuariat, vol. 18, n°35.

Abstract: The aim of this article is to construct prospective mortality tables without severe long term care from French national data. These tables make it possible in particular to determine expectations of survival without severe long term care, by sex and by nature of the loss of autonomy (cognitive or otherwise physical).

Mesure du risque de perte d’autonomie en France métropolitaine
Guibert Q., Planchet F., Schwarzinger M. [2018b], Bulletin Français d’Actuariat, vol. 18, n°35.

Abstract: This paper focuses on the construction of incidence law for the loss of total autonomy over the period 2010-2012 from data from the national hospital bases (PMSI 2008-2013). Our results are broken down into two types of dependence: cognitive dependence (or dementia) and physical dependence. Women have a slightly higher risk of dementia, while the risk of physical dependence is higher for men. The incidence of dependence « all causes » is comparable between men and women. The results suggest a slowdown in incidence beyond the age of 90 and a convergence of men and women in old age. The implications of these results for extrapolation to older ages are discussed.

Mesure de l’espérance de vie en dépendance totale en France métropolitaine
Guibert Q., Planchet F., Schwarzinger M. [2018a], Bulletin Français d’Actuariat, vol. 18, n°35.

Abstract: The aim of this article is to construct life tables in total dependence from French national data. These tables make it possible in particular to determine expectation of survival in dependence, by sex and by nature of the loss of autonomy (dementia or physical dependence).

Insurance: models, digitalization, and data science
Albrecher, H., Bommier, A., Filipovic, D., Koch-Medina, P., Loisel, S., & Schmeiser, H. [2019], European Actuarial Journal (2019), 9(2), 349-360.

Abstract: This article summarizes the main topics and findings from the Swiss Risk and Insurance Forum 2018. That event gathered experts from academia, insurance industry, regulatory bodies, and consulting companies to discuss the challenges arising from the impact of data science and, more generally, of digitalization to the insurance sector.

Partially Schur-constant vectors
A. Castaner, C. Lefèvre, S. Loisel, M. Claramunt [2019], Journal of Multivariate Analysis (2019), Vol. 172, 47-58.

Abstract: In this paper, we introduce a new multivariate dependence model that generalizes the standard Schur-constant model. The difference is that the random vector considered is partially exchangeable, instead of exchangeable, whence the term partially Schur-constant. Its advantage is to allow some heterogeneity of marginal distributions and a more flexible dependence structure, which broadens the scope of potential applications. We first show that the associated joint survival function is a monotonic multivariate function. Next, we derive two distributional representations that provide an intuitive understanding of the underlying dependence. Several other properties are obtained, including correlations within and between subvectors. As an illustration, we explain how such a model could be applied to risk management for insurance networks.

Modelling net carrying amount of shares for market consistent valuation of life insurance liabilities
D. Dorobantu, Y. Salhi, P.-E. Thérond [2019], Methodology and Computing in Applied Probability (2019), in press

Abstract: The attractiveness of insurance saving products is driven, among others, by dividends payments to policyholders and participation in profits. These are mainly constrained by regulatory measures on profit-sharing on the basis of statutory accounts. Moreover, since both prudential and financial reporting regulation require market consistent best estimate measurement of insurance liabilities, cash-flows projection models have to be used for such a purpose in order to derive the underlying financial incomes. Such models are based on Monte-Carlo techniques. The latter should simulate future accounting profit and losses needed for profit-sharing mechanisms. In this paper we deal with impairment losses on equity securities for financial portfolios which rely on instrument-by-instrument assessment (when projection models consider groups of shares). Our motivation is to describe the joint distribution of market value and impairment provision of a book of equity securities, with regard to the French accounting rules for depreciation. The results we obtain enable to improve the ability of projection models to represent such an asymmetric mechanism. Formally, an impairment loss is recognized for an equity instrument if there has been a significant and prolonged decline in its market value below the carrying cost (acquisition value). Such constraints are formalized using an assumption on the dynamics of the equity, and leads to a complex option-like pay-off. Using this formulation, we propose analytical formulas for some quantitative measurements related the impairments losses of a book of financial equities. These are derived on a general framework and some tractable example are illustrated. We also investigate the operational implementation of these formulas and compare their computational time to a basic simulation approach.

A Model-Point Approach to Indifference Pricing of Life Insurance Portfolios with Dependent Lives
C. Blanchet-Scalliet, D. Dorobantu, Y. Salhi [2019], Methodology and Computing in Applied Probability (2019) 21(2) 423-448

Abstract: In this paper, we study the pricing of life insurance portfolios in the presence of dependent lives. We assume that an insurer with an initial exposure to n mortality-contingent contracts wanted to acquire a second portfolio constituted of m contracts. The policyholders’ lifetimes in these portfolios are correlated with a Farlie-Gumbel-Morgenstern (FGM) copula, which induces a dependency between the two portfolios. In this setting, we compute the indifference price charged by the insurer endowed with an exponential utility. The indifference price is characterized as a solution to a backward stochastic differential equation (BSDE), which can be decomposed into (n − 1) n! auxiliary BSDEs. In this general case, the derivation of the indifference price is computationally infeasible. Therefore, while focusing on the example of death benefit contracts, we develop a model-point based approach in order to ease the computation of the price. It consists on replacing each portfolio with a single policyholder that replicates some risk metrics of interest. Also, the two representative contracts should adequately reproduce the observed dependency between the initial portfolios. We implement the proposed procedure and compare the computed prices to classical valuation approach.

Le Prix du Risque de Longévité
N. El Karoui, C. Hillairet, S. Loisel, Y. Salhi [2019], Revue d’Economie Financière  (2019), 133,  129-145

Abstract: In this article, we address the issue of the price of longevity risk. We begin by describing the risk of longevity and its components, distinguishing biometric, financial and regulatory aspects. We then explain the different valuation frameworks (actuarial, financial and regulatory), their common points and their differences. We discuss the issue of discounting and modeling long-term interest rates for longevity risk management. We also give details on the subjective and pragmatic way to handle different components of longevity risk, especially the most extreme, in the market.

2018

Age-Specific Adjustment of Graduated Mortality
Y. Salhi, P.-E. Thérond, ASTIN Bulletin (2018) 48(2) 543-569

Abstract: Recently, there has been an increasing interest from life insurers to assess their portfolios’ mortality risks. The new European prudential regulation, namely Solvency II, emphasized the need to use mortality and life tables that best capture and reflect the experienced mortality, and thus policyholders’ actual risk profiles, in order to adequately quantify the underlying risk. Therefore, building a mortality table based on the experience of the portfolio is highly recommended and, for this purpose, various approaches have been introduced into actuarial literature. Although such approaches succeed in capturing the main features, it remains difficult to assess the mortality when the underlying portfolio lacks sufficient exposure. In this paper, we propose graduating the mortality curve using an adaptive procedure based on the local likelihood. The latter has the ability to model the mortality patterns even in presence of complex structures and avoids relying on expert opinions. However, such a technique fails to offer a consistent yet regular structure for portfolios with limited deaths. Although the technique borrows the information from the adjacent ages, it is sometimes not sufficient to produce a robust life table. In the presence of such a bias, we propose adjusting the corresponding curve, at the age level, based on a credibility approach. This consists in reviewing the assumption of the mortality curve as new observations arrive. We derive the updating procedure and investigate its benefits of using the latter instead of a sole graduation based on real datasets. Moreover, we look at the divergences in the mortality forecasts generated by the classic credibility approaches including Hardy–Panjer, the Poisson–Gamma model and the Makeham framework on portfolios originating from various French insurance companies.

Modelling net carrying amount of shares for market consistent valuation of life insurance liabilities
D. Dorobantu, Y. Salhi, P.-E. Thérond, soumis et en révision  (2018)

Abstract : The attractiveness of insurance saving products is driven, among others, by dividends payments to policyholders and participation in profits. These are mainly constrained by regulatory measures on profit-sharing on the basis of statutory accounts. Moreover, since both prudential and financial reporting regulation require market consistent best estimate measurement of insurance liabilities, cash-flows projection models have to be used for such a purpose in order to derive the underlying financial incomes. Such models are based on Monte-Carlo techniques. The latter should simulate future accounting profit and losses needed for profit-sharing mechanisms. In this paper we deal with impairment losses on equity securities for financial portfolios which rely on instrument-by-instrument assessment (when projection models consider groups of shares). Our motivation is to describe the joint distribution of market value and impairment provision of a book of equity securities, with regard to the French accounting rules for depreciation. The results we obtain enable to improve the ability of projection models to represent such an asymmetric mechanism. Formally, an impairment loss is recognized for an equity instrument if there has been a significant and prolonged decline in its market value below the carrying cost (acquisition value). Such constraints are formalized using an assumption on the dynamics of the equity, and leads to a complex option-like pay-off. Using this formulation, we propose analytical formulas for some quantitative measurements related the impairments losses of a book of financial equities. These are derived on a general framework and some tractable example are illustrated. We also investigate the operational implementation of these formulas and compare their computational time to a basic simulation approach.

Can Pension Funds Partially Manage Longevity Risk by Investing in a Longevity Megafund?
Debonneuil E., Eyraud-Loisel A., Planchet F. [2018] «, Risks, Special Issue « New Perspectives in Actuarial Risk Management », to appear.

Abstract: Pension funds, which manage the financing of a large share of global retirement schemes, need to invest their assets in a diversified manner and over long durations while managing interest rate and longevity risks. In recent years, a new type of investment has emerged, that we call a longevity megafund, which invests in clinical trials for solutions against lifespan-limiting diseases and provides returns positively correlated with longevity. After describing ongoing biomedical developments against ageing-related diseases, we model the needed capital for pension funds to face longevity risk and find that it is far above current practices. After investigating the financial returns of pharmaceutical developments, we estimate the returns of a longevity megafund. Combined, our models indicate that investing in a longevity megafund is an appropriate method to significantly reduce longevity risk and the associated economic capital need.

Non-Parametric Inference of Transition Probabilities Based on Aalen-Johansen Integral Estimators for Semi-Competing Risks Data: Application to LTC Insurance
Guibert Q., Planchet F. [2018] Insurance: Mathematics and Economics, Volume 82, Pages 21-36, doi: 10.1016/j.insmatheco.2018.05.004.

Abstract : Studying Long Term Care (LTC) insurance requires modeling the lifetime of individuals in presence of both terminal and non-terminal events which are concurrent. Although a nonhomogeneous semi-Markov multi-state model is probably the best candidate for this purpose, most of the current researches assume, maybe abusively, that the Markov assumption is satisfied when fitting the model. In this context, using the Aalen-Johansen estimators for transition probabilities can induce bias, which can be important when the Markov assumption is strongly unstated. Based on some recent studies developing non-Markov estimators in the illness-death model that we can easily extend to a more general acyclic multi-state model, we exhibit three non-parametric estimators of transition probabilities of paying cash-flows, which are of interest when pricing or reserving LTC guarantees in discrete time. As our method directly estimates these quantities instead of transition intensities, it is possible to derive asymptotic results for these probabilities under non-dependent random right-censorship, obtained by re-setting the system with two competing risk blocks. Inclusion of left-truncation is also considered. We conduct simulations to compare the performance of our transition probabilities estimators without the Markov assumption. Finally, we propose a numerical application with LTC insurance data, which is traditionally analyzed with a semi-Markov model.

Comment construire un générateur de scénarios économiques risque neutre destiné à l’évaluation économique des contrats d’épargne ?
Armel K., Planchet F. [2018] Assurances et gestion des risques, Vol. 85 (1-2).

Abstract: Ce papier présente une démarche de construction d’un générateur de scénarios économiques risque neutre, destinés à l’évaluation du best-estimate des contrats d’épargne, dans le cadre d’un environnement économique caractérisé par des taux négatifs.

Multiple time series forecasting using quasi-randomized functional link neural networks
Cousin A., Moudiki T., Planchet F. [2018] Risks, 2018, 6(1), 22; doi:10.3390/risks6010022.

Abstract: We are interested in obtaining forecasts for multiple time series, by taking into account the potential nonlinear relationships between their observations. For this purpose, we use a specific type of regression model on an augmented dataset of lagged time series. Our model is inspired by dynamic regression models (Pankratz 2012), with the response variable’s lags included as predictors, and is known as Random Vector Functional Link (RVFL) neural networks. The RVFL neural networks have been successfully applied in the past, to solving regression and classification problems. The novelty of our approach is to apply an RVFL model to multivariate time series, under two separate regularization constraints on the regression parameters.

Do actuaries believe in longevity deceleration?
Debonneuil E., Loisel S., Planchet F. [2018] Insurance: Mathematics and Economics, Volume 78, Pages 325-338.  Lien vers l’article

Abstract: As more and more people believe that significant life extensions may come soon, should commonly used future mortality assumptions be considered prudent? We find here that commonly used actuarial tables for annuitants – as well as the Lee–Carter model – do not extrapolate life expectancy at the same rate for future years as for past years; instead they produce some longevity deceleration. This is typically because their mortality improvements decrease after a certain age, and those age-specific improvements are constant over time. As potential alternatives (i) we study the Bongaarts model that produces straight increases in life expectancy; (ii) we adapt it to produce best-practice longevity trends (iii) we compare with various longevity scenarios even including a model for “life extension velocity”. (iv) after gathering advances in biogerontology we discuss elements to help retirement systems cope with a potential strong increase in life expectancy.

Utilisation des estimateurs de Kaplan-Meier par génération et de Hoem pour la construction de tables de mortalité prospectives
Guibert Q., Planchet F. [2018] Bulletin Français d’Actuariat, vol. 17, n°33.

Operational choices for risk aggregation in insurance: PSDization and SCR sensitivit
X. Milhaud, V. poncelet, C. Saillard, Risks, (2018), 6, 36

Abstract: This work addresses crucial questions about the robustness of the PSDization process for applications in insurance. PSDization refers to the process that forces a matrix to become positive semidefinite. For companies using copulas to aggregate risks in their internal model, PSDization occurs when working with correlation matrices to compute the Solvency Capital Requirement (SCR). We examine how classical operational choices concerning the modelling of risk dependence impacts the SCR during PSDization. These operations refer to the permutations of risks (or business lines) in the correlation matrix, the addition of a new risk, and the introduction of confidence weights given to the correlation coefficients. The use of genetic algorithms shows that theoretically neutral transformations of the correlation matrix can surprisingly lead to significant sensitivities of the SCR (up to 6%). This highlights the need for a very strong internal

Cluster size distributions of extreme values for the Poisson-Voronoï tessellation
Chenavier, N. and Robert, C. (2018) To appear in Annals of Applied Probability.

Abstract: We consider the Voronoi tessellation based on a homogeneous Poisson point process in Rd. For a geometric characteristic of the cells (e.g. the inradius, the circumradius, the volume), we investigate the point process of the nuclei of the cells with large values. Conditions are obtained for the convergence in distribution of this point process of exceedances to a homogeneous compound Poisson point process. We provide a characterization of the asymptotic cluster size distribution which is based on the Palm version of the point process of exceedances. This characterization allows us to compute efficiently the values of the extremal index and the cluster size probabilities by simulation for various geometric characteristics. The extension to the Poisson-Delaunay tessellation is also discussed.

A central limit theorem for functions of stationary max-stable random fields on R^d
Koch, E., Dombry, C. and Robert, C. (2018). To appear in Stochastic Processes and Their Applications.

Abstract : Max-stable random fields are very appropriate for the statistical modelling of spatial extremes. Hence, integrals of functions of max-stable random fields over a given region can play a key role in the assessment of the risk of natural disasters, meaning that it is relevant to improve our understanding of their probabilistic behaviour. For this purpose, in this paper, we propose a general central limit theorem for functions of stationary max-stable random fields on Rd. Then, we show that appropriate functions of the Brown–Resnick random field with a power variogram and of the Smith random field satisfy the central limit theorem. Another strong motivation for our work lies in the fact that central limit theorems for random fields on Rd have been barely considered in the literature. As an application, we briefly show the usefulness of our results in a risk assessment context.

Geometric ergodicity for some space-time max-stable Markov chains
Koch, E. and Robert, C. (2018). To appear in Statistics and Probability letters.

Abstract: Max-stable processes are central models for spatial extremes. In this paper, we focus on some space-time max-stable models introduced in Embrechts et al. (2016). The processes considered induce discrete-time Markov chains taking values in the space of continuous functions from the unit sphere of R3 to (0,∞). We show that these Markov chains are geometrically ergodic. An interesting feature lies in the fact that the state space is not locally compact, making the classical methodology inapplicable. Instead, we use the fact that the state space is Polish and apply results presented in Hairer (2010).

Composite likelihood estimation method for hierarchical Archimedean copulas defined with multivariate compound distributions
Cossette, H., Gadoury S.P., Marceau, E. and Robert, C. (2019). To appear in Journal of Multivariate Analysis.

2017

MODELS FOR INSURANCE

Measures of risk and performance in the management of insurance organisations > LONGEVITY

A Class of Random Field Memory Models for Mortality Forecasting, P. Doukhan, J. Rynkiewicz, D. Pommeret, Y. Salhi, pp Insurance: Mathematics and Economics Vol. 77 (2017), pp 97-110 Lien vers l’article
Abstract: This article proposes a parsimonious alternative approach for modeling the stochastic dynamics of mortality rates. Instead of the commonly used factor-based decomposition framework, we consider modeling mortality improvements using a random field specification with a given causal structure. Such a class of models introduces dependencies among adjacent cohorts aiming at capturing, among others, the cohort effects and cross generations correlations. It also describes the conditional heteroskedasticity of mortality. The proposed model is a generalization of the now widely used AR-ARCH models for random processes. For such class of models, we propose an estimation procedure for the parameters. Formally, we use the quasi-maximum likelihood estimator (QMLE) and show its statistical consistency and the asymptotic normality of the estimated parameters. The framework being general, we investigate and illustrate a simple variant, called the three-level memory model, in order to fully understand and assess the effectiveness of the approach for modeling mortality dynamics.

Minimax optimality in robust detection of a disorder time in doubly-stochastic Poisson processes, N. El Karoui, S. Loisel, Y. Salhi, Annals of Applied Probability Vol. 27, N. 4 (2017), pp 2515-2538. Lien vers l’article
Abstract:     We consider the minimax quickest detection problem of an unobservable time of change in the rate of an inhomogeneous Poisson process. We seek a stopping rule that minimizes the robust Lorden (1971) criterion, formulated in terms of the number of events until detection, both for the worst-case delay and the false alarm constraint. In the Wiener case, such a problem has been solved using the so-called cumulative sums (cusum) strategy by Shiryaev (1963, 2009), or Moustakides (2004) among others. In our setting, we derive the exact optimality of the cusum stopping rule by using finite variation calculus and elementary martingale properties to characterize the performance functions of the cusum stopping rule in terms of scale functions. These are solutions of some delayed differential equations that we solve elementarily. The case of detecting a decrease in the intensity is easy to study because the performance functions are continuous. In the case of an increase where the performance functions are not continuous, martingale properties require using a discontinuous local time. Nevertheless, from an identity relating the scale functions, the optimality of the cusum rule still holds. Finally, some numerical illustration are provided.

Basis risk modeling: A co-integration based approach, Y. Salhi, S. Loisel, Statistics Vol. 51, N. 1, (2017), pp 205-221 Lien vers l’article
Abstract:     In this paper we propose a multivariate approach for forecasting pairwise mortality rates of related populations. The need for joint modeling of mortality rates is analyzed using a causality test. We show that for the datasets considered, the inclusion of national mortality information enhances predictions on its sub-populations. The investigated approach links national population mortality to that of a subset population, using an econometric model that captures a long-term relationship between the two mortality dynamics. This model does not focus on the correlation between the mortality rates of the two populations, but rather their long-term behavior, which suggests that the two times series cannot wander off in opposite directions for long before mean reverting, which is consistent with biological reasoning. The model can additionally capture short-term adjustments in the mortality dynamics of the two populations. An empirical comparison of the forecast of one-year death probabilities for policyholders is performed using both a classical factor-based model and the proposed approach. The robustness of the model is tested on mortality rate data for England and Wales, alongside the Continuous Mortality Investigation assured lives data set, representing the sub-population.

Utilisation des estimateurs de Kaplan-Meier par génération et de Hoem pour la construction de tables de mortalité prospectives ? Guibert Q., Planchet F. [2017], Bulletin Français d’Actuariat, vol. 17, n°33. Lien vers l’article
Abstract: Data quality is an overarching concern when it comes to building a mortality model or prospective mortality tables. This is even more significant when these procedures are based on a small population, as data may show major random fluctuations due to a lack of information for particular ages. Such situations arise frequently with the entry into force of Solvency II as insurers shall consider their own data sets, limited in size, to build best estimate tables.  Since  parametric  methods  are  too  rough  to  capture  a  realistic  mortality  pattern  in  two dimensions, the mortality profile is quite often adjusted using exogenous information, such as a table based on a national population. In light of this, the aim of this paper is to discuss the problem of choosing appropriate estimators  for  two-dimensional  mortality  rates  or  death  rates  in  the  presence  of  independent  censoring. Indeed,  practitioners  currently  use  the  Hoem  estimator  or  the  Kaplan -Meier  estimator  split  by  generation without questioning their relevance and reliability.  We propose in this paper a comparative analysis of these estimators and try to give some criteria to choose one approach over another, and give some figures based on a real insurance portfolio and simulated data. Finally, we provided some non-parametric estimators for a direct estimation of death rates both with the cohort and the period approaches.

Le risque de longévité est-il assurable?, N. El Karoui, S. Loisel, Revue d’Economie Financière (2017). Lien vers l’article
Abstract: Dans cet article, nous étudions la question de l’assurabilité du risque de longévité à travers le prisme de l’enterprise risk management (ERM). Nous commençons par identifier les différentes composantes du risque de longévité. Nous présentons ensuite les manières de modéliser, mesurer et détecter des évolutions de ce risque. Enfin, nous étudions les différents contrôles de risque possibles et les risques résiduels associés.

Measures of risk and performance in the management of insurance organisations > MULTIVARIATE DEPENDENCE MODELING

Tail approximations for sums of dependent regularly varying variables under Archimedean copula models, H. Cossette, E. Marceau, Q.H. Nguyen and C. Y. Robert (2017), to appear in Methodology and Computing in Applied Probability Lien vers l’article
Abstract: In this paper, we compare two numerical methods for approximating the probability that the sum of dependent regularly varying random variables exceeds a high threshold under Archimedean copula models. The first method is based on conditional Monte Carlo. We present four estimators and show that most of them have bounded relative errors. The second method is based on analytical expressions of the multivariate survival or cumulative distribution functions of the regularly varying random variables and provides sharp and deterministic bounds of the probability of exceedance. We discuss implementation issues and illustrate the accuracy of both procedures through numerical studies.

A note on upper-patched generators for Archimedean copulas, Di Bernardino, E., Rullière, D. (2017), ESAIM: Probability and Statistics, to appear.
doi: 10.1051/ps/2017003, ISSN: 1292-8100 – eISSN: 1262-3318. Lien vers l’article
Abstract: The class of multivariate Archimedean copulas is defined by using a real-valued function called the generator of the copula. This generator satisfies some properties, including d-monotonicity. We propose here a new basic transformation of this generator, preserving these properties, thus ensuring the validity of the transformed generator and inducing a proper valid copula. This transformation acts only on a specific portion of the generator, it allows both the non-reduction of the likelihood on a given dataset, and the choice of the upper tail dependence coefficient of the transformed copula. Numerical illustrations show the utility of this construction, which can improve the fit of a given copula both on its central part and its tail.

On finite exchangeable sequences and their dependence, C. Lefèvre, S. Loisel, S. Utev, to appear in Journal of Multivariate Analysis (2017). Lien vers l’article
Abstract: This paper deals with finite sequences of exchangeable 0–1 random variables. Our main purpose is to exhibit the dependence structure between such indicators. Working with Kendall’s representation by mixture, we prove that a convex order of higher degree on the mixing variable implies a supermodular order of same degree on the indicators, and conversely. The convex order condition is then discussed for three standard distributions (binomial, hypergeometric and Stirling) in which the parameter is randomized. Distributional properties of exchangeable indicators are also revisited using an underlying Schur-constant property. Finally, two applications in insurance and credit risk illustrate some of the results.

Measures of risk and performance in the management of insurance organisations > FINANCIAL MODELS AND MARKET CONSISTENCY

Continuous Mixed-Laplace Jump Diffusion models for stocks and commodities, D. Hainaut (2017), Quantitative Finance and Economics. Vol 1(2), p 145-173. Lien vers l’article
In this paper we propose a new dynamics for jumps hitting stocks and commodities prices. The jump process is the continuous limit of a mixture of compound Poisson processes.  This model is compatible with Solvency II recommendations for risk management and is related to the axis “models for insurance” of the DAMI chair.

Clustered Levy processes and their Financial applications, D. Hainaut (2017), Accepted in Journal of Computational and Applied Mathematics (JCAM 5y impact factor 1.413) Vol 319, p 117-140. Lien vers l’article
This article studies the properties of Lévy processes that are time changed by a stochastic clock which is the integral of a Hawkes process. The purpose of this model is to introduce jump clustering in the dynamics of financial assets or insurance liabilities. This work is related to the axis “models for insurance” of the DAMI chair.

Proposition d’un modèle de projection des scénarios économiques pour le développement de la zone CIPRES Ahoussi A., Gbongué F., Planchet F. [2017], Assurances et gestion des risques, Vol. 84 (1-2). Lien vers l’article
Abstract: Un  générateur  de  scénarios  économique  (GSE)  est  un  outil  qui  permet  de  projeter  des facteurs  de  risque  économiques  et  financiers.  Il s’agit d’un élément important dans le pilotage technique de l’activité d’assurance, notamment dans l’évaluation des provisions économiques, l’allocation stratégique des actifs et la gestion des risques financiers.  Dans la littérature, les modèles du GSE que nous rencontrons, sont applicables difficilement en Afrique subsaharienne francophone, principalement  à  cause de l’insuffisance ou l’inexistence des données. Pour pallier à ce problème, nous proposons, dans cet article, une démarche de conception d’un générateur de scénarios économique pertinent, adapté au contexte de la zone CIPRES.

Market inconsistencies of the market-consistent European life insurance economic valuations: pitfalls and practical solutions, N. El Karoui, S. Loisel, J.L. Prigent, J. Vedani, accepted, to appear in European Actuarial Journal (2017). Lien vers l’article 
Abstract : The Solvency II directive has introduced a specific so-called risk-neutral framework to valuate economic accounting quantities throughout European life insurance companies. The adaptation of this theoretical notion for regulatory purposes requires the addition of a specific criterion, namely the market-consistency, in order to objectify the choice of the valuation probability measure. This paper aims at pointing out and fixing some of the major risk sources embedded in the current regulatory life insurance valuation scheme. We compare actuarial and financial valuation schemes. We then address first operational issues and potential market manipulation sources in life insurance, induced by both theoretical and regulatory pitfalls. For example, we show that calibrating the interest rate model in October 2014 instead of December 31 st 2014 generates a 140%-increase in the economic own funds of a representative French life insurance company. We propose various adaptations of the current implementations, including product-specific valuation scheme, to limit the impact of these market-inconsistencies.

Governance of models and decision-makers’ behavior

Contagion modelling between Financial and Insurance markets with time changed processes, D. Hainaut (2017), Insurance: Mathematics & Economics Vol 74, p 63-77. (CNRS rank 3, ABS 3*, impact factor 5y 1.748). Lien vers l’article
In this article, we introduce contagion between non-life insurance liabilities and financial through a stochastic clock, which is the integral of a Hawkes process. This is an elegant solution to introduce correlation between a compound Poisson process and a Brownian motion. In this framework, we infer a bound on the insurer’s ruin probability and the optimal investment-reinsurance policy. This work is related to the axis “models for insurance” of the DAMI chair.

Modeling policyholder behavior

Lapse risk in life insurance: correlation and contagion effects among policyholders’ behaviors, F. Barsotti, X. Milhaud, Y. Salhi, Insurance: Mathematics and Economics, (Oct. 2016) Volume 71, pp.317-331. Lien vers l’article
Abstract: The present paper proposes a new methodology to model the lapse risk in life insurance by integrating the dynamic aspects of policyholders’ behaviors and the dependency of the lapse intensity on macroeconomic conditions. Our approach, suitable to stable economic regimes as well as stress scenarios, introduces a mathematical framework where the lapse intensity follows a dynamic contagion process, see Dassios and Zhao (2011). This allows to capture both contagion and correlation potentially arising among insureds’ behaviors. In this framework, an external market driven jump component drives the lapse intensity process depending on the interest rate trajectory: when the spread between the market interest rates and the contractual crediting rate crosses a given threshold, the insurer is likely to experience more surrenders. A log-normal dynamic for the forward rates is introduced to build trajectories of an observable market variable and mimic the effect of a macroeconomic triggering event based on interest rates on the lapse intensity. Contrary to previous works, our shot-noise intensity is not constant and the resulting intensity process is not Markovian. Closed-form expressions and analytic sensitivities for the moments of the lapse intensity are provided, showing how lapses can be affected by massive copycat behaviors. Further analyses are then conducted to illustrate how the mean risk varies depending on the model’s parameters, while a simulation study compares our results with those obtained using standard practices. The numerical outputs highlight a potential misestimation of the expected number of lapses under extreme scenarios when using classical stress testing methodologies.

Hedging of options in presence of jump clustering. D. Hainaut and F. Moraux. Accepted for publication in Journal of Computational Finance (ABS 1*, 5 year impact factor 0.651). Publication forthcoming in 2018. Lien vers l’article
Participating life insurance contracts contain several long-term options that may not be hedged in financial markets due to a lack of liquidity for maturities over one year. This paper explores the performance of delta-gamma hedging strategies in presence of jumps clustering, modelled by a Hawkes process. This work is related to the axis “models for insurance” of the DAMI chair.

Robust proxies for use in calculating sensitivity and constituting model points, etc. to rapidly obtain assessments with reasonable calculation times

Robust SCR valuation of participating life insurance policies in the Solvency II Framework, D. Hainaut, P. Devolder and A. Pelsser. Accepted for publication in Insurance: Mathematics & Economics (CNRS rank 3, ABS 3*, impact factor 5y 1.748). Publication online in 2018. Lien vers l’article
This article propose a framework to evaluate the solvency capital requirements of a life participating insurance contract, such as defined in the Solvency II regulation. The novelty of our approach consists of taking into account the uncertainty about model specifications. This work is related to the axis “models for insurance” of the DAMI chair.

A Model-Point Approach to Indifference Pricing of Life Insurance Portfolios with Dependent Lives, C. Blanchet-Scalliet, D. Dorobantu, Y. Salhi, Methodology and Computing in Applied Probability, (2017), pp 1-25 Lien vers l’article
Abstract: In this paper, we study the pricing of life insurance portfolios in the presence of dependent lives. We assume that an insurer with an initial exposure to n mortality-contingent contracts wanted to acquire a second portfolio constituted of m individuals. The policyholders’ lifetimes in these portfolios are correlated with a Farlie-Gumbel-Morgenstern (FGM) copula, which induces a dependency between the two portfolios. In this setting, we compute the indifference price charged by the insurer endowed with an exponential utility. The optimal price is characterized as a solution to a backward differential equation (BSDE). The latter can be decomposed into (n – 1)n! auxiliary BSDEs. In this general case, the derivation of the indifference price is computationally infeasible. Therefore, while focusing on the example of death benefit contracts, we develop a model point based approach in order to ease the computation of the price. It consists on replacing each portfolio with a single policyholder that replicates some risk metrics of interest. Also, the two representative agents should adequately reproduce the observed dependency between the initial portfolios.

Nested Kriging predictions for datasets with a large number of observations, Rullière, D., Durrande, N., Bachoc, F., Chevalier, C. (2017), Statistics and Computing, in press. Lien vers l’article
doi: 10.1007/s11222-017-9766-2, ISSN: 0960-3174 (Print) 1573-1375 (Online) .
Abstract: This work falls within the context of predicting the value of a real function at some input locations given a limited number of observations of this function. The Kriging interpolation technique (or Gaussian process regression) is often considered to tackle such a problem, but the method suffers from its computational burden when the number of observation points is large. We introduce in this article nested Kriging predictors which are constructed by aggregating sub-models based on subsets of observation points. This approach is proven to have better theoretical properties than other aggregation methods that can be found in the literature. Contrarily to some other methods it can be shown that the proposed aggregation method is consistent. Finally, the practical interest of the proposed method is illustrated on simulated datasets and on an industrial test case with observations in a 6-dimensional space.

Kriging of financial term-structures, Cousin, A., Maatouk, H., Rullière, D. (2016), European Journal of Operational Research, vol. 255, issue 2, pp. 631-648.
doi: 10.1016/j.ejor.2016.05.057, ISSN: 0377-2217, SCIE, CC-ECT. Lien vers l’article
Abstract: Due to the lack of reliable market information, building financial term-structures may be associated with a significant degree of uncertainty. In this paper, we propose a new term-structure interpolation method that extends classical spline techniques by additionally allowing for quantification of uncertainty. The proposed method is based on a generalization of kriging models with linear equality constraints (market-fit conditions) and shape-preserving conditions such as monotonicity or positivity (no-arbitrage conditions). We define the most likely curve and show how to build confidence bands. The Gaussian process covariance hyper-parameters under the construction constraints are estimated using cross-validation techniques. Based on observed market quotes at different dates, we demonstrate the efficiency of the method by building curves together with confidence intervals for term-structures of OIS discount rates, of zero-coupon swaps rates and of CDS implied default probabilities. We also show how to construct interest-rate surfaces or default probability surfaces by considering time (quotation dates) as an additional dimension.

DATA ANALYTICS IN INSURANCE

Modern analytics methodologies

A neural network analyzer for mortality forecast, D. Hainaut, Accepted for publication in the ASTIN Bulletin (CNRS rank 3, ABS 2*, impact factor 1.083, Journal of the International Actuarial Association), publication online in 2018. Lien vers l’article
This article proposes a new method based on “bottleneck” neural networks to forecast mortality tables. An empirical study on French, UK and US population reveals that our approach outperforms traditional actuarial models, like the Lee-Carter model. This work is related to the axis “data analytics for insurance” of the DAMI chair.

Tree-based censored regression with applications in insurance, O. Lopez, X. Milhaud, P. Therond, Electronic Journal of Statistics, (Oct. 2016) Volume 10 issue 2, pp.2685-2716. Lien vers l’article
Abstract: We propose a regression tree procedure to estimate the conditional distribution of a variable which is not directly observed due to censoring. The model that we consider is motivated by applications in insurance, including the analysis of guarantees that involve durations, and claim reserving. We derive consistency results for our procedure, and for the selection of an optimal subtree using a pruning strategy. These theoretical results are supported by a simulation study, and two applications involving insurance datasets. The first concerns income protection insurance, while the second deals with reserving in third-party liability insurance.

 

Accéder à la collection HAL des publications de la chaire

2016

Old-Age Provision: Past, Present, Future, H. Albrecher, P. Embrechts, D. Filipovic, G. Harrison, P. Koch-Medina, S. Loisel, P. Vanini, J. Wagner, accepted, to appear in European Actuarial Journal (2016). Preprint sur SSRN. Lien vers l’article
Cet article est un travail collectif visant à restituer les échanges de la conférence Old-age Provision : Past, Present, Future organisée au Swiss Re Center for Global Dialogue en juin 2015.

Partial Splitting of Longevity and Financial Risks: The Longevity Nominal Choosing Swaptions, H. Bensusan, N. El Karoui, S. Loisel, Y. Salhi, accepted (2016), to appear in Insurance: Mathematics and Economics 68:61-72. Lien vers l’article
Dans cet article, on propose un produit de couverture du risque de taux pour un portefeuille de rentes. Ce produit permet à son détenteur d’ajuster après quelques années le nominal de sa couverture du risque de taux en fonction de la tendance de longévité réalisée ou réévaluée.

Systemic tail risk distribution. Bienvenüe, A. and Robert C. (2016). A paraître dans Annals of Economics and Statistics. Lien vers l’article
Dans ce papier, nous proposons un modèle basé sur la théorie des valeurs extrêmes qui donne le nombre moyen d’actifs d’un marché qui subissent une perte importante de leur valeur si l’un au moins d’entre eux connaît une baisse significative de sa valeur. Ce modèle nous permet de mesurer le risque systémique sur un marché financier et de comprendre comment un assureur doit diversifier ses risques pour éviter des pertes importantes en cas de crise financière.

Likelihood inference for multivariate extreme value distributions whose spectral vectors have known conditional distributions. Bienvenüe, A. and Robert C. (2016). A paraître dans Scandinavian Journal of Statistics. Lien vers l’article
Dans ce papier, nous développons l’approche statistique à mettre en place pour estimer le modèle du papier précédent.

Measuring mortality heterogeneity with multi-state models and interval-censored data, A. Boumezoued, N. El Karoui, S. Loisel, accepted, to appear in Insurance: Mathematics and Economics (2016). Lien vers l’article
Dans cet article, on explique comment mesurer la mortalité dans une population hétérogène dans un modèle multi-états. Cela correspond à une modélisation de la longévité par la dynamique des populations.

Pilotage de la participation aux bénéfices et calcul de l’option de revalorisation, Combes F., Planchet F. Tammar M. [2016], Bulletin Français d’Actuariat, vol. 16, n°31. Lien vers l’article

Benchmarking asset allocation strategies in the presence of liability constraints. Cousin, A., Jiao, Y., Robert, C. and Zerbib, D. (2016)  A paraître dans Insurance: Mathematics and Economics. Lien vers l’article
Dans ce papier, nous présentons une stratégie optimale d’actifs en présence de risque de crédit lorsque l’asset manageur est exposé à des contraintes de passif.

Impact of volatility clustering on equity indexed annuities, D.Hainaut, 2016 Insurance: Mathematics & Economics 71, p-367–381 Lien vers l’article
This study analyses the impact of volatility clustering in stock markets on the evaluation and risk management of equity indexed annuities (EIA). To introduce clustering in equity returns, the reference index is modelled by a diffusion combined with a bivariate self-excited jump process.

A bivariate Hawkes process for interest rates modelling. D.Hainaut, 2016, Economic Modelling 57, p180-196 Lien vers l’article
This paper proposes a continuous time model for interest rates, based on a bivariate mutually exciting point process. The two components of this process represent the global supply and demand for fixed income instruments. In this framework, closed form expressions are obtained for the first moments of the short term rate and for bonds, under an equivalent affine risk neutral measure.

Tree-based censored regression with applications in insurance, O. Lopez, X. Milhaud, P. Therond​​, Electronic Journal of Statistics (2016) Volume 10 issue 2, pp.2685-2716. Lien vers l’article
Ce papier traite du problème d’information complète (données censurées) appliqué aux algorithmes d’apprentissage statistique, plus précisément aux arbres de régression et de classification (CART). Les questions de convergence des estimateurs arbre avec ce type de données sont abordées, et l’application en assurance correspond à des comparaisons entre estimations de montant de sinistres finaux par des experts et des arbres CART.

Wind Storm Risk Management : Sensitivity of Return Period Calculations and Spread on the Territory, A. Mornet, T. Opitz, M. Luzi, S. Loisel, accepted, to appear in SERRA (2016). Lien vers l’article
Cet article combine données météorologiques et d’assurances pour proposer un indice de sévérité de tempêtes et étudie la robustesse des calculs de périodes de retour de tempêtes extrêmes.

A Credibility Approach for the Makeham Mortality Law. Y. Salhi, P.-E. Thérond, J. Tomas (2016) European Actuarial Journal 6(1) 61-96 Lien vers l’article
Nous proposons une approche de crédibilité consistant à réviser les tables de mortalité, au fur et à mesure que de nouvelles observations arrivent, les paramètres d’un ajustement Makeham. Cela permet de rajouter de la structure ce qui s’avère utile lorsque les portefeuilles sont de taille limitée et le processus de révision proposé intègre la bonne représentation aux différents âges.

Période 2010-2015 – Management de la Modélisation en Assurance

Depuis son lancement en octobre 2010, les recherches académiques effectuées dans le cadre de la chaire ont porté sur les thèmes suivants.

  • Le calcul explicite de probabilités de ruine dans des contextes de dépendance entre montants des sinistres et/ou temps inter-sinistres.
  • L’évaluation du risque d’estimation et de l’erreur liées à l’agrégation des risques.
  • L’estimation statistique d’indicateurs de risque en présence de plusieurs facteurs de risque (problématiques multidimensionnelles), en vue d’applications en théorie du risque, à l’ERM (« Entreprise Risk Management ») et dans le cadre de Solvabilité 2
  • La modélisation des événements rares ou extrêmes et ses applications en théorie du risque.
  • Le calcul de la marge pour risque des risques d’assurance non-vie dans le cadre de Solvabilité 2.
  • La modélisation des risques liés à la démographie et aux événements catastrophiques.
  • La gestion des risques corrélés, en particulier en matière de risque de crédit, de contrepartie et de liquidité.
  • Les aspects de comptabilité et liés à la mise en place de Solvabilité 2.

TRAVAUX ACADEMIQUES

Les recherches académiques réalisées dans le cadre de la chaire se sont répartis, lors de la période 2010-2015, selon trois thèmes principaux :

  1. Nouveaux développements de la théorie du risque : prise en compte des
    dépendances, calculs explicites de probabilités de ruine, indicateurs de risques multivariés, événements rares ou extrêmes, analyse économique du risque.
  2. Management du capital économique des compagnies d’assurances : adaptation au cadre de Solvabilité 2, générateurs de scénarios économiques, risque de modèle, agrégation des risques, allocation de capital, marge pour risque,
  3. Risques de défaut, de liquidité, risques démographiques : risques de défaut corrélés, risques de contrepartie, de crédit propre, impact des nouvelles réglementations sur le coût et la pérennité des ressources, modélisation des risques de longévité et de mortalité.