Travaux académiques publiés

2017

MODELS FOR INSURANCE

Measures of risk and performance in the management of insurance organisations > LONGEVITY

A Class of Random Field Memory Models for Mortality Forecasting, P. Doukhan, J. Rynkiewicz, D. Pommeret, Y. Salhi, pp Insurance: Mathematics and Economics Vol. 77 (2017), pp 97-110 Lien vers l’article
Abstract: This article proposes a parsimonious alternative approach for modeling the stochastic dynamics of mortality rates. Instead of the commonly used factor-based decomposition framework, we consider modeling mortality improvements using a random field specification with a given causal structure. Such a class of models introduces dependencies among adjacent cohorts aiming at capturing, among others, the cohort effects and cross generations correlations. It also describes the conditional heteroskedasticity of mortality. The proposed model is a generalization of the now widely used AR-ARCH models for random processes. For such class of models, we propose an estimation procedure for the parameters. Formally, we use the quasi-maximum likelihood estimator (QMLE) and show its statistical consistency and the asymptotic normality of the estimated parameters. The framework being general, we investigate and illustrate a simple variant, called the three-level memory model, in order to fully understand and assess the effectiveness of the approach for modeling mortality dynamics.

Do actuaries believe in longevity deceleration?, Debonneuil E., Loisel S., Planchet F. [2017], Insurance: Mathematics and Economics, to appear. Lien vers l’article
Abstract: As more and more people believe that significant life extensions may come soon, should commonly used future mortality assumptions be considered prudent? We find here that commonly used actuarial tables for annuitants  –  as  well  as  the  Lee-Carter  model  –  do  not  extrapolate  life  expectancy  at  the  same  rate  for future years as for past years; instead they produce some longevity deceleration. This is typically because their  mortality  improvements  decrease  after  a  certain  age,  and  those  age -specific  improvements  are constant  over  time.  As  potential  alternatives  i)  we  study  the  Bongaarts  model  that  produces  straight increases in life expectancy; ii) we adapt it to produce best -practice longevity trends iii) we compare with various  longevity  scenarios  even  including  a  model  for  “life  extension  velocity”.  iv)  after gathering advances in biogerontology we discuss elements to help retirement systems cope with a potential strong increase in life expectancy.

Minimax optimality in robust detection of a disorder time in doubly-stochastic Poisson processes, N. El Karoui, S. Loisel, Y. Salhi, Annals of Applied Probability Vol. 27, N. 4 (2017), pp 2515-2538. Lien vers l’article
Abstract:     We consider the minimax quickest detection problem of an unobservable time of change in the rate of an inhomogeneous Poisson process. We seek a stopping rule that minimizes the robust Lorden (1971) criterion, formulated in terms of the number of events until detection, both for the worst-case delay and the false alarm constraint. In the Wiener case, such a problem has been solved using the so-called cumulative sums (cusum) strategy by Shiryaev (1963, 2009), or Moustakides (2004) among others. In our setting, we derive the exact optimality of the cusum stopping rule by using finite variation calculus and elementary martingale properties to characterize the performance functions of the cusum stopping rule in terms of scale functions. These are solutions of some delayed differential equations that we solve elementarily. The case of detecting a decrease in the intensity is easy to study because the performance functions are continuous. In the case of an increase where the performance functions are not continuous, martingale properties require using a discontinuous local time. Nevertheless, from an identity relating the scale functions, the optimality of the cusum rule still holds. Finally, some numerical illustration are provided.

Basis risk modeling: A co-integration based approach, Y. Salhi, S. Loisel, Statistics Vol. 51, N. 1, (2017), pp 205-221 Lien vers l’article
Abstract:     In this paper we propose a multivariate approach for forecasting pairwise mortality rates of related populations. The need for joint modeling of mortality rates is analyzed using a causality test. We show that for the datasets considered, the inclusion of national mortality information enhances predictions on its sub-populations. The investigated approach links national population mortality to that of a subset population, using an econometric model that captures a long-term relationship between the two mortality dynamics. This model does not focus on the correlation between the mortality rates of the two populations, but rather their long-term behavior, which suggests that the two times series cannot wander off in opposite directions for long before mean reverting, which is consistent with biological reasoning. The model can additionally capture short-term adjustments in the mortality dynamics of the two populations. An empirical comparison of the forecast of one-year death probabilities for policyholders is performed using both a classical factor-based model and the proposed approach. The robustness of the model is tested on mortality rate data for England and Wales, alongside the Continuous Mortality Investigation assured lives data set, representing the sub-population.

Utilisation des estimateurs de Kaplan-Meier par génération et de Hoem pour la construction de tables de mortalité prospectives ? Guibert Q., Planchet F. [2017], Bulletin Français d’Actuariat, vol. 17, n°33. Lien vers l’article
Abstract: Data quality is an overarching concern when it comes to building a mortality model or prospective mortality tables. This is even more significant when these procedures are based on a small population, as data may show major random fluctuations due to a lack of information for particular ages. Such situations arise frequently with the entry into force of Solvency II as insurers shall consider their own data sets, limited in size, to build best estimate tables.  Since  parametric  methods  are  too  rough  to  capture  a  realistic  mortality  pattern  in  two dimensions, the mortality profile is quite often adjusted using exogenous information, such as a table based on a national population. In light of this, the aim of this paper is to discuss the problem of choosing appropriate estimators  for  two-dimensional  mortality  rates  or  death  rates  in  the  presence  of  independent  censoring. Indeed,  practitioners  currently  use  the  Hoem  estimator  or  the  Kaplan -Meier  estimator  split  by  generation without questioning their relevance and reliability.  We propose in this paper a comparative analysis of these estimators and try to give some criteria to choose one approach over another, and give some figures based on a real insurance portfolio and simulated data. Finally, we provided some non-parametric estimators for a direct estimation of death rates both with the cohort and the period approaches.

Le risque de longévité est-il assurable?, N. El Karoui, S. Loisel, Revue d’Economie Financière (2017). Lien vers l’article
Abstract: Dans cet article, nous étudions la question de l’assurabilité du risque de longévité à travers le prisme de l’enterprise risk management (ERM). Nous commençons par identifier les différentes composantes du risque de longévité. Nous présentons ensuite les manières de modéliser, mesurer et détecter des évolutions de ce risque. Enfin, nous étudions les différents contrôles de risque possibles et les risques résiduels associés.

Measures of risk and performance in the management of insurance organisations > MULTIVARIATE DEPENDENCE MODELING

Tail approximations for sums of dependent regularly varying variables under Archimedean copula models, H. Cossette, E. Marceau, Q.H. Nguyen and C. Y. Robert (2017), to appear in Methodology and Computing in Applied Probability Lien vers l’article
Abstract: In this paper, we compare two numerical methods for approximating the probability that the sum of dependent regularly varying random variables exceeds a high threshold under Archimedean copula models. The first method is based on conditional Monte Carlo. We present four estimators and show that most of them have bounded relative errors. The second method is based on analytical expressions of the multivariate survival or cumulative distribution functions of the regularly varying random variables and provides sharp and deterministic bounds of the probability of exceedance. We discuss implementation issues and illustrate the accuracy of both procedures through numerical studies.

A note on upper-patched generators for Archimedean copulas, Di Bernardino, E., Rullière, D. (2017), ESAIM: Probability and Statistics, to appear.
doi: 10.1051/ps/2017003, ISSN: 1292-8100 – eISSN: 1262-3318. Lien vers l’article
Abstract: The class of multivariate Archimedean copulas is defined by using a real-valued function called the generator of the copula. This generator satisfies some properties, including d-monotonicity. We propose here a new basic transformation of this generator, preserving these properties, thus ensuring the validity of the transformed generator and inducing a proper valid copula. This transformation acts only on a specific portion of the generator, it allows both the non-reduction of the likelihood on a given dataset, and the choice of the upper tail dependence coefficient of the transformed copula. Numerical illustrations show the utility of this construction, which can improve the fit of a given copula both on its central part and its tail.

On finite exchangeable sequences and their dependence, C. Lefèvre, S. Loisel, S. Utev, to appear in Journal of Multivariate Analysis (2017). Lien vers l’article
Abstract: This paper deals with finite sequences of exchangeable 0–1 random variables. Our main purpose is to exhibit the dependence structure between such indicators. Working with Kendall’s representation by mixture, we prove that a convex order of higher degree on the mixing variable implies a supermodular order of same degree on the indicators, and conversely. The convex order condition is then discussed for three standard distributions (binomial, hypergeometric and Stirling) in which the parameter is randomized. Distributional properties of exchangeable indicators are also revisited using an underlying Schur-constant property. Finally, two applications in insurance and credit risk illustrate some of the results.

Measures of risk and performance in the management of insurance organisations > FINANCIAL MODELS AND MARKET CONSISTENCY

Continuous Mixed-Laplace Jump Diffusion models for stocks and commodities, D. Hainaut (2017), Quantitative Finance and Economics. Vol 1(2), p 145-173. Lien vers l’article
In this paper we propose a new dynamics for jumps hitting stocks and commodities prices. The jump process is the continuous limit of a mixture of compound Poisson processes.  This model is compatible with Solvency II recommendations for risk management and is related to the axis “models for insurance” of the DAMI chair.

Clustered Levy processes and their Financial applications, D. Hainaut (2017), Accepted in Journal of Computational and Applied Mathematics (JCAM 5y impact factor 1.413) Vol 319, p 117-140. Lien vers l’article
This article studies the properties of Lévy processes that are time changed by a stochastic clock which is the integral of a Hawkes process. The purpose of this model is to introduce jump clustering in the dynamics of financial assets or insurance liabilities. This work is related to the axis “models for insurance” of the DAMI chair.

Proposition d’un modèle de projection des scénarios économiques pour le développement de la zone CIPRES Ahoussi A., Gbongué F., Planchet F. [2017], Assurances et gestion des risques, Vol. 84 (1-2). Lien vers l’article
Abstract: Un  générateur  de  scénarios  économique  (GSE)  est  un  outil  qui  permet  de  projeter  des facteurs  de  risque  économiques  et  financiers.  Il s’agit d’un élément important dans le pilotage technique de l’activité d’assurance, notamment dans l’évaluation des provisions économiques, l’allocation stratégique des actifs et la gestion des risques financiers.  Dans la littérature, les modèles du GSE que nous rencontrons, sont applicables difficilement en Afrique subsaharienne francophone, principalement  à  cause de l’insuffisance ou l’inexistence des données. Pour pallier à ce problème, nous proposons, dans cet article, une démarche de conception d’un générateur de scénarios économique pertinent, adapté au contexte de la zone CIPRES.

Market inconsistencies of the market-consistent European life insurance economic valuations: pitfalls and practical solutions, N. El Karoui, S. Loisel, J.L. Prigent, J. Vedani, accepted, to appear in European Actuarial Journal (2017). Lien vers l’article 
Abstract : The Solvency II directive has introduced a specific so-called risk-neutral framework to valuate economic accounting quantities throughout European life insurance companies. The adaptation of this theoretical notion for regulatory purposes requires the addition of a specific criterion, namely the market-consistency, in order to objectify the choice of the valuation probability measure. This paper aims at pointing out and fixing some of the major risk sources embedded in the current regulatory life insurance valuation scheme. We compare actuarial and financial valuation schemes. We then address first operational issues and potential market manipulation sources in life insurance, induced by both theoretical and regulatory pitfalls. For example, we show that calibrating the interest rate model in October 2014 instead of December 31 st 2014 generates a 140%-increase in the economic own funds of a representative French life insurance company. We propose various adaptations of the current implementations, including product-specific valuation scheme, to limit the impact of these market-inconsistencies.

Governance of models and decision-makers’ behavior

Contagion modelling between Financial and Insurance markets with time changed processes, D. Hainaut (2017), Insurance: Mathematics & Economics Vol 74, p 63-77. (CNRS rank 3, ABS 3*, impact factor 5y 1.748). Lien vers l’article
In this article, we introduce contagion between non-life insurance liabilities and financial through a stochastic clock, which is the integral of a Hawkes process. This is an elegant solution to introduce correlation between a compound Poisson process and a Brownian motion. In this framework, we infer a bound on the insurer’s ruin probability and the optimal investment-reinsurance policy. This work is related to the axis “models for insurance” of the DAMI chair.

Modeling policyholder behavior

Lapse risk in life insurance: correlation and contagion effects among policyholders’ behaviors, F. Barsotti, X. Milhaud, Y. Salhi, Insurance: Mathematics and Economics, (Oct. 2016) Volume 71, pp.317-331. Lien vers l’article
Abstract: The present paper proposes a new methodology to model the lapse risk in life insurance by integrating the dynamic aspects of policyholders’ behaviors and the dependency of the lapse intensity on macroeconomic conditions. Our approach, suitable to stable economic regimes as well as stress scenarios, introduces a mathematical framework where the lapse intensity follows a dynamic contagion process, see Dassios and Zhao (2011). This allows to capture both contagion and correlation potentially arising among insureds’ behaviors. In this framework, an external market driven jump component drives the lapse intensity process depending on the interest rate trajectory: when the spread between the market interest rates and the contractual crediting rate crosses a given threshold, the insurer is likely to experience more surrenders. A log-normal dynamic for the forward rates is introduced to build trajectories of an observable market variable and mimic the effect of a macroeconomic triggering event based on interest rates on the lapse intensity. Contrary to previous works, our shot-noise intensity is not constant and the resulting intensity process is not Markovian. Closed-form expressions and analytic sensitivities for the moments of the lapse intensity are provided, showing how lapses can be affected by massive copycat behaviors. Further analyses are then conducted to illustrate how the mean risk varies depending on the model’s parameters, while a simulation study compares our results with those obtained using standard practices. The numerical outputs highlight a potential misestimation of the expected number of lapses under extreme scenarios when using classical stress testing methodologies.

Hedging of options in presence of jump clustering. D. Hainaut and F. Moraux. Accepted for publication in Journal of Computational Finance (ABS 1*, 5 year impact factor 0.651). Publication forthcoming in 2018. Lien vers l’article
Participating life insurance contracts contain several long-term options that may not be hedged in financial markets due to a lack of liquidity for maturities over one year. This paper explores the performance of delta-gamma hedging strategies in presence of jumps clustering, modelled by a Hawkes process. This work is related to the axis “models for insurance” of the DAMI chair.

Robust proxies for use in calculating sensitivity and constituting model points, etc. to rapidly obtain assessments with reasonable calculation times

Robust SCR valuation of participating life insurance policies in the Solvency II Framework, D. Hainaut, P. Devolder and A. Pelsser. Accepted for publication in Insurance: Mathematics & Economics (CNRS rank 3, ABS 3*, impact factor 5y 1.748). Publication online in 2018. Lien vers l’article
This article propose a framework to evaluate the solvency capital requirements of a life participating insurance contract, such as defined in the Solvency II regulation. The novelty of our approach consists of taking into account the uncertainty about model specifications. This work is related to the axis “models for insurance” of the DAMI chair.

A Model-Point Approach to Indifference Pricing of Life Insurance Portfolios with Dependent Lives, C. Blanchet-Scalliet, D. Dorobantu, Y. Salhi, Methodology and Computing in Applied Probability, (2017), pp 1-25 Lien vers l’article
Abstract: In this paper, we study the pricing of life insurance portfolios in the presence of dependent lives. We assume that an insurer with an initial exposure to n mortality-contingent contracts wanted to acquire a second portfolio constituted of m individuals. The policyholders’ lifetimes in these portfolios are correlated with a Farlie-Gumbel-Morgenstern (FGM) copula, which induces a dependency between the two portfolios. In this setting, we compute the indifference price charged by the insurer endowed with an exponential utility. The optimal price is characterized as a solution to a backward differential equation (BSDE). The latter can be decomposed into (n – 1)n! auxiliary BSDEs. In this general case, the derivation of the indifference price is computationally infeasible. Therefore, while focusing on the example of death benefit contracts, we develop a model point based approach in order to ease the computation of the price. It consists on replacing each portfolio with a single policyholder that replicates some risk metrics of interest. Also, the two representative agents should adequately reproduce the observed dependency between the initial portfolios.

Nested Kriging predictions for datasets with a large number of observations, Rullière, D., Durrande, N., Bachoc, F., Chevalier, C. (2017), Statistics and Computing, in press. Lien vers l’article
doi: 10.1007/s11222-017-9766-2, ISSN: 0960-3174 (Print) 1573-1375 (Online) .
Abstract: This work falls within the context of predicting the value of a real function at some input locations given a limited number of observations of this function. The Kriging interpolation technique (or Gaussian process regression) is often considered to tackle such a problem, but the method suffers from its computational burden when the number of observation points is large. We introduce in this article nested Kriging predictors which are constructed by aggregating sub-models based on subsets of observation points. This approach is proven to have better theoretical properties than other aggregation methods that can be found in the literature. Contrarily to some other methods it can be shown that the proposed aggregation method is consistent. Finally, the practical interest of the proposed method is illustrated on simulated datasets and on an industrial test case with observations in a 6-dimensional space.

Kriging of financial term-structures, Cousin, A., Maatouk, H., Rullière, D. (2016), European Journal of Operational Research, vol. 255, issue 2, pp. 631-648.
doi: 10.1016/j.ejor.2016.05.057, ISSN: 0377-2217, SCIE, CC-ECT. Lien vers l’article
Abstract: Due to the lack of reliable market information, building financial term-structures may be associated with a significant degree of uncertainty. In this paper, we propose a new term-structure interpolation method that extends classical spline techniques by additionally allowing for quantification of uncertainty. The proposed method is based on a generalization of kriging models with linear equality constraints (market-fit conditions) and shape-preserving conditions such as monotonicity or positivity (no-arbitrage conditions). We define the most likely curve and show how to build confidence bands. The Gaussian process covariance hyper-parameters under the construction constraints are estimated using cross-validation techniques. Based on observed market quotes at different dates, we demonstrate the efficiency of the method by building curves together with confidence intervals for term-structures of OIS discount rates, of zero-coupon swaps rates and of CDS implied default probabilities. We also show how to construct interest-rate surfaces or default probability surfaces by considering time (quotation dates) as an additional dimension.

DATA ANALYTICS IN INSURANCE

Modern analytics methodologies

A neural network analyzer for mortality forecast, D. Hainaut, Accepted for publication in the ASTIN Bulletin (CNRS rank 3, ABS 2*, impact factor 1.083, Journal of the International Actuarial Association), publication online in 2018. Lien vers l’article
This article proposes a new method based on “bottleneck” neural networks to forecast mortality tables. An empirical study on French, UK and US population reveals that our approach outperforms traditional actuarial models, like the Lee-Carter model. This work is related to the axis “data analytics for insurance” of the DAMI chair.

Tree-based censored regression with applications in insurance, O. Lopez, X. Milhaud, P. Therond, Electronic Journal of Statistics, (Oct. 2016) Volume 10 issue 2, pp.2685-2716. Lien vers l’article
Abstract: We propose a regression tree procedure to estimate the conditional distribution of a variable which is not directly observed due to censoring. The model that we consider is motivated by applications in insurance, including the analysis of guarantees that involve durations, and claim reserving. We derive consistency results for our procedure, and for the selection of an optimal subtree using a pruning strategy. These theoretical results are supported by a simulation study, and two applications involving insurance datasets. The first concerns income protection insurance, while the second deals with reserving in third-party liability insurance.

 

Accéder à la collection HAL des publications de la chaire

2016

Old-Age Provision: Past, Present, Future, H. Albrecher, P. Embrechts, D. Filipovic, G. Harrison, P. Koch-Medina, S. Loisel, P. Vanini, J. Wagner, accepted, to appear in European Actuarial Journal (2016). Preprint sur SSRN. Lien vers l’article
Cet article est un travail collectif visant à restituer les échanges de la conférence Old-age Provision : Past, Present, Future organisée au Swiss Re Center for Global Dialogue en juin 2015.

Partial Splitting of Longevity and Financial Risks: The Longevity Nominal Choosing Swaptions, H. Bensusan, N. El Karoui, S. Loisel, Y. Salhi, accepted (2016), to appear in Insurance: Mathematics and Economics 68:61-72. Lien vers l’article
Dans cet article, on propose un produit de couverture du risque de taux pour un portefeuille de rentes. Ce produit permet à son détenteur d’ajuster après quelques années le nominal de sa couverture du risque de taux en fonction de la tendance de longévité réalisée ou réévaluée.

Systemic tail risk distribution. Bienvenüe, A. and Robert C. (2016). A paraître dans Annals of Economics and Statistics. Lien vers l’article
Dans ce papier, nous proposons un modèle basé sur la théorie des valeurs extrêmes qui donne le nombre moyen d’actifs d’un marché qui subissent une perte importante de leur valeur si l’un au moins d’entre eux connaît une baisse significative de sa valeur. Ce modèle nous permet de mesurer le risque systémique sur un marché financier et de comprendre comment un assureur doit diversifier ses risques pour éviter des pertes importantes en cas de crise financière.

Likelihood inference for multivariate extreme value distributions whose spectral vectors have known conditional distributions. Bienvenüe, A. and Robert C. (2016). A paraître dans Scandinavian Journal of Statistics. Lien vers l’article
Dans ce papier, nous développons l’approche statistique à mettre en place pour estimer le modèle du papier précédent.

Measuring mortality heterogeneity with multi-state models and interval-censored data, A. Boumezoued, N. El Karoui, S. Loisel, accepted, to appear in Insurance: Mathematics and Economics (2016). Lien vers l’article
Dans cet article, on explique comment mesurer la mortalité dans une population hétérogène dans un modèle multi-états. Cela correspond à une modélisation de la longévité par la dynamique des populations.

Pilotage de la participation aux bénéfices et calcul de l’option de revalorisation, Combes F., Planchet F. Tammar M. [2016], Bulletin Français d’Actuariat, vol. 16, n°31. Lien vers l’article

Benchmarking asset allocation strategies in the presence of liability constraints. Cousin, A., Jiao, Y., Robert, C. and Zerbib, D. (2016)  A paraître dans Insurance: Mathematics and Economics. Lien vers l’article
Dans ce papier, nous présentons une stratégie optimale d’actifs en présence de risque de crédit lorsque l’asset manageur est exposé à des contraintes de passif.

Impact of volatility clustering on equity indexed annuities, D.Hainaut, 2016 Insurance: Mathematics & Economics 71, p-367–381 Lien vers l’article
This study analyses the impact of volatility clustering in stock markets on the evaluation and risk management of equity indexed annuities (EIA). To introduce clustering in equity returns, the reference index is modelled by a diffusion combined with a bivariate self-excited jump process.

A bivariate Hawkes process for interest rates modelling. D.Hainaut, 2016, Economic Modelling 57, p180-196 Lien vers l’article
This paper proposes a continuous time model for interest rates, based on a bivariate mutually exciting point process. The two components of this process represent the global supply and demand for fixed income instruments. In this framework, closed form expressions are obtained for the first moments of the short term rate and for bonds, under an equivalent affine risk neutral measure.

Tree-based censored regression with applications in insurance, O. Lopez, X. Milhaud, P. Therond​​, Electronic Journal of Statistics (2016) Volume 10 issue 2, pp.2685-2716. Lien vers l’article
Ce papier traite du problème d’information complète (données censurées) appliqué aux algorithmes d’apprentissage statistique, plus précisément aux arbres de régression et de classification (CART). Les questions de convergence des estimateurs arbre avec ce type de données sont abordées, et l’application en assurance correspond à des comparaisons entre estimations de montant de sinistres finaux par des experts et des arbres CART.

Wind Storm Risk Management : Sensitivity of Return Period Calculations and Spread on the Territory, A. Mornet, T. Opitz, M. Luzi, S. Loisel, accepted, to appear in SERRA (2016). Lien vers l’article
Cet article combine données météorologiques et d’assurances pour proposer un indice de sévérité de tempêtes et étudie la robustesse des calculs de périodes de retour de tempêtes extrêmes.

A Credibility Approach for the Makeham Mortality Law. Y. Salhi, P.-E. Thérond, J. Tomas (2016) European Actuarial Journal 6(1) 61-96 Lien vers l’article
Nous proposons une approche de crédibilité consistant à réviser les tables de mortalité, au fur et à mesure que de nouvelles observations arrivent, les paramètres d’un ajustement Makeham. Cela permet de rajouter de la structure ce qui s’avère utile lorsque les portefeuilles sont de taille limitée et le processus de révision proposé intègre la bonne représentation aux différents âges.

Période 2010-2015 – Management de la Modélisation en Assurance

Depuis son lancement en octobre 2010, les recherches académiques effectuées dans le cadre de la chaire ont porté sur les thèmes suivants.

  • Le calcul explicite de probabilités de ruine dans des contextes de dépendance entre montants des sinistres et/ou temps inter-sinistres.
  • L’évaluation du risque d’estimation et de l’erreur liées à l’agrégation des risques.
  • L’estimation statistique d’indicateurs de risque en présence de plusieurs facteurs de risque (problématiques multidimensionnelles), en vue d’applications en théorie du risque, à l’ERM (« Entreprise Risk Management ») et dans le cadre de Solvabilité 2
  • La modélisation des événements rares ou extrêmes et ses applications en théorie du risque.
  • Le calcul de la marge pour risque des risques d’assurance non-vie dans le cadre de Solvabilité 2.
  • La modélisation des risques liés à la démographie et aux événements catastrophiques.
  • La gestion des risques corrélés, en particulier en matière de risque de crédit, de contrepartie et de liquidité.
  • Les aspects de comptabilité et liés à la mise en place de Solvabilité 2.

TRAVAUX ACADEMIQUES

Les recherches académiques réalisées dans le cadre de la chaire se sont répartis, lors de la période 2010-2015, selon trois thèmes principaux :

  1. Nouveaux développements de la théorie du risque : prise en compte des
    dépendances, calculs explicites de probabilités de ruine, indicateurs de risques multivariés, événements rares ou extrêmes, analyse économique du risque.
  2. Management du capital économique des compagnies d’assurances : adaptation au cadre de Solvabilité 2, générateurs de scénarios économiques, risque de modèle, agrégation des risques, allocation de capital, marge pour risque,
  3. Risques de défaut, de liquidité, risques démographiques : risques de défaut corrélés, risques de contrepartie, de crédit propre, impact des nouvelles réglementations sur le coût et la pérennité des ressources, modélisation des risques de longévité et de mortalité.