A Bayesian Model for Estimating Long Tailed Excess of Loss Reinsurance Loss Costs

Abstract

In this paper we will describe a Bayesian model for excess of loss reinsurance pricing which has many advantages over existing methods. The model is currently used in production for multiple lines of business at one of the world’s largest reinsurers. This model treats frequency and severity separately. In estimating ultimate frequency, the model analyzes nominal claim count data jointly against uncertain ultimate frequency and development pattern priors, allowing for more careful analysis of sparse claim count information and properly differentiating between triangulated and last diagonal data. The severity model is pragmatic, yet accounts for severity distribution development and weighs the volume of data against prior distributions. The model is programmed in R and Stan, thus eliminating the need for a considerable amount of algebra and calculus and the necessity to use conjugate prior distribution families. We compare this method with the more established Buhlmann-Straub credibility application to excess of loss pricing (for instance in Cockroft), and the more complex model given by Mildenhall, showing numerous advantages of our method.

Volume
Summer
Year
2023
Keywords
Credibility, Bayesian statistics, Reinsurance, Excess of loss pricing, Pareto distribution, Gamma distribution ,Poisson distribution,Weibull distribution Copulas
Description
In this paper we will describe a Bayesian model for excess of loss reinsurance pricing which has many advantages over existing methods. The model is currently used in production for multiple lines of business at one of the world’s largest reinsurers. This model treats frequency and severity separately. In estimating ultimate frequency, the model analyzes nominal claim count data jointly against uncertain ultimate frequency and development pattern priors, allowing for more careful analysis of sparse claim count information and properly differentiating between triangulated and last diagonal data. The severity model is pragmatic, yet accounts for severity distribution development and weighs the volume of data against prior distributions. The model is programmed in R and Stan, thus eliminating the need for a considerable amount of algebra and calculus and the necessity to use conjugate prior distribution families. We compare this method with the more established Buhlmann-Straub credibility application to excess of loss pricing (for instance in Cockroft), and the more complex model given by Mildenhall, showing numerous advantages of our method.
Publications
Casualty Actuarial Society E-Forum
Authors
Stephanie Chin
Greg McNulty
Formerly on syllabus
Off