On Predictive Modeling for Claim Severity

Abstract
Reinsurers typically face two problems when they want to use insurer claim severity experience to experience rate their liability excess of loss treaties. First, the claim severity data has insufficient volume to make credible projections of excess layer costs. Second, the data they do receive is not fully developed. Most claims that pierce the excess layers can take at least a few years to settle. This paper starts with some introductory examples that illustrate hwo to quantify the inherent uncertainty in fitting claim severity distributions. Then the paper illustrates a Bayesian methodology to estimate the expected cost of excess layers, and shows how to quantify the uncertainty in these estimates. The Bayesian "prior models" are not derived from purely subjective considerations. Instead they are derived after examining the claim severity data of several insurers. Each "prior model" contains claim severity distributions of immature data submitted by an insurer. Each "prior model" also contains a fully developed claim severity distribution. The estimate of the cost of an excess layer is the average of the fully developed excess layer costs weighted by the posterior probabilities calculated with the immature data submitted by the insurer.
Volume
Spring
Page
215 - 254
Year
2005
Keywords
predictive analytics
Categories
Financial and Statistical Methods
Aggregation Methods
Collective Risk Model
Financial and Statistical Methods
Aggregation Methods
Fourier
Actuarial Applications and Methodologies
Reserving
Reserve Variability
Actuarial Applications and Methodologies
Reserving
Reserving Methods
Actuarial Applications and Methodologies
Reserving
Uncertainty and Ranges
Financial and Statistical Methods
Credibility
Publications
Casualty Actuarial Society E-Forum
Authors
Glenn G Meyers