Browse Research

Viewing 201 to 225 of 7690 results
2016
This paper studies a novel capital allocation framework based on the tail mean-variance (TMV) principle for multivariate risks. The new capital allocation model has many intriguing properties, such as controlling the magnitude and variability of tail risks simultaneously. General formulas for optimal capital allocations are discussed according to the semideviation distance measure.
2016
The concept of excess losses is widely used in reinsurance and retrospective insurance rating. The mathematics related to it has been studied extensively in the property and casualty actuarial literature. However, it seems that the formulas for higher moments of the excess losses are not readily available.
2016
Experience shows that U.S. risk-based capital measures do not always signal financial troubles until it is too late. Here we present an alternative, reasonable capital adequacy model that can be easily implemented using data commonly available to company actuaries. The model addresses the three most significant risks common to property and casualty companies—namely, pricing, interest rate, and reserving risk.
2016
The statistical foundation of disaster risk analysis is actual loss experience. The past cannot be changed and is treated by actuaries as fixed. But from a scientific perspective, history is just one realization of what might have happened, given the randomness and chaotic dynamics of nature. Stochastic analysis of the past is an exploratory exercise in counterfactual history, considering alternative possible scenarios.
2016
Despite the occurrence of numerous casualty catastrophes that have had a significant impact on the insurance industry, the state of casualty catastrophe modeling lags far behind that of property catastrophes. One reason for this lag is that casualty catastrophes develop slowly, as opposed to property catastrophes that occur suddenly, so that the impact, both financial and psychological, has less of a shock element.
2016
This paper is concerned with dependency between business segments in the Property & Casualty industry. When considering the business of an insurance company at the aggregate level, dependence structures can have a major impact in several areas of Enterprise Risk Management, such as in claims reserving and capital modelling.
2016
Actuarial research has aimed to tame uncertainty, typically by building on two premises: (a) the malleability of uncertainty to quantification and (b) the separability of quantitative modelling from decision principles. We argue that neither of the two premises holds true. The first ignores deeper – ‘ontological’ and ‘framing’– uncertainties, which do not lend themselves to quantification.
2016
Starting in 2012 and continuing for several years since, I have been witness to a burst of innovation in the seemingly staid world of crop insurance, which is too often dismissed as an obscure backwater of the U.S. insurance industry.
2016
We generally do not tend to think of innovation and risk management as compatible themes. Innovation conjures up images of bold new ideas, thinking outside the box, disrupting established ways of doing things, and breaking new ground.
2016
There are many papers that describe the over-dispersed Poisson (ODP) bootstrap model, but these papers are either limited to the basic calculations of the model or focus on the theoretical aspects of the model and always implicitly assume that the ODP bootstrap model is perfectly suited to the data being analyzed.
2016
“The Actuary and IBNR” was published in 1972 by Ronald Bornhuetter and Ronald Ferguson. The methodology from this paper has exploded into a veritably universal methodology used by actuaries and commonly referred to as the “Born Ferg” or “BF” method. The technique and its application are included in the syllabus for the CAS actuarial exams and the use of the technique is pervasive in both the reserving and pricing worlds.
2016
Although it is an analytic construct important in its own right, a stationary population is an integral component of a life table. Using this perspective, we discuss well-known and not-so-well known equalities that are found a stationary population as well as a set of inequalities. There are two parts to the set of inequalities we discuss.
2016
The common calculation used in developing case reserves are based on “hindsight” from a separate development test, thus they are based on data that already reflects judgment. A method is presented for estimating development factors for case reserves that strictly uses data within the standard loss development triangles, primarily the paid loss to case reserve disposed or “runoff” ratios.
2016
The evolving definition of Advanced Analytics and the emergence of the Data Scientist In its infancy, Actuarial Science operated at the leading edge of contemporary analytic capabilities and could be easily said to be employing “advanced analytics.” Over the past 50 years, however, relentless data and technology breakthroughs have created modern analytic capabilities that far outstrip many of our traditional actuarial pricing and reserving met
2016
CAS E-Forum, Winter 2016 Featuring the RBC Dependencies and Calibration Subcommittee
2016
This report starts a discussion on the interrelationships between the general levels of unemployment in the economy and the sale and persistency of insurance products. The study hypothesizes that insurance products experience reduced sales and persistency as the level of unemployment increases in a given economic environment.
2016
Ninth Survey of Emerging Risks Risk management can be looked at in many ways—from how the volatility of an individual risk impacts profit distribution to how it threatens solvency. Emerging risks fall into the latter category. Risk managers seek out information about risks that, over a long time horizon, could have a great impact on an entity’s survival.
2016
Large yet infrequent disruptions of electrical power can impact tens of millions of people in a single event, triggering significant economic damages, portions of which are insured. Small and frequent events are also significant in the aggregate. This article explores the role that insurance claims data can play in better defining the broader economic impacts of grid disruptions in the U.S. context.
2016
Motivation This paper proposes a triangle-based stochastic reserving framework for parsimoniously describing insurance claims generation, reporting and settlement processes with intuitive parameters. Method Deterministic compartmental models are explored as extensible tools to describe and project the insurance claims process using a small number of parameters, including measure of case reserve robustness.
2016
CAS E-Forum, Fall 2016 Featuring the CAS Data & Technology Working Party Report and Independent Research
2016
This paper provides an introduction to the use of Bayesian methods for blending prior information with a loss development pattern from a triangle. The methods build upon conjugate forms discussed in earlier literature but introduce the Generalized Dirichlet as a prior, which allows for a significant simplification in calculation.
2016
The Cape Cod method is a commonly used technique where the a priori loss ratio is calculated as the weighted average of the chain ladder ultimate loss ratios across all years with the “used” premium as the weights. It applies the same a priori loss ratio estimate (on a trended, current rates level) across all years, without consideration for any possible changes that may have occurred.
2016
CAS E-Forum, Summer 2016 Featuring the CAS Reserves Call Papers, Innovation Essays and Independent Research