Browse Research

Viewing 1126 to 1150 of 7690 results
2012
Index insurance and probabilistic seasonal forecasts are becoming available in developing countries to help farmers manage climate risks in production. Although these tools are intimately related, work has not been done to formalize the connections between them. We investigate the relationship between the tools through a model of input choice under uncertainty, forecasts, and insurance.
2011
This paper is based on a commissioned research study by the Casualty Actuarial Society with a focus on a theoretical framework for a liquidity risk premium and the interaction of illiquidity with credit effects on the valuation of assets and liabilities. The problem addressed is the development of a theory of liability valuation distinct from a theory of asset pricing or valuation.
2011
In this paper we evaluate the single-loss approximation method for high-quantile loss estimation on the basis of SAS OpRisk Global Data. Due to its simplicity, the single-loss approximation method has become a popular tool for capital requirement calculation purposes in the financial services industry. As the single-loss approximation method requires some strict assumptions, the naive use of this method was criticized in a 2010 paper by Degen.
2011
Under the Basel II standards, the Operational Risk (OpRisk) advanced measurement approach is not prescriptive regarding the class of statistical model utilized to undertake capital estimation. It has however become well accepted to utilize a Loss Distributional Approach (LDA) paradigm to model the individual OpRisk loss processes corresponding to the Basel II Business line/event type.
2011
We investigate the question how the development pattern in the Bornhuetter-Ferguson method should be estimated and derive the corresponding conditional mean square error of prediction (MSEP) of the ultimate claim prediction. An estimator of this conditional MSEP in a distribution-free model was given by Mack [9], whereas in Alaiet al.
2011
This paper introduces a new framework for modelling the joint development over time of mortality rates in a pair of related populations with the primary aim of producing consistent mortality forecasts for the two populations.
2011
Regression analysis is one of the most commonly used statistical methods. But in its basic form, ordinary least squares (OLS) is not suitable for actuarial applications because the relationships are often nonlinear and the probability distribution of the dependent variable may be non-normal.
2011
Given an n x n triangle of losses, xAY,Lag (AY = 1,..,n, Lag = 1,…,n, AY + Lag n + 2), the goal of a stochastic loss reserve model is to predict the distribution of outcomes, XAY,Lag (AY + Lag > n +1), and sums of losses.
2011
Those familiar with classic linear regression, as many actuaries are, are aware that, for any regression including an intercept term, there is an exact balance (equality) between (weighted) fitted and (weighted)observed values in aggregate over the whole dataset. Many are also aware that this balance also holds in aggregate over any level of any classification variable appearing in the regression as a main effect.
2011
GLMs that include explanatory classification variables with sparsely populated levels assign large standard errors to these levels but do not otherwise shrink estimates toward the mean in response to low credibility. Accordingly, actuaries have attempted to superimpose credibility on a GLM setting, but the resulting methods do not appear to have caught on.
2011
Predictive models are used by insurers for underwriting and ratemaking in personal lines insurance. Focusing on homeowners insurance, this paper provides a systematic comparison of many predictive generalized linear models. We compare pure premium (Tweedie) and frequency/severity models based on single perils as well as multiple perils.
2011
NCCI changed its workers compensation ratemaking methodology to improve the treatment of large individual claims and catastrophic multiclaim events related to the perils of industrial accidents, earthquake, and terrorism. NCCI worked with a well known modeling firm to determine provisions for catastrophic events on a state basis. This paper describes the new methodology that NCCI has filed in many states.
2011
In pricing excess of loss reinsurance, the traditional method for applying credibility is as a weighted average of two estimates of expected loss: one from experience rating and a second from exposure rating. This paper will show how this method can be improved by incorporating loss estimates from lower layers; producing a multifactor credibility-weighted estimate of expected loss.
2011
Motivated by the empirical evidence of the long-range dependency found within the Greek motor insurance market, we formulate a particular stochastic pricing model in a continuous framework. We assume the structure of a competitive insurance market where the business volume of each company is directly related to the existing relativity between the company’s premium and the market’s average premium.