Browse Research
Viewing 1451 to 1475 of 7690 results
2010
Distortion risk measures are perspective risk measures because they allow an asset manager to reflect a client’s attitude toward risk by choosing the appropriate distortion function. In this paper, the idea of asymmetry was applied to the standard construction of distortion risk measures. The new asymmetric distortion risk measures are derived based on the quadratic distortion function with different risk-averse parameters.
2010
A simple and commonly used method to approximate the total claim distribution of a (possible weakly dependent) insurance collective is the normal approximation. In this article, we investigate the error made when the normal approximation is plugged in a fairly general distribution-invariant risk measure.
2010
This study reviews the valuation models for three types of catastrophe-linked instruments: catastrophe bonds, catastrophe equity puts, and catastrophe futures and options. First, it looks into the pricing of catastrophe bonds under stochastic interest rates and examines how (re)insurers can apply catastrophe bonds to reduce the default risk.
2010
The International Accounting Standards Board (IASB) and the Financial Accounting Standards Board (FASB) continue to debate and refine the financial reporting standards that will emerge from Phase II of their joint project on insurance contracts. The changes to the measurement of insurance liabilities for financial reporting are potentially quite significant for most insurance organizations around the world.
2009
Actuarial Standard of Practice No. 13, Trending Procedures in Property/Casualty Insurance Ratemaking
This standard of practice provides a basis for assessing procedures appropriate for estimating future expected values by analyzing historical data and other relevant information. The historical data to be considered for analysis are those referred to in the Statement of Principles Regarding Property and Casualty Insurance Ratemaking of the Casualty Actuarial Society (CAS).
2009
The paper analyzes the implications of extreme events on the proper choice of discounting. Any discounting with constant or declining rates can be linked to random âstopping timeâ events, which define the internal discount-related horizons of evaluations. Conversely, any stopping time induces a discounting, in particular, with the standard discount rates.
2009
In the literature, one of the the main objects of stochastic claims reserving is to find models underlying the chain-ladder method in order to analyze the variability of the outstanding claims, either analytically or by bootstrapping.
2009
In this study, we present an approach based on neural networks, as an alternative to the ordinary least squares method, to describe the relation between the dependent and independent variables. It has been suggested to construct a model to describe the relation between dependent and independent variables as an alternative to the ordinary least squares method.
2009
In this paper, we take the point of view of an insurer dealing with life annuities, which aims at building up a (partial) internal model in order to quanitfy the impact of mortality risks, namely process and longevity risk, in view of taking appropriate risk mangagement actions.
2009
The use of generalized linear models (GLM) to estimate claims reserves has become a standard method in insurance. Most frequently, the exponential dispersion family (EDF) is used; see e.g. England, Verrall [2]. We study the so-called Tweedie EDF and test the sensitivity of the claims reserves and their mean square errors of predictions (MSEP) over this family.
2009
Only high-quality internal models optimally reflecting the risk situation facing the company allow insurers to assess the level of risk capital required. This importantly involves measuring and evaluating reserve risk as a part of insurance risks. In literature there is a wide variety of methods for stochastic reserving such as the Mack method, Bootstrap method, regression approaches, Bayesian methods, etc.
2009
In this paper, a long-term equilibrium model of a local market is developed. Subject to minor qualifications, the model is arbitrage-free. The variables modelled are the prices of risk-free zero-coupon bonds – both index-linked and conventional – and of equities, as well as the inflation rate. The model is developed in discrete (nominally annual) time, but allowance is made for processes in continuous time subject to continuous rebalancing.
2009
Generalized Linear Models (GLMs) are gaining popularity as a statistical analysis method for insurance data. For segmented portfolios, as in car insurance, the question of credibility arises naturally; how many observations are needed in a risk class before the GLM estimators can be considered credible?
2009
In this model we examine the claims reserving problem using Tweedie's compound Poisson model. We develop the maximum liklihood and Bayesian Markov chain Monte Carlo simulation approaches to fit the model and then compare the estimated models under different scenarios. The key point we demonstrate relates to the comparison of reserving quantities with and without model uncertainty incorporated into the prediction.
2009
Quantifying liability risk for insurance companies requires projections of distributions of future contingent costs. Past patterns are typically forecast to continue but with the possibility of deviations to various extents.
2009
This paper presents a Bayesian stochastic loss reserve model with the following features:
2009
Excess loss factors, which are ratios of expected losses excess of a limit to total expected losses, are used by the National Council on Compensation Insurance (NCCI) in class ratemaking (estimating the expected ratio of losses to payroll for individual workers compensation classifications) and are used by insurance carriers to determine premiums for certain retrospectively rated policies (on policies for which claims used in the premium deter
2009
This paper investigates the practical aspects of applying the second-order Bayesian revision of a generalized linear model (GLM) to form an adaptive filter for claims reserving. It discusses the application of such methods to three typical models used in Australian general insurance circles. Extensions, including the application of bootstrapping to an adaptive filter and the blending of results from the three models, are considered.
2009
The paper considers a model with multiplicative accident period and development period effects, and derives the ML equations for parameter estimation in the case that the distribution of each cell of the claims triangle is a general member of the Tweedie family.
2009
The popular General/Property-Casualty Insurance chain ladder method was first expanded to include variance calculations by Mack [1]. As new research expands the chain ladder method’s stochastic functionality, it is as important as ever to understand the assumptions underlying this fundamental approach and evaluate their appropriateness given the data.
2009
Generalized Linear Model [GLM] theory is a commonly accepted framework for building insurance pricing and scoring models. A helpful feature of the GLM framework is the “offset” option. An offset is a model variable with a known or pre-specified coefficient. This paper presents several sample applications of offsets in property-casualty modeling applications.
2009
Motivation: GLMs are widely used in insurance modeling applications. Claim or frequency models are a key component of many GLM ratemaking models. Enhancements to the traditional GLM that are described in this paper may by able to address practical issues that arise when fitting count models to insurance claims data.
For modeling claims within the GLM framework, the Poisson distribution is a popular distribution choice.