Browse Research
Viewing 1276 to 1300 of 7690 results
2010
This paper proposes a methodology to calculate the credibility risk premium based on the uncertainty of the risk premium (aka pure loss cost, pure premium), as estimated by the standard deviation of the risk premium estimator. An optimal estimator based on the uncertainties involved in the pricing process is constructed.
2010
Property/casualty reserves are estimates of losses and loss development and as such will not match the ultimate results. Sources of error include model error (the methodology used does not accurately reflect the development process), parameter error (incorrect model parameters), and process error (future development is random). This paper provides a comprehensive and practical methodology for quantifying risk that includes all three sources.
2010
This paper offers a methodology for calculating optimal bounds on tail risk probabilities by deriving upper and lower semiparametric bounds, given only the first two moments of the distribution.We apply this methodology to determine bounds for probabilities of two tail events. The first tail event occurs when two financial variables simultaneously have extremely low values.
2010
In this paper, linear mixed models are employed for estimation of structural parameters in credibility context. In particular, Hachemeister’s model and Dannenburg’s crossed classification model are considered. Maximum likelihood (ML) and restricted maximum likelihood (REML) methods are developed to estimate the variance and covariance parameters.
2010
In 2009, in the aftermath of the Global Financial Crisis, 140 American banks failed–and hundreds of other banks were classified as “problem institutions” by the FDIC. This has led to numerous books and articles examining the causes of systemic risk in our financial system. In this paper we step back in history, to see what we should have learned from a previous banking crisis, which occurred during the 1980s.
2010
This paper presents a bootstrap approach to estimate the prediction distributions of reserves produced by the Munich chain ladder (MCL) model. The MCL model was introduced by Quarg and Mack (2004) and takes into account both paid and incurred claims information. In order to produce bootstrap distributions, this paper addresses the application of bootstrapping methods to dependent data, with the consequence that correlations are considered.
2010
Robust statistical procedures have a growing body of literature and have been applied to loss severity fitting in actuarial applications. An introduction of robust methods for loss reserving is presented in this paper. In particular, following Tampubolon (2008), reserving models for a development triangle are compared based on the sensitivity of the reserve estimates to changes in individual data points.
2010
Prediction Error of the Future Claims Component of Premium Liabilities under the Loss Ratio Approach
In this paper we construct a stochastic model and derive approximation formulae to estimate the standard error of prediction under the loss ratio approach of assessing premium liabilities. We focus on the future claims component of premium liabilities and examine the weighted and simple average loss ratio estimators.
2010
The behavior of competing insurance companies investigating insurance fraud follows one of several Nash Equilibria under which companies consider the claim savings, net of investigation cost, on a portion, or all, of the total claim. This behavior can reduce the effectiveness of investigations when two or more competing insurers are involved.
2010
Insurers purchase catastrophe reinsurance primarily to reduce underwriting risk in any one experience period and thus enhance the stability of their income stream over time. Reinsurance comes at a cost and therefore it is important to maintain a balance between the perceived benefit of buying catastrophe reinsurance and its cost.
2010
Motivation: Advanced calculations on large data sets provide important business insights. Such calculations must be flexible enough for the dynamic nature of advanced analytics done by actuaries and other high-skill users, yet must also leverage the power and stability of large-scale IT systems.
2010
Copulas are an elegant mathematical tool for decoupling a joint distribution into the marginal component and the dependence structure component; thus enabling us to model simultaneous events with a greater degree of flexibility. However, as with many statistical techniques, the application of copulas in practice is as much art as it is science.
2010
The rise and fall of subprime mortgage securitizations contributed in part to the ensuing credit crisis and financial crisis of 2008. Some participants in the subprime-mortgage-backed securities market relied at least
in part on analyses grounded in the loss development factor (LDF) method, and many did not conduct their own credit analyses, relying instead on the work of others such as securities brokers and rating agencies.
2010
In his book, The Best Way to Rob a Bank is to Own One, William Black describes in detail the complex collusion between bankers, regulators, and legislators that brought about the Savings and Loan crisis of the 1980s and early 1990s. As part of the scheme, leverage was used to purchase bankrupt companies that became the basis for a Ponzi-like speculative bubble that ultimately collapsed.
2010
This paper argues that no single valuation basis is completely reliable: neither market price nor other alternatives can accurately measure value. Therefore, this paper proposes that a preferable solution is to simultaneously record two bases of valuation: market price and appraisal value.
2010
This article starts with primitive assumptions on preferences and risk. It then derives prices consistent with a social optimum within an insurance company and the consumer-level capital allocation implied therein. The allocation “adds up” to the total capital of the firm (a result echoing findings in the congestion pricing literature—where optimal tolls exactly cover the rental cost of the highway).
2010
This work deals with prediction of IBNR reserve under a different data ordering of the non-cumulative runoff triangle. The rows of the triangle are stacked, resulting in a univariate time series with several missing values.
2010
We study solvency of insurers in a practical model where in addition to basic insurance claims and premiums, economic factors like inflation, real growth and returns on the investments affect the capital developments of the companies. The objective is to give qualitative descriptions of risks by means of crude estimates for finite time ruin probabilities. In our setup, the economic factors have a dominant role in the estimates.
2010
A model is proposed using the run-off triangle of paid claims and also the numbers of reported claims (in a similar triangular array). These data are usually available, and allow the model proposed to be implemented in a large variety of situations. On the basis of these data, the stochastic model is built from detailed assumptions for individual claims, but then approximated using a compound Poisson framework.
2010
The separation method was introduced by Verbeek (1972) in order to forecast numbers of excess claims and it was developed further by Taylor (1977) to be applicable to the average claim cost.
2010
An insurance company entering the property and liability insurance market at the high point of the insurance cycle may decide to slash premiums to gain an advantageous market share. Such aggressive intrusion may call forth a concerted industry response, producing a severe decline in the insurance market price. This can ruin some companies, and agrees with the observation that the insurance cycles are correlated with clustered insolvencies.
2010
We consider a set of workers’ compensation insurance claim data where the aggregate number of losses (claims) reported to insurers are classified by year of occurrence of the event causing loss, the US state in which the loss event occurred and the occupation class of the insured workers to which the loss count relates.
2010
We analyze fire exposure rating for three types of risk profiles: policy profiles, top location profiles and location profiles. Location profiles offer more detailed information than top location profiles, which in turn are better than policy profiles. We prove criteria to ensure that a better quality of risk profile leads to a lower price.
2010
This paper considers insurance claims that are available by cause of loss, or peril. Using this multi-peril information, we investigate multivariate frequency and severity models, emphasizing alternative dependency structures. Although dependency models may be used for many risk management strategies, we focus on ratemaking.
2010
We consider the unit-linked endowment with guarantee and periodic premiums, where at each premium payment date the insurance company invests a certain fraction of the premium into a risky reference portfolio. In the dual random environment of stochastic interest rates with deterministic volatilities and mortality risk, and for a fixed guarantee, simple analytical lower and upper bounds for the fair periodic premium are explicitly derived.