Browse Research

Viewing 4551 to 4575 of 7690 results
1994
This session extends the discussions of Reinsurance Reserving I. It will address approaches for facultative vs. treaty business and special considerations for financial reinsurance and retrocessions. In addition to offering other reserving techniques, the panelists will review pitfalls that are unique to reinsurance such as aggregate deductibles, commutations, retrocessions and insolvent cedents.
1994
This session provides a basic understanding of loss reserving principles, considerations and techniques as applied to reinsurance assumed. While reinsurance reserving principles are generally similar to direct reserving, their sound application to reinsurance is difficult.
1994
This session is a sequel to Traditional Regression Methods in Loss Reserving. That session presented regression methods for modeling loss development, which assume the parameters to be estimated are constant over time.
1994
We consider compound distributions where the counting distribution has the property that the ratio between successive probabilities may be written as the ratio of two polynomials. We derive a recursive algorithm for the compound distribution, which is more efficient than the one suggested by PANJER & WILLMOT (1982) and WILLMOT & PANJER (1987).
1994
Finite time ruin methods typically rely on diffusion approximations or discrimination. We propose a new method by looking at the surplus process embedded at claim instants and develop a recursive scheme for calculating ruin probabilities. It is assumed that claim sizes follow a phase-type distribution. The proposed method is exact.
1994
During the past fifteen years, environmental legislation has proliferated at the federal, state and local levels. There is evidence that corporate executive and director attitudes have begun to reflect a greater awareness of, and increasing sensitivity to environmental issues.
1994
Lloyd’s has gone through some significant changes during the last year including the formation of the Newco project, the introduction of corporate capital and the closing of several well-established syndicates. This session will focus on the issues facing Newco and on the various reserving methods being adopted. The session will also elaborate on the market features that created the need for a solution to the old year problems at Lloyd’s.
1994
While actuaries have had a Bayesian view of the world for decades, the adoption of methods that adhere strictly to the principles of modern Bayesian analysis has been slow. In his paper, Glenn Meyers shows that for a particular problem such an approach is not only feasible, but easy to complete. I am delighted that he has continued to take up the Bayesian cause, and with this note, I hope to provide just two extensions.
1994
This paper addresses the question: How valuable is a sample of excess claims in determining the expected claim severity in an excess layer of insurance? An established procedure to estimate this expected claim severity is to first fit a model distribution to claim size data and then, using the fitted distribution, estimate the expected claim severity in the given excess layer.
1994
Actuarial standards can be a double-edged sword providing both a safe harbor as well as potential points of attack in a malpractice action. The actuary may be called upon to demonstrate compliance with standards and defend procedures, documentation, and results. This session will present a mock trial based upon a loss reserve opinion for a hypothetical insurance company that subsequently becomes insolvent.
1994
This paper aims to present a statistical modeling framework and environment for conducting loss reserving analysis. The modeling framework and approach affords numerous advantages including increased accuracy of estimates and modeling of loss reserve variability.
1994
According to the Statement of Principles Regarding Property and Casualty Insurance Ratemaking, consideration must be given to the impact catastrophes have on loss experience and procedures must be developed to include an allowance for the catastrophe exposure in the insurance rate.
1994
Pricing a new product line with limited data poses a major challenge to the actuary. Standard actuarial methods require a quantity and consistency of data that may not be available. Therefore, unique solutions may be required. This does not mean that the actuary must develop an entire new methodology. Instead it is often possible to use a combination of techniques found in actuarial literature in reaching a solution.
1994
This paper gives a method for premium rating by postcode area. The method is based on spatial models in a Bayesian framework and uses the Gibbs sampler for estimation. A summary of the theory of Bayes in spatial methods is given and the data which was analyzed by TAYLOR (1989) is reanalyzed. An indication is given of the wide range of models within this class which would be suitable for insurance data.
1994
An overly trendy title for a paper that shows how computer facilities now available can carry out improvements to Markowitz theory he himself had suggested. Especially features measuring risk by only downward deviations from expected returns and getting away from the normal distribution assumptions.
1994
Shows how asset pricing formulas can be adapted to include transaction costs.
1994
The Esscher transform is a time-honored tool in actuarial science. This paper shows that the Esscher transform is also an efficient technique for valuing derivative securities if the logarithms of the prices of the primitive securities are governed by certain stochastic processes with stationary and independent increments.
1994
Based on recurrence equation theory and relative error (rather than absolute error) analysis, the concept and criterion for the stability of a recurrence equation are clarified. A family of recursions, called congruent recursions, is proved to be strongly stable in evaluating its non-negative solutions. A type of strongly unstable recursion is identified.
1994
An iteration scheme is derived for calculating the aggregate claims distribution in the individual life model. The (exact) procedure is an efficient reformulation of De Pril's (1986) algorithm, considerably reducing both the number of arithmetic operations to be carried out and the number of data to be kept at each step of iteration.
1994
The authors categorize proxies that produce particular relations between expected returns and true betas.
1994
GOOVAERTS and KAAS (1991) present a recursive scheme, involving Panjer's recursion, to compute the compound generalized Poisson distribution (CGPD). In the present paper, we study the CGPD in detail. First, we express the generating functions in terms of Lambert's W function. An integral equation is derived for the pdf of CGPD, when the claim severities are absolutely continuous, from the basic principles.
1994
Our work is aimed at testing and further developing the asset models, the Willie model and its variants, which are presented in the Practical Risk Theory study book by Day kin, Pentikainen and Personnel (1993). Empirical data collected from 12 countries is studied.
1994
We consider a risk generating claims for a period of "N" consecutive years (after which it expires), "N" being an integer valued random variable. Let Xk deonte the total claims generated in the Kth year, k=1. The Xk's are assumed to be independent and identically distributed random variables, are paid at the end of the year.
1994
Data Administration Including Warehousing & Design (narrow topic or advanced)