Browse Research

Viewing 4226 to 4250 of 7690 results
1996
Maximum likelihood estimation is derived for the Lagrangnan Poisson distribution for a simple and a loglinear model and illustrated with real data. Keywords: Loglinear, lagrangian Poisson, Maximum likelihood, Newton-Raphson
1996
Reinsurance Research - Loss Distributions, Size of
1996
Data Administration Including Warehousing & Design (general or introductory); Designed for IS managers and database designers, this practical and thoughtful exposition explores dimensional data warehouse concepts. Dimensional database warehouses incorporate three or more dimensions, and lend themselves to multi-dimensional data modeling and analysis.
1996
Occurrences and developments of claims are modelled as a marked point process. The individual claim consists of an occurrence time, two covariates, a reporting delay, and a process describing partial payments and settlement of the claim. Under certain likelihood assumptions the distribution of the process is described by 14 one-dimensional components. The modelling is nonparametric Bayesian.
1996
The premiums for a bonus-malus system which stays in financial equilibrium over the yeats are calculated This is done by minimalizing a quadratic function of the difference between the premium for an optimal BMS with an infinite number of classes and the premium for a BMS with a finite number of classes, weighted by the stationary probability of being in a certain class, and by transposing various constraints on the system,
1996
In the wake of the recent catastrophes, a new way of transferring insurance risk was born. In 1993, the Chicago Board of Trade began trading contracts on an index sensitive to insurer catastrophe experience. Such indices provide an insurer a means to transfer a portion of its catastrophe risk to the capital markets by buying future and option contracts.
1996
The chain ladder method is a simple and suggestive tool in claims reserving, and various attempts have been made aiming at its justification in a stochastic model. Remarkable progress has been achieved by Schnieper and Mack who considered models involving assumptions on conditional distributions. The present paper extends the model of Mack and proposes a basic model in a decision theoretic setting.
1996
The risk-based capital requirements adopted by the NAIC in 1994 are a major advance in the solvency regulation of property/casualty insurance companies. The components of the risk-based capital formula are grounded in actuarial and financial analyses of the risks faced by insurance companies and of the capital needed to guard against those risks.
1996
Like primary insurance, reinsurance is a mechanism for spreading risk. A reinsurer takes some portion of the risk assumed by the primary insurer (or other reinsurer) for premium charged. Most of the basic concepts for pricing this assumption of risk are the same as those underlying ratemaking for other types of insurance.
1996
Several approaches for estimating liabilities under a high deductible program are described. Included is a proposal for a more sophisticated approach relying upon a loss distribution model. Additionally, the discussion addresses several related issues dealing with deductible size and mix, absence of long-term histories, as well as the determination of consistent loss development factors among deductible limits.
1996
As new types of losses appear for which traditional "triangular" analysis in inadequate, different approaches must be used. This paper defines policy-event based loss estimation (PEBLE), which is being used primarily in developing natural disaster and toxic tort rates and loss estimates. Although PEBLE appears to be new, its history goes back to life and disability reserving.
1996
Asset share pricing models are used extensively in life and health insurance premium determination. In contrast, property/casualty ratemaking procedures consider only a single period of coverage. This is true for both traditional methods, such as loss ratio and pure premium ratemaking, and financial pricing models, such as discounted cash flow or internal rate of return models.
1996
This paper presents a method for estimating the premium asset on retrospectively rated policies, using the functional relationship between the losses and the retrospective premium. This relationship is examined using the historical premium and loss development data and the retro rating parameters sold in the underlying policy.
1996
Most businesses that purchase property/casualty insurance design these programs using inappropriate analytical methods that result in inefficient insurance programs. Such programs often provide insurance to reduce variability that does not concern an entity's stakeholders, while offering no protection for the catastrophic losses that do concern them.
1996
The size distribution of yearly claims in the French business interruption insurance branch is a pareto law with an extremely long tail. The behavior of that law reflects the fact that the total value of the yearly claims is dominated by a small number of huge claims. The estimated characteristic exponent of the tail is very close to one.
1996
One of the new challenges facing the workers compensation reserving actuary is the incorporation of cost containment measures into the reserving process. The drastic reduction in medical payments due to these measures distorts historical development patterns and makes the prediction of future development patterns increasingly uncertain.
1996
This paper explains the most commonly used complements of credibility and offers a comparison of the effectiveness of the various methods. It includes numerous examples. It covers credibility complements used in excess ratemaking as well as those used in first dollar ratemaking. It also offers six criteria for judging the effectiveness of various credibility complements.
1996
This paper discusses the strengths, weaknesses, and application of several months used to obtain in estimated ultimate loss distribution from data whose valuation is less than final. The central issues are introduced by examining several basic methods via a simple example.
1996
A stochastic planning model is a representation to an appropriate level of detail of all of the cash flows of an insurance company, where the variables are stochastic (randomly generated). The variables are connected by simple econometric equations whose form and parameters are generated by the relevant underlying data.
1996
This paper describes a new methodology for determining a reserve for unallocated claim expenses. While the discussion will focus on workers compensation claims, the methodology is equally applicable to other lines of business. This paper will describe both a methodology to determine the reserve for all claims (including IBNR claims) as well as a procedure to determine the reserve for claims reported to date (excluding IBNR claims).
1996
This paper describes the dynamic financial analysis model currently being used by a property catastrophe reinsurer to manage its business. The model is an integral part of the day-to-day operations at the Company; and is used as a decision making tool in the underwriting, investment and capital management processes. The paper begins by describing the framework that the Company uses for risk management.
1996
This paper describes a financial model currently being used by a major U.S. multi-line insurer.