Browse Research
Viewing 4501 to 4525 of 7690 results
1994
Challenges foundations of financial theory (expected utility, Sharpe Diagonal Model, and variance as a satisfactory proxy for risk.)
1994
The automobile third party insurance merit-rating systems of 22 countries are simulated and compared, using as main tools the stationary average premium level, the variability of the policyholders' payments, their elasticity with respect to the claim frequency, and the magnitude of the hunger for bonus. Principal components analysis is used to define an "Index of Toughness" for all systems ~.
1994
This paper develops a discrete time model for valuing treasury bills and either forward or futures contracts written against them. It provides formulae for bill prices, forward prices, futures prices, and their conditional variances and risk premiums. The interest rate process is described by a multiplicative binomial random walk whose features conform to some principal characteristics of observed processes.
1994
For some time now, the convenient and fast calculability of collective risk models using the Panjer-algorithm has been a well-known fact, and indeed practitioners almost always make use of collective risk models in their daily numerical computations.
1994
A practical method of allowing for covariates in compound Poisson modeling distributions is discussed.
KEYWORDS Premium rating; compound Poisson distributions, generalized linear models, power variance function.
1994
In deductible pricing formulas presented in the CAS Part 9 syllabus, a "safety factor" is mentioned but not fully explained. This paper describes the purpose, scope, concepts, and applications of the safety factor in Workers Compensation deductible programs. A procedure for quantification of this factor is presented using a component approach. The authors also discuss the theory and practice of estimating each component.
1994
Reinsurance Research - Pricing/Contract Design
1994
Traditional actuarial techniques may not always produce appropriate pricing and reserving analyses for the self-insured market. In particular, potential self-insurers often have very limited historical data. The paper presents an overview of the various self-insurance mechanisms and discussions common limitations inherent in self-insures' data.
1994
The so-called "parallelogram" method is standard in actuarial practice for illustrating loss and exposure statistics as a conceptual and calculational device. Ratemaking is a prime example. In this article we propose a similar device based on three variable calculus.
Loss Development, IBNR
1994
Regulation
1994
The claims generating process for a non-life insurance portfolio is modelled as a marked Poisson process, where the mark associated with an incurred claim describes the development of that claim until final settlement. An unsettled claim is at any point in time assigned to a state in some state-space, and the transitions between different states are assumed to be governed by a Markovian law.
1994
This paper considers the application of the state space modelling to the chain ladder linear model in order to allow the run-off parameters to vary with accident year. In the usual application of the chain ladder technique, the development factors are assumed to be the same for each accident year. This implies that the run-off shape does not alter with accident year.
1994
Mean/variance methodology has been commonly used as a basis for making asset allocation decisions. Sherris (1992) demonstrated how this approach was really a special case of a more general utility maximization problem. This paper intends to carry this idea further by applying numerical techniques to obtain the optimal asset allocation strategy, as well as incorporating explicit constraints into the selection problem.
1994
This article presents a detailed analysis of asset liability management strategies.
1994
This paper explores the collective risk model as a vehicle for estimating the probability distribution for reserves. Though the basic model has been suggested in the past and it provides a direct means to estimate process uncertainty. It does not directly address the potentially more significant problem of parameter uncertainty.
1994
Aggregate loss distributions have been used in a number of different applications over the last few years. These applications have usually focused on the distribution of losses at ultimate or final values and have not studied how losses move to ultimate values over time. The approach outlined in this note models claim activity through the use of transition matrices.
1994
Uses an artificial intelligence approach to combine financial ratios into an insolvency predictor.
1994
This paper develops a three dimensional statistical approach to the estimation of the mean and the standard deviation of pure
incurred but not reported (IBNR) reserves. This means that the time of occurrence, the reporting lag, and the claim severity are separately modeled. It is assumed that, beyond any fixed time $t$, the claim number development process is Poisson and that the severity of loss depends on the length of the reporting lag.
1994
The goal of this paper is to develop an arbitrage free valuation formula for an American put option on a catastrophe insurance futures contract. This contract (denoted CATS) was introduced in December 1992 by the Chicago Board of Trade. The option buyer’s valuation problem is formulated as an optimal stopping problem within a continuous trading, arbitrage-free and complete financial market.
1994
As the cost of Workers’ Compensation insurance coverage continues to rise, many employers are looking for alternative ways to fulfill this obligation. Two products stand our in particular. Large Dollar Deductible plan, and Excess WC insurance which covers excess loss exposure for a self-insurance program.
1994
Homeowners Insurance to Value is a sometimes-forgotten, yet critical, factor in the ratemaking process. Some aspects of Insurance to Value have remained the same while others have changed. Actuaries must consider the impact of Insurance to Value, which can change quickly, as a part of their Homeowner ratemaking analyses.
1994
The goal of the paper is to introduce catastrophe insurance futures and options. We first provide background on natural catastrophes and the associated financial effects on both the primary and reinsurance markets for property insurance. Next, we describe the recently introduced catastrophe insurance futures product and present a general framework for its pricing.
1994
Forty-one years of catastrophe loss data by state are used in this study to produce a model for rating catastrophe covers for insurers in any region of the continental United States. Smooth surfaces are fitted to the data by region, and experience rating is applied in an attempt to give appropriate weight to regional departures from the smoothed results.