Browse Research

Viewing 5876 to 5900 of 7690 results
1980
Mr. Stanard's paper opens a new area of actuarial research, namely the use of simulation to investigate the reliability of commonly used pricing (and related) models. He is not using simulation to forecast insurance results directly, but rather to determine how well a given technique for such forecasting can be expected to perform.
1980
Using an individual insured's own past loss experience to arrive at its rate is a procedure that is used in many different areas of insurance.
1980
The authors are to be commended for their willingness to address as controversial a subject as expense allocation. Their approach provides one with a basic introduction to the subject. This reviewer, with a limited experience with the subject, feels that a few general comments are in order. Ex/Ind. Risk Rating Plans
1980
Until the present time, the great majority of actuarial study and literature in the ratemaking area has revolved around analyzing and quantifying the loss component of the insurance rate. Actuaries have evolved an elaborate system in which losses are trended, developed and credibility weighted, and in which premiums are placed at current rates or at least current rate levels.
1980
LOB-Auto Physical Damage
1980
It is often necessary to estimate probability distributions to describe the loss processes covered by insurance contracts. For example, in order that the premium charged for a particular contract be correct according to any reasonable premium calculation principle, it must be based upon the underlying loss process for the contract.
1980
It is demonstrated that the problems of balancing a reinsurance network and finding the maximum flow in a graph are identical. Gale's theorem is applied first in order to prove a conjecture of Sousselier concerning simple first order networks, next to extend those results to any network. The balanced reinsurance scheme can effectively be constructed by means of Ford and Fulkerson's algorithm, as is shown by an example.
1980
Aggregate loss probability is an effective tool in actuarial rate making, risk charging, and retention analysis for both primary and secondary insurance companies. A noticeable trend over recent years indicates that it also is becoming an indispensable element in the risk management operations of many manufacturing and commercial firms.
1980
The purpose of this paper is to add another chapter to the fund of knowledge being accumulated on loss reserving techniques.
1980
Much work has been done in the past few years on the applications of Bayesian credibility to insurance pricing. This work has been born of necessity due to the failure of "classical" credibility theories. Recent work by Buhlman and Straub as well as Morris & Van Slyke incorporate the Bayesian concept of utilizing as much information available from historical data as possible in predicting behavior for a segment of a population.
1980
An important fact brought into sharp focus by the papers submitted to last year's program is that a healthy insurance enterprise not only must produce adequate earnings but must do so steadily and predictably. That is to say: the risks that confront the enterprise must be tightly controlled. Management must steer a course mindful not only of reasonable expectations but also of unforeseen deviations there from.
1980
Analytical steps towards a numerical calculation of the ruin probability for a finite period when the risk process is of the Poisson type or of the more general type studied by Sparre Andersen, Astin Bulletin vol. 6, 54-65, by OLOF THORIN, Stockholm. On pp. 56-57 of the above-mentioned paper there is a digression about K(t), the distribution function for the time between successive claims.
1980
As most actuaries that have had an opportunity to prepare a rate filing will tell you, the ratemaker will generally have to convince three principals that the rate he is generating is a reasonable one. First, he must convince himself. This first step alone is, in many cases, a difficult and laborious task due to the many technical uncertainties with which an actuary must deal.
1980
A recent article in the Journal of Commerce cited an address given at the convention of The National Association of Casualty & Surety Agents by its President, Mr. John S. Childross, Vice President of Marsh & McLennan. His remarks emphasized the value of educating the public regarding casualty ratemaking procedures and company needs. General/Premium Analysis/Regulation
1980
Insurance pricing is a combination of rating classification and underwriting selection. The author defines necessary and sufficient standards for rating classification which are not met by underwriting selection or insurance pricing as a whole. It is unreasonable to assert that it is necessary for rating classifications to meet certain standards when the pricing structure as a whole does not meet those same standards.
1980
The escalating inflation of the past decade spawned complaints about more than just overall insurance rate increases. Unlike most other products, insurance costs depend upon buyer characteristics, so questions of fairness have naturally arisen as some insureds were confronted with four digit auto insurance prices along with double digit inflation.
1980
This paper examines the economic consequences of allocating common costs by (1) gross revenues, (2) directly attributable costs, and (3) relative output levels (such as ton-miles) to determine fully distributed cost prices for regulated firms.
1980
A premium calculation principle is a functional assigning to a random variable (or its distribution function) a real number. The interpretation is rather obvious. The random variable stands for the possible claims of a risk whereas the resulting number is the premium charged for assuming this risk. Of course, in economics premiums are not only depending on the risk but also on market conditions.
1980
This paper presents an experimental investigation of risk taking in the domain of losses. The results are partly compatible with expected utility theory, assuming an inflection point in the utility function over losses. However, overweighting of low probabilities and underweighting of high ones was observed, which runs counter to the expected utility model.
1980
Properties of individual willingness to pay for changes in mortality probabilities are examined using a decision-theoretic model. There is no unique value per life saved.
1979
The determination of optimal rules for sharing risks and constructing reinsurance treaties has important practical and theoretical interest. Medolaghi, de Finetti, and Ottaviam developed the first linear reciprocal reinsurance treaties based upon minimizing individual and aggregate variance of risk.
1979
At last, a paper applying microeconomic theory to the insurance industry! Mr. Brubaker's paper is extremely basic, and I wish he had carried his thought process further The result is a model which needs a great deal of additional work and refinement before it can even begin to approach reality Still, this is the first paper I recall in our Proceedings which applies economic theory to the insurance industry For that we owe Mr.
1979
It can hardly be disputed that inflation has become one of the biggest concerns and worries of the American public. Nor is it a secret that inflation rates in recent years have risen, become more erratic and, more often than not, defied control. The chance of returning to the inflation rates of the 1950's and 1960's appear slim, as economic forecasters are projecting an inflation rate of 6 to 7X for the next several years.