Browse Research
Viewing 3451 to 3475 of 7690 results
2000
We propose a new method for calculating the risk of ruin with reference to both life and damages insurance portfolios in a finite period of time and in particular for calculating:
a) the total claims probability distribution in multivariate situations, according to the Collective Risk Theory model;
2000
The level of future health service expenditure is one of the most important issues in planning and calculating contributions for social health insurance and premiums in private health insurance.
2000
In this paper we propose a new reinsurance treaty called “Excess Volatility” (XV). This treaty aims at reducing the volatility of the underwriting result of a non-life insurance company.
2000
In order to apply asset-liability management techniques to property-liability insurers, the sensitivity of liabilities to interest rate changes, or duration, must be calculated. The current approach is to use the Macaulay or modified duration calculations, both of which presume that the cash flows are invariant with respect to interest rate changes.
2000
Premium trend has been an integral part of the ratemaking process. The Statement of Principles Regarding Property and Casualty Insurance Ratemaking lists it in its enumeration of considerations for trends. However, current models for estimating the premium trend have been limited to an exploration of changes in the base exposure.
2000
Michael Wacek’s paper is based on the well-known fact that the Black–Scholes call option price is the discounted expected excess value of a certain lognormal random variable.1 Specifically, the Black–Scholes price can be written as (formula), where r is the risk-free rate of interest, T is the time when the option expires, t is the current time, ˜ S(T) is a lognormal random variable related to the stock price S(T)at timeT, k is the exercise price
2000
In this note we give a multivariate extension of the proof of Ospina & Gerber (1987) of the result of Feller (1968) that a univariate distribution on the non-negative integers is infinitely divisible if and only if it can be expressed as a compound Poisson distribution.
2000
This paper explains securitization of insurance risk by describing its essential components and its economic rationale. We use examples and describe recent securitization transactions. We explore the key ideas without abstract mathematics. Insurance-based securitizations improve opportunities for all investors. Relative to traditional reinsurance, securitizations provide larger
amounts of coverage and more innovative contract terms.
2000
We use resampling techniques to analyze tile impact of providers on workers' compensation costs taking into consideration inherent differences in claim populations between providers. Resampling techniques provide a nonparametric determination of a statistic's distribution and a measure of effectiveness that is not sensitive to deviations from the assumptions underlying most parametric statistical procedures.
2000
This paper proposes a multivariate generalization of the generalized Poisson distribution. Its definition and main properties are given. The parameters are estimated by the method of moments.
Keywords: Multivariate generalized Poisson distribution (MGPm); generalized Poisson distribution (GPD); bivariate generalized Poisson distribution (BGPD).
2000
Some areas of the country are just plain risky to live in. Is it the federal government's job to assume the risk for the people who decide to live there?
2000
Nielsen (1999) showed the surprising fact that a nonparametric one-dimensional hazard as a function of time can be estimated x/~-consistently if a high quality marker is observed. In this paper we show that the hazard relevant for predicting remaining duration time, given the current status of a high quality marker, can be estimated v~-consistently if a Markov type property holds for the high quality marker.
2000
The capital base of property casualty insurers includes an increasing proportion of equities relative to fixed income securities. This paper analyzes the risk/reward attributes of various fixed income/equity asset allocation alternatives using dynamic financial analysis (DFA) and demonstrates that a typical company could improve its returns without significantly reducing its financial security by further increasing its proportion of equities.
2000
Source of earnings analysis has long been a staple of life insurance policy pricing and profitability monitoring. It has grown in importance with the advent of universal life insurance and of similar contracts with non-guaranteed benefits or charges.
2000
This paper itemized the questions and answers from an International Survey on Ratemaking Principles and Methods distributed by the CAS Committee on Ratemaking in 1996.
2000
In this paper we study some bivariate counting distributions that are obtained by the trivariate reduction method. We work with Poisson compound distributions and we use their good properties in order to derive recursive algorithms for the bivariate distribution and bivariate aggregate claims distribution. A data set is also fitted.
2000
Should the pricing of reinsurance catastrophes be related to the price of the default risk embedded in corporate bonds? If not, why not?
A risk is a risk is a risk, in whatever market it appears. Shouldn't the risk- prices in these different markets be comparable? More basically perhaps, how should reinsurance prices and bond prices be set? How does the market currently set them?
2000
For insurance products, multiple loss triggers have emerged as a tool for both risk managers and insurance companies to customize coverage. This paper will focus on an example drawn from a utility industry coverage that has triggers based separately on spot prices for electricity and lost power generation capacity. This paper provides a background to the current electric industry to help understand the interaction of the triggers.
2000
A substantial number of National accounts have insurance programs with large retentions (i.e. deductibles) which can range from $100,000 to $1,000,000 per occurrence. This high retention business obviously limits the insurance risk to the insurance company. However, if the insured with a deductible program goes into default, and is unable to pay its' insurance liabilities, it becomes the insurance company's responsibility.
2000
In this paper we discuss the concept of excess of loss reinsurance with reinstatements. The main objective is to provide a methodology to calculate the distribution of total aggregate losses for two or more consecutive layers when there is a limited number of reinstatements. We also compare different premium principles and their properties to price these treaties for any number of free or paid reinstatements.
2000
This paper approaches portfolio selection in a Bayesian framework that incorporates a prior degree of belief in an asset pricing model. Sample evidence on home bias and value and size effects is evaluated from an asset-allocation perspective.