Browse Research

Viewing 3776 to 3800 of 7690 results
1999
The price of catastrophe reinsurance in the United States has fluctuated markedly in recent years. These fluctuations are commonly associated with the pattern of catastrophe occurrences. For example, catastrophe losses during the period 1992-94 totaled 38.6 billion in 1994 dollars, exceeding the cumulative total of losses during 1949-91 of 34.6 billion.
1999
This paper examines the impact that insurance coupled with specific risk mitigation measures (RMMs) could have on reducing losses from hurricanes and earthquakes as well as improving the solvency position of insurers who provide coverage against these hazards. We first explore why relatively few individuals adopt cost-effective RMMs by reporting on the results of empirical studies and controlled laboratory studies.
1999
For over half a century financial experts have regarded the movements of markets as a random walk - unpredictable meanderings akin to a drunkard‘s unsteady gait - and this hypothesis has become a cornerstone of modern financial economics and many investment strategies. Here Andrew W. Lo and A. Craig MacKinlay put the Random Walk Hypothesis to the test.
1999
Severe natural catastrophes in the early 1990s generated a lack of financial capacity in the catastrophe line of the global reinsurance market. The finance industry reacted to this situation by issuing innovative products designed to spread the excess risk more widely among international investors (risk securitization). The paper reviews these developments and emphasizes their significance with respect to the economic theory of risk exchanges.
1999
We discuss the concept of the risk measure as an expectation using a probability distortion, and classify the standard risk measures according to their associated distortion functions. Using two examples, we explore the features of the different measures.
1999
Implications of factor-based asset pricing models for estimation of expected returns and for portfolio selection are investigated. In the presence of model mispricing due to a missing risk factor, the mispricing and the residual covariance matrix are linked together. Imposing a strong form of this link leads to expected return estimates that are more precise and more stable over time than unrestricted estimates.
1999
The past few years have seen the development and growth of traded securities with payoffs tied to natural and industrial disasters. Pricing the insurance features imbedded in these securities is difficult and imprecise. This lack of pricing precision translates to greater required return premiums to holders of these securities.
1999
This paper clarifies how option pricing methods can be used to determine how much surplus an insurance company should carry and how that surplus requirement should be allocated across the company‘s lines of insurance business. Surplus is important because more surplus means more collateral for outstanding policies. Surplus is costly for at least two reasons.
1999
In contrast to alternative measures of risk, value at risk (VaR) has important virtues intelligibility, comparability, and practicality that make it a potentially valuable tool for strategic decision making and capital management in a wide variety of industries.
1999
This paper develops a theory of capital allocation in opaque financial intermediaries. The model endogenizes risk management and capital structure decisions, and it provides a simple setting within which to address questions relating to capital budgeting, performance measurement, and employee compensation. It provides a theoretical foundation for understanding the appropriate use, and misuse, of the widely-employed RAROC methodology.
1999
The computer-intensive statistical methodology of subsampling has received considerable attention since the early 1990s when it was discovered that it is generally valid under minimal assumptions. Indeed, subsampling provides a robust alternative to the bootstrap, whose consistency may fail unless problem-specific regularity conditions hold true.
1999
This paper reports an investigation of demand for risk reduction, or risk intolerance. It has often been assumed that risk level is a good, almost self-evident, predictor of demand for risk reduction. However, few previous studies have addressed the issue explicitly. Two empirical studies are reported here.
1999
When a rate of return is regressed on a lagged stochastic regressor, such as a dividend yield, the regression disturbance is correlated with the regressor‘s innovation. The OLS estimator‘s finite-sample properties, derived here, can depart substantially from the standard regression setting.
1999
This is a 1999 draft of a chapter in a text book tentatively titled This is a 1999 draft of a chapter in a text book tentatively titled "Derivatives, Risk Management, and Financial Engineering" . This chapter deals with issues of risk management as they affect the financial evaluation of assets and liabilities. The objectives of the chapter are: Introduce risk measures appropriate to financial and nonfinancial firms.
1999
This paper introduces a class of distortion operators, ga(u) = F[F-1(u)+a], where F is the standard normal cumulative distribution. For any loss (or asset) variable X with a probability distribution SX(x) = 1-FX(x), ga[SX(x)] defines a distorted probability distribution whose mean value yields a risk-adjusted premium (or an asset price).
1998
Much has been written in recent years about the types of factors that should be considered in a dynamic financial analysis model. Much less has been written that actually provides a reader with an understanding of how the various pieces of a dynamic financial analysis model need to fit together. This paper is intended to provide a reader with a look “under the covers” at the structure of a model being used for dynamic financial analysis.
1998
How does one measure the effect of improved policy retention on such key variables as market share and profitability?
1998
This paper presents a continuous time version of a stochastic investment model originally due to Wilkie. The model is constructed via stochastic differential equations. Explicit distributions are obtained m the case where the SDEs are driven by Browman motion, which is the continuous time analogue of the time series with white noise residuals considered by Wilkie.
1998
A new stochastic model based on the traditional chain ladder is introduced. It makes explicit use of cumulative distribution functions and payment patterns. It incorporates a mathematical rationale for non-stochastic variations in the age-to-age factors. Perturbation methods are used to obtain and justify the solution. Estimation of liabilities in the tail is a natural product of the model.
1998
This paper presents a statistical model underlying the chain-ladder technique. This is related to other statistical approaches to the chain-ladder technique which have been presented previously. The statistical model is cast in the form of a generalized linear model, and a quasi-likelihood approach is used. It is shown that this enables the method to process negative incremental claims.
1998
This paper will illustrate how the insurance marketplace is characterized by relatively fixed demand and has many of the aspects of a commodity market. It will also discuss how this leads to the pricing cycles that are characteristic of commodity markets.
1998
It is shown that a distribution-free implicit price loading method, which sets prices using a modified Hardy-Littlewood majorant of the stop-loss ordered maximal random variable by given range, mean and variance, induces distribution-free safe layer-additive distortion pricing. As a by-product, Karlsruhe pricing turns out to be a valid linear approximation to Hardy-Littlewood pricing in case the coefficient of variation is sufficiently high.
1998
In this paper the problem of evaluating IBNR claims is tackled by the hazard rates approach. Both the separation method and the chain ladder method can be incorporated in the semiparametric Cox model, a very popular specification among the methods based on proportional hazard rates.
1998
This paper provides a number of comments on two well-known approaches to the problem of loss development forecasting, namely the chain ladder method and the development factor family. These comments are mainly based on recent forecasting experience in time series statistics and econometrics.