Browse Research
Viewing 3351 to 3375 of 7690 results
2001
This paper discusses a methodology for establishing reserves for the portion of loss adjustment expense associated with the cost of claim adjusters. The actuarial literature contains very little material on how to estimate unallocated loss adjustment expense (ULAE) reserves. The literature briefly mentions “transactionbased” methods that require claim department time studies.
2001
Economics can be distinguished from other social sciences by the belief that most (all?) behavior can be explained by assuming that rational agents with stable, well-defined preferences interact in markets that (eventually) clear. An empirical result qualifies as an anomaly if it is difficult to "rationalize" or if implausible assumptions are necessary to explain it within the paradigm.
2001
In this groundbreaking working, Jack L. King, Ph.D. provides the basis for an in-depth understanding of operational risk by focusing on its measurement and modelling. Using both theoretical and practical material, he lays out a foundation theory that can be applied and refined for application in the financial sector and beyond.
2001
Actuarial analysis can be viewed as the process of studying profitability and solvency of an insurance firm under a realistic and integrated model of key input random variables such as loss frequency and severity, expenses, reinsurance, interest and inflation rates, and asset defaults. Traditional models of input variables have generally fitted parameters for a predetermined family of probability distributions.
2000
This paper considers a range of stochastic models which give the same reserve estimates as the chain-ladder technique. The relationship between the models described by Renshaw and Verrall (Renshaw, A.E., Verrall, R.J., 1998. British Actuarial Journal 4, 903–923) and Mack (Mack, T., 1993. ASTIN Bulletin 23, 213–225) is explored in more detail than previously.
2000
A simple upper bound for the variance of the frequency estimates in a multivariate tariff using class criteria is deduced. This upper bound is based exclusively on univariate statistics and can therefore be calculated before a GLM analysis is carried out.
2000
We use a doubly stochastic Poisson process (or the Cox process) to model the claim arrival process for catastrophic events. The shot noise process is used for the claim intensity function within the Cox process. The Cox process with shot noise intensity is examined by piecewise deterministic Markov process theory. We apply the Cox process incorporating the shot noise process as its intensity to price a stop-loss catastrophe reinsurance contract.
2000
This paper considers a range of stochastic models which give the same reserve estimates as the chain-ladder technique. The relationship between the models described by Renshaw and Verrall (Renshaw, A.E., Verrall, R.J., 1998. British Actuarial Journal 4, 903--923) and Mack (Mack, T., 1993. ASTIN Bulletin 23, 213--225) is explored in more detail than previously.
2000
It is shown that the (over-dispersed) Poisson model is not the same as the distribution-free chain ladder model of Mack (Distribution-free calculation of the standard error of chain ladder reserve estimates, ASTIN Bulletin 23 (1993) 213--225) although both reproduce the historical chain ladder estimator for the claims reserve. For example, the true expected claims reserves, ignoring estimation issues, described by the two models are different.
2000
This paper explains how a dynamic pricing system can be built for personal lines business, whereby profit loads and risk premiums can be tailored to the individual behavioral characteristics of the customer.
2000
This paper discusses the modeling and control of pension funds. A continuous-time stochastic pension fund model is proposed in which there are risky assets plus the risk-free asset as well as randomness in the level of benefit outgo. We consider Markov control strategies which optimize over the contribution rate and over the range of possible asset-allocation strategies.
2000
We extend the Cox-Ingersoll-Ross (1985) model of the short interest rate by assuming a stochastic reversion level, which better reflects the time dependence caused by the cyclical nature of the economy or by expectations concerning the future impact of monetary policies. In this framework, we have studied the convergence of the long-term return by using the theory of generalized Bessel-square processes.
2000
We develop a two-tiered agency model that shows how rent-seeking behavior on the part of division managers can subvert the workings of an internal capital market. By rent-seeking, division managers can raise their bargaining power and extract greater overall compensation from the CEO.
2000
If asset returns have systematic skewness, expected returns should include rewards for accepting this risk. We formalize this intuition with an asset pricing model that incorporates conditional skewness. Our results show that conditional skewness helps explain the cross-sectional variation of expected returns across assets and is significant even when factors based on size and book-to-market are included.
2000
This paper provides an analytical and practical framework, consistent with maximizing the wealth of existing shareholders, to address the following questions: What are the costs associated with economic capital? What is the tradeoff between the probability of default and the costs of economic capital? How do we take into account the time profile of economic capital when assessing the performance of a business?
2000
In credibility ratemaking, one seeks to estimate the conditional mean of a given risk. The most accurate estimator (as measured by squared error loss) is the predictive mean. To calculate the predictive mean one needs the conditional distribution of losses given the parameter of interest (often the conditional mean) and the prior distribution of the parameter of interest. Young (1997.
2000
This paper addresses that class of complex problems where there is little or no underlying theory upon which to build a model and the situation dictates the use of an adaptive approach based on the observed data. The field of study is known as adaptive nonlinear models (ANMs), and its goal is to quantify interaction terms without imposing assumptions on the solution.
2000
This paper deals with the problem of pricing a financial product relying on an index of reported claims from catastrophe
insurance. The problem of pricing such products is that, at a fixed time in the trading period, the total claim amount from the catastrophes occurred is not known. Therefore, one has to price these products solely from knowing the aggregate amount of the reported claims at the fixed time point.
2000
Adaptive nonlinear models (ANMs) are currently being proposed for use in actuarial and financial modeling. The techniques of these models included such things as neural networks and genetic algorithms. While there is a general awareness of the nature of these ANM techniques, there is often only vague familiarity with the details of how these techniques are implemented. This article is intended to help alleviate this situation.
2000
Consider a ratio statistic (e.g. the mean) built from observations assigned into classes. An example would be losses=L, claim counts=C, and expos tres=E each aggregated by rating class with the applicable statistic being either case severity=L/C or case frequency=C/E. The note discusses comparing two observed values for such a statistic. The difference is expressed as a sum of two components.
2000
In this paper the authors describe how to link the technical aspects of the Dynamic Financial Analysis (DFA) modeling process with the ultimate purpose of that process, the enlightenment of senior management for the purposes of strategic thinking. The authors desire to enlighten both the model user and the senior executive by describing the elements that connect the merits of a rigorous quantitative analysis to fundamental strategic issues.
2000
Young (1999) discussed the conjecture proposed by Christofides (1998) regarding the premium principle of Wang (1995, 1996). She shows that this conjecture is true for location-scale families and for certain other families, but false in general. In addition Young (1999) states that it remains an open problem to determine under what circumstances Wang's premium principle reduces to the standard deviation (SD) premium principle.
2000
Actuaries are often asked to provide a range or confidence level for the loss reserve along with a point estimate. Traditional methods of loss reserving do not provide an estimate of the variance of the estimated reserve, and actuaries use various ad hoc methods to derive a range for the indicated reserve.
2000
In the world of mass torts, asbestos and pollution are the best known and clearly the largest to date, with $29 billion and $30 billion (respectively) of incurred losses as of year-end 1998 for the U.S. insurance industry. Can another mass tort of comparable size arise in the future? The authors argue that, while it is possible, such a large loss is unlikely.