Browse Research
Viewing 2001 to 2025 of 7690 results
2007
Over the past twenty years many actuaries have claimed and argued that the chain-ladder method of loss reserving is biased; nonetheless, the chain-ladder method remains the favorite tool of reserving actuaries. Nearly everyone who acknowledges this bias believes it to be upward. Although supporting these claims and beliefs, the author proposes herein to deal with two deeper issues.
2007
This paper presents a framework for stochastically modeling the path of the ultimate loss ratio estimate through time from the inception of exposure to the payment of all claims. The framework is illustrated using Hayne’s lognormal loss development model, but the approach can be used with other stochastic loss development models.
2007
The tax shields from debt financing reduce the cost of operations for firms with low cost of bankruptcy. State regulation prevents insurers from using long-term debt as statutory surplus, to ensure sufficient equity capital to meet policyholder obligations. Constraints on regulatory capital force policyholders to fund high tax costs on insurers and reduce the market forces that support solvency.
2007
While accounting principles and actuarial standards of practice are all well designed, they provide only broad guidance to the actuary on what is “reasonable.” This broad guidance is based on the principle that “reasonable” assumptions and methods lead to “reasonable” estimates.
2007
This paper applies the exponential dispersion family with its associate conjugates to the claims reserving problem. This leads to a formula for the claims reserves that is equivalent to applying credibility weights to the chain-ladder reserves and Bornhuetter-Ferguson reserves.
2007
The purpose of this study note is to educate actuaries on certain basic insurance accounting topics that may be omitted in other syllabus readings. These topics include: • Loss and loss adjustment expense accounting basics • Reinsurance accounting basics • Examples of how ceded reinsurance impacts an insurers financial statements • Deposit accounting basics
2007
Written from a global perspective on risk, hazards, and disasters, Introduction to International Disaster Management provides practitioners, educators and students with a comprehensive overview of the players, processes and special issues involved in the management of large-scale natural and technological disasters.
2007
The most important new development in the past two decades in the personal lines of insurance may well be the use of an individual's credit history as a classification and rating variable to predict losses.
2007
This paper shows how expert opinion can be inserted into a stochastic framework for loss reserving. The reserving methods used are the chain-ladder and Bornhuetter-Ferguson, and the stochastic framework follows England and Verrall [8]. Although stochastic models have been studied, there are two main obstacles to their more frequent use in practice: ease of implementation and adaptability to user needs.
2007
This paper discusses an approach to the correlation problem in which losses from different lines of insurance are linked by a common variation (or shock) in the parameters of each line’s loss model. The paper begins with a simple common shock model and graphically illustrates the effect of the magnitude of the shocks on correlation.
2007
Although the copula literature has many instances of bivariate copulas, once more than two variates are correlated, the choice of copulas often comes down to selection of the degrees-of-freedom parameter in the t-copula. In search for a wider selection of multivariate copulas we review a generalization of the t-copula and some copulas defined by Harry Joe. Generalizing the t-copula gives more flexibility in setting tail behavior.
2007
In applications of the collective risk model, significantly more attention is often given to modelling severity than modelling frequency. Sometimes, Frequency modelling is neglected to the extent of using a Poisson distribution for the number of claims. The Poisson distribution has variance equal to mean, and there are multiple reasons why this is almost never appropriate when forecasting numbers of non-life insurance claims.
2007
Data and quality data are an ever critical part of our lives today - both personally and corporately - yet it is not a subject that many actuaries focus on. But, every business operation creates or consumes huge quantities of data. Data Quality, the Field Guide, by Thomas C. Redman, Ph.D. provides many practical approaches to establishing or improving the data quality programs in businesses.
2007
In this paper we explore the bias in the estimation of the Value at Risk and Conditional Tail Expectation risk measures using Monte Carlo simulation. We assess the use of bootstrap techniques to correct the bias for a number of different examples.
2007
Motivation: Capital allocation can have substantial ramifications upon measuring risk adjusted profitability as well as setting risk loads for pricing.
2007
Enterprise risk management (ERM) is the process of analyzing the portfolio of risks facing the enterprise to ensure that the combined effect of such risks is within an acceptable tolerance. While more firms are adopting ERM, little academic research exists about the costs and benefits of ERM.
2007
Simultaneous modelling of operational risks occurring in different event type/business line cells poses the challenge for operational risk quantification. Invoking the new concept of Lévy copulas for dependence modelling yields simple approximations of high quality for multivariate operational VAR.
2007
Decisions taken in order to achieve planned targets are always connected with risk that influences the resources of the company—both in positive and negative ways. It should be considered through a capital perspective.
2007
Value-at-risk (VaR) is a widely used risk measure among financial institutions. Cash-flow-at-risk (CFaR) is an attempt to transfer the same ideas to the setting of a non?financial firm.
2007
Now that in many regions of the globe Basel II is fait accompli, risk professionals may feel that the financial world is ready for the ERM challenge. Is this really the case?
The Asian scenery, in which I have worked for over 15 years, paints a different picture altogether; a picture which I believe is relevant to risk professionals wherever they are. In Asian financial institutions risk management as a value-creating business has failed.
2007
In this paper, a re-rating formula which calculates proposed rates directly by means of fully developed and trended loss costs and current class differentials has been introduced.
The new formula eliminates the need to calculate the following, previously necessary variables:
2007
Different financial products usually have very different risk profiles. In the financial Industry, risk measures based on VaR for financial products are either dominant market VaR or credit VaR or Add VaR, which is obtained by evaluating market VaR and credit VaR separately and then add them together.
2007
In many instances when using the BF method, we do not have an external source for the expected loss costs. In this situation, we will frequently use a weighted average of
ultimate loss costs of the preceding accident periods.
2007
Banks all over the world are still concerned with the implementation of the new Basel Accord for Capital Adequacy which refines – among others – the minimum capital requirements. In the last years huge silo?like structures for data acquisition, data management, and data processing have been created to comply with these new standards.