Browse Research

Viewing 3701 to 3725 of 7690 results
1999
Special Topics (narrow focus or advanced); In the Second Edition of this best-selling distributed database systems text, the authors address new and emerging issues in the field while maintaining the key features and characteristics of the First Edition. The text has been revised and updated to reflect changes in the field.
1999
When a rate of return is regressed on a lagged stochastic regressor, such as a dividend yield, the regression disturbance is correlated with the regressor's innovation. The OLS estimator's finite-sample properties, derived here, can depart substantially from the standard regression setting.
1999
This is a follow-up of a previous paper by the author, where claims reserving m non-life insurance as treated in the framework of a marked Poisson claims process A key result on decomposition of the process as generalized, and a number of related results are added. Their usefulness is demonstrated by examples and, in particular, the connection to the analogous discrete tame model is clarified.
1999
This paper documents the methodology used to develop the primary and excess credibilities which underlie the experience rating plan of the Workers' Compensation Insurance Rating Bureau of California (the Bureau) and the translation of these credibilities into the B and W rating values used in the experience rating formula. The method is demonstrated with an analysis based on projecting experience modifications for policy year 1991.
1999
Actuaries are now being called upon to incorporate interest rate models in a variety of applications, including dynamic financial analysis (DFA). ratemaking, and valuation. Although there are many articles and texts on interest rate models, most of these presume an understanding of financial terminology and mathematical techniques that makes it difficult to begin to learn this material.
1999
The authors use a dynamic model to impart predictability to changes in a firm's systematic risk and its expected returns
1999
Consider a classical compound Poisson model. The safety loading can be positive, negative or zero. Explicit expressions for the distributions of the surplus prior and at ruin are given in terms of the ruin probability. Moreover, the asymptotic behavior of these distributions as the initial capital tends to infinity are obtained.
1999
In the present paper we generalize Panjer's (1981) recursion for compound distributions to a multivariate situation where each claim event generates a random vector We discuss situations within insurance where such models could be applicable, and consider some special cases of the general algorithm. Finally we deduce from the algorithm a multivariate extension of De Pril's (1985) recursion for convolutions.
1999
Data Administration Including Warehousing & Design (narrow topic or advanced); Special Topics (narrow topic or advanced); Like all professionals in the information era, actuaries need computers to automate non-creative activities and to relieve them from the burden of repetitive actions. Actuaries need a system which shields them from the complexities of computer architecture and provides an abstraction and generalization exactly at the leve
1999
Algorithms for the calculation of the distribution of the aggregate claims from a life insurance portfolio have been derived by Kornya (1983), Hipp (1986) and De Pril (1986 and 1989). All these authors considered the distribution of the aggregate claims over a single period. In this paper we derive algorithms for the calculation of the joint distribution of the aggregate claims from a life portfolio over several periods.
1999
In order to be complete dynamic financial analysis (DFA) models should deal with both the amount and timing of future loss and loss adjustment expense payments. Even more than asset cash flows, these future payments are very uncertain. However, even with this uncertainty, one would expect to see payments that are somewhat stable from year to year. This paper presents an approach that can deal with this seeming contradiction.
1999
Finding a parametric model that fits loss data well is often difficult. This paper offers an alternative—the semiparametric mixed exponential distribution. The paper gives the reason why this is a good model and explains maximum likelihood estimation for the mixed exponential distribution. The paper also presents an algorithm to find parameter estimates and gives an illustrative example.
1999
Special Topics (narrow topic or advanced); A guide to principles and practice of archives management in private and public sector organizations. Reviews the history and function of archives and archival services, and explains how to set up and run a records management program, manage the interface with archival management, and conduct a records survey. Also covers coding and description of archival material, user issues, and security.
1999
The paper by Halliwell [1] and the Discussion of Halliwell’s paper by Dr. Schmidt both consider the form of “best” linear unbiased estimators for unknown quantities based on observable values. This paper proposes a general definition of “best” called Uniformly Best (UB) to distinguish it from previous definitions and provides various equivalent forms for the definition.
1999
In a recent paper on loss reserving, Halliwell suggests predicting outstanding claims by the method of generalized least squares applied to a linear model. An example is the linear model given by where Z i,k is the total claim amount of all claims which occur in year i and are settled in year i +k. The predictor proposed by Halliwell is known in econometrics but it is perhaps not well-known to actuaries.
1999
Having had the pleasure of seeing my paper in the Proceedings, I am even more pleased now that Klaus Schmidt and Michael Hamer have deigned to discuss it. But even with their discussions, most of the subject of statistically modeling loss triangles remains terra incognita; and I hope that actuaries and academics will continue to explore it.
1999
When commuting workers compensation reinsurance claims, the standard method is to project the future value of the claims using stated assumptions for future medical usage, medical inflation, cost-of-living adjustments, and investment income. The actuary selects a best estimate for each variable, and assumes this deterministic number will be realized in the future.
1999
Data Management Profession (narrow topic or advanced); This paper provides a timely overview of the legal, political and practical implications of intellectual property concepts as they apply to insurance data collection and use. Intellectual property issues have become common in regulatory discussions during the 1990's and have also become important to the understanding of advisory organizations.
1999
The current syllabus of the Casualty Actuarial Society, especially parts four and ten thereof exposes actuaries to mathematical finance, particularly to the valuation and management of cash flows. The Society believes that financial matters will be even more important to actuaries of the twenty-first century. However, the syllabus readings do not take advantage of the mathematical proficiency of actuaries.
1999
Data Administration Including Warehousing & Design (narrow topic or advanced); Data Quality (narrow topic or advanced); Each year, companies lose millions as a result of inaccurate and missing data in their operational databases. This in turn corrupts data warehouses, causing them to fail.
1999
Special Topics (narrow topic or advanced); Recent announcements such as the Prudential's plan to fully demutualize have brought the issue of demutualization to the forefront of the insurance industry. The Center for Insurance Research estimates that one in six households may be impacted by the demutualization of Prudential alone.