Browse Research
Viewing 3276 to 3300 of 7690 results
2001
Hedge funds are how the other half invests. If you've got a few hundred thousand just sitting around not doing anything, read this.
2001
Chapter headings:
Investment Income
Investment and Tax Strategies
Rate of Return Measures
Impact of Investment Income on Pricing
Conclusion
References
2001
In the last few years we have witnessed growing interest in Dynamic Financial Analysis (DFA) in the nonlife insurance industry. DFA combines many economic and mathematical concepts and methods. It is almost impossible to identify and describe a unique DFA methodology. There are some DFA software products for nonlife companies available in the market, each of them relying on its own approach to DFA.
2001
Data Administration Including Warehousing & Design (narrow topic or advanced); Inside, ORM authority Terry Halpin blends conceptual information with practical instruction that will let you begin using ORM effectively as soon as possible.
2001
In this paper we consider the problem raised in the Astin Bulletin (1999) by Prof. Benktander at the occasion of his 80th birthday concerning the choice of an appropriate claim size distribution in connection with reinsurance rating problems. Appropriate models for large claim distributions play a central role in this matter. We review the literature on extreme value methodology and consider its use in reinsurance.
2001
Whittaker graduation is applied to the spatial smoothing of insurance data. Such data (I.e. claim frequency) form a surface over the 2-dimensional geographic domain to which they relate. Observations of this surface are subject to sampling error. They need to be smoothed spatially if a reliable estimate of the underlying surface is to be obtained. A measure of smoothness of a surface has been defined.
2001
The normal inverted gamma mixture or generalized Student and the symmetric double Weibull, as well as their logarithmic counterparts, are proposed for modeling some loss distributions in non-life insurance and daily index return distributions in financial markets.
2001
For problems such as rating excess of toss reinsurance and estimating deductible credits, actuaries frequently employ exposure rating factors. In the context of property insurance this takes the form of loss tables such as the Lloyds scale or Salzmann tables. These tables display the fraction of loss cost retained for layers expressed as fractions of insured value, or policy limit.
2001
This paper proposes bonus-malus systems for fleets of vehicles, by using the individual characteristics of both the vehicles and the carriers. Bonus-malus coefficients are computed from the history of claims or from the history of safety offences of the carriers and the drivers.
2001
Companies subject to significant catastrophe losses are continually evaluating their appetite and tolerance for risk. A sound process of linking catastrophe models to financial output is critical to understanding the implications of risk management strategies.
2001
This paper presents a dynamic method to estimate fair value insurance liabilities for the whole book (with separate but correlated lines) of business. The model studies the aggregate liability without assuming independence of individual losses. A non-traditional approach is proposed which estimates the fair value liability based on a stotashic model of individual losses.
2001
This paper has been submitted in response to the Committee on Dynamic Financial Analysis 2001 Call for Papers. The authors have applied dynamic financial analysis to DFA Insurance Company (DFAIC) to address capital adequacy and capital allocation
issues. The DFA model used for this analysis was the Swiss Re Investors Financial TM
Integrated Risk Management (FIRM) System. This paper is Part 2 of a two-part submission.
2001
From a major worm event (such as a military action) to a seemingly minor detail (such as the use o f a new plastic washer in a faucet design) change must be accounted for when collecting, interpreting and analyzing data. Indeed the intervention itself may be the focus o f the study. Theoretically, the best way to model some interventions, especially time-dependent ones, is via the hazard function.
2001
The majority of optimal Bonus-Malus Systems (BMS) presented up to now in the actuarial literature assign to each policyholder a premium based on the number of his accidents. In this way a policyholder who had an accident with a small size of loss is penalized unfairly in the same way with a policyholder who had an accident with a big size of loss.
2001
With rapid advances in technology and communications, globalization is impacting every sector of the economy, and insurance is no exception.
2001
The application of loss trends has long been a fundamental part of the ratemaking process. Despite this, the actuarial literature is somewhat lacking in the description of methods by which one can estimate the proper loss trend from empirical data. Linear or exponential least squares regression is widely used in this regard. However, there are problems with the use of least squares
regression when applied to insurance loss data.
2001
With so much discussion about claim benchmarking, treatment protocols and the like, did you ever wish someone could just point you in the right direction? This analysis contains detailed workers compensation [WC] claim data now becoming available to researchers leads to a picture that resembles a simplified navigational chart.
2001
Special Topics (narrow focus or advanced); With E-business comes the opportunity for companies to really get to know their customers—who they are and their buying patterns. Business managers need an integrated strategy that supports customers from the moment they enter the front door—or Web site—right through to fulfillment, support, and promotion of new products and services.
2001
Aggregate Loss Distributions are used extensively in actuarial practice, both in ratemaking and reserving. A number of approaches have been developed to calculate aggregate loss distributions, including the Heckman-Meyers method, Panjer method, Fast Fourier transform, and stochastic simulations.
2001
Based on the notions of value-at-risk and expected shortfall, we consider two functionals, abbreviated VaR and RaC, which represent the economic risk capital of a risky business over some time period required to cover losses with a high probability. These functionals are consistent with the risk preferences of profit-seeking (and risk averse) decision makers and preserve the stochastic dominance order (and the stop-loss order).
2001
The DFA Insurance Company (DFAIC) is a fictional insurance company created by the CAS for the 2001 Dynamic Financial Analysis (DFA) Call for Papers. Those who respond to the call are expected to use DFA to answer specific questions about DFAIC's capital adequacy, capital allocation and reinsurance strategy. This paper is a response to that call.
2001
The paper contains a brief review of the bonus/malus ratemaking methodology found within the European community. It proceeds to explain how, under such systems, a priori and a posteriori ratemaking have to be integrated into a continuous risk evaluation mechanism.