Search Presentations

The presentation materials are offered in connection with CAS professional education offerings. © 2022 Casualty Actuarial Society. All Rights Reserved. The presentation materials may contain copyrighted content the use of which has not been specifically authorized by the copyright owner. You are permitted to view and print the materials for personal/professional noncommercial research purposes. Except for the foregoing, you agree not to reproduce, distribute, modify, create derivative works, or commercially exploit the presentation materials without prior written permission from CAS. Please direct any copyright permission inquiries regarding use of the presentation materials to acs@casact.org.

Viewing 3876 to 3900 of 6735 results
STAY TUNED! If you are anticipating additional search filters by attribute and level to align with the CAS Capability Model, it is coming later this Summer. As the CAS begins to code recorded sessions by specific attributes and levels (starting with the 2023 Annual Meeting), these will be tagged in the CAS database of presentations going forward and should be searchable.

But you may use the Capability Model now to help you identify topics. For example, if you want to move up one level under the content area “Functional Expertise,” you may search topics in the particular functional area to expand your knowledge.

Recorded content is searchable by Capability Model attribute and level in the CAS Online Library.

Impact of Health Care Reform on Medical Malpractice

Health care reform is here, and brings significant short and long-term challenges to employers and the insurance industry. Actuaries should understand the potential impact of health reform on their lines of business. This session will focus on workers compensation and medical professional liability, and will address how changes created by the reform may affect the cost of claims going forward. While we can't quantify the impact of health reform at this time, attendees will leave with an understanding of what factors might affect the frequency and severity of claims, and be better prepared to evaluate and respond to changes in claims experience as it emerges.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Panelists: Harry Shuford, John Mize

Assembling Claim Adjuster Notes and Other Unstructured Data for Data Analytics Applications

Claim adjuster notes, diary notes, and other forms of unstructured text data are largely untapped sources of detailed information on property-casualty insurance claims. These data can provide useful information on the particular circumstances of an incident, assignment of liability, and other details of a claim that may not be tracked in standard structured-data fields. For example, while the number of fields in structured data is usually limited to reporting information on one or a few causes of incident, mining unstructured data can be used to capture numerous causes of an incident. Furthermore, although there is often an initial assignment of liability, the assignment (or apportionment) can change as additional information becomes available. Claim adjuster notes and other unstructured data can be used to obtain current information on liability on an as-available, on-going basis. While structured data are often gathered when a claim is reported, the collection of unstructured data is an open-ended, on-going process. Consequently, unstructured data provides a source of additional information that continues to be gathered weeks, months, and years after an incident. Finally, working with unstructured data overcomes coding limitations that often confine information in structured data. The size and properties of unstructured data often present challenges in extracting and organizing the information into a useable form. Using real-world data on automobile complaints, this session will describe procedures for extracting and compiling text data into a form useable for data analytics.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Terri Kremenski
Panelists: Philip Borba

Usage Based Insurance: Are you Ready?

Usage-based Automobile Insurance (UBI) is now being implemented by a number of insurers and the battle for product differentiation using telematics data is heating up. The telematics data is clearly powerful for use in segmentation, but the economics are now becoming favorable for mass rollout to consumers. As a result, UBI has caught the attention of most insurers and actuaries are being asked the difficult questions regarding how and why to implement. This session is designed to educate participants regarding the current status of UBI and explore key elements that must be answered to understand the challenges for implementation. This session will cover important questions: What is the current market landscape for usage-based insurance? What data can be obtained from telematics and how powerful is this data for risk segmentation and product differentiation? How are insurers and telematics providers overcoming the obstacles to implementation?
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Terri Kremenski
Panelists: Robin Harbage, Allen Greenberg

Trends in Workers Compensation Medical Costs

The link between changes in medical severity and medical price inflation were markedly different between the last half of the 1990s and the first half of the current decade. The shift creates uncertainty when estimating loss cost trends. Harry Shuford will present National Council on Compensation Insurance research that quantifies the factors that account for these differences. These will be discussed and potential explanations for the changes will be explored. Frank Schmid will present work on a state-level price index of physician services in workers compensation. Two price indexes are calculated: one based on transaction prices, and one based on the fee schedule. By comparing the two price indexes it is shown how medical prices respond to fee schedule changes at the level of AMA categories and in the aggregate.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Terri Kremenski
Panelists: Frank Schmid

Quantifying Risk Load for Property Cat Exposure

Determining the appropriate compensation for the bearing of risk is a critical element in insurance pricing. The bearing of property catastrophe risk, and the required compensation for such high-layer risks, have become increasingly important topics in today's insurance market. Recently developed methodologies in actuarial pricing have begun to use the capital markets to determine the required compensation. The volume of insurance Linked Securities (ILS) available in the capital markets is substantial. Catastrophe Bonds are one form of ILS. In this session, available capital market data on catastrophe bonds will be used to quantify the cost of catastrophe risk for property insurance. Several applications will be presented, including quantifying risk loads and evaluating the cost of catastrophe reinsurance. In addition, practical considerations for implementation will be discussed.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Terri Kremenski
Panelists: Paul Anderson, Seth Goodchild

Optimization of Distribution Channels Using Statistical Techniques

Insurance companies have always strived to measure and improve the performance of their distribution channels, their life lines of production. Traditionally, this important function is usually assigned to the company's Sales & Marketing departments. While many other departments within the insurance companies have successfully utilized predictive modeling and other statistical techniques to improve their performance (e.g. actuarial, underwriting, and claims), these techniques have not been used prevalently in the sales and marketing area. In this session, we will demonstrate some of the techniques that can be used to design strategies in a distribution system, such as agency management. We will present specific examples of the techniques used. We will discuss how to measure the various factors that affect an agency's performance statistically and the effect of the interaction between the factors. The usages and the benefits of the statistical approach, designed to complement the traditional sales & marketing techniques, will be discussed as well. Included in this discussion will be the measurement of success.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Rachel Boles
Panelists: Benny Yuen

Loss Cost Modeling vs. Frequency and Severity Modeling

In recent years, loss cost modeling using the Tweedie distribution has been gaining popularity in GLM-based predictive modeling practice, joining frequency and severity modeling. This has left us with two widely used GLM design approaches, each with their own strengths and weaknesses. The frequency-severity modeling approach prescribes modeling claim frequency (claim count over exposure) and claim severity (incurred loss over claim count) separately, and then combining those results to create loss cost estimates. Enhancements of the basic frequency-severity approach include the creation of modeled loss cost datasets to facilitate offsetting and special treatment or modeling of high-severity losses. The loss cost approach uses loss cost data (incurred loss over exposure) directly as the target variable in the model. In this session, we will discuss the pros and cons of the two model design approaches during a class plan study or underwriting tier development. The discussion will cover both business and statistical considerations for personal and commercial lines. We will use multiple data sources to illustrate comparisons and support our findings.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Christian Lemay
Panelists: Jun Yan, Chad Davis

Intelligent Use of Competitive Analysis

Competitive analysis is one of the key elements in measuring the performance of a rate algorithm. This requires the capture and analysis of competitive data. Over the years, much work has been performed in capturing the data; however, there is a significant lack of sophistication in analyzing it. This session will begin by discussing the sources and challenges of acquiring good competitive information. Then, the focus will be on how to analyze the data to make informed decisions. Analysis strategies will run the gamut of traditional mining of the data to more sophisticated analysis and then to incorporation of the information into demand curves.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Christian Lemay
Panelists: Kelleen Arquette

Workers Compensation—State of the Market

An overview of the current state of the workers compensation line will be presented, including a review of financial results, recent trends, and a discussion of where the line might be headed.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Christian Lemay
Panelists: Nancy Treitel-Moore

Overview of Cat Bond Market

Non-life catastrophe bond issuance levels rose to $4.8 billion in 2010, a 41% increase over 2009, and remain a competitive alternative to catastrophe reinsurance. This session will provide an overview of how the cat bond market works. It will include an explanation of the cost associated with cat bond issuance and how cat bonds are priced, so costs can be identified for use in ratemaking applications. It will also cover the available data that will serve as an introduction to the session RR-3, where the cat bond data will be used in other ratemaking applications. Additionally, Morton Lane plans to share some of the interesting analysis he does on the cat bond market data.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Christian Lemay
Panelists: Morton Lane

Catastrophe Modeling for Commercial Lines

This session will address catastrophe modeling from a commercial lines perspective, where modeled losses can be highly dependent on assumptions concerning construction and occupancy mappings, proper accounting for policy conditions, dealing with large data sets, and extra coverages. Improved methodologies for modeling business interruption and complex industrial facilities, approaches for understanding the sensitivity of modeled losses to input data, and trends in how commercial lines insurers are using catastrophe models will be discussed.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Christian Lemay
Panelists: Jason Nonis

Risk and Return Considerations in Ratemaking: Calculating the Profit Provision

After you have the projected loss costs and expenses, the final step in deriving the indicated premium is to load in the underwriting profit provision. But what is the right number? This session will supply not one, but several, answers to that question. It will survey different approaches, from those mandated by regulators to those used by corporate pricing actuaries for internal profitability analysis. The assumptions and parameter selections for each method will be discussed and the sensitivity of results to key parameters will also be explored. The session will have a practical focus with an emphasis on clarifying basic concepts and highlighting key distinctions between different methods.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Christian Lemay

Moving Beyond the Credit Score

Credit based insurance scores have generated a lot of interest over the past 10 years. Before the use of credit scores insurance became prevalent in the late 1990's and early 2000's, it had been many years since a new factor had such a significant impact on the rating and underwriting of personal lines insurance. For this reason, insurance companies and credit score vendors have argued for its use, consumer groups have rallied against it, and regulators have been caught somewhere in the middle. It is clear from recent history that the debate over credit scores is not going anywhere in the immediate future. The resolution of the debate is not clear either. Ultimately, though, whether credit scores are ultimately allowed or disallowed does not change one important point: insurance companies need to move beyond the credit score. In response to this reality, this presentation will discuss: * The history of adoption and the support behind the use if credit scores * The challenges to the use of credit scoring * Reasons why credit score is useful in predicting insurance loss, and how companies can use this logic to move beyond credit scores * How companies have responded to states where the use of credit is banned * Analytic approaches to determining an optimal rating and underwriting approach with or without credit * Segmentation factors and tools that can have a significant impact on the industry going forward
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Christian Lemay
Panelists: Chris Maydak

Medical Fee Schedules and Their Impact on Workers Compensation Costs

Over the last decade, medical inflation and the utilization of medical services have increased significantly. Today, close to 60% of the workers compensation benefits are attributable to medical costs. With this escalation of medical claim costs, many states regulators and legislators have implemented and revised medical fee schedules as part of their medical cost containment solutions. This session will provide background on workers compensation medical fee schedules. It will also include a discussion on evaluating the cost impacts of various types of schedule changes, as well as the data challenges associated with these pricings.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Christian Lemay
Panelists: Ziv Kimmel

GLM II

GLM I provided the case for using GLMs and some basic GLM theory. GLM II will be a practical session outlining basic modeling strategy. The discussion will cover topics such as overall modeling strategy, selecting an appropriate error structure and link function, simplifying the GLM (i.e., excluding variables, grouping levels, and fitting curves), complicating the GLM (i.e., adding interactions), and validating the final model. The session will discuss diagnostics that help test the selections made.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Thomas Struppeck
Panelists: Joseph Marker, Amel Arhab

The Differences & Similarities Between Cyber and Errors & Omissions Insurance: Who Needs What?

This session is designed to address a common question - and potential misunderstanding - from insurance agents, brokers and risks managers: do I need "cyber" insurance when I already provide or carry Errors & Omissions insurance? The answer, of course, is "t depends". Some organizations will have little to no coverage for "cyber" type events, while others may have significant liability coverage. This session reviews the differences and the important questions that agent, broker or risk manager should ask about the different coverages.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Panelists: George Allport, Steven Anderson

Workers Compensation Ratemaking—An Overview

The panel will review the essential components of a typical rate filing from the perspective of NCCI, other bureaus, and from the view of companies in loss cost jurisdictions. The discussion will highlight coverages, exposure bases, and data sources used for workers compensation ratemaking.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Thomas Struppeck
Panelists: Scott Skansberg

GLM I

Do terms such as "link function," "exponential family," and "deviance" leave you puzzled? If so, this session will clarify those terms and demystify generalized linear models (GLMs). The session will provide a basic introduction to linear models and GLMs. Targeted at those who have modest experience with statistics or modeling, the session will start with a brief review of traditional linear models, particularly regression, which has been taught and widely applied for decades. Session leaders will explain how GLMs naturally arise as some of the restrictive assumptions of linear regression are relaxed. GLMs can model a wide range of phenomena, including frequencies and severities as well as the probability that a claim is fraudulent or abusive, to name just a few. The session will emphasize intuition and insight rather than mathematical calculations. Illustrations will be presented using actual insurance data.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Kevin Lee
Panelists: Ashley Lambeth

Comprehensive Data Review for Commercial Insurers as Part of Pricing Analysis

As part of a periodic review of rate level for a commercial insurance product, the actuary should consider a comprehensive data review, which would address the following for the product managers and other recipients of the actuarial work product: 1. Is the data complete? Are there material facts not available in the data source? 2. Is the data accurate? What is the error rate in the data? What steps are in place to insure accurate information? Can the information be relied on? 3. Are there anomalies in the data that, while correct, are a 'red flag'? 4. Are there anomalies or significant changes in subsegments of the data with respect to hit ratios, coverage, growth rates, etc. 5. Is there 'orphan' data, that is being captured but serves no purpose? 6. Does the product manager have access to pertinent information in between periodic actuarial reviews that is accurate and complete, and is delivered in an efficient manner? 7. Are there multiple sources of certain data, and are those sources consistent with each other? 8. What systems feed the data warehouse--it would helpful for the actuary to know how the data gets from a transaction system to the data warehouse. 9. Is there clear, consistent guidance about meanings of all data fields? Are all the underwriters using consistent definitions? Such a comprehensive data review results in a continuous cycle of improvement in the actuarial data. The discussion throughout this session will give both the actuary and the product manager a better qualitative understanding of the business, in addition to improving the results of quantitative analyses.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent

Commercial Lines Predictive Analytics: Before & After

Commercial lines predictive analytics continue to develop and become more sophisticated. Some exciting changes are occurring related to new or improved external data available to commercial lines insurers. In addition, companies are implementing better business intelligence tools to monitor program performance after implementing analytics. This session will focus on the latest developments in commercial lines, before and after predictive modeling.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Panelists: Robert Walling, Tony Phillips

Update on Latest Vehicle Changes and Safety Enhancements - Vehicle Rating Developments

The panelists will discuss recent changes in automotive design and safety characteristics and the latest developments in rating plans utilized for personal and commercial automobile insurance.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Panelists: Kim Hazelbaker, LeRoy Boison

Generalized Linear Mixed Models for Ratemaking: A Means of Introducing Credibility into a Generalized Linear Model Setting

GLMs that include explanatory classification variables with sparsely populated levels assign large standard errors to these levels but do not otherwise shrink estimates toward the mean in response to low credibility. Accordingly, actuaries have attempted to superimpose credibility on a GLM setting, but the resulting methods do not appear to have caught on. The Generalized Linear Mixed Model (GLMM) is yet another way of introducing credibility-like shrinkage toward the mean in a GLM setting. Recently available statistical software, such as SAS PROC GLIMMIX, renders these models more readily accessible to actuaries. This paper offers background on GLMMs and presents a case study displaying shrinkage towards the mean very similar to Buhlmann-Straub credibility.
Source: 2011 Ratemaking and Product Management Seminar
Type: paper
Panelists: Fred Klinker
Keywords: Generalized Linear Mixed Models for Ratemaking, GLM

Credibility for a Tower of Excess Layers

In pricing excess of loss reinsurance, the traditional method for applying credibility is as a weighted average of two estimates of expected loss: one from experience rating and a second from exposure rating. This paper will show how this method can be improved by incorporating loss estimates from lower layers; producing a multi-factor credibility-weighted estimate of expected loss. The method described is based on minimum variance criteria, whereby the resulting credibility-weighted estimator has a lower variance than any other combination of the individual estimators. It is shown that the multi-factor credibility model can be presented as a simple recursive procedure for practical application.
Source: 2011 Ratemaking and Product Management Seminar
Type: paper
Panelists: Dave Clark
Keywords: Credibility

Federal vs. State Insurance Regulation

President Barak Obama signed the Dodd-Frank Act on July 22, 2010. The regulators and the level of regulation for the insurance industry have changed and continue to change. This session reviews recent issues in the debate about federal versus state regulation. This session will provide a discussion of the pros and cons of each, as well as a status report on the implementation of increased federal legislation. In addition, legislative and other insurance-related regulatory changes made at the state level in reaction to the financial sector collapse will be identified.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Mo Masud
Panelists: Orin Linden, Edward Collins, Mary Hudson

Using C++: A First Step

This session will focus on using C++ as a tool for actuarial analysis. It will cover some of the basics in the creation of a program, in the management of data, and in the implementation of algorithms. Suggestions on using C++ to complement other tools will be given. Code will be analyzed as well.
Source: 2011 Ratemaking and Product Management Seminar
Type: concurrent
Panelists: Jeremy Benson, William Rudolph