Search Presentations

The presentation materials are offered in connection with CAS professional education offerings. © 2022 Casualty Actuarial Society. All Rights Reserved. The presentation materials may contain copyrighted content the use of which has not been specifically authorized by the copyright owner. You are permitted to view and print the materials for personal/professional noncommercial research purposes. Except for the foregoing, you agree not to reproduce, distribute, modify, create derivative works, or commercially exploit the presentation materials without prior written permission from CAS. Please direct any copyright permission inquiries regarding use of the presentation materials to acs@casact.org.

Viewing 4526 to 4550 of 6735 results
STAY TUNED! If you are anticipating additional search filters by attribute and level to align with the CAS Capability Model, it is coming later this Summer. As the CAS begins to code recorded sessions by specific attributes and levels (starting with the 2023 Annual Meeting), these will be tagged in the CAS database of presentations going forward and should be searchable.

But you may use the Capability Model now to help you identify topics. For example, if you want to move up one level under the content area “Functional Expertise,” you may search topics in the particular functional area to expand your knowledge.

Recorded content is searchable by Capability Model attribute and level in the CAS Online Library.

Innovations in Commercial Auto Insurance

The commercial automobile insurance line offers product and rating plan development opportunities. On one hand, many of the factors and data sources developed for personal auto can be used in commercial auto. Credit scores, vehicle identification numbers (VINs), motor vehicle records (MVRs), and territory refinements all make the transition to commercial auto quite well. On the other hand, the unique characteristics of commercial auto risks offer some new opportunities. Factors such as industry classification, trailer and cargo type, and truck-to-car ratio all contribute to the dynamic world of commercial auto predictive modeling. Usage-based rating approaches can also be incorporated as well. This session examines issues faced by two leaders in this market.
Source: 2009 Ratemaking and Product Management Seminar
Type: Concurrent
Panelists: David Otto, Gary Wang

Implementation of Risk Models

This session will discuss various deployment strategies including pricing, marketing, and internal process flow changes.
Source: 2009 Ratemaking and Product Management Seminar
Type: Concurrent
Panelists: Christopher Cooksey, Joe Walkush, Jose Trasancos, Mike Cronin

Implementing Commercial Lines Predictive Models in a Soft Market: Challenges and Best Practices

Given the recent challenges P&C companies are facing as they operate in a soft market, this session will examine the strategic and tactical choices companies have to make related to the use of predictive analytics in their day-to-day decision making process. Using a case study, the presenters will discuss real-world challenges and solutions that companies can examine to maximize the value of predictive analytics in a soft market from an IT, actuarial, business process and strategic perspective. Market leaders of today will not be defined as those companies that have the core competency to develop predictive analytics but ones that can successfully integrate predictive analytics across the enterprise and monitor its impact and resulting business decisions in a soft market. Given the importance of this holistic approach to predictive analytics development and deployment, the role of the actuary in today's environment will change. The actuary's role is growing from developing and validating the technical aspects of predictive analytics to being a key strategic business partner with operations, offering insight on the strategic and tactical business value that predictive analytics can provide. This session will examine IT, actuarial, marketing, underwriting, and business implementation/process strategies that can help companies overcome the impact that the soft market cycle can have on the successful deployment of predictive analytics.
Source: 2009 Ratemaking and Product Management Seminar
Type: Concurrent
Moderators: Erik Johnson
Panelists: Paul Cohen, Mo Masud, Stacey Peterson

Trends in Workers Compensation Medical Costs

Medical severity continues its relentless growth even as claim frequency continues to fall, and pressure from indemnity severity has eased. This session will highlight the National Council on Compensation Insurance research exploring many facets of the growth in medical costs, including factors that have contributed to the growth in medical utilization, the apparent role of generics and FDA regulation on Rx drug trends, and the relationship between workers compensation medical trends and those of the U.S. health care system generally.
Source: 2009 Ratemaking and Product Management Seminar
Type: Concurrent
Moderators: Tom McIntyre
Panelists: Tanya Restrepo

Raising Your Actuarial IQ (improving Information Quality)

Predictive modeling, Sarbanes-Oxley, and other recent developments have renewed the focus on the quality of information. In this session, we approach data quality from the perspective of the cost of poor information quality. We then define information quality and give tips and examples on how to pursue it, including how actuaries can be proactive in improving data quality. The emphasis will be on: * Techniques that should be easy for most actuaries and analysts to apply right away * Aspects of data quality that actuaries are best able to fulfill This session is drawn from the work of the CAS Data Management Educational Materials Working Party (Research Working Party 5).
Source: 2009 Ratemaking and Product Management Seminar
Type: Concurrent
Panelists: Aleksey Popelyukhin, Robert Campbell

Measuring the Value of Rate Segmentation

Actuaries and other insurance professionals intuitively understand the importance of rate segmentation in the competitive marketplace. However, it is often difficult to express the value of enhancing the segmentation of an existing rate plan. Measurement of segmentation value facilitates a cost/benefit analysis of decisions to implement new rate segmentation plans. However, new segmentation plans are implemented in a “revenue neutral” manner, which some may understand to mean no revenue benefit to offset the implementation cost. This session will discuss potential frameworks to develop a measurement of the value of rate segmentation. These frameworks evaluate the potential costs resulting from lack of segmentation in a competitive marketplace, including the effects of adverse selection, customer retention, and customer price sensitivity. These potential costs can then be compared to the expected implementation costs in a manner similar to a traditional cost/benefit analysis. Presentations will also relate these frameworks to other statistical measures of lift, such as the Gini index and the Value of Lift.
Source: 2009 Ratemaking and Product Management Seminar
Type: Concurrent
Moderators: Tom McIntyre
Panelists: Frank Karlinski, Neeraj Arora

More Flexible GLMs Zero-Inflated Models and Hybrid Models

For modeling claims within the GLM framework, the Poisson distribution is a popular choice. In the presence of overdispersion, the negative binomial is also sometimes used. The statistical literature has suggested that taking excess zeros into account can improve the fit of count models when overdispersion is present. In insurance, excess zeros may arise when claims near the deductible are not reported to the insurer, thus inflating the number of zero policies when compared to the predictions of a Poisson or Negative Binomial distribution. In predictive modeling practice, data mining techniques such as neural networks and decision trees are often used to handle data complexities such as nonlinearities and interactions. Data mining techniques are sometimes combined with GLMs to improve the performance or efficiency of the predictive modeling analysis. One augmentation of GLMs uses decision tree methods in the data preprocessing step. An important preprocessing task reduces the number of levels of categorical variables so that sparse cells are eliminated and only significant groupings of the categories remain. This paper addresses some common problems in fitting count models to data. These are: * Excess zeros * Parsimonious reduction of category levels * Nonlinearity
Source: 2009 Ratemaking and Product Management Seminar
Type: Paper
Moderators: Marn Rivelle
Panelists: Louise Francis, Matthew Flynn
Keywords: Flexible GLMs Zero-Inflated Models, Hybrid Models

Class Ratemaking for Workers Compensation: NCCI's New Methodology

For the first time in many years, NCCI is revising the methodology used to determine class relativities in workers compensation loss cost filings. This paper will describe the new methodology NCCI has developed, and reveal the research approach and analyses underlying the modifications NCCI will be implementing to several key class ratemaking components. The paper will discuss in detail how the traditional areas of class ratemaking were modified, namely loss development, limiting large claims and applying expected excess provisions, updating credibility standards, and the derivation of industry group differentials. The paper will also focus on the new NCCI class ratemaking approach from an educational perspective for actuaries who are just becoming familiar with workers compensation. Exhibits are provided in Appendix B illustrating the stepwise derivation of a loss cost for a classification from beginning to end.
Source: 2009 Ratemaking and Product Management Seminar
Type: Paper
Moderators: Marn Rivelle
Panelists: Thomas Daley
Keywords: Ratemaking for Workers Compensation

Impact of the Economy on Workers Compensation Insurance

"It’s the economy!" applies to workers compensation as well as to politics. National Council on Compensation Insurance's economists have conducted extensive research into the economic factors that drive loss costs. This session will cover a range of issues-most notably why it seems likely that the decline in claim frequency will continue for the foreseeable future, how the business cycle affects loss costs, and the relationship between the business cycle and the underwriting cycle.
Source: 2009 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Gerald Kirschner
Panelists: Frank Schmid

Implementing Usage Based Auto Insurance as a Segmentation Tool

Different methods of monitoring mileage and driving behavior are being introduced into the market. Some public data exists that demonstrates the significance this information has as a risk segmentation tool. Early adopters have faced significant hurdles and lead time to launch. This session is geared to the issues insurers face in launching a program and how to develop actual rates from the vast quantity of information that can be collected from monitoring devices.
Source: 2009 Ratemaking and Product Management Seminar
Type: concurrent
Panelists: Germain Denoncourt, Robin Harbage

Conversion and Retention Modeling

Predictive modeling has gained widespread acceptance within the U.S. insurance industry as a means to estimate loss costs. However, the U.S. the insurance industry lags behind other industries in the use of predictive modeling as a means to understand customer response. This session will cover the use of multivariate techniques to study and predict outcomes such as response rate, policyholder retention, and new business conversion. The panel will provide practical tips and illustrative results associated with modeling customer response data. Furthermore, the panel will address the benefits and applications of modeling customer response, tying it in, at a high level, with price optimization techniques.
Source: 2009 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Gerald Kirschner
Panelists: Geoffrey Werner, James Tanser

Product Management In Small to Medium Sized Companies

This session will explore the unique challenges to a small company versus a big company in building a product management team/culture.
Source: 2009 Ratemaking and Product Management
Type: concurrent
Moderators: Gerald Kirschner
Panelists: Mark Crutcher, Joe Walkush

GLM II

GLM I provided the case for using GLMs and some basic GLM theory. GLM II will be a practical session outlining basic modeling strategy. The discussion will cover topics such as overall modeling strategy, selecting an appropriate error structure and link function, simplifying the GLM (i.e., excluding variables, grouping levels, and fitting curves), complicating the GLM (i.e., adding interactions), and validating the final model. The session will discuss diagnostics that help test the selections made.
Source: 2009 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Gerald Kirschner
Panelists: Robert Weishaar, Ernesto Schirmacher

Workers Compensation Claim Frequency

This session will provide an in-depth review of workers compensation claim frequency research over the past decade. Recent changes by claim size, injury type, and occupation are several categories that will be explored. Uses of claim frequency in pricing and predictive modeling for workers compensation will also be discussed.
Source: 2009 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Gerald Kirschner
Panelists: Jacob Geyer

Price Optimization for the U.S. Market: Techniques and Implementation Strategies

Incorporating the concept of demand into the ratemaking process is not new, but new developments and trends in the industry point towards a more quantitative application of price elasticity and demand in the future. Integrating these concepts into ratemaking is part of price optimization. Used in most major markets for some time, regulatory constraints have slowed the adoption of price optimization in the U.S. market. This session will focus on the technical aspects of price optimization, carefully laying out the steps to determine a set of optimized rates and also discussing the means by which insurers can use these optimized rates to implement actuarially sound rates in the U.S. market.
Source: 2009 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Gerald Kirschner
Panelists: Michael Greene

GLM I

Do terms such as "link function," "exponential family," and "deviance" intimidate you? If so, this session will help demystify generalized linear models (GLMs) and will provide a basic introduction to linear and GLMs. Targeted at those who have modest experience with statistics or modeling, the session will start with a brief review of traditional linear models, particularly regression, which has been taught and widely applied for decades. Session leaders will explain how GLMs naturally arise as some of the restrictive assumptions of linear regression are relaxed. GLMs can model a wide range of phenomena including frequencies and severities as well as the probability that a claim is fraudulent or abusive, to name just a few. The session will emphasize intuition and insight rather than mathematical calculations. Illustrations will be presented using actual insurance data.
Source: 2009 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Gerald Kirschner
Panelists: Paul Anderson, Tanya Havlicek

Finding a Balance Between Rate Stability and Adverse Selection Avoidance

Actuaries are developing more sophisticated predictive models that better match rate to risk at an unprecedented rate. Given the pace of the new insights generated by these models, incorporating them into the product as soon as they're discovered could lead to rate instability. On the other hand, waiting to incorporate them in the name of rate stability may perpetuate existing adverse selection or subject you to it in the near future. This session will discuss a framework for determining the optimal time to implement product enhancements. Furthermore, when the time for implementation arrives, there is a direct relationship between the amount of rating accuracy improvement and the amount of potential in-force customer dislocation. On one hand, it could be argued that using the same rates for new and in-force customers is the correct and "actuarially sound" approach. However, the company must consider the impact on retention if a large portion of its book receives a significant rate change. This session will discuss various strategies for mitigating in-force customer dislocation when implementing a product enhancement. For instance, one popular approach is to implement the enhancement for both new and in-force customers but apply rate caps to the in-force customers. The implementation of rate caps is far from simple: the capping rule must be precisely defined and consideration must be given to what different DOIs will allow.
Source: 2009 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Gerald Kirschner
Panelists: Kelly McKeethan, Wade Warriner

Federal vs State Insurance Regulation

Recent events in the financial sector have resulted in questions as to what level of regulation is optimal and who should provide the oversight. With respect to insurance in particular, the debate about federal vs. state regulation has intensified. This session will provide a discussion of the pros and cons of each, as well as a status report on the federal legislation. In addition, legislative and other insurance-related regulatory changes made at the state level in reaction to the financial sector collapse will be identified.
Source: 2009 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Annette Goodreau
Panelists: Eric Nordman, Karen Adams, J. Zielezienski

Applications of Quantile Regression in Commercial Underwriting Models

Session panelists will present a robust regression technique, quantile regression, and use it to develop underwriting models for commercial insurance. Ordinary least-square regression or GLM analyzes the relationship between X and the conditional mean of Y. In contrast, quantile regression models the relationship between X and the conditional quantiles of Y. Quantile regression produces a very robust estimation because random large noises will not affect the model as much as in least-square and GLM regressions. Quantile regression is especially useful in commercial lines where the data is very volatile and extremes are important, such as identifying the highest risks. Another advantage of quantile regression is that it provides a more complete picture of the conditional distribution of loss ratio Y. A case study will demonstrate this robust regression technique numerically.
Source: 2009 Ratemaking and Product Management Seminar
Type: concurrent
Panelists: Cheng-Sheng Wu, Luyang Fu

Workers Compensation Ratemaking-An Overview

The panel will review the essential components of a typical rate filing from the perspective of the National Council on Compensation Insurance, other bureaus, and from the view of companies in loss cost jurisdictions. The discussion will highlight coverages, exposure bases, and data sources used for workers compensation ratemaking.
Source: 2009 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Camille Minogue
Panelists: Jay Rosen

Basic Ratemaking: Introduction to Credibility

Considering credibility in the context of ratemaking concepts, this session will review variables affecting credibility and credibility formulas, as well as practical techniques for applying in increasing credibility. Both classical and Buhlmann models will be described.
Source: 2009 Ratemaking and Product Management Seminar
Type: Workshop
Moderators: Martin King
Panelists: Christopher Otterman
Keywords: Ratemaking, Credibility

Product Development: Managing The Product

The fourth segment of the Product Development Workshop, this interactive session will allow participants to learn the basics of product implementation, including monitoring, modification, and follow up.
Source: 2009 Ratemaking and Product Management Seminar
Type: Workshop
Moderators: Martin King
Panelists: Mark Crutcher
Keywords: Product Development

Basic Ratemaking: Introduction to Increased Limit Factors

This session will present an overview of increased limits ratemaking. Participants will cover general concepts, such as calculating limited average severities, and practical problems with developing increased limit factors (ILFs) from a distribution of loss data. The session will also provide an overview of excess and deductible pricing and will discuss common approached for calculating ILFs.
Source: 2009 Ratemaking and Product Management Seminar
Type: Workshop
Moderators: Martin King
Panelists: Patrick Thorpe
Keywords: Ratemaking

Product Development: Marketing Of A New Product

This third segment of the product development workshop will feature the basics of marketing a new product that uses an interactive approach. This may be a "window of opportunity" that most actuaries rarely see through.
Source: 2009 Ratemaking and Product Management Seminar
Type: Workshop
Moderators: Martin King
Panelists: Mark Crutcher
Keywords: Product Development

Cat Modeling: Different Uses for the Model Output

This session will look at various ways that outputs from the models can be used and misused. Possible topics include: * Policy terms and conditions * Loss and profit components of the overall rate indication * Territorial and class ratemaking * Compliance with Standards of Practice 38 and 39 * Use of regulators' analyses to choose a company's model * Regulatory constraints on use of models in ratemaking * Solvency regulation * ERM, capital management and allocation, reinsurance issues * Rating agency issues * Underwriting new and existing risks * Risk mitigation Presentation of these topics may continue into the final session of the day.
Source: 2009 Ratemaking and Product Management Seminar
Type: Workshop
Moderators: Martin King
Panelists: Larry Johnson, Shawna Ackerman, Stephen Russell
Keywords: Cat Modeling