Search Presentations

The presentation materials are offered in connection with CAS professional education offerings. © 2022 Casualty Actuarial Society. All Rights Reserved. The presentation materials may contain copyrighted content the use of which has not been specifically authorized by the copyright owner. You are permitted to view and print the materials for personal/professional noncommercial research purposes. Except for the foregoing, you agree not to reproduce, distribute, modify, create derivative works, or commercially exploit the presentation materials without prior written permission from CAS. Please direct any copyright permission inquiries regarding use of the presentation materials to acs@casact.org.

Viewing 3151 to 3175 of 6735 results
STAY TUNED! If you are anticipating additional search filters by attribute and level to align with the CAS Capability Model, it is coming later this Summer. As the CAS begins to code recorded sessions by specific attributes and levels (starting with the 2023 Annual Meeting), these will be tagged in the CAS database of presentations going forward and should be searchable.

But you may use the Capability Model now to help you identify topics. For example, if you want to move up one level under the content area “Functional Expertise,” you may search topics in the particular functional area to expand your knowledge.

Recorded content is searchable by Capability Model attribute and level in the CAS Online Library.

Do Medical Fee Schedules Really Work? Evidence on Provider Behavior Reflected in Observed Changes in Medical Utilization and Severity as Well as the Prices Actually Paid

Quantifying the effects of changes to physician fee schedules in workers compensation has become an integral part of NCCI legislative pricing, as an increasing number of jurisdictions have introduced such legal provisions over the past decades. Indeed, understanding the dynamics of WC medical costs in response to market changes is critical to effective ratemaking for all workers compensation market participants. The session will start with a discussion of the creation of a series of medical cost indexes (for fee schedule prices, prices actually paid, severity, and utilization) and offer evidence on how changes in actual medical prices paid depart from and may offset changes in the fee schedule itself. The analysis also provides evidence to evaluate the common belief that changes in utilization enable medical providers to further offset the direct impact of fee schedule changes. This analysis provides important insights into the dynamics of WC medical costs. This will be followed by a discussion of the impact of the introduction of fee schedules in states where previously there were none. The introduction of fee schedules likely serves as a “shock” to the status quo in markets for WC medical services. The market reaction to such a shock is likely to differ in material ways from patterns observed once fee schedules are the norm. The medical index methodology presented in the first part of the session can also be used to assess the impact of the introduction of fee schedules in states that previously had no direct pricing controls. This analysis also requires the development of additional methods to estimate the cost trends that would have been observed if the fee schedules had not been introduced. The analysis indicates that fee schedules impact trends as well as levels of medical costs.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Chuck Boucek
Panelists: Harry Shuford, Nathan Lord

Cyber Data and Analytics

The underwriting and pricing process for a cyber risk is mainly driven by qualitative rather than quantitative aspects. Availability of cyber event data however has grown rapidly over the past few years due to cyber-disclosure guidelines. By leveraging an extensive cyber event data set together with predictive model approaches, we are introducing more rigor into the cyber insurance risk evaluation process.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Panelists: Mark Hoffmann

Customer Lifetime Value - Opportunities and Challenges

Customer lifetime value (CLV) is a useful tool in marketing and customer relationship management (CRM). CLV has been gaining ground in the insurance industry over the last several years. Despite the theoretical simplicity of CLV, it is fraught with difficulty when applied in practice. In this talk, we will discuss critical issues to consider when modeling and implementing CLV applications within the insurance industry.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Chuck Boucek
Panelists: Mohamad Hindawi, Gregory Firestone

Credit Scoring

The use of credit-based scores for determining insurance rates remains a controversial and unresolved issue. In this session, we bring together a proponent, an opponent, and a moderator who has dealt with insurance from both a regulatory and insurance company viewpoint. They will present their positions and provide an opportunity for the audience to hear and explore various positions on this issue. Birny Birnbaum, Executive Director of the Center for Economic Justice, will present consumer concerns about insurers’ use of consumer credit information for underwriting and rating homeowners and auto insurance. Birny will also discuss risk classification issues beyond consumer credit information, including appropriate actuarial and public policy standards for evaluating the reasonableness of emerging risk classifications. John B. Wilson, Director of Insurance Analytics at LexisNexis Risk Solutions will present the advantages of using consumer credit information to underwrite and rate insurance policies. Michael Lamb, who dealt with insurance scores as a regulator and now is a neutral consultant, will serve as moderator. He will facilitate the discussion and posit panel-puzzling perspectives.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Chuck Boucek
Panelists: Birny Birnbaum, John Wilson

Creating a Data Driven Culture

What is the difference between having lots of data and having a corporate culture that relies on data to drive better business decisions? What factors help some companies achieve this goal? What barriers prevent the efficient use of data at others? Building on the success of last year’s interactive roundtable discussion, this year’s session will be a fully interactive, audience participation driven look at best practices, barriers, success stories, and the role of actuaries in creating a data driven culture.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Chuck Boucek

Cost of Capital and Capital Attribution- A Primer for the Property Casualty Actuary

This session will provide a historical primer on the subject and address research past and current in the practical ways of reflecting the cost of capital in ratemaking. This primer will discuss aspects as developed by Merton-Perold, Myers-Read, Mango’s asset share model, and the RMK procedures. Also considered will be current research as it relates to how actual capital, should perhaps be considered in tranches, based on the risk metrics considered. We have come a long way since using premium to surplus ratios of 30 years ago. Let’s continue the evolution.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Chuck Boucek
Panelists: Robert Wolf, Glenn Meyers, David Ruhm

Commercial Usage-Based Insurance

Making the business case for a telematics application in commercial lines is easier than personal lines. Surprisingly, the majority of successful product offerings in the insurance industry are in personal lines. Nevertheless, commercial auto insurers are moving quickly to catch up with personal auto insurers. In this presentation, we will contrast personal and commercial lines needs. We will discuss UBI strategies that worked for personal lines and why they do not work in commercial lines. Finally, we will focus on how to build a strategy for your commercial UBI offering that is aligned with your customer needs and your company’s long term goals.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Chuck Boucek
Panelists: Mohamad Hindawi

Climate Change Panel

We will kick off the Severe Weather Workshop with a panel discussion led by the Chair of the CAS Climate Change Committee. Top meteorological experts will share their perspectives on Climate Change, focusing on the impacts to the insurance industry. This will set the stage for the rest of the day as we learn techniques for pricing for severe weather events.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Chuck Boucek
Panelists: Peter Dailey, Vijay Manghnani

Catastrophe Pricing: Making Sense of the Alternatives

You have run your favorite catastrophe simulation model and have thousands of years of simulated losses. You are ready to derive the indicated price, but now questions arise about how to compute the indicated risk load. There are many types of statistics that could be generated from the simulated data. Which one should be used for pricing? Each statistic can be applied on a standalone basis or from several portfolio perspectives. Which approach makes the most sense? There are a set of desirable pricing properties called Coherence, yet some commonly employed methods fail to be Coherent. Does that matter? Some methods are driven by the extreme tail; while others look at all events that consume capital. Where is the proper focus? This session will not provide definitive answers to any of those questions, but it explain alternative catastrophe pricing approaches and use simple examples to clarify the definition and behavior of different algorithms. The session promises to be informative and to offer new insights.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Chuck Boucek
Panelists: Ira Robbin

Catastrophe Modeling for Commercial Lines

This session will address catastrophe modeling from a commercial lines perspective, where modeled losses can be highly dependent on assumptions concerning construction and occupancy mappings, proper accounting for policy conditions, dealing with large data sets, and extra coverages. Improved methodologies for modeling business interruption and complex industrial facilities, approaches for understanding the sensitivity of modeled losses to input data, and trends in how commercial lines insurers are using catastrophe models will be discussed.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Vanita Banks

Case Studies in Adding Variable Interactions in GLM’s

Model development using Generalized Linear Models (GLMs) has traditionally focused on the search for and inclusion of main effects variables. More recently interactions between variables have become the aim of many practitioners. This session explores various methods employed in identifying variable interactions.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Vanita Banks
Panelists: Paul Beinat, Chun Li

Captivated by Captives - Promises and Pitfalls

An interactive discussion of the captive marketplace, including domiciles, types of captives, and common lines of business written. We will address a range of advantages and disadvantages of captive insurance companies, and then initiate a discussion of the areas surrounding captives that can be problematic to an actuary. Particular focus is placed on the calculation of expected loss for lines of insurance with little frequency, estimating premium from loss projections, premium transfer pricing issues, and reserve sufficiency.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Vanita Banks
Panelists: James Bulkowski, Charles Mitchell

Beyond the Cost Model: Understanding Price Elasticity and Its Applications

Once cost models have been constructed, insurers spend a significant amount of time translating those expected cost models into a rating algorithm. Today, competitive analytics are widely used to support this effort. However, companies often fail to properly integrate them into the pricing process. The intent of this paper is to provide the basic tools needed for insurers to make more effective pricing decisions using customer price elasticity of demand. To achieve this, we will explore demand modeling techniques, as well as practical applications of demand in pricing.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Vanita Banks
Panelists: Greg McNulty

Beyond Auto - Lines You Are Less Familiar With

This session will explore two lines of business that you may not know a lot about: lender-placed home insurance and boat owners insurance. We’ll explore what lender-placed home insurance is and how it differs from traditional homeowners insurance, what challenges it presents with regard to ratemaking, and why it is attracting regulatory and media attention. An introduction to boat insurance pricing involves a brief overview of the product/rating plans followed by a discussion of the challenges this line presents to actuaries. This discussion will include considerations of the application of standard pricing methodologies in the areas of rate level indications and rating plan development.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Vanita Banks
Panelists: Patrick Curtis, Eric Schmidt

Bayesian Trend Selection

Selecting loss ratio trends is an integral part of NCCI aggregate ratemaking. The trend selection draws on an exponential trend regression model that is applied, alternatively, to the latest 5, 8, and 15 observations. Then, using actuarial judgment, the three estimates are aggregated into a single forecast. This process of decision making under uncertainty can be formalized using Bayesian model selection. A Bayesian trend selection model is introduced that averages across the three exponential trend models. Using a double-exponential likelihood, this model minimizes the sum of absolute forecast errors for a set of (overlapping) holdout periods. The model selection is accomplished by means of a categorical distribution with a Dirichlet prior. The model is estimated by way of Markov chain Monte Carlo simulation (MCMC). The Bayesian trend selection is validated on data from past ratemaking seasons. Further, the robustness of the model is examined for past ratemaking data and a long series of injury (and illness) incidence rates for the manufacturing industry. In both cases, the performance of the Bayesian trend selection is compared to the 5-point, 8-point, and 15-point exponential trend regression, using the random walk as a benchmark. Finally, for the purpose of illustration, the Bayesian trend selection is implemented for an unidentified state.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Tom Hettinger
Panelists: Chris Laws

Balancing Robust Statistics - Gradient Boosting

Gradient Boosting (GB) is an iterative algorithm that combines simple parameterized functions with “poor” performance (high prediction error) to produce a highly accurate prediction rule. GB can be interpreted as the hybrid of traditional statistical modeling and data mining. In contrast to both extremes, GB usually provides comparable accuracy with data mining tools and gives interpretable results similar to GLMs. Another advantage of GB is its requirement of little data preprocessing and tuning of the parameters. The method is highly robust to less than clean data and can be applied to classification or regression problems from a variety of response distributions (Gaussian, Bernoulli, Poisson, and Laplace). Complex interactions are modeled simply, missing values in the predictors are managed almost without loss of information, and feature selection is performed as an integral part of the procedure. These properties make GB a good candidate for insurance loss cost modeling. However, to the best of our knowledge, the application of this method to insurance pricing has not been fully documented to date. This session presents the theory of GB and its application to the problem of predicting auto “at-fault” accident loss cost using data from a major Canadian insurer. The predictive accuracy of the model is compared against the conventional Generalized Linear Model (GLM) approach and a fancy neural network.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Bernie Horovitz
Panelists: Simon Lee

Ask a Regulator

Each member of this panel of regulators will present a short overview of current issues and the rate regulation process in their state. “Hot Button” issues will be identified and discussed. A roundtable group discussion will follow, with audience participation strongly encouraged. The differences and the similarities in regulatory approach will be highlighted. In addition, panelists will comment on their experiences, and field questions from the audience.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Richard Betterley
Panelists: Thomas Botsko, David Dahl, Sharon Li, Sarah McNair-Grove

Allocating Capital- A Hands on Case Study

A laptop is recommended for participation in this session. Join us for a hands-on technical session where the audience will be allocating capital to lines of business considering various methods and will be working in groups to make strategic decisions based on their respective results.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Alan Roth
Panelists: Stephen D'Arcy

Adventures in Rate Capping

As companies gain better understanding of their customers’ sensitivity to price, more insurers are implementing Rate Capping programs. Under Rate Capping, a customer’s rate increase (or decrease) may be capped at renewal. Therefore, a new business risk and a renewal risk, all else being equal, may pay different rates, at least in the near term. Companies’ primary motivations for Rate Capping are that they believe it meets their customers’ preferences for stable rate changes. However, managing Rate Capping can prove difficult. Many of the challenges do not emerge until capping has been implemented and is in place for several years. In this session, we will discuss the pros and cons of Rate Capping and the various challenges it presents, such as its implications for rate indications, managing calendar year results, implementation, and data. The panel will discuss how companies can address these inherent complexities.
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Alan Roth
Panelists: Morgan Bugbee

A Look at Asia Personal Lines Markets

This session will provide an overview of the personal auto markets in Asia, with an in-depth look at China, India and Malaysia. The overview will include information about the key players, industry experience, rating variables, residual markets, and distribution. The session will also discuss regulation history with a focus on de-tariff and its impact and implication on the market. In addition, the session will discuss areas where predictive modeling applications can be used to improve the profitability (e.g., pricing, risk selection, claims fraud detection, marketing).
Source: 2013 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Alan Roth
Panelists: Ronald Kozlowski

Working With Elephants & Swans in The Real World

In recent years, there have been some watershed disasters which changed traditional thinking about risk; at least for a period of time. Were they black swans? Were they predictable? Were they preventable? This session will take you on an interactive excursion of the current risk environment. We will visit the lair of the black swan and the world of disaster denial. We will share first-hand accounts with swans, elephants and dinosaurs, as well as lessons learned. We are saving a seat for you on this tour.
Source: 2013 In Focus Seminar
Type: concurrent
Moderators: Alan Roth
Panelists: Harry Rhulen

Using Credibility to Mitigate the “Winner’s Curse”

It is possible for a reinsurer to use a pricing model based on unbiased estimators and still produce a portfolio of under-priced business. This is because there is variance in any pricing estimator and in a competitive auction the lowest price “wins” the business; a reinsurer many find that they are more likely to win the bids that are below the true expected pricing level. In this session, we will discuss how to mitigate this “winner’s curse” by using minimum variance estimators in pricing. The main application will be based on Clark’s Variance paper “Credibility for a Tower of Excess Layers.”
Source: 2013 In Focus Seminar
Type: concurrent
Moderators: Alan Roth
Panelists: Dave Clark

Understanding and Modeling Wildfire Risk in a World of Increasing Climatic Variability

This session will present two different perspectives on understanding and modeling wildfire risk during a period of increasing risk from climatic change and from urban expansion into fire prone areas. CoreLogic - Over the last two decades, wildfire has been responsible for billions of dollars of property damage. Wildfires that have dominated news headlines in 2012, and again this summer have been responsible for record-setting property destruction across a swath of states spanning the West. CoreLogic builds highly granular wildfire risk models has identified more than 740,000 residences across 13 states in the western U.S. currently at high or very high risk for wildfire damage, representing a combined total property value estimated at more than $136 billion. This part of the session will focus on understanding and modeling wildfire risk at the parcel level and how this information can be used for pricing, underwriting and for risk mitigation. Case studies include the June 2013 Black Hills Fire near Colorado Springs and 2011 Bastrop Fire in Texas will be presented. AIR - In October 2007, wildfires in Southern California caused insured property losses of more than $2 billion. At present, more than 5 million homes in California are located in high-risk areas abutting wildland—more than twice as many as in any other state. Continued development in these areas and changes in building practices limit the effectiveness of relying on historical losses for risk management. Yet despite the magnitude of potential losses, and their upward trend over the last two decades, many companies continue to rely on historical data to estimate future wildfire losses. In this session, a senior scientist from AIR Worldwide will discuss AIR’s fully probabilistic wildfire model for California. The presentation will describe how the model makes use of high resolution LANDFIRE fuels data, a complex fire spread algorithm that includes the simulation of man-made firebreaks, and explicitly captures the possibility that a wildland fire will evolve into an urban conflagration. Additionally, this session will address the relative vulnerability of various types of buildings to fire, and how mitigation factors may help curb the losses. Also mentioned will be some of the detailed validation the model undergoes, both using claims data and damage surveys.
Source: 2013 In Focus Seminar
Type: concurrent
Moderators: Alan Roth
Panelists: Scott Stransky, Howard Botts

Tornado/Hail: The Emerging Elephant

The session will explore changing weather patterns in North America and also review engineering research on effects of wind/hail, etc on structures with an eye towards the research into improving durability of insured properties. The panelists include one of the authors of Munich Re's recently published book, "Severe weather in North America" and a CEO of a non-profit devoted to objective, scientific research to identify and promote effective actions that strengthen homes, businesses, and communities against natural disasters and other causes of loss.
Source: 2013 In Focus Seminar
Type: concurrent
Moderators: Alan Roth
Panelists: Mark Bove, Ian Giammanco

The Right UBI Data for Now and the Future

Many insurance companies spend a considerable amount of time and energy building their UBI product and collecting data only to find that the data they have accumulated is inadequate. This session is focused on UBI data strategy, discussing how to determine the best level of data to collect, expectations for data quantity and common issues with data quality.
Source: 2013 In Focus Seminar
Type: concurrent
Moderators: Guntram Werther
Panelists: Kelleen Arquette