Search Presentations

The presentation materials are offered in connection with CAS professional education offerings. © 2022 Casualty Actuarial Society. All Rights Reserved. The presentation materials may contain copyrighted content the use of which has not been specifically authorized by the copyright owner. You are permitted to view and print the materials for personal/professional noncommercial research purposes. Except for the foregoing, you agree not to reproduce, distribute, modify, create derivative works, or commercially exploit the presentation materials without prior written permission from CAS. Please direct any copyright permission inquiries regarding use of the presentation materials to acs@casact.org.

Viewing 3401 to 3425 of 6735 results
STAY TUNED! If you are anticipating additional search filters by attribute and level to align with the CAS Capability Model, it is coming later this Summer. As the CAS begins to code recorded sessions by specific attributes and levels (starting with the 2023 Annual Meeting), these will be tagged in the CAS database of presentations going forward and should be searchable.

But you may use the Capability Model now to help you identify topics. For example, if you want to move up one level under the content area “Functional Expertise,” you may search topics in the particular functional area to expand your knowledge.

Recorded content is searchable by Capability Model attribute and level in the CAS Online Library.

Catastrophe Modeling

Source: 2012 Regional Affiliate - BACE
Type: affiliate
Panelists: George Freimarck

Balancing Robust Statistics and Data Mining in Ratemaking: Gradient Boosting Modeling

Gradient Boosting (GB) is an iterative algorithm that combines simple parameterized functions with “poor" performance (high prediction error) to produce a highly accurate prediction rule. GB can be interpreted as the hybrid of traditional statistical modeling and data mining. In contrast to both extremes, GB usually provides comparable accuracy with data mining tool and gives interpretable results similar to GLM. Another advantage of GB is its requirement of little data preprocessing and tuning of the parameters. The method is highly robust to less than clean data and can be applied to classification or regression problems from a variety of response distributions (Gaussian, Bernoulli, Poisson, and Laplace). Complex interactions are modeled simply, missing values in the predictors are managed almost without loss of information, and feature selection is performed as an integral part of the procedure. These properties make GB a good candidate for insurance loss cost modeling. However, to the best of our knowledge, the application of this method to insurance pricing has not been fully documented to date. This session presents the theory of GB and its application to the problem of predicting auto “at-fault" accident loss cost using data from a major Canadian insurer. The predictive accuracy of the model is compared against the conventional Generalized Linear Model (GLM) approach and fancy neural network.
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Mark Littmann
Panelists: Leo Guelman

Update on Latest Vehicle Changes and Safety Enhancements  Vehicle Rating Developments

The panelists will discuss recent changes in automotive design and safety characteristics and the latest developments in rating plans utilized for personal and commercial automobile insurance.
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Thomas Toce
Panelists: LeRoy Boison

Product Architecture

Product Architecture is a product management concept rapidly gaining popularity in the commercial lines space. It is a comprehensive mapping of the components, dimensions, and rules of an insurance product with a focus on isolating the reusable assets. Especially when combined with leading product management practices and emerging technologies, the use of product architecture can greatly enhance a company’s flexibility in launching new products and reducing the workload associated with product enhancements. The session will focus on: • What is Product Architecture? • Why is it important? • Illustrative case study examples of: • How products are decomposed • How market offerings are built • Expected benefits
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent

Pricing and Product Issues in Latin American Personal Lines Insurance

This session will provide an overview of the personal auto and homeowners markets in the largest Central and South American countries, with an in-depth look at Mexico, Argentina and Brazil. The overview will include information about the key players, market share, market distribution, and additional information relative to each line. The session will also discuss a series of hot topics relevant in both the US and these international markets, including predictive modeling for pricing and risk selection, usage-based insurance, product differentiation, inflation, and regulation.
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Thomas Toce
Panelists: Lidia Leonardi

Price Optimization for the U.S. Market: Techniques and Implementation Strategies

Recent years have seen continued advancements in the integration of customer demand information into the ratemaking process. As more sophisticated quantitative techniques for combining competitor and price elasticity information with cost models gain popularity in the U.S. market, pricing actuaries are faced with a new array of technical, practical and regulatory challenges. This session will focus on the technical aspects of price optimization, carefully laying out the steps to determine a set of optimized rates. With a focus on the U.S. market, it will also include an overview of data structures and alternatives for using optimized prices to implement actuarially sound rates.
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Thomas Toce
Panelists: Duncan Anderson

Predictive Modeling and By-Peril Analysis for Homeowners Insurance

Homeowners insurance covers a multitude of perils. However, most of the traditional variables used to rate homeowners insurance, such as amount of insurance and construction class, are most appropriate for fire insurance. The session will show how to apply predictive modeling separately by peril. Independent variables will include the traditional rating variables along with external variables. In the models, the effect of the traditional variables will differ by peril. The session will also provide company insight on the development and implementation of by-peril rating.
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Thomas Toce
Panelists: David Cummings

Predictive Analysis & A Data Driven Culture

Cloud computing. Business analytics. Predictive modeling. As technology allows us to analyze more and more data at ever-increasing speeds, insurance companies are faced with significant threats and tremendous opportunities. This session will take a decidedly strategic look at the role business analytics, including predictive modeling, can play in transforming the insurance-company culture into one that relies on greater analysis of data for decision making. Emphasis will be placed on the cultural dynamic within an insurance company and how internal communication can facilitate the move to a data driven culture.  This session will present a conceptual framework of the opportunities for insurance companies as well as specific examples of current company best practices in commercial lines.
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent

Practitioner's Guide to Cost of Capital/Profit Provisions

The panelists will examine how results of several cost of capital methods have changed over time as economic conditions change.  We will also examine requirements/regulations that dictate how the profit provision is determined in several states, and also the profit provision resulting from the ISO State X approach.  Finally, we will summarize the cost of capital expected to be generated by these state mandate profit provisions.
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Yi Jing
Panelists: Paul Ericksen

Part I - Introduction and Basics

This beginner level course  covers the basics of the R language, including introductions to loading data, performing calculations, creating graphs, and use of simple statistical functions.  Exercises will cover common tasks that actuaries may have to perform in their analyses.  The course is a lead-in for the Predictive Modeling course by Jim Guscsza but is not a pre-requisite. This first section of the workshop day will give attendees an introduction to R.  This introduction will include the history and language features.  The section is designed to spend time on areas important to first time users (Booting Up, Getting Help, Loading/Saving, Interactive vs. Batch, Packages).  Finally, participants will spend ample time working through exercises on Basics, Simple Calculations, Control Structures, Vectors and Lists-Exercises include Loss Data to Ultimate Losses and Loss Reserving Distribution.
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Yi Jing

Network Security & Privacy Liability: An Overview of Loss Mitigation Concepts and Data Breach Response Services

This session is designed to provide an overview on the various coverages available prior to a breach incident as well as the post-breach response services that have become the critical component of the overall policy.  The session will look at the concept of a Data Breach Team and the various benefits of partnering with a carrier who can provide access to a variety of breach response vendors in the event of a breach.  The session will also provide a general market update on the network security and privacy liability environment – it's not just “cyber” anymore.
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Yi Jing
Panelists: Brian Cole

Loss Cost Modeling vs. Frequency and Severity Modeling

In recent years, loss cost modeling using the Tweedie distribution has been gaining popularity in GLM-based predictive modeling practice, joining frequency and severity modeling. This has left us with two widely used GLM design approaches, each with their own strengths and weaknesses. The frequency-severity modeling approach prescribes modeling claim frequency (claim count over exposure) and claim severity (incurred loss over claim count) separately, and then combining those results to create loss cost estimates. Enhancements of the basic frequency-severity approach include the creation of modeled loss cost datasets to facilitate offsetting and special treatment or modeling of high-severity losses. The loss cost approach uses loss cost data (incurred loss over exposure) directly as the target variable in the model. In this session, we will discuss the pros and cons of the two model design approaches during a class plan study or underwriting tier development. The discussion will cover both business and statistical considerations for personal and commercial lines. We will use multiple data sources to illustrate comparisons and support our findings.
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Yi Jing
Panelists: Jun Yan

Introduction to Bayesian Modeling for Ratemaking

Bayesian methods have undergone a dramatic resurgence in recent decades, sparked by the introduction in 1990 of Markov Chain Monte Carlo [MCMC] simulation as a method for estimating Bayesian prior probabilities.  With the advent of MCMC, statisticians suddenly possessed a powerful computational tool for solving hitherto intractable Bayesian problems.  However, MCMC methods have received relatively little use within the actuarial community.  While this is ironic given the profession’s longstanding Bayesian heritage in the form of credibility theory, it is perhaps unsurprising given the technical nature of Bayesian computation.  This session will present the fundamentals of Bayesian theory and computation with an emphasis on its key component:  hierarchical models.  One major theme of the session will be Bayesian prior probabilities as a formal mechanism for incorporating relevant information not captured in ones data, thereby enabling one to overcome ambiguous or false indications due to sparse data.  Another major theme will be hierarchical models as a next logical step in the evolution of GLM-based ratemaking,  unifying credibility and GLM modeling methods.  The session will be highly practical, emphasizing fundamental concepts and intuition above mathematical formalism.  A number of ratemaking examples will be presented to illustrate the practical benefits of the Bayesian approach.
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Edward Stone
Panelists: James Guszcza

Insurance Modeling with the Tweedie Compound Poisson Distribution

The Tweedie compound Poisson distribution has been used extensively in insurance data modeling as it corresponds to the underlying loss generating process. This distribution is composed of a probability mass at the origin accompanied by a skewed continuous distribution on the positive values. This mixed type distribution results from the mixing of two underlying components - the Poisson and Gamma distributions, where the mixing behavior is determined by an index parameter. However, since the density of the compound Poisson distribution is not analytically tractable, full statistical inference remains challenging when the index parameter is not known, and research on the compound Poisson generalized linear mixed model and Bayesian model is fairly sparse. In this session, we give a detailed account of the challenges that insurance modelers are currently struggling with in this distribution, and present a latent variable approach that lays down a unified and flexible framework for statistical inference in the compound Poisson distribution. The proposed latent variable method is intuitively appealing and results in coherent parameter estimations. Our discussion includes generalized linear models, hierarchical models, and Bayesian methods, and an extensive list of examples will be presented, many of which are real-life applications.
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Edward Stone
Panelists: Yanwei Zhang

GLM III - The Matrix Reloaded

This session will consider new techniques and refinements to the basic GLM which can add material value to the modeling process. It will specifically consider amendments which address some of the purported failings of GLMs in comparison to emerging methods such as machine learning techniques. The session will include a discussion of: • an innovative approach to detecting subtle and higher dimensional interactions in an efficient way, potentially eliminating the need to consider alternative, harder-to-implement, model forms such as nonlinear models • the role of such automated methods in comparison with more manual construction of composite explanatory variables • ways to mitigate the risk of over-parameterization through the use of modifications which incorporate elements of credibility within the GLM framework • simple practical modelling steps that can be used to remove distortions created by combining models across claim types • innovative ways of modeling bodily injury claims.other miscellaneous refinements
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Edward Stone
Panelists: Duncan Anderson

Estimating Systemic Risk for Professional Lines

This session will examine the actual and potential impact of systemic risk factors on Professional Lines. The session will view systemic risk from an industry wide perspective (i.e., viewing insurance as the system creating the risk) and a financial system wide perspective (i.e., systemic risks such as those that caused the financial crisis).  In this session we will demonstrate open source software developed in a research project funded by the North American Actuarial Council that can be used to model extreme financial scenarios.  We will then discuss the possible effect of the scenarios on Professional Lines including: • Effect on inflation (economy wide and for professional lines) • Effect on interest rates • Effect on overall economy We will also discuss systemic events and factors that impact professional lines such as • Tort reform • Legal climate • Emerging exposures • Underwriting cycle
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Jennifer Caulder
Panelists: Louise Francis

Emerging Trends in Auto Related Medical Claims Payments - or UCR After Ingenix

On Oct 27, 2009, New York Attorney General Cuomo put out press release announcing 'nationwide reform of the consumer reimbursement system for out-of-network health care charges'. This action found that the Ingenix MDR databases, commonly used to reimburse out-of-network physicians and hospitals, were systematically flawed. The key issues in the finding were: 1. Ingenix is owned by United Healthcare which constituted a conflict of interest. 2. The data was supplied by the same insurance customers that use the data which created an incentive to skew the supplied data. 3. The UCR methodology was proprietary and inaccessible. AG Cumoo's findings led to several lawsuits which became combined in a class action. While the action was directed at Health Insurers, it turns out that Auto Insurers are also big users of the Ingenix MDR product since they do not maintain nation-wide negotiated contracts with physicians and hospitals.  Most insurers settled the suit by contributing money to establish a new database at SUNY and agreeing to stop using the MDR product. This is the new FAIR database. The presentation could cover the following items: 1. What UCR data for medical related claims in causality (e.g. Auto Insurance, Worker's Compensation) are available? 2. A brief history of UCR data, why it is needed and how it is used.   3. The issues with Ingenix 4. Attributes of the FAIR database 5. Alternatives to FAIR
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Bill Lakins
Panelists: David Williams

Distracted Driving, Text Data, and Predictive Analytics

The distraction that can be caused by cell phone use while driving continues to attract considerable attention from government officials, auto insurers, auto manufacturers, suppliers of cell phone equipment and services, and policymakers. Recently, the National Highway Traffic Safety Administration (NHTSA) issued a policy statement recommending that states prohibit novice drivers from using electronic communication devices (including cell phones). Studies are appearing in the trade and academic literature debating whether cell phone use is a significant casual factor in auto accidents, as well as the influence that driver and environment factors may have on the usage-casual relationship. A challenge with performing the analytics for accidents that might be caused by distractions is that much of the causal information may not be available in structured-data fields typically captured by auto insurers but resides in the descriptions (text data, including case adjuster notes and other reports) of the accidents. The NHTSA has developed a data base with detailed information for a representative sample of almost 7,000 accidents. The database includes structured-data fields that are commonly part of automobile accident claim systems, as well as lengthy text (unstructured data) descriptions of the accidents. These text descriptions contain information important to identifying the causal factors from an accident that may not be captured in the structured data. The two purposes of this session will be to (1) explain how text data can be extracted and organized to gather information not readily available in structured data and (2) demonstrate how information captured in unstructured data can improve the results from multivariate, predictive analytics. The objective will be to demonstrate how the results from predictive analytics can be improved using data that are usually not readily accessible to the analyst (because text data is in an unstructured form).
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Bill Lakins
Panelists: Philip Borba

Data Management Call Paper Session #1: Social Media Analytics: Data Mining Applied to Insurance Twitter Posts

The use of social media has grown significantly in recent years. With the growth in its use, there has also been a substantial growth in the amount of information generated by users of social media. Insurers are making significant investments in social media, but many are not systematically analyzing the valuable information that is resulting from their investments. This session discusses the application of correlation, clustering, and association analyses to social media. This is demonstrated by analyzing insurance Twitter posts. The results of these analyses help identify keywords and concepts in the social media data, and can facilitate the application of this information by insurers. As insurers analyze this information and apply the results of the analysis in relevant areas, they will be able to proactively address potential market and customer issues more effectively.
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Bill Lakins
Panelists: Roosevelt Mosley

Credit Scoring

The use of credit based scores for determining insurance rates remains a controversial and unresolved issue. In this session, we bring together a proponent, an opponent, and a moderator who has dealt with insurance from both a regulatory and insurance company viewpoint. They will present their positions and provide an opportunity for the audience to hear and explore various  positions on this issue. Birny Birnbaum, Executive Director of  the Center for Economic Justice, will present consumer concerns about insurer's use of consumer credit information for underwriting and rating homeowners and auto insurance. Birny will also discuss risk classification issues beyond consumer credit information, including appropriate actuarial and public policy standards for evaluating the reasonableness of emerging risk classifications. John B. Wilson, Director of  Insurance Analytics , at LexisNexis | Risk Solutions will present the advantages of using consumer credit information to underwrite and rate insurance policies. Chet Szczepanski, who has dealt with insurance scores both as a regulator and as the Chief Actuary of  the Donegal Insurance Group, will serve as moderator. He will facilitate the discussion and raise awkward questions.
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Bill Lakins
Panelists: Birny Birnbaum

COTOR Update: The Liquidity Risk Premium Project

This session covers the results of the Liquidity Risk Premium Research Project of the Committee on Theory of Risk. Liquidity, Credit, and Market risks are considered together in pricing assets. A novel theory of a two-price market, based on the liquidity measures of bid-ask spreads, becomes the setting to untangle the three risks. An application to reinsurance will be presented using distortion probabilities and catastrophe loss distributions. The Project reports are available at  www.casact.org/liquidity.
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent
Moderators: Bill Lakins
Panelists: Philip Heckman

Commercial Lines External Data Tools

This session will discuss how commercial lines insurance has followed the example of personal lines while discovering the power of credit-based rating and underwriting tools. Middle market accounts have known the benefits of commercial credit scores and related information for years; however, some of these tools were not as effective for smaller accounts. Insurance programs for smaller commercial lines’ insureds have discovered a wide variety of tools to increase the predictive power of credit-based tools. This session will examine both traditional and non-traditional tools used for rating and underwriting commercial insurance products.  Both the financial data available to insurers and the techniques used to incorporate them into commercial lines pricing tools will be presented.
Source: 2012 Ratemaking and Product Management Seminar
Type: concurrent

Discussion of Using Tiers for Insurance Segmentation from Pricing, Underwriting, and Product Management Perspectives

Tier, as a composite variable based on multiple tier elements, has become popular over last several years for the P&C insurance industry to segment business.  It started from personal auto rating as a new component for aggregating pricing impact of non-traditional rating variables to expand price range and increase pricing points.  Lately, it has been applied to commercial lines.  Its application is also beyond pricing now.  It appears to become one of the critical strategic elements in assisting underwriting and product management to more effectively react to market competitions and regulations. In this presentation, we will discuss several frequently asked questions regarding tier applications: • How to effectively design a tiering structure? • How to integrate tier with pricing and/or underwriting?  For example, should pricing oriented tiers be created for manual and class pricing or for underwriting driven pricing? • How to decide the optimal number of tiers? • For commercial lines, how to integrate tiers with some other pricing and underwriting components, such as schedule modifications? • How to embed underwriting tiers for multiple writing companies? • Should pricing tiers be developed based on a pure premium approach or based on a loss ratio approach? • What are the differences between personal line tier applications and commercial line tier applications? In this presentation, we will discuss the above through various design options with their pros and cons.  We will use numerical examples to support the discussion.
Source: 2012 Ratemaking and Pjuduct of Seminar
Type: concurrent
Moderators: Bill Lakins
Panelists: Jun Yan

Managing Cat Risk: It's All about the Portfolio

Traditional actuarial techniques, as well as standard casualty actuarial training and education, rely heavily on the law of large numbers. As such, many pricing actuaries focus on expected losses and expenses of a book of business and expect volatility to decrease as the volume of business increases. Catastrophe risk, on the other hand, is a low frequency - high severity risk, and as such the risk is an increasing function of business volume - the more business is written, the higher the probability of capital impairment. Unlike the standard insurance risk, mis-pricing of catastrophe risk in and of itself doesn't really cause any problems - unless the mis-pricing leads to a suboptimal portfolio via underestimating the exposure and accumulating more risk than desired or overestimating the loss potential and turning away profitable business. There will never be enough premium to cover the actual losses from a catastrophic event (or any low frequency - high severity event). Premium for catastrophic event coverage is meant to provide an appropriate return on the capital supporting the cover and to pay for any reinsurance or retrocessional purchases. This session will discuss the interactions between pricing and portfolio management, and touch upon some ERM related issues related to catastrophe risk.
Source: 2012 In Focus Seminar
Type: concurrent
Moderators: Bill Lakins
Panelists: Kevin Madigan, Thomas Le

Is That a Worm or a Bot? Cyber Liability Insurance

Cyber Liability and Network Security insurance are new coverage concepts that are evolving at an amazing pace every day. There are lots of questions surrounding these concepts such as: What are the risks and how do you insure them? What is the coverage and what types of businesses need to have this coverage? What does this mean for your company, be it a primary insurer, a reinsurer, a broker, or a consultant? And, most importantly, what does it mean to your clients? Our speaker will present on current trends for this line of business as well as the pain points of the marketplace as they answer these and other important questions. The speaker is knowledgeable in this fast growing coverage. He will share his experience from the point of view of a primary insurer and reinsurance company.
Source: 2012 In Focus Seminar
Type: concurrent
Moderators: Bill Lakins
Panelists: John Merchant