Introduction to Multilevel/Hierarchical Modeling
The Multilevel/Hierarchical, a.k.a. "mixed effects," modeling framework is a powerful and intuitive generalization of the linear modeling framework that has become a cornerstone of much actuarial work. Yet it has received relatively scant attention in actuarial publications and seminars. Hierarchical models apply when one’s data is naturally structured in groups (e.g., repeated observations for each policy, policies within territories, etc.) and one would like one's model coefficients to reflect this group structure. This is achieved by allowing (some of) the model parameters to vary by group. This session will sketch some fundamental concepts of hierarchical models, discuss some of their many practical applications in actuarial science, and draw a connection between the theory of hierarchical models and Bayesian Credibility theory.
Source:
2009 Ratemaking and Product Management Seminar
Type:
concurrent
Panelists:
James Guszcza, Bill Stergiou
Use of Scoring in Underwriting Analytics and Marketing
Customer life cycles offer many opportunities for improving profitability—prospecting, lead generation, risk segmentation and selection, acquisition, retention, cross sell, upsell, attrition, and win back. But it is not just the contractual relationship; there is also price elasticity for brand value to consider over multiple policy periods for the “life time.” We will discuss how to define a customer, and then how predictive models and operational implementation can improve your company’s profitability, both immediately and in the future.
In addition, the marketing department of an insurance organization seeks alignment with their underwriting department in terms of what attributes are associated with a “good” risk when pursuing new business. External predictors used in a company’s underwriting/pricing models can be leveraged to achieve better alignment. Potential new accounts can be scored and ranked based on the likelihood of passing through a company’s underwriting filter. The accounts can also be compiled reflecting the relevant distribution channel. The distribution channel factors considered include proximity, type of account, and referral information related to the targeted account.
Source:
2009 Ratemaking and Product Management Seminar
Type:
concurrent
Moderators:
Bethany Cass
Panelists:
Martin Ellingsworth, Gary Ciardiello
Optimization of Distribution Channels using Statistical Techniques
Insurance companies strive to measure the performance of their distribution channels. Predictive modeling and other statistical techniques can be used to generate growth and increase profitability while controlling the cost of the strategies used. This session will address some of the techniques that can be used to design strategies in distribution system areas such as agency management and direct mail campaigns. The session will also address how the effectiveness of these strategies can be measured.
Source:
2009 Ratemaking and Product Management Seminar
Type:
concurrent
Panelists:
Marina Ashiotou, Benny Yuen
Communicating Predictive Modeling Results
Predictive modeling is highly technical work. The successful implementation of a predictive modeling project often relies on communicating project results to a less technical audience. Graphical presentation of results is thus a key communication tool for predictive modeling work. In this session, the presenters will draw on their experience from a variety of predictive modeling projects in order to demonstrate a number of graphical presentation methodologies that they have found critical for proper presentation of model results. This will include techniques to understand key aspects of the data, to identify and analyze predictor variables, and to summarize key model results to senior management. Selected elements of the presentation will be in case-study format.
Source:
2009 Ratemaking and Product Management Seminar
Type:
concurrent
Moderators:
Bethany Cass
Panelists:
William Carpenter, Marlowe Leibensperger, Marina Ashiotou
Text Mining on Unstructured Data
It is generally believed that about 80% of all data is unstructured data. Unstructured data is data that is not contained in specified formats with values that have a definite meaning to users and typically it is found in a computerized database. Unstructured data includes: free-form claim description fields in a claims database, underwriter notes in an underwriting file, the title and contents of e-mails, answers to open-ended survey questions, and words or phrases typed into a search engine. Unstructured data is often ignored when performing predictive modeling analyses. This session will give an overview of some kinds of unstructured data and how they can be used in predictive modeling. Specific examples of the application of text mining will be provided. Presenters will also provide references and discuss some of the key literature on the topic.
Source:
2009 Ratemaking and Product Management Seminar
Type:
concurrent
Moderators:
Virginia Prevosto
Panelists:
Louise Francis, Philip Borba, Karthik Balakrishnan
Underwriting Cycles-Are They Unavoidable?
Alternating periods of hard and soft markets have been a notable-and sometimes painful-characteristic of the P&C insurance business.
Over the past few years, prices have declined, coverages provided may have increased, and for workers compensation, residual markets have generally depopulated. Will the market turn in 2009 or 2010, especially with the investment losses reducing capital in the insurance industry?
Our panel of practitioners and observers will explore the environmental, structural, and behavioral factors that give rise to the cycle and will examine the situation as it exists today. They will also tackle the question, "Is the underwriting cycle unavoidable?"
Source:
2009 Ratemaking and Product Management Seminar
Type:
concurrent
Moderators:
Virginia Prevosto
Panelists:
Matthew Mosher, Spencer Gluck
GLM III
GLM III will cover additional refinements, such as regression splines and how to use them to improve models, as well as investigating the appropriateness of a multiplicative model structure, how to combine GLMs across multiple claim types, and the use of the offset term to constrain models. Further refinements discussed will include techniques for modeling large claims, practical model validation approaches, and specific issues that arise when modeling price demand elasticity with GLMs, which is of particular importance when undertaking price optimization analyses.
Source:
2009 Ratemaking and Product Management Seminar
Type:
concurrent
Panelists:
Duncan Anderson
Marrying Underwriter Intuition & Predictive Modeling - A Workers Compensation Perspective
As predictive analytics and GLM increase in popularity, there is an increasing challenge associated with optimizing the acceptance and integration of these new techniques into sound underwriting processes and decision-making structures. All too often these new tools are met by underwriters with the same skepticism as any other challenge to their established intuitive norms for risk selection.
This presentation will discuss how both actuaries and underwriters can engage each other in cooperative ways through predictive analytic endeavors to both sharpen the underwriters' intuitions and skills at risk selection and increase the real-world, business-case perspective of the predictive modeler. It will explore why no predictive model can be sustained in a vacuum as well as why no underwriter should address any predictive modeling project with anything less than a strong enthusiasm for the insights that can be garnered through the process. Finally, the presentation will give examples of how predictive models can help to mitigate some standard intuitive underwriting decision traps and how an underwriter's intuition into the daily use cases for a predictive model can help stave off degradation of model stability.
Source:
2009 Ratemaking and Product Management Seminar
Type:
concurrent
Moderators:
Virginia Prevosto
Panelists:
Gaetan Veilleux, Matt Frazier
Keynote Speaker: Ian Ayres
The RPM seminar will kick off with a dynamic keynote address by Ian Ayres, author of Super Crunchers. Super crunching - analyzing massive databases to affect real-world decisions - is a new trend looking at why thinking-by-numbers is the new way to be smart. Mr. Ayres is a preeminent expert on new methods of prediction and decision making that are not only changing the way decisions are made, but the decisions themselves. He is a columnist for Forbes magazine, a regular commentator on public radio's Marketplace, and regularly writes op eds for the New York Times. His research has been featured on Prime Time Live, Oprah, Good Morning America, and in Time and Vogue magazines. For further information about Ian Ayres, please visit www.ianayres.com. Early registrants can win big at this year's RPM Seminar! The first 250 attendees to register for the seminar will receive a brand new copy of Super Crunchers. Please do not hesitate to sign up early for this great seminar and leave Vegas winning.
Source:
2009 Ratemaking and Product Management Seminar
Type:
keynote
Moderators:
Virginia Prevosto
California Auto Class Plan after July 14, 2008, Including a Case Study
On July 14, 2008, the state of California required all insurers to comply with new auto rating factor regulations (California Code of Regulations, Title 10, Section 2632) specifying that a constrained sequential analysis process be followed in the development of classification plans. The plan's intent is to ensure that the safety record, mileage, and years licensed rating factors are developed in such a way as to maximize their "contributions," removing previously allowable rating factors and restricting the contributions of the 16 allowable rating factors where territory has the least influence. This session briefly describes the California auto rating factor regulations and reviews the filings for the largest California auto insurance companies, including effects on rate levels, class plan compliance, approval status, and policyholder rate dislocation. Also discussed will be potential market effects of the new restrictions, including pushback from the CDI on rate changes, creation of new areas of profitability and unprofitability, and a focus on accurate mileage data. Finally, it will include suggestions on how to achieve profits in a chaotic and actuarially unsound market by measuring competitiveness, rate adequacy, and available opportunity. A case study will be presented to illustrate how optimal rates can be developed within the restrictions of the California Code of Regulations.
Source:
2009 Ratemaking and Product Management Seminar
Type:
concurrent
Moderators:
Virginia Prevosto
Panelists:
Susan Miller, Scott Sobel, Mark Jones, Paul Stahlschmidt
Predictive Modeling for Personal Lines
This session will discuss a novel approach to predicting the loss cost for a single policy, with examples taken from actual models for personal auto and homeowners insurance. The approach involves separately modeling frequency and severity using detailed data that pertains to individual policies. The session will describe 1) the types of GLMs used for frequency and severity modeling; 2) an overview of data used to make the loss cost predictions; 3) special modeling problems like partial year exposures; 4) partial residual plots to diagnose the effectiveness of individual variables, or group of variables; 5) graphical plots that indicate the effectiveness of the overall model.
Source:
2009 Ratemaking and Product Management Seminar
Type:
concurrent
Panelists:
A. Cummings, Mark Richards
Homeowners Catastrophe Ratemaking - Risk Load and Reinsurance Costs
The volume of insurance-linked securities available in the capital markets is growing. In this session, available catastrophe bond data from the capital markets will be used to quantify the cost of catastrophe risk for property insurance. Several applications will be presented, including quantifying risk loads and evaluating the cost of catastrophe reinsurance. The panel will also examine ways to allocate the risk loads and reinsurance costs by geographic area.
Source:
2009 Ratemaking and Product Management Seminar
Type:
concurrent
Moderators:
David Foley
Panelists:
David Appel
Price Optimization versus Regulation
This session will address the often conflicting requirements of statutory rate standards, actuarial standards of practice, and market realities. How does one balance price optimization with actuarial soundness? How does one satisfy regulatory requirements while achieving desired business results? The interrelationships between these concepts, and suggested solutions will be covered.
Source:
2009 Ratemaking and Product Management Seminar
Type:
concurrent
Moderators:
David Foley
Panelists:
Cara Blank, Arthur Schwartz
Predictive Modeling for Small Companies
Predictive modeling for small companies can be a challenge. Most small companies do not have the resources or in-house expertise to do predictive modeling. In addition, many small companies feel that they do not have enough data to obtain a credible result. Several case studies will be presented to illustrate how a smaller company can overcome these obstacles and reap the benefits of predictive modeling.
Source:
2009 Ratemaking and Product Management Seminar
Type:
concurrent
Moderators:
Robert Race
Panelists:
Donald Closter, Thomas Boyer
Optimization for the U.S. Market: Introduction and Fundamentals
This session will introduce price optimization techniques and discuss the technical foundations of an effective price optimization approach. It will cover items such as data and general modeling techniques as well as how to approach and sell senior management on the benefits of price optimization. This session is intended to complement "Price Optimization for the U.S. Market: Techniques and Implementation Strategies" in the Implementation track.
Source:
2009 Ratemaking and Product Management Seminar
Type:
concurrent
Moderators:
Kevin Dyke
Panelists:
Robin Harbage
Model Validation Techniques
Modern computing power has greatly enhanced the complexity of statistical models that one can attempt to fit to observed data. Used in an indiscriminate fashion, almost any statistical technique can end up modeling noise, resulting in a model that fits the observed data well but predicts poorly.
This session will discuss validation (including cross-validation) techniques that can and should be applied regardless of the underlying statistical model employed. One key goal of these straightforward techniques is to avoid models that “overfit” the observed data and thus will not generalize well to future experience.
In addition, the bottom-line question, “How well do GLMs fit?” will also be discussed. Much work can be done to improve the fit of a GLM, but despite even the best efforts, when (if ever) can you be satisfied with a certain level of error?
Source:
2009 Ratemaking and Product Management Seminar
Type:
concurrent
Moderators:
Thomas Mack
Panelists:
Paul Beinat, Christopher Monsour
Project Management for Predictive Models
The use of predictive modeling continues to expand within the P&C insurance industry. The need to manage both the development and implementation of products based on predictive modeling efforts is critical to accomplishing the goals for the business unit. This session will discuss the aspects of managing a complex project and the issues to consider for successful implementation.
Source:
2009 Ratemaking and Product Management Seminar
Type:
concurrent
Moderators:
Thomas Mack
Panelists:
Jonathan White
Workers Compensation-State of the Market
An overview of the current state of the workers compensation line will be presented, including a review of financial results, recent trends, and a discussion of where the line might be headed.
Source:
2009 Ratemaking and Product Management Seminar
Type:
Concurrent
Moderators:
Thomas Mack
Panelists:
Nancy Treitel-Moore
Handling High-Dimensional Variables
Generalized linear modeling has become the standard technique for predictive modeling within the insurance industry. There are some variables that have a large number of levels; oftentimes, these variables may have relatively few exposures. It is important to consider some advanced techniques when handling such variables.
This session will discuss special techniques for handling high-dimensional variables, with a focus on territory boundary analysis, vehicle symboling, and workers compensation classes.
Source:
2009 Ratemaking and Product Management Seminar
Type:
Concurrent
Moderators:
Thomas Mack
Panelists:
Klayton Southwood, Serhat Guven
Understanding Data Warehousing Needs
As companies expand their predictive modeling analytics, the need for quality data has risen. Given the large expansion of variables and the vast volumes of information that can be stored, this session will discuss what insurance companies need to know about building effective data warehousing structures.
Source:
2009 Ratemaking and Product Management Seminar
Type:
Concurrent
Panelists:
Tracy Spadola, Frank Capobianco
Update on Auto Insurance Costs - Other Make/Model Symbol Issues
The increase in size and types of vehicles on the road today coupled with the market leaders’ desires for greater granularity in price points has resulted in a renewed focus on the necessity to price individual vehicles accurately—for both liability and physical damage coverages. Recent experience also points to significantly higher loss costs for performance versions of some makes and models. What can be done in the ratemaking process to adequately account for these differences?
The panel will present a discussion of the current pricing issues related to rating individual makes and models of automobiles, including a discussion of the various “symbol rating” programs in use by insurers in the U.S.
Source:
2009 Ratemaking and Product Management Seminar
Type:
Concurrent
Panelists:
Kim Hazelbaker, LeRoy Boison
Predictive Modeling for Commercial Lines with Schedule Rating
Predictive modeling continues to gain popularity as a tool for pricing and underwriting various commercial lines. However, what role should these models play in lines where significant underwriting judgment and wide-ranging schedule rating modifiers have long been the norm?
Topics will include:
* Benefits of adding predictive models to the process
* Methods that use underwriting judgment to improve predictive models
* Methods that use predictive model results to improve pricing and underwriting decisions, along with pros and cons
* Measurement of whether predictive model implementation should coincide with a decreased reliance on schedule rating modification
Source:
2009 Ratemaking and Product Management Seminar
Type:
Concurrent
Moderators:
Thomas Mack
Panelists:
Larry Seymour, Adam Sherwin
Leveraging Machine Learning Techniques
Pre-dating SAS, GLMs were first used to analyze insurance data in 1972 - but are GLMs still the best tool available today? This session explores alternative machine learning techniques currently available today that have shown to yield impressive results over the GLM.
Source:
2009 Ratemaking and Product Management Seminar
Type:
Concurrent
Panelists:
Paul Beinat
Revenue Management and Insurance Cycles
The paper investigates how an insurer's pricing strategy can be adapted to respond to market conditions, in particular the insurance cycle. For this purpose, we explored the use of dynamic pricing strategies, such as the revenue management techniques used by other industries (e.g., airlines, car rentals, Internet service providers), in an insurance context. We then compared these dynamic pricing techniques with the static ones currently used in the market, and demonstrated that they can prove very valuable to insurers looking to enhance their competitive strategy.
Source:
2009 Ratemaking and Product Management Seminar
Type:
Paper
Moderators:
Wyndi White
Panelists:
Jean-Bernard Crozet
Keywords:
Insurance Cycles, Revenue Management
Is Any Insurance Product a Commodity
A commodity is defined as “anything for which there is demand, but which is supplied without qualitative differentiation across a market.” Certainly some insurance products (e.g., personal auto) have some of the characteristics of a commodity and many insurance products may be moving in the direction of becoming a commodity. On the other hand, there are ways to prevent a product from becoming a commodity and they apply to insurance just as they do to other types of products.
Source:
2009 Ratemaking and Product Management Seminar
Type:
Concurrent
Panelists:
Ronald Baker