Updating the Berquist Sherman Paper: 30 Years Later
        
    
      
          What if Berquist and Sherman were to update their paper? Rick Sherman will also offer practice insights based on his service as an expert witness in a number of lawsuits against actuaries regarding applications of the paper.
        
    
      
              Source: 
            2010 Regional Affiliate - CANW
        
    
      
              Type: 
            affiliate
        
    
      
              Panelists: 
            Rick Sherman
        
    
      
              Keywords: 
            Berquist Sherman Paper
        
    
      
  
   
    
      
      
          
The Actuarial Foundation
        
    
      
          Pamela will tell us what The Actuarial Foundation is and what it is currently doing
        
    
      
              Source: 
            2010 Regional Affiliate - CANW
        
    
      
              Type: 
            affiliate
        
    
      
  
   
    
      
      
          
Update from the CAS
        
    
      
          Alice will take us through current CAS issues and directions.
        
    
      
              Source: 
            2010 Regional Affiliate - CANW
        
    
      
              Type: 
            affiliate
        
    
      
              Panelists: 
            Alice Underwood
        
    
      
  
   
    
      
      
          
Predictive Modeling & Price Optimization
        
    
      
              Source: 
            2010 Regional Affiliate - CASE
        
    
      
              Type: 
            affiliate
        
    
      
              Panelists: 
            Alex Laurie
        
    
      
              Keywords: 
            Predictive Modeling, Price Optimization
        
    
      
  
   
    
      
      
          
WC Predictive Modeling
        
    
      
              Source: 
            2010 Regional Affiliate - CASE
        
    
      
              Type: 
            affiliate
        
    
      
              Panelists: 
            Gaetan Veilleux
        
    
      
              Keywords: 
            Predictive Modeling
        
    
      
  
   
    
      
      
          
Panel Session on Catastrophe Modeling
        
    
      
              Source: 
            2010 Regional Affiliate - CASE
        
    
      
              Type: 
            affiliate
        
    
      
              Panelists: 
            Tim Aman, Kay Clearly
        
    
      
              Keywords: 
            Catastrophe Modeling
        
    
      
    
      
  
   
    
      
      
          
Recent Developments in Workers Compensation Claim Frequency
        
    
      
          This session will summarize recent workers compensation claim frequency research using multistate data. Claim frequency changes by industry, injury type, and employer size are a few of the categories that will be examined. In addition, we will explore the recent substantial claim frequency declines in California and Florida. In both states the reductions occurred after the enactment of major system reforms. After providing some background and historical perspective, the latest experience will be reviewed. Potential reasons for the drop in claim frequency in each state will also be discussed.
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            concurrent
        
    
      
              Moderators: 
            James Donelon
        
    
      
              Panelists: 
            David Bellusci
        
    
      
    
      
  
   
    
      
      
          
Predictive Modeling and By-Peril Analysis for Homeowners Insurance
        
    
      
          Homeowners insurance covers a multitude of perils. However, most of the traditional variables used to rate homeowners insurance, such as amount of insurance and construction class, are most appropriate for fire insurance. The first presentation will show how to apply predictive modeling separately by peril. Independent variables will include the traditional rating variables along with external variables. In these models, the effect of the traditional variables will differ by peril. The second presentation will present a case study on deriving deductible factors for by-peril modeling. The structure of deductible factors needs to reflect interactions by peril between deductible and coverage A limit. Loss elimination ratios are estimated under two assumed loss distributions: a gamma from GLM output and a mixture of gamma and lognormal.
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            concurrent
        
    
      
              Panelists: 
            Luyang Fu, A. Cummings, Jerry Han
        
    
      
    
      
  
   
    
      
      
          
Large Scale Analysis of Renewal Discounts for P&C Insurance
        
    
      
          In this session, we discuss the issue of whether a price discount for renewal business is warranted for property and casualty insurance. The discount is motivated by the fact that new business with insurance coverage lapse, or new business in general, may perform worse than renewal business. We will support the discussion with a large amount of real industry data: 25 books of insurance business with a total amount of almost $29 billion of premium. The data cover all of the primary property and casualty lines of business, including personal auto and homeowners as well as commercial business owners policies, auto, WC, GL, and property. Our discussion will show that new business universally has a higher loss ratio and a lower retention rate than renewal business across all the 25 books of business. We will attempt to offer reasons as to why such differences exist between new and renewal business for insurance.
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            concurrent
        
    
      
              Moderators: 
            James Donelon
        
    
      
              Panelists: 
            Cheng-Sheng Wu, Hua Lin
        
    
      
  
   
    
      
      
          
Intelligent Use of Competitive Analysis
        
    
      
          Competitive analysis is one of the key elements in measuring the performance of a rate algorithm. This requires the capture and analysis of competitive data. Over the years, much work has been performed in capturing the data; however, there is a significant lack of sophistication in analyzing it. This session will begin by discussing the sources and challenges of acquiring good competitive information. Then, the focus will be on how to analyze the data to make informed decisions. Analysis strategies will run the gamut of traditional mining of the data to more sophisticated analysis and then to incorporation of the information into demand curves.
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            concurrent
        
    
      
              Moderators: 
            James Donelon
        
    
      
              Panelists: 
            Jacob Fetzer
        
    
      
  
   
    
      
      
          
Current Developments in Workers Compensation Experience Rating
        
    
      
          After providing a brief review of the NCCI Experience Rating (ER) formula and historical background, recent performance tests of the ER Plan will be presented. Other topics for discussion will be NCCI's comprehensive review of the ER Plan that is currently underway and how the recent changes to the NCCI class ratemaking methodology impact the ER parameters.
In addition, this session will provide a different perspective on reflecting individual risk historical experience in workers compensation ratemaking through a class rating approach. Examples using actual industry data and comparisons to other lines will be presented.
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            concurrent
        
    
      
              Moderators: 
            Dave Horn
        
    
      
              Panelists: 
            Cheng-Sheng Wu, Jonathan Evans, Min Wang
        
    
      
  
   
    
      
      
          
Personal Lines Pricing Strategies
        
    
      
          Pricing strategies for personal insurance in the United States differ from those of other consumer products because of both regulation and the stochastic nature of insurance losses. Due to the evolution of personal lines pricing with its origins in the era of pencil-and-paper statistics, there is often a gap between actuarial cost-plus projections and the prices actually charged for personal insurance. As the field has evolved, new techniques have been introduced from academia, other industries, and business contexts. This presentation will be a panel discussion of the evolution of pricing, encouraging audience participation and interaction. Parallels will be drawn between innovations in personal lines pricing strategies and the integration of ideas from statistics, data mining/machine learning, and marketing science. Particular focus will be given to the use of scorecard models for refined pricing and underwriting, as well as the steps involved in integrating demand modeling and price optimization into the pricing process.
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            concurrent
        
    
      
              Moderators: 
            Dave Horn
        
    
      
              Panelists: 
            Michael Greene, Jose Trasancos
        
    
      
  
   
    
      
      
          
Essentials of Data Quality for Predictive Modeling
        
    
      
          This session will outline the components needed in a data collection and data quality plan to prepare for a predictive modeling project. We will be discussing the need to have a data collection plan as well as answering why a feedback loop between the data team, the modeling team and the business is necessary. We will also discuss the data quality dimensions critical to success and the type of data cleansing that is required prior to modeling.
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            concurrent
        
    
      
              Panelists: 
            Jeremy Benson, Alietia Caughron
        
    
      
  
   
    
      
      
          
Alternatives to Credit Score
        
    
      
          Credit-based insurance scores have become an important part of many insurers' rating and underwriting plans. These scores can result in significant rate impacts. However, given the regulatory and public scrutiny of the use of credit information, there is a concern over what would happen if insurers could no longer use credit. The use of credit is currently banned in several states, and the potential for losing the use of credit in other states is very real. In response to this reality, this session will discuss:
In response to this reality, this session will discuss:
    * The historical and current attacks on the use of insurance scoring
    * Reasons that credit score is useful in predicting insurance loss
    * Companies' reactions in states where the use of credit is banned
    * Analytic approaches to determining an optimal rating and underwriting approach without using credit
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            concurrent
        
    
      
              Moderators: 
            Dave Horn
        
    
      
              Panelists: 
            Roosevelt Mosley, Eliade Micu
        
    
      
  
   
    
      
      
          
Quantifying Risk Load for Property Catastrophe Exposure
        
    
      
          The volume of insurance-linked securities (ILS) available in the capital markets is growing. Catastrophe Bonds are one form of ILS. In this session, available catastrophe bond data from the capital markets will be used to quantify the cost of catastrophe risk for property insurance. Several applications will be presented, including quantifying risk loads and evaluating the cost of catastrophe reinsurance. The panel will also examine ways to allocate the risk loads and reinsurance costs by geographic area.
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            concurrent
        
    
      
              Moderators: 
            Ian Ayres
        
    
      
              Panelists: 
            David Appel
        
    
      
    
      
  
   
    
      
      
          
Commercial Applications of Predictive Analytics
        
    
      
          Commercial lines of insurance offer tremendous product and rating plan development opportunities. On one hand, many of the factors and data sources developed for personal lines can be similarly used in commercial lines. For example, credit scores, vehicle identification numbers (VINs), motor vehicle records (MVRs), and territory refinements all make the transition from personal to commercial auto quite well. On the other hand, workers compensation, commercial auto, and commercial package products, including BOPs, each have unique characteristics that offer actuaries and product managers new opportunities. Factors such as number of years in business, hours of operation, fleet size, type of ownership and a myriad of others all contribute to the dynamic world of commercial lines product development. This session examines the distinct issues relevant to development of commercial lines rating plans, particularly from a predictive analytics perspective.
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            concurrent
        
    
      
              Panelists: 
            Robert Walling, Anthony Phillips
        
    
      
    
      
  
   
    
      
      
          
Geospatial Data Visualization and Analysis
        
    
      
          The legendary statistician John Tukey once said, "The greatest value of a picture is when it forces us to notice what we never expected to see." This is particularly true in the case of geospatial data, as has been recognized in fields like epidemiology, marketing, political science, economic geography, and increasingly in actuarial science. This session will begin by covering some basic concepts of data visualization and Geographic Information Systems (GIS). Case studies of effective knowledge discovery and communication through the use of geospatial data visualization will be included to illustrate key points. Further topics of discussion will include an introduction to the different forms of spatial data, commonly used geospatial software packages, potential insurance applications, and major types of geospatial modeling techniques. Case studies will be provided to illustrate the application of geospatial modeling to insurance data.
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            concurrent
        
    
      
              Moderators: 
            Ian Ayres
        
    
      
              Panelists: 
            Michael Greene, Satadru Sengupta
        
    
      
    
      
    
      
  
   
    
      
      
          
Introduction to Increased Limit Factors
        
    
      
          This session will present an overview of increased limits ratemaking. Participants will cover general concepts, such as calculating limited average severities, and practical problems with developing increased limit factors (ILFs) from a distribution of loss data. The session will also provide an overview of excess and deductible pricing and will discuss common approaches for calculating ILFs.
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            workshop
        
    
      
              Panelists: 
            Patrick Thorpe
        
    
      
              Keywords: 
            Limit Factors
        
    
      
  
   
    
      
      
          
Combining Machine Learning with GLMs in Ratemaking
        
    
      
          This session will cover three topics:
   1. "GLMs: The Good, Bad, and Ugly"
      As GLMs increasingly become the pricing standard for the U.S. insurance industry, a review of their assumptions and limitations becomes more important. The title refers to three distinct topics. The Good: Advantages of GLM/what they do well. The Bad: Difficulties associated with GLMs/what they do not do well. The Ugly: Limitations of GLMs/what they cannot do. In each of these three topics, the implications for pricing is discussed. This topic concludes with solutions for complementing the weaknesses of GLMs.
   2. "Combining Machine Learning with GLMs in Ratemaking"
      Statistical methods have been successfully employed in ratemaking for a considerable time. Their very parametric nature provides a framework for derived statistics that describe their fit, while assuming that the real-world adheres to their assumptions. We will look at how these assumptions hold, seeking signals within their residuals, representing them as discrete and continuous values. An approach that combines these methods is presented along with the results obtained. Finally, we look at how these methods can be used to retrieve the signal lost when the use of a predictor variable is removed, such as credit score.
   3. "Leveraging Machine Learning Techniques"
      Machine learning is most commonly associated with algorithms such as neural networks, decision trees, case-based reasoning or support vector machines to name just a few. These algorithms work reasonably well on a range of problems but they are not universally applicable. Recently, work at the Data Sciences and Knowledge Discovery Laboratory has focused on mining activity patterns, looking for associations among actors and incorporating domain knowledge into algorithms. Finally, fusion algorithms are presented and some results obtained on real-world data sets are discussed.
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            concurrent
        
    
      
              Moderators: 
            Ian Ayres
        
    
      
              Panelists: 
            Christopher Cooksey, Paul Beinat
        
    
      
    
      
  
   
    
      
      
          
Text Mining Handbook
        
    
      
          Text Mining is an emerging technology that can be used to augment existing data in corporate databases by making unstructured text data available for analysis. In this session you will learn how the authors applied the text processing language Perl and the statistical language R to two text databases, an accident description database and a survey database. For the accident description data, new variables were extracted from the free-form text data that are used to predict the likelihood of attorney involvement and the severity of claims. For the survey data, the text mining identified key themes in the responses. The authors will show that useful information that would not otherwise be available was derived from both databases.
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            paper
        
    
      
              Panelists: 
            Louise Francis, Matthew Flynn
        
    
      
              Keywords: 
            Text Mining
        
    
      
    
      
  
   
    
      
      
          
Applications of Quantile Regression in Commercial Underwriting Models
        
    
      
          Session panelists will present a robust regression technique, quantile regression, and use it to develop underwriting models for commercial insurance. Ordinary least-squares regression, or GLM, analyzes the relationship between X and the conditional mean of Y. In contrast, quantile regression models the relationship between X and the conditional quantiles of Y. Quantile regression produces a very robust estimation because random large noises will not affect the model as much as with least-squares and GLM regressions. Quantile regression is especially useful in commercial lines where the data is very volatile and extremes are important, such as identifying the highest risks. Another advantage of quantile regression is that it provides a more complete picture of the conditional distribution of loss ratio Y. A case study will demonstrate numerically this robust regression technique.
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            concurrent
        
    
      
              Panelists: 
            Cheng-Sheng Wu, Luyang Fu
        
    
      
  
   
    
      
      
          
Using Your Professionalism GPS to Navigate the Actuarial World
        
    
      
          Panelists will cover a series of thought-provoking case studies to help attendees explore different aspects of professionalism.
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            concurrent
        
    
      
              Panelists: 
            Susan Forray, Dustin Loeffler, Karen Terry
        
    
      
  
   
    
      
      
          
Multilevel Models, Credibility Theory, and Ratemaking
        
    
      
          The Multilevel/Hierarchical, a.k.a. "mixed effects," modeling framework is a powerful and intuitive generalization of the Generalized Linear Modeling framework that has become a cornerstone of much actuarial work. Yet, mixed effects modeling has received relatively scant attention in actuarial publications and seminars. This is particularly surprising given the fact that the theory of multilevel/hierarchical models can be viewed as a generalization of Bühlmann's credibility theory. Multilevel modeling is applicable when one's data is naturally structured in groups (e.g., repeated observations for each policy, policies within territories and business classes, etc.) and one would like one's model coefficients to vary appropriately by group. This session will sketch some fundamental concepts of multilevel models, spell out the connection between multilevel modeling and credibility theory in some detail, and discuss some of the many practical applications of multilevel models in actuarial science. The presenters' points of view are that multilevel models provide a statistically consistent inferential method that unifies two pillars of actuarial modeling: credibility theory and generalized linear models. Finally, a case study will be presented to illustrate the application of hierarchical models in a ratemaking setting.
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            concurrent
        
    
      
              Moderators: 
            Ian Ayres
        
    
      
              Panelists: 
            Fred Klinker
        
    
      
    
      
  
   
    
      
      
          
Responses to Unavailable Personal Lines Insurance in the Voluntary Market
        
    
      
          This session will cover the latest update on the Massachusetts Personal Auto Open Competition Law enacted in 2008 and the effect this law change has had on carriers, rate regulation, and the competitive environment. Also, an overview of the current environment of personal auto residual markets and FAIR plans will be discussed in the session.
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            concurrent
        
    
      
              Moderators: 
            Ian Ayres
        
    
      
              Panelists: 
            Frederick Strauss, John Winkleman
        
    
      
    
      
  
   
    
      
      
          
Account Proposals: A Team Effort
        
    
      
          Why aren't we making money?
    * CEO - "The actuaries don't know how to rate the business!"
    * Actuary - "The underwriters have opened the floodgates!"
    * Underwriter - "You should see the kind of business we write!"
    * Marketing - "The CEO told us to grow!" 
Sound familiar? The blame game is all too common in the industry. Learn how one company evolved to play a different game in which the underwriters and actuaries (and usually marketing) play on the same team! Also hear about how the account proposal process was the impetus for changes that significantly affected the entire organization.
        
    
      
              Source: 
            2010 Ratemaking and Product Management Seminar
        
    
      
              Type: 
            concurrent
        
    
      
              Moderators: 
            Ian Ayres
        
    
      
              Panelists: 
            Kelly McKeethan