Casualty Actuarial Society


Submit a Proposal for 2012 Data Call for Papers - Read papers from 2010 Call

12/01/2010 —

Recent technological advances have enabled actuaries to process larger quantities of data than ever before. Credit scoring, multivariate modeling, economic capital modeling, catastrophe modeling, and business intelligence models are all examples of advancements that provide actuaries with an ever-evolving scope to their analyses and application of their skills to new business problems. The CAS Committee on Management Data and Information sought papers on a variety of subjects for its 2010 Call Paper Program which produced a record number of submissions. Below is a brief synopsis of two submitted papers.

Data and Disaster: The Role of Data in the Financial Crisis by Louise Francis and Virginia Prevosto

In this paper the authors showed that data quality played a significant role in the mispricing and business intelligence errors that caused the crisis. They utilized a number of relatively simple statistics to illustrate the due diligence that should have, but was not performed. They used the Madoff fraud and the mortgage meltdown as data quality case studies. They then applied simple exploratory procedures to illustrate simple techniques that could have been used to detect problems. Lastly they illustrated some modeling methods that could have been used to help underwrite mortgages and find indications of fraud

Duplicate FHA Single-Family Mortgage Records and Related Problems by Thomas N. Herzog

The Federal Housing Administration (FHA) insures tens of millions of mortgages against the risk of foreclosure. Maintaining accurate databases has been an issue as data such as ‘status’ has been lost due to mergers and acquisitions as lenders do not always transmit accurate/timely data to FHA on the termination of FHA-insured single-family mortgages that they are servicing. This, in turn, means that FHA’s database has many mortgages listed as “active” that have in fact been terminated. In addition, the FHA made a decision many years ago to omit the property address of insured mortgages from its databases on single-family mortgages because of the high cost of computer storage at that time. To improve the quality of FHA’s databases in these areas, the author applied record linkage techniques. The first approach was to use a variety of internal consistency checks to identify “active” mortgage records whose underlying mortgages had in fact terminated. The second approach involved matching FHA records with corresponding records of the Government National Mortgage Association (GNMA). This second approach allowed the author to (1) obtain property addresses from the GNMA database and add them to the FHA database as well as (2) identify additional “active” FHA mortgage records whose underlying mortgages had in fact terminated.

The rest of the papers can be found in the 2010 Spring E-Forum.

We encourage everyone to read these forward thinking papers that go beyond theoretical knowledge into valuable practical tools. We further encourage you to consider a submission for the

2012 Data Management and Information Call Paper Program

Back to All News Articles