Actuarial Review Return to Main Page

Latest Research



Barnett and Zehnwirth Provide Road Map
for Probabilistic Reserve Models

by Frederick F. Cripe, Chairperson,
CAS Research Policy and Management Committee

At the March Ratemaking Seminar in Las Vegas, I was mildly surprised to see so many actuaries spending their evenings and early mornings at the gambling tables.  Any actuary who has completed the probability and statistics course work on the syllabus knows that any wager made has negative expected value. Based on this limited evidence, I conclude that actuaries (like lesser mortals) often act in ways that are incongruent with what they know to be true.

This incongruence is clearly displayed by many actuaries involved in reserving or loss development estimation.  Although the actuarial literature is replete with articles explaining the shortcomings of the traditional link ratio methods, most actuarial reserve analyses mainly rely on link ratio methods.

A relatively recent paper accepted by the Committee on Review of Papers (CORP) provides a theoretical and practical road map to actuaries who would like to improve their reserving methods.  In "Best Estimates for Reserves," (pdf file) Glen Barnett and Ben Zehnwirth provide an outstanding discussion of the relationship between traditional link ratio methods, broader regression models, and probabilistic reserve estimation models.  The authors make a very strong case for the advantages of using a proba bilistic model. The paper is currently available on the CAS Web Site, by looking under Publications, then under Papers Accepted by CORP.

After a brief introduction, Section Two of the paper shows traditional link ratio methods to be special cases of a broad group of regression models, which the authors describe as the "extended link ratio family" (ELRF) of regression models. Using this broader group of models, the authors examine the implicit assumptions underlying these models.  They find that in almost all real-world scenarios the data does not support those assumptions.

The analysis also compares the relative precision of the methods based on the traditional link ratios to methods based on modeling trends in the incremental data.  Not surprisingly, modeling the trends in incremental data results in substantial improvement.

In Section Three, Barnett and Zehnwirth go a step further.  Using the ELRF models as a starting point, they create a probabilistic model structure.  The new structure is based on the analysis of logarithms of the incremental data and focuses on trends in three directions:
      1) across accident periods,
      2) across development periods, and
      3) across calendar periods.
The next step in building the structure is the fitting of a distribution to each cell in the loss development matrix. Models based on this structure are called the Probabilistic Trend Family (PTF).  The paper then describes the process for selecting the optimal PTF model to use for any given set of reserving data.

The final section of the paper discusses the potential benefits that arise from the use of a probabilistic model of reserves.

Barnett and Zehnwirth have written an excellent paper that provides a flexible and powerful set of methods for estimating reserves.  The body of the paper is quite easy to follow and understand (even for a nontechnical guy like me).  The appendices are considerably more complicated and require more effort to understand, but it's worth taking the time to read through them.

I highly recommend "Best Estimates for Reserves" and hope that more of us will begin to use the methods that Barnett and Zehnwirth describe.  If any of you out there are currently using similar models in your work, the Research Policy and Management Committee would be interested in hearing from you.  Also, if you have an idea for research, please visit the "Research" section of the CAS Web Site.