An Interview with Jason Russ, VP of Admissions
In this insightful interview, Jason Russ, vice president of Admissions at the CAS, shares his perspectives on the evolving landscape of actuarial exams and candidate preparation. With over 27 years of involvement with the CAS, Russ discusses the importance of supplementary study materials, the integration of predictive analytics and data science into the education pathway, and the shift towards computer-based testing. He also highlights the significance of project-based evaluations in developing practical skills and ongoing efforts to enhance the candidate experience. This conversation provides valuable insights for current and future CAS candidates navigating their actuarial credentialing journey.
The following transcript has been edited for length and clarity. The full interview can be found online: https://tinyurl.com/CASFutureFellows.
CAWG: Can you share something about yourself that candidates might not know?
Russ: With respect to the CAS, I’ve been involved with the syllabus and exam committee in some capacity for over 27 years. I’ve been involved in this a long time, so I’ve seen a lot of things since I was a writer, a grader, and on a pass mark panel. I was the chair for an individual exam, and I was the chair for the whole committee. Then I was a Board member and on the board committee overseeing the governance of admission. I’ve seen admissions from a lot of angles for many, many years and bring those perspectives to my current role.
Outside the CAS, I like to spend time ballroom dancing. In my younger days, this was a serious hobby. I would train for 10 hours a week and travel all around the U.S. and to other countries doing competitions with my dance partner, who’s now my wife. Now we’re still doing a bit of dancing just for fun and to stay active.
CAWG: We’ve seen some recent content outline updates, such as the expansion with PCPA, as well as some updates to the MAS exams and Exam 8 that really focus on predictive analytics and data science. How do you think these changes help prepare candidates for actuarial roles of the future?
Russ: One of the pillars of the CAS Strategic Plan is to ensure that our actuaries are building the skills needed for their future work. These changes with respect to predictive analytics and data science are a great example of how we’re addressing that. Actuaries need to be able to work with large sets of detailed data. Years ago, if I were doing a reserve assignment for a client, the data I received might be limited to just aggregate loss triangles and information on premium and rate changes. That’s very rarely the case now.
Now it’s very typical to receive databases with the details for every single policy written and every single claim received, with a large number of fields for each. That’s a lot of data that we didn’t have to deal with in the past.
There are skills actuaries need to work with such large datasets—to read the data, to review it, to query it to understand what’s there; actuaries need to test the data and address issues that arise regarding data quality. Then actuaries need to actually use this data. This could mean using a GLM to identify loss cost drivers, producing visualizations to help explain trends, or many other things.
The actuaries on my team at Milliman dig into things like is there a change in the business mix that drove a change in the claim frequency or are there different types of claims emerging that drove changes in claim severities? Investigating those changes and then incorporating any conclusions from such investigation into the actuarial analysis is vital to producing a high-quality work product. And it’s only going to become more important in the future.
CAWG: Regarding the new PCPA exam and project, this is the first time the CAS has used a project-based testing approach. How does that project-based evaluation approach support the development of new skills for candidates?
Russ: There are several different skills that this approach will highlight. This project-based exam will allow candidates to demonstrate their abilities in a way that cannot be done using traditional exams. Candidates are not just going to talk about how to build a model or what they would look at in a model or how they would interpret model output. They’re going to actually do the modeling.
It’s moving from theory to practice, which I think is a really great step in our exam process.
We need to know that not only do candidates understand the principles, but that they can actually do the work.
This came about from a very clear directive from the CAS Board a few years back: we need to build skills for the future. And this was viewed as a must—to make sure that all actuaries can build models.
On top of that, candidates will need to write a report that describes their work, thus going beyond merely testing skills. They also must clearly identify the rationale for creating the model, discuss the relevant business decisions to be made, etc., demonstrating their ability to be more than just technical support but also businesspeople, decision-makers, and influencers. These roles require certain soft skills, such as communication skills, presentation skills, and problem-solving skills. Those skills are difficult to assess using traditional exams, but they can come through in a project environment like this.
CAWG: Any closing thoughts to share with candidates?
Russ: Thank you for the opportunity. One thing we haven’t touched on is how our exam committee and leadership are much more in touch with the Candidate Advocate Working Group now than ever before. I think that candidates should understand that. Candidates can use that stronger relationship to feed any comments and perspectives through the CAWG. That’s a great way for us to hear what our candidates are really thinking.