Notes From the Syllabus and Examination Committee
Occasionally the Syllabus and Examination Committee responds to questions that arise about specific exam issues. The most recent responses can be found in the expandable sections below.
Notes by Year
For CAS candidates taking the at times seemingly endless journey through the exams, there is fortunately no shortage of assistance. Textbooks, commercial study guides, practice questions and exams, and Examiner's Reports all help candidates master the material they need to reach the FCAS designation. But there is one resource that you don't often hear from – exam writers. Although you rarely communicate with them directly, there's an army of volunteers who work year-round to create your exams. And while we can't give you an advanced copy of the next exam, there are a few tips we can offer to help you on your voyage.
Commercial study guides are abundant and popular aids for candidates of CAS exams, but they're just that — aids. They are a supplement to source material, not a replacement. CAS exam writers do not make use of these myriad guides; they rely exclusively on the syllabus texts when writing questions. And while study guides may try to anticipate how the source material will be covered on exams, if at all, this is at best an educated guess. To ensure adequate preparation for the exams, we strongly recommend reading all of the syllabus texts directly. Read Exam Questions Carefully. The committee does not go out of its way to make questions intentionally tricky or misleading to candidates. Riddles can be fun, but not when your career is at stake. But this doesn't mean that all questions will be straightforward and require rote regurgitation of facts. To test deep understanding of the material, some questions may be more challenging than they appear at first glance — and the question will not always directly bring that to your attention. The 15-minute reading period at the beginning of each exam is useful for finding these potentially complex or unusually tough problems. Read through each item carefully and make sure you understand exactly what the question is asking. And if you don't understand, read it again.
In the Spring 2018 sitting, some candidate challenges alluded to reinterpretations of questions. This, frankly, caught us off guard. All of the exams go through many rounds of reviews and edits before being printed. Despite this, mistakes do happen occasionally. However, even if you think it is painfully obvious that something is wrong in the problem, do not assume that we made an error and try to answer what you think we intended. If there is a typo that results in a nonsensical answer, even if it's not what we intended, we will always at least accept a response consistent with the literal reading of the question. On the constructed response exam questions, candidates are also welcome to state their assumptions if they think the item is unclear. Graders will consider these assumptions, though simply stating them is no guarantee that they will be accepted as valid. Either way, we encourage candidates to let us know following the exam if they think there's an issue with a question. We will do our best to solve the issue judiciously and fairly. Caution: Do not assume that you know what we "meant" and solve that reinterpreted problem without writing an assumption, and then only tell us later.
If a candidate believes that an exam question was flawed, we want to hear about it. The most effective way to write a challenge is to first succinctly and clearly state what you believe the issue to be, and second provide any supporting documentation you can. Ideally this will be a specific reference in the syllabus texts that supports your point, but any sort of documentation will be helpful. We consider every challenge fairly and thoroughly, and we want nothing but the most just outcome for everyone.
The CAS exams are created by hard-working volunteers. We donate our time to this effort because we're proud of the CAS and the opportunities that it has given us professionally. Our goal is always to make the best and most relevant set of credentialing exams that we can, and we're constantly striving to improve the process. At times, the candidates may view the process as frustrating, confusing or dysfunctional, but we assure you that we're doing everything we can to make this as painless as possible in the long run. We hope that you all will someday be in a similar position, as credentialed actuaries, to pass along your appreciation of our organization by volunteering for one of many great committees too.
The CAS Board of Directors has approved revisions to the organization’s credentialing requirements effective in 2018, with the creation of two new exams. The new exams, which are intended to address the emerging needs of future actuaries and their employers, will be called Modern Actuarial Statistics I and II (MAS-I and MAS-II).
MAS-I is largely a modification of current CAS Exam S, which it will replace when it is first offered in the spring of 2018. MAS-II will replace the current CAS Exam 4 requirement that is typically fulfilled by most candidates through completion of SOA Exam C, which is being discontinued. MAS-II will cover several topics from Exam C, along with new statistics and predictive analytics material, and will first be offered in the fall of 2018.
The discontinuation of Exam C provided an opportunity for the CAS to create a replacement exam that focuses on the modern statistics that actuaries are increasingly using. This will enhance the relevance of the CAS exam syllabus with respect to emerging statistical and analytics skills, with minimal changes to the overall exam structure.
“The CAS has a duty to the actuarial profession to help ensure that future actuaries are prepared to address the challenges of our changing world,” said Nancy Braithwaite, CAS President. “With our new exams, we are demonstrating our commitment to providing education that is uniquely relevant to property and casualty actuaries so that we continue to meet the needs of our employers and principals.”
Both MAS-I and MAS-II will be four-hour exams resulting in virtually no net increase in exam hours required for CAS credentials. The exams will be offered every six months as multiple-choice paper-and-pencil exams, in the same general windows in the spring and fall in which other CAS exams are offered.
The transition rules allow candidates with credit for Exam S achieved through an examination administered prior to January 1, 2018 to receive credit for MAS-I. Candidates with credit for SOA Exam C achieved through an examination administered prior to July 1, 2018 will receive credit for MAS-II.
The syllabus and learning objectives for Exams MAS-I and II are still being finalized, but in general, candidates should expect MAS-I to be similar to Exam S, but with more emphasis on applied modeling and a deeper coverage of generalized linear models. MAS-II will retain coverage of credibility from Exam C, and will also include advanced statistical topics like Bayesian Markov Chain Monte Carlo (MCMC) methods.
The CAS has announced that it will create a new exam and modify a current exam, and brand these exams as Modern Actuarial Statistics I and II (MAS-I and MAS-II). MAS-I will be nearly equivalent to current CAS Exam S and will first be offered in the spring of 2018. MAS-II will cover much of the current SOA Exam C, which is being discontinued in 2018. The new exam will add statistics and predictive analytics material, and will first be offered in the fall of 2018.
Recent candidate feedback to the Examination Committee questioned an exam passing score, known as a passmark, and the process used to set that score, known as standard setting. Because it is an interesting psychometric issue, the Examination Committee agreed to publish its response to the feedback as an Open Letter so that all candidates could better understand the process.
A candidate had expressed concern with the process used by the CAS to set the exam passing score. The passing score on each CAS exam is directly linked with the minimally-acceptable competency in the aspect of practice the particular exam is assessing. The passing score is objectively set so that each and every examinee who meets or exceeds that passing standard (passmark) will pass the exam. Those who do not meet the standard will not pass the exam.
This process was noted in a 2011 open letter from then-CAS president Pat Teufel, who wrote that “ … the Examination Committee has, as its sole objective, the implementation of an objective, content-based approach to assessing whether candidates have demonstrated the qualifications outlined in the learning objectives …” This was true in 2011 and it remains true today, and it is an important point. Each CAS passmark is individually set by the respective exam committee prior to test administration, with the process overseen by an expert psychometrician, Dr. Richard Fischer, who joined CAS in March 2013 as Director of Admissions.
Since every CAS passmark is an objective standard, every examinee could pass a particular test, or no examinee could pass. Of course, having everyone pass (or fail) a particular exam is never expected, but it is possible since each passmark is set with respect to minimally-acceptable competency in the aspect of practice the particular exam is assessing. This is known as the “criterion-referenced” approach to testing. Examinees who meet or exceed a properly-established passing score have met the minimum standard of knowledge and professional competence established for that exam, which is expressed as the exam’s passmark, so they pass the exam.
In contrast, in a “norm-referenced” approach, which the CAS does not follow, one’s performance on a test is judged with respect to how others did on that test. Examinee performance is subjective. A norm-referenced passmark is set based on test results so that a particular number of examinees will pass. For instance, under such a “quota system,” the top, say, 60 scorers would pass – regardless of their test scores – since those 60 scores were the highest on the test, without regard to any minimum competency or proficiency standard. Moreover, since performance is relative to those who also took the test, the “Top 60” in one examinee group may differ from the “Top 60” in another examinee group on the same test, depending on the group means in each administration of the test.
In summary, the CAS adheres to a “criterion-referenced” approach to testing in which each passmark is set with respect to minimally-acceptable competency. m A “norm-referenced” system has no place in professional credentialing, particularly in the CAS, which has been setting the standard of expertise, credibility, and professional integrity for the property and casualty actuarial profession for almost 100 years. As a result, CAS credentials are unmatched for their rigor, integrity, and value.
To further improve transparency and to provide insights into CAS test development and the ways that overall exam quality continues to be improved, we’re developing a series of Future Fellows articles on individual aspects of the process. The first article, on the topic of standard setting, is set for publication in the March 2014 issue.
The Examination Committee is looking forward to sharing test development particulars in subsequent issues of Future Fellows. In the meantime, candidates and members are welcome to submit test development and related psychometric issues of concern and interest to the Examination Committee for possible discussion in a future article. We welcome your questions and feedback through the CAS website.
Some candidates have expressed concern about the CAS’ use of Bloom’s Taxonomy in developing questions. Others have asked that the CAS release candidate papers for upper-level exams. The CAS Examination Committee Officers wanted to respond to these candidate questions.
First, about Bloom’s Taxonomy and the Board’s decision to assess at higher cognitive levels on our upper-level exams.
Cognitive level within Bloom’s Taxonomy refers to the complexity of the mental process needed to complete a task. The higher the cognitive level of a test item, the more complex the mental process needed to correctly answer that question. In moving to higher complex levels for exams, the CAS’ goal is to ask questions that represent the complex thinking required for actuaries in the workplace.
However, while the structure and focus of questions is changing, the learning objectives remain the same. The Examination Committee recognizes that these shifts in questions types may require candidates to change their study methods, and may make past exam questions less useful as study aids. We take into account the higher cognitive level of questions in designing exams, as well as during the grading process.
The CAS Examination Committee and the Candidate Liaison Committee will work together to provide information on the changes in question types, and to assist candidates in targeting their study plans to the new format.
The CAS has no immediate plans to release candidate papers. We will continue to provide detailed sample solutions, as well as the Examiner’s report, which can serve as study tools for candidates in future exams.
Admittedly, attaining Fellowship can be an arduous process. Fellowship requirements are designed to ensure that fellows have the knowledge and analytic ability to succeed. Our added effort to move exams to a higher cognitive level supports this premise.