Skip to Main Content
Follow us

Examiner Newsletter, Vol. 1, No. 1, April 2018

What’s new and what do you need to know

Welcome to the first issue of the Royal College Examiner Newsletter. We created this venue to improve our communication with our examiners and meet some of your needs. Feel free to reach out to us at if you need information or clarification, or if you have ideas for future articles. We promise to keep topics brief and avoid overwhelming your already busy inboxes. We’ll just share what we think you really need to hear about.

Thank you for all that you do for the Royal College, medical education in Canada and the public we serve! Your role as examiners is critical to our mission and helps maintain the public trust in the profession.

Farhan Bhanji, MD, MSc (Ed), FAHA, FRCPC
Associate Director, Examination Strategy

Eryn McDonell, BSc
Coordinator Evaluation, Exam Quality and Analytics (EQA)

Exam testing

Formal reviews and examiner conduct

How can I contribute to fair exams and minimize the risk of an appeal?

The number of formal review requests is substantially rising.

Examiners need to take care in how they interact with exam candidates because their conduct can affect candidates’ perception of the fairness of Royal College exams.

Candidates can request that the Royal College formally review a process irregularity that they believe caused them to fail their exams. If the Royal College investigation finds that the candidate did experience a process irregularity that may have affected their exam result, it will grant a no standing on the exam, refund the fees and add another year of eligibility. This process is critical for demonstrating the fairness of exams to candidates.

In about half of the formal review requests, the process irregularity relates to some aspect of examiner conduct before, during or after the exam. For example, candidates have raised concern about negative interactions with examiners before the exam and examiner demeanour during the exam, saying it showed a bias against them.

The formal review process is not the only reason examiner conduct is important. For candidates, examiners are the face of the Royal College. Their conduct influences candidates’ perception of the fairness and validity of exams and the overall reputation of the Royal College. In addition to actually having a fair process, candidates need to perceive the exam process as being fair to all candidates, whether or not they pass their exams.

Given the significant effect of examiner conduct, the Royal College has started educating examiners on how their actions can affect the reputation of the Royal College and the formal review process. We look forward to continuing our education with our examiners.

The Royal College thanks you for your cooperation in this new process.

The future of exams in a CBD world

Medical education is undergoing a major transformation with the launch of Competence by Design (CBD)! Anesthesiology and Otolaryngology — Head and Neck Surgery are our pioneer specialties, launching in July 2017.

Workplace-based assessment will substantially increase in CBD. This will be helpful to both determining the competence of residents and providing them concrete, specific feedback to help them improve.

Despite this increased emphasis on workplace-based assessment, there will continue to be a role for the Royal College exams as a standardized, third party, unbiased assessment of competence in the certification process that is needed to maintain the public trust. We may revisit the role of exams in the future, after we have data on the effectiveness of CBD assessments, but that will require time and a change in societal expectations.

In transitioning to CBD, we need to revisit our exams and ensure that they meet the needs of the new educational system. Additionally as we prepare to move to computer-based exams, we need to standardize our practices across the Royal College exams. Many of the differences that exist between the 67 disciplines have occurred based on history and that variance puts us at risk for making errors.

We have identified some of the key issues exam boards are facing as they transition to CBD. While this is not an exhaustive list, as examiners, you need to be aware of these issues and changes.

  • Blueprints need to be revised, so they can best supplement and support the CBD assessment program.
  • Answerless review or exam quality review will be required for all exams.
  • Scores for written and applied (i.e., oral, OSCE or practical) exams will no longer be combined to determine the pass or fail decision of candidates. For those specialties that use written and applied exams, candidates will need to pass the written before taking the applied exam.
  • To simplify decision rules for passing an exam, candidates must score at least 70 per cent to pass any exam. There will not be multiple decision rules, such as the number of stations passed, and grids will no longer be used.
  • Exams will occur earlier in training, ideally in the Core phase for primary specialties and within Core or Transition to Practice for subspecialties.
  • Discussions of borderline candidates will no longer take place. Multiple internal and external reviews, legal counsel and exam and assessment committees raised concerns about these discussions, which have been the reason for successful appeals. We will continue to improve the quality of the exams so boards can trust the outcomes of their exams.
  • Transparency for candidates will be improved, including sample questions, better reflection of the blueprint on our website, understanding how they will be marked and the pass score.
  • Examiners will use a global rating scale to score applied exams as it better assesses candidates. Checklists reward thoroughness, not higher order thinking that characterizes experts.
  • Specialties need to consider the costs of exam development and delivery (as this is passed on to candidates) in determining what exam components are necessary.
Onboarding screen shot

New learning activity for new examiners

The Examiner Onboarding Program is an online learning activity for new Royal College examiners. The aim is to support them by providing key foundational knowledge for their important role.

In this pilot year, the program will be offered to newly appointed examiners only. We will identify and register all new examiners automatically. We may extend access for existing examiners in subsequent years, depending on feedback and perceived value.

Your examination and assessment committees developed the objectives of this pilot, which determined the content. They reviewed the alpha version of the program and were happy with the quality and interactivity of the modules.

This program has received MOC Section 3 accreditation, just in time for the launch of a new exam season.

The program consists of eight modules and one master test.

1. Introduction to Royal College Exams
  • Purpose of exams
  • Criteria for good assessment
  • Roles and responsibilities
  • Exam development and administration process
2. Validity
  • What is validity?
  • How to determine if a Royal College examination is valid?
3. Current Topics in Exam Quality
  • Exam review processes
  • Rating scales
  • Rater calibration
4. Basic Psychometrics
  • Introduction to psychometrics
  • Criterion-referenced vs. norm-referenced
  • Statistics
  • Classical test theory
5. Royal College Exam Policies
  • Terms of reference
  • Examiner conduct before, during and after exams
  • Confidentiality and conflict of interest
  • Maintenance of Certification (MOC)
6. Creating MCQ items
  • Characteristics of well-constructed MCQs (multiple-choice questions)
  • Steps in developing MCQs
  • Guidelines for assessing MCQs
7. Creating SAQ items
  • Characteristics of well-constructed SAQs (short-answer questions)
  • Steps in developing SAQs
  • Guidelines for assessing SAQs
8. Creating Oral and OSCE Cases
  • What is an applied exam?
  • Case development
  • Case administration

The program will be launched on April 13 in English and French. Stay tuned!

If you have any questions, please contact Monica Rodriguez at

Hands typing

Cybersecurity: Keeping your computer and the Royal College exams safe

The world of cybersecurity has changed. Hackers are no longer attacking us through the defensive measures and tools implemented by IT. They are going around them by targeting us individually because we have direct access to the information or resources they want.

Why the change? Because it’s easier. We can be tricked. We are trusting, helpful and curious by nature. We open email messages and click on links. Hackers use those opportunities to install malware on our computers, and then they exploit that control.

Think about what you have access to — confidential data like exam questions, personal information on our colleagues (Fellows) and your own personal information (bank accounts, contact lists, photos, etc.). If that data were to be posted online, what damage could it cause?

What you can do

  • Stop, look and think before you respond to a suspicious email or click on links in an email, plug in a USB key, or connect to public Wi-Fi.
  • Consider the potential risks of clicking on a suspicious link and delete the email instead.
  • Protect your computer by installing an antivirus tool and keep all your software up-to-date.
  • Make sure you have good backups of your data and test them regularly.
  • Do not store exam questions outside of Royal College systems. Use RCeds and Alfresco to store questions and only transmit them through those tools or by encrypted email.

If you have questions about sending documents securely, reach out to your exam facilitators or

International banner

Royal College International sees increased interest in exams

Royal College International continues to see increased interest in Royal College’s exam processes and offerings. Recognized internationally as a leader in postgraduate medical education (PGME), the Royal College’s exam services are viewed as a gold standard within the PGME field.

In 2016 RCI entered into a long-term collaboration with the Kuwait Institute for Medical Specialization to provide assessment and exam improvements for 21 medical specialties.

The success of this project, albeit in its infancy, has led to other countries seeking similar value-added improvements. Most notably, Qatar and Oman have expressed interest in working with the Royal College to enhance their assessment and exam offerings. The proposed programs could result in the training of Omani and Qatari examiners in the development of exam content and improvements to the delivery of specialty exams within their respective countries.

To support the current growth in Royal College’s assessment and exam offerings, the Examinations unit will be growing. The team is expected to add two exam facilitators and one data analyst before the end of March to provide support in service to the Royal College’s growing international partnerships.

For more information about Royal College International and opportunities to get involved, please email