Hostname: page-component-7c8c6479df-p566r Total loading time: 0 Render date: 2024-03-28T13:17:38.828Z Has data issue: false hasContentIssue false

Teaching, learning and assessing evidence-based psychiatry

Published online by Cambridge University Press:  02 January 2018

Stuart Carney*
Affiliation:
UK Foundation Programme Office
James Warner
Affiliation:
Central and North West London NHS Foundation Trust Imperial College London
Sheraz Ahmad
Affiliation:
Charing Cross and St Mary's Higher Specialist Training Scheme London Deanery/School of Psychiatry
Gianetta Rands
Affiliation:
Camden and Islington NHS Foundation Trust University College London
Sajid Suleman
Affiliation:
South London and Maudsley NHS Foundation Trust
Rights & Permissions [Opens in a new window]

Summary

This paper sets out the rationale, process for development and the content of the new evidence-based practice syllabus, which is examined as part of the Membership of the Royal College of Psychiatrists' Paper 3. The syllabus was developed by the Critical Review Paper Panel of the Royal College of Psychiatrists. Suggestions for learning and teaching evidence-based practice are also put forward.

Type
Education & Training
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
Copyright © Royal College of Psychiatrists, 2011

Readers of the scientific literature will be acutely aware that publication in a peer-reviewed journal is no guarantee of quality of a research paper. The tendency for published research funded by the pharmaceutical industry to favour new therapies is well known. Reference Djulbegovic, Lacevic, Cantor, Fields, Bennett and Adams1,Reference Perlis, Perlis, Wu, Hwang, Joseph and Nierenberg2 Composite end-points, subgroup analyses and faulty comparators in clinical research reports can mislead the unsuspecting clinician. Reference Montori, Jaeschke, Schunemann, Bhandari, Brozek and Devereaux3 Perhaps one of the most famous examples of potentially inappropriate comparators was the use of high-dose conventional antipsychotics in randomised controlled trials of atypical antipsychotics for the treatment of schizophrenia. Reference Geddes, Freemantle, Harrison and Bebbington4

Recognising the need to ensure that psychiatrists have the skills to make informed judgements about the validity, importance and applicability of research papers, the Royal College of Psychiatrists was one of the first medical Royal Colleges in the UK to introduce a critical review paper as part of their membership examination. Reference Leung and Whitty5 Initially, this was a 90-minute written examination (short-answer questions) based on published research articles. However, since a critical review paper was first incorporated into the examination, there have been significant changes in postgraduate medical education in the UK, and there has been an increasing recognition of the need for high-stakes examinations to be reliable as well as valid. These changes have resulted in the critical appraisal exam evolving from a short-question format to an electronically marked one (using single-best answer and extended matching item questions) and crystallised the need to have a defined syllabus for exam setters and candidates.

Although there is no longer a separate critical review paper, this topic is covered in the new Membership of the Royal College of Psychiatrists (MRCPsych) Paper 3 exam, where it contributes around a third of the marks for this written paper. For the past 3 years, the panel responsible for setting this component of Paper 3 has been redesigning the examination format and developing a syllabus to explicitly define what is required. The single-best answer questions and extended matching items are mapped to the syllabus.

The shift to electronically marked test mirrors practice in the USA. The American Board of Psychiatry and Neurology's (ABPN's) psychiatry part I examination is a 500-item, multiple-choice test administered by computer over 9.5 hours. Evidence-based practice is assessed as part of the epidemiology and public policy section, which accounts for 8% of Part A examination. The ABPN publish a content outline for Parts A and B of this examination but a detailed syllabus is not publicly available. 6

In this paper, we describe the evidence-based practice syllabus (formerly the critical appraisal paper syllabus) that is assessed as part of the MRCPsych Paper 3. In addition, we review strategies for teaching, learning and assessing evidence-based practice in psychiatric training.

The evidence-based practice syllabus

Four principles underpinned the development of the evidence-based practice syllabus for Paper 3 of the MRCPsych exam:

  1. 1. face validity - it must cover the knowledge and skills necessary for evidence-based practice;

  2. 2. feasibility - candidates must be able to access training resources to support their learning and it must be possible to formally assess their knowledge and skills;

  3. 3. content coverage - the syllabus must describe the breadth and depth of the knowledge and skills required;

  4. 4. transparency - the syllabus must be published and both learners and trainers must be able to access it.

In this section, we describe how the Critical Review Paper Panel developed this part of the syllabus. The syllabus is set out in an online supplement to this paper and will also be published on the exams section of the Royal College of Psychiatrists’ website (www.rcpsych.ac.uk/exams.aspx).

The evidence-based practice syllabus aims to cover the knowledge and skills that psychiatrists need to use research data to inform their clinical practice for the benefit of patient care. Therefore, the syllabus has been structured around the five steps of evidence-based practice, as recommended in the ‘Sicily statement’: Reference Dawes, Summerskill, Glasziou, Cartobellotta, Martin and Hopayaian7

  1. 1. translation of uncertainty to an answerable question

  2. 2. systematic retrieval of best available evidence

  3. 3. critical appraisal of evidence for validity, clinical relevance and applicability

  4. 4. application of results in practice

  5. 5. evaluation of performance.

These five steps are integral to the General Medical Council's definition of good clinical care and also to clinical governance. 8,Reference Scally and Donaldson9 The Academy of Medical Royal Colleges has incorporated these steps into the evidence and guidelines section of the Common Competences Framework for Doctors (Table 1). 10

Table 1 Common Competences Framework for Doctors: evidence and guidelines 10

Objectives • To make the optimal use of current best evidence in making decisions about the care of patients
• To develop the ability to construct evidence-based guidelines and protocols in relation to medical practice
Knowledge • Outlines the principles of critical appraisal
• Knows the advantages and disadvantages of different study methodologies (quantitative and qualitative) for different types of questions
• Outlines levels of evidence and quality of evidence
• Knows how to apply statistics in scientific medical practice
• Understands the use and differences between the basic measures of risk and uncertainty
• Describes the role and limitations of evidence in the development of clinical guidelines and protocols
• Understands the processes that result in nationally applicable guidelines (e.g. those from NICE and SIGN)
• Knows the principles of service development
Skills • Able to search the medical literature including use of PubMed, Medline, Cochrane reviews and the internet
• Appraises retrieved evidence to address a clinical question
• Applies conclusions from critical appraisal into clinical care
• Contributes to the construction, review and updating of local (and national) guidelines of good practice using the principles of evidence-based medicine
Behaviours • Aims for best clinical practice (clinical effectiveness) at all times, as informed by evidence-based medicine
• Recognises knowledge gaps and keeps a logbook of clinical questions
• Keeps up to date with national reviews, key new research and guidelines of practice (e.g. those from NICE and SIGN)
• Recognises the common need to practise outside clinical guidelines
• Communicates risk information, and risk-benefit trade-offs, in ways appropriate for individual patients
• Encourages discussion among colleagues on evidence-based practice
• Proposes and tests ways to improve patient care

The Panel reviewed the syllabic content of the Royal College of Psychiatrists’ curriculum and mapped it to the five steps outlined above. Over the past few years, the Panel has discussed additional content at each meeting and blueprinted proposed questions in the critical review paper against the agreed syllabus. This iterative process has taken account of comments from psychiatrists who volunteered to contribute questions to Paper 3. The performance of individual questions has been reviewed following each sitting of Paper 3 and if necessary changes were made to the syllabic content. Whenever the syllabus was reviewed, the Panel considered two key questions: ‘Do psychiatrists require this knowledge and these skills to practise effectively?’ and ‘Can psychiatric trainees realistically acquire and develop this knowledge and these skills as part of their training?’

Fundamental to the process of developing this syllabus has been a commitment to define the limits of what will be examined. The syllabus is necessary to ensure that trainers and trainees are aware of what they should be learning and also for blueprinting assessment. Inevitably, any syllabus is open to interpretation but the Panel believes that this describes the core evidence-based practice knowledge and skills required for satisfactory completion of basic specialty training in psychiatry.

Teaching, learning and assessing evidence-based practice

Specialty training programmes build upon the knowledge, skills, attitudes and behaviours acquired and developed as an undergraduate and during foundation training. Psychiatric trainees should have access to a range of approaches to continue to develop their competence in evidence-based practice, including:

  1. workplace-based experiential learning

  2. independent self-directed learning driven by clinical questions

  3. taught courses, which model evidence-based practice and describe explicitly the evidence upon which assertions are made.

Clinically integrated teaching on evidence-based practice, that is basing teaching sessions on encounters with patients on the ward and in clinics or focused training in clinical ward rounds, has been shown to improve the relevant knowledge, skills, attitudes and behaviours. Reference Coomarasamy and Khan11 Stand-alone teaching appears to only improve knowledge. Therefore, the predominant mode of evidence-based practice learning (after initial skills training) should be experiential, that is, the five steps described earlier should be applied in the management of current patient problems. Supervision and feedback provides an important opportunity to help develop these skills in addition to helping doctors in training reflect on their learning needs.

Knowledge and understanding of concepts and principles of evidence-based practice can be reliably assessed using single-best answers and extended matching item questions. The MRCPsych examination now uses these techniques to assess basic epidemiology, basic biostatistics, qualitative methods, health economics, guideline development and critical appraisal, i.e. the knowledge and skills underpinning evidence-based practice.

Although moving away from the short-answer format was initially challenging to the question setters, the College now has an expanding bank of highly reliable questions that discriminate well between good and less able candidates. The evidence-based practice part of Paper 3 comprises 60 questions taking about a third of a 3-hour paper. These questions include 8-10 single-best answer questions linked to a short précis (about one-page long) of a research paper with a data-set or graph, and stand-alone single-best answer and extended matching item questions. Examples of this format are provided below.

Sample single-best answer question

Which of the following is the least adequate method of randomisation?

  1. a. Minimisation __

  2. b. Odd/even last digit of date of birth __

  3. c. Permuted block randomisation __

  4. d. Simple randomisation by computer __

  5. e. Toss of a fair, unbiased coin __

Sample extended matching item

Theme: calculations in critical appraisal Options:

  1. A 0

  2. B 1

  3. C 4

  4. D 5

  5. E 20

  6. F 80

  7. G 100

For each of the questions below, select the most appropriate number from the list above.

  1. 1. The usual upper limit of risk of type II error (expressed as a percentage) in power calculations for randomised clinical trials. __

  2. 2. The ideal number needed to treat (NNT). __

  3. 3. The sensitivity of a test, expressed as a percentage where 80 people were classified ‘true positive’ and 20 people were classified ‘false negative’. __

And so on, for six to eight questions per one extended matching item.

Despite the changes to the exam, specialty training programmes must also assess whether specialty registrars are competent in practice. The Royal College of Psychiatrists has identified nine tools for workplace-based assessments in psychiatry training. Only the case presentation tool explicitly asks about ‘interpretation of clinical evidence’, although evidence-based practice skills could be highlighted in a case-based discussion, journal club presentation or the mini-Peer Assessment Tool (mini-PAT) multisource feedback. Requiring psychiatric trainees to produce critically appraised topics could provide another means of assessing skills in evidence-based practice.

Conclusion

It is essential that all psychiatrists use the best available evidence to inform patient care. The knowledge and skills required for evidence-based practice are comprehensively examined as part of MRCPsych Paper 3. The College now has an evidence-based syllabus for this exam and the revised format works well. Psychiatrists should consolidate and develop their evidence-based practice skills and behaviours both throughout their training and career.

Footnotes

Declaration of interest

J.W. is chair and S.C., S.A., G.R. and S.S. are members of the MRCPsych Critical Review Paper Panel.

References

1 Djulbegovic, B, Lacevic, M, Cantor, A, Fields, K, Bennett, C, Adams, J, et al. The uncertainty principle and industry-sponsored research. Lancet 2000; 356: 635–8.Google Scholar
2 Perlis, R, Perlis, C, Wu, Y, Hwang, C, Joseph, M, Nierenberg, A. Industry sponsorship and financial conflict of interest in the reporting of clinical trials in psychiatry. Am J Psychiatry 2005; 162: 19571960.Google Scholar
3 Montori, V, Jaeschke, R, Schunemann, H, Bhandari, M, Brozek, J, Devereaux, P, et al. User's guide to detecting misleading claims in clinical research reports. BMJ 2004; 329: 1093–6.Google Scholar
4 Geddes, J, Freemantle, N, Harrison, P, Bebbington, P. Atypical antipsychotics in the treatment of schizophrenia: systematic overview and meta-regression analysis. BMJ 2000; 321: 1371–6.Google Scholar
5 Leung, W, Whitty, P. Is evidence-based medicine neglected by royal college examinations? A descriptive study of their syllabuses. BMJ 2000; 321: 603–4.Google Scholar
6 American Board of Psychiatry and Neurology. Part I Examination in Psychiatry A and B Content Outline 2011. ABPN, 2011 (http://www.abpn.com/Initial_Psych.htm).Google Scholar
7 Dawes, M, Summerskill, W, Glasziou, P, Cartobellotta, A, Martin, J, Hopayaian, K, et al. Sicily statement on evidence-based Practice. BMC Med Educ 2005; 5: 1.Google Scholar
8 General Medical Council. Good Medical Practice. GMC, 2001.Google Scholar
9 Scally, G, Donaldson, L. Clinical governance and the drive for quality improvement in the new NHS in England. BMJ 1998; 317: 61–5.Google Scholar
10 Academy of Medical Royal Colleges. Common Competences Framework for Doctors. AMRC, 2009 (http://www.rcpe.ac.uk/training/files/ccfd-august-2009.pdf).Google Scholar
11 Coomarasamy, A, Khan, K. What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review. BMJ 2004; 329: 1017.Google Scholar
Figure 0

Table 1 Common Competences Framework for Doctors: evidence and guidelines10

Supplementary material: PDF

Carney et al. supplementary material

Supplementary Material

Download Carney et al. supplementary material(PDF)
PDF 37.5 KB
Submit a response

eLetters

No eLetters have been published for this article.