Hostname: page-component-7c8c6479df-94d59 Total loading time: 0 Render date: 2024-03-26T23:37:38.024Z Has data issue: false hasContentIssue false

Curriculum for workplace-based assessments: a Delphi study

Published online by Cambridge University Press:  02 January 2018

Rahul Bhattacharya*
Affiliation:
East London NHS Foundation Trust
Michael Maier
Affiliation:
Imperial College, London
Dinesh Bhugra
Affiliation:
Institute of Psychiatry, King's College, London
James Warner
Affiliation:
Central and Northwest London NHS Foundation Trust
*
Rahul Bhattacharya (rahul.bhattacharya@nhs.net)
Rights & Permissions [Opens in a new window]

Abstract

Aims and method

To generate a list of topics for ‘core curriculum’ that can be used as a guide for trainees and trainers carrying out workplace-based assessments (WPBAs). A three-stage Delphi consultation was carried out.

Results

Generation of a list of topics for WPBA appropriate for each year of core training with a mean rating of importance in curriculum.

Clinical implications

In the absence of formal guidance, the list generated can serve as an informal guide.

Type
Education & Training
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
Copyright © The Royal College of Psychiatrists, 2010

The purpose of restructuring medical education came from the idea that acquisition of knowledge does not in itself translate into quality assurance or professionalism. Many of the skills deemed essential for providing excellent clinical care and ensuring medico-legal safety are often beyond the remits of what is tested through the MRCPsych examination. The theoretical basis for this stems from Miller's work that proposed a hierarchy for assessment of competencies. Reference Miller1 At the lowest level of this hierarchy is knowledge (knows); followed by competence (knows how); then performance (shows); and finally action (does). To ensure quality assurance in training, it is proposed one needs to focus on what one ‘does’. Workplace-based assessments (WPBAs) aim to measure what a clinician ‘does’ in Miller's pyramid.

Educationists have tried to categorise learning in terms of skills, knowledge and attitudes (Bloom's Taxonomy) Reference Bloom2 for more than half a century. In 2002, the Chief Medical Officer for England published the Unfinished Business document, which called for managing basic specialist training and suggested that the focus should shift towards competency. 3 The General Medical Council (GMC) then published Tomorrow's Doctors, where it was suggested that the core curriculum should be broadened to incorporate skills and attitudes along with knowledge. 4 Unfinished Business also raised the issue of lack of standardisation in basic training. These set the scene for the Modernising Medical Careers (MMC) programme in 2003. Setting consistent national standards for training was identified as one of the principles of the MMC. 5 Workplace-based assessments were introduced to postgraduate psychiatric training in the UK in 2007 by the MMC after being tested for pre-registration foundation-year trainees. The ‘Gold Guide’, which is a generic guide to basic training, reiterated the need for trainees to acquire the skills and knowledge appropriate to their particular specialty. 6 There was an expectation that medical Royal Colleges would adapt their postgraduate curricula to meet the changing needs. Consequently, the Royal College of Psychiatrists updated the postgraduate psychiatric curriculum, introducing a competency-based framework in 2007. This was revised to develop the Core Curriculum in 2009. 7

The psychiatric curriculum prior to these modifications listed topics to be covered, which essentially reflected a syllabus, or a list of topics with a time frame for assessment or examination. The Core Curriculum is a much broader and detailed document. It does list some topics at the end as the Appendix. However, with this transition many of the clinical topics within the curriculum or the syllabus was lost. Many common mental illnesses are nominally mentioned or omitted altogether in the Core Curriculum document, e.g. schizophrenia is mentioned twice, whereas depression and dementia are not mentioned at all.

In January 2009, the regulations regarding WPBAs were simplified with candidates no longer being expected to undertake a required number of WPBAS to progress to a particular part of the Membership examination, as long as they progress through the Annual Review of Competence Progress (ARCP). 8 The curriculum for postgraduate psychiatric training is developed by the Royal College of Psychiatrists. The assessment of competency for progression to the next year of core specialty training is decided by the Deanery. There seems to be a fracture in-between the Royal College conducted Membership examination and the Deanery conducted ARCP that reviews trainees' WPBA results. This absence of alignment of the WPBA with the curriculum makes it possible for trainees to progress to completion of training without being assessed on core competencies such as risk assessment. In theory, a trainee could undertake assessments on the same topic every year or even on every occasion. In the absence of guidance, trainees, trainers and training schemes decide on the content of the WPBA carried out at a particular stage of training. This can be potentially inconsistent. Inconsistency can defeat the very purpose of ensuring the national-level standardisation that the overhauling of medical education was aiming to achieve.

Our aim was to involve the stakeholders relevant to postgraduate psychiatric training to develop a consensus and generate a list of topics that adhere to the College's postgraduate curriculum, which can be tested as WPBAs and can be reviewed by the Deanery, thereby linking in workplace-based training and the postgraduate curriculum. The list generated can act as guidance for trainees and trainers while carrying out WPBAs. We believe this exercise will also highlight the potential gap between the curriculum and the ARCP process and possibly initiate a consultation and debate on the topic that can ultimately result in development of official consensus guidelines on the matter.

Method

Delphi consultation is an iterative exercise enabling stakeholders to arrive at a consensus on an issue. It is a flexible, participative and involving exercise intended to engage more people than would have been possible to meet at one time. Delphi consultation is well recognised as a method for the generation of healthcare content curriculum. Reference Williams and Webb9 The technique has been used to devise postgraduate curricula. Reference Broomfield and Humphris10

We carried out three cycles of questioning, each further developing from the responses in the previous cycle. At the first stage the participants were asked to generate a list of topics they thought were of sufficient importance that core trainees should be tested on them at each stage of training. It involved looking into ACE (Assessment of Clinical Expertise), Mini-ACE (Mini-Assessment Clinical Encounter) and CbD (case-based discussion) appropriate and important to core trainees in each year (CT1, CT2, CT3). At the end of the first stage individual lists of suggested topics were collated to develop one exhaustive list.

At the second round of consultation this list was circulated among the stakeholders who were asked to weigh the importance of each topic using a five-point Likert scale and if necessary comment on the topics. The rationale of weighting the topics was aimed at developing a ‘core curriculum’ based on importance. At the end of the second stage the list of topics was refined in line with suggestions and duplications removed. An average weighting or mean Likert score was added to each topic in the list. Topics that were similar were combined and their mean scores displayed in the final list.

In the third and final stage, stakeholders were circulated the weighted and refined list for final comments and ratification. Participants were allowed to add new topics at later stages, that they believed were ‘core’ but not already included in the list. As consensus was reached at this stage further consultations were not needed.

The participants included local tutors, trainers, trainees, a university teaching fellow, a medical director and a director of postgraduate medical education of a mental health trust and Head of the Specialty School of Psychiatry at London Deanery and the then Dean of the Royal College of Psychiatrists responsible for overlooking postgraduate medical education curriculum in psychiatry. Multidisciplinary working and working in partnership with service users are key issues in Tomorrow's Doctors. 4 We therefore sought the opinion of a general practitioner who was training to become a trainer with London Deanery, a mental health social worker, a primary care service user and a lay member registered with the general practitioner, and a carer representative involved with a north London carer forum. Fifteen stakeholders relevant to postgraduate psychiatric training were identified by the authors J.W. and R.B; R.B. acted as the coordinator for the consultation. Contributions to the consultation were collected anonymously by the coordinator. The coordinator did not contribute to the consultation.

Box 1 Topics for Assessment of Clinical Expertise (ACE)

CT1

  1. Assessment for uncomplicated mood/affective disorder and psychosis

  2. Assessment of physical health and examination of a psychiatric patient plus discussion of investigation and management

  3. Assessment of Mental State Examination with understanding of phenomenology in a cooperative patient and implications on diagnosis and management

  4. Clinical encounter focusing on communication skills

CT2

  1. Assessing and managing a psychiatric emergency

  2. Detailed Mental State Examination in an uncooperative patient or patient with mania, interpretation of phenomenology, implication on diagnosis and management

  3. Demonstrating awareness of biopsychosocial approach in devising management plans

  4. Demonstrating ability to devise a diagnostic formulation with awareness of differentials and classification system

  5. Psychiatric assessment of a relatively complex case

CT3

  1. Assessment leading to developing a management plan based on evidence base in psychopharmacology and on biopsychosocial model

  2. Assessment leading to developing a ‘formulation’ for a case including psychological formulation

  3. Psychiatric assessment in a complex case

Results

After the initial survey, 147 topics for WPBAs were identified as suitable for the assessment of CT1-3 trainees using ACE, Mini-ACE or CbD. At the end of the second stage there were 31 ACEs, 55 Mini-ACEs and 48 CbDs, a total of 134 topics after removing duplicates. At the next stage, two new CbDs and one Mini-ACE were added and therefore these do not have a score. The final list has 26 ACEs, 52 Mini-ACEs and 47 CbDs with the new additions; a total of 125 topics. At the third stage a few of the topics were reworded, combined or brought together under broader umbrella topics. Topics generated for ACE are listed in Box 1 (complete list of topics with weighted scores are listed in online Table DS1), Mini-ACE in Box 2 (full details with weighted scores are shown in online Table DS2) and CbD in online Table DS3. In our consultation not all stakeholders responded to each stage of consultation although all contributed at least once.

Other forms of WPBAs were also suggested. Journal club presentations should be tested at each stage of core training with increasing complexity (such as using literature searches) using the WPBA tool journal club presentation. Similarly, selecting an appropriate audience may stratify teaching at each year of postgraduate training. Examples suggested included CT1 trainees teaching undergraduates, CT2 trainees teaching foundation year trainees and CT3 trainees teaching a multidisciplinary audience within the team. Teaching can be tested through using the assessment of teaching tool.

Box 2 Topics for Mini-Assessment of Clinical Expertise (Mini-ACE)

CT1

  1. Risk assessment for suicide or self-harm

  2. Mental State Examination

  3. Demonstration of good communication skills

  4. Understanding the principles of mental health legislation

  5. Obtaining history focusing on an individual component

  6. Writing prescriptions

  7. Brief physical examination

CT2

  1. Assessment of risk to self and others

  2. Physical examination – general physical examination

  3. Management of self-harm

  4. Assessment for detention under mental health legislation

  5. History taking or Mental State Examination

  6. Communication skills

  7. Multidisciplinary working

  8. Appropriate use of investigations (e.g. blood tests)

CT3

  1. Communication skills

  2. Risk assessment in the context of mental health legislation

  3. Ability to choose a treatment option based on evidence (e.g. choosing antipsychotic medication)

  4. Assessing suitability for psychological therapies

  5. Application of evidence base in treatment of an individual patient

  6. Assessment of children referred to mental health services

  7. Prescribing controlled drugs (e.g. methadone or methylphenidate)

  8. Taking personal history with emphasis on interpersonal difficulties

  9. Devising a management plan decision on the level of support/setting of care (e.g. community or hospital)

Discussion

The list generated can act as a guide to trainees and trainers on ‘core’ topics for WPBAs suited to the individual's training needs at that particular stage of their training. Specific WPBAs can be incorporated within the individual trainee's learning objectives. This is important to ensure the process does not become merely a tick-box exercise to produce the right number of WBPAs necessary for the ARCP, but instead to provide a meaningful record of curricular progression. The list of topics generated by this exercise is not meant to be exhaustive and it does not represent an official curriculum. Trainees or trainers should not feel constrained by our suggestions. Instead, the aims are improving the consistency of WPBAs for postgraduate psychiatric training, defining parameters within which cases selected may be legitimate and as a guide to what would constitute an adequate mix of cases.

The consultation resulted in an extended list of topics, which range from those that have been traditionally tested, to ones that few clinicians have previously been formally assessed on. Potentially, trainees' awareness on these varied topics can now be assessed.

The Mini-ACEs generated topics (Box 2 and online Table DS2) were similar to the Objective and Structured Clinical Examination (OSCE) or Clinical Assessment of Skills and Competencies (CASCs) topics, focusing on communication skills based topics, history taking or a more focused assessment including physical examination. The list generated for ACEs (Box 1 and online Table DS1) were reminiscent of the long cases previously tested in Membership examinations, although ACEs are expected to simulate more real situations. The CbDs (online Table DS3) offer not only an opportunity to discuss complex ethical issues surrounding management but also to facilitate assessment of professionalism. A formal forum for such exploratory discussions was lost with the abandoning of the Patient Management Plan from the erstwhile MRCPsych part II examinations. Some topics such as teamworking, leading ward rounds or meetings were not traditionally assessed. Workplace-based assessments give an opportunity to assess such skills, which are now recognised as key competencies within training.

The generation of topics such as prioritising resources, quality markers or performance indicators reflect the changing perception of stakeholders in postgraduate training within the National Health Service. Trainees are now expected to be aware of the corporate and business management perspectives of healthcare.

There was repetition of the content of topics generated for the different WPBAs aimed at different stages of training (e.g. assessment of risk). This possibly reflects a lack of clarity in the roles and expectations in the competencies of CT1, CT2 and CT3. If the same skill is expected of the trainee with an increasing level of sophistication at different stages of their training it leaves us with another challenge. The trainer needs to have a clear expectation of what would constitute the skills of a ‘good enough’ trainee at each level. Beyond the scope of this study, there is a need to develop a consensus around these expectations if we are to have improved reliability of WPBAs. This is an issue assessors of WPBAs need to be aware of. Validity of assessments can be potentially improved through regular training. Reference Clarke11 This problem is possibly more acute in ‘non-procedural’ professions such as psychiatry. Psychiatry therefore needs to adapt to the concept of the ‘spiral curriculum’, Reference Bruner12 using the same exercises with increasing complexity over several years of training. This provides an ideal opportunity to use WPBA as a formative tool.

Generating a list of topics in itself does not resolve all the challenges that WPBAs present. It is important to remember that as we ascend Miller's pyramid the reliability of a test decreases, which means WPBAs have less reliability than a written paper that tests knowledge. It is suggested that to improve reliability WPBAs are repeated. Reference Wass, Vleuten, Shatzer and Jones13 To improve the reliability of assessments and ensure clarity of expectation at each level of training there is a need for the constant training of trainers. Training and exchange of views between trainers is important as self-assessment and peer evaluation improve validity of assessment by triangulation. Reference Clarke11 This assessment of validity and reliability is important as WPBAs were not specifically devised to assess postgraduate competencies in psychiatry in the first place. At all stages underperformance needs to be addressed. It remains to be seen whether the new training curriculum and assessment of competencies will enable trainees to meet their leaning needs Reference Talbot14,Reference Rees15 or more importantly produce better psychiatrists.

We believe this exercise synthesises the postgraduate psychiatric curriculum with the requirements to create a practical framework for WPBAs.

Acknowledgements

Contributors to the Delphi consultation: Professor Dinesh Bhugra, President Royal College of Psychiatrists; Dr Michael Maier, Head of London Specialty School of Psychiatry; Dr Alex Lewis, Medical Director; Dr Elaine Arnold, Training Programme Director for Higher Training; Dr Rizkar Amin, Training Programme Director for Basic or Core Training; Dr John Lowe, College Tutor and Consultant in Adult Psychiatry; Dr Gary Wannan, College Tutor and Consultant Child and Adolescent Psychiatry; Dr Abanti Paul, General Practitioner; Dr Caroline Methuen, Teaching Fellow; Dr Alex Bailey, Higher Trainee; Dr Mary Linton, Core Trainee; Ms Lydia De Rieu, Social Worker; Ms Lynn Strother, Carer and Carer Representative; Ms Lillian Worley, Lay Member and Primary Care Service User; Dr James Warner, Director of Postgraduate Medical Education.

We would also thank Dr Claire Hilton and Dr Tim Swanwick for their comments.

Footnotes

Declaration of interest

At the time of conducting the consultation D.B. was the Dean of the Royal College of Psychiatrists and M.M. was the Head of the London Specialty School of Psychiatry.

References

1 Miller, GE. The assessment of clinical skills/performance. Acad Med 1990; 65: S637.Google Scholar
2 Bloom, BS. Taxonomy of Educational Objectives, Handbook I: The Cognitive Domain. David McKay Co., 1956.Google Scholar
3 Department of Health. Unfinished Business – Proposals for Reform of the Senior House Officer Grade. Department of Health, 2002.Google Scholar
4 General Medical Council. Tomorrow's Doctors. GMC, 2002.Google Scholar
5 Modernising Medical Careers. About Modernising Medical Careers: The Principles of MMC. MMC, 2007 (http://www.mmc.nhs.uk/medical_education/about_modernising_medical_care.aspx).Google Scholar
6 Department of Health, Department of Health Social Services and Public Safety, National Health Service. The Gold Guide A Guide to Postgraduate Specialty Training in the UK. Modernising Medical Careers, 2007.Google Scholar
7 Royal College of Psychiatrists. A Competency Based Curriculum for Specialist Training in Psychiatry – Core Module. Royal College of Psychiatrists, 2009.Google Scholar
8 Royal College of Psychiatrists. Eligibility Criteria and Regulations for MRCPsych Examinations. Royal College of Psychiatrists, 2009.Google Scholar
9 Williams, PL, Webb, C. The Delphi technique: a methodological discussion. J Adv Nurs 1994; 19: 180–6.CrossRefGoogle ScholarPubMed
10 Broomfield, D, Humphris, GM. Using the Delphi technique to identify the cancer education requirements of general practitioners. Med Educ 2001; 35: 928–37.Google Scholar
11 Clarke, R. Foundation Programme Assessments in General Practice. Education for Primary Care. Radcliff Publishing, 2006.Google Scholar
12 Bruner, J. The Process of Education (2nd edn). Harvard University Press, 1977.Google Scholar
13 Wass, V, Vleuten, CV, Shatzer, J, Jones, R. Assessment of clinical competence. Lancet 2001; 357: 945–9.CrossRefGoogle ScholarPubMed
14 Talbot, M. Monkey see, monkey do: a critique of the competency model in graduate medical education. Med Educ 2004; 38: 587–92.Google Scholar
15 Rees, CE. The problem with outcomes based curricula in medical education: insights from educational theory. Med Educ 2004; 38: 593–8.CrossRefGoogle ScholarPubMed
Submit a response

eLetters

No eLetters have been published for this article.