Hostname: page-component-7c8c6479df-995ml Total loading time: 0 Render date: 2024-03-29T04:53:53.076Z Has data issue: false hasContentIssue false

Raising the standards of educational evaluation and research

Published online by Cambridge University Press:  02 January 2018

David Cottrell*
Affiliation:
School of Medicine, University of Leeds, 12A Clarendon Road, Leeds LS2 9NN (e-mail: d.j.cottrell@leeds.ac.uk)
Rights & Permissions [Opens in a new window]

Abstract

Type
Editorial
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
Copyright © The Royal College of Psychiatrists, 2003

This is a time of significant change in medical education. Tomorrow's Doctors (General Medical Council, 1993) had a major impact on undergraduate education that is still reverberating around medical schools. Changes in post-qualification training have seen the development of some psychiatry house officer posts, restructuring of senior house officer training, and the introduction of the specialist registrar grade and the Certificate of Completion of Specialist Training (CCST). Revalidation, consultant appraisal and the personal development plan are likely to presage significant changes in the way that consultant psychiatrists plan their continuing professional development (Royal College of Psychiatrists, 2001). The College is consulting on a document that will set out the core competencies for psychiatry and all its sub-specialities. Such a specification is likely to have far reaching implications for what psychiatrists need to learn and how they will be taught.

A parallel development has seen the introduction of new teaching methods that focus on encouraging learners to find things out for themselves. Self-directed learning, project work in groups, problem-based learning and the promotion of ‘finding-out skills’ are now prioritised. However, evaluation of these new teaching methods is often not rigorous — we may have evidence-based clinical practice but what about evidence-based education? (Reference HutchinsonHutchinson, 1999; Reference PetersenPetersen, 1999; Reference Wilkes and BlighWilkes & Bligh, 1999; Reference LilleyLilley, 2000; Reference PrideauxPrideaux, 2002).

Petersen (Reference Petersen1999) notes that survival of a surgical procedure is rarely seen as qualifying a person to perform that procedure. All doctors have been successful medical students but on the same basis, survival of medical education should not qualify doctors to teach. The task of teaching others requires training and expertise, an expertise that is recognised in the skills required for the CCST and in the new consultation paper on core psychiatric competencies. Equally, those of us entrusted with the task of teaching deserve access to high-quality evaluations of teaching methods in order to allow us to select the best methods to use when teaching others.

The ‘Education & Training’ section has become a regular feature of the Bulletin. In 2001, eight articles were published under this banner with a further nine articles explicitly about education or training being found under the headings ‘Special Articles’ or ‘Original Papers’. Other published papers were indirectly related to training matters. Unfortunately, many other papers on this subject were submitted and rejected because they contained no more than a brief description of an educational programme and a summary of learner feedback. The Royal College of Psychiatrists is committed to raising standards of education and training and, to this end, the Psychiatric Bulletin is seeking to raise the standards of published evaluation and research into teaching methods.

Good educational research can encompass naturalistic studies, including detailed observational descriptions of teaching, through controlled comparisons of educational experience and outcomes, to experimental studies such as randomised controlled trials of different educational interventions. Both qualitative and quantitative methods will have their place — evaluation of teaching process and learner experience is as important as evaluation of outcome. Audits of training experience, such as surveys of learners, have much to contribute but, as with articles about clinical audit, they really need to start ‘closing the loop’ and relate results to existing standards on education practice or generate new standards for others to audit.

What good-quality evaluations should have in common is an attention to the theory behind the educational process and a proactive evaluation strategy that considers, at the outset, how learning and teaching will be evaluated rather than tagging on an evaluation once teaching is complete. Most studies will not be randomised controlled trials; these are relatively infrequent in medical education (Reference PetersenPetersen, 1999). However, many of the criticisms of their use (problems with randomisation, difficulties with ‘blinding’, number and complexity of other variables, difficulties in specifying and measuring outcomes and problems with manualising interventions) will be familiar to psychiatrists who have been involved with evaluation of psychological treatments. Murray (Reference Murray2002) provides a good overview of some of these problems and suggests that educational research shares many similarities with health services research. Interventions are multi-factorial and take place in the real world where economic, political and social factors may change during the study period, making interpretation of the results complex. Psychiatric experience in running complex trials of psychotherapies may enhance our ability to conduct controlled trials in education — both seek to measure the effectiveness of interventions designed to bring about behavioural change.

The British Medical Journal has already published guidance for authors of papers seeking to describe the evaluation of educational interventions (Reference Abbasi and SmithAbbasi & Smith, 1999). Such papers should have clear aims, appropriate design, samples, measures and analysis, and a structured discussion. An adaptation of these guidelines is a useful guide for readers and contributors to the Bulletin (Box 1). It is to be hoped that the coming years will see an increase in both the quality and quantity of published research into psychiatric education.

Box 1. Suggested guidance for evaluators of educational research*

Aims
  1. Are the aims and objectives clearly stated?

  2. Is the educational rationale clearly stated?

  3. Is the educational intervention described in context?

Design
  1. Is the method described in detail?

  2. Does the study design allow the questions posed to be answered?

  3. Are the methods used for sample recruitment described in sufficient detail?

  4. Was the evaluation method planned in advance and linked to the aims of the study?

  5. Are the outcomes measures appropriate to the aims of the study?

  6. Are the measures appropriate for the specified outcomes?

  7. Are the results meaningful?

  8. If an audit, are new standards generated or results related to existing standards?

Discussion
  1. Are strengths and weaknesses of the study discussed in relation to other studies?

  2. Are the meaning and implication of the study discussed?

  3. Is there a discussion of the need for further work?

Declaration of interest

None.

References

Abbasi, K. & Smith, R. (1999) Guidelines for evaluating papers on educational interventions (editorial). BMJ, 318, 12651267.Google Scholar
General Medical Council (1993) Tomorrow's Doctors. London: General Medical Council.Google Scholar
Hutchinson, L. (1999) Evaluating and researching the effectiveness of educational interventions. BMJ, 318, 12671269.Google Scholar
Lilley, P. (2000) Best evidence medical education (BEME): report of meeting 3–5 December 1999, London, UK. Medical Teacher, 22, 242245.Google Scholar
Murray, E. (2002) Challenges in educational research. Medical Education, 36, 110112.Google Scholar
Petersen, S. (1999) Time for evidence based medical education. BMJ, 318, 12231224.Google Scholar
Prideaux, D. (2002) Researching the outcomes of educational interventions: a matter of design. BMJ, 324, 126127.Google Scholar
Royal College of Psychiatrists (2001) Good Psychiatric Practice: CPD (Council Report CR90). London: Royal College of Psychiatrists.Google Scholar
Wilkes, M. & Bligh, J. (1999) Evaluating educational interventions. BMJ, 318, 12691272.Google Scholar
Figure 0

Box 1. Suggested guidance for evaluators of educational research*

Submit a response

eLetters

No eLetters have been published for this article.