J Educ Eval Health Prof.  2011;8:13.

Implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates

Affiliations
  • 1Gippsland Medical School, School of Rural Health, Monash University, Churchill, Australia. debra.nestel@monash.edu
  • 2West Gippsland Healthcare Group, Warragul, Australia.
  • 3Imperial College London, London, UK.
  • 4Gippsland Regional Clinical School, School of Rural Health, Monash University, Traralgon, Australia.
  • 5Department of Family and Community Medicine, University of Toronto, Toronto, Canada.
  • 6West Gippsland Hospital, Warragul, Australia.

Abstract

Evaluation of educational interventions is often focused on immediate and/or short-term metrics associated with knowledge and/or skills acquisition. We developed an educational intervention to support international medical graduates working in rural Victoria. We wanted an evaluation strategy that included participants' reactions and considered transfer of learning to the workplace and retention of learning. However, with participants in distributed locations and limited program resources, this was likely to prove challenging. Elsewhere, we have reported the outcomes of this evaluation. In this educational development report, we describe our evaluation strategy as a case study, its underpinning theoretical framework, the strategy, and its benefits and challenges. The strategy sought to address issues of program structure, process, and outcomes. We used a modified version of Kirkpatrick's model as a framework to map our evaluation of participants' experiences, acquisition of knowledge and skills, and their application in the workplace. The predominant benefit was that most of the evaluation instruments allowed for personalization of the program. The baseline instruments provided a broad view of participants' expectations, needs, and current perspective on their role. Immediate evaluation instruments allowed ongoing tailoring of the program to meet learning needs. Intermediate evaluations facilitated insight on the transfer of learning. The principal challenge related to the resource intensive nature of the evaluation strategy. A dedicated program administrator was required to manage data collection. Although resource-intensive, we recommend baseline, immediate, and intermediate data collection points, with multi-source feedback being especially illuminating. We believe our experiences may be valuable to faculty involved in program evaluations.

Keyword

Medical education; Educational measurement; Medical students

MeSH Terms

Administrative Personnel
Data Collection
Education, Medical
Educational Measurement
Humans
Learning
Program Evaluation
Retention (Psychology)
Students, Medical
Transfer (Psychology)
Victoria

Reference

1. Calder J. Programme evaluation and quality: a comprehensive guide to setting up an evaluation system. London: Kogan Page Limited;1995.
2. Freeth D, Mammick M, Reeves S, Koppel I, Barr H. Effective interprofessional education: development, delivery and evaluation. Oxford: Blackwell;2005.
3. Garman K. Eastside, Westside. An exercise in applying document analysis techniques in educational evaluation. Research on evaluation program paper and report series (No 78). Portland: Northwest Regional Educational Lab;1982.
4. Joint Committee on Standards for Educational Evaluation. Standards for evaluation of educational programs, projects and materials. New York: McGraw Hill;1981.
5. Mertens D. Research and evaluation in education and psychology: integrating diversity with quantitative, qualitative, and mixed methods. Thousand Oaks: Sage Publications Inc;2005.
6. Morrison J. ABC of learning and teaching in medicine: evaluation. BMJ. 2003; 326:385–7.
Article
7. Owen J. Program evaluation: forms and approaches. 3rd ed. Crows Nest: Allen & Unwin;2006.
8. Patton M. Qualitative research and evaluation methods. 3rd ed. Thousand Oaks: Sage Publications Inc;2002.
9. Medical Training Review Panel Overseas Trained Doctor Subcommittee. Overseas trained doctor subcommittee report. [place unknown]: Medical Training Review Panel Overseas Trained Doctor Subcommittee; 2004.
10. Spike NA. International medical graduates: the Australian perspective. Acad Med. 2006; 81:842–6.
Article
11. Hawthorne L, Birrell B, Young D. The retention of overseas trained doctors in general practice in regional Victoria. Melbourne: Faculty of Medicine, Dentistry and Health Sciences, University of Melbourne;2003.
12. Wright A, Regan M, Haigh C, Sunderji I, Vijayakumar P, Smith C, Nestel D. Supporting international medical graduates in rural Australia: a mixed methods evaluation. Rural Remote Health. 2012; In press.
Article
13. Kirkpatrick DL. Evaluating training programmes: the four levels. San Francisco: Brrett-Koehler;1994.
14. Barr H, Freeth D, Hammick M, Koppel I, Reeves S. Evaluations of interprofessional education. London: United Kingdom Review of Health and Social Care;2000.
15. Marinopoulos SS, Dorman T, Ratanawongsa N, Wilson LM, Ashar BH, Magaziner JL, Miller RG, Thomas PA, Prokopowicz GP, Qayyum R, Bass EB. Effectiveness of continuing medical education. Evidence Report/Technology Assessment No. 149. AHRQ Publication No. 07-E006.Rockville, MD: Agency for Healthcare Research and Quality;2007.
16. The Foundation Programme. The foundation learning portfolio [Internet]. Cardiff, UK: The Foundation Programme;2007. [cited 2011 Nov 29]; Available from: http://www.foundationprogramme.nhs.uk/pages/home/keydocs.
Full Text Links
  • JEEHP
Actions
Cited
CITED
export Copy
Close
Share
  • Twitter
  • Facebook
Similar articles
Copyright © 2024 by Korean Association of Medical Journal Editors. All rights reserved.     E-mail: koreamed@kamje.or.kr