J Educ Eval Health Prof.  2018;15:17. 10.3352/jeehp.2018.15.17.

Examiner seniority and experience are associated with bias when scoring communication, but not examination, skills in objective structured clinical examinations in Australia

Affiliations
  • 1Clinical Skills Teaching Unit, Prince of Wales Hospital, Sydney, Australia.
  • 2University of New South Wales, Sydney, Australia.
  • 3Prince of Wales Clinical School, University of New South Wales, Sydney, Australia.
  • 4Office of Medical Education, University of New South Wales, Sydney, Australia. b.shulruf@unsw.edu.au
  • 5Centre for Medical and Health Sciences Education, University of Auckland, Auckland, New Zealand.

Abstract

PURPOSE
The biases that may influence objective structured clinical examination (OSCE) scoring are well understood, and recent research has attempted to establish the magnitude of their impact. However, the influence of examiner experience, clinical seniority, and occupation on communication and physical examination scores in OSCEs has not yet been clearly established.
METHODS
We compared the mean scores awarded for generic and clinical communication and physical examination skills in 2 undergraduate medicine OSCEs in relation to examiner characteristics (gender, examining experience, occupation, seniority, and speciality). The statistical significance of the differences was calculated using the 2-tailed independent t-test and analysis of variance.
RESULTS
Five hundred and seventeen students were examined by 237 examiners at the University of New South Wales in 2014 and 2016. Examiner gender, occupation (academic, clinician, or clinical tutor), and job type (specialist or generalist) did not significantly impact scores. Junior doctors gave consistently higher scores than senior doctors in all domains, and this difference was statistically significant for generic and clinical communication scores. Examiner experience was significantly inversely correlated with generic communication scores.
CONCLUSION
We suggest that the assessment of examination skills may be less susceptible to bias because this process is fairly prescriptive, affording greater scoring objectivity. We recommend training to define the marking criteria, teaching curriculum, and expected level of performance in communication skills to reduce bias in OSCE assessment.

Keyword

Examiner; Bias; Communication; Examination; Objective structured clinical examination; Australia

MeSH Terms

Australia*
Awards and Prizes
Bias (Epidemiology)*
Curriculum
Humans
New South Wales
Occupations
Physical Examination

Cited by  2 articles

Assessment methods and the validity and reliability of measurement tools in online objective structured clinical examinations: a systematic scoping review
Jonathan Zachary Felthun, Silas Taylor, Boaz Shulruf, Digby Wigram Allen, Sun Huh
J Educ Eval Health Prof. 2021;18:11.    doi: 10.3352/jeehp.2021.18.11.

Empirical analysis comparing the tele-objective structured clinical examination and the in-person assessment in Australia
Jonathan Zachary Felthun, Silas Taylor, Boaz Shulruf, Digby Wigram Allen, Sun Huh
J Educ Eval Health Prof. 2021;18:23.    doi: 10.3352/jeehp.2021.18.23.


Reference

References

1. Stroud L, Herold J, Tomlinson G, Cavalcanti RB. Who you know or what you know?: effect of examiner familiarity with residents on OSCE scores. Acad Med. 2011; 86(10 Suppl):S8–S11. https://doi.org/10.1097/ACM.0b013e31822a729d.
Article
2. Schleicher I, Leitner K, Juenger J, Moeltner A, Ruesseler M, Bender B, Sterz J, Schuettler KF, Koenig S, Kreuder JG. Examiner effect on the objective structured clinical exam: a study at five medical schools. BMC Med Educ. 2017; 17:71. https://doi.org/10.1186/s12909-017-0908-1.
Article
3. Yeates P, Moreau M, Eva K. Are examiners’ judgments in OSCE-style assessments influenced by contrast effects? Acad Med. 2015; 90:975–980. https://doi.org/10.1097/ACM.0000000000000650.
Article
4. Chesser A, Cameron H, Evans P, Cleland J, Boursicot K, Mires G. Sources of variation in performance on a shared OSCE station across four UK medical schools. Med Educ. 2009; 43:526–532. https://doi.org/10.1111/j.1365-2923.2009.03370.x.
Article
5. Setyonugroho W, Kennedy KM, Kropmans TJ. Reliability and validity of OSCE checklists used to assess the communication skills of undergraduate medical students: a systematic review. Patient Educ Couns. 2015; Jun. 27. [Epub]. https://doi.org/10.1016/j.pec.2015.06.004.
Article
6. Khan KZ, Ramachandran S, Gaunt K, Pushkar P. The objective structured clinical examination (OSCE): AMEE guide no. 81. part I: an historical and theoretical perspective. Med Teach. 2013; 35:e1437–e1446. https://doi.org/10.3109/0142159X.2013.818634.
Article
7. Burgess A, Clark T, Chapman R, Mellis C. Senior medical students as peer examiners in an OSCE. Med Teach. 2013; 35:58–62. https://doi.org/10.3109/0142159X.2012.731101.
Article
8. Talley NJ, O’Connor S. Clinical examination: a systematic guide to physical diagnosis. 7th ed. Chatswood (NSW): Elsevier Australia;2013.
9. Kurtz S, Silverman J, Draper J. Teaching and learning communication skills in medicine. 2nd ed. Boca Raton (FL): CRC press;2016.
10. Park SE, Kim A, Kristiansen J, Karimbux NY. The influence of examiner type on dental students’ OSCE scores. J Dent Educ. 2015; 79:89–94.
Article
Full Text Links
  • JEEHP
Actions
Cited
CITED
export Copy
Close
Share
  • Twitter
  • Facebook
Similar articles
Copyright © 2024 by Korean Association of Medical Journal Editors. All rights reserved.     E-mail: koreamed@kamje.or.kr