Skip to main content

The Role of Professional Staff in Assessing Students: A Case Study of the Objective Structured Clinical Exam

  • 162 Accesses

Part of the book series: University Development and Administration ((UDAA))

Abstract

Conducting an objective structured clinical exam (OSCE) to assess a student’s clinical competency is a complex and dynamic process that requires more than just academic input due to the intricate logistical and technical requirements. Such complexity necessitates the involvement of professional staff, who work collaboratively with academic staff in planning and conducting the OSCE itself – often having direct contact with students leading up to and during the exam. This chapter presents a case study to highlight the integral role of professional staff in the assessment of students undertaking an OSCE at an Australian university. The OSCE process involves a multiplicity of roles and skills, blurring the lines between traditional academic and professional staff boundaries, creating a partnership that arguably promotes mutual respect for the expertise of both roles in higher education. The technical, curriculum, and administrative expertise of professional staff is vital to running an effective OSCE, with professional staff often assuming leadership responsibilities during an OSCE to ensure a positive experience for the student. This level of expertise is often unrecognised by those outside the OSCE process, yet is essential to the quality and integrity of the OSCE and to the professional identity of the staff involved. This chapter unpacks the nature of the work and expertise involved in designing, developing, and delivering an OSCE and the range of qualities and skills required to ensure a successful experience for students.

This is a preview of subscription content, log in via an institution.

References

  • Abdulghani, H.M., Z. Amin, and G. Ponnamperuma. 2014. An essential guide to developing, implementing, and evaluating objective structured clinical examination (OSCE). Hackensack: World Scientific Publishing Company.

    Book  Google Scholar 

  • Australian Government Department of Education and Training. 2015. 2015 Staff full-time equivalence. Retrieved from https://docs.education.gov.au/node/38385

  • Australian Higher Education Industrial Association. 2016. Australian higher education workforce of the future. Retrieved from http://www.aheia.edu.au/news/higher-education-workforce-of-the-future-167

  • Birds, R. 2015. Redefining roles and identities in higher education: The liminal experiences of a university spinout company. Journal of Higher Education Policy & Management 37 (6): 633–645. Retrieved from https://doi.org/10.1080/1360080X.2015.1103003

  • Blackmore, J. 2009. Academic pedagogies, quality logics and performative universities: Evaluating teaching and what students want. Studies in Higher Education 34 (8): 857–872. Retrieved from https://doi.org/10.1080/03075070902898664

  • Bosco, A.M., and S. Ferns. 2014. Embedding of authentic assessment in work-integrated learning curriculum. Asia-Pacific Journal of Cooperative Education 15 (4): 281–290.

    Google Scholar 

  • Brand, H.S., and M. Schoonheim-Klein. 2009. Is the OSCE more stressful? Examination anxiety and its consequences in different assessment methods in dental education. European Journal of Dental Education 13 (3): 147–153. Retrieved from https://doi.org/10.1111/j.1600-0579.2008.00554.x

  • Brannick, M.T. 2013. Metacognition, OSCE performance anxiety and OSCE performance. Medical Education 47 (6): 540–542. Retrieved from https://doi.org/10.1111/medu.12148

  • Braun, V., and V. Clarke. 2006. Using thematic analysis in psychology. Qualitative Research in Psychology 3 (2): 77–101. Retrieved from https://doi.org/10.1191/1478088706qp063oa

  • Carpenter, J.L. 1995. Cost analysis of objective structured clinical examinations. Academic Medicine 70 (9): 828–833.

    Google Scholar 

  • Collins, J.P., and R.M. Harden. 1998. AMEE medical education guide No. 13: Real patients, simulated patients and simulators in clinical examinations. Medical Teacher 20 (6): 508–521. Retrieved from https://doi.org/10.1080/01421599880210

  • Conway, M., and I. Dobson. 2003. Fear and loathing in university staffing: The case of the Australian academic and general staff. Higher Education Management and Policy 15 (3): 123–133. Retrieved from https://doi.org/10.1787/17269822

  • Dobson, I.R. 2000. ‘Them and Us’ – general and non-general staff in higher education. Journal of Higher Education Policy & Management 22 (2): 203–210. Retrieved from https://doi.org/10.1080/13600800050196911

  • Friedman Ben-David, M. 2000. AMEE guide no. 18: Standard setting in student assessment. Medical Teacher 22 (2): 120–130. Retrieved from https://doi.org/10.1080/01421590078526

  • Gormley, G. 2011. Summative OSCEs in undergraduate medical education. The Ulster Medical Journal 80 (3): 127–132.

    Google Scholar 

  • Graham, C. 2013. Professional staff contributions to positive student outcomes. Australian Universities Review 55: 7–16.

    Google Scholar 

  • Graham, C. 2014. Another matrix revolution? The overlap of university work. Australian Universities Review 56 (1): 67–69.

    Google Scholar 

  • Gulikers, J.T.M., T.J. Bastiaens, and P.A. Kirschner. 2008. Defining authentic assessment: Five dimensions of authenticity. In Balancing dilemmas in assessment and learning in contemporary education, ed. A. Havnes and L. McDowell. New York: Routledge.

    Google Scholar 

  • Harden, R.M. 1990. Twelve tips for organizing an objective structure clinical examination (OSCE). Medical Teacher 12 (3/4): 259.

    Google Scholar 

  • Harvey, P., and N. Radomski. 2011. Performance pressure: Simulated patients and high-stakes examinations in a regional clinical school. Australian Journal of Rural Health 19 (6): 284–289.

    Google Scholar 

  • Kachur, E., S. Zabar, K. Hanley, A. Kalet, J.H. Bruno, and C.C. Gillespie. 2013. Organising OSCEs (and other SP exercises) in ten steps. In Objective structured clinical examinations: 10 steps to planning and implementing OSCEs and other standardized patient exercise, ed. S. Zabar, E. Kachur, K. Hanley, and A. Kalet, 7–34. New York: Springer Science+Business Media.

    Google Scholar 

  • Kandiko, C. 2013. The global student experience. In The global student experience: An international and comparative analysis, ed. C. Kandiko and M. Weyers, 1–10. Abingdon: Routledge.

    Google Scholar 

  • Khan, K.Z., K. Gaunt, S. Ramachandran, and P. Pushkar. 2013a. The objective structured clinical examination (OSCE): AMEE guide no. 81. Part II: Organisation & administration. Medical Teacher 35 (9): e1447–e1463. Retrieved from https://doi.org/10.3109/0142159X.2013.818635

  • Khan, K.Z., S. Ramachandran, K. Gaunt, and P. Pushkar. 2013b. The objective structured clinical examination (OSCE): AMEE guide no. 81. Part I: An historical and theoretical perspective. Medical Teacher 35 (9): e1437–e1446. Retrieved from https://doi.org/10.3109/0142159X.2013.818634

  • Koenen, A.-K., F. Dochy, and I. Berghmans. 2015. A phenomenographic analysis of the implementation of competence-based education in higher education. Teaching and Teacher Education 50: 1–12. Retrieved from https://doi.org/10.1016/j.tate.2015.04.001.

  • Mercer, J. 2007. The challenges of insider research in educational institutions: Wielding a double-edged sword and resolving delicate dilemmas. Oxford Review of Education 33 (1): 1–17.

    Google Scholar 

  • Nicholson, B., and K. Forrest. 2009. What influences performance in the OSCE exam? The medical student perspective. Medical Teacher 31 (11): 1040–1041.

    Google Scholar 

  • O’Carroll, P.J., and P. Fisher. 2013. Metacognitions, worry and attentional control in predicting OSCE performance test anxiety. Medical Education 47 (6): 562–568. Retrieved from https://doi.org/10.1111/medu.12125

  • Regan, J.-A., E. Dollard, and N. Banks. 2014. A comparative study of the perceptions of professional staff on their contribution to student outcomes. Journal of Higher Education Policy and Management 36 (5): 533–545. Retrieved from https://doi.org/10.1080/1360080X.2014.936093

  • Rowlands, J. 2013. Academic boards: Less intellectual and more academic capital in higher education governance? Studies in Higher Education 38 (9): 1274–1289. Retrieved from https://doi.org/10.1080/03075079.2011.619655

  • Rushforth, H.E. 2007. Objective structured clinical examination (OSCE): Review of the literature and implications for nursing education. Nurse Education Today 27: 481–490. Retrieved from https://doi.org/10.1016/j.nedt.2006.08.009

  • Sudan, R., P. Clark, and B. Henry. 2015. Cost and logistics for implementing the American College of Surgeons objective structured clinical examination. American Journal of Surgery 209 (1): 140–144. Retrieved from https://doi.org/10.1016/j.amjsurg.2014.10.001

  • Whitchurch, C. 2008a. Shifting identities and blurring boundaries: The emergence of third space professionals in UK higher education. Higher Education Quarterly 62 (4): 377–396. Retrieved from https://doi.org/10.1111/j.1468-2273.2008.00387.x

  • Whitchurch, C. 2008b. Beyond administration and management: Reconstructing the identities of professional staff in UK higher education. Journal of Higher Education Policy & Management 30 (4): 375–386. Retrieved from https://doi.org/10.1080/13600800802383042

  • Whitchurch, C. 2009. The rise of the blended professional in higher education: A comparison between the United Kingdom, Australia and the United States. Higher Education 58 (3): 407–418. Retrieved from https://doi.org/10.1007/s10734-009-9202-4

  • Whitchurch, C. 2012. Reconstructing identities in higher education: The rise of “third space” professionals. New York: Routledge.

    Google Scholar 

  • Yap, K., M. Bearman, N. Thomas, and M. Hay. 2012. Clinical psychology students’ experiences of a pilot objective structured clinical examination. Australian Psychologist 47 (3): 165–173. Retrieved from https://doi.org/10.1111/j.1742-9544.2012.00078.x

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Darci Taylor .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Singapore Pte Ltd.

About this entry

Check for updates. Verify currency and authenticity via CrossMark

Cite this entry

Taylor, D. (2018). The Role of Professional Staff in Assessing Students: A Case Study of the Objective Structured Clinical Exam. In: Bossu, C., Brown, N. (eds) Professional and Support Staff in Higher Education. University Development and Administration. Springer, Singapore. https://doi.org/10.1007/978-981-10-1607-3_4-1

Download citation

  • DOI: https://doi.org/10.1007/978-981-10-1607-3_4-1

  • Received:

  • Accepted:

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-10-1607-3

  • Online ISBN: 978-981-10-1607-3

  • eBook Packages: Springer Reference EducationReference Module Humanities and Social SciencesReference Module Education

Publish with us

Policies and ethics

Chapter history

  1. Latest

    The Role of Professional Staff in Assessing Students: A Case Study of the Objective Structured Clinical Exam
    Published:
    30 October 2018

    DOI: https://doi.org/10.1007/978-981-10-1607-3_4-3

  2. The Role of Professional Staff in Assessing Students: A Case Study of the Objective Structured Clinical Exam
    Published:
    30 June 2018

    DOI: https://doi.org/10.1007/978-981-10-1607-3_4-2

  3. Original

    The Role of Professional Staff in Assessing Students: A Case Study of the Objective Structured Clinical Exam
    Published:
    14 November 2017

    DOI: https://doi.org/10.1007/978-981-10-1607-3_4-1