Abstract
Medical educators are tasked with decisions on trainee progression and credentialing for independent clinical practice, which requires robust evidence from workplace-based assessment. It is unclear how the current promotion of workplace-based assessment as a pedagogical approach to promote learning has impacted this use of assessments for decision-making; meeting both these purposes may present unforeseen challenges. In this study we explored how supervisors make decisions on trainee progress in practice. We conducted semi-structured interviews with 19 supervisors of postgraduate anesthesia training across Australia and New Zealand and undertook thematic analysis of the transcripts. Supervisors looked beyond the formal assessment portfolio when making performance decisions. They instead used assessment ‘shadow systems’ based on their own observation and confidential judgements from trusted colleagues. Supervisors’ decision making involved expert judgement of the perceived salient aspects of performance and the standard to be attained while making allowances for the opportunities and constraints of the local learning environment. Supervisors found making progress decisions an emotional burden. When faced with difficult decisions, they found ways to share the responsibility and balance the potential consequences for the trainee with the need to protect their patients. Viewed through the lens of community of practice theory, the development of assessment ‘shadow systems’ indicates a lack of alignment between local workplace assessment practices and the prescribed programmatic assessment approach to high-stakes progress decisions. Avenues for improvement include cooperative development of formal assessment processes to better meet local needs or incorporating the information in ‘shadow systems’ into formal assessment processes.
Similar content being viewed by others
References
Allard, J., & Bleakley, A. (2016). What would you ideally do if there were no targets? An ethnographic study of the unintended consequences of top-down governance in two clinical settings. Advances in Health Sciences Education,21(4), 803–817. https://doi.org/10.1007/s10459-016-9667-8.
Australian and New Zealand College of Anaesthetists. (2012). Anaesthesia training program curriculum. Retrieved 15 Nov 2017, from http://www.anzca.edu.au/documents/anaesthesia-training-program-curriculum.pdf.
Australian and New Zealand College of Anaesthetists. (2017). Handbook for Training and Accreditation. Retrieved 8 Nov 2018, from http://www.anzca.edu.au/documents/training-accreditation-handbook.pdf.
Bates, J., & Ellaway, R. H. (2016). Mapping the dark matter of context: A conceptual scoping review. Medical Education,50(8), 807–816. https://doi.org/10.1111/medu.13034.
Bearman, M., Dawson, P., Boud, D., Bennett, S., Hall, M., & Molloy, E. (2016). Support for assessment practice: Developing the assessment design decisions framework. Teaching in Higher Education,21(5), 545–556. https://doi.org/10.1080/13562517.2016.1160217.
Bindal, N., Goodyear, H., Bindal, T., & Wall, D. (2013). DOPS assessment: A study to evaluate the experience and opinions of trainees and assessors. Medical Teacher,35(6), e1230–e1234. https://doi.org/10.3109/0142159X.2012.746447.
Bindal, T., Wall, D., & Goodyear, H. M. (2011). Trainee doctors’ views on workplace-based assessments: Are they just a tick box exercise? Medical Teacher,33(11), 919–927. https://doi.org/10.3109/0142159X.2011.558140.
Bok, H. G., Teunissen, P. W., Favier, R. P., Rietbroek, N. J., Theyse, L. F., Brommer, H., et al. (2013). Programmatic assessment of competency-based workplace learning: When theory meets practice. BMC Medical Education,13, 123. https://doi.org/10.1186/1472-6920-13-123.
Boud, D., Dawson, P., Bearman, M., Bennett, S., Joughin, G., & Molloy, E. (2016). Reframing assessment research: Through a practice perspective. Studies in Higher Education, 43(7), 1107–1118. https://doi.org/10.1080/03075079.2016.1202913.
Bourdieu, P. (1977). Outline of a theory of practice. Cambridge: Cambridge University Press.
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology,3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa.
Braun, V., & Clarke, V. (2012). Thematic analysis. In H. Cooper, P. M. Camic, D. L. Long, A. T. Panter, D. Rindskopf, & K. J. Sher (Eds.), APA handbook of research methods in psychology, Vol 2: Research designs: Quantitative, qualitative, neuropsychological, and biological (Vol. 2, pp. 57–71). Washington, DC, US: American Psychological Association.
Braun, V., Clarke, V., & Terry, G. (2015). Thematic analysis. In P. Rohleder & A. C. Lyons (Eds.), Qualitative research in clinical and health psychology (pp. 95–114). New York: Palgrave Macmillan.
Buckley, H., Steinert, Y., Regehr, G., & Nimmon, L. (2019). When I say … community of practice. Med Educ. 53, 763–765. https://doi.org/10.1111/medu.13823.
Castanelli, D. J., Jowsey, T., Chen, Y., & Weller, J. M. (2016). Perceptions of purpose, value, and process of the mini-clinical evaluation exercise in anesthesia training. Canadian Journal of Anesthesia,63(12), 1345–1356. https://doi.org/10.1007/s12630-016-0740-9.
Castanelli, D. J., Moonen-van Loon, J. M. W., Jolly, B., & Weller, J. M. (2019a). The reliability of a portfolio of workplace-based assessments in anesthesia training. Canadian Journal of Anesthesia,66, 193–200. https://doi.org/10.1007/s12630-018-1251-7.
Castanelli, D. J., Weller, J. M., Chander, A. R., Molloy, E. K., & Bearman, M. L. (2019b). A balancing act: The supervisor of training role in anaesthesia education. Anaesth Intensive Care. https://doi.org/10.1177/0310057x19853593.
Colbert, C. Y., Dannefer, E. F., & French, J. C. (2015). Clinical competency committees and assessment: Changing the conversation in graduate medical education. Journal of Graduate Medical Education,7(2), 162–165. https://doi.org/10.4300/JGME-D-14-00448.1.
Dijkstra, J., Van der Vleuten, C. P., & Schuwirth, L. W. (2010). A new framework for designing programmes of assessment. Advances in Health Sciences Education,15(3), 379–393. https://doi.org/10.1007/s10459-009-9205-z.
Driessen, E. W., van Tartwijk, J., Govaerts, M., Teunissen, P., & van der Vleuten, C. P. (2012). The use of programmatic assessment in the clinical workplace: A Maastricht case report. Medical Teacher,34(3), 226–231. https://doi.org/10.3109/0142159X.2012.652242.
Ekpenyong, A., Baker, E., Harris, I., Tekian, A., Abrams, R., Reddy, S., et al. (2017). How do clinical competency committees use different sources of data to assess residents’ performance on the internal medicine milestones? A mixed methods pilot study. Medical Teacher,39(10), 1074–1083. https://doi.org/10.1080/0142159X.2017.1353070.
Fitzgerald, J. T., Burkhardt, J. C., Kasten, S. J., Mullan, P. B., Santen, S. A., Sheets, K. J., et al. (2016). Assessment challenges in competency-based education: A case study in health professions education. Medical Teacher,38(5), 482–490. https://doi.org/10.3109/0142159X.2015.1047754.
Gaunt, A., Patel, A., Rusius, V., Royle, T. J., Markham, D. H., & Pawlikowska, T. (2017). ‘Playing the game’: How do surgical trainees seek feedback using workplace-based assessment? Medical Education,51(9), 953–962. https://doi.org/10.1111/medu.13380.
Ginsburg, S., Regehr, G., Lingard, L., & Eva, K. W. (2015). Reading between the lines: Faculty interpretations of narrative evaluation comments. Medical Education,49(3), 296–306. https://doi.org/10.1111/medu.12637.
Govaerts, M. J. (2016). Competence in assessment: Beyond cognition. Medical Education,50(5), 502–504. https://doi.org/10.1111/medu.13000.
Hager, P., Lee, A., & Reich, A. (2012). Problematising practice, reconceptualising learning and imagining change. In P. Hager, A. Lee, & A. Reich (Eds.), Practice, learning and change: Practice-theory perspectives on professional learning (Vol. 8). Dordrecht: Springer Science+Business Media.
Harris, P., Bhanji, F., Topps, M., Ross, S., Lieberman, S., Frank, J. R., et al. (2017). Evolving concepts of assessment in a competency-based world. Medical Teacher,39(6), 603–608. https://doi.org/10.1080/0142159X.2017.1315071.
Harrison, C. J., Könings, K. D., Schuwirth, L. W. T., Wass, V., & van der Vleuten, C. P. M. (2017). Changing the culture of assessment: the dominance of the summative assessment paradigm. BMC Medical Education, 17(1). https://doi.org/10.1186/s12909-017-0912-5.
Hauer, K. E., Chesluk, B., Iobst, W., Holmboe, E., Baron, R. B., Boscardin, C. K., et al. (2015). Reviewing residents’ competence: A qualitative study of the role of clinical competency committees in performance assessment. Academic Medicine,90(8), 1084–1092. https://doi.org/10.1097/ACM.0000000000000736.
Kvale, S., & Brinkmann, S. (2009). InterViews: Learning the craft of qualitative research interviewing (2nd ed.). Thousand Oaks, CA: Sage.
Lave, J. (1993). The practice of learning. In S. Chaiklin & J. Lave (Eds.), Understanding practice perspectives on activity and context (pp. 3–32). Cambridge: Cambridge University Press.
Massie, J., & Ali, J. M. (2016). Workplace-based assessment: A review of user perceptions and strategies to address the identified shortcomings. Advances in Health Sciences Education,21(2), 455–473. https://doi.org/10.1007/s10459-015-9614-0.
Paradis, E., & Whitehead, C. R. (2015). Louder than words: Power and conflict in interprofessional education articles, 1954–2013. Medical Education,49(4), 399–407. https://doi.org/10.1111/medu.12668.
Patton, M. Q. (2015). Qualitative research and evaluation methods: Integrating theory and practice (4th ed.). Thousand Oaks, CA: Sage.
Sandelowski, M. (2000). Whatever happened to qualitative description? Research in Nursing and Health,23, 334–340.
Schumacher, D. J., Michelson, C., Poynter, S., Barnes, M. M., Li, S. T., Burman, N., et al. (2018). Thresholds and interpretations: How clinical competency committees identify pediatric residents with performance concerns. Medical Teacher,40(1), 70–79. https://doi.org/10.1080/0142159X.2017.1394576.
Schuwirth, L., & Ash, J. (2013). Assessing tomorrow’s learners: In competency-based education only a radically different holistic method of assessment will work-Six things we could forget. Medical Teacher,35(7), 555–559. https://doi.org/10.3109/0142159X.2013.787140.
Schuwirth, L. W., & van der Vleuten, C. P. (2012). Programmatic assessment and Kane’s validity perspective. Medical Education,46(1), 38–48. https://doi.org/10.1111/j.1365-2923.2011.04098.x.
Shaw, P. (1997). Intervening in the shadow systems of organizations. Journal of Organizational Change Management,10(3), 235–250. https://doi.org/10.1108/09534819710171095.
Sherbino, J., Frank, J. R., Flynn, L., & Snell, L. (2011). “Intrinsic Roles” rather than “armour”: Renaming the “non-medical expert roles” of the CanMEDS framework to match their intent. Advances in Health Sciences Education,16(5), 695–697. https://doi.org/10.1007/s10459-011-9318-z.
van der Vleuten, C. P. M., Schuwirth, L. W. T., Driessen, E. W., Dijkstra, J., Tigelaar, D. E., Baartman, L. K. J., et al. (2012). A model for programmatic assessment fit for purpose. Medical Teacher,34(3), 205–214. https://doi.org/10.3109/0142159X.2012.652239.
van der Vleuten, C. P., Schuwirth, L. W., Driessen, E. W., Govaerts, M. J., & Heeneman, S. (2014). 12 Tips for programmatic assessment. Medical Teacher. https://doi.org/10.3109/0142159X.2014.973388.
Watling, C. (2016). The uneasy alliance of assessment and feedback. Perspectives on Medical Education,5, 262–264. https://doi.org/10.1007/s40037-016-0300-6.
Weller, J. M., Castanelli, D. J., Chen, Y., & Jolly, B. (2017). Making robust assessments of specialist trainees’ workplace performance. British Journal of Anaesthesia,118(2), 207–214. https://doi.org/10.1093/bja/aew412.
Weller, J. M., Misur, M., Nicolson, S., Morris, J., Ure, S., Crossley, J., et al. (2014). Can I leave the theatre? A key to more reliable workplace-based assessment. British Journal of Anaesthesia,112(6), 1083–1091. https://doi.org/10.1093/bja/aeu052.
Wenger, E. (1998). Communities of Practice: Learning, meaning and identity. Cambridge: Cambridge University Press.
Wenger, E. (2010a). Communities of practice and social learning systems: The career of a concept. In C. Blackmore (Ed.), Social learning systems and communities of practice (pp. 179–198). London: Springer.
Wenger, E. (2010b). Conceptual tools for CoPs as social learning systems: Boundaries, identity, trajectories and participation. In C. Blackmore (Ed.), Social learning systems and communities of practice (pp. 125–144). London: Springer.
Wragg, A., Wade, W., Fuller, G., Cowan, G., & Mills, P. (2003). Assessing the performance of specialist registrars. Clinical Medicine,3, 131–134.
Acknowledgements
The authors wish to thank the ANZCA Clinical Trials Network for their assistance in recruiting participants for this study.
Funding
This study was supported by an untied grant from the ANZCA Research Foundation (Grant No: S16/043).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
DC and JW hold voluntary positions on ANZCA Educational Committees. Otherwise, all authors report no declarations of interest in this work.
Ethical approval
Ethics approval was granted by Monash University Human Research Ethics Committee (Reference: 2016000919) and the University of Auckland Human Participants Ethics Committee (Reference: 017408).
Informed consent
Informed consent was obtained from all individual participants included in the study. All procedures performed were in accordance with the ethical standards of the institutional ethics committees and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Castanelli, D.J., Weller, J.M., Molloy, E. et al. Shadow systems in assessment: how supervisors make progress decisions in practice. Adv in Health Sci Educ 25, 131–147 (2020). https://doi.org/10.1007/s10459-019-09913-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10459-019-09913-5