Skip to main content
  • Research article
  • Open access
  • Published:

A national training program for simulation educators and technicians: evaluation strategy and outcomes

Abstract

Background

Simulation-based education (SBE) has seen a dramatic uptake in health professions education over the last decade. SBE offers learning opportunities that are difficult to access by other methods. Competent faculty is seen as key to high quality SBE. In 2011, in response to a significant national healthcare issue – the need to enhance the quality and scale of SBE - a group of Australian universities was commissioned to develop a national training program - Australian Simulation Educator and Technician Training (AusSETT) Program. This paper reports the evaluation of this large-scale initiative.

Methods

The AusSETT Program adopted a train-the-trainer model, which offered up to three days of workshops and between four and eight hours of e-learning. The Program was offered across all professions in all states and territories. Three hundred and three participants attended workshops with 230 also completing e-learning modules. Topics included: foundational learning theory; orientation to diverse simulation modalities; briefing; and debriefing. A layered objectives-oriented evaluation strategy was adopted with multiple stakeholders (participants, external experts), methods of data collection (end of module evaluations, workshop observer reports and individual interviews) and at multiple data points (immediate and two months later). Descriptive statistics were used to analyse numerical data while textual data (written comments and transcripts of interviews) underwent content or thematic analysis.

Results

For each module, between 45 and 254 participants completed evaluations. The content and educational methods were rated highly with items exceeding the pre-established standard. In written evaluations, participants identified strengths (e.g. high quality facilitation, breadth and depth of content) and areas for development (e.g. electronic portfolio, learning management system) of the Program. Interviews with participants suggested the Program had positively impacted their educational practices. Observers reported a high quality educational experience for participants with alignment of content and methods with perceived participant needs.

Conclusions

The AusSETT Program is a significant and enduring learning resource. The development of a national training program to support a competent simulation workforce is feasible. The Program objectives were largely met. Although there are limitations with the study design (e.g. self-report), there are strengths such as exploring the impact two months later. The evaluation of the Program informs the next phase of the national strategy for simulation educators and technicians with respect to content and processes, strengths and areas for development.

Peer Review reports

Background

Educators may require unique and specific pedagogical skills to effectively support learning through simulation. As McGaghie et al. state: Simulation-based education (SBE) “is not easy or intuitive; clinical experience alone is not a proxy for simulation instructor effectiveness” [1]. As simulation is increasingly used in health professional education, there is a concomitant need for appropriately skilled teachers. While there are many faculty development programs in SBE, we could not locate any related theoretical or empirical published reports. From an international perspective, Navedo and Simon (2013) provide a descriptive account of thirteen multi-day continuing education simulation instructor courses [2]. Although it is unclear how the programs were identified (or sampled), they write, “Typically, these programs are on-site, intensive experiences with established and well-defined learning outcomes.” (p596) The authors report features common to the courses such as, “overviews in teaching and learning theory, introduction to simulation-based learning, orientation to the equipment, debriefing fundamentals, and management of common problems and offer hands-on opportunities to practice.” (p596) For simulation educators who seek further scholarly advancement, the authors describe eight award programs with key components on SBE. Zigmont et al. (2015) describe the content of basic and advanced courses for simulation educators [3] recommending Kern et al’s (1998) six steps for curriculum design [4] and mapping course content to the blueprint of the Certified Healthcare Simulation Educator standards of the Society for Simulation in Healthcare http://www.ssih.org/Certification/CHSE). Nestel et al. (2013) also offer a descriptive account but of a national strategic approach to SBE and the role of programmatic factors [5]. They argue for enabling connections of national, state and local initiatives such that programs are not seemingly ad hoc but coordinated and facilitated through development of communities of practice. Reviews of effective SBE cite the need for targeted training for faculty [1, 6].

In Australia, the healthcare simulation education community has experienced a period of particularly rapid growth. This is in part a response to the actions of Health Workforce Australia (HWA), a body “established to meet the future challenges of providing a health workforce that responds to the needs of the Australian community” [7]. Disestablished in late 2014, HWA was federally funded and closely linked with state and territory governments. Like most Australian bureaucracies, HWA faced the challenge of ensuring equity throughout a nation, which has a huge geographic area, but with the majority of the population concentrated in major coastal cities. Additionally, it had a broad focus, including issues such as enhancing clinical training, workforce planning analysis and supporting the role of international health professionals. Within its program of clinical training reform [7], HWA contributed more than $90 million over three years to enhance SBE within Australia, to develop the health workforce.

The AusSETT Program

In 2010, HWA commissioned surveys of health professional curricula in order to establish current and potential uses of simulation. The reports, which included medical and nursing schools, identified the key issue of insufficient trained faculty to maximise the benefits of SBE [8]. Responding to this issue in 2011, HWA funded a consortium of Australian organisations, who delivered postgraduate courses in simulation, to develop a national train-the-trainer program for simulation educators and simulation technicians. The latter were considered to be a group of faculty who provided technical support in the implementation of SBE. The resulting Australian Simulation Educator and Technician Training (AusSETT) Program aimed to provide experienced simulation faculty with a curriculum and skillset to train others (see Table 1). The Program was designed to have relevance to all health professions, at all levels of training and across every state and territory in Australia.

Table 1 Participants’ ratings of the extent to which they met learning objectives for the modules in the AusSETT program

HWA’s original proposal was that AusSETT alumni would support the training of 6000 further educators and technicians through a separately funded program. National train-the-trainer programs are rare in this field and we have not identified any national programs for simulation educators and simulator technicians on a global scale. This paper describes the AusSETT Program and the associated objectives-oriented program evaluation.

The institutional members, responsible for developing the AusSETT Program, came from four states and included leading simulation education groups. The development team was comprised of ten authors with clinical (e.g. medicine, nursing, paramedicine, operating theatre technician), content (e.g. human factors, communication, assessment) and process (e.g. education, simulation modality) backgrounds. Authors have experience of several simulation modalities including manikins, task trainers, hybrid simulation, simulated patients, confederates, virtual patients and virtual environments and in different settings such as purpose-built centres, non-specialist settings and in situ. They have between six and 35 years experience of using simulation as an educational method to support learning in healthcare. All authors lived and worked in Australia at the time of development. The ten reviewers had a similar profile although three were from Canada and the United Kingdom.

The development team designed a curriculum with three key features. First, the Program emphasised educational principles based on published healthcare simulation literature. Second, the Program included a broad coverage of simulation modalities including manikins, task trainers, simulated patients and virtual environments. Third, the Program comprised both e-learning and workshops. This approach ensured that the Program was accessible and applicable to diverse simulated learning environments. There were two shared core modules for educators and technicians (C1 and C2). Educators and technicians each completed four additional modules (E1 to E4 and T1 to T4 respectively). Each module was designed to take between four and eight hours. Table 2 summarises the module content and format. Overall goals of the Program were to enable participants to:

Table 2 Brief description of AusSETT Program modules
  • Apply knowledge and skills in the design and delivery of SBE; and

  • Practice contemporary training approaches that influence, motivate and inspire learners.

The Program combined e-learning and workshops and was free to participants. HWA networks nominated participants. The learning management system was custom designed while the e-portfolio was a proprietary product (PebblePad) [9]. Workshops were conducted across the country with six to 28 participants. Over ten months, 230 educators completed the Program.

In this paper we report the evaluation addressing the following questions:

  • To what extent can a national program for healthcare simulation educators and technicians prepare participants to support others in using SBE?

  • What are the strengths and limitations of the Program as identified in structure, process and outcome factors?

Methods

We adopted an objectives-oriented program evaluation approach [10] exploring the structure, process and outcome [11]. In the context of the AusSETT Program examples of structure include the range of skills and abilities of the team to develop and deliver the Program, the sequence and validity of the content (including the learning objectives), the scope of the Program within the time frame and the physical infrastructure for delivery. Examples of process include the educational methods such as the e-learning, workshop format, the e-portfolio and feedback mechanisms etc. Outcome refers to the extent to which the program goals were met. These were mainly focused on outcomes for participants and these were made clear in the learning objectives (Table 1). However, there were likely to be unexpected outcomes too. It was beyond the scope of our evaluation to monitor these changes but we anticipated identifying some through qualitative interviews with participants.

Data was collected using a range of methods, and are described below. Instruments are summarised in Table 3 and examples provided in Additional file 1.

Table 3 Summary of instruments used in the evaluation of the AusSETT Program

Module review

Program partners and external subject matter experts reviewed each module prior to delivery to ensure content was contemporary, accurate and contextualised. External reviewers were identified through collegial networks and published literature. Written feedback was sought and used to adjust content and methods. Internal review was undertaken throughout the development and implementation process – at meetings, in response to specific requests for feedback, and at workshops. Although all modules were sent externally for review with the same instruction, not all received formal written responses. Authors who nominated subject matter experts and wrote personally to the reviewers received detailed written feedback. Feedback was iteratively incorporated into modules and text of feedback was thematically analysed.

Baseline questionnaires

Online questionnaires were completed by all participants and faculty before commencing the Program and sought demographic and professional characteristics, and current practices of SBE (Additional file 1). Data were analysed using descriptive statistics in SPSS and content analysis.

Observation of workshops

In each state and territory, an external observer was invited to observe workshops (Additional file 1). The eighteen observers were selected based on their expertise in health professional education, simulation education and higher education. Observers used a template for each workshop enabling consistency. Free text comments were collated and content analysis was used to identify commonly recurring themes. They were not trained to use the semi-structured form relying on their expert judgement.

End of module evaluations

On completion of each e-learning module, participants were asked to rate the degree to which they met learning objectives and the value of the educational methods in attaining them. Prior to the program, we aspired for mean scores to equal or exceed 4.5 on the 6-point rating scale (1 = not at all met to 6 = completely met). That is, participants have more than ‘moderate’ achievement of learning objectives. Participants were also asked to identify five things they learned during the module that would inform their ability to train others, what worked well in the module and what needed to be improved (Additional file 1). Data was analysed using descriptive statistics. Conventional content analysis was used to collate free text comments [12].

Individual interviews

The goal of the individual telephone interviews was to gain deeper insight into participants’ experiences. Interviews were also used to interrogate other evaluation data (Additional file 1). Purposeful sampling identified participants across jurisdictions, health professional group and experience in simulation-based education. A topic guide was developed with overall Program goals in mind and to explore ‘unexpected’ outcomes. Telephone interviews were conducted by an independent researcher employed for the task and were scheduled for up to one hour. They were recorded and transcribed prior to thematic analysis [13]. JH and DN undertook the thematic analysis using an inductive approach. Key themes were identified independently and then negotiated. The data was then checked again for confirming and disconfirming evidence. An audit trail of analysis was maintained. The interviews were carried out two months after the completion of the Program.

Numerical data was analysed by JH while both JH and DN undertook the analysis of textual data. The Monash University Human Research Ethics Committee approved the study (#CF12/0244-2012000099). Written consent for participation was obtained from the participants to include their data in this study.

Results

Module review

Recommendations included raising the profile of interprofessional SBE, distinguishing essential from recommended or optional readings, indicating areas of overlap between modules, providing a glossary of terms and checking that content addressed general rather than specific principles of SBE. These structural concerns were undertaken and addressed.

Baseline questionnaire

The demographic data of the participants are presented in Table 4. Sixty-three percent (n = 172) of the participants were female and 37 % (n = 103) were male while the remainder of participants did not respond to this item. Percentages are calculated on those reporting for each item and the numbers presented in brackets to aid understanding. Thirty-seven percent (n = 100) were 40 years of age or less and 63 % (n = 168) were 41 years of age or older. Seventy eight percent (n = 235) of participants chose to complete the educator stream and 22 % (n = 65) opted for the technician stream. Fifty-five percent (n = 153) were employed by public health care service, 29 % (n = 80) and 8 % (n = 23) were employed by universities and private health care services, respectively. The remainder were employed by Technical and Further Education (TAFE) Colleges (4 %, n = 10) and other organisations (4 %, n = 11). Over half of the participants (52 %, n = 133) taught both undergraduate and postgraduate students, while 28 % (n = 73) only taught at postgraduate level. Twenty percent (n = 51) taught only undergraduates and the balance did not respond.

Table 4 Participants demographic information

The majority of participants had a nursing or midwifery background (57 %, n = 161), 17 % (n = 49) had a medical background and 12 % (n = 34) were from allied health with fourteen professions noted, including physiotherapy, paramedicine and occupational therapy. Thirteen percent (n = 37) described educational, administrative or other non-clinical backgrounds.

On simulation experience, the majority of participants (53 %, n = 135) spent ten hours or less per month on simulation activities, including 25 participants with no simulation experience. Seven percent (n = 18) indicated they spent over 80 h per month on simulation activities, which would indicate a full-time simulation education position. Forty percent (n = 102) of participants had previous simulation training, such as a debriefing course, an Advance Life Support instructor course, train-the-trainer courses or fellowships. Of these, nineteen had undertaken or were undertaking post-graduate studies in simulation.

Table 5 provides a content analysis of free text on the participants’ views on simulation and the AusSETT Program. The ‘top five’ themes are provided. This analysis indicates that skills such as debriefing or learner engagement are the areas of most challenge for participants; manikin-based simulation is the most common type of simulation modality used by the participants; and 61 % of participants enrolled in the AusSETT Program to improve their own skills, knowledge or understanding with 19 % enrolling to train others. Additionally, they appreciated that the Program was free, that they were nominated to attend and that they had dedicated time to spend in professional development.

Table 5 Content analysis of participants’ views on simulation and the AusSETT Program

Observation of workshops

The key outcomes of observer reports suggested that educational methods supported participants in meeting learning objectives. That is, there was alignment between content and process and perceived participants’ needs. Observers noted effective management of group dynamics, the creation of safety for participants and the benefits of learning across consecutive days with relationships forming between participants and faculty. Recommendations included making better use of e-learning in workshop activities, a similar template for all slides, extended time for discussion of local issues, and longer review time for some experiential activities.

End of module evaluations

Table 1 summarises the participants’ evaluation of the attainment of learning objectives. There were 58 learning objectives in total with 17 in the core modules, 27 in the educator modules and 14 in the technician modules. Overall, 37 learning objectives exceeded 4.5 while the remaining 21 learning objectives were in the range of 4.15 to 4.43. That is, overall participants reported at least partially meeting most objectives. For educator modules, E2 (7 objectives) and E4 (10 objectives) all learning objectives exceeded 4.5. For technician modules, all objectives in T1 exceeded 4.5 while in T2, two of four objectives exceeded 4.5 while neither T3 nor T4 reached this standard. Within the core modules (C1 and C2), one of the highest rated learning objectives was to describe various types of simulation activities with a mean of 4.72 (Median = 5.00, SD = 0.95, 95 % confidence interval = 4.60 to 4.84). The lowest rated learning objective was to develop an evaluation plan of a simulated learning event with a mean of 4.15 (Median = 4.00, SD = 1.12, 95 % confidence interval = 4.01 to 4.29).

A mean of 5.06 (Median = 5.00, SD = 0.84, 95 % confidence interval = 4.85 to 5.26) was reported for describe a systematic approach to simulated patient (SP) training for role portrayal. This is the highest reported learning objective within the educator stream. The lowest reported learning objective within the educator stream was for teach simulation educators to review virtual environments for SBE (Mean = 4.25, Median = 4.00, SD = 1.14, 95 % confidence interval = 3.90 to 4.60). In the technician stream, the highest rated learning objective was for describe the concept of fidelity in relation to [specialist/technician] role with a mean of 5.13 (Median = 5.00, SD = 0.67, 95 % confidence interval = 4.93 to 5.32). The lowest rated learning objective was for describe the basics of programming manikins (Mean = 4.17, Median = 5.00, SD = 1.14, 95 % confidence interval = 4.26 to 4.89).

Content analysis

Qualitative responses for module evaluation covered three areas: i) what worked well; ii) what needs improvement; and iii) things that participants had learned. As the data collected provided a large body of information, only selected responses are included.

What worked well

There were many successes with the modules such as the “open” small and large group discussions, high quality facilitation, the opportunity to learn from each other, debriefing exercises especially with observing different styles of debriefing, the use of non-clinical scenarios for learning about SBE principles, e-learning exercises and debriefing resources, the reference to educational theory relevant to SBE and the breadth of simulation modalities.

These successes are illustrated by examples of text from a core module (C2):

“It was a great discussion-based learning experience, drawing on others’ knowledge and experiences to create themes around simulation and debriefing; The faculty were extremely supportive”

“Different simulation modalities and the understanding that you don’t need the best simulator to achieve the outcome”

“I enjoyed the hands on part of the module where we designed a simulation exercise and needed to give feedback; Also the chance to meet other people trying to do the same thing or with other ideas/experience was great; As I am very new to this area it was all a bit daunting, so hearing some of their tips/techniques was helpful”

What can be improved

The need to improve the learning management system, the use of the e-portfolio and more time for completing the preparation prior to attending workshops were raised by the participants.

“The response boxes need to allow for editing. After brainstorming some ideas and submitting I realised that I could not re-enter to finalise my ideas; Scrolling between 2 pages to re-read quotes then meant that what I had entered into the text box had been lost; Quite frustrating…” (C1)

“The computer interface; Many of my responses were partially or fully deleted on submission; I was unable to then edit the response after submission and 90 % of my scenario was not loaded; I found this extremely frustrating and do not feel the saved answers reflect well either what I actually wrote or my effort in this module.” (E1)

“Couldn’t access the pebblepad” (C2)

“I have had difficulties accessing many of the links and the audio component, so that made it difficult; There was also a huge amount of readings required making it very lengthy to work through” (E2)

Some participants also expressed the need for information about when the next phase of the Program would commence.

“I think AusSETT needs to have a clearer idea of where this project is going, as the presenters just didn’t have the answers about the next few steps; I worry that by the time the project is developed fully, I will have forgotten what I am supposed to train others in.” (C2)

Content analysis of participants’ reported learning

Participants were requested to list up to five things they had learned during each module. This provided further insight to whether the objectives of the sessions were met. Only responses from a core module (C2) are presented as the data collected yielded a large amount of information. Some of these statements covered one or more of the successes reported above. The top five results – practical SBE skills, reinforcement of important processes in SBE, access to resources for SBE, theories that inform SBE and generic teaching approaches - suggest that the module offered consolidation and/or development of knowledge and skills (Table 6). The module also provided helpful resources for the participants.

Table 6 Content analysis of top five things learned from Module C2: Training simulation educators (n = 102)

Individual interviews

Nine interviews were carried out between August and November 2012. Each interview took between 18 to 30 min with 239 min of data obtained. Feedback from the interview triangulated the end of the module evaluation data. Overall, positive general thoughts and feelings for the Program were reported and reinforced the earlier data. The interviews identified an overarching theme: the impact of the Program. Three specific areas of impact were noted; a) personal; b) organisational; and c) professional community. Verbatim statements from the interview data are presented in Table 7. Participants’ reported that the Program had boosted their confidence as a trainer of SBE. The Program reinforced as well as expanded their knowledge on SBE, increasing their interest in other similar programs for furthering their professional development. These individuals reported that they became “champions” for the use of SBE within their organisations. Some of them reported that their organisations had taken an active interest in SBE as a direct consequence of the AusSETT Program. The Program also encouraged inter-professional learning within their organisations. Development of a community of practice was also an outcome as some participants reported that they had kept in contact with others to exchange ideas and information on SBE, interacting regularly and that AusSETT Program offered a shared language as well as repository of resources.

Table 7 Verbatim statements from interviews with participants with key themes

Discussion

The results support the notion that the AusSETT Program was an important national initiative. The interviews and observations indicate the value that participants place upon the program and how access to the Program has resulted in significant goodwill in the healthcare simulation professional community. The overwhelming sense was that participants had high standards, were aspirational for their practices and wanted to make a difference using simulation as an educational method. However, the impact of the Program cannot be fully measured until the participants themselves offer simulation educator and technician training.

Structural factors

Many strengths of the Program were primarily structural. They included the building of a significant and enduring repository of contemporary resources (e-learning and workshops), and its offering in urban and rural regions, across jurisdictions, simulation modalities and professions. The Program also had a national profile with the representativeness of the development team. The module review process enhanced the strength of the structural factors.

Enabling structural factors from participants’ perspectives included being identified to attend the Program, some dedicated time off work to focus on professional development, networking associated with spending time immersed in SBE and offering the Program free of charge to participants. Other enablers included the access to e-learning. In fact, the e-learning also formed a significant structural constraint, particularly with respect to the functionality of features. Additionally, participants largely resisted the use of PebblePad, which did not seem related to the tool itself but to its use in this particular context. At a higher level, a constraining factor was that, at the time of the Program delivery, the next steps were not known to the Program faculty. Although HWA had longer-term goals, the mechanisms of government, especially in relation to budgets, did not permit sharing of plans in a more timely fashion. This created a barrier for engagement by individuals and institutions with the Program.

Process factors

Participants and workshop observers identified the types of active learning opportunities as a strength of the Program. Indeed, participants identified a key point of learning as including a series of ‘processes that worked well’, which were modelled during the process factors. The need to have skilled facilitators was reinforced by workshop observers; this may be a key enabling process factor for any future Program.

Outcome factors

Participants’ reported that they largely met learning objectives. Where objectives were not met as well as our pre-set standard, this may have related to the relative inexperience of some participants. Although observers reported alignment of content with perceived needs of participants, this may not have been achieved in every workshop as not all were observed.

It is worth noting is that there was no assessment associated with the AusSETT Program so we have no way of knowing if participants did achieve competence relevant to the learning objectives. This may be seen as both an enabling factor – in terms of access - and a constraining factor – in terms of accreditation – depending on perspective. Our philosophical view was that we wanted to provide continuous professional development and that at this nascent stage of development of a national community of practice of healthcare simulation educators and technicians, assessment may not have been well received. Also, within the funding model it was not possible to offer robust assessment, but this may be a future step.

Overall, the evaluation results give some indication that a national program for simulation educators and technicians is valuable for supporting others in using simulation as an educational method, at least from the perspectives of those who participated as future ‘trainers’. There is one major caveat. Many participants had no or limited experience with simulation and very few had formal qualifications. Indeed the first reason given for enrolling with the program was to “develop, expand or consolidate” skills. The AusSETT program was intended for experienced simulation experts, who could support others, not for those who were novices. In many ways this suggests that a ‘training’ program rather than a ‘train-the-trainer’ (‘educate-the-educator’) program may be more aligned with participants’ needs. However, the question of training others will only be fully answered once the AusSETT alumni have had an opportunity to offer professional development to other cohorts. However, there is no question that the Program promoted individual development and ignited interest in SBE in local facilities.

The contractual agreement strongly influenced some elements of the Program’s development and implementation and some of the language used to describe simulation practices – such as those of educator and technician. Although the opportunity to pilot the Program is likely to have reduced the scale of iterative development, the actual adjustments to the Program were minimal. The funding body requested a train-the-trainer model but in many respects we adopted a perspective of ‘educate-the-educator’, especially with respect to sharing theoretical underpinnings of many facets of simulation-based education.

Strengths and limitations of the study

The strengths of the study were the objectives-oriented program evaluation design, mixed methods, multiple sources of data (participants and subject matter experts) and the multiple data points collecting baseline, immediate responses after e-learning and workshops and then those in the longer term (two-months), at least for some participants.

Limitations of the study are that not all participants gave permission to publish their data which accounts for some of the discrepancies in reporting. The data is self-report which may not reflect the actual practices of participants. As internal evaluators, our own biases and assumptions may have influenced the data analysis and that which we have chosen to share from the very large data set. However, we employed an independent researcher (JH) to play a leading role in data analysis. Finally, in the absence of data-driven accounts of similar programs, it is not possible to discuss our findings in that context.

Conclusions

In summary, the evaluation of a national ‘train-the-trainer’ program for simulation educators and technicians suggested the value and need for a nationwide approach to training educators in simulation methodologies. The multi-layered objectives-oriented evaluation identified strengths and weaknesses of the Program with specific recommendations for improving its’ content and processes. We hope that sharing the details of learning objectives and our broader experiences may inform others embarking on simulation educator and technician training course development. The ‘national’ identity was achieved through the scale and diversity of the program in terms of simulation modalities, the geographical spread, breadth of professional disciplines, range of educational methods and the multi-institutional development.

References

  1. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010;44(1):50–63. doi:10.1111/j.1365-2923.2009.03547.x. PubMed PMID:20078756. Epub 2010/01/19.

  2. Navedo D, Simon R. Specialized courses in simulation. In: Levine A, DeMaria S, Schwartz A, Sim A, editors. The Comprehensive Textbook of Healthcare Simulation. New York: Springer; 2013. p. 593–7.

    Chapter  Google Scholar 

  3. Zigmont JJ, Oocuma N, Szyld D, Maestre J. Educator training and simulation methodology courses. In: Palaganos J, Maxworthy J, Epps C, Mancini M, editors. Defining Excellence in Simulation Programs. Philadelphia: Wolters Kluwer; 2015. p. 546–57.

    Google Scholar 

  4. Kern D, Thomas P, Howard D, Bass E. Curriculum development for medical education: A six-step approach. Baltimore: Johns Hopkins University Press; 1998.

    Google Scholar 

  5. Nestel D, Watson M, Bearman M, Morrison T, Pritchard S, Andreatta P. Strategic approaches to simulation-based education: A case study from Australia. J Health Spec. 2013;1(1):4–12.

    Article  Google Scholar 

  6. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27(1):10–28. doi:10.1080/01421590500046924.PubMed. Epub 2005/09/09.

    Article  Google Scholar 

  7. Health Workforce Australia. Health Workforce Australia 2012 [cited 2012 October 29]. Available from: https://www.hwa.gov.au/sites/uploads/hwa-work-plan-2012-13-approved-SCoH-20120810.pdf.

  8. Health Workforce Australia (HWA). Use of Simulated Learning Environments (SLE) in Professional Entry Level Curricula of selected professions in Australia: HWA; 2010. Available from: http://www.hwa.gov.au/sites/uploads/simulated-learning-environments-2010-12.pdf.

  9. Pebble Learning Ltd. Pebble Learning Home Telford, UK: Pebble Learning Ltd,; [updated 2015; cited 2014 23/12/2014]. Available from: http://www.pebblelearning.co.uk/.

  10. Fitzpatrick J, Sanders J, Worthen B. Program-Oriented Evaluation Approaches. In: Fitzpatrick J, Sanders J, Worthen B, editors. Program Evaluation: Alternative Approaches and Practical Guidelines. 4th ed. Upper Saffle River, NJ: Pearson Education; 2011. p. 153–71.

    Google Scholar 

  11. Weiss C. Evaluation Research: Methods of Assessing Program Effectiveness. Prentice Hall: Englewood Cliffs, New Jersey; 1972.

    Google Scholar 

  12. Hseih H, Shannon S. Three approaches to qualitative content analysis. Qual Health Res. 2005;15(9):1277–88.

    Article  Google Scholar 

  13. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3:77–101.

    Article  Google Scholar 

Download references

Acknowledgments

Health Workforce Australia funding of the program through the Department of Health. Our colleagues in several jurisdictions including Harry Owen, Joanna Tai, Matt Shuker, Lisa McCoy, Stephanie O’Regan and Nigel Choong for valuable contributions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Debra Nestel.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

DN was involved in the development, implementation and evaluation of the AusSETT Program, devised the evaluation strategy and drafted the manuscript with JH. MB, PB, KF, DC, JG, BJ, LR, CR, CS, BS, MW were involved in the development and implementation of the AusSETT Program. MB was involved in drafting the manuscript. JH and DN analysed the data. All authors read and approved the final manuscript.

Additional file

Additional file 1:

Baseline questionnaires on participants’ views of simulation and the AusSETT Program. (PDF 45 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nestel, D., Bearman, M., Brooks, P. et al. A national training program for simulation educators and technicians: evaluation strategy and outcomes. BMC Med Educ 16, 25 (2016). https://doi.org/10.1186/s12909-016-0548-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-016-0548-x

Keywords