ScR

Scripted Debriefing for Resuscitation Training: A scoping review: EIT 6413 TF ScR

profile avatar

ILCOR staff

Commenting on this CoSTR is no longer possible

To read and leave comments, please scroll to the bottom of this page.

This CoSTR is a draft version prepared by ILCOR, with the purpose to allow the public to comment and is labeled “Draft for Public Comment". The comments will be considered by ILCOR. The next version will be labelled “draft" to comply with copyright rules of journals. The final COSTR will be published on this website once a summary article has been published in a scientific Journal and labeled as “final”.

Conflict of Interest Declaration

The ILCOR Continuous Evidence Evaluation process is guided by a rigorous ILCOR Conflict of Interest policy. The following Task Force members and other authors declared an intellectual conflict of interest, and this was acknowledged and managed by the Task Force Chairs and Conflict of Interest committees: Adam Cheng was the first author of 2 articles found. He was not involved in the assessment of the studies or the data extraction. However, he participated as an expert in the Task Force discussion. All other authors have no COI.

Task Force Synthesis Citation

Yiqun Lin, Robert Greif, Andrew Lockey, Adam Cheng on behalf of the International Liaison Committee on Resuscitation Education, Implementation and Teams Task Force.

Scripted debriefing for resuscitation training: A scoping review and Task Force Insights: International Liaison Committee on Resuscitation (ILCOR) Education, Implementation, and Teams Task Force, 2023 November 1. Available from: http://ilcor.org

Methodological Preamble and Link to Published Scoping Review

The search strategy was reviewed by a librarian (Caitlin McClurg) at the Health Sciences Library, University of Calgary, with the involvement of members of the ILCOR Education, Implementation, and Teams (EIT) Task Force working on this scoping review (Y.L, A.C, R.G., A.L.). Current literature relevant to scripted debriefing in resuscitation training was sought in Medline, EMBASE, and Scopus databases. After title screening and full-text assessment, data from studies containing evidence on scripted debriefing during resuscitation training were extracted by the writing group and presented to the EIT Task Force for discussion. The final scoping review’s task force insight was discussed and agreed upon during EIT Task Force meetings and approved by the ILCOR Science Advisory Committee.

Scoping Review

Webmaster to insert the Scoping Review citation and link to Pubmed using this format when/if it is available.

PICOST

The PICOST (Population, Intervention, Control, Outcome, Study Design and Timeframe)

Population: Healthcare providers or laypeople receiving resuscitation training (primary), and instructors teaching resuscitation courses (secondary)

Intervention: Debriefing with a cognitive aid, checklist, script or tool

Comparisons: Debriefing without the use of a cognitive aid, checklist, script or tool

Outcomes: Primary population: (1) Patient outcomes [CRITICAL]; (2) Improved resuscitation performance in clinical environments [CRITICAL]; (3) Improved learning outcomes (knowledge and skill acquisition and retention [IMPORTANT]; (4) Satisfaction of learning [IMPORTANT].

Secondary population: (5) Quality of teaching / debriefing [IMPORTANT]; (6) Workload / Cognitive load of instructor/debriefer [IMPORTANT]

Study Designs: Randomized controlled trials (RCTs) and non-randomized studies (non-randomized controlled trials, interrupted time series, controlled before-and-after studies, cohort studies) were eligible for inclusion. Unpublished studies (e.g., conference abstracts, trial protocols) and grey literature were excluded. All relevant publications in any language were included as long as there was an English abstract available.

Timeframe: All years and all languages were included as long as there was an English abstract.

Search Strategies

Summary of the databases that were searched and important search terms. Clinical terms such as ‘resuscitation’ and ‘cardiac arrest’ were not included as search terms to maximize the breadth of the search.

Medline (Last search date: Apr 18, 2023)

1."cognitive aid".kf,tw.

2. script*.kf,tw.

3."cognitive tool".kf,tw.

4.(debriefing adj4 tool*).kf,tw.

5.(debriefing adj4 algorithm).kf,tw.

6.(debriefing adj4 checklist).kf,tw.

7.(debriefing adj4 form).kf,tw.

8.(debriefing adj4 aid).kf,tw.

9.(debriefing adj4 guide).kf,tw.

10.(debriefing adj4 template).kf,tw.

11.(debriefing adj4 model).kf,tw.

12.1 or 2 or 3 or 4 or 5 or 6 or 7 or 8 or 9 or 10 or 11

13.debrief*.kf,tw.

14.exp Computer Simulation/

15.(12 and 13) not 14

EMBASE (Last search date: Apr 18, 2023)

1.debrief*.kf,tw.

2.script*.kf,tw.

3."cognitive aid".kf,tw.

4.(cognitive adj4 tool).kf,tw.

5.(debriefing adj4 tool).kf,tw.

6.(debriefing adj4 template).kf,tw.

7.(debriefing adj4 model).kf,tw.

8.(debriefing adj4 checklist).kf,tw.

9.(debriefing adj4 algorithm).kf,tw.

10.(debriefing adj4 script).kf,tw.

11.(debriefing adj4 form).kf,tw.

12.(debriefing adj4 guide).kf,tw.

13.(debriefing adj4 aid).kf,tw.

14. 2 or 3 or 4 or 5 or 6 or 7 or 8 or 9 or 10 or 11 or 12 or 13

15.1 and 14

16.exp computer simulation/

17.15 not 16

Scopus (Last search date: Apr 18, 2023)

( TITLE-ABS-KEY ( debriefing ) AND TITLE-ABS-KEY ( "scripted" OR "script" OR "cognitive guidance" OR "debriefing guidance" OR "cognitive aid" OR "debriefing algorithm" OR "debriefing checklist" OR "debriefing template" OR "debriefing tool" OR "debriefing model" ) )

Inclusion and Exclusion criteria

Inclusion criteria:

Studies comparing the use of debriefing scripts, tools, cognitive aids or checklists to debriefing without any adjuncts.

The context of studies was resuscitation training, including adult and/or pediatric BLS and ALS courses, neonatal resuscitation courses, or local/ institutional resuscitation training sessions, courses or programs.

Exclusion criteria:

No English abstract available.

Unpublished studies (e.g., conference abstracts, trial protocols), letters, editorials, comments, case reports and grey literature.

Studies describing the use of debriefing scripts, tools, cognitive aids or checklists outside of the resuscitation training environment.


Data tables: Attached-EIT 6413 Data tables

Task Force Insights

Why was this topic reviewed?

Debriefing is defined as a learning conversation “in which aspects of performance are explored and analyzed with the aim of gaining insights that will impact the quality of future clinical practice”7. Debriefing conducted during simulation-based training improves provider knowledge, clinical performance, and non-technical skills performance7-12. Studies assessing the impact of debriefing after cardiac arrest events demonstrate improved provider performance13, 14, while debriefings informed using clinical data have been associated with enhanced survival outcomes from cardiac arrest15, 16. Many different debriefing frameworks have been developed and implemented, leading to variability in how debriefing is conducted across programs and institutions17. This variability may influence the overall impact of debriefing as an educational intervention.

Debriefing scripts and tools have been developed to help standardize the approach to debriefing during resuscitation training. Debriefing scripts support the facilitation of debriefing conversations by providing a written plan for the debriefing, which may include topics for discussion, suggested words or phrases to guide discussion, and/or an overarching framework to structure the debriefing1, 18-22. Debriefing scripts have been described using different terminology, including: ‘debriefing tool’, ‘debriefing checklist’, or ‘debriefing cognitive aid’, amongst others. While their use has gained traction in both educational1, 18 and clinical settings19-21, the benefits of debriefing script use in resuscitation education has not been clearly summarized. Understanding the value of debriefing script use during resuscitation training will assist programs in appropriately supporting their instructors with this resource. In this review, we aim to describe if use of a debriefing script, compared with regular debriefing without a debriefing script, improves learning and performance outcomes of learners, patient outcomes, and instructor debriefing performance, workload, and cognitive load.

2. Narrative summary of evidence identified

Study Characteristics

An extensive search of the databases (see search strategy) yielded 1151 citations. (Figure 1) After removing 494 duplicates, 657 articles were screened by reviewing the titles and abstracts. Of these, 11 articles were included for full-text review, which resulted in 6 eligible studies being included in this review with publication years ranging from 2013 to 20231-6 (Table 1). We found 5 randomized controlled studies1-4, 6 and one quasi-experimental (non-RCT) study5; three were conducted in Canada and/or the USA1, 3, 6, and one each was conducted in Norway5, Australia4, and Germany2.

In each of the included studies, clinical resuscitation scenarios were provided as the trigger for the debriefing. Three studies utilized pediatric scenarios1, 4, 6 and three others had adult scenarios2, 3, 5. Five of the studies had real participants (healthcare providers or trainees) in the simulated scenarios and debriefings1, 2, 4-6, while one study used pre-recorded scenarios and actors as participants in the debriefing (i.e. study population were the debriefers)3. Amongst the five studies that recruited participants for the debriefing, healthcare providers were participants in three studies1, 4, 6, and medical or nursing students2, 5 were participants in the other two studies.

Debriefing Script Design

The nature of the scripted debriefing intervention varied amongst the included studies. Five studies1, 3-6 used debriefing scripts that included a debriefing framework, topics for discussion and suggested phrasing, with the other paper describing a script comprised of a framework and teamwork principles (i.e. topics) but no suggested phrases2. Of the six included studies, only one utilized a debriefing script that incorporated objective data (i.e. CPR quality, no flow time, time to critical interventions) collected during the resuscitation event6. Some form of instructor/debriefer training describing use of the debriefing script was provided in four studies3-6; in the other two studies the script was introduced but no formal training was conducted1, 2. Different debriefing frameworks were utilized amongst studies, with PEARLS tool (Promoting Excellence and Reflective Learning in Simulation) being the most common (three studies)3, 5, 6, followed by advocacy-inquiry (two studies)1, 4, and the GAS (Gather, Analyze, Summarize) model (with a TeamTAG script)2.

Clinical Outcomes

No studies evaluated patient outcomes or provider performance on real patients.

Learning Outcomes and Learner Satisfaction

Four studies assessed a wide range of learning outcomes1, 2, 5, 6. One randomized trial of healthcare providers found that groups receiving a data-informed, PEARLS scripted debrief conducted after a simulated pediatric cardiac arrest scenario improved overall excellent CPR (p=0.02), guideline compliant depth (p=0.02), chest compression fraction (p=0.03), and peri-shock pause duration (p=0.004) when compared to groups that received a non-scripted debriefing6. They found no difference between groups in time to critical interventions.

Two studies reported the effect of scripted debriefing on non-technical skills1, 2. A single center study of German medical and nursing students showed no difference in teamwork performance when comparing groups who received scripted vs. non-scripted debriefings by their colleagues2. In a multicenter RCT of healthcare providers, those receiving scripted debriefing by novice instructors had improved team leadership skills (p=0.03) compared to those who received a non-scripted debriefing1. This study also reported improved knowledge acquisition in the scripted debriefing group (p=0.04), but no difference in clinical performance scores (p=0.18).

One study reported no improvement in learner satisfaction2, while another demonstrated no improvement in clinical judgement with scripted debriefing5.

Debriefing Quality

Two studies assessing debriefing quality in scripted vs. non-scripted groups demonstrated mixed results. In a multicenter trial of both novice and expert instructors, scripted debriefings were of higher quality compared to non-scripted debriefings (i.e. OSAD score; p=0.01), with the effect of debriefing scripts significant in novices (p=0.03) but not experts (p=0.48)4. One other study found no difference in debriefing quality (i.e. DASH score, p=0.436) between novice instructors using a PEARLS script vs. those not using a PEARLS debriefing script3.

Debriefer Cognitive Load and Workload

One randomized trial evaluated the impact of a PEARLS scripted debriefing tool when used by novice debriefers (i.e. simulation fellows) facilitating debriefing of actors portraying the roles of participants in simulated resuscitations3. When comparing scripted vs. non-scripted debriefing, there was reduced cognitive load (PASS scale, p=0.04) in the script debriefing group, but no difference in debriefer workload (i.e. NASA-TLX score, p=0.456).

3. Narrative Reporting of the task force discussions

Our review investigating the effect of scripted debriefing in resuscitation education found that when compared to traditional, non-scripted debriefing, those groups receiving scripted debriefing showed improved CPR performance6, team leadership skills and knowledge acquisition1. One study found no difference in teamwork performance2. As it relates to debriefing quality, the results were mixed amongst the studies identified3, 4. Lastly, use of a debriefing script reduced the cognitive load of the debriefer 3.

Amongst the six studies identified in this review, there was significant heterogeneity in the design and implementation of the scripted debriefing intervention. Debriefing scripts and tools are typically comprised of some (or all) of these key components: (1) an overarching framework describing the phases of debriefing; (2) varying degrees of scripted language to support the implementation of certain conversational strategies; (3) specific content areas, learning objectives or performance gaps to review; and (4) suggested use of objective data (e.g. CPR quality metrics) during debriefing1-4, 18, 22. The inclusion of these key components across studies was mixed, and the even when included, the nature and type of content was quite variable. For example, some studies utilized tools promoting a blended method and framework of debriefing3, 5, 6, while others implemented a single method of debriefing (e.g. advocacy inquiry)1, 4. Furthermore, there was a notable difference in how facilitators were familiarized with the script, varying from handing the debriefing script to facilitators immediately prior to the debriefing to comprehensive debriefing training with the script as the key focus of training. These variables may have introduced bias contributing to the variability in results identified in this review.

Our scoping review did not identify any studies reporting patient or process outcomes in real resuscitations. Only one study included CPR performance metrics as an outcome measure. In this study6, objective CPR data pulled from a defibrillator was directly integrated into the debriefing script, which also included standardized and evidence-based strategies to address the various CPR performance deficits. This allowed the debriefing content in the script to be directly linked to clinically relevant performance metrics (i.e. CPR quality). Future studies should explore other ways of linking the content of debriefing scripts to clinically important metrics in efforts to enhance the overall impact of debriefing during resuscitation education. Furthermore, the application of these principles to scripts used during clinical event debriefings should be evaluated. Lessons learned from clinical event debriefings have immense potential of leading to clinical performance improvement as these conversations generally take place in the workplace after real cardiac arrest resuscitations.

Our scoping review did not identify sufficient evidence to prompt a systematic review or meta-analysis. However, we were able to highlight knowledge gaps and identify future opportunities for research.

To advance our knowledge in this area of resuscitation science, we encourage researchers to: (1) design their debriefing scripts with careful consideration of the four key components described above; (2) integrate objective data into debriefing that can be linked to clinically relevant outcomes; and (3) provide training for facilitators to ensure they have a complete understanding of how the debriefing script should be used.

Based on the results of this scoping review and the expert opinion from the ILCOR Task Force on Education, Implementation and Teams, we will issue an educational good practice statement that resuscitation programs should consider using debriefing scripts as a resource to support debriefers during debriefing as there is potential benefit for learning and performance outcomes.

Knowledge Gaps

We identified several knowledge gaps in the literature.

The relative and synergistic effect of scripted wording vs. data-informed debriefing during resuscitation training is unclear.

The impact of scripted debriefing on knowledge and skill retention is unknown.

The impact of scripted debriefing during training on patient or process outcomes in real resuscitations is unknown.

The importance of debriefer adherence to debriefing scripts, and its influence on learning and performance outcomes needs to be explored further.

The influence of debriefer experience and learner characteristics on the impact of debriefing scripts needs to be elucidated.

Acknowledgement

The authors acknowledge the assistance provided by Caitlin McClurg the librarian at the University of Calgary for building up the searching strategy. The following ILCOR EIT Taskforce Members are acknowledged as collaborators on this scoping review: Cristian Abelairas-Gomez, Natalie Anderson, Farhan Bhanji, Jan Breckwoldt, Andrea Cortegiani, Aaron Donoghue, Kathryn Eastwood, Barbara Farquharson, Ming-Ju Hiseih, Ying-Chih Ko, Elina Koota, Kasper G. Lauridsen, Andrew Lockey, Tasuku Matsuyama, Sabine Nabecker, Kevin Nation, Alexander Olaussen, Taylor Sawyer, Sebastian Schnaubelt, Chih-Wei Yang, and Joyce Yeung. We would like to thank Peter Morley (Chair ILCOR Science Advisory Committee) for his valuable contributions.

References

1. Cheng A, Hunt EA, Donoghue A, Nelson-McMillan K, Nishisaki A, Leflore J, et al. Examining pediatric resuscitation education using simulation and scripted debriefing: a multicenter randomized trial. JAMA Pediatr. 2013;167(6):528-36.

2. Freytag J, Stroben F, Hautz WE, Penders D, Kammer JE. Effects of using a cognitive aid on content and feasibility of debriefings of simulated emergencies. GMS Journal for Medical Education. 2021;38(5):1-17.

3. Meguerdichian M, Bajaj K, Ivanhoe R, Lin Y, Sloma A, de Roche A, et al. Impact of the PEARLS Healthcare Debriefing cognitive aid on facilitator cognitive load, workload, and debriefing quality: a pilot study. Adv Simul (Lond). 2022;7(1):40.

4. Snelling PJ, Dodson L, Monteagle E, Ware RS, Acworth J, Symon B, et al. PRE-scripted debriefing for Paediatric simulation Associated with Resuscitation EDucation (PREPARED): A multicentre, cluster randomised controlled trial. Resusc Plus. 2022;11:100291.

5. Høegh-Larsen AM, Ravik M, Reierson IÅ, Husebø SIE, Gonzalez MT. PEARLS Debriefing Compared to Standard Debriefing Effects on Nursing Students’ Professional Competence and Clinical Judgment: A Quasi-Experimental Study. Clinical Simulation in Nursing. 2023;74:38-48.

6. Cheng A, Davidson J, Wan B, St-Onge-St-Hilaire A, Lin Y. Data-informed debriefing for cardiopulmonary arrest: A randomized controlled trial. Resuscitation Plus. 2023;14.

7. Cheng A, Eppich W, Grant V, Sherbino J, Zendejas B, Cook DA. Debriefing for technology-enhanced simulation: a systematic review and meta-analysis. Med Educ. 2014;48(7):657-66.

8. Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simul Healthc. 2007;2(2):115-25.

9. Garden AL, Le Fevre DM, Waddington HL, Weller JM. Debriefing after simulation-based non-technical skill training in healthcare: a systematic review of effective practice. Anaesth Intensive Care. 2015;43(3):300-8.

10. Levett-Jones T, Lapkin S. A systematic review of the effectiveness of simulation debriefing in health professional education. Nurse Educ Today. 2014;34(6):e58-63.

11. Motola I, Devine LA, Chung HS, Sullivan JE, Issenberg SB. Simulation in healthcare education: a best evidence practical guide. AMEE Guide No. 82. Med Teach. 2013;35(10):e1511-30.

12. Raemer D, Anderson M, Cheng A, Fanning R, Nadkarni V, Savoldelli G. Research regarding debriefing as part of the learning process. Simul Healthc. 2011;6 Suppl:S52-7.

13. Couper K, Salman B, Soar J, Finn J, Perkins GD. Debriefing to improve outcomes from critical illness: a systematic review and meta-analysis. Intensive Care Med. 2013;39(9):1513-23.

14. Edelson DP, Litzinger B, Arora V, Walsh D, Kim S, Lauderdale DS, et al. Improving in-hospital cardiac arrest process and outcomes with performance debriefing. Arch Intern Med. 2008;168(10):1063-9.

15. Wolfe H, Zebuhr C, Topjian AA, Nishisaki A, Niles DE, Meaney PA, et al. Interdisciplinary ICU cardiac arrest debriefing improves survival outcomes*. Crit Care Med. 2014;42(7):1688-95.

16. Hunt EA, Jeffers J, McNamara L, Newton H, Ford K, Bernier M, et al. Improved Cardiopulmonary Resuscitation Performance With CODE ACES(2): A Resuscitation Quality Bundle. J Am Heart Assoc. 2018;7(24):e009860.

17. Sawyer T, Eppich W, Brett-Fleegler M, Grant V, Cheng A. More Than One Way to Debrief A Critical Review of Healthcare Simulation Debriefing Methods. Simulation in Healthcare-Journal of the Society for Simulation in Healthcare. 2016;11(3):209-17.

18. Eppich W, Cheng A. Promoting Excellence and Reflective Learning in Simulation (PEARLS): development and rationale for a blended approach to health care simulation debriefing. Simul Healthc. 2015;10(2):106-15.

19. Mullan PC, Wuestner E, Kerr TD, Christopher DP, Patel B. Implementation of an in situ qualitative debriefing tool for resuscitations. Resuscitation. 2013;84(7):946-51.

20. Zinns LE, Mullan PC, O'Connell KJ, Ryan LM, Wratney AT. An Evaluation of a New Debriefing Framework: REFLECT. Pediatric Emergency Care. 2017.

21. Welch-Horan TB, Lemke DS, Bastero P, Leong-Kee S, Khattab M, Eggers J, et al. Feedback, reflection and team learning for COVID-19: development of a novel clinical event debriefing tool. BMJ Simul Technol Enhanc Learn. 2021;7(1):54-7.

22. Bajaj K, Meguerdichian M, Thoma B, Huang S, Eppich W, Cheng A. The PEARLS Healthcare Debriefing Tool. Acad Med. 2018;93(2):336.


Discussion

Sort by

Time range

Categories

Domains

Status

Review Type