Evaluation of Faculty Debriefing Post simulation Events


Introduction

Debriefing is a core component of simulation-based learning. Reflection on the events of a completed simulation is a crucial step whereby participants learn and modify their behavior.[1] This reflection is generally guided by a facilitator, whose goal is to identify knowledge gaps and attempt to address them.[2] Commercially available courses exist to teach debriefing, and workshops devoted to improving these skills are often part of the curriculum at national and international simulation courses. Also, many programs host their own internal debriefing training. Numerous debriefing models exist to help facilitate the debriefing of participants within a simulation.[3]

Fewer models exist to facilitate the debriefing of the faculty running the simulation. As of this publication date, the International Nursing Association for Clinical Simulation and Learning (INASCL) has created a curriculum of best practices in simulation, including session facilitation and debriefing. While the summary article mentions maintaining debriefing skills through observed practice and peer evaluation in general, these evaluation tools are not explicitly addressed.[4]

Continuing Education

Several post-simulation evaluation tools have been created to meet this need, but vary in focus. Some assess the simulation as a whole. Others focus only on the debriefing portion. A few focus specifically on the facilitator.

OSAD

One early tool created to evaluate the faculty debriefer is the Objective Structured Assessment of Debriefing (OSAD).[5]  Described in the surgery literature in 2012, the OSAD was initially developed by researchers through a review of the existing literature and focused interviews of providers and receivers of debriefing. These results were then synthesized into a list of eight features essential to an effective debriefing. These features were modified into the categories of the final OSAD, which are then rated on a 5-point Likert scale by trained observers: approach, environment, engagement, reaction, reflection, analysis, diagnosis, and application. Descriptive anchors were included in the 5 point scale. Benefits of the OSAD include its brevity (estimated 5 minutes to completion), validity, and interrater reliability.[5]

A similar development process was used to develop a pediatric-specific OSAD.[6] This tool had related categories and 5-point Likert benchmarks. It was later validated and adopted by several committees and simulation centers for debriefing standardization and faculty development.[6]

Citing concerns for the traditional, paper-based form of the OSAD, other researchers have created a modified electronic version (eOSAD).[7] These authors note the benefit of good interrater reliability, the ease of use with video recorded debriefing sessions, and the ability to add comments, which was missing from the traditional OSAD survey. They also describe protection from data corruption as a benefit, namely by eliminating the risk of losing paper copies or exposing data to a transcription error. One major limitation of the eOSAD, however, is its requirement for real-time computer and internet access.

DASH

An alternative to the OSAD for evaluating faculty debriefing and simulation sessions is the Debriefing Assessment for Simulation in Healthcare (DASH). This tool, first described in 2012, evaluates six elements of debriefing.[8] Using descriptions of observable behaviors as anchors, participants receive a rating on a 7-point effectiveness scale. This tool has had validation and testing, demonstrating reliability. It requires standardized training to use, which generally takes place via webinar.  

Authors of the DASH boast that it applies to simulations in a variety of domains and disciplines.[8] Indeed, since its development, it has been used to evaluate debriefing in several contexts. In addition to its use in faculty development, it has served to measure outcomes in research studies with a simulation component. For example, a modified version of the DASH (the DASH student version) has been used to measure outcomes in faculty-led vs. resident-led debriefing sessions for medical students and residents.[9][10][11] It has also been a tool when evaluating interprofessional simulation debriefings.[12]

PADI  

More recently, allied health literature introduced an alternative evaluation tool. The Peer Assessment Debriefing Instrument (PADI) adds a component of self-evaluation to the post-debriefing assessment.[13] It evaluates eight aspects of planning and conducting a simulation debriefing. Both the debriefer and evaluator rate performances on multiple elements within each domain on a 4-point Likert scale. They then compare responses, which opens a conversation, allowing the debriefer to focus the feedback on particular areas he or she wants the evaluator to address.[13] It is reliable and valid across healthcare disciplines for evaluating debriefing. 

Creators of the PADI suggest its benefits include the short time necessary to learn and implement the tool.[14] They also suggest it could be a valuable source of data when evaluating teaching skills and effectiveness.[14]

Other Tools

Some authors have created their own debriefing evaluation tools for their studies, but these are not in common use outside of that particular setting.[15][16] Others modify existing scales to fit their needs. For example, a self-reported debriefing quality scale, based on the OSAD and DASH, as was used in the initial development of the TeamGAINS debriefing tool creation.[17]

Still, others allocate a portion of more comprehensive tools for focusing on debriefing. These include the more holistic evaluation of nursing simulation facilitators, as was done with the Facilitator Competency Rubric.[18]

Several tools seek participant evaluations of the debriefer as part of a larger assessment of the simulated learning session. These include the Simulation Design Scale, created by the National League for Nursing, the Simulation Effectiveness Tool-Modified, and the Debriefing Experience Scale.[19][20][19] Each of these represents post-event evaluations that students complete by describing their perception of a faculty member or simulation event’s effectiveness.

Clinical Significance

There is no direct clinical significance to the evaluations of faculty debriefing after simulations. Instead, tools such as the OSAD, DASH, PADI, and others allow faculty to receive an assessment of their debriefing skills during a single simulation event, which may work to improve debriefing skills, which may then improve learning and impact clinical care.

Enhancing Healthcare Team Outcomes

Simulation-based medical education is a growing component of medical and nursing education. It is used to test systems, enhance communication, and improve teamwork.[21][22] The post-simulation debriefing is the primary venue for exploring and addressing knowledge and behavior gaps. Principles such as psychological safety and non-judgemental attitudes are crucial to enhancing this learning environment.[23] Faculty development of facilitators leading these debriefing sessions may improve debriefing quality, and by extension, the team’s learning.


Details

Editor:

Ashley Rider

Updated:

9/26/2022 5:59:15 PM

References


[1]

Fanning RM,Gaba DM, The role of debriefing in simulation-based learning. Simulation in healthcare : journal of the Society for Simulation in Healthcare. 2007 Summer;     [PubMed PMID: 19088616]


[2]

Dismukes RK,Gaba DM,Howard SK, So many roads: facilitated debriefing in healthcare. Simulation in healthcare : journal of the Society for Simulation in Healthcare. 2006 Spring;     [PubMed PMID: 19088569]


[3]

Sawyer T,Eppich W,Brett-Fleegler M,Grant V,Cheng A, More Than One Way to Debrief: A Critical Review of Healthcare Simulation Debriefing Methods. Simulation in healthcare : journal of the Society for Simulation in Healthcare. 2016 Jun;     [PubMed PMID: 27254527]


[4]

Sittner BJ,Aebersold ML,Paige JB,Graham LL,Schram AP,Decker SI,Lioce L, INACSL Standards of Best Practice for Simulation: Past, Present, and Future. Nursing education perspectives. 2015 Sep-Oct;     [PubMed PMID: 26521497]

Level 3 (low-level) evidence

[5]

Arora S,Ahmed M,Paige J,Nestel D,Runnacles J,Hull L,Darzi A,Sevdalis N, Objective structured assessment of debriefing: bringing science to the art of debriefing in surgery. Annals of surgery. 2012 Dec;     [PubMed PMID: 22895396]


[6]

Runnacles J,Thomas L,Sevdalis N,Kneebone R,Arora S, Development of a tool to improve performance debriefing and learning: the paediatric Objective Structured Assessment of Debriefing (OSAD) tool. Postgraduate medical journal. 2014 Nov;     [PubMed PMID: 25201993]


[7]

Zamjahn JB,Baroni de Carvalho R,Bronson MH,Garbee DD,Paige JT, eAssessment: development of an electronic version of the Objective Structured Assessment of Debriefing tool to streamline evaluation of video recorded debriefings. Journal of the American Medical Informatics Association : JAMIA. 2018 Oct 1;     [PubMed PMID: 30299477]


[8]

Brett-Fleegler M,Rudolph J,Eppich W,Monuteaux M,Fleegler E,Cheng A,Simon R, Debriefing assessment for simulation in healthcare: development and psychometric properties. Simulation in healthcare : journal of the Society for Simulation in Healthcare. 2012 Oct;     [PubMed PMID: 22902606]


[9]

Cooper DD,Wilson AB,Huffman GN,Humbert AJ, Medical students' perception of residents as teachers: comparing effectiveness of residents and faculty during simulation debriefings. Journal of graduate medical education. 2012 Dec;     [PubMed PMID: 24294426]


[10]

Adams T,Newton C,Patel H,Sulistio M,Tomlinson A,Lee W, Resident versus faculty member simulation debriefing. The clinical teacher. 2018 Dec;     [PubMed PMID: 29144023]


[11]

Doherty-Restrepo J,Odai M,Harris M,Yam T,Potteiger K,Montalvo A, Students' Perception of Peer and Faculty Debriefing Facilitators Following Simulation- Based Education. Journal of allied health. 2018 Summer;     [PubMed PMID: 29868695]


[12]

Brown DK,Wong AH,Ahmed RA, Evaluation of simulation debriefing methods with interprofessional learning. Journal of interprofessional care. 2018 Jul 19;     [PubMed PMID: 30024297]


[13]

Saylor JL,Wainwright SF,Herge EA,Pohlig RT, Peer-Assessment Debriefing Instrument (PADI): Assessing Faculty Effectiveness in Simulation Education. Journal of allied health. 2016 Fall;     [PubMed PMID: 27585622]


[14]

Saylor JL,Wainwright SF,Herge EA,Pohlig RT, Development of an Instrument to Assess the Clinical Effectiveness of the Debriefer in Simulation Education. Journal of allied health. 2016 Fall;     [PubMed PMID: 27585615]


[15]

Kable AK,Levett-Jones TL,Arthur C,Reid-Searl K,Humphreys M,Morris S,Walsh P,Witton NJ, A cross-national study to objectively evaluate the quality of diverse simulation approaches for undergraduate nursing students. Nurse education in practice. 2018 Jan;     [PubMed PMID: 29195107]

Level 2 (mid-level) evidence

[16]

Gururaja RP,Yang T,Paige JT,Chauvin SW, Examining the Effectiveness of Debriefing at the Point of Care in Simulation-Based Operating Room Team Training 2008 Aug;     [PubMed PMID: 21249934]


[17]

Kolbe M,Grande B,Spahn DR, Briefing and debriefing during simulation-based training and beyond: Content, structure, attitude and setting. Best practice     [PubMed PMID: 25902470]


[18]

Leighton K,Mudra V,Gilbert GE, Development and Psychometric Evaluation of the Facilitator Competency Rubric. Nursing education perspectives. 2018 Nov/Dec;     [PubMed PMID: 30335707]

Level 3 (low-level) evidence

[19]

Franklin AE,Burns P,Lee CS, Psychometric testing on the NLN Student Satisfaction and Self-Confidence in Learning, Simulation Design Scale, and Educational Practices Questionnaire using a sample of pre-licensure novice nurses. Nurse education today. 2014 Oct;     [PubMed PMID: 25066650]


[20]

Leighton K,Ravert P,Mudra V,Macintosh C, Updating the Simulation Effectiveness Tool: Item Modifications and Reevaluation of Psychometric Properties. Nursing education perspectives. 2015 Sep-Oct;     [PubMed PMID: 26521501]

Level 3 (low-level) evidence

[21]

Miller D,Crandall C,Washington C 3rd,McLaughlin S, Improving teamwork and communication in trauma care through in situ simulations. Academic emergency medicine : official journal of the Society for Academic Emergency Medicine. 2012 May;     [PubMed PMID: 22594369]


[22]

Auerbach M,Roney L,Aysseh A,Gawel M,Koziel J,Barre K,Caty MG,Santucci K, In situ pediatric trauma simulation: assessing the impact and feasibility of an interdisciplinary pediatric in situ trauma care quality improvement simulation program. Pediatric emergency care. 2014 Dec;     [PubMed PMID: 25407035]

Level 2 (mid-level) evidence

[23]

Rudolph JW,Simon R,Dufresne RL,Raemer DB, There's no such thing as     [PubMed PMID: 19088574]