Skip to main content

The expository phase of debriefing in clinical simulation: a qualitative study

Abstract

Background

Clinical simulation fosters reflective, experiential learning in a safe environment, allowing participants to learn from mistakes without patient risk. Debriefing, essential for reflection, is typically facilitator driven. The MAES© methodology (Self-Learning Methodology in Simulated Environments) shifts the focus to students, guiding them through six sequential phases: group identity creation, topic selection, objective setting, competency establishment, scenario design, simulation, and debriefing. MAES© introduces an expository phase in debriefing, where students present theoretical and practical content. The facilitator assumes a significant, yet secondary role, fostering increased student-led learning opportunities and, at times, enabling even trained real patients to co-facilitate the debriefing.

Objective

To explore participants’ experiences and perceptions regarding the expository phase of debriefing within the MAES© methodology framework, with specific focus on the student-led debriefing component.

Method

A descriptive qualitative inductive approach with thematic content analysis was used. Open-ended questionnaires from 151 undergraduate final year and post-graduate nursing students, captured their experiences with the MAES© expository phase. Open-ended questionnaires allow participants to freely and anonymously express their perspectives and experiences. Responses were transcribed, independently coded, and analyzed using MaxQDA® v18. Data were coded and analyzed based on absolute and relative frequencies of emerging categories. The study adhered to the SRQR (Standards for Reporting Qualitative Research) guidelines.

Results

The analysis revealed several key themes in student evaluations. Satisfaction with the methodology emerged strongly, with over one-third of participants expressing no desired changes. The reflective nature of the approach was prominently valued, along with its effectiveness for concept clarification and fostering collaborative learning. Participants particularly noted developmental outcomes in communication competencies and technical skills, while appreciating the motivational learning environment and evidence-based focus. The suggested improvements focused on three main aspects: increased session dynamism, a greater use of visual and interactive elements, and reduced dependence on slide-based presentations.

Conclusion

The study highlights the value of the expository phase in the MAES© methodology, emphasizing its effectiveness in clarifying concepts, fostering collaboration, and developing technical and communication skills. It also promotes student autonomy through active engagement. However, participants suggested improvements, such as greater dynamism, personalization, and varied presentation methods using videos, skill stations or patient’s-oriented debriefing. Overall, the expository phase proves to be a valuable pedagogical tool with potential for broader application in simulation-based learning and other debriefing models.

Clinical trial number

Not applicable.

Peer Review reports

Introduction

Simulation Based Learning (SBL) is one of the most prominent learning methodologies in university teaching and clinical practice [1, 2]. Training and learning through simulation is considered a reflective and experiential method [3, 4]. Aligned with these principles can be found MAES© methodology (Self-Learning Methodology in Simulated Environments, Spanish acronym). The MAES© methodology (Self-Learning Methodology in Simulated Environments) shifts simulation-based learning from an instructor-led to a student-centered model, promoting autonomy and active engagement. Through six structured phases (group identity creation, study topic selection, initial objectives definition, competencies establishment, scenario design, simulation, and debriefing) students take control of their learning, design simulations, and engage in reflective analysis [5]. This approach fosters “learning by doing” in a safe environment, integrating theory with practice while encouraging collaboration and critical thinking. Participants can make mistakes and learn from them without posing risks to the patient [6]. The MAES© methodology is theoretically grounded in constructivist approaches [7], particularly Problem-Based Learning [8], which emphasizes active knowledge construction through problem-solving, and Collaborative Learning [9], where social interaction drives cognitive development. This framework aligns with Peer Learning principles [10], where knowledge co-creation occurs through symmetrical relationships among learners.

The reflective analysis of the simulated event and the strengths and weaknesses of the participants is carried out during the debriefing, which is considered essential for learning through simulation by most authors [11, 12]. Although its essential role in simulation, debriefing frameworks and techniques vary widely in their approach and effectiveness [13]. Debriefing began as a critical method initially focused on identifying errors, which limited effectiveness of feedback. To address this, the Debriefing with Good Judgment methodology was created, finding a balance between learning, reflection, motivation, and emotions [14]. Building on the need for structured feedback, the GAS model (Gather, Analize, Summarize) was introduced to organize debriefing into three logical, progressive stages, ensuring a systematic review of both strengths and areas for improvement [15]. Expanding this structure, the PEARLS (Promoting Excellence and Reflective Learning in Simulation) offers a detailed framework with defined phases: Setting the scene, Reaction, Description, Analysis, and Summary [16]. The main difference with GAS is that PEARLS link participants’ emotions to the learning experience. In contrast to facilitator-led models, “self-debriefing” offers a more autonomous approach. Participants individually reflect on their performance without a facilitator’s intervention fostering self-assessment and personal insight [17]. Expanding on this concept, “Videodebriefing” emerged during the COVID-19 pandemic, as a self-assessment remote alternative. Through simulation recordings, participants can objectively observe their performance, integrating visual feedback into the reflection process [18].

The debriefing models discussed provide structured and reflective analysis but are predominantly facilitator-driven [2, 19]. Studies suggest instructor-led debriefing is not always positive [19]. The unique feature of MAES© underscores the necessity for a tailored debriefing model that prioritizes a central role for students [5, 3]. With MAES© the facilitator retains their role in guiding and moderating the reflective process but transitions to a more supportive and concluding position [20]. This approach allows students to take on greater responsibility, fostering autonomy and critical thinking.

MAES© debriefing adheres to a well-defined five-phase structure aligned with the INACSL debriefing standards [21]. It distinguished from other models by its inclusion of an expository phase. In the initial phase, as in PEARLS debriefing [22], participants express their emotions immediately after the simulation scenario. The second phase, descriptive in nature, involves a narrative recounting of the events in the simulated scenario. The third phase, analytical, facilitates deeper reflective learning [23], utilizing analysis techniques such as “Plus/Delta” [24]. The analytical phase of MAES© emphasizes active student participation. Specifically, the students who prepared the case lead this phase by posing questions or offering reflections [5].

The expository phase, which follows the analytical phase, represents the primary structural innovation of MAES© debriefing. Collaborative learning (Barkley et al., 2014) and peer-to-peer learning (Damon, 1984) are actively fostered in this phase. Students who designed the scenario take the lead, dynamically presenting scientific evidence related to the case while addressing the learning objectives established by the group. Participants are encouraged to creatively convey evidence through diverse methods (e.g., presentations, visual content, interactive activities, brief workshops and even inviting experts or patients with relevant experiences to share) [20, 3]. The expository phase is intentionally dynamic and designed to last no longer than 10 min to maintain engagement and effectiveness. The MAES© debriefing ends with a summary. The facilitator, guiding students in synthesizing key learning points, ensures that all learning objectives are addressed, including any emergent objectives that may not have been anticipated during the initial scenario design [25]. This distinctive approach not only enriches the reflective process but also enhances the autonomy and engagement of learners.

With its structure, MAES© emphasizes the active role of students throughout their learning process, including debriefing [26]. This process enhances intrinsic motivation, educational responsibility, and participant engagement [27, 28]. The facilitator takes a seemingly secondary role, allowing more student space and even enabling trained real patients to co-debrief [20]. The facilitator’s role in MAES©, adhering to INACSL facilitation standards [29], is crucial in the expository debriefing. They guide students toward the initial objectives, focusing on students’ baseline knowledge and motivations [3].

The MAES© debriefing retains the structured framework common to most debriefing models [30], adding a key structural innovation: the expository phase. Despite its conceptual grounding, prior research has not fully explored the perceptions and experiences of users who have applied this phase in practice, particularly in relation to student-led debriefing. Existing studies have largely focused on the theoretical aspects of the expository phase [5, 20], overlooking how it is perceived and experienced by participants in real-world settings. Consequently, the primary objective of this study is to explore participants’ experiences and perceptions regarding the expository phase of debriefing within the MAES© methodology framework, with specific focus on the student-led debriefing component.

Method

Research design

This study employed a descriptive qualitative design with inductive approach, utilizing content analysis of an open-ended questionnaire to explore participants’ experiences and opinions regarding the expository phase of MAES© debriefing. The methodology choice was taken with the aim to provide detailed, in depth understanding of MAES© expository phase. This method allows understanding of complex phenomena capturing lived experiences with great flexibility in data collection providing contextual insights [31]. Open-ended questionnaires provide a loosely structured approach that allows participants to freely and anonymously articulate their subjective perspectives and experiences in writing [32]. Although less commonly employed in qualitative research, this method offers valuable preliminary insights, particularly for exploring focused topics. Similar approaches have been applied in health sciences education research [33, 34, 35]. The study adhered to the SRQR (Standards for Reporting Qualitative Research) guidelines [36].

Participants

Participants were initially selected through non-probabilistic purposive sampling. Participants included were fourth-year nursing students (final year), enrolled in Practicum V and postgraduate students in emergency care in the Catholic University of Murcia (Spain). Students participating in exchange programs and not enrolled in simulation courses were excluded. The Practicum V cohort consisted of 304 students, divided across 20 simulation groups (16 students per group average). On the other hand, Postgraduate cohort consisted of 36 students, divided into two groups of 18 student. The total of the non-probabilistic purposive sampling was 340 students.

Practicum V incorporates 20 h of simulation-based learning using the MAES© method. These hours are structured into a 4-hour prebriefing session and four 4-hour simulation and debriefing sessions, each covering three scenarios with debriefings divided into five phases: reactions, descriptive, analytical, expository, and summary. The postgraduate students completed 24 h of simulation using the same methodology during their Masters’ degree.

As a qualitative study examining learner experiences, we employed a systematic sampling approach within a convenience sampling framework, selecting eight undergraduate simulation groups and both postgraduate groups (totaling 10 groups with 164 eligible participants). This dual sampling strategy - systematic selection from convenience samples - is particularly suited for qualitative research where: (1) the population has natural groupings (simulation cohorts), (2) depth of experience with the intervention (MAES© methodology) is prioritized over probabilistic representation, and (3) logistical constraints require practical sampling solutions. The selection of 10 out of 22 available groups achieved three critical qualitative research objectives: experiential diversity (ensuring varied clinical backgrounds), data richness (maintaining manageable but substantial narrative data), and methodological rigor (through systematic randomization within the convenience sample to minimize bias). Finally, 151 students completed the qualitative questionnaire, yielding a 92% response rate.

Procedure and data collection

All MAES© sessions adhered to the previously standardized self-directed learning structure, regardless of the specific competencies being addressed. Data were collected during the 2022–2023 academic year (September 2022–July 2023).

The data were gathered through an anonymous, self-administered questionnaire with no word limit or time constraints, allowing participants to respond flexibly from home. Students were aware the data collection was planned to end in July 2023. The questionnaire was distributed at the end of the simulation sessions for undergraduate and postgraduate nursing students participating in MAES©-based courses at UCAM. Students could complete the questionnaire online (via Google Forms®) or on paper. All participants decided to fill in the questionnaire online. Brief instructions were provided at the beginning of the questionnaire. Students were instructed to focus their comments on the expository phase of debriefing, and it was told that all questions requesting sociodemographic data (e.g., age, gender, academic level, previous MAES© experience, and educational resources used in the expository phase) were mandatory. Two open-ended questions guided the qualitative data collection: “Provide your honest opinion on the strengths of the expository phase of debriefing without a word limit.” “Provide your opinion on the limitations and suggestions for improvement of the expository debriefing phase without a word limit.” The research team carefully designed the questions to ensure they were open-ended, enabling participants to express diverse perspectives for a comprehensive thematic analysis. Participants can withdrawal from the study any time they want, simply not answering the questionnaire before the end of collection data period (July 2023). Anonymized raw data in Spanish are publicly available in the following repository: https://doiorg.publicaciones.saludcastillayleon.es/10.6084/m9.figshare.27959604.

Data analysis

Responses from questionnaires were transcribed into a single document, with responses organized by participant and dimension (strengths and suggestions for improvement). This document was the foundation for an inductive thematic content analysis with a descriptive orientation [37] and a directed approach [38]. Two researchers independently GF (Masters’ degree and 7 years simulation experience) and CAJ (PhD and 4 years simulation experience) coded the data with a qualitative analysis software (MaxQDA® v18). The software was used to create codes for key themes, organize subcategories, analyze patterns and visualize relationships between different themes. The process helped to categorize data into dimensions, categories, and subcategories. The results were discussed during two consensus meetings with a third researcher JLDA (PhD, expert qualitative researcher with more than 15 years of clinical simulation experience) who helped to resolve discrepancies. A final version of the coded data was obtained Quantitative measures of absolute and relative frequencies were calculated for each category. Although qualitative research primarily focuses on understanding meaning and experience, incorporating quantitative frequencies can enhance analytical rigor. Reporting how often specific themes or codes appears provides a systematic structure strengthening transparency and rigor, ensuring replicability demonstrating results are not only based on subjective interpretation. In our study, it also supports comparisons between groups (undergraduate and postgraduate) and it reenforce the validity of qualitative themes by showing their frequency and distribution. Absolute frequency refers to the number of occurrences of a category. In contrast, relative frequency expresses its percentage of the total dataset, calculated by dividing the absolute frequency by the total responses and multiplying by 100. All codes, including those representing less than 5% of the dataset, were described [39].

Study rigor

To ensure the rigor of the study, the following measures were applied [40, 36]:

To ensure the trustworthiness of our study, we took deliberate steps to strengthen each aspect of rigor. During participant selection, we carefully balanced both homogeneity and heterogeneity. All participants had been trained in the MAES© methodology, ensuring a shared foundation for comparison, while the inclusion of both undergraduate and postgraduate nursing students introduced a range of perspectives. This diversity enriched the analysis by capturing variations in experience and learning levels. For data collection, we used self-administered questionnaires, allowing participants to share their experiences freely without the presence or influence of researchers. This approach minimized social desirability bias, ensuring that responses genuinely reflected participants’ perspectives rather than perceived expectations. To enhance transferability, we provided a detailed description of the study context and participant characteristics. By offering a rich account of the educational setting and demographic backgrounds, we enabled other researchers to assess the applicability of our findings to similar contexts. Transparency was ensured through open data practices. Anonymized raw data was made available in a public repository, allowing future researchers to verify our findings and build upon our work. This openness reinforces the credibility of the study and fosters continued exploration of the topic.

Two researchers independently conducted the coding and thematic analysis. Through a structured consensus process, discrepancies were discussed and resolved, ensuring that findings were consistently interpreted and not solely dependent on individual perspectives. Given our familiarity with the MAES© methodology, we remained critically aware of potential biases. Regular discussions within the research team encouraged self-reflection and objective analysis, helping to mitigate any influence of prior knowledge on data interpretation. Both these actions made reliability and reflexivity an integral part of our research process.

Ethical considerations

All participants were informed of the study’s objectives and provided written consent before participation, during the first MAES© simulation session of their group. The consent form was aligned with institutional research standards and approved by ethics committee from the Catholic University of Murcia (UCAM) prior to the study. To all participants it was explained that they withdrawal from the study any time they want. Anonymity was preserved as students can fill in the questionnaire using an anonymous single use code. Privacy and anonymity were also preserved by Spanish data protection legislation. The study was approved by the ethics committee from UCAM (reference number CE052308). Participant responses are presented with alphanumeric codes (e.g., A1, A2, A3), specifying if they are undergraduate or postgraduate to better represent the data obtained and reduce possible bias.

Results

The study included 151 participants with a mean age of 22.75 years (SD = 2.57). Most were women (n = 115), while 36 were men. All participants were students at the Catholic University of Murcia (Spain). Regarding prior experience with MAES©, 109 students had previous exposure, whereas 42 did not. Academically, 126 participants were undergraduate students, and 25 were enrolled in postgraduate nursing programs.

A thematic content analysis of the qualitative responses identified two primary dimensions: strengths and areas for improvement in the expository phase of MAES© debriefing. Data were categorized into thematic groups, and absolute frequency (AF) and percentage (%) were calculated for each subcategory.

Dimension 1: Strengths of the expository phase in MAES© debriefing

The first dimension individuated in our study focuses on the strengths of the expository phase in MAES©. In this phase, students take an active role presenting theoretical and practical competencies related to the simulation scenario experienced by another group, while the facilitator provides guidance and supervision. Findings suggest that peer-led information exchange significantly enhances learning. This phase fosters concept consolidation, reflection on errors and improvements, and collaborative learning by encouraging exchanging ideas and experiences. Additionally, it strengthens communication skills, autonomy, and scientific literacy, creating a motivating and interactive learning environment. The absolute frequency (AF) and percentage of each category within Dimension 1 are summarized in Table 1.

Table 1 Table of frequencies and percentages by subcategories in dimension 1

The expository phase of MAES© debriefing enhances clarification and consolidation of concepts (category 1). It facilitates conceptual clarity resolving doubts (8%) with the opportunity to ask questions at any time and reinforces theoretical and practical knowledge (5,3%) linked with the learning baseline previously set by students. One participant said, “You can ask anything you don’t know or anything that wasn’t clear” (A4-undergraduate). Another stated, “It helps you understand the case better, providing you with the necessary information based on what we want to learn” (A82-postgraduate).

Beyond individual comprehension, the collaborative nature of this phase fosters teamwork and shared learning (category 2). Students emphasized the value of exchanging perspectives and discussing different approaches with their peers. The exchange of perspectives and collective discussion were highlighted as enriching aspects of the learning experience. One participant noted, “Discuss different points of view with classmates, see the mistakes and how to improve” (A2-undergraduate), while another emphasized, “Share practical knowledge with classmates” (A101-undergraduate). This collective reflection further supports reflection on errors and improvements (category 3).

The expository phase promotes reflective learning by encouraging the identification of mistakes (10%) and analysis of errors, allowing students to improve future performance learning from them (6,6%). One student stated, “Analyze the mistakes to learn from them” (A50-postgraduate), and another added, “You learn from mistakes” (A94-postgraduate).

In addition to deepening knowledge, the expository phase also strengthens development of communication skills (category 4). Structured presentations and public speaking opportunities boost students’ confidence and expressiveness (6,6%). The use of interactive methods made presentations more engaging (3,4%). As one participant described, “It helps to present in public and gives us confidence for other presentations like the Final Degree Project” (A12-undergraduate). Another remarked, “It’s interactive, so the class is invited to join the activity” (A90-postgraduate). At the same time the expository phase helps Technical and Practical Skills development (category 5).

The expository phase enhances technical competencies (6,6%) and their application in clinical practice (6%). One student commented, “Refresh nursing techniques, share experiences with classmates” (A70-undergraduate). At the same time, another highlighted, “Know the small details of the case that we overlooked in the simulation and that can be relevant to the patient’s well-being” (A86-postgraduate).

Academic and technical aspects are not the only ones remarked as positive. As matter of fact, the expository phase of MAES© debriefing is seen as a chance to improve motivation and give a positive learning environment (category 6). The expository phase creates a supportive and engaging atmosphere where peer recognition and participation contribute to a positive learning experience. A participant noted, “The camaraderie and respect during the nervous moments we have when presenting” (A112-undergraduate), and another stated, “It’s good that the class takes an interest and listens to you; it motivates you to keep explaining” (A33-undergraduate).

The expository phase also promotes the use and search for scientific evidence (category 7), reinforcing an evidence-based approach to learning (3,4%) especially applied in the preliminary preparation of this phase (4,7%) One participant reported, “Searching for information helps you learn” (A7-undergraduate), while another added, “With the expository phase of debriefing, you update and reinforce information and knowledge” (A92-postgraduate). Moreover, active student involvement (4,7%) in preparing and presenting material contributes to deeper understanding and better knowledge retention (2,7%) helping autonomous and meaningful learning (category 8). One participant described, “Preparing material involves greater dedication and learning by the person doing it” (A5-postgraduate). Another noted, “The clarity with which the content stays and the ease of acquiring knowledge” (A91-postgraduate).

Finally, students recognized the educational innovation (category 9) of the expository phase. Participants recognized the expository phase as an effective alternative to traditional learning due to its dynamic methodology (4%) and the diversity of the resources that students can use to learn (2%). One student observed, “A different way of learning and, in my opinion, more effective and dynamic than conventional” (A35-postgraduate). Another highlighted the creative approaches used in presentations: “I find it fun that classmates try other things, like a quiz game or doing theater” (A47-graduate).

Dimension 2: Areas for improvement in the expository phase of MAES© debriefing

The analysis of participants’ feedback on the expository phase of the MAES© debriefing reveals several key areas for improvement, as well as a significant number of students who expressed satisfaction with the current methodology. The primary focus of most suggestions revolved around enhancing student engagement and participation, either through reducing reliance on traditional methods like PowerPoint or incorporating more interactive, hands-on learning activities. Additionally, there was a call for more personalized support for students, particularly those who experience anxiety or discomfort in public speaking. Several participants also highlighted the need for clearer organization and better pacing of activities to optimize learning outcomes. The absolute frequency (AF) and percentage of each category within Dimension 2 are summarized in Table 2.

Table 2 Absolute frequency (AF) and percentage of each category in dimension 2

The most frequently identified category in this dimension was full satisfaction (category 1), with 37.1% of participants stating they would not change anything. While this does not represent a suggestion for improvement, it provides valuable insight into students’ positive perceptions of the current methodology. Common responses from both undergraduate and postgraduate students included: “Nothing” (A4, A6, A24, A27, A29) and “Wouldn’t change anything, honestly” (A7). Others expressed general contentment with statements such as: “Everything is perfect” (A150). These responses reinforce the positive reception of the expository phase among a considerable number of students.

Some participants (10%) found certain presentations excessively long (category 2), which they perceived as detrimental to engagement and learning. Comments included: “They should be shorter” (A12-postgraduate) and “Maybe the time, make it quicker and more manageable, but overall very good” (A72-graduate). These observations suggest the need to optimize the duration of activities to sustain attention and maximize learning outcomes. In line with this, a substantial number of participants (12%) advocated for grater dynamism and interactivity (category 3) One student proposed: “Improve by not using PowerPoint for all presentations; I think it’s more enriching to do practical workshops, using questions and answers” (A5-undergraduate). Another emphasized: “Make it more interactive” (A117-undergraduate). These suggestions underscore the importance of pedagogical strategies that foster active student participation in the learning process.

Many participants (10,6%) recommended incorporating more hands-on workshops and visual materials (category 4) to enhance learning effectiveness. Illustrative comments include: “Use more practical workshops” (A8-undergraduate) and “In conclusion, implement more workshops and not so many presentations” (A120-postgraduate). These perspectives highlight the value of experiential learning as a complement to theoretical instruction. In addition to this, some participants (7,3%) suggested that the organization and clarity of activities could be improved (category 5). For instance, one student remarked: “I think I would structure the plus and delta parts more” (A17-postgraduate). Additionally, the need for final summaries to consolidate learning was emphasized: “Make a summary at the end of the debriefing of everything learned. Sometimes it’s a bit scattered” (A94-undergraduate). These insights indicate a demand for clearer and more structured debriefing sessions. In line with the suggestions to have a more hands-on session and the need of an improved organization and clarity expository phase of MAES©, several participants (7,3%) expressed concerns about the excessive reliance on PowerPoint® presentations (category 6). They argued that this format can become monotonous. One participant noted: “The presentation mode we students use (generally PowerPoint) sometimes gets a bit tedious and you don’t pay the attention it deserves” (A105-undergraduate). A more extreme suggestion was: “Ban the use of PowerPoint presentations because they tend to be very boring” (A137-postgraduate). These critiques point to the need for diversifying presentation tools to maintain student engagement.

Although a smaller group of participants (4,7%) highlighted the need for emotional support and personalization (category 7), this feedback is still significant. A few participants highlighted the importance of adapting activities to accommodate individual emotional and psychological differences. One student commented: “The only drawback I find is that for people who are shy or get nervous speaking in public, it can penalize them” (A40-undergraduate). This suggests that some students would benefit from more individualized support, particularly in addressing the emotional and psychological challenges of public speaking and group presentations.

Lastly, few participants (11,3%) provided specific additional suggestions (category 8) that did not fully align with the previously identified categories. These included: “Have classmates make a summary-outline of the information gathered and share it with others in a Drive after each presentation” (A48-undergraduate). Another proposal was: “Require something practical with classmates in each case” (A126-postgraduate). These ideas contribute to a broader discussion on how the expository phase can be further refined. In contrast to those who propose additional suggestions, a residual category was identified, comprising responses that did not provide relevant content or were otherwise uninformative (category 9). These responses were minimal (6,6%) but included for completeness in the analysis.

Discussion

This study aimed to explore students’ perceptions of the expository phase of debriefing within the MAES© methodology, an innovative approach that places students at the center of the learning process. By introducing a structured phase for student-led content presentation, MAES© modifies the traditional debriefing flow. While this alteration may initially seem disruptive, participants identified multiple strengths of the expository phase. The results indicate that students perceive it as an opportunity to consolidate knowledge, clarify concepts, improve communication skills, enhance autonomy, and foster creativity by cultivating a collaborative and motivating learning environment. These aspects contribute to a richer educational experience while also developing transversal skills critical for professional practice.

In MAES© debriefing, facilitators can integrate any established debriefing structure, provided they allocate space for the expository phase to address knowledge gaps identified by participants. Unlike traditional debriefing models, where facilitators assume a predominant role in guiding reflection [22, 13], expository phase shifts the focus toward student-led learning. In our study it has been demonstrated that this transition, while potentially perceived as interrupting the linear flow of debriefing, promotes deeper theory-practice integration and increases student engagement. These findings align with active and collaborative learning theories [41], which emphasize that students construct knowledge more effectively through engagement in discussions, problem-solving, and experiential activities. Additionally, collaborative learning highlights how pooling skills and reflections among learners working toward shared goals enhances critical thinking and knowledge application [42].

A key outcome of this study is the recognition that the expository phase contributes to the development of multiple competences and skills, enhancing collaborative learning. Through peer interaction and knowledge exchange, students reinforce both theoretical concepts and practical skills through active participation increasing their motivation to learn [28]. The necessity of explaining and debating concepts improves communication abilities, as students must articulate their understanding clearly, adapt their explanation to different audiences and engage in constructive dialogue. Explaining and discussing concepts requires students to articulate ideas, adapt discourse and engage in dialogic learning [43, 44],which strengthens cognitive processing. These aspects add to the variety of different possibilities to expose their knowledge during the debriefing, make the expository phase of MAES© a fundamental opportunity to train effective teamwork which is fundamental in clinical settings [45, 46].

Additionally, preparing and presenting content fosters autonomy and creativity, requiring students to critically structure and convey their ideas. This autonomy in content delivery helps deeper cognitive engagement, as students take ownership of their learning process aligning with self-regulated learning and MAES© objective [47, 5]. In this sense, the results of the study regarding the promotion of autonomy in learning could be explained based on self-regulated learning theories [48] which explain that preparing and presenting content promotes independent organization, critical evaluation and active construction of knowledge. These findings support the role of the expository phase in promoting reflective and peer learning, both fundamental pillars of the MAES© model. Guided reflection enables students to internalize key concepts from simulation, identify learning gaps, and actively seek solutions. Furthermore, the process reinforces essential competencies such as the ability to search for, analyze, and apply scientific evidence, aligning with the principles of Evidence-Based Nursing [49, 50].

In response to the question about aspects to improve, a significant proportion of participants perceived the expository phase as effective and well-structured, requiring no modifications. This suggests that, for many students, the current methodology effectively facilitates learning. Despite this strong satisfaction rate, it coexists with several calls for improvement. This contrast underlines the importance of balancing established strengths with continuous refinement, ensuring that the methodology remains responsive to diverse student needs and learning preference.

Many participants emphasized the importance of diversifying presentation methodologies to reduce reliance on PowerPoint® and incorporating more dynamic and interactive elements. Based on our findings, many students stated to have tried different methods of exposition rather than Power Point presentations, finding them more positive to enhance students’ interaction and knowledge retention. Active learning strategies like group discussion, case-based problem-solving and patient-oriented debriefing can deepen reflection and help critical thinking. Diversifying presentation styles with videos, infographics, and role playing can cater to various learning preferences. Integrating gamification through quizzes and challenges can help maintain engagement. Additionally, allowing students to create multimedia content, like podcasts, videos or social media content can enhance motivation and continuous improvement. This feedback reflects a broader pedagogical trend toward integrating innovative educational approaches, such as gamification and interactive technologies, to transform learning environments into more participatory and engaging spaces [51, 52].

These findings further underscore the need to reconsider the facilitator’s role in MAES© debriefing. Rather than directing the session, facilitators should adopt a more flexible, supportive role, intervening primarily when students deviate from the prebriefing objectives. This adjustment enhances student agency, fostering a more collaborative and student-driven learning environment that aligns with contemporary trends in health education [53, 31]. These findings further underscore the need to reconsider the facilitator’s role in MAES© debriefing. Rather than directing the session, facilitators should adopt a more flexible, supportive role, intervening primarily when students deviate from the prebriefing objectives. This adjustment enhances student agency, fostering a more collaborative and student-driven learning environment that aligns with contemporary trends in health education [53, 31].

This student-centered approach to debriefing aligns with broader research on peer-led learning, such as the study by He et al. [19], which highlights the effectiveness of structured peer debriefing in enhancing simulation performance and knowledge acquisition [26]. Similarly, Christiansen et al. [54] found that student-led debriefings resulted in reflection levels comparable to those in facilitator-led debriefings, reinforcing the idea that students can effectively lead their own reflective learning processes. However, their findings also suggest that the complexity of the simulated scenario plays a crucial role in the depth of reflection achieved. In contrast, MAES© emphasizes structured, student-led knowledge consolidation through the expository phase, ensuring that reflection is complemented by a deeper integration of evidence-based content. While Christiansen et al. focus on the equivalency of student-led and facilitator-led reflection [54], our study highlights how structured student presentations enhance engagement, autonomy, and the practical application of theoretical concepts.

Integrating the expository phase within MAES© debriefing presents a valuable opportunity for refining simulation-based education by fostering more dynamic, personalized, and student-centered learning experiences. However, several limitations must be considered. Achieving a seamless flow in debriefing depends on the ability of facilitators to create a dynamic and engaging learning environment, particularly during the prebriefing phase. Without skilled facilitators and motivated students, the effectiveness of the expository phase may be compromised. Additionally, variability in students’ ability to search for and present learning objectives may impact the expected learning outcomes. Another critical aspect is the degree of creative freedom afforded to participants in choosing their presentation methods, as some students may engage more innovatively than others, influencing group perceptions and expectations.

Future research should investigate the adaptability of the expository phase across diverse educational contexts and assess when combined with different simulation methodologies. For example, studies could explore the implementation of the expository phase in interprofessional simulation to determine whether it enhances teamwork across disciplines. Additionally, emerging technologies, such as mixed reality [55], could further optimize this phase, enhancing both student participation and knowledge retention. Investigating the integration of mixed reality into the expository phase could also be a valuable research direction, exploring its impact on participants’ perceptions and retention of knowledge. Finally, the expository phase could be analyzed in in-situ simulations, where researchers could investigate whether it improves the work dynamics, clinical teamwork, and non-clinical group cohesion among healthcare professionals participating in a simulation experience with a debriefing that includes this methodology.

Conclusions

This study indicates that participants highly value the expository phase of debriefing in the MAES© methodology. Students particularly emphasize its effectiveness in clarifying and consolidating concepts, integrate theory with practice, foster collaborative learning, and developing both technical and communication skills. Students appreciate the active role this phase plays in promoting autonomy and encouraging collaborative learning. By providing structured reflection and evidence-based presentations, the expository phase supports deeper understanding and enhances students’ engagement with material.

Despite these strengths and the well-regarded expository phase, participants also identified areas for improvement, including the need for greater dynamism, interactivity, and personalization. There is a clear call for diversifying presentation methods to better offer various learning preferences and maintain engagement throughout the process. These suggestions point to the potential for evolving the MAES© methodology to further enhance its effectiveness and responsiveness to diverse students’ needs.

In conclusion, these results suggest that the expository phase within MAES© debriefing is a valuable pedagogical tool in simulation-based learning. It enables students to integrate theoretical and practical knowledge effectively. Its structured yet flexible design may allow for broader application in other simulation based reflective analysis contexts or debriefing models, particularly in the health science field. It can happen especially in context where students assume a central role in presenting theoretical and practical insights related to a simulated scenario. By integrating structured reflection with evidence-based presentations, it enables deeper concept consolidation and enhances student ownership of the learning process, making it a valuable and novel addition to debriefing literature. Addressing the identified areas for improvement can further strengthen its role in simulation, providing a more dynamic and inclusive learning environment. To achieve this, simulation educators and curriculum designers should consider training facilitators in student-led methods, such as encouraging active participation through peer teaching, adapting the pace and structure of the expository phase, incorporating brief reflection intervals and interactive elements to maintain energy and focus. These adjustments would not only address the call for greater dynamism but also create a more responsive and inclusive learning experience that better meets the needs of all students.

The expository phase in the MAES© debriefing model offers significant potential for enhancing student engagement and autonomy in simulation-based education. As technology, such as mixed reality, continues to evolve, this phase can be further refined to create more interactive and personalized learning experiences. Its adaptability positions it to play a key role in the future of simulation education, empowering students to take ownership of their learning.

Limitations

This study has several limitations. As with all qualitative research, external validity is inherently constrained, and the exclusive focus on students from a single institution may limit the generalizability of the findings. Future research should include broader, multicenter studies to compare and extend these results across diverse educational settings.

Additionally, the use of written questionnaires for data collection, while beneficial for obtaining a wide range of responses, may have restricted the depth and richness of the data compared to other qualitative techniques such as interviews or focus groups. Employing a combination of qualitative methods in future research could provide deeper insights into students’ experiences and perceptions of the expository phase in simulation-based learning.

Data availability

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

References

  1. Alshehri FD, Jones S, Harrison D. The effectiveness of high-fidelity simulation on undergraduate nursing students’ clinical reasoning-related skills: A systematic review. Nurse Educ Today. 2023;121:105679. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.nedt.2022.105679.

    Article  PubMed  Google Scholar 

  2. Cant RP, Cooper SJ. Use of simulation-based learning in undergraduate nurse education: an umbrella systematic review. Nurse Educ Today. 2017;49:63–71. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.nedt.2016.11.015.

    Article  PubMed  Google Scholar 

  3. Fenzi G, Reuben AD, Agea JLD, Ruipérez TH, Costa CL. Self-learning methodology in simulated environments (MAES©) utilized in hospital settings. Action-research in an emergency department in the united Kingdom. Int Emerg Nurs. 2022;61:101128. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.ienj.2021.101128.

    Article  PubMed  Google Scholar 

  4. Kolb DA. Experiential learning: Experience as the source of learning and development (Second edition). Pearson Education, Inc. 2015.

  5. Díaz JL, Leal C, García JA, Hernández E, Adánez MG, Sáez A. Self-Learning methodology in simulated environments (MAES©): elements and characteristics. Clin Simul Nurs. 2016;12(7):268–74. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.ecns.2016.02.011.

    Article  Google Scholar 

  6. Jagneaux T, Caffery TS, Musso MW, Long AC, Zatarain L, Stopa E, Freeman N, Quin CC, Jones GN. Simulation-Based education enhances patient safety behaviors during central venous catheter placement. J Patient Saf. 2021;17(6):425–9. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/PTS.0000000000000425.

    Article  PubMed  Google Scholar 

  7. Mann K, MacLeod A. Constructivism: learning theories and approaches to research. Researching medical education. John Wiley & Sons, Ltd; 2015. pp. 49–66. https://doiorg.publicaciones.saludcastillayleon.es/10.1002/9781118838983.ch6.

  8. Savery JR, Duffy TM. Problem based learning: an instructional model and its constructivist framework. Educational Technol. 1995;35(5):31–8.

    Google Scholar 

  9. Laal M, Laal M. Collaborative learning: what is it? Procedia - Social Behav Sci. 2012;31:491–5. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.sbspro.2011.12.092.

    Article  Google Scholar 

  10. Topping KJ. Trends in peer learning. Educational Psychol. 2005;25(6):631–45. https://doiorg.publicaciones.saludcastillayleon.es/10.1080/01443410500345172.

    Article  Google Scholar 

  11. Cheng A, Eppich W, Grant V, Sherbino J, Zendejas B, Cook DA. Debriefing for technology-enhanced simulation: A systematic review and meta-analysis. Med Educ. 2014;48(7):657–66. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/medu.12432.

    Article  PubMed  Google Scholar 

  12. Eppich WJ, Hunt EA, Duval-Arnould JM, Siddall VJ, Cheng A. Structuring feedback and debriefing to achieve mastery learning goals. Acad Medicine: J Association Am Med Colleges. 2015;90(11):1501–8. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/ACM.0000000000000934.

    Article  Google Scholar 

  13. Duff JP, Morse KJ, Seelandt J, Gross IT, Lydston M, Sargeant J, Dieckmann P, Allen JA, Rudolph JW, Kolbe M. Debriefing methods for simulation in healthcare: A systematic review. Simul Healthcare: J Soc Simul Healthc. 2024;19(1S):S112–21. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/SIH.0000000000000765.

    Article  Google Scholar 

  14. Rudolph JW, Simon R, Rivard P, Dufresne RL, Raemer DB. Debriefing with good judgment: combining rigorous feedback with genuine inquiry. Anesthesiol Clin. 2007;25(2):361–76. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.anclin.2007.03.007.

    Article  PubMed  Google Scholar 

  15. Phrampus PE, O’Donnell JM. Debriefing Using a Structured and Supported Approach. In A. I. Levine, S. DeMaria, A. D. Schwartz, & A. J. Sim, editors, The Comprehensive Textbook of Healthcare Simulation 2013:73–84). Springer. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/978-1-4614-5993-4_6

  16. Eppich W, Cheng A. Promoting excellence and reflective learning in simulation (PEARLS): development and rationale for a blended approach to health care simulation debriefing. Simul Healthcare: J Soc Simul Healthc. 2015;10(2):106–15. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/SIH.0000000000000072.

    Article  Google Scholar 

  17. Isaranuwatchai W, Alam F, Hoch J, Boet S. A cost-effectiveness analysis of self-debriefing versus instructor debriefing for simulated crises in perioperative medicine in Canada. J Educational Evaluation Health Professions. 2017;13:44. https://doiorg.publicaciones.saludcastillayleon.es/10.3352/jeehp.2016.13.44.

    Article  Google Scholar 

  18. Tudor GJ, Podolej GS, Willemsen-Dunlap A, Lau V, Svendsen JD, McGarvey J, Vozenilek JA, Barker LT. The equivalence of video Self-review versus debriefing after simulation: can faculty resources be reallocated?? AEM Educ Train. 2020;4(1):36–42. https://doiorg.publicaciones.saludcastillayleon.es/10.1002/aet2.10372.

    Article  PubMed  Google Scholar 

  19. He X, Rong X, Shi L, Qin F, Fang Y, Zhang P, Wei T, Liang Q, Liu W. Peer-led versus instructor-led structured debriefing in high-fidelity simulation: A mixed-methods study on teaching effectiveness. BMC Med Educ. 2024;24(1):1290. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12909-024-06262-9.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Díaz-Agea JL, Jiménez-Rodríguez D, García-Méndez JA, Hernández-Sánchez E, Sáez-Jiménez A, Leal-Costa C. Patient-Oriented debriefing: impact of real patients’ participation during debriefing. Clin Simul Nurs. 2017;13(9):405–13. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.ecns.2017.04.008.

    Article  Google Scholar 

  21. Decker S, Alinier G, Crawford SB, Gordon RM, Jenkins D, Wilson C. Healthcare simulation standards of best practicetm the debriefing process. Clin Simul Nurs. 2021;58:27–32. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.ecns.2021.08.011.

    Article  Google Scholar 

  22. Cheng A, Grant V, Robinson T, Catena H, Lachapelle K, Kim J, Adler M, Eppich W. The promoting excellence and reflective learning in simulation (PEARLS) approach to health care debriefing: A faculty development guide. Clin Simul Nurs. 2016;12(10):419–28. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.ecns.2016.05.002.

    Article  Google Scholar 

  23. Bulman C, Schutz S. Reflective practice in nursing. Wiley; 2013.

  24. Cheng A, Eppich W, Epps C, Kolbe M, Meguerdichian M, Grant V. Embracing informed learner self-assessment during debriefing: the Art of plus-delta. Adv Simul. 2021;6(1):22. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s41077-021-00173-1.

    Article  CAS  Google Scholar 

  25. Díaz-Agea JL, Manresa-Parres M, Pujalte-Jesús MJ, Soto-Castellón MB, Aroca-Lucas M, Rojo-Rojo A, Leal-Costa C. What do I take home after the simulation? The importance of emergent learning outcomes in clinical simulation. Nurse Educ Today. 2022;109:105186. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.nedt.2021.105186.

    Article  PubMed  Google Scholar 

  26. Díaz Agea JL, Megías Nicolás A, García Méndez JA, Adánez Martínez MDG, Leal Costa C. Improving simulation performance through Self-Learning methodology in simulated environments (MAES©). Nurse Educ Today. 2019;76:62–7. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.nedt.2019.01.020.

    Article  PubMed  Google Scholar 

  27. Arizo-Luque V, Ramirez-Baena L, Pujalte-Jesús MJ, Rodríguez-Herrera MÁ, Lozano-Molina A, Arrogante O, Díaz-Agea JL. Does Self-Directed learning with simulation improve critical thinking and motivation of nursing students?? A Pre-Post intervention study with the MAES© methodology. Healthcare. 2022;10(5):927. https://doiorg.publicaciones.saludcastillayleon.es/10.3390/healthcare10050927.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Díaz-Agea JL, Pujalte-Jesús MJ, Leal-Costa C, García-Méndez JA, Adánez-Martínez MG, Jiménez-Rodríguez D. Motivation: bringing up the Rear in nursing education. Motivational elements in simulation. The participants’ perspective. Nurse Educ Today. 2021;103:104925. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.nedt.2021.104925.

    Article  PubMed  Google Scholar 

  29. Persico L, Belle A, DiGregorio H, Wilson-Keates B, Shelton C. Healthcare simulation standards of best practicetm facilitation. Clin Simul Nurs. 2021;58:22–6. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.ecns.2021.08.010.

    Article  Google Scholar 

  30. Ozekcin LR, Tuite P, Willner K, Hravnak M. Simulation education: early identification of patient physiologic deterioration by acute care nurses. Clin Nurse Spec CNS. 2015;29(3):166–73. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/NUR.0000000000000123.

    Article  PubMed  Google Scholar 

  31. Kim H, Sefcik JS, Bradway C. Characteristics of qualitative descriptive studies: A systematic review. Res Nurs Health. 2017;40(1):23–42. https://doiorg.publicaciones.saludcastillayleon.es/10.1002/nur.21768.

    Article  CAS  PubMed  Google Scholar 

  32. Tracy SJ. Qualitative quality: eight Big-Tent criteria for excellent qualitative research. Qualitative Inq. 2010;16(10):837–51. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/1077800410383121.

    Article  Google Scholar 

  33. Ginsburg S, Gold W, Cavalcanti RB, Kurabi B, McDonald-Blumer H. Competencies ‘plus’: the nature of written comments on internal medicine residents’ evaluation forms. Acad Medicine: J Association Am Med Colleges. 2011;86(10 Suppl):30–4. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/ACM.0b013e31822a6d92.

    Article  Google Scholar 

  34. Mamali FC, Lehane CM, Wittich W, Martiniello N, Dammeyer J. What couples say about living and coping with sensory loss: A qualitative analysis of open-ended survey responses. Disabil Rehabil. 2022;44(12):2784–805. https://doiorg.publicaciones.saludcastillayleon.es/10.1080/09638288.2020.1850889.

    Article  PubMed  Google Scholar 

  35. Myers KA, Zibrowski EM, Lingard L. A mixed-methods analysis of residents’ written comments regarding their clinical supervisors. Acad Medicine: J Association Am Med Colleges. 2011;86(10 Suppl):21–4. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/ACM.0b013e31822a6fd3.

    Article  Google Scholar 

  36. O’Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research: A synthesis of recommendations. Acad Medicine: J Association Am Med Colleges. 2014;89(9):1245–51. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/ACM.0000000000000388.

    Article  Google Scholar 

  37. Vaismoradi M, Turunen H, Bondas T. Content analysis and thematic analysis: implications for conducting a qualitative descriptive study. Nurs Health Sci. 2013;15(3):398–405. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/nhs.12048.

    Article  PubMed  Google Scholar 

  38. Hsieh H-F, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15(9):1277–88. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/1049732305276687.

    Article  PubMed  Google Scholar 

  39. Mayring P. Qualitative content analysis: Theoretical foundation, basic procedures and software solution. 2014.

  40. Graneheim UH, Lundman B. Qualitative content analysis in nursing research: concepts, procedures and measures to achieve trustworthiness. Nurse Educ Today. 2004;24(2):105–12. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.nedt.2003.10.001.

    Article  CAS  PubMed  Google Scholar 

  41. Roselli ND. Collaborative learning: theoretical foundations and applicable strategies to university. J Educational Psychol - Propositos Y Representaciones. 2016;4(1):251–80.

    Google Scholar 

  42. Mägi L, Uibu E, Moi AL, Mortensen M, Naustdal K, Põlluste K, Lember M, Kangasniemi M. Collaborative learning linking nursing practice and education—Interview study with master’s students and teachers. Nurse Educ Today. 2024;139:106261. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.nedt.2024.106261.

    Article  PubMed  Google Scholar 

  43. Boyd VA, Woods NN, Kumagai AK, Kawamura AA, Orsino A, Ng SL. Examining the impact of dialogic learning on critically reflective practice. Acad Medicine: J Association Am Med Colleges. 2022;97(11S):S71–9. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/ACM.0000000000004916.

    Article  Google Scholar 

  44. Kim M-Y, Wilkinson IAG. What is dialogic teaching? Constructing, deconstructing, and reconstructing a pedagogy of classroom talk. Learn Cult Social Interact. 2019;21:70–86. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.lcsi.2019.02.003.

    Article  Google Scholar 

  45. Buljac-Samardzic M, Doekhie KD, van Wijngaarden JDH. Interventions to improve team effectiveness within health care: A systematic review of the past decade. Hum Resour Health. 2020;18(1):2. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12960-019-0411-3.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Weller JM, Mahajan R, Fahey-Williams K, Webster CS. Teamwork matters: team situation awareness to build high-performing healthcare teams, a narrative review. Br J Anaesth. 2024;132(4):771–8. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.bja.2023.12.035.

    Article  PubMed  Google Scholar 

  47. Brenner CA. Self-regulated learning, self-determination theory and teacher candidates’ development of competency-based teaching practices. Smart Learn Environ. 2022;9(1):3. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s40561-021-00184-5.

    Article  Google Scholar 

  48. Zimmerman BJ, Labuhn AS. Self-regulation of learning: Process approaches to personal development. In APA educational psychology handbook: Theories, constructs, and critical issues. American Psychological Association. 2012;1:399–425. https://doiorg.publicaciones.saludcastillayleon.es/10.1037/13273-014

  49. Giesen J, Berings M, Bakker-Jacobs A, Vermeulen H, Huisman-De Waal G, Van Vught A. Facilitating an Evidence-Based quality improvement learning culture in nursing teams through coaching and identification of key influencing factors: an action research approach. J Adv Nurs. 2024. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/jan.16679.

    Article  PubMed  Google Scholar 

  50. Karlsholm G, Strand LB, André B, Grønning K. Learning evidence-based practice by writing the bachelor’s thesis—A prospective cohort study in undergraduate nursing education. Nurse Educ Today. 2024;139:106239. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.nedt.2024.106239.

    Article  PubMed  Google Scholar 

  51. López-Navarrete A, Zimmermann-Vildoso M, de Brito Poveda V, de Nogueira S, L. Effectiveness of combined virtual and clinical simulation compared with other active teaching strategies on health students’ learning: A systematic review protocol. JBI Evid Synthesis. 2024;22(6):1170–6. https://doiorg.publicaciones.saludcastillayleon.es/10.11124/JBIES-23-00348.

    Article  Google Scholar 

  52. Song H, Cai L. Interactive learning environment as a source of critical thinking skills for college students. BMC Med Educ. 2024;24(1):270. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12909-024-05247-y.

    Article  PubMed  PubMed Central  Google Scholar 

  53. García-Salido C, Ramírez-Baraldes E la, Garcia-Gutiérrez D. Learners to Leaders: Impact of Instructor Roles on Nursing Students’ Professional Development in Clinical Simulations. Nursing Reports (Pavia, Italy), 2024;14(4):3652–3666. https://doiorg.publicaciones.saludcastillayleon.es/10.3390/nursrep14040267

  54. Christiansen CR, Andersen JV, Dieckmann P. Comparing reflection levels between facilitator-led and student-led debriefing in simulation training for paramedic students. Adv Simul. 2023;8(1):30. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s41077-023-00273-0.

    Article  Google Scholar 

  55. McNeill L, Gum L, Graham K, Sweet L. Removing the home court advantage’: A qualitative evaluation of LEGO® as an interprofessional simulation icebreaker for midwifery and medical students. Nurse Educ Pract. 2024;80:104138. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.nepr.2024.104138.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

This manuscript received no specific grant from public, commercial, or not-for-profit funding agencies.

Author information

Authors and Affiliations

Authors

Contributions

GF performed conceptualization, methodology, investigation, data curation, software, resources, and validation, writing-original draft, and visualization. CAJ performed conceptualization, methodology, investigation, data curation, writing review and editing, and visualization. PSCF performed methodology, investigation, data curation, writing review and editing, and visualization. GSL performed methodology, investigation, data curation, writing review and editing, and visualization. CLC performed methodology, formal analysis, data curation, writing review and editing, visualization and supervision. JLDA performed conceptualization, methodology, validation, formal analysis, writing-original draft, visualization and supervision. All authors approved the final version.

Corresponding author

Correspondence to Cesar Leal-Costa.

Ethics declarations

Ethics approval and consent to participate

This study was performed in accordance with the Helsinki standard and was approved by the Ethics Committee from the Catholic University of Murcia (reference number CE052308). All participants were informed of the study’s objectives and provided written consent before participation. Anonymity was preserved by Spanish data protection legislation. Participant responses are presented with alphanumeric codes (e.g., A1, A2, A3).

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fenzi, G., Alemán-Jiménez, C., Cayuela-Fuentes, P.S. et al. The expository phase of debriefing in clinical simulation: a qualitative study. BMC Nurs 24, 476 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12912-025-03067-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12912-025-03067-z

Keywords