[go: up one dir, main page]

Next Article in Journal
Mamba-DQN: Adaptively Tunes Visual SLAM Parameters Based on Historical Observation DQN
Previous Article in Journal
Role of Endoscopic Ultrasound in Diagnosis and Management of Pancreas Divisum: A Case Study and Literature Review
Previous Article in Special Issue
Perceived Brightness and Resolution of Holographic Augmented Reality Retinal Scan Glasses
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

NursingXR: Advancing Nursing Education Through Virtual Reality-Based Training

by
Mohammad F. Obeid
1,*,
Ahmed Ewais
2 and
Mohammad R. Asia
3
1
Division of Advanced Technology and Engineering, Shenandoah University, Winchester, VA 22601, USA
2
Computer Science Department, Arab American University, Jenin P.O. Box 240, Palestine
3
Health Sciences Department, Arab American University, Ramallah P600, Palestine
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(6), 2949; https://doi.org/10.3390/app15062949 (registering DOI)
Submission received: 28 January 2025 / Revised: 27 February 2025 / Accepted: 4 March 2025 / Published: 9 March 2025
(This article belongs to the Special Issue Virtual and Augmented Reality: Theory, Methods, and Applications)
Figure 1
<p>Conceptual design and system architecture of NursingXR.</p> ">
Figure 2
<p>Layout document for Tracheostomy lesson showing the various interactions and objects involved, as part of the nomenclature translation process.</p> ">
Figure 3
<p>Customized assets created for the lesson “Starting an intravenous infusion” including (<b>a</b>) the IV kit and (<b>b</b>) the room and patient avatar.</p> ">
Figure 4
<p>Example use of <span class="html-italic">ProcedureEvents</span> script for the Intramuscular Injection lesson, showing a high-level view of the events that were created for this specific lesson’s (<b>a</b>) pre-procedure, (<b>b</b>) procedure, and (<b>c</b>) post-procedure event categories.</p> ">
Figure 4 Cont.
<p>Example use of <span class="html-italic">ProcedureEvents</span> script for the Intramuscular Injection lesson, showing a high-level view of the events that were created for this specific lesson’s (<b>a</b>) pre-procedure, (<b>b</b>) procedure, and (<b>c</b>) post-procedure event categories.</p> ">
Figure 5
<p>Textual instructions (<b>a</b>) paired with visual effects (<b>b</b>) to guide user through the Intramuscular Injection procedure in the Training Mode.</p> ">
Figure 6
<p><span class="html-italic">NursingXR</span> menu and navigation showing the various screens available including (<b>a</b>) Welcome Screen, (<b>b</b>) User ID input, (<b>c</b>) Modules tab, (<b>d</b>), Results tab, (<b>e</b>) Tutorial tab, and (<b>f</b>) Settings tab.</p> ">
Figure 6 Cont.
<p><span class="html-italic">NursingXR</span> menu and navigation showing the various screens available including (<b>a</b>) Welcome Screen, (<b>b</b>) User ID input, (<b>c</b>) Modules tab, (<b>d</b>), Results tab, (<b>e</b>) Tutorial tab, and (<b>f</b>) Settings tab.</p> ">
Figure 7
<p>Nursing students participating in the study and using NursingXR to supplement their classroom experience on various devices such as the Meta Quest 2 (<b>a</b>) and the HP Reverb (<b>b</b>).</p> ">
Figure 8
<p>A sampling of the various steps within the Intramuscular Injection submodule including: (<b>a</b>) user washing their hands, (<b>b</b>) user equipping gloves, (<b>c</b>) filling the syringe with the medication, and (<b>d</b>) injecting the patient’s arm.</p> ">
Figure 9
<p>A frequency chart for responses from Experts related to didactic capacity of NursingXR.</p> ">
Figure 10
<p>Feedback related to VR comfort from male and female participants.</p> ">
Versions Notes

Abstract

:

Featured Application

An interactive training and simulation virtual reality platform for teaching fundamentals of nursing.

Abstract

The increasing complexity of healthcare delivery and the advancements in medical technology have highlighted the necessity for improved training in nursing education. While traditional training methods have their merits, they often encounter challenges such as limited access to clinical placements, static physical simulations, and performance anxiety during hands-on practice. Virtual reality (VR) has been increasingly adopted for immersive and interactive training environments, allowing nursing students to practice essential skills repeatedly in realistic, risk-free settings. This study presents NursingXR, a VR-based platform designed to help nursing students master essential clinical skills. With a scalable and flexible architecture, NursingXR is tailored to support a variety of nursing lessons and adapt to evolving curricula. The platform has a modular design and offers two interactive modes: Training Mode, which provides step-by-step guided instruction, and Evaluation Mode, which allows for independent performance assessment. This article details the development process of the platform, including key design principles, system architecture, and implementation strategies, while emphasizing its utility and scalability. A mixed-methods evaluation involving 78 participants—both novices and experts—was conducted to evaluate the platform’s usability and user satisfaction. The results underscore NursingXR’s effectiveness in fostering an effective and engaging learning environment as well as its potential as a supplementary resource for nursing training.

1. Introduction

As healthcare technology evolves, nurses must continually adapt to increasingly complex medical devices, treatment protocols, and patient care techniques. This growing demand for technical proficiency places significant pressure on nursing education to ensure students acquire both theoretical knowledge and hands-on clinical skills [1]. However, nursing students often struggle with the abstract and intricate concepts of foundational courses such as physiology, pharmacology, and pathophysiology. These difficulties are exacerbated by academic deficits, including limited scientific literacy and insufficient exposure to high-level medical content. Moreover, stress plays a crucial role during training, particularly in initial hands-on practice. These challenges are especially significant in high-stakes clinical settings, where performance anxiety further limits opportunities for gaining experience [2]. Furthermore, clinical placements are often limited, making it difficult for students to gain sufficient real-world experience. Many institutions struggle to provide adequate hands-on training due to faculty shortages, resource constraints, and competition for clinical sites [3]. Additionally, physical simulations, such as the use of mannequins, while helpful, do not always fully replicate the complexity of patient care [4]. These constraints have driven the need for alternative learning methods that can provide students with the flexibility to practice clinical tasks in a realistic and controlled environment. Virtual reality (VR) training platforms offer a solution by enabling students to repeatedly practice procedures in an immersive, interactive setting without the constraints of physical labs or patient availability [5].
The integration of emerging technologies into education is transforming how students learn and acquire practical skills. Virtual reality (VR) and immersive learning environments have gained considerable attention in the past decade for their potential to revolutionize training across various fields, particularly in healthcare [6,7]. Nursing education, which traditionally relies on physical simulations, textbooks, and supervised clinical practice, is increasingly looking to such immersive training tools to enhance the learning experience [8]. By immersing students in realistic, interactive environments, VR allows them to practice critical skills in a risk-free setting, offering an approach to bridging the gap between theoretical knowledge and hands-on clinical practice. Recent studies [5,8,9,10] have shown that VR can significantly enhance skill acquisition, decision-making, and confidence in medical and nursing students. For instance, research by Chen et al. [5] demonstrated that VR-based simulations enhance knowledge, task performance, and safety of clinical practice.
Despite its promise, the adoption of VR in nursing education is still met with skepticism. Some researchers and academics question whether VR can fully replace traditional hands-on training, particularly for tasks that require tactile feedback, such as wound care, peripheral pulse, or intravenous (IV) insertion [11,12]. Furthermore, concerns about the cost of VR implementation, the learning curve for students unfamiliar with the technology [13,14], and issues such as motion sickness highlight the need for continued research [7,15]. This divergence in perspectives underscores the necessity to further investigate the efficacy of VR as a supplementary, not replacement, tool for nursing education, particularly in skill retention and clinical performance.
NursingXR, an immersive learning environment for nursing fundamentals, was designed to address many of the challenges faced in nursing education. By simulating real-world clinical environments and procedures, this platform enables nursing students to practice fundamental tasks, such as patient assessments, wound care, and IV insertion, within an immersive, interactive setting. The platform provides real-time feedback, helping students refine their skills over multiple iterations, and its modular design allows for continuous updates to meet evolving educational needs. This study builds upon the framework and design principles described in previous work [16,17,18], and discusses the implementation methodologies and validation approaches used to assess the usability and technical robustness of NursingXR. Specifically, the study seeks to address the following research questions:
  • Can VR-based platforms like NursingXR significantly enhance nursing students’ task proficiency compared to traditional methods?
  • How impactful is the user interface and interaction design in facilitating intuitive navigation and task execution for users with varying levels of experience?
  • What technical and usability challenges, such as motion sickness or control precision, arise during the use of NursingXR?
  • To what extent does the modular and scalable architecture of a VR-based training platform support the integration of new features and lessons?
  • How well does the platform perform in terms of system responsiveness, visual fidelity, and real-time feedback, and how do these factors influence user satisfaction and engagement?
This paper is organized into six sections. Section 2 reviews the state-of-the-art in nursing education and the role of VR in training. Section 3 outlines the design and development of NursingXR, including its architecture, features, and implementation. Section 4 describes the validation study, covering participant demographics and experiment design. Section 5 presents the results and discussion, with both quantitative and qualitative analyses. Finally, Section 6 concludes with key findings and future directions.

2. Immersive Learning in Nursing Education

2.1. State-of-the-Art in Nursing Training

Nursing has a long history of using a variety of training simulations for students, which allowed the user to gain required knowledge, experience, and skills out of it [19]. Nurse training used to be performed in class on subjects in which they conducted the process on people who required medical care. This could cause a risk in which the patient’s life could be threatened, which was the case in the 20th century. Nurse training has evolved to use many different solutions, such as utilize a human dummy (mannequins) when performing the process to train nurses in a safe environment, without disturbing the real system and causing accidental and permanent injury to the patient as they practice [20].
Most modern simulations take place in a physical lab environment, designed to emulate a real sick room. However, they are stocked with replacement chemicals, prop tools and materials, and most importantly, training mannequins [9]. Mannequins, in general, come in a variety of models and have a wide array of features, making them suitable for educating students in a range of medical procedures and bedside care techniques. On the other hand, their complexity means high price points including (but not limited to) the manikins’ initial price, purchasing and then regularly replacing and restocking the testing materials (such as chemicals and practice solutions), swappable props and items to enhance training simulations [21].
Due to the lockdown caused by COVID-19, there was a real need for distance learning in higher educational institutions around the world [22,23]. To overcome the limitations associated with the distance learning for nursing students, VR technology is considered as a mean to transfer practical knowledge and skills [24]. Using VR technology, nursing students can look at and manipulate three dimensional models and objects in a safe, immersive environment [25]. They can repeatedly perform actions and procedures that might normally be costly or time consuming to set up, such as working on a mechanical device, or learning a hands-on skill such as pipe maintenance or drawing [26,27]. The rapidly improving development platforms and wide range of hardware and software means that these kinds of training tools can be rapidly produced, while still retaining a high level of educational merit and interactive fidelity.
Therefore, there are a number of commercial VR products that focus on business-to-business models, and they are not available to the public. UbiSim [28] is one of the immersive training platforms that enable users to customize scenarios. Similarly, Oxford Medical Simulation (OMS) [29] is VR application that can be integrated with the curriculum for a number of medical courses as a multiplayer environment. Another commercial VR solution is SimX [30] which provides users with different medical scenarios for nursing in hospitals and the military. Medical Assisting Clinical Suite [31] is a VR application that allows the user to select the mode of VR training modules including guided, expert, and exam modes. It also has a sort of gamification through including a scoring mechanism and sound effects based on correct tasks or mistakes made.

2.2. Effectiveness of VR in Nursing Training

In general, effectiveness and usability issues are of great importance in VR solutions, and particularly in healthcare. Different studies have been conducted to investigate the effectiveness of VR and its utility as an educational or training tool in the nursing domain. For instance, researchers in [5,7,10,11,13,32,33,34] conducted systematic reviews to evaluate the effectiveness of VR to develop skills and prepare nursing students for real clinical practice. The studies reviewed a large number of articles related to using VR in nursing contexts. The analysis concludes that VR simulations can support nursing students in drug management, intravenous drug infusion, and administer medications safely, among others. Furthermore, the reviewed studies showed that the use of VR in nursing has an improved knowledge effect. In particular, Chen et al. [5] evaluated the integration of VR in nursing, learning, and teaching in the areas of knowledge, skills, satisfaction, confidence, and performance time of different nursing procedures and tasks by conducting a comprehensive evaluation for the literature. The study concludes that VR simulations are effective in reinforcing the knowledge sets of users unlike those in control groups. However, it also mentions that VR provides no significant improvement to skills, satisfaction, confidence, or performance time. Nevertheless, VR can be considered as a complement to other simulation strategies in nursing education to enhance knowledge, and as a way to improve the quality and safety of clinical practice [5,9]. Another study investigated nursing students’ experiences in using VR-based skill learning processes [35]. The study was conducted for students who enrolled in an Adult Nursing and Practice course, and they were asked to use a VR nasogastric tube care skill learning system. The obtained results presented a fast skill learning process and stress-free environment as advantages of the VR application. However, it was noted that instead of replacing conventional skill teaching methods, VR application can be considered a supplementary tool. This was also confirmed by Klenke-Borgmann et al., who observed VR’s significantly positive impact on clinical training and assessment [14]. Other examples of studies that investigated the effectiveness of VR technology for healthcare and nursing domains include [36,37,38].

2.3. Technical Considerations in VR Development

Previous work has explored various strategies and technical details for developing VR applications across different domains, including nursing education. For instance, the study by [39] investigated the integration of multi-access edge computing (MEC) into virtual reality environments to alleviate the burden of network transmission, a critical factor in ensuring seamless VR experiences. Similarly, another study proposed algorithms for optimizing bandwidth utilization in small cell base stations (SBS) to enhance the delivery of VR applications [40].
A user-centered design approach for VR application development was reported in [41], highlighting the importance of identifying user needs to create useful, usable, and accessible VR tools. Other studies applied standard requirement elicitation techniques from software engineering to VR development, such as those outlined in [42,43]. For example, the study by [12] emphasized the use of storyboards, interviews, user observations, and focus groups to perform requirements elicitation for VR applications, showcasing a practical approach to aligning technical design with user expectations.
In the medical and healthcare domains, researchers in [44] presented diverse technical perspectives for developing VR applications tailored for disciplines such as medicine, surgery, dentistry, and nursing. The study discussed functional requirements, the design of 3D models, necessary equipment, and the integrated development environments (IDEs) and tools employed for VR solution development. Additionally, the study by [45] provided detailed insights into the development of a pediatric VR training solution, covering aspects such as IDE selection, task analysis, interaction design, and hardware (e.g., VR headsets).
However, a notable gap remains in the detailed documentation of development and deployment phases for proposed VR solutions [16,17,18]. While the existing literature provides valuable information on the technical and functional aspects of VR development, comprehensive frameworks outlining the full lifecycle, from design to implementation and deployment, are often missing, leaving room for further research and standardization in this area.

3. Materials and Methods

3.1. Conceptual Design

The design of NursingXR is centered around creating an immersive virtual environment that replicates real-world clinical settings. The goal is to provide nursing students with a realistic platform to practice fundamental nursing procedures in a controlled, repeatable manner. Recent research has demonstrated the effectiveness of VR in medical training, highlighting its ability to create immersive, interactive, and realistic learning environments that improve skill acquisition and retention [46]. Previous work [16] indicated a number functional requirements, shown below, that guide the implementation and development of the platform.
FR1: provides easy to use interface to access the different available functionalities.
FR2: user logs in to NursingXR using unique credentials.
FR3: each procedure provides clear training objectives and step-by-step instructions with interactive elements, medical equipment, etc.
FR4: integrate virtual patients with different symptoms and responses depending on defined scenario-based training for each procedure.
FR5: provides feedback mechanisms and performance evaluation based on predefined criteria to guide user.
FR6: technical support and onboarding available to assist new users.
Designed to run on mainstream untethered VR devices, the platform was developed to house a library of immersive learning experiences organized as modules and submodules (lessons). For example, the module “Vital Signs” will have submodules such as “Assessing Body Temperature” and “Assessing Peripheral Pulse”. As detailed in Section 3.2.2, each lesson will have two modes of interaction: training and evaluation. While the Training Mode provides users with hints and supporting cues to perform specific procedures, the Evaluation Mode allows users to perform a procedure without hints or feedback and obtain an assessment upon completion.
The conceptual design and system architecture of NursingXR (Figure 1) integrate three primary components: the User, the Game Engine (Unity), and the Virtual Reality Head-Mounted Display (VR HMD). The User interfaces with the system by selecting either Training Mode or Evaluation Mode through the Game Engine. In Evaluation Mode, the system utilizes the user’s unique userID to collect and store performance data. The Game Engine serves as the central hub, delivering the interactive experience by transmitting visuals, audio, and haptic feedback to the VR HMD. This feedback includes immersive visuals of the virtual nursing training environment (Virtual Environment), spatial audio with sound effects (SFX), and controller vibrations that simulate tactile interactions. Performance data collected in Evaluation Mode is stored on the HMD’s onboard memory for later analysis. Within the Game Engine, several interconnected components work together to provide an immersive and functional experience. These include OpenXR for cross-platform compatibility, the XR Interaction Toolkit for enabling user input and interactions, and a Physics engine for collision detection, haptics, and object manipulation. The engine also houses the Nursing Curriculum, which drives the sequence of steps and visual cues in Training Mode, as well as the scoring mechanisms in Evaluation Mode. The Virtual Environment is further enhanced by 3D models, including a simulated clinical examination room, tools, instruments, and a patient avatar, alongside user interface (UI) elements. Spatial sound design adds realism, incorporating background noise, event-driven audio, and sound effects that align with user actions. The VR HMD acts as the primary output device, equipped with a display, speakers, controllers, and storage memory. The HMD and its controllers are tracked in six degrees of freedom (6DOF), enabling the system to respond to the user’s movements and input in real time. The controllers provide haptic feedback, creating a tactile connection with virtual objects, while the display and speakers immerse the user in the simulated clinical environment. This architecture ensures seamless interaction and a cohesive user experience, facilitating realistic training scenarios and comprehensive performance evaluations.

3.2. Foundational Building Blocks

3.2.1. Scalable Cross-Platform Support

NursingXR was implemented using the game engine Unity [47], a versatile platform for developing interactive simulations and, consequently, VR applications. The software development approach ensures scalability and flexibility through device-agnostic and cross-platform deployment. Central to this approach is the use of the OpenXR framework as the foundational infrastructure for NursingXR [48]. OpenXR is a widely adopted open standard that provides unified Application Programming Interfaces (APIs) for developing AR and VR applications. By standardizing the interface between XR applications and hardware, OpenXR ensures compatibility with a broad range of standalone and tethered VR systems. This design allows NursingXR to remain adaptable to future hardware advancements, preventing obsolescence and enabling continuous expansion. The interoperability not only simplifies development but also future proofs the software against evolving hardware and platform ecosystems.

3.2.2. Modular Expandable Design

One of the primary intents of NursingXR is to accommodate additional training modules, allowing for seamless integration of nursing procedures as the platform evolves. The modular software architecture enables new lessons, updates, and functionality enhancements to be incorporated without disrupting existing features. The system’s event-driven design allows interactions to be dynamically managed, making it easier to modify, extend, or refine procedural steps as nursing curricula evolve. Table 1 shows a list of modules and submodules within the nursing fundamentals curriculum and their level of integration within Nursing XR.
To achieve this, the software development process follows a modular and templated design, enabling new procedures to be incorporated without disrupting existing functionalities. Building on the OpenXR framework for cross-platform compatibility, NursingXR employs the XR Interaction Toolkit (XRI) to manage user interactions in the VR environment. XRI utilizes action-based interaction mapping, decoupling user inputs from device-specific configurations and supporting consistent functionality across platforms. Interactions are programmed as events, enabling specific user actions to trigger a series of predefined steps and associated system responses. For example, grabbing interactions allow users to pick up virtual objects using the controller’s grip button, while socket interactors enable objects to snap into place at predefined locations, guided by visual cues that highlight compatibility. These interactions also trigger additional events, such as updating instructions, triggering a sound effect, highlighting key objects, or progressing to the next step in a procedure.
The system’s event-driven architecture further enhances flexibility by pairing interactions with defined inputs and outputs. Inputs activate the interactions, while outputs execute commands once interactions are completed successfully for both modes. The two modes implemented are explained in detail in Section 3.2.3, but in short, in Training Mode, objects are dynamically highlighted and activated as users progress through predefined steps, with each interaction leading to the next in a controlled sequence. In Evaluation Mode, however, objects remain interactable from the start, and the system tracks user actions against a predefined key that specifies the correct sequence of operations. At the end of the evaluation, the tracker compares user performance to the key and generates a score, providing actionable feedback. This combination of modularity, event-driven design, and cross-platform compatibility ensures that NursingXR is both scalable and adaptable, making it a powerful tool for nursing education and training.

3.2.3. Multi-Modal Functionality

For each nursing submodule (lesson), NursingXR offers two distinct modes: Training Mode and Evaluation Mode, each serving complementary purposes in the learning process (Table 2).
Training Mode focuses on teaching users how to perform selected procedures by providing step-by-step instructions and visual guidance. This mode helps mitigate performance anxiety by allowing users to practice at their own pace in a low-pressure environment, ensuring they understand each step correctly before moving forward. Users are restricted to one action at a time, reinforcing the correct sequence of steps. Training Mode consists of three interconnected sections: pre-procedure, procedure, and post-procedure, each guiding the user through key tasks. Instructional mechanisms include highlighting relevant 3D objects, playing sound effects, and displaying instructional boards to enhance comprehension and engagement. Training Mode does not assess or grade user performance; instead, its primary purpose is to foster confidence and competence through unlimited, guided practice.
Evaluation Mode, on the other hand, is designed to test users’ comprehension and readiness to apply what they have learned in Training Mode. This mode removes instructional guidance and visual indicators, requiring users to rely on memory and prior training to complete the selected procedure independently. Unlike Training Mode, Evaluation Mode consists of a single “manager” section instead of the three-step structure. This manager tracks the sequence of user actions, their duration, and procedural accuracy. The environment adopts a sandbox-style approach, granting users full control over object interaction and task execution order. The system generates a detailed performance report, comparing user actions against the correct procedural flow and assigning a proficiency score. Additionally, each user is assigned a unique ID, ensuring that their performance data can be securely stored, accessed later for review, and used for further evaluation or tailored instruction. Evaluation Mode provides an essential tool for assessing readiness and identifying areas where additional training may be required.

3.3. Implementation and Development

3.3.1. Nomenclature Analysis and Translation

The accurate translation of nursing procedures into virtual interactions was a critical step in the design process. To achieve this, a detailed nomenclature analysis was conducted to break down each task into its essential steps. These steps were then mapped onto corresponding virtual interactions, ensuring that the tasks performed in VR mirrored real-world procedures as closely as possible. The nomenclature analysis involved the collaboration of nursing professionals and VR developers to create a common language that accurately reflects clinical practice.
Furthermore, and to ensure fidelity and validity of the developed system, a planning and data collection phase took place at the outset of the development cycle to obtain relevant information about each nursing lesson including listing and analyzing each step of the procedure and the objects/interactions that step involves. Subsequently, categories of nomenclature were classified for the nursing procedures to identify specific terminology, which can then be translated (converted) to one compatible with game engine design and development. This mapping is created by asking questions such as: What is the user seeing? What object does the user touch? What does the user do with this object? What happens when the user takes that action? How should the environment react to the user’s action? The resulting translation mapping included (among many) the following:
Nursing chapter = module.
Lesson in a chapter = submodule = Game engine scene.
Leave the lesson = interact with door handle.
User uses hands to handle objects = interactors.
Instruments and tools = 3D models and grabbable interactables.
UI elements or instructions = 2D graphics and selectable interactables.
Locations of interactions in the room = locomotion spots and anchors.
Two objects touching each other = collision detection.
Object moves = animation.
Sterile field = collision radius.
Wait for event to occur (e.g., mercury movement in thermometer) = start simulation clock and animate object.
Object placed in approximate location = socket interaction.
Item placed in trash or hazmat bin = dispose and destroy (garbage collection).
This mapping serves as the foundation of the developed program and created the infrastructure for studying the real-world process and then building a faithful representation of its simulated equivalent. The result of this effort is a template or rubric that depicts that mapping specifically for each sub-module (lesson) and brings each step of the procedure down to its lowest common denominator. This layout document shows the elements that are involved in performing an entire procedure including interactions, UI displays, controls and settings, and user events (Figure 2).
This material was the outcome of multiple meetings, consultations, and discussions between nursing staff and software developers across institutions, as well as the analysis of curriculum material like the textbook lab manual, and syllabus. The devised template document was divided into sections to include general information, pre-procedure, procedure, and post-procedure. A color-coding scheme was used to identify various elements for translation. Although time consuming, this document served as a meeting ground for software developers and nursing professionals, and a documented mapping of all elements involved in the simulation to equate the action that a nurse would take into an equivalent action inside the NursingXR application.

3.3.2. Environment Design and Assets

The visual fidelity of the virtual environment was a key focus in the design of NursingXR. The platform uses high-quality 3D models of medical equipment, patient rooms, and clinical settings to create an immersive experience for users. Photorealistic textures and lighting effects were incorporated to enhance the sense of presence, ensuring that users feel as though they are working in a clinical setting.
The design of the art assets was informed by consultations with healthcare professionals to ensure that the virtual representations accurately reflect real-world clinical settings. Several approaches were used to create content (assets) for the environment. These included the use of photogrammetry to create an optimized, to-scale digital twin of the examination room, with textures created from photographs. Once the 3D environment was integrated into the game engine, it was deconstructed into modular parts, allowing the development team to enable or disable its components based on the corresponding lesson (i.e., procedure). Other specialized, lesson-specific objects and instruments were acquired. This necessitated examining each lesson to identify props, tools, objects, disposables, instruments, or any items the nurse would interact with. These models were created via 3D modeling tools such as Blender [49] or Maya [50], textured via physically based rendering (PBR) techniques, and optimized with levels of detail (LOD) where necessary. Examples of these models include thermometers, germicidal wet wipes, nasogastric tubes, IV kits, and injection kits (Figure 3). Generic models, such as patient avatars, towels, napkins, and gloves, were obtained from online asset repositories.

3.3.3. Controls and Interactions

The platform was designed to facilitate seamless interactions with the virtual environment, enabling users to repeatedly train on the clinical procedures. Users engage with the platform through VR controllers, with inputs and gestures mapped to specific actions that closely simulate real-world nursing tasks. These controls ensure that tasks such as navigation, object manipulation, and user interface (UI) interaction are intuitive and accessible.
Locomotion within the virtual environment is flexible, allowing for various movement modes based on user preferences as well as physical setups and constraints. Users can move through the environment by teleporting using the right joystick, engaging in continuous motion with the left joystick, or physically walking in room-scale mode in real life. Object manipulation is equally intuitive. Users can grab objects by pressing and holding the grip button on either controller. Releasing the button allows the object to be dropped or placed. This grabbing mechanism is seamlessly integrated into the simulation, enabling users to handle virtual tools and instruments naturally. Interacting with the UI is achieved using the trigger button on the controller. This button allows users to select on-screen elements or interact with sliders in the interface. The UI design ensures accessibility and simplicity, enabling users to make adjustments or selections without interrupting the flow of their training. To enhance immersion, specific actions are tied to contextual interactions within the environment. For example, users can exit a training room and return to the main menu by gripping and releasing the door handle, replicating real-world motions. These features collectively create a cohesive and intuitive interaction system that aligns with the platform’s goal of providing realistic and engaging nursing education experiences.
To support the contrasting lessons as well as the different modes within NursingXR, an event-driven system was implemented in the Unity game engine to monitor XR interactions in real-time and track user progress through each procedure. Primary in this system is a script named ProcedureEvents, which orchestrates the entire procedural flow. This script uses a composition-based relationship with a custom class named ProcedureCommentAndEvents, defined using UnityEvents. Each instance of this class represents a single step in the procedure and incorporates two key elements:
(a)
Step Description (TextAreaAttribute String variable): Stores textual descriptions of each step, providing informative guidance and instructions for users.
(b)
Event Trigger (UnityEvent variable): Leverages UnityEvents to execute custom code or actions upon activating a step, enabling dynamic behavior and real-time user interaction.
The ProcedureEvents script maintains an array of ProcedureCommentAndEvents, functioning as a structured list that defines the sequence of steps constituting the entire procedure. To activate a specific step, a public function accepts an integer value as input, referencing the corresponding step in the array. This function retrieves the associated step description for user presentation and triggers the linked event, executing any scripted actions. This architecture ensures a flexible and extensible system for managing procedural training. The same script is used in all lessons and, within each lesson, events/steps are grouped in pre-procedure, procedure, and post-procedure categories. The use of this script and modular approach is demonstrated in Figure 4 for the Intramuscular Injection lesson. The figure also shows an expanded view for one example event in each category in the adjacent column, depicting how various interactions can be invoked within the same event occurrence including enabling/disabling objects, making objects accessible, changing UI content, switching outlined objects, and triggering SFX, among others. Several interactions can be seamlessly added for each step. For other lessons, the same script can be used but the events/interactions would differ.

3.3.4. Implementing Modes and Lessons

As mentioned below, incorporating OpenXR for cross-platform compatibility and leveraging the XR Interaction Toolkit (XRI) for managing action-based interactions facilitates scalability and adaptability and allows deployment to be hardware-agnostic. To streamline development and collaboration, GitHub [51] was used for version control, allowing teams from the two partnering international institutions to efficiently manage changes and contributions. Features like version tracking and workflow automation ensured synchronization and consistency throughout the project.
A custom template scene was developed within the NursingXR repository to serve as a reusable framework for creating new modes and lessons. This scene includes the developed event system as well as configured elements such as reusable 3D assets, interaction managers, and pre-procedure activities. Commonly used objects and their associated interactions, like medical tools, virtual patient avatars, and instructional boards, are preloaded into the template, simplifying the integration of standard components into new lessons. For procedures requiring unique tools or interactions, developers can add customized 3D assets and behaviors as needed. Unity’s array and list classes are utilized to manage these assets, ensuring structured and scalable development. This template supports the efficient implementation of both Training and Evaluation modes by providing a consistent foundation while allowing for customization. New nursing procedures can be added seamlessly by reusing and adapting existing elements, maintaining the platform’s modular and expandable nature.

3.3.5. Progressive Multi-Sensory Guidance

To enhance the realism and effectiveness of the training experience and contribute to knowledge retention, a combination of visual, haptic, textual, and auditory feedback was implemented and programmed to be driven by the event system. These guidance techniques are particularly prominent in Training Mode, where they serve to support users in mastering nursing procedures by offering clear, step-by-step assistance.
Visual cues play a central role in guiding users through tasks. For instance, objects that require interaction are surrounded by an outlined yellow box, which disappears once the task is successfully completed, reinforcing correct actions (see Figure 5a and Figure 6a). Additionally, green check boxes on the UI instructional panel signify task completion. Textual and auditory sound guidance are also implemented to provide a rich and immersive learning experience. Instructions are delivered in English through on-screen text and accompanying aural cues, offering comprehensive support as users navigate the procedures (see Figure 5b). Beyond simple guidance, the platform integrates sophisticated contextual sound effects, including fluid sounds, glove application sounds, heartbeats, clock ticking, plaster and bandage application sounds, bottle and vial opening sounds, and object discard sounds. These audio elements simulate realistic clinical scenarios, reinforcing task engagement and immersion. Haptic feedback further enhances the immersive experience by simulating tactile sensations associated with specific tasks. For example, users feel vibrations through the VR controllers when performing actions such as inserting an IV or feeling for a patient’s countable pulse.
By combining visual, audio, and tactile feedback, the system creates an immersive training experience that reinforces accuracy, enhances skill development, and bridges the gap between theoretical learning and practical application to some extent.

3.3.6. Functionality and Deployment

The NursingXR platform aimed to deliver a comprehensive and immersive training experience for nursing students with a wide range of interactive lessons. Available nursing lessons are organized into modules and submodules, each targeting a specific nursing procedure. Table 1 outlined the supported and under-development lessons within NursingXR. These modules encompass essential nursing skills, ranging from aseptic techniques to advanced procedures like intravenous infusion monitoring. Each lesson is divided into pre-procedure, procedure, and post-procedure sections, reflecting real-world clinical workflows. As shown in Figure 6, upon launching NursingXR, users are introduced to the platform through a brief tutorial that teaches navigation, teleportation, object manipulation, and interaction with the user interface (Figure 6a). After completing or skipping the tutorial, users input a four-digit identification number (Figure 6b), which is used by the Evaluation Mode to track progress and scores, stored locally as a JSON file. They are then taken to the Main Menu (Figure 6c) which includes five tabs:
  • About: Provides details about the app and its developers.
  • Modules: Lists available nursing lessons, which can be launched in either Training or Evaluation Mode (Figure 6d).
  • Results: Displays the latest scores for the user’s completed lessons (Figure 6e).
  • Tutorial: Allows users to revisit the initial controls and navigation tutorial (Figure 6a).
  • Settings: Offers configuration options, including volume control, locomotion modes, subtitles, and user ID verification (Figure 6f).
Within the Modules tab, users can select a module and submodule to launch a simulation. Each simulation incorporates the pre-procedure, procedure, and post-procedure sections to replicate the full scope of nursing tasks. For example, in the Wound Drainage and Specimen lesson, users perform activities such as cleaning a wound, collecting a specimen, and applying a new bandage while adhering to strict hygiene protocols. Similarly, the Peripheral Pulse lesson uses haptic feedback to simulate a patient’s pulse and requires users to accurately measure and record the value.
The software’s wide compatibility and accessibility is demonstrated by its support of an array of mainstream VR hardware. Table 3 lists the various supported hardware platforms detailing key specifications such as price (currently), resolution, refresh rate, field of view (FOV), weight, and tracking capabilities.

4. Validation and User Study

In order to investigate the validity and utility of NursingXR, a user study was designed and conducted to highlight strengths and areas of improvement as a training tool for two distinct groups of participants. The first group includes students who are currently in their first year or second year (Novices) and the second group are students who passed three years in a nursing bachelor program (Experts). This study was approved by Shenandoah University’s Internal Review [SU IRB #1090]. All participants provided informed consent after being provided with a detailed explanation of the process and potential risks. No personal or identifying data were collected in this study.

4.1. Demographics

Participants in this study (n = 78) included 25 novices (first-year nursing students) and 53 experts (experienced nursing professionals) from both participating institutions (SU and AAUP). The cohort was predominantly female, with approximately 72% identifying as such, while the remaining 28% were male, reflecting the general demographic trends in the nursing profession (Figure 7). Most of the participants, 92.3%, were between the ages of 18 and 25, with the remainder aged 40 and above. In terms of prior technological familiarity, 56% of participants reported having experience with video games or entertainment simulations, and 52% had previously used virtual reality (VR) systems.

4.2. Experiment Design and Protocol

The goal of this experiment is to investigate overall usability, satisfaction, and initial reactions to NursingXR’s interface, feedback mechanisms, and ability to support accurate clinical task execution. While this study did not include a direct comparison to traditional field-based training, participants’ responses were used to evaluate the system’s perceived realism and effectiveness in replicating clinical procedures. Participants engaged with the platform in both the Training Mode, which provides guided instructions, and the Evaluation Mode, where tasks are completed independently. A mixed-methods approach was employed, combining quantitative and qualitative data collection. Quantitatively, questions that evaluate usability, visual and haptic feedback, and overall satisfaction were used with a 5-point Likert scale. These measures assessed the platform’s effectiveness for participants with varying levels of experience. Qualitatively, on the other hand, open-ended responses captured participants’ perceptions of the platform’s realism, its potential impact on nursing education, and suggestions for improvement. Thematic analysis of these responses identified recurring patterns and provided actionable insights for refining the platform. Data collection for the experiment followed a structured protocol to ensure consistency and reliability across sessions:
Reading and signing the Informed Consent form (5 min): Participants were required to sign a consent form before starting the experiment to acknowledge that their data would remain confidential and inform them of potential VR side effects, such as dizziness, vertigo, or loss of balance. The experiments were conducted with assistance by trained VR lab technicians.
Introduction to VR equipment (5 min): VR lab technicians guided participants on equipping and adjusting the VR headset and using the VR controllers. Participants were introduced to the VR tutorial, which demonstrated basic controls such as navigation, interaction, and object manipulation. Each participant was assigned a randomized four-digit ID to ensure anonymity while tracking progress.
Engagement with NursingXR (10 min): Participants were encouraged to explore NursingXR at their own pace. They selected a submodule of their choice and worked through the steps of the procedure. VR lab technicians remained available to provide guidance and assistance. Figure 8 depicts some moments from the Intramuscular Injection submodule.
Questionnaire Completion (5 min): After completing the session, participants were assisted in removing the VR headset and directed to an online questionnaire. Small talk initiated by the researcher helped participants reflect on their experience before filling out the survey. The questionnaire could be accessed via a barcode scanner on mobile devices or on lab PCs.
Two questionnaires were designed for the two participant groups, reflecting their unique perspectives and objectives. Both surveys included questions focused on usability, knowledge retention, visual and haptic feedback, ease of understanding, and overall satisfaction. However, the survey for experts included additional questions related to the utility of NursingXR in replicating clinical procedures and its pedagogical potential. Closed-ended questions were designed to evaluate these metrics on a 5-point Likert scale, while open-ended questions allowed students to provide qualitative insights into their experience to uncover nuanced perspectives on the platform’s usability, realism, and educational value, as well as suggestions for improvement.

5. Results and Discussion

This section presents the findings from the study, analyzing the usability, user experience, and effectiveness of NursingXR. The results are structured to align with the research questions (RQs) outlined in the Introduction, which focus on evaluating task proficiency, usability challenges, system design, and technical considerations. Throughout this section, references to these research questions will be denoted as (RQ1, RQ2, etc.) to maintain clarity and systematically connect the findings to the study’s objectives.

5.1. Quantitative Analysis

This section evaluates the responses from novices and experts regarding various aspects of the NursingXR platform. Ratings were based on a Likert scale (1 = Strongly Disagree to 5 = Strongly Agree), and independent sample t-tests were conducted to compare perceptions between the two groups. Table 4 shows the results, including descriptive statistics (mean, standard deviation) as well as the outcome of each t-tests for the various questions (Q1, Q2, Q3, etc.) assessed in the questionnaire. These analyses address RQ1 regarding the platform’s effectiveness in enhancing task proficiency and RQ2 regarding the role of user interface and interaction design in facilitating task execution.
With regard to user interface and user experience (Q1), novices rated the main menu’s ease of use slightly lower (mean: 3.48) than experts (mean: 3.7). However, the t-test revealed no statistically significant difference (two-tailed p-value = 0.9025), indicating that both novices and experts found the main menu intuitive. Feedback regarding ease of navigation within the application (Q2) was also high, with novices rating it 3.64 on average and experts slightly lower at 3.43. The difference did not approach statistical significance (p-value = 0.7604), highlighting that both groups generally found navigation intuitive, despite minor usability challenges experienced by some participants. Both groups rated object interaction (Q3) positively, with novices giving an average rating of 3.64 and experts rating it slightly lower at 3.28. The t-test results (p-value = 0.1570) confirmed that there was no statistically significant difference between the groups. This suggests that object manipulation was accessible across experience levels, although minor improvements could enhance interaction consistency. This provides insight into how interface design and interaction mechanics influence user experience (RQ2).
As for general functionality, the visual appeal (Q4) of the application was rated highly by both groups, with novices assigning a mean score of 3.4 and experts a mean score of 3.84. The difference approached significance but was ultimately not statistically significant (p-value = 0.0948). These results indicate that the platform’s visual design and fidelity were appreciated by participants regardless of experience, highlighting its engaging and immersive qualities (RQ5). For the clarity of procedural steps (Q5) which was evaluated with the statement, “Steps to complete procedure were easily understood”, novices gave this a mean score of 3.68, whereas experts rated it 3.21. While the t-test result (p-value = 0.0681) indicated a trend toward significance, the findings suggest that novices might find the procedures slightly easier to follow compared to experts, possibly due to differing expectations or familiarity with traditional methods (RQ3). When asked if “VR lessons are similar to those in real life” (Q6), novices and experts were in close agreement, with mean scores of 3.36 and 3.38, respectively. The lack of significant difference (p-value = 0.9514) suggests that both groups found the VR lessons reasonably reflective of real-world nursing procedures, validating the realism embedded in the platform (RQ1).
Finally, the potential of VR as an equivalent or complementary educational method was assessed through several questions. For the statement, “VR educational potential is at least equivalent to traditional methods” (Q7), novices assigned a mean score of 3.52, while experts rated it slightly higher at 3.64. However, the t-test (p-value = 0.6278) revealed no significant difference between the groups, suggesting that both novices and experts recognize VR’s viability as a teaching tool comparable to traditional laboratory methods (RQ1). In terms of VR serving as a complementary tool to traditional methods (Q8), novices rated this higher (mean = 3.36) than experts (mean = 3.09), though the difference was not significant (p-value = 0.3287). These findings indicate that novices tend to view VR more favorably as a supplement to traditional teaching methods compared to experts (RQ3). This may reflect the open-minded perspective of novices who are less entrenched in traditional methods. The platform’s potential for improving memory retention was evaluated through the statement, “VR provides better information retention (memory capacity) than traditional education” (Q9). Novices rated this higher (mean = 3.88) compared to experts (mean = 3.60), though the difference was not statistically significant (p-value = 0.3075). These results suggest that both groups perceive VR as a promising tool for enhancing memory retention (RQ5), though its effectiveness in this regard may vary based on individual preferences and learning styles, similar to what was found in [46].
Figure 9 further highlights experts’ perceptions of VR’s broader applicability in nursing education and didactic capacity, particularly its potential to support learning beyond the immediate scope of NursingXR. When asked to respond to three specific statements (on a 5-point Likert scale), responses from experts shed light on the platform’s perceived strengths and areas for broader educational impact. The findings show that responses were predominantly high, indicating a generally positive reception to VR’s role in nursing education. Most experts expressed interest in incorporating VR into other courses and acknowledged its potential to improve learning outcomes and alleviate stress, reflecting the platform’s perceived utility and adaptability in diverse educational contexts (RQ4).
The results of this analysis provide valuable insights into the usability, realism, and educational potential of NursingXR. Both novice and expert participants rated the platform positively, with no statistically significant differences for most metrics. This highlights the platform’s accessibility and effectiveness across varying experience levels. However, the trend toward significance for procedural clarity suggests that further refinements to task instructions and user guidance may improve usability, particularly for experts accustomed to traditional methods (RQ3). Additionally, while both groups appreciated the visual appeal and potential for information retention, these areas could be further enhanced to maximize engagement and learning outcomes (RQ5). The findings validate NursingXR as a viable and immersive training tool that aligns closely with traditional nursing education while offering the advantages of interactive and memorable VR experiences. Future iterations should address minor usability challenges and focus on optimizing its integration into existing nursing curricula to fully realize its potential as a transformative educational tool.

5.2. Qualitative Analysis

For the novice group, some patterns were observed, albeit not in depth. Many novice participants expressed that NursingXR was useful and fun. Phrases such as “fun experience”, “good idea”, and “useful and fun to use” were prevalent. This supports the platform’s potential to engage learners and facilitate skill acquisition (RQ1, RQ5). Novices provided very minimal criticisms in a specific way, except for one mention of difficulty with tasks like “picking up an alcohol swab”. The lack of detailed negative feedback could suggest either a limited ability to critically evaluate the platform or general contentment with its functionality (Table 5). The feedback from novices was often brief and repetitive (e.g., “good”, “no”), suggesting that their engagement with the feedback process might have been surface-level, reflective of a lack of familiarity with evaluating such platforms, or indicating general contentment with its design (RQ2, RQ3).
As for the expert group, more nuanced feedback was provided (Table 5). Usability challenges were identified, including difficulty selecting menu items, issues with motion sickness, and the need for clearer instructional guidance. While experts acknowledged the platform’s educational value, they also highlighted refinements needed for widespread adoption (RQ3, RQ4). They also commented on specific interaction mechanisms, such as difficulty with grip mechanics or aligning objects, and suggested enhancements for tactile feedback and functionality. Notably, several experts emphasized the platform’s potential to “transform nursing education” but stressed that improvements in interaction precision and feedback mechanisms were necessary (RQ5).
Given the above, some of the themes identified include the following:
  • Positive reception by both groups for the platforms’ engageability and potential usefulness, with novices expressing more unqualified enthusiasm.
  • Usability challenges identified by experts and one novice related to controls, navigation, and motion sickness.
  • Educational potential recognized by both groups highlighting the platform’s ability to enhance nursing education, though experts were more cautious about its current readiness to replace traditional methods. It is worth noting that the intent is for NursingXR to supplement and not replace traditional teaching methods.

5.3. User Comfort

The study revealed a range of experiences regarding VR-induced discomfort, with 34% of participants reporting no discomfort at all, while others noted specific symptoms, including headache (21%), motion sickness or disorientation (23%), and eye strain (19%). Interestingly, these figures varied significantly when broken down by gender. A closer look at the data, represented in Figure 10, indicates differing patterns in discomfort experiences. While one gender might have reported higher instances of specific symptoms, the other showed a relatively different distribution. Such insights are crucial for tailoring the platform to accommodate diverse user needs, particularly in addressing ergonomic and usability challenges that may vary based on demographic factors (RQ3, RQ5).

6. Conclusions

This study detailed the development and evaluation of the NursingXR platform, a virtual reality-based tool for training nursing students in fundamental clinical skills. The research provided an in-depth account of the methods and techniques used to design and implement the platform, focusing on its modular and scalable architecture, realistic virtual environments, and multi-sensory feedback mechanisms. Alongside its development, NursingXR was evaluated for usability, task proficiency, and user satisfaction, offering insights into its effectiveness and potential as an immersive learning platform.
Through a mixed-methods evaluation, NursingXR demonstrated strong usability, realism, and educational potential, with generally positive ratings from both novice and expert users. Novices showed higher enthusiasm for the platform’s educational possibilities, while experts provided valuable insights into areas requiring refinement, such as interaction mechanics and procedural clarity. Both groups recognized the platform’s ability to simulate real-world nursing tasks and its potential to enhance information retention and skill acquisition. The findings validate NursingXR as a promising tool for nursing education, offering an innovative approach that aligns with modern healthcare demands. While the platform successfully bridges the gap between traditional and virtual training methods, addressing usability challenges and user comfort will further enhance its adoption and effectiveness. By integrating NursingXR into hybrid learning models, nursing educators can leverage its immersive capabilities to prepare students for real-world clinical practice in a dynamic and cost-effective manner.
A challenge for VR-based learning platforms is technological obsolescence, as advancements in hardware and software can make early implementations outdated over time. While initial enthusiasm for such platforms is often high, sustained adoption requires ongoing development and adaptation. To address this, NursingXR was designed with a modular and scalable framework, enabling continuous updates, content expansions, and integration with emerging VR technologies. Future work will explore strategies for maintaining long-term engagement, including periodic content updates, hardware-agnostic deployment, and multi-user functionality to support collaborative learning.
Additionally, future research will focus on refining interaction mechanics, improving object handling precision, and mitigating motion sickness for a smoother user experience. Expanding the platform to include advanced procedures and critical care simulations will enhance its applicability across educational levels. Longitudinal studies will assess skill transfer to real-world practice, ensuring sustained learning outcomes.

Author Contributions

Conceptualization, A.E. and M.F.O.; methodology, M.F.O.; software, M.O. and A.E.; validation, M.F.O., A.E. and M.R.A.; formal analysis, M.F.O.; investigation, M.O., M.R.A. and A.E.; resources, A.E.; data curation, M.F.O.; writing—original draft preparation, A.E. and M.F.O.; writing—review and editing, M.F.O.; visualization, M.F.O.; supervision, M.F.O. and A.E.; project administration, M.F.O. and M.R.A.; funding acquisition, M.F.O. and M.R.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the United States Department of State, grant number SIS50022GR0070.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board of Shenandoah University (protocol number 1090 approved on 21 March 2022).

Informed Consent Statement

The study was conducted in accordance with and approved by the Institutional Review Board of Shenandoah University (Protocol #1090).

Data Availability Statement

The data presented in this study are available on request from the corresponding author due to other studies still being underway.

Acknowledgments

The effort of several students who were responsible for implementing various aspects of the project is acknowledged and appreciated: Cole Herndon, Orion Tighe, John Ulbrich, Jemison Goforth, Brynna Strader, Luke Yager, Mohammad Kmail, Hamza Sadaqa, Yazan AlOmari, Mohammad Azar, Adam Nabhan, Ahmad Alqerem, Doaa Turkman, and Mohammed Egbarea.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Aryuwat, P.; Holmgren, J.; Asp, M.; Radabutr, M.; Lövenmark, A. Experiences of Nursing Students Regarding Challenges and Support for Resilience during Clinical Education: A Qualitative Study. Nurs. Rep. Pavia Italy 2024, 14, 1604–1620. [Google Scholar] [CrossRef] [PubMed]
  2. Oner Altiok, H.; Ustun, B. The Stress Sources of Nursing Students. Educ. Sci. Theory Pract. 2013, 13, 760–766. [Google Scholar]
  3. Jarosinski, J.M.; Seldomridge, L.; Reid, T.P.; Willey, J. Nurse Faculty Shortage: Voices of Nursing Program Administrators. Nurse Educ. 2022, 47, 151–155. [Google Scholar] [CrossRef] [PubMed]
  4. Koukourikos, K.; Tsaloglidou, A.; Kourkouta, L.; Papathanasiou, I.V.; Iliadis, C.; Fratzana, A.; Panagiotou, A. Simulation in Clinical Nursing Education. Acta Inform. Medica. 2021, 29, 15–20. [Google Scholar] [CrossRef]
  5. Chen, F.-Q.; Leng, Y.-F.; Ge, J.-F.; Wang, D.-W.; Li, C.; Chen, B.; Sun, Z.-L. Effectiveness of Virtual Reality in Nursing Education: Meta-Analysis. J. Med. Internet Res. 2020, 22, e18290. [Google Scholar] [CrossRef] [PubMed]
  6. García Fierros, F.J.; Moreno Escobar, J.J.; Sepúlveda Cervantes, G.; Morales Matamoros, O.; Tejeida Padilla, R. VirtualCPR: Virtual Reality Mobile Application for Training in Cardiopulmonary Resuscitation Techniques. Sensors 2021, 21, 2504. [Google Scholar] [CrossRef]
  7. Aiello, S.; Cochrane, T.; Sevigny, C. The affordances of clinical simulation immersive technology within healthcare education: A scoping review. Virtual Real. 2023, 27, 3485–3503. [Google Scholar] [CrossRef]
  8. Bayram, S.B.; Caliskan, N. The use of virtual reality simulations in nursing education, and patient safety. In Contemporary Topics in Patient Safety-Volume 1; IntechOpen: London, UK, 2020. [Google Scholar]
  9. Fealy, S.; Jones, D.; Hutton, A.; Graham, K.; McNeill, L.; Sweet, L.; Hazelton, M. The integration of immersive virtual reality in tertiary nursing and midwifery education: A scoping review. Nurse Educ. Today 2019, 79, 14–19. [Google Scholar] [CrossRef]
  10. Tusher, H.M.; Mallam, S.; Nazir, S. A Systematic Review of Virtual Reality Features for Skill Training. Technol. Knowl. Learn. 2024, 29, 843–878. [Google Scholar] [CrossRef]
  11. Haque, S.; Srinivasan, S. A meta-analysis of the training effectiveness of virtual reality surgical simulators. IEEE Trans. Inf. Technol. Biomed. 2006, 10, 51–58. [Google Scholar] [CrossRef]
  12. Gómez, V.; Peñaranda, K.; Figueroa, P. Lessons learned from requirements gathering for virtual reality simulators. Virtual Real. Intell. Hardw. 2021, 3, 407–422. [Google Scholar] [CrossRef]
  13. Bansal, G.; Rajgopal, K.; Chamola, V.; Xiong, Z.; Niyato, D. Healthcare in Metaverse: A Survey on Current Metaverse Applications in Healthcare. IEEE Access 2022, 10, 119914–119946. [Google Scholar] [CrossRef]
  14. Klenke-Borgmann, L.; Cantrell, M.A.; Mariani, B. Clinical Judgment in Nursing Students After Observation of In-Class Simulations. Clin. Simul. Nurs. 2021, 51, 19–27. [Google Scholar] [CrossRef]
  15. Hitching, R.; Hoffman, H.G.; Garcia-Palacios, A.; Adamson, M.M.; Madrigal, E.; Alhalabi, W.; Alhudali, A.; Sampaio, M.; Peterson, B.; Fontenot, M.R.; et al. The Emerging Role of Virtual Reality as an Adjunct to Procedural Sedation and Anesthesia: A Narrative Review. J. Clin. Med. 2023, 12, 843. [Google Scholar] [CrossRef]
  16. Ewais, A.; Obeid, M.F. Analyzing and Designing the Utility of Virtual Reality for Nursing Fundamentals Lab. Int. J. Online Biomed. Eng. IJOE 2024, 20, 27–51. [Google Scholar] [CrossRef]
  17. Ewais, A.; Asia, M.; Herndon, C.; Tighe, O.; Ulbrich, J.; Obeid, M.F. Using HTA and UML in Analysis and Design Phases for a VR-Based Nursing Lab. In Extended Reality; De Paolis, L.T., Arpaia, P., Sacco, M., Eds.; Springer Nature: Cham, Switzerland, 2024; pp. 316–324. [Google Scholar]
  18. Obeid, F.; Ewais, A.; Asia, M. Development of an Immersive Learning Environment for Fundamentals of Nursing Labs. In Proceedings of the 2024 10th International Conference on Virtual Reality (ICVR), Bournemouth, UK, 24–26 July 2024. [Google Scholar]
  19. Sahin Karaduman, G.; Basak, T. Virtual Patient Simulations in Nursing Education: A Descriptive Systematic Review. Simul. Gaming 2024, 55, 159–179. [Google Scholar] [CrossRef]
  20. Rauen, C.A. Simulation as a Teaching Strategy for Nursing Education and Orientation in Cardiac Surgery. Crit. Care Nurse 2004, 24, 46–51. [Google Scholar] [CrossRef]
  21. Günay İsmailoğlu, E.; Zaybak, A. Comparison of the Effectiveness of a Virtual Simulator With a Plastic Arm Model in Teaching Intravenous Catheter Insertion Skills. Comput. Inform. Nurs. CIN 2017, 36, 98–105. [Google Scholar] [CrossRef] [PubMed]
  22. Mulrooney, H.M.; Kelly, A.F. COVID-19 and the Move to Online Teaching: Impact on Perceptions of Belonging in Staff and Students in a UK Widening Participation University, Volume 2, 2020. Available online: https://api.semanticscholar.org/CorpusID:225171484 (accessed on 15 February 2025).
  23. Hamilton, D.; McKechnie, J.; Edgerton, E.; Wilson, C. Immersive virtual reality as a pedagogical tool in education: A systematic literature review of quantitative learning outcomes and experimental design. J. Comput. Educ. 2021, 8, 1–32. [Google Scholar] [CrossRef]
  24. Ahmed, H.; Allaf, M.; Elghazaly, H. COVID-19 and medical education. Lancet Infect. Dis. 2020, 20, 777–778. [Google Scholar] [CrossRef]
  25. Bowman, D.A.; McMahan, R.P. Virtual Reality: How Much Immersion Is Enough? Computer 2007, 40, 36–43. [Google Scholar] [CrossRef]
  26. Riva, G.; Wiederhold, B.K.; Mantovani, F. Neuroscience of Virtual Reality: From Virtual Exposure to Embodied Medicine. Cyberpsychology Behav. Soc. Netw. 2019, 22, 82–96. [Google Scholar] [CrossRef] [PubMed]
  27. Radianti, J.; Majchrzak, T.A.; Fromm, J.; Wohlgenannt, I. A systematic review of immersive virtual reality applications for higher education: Design elements, lessons learned, and research agenda. Comput. Educ. 2020, 147, 103778. [Google Scholar] [CrossRef]
  28. UbiSim. Labster. 2024. Available online: https://www.ubisimvr.com/ (accessed on 15 February 2025).
  29. Oxford Medical Simulation. OMS. 2024. Available online: https://oxfordmedicalsimulation.com/ (accessed on 15 February 2025).
  30. SimX. 2024. Available online: https://www.simxvr.com/virtual-reality-simulation-for-nurses/ (accessed on 15 February 2025).
  31. Medical Assisting Clinical Suite. Incite VR. [Online]. 2020. Available online: https://www.incitevr.com/ma-skills/versions/professional (accessed on 15 February 2025).
  32. Yahya, L.B.; Naciri, A.; Radid, M.; Chemsi, G. Immersive simulation in nursing and midwifery education: A systematic review. J. Educ. Eval. Health Prof. 2024, 21, 19. [Google Scholar] [CrossRef] [PubMed]
  33. Bajuri, M.Y.; Benferdia, Y.; Ahmad, M.N. Critical Success Factors for Virtual Reality Applications in Orthopaedic Surgical Training: A Systematic Literature Review. IEEE Access 2021, 9, 128574–128589. [Google Scholar] [CrossRef]
  34. Hung, L.; Wong, J.; Wong, K.L.; Son, R.C.; Van, M.; Mortenson, W.B.; Lim, A.; Boger, J.; Wallsworth, C.; Zhao, Y. The Use and Impact of Virtual Reality Programs Supported by Aromatherapy for Older Adults: A Scoping Review. Appl. Sci. 2025, 15, 188. [Google Scholar] [CrossRef]
  35. Chang, Y.M.; Lai, C.L. Exploring the experiences of nursing students in using immersive virtual reality to learn nursing skills. Nurse Educ. Today 2021, 97, 104670. [Google Scholar] [CrossRef]
  36. Hussain, Z.; Ng, D.M.; Alnafisee, N.; Sheikh, Z.; Ng, N.; Khan, A.; Hussain, A.; Aitken, D.; Sheikh, A. Effectiveness of virtual and augmented reality for improving knowledge and skills in medical students: Protocol for a systematic review. BMJ Open 2021, 11, e047004. [Google Scholar] [CrossRef]
  37. Boros, M.; Sventekova, E.; Cidlinova, A.; Bardy, M.; Batrlova, K. Application of VR Technology to the Training of Paramedics. Appl. Sci. 2022, 12, 1172. [Google Scholar] [CrossRef]
  38. Cunha, C.R.; Moreira, A.; Pires, L.; Fernandes, P.O. Using Mixed Reality and Machine Learning to Assist Caregivers in Nursing Home and Promote Well-being. Procedia Comput. Sci. 2023, 219, 1081–1088. [Google Scholar] [CrossRef]
  39. Guo, Z.; Zhang, P.; Xia, J. Design of Virtual Reality Education Platform based on 5G MEC. In Proceedings of the 2021 20th International Conference on Ubiquitous Computing and Communications (IUCC/CIT/DSCI/SmartCNS), London, UK, 20–22 December 2021; pp. 572–578. [Google Scholar] [CrossRef]
  40. Mahbub, M.; Barua, B. Optimal Coverage and Bandwidth-Aware Transmission Planning for Augmented Reality/Virtual Reality. In Proceedings of the 2021 International Conference on Information Technology (ICIT), Amman, Jordan, 14–15 July 2021; pp. 612–615. [Google Scholar] [CrossRef]
  41. Thalen, J.P.; van der Voort, M.C. User centred methods for gathering VR design tool requirements. In Proceedings of the 17th Eurographics Conference on Virtual Environments & Third Joint Virtual Reality, in EGVE-JVRC’11, Nottingham, UK, 20–21 September 2021; Eurographics Association: Goslar, Germany, 2011; pp. 75–81. [Google Scholar]
  42. ISOIECIEEE 291482018E; ISO/IEC/IEEE International Standard-Systems and Software Engineering–Life Cycle Processes–Requirements Engineering. ISO: Geneva, Switzerland, 2018; pp. 1–104. [CrossRef]
  43. Lloyd, W.J.; Rosson, M.B.; Arthur, J.D. Effectiveness of elicitation techniques in distributed requirements engineering. In Proceedings of the IEEE Joint International Conference on Requirements Engineering, Essen, Germany, 9–13 September 2002; pp. 311–318. [Google Scholar] [CrossRef]
  44. Akinwale, O.B.; Abiona, O.; Oluwatope, A.O.; Otuyemi, O.D.; Ijarotimi, O.A.; Olubusola Komolafe, A.; Aregbesola, S.B.; Kolawole, B.A.; Adetutu, O.M.; Agunbiade, O.M.; et al. Designing a virtual reality system for clinical education and examination. Comput. Educ. X Real. 2024, 5, 100083. [Google Scholar] [CrossRef]
  45. Matthews, T.; Tian, F.; Dolby, T. Interaction design for paediatric emergency VR training. VR Exp. Simul. 2020, 2, 330–344. [Google Scholar] [CrossRef]
  46. Vergara, D.; Extremera, J.; Rubio, M.P.; Dávila, L.P. Meaningful Learning Through Virtual Reality Learning Environments: A Case Study in Materials Engineering. Appl. Sci. 2019, 9, 4625. [Google Scholar] [CrossRef]
  47. Unity. Unity Technologies. 2024. Available online: https://unity.com/ (accessed on 15 February 2025).
  48. The Khronos Group Inc. OpenXR. 2024. Available online: www.khronos.org/openxr (accessed on 15 February 2025).
  49. Blender. [Online]. Available online: https://www.blender.org/about/ (accessed on 15 February 2025).
  50. Autodesk Inc. Autodesk Maya. Available online: https://www.autodesk.com/education/support (accessed on 15 February 2025).
  51. GitHub, Inc. GitHub. Available online: https://github.com/about (accessed on 15 February 2025).
Figure 1. Conceptual design and system architecture of NursingXR.
Figure 1. Conceptual design and system architecture of NursingXR.
Applsci 15 02949 g001
Figure 2. Layout document for Tracheostomy lesson showing the various interactions and objects involved, as part of the nomenclature translation process.
Figure 2. Layout document for Tracheostomy lesson showing the various interactions and objects involved, as part of the nomenclature translation process.
Applsci 15 02949 g002
Figure 3. Customized assets created for the lesson “Starting an intravenous infusion” including (a) the IV kit and (b) the room and patient avatar.
Figure 3. Customized assets created for the lesson “Starting an intravenous infusion” including (a) the IV kit and (b) the room and patient avatar.
Applsci 15 02949 g003
Figure 4. Example use of ProcedureEvents script for the Intramuscular Injection lesson, showing a high-level view of the events that were created for this specific lesson’s (a) pre-procedure, (b) procedure, and (c) post-procedure event categories.
Figure 4. Example use of ProcedureEvents script for the Intramuscular Injection lesson, showing a high-level view of the events that were created for this specific lesson’s (a) pre-procedure, (b) procedure, and (c) post-procedure event categories.
Applsci 15 02949 g004aApplsci 15 02949 g004b
Figure 5. Textual instructions (a) paired with visual effects (b) to guide user through the Intramuscular Injection procedure in the Training Mode.
Figure 5. Textual instructions (a) paired with visual effects (b) to guide user through the Intramuscular Injection procedure in the Training Mode.
Applsci 15 02949 g005
Figure 6. NursingXR menu and navigation showing the various screens available including (a) Welcome Screen, (b) User ID input, (c) Modules tab, (d), Results tab, (e) Tutorial tab, and (f) Settings tab.
Figure 6. NursingXR menu and navigation showing the various screens available including (a) Welcome Screen, (b) User ID input, (c) Modules tab, (d), Results tab, (e) Tutorial tab, and (f) Settings tab.
Applsci 15 02949 g006aApplsci 15 02949 g006b
Figure 7. Nursing students participating in the study and using NursingXR to supplement their classroom experience on various devices such as the Meta Quest 2 (a) and the HP Reverb (b).
Figure 7. Nursing students participating in the study and using NursingXR to supplement their classroom experience on various devices such as the Meta Quest 2 (a) and the HP Reverb (b).
Applsci 15 02949 g007
Figure 8. A sampling of the various steps within the Intramuscular Injection submodule including: (a) user washing their hands, (b) user equipping gloves, (c) filling the syringe with the medication, and (d) injecting the patient’s arm.
Figure 8. A sampling of the various steps within the Intramuscular Injection submodule including: (a) user washing their hands, (b) user equipping gloves, (c) filling the syringe with the medication, and (d) injecting the patient’s arm.
Applsci 15 02949 g008
Figure 9. A frequency chart for responses from Experts related to didactic capacity of NursingXR.
Figure 9. A frequency chart for responses from Experts related to didactic capacity of NursingXR.
Applsci 15 02949 g009
Figure 10. Feedback related to VR comfort from male and female participants.
Figure 10. Feedback related to VR comfort from male and female participants.
Applsci 15 02949 g010
Table 1. Modules and submodules currently supported or under development within Nursing XR.
Table 1. Modules and submodules currently supported or under development within Nursing XR.
ModuleSubmodulesStatus
Vital SignsAssessing Body Temperature.Supported
Assessing Peripheral Pulse.Supported
Assessing an Apical Pulse.Under development
Assessing Blood Pressure.Supported
AsepsisPerforming Hand Hygiene.Supported
Applying and Removing PPE.Supported
Establishing and Maintaining a Sterile Field.Supported
Applying and Removing Sterile Gloves (Open Method).Supported
Fluid, Electrolyte, and Acid–Base BalanceStarting an intravenous infusion.Supported
Monitoring an intravenous infusion.Under development
Changing an intravenous container, tubing, and dressing.Under development
MedicationAdministering an Intramuscular Injection.Supported
Adding Medications to Intravenous Fluid Containers.Under development
Skin Integrity and Wound CareObtaining a Wound Drainage Specimen for Culture.Supported
Irrigating a Wound.Under development
Activity and ExerciseAssisting a client to sit on the side of the bed (dangling).Under development
Transferring between bed and chair.Supported
Transferring bed and stretcher.Under development
NutritionInserting a NG tube.Supported
Removing a NG tube.Under development
Urinary EliminationApplying an external urinary device.Under development
Performing urinary catheterization.Under development
OxygenationProviding tracheostomy care.Supported
Table 2. Side-by-side comparison between Training and Evaluation modes for NursingXR.
Table 2. Side-by-side comparison between Training and Evaluation modes for NursingXR.
Training ModeEvaluation Mode
PurposeEducate via guidance and feedback.Assess proficiency and readiness.
Instructions providedStep-by-step instructions.Only main goal (procedure).
FeedbackReal-time during procedure.Based on generated performance report.
Visual aidsBlinking visual cues highlight objects to interact with.Absent. User must independently recall steps.
EnvironmentPredefined steps in structured phases.Sandbox-style environment, up to user.
Object FunctionalityObjects require enabling before interaction (sequential).Objects are immediately usable (non-sequential).
Data CollectionNo data collection for actions or performance.Tracks step order, duration, and scores; stores progress.
Grading/scoringNon-graded; focus on learning and mastery.Graded; generates performance and task completion report.
User ID TrackingNot applicable.Unique ID tracks user performance.
Ability to make mistakesAbsent.User can make mistakes or perform steps out of order.
Wall clock time provisionTimer provided.Wall clock provided.
Error proofing and preventionPresent.Absent.
Table 3. List of VR HMD platforms with which NursingXR is compatible.
Table 3. List of VR HMD platforms with which NursingXR is compatible.
PlatformPrice (USD)Resolution (Pixels/Eye)Refresh Rate (Hz)FOV
(Degrees)
Weight (g)Tracking Capabilities
Meta Quest 2$2991832 × 192060/72/90~97°503Inside-out, 6DOF
Meta Quest 3$4992064 × 2208Up to 120~110°515Inside-out, 6DOF
Meta Quest 3S$2991832 × 1920Up to 120~97°503Inside-out, 6DOF
Meta Quest Pro$9991800 × 192090~106°722Inside-out, 6DOF + eye and face tracking
Pico Neo 3$5991832 × 192072/90~98°395Inside-out, 6DOF
Pico Neo 4$3792160 × 216072/90~105°586Inside-out, 6DOF
HTC Vive$4991080 × 120090~110°605External base stations
HTC XR Elite$10991920 × 192090~110°625Inside-out, 6DOF
HTC Vive Focus$7991600 × 144075~100°695Inside-out, 6DOF
HP Reverb G2$5992160 × 216090~114°500Inside-out, 6DOF
Table 4. Outcomes of the two-tailed t-test comparing Novice and Expert groups after using NursingXR.
Table 4. Outcomes of the two-tailed t-test comparing Novice and Expert groups after using NursingXR.
Q#Question Novices (n = 25)Experts (n = 53)
Q1Main Menu was easy to use.Mean3.483.70
SD0.921.03
Mode/Median3/3.484/4
t-testt = 0.9025
p-value = 0.3697
Q2Navigation was intuitive.Mean3.643.43
SD1.111.12
Mode/Median4/44/4
t-testt = 0.7604
p-value = 0.4494
Q3Object manipulation was intuitive and consistent.Mean3.643.28
SD0.861.10
Mode/Median4/44/3.4
t-testt = 1.4295
p-value = 0.1570
Q4The experience was visually pleasing.Mean3.43.83
SD1.001.07
Mode/Median4/44/4
t-testt = 1.6918
p-value = 0.0948
Q5Steps to complete procedure were easily understood.Mean3.683.21
SD0.951.10
Mode/Median4/43/3
t-testt = 1.8505
p-value = 0.0681
Q6VR lessons are similar to those in real life.Mean3.363.38
SD1.221.15
Mode/Median4/44/4
t-testt = 0.0611
p-value = 0.9514
Q7VR educational potential is at least equivalent to traditional methods.Mean3.523.64
SD1.051.02
Mode/Median4/44/4
t-testt = 0.4868
p-value = 0.6278
Q8VR educational potential as complement to traditional methods.Mean3.363.09
SD1.111.11
Mode/Median4/43/3
t-testt = 0.9830
p-value = 0.3287
Q9VR provides better information retention (memory capacity) than traditional education.Mean3.883.60
SD1.241.04
Mode/Median4/44/3.4
t-testt = 1.0274
p-value = 0.3075
Table 5. Qualitative and thematic analysis from user study.
Table 5. Qualitative and thematic analysis from user study.
CategoryNovices (n = 25)Experts (n = 53)
Positive Feedback“Useful and fun”, “Good idea”, “Real”.“Amazing”, “Breakthrough in education”.
Negative FeedbackMinimal (e.g., “difficult to pick up swab”).Detailed (usability and motion sickness).
Focus on LearningEmphasis on fun and general utility.Suggestions for refinement to align with IRL.
Thematic IssuesSimplistic responses, limited critique.Nuanced, detailed improvement suggestions.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Obeid, M.F.; Ewais, A.; Asia, M.R. NursingXR: Advancing Nursing Education Through Virtual Reality-Based Training. Appl. Sci. 2025, 15, 2949. https://doi.org/10.3390/app15062949

AMA Style

Obeid MF, Ewais A, Asia MR. NursingXR: Advancing Nursing Education Through Virtual Reality-Based Training. Applied Sciences. 2025; 15(6):2949. https://doi.org/10.3390/app15062949

Chicago/Turabian Style

Obeid, Mohammad F., Ahmed Ewais, and Mohammad R. Asia. 2025. "NursingXR: Advancing Nursing Education Through Virtual Reality-Based Training" Applied Sciences 15, no. 6: 2949. https://doi.org/10.3390/app15062949

APA Style

Obeid, M. F., Ewais, A., & Asia, M. R. (2025). NursingXR: Advancing Nursing Education Through Virtual Reality-Based Training. Applied Sciences, 15(6), 2949. https://doi.org/10.3390/app15062949

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop