WO2025027132A1 - Intravascular medical device and external imaging software simulation in virtual reality - Google Patents
Intravascular medical device and external imaging software simulation in virtual reality Download PDFInfo
- Publication number
- WO2025027132A1 WO2025027132A1 PCT/EP2024/071807 EP2024071807W WO2025027132A1 WO 2025027132 A1 WO2025027132 A1 WO 2025027132A1 EP 2024071807 W EP2024071807 W EP 2024071807W WO 2025027132 A1 WO2025027132 A1 WO 2025027132A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- virtual
- simulation
- software
- processor
- change
- Prior art date
Links
- 238000004088 simulation Methods 0.000 claims abstract description 170
- 230000008859 change Effects 0.000 claims abstract description 81
- 238000003384 imaging method Methods 0.000 claims description 37
- 238000004891 communication Methods 0.000 claims description 28
- 238000012544 monitoring process Methods 0.000 claims description 12
- 238000000034 method Methods 0.000 description 23
- 238000012549 training Methods 0.000 description 23
- 238000013459 approach Methods 0.000 description 15
- 238000010586 diagram Methods 0.000 description 13
- 230000009471 action Effects 0.000 description 12
- 239000000994 contrast dye Substances 0.000 description 9
- 238000002608 intravascular ultrasound Methods 0.000 description 9
- 238000005259 measurement Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 8
- 210000004204 blood vessel Anatomy 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 6
- 230000003993 interaction Effects 0.000 description 5
- 230000001360 synchronised effect Effects 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 230000036772 blood pressure Effects 0.000 description 4
- 238000004590 computer program Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 238000012014 optical coherence tomography Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 230000003213 activating effect Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 239000008280 blood Substances 0.000 description 2
- 210000004369 blood Anatomy 0.000 description 2
- 230000017531 blood circulation Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000002595 magnetic resonance imaging Methods 0.000 description 2
- 201000003152 motion sickness Diseases 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 1
- 238000011888 autopsy Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000009530 blood pressure measurement Methods 0.000 description 1
- 229910052799 carbon Inorganic materials 0.000 description 1
- 238000013170 computed tomography imaging Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 238000002592 echocardiography Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 238000002594 fluoroscopy Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 229910052760 oxygen Inorganic materials 0.000 description 1
- 239000001301 oxygen Substances 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000035790 physiological processes and functions Effects 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000002560 therapeutic procedure Methods 0.000 description 1
- 238000003325 tomography Methods 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 238000012285 ultrasound imaging Methods 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
Definitions
- the subject matter described herein relates to a systems, methods, and devices for integrating intravascular medical device (e.g., catheter-based and/or guidewire-based) software and/or external imaging (e.g., X-ray) software into virtual reality simulations.
- intravascular medical device e.g., catheter-based and/or guidewire-based
- external imaging e.g., X-ray
- This virtual medical device simulation system has particular but not exclusive utility for training of clinicians on intravascular medical device systems.
- VR virtual reality
- modern medical interventions may rely on complex architectures made up of many independent systems and devices working together. Such complexity is expected to increase in the future.
- current-generation virtual reality medical simulation software may not accurately simulate complex software often found inside current clinical environments. Due to technical limitations inherent in the software used to make VR experiences, these complex systems may be reduced to still-frame images that can be clicked through like a slide show. Other similar technical workarounds may be used instead or in addition.
- Such monolithic simplification can inhibit learning by forcing the learner to train on a system with significant differences from reality. This can in turn limit the educational value of the VR simulation.
- Such commonly used VR training simulations have numerous drawbacks, including inaccurate or incomplete representations of the software associated with medical devices.
- Example medical devices include intravascular imaging devices or interventional devices (whether catheters, guidewires, or combinations thereof), external imaging devices (e.g., X-ray,), etc.
- a trainee working inside the VR simulation has their inputs (e.g., interactions with a simulated medical device) transmitted via a network connection to the simulation of that device which, then updates accordingly.
- the software simulation would then send an updated screen display back to be displayed on virtual video screens inside the VR simulation on a continuous basis, creating the illusion of a fully realistic training environment.
- Individual simulation containers also have the ability to impact each other.
- the virtual medical device simulation system disclosed herein has particular, but not exclusive, utility for training of clinicians on intravascular devices and systems.
- a system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions.
- One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions.
- Implementations may include one or more of the following features.
- the first virtual medical device may include a virtual intravascular catheter or guidewire.
- the at least one processor is configured to use the first medical software application is configured to send commands to or receive data from a first non- virtual medical device.
- the at least one memory is configured to store: a virtual patient simulation software associated with a virtual patient visually depicted within the virtual environment; and a patient monitoring software application may include a patient monitoring output visible on a patient monitoring virtual screen display visually depicted within the virtual environment, where the at least one processor configured to execute the virtual patient simulation software and the patient monitoring software application.
- the at least one processor is configured to use the patient monitoring software application to receive data from a non-virtual patient sensor.
- the operation of the first virtual medical device within the virtual environment by the user is configured to cause the at least one processor to use the simulation director software to direct a corresponding change within the virtual patient simulation software, where the change within the virtual patient simulation software is configured to cause the at least one processor to direct a change within the patient monitoring software application, where the change within the patient monitoring software application is configured to cause the at least one processor to direct a change to the patient monitoring output visible on the patient monitoring virtual screen display.
- the at least one memory may include a virtual reality memory configured to store a virtual reality simulation software
- the at least one processor may include a virtual reality processor in communication with the virtual reality memory, where the virtual reality processor is configured to execute the virtual reality simulation software to provide the virtual environment.
- Figure 9 is a screen display of an example virtual environment, in accordance with at least aspects of the present disclosure.
- Figure 11 is a schematic diagram of a processor circuit, in accordance with aspects of the present disclosure.
- Current-generation virtual reality medical simulation software may not accurately simulate complex software often found inside current clinical environments. These include the software systems controlling X-ray imaging devices (e.g., Philips Azurion software), patient monitoring systems (e.g., Philips IntelliVue software), and software related to complex interventional medical devices (e.g., Philips IntraSight software).
- systems such as the Philips Nexcimer Laser or Azurion image-guided therapy system may include both hardware and software components. However, this may not be the case for other patient monitoring hardware and/or software.
- Such cross-functional updating can occur on a continuous basis (e.g., at a rate of 30 Hz, 60 Hz, 120 Hz, etc.), thus creating the illusion of a fully realistic operating room environment.
- individual simulation containers also have the ability to affect one another. For example if inside of VR simulation, the learner injects contrast dye into a patient’s bloodstream, this van have a realistic effect not only on fluoroscopic X-ray images, but also on the patient’s heart rate, and thus on the measured blood pressure of a simulated intravascular pressure sensor.
- This approach allows each device and its software to be simulated separately, either on the same hardware or on physically separate processors, or even remote processors such as cloud-based computing assets.
- This approach also facilitates swapping out simulated components (e.g., swapping an intravascular ultrasound (IVUS) catheter for a balloon catheter), to allow realistic 1:1 training on different hardware and software versions. It also potentially allows for cloud streaming of some or all software component simulations from remote servers to reduce the complexity for a learner.
- IVUS intravascular ultrasound
- a virtual machine running a containerized medical device simulation can be linked to the VR simulation using a web browser or other display mechanism (e.g., web real-time communications (web RTC)) inside a simulation engine such as Unreal or Unity, and can pass interactions back and forth with other containers via a simulation director (which may be integrated with the VR simulation, or may be a separate process).
- a simulation director which may be integrated with the VR simulation, or may be a separate process.
- the virtual medical device simulation system thus represents a substantially more complex approach, wherein multiple virtual displays, within the same VR environment, show the outputs of different device software, each being fed information from a simulated device with its own input controls. These virtual displays and controls exist only within the VR simulation, but are in communication with software and simulated devices operating outside the simulation.
- This containerized approach allows software, for example, to be swapped out and updated, with little impact to the overall system. This also permits learners to train on the exact software they will be using in real- world training.
- Current approaches in the market may avoid software entirely, or may create super simplified copies of the software, or interactive slide shows representing only a small fraction of the features and quirks of the real software.
- the virtual medical device simulation system avoids these limitations by showing, via web-browser windows in the simulated 3D environment. This approach can facilitate increased adoption of VR-based clinician training.
- intracardiac catheter with imaging device such as an intracardiac echocardiography (ICE) catheter.
- imaging device e.g., transducer array
- ICE intracardiac echocardiography
- Each different type of device or combination of devices may be represented by a different synthetic intraluminal data provider application 220 and intraluminal data and/or co-registration software application 225.
- External imaging may for example include X-ray imaging (e.g., angioscopy, fluoroscopy) represented by a simulated X-ray imaging device, computer-aided tomography (CT) represented by a simulated CT imaging device, magnetic resonance imaging (MRI) represented by a simulated MRI device, external ultrasound represented by an external ultrasound imaging device, or other related external imaging modalities and devices.
- CT computer-aided tomography
- MRI magnetic resonance imaging
- external ultrasound represented by an external ultrasound imaging device, or other related external imaging modalities and devices.
- the simulation director 250 Upon receiving the updated state information 720, the simulation director 250 also sends updated state information 740 directly to the VR subsystem 120, directing the VR subsystem 140 to make changes in the VR environment (e.g., moving the catheter and the hand gripping the catheter).
- FIG. 8 is a screen display 800 of an example virtual environment 305, in accordance with aspects of the present disclosure. Visible are the screen displays 217 and 227 of the external imaging software 215 and the intraluminal imaging software 225, which in this example is intravascular ultrasound (IVUS) imaging software, although in general it could be any intraluminal device software, including btu not limited to pressure measurement, flow measurement, and/or imaging with different modalities such as optical coherence tomography (OCT). Also visible is a virtual hand 810 (e.g., controlled by a hand controller of the clinician) gripping a virtual IVUS catheter 820, per an instruction window 830.
- IVUS intravascular ultrasound
- OCT optical coherence tomography
- the screen displays 217, 227 of the external imaging software 215 and the intraluminal imaging software 225 are also visible. Also visible is the virtual hand 810 gripping the virtual IVUS catheter 820. Per an instruction window 830, the clinician, trainee, or learner has advanced the virtual catheter 820 further into the patient’s blood vessel.
- Other types of intraluminal imaging user actions may include selecting a Record button, selecting an End Recording button, making body lumen area or diameter measurements, using blood flow visualization (e.g., Philips ChromaFlo), adding bookmarks, or reviewing recorded data to make clinical decisions.
- the user may also use choose to execute a co-registration workflow which may, for example require the interaction of multiple software programs (e.g., Philips Azurion and Intrasight software) to align images aqcuired from different imaging modalities (e.g., IVUS and angiogram data), along with precise user actions taken inside VR.
- the screen display 227 of the intraluminal imaging software 225 shows a synthetic view 840 of a different portion of the blood vessel where the virtual catheter 820 is positioned.
- Other examples of changes in the screen display 227 of the intraluminal imaging software 225 include building up an image longitudinal display (IED) in real time, etc.
- IED image longitudinal display
- the communication module 1168 facilitates direct or indirect communication between various elements of the processor circuit 1150 and/or the virtual medical device simulation system 100.
- the communication module 1168 may communicate within the processor circuit 1150 through numerous methods or protocols.
- Serial communication protocols may include but are not limited to United States Serial Protocol Interface (US SPI), Inter-Integrated Circuit (I 2 C), Recommended Standard 232 (RS-232), RS-485, Controller Area Network (CAN), Ethernet, Aeronautical Radio, Incorporated 429 (ARINC 429), MODBUS, Military Standard 1553 (MIL-STD-1553), or any other suitable method or protocol.
- External communication may be accomplished using any suitable wireless or wired communication technology, such as a cable interface such as a universal serial bus (USB), micro USB, Lightning, or FireWire interface, Bluetooth, Wi-Fi, ZigBee, Li-Fi, or cellular data connections such as 2G/GSM (global system for mobiles) , 3G/UMTS (universal mobile telecommunications system), 4G, long term evolution (LTE), WiMax, or 5G.
- a Bluetooth Low Energy (BLE) radio can be used to establish connectivity with a cloud service, for transmission of data, and for receipt of software patches.
- BLE Bluetooth Low Energy
- the controller may be configured to communicate with a remote server, or a local device such as a laptop, tablet, or handheld device, or may include a display capable of showing status variables and other information. Information may also be transferred on physical media such as a USB flash drive or memory stick.
- processor and “memory” are used to refer to a range of potential technologies and implementations.
- Processor may for example refer to a device or devices providing computing power to complete a task. This may for example be a desktop or mobile central processing unit (CPU), a more specialized mobile Arm-style processor, a graphics processing unit (GPU) or tensor processing unit (TPU), or a network of different devices including some physical and some in the cloud working together.
- Memory similarly refers to a range of potential technologies and implementations of data storage. Data may be stored in traditional volatile RAM, or on non-volatile storage media such as hard drives or solid-state drives, or remotely in the cloud.
- the virtual medical device simulation system advantageously provides an enhanced virtual training environment wherein a simulated patient and multiple simulated medical devices or systems respond realistically as the various devices are operated or otherwise manipulated.
- Such interdependence of the device simulations provides greatly enhanced realism, thus increasing the value of virtual vs. live training, and thus potentially reducing the time required for live training and therefore potentially reducing the costs and increasing the throughputs of training programs.
- other medical devices may be simulated than those shown herein, including but not limited to surgical devices, measurement devices, interventional devices, transdermal devices, intraluminal devices, imaging devices, teleoperated or robotic devices, and otherwise.
- Other types of patients may be simulated, including but not limited to prenatal, pediatric, or geriatric patients, veterinary patients (e.g., pets, farm animals, wild animals, etc.), or deceased patients (e.g., for autopsy procedures).
- Other types of environments may be simulated, including but not limited to operating rooms, emergency rooms, medical offices, ambulances, field hospitals, etc.
- multiple patients may be represented, as for example in an organ transplant or blood transfusion scenario.
- the technology described herein may also be applied to veterinary applications, as well as non-medical applications such VR simulation training for chemical or electrical plant engineers, training of bridge crew for naval vessels, or automotive repair for those working on highly computerized vehicles.
- the approach is focused on providing value where humans, complex mechanical, and complex networked software systems interact, and training simulations are believed to provide value.
- All directional references e.g., upper, lower, inner, outer, upward, downward, left, right, lateral, front, back, top, bottom, above, below, vertical, horizontal, clockwise, counterclockwise, proximal, and distal are only used for identification purposes to aid the reader’s understanding of the claimed subject matter, and do not create limitations, particularly as to the position, orientation, or use of the virtual medical device simulation system.
- Connection references e.g., attached, coupled, connected, joined, or “in communication with” are to be construed broadly and may include intermediate members between a collection of elements and relative movement between elements unless otherwise indicated. As such, connection references do not necessarily imply that two elements are directly connected and in fixed relation to each other.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Surgery (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Primary Health Care (AREA)
- Life Sciences & Earth Sciences (AREA)
- Epidemiology (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Heart & Thoracic Surgery (AREA)
- Robotics (AREA)
- Urology & Nephrology (AREA)
- Processing Or Creating Images (AREA)
Abstract
A virtual training system includes operational simulation software for a virtual medical device that is visually depicted within in a virtual environment, and a medical software application for the virtual medical device, which includes an output visible on a virtual display screen visually depicted within the virtual environment. The system also includes a simulation director software, and a processor configured to execute the operational simulation software, the medical software application, and the simulation director software. An operation of the virtual medical device within the virtual environment by a user causes the simulation director software to direct a corresponding change within the operational simulation software, causing a change within the medical software application. The change within the medical software application causes a change to the output visible on the virtual screen display.
Description
INTRAVASCULAR MEDICAL DEVICE AND EXTERNAL IMAGING SOFTWARE SIMULATION IN VIRTUAL REALITY
TECHNICAL FIELD
[0001] The subject matter described herein relates to a systems, methods, and devices for integrating intravascular medical device (e.g., catheter-based and/or guidewire-based) software and/or external imaging (e.g., X-ray) software into virtual reality simulations. This virtual medical device simulation system has particular but not exclusive utility for training of clinicians on intravascular medical device systems.
BACKGROUND
[0002] Virtual reality (VR) can be an important training tool for clinicians learning to operate complex medical devices with both hardware and software components. However, modern medical interventions may rely on complex architectures made up of many independent systems and devices working together. Such complexity is expected to increase in the future. However, current-generation virtual reality medical simulation software may not accurately simulate complex software often found inside current clinical environments. Due to technical limitations inherent in the software used to make VR experiences, these complex systems may be reduced to still-frame images that can be clicked through like a slide show. Other similar technical workarounds may be used instead or in addition. Such monolithic simplification can inhibit learning by forcing the learner to train on a system with significant differences from reality. This can in turn limit the educational value of the VR simulation. Thus, it is to be appreciated that such commonly used VR training simulations have numerous drawbacks, including inaccurate or incomplete representations of the software associated with medical devices.
[0003] The information included in this Background section of the specification, including any references cited herein and any description or discussion thereof, is included for technical reference purposes only and is not to be regarded as subject matter by which the scope of the disclosure is to be bound.
SUMMARY
[0004] Disclosed is a virtual medical device simulation system that maintains separate, synchronized simulations of medical devices and their associated software, and send data
back and forth between software components to create a single, more realistic simulation to the user. Example medical devices include intravascular imaging devices or interventional devices (whether catheters, guidewires, or combinations thereof), external imaging devices (e.g., X-ray,), etc. A trainee working inside the VR simulation has their inputs (e.g., interactions with a simulated medical device) transmitted via a network connection to the simulation of that device which, then updates accordingly. The software simulation would then send an updated screen display back to be displayed on virtual video screens inside the VR simulation on a continuous basis, creating the illusion of a fully realistic training environment. Individual simulation containers also have the ability to impact each other. For example, if inside of VR simulation, the trainee makes a mistake that causes the virtual patient to expire, this is reflected in all other simulated components, including patient monitoring and imaging, which are each running as separate applications (either on the same hardware or on separate or remote hardware). This approach has other benefits including the ability to easily swap out software components to allow realistic 1 : 1 training on different hardware and software versions. This approach also potentially allows for cloud streaming of some or all software component simulations from remote servers, to reduce the complexity for a trainee.
[0005] The virtual medical device simulation system disclosed herein has particular, but not exclusive, utility for training of clinicians on intravascular devices and systems. A system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions. One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions. One general aspect includes a system with at least one memory configured to store: a first operational simulation software associated with a first virtual medical device visually depicted within in a virtual environment; a first medical software application associated with the first virtual medical device, where the first medical software application may include an output visible on a first virtual display screen visually depicted within the virtual environment; and a simulation director software. The system also includes at least one processor configured for communication with the at least one memory and configured to execute the first operational simulation software, the first medical software application, and the simulation director software. An operation of the first virtual medical device within the virtual environment by a user is configured to cause the at least one
processor to use the simulation director software to direct a corresponding change within the first operational simulation software, where the change within the first operational simulation software is configured to cause the at least one processor to direct a change within the first medical software application. The change within the first medical software application is configured to cause the at least one processor to director to direct a change to the output visible on the first virtual screen display. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
[0006] Implementations may include one or more of the following features. In some embodiments, the first virtual medical device may include a virtual intravascular catheter or guidewire. In some embodiments, the at least one processor is configured to use the first medical software application is configured to send commands to or receive data from a first non- virtual medical device. In some embodiments, the at least one memory is configured to store: a virtual patient simulation software associated with a virtual patient visually depicted within the virtual environment; and a patient monitoring software application may include a patient monitoring output visible on a patient monitoring virtual screen display visually depicted within the virtual environment, where the at least one processor configured to execute the virtual patient simulation software and the patient monitoring software application. In some embodiments, the at least one processor is configured to use the patient monitoring software application to receive data from a non-virtual patient sensor. In some embodiments, the operation of the first virtual medical device within the virtual environment by the user is configured to cause the at least one processor to use the simulation director software to direct a corresponding change within the virtual patient simulation software, where the change within the virtual patient simulation software is configured to cause the at least one processor to direct a change within the patient monitoring software application, where the change within the patient monitoring software application is configured to cause the at least one processor to direct a change to the patient monitoring output visible on the patient monitoring virtual screen display. In some embodiments, the at least one memory is configured to store: a second operational simulation software associated with a second virtual medical device visually depicted within in the virtual environment; a second medical software application associated with the second virtual medical device, where the second medical software application may include an output visible on a second virtual display screen visually depicted within the virtual environment; and where the at least one processor is configured to execute the second operational simulation software and the second medical
software application. In some embodiments, the second virtual medical device may include a virtual x-ray imaging device. In some embodiments, an operation of the second virtual medical device within the virtual environment by the user is configured to cause the at least one processor to use the simulation director software to direct a corresponding change within the second operational simulation software, where the change within the second operational simulation software is configured to cause the at least one processor to direct a change within the second medical software application, and where the change within the second medical software application is configured to cause the at least one processor to director to direct a change to the output visible on the second virtual screen display. In some embodiments, the operation of the first virtual medical device within the virtual environment by the user is configured to cause the at least one processor to use the simulation director software to direct a corresponding change within the second operational simulation software, where the change within the second operational simulation software is configured to cause the at least one processor to direct a change within the second medical software application, and the change within the second medical software application is configured to cause the at least one processor to director to direct a change to the output visible on the second virtual screen display. In some embodiments, the at least one memory may include a simulation memory configured to store the first operational simulation software and the first medical software application, where the at least one processor may include a simulation processor in communication with the simulation memory. In some embodiments, the at least one memory may include a virtual reality memory configured to store a virtual reality simulation software, where the at least one processor may include a virtual reality processor in communication with the virtual reality memory, where the virtual reality processor is configured to execute the virtual reality simulation software to provide the virtual environment. In some embodiments, the virtual reality memory is configured to store the simulation director software. Implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium.
[0007] One general aspect includes a virtual medical system with a virtual reality subsystem that may include: a virtual reality processor and a virtual reality memory; a virtual environment running on the virtual reality processor; a first virtual medical device geometrically represented within in the virtual environment; a first virtual screen display geometrically represented within the virtual environment; a simulation subsystem may include: a simulation processor and a simulation memory; a first operational simulation of the first virtual medical device running on the simulation processor; a first medical software
application associated with the first virtual medical device and running on the simulation processor, where the first medical software application is configured to generate an output visible on the first virtual display screen. The system also includes a simulation director operatively coupled to the virtual reality subsystem and the simulation subsystem, such that an operation of the first virtual medical device within the virtual environment causes the simulation director to direct a corresponding change within the first operational simulation, and the change within the first operational simulation causes a change within the first medical software application that produces a change to the output visible on the first virtual screen display. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
[0008] This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to limit the scope of the claimed subject matter. A more extensive presentation of features, details, utilities, and advantages of the virtual medical device simulation system, as defined in the claims, is provided in the following written description of various embodiments of the disclosure and illustrated in the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] Illustrative embodiments of the present disclosure will be described with reference to the accompanying drawings, of which:
[0010] Figure 1 is a schematic, diagrammatic representation of a virtual medical device simulation system, in accordance with aspects of the present disclosure.
[0011] Figure 2 is a schematic, diagrammatic representation of a virtual medical device simulation system, in accordance with aspects of the present disclosure.
[0012] Figure 3 is a screen display of an example virtual environment, in accordance with at least on aspect of the present disclosure.
[0013] Figure 4 is a schematic, diagrammatic illustration, in block diagram form, of an example virtual medical device simulation implementation, in accordance with aspects of the present disclosure.
[0014] Figure 5A is a representation of a virtual screen display within the virtual environment, in accordance with aspects of the present disclosure.
[0015] Figure 5B is a is a representation of a virtual screen display within the virtual environment, in accordance with aspects of the present disclosure.
[0016] Figure 6 is a schematic, diagrammatic illustration, in block diagram form, of an example virtual medical device simulation implementation, in accordance with aspects of the present disclosure.
[0017] Figure 7 is a schematic, diagrammatic illustration, in block diagram form, of an example virtual medical device simulation implementation, in accordance with aspects of the present disclosure.
[0018] Figure 8 is a screen display of an example virtual environment, in accordance with aspects of the present disclosure.
[0019] Figure 9 is a screen display of an example virtual environment, in accordance with at least aspects of the present disclosure.
[0020] Figure 10 is a schematic, diagrammatic illustration, in block diagram form, of an example virtual medical device simulation implementation, in accordance with aspects of the present disclosure.
[0021] Figure 11 is a schematic diagram of a processor circuit, in accordance with aspects of the present disclosure.
DETAILED DESCRIPTION
[0022] Current-generation virtual reality medical simulation software may not accurately simulate complex software often found inside current clinical environments. These include the software systems controlling X-ray imaging devices (e.g., Philips Azurion software), patient monitoring systems (e.g., Philips IntelliVue software), and software related to complex interventional medical devices (e.g., Philips IntraSight software). In an example, systems such as the Philips Nexcimer Laser or Azurion image-guided therapy system may include both hardware and software components. However, this may not be the case for other patient monitoring hardware and/or software.
[0023] Disclosed herein is a virtual medical device simulation system capable of maintaining separate, synchronized simulations of medical devices, and sending data back and forth between the simulated devices and their associated software components, to create a single integrated simulation that responds realistically, across multiple devices, to inputs from the user. Thus, when a learner inside the VR simulation interacts with a machine inside the VR simulation, their input may be transmitted via a network connection to the simulation of that machine, which then updates both its simulated physical state (e.g., the C-arm angle of
an X-ray imager) and its software outputs (e.g., a realistic simulated X-ray image of the simulated patient on a simulated video screen). Such cross-functional updating can occur on a continuous basis (e.g., at a rate of 30 Hz, 60 Hz, 120 Hz, etc.), thus creating the illusion of a fully realistic operating room environment. In some embodiments, individual simulation containers also have the ability to affect one another. For example if inside of VR simulation, the learner injects contrast dye into a patient’s bloodstream, this van have a realistic effect not only on fluoroscopic X-ray images, but also on the patient’s heart rate, and thus on the measured blood pressure of a simulated intravascular pressure sensor.
[0024] This approach allows each device and its software to be simulated separately, either on the same hardware or on physically separate processors, or even remote processors such as cloud-based computing assets. This approach also facilitates swapping out simulated components (e.g., swapping an intravascular ultrasound (IVUS) catheter for a balloon catheter), to allow realistic 1:1 training on different hardware and software versions. It also potentially allows for cloud streaming of some or all software component simulations from remote servers to reduce the complexity for a learner.
[0025] In some embodiments, a virtual machine running a containerized medical device simulation can be linked to the VR simulation using a web browser or other display mechanism (e.g., web real-time communications (web RTC)) inside a simulation engine such as Unreal or Unity, and can pass interactions back and forth with other containers via a simulation director (which may be integrated with the VR simulation, or may be a separate process). Thus, the actual medical device software, in communication with a simulated version of the device, can be portrayed inside the VR simulation, thus dramatically improving the realism of the simulation.
[0026] This approach is similar to how some aircraft training simulators work. Such aircraft simulators run the actual aircraft software inside containers (or on separate hardware similar or identical to the actual flight computer), while another computer feeds them information (e.g., simulated sensor readings). That way, the pilots interact with the normal aircraft software. Such systems can realistically portray normal flight operations, and can also induce complex failures wherein the software responds as it would in a real aircraft. However, such aircraft simulations have the benefit of actual cockpit hardware such as controls and display screens, rather than virtual controls and displays screens inside the VR environment. These flight simulators also do not model and integrate the interaction of multiple separate software containers, each capable of affecting the others in real time.
[0027] The virtual medical device simulation system thus represents a substantially more complex approach, wherein multiple virtual displays, within the same VR environment, show the outputs of different device software, each being fed information from a simulated device with its own input controls. These virtual displays and controls exist only within the VR simulation, but are in communication with software and simulated devices operating outside the simulation.
[0028] This containerized approach allows software, for example, to be swapped out and updated, with little impact to the overall system. This also permits learners to train on the exact software they will be using in real- world training. Current approaches in the market may avoid software entirely, or may create super simplified copies of the software, or interactive slide shows representing only a small fraction of the features and quirks of the real software. The virtual medical device simulation system avoids these limitations by showing, via web-browser windows in the simulated 3D environment. This approach can facilitate increased adoption of VR-based clinician training.
[0029] The present disclosure aids substantially in the training of clinicians, by improving the realism of medical device software representations in a VR environment. Implemented on a VR processor in communication with multiple containerized software applications and their associated simulated devices, the virtual medical device simulation system disclosed herein provides practical improvements in the experience level of clinicians. This improved VR simulation environment transforms a slide-show type representation of device software into a full, simultaneous implementation of the software for multiple devices, without the normally routine need for a clinician’s first contact with the real device software to occur in the real world. This unconventional approach improves the functioning of the VR simulation system, by giving it access to real-time inputs and outputs of actual medical device software. [0030] The virtual medical device simulation system may be implemented as a virtual environment viewable on a display (e.g., a VR headset display), and operated by a control process executing on a processor that accepts user inputs from a keyboard, mouse, touchscreen, hand controller interface, haptic simulation device, or other input device, and that is in communication with one or more containerized software applications. In that regard, the control process performs certain specific operations in response to different inputs or selections made at different times. Certain outputs of the virtual medical device simulation system may be printed, shown on a display, or otherwise communicated to human operators. Certain structures, functions, and operations of the processor, display, sensors, and user input
systems are known in the art, while others are recited herein to enable novel features or aspects of the present disclosure with particularity.
[0031] These descriptions are provided for exemplary purposes only, and should not be considered to limit the scope of the virtual medical device simulation system. Certain features may be added, removed, or modified without departing from the spirit of the claimed subject matter.
[0032] For the purposes of promoting an understanding of the principles of the present disclosure, reference will now be made to the embodiments illustrated in the drawings, and specific language will be used to describe the same. It is nevertheless understood that no limitation to the scope of the disclosure is intended. Any alterations and further modifications to the described devices, systems, and methods, and any further application of the principles of the present disclosure are fully contemplated and included within the present disclosure as would normally occur to one skilled in the art to which the disclosure relates. In particular, it is fully contemplated that the features, components, and/or steps described with respect to one embodiment may be combined with the features, components, and/or steps described with respect to other embodiments of the present disclosure. For the sake of brevity, however, the numerous iterations of these combinations will not be described separately.
[0033] Figure 1 is a schematic, diagrammatic representation of a virtual medical device simulation system 100, in accordance with aspects of the present disclosure. The virtual medical device simulation system 100 includes a simulation subsystem 110 and a virtual reality (VR) subsystem 120. The simulation subsystem 110 includes a simulation processor 130 and a simulation memory 140. The VR subsystem 120 includes a VR processor 160, a VR memory 170, a VR display 180 (e.g., a 3D virtual reality headset display), and a VR input device 190 (e.g., a VR hand controller or VR camera capturing the user’s hand, eyes, etc. such as with image processing of the video of the user’s hand, eyes, etc. to identify user input). One, some, or all of the components VR subsystem 120 are or can be similar to commercially-available VR systems available from Meta (Quest), HTC (Vive), Apple (Vision Pro), etc.. In the example shown in Figure 1, the VR subsystem 120 may send, to the simulation subsystem 110 a signal 195 that represents a user input in the VR environment. Such an input may for example be or include a click event on a touchscreen, a movement event for a toggle control, a grasping or movement event for a virtual hand, etc. As described below, based on this user input signal 195, the simulation subsystem may change the state of simulated devices, which in turn changes the internal state of the device software, which in
turn changes the screen displays generated by the device software. Thus, the simulation subsystem 110 generates an updated screen display frame 150, which is transmitted to the VR subsystem 120 for display on one of the virtual displays (e.g., browser windows) in the VR environment.
[0034] It is noted that block diagrams are provided herein for exemplary purposes; a person of ordinary skill in the art will recognize myriad variations that nonetheless fall within the scope of the present disclosure. For example, block diagrams may show a particular arrangement of components, modules, services, steps, processes, or layers, resulting in a particular data flow. It is understood that some embodiments of the systems disclosed herein may include additional components, that some components shown may be absent from some embodiments, and that the arrangement of components may be different than shown, resulting in different data flows while still performing the methods described herein.
[0035] Similarly, the logic represented by block diagrams may be shown or described as sequential. However, similar logic could be parallel, massively parallel, object oriented, realtime, event-driven, cellular automaton, or otherwise, while accomplishing the same or similar functions. In order to perform the methods described herein, a processor may divide each of the steps described herein into a plurality of machine instructions, and may execute these instructions at the rate of several hundred, several thousand, several million, or several billion per second, in a single processor or across a plurality of processors. Such rapid execution may be necessary in order to execute the method in real time or near-real time as described herein. For example, in order to avoid motion sickness in the user, a 3D display of the VR environment may need to update at a rate of 20 Hz or faster (e.g., a minimum of 75-90 Hz may be used to limit the incidence of motion sickness), and with a latency of 100 milliseconds (ms) between the generation of a user input and a visible result on the display, and preferably 12 ms or less for a wired system or 70 ms or less for a wireless system.
[0036] It is noted that the simulation subsystem 110 and the VR subsystem 120 may communicate with one another via wired or wireless communication, or combinations thereof. The subsystems may be directly connected (e.g., operating on the same processor, or different processors of the same machine), or may communicate over a network. The simulation subsystem 110 and VR subsystem 120 can be spaced or remote from one another (whether in the same room, different rooms within the same facility, or in different facilities). The processor of the simulation subsystem may execute software stored in memory to provide one or multiple simulations. The simulation subsystem 110 could be one computer or multiple computers (e.g., in communication with one another and VR subsystem).
[0037] Before continuing, it should be noted that the examples described above are provided for purposes of illustration, and are not intended to be limiting. Other devices and/or device configurations may be utilized to carry out the operations described herein.
[0038] Figure 2 is a schematic, diagrammatic representation of a virtual medical device simulation system 100, in accordance with aspects of the present disclosure. Visible are the Simulation subsystem 110, simulation processor 130, simulation memory 130, VR subsystem 120, VR processor 160, VR memory 170, VR display 180, and VR input device 190.
[0039] Abstraction is an important aspect of the virtual medical device simulation system 100, and of medical device software. Between a user (e.g., a clinician) and the bare components of a medical device, there may exist several layers of abstraction that have been added over time to make complex tasks simpler. What appears to the user as a single object or action may in fact be a web of different parts or interactions. The virtual medical device simulation system 100 thus employs a family of simulated hardware and software working together and composited together in real time into a single virtual reality simulation [0040] In the example shown in Figure 2, the simulation memory 140 holds eight different containerized processes: a synthetic external imaging provider application 210, external imaging software 215, synthetic intraluminal provider application 220, intraluminal data and/or co-registration software 225, simulated sensors of an intraluminal measurement or treatment device 230, intraluminal measurement or treatment software 235, a virtual patient simulation 240, and patient monitoring software 245. Each of the software applications 210, 215, 220, 225, 230, 235, 240, and 245 may be containerized within virtual machines executing on the simulation processor (or processors) 130.
[0041] An intraluminal medical device could take many forms. One example is an intravascular guidewire with flow sensor that obtains blood flow data (velocity, volume) used to calculate, e.g., CFR (coronary flow reserve). Another example is an intravascular guidewire with pressure sensor that obtains blood pressure data used to calculate a pressure ratio (e.g., FFR, iFR, Pd/Pa, etc.), another example is an intravascular guidewire with flow sensor and pressure sensor. Another example is an intravascular catheter with imaging device (e.g., IVUS transducer or IVUS transducer array, OCT, photoacoustic transducer, etc.). Another example is an intracardiac catheter with imaging device (e.g., transducer array) - such as an intracardiac echocardiography (ICE) catheter. Each different type of device or combination of devices may be represented by a different synthetic intraluminal data provider application 220 and intraluminal data and/or co-registration software application 225.
[0042] External imaging may for example include X-ray imaging (e.g., angioscopy, fluoroscopy) represented by a simulated X-ray imaging device, computer-aided tomography (CT) represented by a simulated CT imaging device, magnetic resonance imaging (MRI) represented by a simulated MRI device, external ultrasound represented by an external ultrasound imaging device, or other related external imaging modalities and devices.
[0043] The virtual patient may have simulated physiological variables such as heart rate, blood oxygen, electrocardiogram (ECG), etc., each represented by simulated sensors such as blood pressure cuff, heart rate sensor, ECG electrodes, etc., which are read by the patient monitoring software.
[0044] Similarly, the VR memory 170 holds one containerized process, a VR simulation 260 (e.g., a virtual environment running in a virtual reality engine such as Unreal or Unity), which may be in communication with the containerized processes of the simulation subsystem through, for example, a plurality of browser windows positioned within the VR environment. It is understood that the term “virtual reality” (VR) is used herein in an exemplary way, and that systems involving extended reality (XR), augmented reality (AR), mixed reality (MR), passthrough mixed reality (PMR), spatial computing (SC), 2D representations of 3D virtual environments, or combinations thereof, may be used instead or in addition, while remaining within the scope of the present disclosure. For example, in some implementations, a user might see a simulated surgical theater overlaid on the actual room in which the user is standing, or an image or representation thereof. Hardware for virtual, augmented, or extended reality presentation may include, but is not limited to: glasses, goggles, headsets, helmets, projectors (whether onto a screen or directly into the eye), contact lenses, projection domes, LED walls, 2D video displays, and otherwise.
[0045] In the example shown in Figure 2, the virtual medical device simulation system 100 also includes a simulation director 250, which is in two-way communication between both the simulation subsystem and the VR subsystem, to pass information between the various containerized software components 210-245 and the VR simulation 260 (e.g., by translating screen displays of the device software 215, 225, 235, and 245 into virtual display windows (e.g., browser windows) positioned with particular X,Y,Z positions and orientations within the virtual environment. The simulation director 250 is also responsible for translating user inputs within the VR subsystem 120 into state changes within the device simulations 210, 220, 230 and the patient simulation 240.
[0046] It is noted that the simulation director 250 may be executed on the VR subsystem 120 or simulation subsystem 110, or may be executed on separate, standalone hardware. It is
further noted that the simulation applications 210, 220, 230, and 240 represent operational models of the various medical devices and the patient, which are distinct from the geometric representations of the medical devices and the patient in the virtual environment.
[0047] Figure 3 is a screen display 300 of an example virtual environment 305, in accordance with at least on aspect of the present disclosure. The virtual environment 305 includes a virtual clinician’s head 260 (specifying the 3D positions and orientations of the user’s eyes and ears within the virtual environment 205) and hands 265 (specifying the position, orientation, and open-closed state of the user’s hands within the virtual environment 305. The virtual environment 305 also includes a simulated external imaging system 210, which drives outputs 217 from the external imaging software. The virtual environment 305 also includes a simulated intraluminal measurement device 220 (e.g., an intravascular pressure and/or flow measurement system, an IVUS imaging system, etc.), which drives outputs 227 from the intraluminal measurement device software. The virtual environment 305 also includes a simulated intraluminal treatment device 230, which drives outputs 237 from the intraluminal treatment device software. The virtual environment 305 also includes a simulated intraluminal treatment device 240, which drives the outputs 247 of the intraluminal treatment software. The virtual environment 305 also includes a simulated patient 240, whose simulated physiological state drives outputs 247 of the patient monitoring software.
[0048] As the user interacts with objects in the virtual environment 305 (e.g., by using the virtual hands 265 to move a catheter, inject fluid from a syringe, or click on a touchscreen), these inputs can change the state of the containerized software outputs or screen displays 217, 227, 237, 247, of the patient simulation 240, and/or of the device simulations 210, 220, 230. These changes can, in turn, create additional changes in the virtual environment 305, as described below.
[0049] Figure 4 is a schematic, diagrammatic illustration, in block diagram form, of an example virtual medical device simulation implementation 400, in accordance with aspects of the present disclosure. In the virtual medical device simulation implementation 400, similar to Figure 1, the VR subsystem 120, after receiving a user input, sends the medical software 420 a signal 410 representing a user input at a particular X, Y, Z position in VR space. This X,Y,Z position may for example be associated with a particular location within a touchscreen in the virtual environment, or may be associated with moving a toggle control to a “forward” or “up” position, or may be associated with moving a catheter or a plunger that the user has previously gripped with a gripper control. This input is then received by the medical software 420 (e.g., by a containerized software application such as medical software 215,
225, or 235 of Figure 2), which generates an updated software screen display frame 420, which is then transmitted to the VR subsystem for display (as shown below in Figures 5A and 5B).
[0050] Figure 5A is a representation of a virtual screen display 500 within the virtual environment, in accordance with aspects of the present disclosure. Figure 5A shows a screen display generated by medical device software, such as screen display 217 of Fig. 3 and/or a screen display generated by the medical software 420 of Fig. 4. Visible are a touchscreen 510, virtual pointer 520, and a touchscreen control 530. In the example shown in Figure 5A, the virtual pointer 520 is pointing at the touchscreen control 530. Such controls may for example change the state of the medical device software, which may in turn change the state of the associated medical device simulation, which may in turn change the state of the patient simulation, all of which may result in visible changes within the VR environment.
[0051] Figure 5B is a is a representation of a virtual screen display 500 within the virtual environment, in accordance with aspects of the present disclosure. Figure 5A shows a screen display generated by medical device software, such as screen display 217 of Fig. 3 and/or a screen display generated by the medical software 420 of Fig. 4. Visible are the touchscreen 510, virtual pointer 520, and touchscreen control 530. In the example shown in Figure 5B, the virtual pointer 520 has clicked on the touchscreen control 530, thus opening a touchscreen menu 540, thus opening a number of software options available to the user.
[0052] The input in the screen display frame 500 via the VR subsystem (e.g., VR subsystem 120 of Figure 4) causes the medical software (e.g., medical software 420 of Figure 4) to generate the screen display frame of Fig. 5B and send the VR subsystem the newly generated frame. This happens repeatedly as the user interacts with the medical software via the VR subsystem, so that the medical software appears to run in the VR subsystem.
[0053] Since the screen display 500 is generated by the actual medical device software, rather than a facsimile, subset, slide show, etc., the touchscreen menu 540 is representative of what a clinician would see in an actual clinical environment. Thus, training clinicians in the virtual environment can significantly reduce the amount of training time the clinicians require in actual clinical environments, in order to become familiar with the operation of the various medical device software required to perform a procedure.
[0054] Figure 6 is a schematic, diagrammatic illustration, in block diagram form, of an example virtual medical device simulation implementation 600, in accordance with aspects of the present disclosure. In the virtual medical device simulation implementation 600, the VR subsystem 120, after receiving a user input, sends a signal 605 representing a user input at a
particular X, Y, Z position in VR space (as described above) to the external imaging software 215, (e.g., directing the external imaging software 215 to change the C-arm angle of the external imaging system). The external imaging software 215 then sends an updated screen display 610 to the VR subsystem 120, showing the software’s response to the user input. The external imaging software also sends updated state information 610 to the simulation director 250, which performs several actions.
[0055] The simulation director 250 sends updated state information 630 to the synthetic external imaging provider application 210 (e.g., a simulated X-ray machine), which generates a new simulated external image 640 based on the updated state information (e.g., the new C- arm angle). These new external images 640 are then transmitted to the external imaging software 215, which generates a new screen display 615 containing the new image 640, and transmits this new screen display 615 to the VR subsystem 120 for viewing by the user on the virtual display screen of the external imaging system.
[0056] Upon receiving the updated state information 610, the simulation director 250 also sends updated state information 620 directly to the VR subsystem 120, directing the VR subsystem 120 to make changes in the VR environment (e.g., reorienting the external imaging system to the new C-arm angle).
[0057] Thus, by activating a single control (e.g., a C-arm toggle control), the user is able to change the VR representation of the external imaging system, the internal state of the external imaging software, the simulated images generated by the synthetic external imaging provider application 210, and the screen display information 615 shown on the VR environment’s external imaging system virtual display. Such a capability may not be found in present systems, and thus represents a significant improvement in the art.
[0058] It is understood that the external imaging example shown in Figure 6 is merely exemplary, and other software applications, associated with other medical devices, may be used instead or in addition.
[0059] Figure 7 is a schematic, diagrammatic illustration, in block diagram form, of an example virtual medical device simulation implementation 700, in accordance with aspects of the present disclosure. In the virtual medical device simulation implementation 700, the VR subsystem 120, after receiving a user input, sends a signal 710 representing a user input at a particular X, Y, Z position in VR space (as described above) to the intraluminal device software 225, (e.g., advancing, rotating, withdrawing, unplugging, or plugging in an intraluminal catheter that is positioned within a vessel of the simulated patient). This information 710 is also passed to the simulation director 250, which performs several actions.
[0060] The simulation director 250 sends updated state information 750 to the synthetic intraluminal data provider application 220 (e.g., a simulated pressure-sensing guidewire), which generates new simulated intraluminal data 760 based on the updated state information (e.g., the new position of the guidewire). The new data 760 is then transmitted to the intraluminal device software 225, which generates a new screen display 730 containing the new data 760, and transmits this new screen display 730 to the VR subsystem 120 for viewing by the user on the virtual display screen of the intraluminal data system.
[0061] Upon receiving the updated state information 720, the simulation director 250 also sends updated state information 740 directly to the VR subsystem 120, directing the VR subsystem 140 to make changes in the VR environment (e.g., moving the catheter and the hand gripping the catheter).
[0062] Thus, by activating a single control (e.g., a hand controller that controls a virtual hand gripping the virtual catheter), the user is able to change the VR representation of the intravascular device, the internal state of the intravascular software, the simulated data generated by the synthetic intraluminal data provider application 210, and the screen display information 730 shown on the VR environment’s external imaging system virtual display. It is understood that the intraluminal data example shown in Figure 7 is merely exemplary, and other software applications, associated with other medical devices, may be used instead or in addition.
[0063] Figure 8 is a screen display 800 of an example virtual environment 305, in accordance with aspects of the present disclosure. Visible are the screen displays 217 and 227 of the external imaging software 215 and the intraluminal imaging software 225, which in this example is intravascular ultrasound (IVUS) imaging software, although in general it could be any intraluminal device software, including btu not limited to pressure measurement, flow measurement, and/or imaging with different modalities such as optical coherence tomography (OCT). Also visible is a virtual hand 810 (e.g., controlled by a hand controller of the clinician) gripping a virtual IVUS catheter 820, per an instruction window 830. The screen display 217 of the external imaging software 215 shows a synthetic fluoroscopic (e.g., X-ray) image of the patient, including the catheter 820. The synthetic fluoroscopic image may for example be generated by the synthetic external imaging provider application 210 of Figure 2. The screen display 227 of the intraluminal imaging software 225 shows a synthetic cross-sectional or circumferential IVUS view 840 of the interior of a blood vessel where the virtual catheter 820 is positioned. The synthetic IVUS view may for example be generated by the synthetic intraluminal data provider application 220 of Figure 2.
[0064] Figure 9 is a screen display 800 of an example virtual environment 305, in accordance with at least aspects of the present disclosure. Visible are the screen displays 217, 227 of the external imaging software 215 and the intraluminal imaging software 225. Also visible is the virtual hand 810 gripping the virtual IVUS catheter 820. Per an instruction window 830, the clinician, trainee, or learner has advanced the virtual catheter 820 further into the patient’s blood vessel. Other types of intraluminal imaging user actions may include selecting a Record button, selecting an End Recording button, making body lumen area or diameter measurements, using blood flow visualization (e.g., Philips ChromaFlo), adding bookmarks, or reviewing recorded data to make clinical decisions. The user may also use choose to execute a co-registration workflow which may, for example require the interaction of multiple software programs (e.g., Philips Azurion and Intrasight software) to align images aqcuired from different imaging modalities (e.g., IVUS and angiogram data), along with precise user actions taken inside VR. The screen display 217 of the external imaging software 215, showing the synthetic fluoroscopic (e.g., X-ray) image of the patient, therefore shows the catheter 820 in a different, more extended position than was shown in Figure 8. The screen display 227 of the intraluminal imaging software 225 shows a synthetic view 840 of a different portion of the blood vessel where the virtual catheter 820 is positioned. Other examples of changes in the screen display 227 of the intraluminal imaging software 225 include building up an image longitudinal display (IED) in real time, etc.
[0065] Thus, a change in the virtual environment 305 (e.g., advancing the catheter) causes a change within the synthetic external imaging provider application 210, which causes a change in the external imaging software 215, which then causes a change in one of the screen displays in the virtual environment 305. The change in the virtual environment 305 (e.g., advancing the catheter) also causes a change in the synthetic intraluminal data provider 220, which causes a change in the intraluminal data and/or co-registration software 225, which then drives a change to another screen display in the virtual environment 305. These changes happen in a synchronized way, in real time (e.g., with 10-millisecond latency or less) or near-real time (e.g., with latency of 10-1000 milliseconds), using the actual software for the external imaging system and the intraluminal imaging system, thus providing a high degree of realism for the VR training environment.
[0066] Such realism can improve the quality of VR training and thus reduce the amount of time the clinician, trainee, or learner needs to spend in actual surgical environments in order to learn the features and quirks of the various medical device software applications
being used. Furthermore, this nodular approach allows one medical device or software version to be swapped out in favor of a different medical device or software application [0067] Figure 10 is a schematic, diagrammatic illustration, in block diagram form, of an example virtual medical device simulation implementation 1000, in accordance with aspects of the present disclosure. In the virtual medical device simulation implementation 1000, the VR subsystem 120, after receiving a user input, sends a signal 1010 representing a user input at a particular X, Y, Z position in VR space (as described above) to the simulation director 250, (e.g., an eye tracker or hand tracker input, moving a virtual hand, gripping a plunger of a virtual syringe and injecting X-ray contrast dye into the patient). The simulation director 250 then performs several actions. For example, the simulation director 250 sends updated state information 1020 directly to the VR subsystem (e.g., changing the position of the hand, plunger, contrast dye, and syringe).
[0068] The simulation director 250 also provides updated state information 1030 to the synthetic external imaging provider application 210 (e.g., indicating that the blood vessel under investigation now contains a radiopaque X-ray contrast dye). The synthetic external imaging provider application 210 then generates new synthetic external images 1040 based on the updated state information 1030 (e.g., new images showing the same view of the patient, but with the radiopaque contrast dye in the blood vessel). These synthetic images 1040 are then passed to the external imaging software 215, which generates updated state information 1050 (e.g., a new screen display showing the new image(s)) and passes this updated state information 1050 to the VR subsystem for display (e.g., on one of the virtual monitors or browser windows positioned within the VR environment).
[0069] The simulation director 250 also passes the updated state information 1030 (e.g., the presence of contrast dye in the blood vessel) to the virtual patient simulation 240, which may generate corresponding physiological changes (e.g., X-ray contrast dyes can cause an increase in the patient’s heart rate) that are passed on to the patient monitoring software 245 as updated medical state data 1060 based on the updated state information 1030. The patient monitoring software 245 then generates its one updated state information 1050 (e.g., a new screen display showing updated physiological data such as increased heart rate) to the VR subsystem 120 for display (e.g., on one of the virtual screens or browser windows positioned within the virtual environment.
[0070] Thus, a change in the virtual environment 305 (e.g., injecting the contrast dye) causes a change within the synthetic external imaging provider application 210, which causes a change in the external imaging software 215, which then causes a change in one of the
screen displays in the virtual environment 305. The change in the virtual environment 305 (e.g., advancing the catheter) also causes a change in the virtual patient simulation 240, which causes a change in the patient monitoring software 245, which then drives a change to another screen display in the virtual environment 305. These changes happen in a synchronized way, in real time or near-real time, using the actual software for the external imaging system and the patient monitoring system, thus providing a high degree of realism for the VR training environment.
[0071] It is noted that, depending on the implementation and the configuration of simulated devices, the change to the virtual environment (e.g., the injection of contrast dye) may trigger changes in other simulations as well, such as a change in the periodic blood pressure measured by a pressure-sensing guidewire, changes in the appearance of the blood vessel in IVUS images, etc. Such multi-system responses to a change in the virtual environment provide enhanced realism for the VR training simulation.
[0072] More generally, it is understood that operation of a first virtual medical device can cause a change in the state of the operational simulation software of a second virtual medical device and/or the patient monitoring software application, and operation of the second virtual medical device can cause change in the state of the operational simulation software of the first medical device and/or the patient monitoring software application.
[0073] Figure 11 is a schematic diagram of a processor circuit 1150, in accordance with aspects of the present disclosure. The processor circuit 1150 may be implemented in the virtual medical device simulation system 100, or other devices or workstations (e.g., third- party workstations, network routers, etc.), or on a cloud processor or other remote processing unit, as necessary to implement the method. As shown, the processor circuit 1150 may include a processor 1160, a memory 1164, and a communication module 1168. These elements may be in direct or indirect communication with each other, for example via one or more buses.
[0074] The processor 1160 may include a central processing unit (CPU), a digital signal processor (DSP), an ASIC, a controller, or any combination of general-purpose computing devices, reduced instruction set computing (RISC) devices, application- specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), or other related logic devices, including mechanical and quantum computers. The processor 1160 may also comprise another hardware device, a firmware device, or any combination thereof configured to perform the operations described herein. The processor 1160 may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a
plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
[0075] The memory 1164 may include a cache memory (e.g., a cache memory of the processor 1160), random access memory (RAM), magnetoresistive RAM (MRAM), readonly memory (ROM), programmable read-only memory (PROM), erasable programmable read only memory (EPROM), electrically erasable programmable read only memory (EEPROM), flash memory, solid state memory device, hard disk drives, other forms of volatile and non-volatile memory, or a combination of different types of memory. In an embodiment, the memory 1164 includes a non-transitory computer-readable medium. The memory 1164 may store instructions 1166. The instructions 1166 may include instructions that, when executed by the processor 1160, cause the processor 1160 to perform the operations described herein. Instructions 1166 may also be referred to as code. The terms “instructions” and “code” should be interpreted broadly to include any type of computer- readable statement(s). For example, the terms “instructions” and “code” may refer to one or more programs, routines, sub-routines, functions, procedures, etc. “Instructions” and “code” may include a single computer-readable statement or many computer-readable statements. [0076] The communication module 1168 can include any electronic circuitry and/or logic circuitry to facilitate direct or indirect communication of data between the processor circuit 1150, and other processors or devices. In that regard, the communication module 1168 can be an input/output (I/O) device. In some instances, the communication module 1168 facilitates direct or indirect communication between various elements of the processor circuit 1150 and/or the virtual medical device simulation system 100. The communication module 1168 may communicate within the processor circuit 1150 through numerous methods or protocols. Serial communication protocols may include but are not limited to United States Serial Protocol Interface (US SPI), Inter-Integrated Circuit (I2C), Recommended Standard 232 (RS-232), RS-485, Controller Area Network (CAN), Ethernet, Aeronautical Radio, Incorporated 429 (ARINC 429), MODBUS, Military Standard 1553 (MIL-STD-1553), or any other suitable method or protocol. Parallel protocols include but are not limited to Industry Standard Architecture (ISA), Advanced Technology Attachment (ATA), Small Computer System Interface (SCSI), Peripheral Component Interconnect (PCI), Institute of Electrical and Electronics Engineers 488 (IEEE-488), IEEE- 1284, and other suitable protocols. Where appropriate, serial and parallel communications may be bridged by a Universal Asynchronous Receiver Transmitter (UART), Universal Synchronous Receiver Transmitter (USART), or other appropriate subsystem.
[0077] External communication (including but not limited to software updates, firmware updates, preset sharing between the processor and central server, or readings from the user controls) may be accomplished using any suitable wireless or wired communication technology, such as a cable interface such as a universal serial bus (USB), micro USB, Lightning, or FireWire interface, Bluetooth, Wi-Fi, ZigBee, Li-Fi, or cellular data connections such as 2G/GSM (global system for mobiles) , 3G/UMTS (universal mobile telecommunications system), 4G, long term evolution (LTE), WiMax, or 5G. For example, a Bluetooth Low Energy (BLE) radio can be used to establish connectivity with a cloud service, for transmission of data, and for receipt of software patches. The controller may be configured to communicate with a remote server, or a local device such as a laptop, tablet, or handheld device, or may include a display capable of showing status variables and other information. Information may also be transferred on physical media such as a USB flash drive or memory stick.
[0078] It is noted that the terms “processor” and “memory” are used to refer to a range of potential technologies and implementations. Processor may for example refer to a device or devices providing computing power to complete a task. This may for example be a desktop or mobile central processing unit (CPU), a more specialized mobile Arm-style processor, a graphics processing unit (GPU) or tensor processing unit (TPU), or a network of different devices including some physical and some in the cloud working together. Memory similarly refers to a range of potential technologies and implementations of data storage. Data may be stored in traditional volatile RAM, or on non-volatile storage media such as hard drives or solid-state drives, or remotely in the cloud.
[0079] Accordingly, it can be seen that the virtual medical device simulation system advantageously provides an enhanced virtual training environment wherein a simulated patient and multiple simulated medical devices or systems respond realistically as the various devices are operated or otherwise manipulated. Such interdependence of the device simulations provides greatly enhanced realism, thus increasing the value of virtual vs. live training, and thus potentially reducing the time required for live training and therefore potentially reducing the costs and increasing the throughputs of training programs.
[0080] The virtual medical device simulation system thus provides an ability to simulate a previously impossible levels of detail, thus disrupting the current status quo in medical simulation. It provides for modular simulation of both original computing hardware, hardware sensors, and actual device software, and may thus represent a future -proof approach
that will allow the addition and removal of individual parts. This approach may allow for significant cost, feature, and carbon footprint advantages over current approaches.
[0081] A number of variations are possible on the examples and embodiments described above. For example, other medical devices may be simulated than those shown herein, including but not limited to surgical devices, measurement devices, interventional devices, transdermal devices, intraluminal devices, imaging devices, teleoperated or robotic devices, and otherwise. Other types of patients may be simulated, including but not limited to prenatal, pediatric, or geriatric patients, veterinary patients (e.g., pets, farm animals, wild animals, etc.), or deceased patients (e.g., for autopsy procedures). Other types of environments may be simulated, including but not limited to operating rooms, emergency rooms, medical offices, ambulances, field hospitals, etc. In some implementations, multiple patients may be represented, as for example in an organ transplant or blood transfusion scenario.
[0082] The technology described herein may also be applied to veterinary applications, as well as non-medical applications such VR simulation training for chemical or electrical plant engineers, training of bridge crew for naval vessels, or automotive repair for those working on highly computerized vehicles. The approach is focused on providing value where humans, complex mechanical, and complex networked software systems interact, and training simulations are believed to provide value.
[0083] Accordingly, the logical operations making up the embodiments of the technology described herein are referred to variously as operations, steps, objects, elements, components, subsystems, containers, or modules. Furthermore, it should be understood that these may occur or be performed in any order, unless explicitly claimed otherwise or a specific order is inherently necessitated by the claim language.
[0084] All directional references e.g., upper, lower, inner, outer, upward, downward, left, right, lateral, front, back, top, bottom, above, below, vertical, horizontal, clockwise, counterclockwise, proximal, and distal are only used for identification purposes to aid the reader’s understanding of the claimed subject matter, and do not create limitations, particularly as to the position, orientation, or use of the virtual medical device simulation system. Connection references, e.g., attached, coupled, connected, joined, or “in communication with” are to be construed broadly and may include intermediate members between a collection of elements and relative movement between elements unless otherwise indicated. As such, connection references do not necessarily imply that two elements are directly connected and in fixed relation to each other. The term “or” shall be interpreted to
mean “and/or” rather than “exclusive or.” The word “comprising” does not exclude other elements or steps, and the indefinite article “a” or “an” does not exclude a plurality. Unless otherwise noted in the claims, stated values shall be interpreted as illustrative only and shall not be taken to be limiting.
[0085] The above specification, examples and data provide a complete description of the structure and use of exemplary embodiments of the virtual medical device simulation system as defined in the claims. Although various embodiments of the claimed subject matter have been described above with a certain degree of particularity, or with reference to one or more individual embodiments, those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the spirit or scope of the claimed subject matter.
[0086] Still other embodiments are contemplated. It is intended that all matter contained in the above description and shown in the accompanying drawings shall be interpreted as illustrative only of particular embodiments and not limiting. Changes in detail or structure may be made without departing from the basic elements of the subject matter as defined in the following claims.
Claims
1. A system, comprising: at least one memory configured to store: a first operational simulation software associated with a first virtual medical device visually depicted within in a virtual environment; a first medical software application associated with the first virtual medical device, wherein the first medical software application comprises an output visible on a first virtual display screen visually depicted within the virtual environment; and a simulation director software; and at least one processor configured for communication with the at least one memory and configured to execute the first operational simulation software, the first medical software application, and the simulation director software, wherein an operation of the first virtual medical device within the virtual environment by a user is configured to cause the at least one processor to use the simulation director software to direct a corresponding change within the first operational simulation software, wherein the change within the first operational simulation software is configured to cause the at least one processor to direct a change within the first medical software application, and wherein the change within the first medical software application is configured to cause the at least one processor to director to direct a change to the output visible on the first virtual screen display.
2. The system of claim 1, wherein the first virtual medical device comprises a virtual intravascular catheter or guidewire.
3. The system of claim 1, wherein the at least one processor is configured to use the first medical software application is configured to send commands to or receive data from a first non- virtual medical device.
4. The system of claim 1, wherein the at least one memory is configured to store: a virtual patient simulation software associated with a virtual patient visually depicted within the virtual environment; and
a patient monitoring software application comprising a patient monitoring output visible on a patient monitoring virtual screen display visually depicted within the virtual environment. wherein the at least one processor configured to execute the virtual patient simulation software and the patient monitoring software application.
5. The system of claim 4, wherein the at least one processor is configured to use the patient monitoring software application to receive data from a non- virtual patient sensor.
6. The system of claim 4, wherein the operation of the first virtual medical device within the virtual environment by the user is configured to cause the at least one processor to use the simulation director software to direct a corresponding change within the virtual patient simulation software, wherein the change within the virtual patient simulation software is configured to cause the at least one processor to direct a change within the patient monitoring software application, wherein the change within the patient monitoring software application is configured to cause the at least one processor to direct a change to the patient monitoring output visible on the patient monitoring virtual screen display.
7. The system of claim 1, wherein the at least one memory configured to store: a second operational simulation software associated with a second virtual medical device visually depicted within in the virtual environment; a second medical software application associated with the second virtual medical device, wherein the second medical software application comprises an output visible on a second virtual display screen visually depicted within the virtual environment; and wherein the at least one processor is configured to execute the second operational simulation software and the second medical software application.
8. The system of claim 7, wherein the second virtual medical device comprises a virtual x-ray imaging device.
9. The system of claim 7, wherein an operation of the second virtual medical device within the virtual environment by the user is configured to cause the at least one processor to use the simulation director software to direct a corresponding change within the second operational simulation software, wherein the change within the second operational simulation software is configured to cause the at least one processor to direct a change within the second medical software application, and wherein the change within the second medical software application is configured to cause the at least one processor to director to direct a change to the output visible on the second virtual screen display.
10. The system of claim 7, wherein the operation of the first virtual medical device within the virtual environment by the user is configured to cause the at least one processor to use the simulation director software to direct a corresponding change within the second operational simulation software, wherein the change within the second operational simulation software is configured to cause the at least one processor to direct a change within the second medical software application, and wherein the change within the second medical software application is configured to cause the at least one processor to director to direct a change to the output visible on the second virtual screen display.
11. The system of claim 1 , wherein the at least one memory comprises a simulation memory configured to store the first operational simulation software and the first medical software application, wherein the at least one processor comprises a simulation processor in communication with the simulation memory.
12. The system of claim 11, wherein the simulation memory configured to store the simulation director software.
13. The system of claim 11, wherein the at least one memory comprises a virtual reality memory configured to store a virtual reality simulation software, wherein the at least one processor comprises a virtual reality processor in communication with the virtual reality memory, wherein the virtual reality processor is configured to execute the virtual reality simulation software to provide the virtual environment.
14. The system of claim 13, wherein the virtual reality memory is configured to store the simulation director software.
15. A virtual medical system, comprising: a virtual reality subsystem comprising: a virtual reality processor and a virtual reality memory; a virtual environment running on the virtual reality processor; a first virtual medical device geometrically represented within in the virtual environment; a first virtual screen display geometrically represented within the virtual environment; a simulation subsystem comprising: a simulation processor and a simulation memory; a first operational simulation of the first virtual medical device running on the simulation processor; a first medical software application associated with the first virtual medical device and running on the simulation processor, wherein the first medical software application is configured to generate an output visible on the first virtual display screen; and a simulation director operatively coupled to the virtual reality subsystem and the simulation subsystem, such that an operation of the first virtual medical device within the virtual environment causes the simulation director to direct a corresponding change within the first operational simulation,
wherein the change within the first operational simulation causes a change within the first medical software application that produces a change to the output visible on the first virtual screen display.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202363530139P | 2023-08-01 | 2023-08-01 | |
US63/530,139 | 2023-08-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2025027132A1 true WO2025027132A1 (en) | 2025-02-06 |
Family
ID=92264034
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2024/071807 WO2025027132A1 (en) | 2023-08-01 | 2024-08-01 | Intravascular medical device and external imaging software simulation in virtual reality |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2025027132A1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190355278A1 (en) * | 2018-05-18 | 2019-11-21 | Marion Surgical Inc. | Virtual reality surgical system including a surgical tool assembly with haptic feedback |
US20220293014A1 (en) * | 2016-09-29 | 2022-09-15 | Simbionix Ltd. | Virtual reality medical simulation |
-
2024
- 2024-08-01 WO PCT/EP2024/071807 patent/WO2025027132A1/en unknown
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220293014A1 (en) * | 2016-09-29 | 2022-09-15 | Simbionix Ltd. | Virtual reality medical simulation |
US20190355278A1 (en) * | 2018-05-18 | 2019-11-21 | Marion Surgical Inc. | Virtual reality surgical system including a surgical tool assembly with haptic feedback |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11694328B2 (en) | Method and system for outputting augmented reality information | |
US11594002B2 (en) | Overlay and manipulation of medical images in a virtual environment | |
US9208747B2 (en) | Control module and control method to determine perspective in the rendering of medical image data sets | |
Sielhorst et al. | An augmented reality delivery simulator for medical training | |
JP5269604B2 (en) | System and method for editing a model of a physical system for simulation | |
US20170296292A1 (en) | Systems and Methods for Surgical Imaging | |
EP2777034B1 (en) | Interacting with a three-dimensional object dataset | |
CN107705855B (en) | Personalized percutaneous coronary angioplasty (PTCA) simulation training system and method | |
CN115315729A (en) | Method and system for facilitating remote presentation or interaction | |
JP7466541B2 (en) | Positioning of medical X-ray imaging equipment | |
CN115715386A (en) | Augmented reality based user interface add-on components, systems, and methods for viewing 3D or 4D medical image data | |
Zhou et al. | Cardiovascular‐interventional‐surgery virtual training platform and its preliminary evaluation | |
CN114974548A (en) | Device for moving a medical object and method for providing a control preset | |
CN107316554A (en) | A kind of heart interventional therapy virtual training system | |
Mangalote et al. | A comprehensive study to learn the impact of augmented reality and haptic interaction in ultrasound-guided percutaneous liver biopsy training and education | |
CN116416383A (en) | Dynamic model building method, simulation operation device, equipment and medium | |
WO2025027132A1 (en) | Intravascular medical device and external imaging software simulation in virtual reality | |
Capellini et al. | 3D Printing and 3D Virtual Models for Surgical and Percutaneous Planning of Congenital Heart Diseases. | |
Chen et al. | Virtual-reality simulator system for double interventional cardiac catheterization using haptic force producer with visual feedback | |
Kumar et al. | Role of Augmented Reality and Virtual Reality in Medical Imaging | |
EP4181789B1 (en) | One-dimensional position indicator | |
CN204971576U (en) | Nose endoscopic surgery navigation emulation training system | |
US20230245376A1 (en) | System and method for four-dimensional angiography | |
Hong et al. | Virtual angioscopy based on implicit vasculatures | |
WO2024233351A2 (en) | Image guided procedures |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24752383 Country of ref document: EP Kind code of ref document: A1 |