[go: up one dir, main page]

WO2015027286A1 - A medical training simulation system and method - Google Patents

A medical training simulation system and method Download PDF

Info

Publication number
WO2015027286A1
WO2015027286A1 PCT/AU2014/000865 AU2014000865W WO2015027286A1 WO 2015027286 A1 WO2015027286 A1 WO 2015027286A1 AU 2014000865 W AU2014000865 W AU 2014000865W WO 2015027286 A1 WO2015027286 A1 WO 2015027286A1
Authority
WO
WIPO (PCT)
Prior art keywords
manikin
images
simulation
computational
sar
Prior art date
Application number
PCT/AU2014/000865
Other languages
French (fr)
Inventor
Bruce Hunter THOMAS
Michael Robert MARNER
Ross Travers SMITH
Alexander William Walker
Michal Wozniak
Nirmal Kumar MENON
Original Assignee
University Of South Australia
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2013903338A external-priority patent/AU2013903338A0/en
Application filed by University Of South Australia filed Critical University Of South Australia
Publication of WO2015027286A1 publication Critical patent/WO2015027286A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • G09B23/30Anatomical models
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • G09B5/067Combinations of audio and projected visual presentation, e.g. film, slides
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Definitions

  • the present invention relates to medical training simulators.
  • the present invention relates to medical training systems using mannequins to simulate a patient.
  • the process of making a medical diagnosis is a cognitive process, where a clinician uses several sources of sensory observations, which are combined to form a diagnostic impression (a medical diagnosis).
  • Medical training simulation systems are used to allow a healthcare provider (medical student, doctor, nurse, paramedic etc.) to follow the process of attempting to determine or identify the possible disease or disorder characterised (a medical diagnostic procedure), and to reach an opinion by this process (a medical diagnostic opinion).
  • Medical training simulation systems typically use a Human Patient Simulation Manikin (HPSM or simpl manikin, or alternativel mannequin), which is placed in a clinical settin such as simulated hospital ward, emergency room, or mock incident location to allow training to be performed in a realistic and immersive environment with multiple people and standard medical equipment, available.
  • a clinical settin such as simulated hospital ward, emergency room, or mock incident location
  • Suc systems enable both trainee and experienced clinicians to develop the skills to identify, understand and combat life threatening situations that may occur in a clinical situation before they are actually faced with these situations. Further, whilst students often have the theoretical knowledge of how to deal with these situations, by physically acting out the procedures with a responsive (eg breathing, talking) patient, the student can develop the practical skills and psychological preparedness required for real clinical situations.
  • HPS s are mechanically based simulation manikins, which are extremely limited in their range of patient types and simulation capabilities. These can be divided into low, medium and hig fidelity manikins.
  • High fidelity manikins attempt to provide a lifelike patient which can respond to user interventions (either manually or remotely) and can contain over 20 different mechanical and electrical sub systems such as pupil dilation, blue LEDs around the lips to simulate cyanosis, and separate audio systems for heart, lungs, vocals and bowels.
  • pupil dilation either manually or remotely
  • blue LEDs around the lips to simulate cyanosis
  • separate audio systems for heart, lungs, vocals and bowels a separate audio systems for heart, lungs, vocals and bowels.
  • such systems are expensive, with initial costs of $20,000 with added expense for instalment, upkeep of disposable, parts, operator training, etc.
  • Medium fidelity manikins have fewer features, less realistic responses and are cheaper, and tend to focus on just one medical issue, or don't cover certain medical issues.
  • Low fidelity manikins are often little more than dumb solids, and include simple features such as basic airway and lung to allo CP training,
  • manikins are constructed of a dull plastic they lack a human look and feel, as well as mechanisms to imply different and changing skin tones, bruising, rashes, or boil-like symptoms. Such visual clues are often important when assessing a patient and making a medical diagnosis. They also use mechanical solutions to show effects such as tears, pupil dilation, and chest movement resulting from breathing. As a result the manikins lack realism and simulation coordinators resort to techniques such as applying makeup and wigs to represent different patient types and symptoms, as well as resorting to directly informing the participants of what the makeup is supposed to represent. The mechanical subsystems also create unintentional sounds such as faint creaking and squeaking which interfere with audio based diagnosis, and even lead to misdiagnosis as some students interpret these sounds as though they were real medical symptoms (eg creaking in the chest),
  • a method for simulating a medical condition comprising;
  • a computational system comprising at least one computing device and at least one projector.
  • the one or more images are visual, indications of a simulated medical condition. These images may be a sequence of images corresponding to visual indications of different stages or symptoms of a simulated medical condition.
  • the medical condition is a trauma condition.
  • the medical cond tion is a neurological condition.
  • the method further comprise selecting an age, bod size, ethnicity and sex for a simulated patient, and projecting one or more images corresponding to the selected age, body size, ethnicity and sex onto the manikin.
  • the manikin is obtained by forming a 3D shape based upon a computer model.
  • the manikin is a standard manikin fitted with a de-featured face mask to hide existing manikin features and to provide a clean surface for projection of the one or more images fay the one or more projectors.
  • the manikin includes one or more speakers, and the method further comprises generating one or more aural outputs from the speakers in the manikin.
  • the aural outputs are used to simulate one or more aural symptoms, and the aural symptoms arc coordinated with the permited images.
  • the manikin includes a microphone and the method further comprises receiving aural input from a user. In one form, the method further comprises providing the aural input to a simulation supervisor,
  • the manikin includes one or more embedded subsystems
  • the method further comprises simulating one or more symptoms of the simulated medical condition usin at least one embedded subsystem, wherein the symptoms are coordinated with the one or more projected images.
  • the computational system may also comprise one or more tracking systems for tracking at least the manikin.
  • the computational system may be used to project images onto other objects or to track other objects besides the manikin, m one form, the one or more projectors are ceiling mounted projectors.
  • the manikin is located on bed, and the one or more projectors ar mounted in or on the bed.
  • the manikin is a shell, constructed of semi-translucent materia!
  • the computational system is a SAR system comprises one or more projectors for projecting one or more images into a SAR en vironment which incl udes at least a portion of the manikin.
  • the method further comprises obtaining a video capture of the simulation, and storing simulation data to allow for later review of the medical simulation.
  • the method farther comprises projecting treatment information to instruct a user on an appropriate procedure to treat tile simulated medical condition, in one form, the method further comprises projecting one or more images representing an internal state of the simulated patient.
  • a computer readable medium comprising computer executable instructions for performing the method of the first aspect.
  • a computational apparatus comprising a processor and a memory, the processor configured to perform the method of the first aspect.
  • a medical training system comprising:
  • a computational system comprising:
  • one or more projectors for projecting one or more images onto at least a portion of the manikin.
  • the computational system comprises one or more tracking systems for tracking at least the manikin.
  • the computational system is a SAR system, and the at least one projector is for projecting the one or more images into a SA R environment, the SAR environment comprising at least a portion of the manikin.
  • a method for simulating, a patient comprising:
  • a manikin incorporating at least one surface for displaying a representation of a simulated patient feature or condition, and one or more speakers:
  • a simulation system comprising: a manikin incorporating at least one surface for displaying a representation of a simulated patient feature or condition, and one or more speakers; and
  • a computational system comprising at least one processor and a memory, the at least one processor configured to display one or more images on the at least one surface and providing one or more audio outputs to simulate a patient.
  • the above method and simulation system may " use a standardised patient manikin to replace an actor in a sinmlation system, in one form, the at least one dispiay device comprises at least OLED based display or flat panel display device. In one form, the computational system is configured to
  • the computational system further comprises a microphone and the computational system is configured to output the audio signal via the one or speakers to simulate a patient response in a simulation.
  • the manikin further comprises one or more touch sensitive surfaces for receiving user input, and the computational system is configured to select or generate one or more image and one or more udio outputs based upon the received user input.
  • Figure I is a schematic view of a spatial augmented reality (SAR) medical training system according to an embodiment
  • Figure 2A is an isometric view of spatial augmented reality (SAR) medical training system using ceiling mounted projectors according to an embodiment
  • Figure 2B is a top view of embodiment shown in Figure 2 A;
  • Figure 2C is a side view of embodiment shown in Figure 2A;
  • Figure 3 A is a . isometric view of a spatial augmented reality (SAR) medical training system using bed mounted projectors according to an embodiment
  • Figure 3B is a top view of a spatial augmented reality (S R) medical training system using bed mounted projectors according to an embodiment
  • j 0030
  • Figure 3C is a top view of a spatial augmented reality' f S AR) medical training system using bed mounted projectors according to another embodiment
  • Figure 3D is a top view of a spatial augmented reality (SAR) medical training system using bed mounted projectors according to another embodiment
  • Figure 4 is an isometric view of a medical training system using internally mounted projectors according to an embodiment
  • Figure 5 A is an isometric view of a range of manikins of different sizes according to an embodiment
  • Figure 5B is a top view of retrofitting an existing manikin by placing a de-featured mask placed over the head of manikin to hide facial features according to art embodiment;
  • Figure 5C shows a top view of the manikin of Figure 5B retrofitted with a de-featured mask according to an embodiment
  • Figure 5D is a top view of the projection of an inj ured face onto retrofitted manikin of Figure 5C according to an embodiment
  • Figure 6A is an isometric view of the projection of a rash symptom onto a manikin by a projector according to an embodiment
  • Figure 6B is an isometric view of the projection of trai ning information onto a manikin by a projector according to an embodiment
  • Figure 6C is an isometric view of the projection of internal information onto a manikin by a projec tor according to an embodiment.
  • Figure 7 is schematic d iagram of a computing device according to an embodiment.
  • the medical training system comprises at least one manikin, and a computational system.
  • the computational system comprises one or more projectors for projecting one or more images onto at least a portion of the manikin.
  • the manikin may be in a fixed location.
  • the computational system may also include one or more tracking systems for tracking at least the manikin.
  • the method of simulating a medical condition is performed by obtaining a manikin, and then projecting one or more images onto at least a portion of the manikin to simulate a medical condition.
  • the computing device may be -configured to di splay images or a sequence of images according to a script or under the control of a simulation supervisor who may trigger specific sequences or images, or the selection of images may be automated and respond to actions by persons interacting with the manikin.
  • the manikin could be a whole body or part such as just the head or face in which a static or animated medical condition and/or feature data is projected onto the manikins face.
  • the manikiiv'head can be configured to talk ⁇ either computer animated or pre-recorded audio) and respond to established scenarios through the application of programmed intelligence or respond via instructor input, to create a- more realistic or immersive medical trainin simulation.
  • This computational system may also be a Spatial Augmented Reality (SAR) system.
  • SAR Spatial Augmented Reality
  • AR is the addition of digital imager and other information to the real world by a computer system. AR enhances a user's view or perception of the world by adding computer generated information to their view.
  • Spatial Augmented Reality (S AR) is a branch of AR research that uses projectors to augment physical objects with computer generated information and graphics. Traditionally, projectors have been used to project information onto purpose built projection screens, or walls. SAR on the other hand, locates (or projects) information directly onto objects of interest, including moving objects.
  • SAR systems use sensors to develop a three dimensional (3D) model of the world, and typically include tracking systems that enable them to dynamicall track movement of real world objects. Such movements or changes are integrated into the 3D model so that updates can be made to projections as objects are mo ved around,
  • SAR systems have considerable flexibility and scalability over other AR systems.
  • Multiple projectors may be used to provide projections onto multiple objects, or multiple surfaces of an object, and the projections may be of varying size (including very large projections).
  • high resolution projections can also be provided, either by the use of high resolution projectors, or multiple lower resolution projectors each handling different components of the projection to provide a high resolution output.
  • One advantage of SAR systems is that. as the information is projected onto an object (or a surface), the system frees the viewer from having to wear or hold a display device, and the information can. be viewed by multiple people at the same time. Users can thus hold physical objects, and make and observe digital changes to the object, and these can be easil communicated to other viewers.
  • a basic SAR system comprises a computing, apparatus that provides images (including computer generated animation sequences) for projection by a single projector onto a surface (such as a manikin) and thus the computational system could be a basic SAR system.
  • the computational system may also be a more complex SAR system such as one that models the. SAR environment, and comprises multiple projectors, object tracking systems and alignment/calibration systems for projection of multiple images by the multiple projectors with real time compensation for movement of multiple objects within the field of view or SAR environment.
  • the computational system described herein may also be referred to as a SAR system, and cover both the use of basic S AR systems and advanced SAR systems within the medical simulation system.
  • SAR systems comprise a SAR device and a SAR platform for producing a SAR envi onment.
  • Tire SAR device is a computing device (ie comprising a processor and a memory), with inputs for recei ving data and an output for connection to at least one device for human perception such as the SAR platform.
  • the SAR computing device loads SAR application modules and a SAR engine fo receiving the input date and for interfacing between the SAR application modules and the outputs such as the SAR platform.
  • SAR application modules can be developed for simulating a range of medical conditions.
  • the SAR platform is defined as the devices which receive input or generate the SA output - that is the actual configuration and layout of devices used to generate the SAR environment and to detect changes or inputs.
  • These may be simple devices such as a keyboard, mouse or video projector which can be directly connected to the computing device, or the input devices may include more complex systems such as an object tracking system that comprises multiple sensors and a separate computing device which processors the sensor input and provides tracking input mformation to the computing device.
  • the tracking system tracks the manikin, although other objects can also be used.
  • the display of a device can be simulated based on the medical simulation data.
  • an ECG trace or blood oxygen levels ccmkl be simulaied and projecied onto a screen nest to the manikin.
  • the objects and surfaces on which information is to be projected or perceived can be stationary or moving.
  • the connection of the SAR platform or individual input and output devices of the- SA platform to the computing device may be a wired or wireless protocols or communications devices or means, including Bluetooth, Wi-Fi, infrared, or other wireless technologies, protocols and means.
  • the SAR system may ran autonomously, or be unde the control of a simulation supervisor overseeing the medical training simulation.
  • the SAR environment is defined to represent the physical environment within which augmented reality outputs generated by the SAR system are output (or may be output).
  • the SAR environment wouid be defined by the intrinsic and extrinsic parameters of the projector, such as the ran e within which an. image remains visible (eg lamp power), the range at which, individual pixels reach a predefined size limit (projection optics), and the position and orientation of the projector which defines the field of view or pointing limits.
  • the SAR environment may be an interior region of space, such as a. portion of a room, an entire room, multiple rooms, a region of exterior space (ie outdoor) or some combination. For example, in the context of a medical simulation this might be a bed of a hospital ward.
  • the input devices and output devices may be located within the SAR environment, or they may be located outside of the SAR environment provided they can produce outputs which are perceptible within the SAR environment. That is the SAR .. platform, ma be completely outside the SA environment or partially within the SAR environments, in some circumstances the SAR environmen may be taken to include the SAR platform and the physical environment within which SAR. outputs are perceptible. Similarly, the observers who perceive or sense the outputs, or generate inputs may be located i the SAR. environment or they may be located outside of the SAR environment provided they can perceive the SAR outputs.
  • FIG. 1 there is shown a schematic view of a spatial augmented reality (SAR) medical training system I according to an embodiment.
  • a manikin 2 comprising a head 3, torso 4, arms.5 and legs 6 rests on a bed 7.
  • a S AR system 1 is used to project images and textures onto a manikin and comprises a projector 11, a computing device 1 comprising a processor 1 7 and memory 18, and a tracking system 19 (although, as noted above, in other embodiments the tracking system may be omitted in which case the SAR system is simply an appropriately configured computational system).
  • the SAR system is used to project images that include faciai features such as pupils 12 and a mouth 14, as well as symptoms such as red flushed cheek 13, and a rash 15 o the torso. Further, facial features ca be altered to indicate symptoms such as dilating the pupils or adjusting the colour the lips to indicate cyanosis.
  • the images may be projected as a sequence of images to show time varying symptoms or features.
  • a memory in the computational system may store a library of images or they may be computer generated, images can be projected onto both the patient (the manikin) and other equipment such as the sheets, bed, floor, or medical equipment props (eg an ultrasound or blood pressure monitor) to simulate measurements.
  • FIGS 2 A, 2B and 2C show isometric view 20, top view 21 and side view 22, respectively, of a SAR medical training system using ceiling mounted projectors according to an embodiment.
  • the S R environment is a 7 bed with a manikin 2 in a hospital ward.
  • Three projectors 24, 25, 26 are distributed around the bed and located in the plenum space above a ceiling 27 and provide overlapping projection volumes. This allows projection from different angles to reduce or prevent shadowing effects caused by participants walking around and interacting with the manikin, Further, multiple projectors can be used to generate complex images and textures o the manikin.
  • the plenum space is the space between the true ceiling and a dropped ceiling and is used to house lighting, wiring, air vents, piping etc.
  • a ceiling tile could be modified to house a mount for the projector, and to include a projection aperture for projection of images onto the manikin.
  • This arrangement provides a discrete and subtle arrangement as the only sign of projectors is the apertures in the ceiling tiles.
  • the mounts can allow the projectors to be mounted at a range of angles, and can be motorised to allo w adjustment during simulations, for exaraple in the case of movement of the manikin on the bed.
  • the projectors could be mounted on the ceiling in the case the plenum space was not present or not available for use.
  • a motion tracking system or video recording system can be mounted in the plenum space or directly from the ceiling.
  • Figure 3A is an isometric view 30 of a spatial augmented reality (SAR) medical training system using bed mounted projectors according to an embodiment.
  • a first projector 31 is located at the head of the bed and a second projector 32 is located at the foot of the bed.
  • Figure 3B is a top view 33 of the embodiment of Figure 3 A using projectors at the ends of the bed.
  • Figure 3C is a top view 34 of another embodiment in which 4 projectors are located in each corner of the bed.
  • Figure 3D is a top view 35 of another embodiment in which projectors are located along the long edges of the bed (3 per side).
  • Bed mounted projectors can be integrated into the bed, or mounted on the bed.
  • Short throw projectors are used to project textures (and images) at extreme angles so as to preventing shadowing from participants standing over the manikin. Further, this approach has the advantage of saving space and increasing mobility/freedom of movement of both the manikin and of participants without the creation of shadowing effect as can occur with ceiling or wall mounted projectors.
  • a wireless pi co-projector is installed within the manikin head ' neck region. This is then used in conjunction with a periseopic mirror, to rear-project up and across onto the manikin faces, or onto a face-mask.
  • the projectors are mounted on tripods or moveable (eg wheeled) stands to provide a readily portable and eustomisable system which can be used in a range of indoor and outdoor locations (eg simulated accident scenes).
  • a parti al manikin could be a shell constructed of semi-translucent material, and the simulation system is used to internally project images onto the interior surface of the shell.
  • Figure 4 is an isometric view 40 of a medical training system using internally mounted projectors according to an embodiment.
  • the projectors are located within the bed 41 and are used to project images onto the interior surface of a manikin shell 42.
  • a first projector 43 is used to project images onto the lower legs, a second projector 44 projects images for the torso and arms, and a ⁇
  • third projector 45 is used to project images into the head.
  • the bed can be adjusted from a flat to inclined position without affecting the projection system.
  • This embodiment also avoids the need for a tracking syste as the manikin can be fixed with respect to the bed and the projectors, and as internal projection is used the problem of shadowing is completely eliminated, in this embodiment the projectors need to be capable of extremely short throws, preferably with high resolution, although this can be compensated for by the use of additional lower resolution projectors.
  • portions of the manikin include a display or touch screen (including multi-touch touch screens).
  • the screen is an OLED screen and in another embodiment the OLED screen is a 3D formed (ie curved) OLED screen.
  • a OLED mask is placed on the manikin to represent facial features or medical conditions, in another embodiment, the display screens are permanently embedded in the manikin.
  • the manikin may include one or more touch screens to allow the manikin to become an interactive "smart surface" which can be interrogated through hand-tracking or gesture recognition, to provide additional training information.
  • the manikin may be fixed in place, or be allowed to move.
  • the computational system may be a SAR system including a tracking system used to track the location of the manikin in the SAR environment and with respect to the bed.
  • the SAR computing device updates the images to be projected in response to movement of the manikin, to ensure that projected features are consistent projected in the same locations of the manikin, irrespective of movement of the manikin.
  • Video (normal and I ) or RF based tracking system may be used, eg OptiTrack, ARToolkitPlus, and Vicon motion capture system
  • the trackin systems may use optical trackers, LED markers, magnetic trackers, fiducial markers, etc embedded in or on the manikin and bed to assist with tracking an object.
  • the location of the bed, sheets, manikin and people may be tracked and the projection updated in response to movement of tracked objects.
  • the SAR system may compensate for obstruction of projeetcd images due to movement of a tracked object.
  • the system may project a feature fro multiple projectors to compensate for obscuration, or a second projector may project a feature if the projection from a first projector is blocked.
  • the simulation could be paused until a person moves out of an obscuring location.
  • the projectors are controlled by the SAR computing device (or more generally a computing device) and can be connected to the SAR computing device using wired or wireless communication links (eg an IR or RF based protocol such as WiFi, BlueTooth or an IEEE 802.1 1 based protocol).
  • Projectors can be mains or barter)' powered.
  • Calibration can be performed periodically, as part of a pre-simulation initialisation procedure, or during the simulation.
  • Multiple projecto calibration such as that described in AU2014200683 titled “Method And Apparatus For Calibration Of Multiple Projector Systems ' " filed on 7 February 2014 may be used.
  • the calibration process may be performed or assisted using calibration devices including light sensors and/or LEDs that are located on the bed or the manikin, such as those described in AU2Q 14200683.
  • the computing device is used to project images onto the manikins to simulate a patient and symptoms.
  • the computing device may be a SAR computing device and may ran various generic or supporting SAR computing modules related to calibration, tracking, image updating, etc, as well as medical simulation related SAR computing modules developed to support medical simulations.
  • the SAR system, including the development of SAR modules may be based on the SAR system described in
  • the computing device may also be used to control the overall medical simulation, or the
  • computing device may be in communication, or under control of, another computing device which is used to control the simulation.
  • a simulation supervisor may use an interface to control or configure the medical simulation.
  • the simulation supervisor may directly supervise a medical simulation, and control the progression of symptoms, as well as -recovery in the case of correct treatment, as well as directl interacting with participants (including acting as the patient).
  • simulation supervisor can simply configure or program simulations, with the simulation being controlled autonomously b appropriate software that follows the preconfigured script, and which can respond to treatments initiated by participants.
  • the projected images are used to simulate medical conditions, and thus act as visual indications to the participants in the simulation.
  • participants in the simulation For ease of reference we will refer to the participants in the simulation
  • the projected images may be a sequence of images corresponding to visual indications of different stages or symptoms of a simulated medical condition. This can be used to simulate the development of a simulated medical condition. Images can be projected onto both the patient (the manikin) and other equipment such as the sheets, bed, floor, or medical equipment props (eg an
  • ultrasound or blood pressure monitor to simulate measurements.
  • blood or other body fluids and bod tissues can be projected onto the manikin and also sheets and other adjacent surfaces. This avoids the mess associated with using "fake blood" in training scenarios winch can be a big time consuming problem in clean-up and re-use of a simulation facility.
  • the images can be updated based on treatment performed by the participants. In this way the simulation can be dynamic and interactive.
  • a wide range of symptoms and medical conditions can be simulated by projecting appropriate images on to a manikin. Examples of simulated condi tions are listed in Table 1. Thi can be used to generally support medical training for doctors, nurses and other medical staff. Additionally, specific modules can be created for specialised training. For example, a trauma module for simulating trauma conditions could be modelled. Most undergraduate training either docs not cover trauma diagnosis and treatment, or only covers such areas superficially. A trauma simulation, would fill a need in this area. Another specialised module is for obstetrics/antenatal simulation or more generally abdominal/pelvic pathology simulation to simulate pregnancy complications and trauma, and to display anatomy, physiology or pathology. Another specialised module is neurological module for simulating.
  • cranial nerves are at the base of the brain, and are responsible for a lot of facial and sensory fractions. Medical students are often required to perform cranial nerve examinations as part of their medical training, but in almost all cases the patient is healthy as it is near impossible for a person to fake the signs of a neuroiogicai condition, and thus students simply go through the motions of examination and no assessment is possible.
  • a medical simulator including a SAR medical simulator can be used to project the often subtle and nuanced neurological symptoms onto a manikin to
  • Rashes such as anaphylaxis (uticaria or hives), and infectious rashes (meningococcus,
  • Trauma bruises, penetrating wound
  • Images can be used to simulate a wide range of skin textures and features onto a plain manikin. This can be used to simulate a desired age, ethnicity, and sex for a patient, and could he changed from simulation to simulation. For example, characteristics such as skin colour, age and sun effects including smooth young skin, old wrinkly skin, sun spots, freckles, moles, ton burn, ethnic features, and colourings (eg. Asian eye shapes, red head) can: all be controlled by selection of images. Often factors such as age, ethnicity and sex will influence the symptoms of a medical condition, as some combinations make it harder to identify and thus diagnose conditions.
  • a manikin with a range of body types and sizes can be used. Most manikins are made of plastic and represent white males of approximately 25 years of age, and arc thus difficult to modify to simulate other body types and ages. As the complexity of symptoms and appearance is handled by the SAR system, the manikins can be cheaply manufactured with a range of desired shapes, such as children, females including pregnant females, thin patients, obese and morbidly obese patients, etc.
  • a manikin is obtained by forming a 3D shape based upon a computer model.
  • the computer model could be obtained from a body scan of a real person or be artificially generated using simulation software, or be a hybrid (eg modified from a body scan).
  • the manikin can then be manufactured by quickly cutting ou t a 3D manikin from a block of foam or plastic such as by using a CNC machine, or produced from a computer model usin a 3D printer.
  • Figure 5A shows an isometric view 50 of a child 51 , an obese person, 52, and a female 53.
  • the manikin could be manufactured using a easting or moulding technique.
  • partial manikins can be produced, such as just a head, just a torso, an upper body comprising a head, torso and upper amis, or a lower body comprising the abdomen/pelvic area and legs.
  • Forming manikins in these ways provides a cheap way of generating manikins with a range of body shapes and sizes that can be used in the simulation system. Additionally, flat or slightly curved projections regions can. be provided, for example on the face, as the projection system(s) can be used to project detailed facial features. Forming the manikins in this way provides a cheap source of manikins with a range of body types for the medical simulation system, thus enhancing the flexibility and cost effectiveness of the system.
  • an existing manikin is retrofitted with a de-featured cover, mask or surface to hide existing manikin features and to provide a clear (or clean) surface for projection of features/medical conditions b a projector.
  • the bottom surface or portion of the cover may be contoured to match manikin facial features/structure, or it may be deformable to mould to the manikin features/structure, l ite top surface may be a flat surface or slightly curved ' surface (ie substantially flat in locations of facial features) to allow projection of a wide range of features.
  • the mask may have basic structural features, or regions such as a nose, cheek structure, eye sockets and chin, but allow considerable fine control of presentation of specific features such as size/shape of eyes, mouth, pigmentation, as well as medical symptoms.
  • the mask may also include a wig or other related materials or features which are difficult to recreate through projection means alone.
  • the ears are utilised as a means of attaching the mask to the manikin head.
  • Figure 5B is top view 54 of retrofitting an existing manikin 2 by placing a de-featured mask 55 which is placed over the head 4 of manikin 2 to hide facial features.
  • Figure 5C shows a top view 56 of the retrofitted manikin with, the de-featured mask 55 providing a clean facial surface for projection of features or medical, conditions by a projector.
  • Figure 5.D shows a top view 57 of the projection of an injured face 58 by a projector 10 onto the clea projection surface provided by dc*feature l mask 55.
  • a cover, o covers could be placed on other regions/body parts (eg torso, breasts, genitals, arms, legs) to alter the default manikin features, for example to add fat to tarso/arms/legs to simulate obesity, to add/hide gender specific features, etc.
  • j 00661 Whilst the simulation system facilitates the use of low fidelity custom manikins, it is to be understood titat the simulation system can also be used to project images onto medium and high fidelity manikins.
  • a medium or high fidelity manikin that mechanically supports gross movements such as a head moving, or chest movements due to breathing, or a beating heart
  • SAR based simulation systems can be used to track the moving surface and provide realistic textures that move with the surface or which are used to enhance the mechanical movement
  • the manikin may simulate some physical movements whilst computer generated images are used to simulate other movements.
  • a manikin could mechanically move the chest to simulate breathing, whilst the simulation system is used to project images to simulate a heartbeat, such as by projecting different textures onto the moving surface (or vice versa).
  • ⁇ (K)67 j Bod and facial appearance, surface textures and symptoms, and dynamic effects can be computer generated, obtained f om an image library or database, including images from real patients and symptoms, or obtained from body scans or image software.
  • Gaming engines, used for generating textures of characters in computer games could be adapted to generate images for
  • FIG. 6A is an isometric view of the projection of skin texture onto the torso of a grey manikin along with a simulated rash syinptam 61.
  • the simulated effects can he dynamic, so that they evolve over time, or change in response to student actions or requests. For example, a student could ask. a manikin to open their mouth, in which case the simulation system could then project an open oral cavity onto the manikin showing the pathology of the tonsils, tongue and teeth.
  • the simulation system could also dynamically change the image based upon tracking the students or manikins head. For example, if the student moves their head to one side to get a better view of the interior of the oral cavity, the simulation system could track the students head, and dynamically alter the projected image based upon the estimated view ing angle of the student relative to the man ikin ,
  • the manikins can also be instrumented with one or more embedded subsystems.
  • the embedded subsystems can be used to simulate a medical condition or body noise (eg breathing, heart beat) and can be coordinated with the projected images, i one embod iment the manikin includes an aural user interface including one or more speakers which are used to generate aural outputs. These can allow simulation controller to speak for the manikin, or allow pre-recorded or computer generated sounds- or speech to be played.
  • the speaker may also be used simulate one or more aural sym toms of the simulated medical condition, such as hear sounds, left/right lung sounds and abdomen sounds.
  • the aural symptoms can be are coordinated w ith projected images and visual symptoms and effects.
  • the manikin can also include a microphone to receive aural -input (eg questions) from a participant or to detect procedures performed by the participants.
  • the microphone can be internal and/or external. This can be provided to a simulation supervisor who can verbally respond (eg using the speakers) or select an appropriate prerecorded or computer generated response.
  • the microphone input could be provided to a speech recognition engine (or module) running either on a computing device embedded in the manikin or local environment (eg under a bed the manikin is on), or on a remote computing device (eg in a control room, remote server or even i the cloud).
  • the received speech can then be interpreted, for example using a natural language processing module an an appropriate response can be provided via a speaker in the manikin.
  • a student could ask if a rash is itchy, and the manikin could respond with a response such as yes or no.
  • a student could ask the manikin to rate the pain in a range from 1. to 10, an appropriate value could be provided based upon the underlying medical condition, eg if 1 is high, the manikin could respond with a value of 8 for a serious condition such as a intestinal
  • the simulation system could use the aural user interface to respond to both questions and tactile inputs from the student. For example, if the manikin is simulating appendicitis, then the student could push or prod the manikin ' s torso or abdomen in several different locations and each time ask if the pushing hurts. W here the manikin is pushed in an area not linked to the inflamed appendix, the maniki could provide a low pain response, but when an appropriate location is pushed a strong pain reaction could be provided including a pained facial response (eg gritted teeth, closed eyes, ete),along with appropriate aural sounds such as a cry or groan. If gross movement of the manikin is possible then this could also be performed eg flinching, or attempting to move the abdomen away from the pushing.
  • a pained facial response eg gritted teeth, closed eyes, ete
  • a relatively chea standardised patient simulation system can be developed in which rear (internal) or front projection is used to project static or animated medical conditions and/or feature data onto the manikins face.
  • the manikin can talk (via computer animated images and audio) and respond to established scenarios through the application of programmed intelligence or via instructor input.
  • This system utilises a basic SAR system such as a single projector, computer, and speakers/microphone but can provide an enhanced simulation experience at relatively low cost.
  • f 00711 Other subsystems that can be included are basic airways, blinking, sweating, crying, and
  • IV/Blood flow features pulse, blood pressure etc
  • bladder/catheter interaction may be simple or complex systems depending upon the simulation requirements.
  • a modular approach could be used, in which cavities in the manikin are created to receive modular units such as a hollowed out airway unit for use in anaesthesia and emergency room simulations. This allows generation of cheap manikins that can be used for simulation of a wide range of conditions, but which provide the capability for upgrading or retrofitting of sophisticated simulation modules for simulating specific conditions or scenarios.
  • an actor could be used in place of a manikin, and a projection or SAR, system used to project images of medical conditions or data onto the actor.
  • the actor would be briefed about the medical condition they have, and how to respond to questions or examination, and then the images of the medical condition would be projected onto the actor 's body.
  • images simulating a wound and blood could be projected onto the actor's body or clothes, and the images updated as the examination continues, for example as clothing is removed, or to simulate a change in the condition such as fresh bleeding from a wound.
  • a SAR system could be used to track the actor's location and the actor could wear fiducial markers to assist wi th the tracking system.
  • training information can easily be projected onto a manikin 2. This can be used to guide a student on how to perform a new procedure, and/or to indicate what symptoms they can expect to encounter with a new procedure, hi this embodiment, speakers 62 are projected onto the manikin along with text 63.
  • the speaker images indicate the location to listen for a symptom and the text can alert the student on what the symptom is. For example, in this embodiment the speaker on the right side of the manikin's chest has t e text "Right Lung Sounds" to indicate to the student what lung sounds they should expect to hear.
  • these sounds may be played when the user touches that location, for example with a modified stethoscope that includes speakers in the ear pieces.
  • internal information 64 such as organs, muscles, circulator ⁇ ' and lymphatic can be projected onto the manikin. This can be used to assist in training to indicate the internal state of the organs associated with observed symptoms or a medical condition the student is learning about (or diagnosed). For example, if the symptoms are associa ted with a rupturing appendix, then internal ima ges could be shown to reinforce the associatio between the symptoms and root cause.
  • Projection of training or internal information can thus be used as part of a training or learning exercise to create an interactive interface, in which the participants are instructed or guided by the additional information projected onto the surface of the manikin.
  • a student can be guided throug a medical diagnosis by the SAR. system, which can present symptoms of a medical condition, including how symptoms, change and how further symptoms develop as a condition progresses. This could be performed semi-automatically by software- under the guidance- of a simulation controller who could be supervising several students at once with the assistance of the software, or the process could be fully automated or controlled by the simulation software.
  • Instructional information for example, on what stage of the condition that the symptoms represent can be pro vided along with information on appropriate tests to perform. This ca be supplemented with internal images of how th condition is developing.
  • instructional and internal information can be provided as feedback to a student either during or after a medical diagnosis simulation.
  • the system could be- used to make a recording of the simulati on.
  • a video recording of the simulation environment could be stored along with additional simulation dat from the computational system relating to the patient such as what symptoms were bein displayed and when and how they developed.
  • the video and associated simulation data could. then be played back to the students in a post briefing analysis to provide an improved learning tool
  • the students and instructors could review the simulation, and specific sequences could be re-enacted and supplemented with instructional information presented onto the manikins,
  • the simulation system could also be used more widely than just medical simulations.
  • the simulation sy stem could be used by police to create a three dimensional representation of a suspect, crime scene or missing person to assist in criminal investigation.
  • a facial "sketch" a whole three dimensional clothed person, to the correct height, body type, body shape could be produced.
  • a computer sketch or computer generated image including a computer generated from one or more images of the suspect could then be projected onto the manikin.
  • Such a simulation system could take the recognition process to the next level for identification purposes.
  • the simulation system described herein could also be extended to other applications such as in store manikins, movie or theatrical productions, for example to al low use of a wider range of manikin body types, which can then be augmented with a range of facial features and textures to create more real istic manikins.
  • a shopper's facial features and skin texture could be projected onto a manikin wearing clothing for sale.
  • the computing device or apparatus may be configured to displa images or a sequence of images according to a script or under the control of a simulation supervisor who may trigger specific sequences or images, or the selection of images may be automated and respond to actions by persons interacting with the manikin. This may be manually controlled by a supervisor who monitors the simulatio
  • More advanced systems may include embedded speakers and microphones, to allow the supervisor to respond as the patient or play back prerecorded patient responses or aural symptoms (eg groans, wheezing), or voice recognition arid speech synthesis may be used.
  • Computer animations may also be displayed that are synchronised with the audio playback to reinforce the simulation of the manikin speaking or responding (eg mouth movements, grimacing when groaning, etc).
  • Sophisticated systems may include multiple projectors and projector calibration systems, object tracking systems, instrumented manikins, and one or more high performance computers for modelling, generating and controlling the simulation environment such as analysing input and generating appropriate output including rending images and simulating symptoms.
  • computational systems may also include SAR systems.
  • SAR system is defined broadly to include relatively basic SAR systems as well as sophisticated SAR systems.
  • a basic SAR. system comprises a computing device that provides images (including computer generated animation sequences) for projection by a single projector onto a surface such as a manikin, and can thus be broadly described simply as a computational system.
  • a computational standardised patient manikin is used to replace an actor in a medical (or other) simulation system.
  • the manikin incorporates at least one surface for displaying a representation of a simulated patient feature or condition, and one or more speakers.
  • a computational system is configured to display one or more images on the at least one surface and providing one or more audio outputs to simulate a patient.
  • the manikin's face comprises a display screen, such as 3D formed OLED screen or other LED, LCD, or flat or curved panel display system.
  • the body may also include touch sensitive surfaces and additional display screens.
  • front or rear (internal) projectors are used to project static or animated medical conditio and/or feature data onto the manikin's face or body.
  • the manikin is configured to talk or generate audi outputs using speech synthesis and computer animated movements and/or pre-recorded audio.
  • the manikin may be configured to respond to established scenarios through the application of programmed intelligence or configured to respond using audio input from a supervisor using a microphone, and directing the microphones audio signal to the speakers.
  • Touch screens or othe touch sensitive surfaces can be used to track hand movements and patient examinations, o it may reveal additional training information, for example displaying an internal representation, (eg blood supply or muscular-skeletal) of the bod region.
  • a standardised manikin head is provided (ie no body) using a display screen (eg 3D OLED) or front or rear projection system.
  • die entire manikin is a 3D formed OLED or similar displa technology.
  • the standardised patient manikin could be programmed or configured to respond to a specific set of scenarios o be more generally used to simulate a patient response in a simulation, and in particular to simulate scenarios where actors: have previously been used. This would enable a finer degree of standardisation in training scenarios where actors are used, particular for neurological or complex medical scenarios.
  • the standardised patient manikin could be a stand alone manikin containing a computing system or in communication with a computing system or be part of more elaborate SAR system including multiple projectors and object tracking systems.
  • the standardised patient manikin could thus be used to provide a method for simulating a patient by obtaining (including constructing) a standardised patient manikin and then displaying one or more images on the at f east one surface and providing one or more audio outputs to simulate a patient using a computational system comprising at least one computing device and at least one display device.
  • a computing device could be configured with suitable software modules to act as a simulation controller and comprise of a display device, a processor and a memory, and one or more input devices.
  • the computing device acting as the simulation controller could also be used to execute SAR modules (ie the simulation controller is also the SAR computing device) or it may be in communication wit a SAR system and S AR computing device in such a system.
  • the memory may comprise instructions to cause the processor to execute a method described herein (and thus the processor may be configured to execute the instructions). These instructions may be stored as computer codes, which arc loaded and executed.
  • the processor memory and display device may be included in a standard computing device, such as a desktop computer, a portable computing device such as a laptop computer or tablet, o they may be included in a customised device or system.
  • the computing device may be a unitary computing or prograiiunable device, or a distributed device comprising several components operativcly (or
  • FIG. 7 An embodiment of a computing apparatus or computing device KM ) is illustrated in Figure 7 and comprises a central processing unit (CPU) 1 10, a memory 120, a display apparatus 130, and may include an input device 140 such as keyboard, mouse, microphone etc.
  • the CPU 1 .10 comprises an input-Output Interface 1 12, an Arithmetic and Logic Unit (ALU) 1 1 and a Control Unit and Program Counter element 1 1 which is in communication with input and output devices (eg input device 1 0 and display apparatus 130) through the Input/Output Interface.
  • the Input/Output Interface may comprise a network interface and/or communications module for communicatiiig with an equivalent communications module in another device using a predefined communications protocol (e.g.
  • the display apparatus may comprise a flat screen display (eg LCD, LED, OLED, plasma, touch screen, etc), a projector, CRT, etc.
  • the computing device may comprise a single CPU (core) or multiple CPU's
  • the computing apparatus may use a parallel processor, a vector processor, or be a distributed computing device.
  • the memory' is operatively coupled to the processor ⁇ ) and ma comprise RAM and ROM components, and may be provided within or external to the device.
  • the memory may be used to store the operating system and additional software modules that can be loaded and executed by the processor(s). These may include modules to configure the processor to implement aspects of the methods described herein,
  • the medical training system described herein is able, with high fidelity, to visually simulate patients with a broad range of medical conditions and visual diagnosis c ues such as (but not l imited to): skin pallor and general colouration; pupil dilation; blinking; rashes; sweating; and traumatic conditions.
  • visual diagnosis c ues such as (but not l imited to): skin pallor and general colouration; pupil dilation; blinking; rashes; sweating; and traumatic conditions.
  • a number of additional visual characteristics or patient types can be manipulated such as: sex; age; race; body type; and body size, in addition to allowing training on the wide range of patient types clinicians can expect to encounter, these factors often have additional diagnostic implications and thus enable a more thorough medical diagnostic procedure to be performed.
  • simulations can be provided for neurological conditions, and other conditions for which current systems are either not capable of simulating, or for which they are only able to provide very rudimentary or poor simulations for.
  • the medical training system can be used to address a number of existing problems with current medical training systems using manikins.
  • the use of simple manikins with realistic skin, textures, and dynamic symptoms pro vided by the simulation system allows the development of a low cost but highly realistic and immersive simul tion environment.
  • the system can. address significant audio issues which piague current systems by removing redundant mechanical sub systems such as breathing subsystems to reduce unwanted sounds. Such functionality can be replaced by embedded speakers and dynamic imagery simulating movement of the chest.
  • the system can also address the lack of visual detail and human factors b using a simulation system to display high quality and realistic textures onto a manikin.
  • the system can also be used to dynamically simulate medical procedures by providing a level of interaction with the system changing visual and aural symptoms such as pupil dilation, colouration, pulse, etc, reflecting the progression of a medical condition, as well as responses to treatment to prov ide an interactive simulation experience. Further, greater immersion can be achieved by using more realistic human shapes (eg a range of manikins) and audio relays to provide realistic sounds.
  • information and signals may be represented using an of a variety of technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, o any combination thereof.
  • processing may be implemented within one or more application specific integrated circuits ( ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (JPLDs). field programmable gate arrays
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • JPLDs programmable logic devices
  • FPGAs field-programmable gate arrays
  • Software modules also known as computer programs, computer codes, or instructions, may contain a number a number of source code or object code segments or instructions, and may reside in any computer readable medium such as a RAM memory, flash memory, ROM memory, EPRGM memory, registers, hard disk, a removable disk, a CD- ROM, a DVD-ROM or any other form of computer readable medium.
  • the computer readable medium may be integral to the processor.
  • the processor and the computer readable maxim may- reside in an . ASIC or related device.
  • the software codes ma be tored in a memory unit and executed by a processor.
  • the memory unit may be implemented within the processor or external to the processor, in which case it can be communicatively coupled to the processor via various means as is known in the art.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Pure & Applied Mathematics (AREA)
  • Medical Informatics (AREA)
  • Medicinal Chemistry (AREA)
  • Algebra (AREA)
  • Health & Medical Sciences (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Multimedia (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A medical training simulation system using a Human Patient Simulation Manikin is described. To increase realism a computational system comprising a computer and a projector (e.g. a basic spatial augmented reality (SAR) system, is used to project images onto the manikin to simulate a medical condition. The computational system may be also be a sophisticated SAR system with multiple projectors and object tracking systems. The projected images can be used to simulate a range of patient body types (age, sex, ethnicity, etc) as well as a range of symptoms, including time varying symptoms. A range of manikins of different sizes can also be provided or formed using a range of materials, and projection can be internal or external. Internal subsystems such as speaker systems to replicate internal symptoms can also be included. Additional training information and play back facilities can also be provided to assist with learning outcomes.

Description

A MEDICAL TRAINING SIMULATION SYSTEM AND METHOD
PRIORITY DOCUMENTS j'QGOTJ The present application claims priority from Australian.. Provisional Patent Application No.
2013903338 titled "Spatial Augmented Realit (SAR) Medical Training System and Method" and filed on 2 September 2013, the content of which is hereby incorporated by reference in its entirety.
INCORPORATION BY REFERENCE
[00021 The following co-pending patent applications are referred to in the following description;
PCT/AU2Q] 3/000952 titled "Spatial Augmented Reality (SAR) Application Development
System'" and filed on 27 August 2013; and
AU2014200683 titled "Method And Apparatus For Calibration Of Multiple Projector Systems" filed on 7 February 2014,
The content of each of these applications is hereby incorporated by reference in their entirety. TECHNICAL FIELD j 0003 [ The present invention relates to medical training simulators. In a particular form, the present invention relates to medical training systems using mannequins to simulate a patient.
BACKGROUND
[0004] The process of making a medical diagnosis is a cognitive process, where a clinician uses several sources of sensory observations, which are combined to form a diagnostic impression (a medical diagnosis). Medical training simulation systems are used to allow a healthcare provider (medical student, doctor, nurse, paramedic etc.) to follow the process of attempting to determine or identify the possible disease or disorder characterised (a medical diagnostic procedure), and to reach an opinion by this process (a medical diagnostic opinion).
[0005] Medical training simulation systems typically use a Human Patient Simulation Manikin (HPSM or simpl manikin, or alternativel mannequin), which is placed in a clinical settin such as simulated hospital ward, emergency room, or mock incident location to allow training to be performed in a realistic and immersive environment with multiple people and standard medical equipment, available. Suc systems enable both trainee and experienced clinicians to develop the skills to identify, understand and combat life threatening situations that may occur in a clinical situation before they are actually faced with these situations. Further, whilst students often have the theoretical knowledge of how to deal with these situations, by physically acting out the procedures with a responsive (eg breathing, talking) patient, the student can develop the practical skills and psychological preparedness required for real clinical situations.
} 00()6 j However, currently available HPS s are mechanically based simulation manikins, which are extremely limited in their range of patient types and simulation capabilities. These can be divided into low, medium and hig fidelity manikins. High fidelity manikins attempt to provide a lifelike patient which can respond to user interventions (either manually or remotely) and can contain over 20 different mechanical and electrical sub systems such as pupil dilation, blue LEDs around the lips to simulate cyanosis, and separate audio systems for heart, lungs, vocals and bowels. However such systems are expensive, with initial costs of $20,000 with added expense for instalment, upkeep of disposable, parts, operator training, etc. Medium fidelity manikins have fewer features, less realistic responses and are cheaper, and tend to focus on just one medical issue, or don't cover certain medical issues. Low fidelity manikins are often little more than dumb solids, and include simple features such as basic airway and lung to allo CP training,
[0007] Scholarly research papers have shown statistical increases in performance of students in training, scenarios, especially when high fidelity manikins are used. However, such studies have also highlighted a number of deficiencies. For example, some studies have concluded that the benefits of high fidelity manikins are outweighed by the financial drain on the education facility in purchasing and supporting the manikins. Additionally, the manikin's body type is fixed, and typically is a white muscular approximately 25 year old male. As doctors are expected to diagnose and treat patients of all ages, sex, race, and a range of body weights (including obese and morbidly obese) this body type is not very representative of the range of patients they will be expected to treat. Further, as the manikins are constructed of a dull plastic they lack a human look and feel, as well as mechanisms to imply different and changing skin tones, bruising, rashes, or boil-like symptoms. Such visual clues are often important when assessing a patient and making a medical diagnosis. They also use mechanical solutions to show effects such as tears, pupil dilation, and chest movement resulting from breathing. As a result the manikins lack realism and simulation coordinators resort to techniques such as applying makeup and wigs to represent different patient types and symptoms, as well as resorting to directly informing the participants of what the makeup is supposed to represent. The mechanical subsystems also create unintentional sounds such as faint creaking and squeaking which interfere with audio based diagnosis, and even lead to misdiagnosis as some students interpret these sounds as though they were real medical symptoms (eg creaking in the chest),
[0008 ] There is thus a need to provide medical training simulation systems using improved Human Patient Simulation Manikins that are more realistic and humanlike, and without effects such as unintentional sounds so as to increase user immersion and improve learning quality. SUMMARY
[0009] According to a first aspect of the present invention, there is provided a method for simulating a medical condition, the method comprising;
obtaining a manikin; and
projecting one or more images onto at least a portion of the manikin to simulate a medical condition using a computational system comprising at least one computing device and at least one projector.
[0010] In one form, the one or more images are visual, indications of a simulated medical condition. These images may be a sequence of images corresponding to visual indications of different stages or symptoms of a simulated medical condition. In one form, the medical condition is a trauma condition. In one form, the medical cond tion is a neurological condition.
[0011] In one form, the method further comprise selecting an age, bod size, ethnicity and sex for a simulated patient, and projecting one or more images corresponding to the selected age, body size, ethnicity and sex onto the manikin. In one form, the manikin is obtained by forming a 3D shape based upon a computer model. In one form, the manikin is a standard manikin fitted with a de-featured face mask to hide existing manikin features and to provide a clean surface for projection of the one or more images fay the one or more projectors.
[00121 in one form, the manikin includes one or more speakers, and the method further comprises generating one or more aural outputs from the speakers in the manikin. In one form, the aural outputs are used to simulate one or more aural symptoms, and the aural symptoms arc coordinated with the proiected images.
[0013 ] In one form, the manikin includes a microphone and the method further comprises receiving aural input from a user. In one form, the method further comprises providing the aural input to a simulation supervisor,
[0014] In one form, the manikin includes one or more embedded subsystems, and the method further comprises simulating one or more symptoms of the simulated medical condition usin at least one embedded subsystem, wherein the symptoms are coordinated with the one or more projected images.
[0 15 ] In one form, the computational system may also comprise one or more tracking systems for tracking at least the manikin. The computational system may be used to project images onto other objects or to track other objects besides the manikin, m one form, the one or more projectors are ceiling mounted projectors. In one form, the manikin is located on bed, and the one or more projectors ar mounted in or on the bed. In one form, the manikin is a shell, constructed of semi-translucent materia! and the computational system internally projects the one or more images onto the interior surface of the shell, in one form, the computational system is a SAR system comprises one or more projectors for projecting one or more images into a SAR en vironment which incl udes at least a portion of the manikin.
[0016] In one form, the method further comprises obtaining a video capture of the simulation, and storing simulation data to allow for later review of the medical simulation. In one form, the method farther comprises projecting treatment information to instruct a user on an appropriate procedure to treat tile simulated medical condition, in one form, the method further comprises projecting one or more images representing an internal state of the simulated patient.
[0017] In one aspect, there is provided a computer readable medium comprising computer executable instructions for performing the method of the first aspect. In another aspect, there is provided a computational apparatus comprising a processor and a memory, the processor configured to perform the method of the first aspect. f 001 ] According to another aspect of the present invention, there is provided a medical training system, comprising:
at least one manikin; and
a computational system comprising:
one or more projectors for projecting one or more images onto at least a portion of the manikin.
[00.19] in one form, the computational system comprises one or more tracking systems for tracking at least the manikin. In one form, the computational system is a SAR system, and the at least one projector is for projecting the one or more images into a SA R environment, the SAR environment comprising at least a portion of the manikin.
[0020] According to another aspect of the present invention, there is provided a method for simulating, a patient, the method comprising:
obtaining a manikin incorporating at least one surface for displaying a representation of a simulated patient feature or condition, and one or more speakers: and
dis laying one or more images on the at least one surface and providing one or more audio outputs to simulate a patient using a computational system comprisin at least one computing device and at least one displa device.
{ 021 1 According to another aspect of the present invention, there is provided a simulation system comprising: a manikin incorporating at least one surface for displaying a representation of a simulated patient feature or condition, and one or more speakers; and
a computational system comprising at least one processor and a memory, the at least one processor configured to display one or more images on the at least one surface and providing one or more audio outputs to simulate a patient.
[0Q22] The above method and simulation system may "use a standardised patient manikin to replace an actor in a sinmlation system, in one form, the at least one dispiay device comprises at least OLED based display or flat panel display device. In one form, the computational system is configured to
computationally generate or play back a pre-recorded audio signal and to output the audio signal via the one or speakers to simulate a patient response in a simulation, in one form, the computational system further comprises a microphone and the computational system is configured to output the audio signal via the one or speakers to simulate a patient response in a simulation. In one form, the manikin further comprises one or more touch sensitive surfaces for receiving user input, and the computational system is configured to select or generate one or more image and one or more udio outputs based upon the received user input.
BRIEF DESCRIPTION OF DRAWINGS
10023] Embodiments of the present inventio will be discussed with reference to the accompanying drawings wherein:
[0024] Figure I is a schematic view of a spatial augmented reality (SAR) medical training system according to an embodiment;
[0025] Figure 2A is an isometric view of spatial augmented reality (SAR) medical training system using ceiling mounted projectors according to an embodiment;
[0026] Figure 2B is a top view of embodiment shown in Figure 2 A;
[0027] Figure 2C is a side view of embodiment shown in Figure 2A;
[0028] Figure 3 A is a. isometric view of a spatial augmented reality (SAR) medical training system using bed mounted projectors according to an embodiment;
[0029] Figure 3B is a top view of a spatial augmented reality (S R) medical training system using bed mounted projectors according to an embodiment; j 0030] Figure 3C is a top view of a spatial augmented reality' f S AR) medical training system using bed mounted projectors according to another embodiment;
[0031] Figure 3D is a top view of a spatial augmented reality (SAR) medical training system using bed mounted projectors according to another embodiment;
[0032] Figure 4 is an isometric view of a medical training system using internally mounted projectors according to an embodiment;
[0033] Figure 5 A is an isometric view of a range of manikins of different sizes according to an embodiment;
[0034] Figure 5B is a top view of retrofitting an existing manikin by placing a de-featured mask placed over the head of manikin to hide facial features according to art embodiment;
[0035] Figure 5C shows a top view of the manikin of Figure 5B retrofitted with a de-featured mask according to an embodiment;
[0036] Figure 5D is a top view of the projection of an inj ured face onto retrofitted manikin of Figure 5C according to an embodiment;
[0037] Figure 6A is an isometric view of the projection of a rash symptom onto a manikin by a projector according to an embodiment;
(.0038] Figure 6B is an isometric view of the projection of trai ning information onto a manikin by a projector according to an embodiment;
10039] Figure 6C is an isometric view of the projection of internal information onto a manikin by a projec tor according to an embodiment; and
[0040] Figure 7 is schematic d iagram of a computing device according to an embodiment.
[0041 ] In the following description, like reference characters designate- like or corresponding parts throughout the figures.
DESCRIPTION OF EMBODIMENTS
[00421 Embodiments of a computational medical, training system and assoc iated methods for simulating a medical condition using a computational and/or SAR based medical training system will now be described. The medical training system comprises at least one manikin, and a computational system. The computational system comprises one or more projectors for projecting one or more images onto at least a portion of the manikin. The manikin may be in a fixed location. Alternatively, the computational system may also include one or more tracking systems for tracking at least the manikin. The method of simulating a medical condition is performed by obtaining a manikin, and then projecting one or more images onto at least a portion of the manikin to simulate a medical condition. The computing device may be -configured to di splay images or a sequence of images according to a script or under the control of a simulation supervisor who may trigger specific sequences or images, or the selection of images may be automated and respond to actions by persons interacting with the manikin. The manikin could be a whole body or part such as just the head or face in which a static or animated medical condition and/or feature data is projected onto the manikins face. Additionally, the manikiiv'head can be configured to talk {either computer animated or pre-recorded audio) and respond to established scenarios through the application of programmed intelligence or respond via instructor input, to create a- more realistic or immersive medical trainin simulation.
[0043] This computational system may also be a Spatial Augmented Reality (SAR) system. .Augmented Reality (AR) is the addition of digital imager and other information to the real world by a computer system. AR enhances a user's view or perception of the world by adding computer generated information to their view. Spatial Augmented Reality (S AR) is a branch of AR research that uses projectors to augment physical objects with computer generated information and graphics. Traditionally, projectors have been used to project information onto purpose built projection screens, or walls. SAR on the other hand, locates (or projects) information directly onto objects of interest, including moving objects. SAR systems use sensors to develop a three dimensional (3D) model of the world, and typically include tracking systems that enable them to dynamicall track movement of real world objects. Such movements or changes are integrated into the 3D model so that updates can be made to projections as objects are mo ved around,
[0044] SAR systems have considerable flexibility and scalability over other AR systems. Multiple projectors may be used to provide projections onto multiple objects, or multiple surfaces of an object, and the projections may be of varying size (including very large projections). Further, high resolution projections can also be provided, either by the use of high resolution projectors, or multiple lower resolution projectors each handling different components of the projection to provide a high resolution output. One advantage of SAR systems is that. as the information is projected onto an object (or a surface), the system frees the viewer from having to wear or hold a display device, and the information can. be viewed by multiple people at the same time. Users can thus hold physical objects, and make and observe digital changes to the object, and these can be easil communicated to other viewers. This assists in creating an immersive environment. j 0045 ] A basic SAR system comprises a computing, apparatus that provides images (including computer generated animation sequences) for projection by a single projector onto a surface (such as a manikin) and thus the computational system could be a basic SAR system. However, the computational system may also be a more complex SAR system such as one that models the. SAR environment, and comprises multiple projectors, object tracking systems and alignment/calibration systems for projection of multiple images by the multiple projectors with real time compensation for movement of multiple objects within the field of view or SAR environment. For the sake of convenience the computational system described herein may also be referred to as a SAR system, and cover both the use of basic S AR systems and advanced SAR systems within the medical simulation system. In this context the computational device may also be referred to as the SAR computing device. However, it is to be understood that references to SAR systems or SAR devices can be equivalcntly understood as a reference to a computational, system. j 0046] Generally SAR systems comprise a SAR device and a SAR platform for producing a SAR envi onment. Tire SAR device is a computing device (ie comprising a processor and a memory), with inputs for recei ving data and an output for connection to at least one device for human perception such as the SAR platform. The SAR computing device loads SAR application modules and a SAR engine fo receiving the input date and for interfacing between the SAR application modules and the outputs such as the SAR platform. In particular, SAR application modules can be developed for simulating a range of medical conditions.
[0047] in the current specification, the SAR platform is defined as the devices which receive input or generate the SA output - that is the actual configuration and layout of devices used to generate the SAR environment and to detect changes or inputs. These may be simple devices such as a keyboard, mouse or video projector which can be directly connected to the computing device, or the input devices may include more complex systems such as an object tracking system that comprises multiple sensors and a separate computing device which processors the sensor input and provides tracking input mformation to the computing device. In medical simulation embodiments the tracking system tracks the manikin, although other objects can also be used. Fo example, rather than using real and typically expensive medical equipment, the display of a device can be simulated based on the medical simulation data. For example, an ECG trace or blood oxygen levels ccmkl be simulaied and projecied onto a screen nest to the manikin. The objects and surfaces on which information is to be projected or perceived can be stationary or moving. The connection of the SAR platform or individual input and output devices of the- SA platform to the computing device may be a wired or wireless protocols or communications devices or means, including Bluetooth, Wi-Fi, infrared, or other wireless technologies, protocols and means. The SAR system may ran autonomously, or be unde the control of a simulation supervisor overseeing the medical training simulation. [0048] In the current specification, the SAR environment is defined to represent the physical environment within which augmented reality outputs generated by the SAR system are output (or may be output). For example, if the SAR output was generated by a video projector, then the SAR environment wouid be defined by the intrinsic and extrinsic parameters of the projector, such as the ran e within which an. image remains visible (eg lamp power), the range at which, individual pixels reach a predefined size limit (projection optics), and the position and orientation of the projector which defines the field of view or pointing limits. I some embodiments, the SAR environment may be an interior region of space, such as a. portion of a room, an entire room, multiple rooms, a region of exterior space (ie outdoor) or some combination. For example, in the context of a medical simulation this might be a bed of a hospital ward. The input devices and output devices may be located within the SAR environment, or they may be located outside of the SAR environment provided they can produce outputs which are perceptible within the SAR environment. That is the SAR .. platform, ma be completely outside the SA environment or partially within the SAR environments, in some circumstances the SAR environmen may be taken to include the SAR platform and the physical environment within which SAR. outputs are perceptible. Similarly, the observers who perceive or sense the outputs, or generate inputs may be located i the SAR. environment or they may be located outside of the SAR environment provided they can perceive the SAR outputs.
[0049 ] Referring now to Figure 1 , there is shown a schematic view of a spatial augmented reality (SAR) medical training system I according to an embodiment. A manikin 2 comprising a head 3, torso 4, arms.5 and legs 6 rests on a bed 7. A S AR system 1 is used to project images and textures onto a manikin and comprises a projector 11, a computing device 1 comprising a processor 1 7 and memory 18, and a tracking system 19 (although, as noted above, in other embodiments the tracking system may be omitted in which case the SAR system is simply an appropriately configured computational system). In this embodiment, the SAR system is used to project images that include faciai features such as pupils 12 and a mouth 14, as well as symptoms such as red flushed cheek 13, and a rash 15 o the torso. Further, facial features ca be altered to indicate symptoms such as dilating the pupils or adjusting the colour the lips to indicate cyanosis. The images may be projected as a sequence of images to show time varying symptoms or features. A memory in the computational system may store a library of images or they may be computer generated, images can be projected onto both the patient (the manikin) and other equipment such as the sheets, bed, floor, or medical equipment props (eg an ultrasound or blood pressure monitor) to simulate measurements.
[0050] Figures 2 A, 2B and 2C show isometric view 20, top view 21 and side view 22, respectively, of a SAR medical training system using ceiling mounted projectors according to an embodiment. The S R environment is a 7 bed with a manikin 2 in a hospital ward. Three projectors 24, 25, 26 are distributed around the bed and located in the plenum space above a ceiling 27 and provide overlapping projection volumes. This allows projection from different angles to reduce or prevent shadowing effects caused by participants walking around and interacting with the manikin, Further, multiple projectors can be used to generate complex images and textures o the manikin. The plenum space is the space between the true ceiling and a dropped ceiling and is used to house lighting, wiring, air vents, piping etc. Dropped ceilings often use ceiling tiles, and a ceiling tile could be modified to house a mount for the projector, and to include a projection aperture for projection of images onto the manikin. This arrangement provides a discrete and subtle arrangement as the only sign of projectors is the apertures in the ceiling tiles. The mounts can allow the projectors to be mounted at a range of angles, and can be motorised to allo w adjustment during simulations, for exaraple in the case of movement of the manikin on the bed. However, the projectors could be mounted on the ceiling in the case the plenum space was not present or not available for use. Additionally, a motion tracking system or video recording system can be mounted in the plenum space or directly from the ceiling.
[0051 ] in another embodiment, the one or more projectors are mounted in or on the bed ? that the maniki 2 is located on. Figure 3A is an isometric view 30 of a spatial augmented reality (SAR) medical training system using bed mounted projectors according to an embodiment. In this case, a first projector 31 is located at the head of the bed and a second projector 32 is located at the foot of the bed. Figure 3B is a top view 33 of the embodiment of Figure 3 A using projectors at the ends of the bed. Figure 3C is a top view 34 of another embodiment in which 4 projectors are located in each corner of the bed. Figure 3D is a top view 35 of another embodiment in which projectors are located along the long edges of the bed (3 per side). Bed mounted projectors can be integrated into the bed, or mounted on the bed. Short throw projectors are used to project textures (and images) at extreme angles so as to preventing shadowing from participants standing over the manikin. Further, this approach has the advantage of saving space and increasing mobility/freedom of movement of both the manikin and of participants without the creation of shadowing effect as can occur with ceiling or wall mounted projectors. In one embodiment, a wireless pi co-projector is installed within the manikin head'neck region. This is then used in conjunction with a periseopic mirror, to rear-project up and across onto the manikin faces, or onto a face-mask.
[0052] In another embodiment, the projectors are mounted on tripods or moveable (eg wheeled) stands to provide a readily portable and eustomisable system which can be used in a range of indoor and outdoor locations (eg simulated accident scenes).
[ 0053 j In another embodiment, a parti al manikin could be a shell constructed of semi-translucent material, and the simulation system is used to internally project images onto the interior surface of the shell. Figure 4 is an isometric view 40 of a medical training system using internally mounted projectors according to an embodiment. In this embodiment, the projectors are located within the bed 41 and are used to project images onto the interior surface of a manikin shell 42. A first projector 43 is used to project images onto the lower legs, a second projector 44 projects images for the torso and arms, and a π
third projector 45 is used to project images into the head. As the projectors are located in the bed, the bed can be adjusted from a flat to inclined position without affecting the projection system. This embodiment also avoids the need for a tracking syste as the manikin can be fixed with respect to the bed and the projectors, and as internal projection is used the problem of shadowing is completely eliminated, in this embodiment the projectors need to be capable of extremely short throws, preferably with high resolution, although this can be compensated for by the use of additional lower resolution projectors.
J 0054] In another embodiment, rather than, or in addition to, externally or internally projecting images onto the manikin, portions of the manikin include a display or touch screen (including multi-touch touch screens). In one embodiment, the screen is an OLED screen and in another embodiment the OLED screen is a 3D formed (ie curved) OLED screen. In one embodiment, a OLED mask is placed on the manikin to represent facial features or medical conditions, in another embodiment, the display screens are permanently embedded in the manikin. The manikin may include one or more touch screens to allow the manikin to become an interactive "smart surface" which can be interrogated through hand-tracking or gesture recognition, to provide additional training information.
[0055] The manikin may be fixed in place, or be allowed to move. In the latter ease, the computational system may be a SAR system including a tracking system used to track the location of the manikin in the SAR environment and with respect to the bed. The SAR computing device updates the images to be projected in response to movement of the manikin, to ensure that projected features are consistent projected in the same locations of the manikin, irrespective of movement of the manikin. Video (normal and I ) or RF based tracking system may be used, eg OptiTrack, ARToolkitPlus, and Vicon motion capture system, the trackin systems may use optical trackers, LED markers, magnetic trackers, fiducial markers, etc embedded in or on the manikin and bed to assist with tracking an object. For example the location of the bed, sheets, manikin and people may be tracked and the projection updated in response to movement of tracked objects. Additionally, the SAR system may compensate for obstruction of projeetcd images due to movement of a tracked object. For example, in a multi-projector system the system may project a feature fro multiple projectors to compensate for obscuration, or a second projector may project a feature if the projection from a first projector is blocked. In single projector systems, the simulation could be paused until a person moves out of an obscuring location.
[00561 The projectors are controlled by the SAR computing device (or more generally a computing device) and can be connected to the SAR computing device using wired or wireless communication links (eg an IR or RF based protocol such as WiFi, BlueTooth or an IEEE 802.1 1 based protocol). Projectors can be mains or barter)' powered. Calibration can be performed periodically, as part of a pre-simulation initialisation procedure, or during the simulation. Multiple projecto calibration, such as that described in AU2014200683 titled "Method And Apparatus For Calibration Of Multiple Projector Systems'" filed on 7 February 2014 may be used. The calibration process may be performed or assisted using calibration devices including light sensors and/or LEDs that are located on the bed or the manikin, such as those described in AU2Q 14200683.
[0057] Hospital environments are often very well lit, and thus to increase the visibility of projected images selective lighting can be used to create a darker area on the manikin to improve contrast, whilst .maintaining normal lighting the other areas (such as tool trolleys, diagnostic displays and charts). The actual colour and reflectivity of the projection surface can also be used to increase the apparent contrast of the projection. Grey screen technology can also be used in which the use of a grey screens (or in this ease a grey manikin) effectively increase perceived contrast at the expense of image brightness, which can he countered with a brighter projector.
[00581 The computing device is used to project images onto the manikins to simulate a patient and symptoms. The computing device may be a SAR computing device and may ran various generic or supporting SAR computing modules related to calibration, tracking, image updating, etc, as well as medical simulation related SAR computing modules developed to support medical simulations. The SAR system, including the development of SAR modules may be based on the SAR system described in
PCX/ AU20.13/000952 titled "Spatial Augmented Reality (SAR) Application Development System" and filed on 27 August 2013..
[0059] The computing device may also be used to control the overall medical simulation, or the
computing device may be in communication, or under control of, another computing device which is used to control the simulation. A simulation supervisor may use an interface to control or configure the medical simulation. The simulation supervisor may directly supervise a medical simulation, and control the progression of symptoms, as well as -recovery in the case of correct treatment, as well as directl interacting with participants (including acting as the patient). Alternatively, simulation supervisor can simply configure or program simulations, with the simulation being controlled autonomously b appropriate software that follows the preconfigured script, and which can respond to treatments initiated by participants.
[0060] The projected images are used to simulate medical conditions, and thus act as visual indications to the participants in the simulation. For ease of reference we will refer to the participants in the
simulation as students. The projected images may be a sequence of images corresponding to visual indications of different stages or symptoms of a simulated medical condition. This can be used to simulate the development of a simulated medical condition. Images can be projected onto both the patient (the manikin) and other equipment such as the sheets, bed, floor, or medical equipment props (eg an
ultrasound or blood pressure monitor) to simulate measurements. For example, blood or other body fluids and bod tissues can be projected onto the manikin and also sheets and other adjacent surfaces. This avoids the mess associated with using "fake blood" in training scenarios winch can be a big time consuming problem in clean-up and re-use of a simulation facility. Further, the images can be updated based on treatment performed by the participants. In this way the simulation can be dynamic and interactive.
10061 j A wide range of symptoms and medical conditions can be simulated by projecting appropriate images on to a manikin. Examples of simulated condi tions are listed in Table 1. Thi can be used to generally support medical training for doctors, nurses and other medical staff. Additionally, specific modules can be created for specialised training. For example, a trauma module for simulating trauma conditions could be modelled. Most undergraduate training either docs not cover trauma diagnosis and treatment, or only covers such areas superficially. A trauma simulation, would fill a need in this area. Another specialised module is for obstetrics/antenatal simulation or more generally abdominal/pelvic pathology simulation to simulate pregnancy complications and trauma, and to display anatomy, physiology or pathology. Another specialised module is neurological module for simulating. neurological conditions including stroke, Current medical simulators lack the fidelity to simulate neurological symptoms and conditions, and there are no stroke or neurological simulators, especially for simulating often, subtle facial effects. For example, cranial nerves are at the base of the brain, and are responsible for a lot of facial and sensory fractions. Medical students are often required to perform cranial nerve examinations as part of their medical training, but in almost all cases the patient is healthy as it is near impossible for a person to fake the signs of a neuroiogicai condition, and thus students simply go through the motions of examination and no assessment is possible. A medical simulator, including a SAR medical simulator can be used to project the often subtle and nuanced neurological symptoms onto a manikin to
TABLE 1
Examples of condi tions which can be simulated using images projected onto a manikin.
Body
Symptoms
Part
Face Unilateral facial droop
Rashes (as per torso)
Trauma (bruising, wounds)
Skin texture - pale (anaemia), ashen skin
Mouth movement when speaking
Eyes Unilateral and bilateral changes to pupil size, with, ability to construct as response to light Eyelids open and close
Unilateral eyelid drooping due to facial palsy
Nystagmus (eye movement condition)
Orbital and peri-orbital trauma and infection
Unilateral and bilateral red eye (conjunctivitis and conjunctival injection)
Nose Bleeding (epistaxis)
Trauma - braising/swelling
Nasal flaring - in babies having difficulty breathing
Mouth Pale/ white/ashen
Blue (central cyanosis)
Drooping one side of mouth (part of facial palsy)
Neck Rashes as per torso
Distended neck veins
Generalised swelling (fat or anaphylaxis)
Tracheal tugging during difficult breathing
Rashes such as anaphylaxis (uticaria or hives), and infectious rashes (meningococcus,
Torso
shingles, chicken pox)
Lack of chest movement on one side to indicate lack of function (eg pneumothorax or haemothorax)
Trauma (bruising, penetrating wound)
In-drawing of muscles between rubs during difficult breathing (intercostal, recession)
Abdominal pel v ie anatomy/pathoiogy/physioiogy such as trauma or bruising obstetrics/antenatal anatomy/pathoiogy/physioiogy simulation (eg pregnancy complications)
Colour: pale/white (anaemia/blood loss); blue (peripheral cyanosis): red (infection -
Limbs
cellulitis)
Rashes - as per torso
Trauma - bruising, wounds - open, infected or bleeding ]Ό0ό2 ] Images can be used to simulate a wide range of skin textures and features onto a plain manikin. This can be used to simulate a desired age, ethnicity, and sex for a patient, and could he changed from simulation to simulation. For example, characteristics such as skin colour, age and sun effects including smooth young skin, old wrinkly skin, sun spots, freckles, moles, ton burn, ethnic features, and colourings (eg. Asian eye shapes, red head) can: all be controlled by selection of images. Often factors such as age, ethnicity and sex will influence the symptoms of a medical condition, as some combinations make it harder to identify and thus diagnose conditions. In other cases the relevance of observed symptoms must be weighted by such factors - for example some symptoms may be due. to multiple causes in an older patient whereas they would be extremely rare in a younger patient, and thus more weight should be placed on the presence in such cases. This also adds realism, as the appearance of the manikin can be varied considerably, and thus be more reflective of the di versit of patient types a medical practitioner would expect to see. Further, dynamic imagery can be used to replace mechanical systems to simulate dynamic effects such as breathing, heart-beat, and pulse, thus enabling the use of simple manikins.
[0063] Additionally, manikins with a range of body types and sizes can be used. Most manikins are made of plastic and represent white males of approximately 25 years of age, and arc thus difficult to modify to simulate other body types and ages. As the complexity of symptoms and appearance is handled by the SAR system, the manikins can be cheaply manufactured with a range of desired shapes, such as children, females including pregnant females, thin patients, obese and morbidly obese patients, etc. In one embodiment, a manikin is obtained by forming a 3D shape based upon a computer model. The computer model could be obtained from a body scan of a real person or be artificially generated using simulation software, or be a hybrid (eg modified from a body scan). The manikin can then be manufactured by quickly cutting ou t a 3D manikin from a block of foam or plastic such as by using a CNC machine, or produced from a computer model usin a 3D printer. Figure 5A shows an isometric view 50 of a child 51 , an obese person, 52, and a female 53. Alternatively, the manikin could be manufactured using a easting or moulding technique. Further, rather than form a full manikin, partial manikins can be produced, such as just a head, just a torso, an upper body comprising a head, torso and upper amis, or a lower body comprising the abdomen/pelvic area and legs. Forming manikins in these ways provides a cheap way of generating manikins with a range of body shapes and sizes that can be used in the simulation system. Additionally, flat or slightly curved projections regions can. be provided, for example on the face, as the projection system(s) can be used to project detailed facial features. Forming the manikins in this way provides a cheap source of manikins with a range of body types for the medical simulation system, thus enhancing the flexibility and cost effectiveness of the system.
[0064] In another embodiment, an existing manikin is retrofitted with a de-featured cover, mask or surface to hide existing manikin features and to provide a clear (or clean) surface for projection of features/medical conditions b a projector. For example, in the ease of a face mask, the bottom surface or portion of the cover may be contoured to match manikin facial features/structure, or it may be deformable to mould to the manikin features/structure, l ite top surface may be a flat surface or slightly curved' surface (ie substantially flat in locations of facial features) to allow projection of a wide range of features. In another embodiment, the mask may have basic structural features, or regions such as a nose, cheek structure, eye sockets and chin, but allow considerable fine control of presentation of specific features such as size/shape of eyes, mouth, pigmentation, as well as medical symptoms. The mask may also include a wig or other related materials or features which are difficult to recreate through projection means alone. In one embodiment, the ears are utilised as a means of attaching the mask to the manikin head.
100651 Figure 5B is top view 54 of retrofitting an existing manikin 2 by placing a de-featured mask 55 which is placed over the head 4 of manikin 2 to hide facial features. Figure 5C shows a top view 56 of the retrofitted manikin with, the de-featured mask 55 providing a clean facial surface for projection of features or medical, conditions by a projector. Figure 5.D shows a top view 57 of the projection of an injured face 58 by a projector 10 onto the clea projection surface provided by dc*feature l mask 55. It will also be understood that a cover, o covers could be placed on other regions/body parts (eg torso, breasts, genitals, arms, legs) to alter the default manikin features, for example to add fat to tarso/arms/legs to simulate obesity, to add/hide gender specific features, etc. j 00661 Whilst the simulation system facilitates the use of low fidelity custom manikins, it is to be understood titat the simulation system can also be used to project images onto medium and high fidelity manikins. For example, a medium or high fidelity manikin that mechanically supports gross movements such as a head moving, or chest movements due to breathing, or a beating heart, whilst SAR based simulation systems can be used to track the moving surface and provide realistic textures that move with the surface or which are used to enhance the mechanical movement, in some embodiments, the manikin may simulate some physical movements whilst computer generated images are used to simulate other movements. For example, a manikin could mechanically move the chest to simulate breathing, whilst the simulation system is used to project images to simulate a heartbeat, such as by projecting different textures onto the moving surface (or vice versa).
}(K)67 j Bod and facial appearance, surface textures and symptoms, and dynamic effects (e breathing, heart beats) can be computer generated, obtained f om an image library or database, including images from real patients and symptoms, or obtained from body scans or image software. Gaming engines, used for generating textures of characters in computer games could be adapted to generate images for
projection onto the manikins. These can also be used to generate dynamic imagery to simulate breathing and pulse. Suitable examples include TRUSIM TRIAGE. Software such as 123D catch, MindMapper and VPT 6.0 allows for easy, practical captures of 3D surfaces. For example, 123D catch is a smart phone application that allows user to take a series of images around a 3D object (eg a real person), and the software performs 3D image reconstruction to create a 3D texture model, which can then he applied to the manikin. Figure 6A is an isometric view of the projection of skin texture onto the torso of a grey manikin along with a simulated rash syinptam 61. Further, the simulated effects can he dynamic, so that they evolve over time, or change in response to student actions or requests. For example, a student could ask. a manikin to open their mouth, in which case the simulation system could then project an open oral cavity onto the manikin showing the pathology of the tonsils, tongue and teeth. The simulation system could also dynamically change the image based upon tracking the students or manikins head. For example, if the student moves their head to one side to get a better view of the interior of the oral cavity, the simulation system could track the students head, and dynamically alter the projected image based upon the estimated view ing angle of the student relative to the man ikin ,
[0068] The manikins can also be instrumented with one or more embedded subsystems. The embedded subsystems can be used to simulate a medical condition or body noise (eg breathing, heart beat) and can be coordinated with the projected images, i one embod iment the manikin includes an aural user interface including one or more speakers which are used to generate aural outputs. These can allow simulation controller to speak for the manikin, or allow pre-recorded or computer generated sounds- or speech to be played. The speaker may also be used simulate one or more aural sym toms of the simulated medical condition, such as hear sounds, left/right lung sounds and abdomen sounds. The aural symptoms can be are coordinated w ith projected images and visual symptoms and effects. The manikin can also include a microphone to receive aural -input (eg questions) from a participant or to detect procedures performed by the participants. The microphone can be internal and/or external. This can be provided to a simulation supervisor who can verbally respond (eg using the speakers) or select an appropriate prerecorded or computer generated response. Alternatively, the microphone input could be provided to a speech recognition engine (or module) running either on a computing device embedded in the manikin or local environment (eg under a bed the manikin is on), or on a remote computing device (eg in a control room, remote server or even i the cloud). The received speech can then be interpreted, for example using a natural language processing module an an appropriate response can be provided via a speaker in the manikin. For example, a student could ask if a rash is itchy, and the manikin could respond with a response such as yes or no. in another example, a student could ask the manikin to rate the pain in a range from 1. to 10, an appropriate value could be provided based upon the underlying medical condition, eg if 1 is high, the manikin could respond with a value of 8 for a serious condition such as a intestinal
Obstruction/injury whereas the manikin could respond with a value of 4 for a less serious condition such as upset stomach o bo el.
[0069] In another embodiment, the simulation system could use the aural user interface to respond to both questions and tactile inputs from the student. For example, if the manikin is simulating appendicitis, then the student could push or prod the manikin's torso or abdomen in several different locations and each time ask if the pushing hurts. W here the manikin is pushed in an area not linked to the inflamed appendix, the maniki could provide a low pain response, but when an appropriate location is pushed a strong pain reaction could be provided including a pained facial response (eg gritted teeth, closed eyes, ete),along with appropriate aural sounds such as a cry or groan. If gross movement of the manikin is possible then this could also be performed eg flinching, or attempting to move the abdomen away from the pushing.
[0Q7G] A relatively chea standardised patient simulation system can be developed in which rear (internal) or front projection is used to project static or animated medical conditions and/or feature data onto the manikins face. The manikin can talk (via computer animated images and audio) and respond to established scenarios through the application of programmed intelligence or via instructor input. This system utilises a basic SAR system such as a single projector, computer, and speakers/microphone but can provide an enhanced simulation experience at relatively low cost. f 00711 Other subsystems that can be included are basic airways, blinking, sweating, crying, and
vomiting, IV/Blood flow features (pulse, blood pressure etc) and bladder/catheter interaction. These may be simple or complex systems depending upon the simulation requirements. A modular approach could be used, in which cavities in the manikin are created to receive modular units such as a hollowed out airway unit for use in anaesthesia and emergency room simulations. This allows generation of cheap manikins that can be used for simulation of a wide range of conditions, but which provide the capability for upgrading or retrofitting of sophisticated simulation modules for simulating specific conditions or scenarios.
[0072] n another embodiment, an actor could be used in place of a manikin, and a projection or SAR, system used to project images of medical conditions or data onto the actor. The actor would be briefed about the medical condition they have, and how to respond to questions or examination, and then the images of the medical condition would be projected onto the actor 's body. For example, images simulating a wound and blood could be projected onto the actor's body or clothes, and the images updated as the examination continues, for example as clothing is removed, or to simulate a change in the condition such as fresh bleeding from a wound. A SAR system could be used to track the actor's location and the actor could wear fiducial markers to assist wi th the tracking system.
[0073] The simulation system also naturally allows a wider range of training and learning opportunities to be performed. As shown in Figures 6B, training information can easily be projected onto a manikin 2. This can be used to guide a student on how to perform a new procedure, and/or to indicate what symptoms they can expect to encounter with a new procedure, hi this embodiment, speakers 62 are projected onto the manikin along with text 63. The speaker images indicate the location to listen for a symptom and the text can alert the student on what the symptom is. For example, in this embodiment the speaker on the right side of the manikin's chest has t e text "Right Lung Sounds" to indicate to the student what lung sounds they should expect to hear. These sounds may be played when the user touches that location, for example with a modified stethoscope that includes speakers in the ear pieces. Similarly, and as shown in Figure 6C, internal information 64 such as organs, muscles, circulator}' and lymphatic can be projected onto the manikin. This can be used to assist in training to indicate the internal state of the organs associated with observed symptoms or a medical condition the student is learning about (or diagnosed). For example, if the symptoms are associa ted with a rupturing appendix, then internal ima ges could be shown to reinforce the associatio between the symptoms and root cause.
[0074] Projection of training or internal information can thus be used as part of a training or learning exercise to create an interactive interface, in which the participants are instructed or guided by the additional information projected onto the surface of the manikin. A student can be guided throug a medical diagnosis by the SAR. system, which can present symptoms of a medical condition, including how symptoms, change and how further symptoms develop as a condition progresses. This could be performed semi-automatically by software- under the guidance- of a simulation controller who could be supervising several students at once with the assistance of the software, or the process could be fully automated or controlled by the simulation software. Instructional information, for example, on what stage of the condition that the symptoms represent can be pro vided along with information on appropriate tests to perform. This ca be supplemented with internal images of how th condition is developing.
Alternatively, instructional and internal information can be provided as feedback to a student either during or after a medical diagnosis simulation. j 007 1 in another embodiment, the system could be- used to make a recording of the simulati on. A video recording of the simulation environment could be stored along with additional simulation dat from the computational system relating to the patient such as what symptoms were bein displayed and when and how they developed. The video and associated simulation data could. then be played back to the students in a post briefing analysis to provide an improved learning tool For example, the students and instructors could review the simulation, and specific sequences could be re-enacted and supplemented with instructional information presented onto the manikins,
[0076] The simulation system could also be used more widely than just medical simulations. For example, the simulation sy stem could be used by police to create a three dimensional representation of a suspect, crime scene or missing person to assist in criminal investigation. For example, rather than just producing a facial "sketch" a whole three dimensional clothed person, to the correct height, body type, body shape could be produced. A computer sketch or computer generated image including a computer generated from one or more images of the suspect could then be projected onto the manikin. Such a simulation system could take the recognition process to the next level for identification purposes. The simulation system described herein could also be extended to other applications such as in store manikins, movie or theatrical productions, for example to al low use of a wider range of manikin body types, which can then be augmented with a range of facial features and textures to create more real istic manikins. In one embodiment, a shopper's facial features and skin texture could be projected onto a manikin wearing clothing for sale.
10077) The computing device or apparatus may be configured to displa images or a sequence of images according to a script or under the control of a simulation supervisor who may trigger specific sequences or images, or the selection of images may be automated and respond to actions by persons interacting with the manikin. This may be manually controlled by a supervisor who monitors the simulatio
environment and controls which images are displayed and when. More advanced systems may include embedded speakers and microphones, to allow the supervisor to respond as the patient or play back prerecorded patient responses or aural symptoms (eg groans, wheezing), or voice recognition arid speech synthesis may be used. Computer animations may also be displayed that are synchronised with the audio playback to reinforce the simulation of the manikin speaking or responding (eg mouth movements, grimacing when groaning, etc). Sophisticated systems may include multiple projectors and projector calibration systems, object tracking systems, instrumented manikins, and one or more high performance computers for modelling, generating and controlling the simulation environment such as analysing input and generating appropriate output including rending images and simulating symptoms. Such
computational systems may also include SAR systems. As noted above a SAR system is defined broadly to include relatively basic SAR systems as well as sophisticated SAR systems. For example, a basic SAR. system comprises a computing device that provides images (including computer generated animation sequences) for projection by a single projector onto a surface such as a manikin, and can thus be broadly described simply as a computational system.
[0078] J another embodiment, a computational standardised patient manikin is used to replace an actor in a medical (or other) simulation system. In this embodiment, the manikin incorporates at least one surface for displaying a representation of a simulated patient feature or condition, and one or more speakers. A computational system is configured to display one or more images on the at least one surface and providing one or more audio outputs to simulate a patient. In one embodiment the manikin's face comprises a display screen, such as 3D formed OLED screen or other LED, LCD, or flat or curved panel display system. The body may also include touch sensitive surfaces and additional display screens.
Alternatively or additionally, front or rear (internal) projectors are used to project static or animated medical conditio and/or feature data onto the manikin's face or body. The manikin is configured to talk or generate audi outputs using speech synthesis and computer animated movements and/or pre-recorded audio. The manikin may be configured to respond to established scenarios through the application of programmed intelligence or configured to respond using audio input from a supervisor using a microphone, and directing the microphones audio signal to the speakers. Touch screens or othe touch sensitive surfaces can be used to track hand movements and patient examinations, o it may reveal additional training information, for example displaying an internal representation, (eg blood supply or muscular-skeletal) of the bod region. In one embodiment, a standardised manikin head is provided (ie no body) using a display screen (eg 3D OLED) or front or rear projection system. In one embodiment, die entire manikin is a 3D formed OLED or similar displa technology.
[0079] The standardised patient manikin could be programmed or configured to respond to a specific set of scenarios o be more generally used to simulate a patient response in a simulation, and in particular to simulate scenarios where actors: have previously been used. This would enable a finer degree of standardisation in training scenarios where actors are used, particular for neurological or complex medical scenarios. The standardised patient manikin could be a stand alone manikin containing a computing system or in communication with a computing system or be part of more elaborate SAR system including multiple projectors and object tracking systems. The standardised patient manikin could thus be used to provide a method for simulating a patient by obtaining (including constructing) a standardised patient manikin and then displaying one or more images on the at f east one surface and providing one or more audio outputs to simulate a patient using a computational system comprising at least one computing device and at least one display device.
[0080] Aspects of the methods described herein are compu ter implemented using one or more computing devices. For example, a computing device could be configured with suitable software modules to act as a simulation controller and comprise of a display device, a processor and a memory, and one or more input devices. The computing device acting as the simulation controller could also be used to execute SAR modules (ie the simulation controller is also the SAR computing device) or it may be in communication wit a SAR system and S AR computing device in such a system. The memory may comprise instructions to cause the processor to execute a method described herein (and thus the processor may be configured to execute the instructions). These instructions may be stored as computer codes, which arc loaded and executed. The processor memory and display device may be included in a standard computing device, such as a desktop computer, a portable computing device such as a laptop computer or tablet, o they may be included in a customised device or system. The computing device may be a unitary computing or prograiiunable device, or a distributed device comprising several components operativcly (or
functionally) connected via wired or wireless connections.
[0081] An embodiment of a computing apparatus or computing device KM) is illustrated in Figure 7 and comprises a central processing unit (CPU) 1 10, a memory 120, a display apparatus 130, and may include an input device 140 such as keyboard, mouse, microphone etc. The CPU 1 .10 comprises an input-Output Interface 1 12, an Arithmetic and Logic Unit (ALU) 1 1 and a Control Unit and Program Counter element 1 1 which is in communication with input and output devices (eg input device 1 0 and display apparatus 130) through the Input/Output Interface. The Input/Output Interface may comprise a network interface and/or communications module for communicatiiig with an equivalent communications module in another device using a predefined communications protocol (e.g. Bluetooth, ZigBee, IEEE 802.15, IEEE 802.1 1, TCP/IP, U.D.P, etc). A graphical processing unit (GPU) may also be included. The display apparatus may comprise a flat screen display (eg LCD, LED, OLED, plasma, touch screen, etc), a projector, CRT, etc. The computing device may comprise a single CPU (core) or multiple CPU's
(multiple core). The computing apparatus may use a parallel processor, a vector processor, or be a distributed computing device. The memory' is operatively coupled to the processor^) and ma comprise RAM and ROM components, and may be provided within or external to the device. The memory may be used to store the operating system and additional software modules that can be loaded and executed by the processor(s). These may include modules to configure the processor to implement aspects of the methods described herein,
[0082] The medical training system described herein is able, with high fidelity, to visually simulate patients with a broad range of medical conditions and visual diagnosis c ues such as (but not l imited to): skin pallor and general colouration; pupil dilation; blinking; rashes; sweating; and traumatic conditions. In addition, a number of additional visual characteristics or patient types can be manipulated such as: sex; age; race; body type; and body size, in addition to allowing training on the wide range of patient types clinicians can expect to encounter, these factors often have additional diagnostic implications and thus enable a more thorough medical diagnostic procedure to be performed. These capabilities are not available with current medical simula tors which typically use white, muscular males of approximately 25 years of age. Further, simulations can be provided for neurological conditions, and other conditions for which current systems are either not capable of simulating, or for which they are only able to provide very rudimentary or poor simulations for.
[0083 ] The medical training system can be used to address a number of existing problems with current medical training systems using manikins. The use of simple manikins with realistic skin, textures, and dynamic symptoms pro vided by the simulation system allows the development of a low cost but highly realistic and immersive simul tion environment. The system can. address significant audio issues which piague current systems by removing redundant mechanical sub systems such as breathing subsystems to reduce unwanted sounds. Such functionality can be replaced by embedded speakers and dynamic imagery simulating movement of the chest. The system can also address the lack of visual detail and human factors b using a simulation system to display high quality and realistic textures onto a manikin. The system can also be used to dynamically simulate medical procedures by providing a level of interaction with the system changing visual and aural symptoms such as pupil dilation, colouration, pulse, etc, reflecting the progression of a medical condition, as well as responses to treatment to prov ide an interactive simulation experience. Further, greater immersion can be achieved by using more realistic human shapes (eg a range of manikins) and audio relays to provide realistic sounds. [0084] Those of skill in the art would understand that information and signals may be represented using an of a variety of technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, o any combination thereof.
[0085] Those of skill in the art would further appreciate mat the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein nay be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeabiiity of hardware and software, various illustrative componen ts, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application rid design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particula application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
[0086] The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly m.hardware, in. a software module executed by a processor, or in a combination of the two. For a hardware implementation, processing may be implemented within one or more application specific integrated circuits ( ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (JPLDs). field programmable gate arrays
(FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described herein, or a combination thereof. Software modules, also known as computer programs, computer codes, or instructions, may contain a number a number of source code or object code segments or instructions,, and may reside in any computer readable medium such as a RAM memory, flash memory, ROM memory, EPRGM memory, registers, hard disk, a removable disk, a CD- ROM, a DVD-ROM or any other form of computer readable medium. In the alternative, the computer readable medium may be integral to the processor. The processor and the computer readable mediu may- reside in an .ASIC or related device. The software codes ma be tored in a memory unit and executed by a processor. The memory unit may be implemented within the processor or external to the processor, in which case it can be communicatively coupled to the processor via various means as is known in the art.
(00871 Throughout the specification and the claims that follow, unles the context requires otherwise, the words "comprise" and "include" and variations such as "comprising" and "including" will be understood to imply the inclusion of a s tated in teger or group of integers, but not the excl usion of any other integer or group of integers. j 0088 ] The reference to any prior art in this specification is not and should not he taken as, an acknowledgement of any form of suggestion that such prior art forms part of the common general knowledge.
[0089] it will be appreciated by those skilled in the art that the invention is not restricted in its use to the particular applicatio described. Neither is the present invention restricted in its preferred embodiment with regard to the particular elements and/or features described or depicted herein. It will be appreciated that the invention is not limited to the embodiment or embodiments disclosed, but is capable of numerous rearrangemeats,- modifications and substitutions without departing from the scope of the invention as set forth and defined by the following claims.

Claims

1. A method for simulating a medical condition, the method comprising;
obtaining a manikin; and
projecting one or more images onto at least a portion of the manikin to simulate a medical condition using a computational system comprising at least one computing device and at least one projector.
2. The method as claimed in claim 1 , wherein the one or more images are visual indications of a simulated medical condition.
3. The method as claimed in claim 1 or 2, wherein projecting one or more images comprises projecting a plurality of images in sequence, wherein the sequence of images corresponds to visual indi cations of different stages or symptoms of a simulated medical condition.
4. The method as claimed in claim 3, further comprising receiving a response from a user after the projection of an image, and a next image in the sequence of images is based upon a previous image and the received response of the user.
5. The method as claimed in an one of claims 1 to 4, wherein the medical condition is a trauma condition.
6. The method as claimed in an one of claims 1 to 4, wherein the medical condition is a neurological condition.
7. The method as claimed in any one of claims 1 to 6, further receiving a selection of an age, a body size, an ethnicity and a sex for a simulated patient, and projecting one or more images corresponding to the selected age, body size, ethnicity and sex onto the manikin.
8. The method as claimed in any one of claims 1 to 7, wherein the manikin is obtained by forming a 3D shape based upon a computer model.
9. The method as claimed in any one of claims 1 to 8, wherein the manikin is a standard manikin fitted with a de-featured face mask to hide existing manikin features and to provide a clea surface for projection of the one or more images b the one or more projectors.
10. The method as claimed in an one of claims 1 to 9,. wherein the computational system further comprising one or more tracking systeras for tracking at least the manikin, and the one or more images for projection are adjusted based upon a tracked movement of the manikin.
1 1 . The method as claimed in an one of claims 1 to Ϊ 0, wherein the one or more projectors are ceiling mounted projectors,
12. The method as claimed in any one of claims 1 to 1 1 , wherein the manikin is located on a bed, and the one or more projectors are mounted in. or on the bed.
13. The method as claimed in any one of claims 1 to 12, wherein the manikin is a shell constructed of semi-translucent material and the one or more images are projected onto the interior surface of the shell.
14. The method as claimed in any one of claims 1 to 13, wherein the manikin includes one or more speakers, and the method farther comprises- generating one or more aural outputs from the speakers in the manikin.
15. The method as claimed in the previous claim, wherein the method further comprises simulating one or more aural symptoms of the simulated medical condition using at least one of the one or more speakers, wherein the aural symptoms are coordinated with the one or more projected images.
16. The method as claimed in any one of claims 1 to 1 , wherein the manikin includes a microphone and the method further comprises receiving aural input from a user.
17. The method as claimed in the previous claim, wherein the method further comprises providing the aural input to a simulation supervisor.
1 . The method as claimed in an one of claims 1 to 17, wherein the manikin includes one or more embedded subsystems., and the method further comprises simulating one or more symptoms of the simulated medical condition using at least one embedded subsystem, wherein the symptoms are coordinated with the one or more projected images.
1 . The method as claimed in any one of claims 1 to 18, further comprising obtaining a video capture of the simulation, and storing simulation data to allow for later revie of the medical simulation.
20. The method as claimed in an one of claims i to 19, further comprising projecting treatment information to instruct a user on an appropriate procedure to treat the simulated medical condition.
21. The method as claimed in any one of claims 1 to 20, former comprising projecting one or more images representing an internal state of the simulated patient.
22. The method as claimed in any one of claims 1 to 21, wherein the computational system is a Spatial Augmented Reality (SAR) system, and the at feast one projector is for projecting the one or more images into a S R environment, the SAR environment comprising at least a portion of the manikin.
23. A medical training simulation system comprising:
at least one manikin: and
a computational system comprising:
at least one computing apparatus and at least one projector for projecting one or more images onto at least a portion of the manikin.
24. The system as claimed in the preceding claim, wherein the at least one manikin comprises a pl urality of mannequins corresponding to a range of ages, body sizes, ethnicities and sexes.
25. The system as claimed in claim 23 or 24, wherein the at least one manikin is constructed using a rapid construction' process.
26. The system as claimed i claim 23, 24 or 25,. wherein the manikin includes one or more subsystems fo simulating internal symptoms of a patient.
27. The system as claimed in the preceding claim, wherein the one or more subsystems includes one or more speakers for providing aural symptoms.
28. The system as claimed in any one of claims 23 to 27, wherein the computational system further comprises one or more tracking systems for tracking at least the manikin.
29. The system as claimed in any one of claims 23 to 28, wherein the manikin is a standard manikin fitted with a de-featured face mask to hide existing manikin features and to provide a clean surface for projection of the one or more images by the one or more projectors.
30. The system as claimed in any one of claims 23 to 29, wherein the computational system is a Spatial Augmented Reality (SAR) system, and the at least one projector is for projecting the one or more images into a SAR environment, the S AR environment comprising at least a portion of the manikin.
31 . A computer readable medium comprising computer executable instructions for instructing a pr ocessor to perform any of method claims 1 to 22 to simulate a medical condition.
32. A computation ai apparatus comprising a processor and a memory, the processor configured to: provide one .or more images stored in the memory to one or more projectors for projection onto at leas t a portion of a manikin to simulate a medical condition.
33. The computational apparatus as claimed in. claim 32, wherein the one or more images are visual indications of a simulated medical condition.
34. The computational apparatus as claimed in claim 32 or 33, wherein the processor is configured to provide a plurality of images to the one or more projectors in a sequence wherein the sequence of images corresponds to visual indications of different stages or symptoms of a simulated medical condition .
35. The computational apparatus as claimed i claim 34, wherein the processor is configured to receive a response from a user after the projection of an image, and a next image in the sequence provided to the one or more projectors based upon a previous image and the received response of the user.
36. The computational apparatus as claimed in any one of claims 32 to 35, wherein the medical condition is a trauma condition.
37. The computational apparatus as claimed in any one of claims 32 to 35, wherein the medical condition is a neurological condition.
38. The computa tional apparatus as churned in any one of claims 32 to 37, wherein the processor is configured to receive a selection of an age, body size, ethnicity and sex for a simulated patient, and the processor is configured to select an image for projection corresponding to the selected age, body size, ethnicity and sex onto the manikin.
39. The computational apparatus as claimed in any one of claims 32 to 37, further comprising one or more tracking systems for tracking at least the manikin, and the processor is configured to adjust one or more projection parameters based upon a tracked movement of the manikin.
40. The computational apparatus as claimed in any one of claims 32 to 39, wherein the processor is configured to send one or more aural outputs to one or more speakers in the manikin.
41. The computational apparatus as claimed in claim 40, -wherein the processor is configured to simulate one or more aural symptoms of the simulated medical condition and provide the one or more aural symptoms to the one or more speakers, wherein the provision of aural symptoms is coordinated with the provision of the one or more projected images.
42. The computational apparatus as claimed in any one of claims 32 to 41 , wherein the manikin includes a microphone and the computational apparatus is configured to receive aural input from a user.
43. The computational apparatus as claimed in the previous claim, wherein the processor is configured to provide the received aural input to a simulation supervisor,
44. The computational apparatus as claimed in any one of claims 32 to 43, wherein die manikin includes one or more embedded subsystems, and the processor is configured to simulate one or more symptoms of the simulated medical condition by controlling at least one embedded subsystem, wherein control of the at least one embedded subsystem is coordinated with the one or more projected images.
45. The computational apparatus as claimed in any one of claims 32 to 44, wherein the processor is configure to receive and store a video capture of the simulation in the memory, and to store simulation data to allow for later review of the medical simulation.
46. The computational apparatus as claimed in any one of claims 32 to 45, wherein the one or more images comprise treatment information to instruct a user on an appropriate procedure to treat the simulated medical condition.
47. The computational apparatus as claimed in any one of claims 32 to 46, wherein the one or more images comprise a representation of an internal state of the simulated patient.
48. The computational apparatus as claimed in any one of claims 32 to 47, wherein the computational apparatus is part of a Spatial Augmented Reality (SAR) system, and the at least one projector is for projecting the one or more images into a SAR environment, the SAR environment comprising at least a portion of the manikin.
49. A method for simulating a patient, the method comprising;
obtaining a manikin incorporating at least one surface for displaying a representati n of a simulated patient feature or condition, and one or more speakers; and
displaying one or more images on the at least one surface and providing one or more audio outputs to stmulate a patient using a computational system comprising at least one computing device and at least one displa device.
50. The method as claimed in claim 49, wherein the at least one display device comprises at least one OLED based display or flat panel display device, and the at least one surface for displaying a
representation of a simulated patient feature or condition is the at least one OLED based display.
51. The method as claimed ra claim 49 or 50, further comprising computationally generating or playing back pre-recorded audio signal, and outputting the audio signal via the one or more speakers to simulate a patient response in the simulation.
52. The method as claimed in claim 49 or 50, further comprising receiving an audio signal from a microphone used by a simulation supervisor, and outputting the audio signal via the one or speakers to simulate a patient response in the simulation.
53. The method as claimed in an one of claims 49 to 52, wherein th manikin comprises one or more touch sensitive surfaces for receiving user input, and the one or more images and one or more audio outputs are displayed are selected or generated based upon the received user input.
54. A simulation, system comprising:
manikin incorporating at least one surface for displaying a representation of a simulated patient feature or condition, and one or more speakers; and
a computational system comprising at least one processor and a memory, the at least one processor configured to display one or more images on the at least one surface and providing one or more audio outputs to simulate a patient,
55. The simulation system as claimed in claim 54, wherein the at least one display device comprises at least OLED based display or flat panel display device.
56. The simulation system as claimed in claim 54 or 55, wherein the computational system is configured to computationally generate or play back a pre-recorded audio signal and to output the audio signal via the one or more speakers to simulate a patient response in a simulation.
57. The simulation system as claimed in claim 54 or 55, wherein the computational system further comprises a microphone and the computational system is configured to output the audio signal via the one or more speakers to simulate a patient response in a simulation.
58. The simulation system as claimed in any one of claims 54 to 57, wherein the manikin further comprises one or more touch sensitive surfaces for receiving user input, and the computational system is configured to select or generate one or more images and one or more audio outputs based upon the recei ved user input.
PCT/AU2014/000865 2013-09-02 2014-09-02 A medical training simulation system and method WO2015027286A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2013903338 2013-09-02
AU2013903338A AU2013903338A0 (en) 2013-09-02 Spatial augmented reality (sar) medical training system and method

Publications (1)

Publication Number Publication Date
WO2015027286A1 true WO2015027286A1 (en) 2015-03-05

Family

ID=52585304

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2014/000865 WO2015027286A1 (en) 2013-09-02 2014-09-02 A medical training simulation system and method

Country Status (1)

Country Link
WO (1) WO2015027286A1 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017003693A1 (en) * 2015-06-30 2017-01-05 Thomson Licensing Method and apparatus using augmented reality with physical objects to change user states
US9721483B2 (en) 2013-08-22 2017-08-01 University Of Delaware Medical treatment simulation devices
EP3364322A1 (en) * 2017-02-15 2018-08-22 Fresenius Medical Care Deutschland GmbH Device and method for a simulation and evaluation system for medical treatment facilities
WO2018187748A1 (en) * 2017-04-07 2018-10-11 Unveil, LLC Systems and methods for mixed reality medical training
CN109074755A (en) * 2016-04-06 2018-12-21 皇家飞利浦有限公司 For making it possible to analyze the method, apparatus and system of the performance of vital sign detector
CN110178159A (en) * 2016-10-17 2019-08-27 沐择歌公司 Audio/video wearable computer system with integrated form projector
WO2019217247A1 (en) * 2018-05-05 2019-11-14 Mentice Inc. Simulation-based training and assessment systems and methods
US10540911B2 (en) 2013-08-22 2020-01-21 University Of Delaware Medical treatment simulation devices
WO2020018834A1 (en) * 2018-07-18 2020-01-23 Simulated Inanimate Models, LLC Surgical training apparatus, methods and systems
WO2020097122A1 (en) * 2018-11-05 2020-05-14 Children's Hospital Medical Center Computation model of learning networks
US10692401B2 (en) 2016-11-15 2020-06-23 The Board Of Regents Of The University Of Texas System Devices and methods for interactive augmented reality
CN112466479A (en) * 2020-12-07 2021-03-09 上海梅斯医药科技有限公司 Patient model creation method, system, device and medium based on virtual reality
US11955030B2 (en) 2021-03-26 2024-04-09 Avkin, Inc. Wearable wound treatment simulation devices
US12106678B2 (en) 2021-10-23 2024-10-01 Simulated Inanimate Models, LLC Procedure guidance and training apparatus, methods and systems
US12154456B2 (en) 2014-08-22 2024-11-26 University Of Delaware Medical treatment simulation devices
US12374236B2 (en) 2022-01-27 2025-07-29 Fresenius Medical Care Holdings, Inc. Dialysis training using dialysis treatment simulation system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100159434A1 (en) * 2007-10-11 2010-06-24 Samsun Lampotang Mixed Simulator and Uses Thereof
US20120038739A1 (en) * 2009-03-06 2012-02-16 Gregory Francis Welch Methods, systems, and computer readable media for shader-lamps based physical avatars of real and virtual people

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100159434A1 (en) * 2007-10-11 2010-06-24 Samsun Lampotang Mixed Simulator and Uses Thereof
US20120038739A1 (en) * 2009-03-06 2012-02-16 Gregory Francis Welch Methods, systems, and computer readable media for shader-lamps based physical avatars of real and virtual people

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SAMOSKY, J. ET AL.: "BodyExplorerAR: enhancing a mannequin medical simulator with sensing and projective augmented reality for exploring dynamic anatomy and physiology", PROCEEDING TEI ' 12, PROCEEDINGS OF THE SIXTH INTERNATIONAL CONFERENCE ON TANGIBLE, EMBEDDED AND EMBODIED INTERACTION, NEW YORK, NY, USA, pages 263 - 270 *

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11562664B2 (en) 2013-08-22 2023-01-24 University Of Delaware Medical treatment simulation devices
US10540911B2 (en) 2013-08-22 2020-01-21 University Of Delaware Medical treatment simulation devices
US10373527B2 (en) 2013-08-22 2019-08-06 University Of Delaware Medical treatment simulation devices
US9721483B2 (en) 2013-08-22 2017-08-01 University Of Delaware Medical treatment simulation devices
US12154456B2 (en) 2014-08-22 2024-11-26 University Of Delaware Medical treatment simulation devices
WO2017003693A1 (en) * 2015-06-30 2017-01-05 Thomson Licensing Method and apparatus using augmented reality with physical objects to change user states
CN109074755B (en) * 2016-04-06 2024-06-14 皇家飞利浦有限公司 Methods, devices and systems for enabling analysis of performance of vital sign detectors
CN109074755A (en) * 2016-04-06 2018-12-21 皇家飞利浦有限公司 For making it possible to analyze the method, apparatus and system of the performance of vital sign detector
JP2019513432A (en) * 2016-04-06 2019-05-30 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Method, device and system for enabling to analyze characteristics of vital sign detector
US11810325B2 (en) 2016-04-06 2023-11-07 Koninklijke Philips N.V. Method, device and system for enabling to analyze a property of a vital sign detector
CN110178159A (en) * 2016-10-17 2019-08-27 沐择歌公司 Audio/video wearable computer system with integrated form projector
EP3526775A4 (en) * 2016-10-17 2021-01-06 Muzik Inc. Audio/video wearable computer system with integrated projector
US10692401B2 (en) 2016-11-15 2020-06-23 The Board Of Regents Of The University Of Texas System Devices and methods for interactive augmented reality
EP3364322A1 (en) * 2017-02-15 2018-08-22 Fresenius Medical Care Deutschland GmbH Device and method for a simulation and evaluation system for medical treatment facilities
US10438415B2 (en) 2017-04-07 2019-10-08 Unveil, LLC Systems and methods for mixed reality medical training
WO2018187748A1 (en) * 2017-04-07 2018-10-11 Unveil, LLC Systems and methods for mixed reality medical training
US12165536B2 (en) 2018-05-05 2024-12-10 Mentice, Inc. Simulation-based training and assessment systems and methods
WO2019217247A1 (en) * 2018-05-05 2019-11-14 Mentice Inc. Simulation-based training and assessment systems and methods
CN112822989A (en) * 2018-07-18 2021-05-18 西姆拉特无生命模型公司 Surgical training apparatus, method and system
WO2020018834A1 (en) * 2018-07-18 2020-01-23 Simulated Inanimate Models, LLC Surgical training apparatus, methods and systems
US10665134B2 (en) 2018-07-18 2020-05-26 Simulated Inanimate Models, LLC Surgical training apparatus, methods and systems
WO2020097122A1 (en) * 2018-11-05 2020-05-14 Children's Hospital Medical Center Computation model of learning networks
CN112466479A (en) * 2020-12-07 2021-03-09 上海梅斯医药科技有限公司 Patient model creation method, system, device and medium based on virtual reality
US11955030B2 (en) 2021-03-26 2024-04-09 Avkin, Inc. Wearable wound treatment simulation devices
US12300118B2 (en) 2021-03-26 2025-05-13 Avkin, Inc. Wearable wound treatment simulation devices
US12106678B2 (en) 2021-10-23 2024-10-01 Simulated Inanimate Models, LLC Procedure guidance and training apparatus, methods and systems
US12374236B2 (en) 2022-01-27 2025-07-29 Fresenius Medical Care Holdings, Inc. Dialysis training using dialysis treatment simulation system

Similar Documents

Publication Publication Date Title
WO2015027286A1 (en) A medical training simulation system and method
US20230094004A1 (en) Augmented reality system for teaching patient care
US20120270197A1 (en) Physiology simulation garment, systems and methods
CN107527542B (en) Percussion training system based on motion capture
US20140287395A1 (en) Method and system for medical skills training
CN112164135A (en) Virtual character image construction device and method
US20120288837A1 (en) Medical Simulation System
Girau et al. A mixed reality system for the simulation of emergency and first-aid scenarios
CN112150617A (en) Control device and method of three-dimensional character model
Preim et al. Virtual and augmented reality for educational anatomy
Oyama Virtual reality for the palliative care of cancer
Pereira Santos et al. Embodied Agents for Obstetric Simulation Training
KR100445846B1 (en) A Public Speaking Simulator for treating anthropophobia
Zheng et al. Using computer animation for emergency medicine education
WO2020047761A1 (en) Medical simulator, and medical training system and method
US20240221518A1 (en) System and method for virtual online medical team training and assessment
Hon Medical reality and virtual reality
Takacs Cognitive, Mental and Physical Rehabilitation Using a Configurable Virtual Reality System.
RU2799123C1 (en) Method of learning using interaction with physical objects in virtual reality
Canadelli " Scientific Peep Show" The Human Body in Contemporary Science Museums
Aydin An examination of the use of virtual reality in neonatal resuscitation learning and continuing education
RU2798405C1 (en) Simulation complex for abdominal cavity examination using vr simulation based on integrated tactile tracking technology
Scerbo et al. Medical simulation
Tripathi A Study on the Field of XR Simulation Creation, Leveraging Game Engines to Develop a VR Hospital Framework
Wysieński et al. PROSPECTS FOR THE USE OF THREE-DIMENSIONAL VIRTUAL REALITY (VR 3D) SUPPORTED BY ARTIFICIAL INTELLIGENCE (AI) ALGORITHMS IN MEDICAL EDUCATION

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14839152

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14839152

Country of ref document: EP

Kind code of ref document: A1