US20200138337A1 - Non-Contact Breathing Activity Monitoring And Analyzing System Through Thermal On Projection Medium Imaging - Google Patents
Non-Contact Breathing Activity Monitoring And Analyzing System Through Thermal On Projection Medium Imaging Download PDFInfo
- Publication number
- US20200138337A1 US20200138337A1 US16/676,366 US201916676366A US2020138337A1 US 20200138337 A1 US20200138337 A1 US 20200138337A1 US 201916676366 A US201916676366 A US 201916676366A US 2020138337 A1 US2020138337 A1 US 2020138337A1
- Authority
- US
- United States
- Prior art keywords
- medium
- thin
- exhale
- thermal
- patient
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Measuring devices for evaluating the respiratory organs
- A61B5/087—Measuring breath flow
- A61B5/0878—Measuring breath flow using temperature sensing means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0033—Features or image-related aspects of imaging apparatus, e.g. for MRI, optical tomography or impedance tomography apparatus; Arrangements of imaging apparatus in a room
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Measuring devices for evaluating the respiratory organs
- A61B5/0816—Measuring devices for examining respiratory frequency
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Measuring devices for evaluating the respiratory organs
- A61B5/082—Evaluation by breath analysis, e.g. determination of the chemical composition of exhaled breath
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Measuring devices for evaluating the respiratory organs
- A61B5/087—Measuring breath flow
- A61B5/0873—Measuring breath flow using optical means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Measuring devices for evaluating the respiratory organs
- A61B5/091—Measuring volume of inspired or expired gases, e.g. to determine lung capacity
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6803—Head-worn items, e.g. helmets, masks, headphones or goggles
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/10—Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/10—Image enhancement or restoration using non-spatial domain filtering
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/50—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2560/00—Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
- A61B2560/04—Constructional details of apparatus
- A61B2560/0431—Portable apparatus, e.g. comprising a handle or case
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0271—Thermal or temperature sensors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2576/00—Medical imaging apparatus involving image processing or analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/01—Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Definitions
- Various embodiments of the present technology generally relate to systems and methods for monitoring breathing activity and identifying various pulmonary conditions. More specifically, some embodiments of the present technology relate to non-contact breathing activity monitoring and analyzing system through thermal on projection medium imaging.
- Respiration monitoring techniques that are both accurate and comfortable for the patient are sorely lacking from the medical field.
- the most accurate methods of respiration monitoring may require placing ECG electrodes on the patient's body, putting thermistors in the patient's nose, having the patient wear an abdominal strain-gauge transducer, having the patient breathe through a tube while wearing a nose clip, or some combination of these techniques that monitor multiple biophysiological parameters concurrently.
- These traditional solutions all involve placing sensors directly on the patient's body. These direct measurements have a high rate of accuracy, but cause discomfort and alter the natural breathing of the patient.
- Pulmonologists employ a variety of respiratory monitoring tools to accurately assess the health of a patient, such as spirometers, plethysmographs, and polysomnography.
- these techniques require placing sensors directly on the body of the subject or having the subject breath through a tube-based device. Though these methods boast high accuracy and utility, these methods often inflict physical and psychological discomfort, and interfere with a patient's natural breathing behaviors, limiting their application for long-term monitoring. Additionally, cumbersome equipment and labor-intensive setups make these techniques difficult to utilize in out-patient clinics.
- Various embodiments of the present technology generally relate to systems and methods for monitoring breathing activity. More specifically, some embodiments of the present technology relate to non-contact breathing activity monitoring and analyzing through thermal imaging of thin-medium surfaces.
- Thin-medium thermal imaging refers to the visual collection of exhaled heat signatures obtained on a thin-medium imaged by a thermal camera.
- Various embodiments provide for systems and methods based on this premise for the extraction of clinically meaningful metrics for respiratory analysis. This includes, but is not limited to, the reconstruction of exhale flows to obtain volumetric estimates of exhaled breath, the identification of the separation between nose (each nostril) and mouth exhale flows to measure distribution, and both the velocity and strength of exhale flows.
- IR thermal camera an infrared (IR) thermal camera.
- the IR thermal camera can be combined with a laser transmitter (Tx) and receiver (Rx) which can be used to measure the distance between the thermal camera and the thin-medium (e.g., formed in a planar or a curved surface).
- Tx laser transmitter
- Rx receiver
- the IR thermal camera can be configured to image a subset interval of the electromagnetic spectrum within the infrared wavelengths and generate grayscale intensity values within a dense pixel array that represent the thermal intensity distribution of exhale behaviors within the thin-medium.
- the system can define a configuration where exhale characteristics and behaviors are identified through the visualization of heated exhale interacting with the thin medium (e.g., plastics, polymers, metals, fibers, or other synthetic materials) which is imaged from the opposing side or from surrounding directions of the thin-medium.
- the thin medium e.g., plastics, polymers, metals, fibers, or other synthetic materials
- a chemical coating may be applied to the thin-medium to enhance the thermal residual and dissipative characteristics of the medium to modify the thermal signature imposed on the material from external heat sources.
- the chemical coating may alter the material's ability to maintain heat residuals or alter the material's dissipative characteristics.
- the thin medium can be selected to maintain a thermal radiance above ambient temperature for a given period of time.
- the IR thermal camera, distance laser, and thin-medium can be combined into a mobile or wearable device.
- the camera can be placed to allow imaging of a large portion (e.g., 60% or more) of the thin-medium for recording the thermal changes in the thin-medium.
- the thermal camera, laser distance system, and thin-medium may be mounted within an adjustable mount that provides a targetable vision system that tracks thermal changes in the thin-medium.
- Various embodiments may also include processing units, processors, or computers to execute image processing and respiratory metric algorithms.
- the computing resources can be integrated within the device or may be external to the device and communicably coupled via a wireless link.
- a system can include a thermal camera that images projections of an exhale cross-section on a projection-medium, a communication interface coupled to the thermal camera to receive a sequence of thermal images depicting the thermal distribution of the exhale on the thin-medium, a processor, and a set of instructions stored thereon that when executed by the processor, cause the processor to generate a representation or model of the exhale of the subject.
- Various metrics of the respiratory behavior, two- and three-dimensional representations of the exhale cross-sections, 3D reconstruction, a geometrically consistent estimation of the approximate flow volume via a reconstruction on a per-frame basis to provide an estimate of the subject's tidal volume, and/or the like can also be generated.
- a method of analysis for monitoring and extracting metrics related to pulmonary function can include taking thermal images of a thin-medium while the user breathes onto the surface of the medium.
- a temporary representation of the exhale signal on the medium that represents the exhale behaviors of the user can be created.
- iso-lines to form an iso-line map of gradients and exhale boundary captured over time can be created to form a three-dimensional representation of the exhale.
- Respiratory information can be derived through applying image processing techniques to the thermal gradient images.
- a correlation between the intensity values and exhale behavior over time with objective metrics may also be derived through machine learning.
- Embodiments of the present technology also include computer-readable storage media containing sets of instructions to cause one or more processors to perform the methods, variations of the methods, and other operations described herein.
- the system can generate metrics that describe pulmonological function that include: reconstructions of exhale flow behavior by (1) imaging the thin-medium using the thermal camera to produce a video stream composing a sequence of images, (2) extracting 2D cross-sectional regions of the exhale behavior within the medium for each image, (3) generating a model based on these cross sections over time.
- This model can then be used to estimate various pulmonological traits including: 3D reconstructions of the exhale volume used to obtain tidal volume estimates, identifying the separation between nose, mouth, and nostrils, and exhale strength over time.
- the image sequence produced by the thermal camera can be used to generate a sequence of contours that are extracted from the 2D intensity images.
- Each contour can be identified temporally by a time-stamp that can be used to define these 2D contours over time.
- the sequence of contours is then projected over time to define a 3D volume.
- the 3D volumetric model can then be used to provide estimates of exhale behaviors and tidal volume.
- Exhale can be separated into a distribution between each nostril and the mouth. This distribution is represented as a percentage (%) relationship.
- the behaviors of the contours and thermal distribution within the thin-medium are used to identify a relational proportion between each nostril and the mouth.
- the duration and thermal distribution of an exhale can be analyzed to extract the strength of the exhale. Strength may be defined as a relationship between velocity, force, and duration of the exhale.
- Some embodiments may include an additional thermal camera to capture a side profile of the exhale.
- the additional thermal camera may include a wavelength filter tuned to a spectral range subset of the infrared spectrum used to visualize CO 2 from exhaled airflows from a side profile. This is used as a ground-truth device with which to validate metrics collected from the medium.
- the thermal CO 2 camera may be positioned perpendicular to the linear setup of the thermal camera and thin-medium. This can provide a method for visualizing the exhale as it exits the nose/mouth and makes contact with the thin-medium, imparting thermal energy into the medium. The flow delay and nose/mouth distribution ground-truth values can then be correlated with the behaviors identified on the thin-medium.
- Measurements obtained from the side view using the thermal CO 2 camera are used to improve the accuracy of the apparent behavior identified in the thin medium using the thermal camera. This correlation can be established in some embodiments by training a deep neural network with the relationship of the exhale metrics between those obtained with the thermal CO 2 camera and the behaviors visualized on the thin-medium.
- FIG. 1 illustrates an example of a system that may be used for respiratory monitoring through thin-medium thermal imaging in accordance with various embodiments of the present technology
- FIG. 2 illustrates an example of a sequence of a thermal distribution on the thin medium due to the exhale that may be generated in accordance with one or more embodiments of the present technology
- FIGS. 3A-3B illustrate examples of training and monitoring setups using the thin medium thermal imaging technique that may be used in some embodiments of the present technology
- FIG. 4 shows plots of the cumulative volume over time and the volume per exhale for a single patient that may be present in some embodiments of the present technology
- FIG. 5 is a flowchart illustrating an example of a set of operations for computing tidal volume in accordance with various embodiments of the present technology
- FIGS. 6A-6B provide illustrations of two possible stationary setups for the thin medium in a clinical setting in accordance with various embodiments of the present technology
- FIGS. 7A-7B illustrate examples of semi-contact mask-based thin-medium setups that may be used in one or more embodiments of the present technology
- FIGS. 8A-8B illustrate components of a stimulation-based respiratory analysis system that may be used in some embodiments of the present technology
- FIG. 9 illustrates a VR thin-medium respiratory analysis setup that may be used in one or more embodiments of the present technology
- FIG. 10 illustrates an example of a hybrid respiratory monitoring system that may be used for respiratory monitoring through thin-medium thermal imaging in accordance with various embodiments of the present technology
- FIG. 11 illustrates an example of various components of the hybrid respiratory monitoring system that may be used in one or more embodiments of the present technology
- FIG. 12 illustrates an example of a thermal exhale sequence showing images from the medium-view and the side-view that can be collected by some embodiments of the present technology
- FIG. 13 is a confusion matrix for one of the four evaluations of the convolutional neural network implemented in some embodiments of the present technology
- FIG. 14 is a flowchart illustrating an example of a set of operations for computing a tidal volume estimate by training a Long-Short-Term Memory (LSTM) network to associate flow data from a spirometer with information extracted from the medium images in accordance with various embodiments of the present technology;
- LSTM Long-Short-Term Memory
- FIG. 15 is a flowchart illustrating an example of a set of operations for identifying a person's breathing mode using a convolutional neural network in accordance with one or more embodiments of the present technology
- FIG. 16 illustrates techniques for computing various breathing metrics in accordance with some embodiments of the present technology
- FIGS. 17A-17B illustrate block diagrams of integrated mobile configurations and sensor only configurations of the breathing analysis system
- FIG. 18 is an illustration of a top view of breathing analysis system where the cameras are imaging the thin-medium from the same-side as the patient.
- FIG. 19 is a block diagram illustrating an example machine representing the computer systemization of the breathing analysis system.
- Various embodiments of the present technology generally relate to systems and methods for monitoring breathing activity. More specifically, some embodiments of the present technology relate to non-contact breathing activity monitoring and analyzing through Thin-Medium Thermal Imaging (TMTI).
- TMTI Thin-Medium Thermal Imaging
- breathing behaviors are captured as they contact a thin planar surface that is then imaged using a thermal camera.
- the heat distribution imposed on the thin-medium is then used to extract respiratory metrics.
- Conventional breath monitoring systems need to place devices on patients, which causes discomfort and may not be applicable for long term monitoring.
- the primary evaluation criteria within respiratory analysis revolves around the collection of a limited set of quantitative metrics such as breathing rate, flow analysis, and tidal volume estimates.
- some embodiments provide for systems and methods of respiratory analysis that are non-contact, but also measure the exhaled air of a human subject directly through a medium-based exhale visualization technique.
- a thin medium can be placed perpendicular to the exhaled airflow of an individual, and thermal camera can be used to record the heat signature from the exhaled breath on the opposite side of the material. Breathing rate and respiratory behaviors can be extracted from the thermal data in real-time.
- Some embodiments of the respiration monitoring technique accurately report breathing rate and provide other information not obtainable through other non-contact methods.
- Various embodiments can be implemented as a small low-cost device for ease of use in a clinical environment or within an at-home deployment.
- the thin (or projection) medium material can have thermally conductive properties that reflect the temperature changes from the exhale but also allow for rapid dissipation of the heat between breaths.
- the material may also be as thin as possible to promote this dissipation process within the material as quickly as possible and may have a high emissivity so that changes in temperature can be seen with the thermal camera.
- when selecting the material one or more of the following considerations may be made: (1) thermally conductivity of the material, (2) ability of the material to allow for rapid thermal dissipation, (3) whether the material is thermally opaque, (4) ability of the material to retain heat signatures long enough to capture (e.g. 5 [Hz] or higher), (5) and the ability of the material to not introduce material composition patterns into the resulting images.
- Some embodiments use a specialty thermal and CO 2 camera to capture images and videos of human exhales and to extract clinically valuable information in a non-invasive way.
- various embodiments allow for non-contact analysis of various breathing activities including breathing rate, speed, strength, tidal volume, nose/mouth distribution, and CO 2 concentration out of exhale, lung efficiency, and obstructive breathing which can be used for various breathing/pulmonary related diseases diagnosis.
- TMTI Thin-Medium Thermal Imaging
- a patient can breathe onto a thin medium while a thermal camera records images of the opposite side of the medium.
- the thin medium can accurately capture the heat signature of the breath, retaining the temperature gradient long enough to be recorded by a thermal camera, but dissipating the heat quickly between breaths.
- These images can then be processed using signal processing or machine learning which can convert the thermal signatures on the medium into clinically important metrics such as respiratory rate and volume, and provides additional breathing behavior information to clinicians for a comprehensive view of the respiratory functioning of a patient.
- Some embodiments use small, low-cost equipment, and works for a variety of patient populations.
- Some embodiments can use a variety of techniques to generate respiratory metrics and behavior information. For example, some embodiments may use estimated values and self-reported information from the subject to determine unseen activity behind the medium. Metrics such as the distance between the person's face and the medium, the delay between the start of a breath and when the breath hits the medium, and the person's breathing mode (e.g., whether the person is breathing through their nose, mouth or both) may be unknown and/or subject to fluctuation throughout measurement. Data taken by having a person breathe through an intermediary respiratory measurement device (such as a spirometer) onto the medium can provide accurate timing and exhale information, but the resulting heat signatures may not represent natural breathing. A ground-truth device that does not interfere with natural breathing that also removes uncertainty about activities occurring behind the medium is required to improve some embodiments of the present technology.
- an intermediary respiratory measurement device such as a spirometer
- Various embodiments of the present technology employ a reinforced hybrid breathing model that uses an additional thermal camera with a spectral filter in the 3-5 [ ⁇ m] range.
- This camera acts as a CO 2 particle sensor, visualizing turbulent airflows as they exit a person's mouth or nose and collide with the medium.
- human and airflow behaviors can be identified that contribute to the thermal signatures on the medium.
- These synchronized image sets can be used in some embodiments to train a Convolutional Neural Network (CNN) to identify breathing activities from medium images from an inexpensive device.
- CNN Convolutional Neural Network
- inventions introduced here can be embodied as special-purpose hardware (e.g., circuitry), as programmable circuitry appropriately programmed with software and/or firmware, or as a combination of special-purpose and programmable circuitry.
- embodiments may include a machine-readable medium having stored thereon instructions which may be used to program a computer (or other electronic devices) to perform a process.
- the machine-readable medium may include, but is not limited to, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs), magneto-optical disks, ROMs, random access memories (RAMs), erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), magnetic or optical cards, flash memory, or other type of media/machine-readable medium suitable for storing electronic instructions.
- FIG. 1 illustrates an example of a system 100 that may be used for respiratory monitoring through Thin-Medium Thermal Imaging in accordance with various embodiments of the present technology.
- various embodiments of the system can include an exhale medium 110 , computing system 120 , and a thermal camera 130 . Respiratory behaviors can be captured through visualizing an exhale flow over time. To provide this visualization, some embodiments can allow the patient to breathe onto thin medium surface 110 and record the resulting heat distribution using thermal camera 130 .
- the exhale medium 110 may be composed of any material that defines a uniformly smooth surface that can capture a heat signature.
- the exhale medium 110 can be composed of any natural or composite material. Typically, it is desirable that the material be highly emissive, very thin and thermally opaque, and have specific thermal properties that retain heat long enough for the camera to capture the image, but allow for dissipation of the heat between breaths. Examples include, but are not limited to, thermochromatic liquid crystal films, a piece of paper, plastics, polymers, metals, fibers, or other synthetic materials.
- thermal camera 130 may be pointed at the opposite side of medium 110 relative to the patient.
- the exhale from the patient imparts a thermal signature on exhale (or thin) medium 110 that can then be visualized using thermal camera 130 .
- some embodiments can track this change over time as a cross-sectional sequence of the exhale flow.
- a computer system can then use the sequence to form a 3D volume over time. This volume is then calibrated to the tidal volume read from a spirometer or directly measured using the cross-sectional area over time to form a measured enclosed volume.
- the respiratory behaviors can include, but are not limited to, breathing rate, tidal volume, and nostril/mouth distribution.
- Some embodiments provide an effective method for estimating the tidal volume using only a small thermal camera and the Thin-Medium Thermal Imaging technique. This technique processes the sequence of thermal images, extracts the thermal distribution from them, and produces a cross-sectional area estimation over time to generate a volume. Then, some embodiments correlate this volume with an established ground truth (gold standard) method of measuring tidal volume with a spirometer. This correlation is then used to train a neural network to provide an estimate of the patient's tidal volume directly from the thermal image sequence.
- This technique processes the sequence of thermal images, extracts the thermal distribution from them, and produces a cross-sectional area estimation over time to generate a volume. Then, some embodiments correlate this volume with an established ground truth (gold standard) method of measuring tidal volume with a spirometer. This correlation is then used to train a neural network to provide an estimate of the patient's tidal volume directly from the thermal image sequence.
- FIG. 2 An example of an image sequence is shown in FIG. 2 which illustrates a possible thermal distribution on the medium during a single breath.
- These images represent cross-sectional snapshots of the exhale pattern as identified by the heat distribution on the thin film or projection medium.
- various embodiments take advantage of the temperature difference between human breath and the surrounding environment by projecting an individual exhale onto a projection medium that is used to visualize the thermal distribution of the exhale.
- the resulting heat signature is preserved on the medium only for a short period of time, but it remains long enough for a conventional thermal sensor to capture the information.
- Various embodiments allow the monitoring system to capture thermal exhale behaviors before the exhaled air dissipates, and then analyze the respiratory behaviors based on the resulting thermal distribution.
- the exhale film can be placed close to the patient's face so that the film can be sufficiently influenced by exhale force.
- the thermal signature on the medium can be used to determine whether a patient is breathing nasally, orally, or oronasally. Each of these modes of respiration show a unique thermal signature pattern on the medium that can be identified by their size, shape, location, and flow direction.
- the thermal signature can be segmented into separate exhale sources by filtering out the regional maxima, threshold the image, and then using pixel clustering. After identifying individual exhale sources from the thermal signature, the contribution from the mouth and each nostril to the surface area of the heat signature can be determined by the pixel sum per thermal region. From this information, various embodiments can calculate the nose-mouth distribution ratio or between each nostril.
- the strength of an individual exhale can be estimated by the rate of expansion of the thermal signature on the medium. Stronger exhale heat expands farther after hitting the medium surface than that from a less forceful exhale. To estimate exhale strength, some embodiments can use optical flow to estimate surface heat flow across the medium.
- FIGS. 3A-3B illustrate examples of training setup 300 and monitoring setup 350 using the thin medium thermal imaging technique that may be used in some embodiments of the present technology. Some embodiments use this two-stage process illustrated in FIGS. 3A-3B to establish the correlation between the thermal signature and the actual tidal volume provided by a spirometer.
- the first training stage 300 some embodiments have the patient 310 breathe through the spirometer 320 onto the medium 330 to train the network that will be used for this patient later in the monitoring stage.
- the computing system can generate sequences of thermal images 340 which can be correlated to the data collected by spirometer 320 .
- the monitoring stage 350 can be defined by having the patient 360 breathe onto the thin medium 370 without the spirometer. This will result in a natural breathing pattern within the thin medium from which some embodiments can then extract metrics.
- Some embodiments of the monitoring system can describe the volume extracted different ways. For example, in some embodiments, a cumulative volume over time can be computed that represents how much exhaled air the patient breathes out during normal breathing. The result of this metric can then be used to form a volume per exhale estimate that represents the change from the last valley to the next peak in volume. As another example, some embodiments can measure the flow, or volume over time, that is described in Liters per second. The results of this are shown in the plots 400 within FIG. 4 .
- FIG. 5 illustrates an example of a set of operations 500 for computing tidal volume in accordance with various embodiments of the present technology.
- a human subject breathes onto the thin medium.
- Thermal signature images are taken from the thin medium at every frame during capture operation 520 .
- thermal images of the medium can show a circular gradient pattern.
- the gradient values of these images are calculated and stored in calculation operation 530 .
- a Long-Short-Term-Memory (LSTM) network can be trained, in training operation 540 , to calculate volume measurements.
- LSTM Long-Short-Term-Memory
- the LSTM can take average thermal values from the medium and their associated ground-truth volume measurement from a spirometer as training input. Once the LSTM is trained with many samples of data, prediction operation 550 can use the LSTM to predict the volume measurement from a new series of thermal images.
- a breathing strength mode can identify the strength of an individual exhale.
- the system can estimate the breathing strength by the rate of expansion of the thermal signature on the medium. Stronger exhale heat expands farther after hitting the medium surface than that from a less forceful exhale.
- some embodiments use optical flow to estimate surface heat flow across the medium.
- Some embodiments aim to deliver higher level respiratory metrics than traditional monitoring solutions. For example, some embodiments can measure or estimate nose/mouth distribution and tidal volume, both of which are not obtainable using existing methods. Therefore, the ability to extract this information from the medium-based method is useful.
- Some embodiments introduce the notion of controlled respiratory analysis that incorporates two factors: (1) patient distraction and (2) active stimulation.
- patient distraction if a patient is aware that they are being monitored, their breathing may not be natural or normal. Therefore, to prevent this observation bias, some embodiments introduce methods of distracting the patient and taking their mind off of the fact that they are being observed. This will result in much more natural recordings, even if they are still wearing the device.
- the second factor is an extension of this concept to active stimulation.
- a multimedia scene with a specific content is chosen and presented to a patient to promote a particular reaction or emotion within the patient (e.g., happy, relaxed, fear, etc.) so that different breathing patterns under an active stimulation can be observed.
- This provides a differential factor for the types of natural breathing that can be recorded and analyzed based on the observed environment.
- Some embodiments provide for both a traditional view (e.g., media, movie, animation, image sequence, etc.) and also within a Virtual Reality (VR) environment. This provides an immersive and controlled environment in which some embodiments can invoke natural changes within the patient's breathing which may contribute to identifying the exaggerated effects of some pulmonary conditions.
- VR Virtual Reality
- FIGS. 6A-6B provide illustrations of two possible stationary setups 600 and 650 for the thin medium in a clinical setting in accordance with various embodiments of the present technology.
- the system may include three primary components: (1) the thin medium 610 , (2) a chinrest 620 to ensure the patient is a constant distance away from the thin medium, and (3) the thermal camera 630 used to image the exhale thermal distributions on the medium.
- the chin rest 620 and thermal camera 630 can be separated by approximately the same distance on both sides of the thin medium.
- the distance of the thin medium 610 from the subject can directly affect the performance of the system and the exhale characteristics of the subject which are identified. For example, if the medium is too far away from the subject's face, the exhale may have mostly dissipated before making contact with the medium. This situation could be overcome by a strong exhale, but is exacerbated by light exhales. Conversely, if the medium is close to the subject's face and the subject forcefully exhales, the airflow will hit the surface of the medium and spread in multiple directions. As such, various embodiments use a distance between one and eight inches between the subject's face and the medium. These ranges typically ensure a strong signal, but enough distance that it minimizes the impact of turbulent flows, and is still comfortable for the patient. Other embodiments may position the medium further or closer to the patient.
- a flat medium 610 can be used to easily visualize the exhale from a reasonable distance established using a chin rest 620 .
- the design illustrated in FIG. 6B changes the shape of the medium 610 to ensure that small movements will not result in a loss of data while imaging the thin medium 610 .
- the chin rests 620 may be provided within each experimental setup as a reference for the patient to limit the effect of movement on the resulting analysis.
- the method can be transformed into a semi-contact solution that has the patient wear the thin-medium as a mask, separated from the face in some embodiments.
- the primary difference between this solution and a traditional mask is the separation distance between the patient's face and the thin-medium. This provides natural airflow during inhale and provides a constant distance at which the surface will accurately represent the patient's exhale.
- FIGS. 7A-7B illustrate examples of semi-contact mask-based thin-medium setups 700 and 750 that may be used in one or more embodiments of the present technology.
- the mask 710 may touch the patient, however for the nose and mouth region, there is no contact. This allows the patient to breathe normally, but still provide some movement ( FIG. 7A ) while camera 720 is located or fixed away from the mask.
- the mask may include sliding rails or other mechanism that allows for distance of the mask 710 from the user's face to be adjusted. As the subject breathes onto the mask 710 , thermal images of the exhale can be captured by camera 720 .
- some embodiments can provide complete movement with the attached setup that also includes camera 720 physically coupled to mask 710 .
- the thin-medium and camera can be lightweight for the feasibility of this design.
- a lightweight thermal camera is held in front of the medium at a fixed distance. This design allows the camera and the medium to move with the patient's head orientation while continuously monitoring their respiration behavior. Therefore, the patient can move and naturally look around during the monitoring session. Since this is required as a fixed attachment to the patient, a very light weight mobile thermal camera is used for the imaging in some embodiments.
- some embodiments try to minimize the distance between the thermal camera and the medium attached to the face.
- Respiratory analysis can be significantly affected by changes in a patient's natural breathing behavior due to the monitoring method, their movement, and their focus or concentration. These factors can significantly alter the results of the breathing analysis and should be minimized during the monitoring session.
- the solution is to disrupt the patient's concentration away from their breathing and the monitoring process and provide a sufficient distraction so that their breathing behaviors will return to normal. This will provide a more accurate method for recording natural behaviors.
- some embodiments provide an external stimulus that will draw the patient's attention away from the monitoring process.
- Two primary methods for doing this that some embodiments introduce are: (1) media-based and (2) Augmented Reality (AR) and/or Virtual Reality (VR) based distractions.
- AR Augmented Reality
- VR Virtual Reality
- FIGS. 8A-8B illustrate components of a stimulation-based respiratory analysis system 800 and 850 that may be used in some embodiments of the present technology.
- the media source can provide a specific stimulus to the patient that will modify their breathing behaviors. Based on these behavior modifications, the resulting changes in the respiratory patterns will be recorded on the thin medium.
- Some embodiments incorporate the concept of the mask and the VR distraction methodology for creating controlled stimulus during the monitoring period.
- some embodiments combine both the mask design and the VR setup. This will result in the patient wearing the Head Mounted Display (HMD) 910 as they normally would for VR with an additional thin-medium 820 and thermal camera 930 attached. Due to the weight of the HMD itself, the additional attachments will not significantly burden the patient.
- the illustration of this design 900 is shown in FIG. 9 . This provides a controlled environment for introducing specific stimuli to the patient to monitor their changes in breathing. Since the medium and thermal camera are attached to the HMD, the user can experience VR as they would normally without the monitoring devices.
- thermal camera provides a viable method for precisely generating a dense representation of the patient's exhale as an image, it requires the camera to be placed on the opposite side of the medium, which limits the mobility of the system.
- the solution to this problem would be to eliminate the camera from the design, while keeping the thin medium as the primary detection mechanism used for monitoring exhale patterns.
- Some embodiments use an electrically conductive mesh from which slight changes in voltage across the medium can be interpreted as thermal distributions. Based on the airflow through the permeable mesh, the temperature changes will modify the resistance of the mesh over its surface. These changes and the small fluctuations within the monitored voltages will provide a basis for identifying the thermal distribution of the exhale.
- FIG. 10 illustrates an example of a hybrid respiratory monitoring system 1000 that may be used for respiratory monitoring through thin-medium thermal imaging in accordance with various embodiments of the present technology.
- various embodiments of the system 1000 can include an exhale medium 1010 , computing system 1020 , a thermal camera 1030 , and a side thermal camera 1040 .
- Various embodiments of the present technology may use local and/or remote computing resources to train and/or use a model correlating respiratory activity and recorded data.
- Images from the CO 2 camera 1030 show a detailed view of breathing behavior, such as when a breath starts and stops, how the person is breathing, and how the exhaled air collides with and spreads across the medium. Information gathered from this camera informs our understanding of the circumstances occurring for each frame from the medium-view camera.
- Breathing mode is one metric of interest to pulmonologists that can be obtained from the CO 2 camera images. Healthy individuals tend to breathe through their nose unless the nasal passage is obstructed, at which point the individual breathes either completely through their mouth, or through their nose and mouth simultaneously.
- various embodiments can label each medium-view image and use this data to train a CNN to classify the breathing mode of medium-view images 1060 .
- some embodiments can use various software (e.g., OpenCV library) to find the outline of the person's face through a combination of thresholding and morphological transformations.
- various embodiments can process this data (e.g., with a NumPy Python module) to find facial landmarks.
- the tip of the nose is the left-most pixel coordinate of the face, and the chinrest can be identified by taking the difference of the x-values of the pixel locations and finding the greatest peak.
- the chin and mouth are located between the tip of the nose and the chinrest and can be identified in a similar manner to the chinrest. These landmarks can be used in some embodiments to mask unnecessary information and to extract the person's breathing mode. Breathing mode can be determined by processing the pixels along the outline of the face (e.g., using the SciPy Python module), and looking for peaks in the data near the nose and mouth. No prominent peaks in the data indicate that the person is inhaling. This information can be used to label each medium-view thermal image as one of four breathing states: not exhaling, exhaling through the nose, exhaling through the mouth, or exhaling through the nose and mouth.
- FIG. 11 illustrates an example of various components of the hybrid respiratory monitoring system 1100 that may be used in one or more embodiments of the present technology.
- This figure shows a flow chart depicting the training process and how the monitoring system behaves during runtime.
- thermal images from the medium-view and from the side-view cameras are collected simultaneously.
- breathing information is extracted from the side-view images and paired with the medium-view image to train a model relating the side-view exhale behaviors with those indicated on the thin-medium.
- Temporal and spatial relationships about the exhale behavior are then extracted to be used during runtime.
- a thermal image is captured from the medium-view camera only and used as input to the model, which returns predicted breathing status and breathing metrics from the image, with improved accuracy due to the prior knowledge based on the initial training from two thermal cameras.
- FIG. 12 illustrates an example of a thermal exhale sequence 1200 showing images from the medium-view and the side-view that can be collected by some embodiments of the present technology.
- the medium-view images and the labels extracted from the CO 2 camera images can be used to train a machine learning model that predicts breathing mode from the medium-view images.
- an Artificial Neural Network (ANN) model is one well-suited machine learning model for this task, but would likely benefit from some temporal context, as it is difficult to determine if the thermal signature on the medium is increasing or decreasing in temperature by examining a single image.
- some embodiments feed both the original image and the previous image subtracted from the current image to the model, which indicates whether the medium temperature is increasing (exhale) or decreasing (inhale).
- the CNN may consist, in some embodiments, of a 2D convolutional layer with a Tan h activation map, a 2D max pooling layer with a (2 ⁇ 2) pool size, a flattening layer, and then a dense layer.
- the dense layer uses a softmax activation map to return label probabilities for classification of exhale behaviors.
- a test subject places their chin on the chinrest and breathes onto the medium (see, e.g., FIG. 10 ).
- This experimental setup is designed to place constraints on the experimental parameters, and is not intended for clinical applications.
- One goal of this experiment was to improve the accuracy of some of the other embodiments so that in clinical applications, the patient can sit or recline in a comfortable position with the medium placed in line with the patient's exhaled airflow.
- Some embodiments may use paper as the medium material because it retains enough heat to be recorded by a thermal camera at a low framerate but dissipates heat between breaths.
- paper is also inexpensive, easy to find, and standardized.
- Test subjects that participated in this research were asked to provide six different samples of breathing data, approximately 60[s] each in length. They were asked to first provide the following four breathing samples while at a normal heart rate: (a) nose breathing, (b) normal mouth breathing, (c) breathing through a small mouth opening, and (d) breathing through a large mouth opening. Participants were then asked to provide nose and mouth breathing samples at a slightly elevated heartrate.
- the collected medium-view images were composed of 1245 images where the individual is inhaling, 422 images of nose breathing, 587 images of mouth breathing, and 77 images where the subject is breathing through both their nose and mouth simultaneously.
- the collected data was iteratively separated into training data and test data, using a 75% to 25% split.
- the performance of the CNN was evaluated by running the evaluation 4 times with a different set of training and test data for each, and then calculating the accuracy of each prediction.
- FIG. 13 shows a confusion matrix 1300 for one of the 4 evaluations of the CNN. The results show accuracies between 91.08% and 93.81% for the 4 training and testing splits.
- Breathing mode prediction from the experimental results can be performed with reasonable accuracy in various embodiments of the present technology.
- the classification with the highest accuracy rate was the mouth, which is surprising due to the unique thermal signatures mouth breathing tends to produce compared with nose breathing.
- the CNN would likely be improved with additional training. Additional training data is expected to continue to improve the accuracy of the system and may also be used in a real-time manner in some embodiments.
- Some embodiments may apply machine learning techniques to other information extracted from the side-view images, such as determining the area of the medium that is directly heated by the initial contact with exhaled air instead of heated by the spread of the exhaled air after colliding with the medium.
- FIG. 14 is a flowchart illustrating an example of a set of operations 1400 for computing a tidal volume estimate by training a Long-Short-Term Memory (LSTM) network to associate flow data from a spirometer with information extracted from the medium images in accordance with various embodiments of the present technology.
- the average pixel intensity can be calculated from the medium-view images and fed to the LSTM along with the flow rate (L/s) from the spirometer. After sufficient training, medium images are fed to the LSTM and the corresponding flow rate is estimated by the network. This can be coded as:
- a second algorithm can be used to estimate tidal volume.
- a 3D volume over time can be constructed. Determining the heated area of the medium can be done through image processing of the medium image, potentially assisted by using a side-view CO 2 -visualizing thermal camera to determine the dimensions medium heated by direct exhaled air.
- FIG. 15 is a flowchart illustrating an example of a set of operations 1500 for identifying a person's breathing mode using a Convolutional Neural Network (CNN) in accordance with one or more embodiments of the present technology.
- the CNN can be trained on the difference image resulting from subtracting the previous frame from the current frame and its label.
- a second CO 2 visualizing thermal camera can be used as a ground-truth device to view the exhaled air from a side-profile view in order to label their associated medium-view images. Images are labeled as mouth breathing, nose breathing, both or none.
- the differenced medium frames are fed into the CNN.
- the CNN predicts the label associated with that medium image. This can be coded as:
- FIG. 16 illustrates techniques for computing various breathing metrics 1600 in accordance with some embodiments of the present technology.
- Different breathing modes nose, mouth, or both
- Some embodiments can identify areas of the heat signature on the medium resulting from nasal breathing and mouth breathing by determining the number of “hot spots” on the medium and their locations with respect to one another. Simultaneous nasal and oral breathing produces three “hot spots” on the medium, whereas nasal breathing produces two “hot spots”, and mouth breathing produces one.
- Some embodiments may use a Watershed algorithm to segment the heat signature into one, two or three areas on the medium, and the areas are classified based on their location on the medium. The pixels belonging to each cluster are summed, and then the ratio between each nostril or between exhaled air from the nose and mouth can be calculated.
- breathing rate can be extracted from a sequence of medium images by plotting the average intensity values of the medium images for a window of time and performing a Fast Fourier Transform (FFT) of the data.
- FFT Fast Fourier Transform
- the FFT transforms the data from the time domain to the frequency domain.
- the overarching frequency is the breaths per minute (BPM) for that window of time.
- optical flow image processing techniques can be applied to consecutive thin-medium thermal images to highlight the spread of heat across the medium. This results in a dense field of flow vectors across the image. The length of each individual vector indicates the flow of heat between frames, where longer vectors denote a faster flow of exhaled air across the surface of the medium, and shorter vectors denote a slower flow of air across the medium.
- the positive difference image values over time from the breathing rate calculation can provide useful insight into the exhale patterns of the individual.
- Breathing pattern abnormalities such as several breaths in rapid succession over a short period of time, are lost after condensing the data into a single breathing rate value. However, these abnormalities are made visible as a plot of increasing difference sums over time.
- FIG. 17A is a block diagram 1700 illustrating components of an integrated mobile configuration of the breathing analysis system.
- the mobile device can include Tx/Rx depth sensor 1705 , thermal imaging sensor 1710 , local compute capability 1715 , local communication 1720 , and I/O communication 1725 all integrated into a single, mobile form factor.
- Other embodiments may include additional components (e.g., a display, AR/VR components, coprocessors, and/or the like).
- I/O communication 1725 data collected from Tx/Rx depth sensor 1705 and thermal imaging sensor 1710 can be shared via external communication array 1730 with external compute capability 1735 (e.g., workstations, laptops, computer, remote servers, cloud-based solutions, etc.).
- external compute capability 1735 e.g., workstations, laptops, computer, remote servers, cloud-based solutions, etc.
- some embodiments can share any processed data or computation with external compute capability 1735 .
- the mobile solution may include minimal computational abilities in order to save power.
- imaging instructions may be received directly from external compute capability 1735 , via I/O communication 1725 .
- the imaging instructions can cause the local compute capability 1715 to control Tx/Rx depth sensor 1705 and thermal imaging sensor 1710 .
- various embodiments of the present technology may use local and/or remote computing resources to train and/or use a model correlating respiratory activity and recorded data.
- FIG. 17B illustrates a block diagram 1750 of a sensor only configuration of the breathing analysis system.
- the breathing system may include depth imaging sensor 1755 , Tx/Rx depth sensor 1760 , local communication 1765 , and external communication 1770 .
- the device can connect to external compute capability 1775 where the images can be processed to generate an analysis of the respiratory behavior of the individual. In these embodiments, any significant processing is done by external compute capability 1775 .
- FIG. 18 is an illustration of a top view of breathing analysis system 1800 where the cameras are imaging the thin-medium from the same-side as the patient.
- the patient 1805 can breathe onto the thin-medium 1810 .
- One or more cameras 1815 A- 1815 B can be positioned on the same side of the thin-medium as the patient.
- the breathing analysis system can identify exhale characteristics and behaviors through the processing and/or visualization of heated exhale interacting with the thin medium (e.g., plastics, polymers, metals, fibers, or other synthetic materials). While some embodiments image the thin-film from the opposing side, other embodiments (such as those illustrated in FIG. 18 ) can image from the same or surrounding directions of the thin-medium.
- aspects and implementations of the breathing analysis system of the disclosure have been described in the general context of various steps and operations.
- a variety of these steps and operations may be performed by hardware components or may be embodied in computer-executable instructions, which may be used to cause a general-purpose or special-purpose processor (e.g., in a computer, server, or other computing device) programmed with the instructions to perform the steps or operations.
- a general-purpose or special-purpose processor e.g., in a computer, server, or other computing device
- the steps or operations may be performed by a combination of hardware, software, and/or firmware.
- FIG. 19 is a block diagram illustrating an example machine representing the computer systemization of the breathing analysis system.
- the system controller 1900 may be in communication with entities including one or more users 1925 client/terminal devices 1920 , user input devices 1905 , peripheral devices 1910 , an optional co-processor device(s) (e.g., cryptographic processor devices) 1915 , and networks 1930 . Users may engage with the controller 1900 via terminal devices 1920 over networks 1930 .
- entities including one or more users 1925 client/terminal devices 1920 , user input devices 1905 , peripheral devices 1910 , an optional co-processor device(s) (e.g., cryptographic processor devices) 1915 , and networks 1930 .
- Users may engage with the controller 1900 via terminal devices 1920 over networks 1930 .
- Computers may employ central processing unit (CPU) or processor to process information.
- Processors may include programmable general-purpose or special-purpose microprocessors, programmable controllers, application-specific integrated circuits (ASICs), programmable logic devices (PLDs), embedded components, combination of such devices and the like.
- ASICs application-specific integrated circuits
- PLDs programmable logic devices
- Processors execute program components in response to user and/or system-generated requests.
- One or more of these components may be implemented in software, hardware or both hardware and software.
- Processors pass instructions (e.g., operational and data instructions) to enable various operations.
- the controller 1900 may include clock 1965 , CPU 1970 , memory such as read only memory (ROM) 1985 and random access memory (RAM) 1980 and co-processor 1975 among others. These controller components may be connected to a system bus 1960 , and through the system bus 1960 to an interface bus 1935 . Further, user input devices 1905 , peripheral devices 1910 , co-processor devices 1915 , and the like, may be connected through the interface bus 1935 to the system bus 1960 .
- the interface bus 1935 may be connected to a number of interface adapters such as processor interface 1940 , input output interfaces (I/O) 1945 , network interfaces 1950 , storage interfaces 1955 , and the like.
- Processor interface 1940 may facilitate communication between co-processor devices 1915 and co-processor 1975 .
- processor interface 1940 may expedite encryption and decryption of requests or data.
- I/O Input output interfaces
- I/O 1945 facilitate communication between user input devices 1905 , peripheral devices 1910 , co-processor devices 1915 , and/or the like and components of the controller 1900 using protocols such as those for handling audio, data, video interface, wireless transceivers, or the like (e.g., Bluetooth, IEEE 1394a-b, serial, universal serial bus (USB), Digital Visual Interface (DVI), 802.11a/b/g/n/x, cellular, etc.).
- Network interfaces 1950 may be in communication with the network 1930 . Through the network 1930 , the controller 1900 may be accessible to remote terminal devices 1920 .
- Network interfaces 1950 may use various wired and wireless connection protocols such as, direct connect, Ethernet, wireless connection such as IEEE 802.11a-x, and the like.
- Examples of network 1930 include the Internet, Local Area Network (LAN), Metropolitan Area Network (MAN), a Wide Area Network (WAN), wireless network (e.g., using Wireless Application Protocol WAP), a secured custom connection, and the like.
- the network interfaces 1950 can include a firewall which can, in some aspects, govern and/or manage permission to access/proxy data in a computer network, and track varying levels of trust between different machines and/or applications.
- the firewall can be any number of modules having any combination of hardware and/or software components able to enforce a predetermined set of access rights between a particular set of machines and applications, machines and machines, and/or applications and applications, for example, to regulate the flow of traffic and resource sharing between these varying entities.
- the firewall may additionally manage and/or have access to an access control list which details permissions including, for example, the access and operation rights of an object by an individual, a machine, and/or an application, and the circumstances under which the permission rights stand.
- Other network security functions performed or included in the functions of the firewall can be, for example, but are not limited to, intrusion-prevention, intrusion detection, next-generation firewall, personal firewall, etc., without deviating from the novel art of this disclosure.
- Storage interfaces 1955 may be in communication with a number of storage devices such as, storage devices 1990 , removable disc devices, and the like.
- the storage interfaces 1955 may use various connection protocols such as Serial Advanced Technology Attachment (SATA), IEEE 1394, Ethernet, Universal Serial Bus (USB), and the like.
- SATA Serial Advanced Technology Attachment
- IEEE 1394 IEEE 1394
- Ethernet Ethernet
- USB Universal Serial Bus
- User input devices 1905 and peripheral devices 1910 may be connected to I/O interface 1945 and potentially other interfaces, buses and/or components.
- User input devices 1905 may include card readers, finger print readers, joysticks, keyboards, microphones, mouse, remote controls, retina readers, touch screens, sensors, and/or the like.
- Peripheral devices 1910 may include antenna, audio devices (e.g., microphone, speakers, etc.), cameras, external processors, communication devices, radio frequency identifiers (RFIDs), scanners, printers, storage devices, transceivers, and/or the like.
- Co-processor devices 1915 may be connected to the controller 1900 through interface bus 1935 , and may include microcontrollers, processors, interfaces or other devices.
- Computer executable instructions and data may be stored in memory (e.g., registers, cache memory, random access memory, flash, etc.) which is accessible by processors. These stored instruction codes (e.g., programs) may engage the processor components, motherboard and/or other system components to perform desired operations.
- the controller 1900 may employ various forms of memory including on-chip CPU memory (e.g., registers), RAM 1980 , ROM 1985 , and storage devices 1990 .
- Storage devices 1990 may employ any number of tangible, non-transitory storage devices or systems such as fixed or removable magnetic disk drive, an optical drive, solid state memory devices and other processor-readable storage media.
- Computer-executable instructions stored in the memory may include one or more program modules such as routines, programs, objects, components, data structures, and so on that perform particular tasks or implement particular abstract data types.
- the memory may contain operating system (OS) component 1995 , modules and other components, database tables, and the like. These modules/components may be stored and accessed from the storage devices, including from external storage devices accessible through an interface bus.
- OS operating system
- the database components can store programs executed by the processor to process the stored data.
- the database components may be implemented in the form of a database that is relational, scalable and secure. Examples of such database include DB2, MySQL, Oracle, Sybase, and the like.
- the database may be implemented using various standard data-structures, such as an array, hash, list, stack, structured text file (e.g., XML), table, and/or the like. Such data-structures may be stored in memory and/or in structured files.
- the controller 1900 may be implemented in distributed computing environments, where tasks or modules are performed by remote processing devices, which are linked through a communications network, such as a Local Area Network (“LAN”), Wide Area Network (“WAN”), the Internet, and the like.
- LAN Local Area Network
- WAN Wide Area Network
- program modules or subroutines may be located in both local and remote memory storage devices.
- Distributed computing may be employed to load balance and/or aggregate resources for processing.
- aspects of the controller 1900 may be distributed electronically over the Internet or over other networks (including wireless networks).
- portions of the breathing analysis system may reside on a server computer, while corresponding portions reside on a client computer. Data structures and transmission of data particular to aspects of the controller 1900 are also encompassed within the scope of the disclosure.
- the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to.”
- the terms “connected,” “coupled,” or any variant thereof means any connection or coupling, either direct or indirect, between two or more elements; the coupling or connection between the elements can be physical, logical, or a combination thereof.
- the words “herein,” “above,” “below,” and words of similar import when used in this application, refer to this application as a whole and not to any particular portions of this application.
- words in the above Detailed Description using the singular or plural number may also include the plural or singular number respectively.
- the word “or,” in reference to a list of two or more items, covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Pulmonology (AREA)
- Physiology (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Geometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Quality & Reliability (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
- This application claims priority to U.S. Provisional Application Ser. No. 62/756,501 filed Nov. 6, 2018, which is incorporated herein by reference in its entirety for all purposes.
- Various embodiments of the present technology generally relate to systems and methods for monitoring breathing activity and identifying various pulmonary conditions. More specifically, some embodiments of the present technology relate to non-contact breathing activity monitoring and analyzing system through thermal on projection medium imaging.
- Respiration monitoring techniques that are both accurate and comfortable for the patient are sorely lacking from the medical field. The most accurate methods of respiration monitoring may require placing ECG electrodes on the patient's body, putting thermistors in the patient's nose, having the patient wear an abdominal strain-gauge transducer, having the patient breathe through a tube while wearing a nose clip, or some combination of these techniques that monitor multiple biophysiological parameters concurrently. These traditional solutions all involve placing sensors directly on the patient's body. These direct measurements have a high rate of accuracy, but cause discomfort and alter the natural breathing of the patient.
- Pulmonologists employ a variety of respiratory monitoring tools to accurately assess the health of a patient, such as spirometers, plethysmographs, and polysomnography. Termed “contact methods”, these techniques require placing sensors directly on the body of the subject or having the subject breath through a tube-based device. Though these methods boast high accuracy and utility, these methods often inflict physical and psychological discomfort, and interfere with a patient's natural breathing behaviors, limiting their application for long-term monitoring. Additionally, cumbersome equipment and labor-intensive setups make these techniques difficult to utilize in out-patient clinics.
- Many proposed non-contact measurement methods use remote sensors such as thermal cameras, RGB cameras, depth sensors, ultrasonic sensors, among other imaging devices. Though inherently comfortable, they are not currently used due to their lowered accuracy, sensitivity to bodily characteristics, and limited utility in practical clinical settings. Thermal imaging methods that monitor breathing in open air have reduced accuracy due to fast heat dissipation, and methods that measure skin temperature changes are unable to provide detailed breathing behavior information. As such, there are a number of challenges and inefficiencies created in traditional respiratory analysis systems. It is with respect to these and other problems that embodiments of the present technology have been made.
- Various embodiments of the present technology generally relate to systems and methods for monitoring breathing activity. More specifically, some embodiments of the present technology relate to non-contact breathing activity monitoring and analyzing through thermal imaging of thin-medium surfaces. Thin-medium thermal imaging refers to the visual collection of exhaled heat signatures obtained on a thin-medium imaged by a thermal camera. Various embodiments provide for systems and methods based on this premise for the extraction of clinically meaningful metrics for respiratory analysis. This includes, but is not limited to, the reconstruction of exhale flows to obtain volumetric estimates of exhaled breath, the identification of the separation between nose (each nostril) and mouth exhale flows to measure distribution, and both the velocity and strength of exhale flows.
- Various embodiments of the present technology provide for a system comprising an infrared (IR) thermal camera. The IR thermal camera can be combined with a laser transmitter (Tx) and receiver (Rx) which can be used to measure the distance between the thermal camera and the thin-medium (e.g., formed in a planar or a curved surface). In some embodiments the IR thermal camera can be configured to image a subset interval of the electromagnetic spectrum within the infrared wavelengths and generate grayscale intensity values within a dense pixel array that represent the thermal intensity distribution of exhale behaviors within the thin-medium.
- The system can define a configuration where exhale characteristics and behaviors are identified through the visualization of heated exhale interacting with the thin medium (e.g., plastics, polymers, metals, fibers, or other synthetic materials) which is imaged from the opposing side or from surrounding directions of the thin-medium. In some embodiments, a chemical coating may be applied to the thin-medium to enhance the thermal residual and dissipative characteristics of the medium to modify the thermal signature imposed on the material from external heat sources. The chemical coating may alter the material's ability to maintain heat residuals or alter the material's dissipative characteristics. In some embodiments, the thin medium can be selected to maintain a thermal radiance above ambient temperature for a given period of time.
- In some embodiments, the IR thermal camera, distance laser, and thin-medium can be combined into a mobile or wearable device. The camera can be placed to allow imaging of a large portion (e.g., 60% or more) of the thin-medium for recording the thermal changes in the thin-medium. In other embodiments, the thermal camera, laser distance system, and thin-medium may be mounted within an adjustable mount that provides a targetable vision system that tracks thermal changes in the thin-medium.
- Various embodiments may also include processing units, processors, or computers to execute image processing and respiratory metric algorithms. In some embodiments, the computing resources can be integrated within the device or may be external to the device and communicably coupled via a wireless link.
- In some embodiments, a system can include a thermal camera that images projections of an exhale cross-section on a projection-medium, a communication interface coupled to the thermal camera to receive a sequence of thermal images depicting the thermal distribution of the exhale on the thin-medium, a processor, and a set of instructions stored thereon that when executed by the processor, cause the processor to generate a representation or model of the exhale of the subject. Various metrics of the respiratory behavior, two- and three-dimensional representations of the exhale cross-sections, 3D reconstruction, a geometrically consistent estimation of the approximate flow volume via a reconstruction on a per-frame basis to provide an estimate of the subject's tidal volume, and/or the like can also be generated.
- In some embodiments, a method of analysis for monitoring and extracting metrics related to pulmonary function can include taking thermal images of a thin-medium while the user breathes onto the surface of the medium. A temporary representation of the exhale signal on the medium that represents the exhale behaviors of the user can be created. Then, for each intensity value range within the image, iso-lines to form an iso-line map of gradients and exhale boundary captured over time can be created to form a three-dimensional representation of the exhale. Respiratory information can be derived through applying image processing techniques to the thermal gradient images. In addition, a correlation between the intensity values and exhale behavior over time with objective metrics may also be derived through machine learning.
- Embodiments of the present technology also include computer-readable storage media containing sets of instructions to cause one or more processors to perform the methods, variations of the methods, and other operations described herein.
- In some embodiments, the system can generate metrics that describe pulmonological function that include: reconstructions of exhale flow behavior by (1) imaging the thin-medium using the thermal camera to produce a video stream composing a sequence of images, (2) extracting 2D cross-sectional regions of the exhale behavior within the medium for each image, (3) generating a model based on these cross sections over time. This model can then be used to estimate various pulmonological traits including: 3D reconstructions of the exhale volume used to obtain tidal volume estimates, identifying the separation between nose, mouth, and nostrils, and exhale strength over time. In some embodiments, the image sequence produced by the thermal camera can be used to generate a sequence of contours that are extracted from the 2D intensity images. Each contour can be identified temporally by a time-stamp that can be used to define these 2D contours over time. The sequence of contours is then projected over time to define a 3D volume. The 3D volumetric model can then be used to provide estimates of exhale behaviors and tidal volume. Exhale can be separated into a distribution between each nostril and the mouth. This distribution is represented as a percentage (%) relationship. The behaviors of the contours and thermal distribution within the thin-medium are used to identify a relational proportion between each nostril and the mouth. The duration and thermal distribution of an exhale can be analyzed to extract the strength of the exhale. Strength may be defined as a relationship between velocity, force, and duration of the exhale.
- Some embodiments may include an additional thermal camera to capture a side profile of the exhale. The additional thermal camera may include a wavelength filter tuned to a spectral range subset of the infrared spectrum used to visualize CO2 from exhaled airflows from a side profile. This is used as a ground-truth device with which to validate metrics collected from the medium. The thermal CO2 camera may be positioned perpendicular to the linear setup of the thermal camera and thin-medium. This can provide a method for visualizing the exhale as it exits the nose/mouth and makes contact with the thin-medium, imparting thermal energy into the medium. The flow delay and nose/mouth distribution ground-truth values can then be correlated with the behaviors identified on the thin-medium. Measurements obtained from the side view using the thermal CO2 camera are used to improve the accuracy of the apparent behavior identified in the thin medium using the thermal camera. This correlation can be established in some embodiments by training a deep neural network with the relationship of the exhale metrics between those obtained with the thermal CO2 camera and the behaviors visualized on the thin-medium.
- While multiple embodiments are disclosed, still other embodiments of the present technology will become apparent to those skilled in the art from the following detailed description, which shows and describes illustrative embodiments of the technology. As will be realized, the technology is capable of modifications in various aspects, all without departing from the scope of the present technology. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not restrictive.
- Embodiments of the present technology will be described and explained through the use of the accompanying drawings in which:
-
FIG. 1 illustrates an example of a system that may be used for respiratory monitoring through thin-medium thermal imaging in accordance with various embodiments of the present technology; -
FIG. 2 illustrates an example of a sequence of a thermal distribution on the thin medium due to the exhale that may be generated in accordance with one or more embodiments of the present technology; -
FIGS. 3A-3B illustrate examples of training and monitoring setups using the thin medium thermal imaging technique that may be used in some embodiments of the present technology; -
FIG. 4 shows plots of the cumulative volume over time and the volume per exhale for a single patient that may be present in some embodiments of the present technology; -
FIG. 5 is a flowchart illustrating an example of a set of operations for computing tidal volume in accordance with various embodiments of the present technology; -
FIGS. 6A-6B provide illustrations of two possible stationary setups for the thin medium in a clinical setting in accordance with various embodiments of the present technology; -
FIGS. 7A-7B illustrate examples of semi-contact mask-based thin-medium setups that may be used in one or more embodiments of the present technology; -
FIGS. 8A-8B illustrate components of a stimulation-based respiratory analysis system that may be used in some embodiments of the present technology; -
FIG. 9 illustrates a VR thin-medium respiratory analysis setup that may be used in one or more embodiments of the present technology; -
FIG. 10 illustrates an example of a hybrid respiratory monitoring system that may be used for respiratory monitoring through thin-medium thermal imaging in accordance with various embodiments of the present technology; -
FIG. 11 illustrates an example of various components of the hybrid respiratory monitoring system that may be used in one or more embodiments of the present technology; -
FIG. 12 illustrates an example of a thermal exhale sequence showing images from the medium-view and the side-view that can be collected by some embodiments of the present technology; -
FIG. 13 is a confusion matrix for one of the four evaluations of the convolutional neural network implemented in some embodiments of the present technology; -
FIG. 14 is a flowchart illustrating an example of a set of operations for computing a tidal volume estimate by training a Long-Short-Term Memory (LSTM) network to associate flow data from a spirometer with information extracted from the medium images in accordance with various embodiments of the present technology; -
FIG. 15 is a flowchart illustrating an example of a set of operations for identifying a person's breathing mode using a convolutional neural network in accordance with one or more embodiments of the present technology; -
FIG. 16 illustrates techniques for computing various breathing metrics in accordance with some embodiments of the present technology; -
FIGS. 17A-17B illustrate block diagrams of integrated mobile configurations and sensor only configurations of the breathing analysis system; -
FIG. 18 is an illustration of a top view of breathing analysis system where the cameras are imaging the thin-medium from the same-side as the patient; and -
FIG. 19 is a block diagram illustrating an example machine representing the computer systemization of the breathing analysis system. - The drawings have not necessarily been drawn to scale. Similarly, some components and/or operations may be separated into different blocks or combined into a single block for the purposes of discussion of some of the embodiments of the present technology. Moreover, while the technology is amenable to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and are described in detail below. The intention, however, is not to limit the technology to the particular embodiments described. On the contrary, the technology is intended to cover all modifications, equivalents, and alternatives falling within the scope of the technology as defined by the appended claims.
- Various embodiments of the present technology generally relate to systems and methods for monitoring breathing activity. More specifically, some embodiments of the present technology relate to non-contact breathing activity monitoring and analyzing through Thin-Medium Thermal Imaging (TMTI). In this respiratory monitoring and diagnosis system, breathing behaviors are captured as they contact a thin planar surface that is then imaged using a thermal camera. The heat distribution imposed on the thin-medium is then used to extract respiratory metrics. Conventional breath monitoring systems need to place devices on patients, which causes discomfort and may not be applicable for long term monitoring. Moreover, the primary evaluation criteria within respiratory analysis revolves around the collection of a limited set of quantitative metrics such as breathing rate, flow analysis, and tidal volume estimates.
- Extensive research has culminated numerous contact and non-contact methods that obtain these metrics with promising levels of accuracy. However, all current non-contact respiratory evaluation is performed using indirect methods, that is, they infer measurements through secondary signals such as visible chest movements, vibration, pressure, acceleration, or sound. Prior methods using spectral analysis for CO2 visualization measure and model the exhale flow consisting of the visualized thermal signature of the CO2 waveform. While prior methods only provide a breathing rate evaluation, this form of visualization has the ability to generate numerous additional metrics such as nose/mouth distribution, velocity, dissipation, behavioral characteristics and even insight into lung efficiency in controlled environments.
- In contrast, some embodiments provide for systems and methods of respiratory analysis that are non-contact, but also measure the exhaled air of a human subject directly through a medium-based exhale visualization technique. In some embodiments, a thin medium can be placed perpendicular to the exhaled airflow of an individual, and thermal camera can be used to record the heat signature from the exhaled breath on the opposite side of the material. Breathing rate and respiratory behaviors can be extracted from the thermal data in real-time. Some embodiments of the respiration monitoring technique accurately report breathing rate and provide other information not obtainable through other non-contact methods. Various embodiments can be implemented as a small low-cost device for ease of use in a clinical environment or within an at-home deployment.
- In some embodiments, the thin (or projection) medium material can have thermally conductive properties that reflect the temperature changes from the exhale but also allow for rapid dissipation of the heat between breaths. The material may also be as thin as possible to promote this dissipation process within the material as quickly as possible and may have a high emissivity so that changes in temperature can be seen with the thermal camera. In accordance with various embodiments, when selecting the material one or more of the following considerations may be made: (1) thermally conductivity of the material, (2) ability of the material to allow for rapid thermal dissipation, (3) whether the material is thermally opaque, (4) ability of the material to retain heat signatures long enough to capture (e.g. 5 [Hz] or higher), (5) and the ability of the material to not introduce material composition patterns into the resulting images.
- Some embodiments use a specialty thermal and CO2 camera to capture images and videos of human exhales and to extract clinically valuable information in a non-invasive way. As a result, various embodiments allow for non-contact analysis of various breathing activities including breathing rate, speed, strength, tidal volume, nose/mouth distribution, and CO2 concentration out of exhale, lung efficiency, and obstructive breathing which can be used for various breathing/pulmonary related diseases diagnosis.
- Some embodiments provide for Thin-Medium Thermal Imaging (TMTI) which is an innovative non-contact respiration sensing method that strives to address the problems of existing methods by monitoring respiration directly, but without touching the patient. In accordance with various embodiments, a patient can breathe onto a thin medium while a thermal camera records images of the opposite side of the medium. The thin medium can accurately capture the heat signature of the breath, retaining the temperature gradient long enough to be recorded by a thermal camera, but dissipating the heat quickly between breaths. These images can then be processed using signal processing or machine learning which can convert the thermal signatures on the medium into clinically important metrics such as respiratory rate and volume, and provides additional breathing behavior information to clinicians for a comprehensive view of the respiratory functioning of a patient. Some embodiments use small, low-cost equipment, and works for a variety of patient populations.
- Some embodiments can use a variety of techniques to generate respiratory metrics and behavior information. For example, some embodiments may use estimated values and self-reported information from the subject to determine unseen activity behind the medium. Metrics such as the distance between the person's face and the medium, the delay between the start of a breath and when the breath hits the medium, and the person's breathing mode (e.g., whether the person is breathing through their nose, mouth or both) may be unknown and/or subject to fluctuation throughout measurement. Data taken by having a person breathe through an intermediary respiratory measurement device (such as a spirometer) onto the medium can provide accurate timing and exhale information, but the resulting heat signatures may not represent natural breathing. A ground-truth device that does not interfere with natural breathing that also removes uncertainty about activities occurring behind the medium is required to improve some embodiments of the present technology.
- Various embodiments of the present technology employ a reinforced hybrid breathing model that uses an additional thermal camera with a spectral filter in the 3-5 [μm] range. This camera acts as a CO2 particle sensor, visualizing turbulent airflows as they exit a person's mouth or nose and collide with the medium. By collecting images from this camera from a side-profile while also collecting thermal images of the medium, human and airflow behaviors can be identified that contribute to the thermal signatures on the medium. These synchronized image sets can be used in some embodiments to train a Convolutional Neural Network (CNN) to identify breathing activities from medium images from an inexpensive device.
- In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present technology. It will be apparent, however, to one skilled in the art that embodiments of the present technology may be practiced without some of these specific details.
- The techniques introduced here can be embodied as special-purpose hardware (e.g., circuitry), as programmable circuitry appropriately programmed with software and/or firmware, or as a combination of special-purpose and programmable circuitry. Hence, embodiments may include a machine-readable medium having stored thereon instructions which may be used to program a computer (or other electronic devices) to perform a process. The machine-readable medium may include, but is not limited to, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs), magneto-optical disks, ROMs, random access memories (RAMs), erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), magnetic or optical cards, flash memory, or other type of media/machine-readable medium suitable for storing electronic instructions.
- The phrases “in some embodiments,” “according to some embodiments,” “in the embodiments shown,” “in other embodiments,” and the like generally mean the particular feature, structure, or characteristic following the phrase is included in at least one implementation of the present technology, and may be included in more than one implementation. In addition, such phrases do not necessarily refer to the same embodiments or different embodiments.
-
FIG. 1 illustrates an example of asystem 100 that may be used for respiratory monitoring through Thin-Medium Thermal Imaging in accordance with various embodiments of the present technology. As illustrated inFIG. 1 , various embodiments of the system can include anexhale medium 110,computing system 120, and athermal camera 130. Respiratory behaviors can be captured through visualizing an exhale flow over time. To provide this visualization, some embodiments can allow the patient to breathe onto thinmedium surface 110 and record the resulting heat distribution usingthermal camera 130. In accordance with various embodiments, theexhale medium 110 may be composed of any material that defines a uniformly smooth surface that can capture a heat signature. - The
exhale medium 110 can be composed of any natural or composite material. Typically, it is desirable that the material be highly emissive, very thin and thermally opaque, and have specific thermal properties that retain heat long enough for the camera to capture the image, but allow for dissipation of the heat between breaths. Examples include, but are not limited to, thermochromatic liquid crystal films, a piece of paper, plastics, polymers, metals, fibers, or other synthetic materials. - In some embodiments,
thermal camera 130 may be pointed at the opposite side ofmedium 110 relative to the patient. The exhale from the patient imparts a thermal signature on exhale (or thin) medium 110 that can then be visualized usingthermal camera 130. Based on this, some embodiments can track this change over time as a cross-sectional sequence of the exhale flow. A computer system can then use the sequence to form a 3D volume over time. This volume is then calibrated to the tidal volume read from a spirometer or directly measured using the cross-sectional area over time to form a measured enclosed volume. - Recording the sequence of thermal images over time allows a computer system (not shown in
FIG. 1 ) to analyze characteristics about the exhale per episode and extract clinically meaningful information about the patient's respiratory behaviors. In accordance with various embodiments, the respiratory behaviors can include, but are not limited to, breathing rate, tidal volume, and nostril/mouth distribution. - Some embodiments provide an effective method for estimating the tidal volume using only a small thermal camera and the Thin-Medium Thermal Imaging technique. This technique processes the sequence of thermal images, extracts the thermal distribution from them, and produces a cross-sectional area estimation over time to generate a volume. Then, some embodiments correlate this volume with an established ground truth (gold standard) method of measuring tidal volume with a spirometer. This correlation is then used to train a neural network to provide an estimate of the patient's tidal volume directly from the thermal image sequence.
- An example of an image sequence is shown in
FIG. 2 which illustrates a possible thermal distribution on the medium during a single breath. These images represent cross-sectional snapshots of the exhale pattern as identified by the heat distribution on the thin film or projection medium. As such, various embodiments take advantage of the temperature difference between human breath and the surrounding environment by projecting an individual exhale onto a projection medium that is used to visualize the thermal distribution of the exhale. The resulting heat signature is preserved on the medium only for a short period of time, but it remains long enough for a conventional thermal sensor to capture the information. Various embodiments allow the monitoring system to capture thermal exhale behaviors before the exhaled air dissipates, and then analyze the respiratory behaviors based on the resulting thermal distribution. In accordance with various embodiments, the exhale film can be placed close to the patient's face so that the film can be sufficiently influenced by exhale force. - The thermal signature on the medium can be used to determine whether a patient is breathing nasally, orally, or oronasally. Each of these modes of respiration show a unique thermal signature pattern on the medium that can be identified by their size, shape, location, and flow direction. The thermal signature can be segmented into separate exhale sources by filtering out the regional maxima, threshold the image, and then using pixel clustering. After identifying individual exhale sources from the thermal signature, the contribution from the mouth and each nostril to the surface area of the heat signature can be determined by the pixel sum per thermal region. From this information, various embodiments can calculate the nose-mouth distribution ratio or between each nostril. The strength of an individual exhale can be estimated by the rate of expansion of the thermal signature on the medium. Stronger exhale heat expands farther after hitting the medium surface than that from a less forceful exhale. To estimate exhale strength, some embodiments can use optical flow to estimate surface heat flow across the medium.
-
FIGS. 3A-3B illustrate examples oftraining setup 300 andmonitoring setup 350 using the thin medium thermal imaging technique that may be used in some embodiments of the present technology. Some embodiments use this two-stage process illustrated inFIGS. 3A-3B to establish the correlation between the thermal signature and the actual tidal volume provided by a spirometer. In thefirst training stage 300, some embodiments have thepatient 310 breathe through thespirometer 320 onto the medium 330 to train the network that will be used for this patient later in the monitoring stage. The computing system can generate sequences ofthermal images 340 which can be correlated to the data collected byspirometer 320. - This correlation does not need to be patient-specific because the difference between exhale patterns from different individuals can be handled by the complexity of the neural network. Thus, some embodiments can offer a solution that does not require the training process for every individual. Based on the training or trained network provided, the
monitoring stage 350 can be defined by having thepatient 360 breathe onto thethin medium 370 without the spirometer. This will result in a natural breathing pattern within the thin medium from which some embodiments can then extract metrics. - Some embodiments of the monitoring system can describe the volume extracted different ways. For example, in some embodiments, a cumulative volume over time can be computed that represents how much exhaled air the patient breathes out during normal breathing. The result of this metric can then be used to form a volume per exhale estimate that represents the change from the last valley to the next peak in volume. As another example, some embodiments can measure the flow, or volume over time, that is described in Liters per second. The results of this are shown in the
plots 400 withinFIG. 4 . -
FIG. 5 illustrates an example of a set ofoperations 500 for computing tidal volume in accordance with various embodiments of the present technology. The volume of an individual's natural exhalation at rest. As illustrated in the embodiments shown inFIG. 5 , during breathingoperation 510, a human subject breathes onto the thin medium. Thermal signature images are taken from the thin medium at every frame duringcapture operation 520. Typically, thermal images of the medium can show a circular gradient pattern. The gradient values of these images are calculated and stored incalculation operation 530. A Long-Short-Term-Memory (LSTM) network can be trained, intraining operation 540, to calculate volume measurements. For example, in some embodiments, the LSTM can take average thermal values from the medium and their associated ground-truth volume measurement from a spirometer as training input. Once the LSTM is trained with many samples of data,prediction operation 550 can use the LSTM to predict the volume measurement from a new series of thermal images. - A breathing strength mode can identify the strength of an individual exhale. In accordance with various embodiments, the system can estimate the breathing strength by the rate of expansion of the thermal signature on the medium. Stronger exhale heat expands farther after hitting the medium surface than that from a less forceful exhale. To estimate exhale strength, some embodiments use optical flow to estimate surface heat flow across the medium.
- Some embodiments aim to deliver higher level respiratory metrics than traditional monitoring solutions. For example, some embodiments can measure or estimate nose/mouth distribution and tidal volume, both of which are not obtainable using existing methods. Therefore, the ability to extract this information from the medium-based method is useful.
- Some embodiments introduce the notion of controlled respiratory analysis that incorporates two factors: (1) patient distraction and (2) active stimulation. For the first factor, if a patient is aware that they are being monitored, their breathing may not be natural or normal. Therefore, to prevent this observation bias, some embodiments introduce methods of distracting the patient and taking their mind off of the fact that they are being observed. This will result in much more natural recordings, even if they are still wearing the device.
- The second factor is an extension of this concept to active stimulation. A multimedia scene with a specific content is chosen and presented to a patient to promote a particular reaction or emotion within the patient (e.g., happy, relaxed, fear, etc.) so that different breathing patterns under an active stimulation can be observed. This provides a differential factor for the types of natural breathing that can be recorded and analyzed based on the observed environment. Some embodiments provide for both a traditional view (e.g., media, movie, animation, image sequence, etc.) and also within a Virtual Reality (VR) environment. This provides an immersive and controlled environment in which some embodiments can invoke natural changes within the patient's breathing which may contribute to identifying the exaggerated effects of some pulmonary conditions.
-
FIGS. 6A-6B provide illustrations of two possible 600 and 650 for the thin medium in a clinical setting in accordance with various embodiments of the present technology. In the embodiments illustrated instationary setups FIGS. 6A-6B , the system may include three primary components: (1) thethin medium 610, (2) achinrest 620 to ensure the patient is a constant distance away from the thin medium, and (3) thethermal camera 630 used to image the exhale thermal distributions on the medium. Thechin rest 620 andthermal camera 630 can be separated by approximately the same distance on both sides of the thin medium. - The distance of the thin medium 610 from the subject can directly affect the performance of the system and the exhale characteristics of the subject which are identified. For example, if the medium is too far away from the subject's face, the exhale may have mostly dissipated before making contact with the medium. This situation could be overcome by a strong exhale, but is exacerbated by light exhales. Conversely, if the medium is close to the subject's face and the subject forcefully exhales, the airflow will hit the surface of the medium and spread in multiple directions. As such, various embodiments use a distance between one and eight inches between the subject's face and the medium. These ranges typically ensure a strong signal, but enough distance that it minimizes the impact of turbulent flows, and is still comfortable for the patient. Other embodiments may position the medium further or closer to the patient.
- As the patient begins the monitoring process, they place their chin on the
chinrest 620 and breathe normally onto thethin medium 610. The thermal image will then capture the exhale thermal distribution on thethin medium 610. These embodiments provide a practical and simple setup for using the thin medium within a clinical setup for short-term monitoring sessions, at the cost of asking the patient to provide a steady posture during the monitoring process. - In the embodiments illustrated in
FIG. 6A , aflat medium 610 can be used to easily visualize the exhale from a reasonable distance established using achin rest 620. The design illustrated inFIG. 6B changes the shape of the medium 610 to ensure that small movements will not result in a loss of data while imaging thethin medium 610. In accordance with various embodiments, the chin rests 620 may be provided within each experimental setup as a reference for the patient to limit the effect of movement on the resulting analysis. - One of the primary challenges associated with developing a non-contact method is the movement and orientation of the patient's face. This problem is further exacerbated by the orientation of the medium which may lead to incomplete or lost thermal signatures based on the patient's breathing pattern or facial rotation. Therefore, the method can be transformed into a semi-contact solution that has the patient wear the thin-medium as a mask, separated from the face in some embodiments. The primary difference between this solution and a traditional mask is the separation distance between the patient's face and the thin-medium. This provides natural airflow during inhale and provides a constant distance at which the surface will accurately represent the patient's exhale.
-
FIGS. 7A-7B illustrate examples of semi-contact mask-based thin- 700 and 750 that may be used in one or more embodiments of the present technology. As illustrated in these figures, themedium setups mask 710 may touch the patient, however for the nose and mouth region, there is no contact. This allows the patient to breathe normally, but still provide some movement (FIG. 7A ) whilecamera 720 is located or fixed away from the mask. In some embodiments, the mask may include sliding rails or other mechanism that allows for distance of themask 710 from the user's face to be adjusted. As the subject breathes onto themask 710, thermal images of the exhale can be captured bycamera 720. - As illustrated in
FIG. 7B , some embodiments can provide complete movement with the attached setup that also includescamera 720 physically coupled tomask 710. The thin-medium and camera can be lightweight for the feasibility of this design. As part of the mask attachment, a lightweight thermal camera is held in front of the medium at a fixed distance. This design allows the camera and the medium to move with the patient's head orientation while continuously monitoring their respiration behavior. Therefore, the patient can move and naturally look around during the monitoring session. Since this is required as a fixed attachment to the patient, a very light weight mobile thermal camera is used for the imaging in some embodiments. Additionally, to limit how cumbersome the setup can be, some embodiments try to minimize the distance between the thermal camera and the medium attached to the face. - Respiratory analysis can be significantly affected by changes in a patient's natural breathing behavior due to the monitoring method, their movement, and their focus or concentration. These factors can significantly alter the results of the breathing analysis and should be minimized during the monitoring session. The solution is to disrupt the patient's concentration away from their breathing and the monitoring process and provide a sufficient distraction so that their breathing behaviors will return to normal. This will provide a more accurate method for recording natural behaviors.
- To achieve this, some embodiments provide an external stimulus that will draw the patient's attention away from the monitoring process. Two primary methods for doing this that some embodiments introduce are: (1) media-based and (2) Augmented Reality (AR) and/or Virtual Reality (VR) based distractions. The illustration of how the medium and the entertainment source are configured with respect to the thin-medium are shown in
FIGS. 8A-8B .FIGS. 8A-8B illustrate components of a stimulation-based 800 and 850 that may be used in some embodiments of the present technology. The media source can provide a specific stimulus to the patient that will modify their breathing behaviors. Based on these behavior modifications, the resulting changes in the respiratory patterns will be recorded on the thin medium.respiratory analysis system - Some embodiments incorporate the concept of the mask and the VR distraction methodology for creating controlled stimulus during the monitoring period. To build this setup, some embodiments combine both the mask design and the VR setup. This will result in the patient wearing the Head Mounted Display (HMD) 910 as they normally would for VR with an additional thin-medium 820 and
thermal camera 930 attached. Due to the weight of the HMD itself, the additional attachments will not significantly burden the patient. The illustration of this design 900 is shown inFIG. 9 . This provides a controlled environment for introducing specific stimuli to the patient to monitor their changes in breathing. Since the medium and thermal camera are attached to the HMD, the user can experience VR as they would normally without the monitoring devices. - While a thermal camera provides a viable method for precisely generating a dense representation of the patient's exhale as an image, it requires the camera to be placed on the opposite side of the medium, which limits the mobility of the system. The solution to this problem would be to eliminate the camera from the design, while keeping the thin medium as the primary detection mechanism used for monitoring exhale patterns. Some embodiments use an electrically conductive mesh from which slight changes in voltage across the medium can be interpreted as thermal distributions. Based on the airflow through the permeable mesh, the temperature changes will modify the resistance of the mesh over its surface. These changes and the small fluctuations within the monitored voltages will provide a basis for identifying the thermal distribution of the exhale.
- Smart Thin-Medium with Side Camera
- Various embodiments of the present technology provide for a data-driven correlation between exhale behaviors on a thin medium with the actual CO2 exhale of the user. The purpose of this model is to improve the accuracy of data extracted from using medium-view images collected from a mobile thermal camera. Some embodiments use the medium-view images and data extracted from the CO2 camera images to train a model that correlates breathing behaviors with thin medium thermal images to improve the accuracy of the TMTI method.
FIG. 10 illustrates an example of a hybridrespiratory monitoring system 1000 that may be used for respiratory monitoring through thin-medium thermal imaging in accordance with various embodiments of the present technology. As illustrated inFIG. 10 , various embodiments of thesystem 1000 can include anexhale medium 1010,computing system 1020, athermal camera 1030, and a sidethermal camera 1040. - Various embodiments of the present technology may use local and/or remote computing resources to train and/or use a model correlating respiratory activity and recorded data.
- Images from the CO2 camera 1030 show a detailed view of breathing behavior, such as when a breath starts and stops, how the person is breathing, and how the exhaled air collides with and spreads across the medium. Information gathered from this camera informs our understanding of the circumstances occurring for each frame from the medium-view camera. Breathing mode is one metric of interest to pulmonologists that can be obtained from the CO2 camera images. Healthy individuals tend to breathe through their nose unless the nasal passage is obstructed, at which point the individual breathes either completely through their mouth, or through their nose and mouth simultaneously. By extracting breathing mode from the CO2 camera images, various embodiments can label each medium-view image and use this data to train a CNN to classify the breathing mode of medium-
view images 1060. - To extract information from the side-view CO2 camera images 1070, some embodiments can use various software (e.g., OpenCV library) to find the outline of the person's face through a combination of thresholding and morphological transformations. Various embodiments can process this data (e.g., with a NumPy Python module) to find facial landmarks. The tip of the nose is the left-most pixel coordinate of the face, and the chinrest can be identified by taking the difference of the x-values of the pixel locations and finding the greatest peak.
- The chin and mouth are located between the tip of the nose and the chinrest and can be identified in a similar manner to the chinrest. These landmarks can be used in some embodiments to mask unnecessary information and to extract the person's breathing mode. Breathing mode can be determined by processing the pixels along the outline of the face (e.g., using the SciPy Python module), and looking for peaks in the data near the nose and mouth. No prominent peaks in the data indicate that the person is inhaling. This information can be used to label each medium-view thermal image as one of four breathing states: not exhaling, exhaling through the nose, exhaling through the mouth, or exhaling through the nose and mouth.
-
FIG. 11 illustrates an example of various components of the hybridrespiratory monitoring system 1100 that may be used in one or more embodiments of the present technology. This figure shows a flow chart depicting the training process and how the monitoring system behaves during runtime. During the training process, thermal images from the medium-view and from the side-view cameras are collected simultaneously. Using image processing, breathing information is extracted from the side-view images and paired with the medium-view image to train a model relating the side-view exhale behaviors with those indicated on the thin-medium. Temporal and spatial relationships about the exhale behavior are then extracted to be used during runtime. During runtime, a thermal image is captured from the medium-view camera only and used as input to the model, which returns predicted breathing status and breathing metrics from the image, with improved accuracy due to the prior knowledge based on the initial training from two thermal cameras. -
FIG. 12 illustrates an example of athermal exhale sequence 1200 showing images from the medium-view and the side-view that can be collected by some embodiments of the present technology. - The medium-view images and the labels extracted from the CO2 camera images can be used to train a machine learning model that predicts breathing mode from the medium-view images. As an image classification problem, an Artificial Neural Network (ANN) model is one well-suited machine learning model for this task, but would likely benefit from some temporal context, as it is difficult to determine if the thermal signature on the medium is increasing or decreasing in temperature by examining a single image. To preserve some temporal context, some embodiments feed both the original image and the previous image subtracted from the current image to the model, which indicates whether the medium temperature is increasing (exhale) or decreasing (inhale).
- An implementation based on a CNN model can be built using the Python Keras Python library with a TensorFlow backend. The CNN may consist, in some embodiments, of a 2D convolutional layer with a Tan h activation map, a 2D max pooling layer with a (2×2) pool size, a flattening layer, and then a dense layer. The dense layer uses a softmax activation map to return label probabilities for classification of exhale behaviors.
- During data collection, a test subject places their chin on the chinrest and breathes onto the medium (see, e.g.,
FIG. 10 ). This experimental setup is designed to place constraints on the experimental parameters, and is not intended for clinical applications. One goal of this experiment was to improve the accuracy of some of the other embodiments so that in clinical applications, the patient can sit or recline in a comfortable position with the medium placed in line with the patient's exhaled airflow. - Some embodiments may use paper as the medium material because it retains enough heat to be recorded by a thermal camera at a low framerate but dissipates heat between breaths. As one potential material, paper is also inexpensive, easy to find, and standardized. Test subjects that participated in this research were asked to provide six different samples of breathing data, approximately 60[s] each in length. They were asked to first provide the following four breathing samples while at a normal heart rate: (a) nose breathing, (b) normal mouth breathing, (c) breathing through a small mouth opening, and (d) breathing through a large mouth opening. Participants were then asked to provide nose and mouth breathing samples at a slightly elevated heartrate.
- Approximately 4000 medium images were collected from 4 individuals. All test subjects were healthy individuals with no congestion due to illness and no other breathing obstructions. Of the collected data, medium-view images that were matched with ambiguous side-view images, and abnormal images resulting from camera calibrations were excluded. This resulted in 2331 medium-view images and their associated labels and measurements.
- The collected medium-view images were composed of 1245 images where the individual is inhaling, 422 images of nose breathing, 587 images of mouth breathing, and 77 images where the subject is breathing through both their nose and mouth simultaneously. The collected data was iteratively separated into training data and test data, using a 75% to 25% split. The performance of the CNN was evaluated by running the
evaluation 4 times with a different set of training and test data for each, and then calculating the accuracy of each prediction.FIG. 13 shows aconfusion matrix 1300 for one of the 4 evaluations of the CNN. The results show accuracies between 91.08% and 93.81% for the 4 training and testing splits. - Breathing mode prediction from the experimental results can be performed with reasonable accuracy in various embodiments of the present technology. Interestingly, the classification with the highest accuracy rate was the mouth, which is surprising due to the unique thermal signatures mouth breathing tends to produce compared with nose breathing. The CNN would likely be improved with additional training. Additional training data is expected to continue to improve the accuracy of the system and may also be used in a real-time manner in some embodiments. Some embodiments may apply machine learning techniques to other information extracted from the side-view images, such as determining the area of the medium that is directly heated by the initial contact with exhaled air instead of heated by the spread of the exhaled air after colliding with the medium.
-
FIG. 14 is a flowchart illustrating an example of a set ofoperations 1400 for computing a tidal volume estimate by training a Long-Short-Term Memory (LSTM) network to associate flow data from a spirometer with information extracted from the medium images in accordance with various embodiments of the present technology. In some embodiments, the average pixel intensity can be calculated from the medium-view images and fed to the LSTM along with the flow rate (L/s) from the spirometer. After sufficient training, medium images are fed to the LSTM and the corresponding flow rate is estimated by the network. This can be coded as: -
While training LSTM model Get thermal medium image Get spirometer flow rate Get average intensity of image Train model with intensity, spirometer flow While running application Get thermal medium image Get average intensity of image Estimated flow = Classify intensity using LSTM model - As mentioned in and shown in
FIG. 14 , a second algorithm can be used to estimate tidal volume. By stacking cross-sectional sequence of images and determining the area of the medium that is heated directly by exhaled air, a 3D volume over time can be constructed. Determining the heated area of the medium can be done through image processing of the medium image, potentially assisted by using a side-view CO2-visualizing thermal camera to determine the dimensions medium heated by direct exhaled air. -
FIG. 15 is a flowchart illustrating an example of a set ofoperations 1500 for identifying a person's breathing mode using a Convolutional Neural Network (CNN) in accordance with one or more embodiments of the present technology. The CNN can be trained on the difference image resulting from subtracting the previous frame from the current frame and its label. A second CO2 visualizing thermal camera can be used as a ground-truth device to view the exhaled air from a side-profile view in order to label their associated medium-view images. Images are labeled as mouth breathing, nose breathing, both or none. After sufficient training, the differenced medium frames are fed into the CNN. The CNN then predicts the label associated with that medium image. This can be coded as: -
While training CNN model Get thermal medium image Get difference between current, previous images Get breathing mode Train CNN with difference image, breathing mode -
FIG. 16 illustrates techniques for computingvarious breathing metrics 1600 in accordance with some embodiments of the present technology. Different breathing modes (nose, mouth, or both) produce unique thermal signatures on the medium. As discussed above, some embodiments can identify areas of the heat signature on the medium resulting from nasal breathing and mouth breathing by determining the number of “hot spots” on the medium and their locations with respect to one another. Simultaneous nasal and oral breathing produces three “hot spots” on the medium, whereas nasal breathing produces two “hot spots”, and mouth breathing produces one. Some embodiments may use a Watershed algorithm to segment the heat signature into one, two or three areas on the medium, and the areas are classified based on their location on the medium. The pixels belonging to each cluster are summed, and then the ratio between each nostril or between exhaled air from the nose and mouth can be calculated. - In some embodiments, breathing rate can be extracted from a sequence of medium images by plotting the average intensity values of the medium images for a window of time and performing a Fast Fourier Transform (FFT) of the data. The FFT transforms the data from the time domain to the frequency domain. The overarching frequency is the breaths per minute (BPM) for that window of time.
- In some embodiments, optical flow image processing techniques can be applied to consecutive thin-medium thermal images to highlight the spread of heat across the medium. This results in a dense field of flow vectors across the image. The length of each individual vector indicates the flow of heat between frames, where longer vectors denote a faster flow of exhaled air across the surface of the medium, and shorter vectors denote a slower flow of air across the medium.
- The positive difference image values over time from the breathing rate calculation can provide useful insight into the exhale patterns of the individual. Breathing pattern abnormalities, such as several breaths in rapid succession over a short period of time, are lost after condensing the data into a single breathing rate value. However, these abnormalities are made visible as a plot of increasing difference sums over time.
-
FIG. 17A is a block diagram 1700 illustrating components of an integrated mobile configuration of the breathing analysis system. As illustrated inFIG. 17A , the mobile device can include Tx/Rx depth sensor 1705,thermal imaging sensor 1710,local compute capability 1715,local communication 1720, and I/O communication 1725 all integrated into a single, mobile form factor. Other embodiments may include additional components (e.g., a display, AR/VR components, coprocessors, and/or the like). Using I/O communication 1725, data collected from Tx/Rx depth sensor 1705 andthermal imaging sensor 1710 can be shared viaexternal communication array 1730 with external compute capability 1735 (e.g., workstations, laptops, computer, remote servers, cloud-based solutions, etc.). In addition, some embodiments can share any processed data or computation withexternal compute capability 1735. In some embodiments, the mobile solution may include minimal computational abilities in order to save power. As such, imaging instructions may be received directly fromexternal compute capability 1735, via I/O communication 1725. The imaging instructions can cause thelocal compute capability 1715 to control Tx/Rx depth sensor 1705 andthermal imaging sensor 1710. As such, various embodiments of the present technology may use local and/or remote computing resources to train and/or use a model correlating respiratory activity and recorded data. -
FIG. 17B illustrates a block diagram 1750 of a sensor only configuration of the breathing analysis system. In the embodiments illustrated inFIG. 17B , the breathing system may includedepth imaging sensor 1755, Tx/Rx depth sensor 1760,local communication 1765, andexternal communication 1770. The device can connect toexternal compute capability 1775 where the images can be processed to generate an analysis of the respiratory behavior of the individual. In these embodiments, any significant processing is done byexternal compute capability 1775. -
FIG. 18 is an illustration of a top view ofbreathing analysis system 1800 where the cameras are imaging the thin-medium from the same-side as the patient. In the embodiments illustrated inFIG. 18 , thepatient 1805 can breathe onto the thin-medium 1810. One ormore cameras 1815A-1815B can be positioned on the same side of the thin-medium as the patient. The breathing analysis system can identify exhale characteristics and behaviors through the processing and/or visualization of heated exhale interacting with the thin medium (e.g., plastics, polymers, metals, fibers, or other synthetic materials). While some embodiments image the thin-film from the opposing side, other embodiments (such as those illustrated inFIG. 18 ) can image from the same or surrounding directions of the thin-medium. - Aspects and implementations of the breathing analysis system of the disclosure have been described in the general context of various steps and operations. A variety of these steps and operations may be performed by hardware components or may be embodied in computer-executable instructions, which may be used to cause a general-purpose or special-purpose processor (e.g., in a computer, server, or other computing device) programmed with the instructions to perform the steps or operations. For example, the steps or operations may be performed by a combination of hardware, software, and/or firmware.
-
FIG. 19 is a block diagram illustrating an example machine representing the computer systemization of the breathing analysis system. Thesystem controller 1900 may be in communication with entities including one or more users 1925 client/terminal devices 1920,user input devices 1905,peripheral devices 1910, an optional co-processor device(s) (e.g., cryptographic processor devices) 1915, andnetworks 1930. Users may engage with thecontroller 1900 viaterminal devices 1920 overnetworks 1930. - Computers may employ central processing unit (CPU) or processor to process information. Processors may include programmable general-purpose or special-purpose microprocessors, programmable controllers, application-specific integrated circuits (ASICs), programmable logic devices (PLDs), embedded components, combination of such devices and the like. Processors execute program components in response to user and/or system-generated requests. One or more of these components may be implemented in software, hardware or both hardware and software. Processors pass instructions (e.g., operational and data instructions) to enable various operations.
- The
controller 1900 may includeclock 1965,CPU 1970, memory such as read only memory (ROM) 1985 and random access memory (RAM) 1980 and co-processor 1975 among others. These controller components may be connected to a system bus 1960, and through the system bus 1960 to an interface bus 1935. Further,user input devices 1905,peripheral devices 1910,co-processor devices 1915, and the like, may be connected through the interface bus 1935 to the system bus 1960. The interface bus 1935 may be connected to a number of interface adapters such asprocessor interface 1940, input output interfaces (I/O) 1945,network interfaces 1950,storage interfaces 1955, and the like. -
Processor interface 1940 may facilitate communication betweenco-processor devices 1915 andco-processor 1975. In one implementation,processor interface 1940 may expedite encryption and decryption of requests or data. Input output interfaces (I/O) 1945 facilitate communication betweenuser input devices 1905,peripheral devices 1910,co-processor devices 1915, and/or the like and components of thecontroller 1900 using protocols such as those for handling audio, data, video interface, wireless transceivers, or the like (e.g., Bluetooth, IEEE 1394a-b, serial, universal serial bus (USB), Digital Visual Interface (DVI), 802.11a/b/g/n/x, cellular, etc.).Network interfaces 1950 may be in communication with thenetwork 1930. Through thenetwork 1930, thecontroller 1900 may be accessible to remoteterminal devices 1920.Network interfaces 1950 may use various wired and wireless connection protocols such as, direct connect, Ethernet, wireless connection such as IEEE 802.11a-x, and the like. - Examples of
network 1930 include the Internet, Local Area Network (LAN), Metropolitan Area Network (MAN), a Wide Area Network (WAN), wireless network (e.g., using Wireless Application Protocol WAP), a secured custom connection, and the like. The network interfaces 1950 can include a firewall which can, in some aspects, govern and/or manage permission to access/proxy data in a computer network, and track varying levels of trust between different machines and/or applications. The firewall can be any number of modules having any combination of hardware and/or software components able to enforce a predetermined set of access rights between a particular set of machines and applications, machines and machines, and/or applications and applications, for example, to regulate the flow of traffic and resource sharing between these varying entities. The firewall may additionally manage and/or have access to an access control list which details permissions including, for example, the access and operation rights of an object by an individual, a machine, and/or an application, and the circumstances under which the permission rights stand. Other network security functions performed or included in the functions of the firewall, can be, for example, but are not limited to, intrusion-prevention, intrusion detection, next-generation firewall, personal firewall, etc., without deviating from the novel art of this disclosure. -
Storage interfaces 1955 may be in communication with a number of storage devices such as,storage devices 1990, removable disc devices, and the like. The storage interfaces 1955 may use various connection protocols such as Serial Advanced Technology Attachment (SATA), IEEE 1394, Ethernet, Universal Serial Bus (USB), and the like. -
User input devices 1905 andperipheral devices 1910 may be connected to I/O interface 1945 and potentially other interfaces, buses and/or components.User input devices 1905 may include card readers, finger print readers, joysticks, keyboards, microphones, mouse, remote controls, retina readers, touch screens, sensors, and/or the like.Peripheral devices 1910 may include antenna, audio devices (e.g., microphone, speakers, etc.), cameras, external processors, communication devices, radio frequency identifiers (RFIDs), scanners, printers, storage devices, transceivers, and/or the like.Co-processor devices 1915 may be connected to thecontroller 1900 through interface bus 1935, and may include microcontrollers, processors, interfaces or other devices. - Computer executable instructions and data may be stored in memory (e.g., registers, cache memory, random access memory, flash, etc.) which is accessible by processors. These stored instruction codes (e.g., programs) may engage the processor components, motherboard and/or other system components to perform desired operations. The
controller 1900 may employ various forms of memory including on-chip CPU memory (e.g., registers),RAM 1980,ROM 1985, andstorage devices 1990.Storage devices 1990 may employ any number of tangible, non-transitory storage devices or systems such as fixed or removable magnetic disk drive, an optical drive, solid state memory devices and other processor-readable storage media. Computer-executable instructions stored in the memory may include one or more program modules such as routines, programs, objects, components, data structures, and so on that perform particular tasks or implement particular abstract data types. For example, the memory may contain operating system (OS)component 1995, modules and other components, database tables, and the like. These modules/components may be stored and accessed from the storage devices, including from external storage devices accessible through an interface bus. - The database components can store programs executed by the processor to process the stored data. The database components may be implemented in the form of a database that is relational, scalable and secure. Examples of such database include DB2, MySQL, Oracle, Sybase, and the like. Alternatively, the database may be implemented using various standard data-structures, such as an array, hash, list, stack, structured text file (e.g., XML), table, and/or the like. Such data-structures may be stored in memory and/or in structured files.
- The
controller 1900 may be implemented in distributed computing environments, where tasks or modules are performed by remote processing devices, which are linked through a communications network, such as a Local Area Network (“LAN”), Wide Area Network (“WAN”), the Internet, and the like. In a distributed computing environment, program modules or subroutines may be located in both local and remote memory storage devices. Distributed computing may be employed to load balance and/or aggregate resources for processing. Alternatively, aspects of thecontroller 1900 may be distributed electronically over the Internet or over other networks (including wireless networks). Those skilled in the relevant art(s) will recognize that portions of the breathing analysis system may reside on a server computer, while corresponding portions reside on a client computer. Data structures and transmission of data particular to aspects of thecontroller 1900 are also encompassed within the scope of the disclosure. - Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to.” As used herein, the terms “connected,” “coupled,” or any variant thereof means any connection or coupling, either direct or indirect, between two or more elements; the coupling or connection between the elements can be physical, logical, or a combination thereof. Additionally, the words “herein,” “above,” “below,” and words of similar import, when used in this application, refer to this application as a whole and not to any particular portions of this application. Where the context permits, words in the above Detailed Description using the singular or plural number may also include the plural or singular number respectively. The word “or,” in reference to a list of two or more items, covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.
- The above Detailed Description of examples of the technology is not intended to be exhaustive or to limit the technology to the precise form disclosed above. While specific examples for the technology are described above for illustrative purposes, various equivalent modifications are possible within the scope of the technology, as those skilled in the relevant art will recognize. For example, while processes or blocks are presented in a given order, alternative implementations may perform routines having steps, or employ systems having blocks, in a different order, and some processes or blocks may be deleted, moved, added, subdivided, combined, and/or modified to provide alternative or sub-combinations. Each of these processes or blocks may be implemented in a variety of different ways. Also, while processes or blocks are at times shown as being performed in series, these processes or blocks may instead be performed or implemented in parallel, or may be performed at different times. Further any specific numbers noted herein are only examples: alternative implementations may employ differing values or ranges.
- The teachings of the technology provided herein can be applied to other systems, not necessarily the system described above. The elements and acts of the various examples described above can be combined to provide further implementations of the technology. Some alternative implementations of the technology may include not only additional elements to those implementations noted above, but also may include fewer elements.
- These and other changes can be made to the technology in light of the above Detailed Description. While the above description describes certain examples of the technology, and describes the best mode contemplated, no matter how detailed the above appears in text, the technology can be practiced in many ways. Details of the system may vary considerably in its specific implementation, while still being encompassed by the technology disclosed herein. As noted above, particular terminology used when describing certain features or aspects of the technology should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the technology with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the technology to the specific examples disclosed in the specification, unless the above Detailed Description section explicitly defines such terms. Accordingly, the actual scope of the technology encompasses not only the disclosed examples, but also all equivalent ways of practicing or implementing the technology under the claims.
- To reduce the number of claims, certain aspects of the technology are presented below in certain claim forms, but the applicant contemplates the various aspects of the technology in any number of claim forms. For example, while only one aspect of the technology is recited as a computer-readable medium claim, other aspects may likewise be embodied as a computer-readable medium claim, or in other forms, such as being embodied in a means-plus-function claim. Any claims intended to be treated under 35 U.S.C. § 112(f) will begin with the words “means for”, but use of the term “for” in any other context is not intended to invoke treatment under 35 U.S.C. § 112(f). Accordingly, the applicant reserves the right to pursue additional claims after filing this application to pursue such additional claim forms, in either this application or in a continuing application.
Claims (20)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/676,366 US20200138337A1 (en) | 2018-11-06 | 2019-11-06 | Non-Contact Breathing Activity Monitoring And Analyzing System Through Thermal On Projection Medium Imaging |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201862756501P | 2018-11-06 | 2018-11-06 | |
| US16/676,366 US20200138337A1 (en) | 2018-11-06 | 2019-11-06 | Non-Contact Breathing Activity Monitoring And Analyzing System Through Thermal On Projection Medium Imaging |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20200138337A1 true US20200138337A1 (en) | 2020-05-07 |
Family
ID=70459934
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/676,346 Active 2041-07-30 US11589776B2 (en) | 2018-11-06 | 2019-11-06 | Non-contact breathing activity monitoring and analyzing through thermal and CO2 imaging |
| US16/676,366 Abandoned US20200138337A1 (en) | 2018-11-06 | 2019-11-06 | Non-Contact Breathing Activity Monitoring And Analyzing System Through Thermal On Projection Medium Imaging |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/676,346 Active 2041-07-30 US11589776B2 (en) | 2018-11-06 | 2019-11-06 | Non-contact breathing activity monitoring and analyzing through thermal and CO2 imaging |
Country Status (1)
| Country | Link |
|---|---|
| US (2) | US11589776B2 (en) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10957038B2 (en) * | 2019-02-04 | 2021-03-23 | International Business Machines Corporation | Machine learning to determine clinical change from prior images |
| CN112613431A (en) * | 2020-12-28 | 2021-04-06 | 中北大学 | Automatic identification method, system and device for leaked gas |
| US20220167856A1 (en) * | 2020-12-01 | 2022-06-02 | The Trustees Of Dartmouth College | Lung function monitoring from heart signals |
| US11607160B2 (en) * | 2019-06-05 | 2023-03-21 | Carlos Andres Cuestas Rodriguez | System and method for multi modal deception test scored by machine learning |
| US12213786B2 (en) | 2020-06-02 | 2025-02-04 | Converus, Inc. | Oculomotor based deception detection using audio multi-issue comparison testing |
| WO2025181642A1 (en) * | 2024-02-29 | 2025-09-04 | Covidien Lp | Calculation of respiratory rate using touchless monitoring and ai |
Families Citing this family (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6693609B1 (en) * | 2018-11-28 | 2020-05-13 | コニカミノルタ株式会社 | Gas flow rate estimation device, gas flow rate estimation method, and gas flow rate estimation program |
| DE102019131328B3 (en) * | 2019-11-20 | 2021-04-15 | Lavision Gmbh | Method for detecting primary gas flows in flow spaces |
| US12051241B2 (en) * | 2020-05-26 | 2024-07-30 | Xtract One Technologies Inc. | Sensor systems and methods for facility operation management |
| FR3111532B1 (en) * | 2020-06-18 | 2022-07-29 | Rubix Si | Massive access control system and method using pathology indicators |
| WO2022004461A1 (en) * | 2020-07-03 | 2022-01-06 | コニカミノルタ株式会社 | Gas region determination device, gas region determination method, learning model generation device, learning model generation method, and program |
| CN111772633B (en) * | 2020-07-16 | 2023-06-23 | 韩锋 | Remote sensing respiratory function monitoring device and method |
| US20230301546A1 (en) * | 2020-08-12 | 2023-09-28 | Arizona Board Of Regents On Behalf Of The University Of Arizona | Imaging human respiratory gas patterns to determine volume, rate and carbon dioxide concentration |
| JP6796306B1 (en) * | 2020-09-09 | 2020-12-09 | 株式会社エクサウィザーズ | Shooting equipment, shooting method and computer program |
| EP3967215A1 (en) | 2020-09-15 | 2022-03-16 | Alunos AG | Measurement device and method for assessing vital data of a human or animal subject |
| CN112924035B (en) * | 2021-01-27 | 2022-06-21 | 复旦大学附属中山医院 | Body temperature and respiration rate extraction method based on thermal imaging sensor and application thereof |
| US12211625B2 (en) * | 2021-02-05 | 2025-01-28 | Cisco Technology, Inc. | Systems and methods for detecting and tracking infectious diseases using sensor data |
| CN112992337B (en) * | 2021-02-07 | 2022-05-24 | 华南理工大学 | Lung function assessment algorithm, device, medium and equipment for cervical and spinal cord injury patient |
| CN114978189A (en) * | 2021-02-27 | 2022-08-30 | 华为技术有限公司 | A data encoding method and related equipment |
| CN113576452A (en) * | 2021-07-30 | 2021-11-02 | 深圳市商汤科技有限公司 | Respiration rate detection method and device based on thermal imaging and electronic equipment |
| CN114022353B (en) * | 2022-01-07 | 2022-03-29 | 成都国星宇航科技有限公司 | Method and device for fusing space-time image texture and image color |
| EP4523619A4 (en) * | 2022-05-10 | 2025-08-20 | Univ Osaka Public Corp | BREATH VISUALIZATION SYSTEM AND PROCEDURE AND BREATH ASSESSMENT SYSTEM AND PROCEDURE |
| US12376763B2 (en) * | 2022-06-29 | 2025-08-05 | Apple Inc. | Non-contact respiration sensing |
| US20250331777A1 (en) * | 2022-06-30 | 2025-10-30 | Arizona Board Of Regents On Behalf Of The University Of Arizona | Sensor suspension system for supine co2 monitoring |
| EP4586890A4 (en) * | 2022-09-15 | 2025-08-13 | Hadassa Hartman | RESPIRATORY ASSESSMENT AND MONITORING SYSTEM AND METHOD |
| WO2024105719A1 (en) * | 2022-11-14 | 2024-05-23 | 三菱電機株式会社 | Imaging system, exhalation detecting system, physical information monitoring system, and air conditioning/ventilation system |
| WO2024145683A2 (en) * | 2022-12-30 | 2024-07-04 | The Regents Of The University Of Colorado, A Body Corporate | Non-contact diagnostic system for respiratory disease prognosis through thermal-depth spatiotemporal behavioral modeling |
Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4945919A (en) * | 1989-02-10 | 1990-08-07 | Yamaguchi Yakuhin Shokai Ltd. | Rhinological diagnostic device |
| US20120075462A1 (en) * | 2010-09-23 | 2012-03-29 | Sony Computer Entertainment Inc. | Blow tracking user interface system and method |
| US20170095157A1 (en) * | 2015-10-03 | 2017-04-06 | Facense Ltd. | Systems for collecting thermal measurements of the face |
| US20180092589A1 (en) * | 2015-06-14 | 2018-04-05 | Facense Ltd. | Selecting a users state based on shape of the exhale stream |
| WO2018218356A1 (en) * | 2017-05-30 | 2018-12-06 | Interaxon Inc. | Wearable computing device with electrophysiological sensors |
| US20190139300A1 (en) * | 2017-11-08 | 2019-05-09 | Siemens Healthcare Gmbh | Medical scene model |
| US20190175012A1 (en) * | 2016-06-08 | 2019-06-13 | Beyond 700 Pty Ltd | An apparatus and method for using infrared thermography for viewing a tear film |
| US20190274621A1 (en) * | 2016-10-28 | 2019-09-12 | Ajou University Industry-Academic Cooperation Foundation | Breath analysis system using gas image detection method |
| US20190323895A1 (en) * | 2018-04-24 | 2019-10-24 | Helen Of Troy Limited | System and method for human temperature regression using multiple structures |
| US20200000370A1 (en) * | 2017-03-13 | 2020-01-02 | Arizona Board Of Regents On Behalf Of Arizona State University | Imaging-based spirometry systems and methods |
Family Cites Families (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4955946A (en) | 1986-12-11 | 1990-09-11 | Marquette Gas Analysis | Respiratory CO2 detector circuit with high quality waveform |
| US4928703A (en) | 1988-11-23 | 1990-05-29 | Evionics, Inc. | Non-contact respiration rate and apnea monitor using pulmonary gas exchange technique |
| SE9904836L (en) | 1999-12-28 | 2001-06-29 | Jonas Sandsten | Quantitative imaging of gas emissions using optical technology |
| CN101828904A (en) | 2002-10-03 | 2010-09-15 | 斯科特实验室公司 | Be used to provide the system and method for sensor fusion |
| US20060239921A1 (en) * | 2005-04-26 | 2006-10-26 | Novadaq Technologies Inc. | Real time vascular imaging during solid organ transplant |
| US10213171B2 (en) * | 2005-07-08 | 2019-02-26 | Shimadzu Corporation | X-ray photography system |
| US7981038B2 (en) * | 2005-10-11 | 2011-07-19 | Carnegie Mellon University | Sensor guided catheter navigation system |
| US8500451B2 (en) * | 2007-01-16 | 2013-08-06 | Simbionix Ltd. | Preoperative surgical simulation |
| US7996066B2 (en) * | 2007-08-16 | 2011-08-09 | New Frontier Imaging LLC | Topographic optical infrared tomography system for biophysical imaging with infrared diagnostic exploratory algorithm sequencing (IDEAS) scripting language |
| US8554490B2 (en) * | 2009-02-25 | 2013-10-08 | Worcester Polytechnic Institute | Automatic vascular model generation based on fluid-structure interactions (FSI) |
| US20160156880A1 (en) * | 2009-06-03 | 2016-06-02 | Flir Systems, Inc. | Durable compact multisensor observation devices |
| US8520074B2 (en) | 2010-12-14 | 2013-08-27 | Xerox Corporation | Determining a total number of people in an IR image obtained via an IR imaging system |
| US8790269B2 (en) | 2011-05-09 | 2014-07-29 | Xerox Corporation | Monitoring respiration with a thermal imaging system |
| US8715202B2 (en) * | 2011-09-27 | 2014-05-06 | Xerox Corporation | Minimally invasive image-based determination of carbon dioxide (CO2) concentration in exhaled breath |
| US8854223B2 (en) | 2012-01-18 | 2014-10-07 | Xerox Corporation | Image-based determination of CO and CO2 concentrations in vehicle exhaust gas emissions |
| DE102015006902B3 (en) * | 2015-06-04 | 2016-06-30 | Drägerwerk AG & Co. KGaA | Device for processing and visualizing data from an electro-impedance tomography device for the detection and visualization of regional delays in ventilation in the lungs |
| US9697599B2 (en) | 2015-06-17 | 2017-07-04 | Xerox Corporation | Determining a respiratory pattern from a video of a subject |
| US10226201B2 (en) * | 2015-10-29 | 2019-03-12 | Invoy Holdings, Llc | Flow regulation device for breath analysis and related method |
| US9996981B1 (en) * | 2016-03-07 | 2018-06-12 | Bao Tran | Augmented reality system |
| US10489975B2 (en) * | 2017-01-04 | 2019-11-26 | Daqri, Llc | Environmental mapping system |
| US11684293B2 (en) * | 2019-12-09 | 2023-06-27 | Arizona Board Of Regents On Behalf Of The University Of Arizona | Sensors and method for defining breathing signatures for identifying respiratory disease |
-
2019
- 2019-11-06 US US16/676,346 patent/US11589776B2/en active Active
- 2019-11-06 US US16/676,366 patent/US20200138337A1/en not_active Abandoned
Patent Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4945919A (en) * | 1989-02-10 | 1990-08-07 | Yamaguchi Yakuhin Shokai Ltd. | Rhinological diagnostic device |
| US20120075462A1 (en) * | 2010-09-23 | 2012-03-29 | Sony Computer Entertainment Inc. | Blow tracking user interface system and method |
| US20180092589A1 (en) * | 2015-06-14 | 2018-04-05 | Facense Ltd. | Selecting a users state based on shape of the exhale stream |
| US20170095157A1 (en) * | 2015-10-03 | 2017-04-06 | Facense Ltd. | Systems for collecting thermal measurements of the face |
| US20190175012A1 (en) * | 2016-06-08 | 2019-06-13 | Beyond 700 Pty Ltd | An apparatus and method for using infrared thermography for viewing a tear film |
| US20190274621A1 (en) * | 2016-10-28 | 2019-09-12 | Ajou University Industry-Academic Cooperation Foundation | Breath analysis system using gas image detection method |
| US20200000370A1 (en) * | 2017-03-13 | 2020-01-02 | Arizona Board Of Regents On Behalf Of Arizona State University | Imaging-based spirometry systems and methods |
| WO2018218356A1 (en) * | 2017-05-30 | 2018-12-06 | Interaxon Inc. | Wearable computing device with electrophysiological sensors |
| US20190139300A1 (en) * | 2017-11-08 | 2019-05-09 | Siemens Healthcare Gmbh | Medical scene model |
| US20190323895A1 (en) * | 2018-04-24 | 2019-10-24 | Helen Of Troy Limited | System and method for human temperature regression using multiple structures |
Non-Patent Citations (3)
| Title |
|---|
| Pešek, M.; Pech, O. (2014). The determination of field usability of method measuring temperature fields in the air using an infrared camera. EPJ Web of Conferences, 67, 02091. https://doi.org/10.1051/epjconf/20146702091 (Year: 2014) * |
| Schoun, B., Transue, S., Halbower, A. C., ; Choi, M.-H. (2018). Non-contact tidal volume measurement through thin medium thermal imaging. Smart Health, 9-10, 37–49. doi.org/10.1016/j.smhl.2018.07.018 (Year: 2018) * |
| Vainer, B. G. (2018). A novel high-resolution method for the respiration rate and breathing waveforms remote monitoring. Annals of Biomedical Engineering, 46(7), 960–971. doi.org/10.1007/s10439-018-2018-6 (Year: 2018) * |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10957038B2 (en) * | 2019-02-04 | 2021-03-23 | International Business Machines Corporation | Machine learning to determine clinical change from prior images |
| US11607160B2 (en) * | 2019-06-05 | 2023-03-21 | Carlos Andres Cuestas Rodriguez | System and method for multi modal deception test scored by machine learning |
| US12213786B2 (en) | 2020-06-02 | 2025-02-04 | Converus, Inc. | Oculomotor based deception detection using audio multi-issue comparison testing |
| US20220167856A1 (en) * | 2020-12-01 | 2022-06-02 | The Trustees Of Dartmouth College | Lung function monitoring from heart signals |
| CN112613431A (en) * | 2020-12-28 | 2021-04-06 | 中北大学 | Automatic identification method, system and device for leaked gas |
| WO2025181642A1 (en) * | 2024-02-29 | 2025-09-04 | Covidien Lp | Calculation of respiratory rate using touchless monitoring and ai |
Also Published As
| Publication number | Publication date |
|---|---|
| US20200138292A1 (en) | 2020-05-07 |
| US11589776B2 (en) | 2023-02-28 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20200138337A1 (en) | Non-Contact Breathing Activity Monitoring And Analyzing System Through Thermal On Projection Medium Imaging | |
| CN105636505B (en) | Apparatus and method for obtaining vital signs of a subject | |
| Murthy et al. | Noncontact measurement of breathing function | |
| Hu et al. | Synergetic use of thermal and visible imaging techniques for contactless and unobtrusive breathing measurement | |
| Murthy et al. | Touchless monitoring of breathing function | |
| Hassan et al. | Towards health monitoring using remote heart rate measurement using digital camera: A feasibility study | |
| Li et al. | An EEG-based multi-modal emotion database with both posed and authentic facial actions for emotion analysis | |
| Krzywicki et al. | A non-contact technique for measuring eccrine sweat gland activity using passive thermal imaging | |
| CN109259749A (en) | Non-contact heart rate measurement method based on vision camera | |
| Basu et al. | Infrared imaging based hyperventilation monitoring through respiration rate estimation | |
| Ganfure | Using video stream for continuous monitoring of breathing rate for general setting | |
| CN106999062A (en) | A method for extracting heart information based on human micro-movement | |
| Ruminski et al. | Evaluation of respiration rate using thermal imaging in mobile conditions | |
| Mohd et al. | Mental stress recognition based on non-invasive and non-contact measurement from stereo thermal and visible sensors | |
| Lin et al. | Contactless monitoring of pulse rate and eye movement for uveal melanoma patients undergoing radiation therapy | |
| Tarmizi et al. | A review of facial thermography assessment for vital signs estimation | |
| KR20140057867A (en) | System for mearsuring stress using thermal image | |
| Schoun et al. | Real-time thermal medium-based breathing analysis with python | |
| Sen et al. | Alternative method for pain assessment using EMG and GSR | |
| Wang et al. | Identification of deep breath while moving forward based on multiple body regions and graph signal analysis | |
| Wang et al. | Facial landmark based BMI analysis for pervasive health informatics | |
| Aario et al. | Respiratory Pattern Recognition from Low-Resolution Thermal Imaging. | |
| Murthy et al. | Non-contact monitoring of breathing function using infrared imaging | |
| Mustafa et al. | Heart rate estimation from facial videos for depression analysis | |
| Awais et al. | Face and its features detection during nap |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| AS | Assignment |
Owner name: THE REGENTS OF THE UNIVERSITY OF COLORADO, A BODY CORPORATE, COLORADO Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHOI, MIN-HYUNG;TRANSUE, SHANE;SCHOUN, BREAWN;REEL/FRAME:053455/0775 Effective date: 20200116 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |