[go: up one dir, main page]

WO2025090347A1 - Obtaining eye movement data using patient-holdable device - Google Patents

Obtaining eye movement data using patient-holdable device Download PDF

Info

Publication number
WO2025090347A1
WO2025090347A1 PCT/US2024/051701 US2024051701W WO2025090347A1 WO 2025090347 A1 WO2025090347 A1 WO 2025090347A1 US 2024051701 W US2024051701 W US 2024051701W WO 2025090347 A1 WO2025090347 A1 WO 2025090347A1
Authority
WO
WIPO (PCT)
Prior art keywords
patient
eye movement
movement data
data
target image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/051701
Other languages
French (fr)
Inventor
Ali SABER TEHRANI
Jorge Otero-Millan
Nathan Farrell
Pouya BARAHIM BASTANI
Hector Rieiro
David E. NEWMAN-TOKER
Shervin BADIHIAN
Taylor Max PARKER
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Johns Hopkins University
Original Assignee
Johns Hopkins University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Johns Hopkins University filed Critical Johns Hopkins University
Publication of WO2025090347A1 publication Critical patent/WO2025090347A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/024Subjective types, i.e. testing apparatus requiring the active assistance of the patient for determining the visual field, e.g. perimeter types
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Definitions

  • This disclosure relates generally to ophthalmology and neurology.
  • any of a variety of diseases and disorders may be diagnosed or prognosed by examining a patient’s eye movements using a variety of ophthalmologic tests.
  • diseases and disorders include, by way of non-limiting example, vestibular strokes, multiple sclerosis (MS), and amyotrophic lateral sclerosis (ALS).
  • a method of providing patient eye movement data using a target image displayed on a patient-holdable device includes: displaying a target image to a patient, wherein the displaying comprises displaying the target image on a patient-holdable device; obtaining, by the patient-holdable device, patient eye movement data; and outputting the patient eye movement data.
  • the patient-holdable device may include a smart phone.
  • the patient- holdable device may include a smart phone of the patient, and the displaying, the obtaining, and the outputting may be performed by an application installed on the smart phone of the patient.
  • the patient-holdable device may include patient-wearable goggles.
  • the patient-wearable goggles may include a smart phone.
  • the patient eye movement data may include video data.
  • the patient eye movement data may include eye tracking data.
  • the displaying may include displaying the target image sequentially at a plurality of disjoint locations in a field of view of the patient.
  • the method may include: based on the patient eye movement data, measuring at least one of a saccade velocity, a saccade latency, or a saccade accuracy; and outputting the saccade velocity, the saccade latency, or the saccade accuracy.
  • the displaying may include displaying the target image smoothly moving within a field of view of the patient.
  • the method may include: based on the patient eye movement data, measuring a pursuit gain of the patient; and outputting the pursuit gain of the patient.
  • the displaying may include displaying the target image at a stationary location within a field of vision of the patient.
  • the method may further include: rotating a position of the patient’s head relative to the target image; based at least on the patient eye movement data, measuring a vestibulo-ocular reflex of the patient; and outputting the vestibuloocular reflex of the patient.
  • the method may further include: rotating the patient and the device about a vertical axis; based on the patient eye movement data, measuring nystagmus data of the patient; and outputting the nystagmus data of the patient.
  • the method may include: alternately covering each eye of the patient; based on the patient eye movement data, measuring eye misalignment data of the patient; and outputting the eye misalignment data of the patient.
  • the method may include: measuring, based on the patient eye movement data, patient eye characteristic data; and automatically diagnosing the patient based on the patient eye characteristic data.
  • a non-transitory computer readable medium comprising instructions that, when executed by an electronic processor, configure the electronic processor to provide patient eye movement data using a target image displayed on a patient-holdable device by performing actions.
  • the actions include: displaying a target image to a patient, wherein the displaying comprises displaying the target image on a patient-holdable device; obtaining, by the patient-holdable device, patient eye movement data; and outputting the patient eye movement data.
  • the patient-holdable device may include a smart phone.
  • the patient-holdable device may include a smart phone of the patient, and the displaying, the obtaining, and the outputting may be performed by an application installed on the smart phone of the patient.
  • the patient-holdable device may include patient-wearable goggles.
  • the patient-wearable goggles may include a smart phone.
  • the patient eye movement data may include video data.
  • the patient eye movement data may include eye tracking data.
  • the displaying may include displaying the target image sequentially at a plurality of disjoint locations in a field of view of the patient.
  • the actions may further include: based on the patient eye movement data, measuring at least one of a saccade velocity, a saccade latency, or a saccade accuracy; and outputting the saccade velocity, the saccade latency, or the saccade accuracy.
  • the displaying may include displaying the target image smoothly moving within a field of view of the patient.
  • the actions may further include: based on the patient eye movement data, measuring a pursuit gain of the patient; and outputting the pursuit gain of the patient.
  • the displaying may include displaying the target image at a stationary location within a field of vision of the patient.
  • the actions may further include: based at least on the patient eye movement data obtained while rotating a position of the patient’s head relative to the target image, measuring a vestibulo-ocular reflex of the patient; and outputting the vestibuloocular reflex of the patient.
  • the actions may further include: based on the patient eye movement data obtained while rotating the patient and the device about a vertical axis, measuring nystagmus data of the patient; and outputting the nystagmus data of the patient.
  • the actions may further include: based on the patient eye movement data obtained while alternately covering each eye of the patient, measuring eye misalignment data of the patient; and outputting the eye misalignment data of the patient.
  • the actions may further include: measuring, based on the patient eye movement data, patient eye characteristic data; and automatically diagnosing the patient based on the patient eye characteristic data.
  • Fig. 1 depicts a system for providing patient eye movement data using a target image displayed on a patient-holdable device, such as patient-wearable goggles or a smart phone, according to various embodiments
  • Fig. 2 depicts a patient-holdable device displaying a target image sequentially at a plurality of disjoint locations within a field of view of a patient, according to various embodiments;
  • Fig. 3 depicts a patient-holdable device displaying a target image smoothly moving within a field of view of a patient, according to various embodiments;
  • Fig. 4 depicts a patient-holdable device displaying a stationary target image within a field of view of a patient, according to various embodiments;
  • Fig. 5 depicts a flow diagram for a method of providing patient eye movement data using a target image displayed on a patient-holdable device according to various embodiments.
  • a variety of diseases and disorders may be diagnosed or prognosed by examining the movement of a patient’s eyes. Such diseases and disorders include, by way of non-limiting examples, vestibular strokes, multiple sclerosis (MS), and amyotrophic lateral sclerosis (ALS). A variety of ophthalmologic tests may rely on such eye movement assessment.
  • diseases and disorders include, by way of non-limiting examples, vestibular strokes, multiple sclerosis (MS), and amyotrophic lateral sclerosis (ALS).
  • MS multiple sclerosis
  • ALS amyotrophic lateral sclerosis
  • a variety of ophthalmologic tests may rely on such eye movement assessment.
  • Such tests may include, by way of non-limiting example, a saccade test (which can measure saccade velocity, latency, or accuracy), a smooth pursuit test (which can measure pursuit gain), a head impulse test (which can measure vestibulo-ocular reflex), a skew deviation test (which can measure eye misalignment), and a vestibuloocular suppression, Dix-Hallpike, or horizontal head roll test (any of which can measure nystagmus velocity or direction)
  • a saccade test which can measure saccade velocity, latency, or accuracy
  • a smooth pursuit test which can measure pursuit gain
  • a head impulse test which can measure vestibulo-ocular reflex
  • a skew deviation test which can measure eye misalignment
  • Dix-Hallpike or horizontal head roll test (any of which can measure nystagmus velocity or direction)
  • ophthalmologic tests have the commonalities of requiring a patient to fixate their gaze on a target, which may be stationary of mobile, while the test is performed.
  • Some tests include various patient manipulations, such as sequentially covering the patient’s eyes (e.g., for the skew deviation test) or moving the patient’s head (e.g., for the head impulse test) or body (for the vestibulo-optical suppression test). Nevertheless, these and other ophthalmologic tests involve obtaining data regarding the movement of the patient’s eyes while the test is performed.
  • Some embodiments provide patient eye movement data using a target image displayed on a patient-holdable device.
  • the patient-holdable device may be patient-wearable goggles or a smart phone, which the patient themselves may own.
  • some embodiments utilize software, e.g., in the form of an app, which the patient may install on their own smart phone.
  • Some embodiments use a camera in the patient-holdable device itself to capture patient eye movement data.
  • some embodiments may utilize inexpensive and/or existing, lightweight, patient-holdable equipment to obtain patient eye movement data, e.g., during the performance of an ophthalmologic test.
  • the patient eye movement data may be used to prognose or diagnose any of a variety of diseases or disorders, not limited to ophthalmologic diseases and disorders.
  • Embodiments may be used by a physician, for example, in an emergency setting or in an outpatient setting, or by the patient themselves, e.g., at home.
  • Fig. 1 depicts a system 100 for providing patient eye movement data using a target image displayed on a patient-holdable device according to various embodiments.
  • the system 100 may be used to obtain patient eye movement data for a patient 102.
  • the eye movement data may be raw video or eye tracking data.
  • the system 100 may be used to perform the method 500 of Fig. 5, for example.
  • the system 100 as shown in Fig. 1 includes two patient-holdable devices, namely, a smart phone 120 and wearable goggles 130. Either of the patient- holdable devices may be used in the alternative, according to various embodiments.
  • the smart phone 120 may be any suitable smart phone and may be owned and/or operated by the patient 102 or by a clinician.
  • the smart phone 120 includes a display 121 , on which a target image may be presented to the patient 102.
  • the display 121 may be any suitable smart phone display.
  • the smart phone 120 may be positioned relative to the patient 102 such that the target image is within the field of view of the patient 102. For example, according to various embodiments, a distance between the patient 102 and the smart phone 120 may be stipulated.
  • the height of the smart phone 120 relative to the height of the patient’s eyes may be stipulated, e.g., such heights may be specified as being required to be equal, such that the display 121 of the smart phone 120 is at eye level of the patient 102.
  • the smart phone 120 also includes a camera 122, which can capture video and/or still images of the eyes of the patient 102 when the patient 102 is viewing the display 121.
  • the smart phone 120 further includes an electronic processor 124 and persistent memory 125.
  • the persistent memory 125 can hold program instructions that direct the processor 124 to perform a method of providing patient eye movement data using a target image displayed on a patient-holdable device, e.g., the method 500 as shown and described herein in reference to Fig. 5.
  • the smart phone 120 is optionally in communication with a computer 110.
  • the computer 110 may be implemented as a cloud service.
  • the communication channel may include a wired connection 128 and/or a wireless connection 126.
  • the wireless connection may be, by way of nonlimiting example, BLUETOOTH or Wi-Fi.
  • the smart phone 120 captures patient eye movement data in the form of raw video and/or still images using one or more cameras.
  • the raw video and/or still images may be in the visible spectrum and/or infrared spectrum, or a combination thereof.
  • Either of the smart phone 120 and/or the computer 110 may perform subsequent actions, nonlimiting examples of which are disclosed presently.
  • Such subsequent actions may include processing the raw video eye movement data into eye tracking data, e.g., where the eye tracking data specifies a gaze direction of the eyes of the patient 102.
  • Other subsequent actions may include processing the patient eye movement data, e.g., the raw video or the eye tracking data, to generate eye characteristic data representing any, or any combination, of: saccade velocity, saccade latency, saccade accuracy, pursuit gain, vestibulo-ocular reflex, nystagmus velocity, nystagmus direction and/or eye misalignment.
  • Other subsequent actions may include determining a diagnosis or prognosis based on the eye characteristic data generated from the patient eye movement data.
  • the goggles 130 may be any suitable virtual reality goggles (e.g., such that the patient cannot see outside the goggles 130 while wearing them) or augmented reality goggles (e.g., such that the patient can see through the goggles 130 while wearing them).
  • the goggles 130 may be owned and/or operated by the patient 102 and/or by a clinician.
  • the goggles 130 include an internal display, on which a target image may be presented to the patient 102 within the field of view of the patient 102.
  • the goggles 130 also include one or more cameras 132, which can capture video and/or still images of the eyes of the patient 102 as the patient 102 is viewing the target image on the internal display.
  • the one or more cameras 132 may capture video and/or still images in the visible spectrum and/or the infrared spectrum, or a combination thereof.
  • the goggles 130 further include an electronic processor 134 and persistent memory 135.
  • the persistent memory 135 can hold program instructions that direct the processor 134 to perform a method of providing patient eye movement data using a target image displayed on a patient-holdable device, e.g., the method 500 as shown and described herein in reference to Fig. 5.
  • the goggles 130 include a frame into which a smart phone is inserted in order for the goggles to function.
  • the goggles 130 may include a frame that incorporates a pair of lenses that adapt the screen of a smart phone as a goggles-based virtual reality display.
  • the frame itself may or may not include any electronics.
  • the smart phone that is inserted into the frame may utilize software, such as an app, that configures the display of the smart phone as a virtual reality display using the frame.
  • Such embodiments are in contrast to embodiments that utilize integrated goggles, which do not require any additional components and in contrast to embodiments that utilize a smart phone by itself, without a goggles frame.
  • the goggles 130 are optionally in communication with a computer 110.
  • the communication channel may include a wired connection 138 and/or a wireless connection 136.
  • the wireless connection may be, by way of non-limiting examples, BLUETOOTH or Wi-Fi.
  • the goggles 130 capture patient eye movement data in the form of raw video and/or still images, and either of the goggles 130 and/or the computer 110 may perform subsequent actions, nonlimiting examples of which are disclosed presently. Such subsequent actions may include processing the raw video eye movement data into eye tracking data, e.g., where the eye tracking data specifies a gaze direction of the eyes of the patient 102.
  • Other subsequent actions may include processing the eye movement data, e.g., the raw video data or the eye tracking data, to generate eye characteristic data representing any, or any combination, of: saccade velocity, saccade latency, saccade accuracy, pursuit gain, vestibulo-ocular reflex, nystagmus velocity, nystagmus direction and/or eye misalignment.
  • Other subsequent actions may include determining a diagnosis or prognosis based on the eye characteristic data generated from the patient eye movement data.
  • Figs. 2, 3, and 4 shown various target image presentations according to various embodiments. Although the target images are shown as circles in these figures, embodiments are not so limited. In general, target images may be any shape, color, and brightness.
  • Fig. 2 depicts a patient-holdable device 200 displaying a target image 202 sequentially at a plurality of disjoint locations within a field of view of a patient, according to various embodiments.
  • the patient-holdable device 200 may be used in a system, such as the system 100 as shown and described herein in reference to Fig. 1.
  • the patient-holdable device shown in Fig. 2 is a smart phone, embodiments are not so limited.
  • the target image 202 depicted in Fig. 2 may be presented within a field of vision of a patient using goggles, e.g., the goggles 130 as shown and described in reference to Fig. 1.
  • the patient-holdable device 200 may be used to perform a method of providing patient eye movement data using a target image, e.g., the method 500 as shown and described herein in reference to Fig. 5.
  • the target image 202 is shown as appearing alternately at each of two disjoint (e.g., separate) locations.
  • the target image 202 may appear at a first location for a period of time, and then at a second location for a period of time, and then back to the first location after a period of time, and so on.
  • the period of time may be any amount of time, e.g., an number of seconds between 0.5 seconds and 10 seconds.
  • two locations for the target image 202 are shown in Fig. 2, embodiments are not so limited.
  • a target image may appear at any number of disjoint locations according to various embodiments.
  • the target image 202 as shown and described herein in reference to Fig. 2 may be presented to a patient as part of a saccade test.
  • the patient may be instructed to keep their head motionless during the saccade test.
  • the instructions may be presented to the patient on the display of the patient-holdable device.
  • An embodiment may obtain patient eye movement data from the patient during the saccade test, e.g., as shown and described herein in reference to Figs. 1 and/or 5.
  • An embodiment may process the patient eye movement data obtained using the target image 202 to generate eye characteristic data representing any, or any combination, of: saccade velocity, saccade latency, saccade accuracy.
  • Fig. 3 depicts a patient-holdable device displaying a target image smoothly moving within a field of view of a patient, according to various embodiments.
  • the patient-holdable device 300 may be used in a system, such as the system 100 as shown and described herein in reference to Fig. 1.
  • the patient- holdable device shown in Fig. 3 is a smart phone, embodiments are not so limited.
  • the target image 302 depicted in Fig. 3 may be presented within a field of vision of a patient using goggles, e.g., the goggles 130 as shown and described in reference to Fig. 1 .
  • the patient-holdable device 300 may be used to perform a method of providing patient eye movement data using a target image, e.g., the method 500 as shown and described herein in reference to Fig. 5.
  • the target image 302 is shown as smoothly moving from one side of the patient’s field of vision to the other.
  • the target image 302 may move back to the first location, and then to the second location, and so on, according to various embodiments.
  • the target image 302 is shown as a dot in Fig. 3, embodiments are not so limited.
  • the target image may include a bar, or a plurality of bars.
  • the target image 302 is shown as moving side-to-side in Fig. 3, embodiments are not so limited.
  • the target image 302 may move up and down within the field of view of the patient, or along any pattern, e.g., along a square, rectangular, triangular, or circular pattern within the field of view of the patient. Yet further, according to various embodiments, for closed-loop patters such as by way of non-limiting example, a circle, the target image 302 may move clockwise and/or counterclockwise.
  • the target image 302 as shown and described herein in reference to Fig. 3 may be presented to a patient as part of a smooth pursuit test.
  • the patient may be instructed to keep their head motionless during the smooth pursuit test.
  • the instructions may be presented to the patient on the display of the patient-holdable device.
  • An embodiment may obtain patient eye movement data from the patient during the smooth pursuit test, e.g., as shown and described herein in reference to Figs. 1 and/or 5.
  • An embodiment may process the patient eye movement data obtained using the target image 302 to generate eye characteristic data representing pursuit gain.
  • ALS Amyotrophic Lateral Sclerosis
  • Fig. 4 depicts a patient-holdable device displaying a stationary target image within a field of view of a patient, according to various embodiments.
  • the patient-holdable device 300 may be used in a system, such as the system 100 as shown and described herein in reference to Fig. 1 .
  • the patient-holdable device shown in Fig. 4 is a smart phone, embodiments are not so limited.
  • the target image 402 depicted in Fig. 4 may be presented within a field of vision of a patient using goggles, e.g., the goggles 130 as shown and described in reference to Fig. 1 .
  • the patient-holdable device 400 may be used to perform a method of providing patient eye movement data using a target image, e.g., the method 500 as shown and described herein in reference to Fig. 5.
  • the target image 402 is shown as being stationary within the patient’s field of view.
  • the target image 402 may be presented anywhere within the patient’s field of view, e.g., at the center of the patient’s field of view, according to various embodiments.
  • Fig. 4 may be presented to a patient as part of a head impulse test.
  • the patient’s head may be rotated side to side, e.g., by a clinician.
  • the patient may be instructed to rotate their head side to side.
  • the instructions may be presented to the patient on the display of the patient-holdable device.
  • An embodiment may obtain patient eye movement data from the patient during the head impulse test, e.g., as shown and described herein in reference to Figs. 1 and/or 5.
  • An embodiment may process the patient eye movement data obtained using the target image 402 together with patient head movement data to generate eye characteristic data representing a vestibulo-ocular reflex.
  • the target image 402 as shown and described herein in reference to Fig. 4 may be presented to a patient as part of a vestibuloocular suppression test.
  • the patient may be instructed to keep their head motionless during the vestibuloocular suppression test.
  • the patient’s body may be manipulated, for example, by a clinician, to rotate about a vertical axis (e.g., while seated in an office chair). Alternately, the patient may be instructed to rotate their body about a vertical axis (e.g., while seated in an office chair).
  • the instructions may be presented to the patient on the display of the patient-holdable device.
  • An embodiment may obtain patient eye movement data from the patient during the smooth pursuit test, e.g., as shown and described herein in reference to Figs. 1 and/or 5.
  • An embodiment may process the patient eye movement data obtained using the target image 402 to generate eye characteristic data representing nystagmus velocity and/or nystagmus direction.
  • the target image 402 as shown and described herein in reference to Fig. 4 may be presented to a patient as part of a skew deviation test.
  • the patient may be instructed to keep their head motionless during the skew deviation test.
  • the patient’s eyes may be alternately covered, e.g., by a clinician during the skew deviation test. Alternately, the patient may be instructed to alternately cover each eye during the skew deviation test.
  • the instructions may be presented to the patient on the display of the patient-holdable device.
  • An embodiment may obtain patient eye movement data from the patient during the skew deviation test, e.g., as shown and described herein in reference to Figs. 1 and/or 5.
  • An embodiment may process the patient eye movement data obtained using the target image 402 to generate eye characteristic data representing eye misalignment.
  • the target image 402 as shown and described herein in reference to Fig. 4 may be presented to a patient as part of a Dix-Hallpike test or horizontal head roll test.
  • the patient’s head may be manipulated, e.g., by a clinician during the these tests.
  • An embodiment may obtain patient eye movement data from the patient during the a Dix-Hallpike test or horizontal head roll test, e.g., as shown and described herein in reference to Figs. 1 and/or 5.
  • An embodiment may process the patient eye movement data obtained using the target image 402 to generate eye characteristic data representing nystagmus data (e.g., nystagmus velocity and/or nystagmus direction).
  • the inventors conducted a study in which they used an embodiment as shown and described in reference to Fig. 4 to detect positional nystagmus, which is indicative of benign paroxysmal positional vertigo (BPPV).
  • BPPV benign paroxysmal positional vertigo
  • the study recruited patients presenting with positional dizziness who were suspected to have BPPV.
  • the study used a smartphone application to obtain patient eye movement data while undergoing a Dix-Hallpike test and a horizontal head roll test.
  • the patient eye movement data was analyzed to detect traces with nystagmus.
  • An expert reviewed videos obtained by the smartphone and marked the ones with nystagmus consistent with BPPV.
  • the inventors assessed the accuracy of the embodiment against the expert review and calculated its sensitivity, specificity, positive and negative predictive values.
  • Fig. 5 depicts a flow diagram for a method 500 of providing patient eye movement data using a target image displayed on a patient-holdable device according to various embodiments.
  • the method 500 may be practiced using a system such as the system 100 as shown and described herein in reference to Fig. 1. According to some embodiments, the method 500 may be practiced at home by a patient using the patient’s smart phone or goggles.
  • the method 500 includes displaying a target image to a patient on a patient-holdable device.
  • the patient-holdable device may be a smart phone.
  • the patient- holdable device may be patient-wearable goggles.
  • the patient-wearable goggles may comprise a frame into which a smart phone is inserted.
  • any, or any combination, of 502, 504, and/or 506 may be performed by an application, e.g., an app, installed on the smart phone or goggles.
  • the patient-holdable device may be a smart phone of the patient.
  • the target image may be displayed in any of a variety of manners.
  • the target image may be displayed sequentially at a plurality of disjoint locations in a field of view of the patient, e.g., as shown and described herein in reference to Fig. 2.
  • the target image may be displayed smoothly moving within a field of view of the patient, e.g., as shown and described herein in reference to Fig. 3.
  • the target image may be displayed at a stationary location within a field of vision of the patient, e.g., as shown and described herein in reference to Fig. 4.
  • the method 500 includes obtaining, by the patient-holdable device, patient eye movement data.
  • the patient eye movement data may be any of a variety of types.
  • the patient eye movement data includes raw video data.
  • the patient eye movement data includes eye tracking data, e.g., where the eye tracking data specifies a gaze direction of the eyes of the patient.
  • the patient eye movement data may be acquired while the patient undergoes certain physical actions, such as: manipulating or rotating a position of the patient’s head relative to the target image, rotating the patient and the device about a vertical axis, or alternately covering each eye of the patient.
  • the method 500 includes outputting the patient eye movement data.
  • the patient eye movement data may be output from the smart phone or goggles to another device, e.g., from the smart phone 120 or goggles 130 to the computer 110 as shown and described herein in reference to Fig. 1 .
  • the patient eye movement data is output from such other device to a subsequent device, such as, by way of non-limiting examples, a computer or cloud server.
  • the outputting can include outputting the patient eye movement data to a clinician, e.g., to a clinical computer system over a network.
  • any of a variety of actions may follow 502, 504, and 506.
  • such actions may include measuring, based on the patient eye movement data, patient eye characteristic data, specific examples of which are set forth presently.
  • such subsequent actions may include measuring a saccade velocity, a saccade latency, and/or a saccade accuracy and outputting the saccade velocity, the saccade latency, and/or the saccade accuracy.
  • such subsequent actions may include measuring a pursuit gain of the patient and outputting the pursuit gain of the patient.
  • such subsequent actions may include measuring a vestibulo-ocular reflex of the patient and outputting the vestibuloocular reflex of the patient, measuring nystagmus data of the patient and outputting the nystagmus data of the patient, and/or measuring eye misalignment data of the patient and outputting the eye misalignment data of the patient.
  • such subsequent actions include prognosing or diagnosing the patient based on the eye movement data or any of the aforementioned eye characteristic data.
  • Such prognosis or diagnosis may be performed automatically, e.g., by a trained machine learning system.
  • Obtaining the prognosis or diagnosis may include one or more of: characterizing the eye movements as normal or abnormal, characterizing the nature of the eye movement abnormality, when one is present, clinically classifying the overall pattern of abnormal eye movements, generating a list of diagnostic possibilities from the pattern of abnormalities, and/or estimating the probability of disease one or more particular diseases on the basis of the pattern of abnormalities.
  • such subsequent actions may include treating the patient based on the eye movement data, the eye characteristic data, the prognosis, and/or the diagnosis.
  • the computer programs can exist in a variety of forms both active and inactive.
  • the computer programs can exist as software program (s) comprised of program instructions in source code, object code, executable code or other formats; firmware program(s), or hardware description language (HDL) files. Any of the above can be embodied on a transitory or non-transitory computer readable medium, which include storage devices and signals, in compressed or uncompressed form.
  • Exemplary computer readable storage devices include conventional computer system RAM (random access memory), ROM (read-only memory), EPROM (erasable, programmable ROM), EEPROM (electrically erasable, programmable ROM), flash memory, and magnetic or optical disks or tapes.
  • RAM random access memory
  • ROM read-only memory
  • EPROM erasable, programmable ROM
  • EEPROM electrically erasable, programmable ROM
  • flash memory and magnetic or optical disks or tapes.
  • These computer readable program instructions may be provided to a processor of a general-purpose computer, special purpose computer, server computer with computational capabilities, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the electronic processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer readable program instructions may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, statesetting data, configuration data for integrated circuitry, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++, or the like, and procedural programming languages, such as the C programming language or similar programming languages.
  • the computer readable program instructions may execute entirely on a user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the terms “A or B” and “A and/or B” are intended to encompass A, B, or ⁇ A and B ⁇ . Further, the terms “A, B, or C” and “A, B, and/or C” are intended to encompass single items, pairs of items, or all items, that is, all of: A, B, C, ⁇ A and B ⁇ , ⁇ A and C ⁇ , ⁇ B and C ⁇ , and ⁇ A and B and C ⁇ .
  • the term “or” as used herein means “and/or.”
  • language such as “at least one of X, Y, and Z,” “at least one of X, Y, or Z,” “at least one or more of X, Y, and Z,” “at least one or more of X, Y, or Z,” “at least one or more of X, Y, and/or Z,” or “at least one of X, Y, and/or Z,” is intended to be inclusive of both a single item (e.g., just X, or just Y, or just Z) and multiple items (e.g., ⁇ X and Y ⁇ , ⁇ X and Z ⁇ , ⁇ Y and Z ⁇ , or ⁇ X, Y, and Z ⁇ ).
  • the phrase “at least one of” and similar phrases are not intended to convey a requirement that each possible item must be present, although each possible item may be present.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Neurology (AREA)
  • Pathology (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Ophthalmology & Optometry (AREA)
  • Physiology (AREA)
  • Neurosurgery (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

Techniques for providing patient eye movement data using a target image displayed on a patient-holdable device are presented. The techniques include displaying a target image to a patient, where the displaying includes displaying the target image on a patient-holdable device. The techniques also include obtaining, by the patient-holdable device, patient eye movement data. The techniques further include outputting the patient eye movement data.

Description

OBTAINING EYE MOVEMENT DATA USING PATIENT-HOLDABLE DEVICE
Field
[0001] This disclosure relates generally to ophthalmology and neurology.
Background
[0002] Any of a variety of diseases and disorders may be diagnosed or prognosed by examining a patient’s eye movements using a variety of ophthalmologic tests. Such diseases and disorders include, by way of non-limiting example, vestibular strokes, multiple sclerosis (MS), and amyotrophic lateral sclerosis (ALS).
[0003] Existing devices for obtaining patient eye movement data are immobile, large (e.g., 2’x2’), expensive (e.g., about $40,000), and require trained technicians to operate. Such devices are generally not readily available, particularly in settings such as urgent care facilities and emergency rooms.
Summary
[0004] According to various embodiments, a method of providing patient eye movement data using a target image displayed on a patient-holdable device is presented. The method includes: displaying a target image to a patient, wherein the displaying comprises displaying the target image on a patient-holdable device; obtaining, by the patient-holdable device, patient eye movement data; and outputting the patient eye movement data.
[0005] Various optional features of the above method embodiments include the following. The patient-holdable device may include a smart phone. The patient- holdable device may include a smart phone of the patient, and the displaying, the obtaining, and the outputting may be performed by an application installed on the smart phone of the patient. The patient-holdable device may include patient-wearable goggles. The patient-wearable goggles may include a smart phone. The patient eye movement data may include video data. The patient eye movement data may include eye tracking data. The displaying may include displaying the target image sequentially at a plurality of disjoint locations in a field of view of the patient. The method may include: based on the patient eye movement data, measuring at least one of a saccade velocity, a saccade latency, or a saccade accuracy; and outputting the saccade velocity, the saccade latency, or the saccade accuracy. The displaying may include displaying the target image smoothly moving within a field of view of the patient. The method may include: based on the patient eye movement data, measuring a pursuit gain of the patient; and outputting the pursuit gain of the patient. The displaying may include displaying the target image at a stationary location within a field of vision of the patient. The method may further include: rotating a position of the patient’s head relative to the target image; based at least on the patient eye movement data, measuring a vestibulo-ocular reflex of the patient; and outputting the vestibuloocular reflex of the patient. The method may further include: rotating the patient and the device about a vertical axis; based on the patient eye movement data, measuring nystagmus data of the patient; and outputting the nystagmus data of the patient. The method may include: alternately covering each eye of the patient; based on the patient eye movement data, measuring eye misalignment data of the patient; and outputting the eye misalignment data of the patient. The method may include: measuring, based on the patient eye movement data, patient eye characteristic data; and automatically diagnosing the patient based on the patient eye characteristic data. [0006] According to various embodiments, a non-transitory computer readable medium comprising instructions that, when executed by an electronic processor, configure the electronic processor to provide patient eye movement data using a target image displayed on a patient-holdable device by performing actions is presented. The actions include: displaying a target image to a patient, wherein the displaying comprises displaying the target image on a patient-holdable device; obtaining, by the patient-holdable device, patient eye movement data; and outputting the patient eye movement data.
[0007] Various optional features of the above computer-readable medium claims include the following. The patient-holdable device may include a smart phone. The patient-holdable device may include a smart phone of the patient, and the displaying, the obtaining, and the outputting may be performed by an application installed on the smart phone of the patient. The patient-holdable device may include patient-wearable goggles. The patient-wearable goggles may include a smart phone. The patient eye movement data may include video data. The patient eye movement data may include eye tracking data. The displaying may include displaying the target image sequentially at a plurality of disjoint locations in a field of view of the patient. The actions may further include: based on the patient eye movement data, measuring at least one of a saccade velocity, a saccade latency, or a saccade accuracy; and outputting the saccade velocity, the saccade latency, or the saccade accuracy. The displaying may include displaying the target image smoothly moving within a field of view of the patient. The actions may further include: based on the patient eye movement data, measuring a pursuit gain of the patient; and outputting the pursuit gain of the patient. The displaying may include displaying the target image at a stationary location within a field of vision of the patient. The actions may further include: based at least on the patient eye movement data obtained while rotating a position of the patient’s head relative to the target image, measuring a vestibulo-ocular reflex of the patient; and outputting the vestibuloocular reflex of the patient. The actions may further include: based on the patient eye movement data obtained while rotating the patient and the device about a vertical axis, measuring nystagmus data of the patient; and outputting the nystagmus data of the patient. The actions may further include: based on the patient eye movement data obtained while alternately covering each eye of the patient, measuring eye misalignment data of the patient; and outputting the eye misalignment data of the patient. The actions may further include: measuring, based on the patient eye movement data, patient eye characteristic data; and automatically diagnosing the patient based on the patient eye characteristic data. [0008] Combinations, (including multiple dependent combinations) of the above-described elements and those within the specification have been contemplated by the inventors and may be made, except where otherwise indicated or where contradictory.
Brief Description of the Drawings
[0009] Various features of the examples can be more fully appreciated, as the same become better understood with reference to the following detailed description of the examples when considered in connection with the accompanying figures, in which: [0010] Fig. 1 depicts a system for providing patient eye movement data using a target image displayed on a patient-holdable device, such as patient-wearable goggles or a smart phone, according to various embodiments; [0011] Fig. 2 depicts a patient-holdable device displaying a target image sequentially at a plurality of disjoint locations within a field of view of a patient, according to various embodiments;
[0012] Fig. 3 depicts a patient-holdable device displaying a target image smoothly moving within a field of view of a patient, according to various embodiments; [0013] Fig. 4 depicts a patient-holdable device displaying a stationary target image within a field of view of a patient, according to various embodiments; and [0014] Fig. 5 depicts a flow diagram for a method of providing patient eye movement data using a target image displayed on a patient-holdable device according to various embodiments.
Description of the Examples
[0015] Reference will now be made in detail to example implementations, illustrated in the accompanying drawings. Wherever convenient, the same reference numbers will be used throughout the drawings to refer to the same or like parts. In the following description, reference is made to the accompanying drawings that form a part thereof, and in which is shown by way of illustration specific exemplary examples in which the invention may be practiced. These examples are described in sufficient detail to enable those skilled in the art to practice the invention and it is to be understood that other examples may be utilized and that changes may be made without departing from the scope of the invention. The following description is, therefore, merely exemplary.
[0016] A variety of diseases and disorders may be diagnosed or prognosed by examining the movement of a patient’s eyes. Such diseases and disorders include, by way of non-limiting examples, vestibular strokes, multiple sclerosis (MS), and amyotrophic lateral sclerosis (ALS). A variety of ophthalmologic tests may rely on such eye movement assessment. Such tests may include, by way of non-limiting example, a saccade test (which can measure saccade velocity, latency, or accuracy), a smooth pursuit test (which can measure pursuit gain), a head impulse test (which can measure vestibulo-ocular reflex), a skew deviation test (which can measure eye misalignment), and a vestibuloocular suppression, Dix-Hallpike, or horizontal head roll test (any of which can measure nystagmus velocity or direction)
[0017] These and other ophthalmologic tests have the commonalities of requiring a patient to fixate their gaze on a target, which may be stationary of mobile, while the test is performed. Some tests include various patient manipulations, such as sequentially covering the patient’s eyes (e.g., for the skew deviation test) or moving the patient’s head (e.g., for the head impulse test) or body (for the vestibulo-optical suppression test). Nevertheless, these and other ophthalmologic tests involve obtaining data regarding the movement of the patient’s eyes while the test is performed.
[0018] Some embodiments provide patient eye movement data using a target image displayed on a patient-holdable device. The patient-holdable device may be patient-wearable goggles or a smart phone, which the patient themselves may own. For example, some embodiments utilize software, e.g., in the form of an app, which the patient may install on their own smart phone. Some embodiments use a camera in the patient-holdable device itself to capture patient eye movement data. Thus, some embodiments may utilize inexpensive and/or existing, lightweight, patient-holdable equipment to obtain patient eye movement data, e.g., during the performance of an ophthalmologic test. Subsequently, the patient eye movement data may be used to prognose or diagnose any of a variety of diseases or disorders, not limited to ophthalmologic diseases and disorders. Embodiments may be used by a physician, for example, in an emergency setting or in an outpatient setting, or by the patient themselves, e.g., at home.
[0019] These and other features and advantages are shown and described herein in reference to the figures.
[0020] Fig. 1 depicts a system 100 for providing patient eye movement data using a target image displayed on a patient-holdable device according to various embodiments. The system 100 may be used to obtain patient eye movement data for a patient 102. The eye movement data may be raw video or eye tracking data. The system 100 may be used to perform the method 500 of Fig. 5, for example.
[0021] The system 100 as shown in Fig. 1 includes two patient-holdable devices, namely, a smart phone 120 and wearable goggles 130. Either of the patient- holdable devices may be used in the alternative, according to various embodiments.
[0022] The smart phone 120 may be any suitable smart phone and may be owned and/or operated by the patient 102 or by a clinician. The smart phone 120 includes a display 121 , on which a target image may be presented to the patient 102. The display 121 may be any suitable smart phone display. According to an embodiment, the smart phone 120 may be positioned relative to the patient 102 such that the target image is within the field of view of the patient 102. For example, according to various embodiments, a distance between the patient 102 and the smart phone 120 may be stipulated. In addition, or in the alternative, the height of the smart phone 120 relative to the height of the patient’s eyes may be stipulated, e.g., such heights may be specified as being required to be equal, such that the display 121 of the smart phone 120 is at eye level of the patient 102. The smart phone 120 also includes a camera 122, which can capture video and/or still images of the eyes of the patient 102 when the patient 102 is viewing the display 121. The smart phone 120 further includes an electronic processor 124 and persistent memory 125. According to an embodiment, the persistent memory 125 can hold program instructions that direct the processor 124 to perform a method of providing patient eye movement data using a target image displayed on a patient-holdable device, e.g., the method 500 as shown and described herein in reference to Fig. 5.
[0023] The smart phone 120 is optionally in communication with a computer 110. According to some embodiments, the computer 110 may be implemented as a cloud service. The communication channel may include a wired connection 128 and/or a wireless connection 126. The wireless connection may be, by way of nonlimiting example, BLUETOOTH or Wi-Fi. According to various embodiments, the smart phone 120 captures patient eye movement data in the form of raw video and/or still images using one or more cameras. The raw video and/or still images may be in the visible spectrum and/or infrared spectrum, or a combination thereof. Either of the smart phone 120 and/or the computer 110 may perform subsequent actions, nonlimiting examples of which are disclosed presently. Such subsequent actions may include processing the raw video eye movement data into eye tracking data, e.g., where the eye tracking data specifies a gaze direction of the eyes of the patient 102. Other subsequent actions may include processing the patient eye movement data, e.g., the raw video or the eye tracking data, to generate eye characteristic data representing any, or any combination, of: saccade velocity, saccade latency, saccade accuracy, pursuit gain, vestibulo-ocular reflex, nystagmus velocity, nystagmus direction and/or eye misalignment. Other subsequent actions may include determining a diagnosis or prognosis based on the eye characteristic data generated from the patient eye movement data. [0024] The goggles 130 may be any suitable virtual reality goggles (e.g., such that the patient cannot see outside the goggles 130 while wearing them) or augmented reality goggles (e.g., such that the patient can see through the goggles 130 while wearing them). The goggles 130 may be owned and/or operated by the patient 102 and/or by a clinician. The goggles 130 include an internal display, on which a target image may be presented to the patient 102 within the field of view of the patient 102. The goggles 130 also include one or more cameras 132, which can capture video and/or still images of the eyes of the patient 102 as the patient 102 is viewing the target image on the internal display. The one or more cameras 132 may capture video and/or still images in the visible spectrum and/or the infrared spectrum, or a combination thereof. The goggles 130 further include an electronic processor 134 and persistent memory 135. According to an embodiment, the persistent memory 135 can hold program instructions that direct the processor 134 to perform a method of providing patient eye movement data using a target image displayed on a patient-holdable device, e.g., the method 500 as shown and described herein in reference to Fig. 5.
[0025] According to some embodiments, the goggles 130 include a frame into which a smart phone is inserted in order for the goggles to function. According to such embodiments, the goggles 130 may include a frame that incorporates a pair of lenses that adapt the screen of a smart phone as a goggles-based virtual reality display. The frame itself may or may not include any electronics. According to some embodiments, the smart phone that is inserted into the frame may utilize software, such as an app, that configures the display of the smart phone as a virtual reality display using the frame. Such embodiments are in contrast to embodiments that utilize integrated goggles, which do not require any additional components and in contrast to embodiments that utilize a smart phone by itself, without a goggles frame. [0026] The goggles 130 are optionally in communication with a computer 110. The communication channel may include a wired connection 138 and/or a wireless connection 136. The wireless connection may be, by way of non-limiting examples, BLUETOOTH or Wi-Fi. According to various embodiments, the goggles 130 capture patient eye movement data in the form of raw video and/or still images, and either of the goggles 130 and/or the computer 110 may perform subsequent actions, nonlimiting examples of which are disclosed presently. Such subsequent actions may include processing the raw video eye movement data into eye tracking data, e.g., where the eye tracking data specifies a gaze direction of the eyes of the patient 102. Other subsequent actions may include processing the eye movement data, e.g., the raw video data or the eye tracking data, to generate eye characteristic data representing any, or any combination, of: saccade velocity, saccade latency, saccade accuracy, pursuit gain, vestibulo-ocular reflex, nystagmus velocity, nystagmus direction and/or eye misalignment. Other subsequent actions may include determining a diagnosis or prognosis based on the eye characteristic data generated from the patient eye movement data.
[0027] Figs. 2, 3, and 4 shown various target image presentations according to various embodiments. Although the target images are shown as circles in these figures, embodiments are not so limited. In general, target images may be any shape, color, and brightness.
[0028] Fig. 2 depicts a patient-holdable device 200 displaying a target image 202 sequentially at a plurality of disjoint locations within a field of view of a patient, according to various embodiments. The patient-holdable device 200 may be used in a system, such as the system 100 as shown and described herein in reference to Fig. 1. Thus, although the patient-holdable device shown in Fig. 2 is a smart phone, embodiments are not so limited. For example, the target image 202 depicted in Fig. 2 may be presented within a field of vision of a patient using goggles, e.g., the goggles 130 as shown and described in reference to Fig. 1. The patient-holdable device 200 may be used to perform a method of providing patient eye movement data using a target image, e.g., the method 500 as shown and described herein in reference to Fig. 5.
[0029] In Fig. 2, the target image 202 is shown as appearing alternately at each of two disjoint (e.g., separate) locations. The target image 202 may appear at a first location for a period of time, and then at a second location for a period of time, and then back to the first location after a period of time, and so on. The period of time may be any amount of time, e.g., an number of seconds between 0.5 seconds and 10 seconds. Although two locations for the target image 202 are shown in Fig. 2, embodiments are not so limited. A target image may appear at any number of disjoint locations according to various embodiments.
[0030] The target image 202 as shown and described herein in reference to Fig. 2 may be presented to a patient as part of a saccade test. The patient may be instructed to keep their head motionless during the saccade test. The instructions may be presented to the patient on the display of the patient-holdable device. An embodiment may obtain patient eye movement data from the patient during the saccade test, e.g., as shown and described herein in reference to Figs. 1 and/or 5. An embodiment may process the patient eye movement data obtained using the target image 202 to generate eye characteristic data representing any, or any combination, of: saccade velocity, saccade latency, saccade accuracy.
[0031] Fig. 3 depicts a patient-holdable device displaying a target image smoothly moving within a field of view of a patient, according to various embodiments. The patient-holdable device 300 may be used in a system, such as the system 100 as shown and described herein in reference to Fig. 1. Thus, although the patient- holdable device shown in Fig. 3 is a smart phone, embodiments are not so limited. For example, the target image 302 depicted in Fig. 3 may be presented within a field of vision of a patient using goggles, e.g., the goggles 130 as shown and described in reference to Fig. 1 . The patient-holdable device 300 may be used to perform a method of providing patient eye movement data using a target image, e.g., the method 500 as shown and described herein in reference to Fig. 5.
[0032] In Fig. 3, the target image 302 is shown as smoothly moving from one side of the patient’s field of vision to the other. The target image 302 may move back to the first location, and then to the second location, and so on, according to various embodiments. Although the target image 302 is shown as a dot in Fig. 3, embodiments are not so limited. According to various embodiments, the target image may include a bar, or a plurality of bars. Further, although the target image 302 is shown as moving side-to-side in Fig. 3, embodiments are not so limited. According to various embodiments, the target image 302 may move up and down within the field of view of the patient, or along any pattern, e.g., along a square, rectangular, triangular, or circular pattern within the field of view of the patient. Yet further, according to various embodiments, for closed-loop patters such as by way of non-limiting example, a circle, the target image 302 may move clockwise and/or counterclockwise.
[0033] The target image 302 as shown and described herein in reference to Fig. 3 may be presented to a patient as part of a smooth pursuit test. The patient may be instructed to keep their head motionless during the smooth pursuit test. The instructions may be presented to the patient on the display of the patient-holdable device. An embodiment may obtain patient eye movement data from the patient during the smooth pursuit test, e.g., as shown and described herein in reference to Figs. 1 and/or 5. An embodiment may process the patient eye movement data obtained using the target image 302 to generate eye characteristic data representing pursuit gain.
[0034] The inventors conducted a study to evaluate the feasibility of Amyotrophic Lateral Sclerosis (ALS) patients self-recording their eye-movement data using an embodiment as shown and described in reference to Figs. 2 and 3. In general, ALS can affect various eye movements, making eye tracking useful for disease monitoring. The study recruited ten participants, who were provided with an embodiment that included a smartphone equipped with an app and step-by-step instructions for recording their eye-movement data. The eye-movement test of the study included of the following:
[0035] 1. Test of horizontal and vertical saccades: the participants were instructed to position their face 30 cm from the smartphone (using the smartphone’s distance measure feature) and run the saccade test in the app. The test displayed two red dots, which appeared alternately on the screen as shown and described herein in reference to Fig. 2. The dots were placed 12.5 cm apart, spanning 20° of the visual field at a 30-cm distance. After recording vertical saccades, the study participants were instructed to rotate the smartphone 90° counter clockwise to record horizontal saccades in a similar manner.
[0036] 2. Test of horizontal and vertical smooth pursuit: the instructions to this test were similar to the saccade test, except a single red dot moved smoothly across the smartphone screen and the study participants were instructed to follow it with their eyes. The dot was designed to move at a velocity of 107second at a 30-cm distance. [0037] The goal of the study was for the participants to record their eye movements (saccades and smooth pursuit) without the help of the study team. Afterward, a trained physician administered the same tests using video-oculography (VOG) goggles and asked the participants to complete a questionnaire regarding their self-recording experience. All participants successfully completed the self-recording process without assistance from the study team. Questionnaire data, represented in the Table below, indicated that participants viewed self-recording using he embodiment favorably, considering it easy and comfortable. Moreover, 70% indicated that they prefer self-recording to being recorded by VOG goggles.
Question Strongly Disagree, Neutral, Agree, Strongly disagree, % % % agree, %
%
0 0 0 60 40
It was comfortable to record the eye movements using the phone
It was easy to adjust the phone 0 0 0 70 30 in the recommended position in front of my face
It was easy to follow the 0 0 0 30 70 instructions provided by the instructional video
It was easy to perform the tests 0 0 0 50 50 without help from the study team
It was easy to navigate between the 0 0 0 30 70 features of the phone application
It was easy for me to perform the 0 0 0 40 60 saccade test (looking at the red dot at different locations on the screen) and recording my eye movements using the phone
It was easy for me to perform the 0 0 0 20 80 smooth pursuit test (following the red dot as it moves on the screen) and recording my eye movements using the phone
If I were provided with the 0 0 0 40 60 instructional video, I would be able to record my eye movements at home using my smartphone
Table: Questionnaire Data
[0038] Fig. 4 depicts a patient-holdable device displaying a stationary target image within a field of view of a patient, according to various embodiments. The patient-holdable device 300 may be used in a system, such as the system 100 as shown and described herein in reference to Fig. 1 . Thus, although the patient-holdable device shown in Fig. 4 is a smart phone, embodiments are not so limited. For example, the target image 402 depicted in Fig. 4 may be presented within a field of vision of a patient using goggles, e.g., the goggles 130 as shown and described in reference to Fig. 1 . The patient-holdable device 400 may be used to perform a method of providing patient eye movement data using a target image, e.g., the method 500 as shown and described herein in reference to Fig. 5.
[0039] In Fig. 4, the target image 402 is shown as being stationary within the patient’s field of view. The target image 402 may be presented anywhere within the patient’s field of view, e.g., at the center of the patient’s field of view, according to various embodiments.
[0040] The target image 402 as shown and described herein in reference to
Fig. 4 may be presented to a patient as part of a head impulse test. During the head impulse test, the patient’s head may be rotated side to side, e.g., by a clinician. Alternately, during the head impulse test, the patient may be instructed to rotate their head side to side. The instructions may be presented to the patient on the display of the patient-holdable device. An embodiment may obtain patient eye movement data from the patient during the head impulse test, e.g., as shown and described herein in reference to Figs. 1 and/or 5. An embodiment may process the patient eye movement data obtained using the target image 402 together with patient head movement data to generate eye characteristic data representing a vestibulo-ocular reflex.
[0041] The target image 402 as shown and described herein in reference to Fig. 4 may be presented to a patient as part of a vestibuloocular suppression test. The patient may be instructed to keep their head motionless during the vestibuloocular suppression test. The patient’s body may be manipulated, for example, by a clinician, to rotate about a vertical axis (e.g., while seated in an office chair). Alternately, the patient may be instructed to rotate their body about a vertical axis (e.g., while seated in an office chair). The instructions may be presented to the patient on the display of the patient-holdable device. An embodiment may obtain patient eye movement data from the patient during the smooth pursuit test, e.g., as shown and described herein in reference to Figs. 1 and/or 5. An embodiment may process the patient eye movement data obtained using the target image 402 to generate eye characteristic data representing nystagmus velocity and/or nystagmus direction.
[0042] The target image 402 as shown and described herein in reference to Fig. 4 may be presented to a patient as part of a skew deviation test. The patient may be instructed to keep their head motionless during the skew deviation test. The patient’s eyes may be alternately covered, e.g., by a clinician during the skew deviation test. Alternately, the patient may be instructed to alternately cover each eye during the skew deviation test. The instructions may be presented to the patient on the display of the patient-holdable device. An embodiment may obtain patient eye movement data from the patient during the skew deviation test, e.g., as shown and described herein in reference to Figs. 1 and/or 5. An embodiment may process the patient eye movement data obtained using the target image 402 to generate eye characteristic data representing eye misalignment.
[0043] The target image 402 as shown and described herein in reference to Fig. 4 may be presented to a patient as part of a Dix-Hallpike test or horizontal head roll test. The patient’s head may be manipulated, e.g., by a clinician during the these tests. An embodiment may obtain patient eye movement data from the patient during the a Dix-Hallpike test or horizontal head roll test, e.g., as shown and described herein in reference to Figs. 1 and/or 5. An embodiment may process the patient eye movement data obtained using the target image 402 to generate eye characteristic data representing nystagmus data (e.g., nystagmus velocity and/or nystagmus direction).
[0044] The inventors conducted a study in which they used an embodiment as shown and described in reference to Fig. 4 to detect positional nystagmus, which is indicative of benign paroxysmal positional vertigo (BPPV). The study recruited patients presenting with positional dizziness who were suspected to have BPPV. The study used a smartphone application to obtain patient eye movement data while undergoing a Dix-Hallpike test and a horizontal head roll test. The patient eye movement data was analyzed to detect traces with nystagmus. An expert reviewed videos obtained by the smartphone and marked the ones with nystagmus consistent with BPPV. Finally, the inventors assessed the accuracy of the embodiment against the expert review and calculated its sensitivity, specificity, positive and negative predictive values. [0045] A total of ten participants (60% women) were recruited for the study, with an average age of 61.8±15.4. A total of 23 positional maneuvers, were performed, four (17.4%) of which showed nystagmus upon expert review. The embodiment results indicated the presence of nystagmus in 3 traces (Sen=75%) and correctly ruled out the presence of nystagmus in 19 traces (Specificity=100%). The positive predictive and negative predictive values of the embodiment were 100% and 95%, respectively.
[0046] Fig. 5 depicts a flow diagram for a method 500 of providing patient eye movement data using a target image displayed on a patient-holdable device according to various embodiments. The method 500 may be practiced using a system such as the system 100 as shown and described herein in reference to Fig. 1. According to some embodiments, the method 500 may be practiced at home by a patient using the patient’s smart phone or goggles.
[0047] At 502, the method 500 includes displaying a target image to a patient on a patient-holdable device. According to various embodiments, the patient-holdable device may be a smart phone. According to various embodiments, the patient- holdable device may be patient-wearable goggles. According to various embodiments, the patient-wearable goggles may comprise a frame into which a smart phone is inserted. According to various embodiments any, or any combination, of 502, 504, and/or 506 may be performed by an application, e.g., an app, installed on the smart phone or goggles. According to various embodiments, the patient-holdable device may be a smart phone of the patient.
[0048] The target image may be displayed in any of a variety of manners.
According to various embodiments, the target image may be displayed sequentially at a plurality of disjoint locations in a field of view of the patient, e.g., as shown and described herein in reference to Fig. 2. According to various embodiments, the target image may be displayed smoothly moving within a field of view of the patient, e.g., as shown and described herein in reference to Fig. 3. According to various embodiments, the target image may be displayed at a stationary location within a field of vision of the patient, e.g., as shown and described herein in reference to Fig. 4.
[0049] At 504, the method 500 includes obtaining, by the patient-holdable device, patient eye movement data. The patient eye movement data may be any of a variety of types. According to various embodiments, the patient eye movement data includes raw video data. According to various embodiments, the patient eye movement data includes eye tracking data, e.g., where the eye tracking data specifies a gaze direction of the eyes of the patient. According to some embodiments, the patient eye movement data may be acquired while the patient undergoes certain physical actions, such as: manipulating or rotating a position of the patient’s head relative to the target image, rotating the patient and the device about a vertical axis, or alternately covering each eye of the patient.
[0050] At 506, the method 500 includes outputting the patient eye movement data. According to some embodiments, the patient eye movement data may be output from the smart phone or goggles to another device, e.g., from the smart phone 120 or goggles 130 to the computer 110 as shown and described herein in reference to Fig. 1 . According to some embodiments, the patient eye movement data is output from such other device to a subsequent device, such as, by way of non-limiting examples, a computer or cloud server. According to some embodiments, e.g., where the method 500 is practiced at home by a patient using the patient’s patient-holdable device, the outputting can include outputting the patient eye movement data to a clinician, e.g., to a clinical computer system over a network. [0051] Any of a variety of actions may follow 502, 504, and 506. For example, such actions may include measuring, based on the patient eye movement data, patient eye characteristic data, specific examples of which are set forth presently. According to some embodiments, e.g., where the target image is displayed sequentially at a plurality of disjoint locations in a field of view of the patient, such subsequent actions may include measuring a saccade velocity, a saccade latency, and/or a saccade accuracy and outputting the saccade velocity, the saccade latency, and/or the saccade accuracy. According to some embodiments, e.g., where the target image is shown as smoothly moving from one side of the patient’s field of vision to the other, such subsequent actions may include measuring a pursuit gain of the patient and outputting the pursuit gain of the patient. According to some embodiments, e.g., where the target image is shown as being stationary within the patient’s field of view, such subsequent actions may include measuring a vestibulo-ocular reflex of the patient and outputting the vestibuloocular reflex of the patient, measuring nystagmus data of the patient and outputting the nystagmus data of the patient, and/or measuring eye misalignment data of the patient and outputting the eye misalignment data of the patient.
[0052] According to some embodiments, such subsequent actions include prognosing or diagnosing the patient based on the eye movement data or any of the aforementioned eye characteristic data. Such prognosis or diagnosis may be performed automatically, e.g., by a trained machine learning system. Obtaining the prognosis or diagnosis may include one or more of: characterizing the eye movements as normal or abnormal, characterizing the nature of the eye movement abnormality, when one is present, clinically classifying the overall pattern of abnormal eye movements, generating a list of diagnostic possibilities from the pattern of abnormalities, and/or estimating the probability of disease one or more particular diseases on the basis of the pattern of abnormalities.
[0053] According to some embodiments, such subsequent actions may include treating the patient based on the eye movement data, the eye characteristic data, the prognosis, and/or the diagnosis.
[0054] Certain examples can be performed using a computer program or set of programs. The computer programs can exist in a variety of forms both active and inactive. For example, the computer programs can exist as software program (s) comprised of program instructions in source code, object code, executable code or other formats; firmware program(s), or hardware description language (HDL) files. Any of the above can be embodied on a transitory or non-transitory computer readable medium, which include storage devices and signals, in compressed or uncompressed form. Exemplary computer readable storage devices include conventional computer system RAM (random access memory), ROM (read-only memory), EPROM (erasable, programmable ROM), EEPROM (electrically erasable, programmable ROM), flash memory, and magnetic or optical disks or tapes.
[0055] Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented using computer readable program instructions that are executed by an electronic processor.
[0056] These computer readable program instructions may be provided to a processor of a general-purpose computer, special purpose computer, server computer with computational capabilities, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the electronic processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
[0057] In embodiments, the computer readable program instructions may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, statesetting data, configuration data for integrated circuitry, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++, or the like, and procedural programming languages, such as the C programming language or similar programming languages. The computer readable program instructions may execute entirely on a user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
[0058] As used herein, the terms “A or B” and “A and/or B” are intended to encompass A, B, or {A and B}. Further, the terms “A, B, or C” and “A, B, and/or C” are intended to encompass single items, pairs of items, or all items, that is, all of: A, B, C, {A and B}, {A and C}, {B and C}, and {A and B and C}. The term “or” as used herein means “and/or.”
[0059] As used herein, language such as “at least one of X, Y, and Z,” “at least one of X, Y, or Z,” “at least one or more of X, Y, and Z,” “at least one or more of X, Y, or Z,” “at least one or more of X, Y, and/or Z,” or “at least one of X, Y, and/or Z,” is intended to be inclusive of both a single item (e.g., just X, or just Y, or just Z) and multiple items (e.g., {X and Y}, {X and Z}, {Y and Z}, or {X, Y, and Z}). The phrase “at least one of” and similar phrases are not intended to convey a requirement that each possible item must be present, although each possible item may be present.
[0060] The techniques presented and claimed herein are referenced and applied to material objects and concrete examples of a practical nature that demonstrably improve the present technical field and, as such, are not abstract, intangible or purely theoretical. Further, if any claims appended to the end of this specification contain one or more elements designated as “means for [perform]ing [a function]...” or “step for [performing [a function]...”, it is intended that such elements are to be interpreted under 35 U.S.C. § 112(f). However, for any claims containing elements designated in any other manner, it is intended that such elements are not to be interpreted under 35 U.S.C. § 112(f).
[0061] While the invention has been described with reference to the exemplary examples thereof, those skilled in the art will be able to make various modifications to the described examples without departing from the true spirit and scope. The terms and descriptions used herein are set forth by way of illustration only and are not meant as limitations. In particular, although the method has been described by examples, the steps of the method can be performed in a different order than illustrated or simultaneously. Those skilled in the art will recognize that these and other variations are possible within the spirit and scope as defined in the following claims and their equivalents.

Claims

What is claimed is:
1 . A method of providing patient eye movement data using a target image displayed on a patient-holdable device, the method comprising: displaying a target image to a patient, wherein the displaying comprises displaying the target image on a patient-holdable device; obtaining, by the patient-holdable device, patient eye movement data; and outputting the patient eye movement data.
2. The method of claim 1 , wherein the patient-holdable device comprises a smart phone.
3. The method of claim 2, wherein the patient-holdable device comprises a smart phone of the patient, and wherein the displaying, the obtaining, and the outputting are performed by an application installed on the smart phone of the patient.
4. The method of claim 1 , wherein the patient-holdable device comprises patient-wearable goggles.
5. The method of claim 4, wherein the patient-wearable goggles comprise a smart phone.
6. The method of claim 1 , wherein the patient eye movement data comprises video data.
7. The method of claim 1 , wherein the patient eye movement data comprises eye tracking data.
8. The method of claim 1 , wherein the displaying comprises displaying the target image sequentially at a plurality of disjoint locations in a field of view of the patient.
9. The method of claim 8, further comprising: based on the patient eye movement data, measuring at least one of a saccade velocity, a saccade latency, or a saccade accuracy; and outputting the saccade velocity, the saccade latency, or the saccade accuracy.
10. The method of claim 1 , wherein the displaying comprises displaying the target image smoothly moving within a field of view of the patient.
11 . The method of claim 10, further comprising: based on the patient eye movement data, measuring a pursuit gain of the patient; and outputting the pursuit gain of the patient.
12. The method of claim 1 , wherein the displaying comprises displaying the target image at a stationary location within a field of vision of the patient.
13. The method of claim 12, further comprising: rotating a position of the patient’s head relative to the target image; based at least on the patient eye movement data, measuring a vestibuloocular reflex of the patient; and outputting the vestibulo-ocular reflex of the patient.
14. The method of claim 12, further comprising: rotating the patient and the device about a vertical axis; based on the patient eye movement data, measuring nystagmus data of the patient; and outputting the nystagmus data of the patient.
15. The method of claim 12, further comprising: alternately covering each eye of the patient; based on the patient eye movement data, measuring eye misalignment data of the patient; and outputting the eye misalignment data of the patient.
16. The method of claim 1 , further comprising: measuring, based on the patient eye movement data, patient eye characteristic data; and automatically diagnosing the patient based on the patient eye characteristic data.
17. A non-transitory computer readable medium comprising instructions that, when executed by an electronic processor, configure the electronic processor to provide patient eye movement data using a target image displayed on a patient- holdable device, by performing actions comprising: displaying a target image to a patient, wherein the displaying comprises displaying the target image on a patient-holdable device; obtaining, by the patient-holdable device, patient eye movement data; and outputting the patient eye movement data.
18. The computer readable medium of claim 17, wherein the patient- holdable device comprises a smart phone.
19. The computer readable medium of claim 18, wherein the patient- holdable device comprises a smart phone of the patient, and wherein the displaying, the obtaining, and the outputting are performed by an application installed on the smart phone of the patient.
20. The computer readable medium of claim 17, wherein the patient- holdable device comprises patient-wearable goggles.
21 . The computer readable medium of claim 20, wherein the patientwearable goggles comprise a smart phone.
22. The computer readable medium of claim 17, wherein the patient eye movement data comprises video data.
23. The computer readable medium of claim 17, wherein the patient eye movement data comprises eye tracking data.
24. The computer readable medium of claim 17, wherein the displaying comprises displaying the target image sequentially at a plurality of disjoint locations in a field of view of the patient.
25. The computer readable medium of claim 24, wherein the actions further comprise: based on the patient eye movement data, measuring at least one of a saccade velocity, a saccade latency, or a saccade accuracy; and outputting the saccade velocity, the saccade latency, or the saccade accuracy.
26. The computer readable medium of claim 17, wherein the displaying comprises displaying the target image smoothly moving within a field of view of the patient.
27. The computer readable medium of claim 26, wherein the actions further comprise: based on the patient eye movement data, measuring a pursuit gain of the patient; and outputting the pursuit gain of the patient.
28. The computer readable medium of claim 17, wherein the displaying comprises displaying the target image at a stationary location within a field of vision of the patient.
29. The computer readable medium of claim 28, wherein the actions further comprise: based at least on the patient eye movement data obtained while rotating a position of the patient’s head relative to the target image, measuring a vestibuloocular reflex of the patient; and outputting the vestibulo-ocular reflex of the patient.
30. The computer readable medium of claim 28, wherein the actions further comprise: based on the patient eye movement data obtained while rotating the patient and the device about a vertical axis, measuring nystagmus data of the patient; and outputting the nystagmus data of the patient.
31 . The computer readable medium of claim 28, wherein the actions further comprise: based on the patient eye movement data obtained while alternately covering each eye of the patient, measuring eye misalignment data of the patient; and outputting the eye misalignment data of the patient.
32. The computer readable medium of claim 17, wherein the actions further comprise: measuring, based on the patient eye movement data, patient eye characteristic data; and automatically diagnosing the patient based on the patient eye characteristic data.
PCT/US2024/051701 2023-10-26 2024-10-17 Obtaining eye movement data using patient-holdable device Pending WO2025090347A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363593310P 2023-10-26 2023-10-26
US63/593,310 2023-10-26

Publications (1)

Publication Number Publication Date
WO2025090347A1 true WO2025090347A1 (en) 2025-05-01

Family

ID=95516367

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/051701 Pending WO2025090347A1 (en) 2023-10-26 2024-10-17 Obtaining eye movement data using patient-holdable device

Country Status (1)

Country Link
WO (1) WO2025090347A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160262608A1 (en) * 2014-07-08 2016-09-15 Krueger Wesley W O Systems and methods using virtual reality or augmented reality environments for the measurement and/or improvement of human vestibulo-ocular performance
US20170367638A1 (en) * 2016-05-16 2017-12-28 Neuro Kinetics, Inc. Apparatus and Method for Computerized Rotational Head Impulse Test
US20210113079A1 (en) * 2016-02-16 2021-04-22 Massachusetts Eye And Ear Infirmary Mobile device application for ocular misalignment measurement
US20210174959A1 (en) * 2017-11-30 2021-06-10 Viewmind S.A. , System and method for detecting neurological disorders and for measuring general cognitive performance

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160262608A1 (en) * 2014-07-08 2016-09-15 Krueger Wesley W O Systems and methods using virtual reality or augmented reality environments for the measurement and/or improvement of human vestibulo-ocular performance
US20210113079A1 (en) * 2016-02-16 2021-04-22 Massachusetts Eye And Ear Infirmary Mobile device application for ocular misalignment measurement
US20170367638A1 (en) * 2016-05-16 2017-12-28 Neuro Kinetics, Inc. Apparatus and Method for Computerized Rotational Head Impulse Test
US20210174959A1 (en) * 2017-11-30 2021-06-10 Viewmind S.A. , System and method for detecting neurological disorders and for measuring general cognitive performance

Similar Documents

Publication Publication Date Title
US20240215818A1 (en) Mobile device application for ocular misalignment measurement
US11612316B2 (en) Medical system and method operable to control sensor-based wearable devices for examining eyes
US11051691B2 (en) Ocular analysis
US8668337B2 (en) System for the physiological evaluation of brain function
Silva et al. Nonmydriatic ultrawide field retinal imaging compared with dilated standard 7-field 35-mm photography and retinal specialist examination for evaluation of diabetic retinopathy
JP6774136B2 (en) Methods and systems for automatic vision diagnosis
US20190209083A1 (en) Monitoring system and monitoring method for infant
US20140320808A1 (en) Method and apparatus for system synchronization in video oculography based neuro-otologic testing and evaluation
US20240293058A1 (en) Automated Data Acquisition, Appraisal and Analysis in Noninvasive Rapid Screening of Neuro-Otologic Conditions Using Combination of Subject's Objective Oculomotor Vestibular and Reaction Time Analytic Variables
CN104219992A (en) Asperger's diagnosis assistance method and system, and asperger's diagnosis assistance device
JP2015142791A (en) Shape recognition eyesight evaluation and tracking system
CN113288044B (en) Dynamic vision detection system and method
US11317861B2 (en) Vestibular-ocular reflex test and training system
US20190246969A1 (en) Procedure of non-invasive video-oculographic measurement of eye movements as a diagnostic tool for (early) detection of neuropsychiatric diseases
EP4406472A1 (en) Eye-tracking device and method for testing visual processing capabilities
WO2025090347A1 (en) Obtaining eye movement data using patient-holdable device
US20170332947A1 (en) System and methods for diplopia assessment
Sadok et al. Performing the HINTS-exam using a mixed-reality head-mounted display in patients with acute vestibular syndrome: a feasibility study
US20160317023A1 (en) System for testing visual field of a patient and the corresponding method
WO2023170614A1 (en) Systems and methods for diagnosing, assessing, and quantifying brain trauma
US20230404397A1 (en) Vision screening device including oversampling sensor
WO2016173652A1 (en) System for testing visual field of a patient and the corresponding method
US20260000342A1 (en) Vestibular and oculomotor assessment utilizing videonystagmography data and posturography data
JP7565566B2 (en) Physical condition estimation device, operating method for the physical condition estimation device, program, and recording medium
WO2025247410A1 (en) Detection method and system for determining whether a patient suffers from strabismus and/or convergence insufficiency

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24883114

Country of ref document: EP

Kind code of ref document: A1