WO2025124699A1 - Foveated rendering of an extended reality environment - Google Patents
Foveated rendering of an extended reality environment Download PDFInfo
- Publication number
- WO2025124699A1 WO2025124699A1 PCT/EP2023/085410 EP2023085410W WO2025124699A1 WO 2025124699 A1 WO2025124699 A1 WO 2025124699A1 EP 2023085410 W EP2023085410 W EP 2023085410W WO 2025124699 A1 WO2025124699 A1 WO 2025124699A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- eye
- tracking
- controller
- tracker
- environment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
Definitions
- Embodiments presented herein relate to a method, a controller, a computer program, and a computer program product for foveated rendering of an extended reality environment.
- Camera-based eye-tracking is a well-used technology in e.g., laptops and extended reality (XR) headsets, to track the eyes of the user for several different types of applications or user analyses.
- Stereo-based eyetracking uses two cameras as well as several illuminators using near-infrared light to enhance the eyetracking.
- camera-based eye-tracking has the opportunity to provide detailed gaze tracking
- camera-based eye-tracking is quite computationally complex, have a high cost, and requires high power consumption.
- Camera-based eye-tracking is often based on statically mounted cameras. There are simple camera-based eye-tracking techniques available, see for example E. Whitmire et al.
- Electrooculography is a technique for measuring the comeo-retinal standing potential that exists between the front and the back of the human eye. The resulting signal is called the electrooculogram.
- Primary applications are in ophthalmological diagnosis and in recording eye movements. To measure eye movement, pairs of electrodes are typically placed either above and below the eye or to the left and right of the eye.
- One advantage of electrooculography-based eye-tracking over camera-based eye-tracking is the significantly lower power consumption due to the measurement of low-dimensional signals at a fairly low bitrate and significantly less processing the analyze those signals compared to camera-based eye-tracking.
- electrooculography-based eye-tracking can have worse accuracy in vertical dimension than in horizontal dimension.
- electrooculography-based eye-tracking might need calibration (which is persondependent). The calibration also depends on e.g., sweat and adaption of the eyes to certain lightning conditions. In turn, this means that for the best performance there is a need to re-calibrate at occasions.
- electrooculography-based eye-tracking is generally less accurate in terms of gaze angle compared to camera-based eye-tracking.
- the accuracy might depend on how the electrodes are attached to, or touches, the skin, which might vary depending on the industrial design of the electrodes, etc.
- Foveated rendering is a video rendering technique based on eye movements. Only the area, sometimes referred to as the foveal area, corresponding to the user’s gaze is rendered in full resolution, and the remaining part of the view is rendered with gradually decreasing quality. This does not affect the user experience, but the computational load is significantly reduced. While rendering techniques for general camera-based eye-tracking and electrooculography-based eye-tracking are known, current technologies for foveated rendering are mainly based on camera-based eye-tracking.
- An object of embodiments herein is to address the above short-comings with camera-based eye-tracking and electrooculography-based eye-tracking, especially in the context of foveated rendering.
- a particular object is to provide energy-efficient foveated rendering in the context of eye-tracking.
- a method for foveated rendering of an XR environment is performed by a controller.
- the method comprises obtaining, using an eye-tracker, a gaze direction of a user towards a point in the XR environment.
- the method comprises obtaining a gaze accuracy for using the eye-tracker to perform eye-tracking of objects in a vicinity of this point in the XR environment.
- the method comprises determining a size of a region in which this point is located based on the gaze accuracy. The size of the region decreases with increased gaze accuracy.
- the method comprises rendering the XR environment whilst performing the eye-tracking. Higher resolution rendering is used inside the region than outside the region.
- a controller for foveated rendering of an XR environment comprises processing circuitry.
- the processing circuitry is configured to cause the controller to obtain, using an eye-tracker, a gaze direction of a user towards a point in the XR environment.
- the processing circuitry is configured to cause the controller to obtain a gaze accuracy for using the eye-tracker to perform eye-tracking of objects in a vicinity of this point in the XR environment.
- the processing circuitry is configured to cause the controller to determine a size of a region in which this point is located based on the gaze accuracy. The size of the region decreases with increased gaze accuracy.
- the processing circuitry is configured to cause the controller to render the XR environment whilst performing the eye-tracking. Higher resolution rendering is used inside the region than outside the region.
- a controller for foveated rendering of an XR environment According to a third aspect there is presented a controller for foveated rendering of an XR environment.
- the controller comprises an obtain module configured to obtain, using an eye-tracker, a gaze direction of a user towards a point in the XR environment.
- the controller comprises an obtain module configured to obtain a gaze accuracy for using the eye-tracker to perform eye-tracking of objects in a vicinity of this point in the XR environment.
- the controller comprises a determine module configured to determine a size of a region in which this point is located based on the gaze accuracy. The size of the region decreases with increased gaze accuracy.
- the controller comprises a render module configured to render the XR environment whilst performing the eye-tracking. Higher resolution rendering is used inside the region than outside the region.
- a computer program for foveated rendering of an XR environment comprises computer code which, when run on processing circuitry of a controller, causes the controller to perform actions.
- One action comprises the controller to obtain, using an eye-tracker, a gaze direction of a user towards a point in the XR environment.
- One action comprises the controller to obtain a gaze accuracy for using the eye-tracker to perform eye-tracking of objects in a vicinity of this point in the XR environment.
- One action comprises the controller to determine a size of a region in which this point is located based on the gaze accuracy. The size of the region decreases with increased gaze accuracy.
- One action comprises the controller to render the XR environment whilst performing the eye-tracking. Higher resolution rendering is used inside the region than outside the region.
- a computer program product comprising a computer program according to the fourth aspect and a computer readable storage medium on which the computer program is stored.
- the computer readable storage medium could be a non-transitory computer readable storage medium.
- these aspects provide eye-tracking that does not suffer from the above-identified issues.
- these aspects enable energy-efficient foveated rendering in the context of eye-tracking.
- these aspects enable efficient foveated rendering using electrooculography-based eyetracking as well as camera-based eye-tracking.
- these aspects enable the time that electrooculography-based eye-tracking is used for foveated rendering to be maximized.
- Fig. 1 is a schematic diagram illustrating user devices according to embodiments
- Fig. 4 schematically illustrates an XR environment according to an embodiment
- an object of embodiments herein is to address the above short-comings with camerabased eye-tracking and electrooculography-based eye-tracking, especially in the context of foveated rendering.
- the embodiments disclosed herein in particular relate to techniques for foveated rendering of an XR environment.
- a controller a method performed by the controller, a computer program product comprising code, for example in the form of a computer program, that when run on a controller, causes the controller to perform the method.
- Fig. 1 is a schematic diagram illustrating user devices 110, represented by a pair of smart glasses, or XR glasses, for eye-tracking according to embodiments.
- the eye-tracking might be performed in the context of gaming, navigation, or tracking user behaviour in an XR environment. Further, the eye-tracking might be performed as part of a meeting, a learning event, etc.
- a first eye-tracking mode is represented by electrooculography-based eye-tracking and a second eye-tracking mode is represented by camera-based eye-tracking.
- the electrooculography-based eye-tracking is based on signals obtained from electrodes 120.
- FIG. 1 schematically illustrates different types of implementations of electrooculography-based eye-tracking, and particularly the placements of the electrodes 120.
- Fig. 1(a) six electrodes 120 (denoted HR (horizontal right), VL (vertical lower), HL (horizontal left), VU (vertical upper), REF (reference), GND (ground)) are fixed to the skin of a user 150 wearing the user device 110.
- the electrodes (as represented by single electrode 120) are part of the user device 110 itself.
- the electrodes (as represented by single electrode 120) are placed on an in-ear headset 140.
- the electrodes might be integrated with audio and microphone circuitry in the in-ear headset 140.
- the camera-based eye-tracking is based on signals obtained from a camera 130.
- a system for selecting eye-tracking mode for an eye-tracking application comprises the electrodes 120 to be placed on a user 150 of the eye-tracking application as well as a controller 500, 600 for selecting the eye-tracking mode for the eye-tracking application. Functionality and operations as performed by the controller 500, 600 will be disclosed in further detail below.
- at least one second eye-tracking mode of the available eye-tracking modes involves camera-based eye-tracking, and the system further comprises a camera 130.
- S 102 The controller 600, 700 obtains, using an eye-tracker, a gaze direction of a user towards a point in the XR environment 400.
- S 104 The controller 600, 700 obtains a gaze accuracy for using the eye-tracker to perform eye-tracking of objects 430 in a vicinity of this point in the XR environment 400.
- the gaze accuracy for using the eye-tracker to perform the eye-tracking can be represented by an eyetracking resolution map 300a, 300b, 300c. That is, in some embodiments, the gaze accuracy varies over the XR environment 400 according to an eye-tracking resolution map 300a, 300b, 300c. In the eyetracking resolution map 300a, 300b, 300c, certain areas represent a higher gaze accuracy (i.e., better resolution) for performing the eye-tracking. Examples of three such eye-tracking resolution maps 300a, 300b, 300c are provided in Fig. 3.
- the eye-tracking resolution maps 300a, 300b, 300c can thus be used to provide information of where the field-of-view of the gaze detection is of good enough accuracy and, consequently, which objects can be distinguished with good-enough probability for the used eye-tracker.
- the eye-tracking resolution map 300a is an example of a highly irregular eye-tracking resolution map, where the gaze accuracy varies over the field-of-view
- the eye-tracking resolution maps 300b, 300c are examples of regular eye-tracking resolution maps, where the field-of-view is more or less constant over the whole field-of-view. This is schematically illustrated in Fig.
- the eye-tracking resolution map 300a is a highly irregular eye-tracking resolution map is based on the facts that the ellipses 310a are of different sizes, i.e., the diameters dl, d2, d3, d4 all are different from each other. In contrast, in each of the eye-tracking resolution maps 300b, 300c, the diameter d5, d6 is the same, irrespectively of where the ellipses 310b, 310c are placed within the eye-tracking resolution maps 300b, 300c.
- the diameter d7 is smaller than if the user instead is gazing in a direction corresponding to region 420b, for example looking at the triangular object or the rhombus-shaped object.
- the two circular objects are placed closer to each other than the triangular object and the rhombus-shaped object, it might still be easier for the used eye-tracker to distinguish between which of the two circular objects the user is gazing at than to distinguish whether the user is gazing at the triangular object or the rhombus-shaped object.
- FIG. 5 One example embodiment for foveated rendering of an XR environment 400 as performed by the controller 600, 700 will now be disclosed with reference to the flowchart of Fig. 5.
- there are two different eye-trackers available camera-based eye-tracking (VOG) and electrooculography-based eye-tracking (EOG).
- VOG camera-based eye-tracking
- EOG electrooculography-based eye-tracking
- step S201 the camera-based eye-tracking
- step S202 The user’s position of the eye is analyzed (step S202) and at first, there might be a need to calibrate the electrooculography-based eye-tracking.
- the calibration can be carried out using simultaneous operation of both the camera-based eye-tracking and the electrooculography-based eye-tracking, where data of the camera-based eye-tracking is used as reference, as disclosed above. This results in an understanding of where the field-of-view the detection can be of good enough accuracy and which points, or areas, can be distinguished with good-enough probability for the electrooculography-based eye-tracking (step S203).
- the gaze accuracy of the electrooculography-based eye-tracking is good enough for the current gaze direction of the user’s eye, if it is deemed that the energy consumption (or other suitable metric) would be lower with electrooculography-based eye-tracking for the current gaze of the user’s eye, a switch is made to the electrooculography-based eye-tracking (step S204), otherwise the camera-based eye-tracking is continuing (step S205). If the energy consumption is lower, the camera-based eye-tracking sensor is switched off (step S206), thus reducing the over-all energy consumption whilst prolonging the battery life of the device.
- the eye-tracking is continued using the electrooculography-based eye-tracking for estimating the current gaze direction of the user’s eye is therefore based on based on these (step S207).
- the bias/drift of the electrooculography-based eye-tracking as accumulating over time can be estimated (step S208). This bias/drift might affect the foveal area (step S209). Depending on the resolution map and bias/drift, the size of the foveal area might need to be adapted depending on the estimated accuracy of the specific area and time. If the gaze-accuracy of the electrooculography-based eye-tracking still is considered sufficient, then electrooculography-based eye-tracking is continued (step S211).
- camera-based eye-tracking is switched back on (step S212).
- the foveal area is in general larger when using electrooculography-based eye-tracking than when using camera-based eye-tracking.
- electrooculography-based eye-tracking is more energy-efficient this trade-off is justified as long as the total energy consumption for the foveated rendering based on the electrooculography-based eye-tracking than the total energy consumption for foveated rendering based on the camera-based eye -tracking.
- the electrooculography-based eye-tracking is calibrated.
- each user 150 might have a slightly different anatomy.
- each user 150 might place the user device or the electrodes 120 slightly differently.
- calibration of the electrooculography-based eye-tracking is therefore performed. Eye-tracking data from the camera-based eye-tracking can then be used as reference data. In this way, also the gaze accuracy can be estimated.
- the gaze accuracy is obtained by comparing first eye-tracking data resulting from using the first eye-tracker (for example using electrooculography-based eye-tracking) to second eyetracking data resulting from using the second eye-tracker (for example using camera-based eye-tracking). Further, assuming that the eye-tracking is performed for an eye-tracking application, then the first eyetracking data can be obtained whilst the second eye-tracker is used during execution of the eye-tracking application, and the second eye-tracking data can be used as reference data when comparing the first eyetracking data to the second eye-tracking data. For example, the calibration, and thus also the estimation of the gaze accuracy, could be performed before or during the execution of the eye-tracking application.
- the calibration could be performed upon every cold-start of the user device.
- a calibration can be performed upon the user having started using the user device after having temporarily removed the user device.
- the eye-tracking resolution map 300a, 300b, 300c is a time-varying eye-tracking resolution map 300a, 300b, 300c.
- the placement of the user device or the electrodes 120 might differ over time (e.g., as the user moves).
- electrooculography signals can be impacted by sweat.
- the user’s eyes adapt to darkness or other conditions which might impact electrooculography signals.
- a re-calibration can be performed upon the user having adjusted the position of the user device.
- a re-calibration could be performed upon the eye-tracking application, or some other application, having detected that the electrooculography-based eye-tracking is deviating from some assumed gaze direction, e.g. when new objects are shown or moved, in a consistently wrong way.
- a re-calibration can be performed periodically or upon expiration of a timer, where the timer is started when the user devices is started. Other trigger conditions for when the (re-)calibration is to be performed are also envisioned.
- the re-calibration involves the same actions as the calibration.
- One difference is that the re-calibration can be based on the current settings and measures errors and adjust to those, whereas a calibration might be either based on previous settings or start from some default settings.
- the electrooculography-based eye-tracking can be further calibrated, or finetuned, beyond the dedicated calibration phases, which means fewer such dedicated calibrations need to be performed.
- the calibration is performed in the background whilst the camera-based eye-tracking is ongoing. That is, in some examples, the controller 600, 700 is configured to calibrate the electrooculography-based eye-tracking whilst the camera-based eye-tracking is used during execution of the eye-tracking application.
- the controller 600, 700 might for this purpose be configured to perform A-B tests.
- Fig. 6 schematically illustrates, in terms of a number of structural units, the components of a controller 600 according to an embodiment.
- Processing circuitry 610 is provided using any combination of one or more of a suitable central processing unit (CPU), multiprocessor, microcontroller, digital signal processor (DSP), etc., capable of executing software instructions stored in a computer program product 810 (as in Fig. 8), e.g. in the form of a storage medium 630.
- the processing circuitry 610 may further be provided as at least one application specific integrated circuit (ASIC), or field programmable gate array (FPGA).
- ASIC application specific integrated circuit
- FPGA field programmable gate array
- the processing circuitry 610 is configured to cause the controller 600 to perform a set of operations, or steps, as disclosed above.
- the storage medium 630 may store the set of operations
- the processing circuitry 610 may be configured to retrieve the set of operations from the storage medium 630 to cause the controller 600 to perform the set of operations.
- the set of operations may be provided as a set of executable instructions.
- the processing circuitry 610 is thereby arranged to execute methods as herein disclosed.
- the storage medium 630 may also comprise persistent storage, which, for example, can be any single one or combination of magnetic memory, optical memory, solid state memory or even remotely mounted memory.
- the controller 600 may further comprise a communications (comm.) interface 620 at least configured for communications with other entities, functions, nodes, and devices (such as the user device 110, the electrodes 120, the camera 130, and the headset 140) as needed for the controller 600 to be able to perform the foveated rendering of an XR environment 400 in accordance with the herein disclosed embodiments.
- the communications interface 620 may comprise one or more transmitters and receivers, comprising analogue and digital components.
- the processing circuitry 610 controls the general operation of the controller 600 e.g. by sending data and control signals to the communications interface 620 and the storage medium 630, by receiving data and reports from the communications interface 620, and by retrieving data and instructions from the storage medium 630.
- Other components, as well as the related functionality, of the controller 600 are omitted in order not to obscure the concepts presented herein.
- Fig. 7 schematically illustrates, in terms of a number of functional modules, the components of a controller 700 according to an embodiment.
- the controller 700 of Fig. 7 comprises a number of functional modules; an obtain module 710 configured to perform step SI 02, an obtain module 720 configured to perform step SI 04, a determine module 730 configured to perform step SI 06, and a render module 740 configured to perform step S108.
- the controller 700 of Fig. 7 may further comprise a number of optional functional modules, as represented by functional module 750.
- each functional module 710:750 may in one embodiment be implemented only in hardware and in another embodiment with the help of software, i.e., the latter embodiment having computer program instructions stored on the storage medium 630 which when run on the processing circuitry makes the controller 600 perform the corresponding steps mentioned above in conjunction with Fig 8. It should also be mentioned that even though the modules correspond to parts of a computer program, they do not need to be separate modules therein, but the way in which they are implemented in software is dependent on the programming language used.
- one or more or all functional modules 710:750 may be implemented by the processing circuitry 610, possibly in cooperation with the communications interface 620 and/or the storage medium 630.
- the processing circuitry 610 may thus be configured to from the storage medium 630 fetch instructions as provided by a functional module 710:750 and to execute these instructions, thereby performing any steps as disclosed herein.
- the controller 500, 600 may be provided as a standalone device or as a part of at least one further device.
- the controller 500, 600 may be provided in the user device 110.
- Fig. 8 shows one example of a computer program product 810 comprising computer readable storage medium 830.
- a computer program 820 can be stored, which computer program 820 can cause the processing circuitry 610 and thereto operatively coupled entities and devices, such as the communications interface 620 and the storage medium 630, to execute methods according to embodiments described herein.
- the computer program 820 and/or computer program product 810 may thus provide means for performing any steps as herein disclosed.
- the computer program product 810 is illustrated as an optical disc, such as a CD (compact disc) or a DVD (digital versatile disc) or a Blu-Ray disc.
- the computer program product 810 could also be embodied as a memory, such as a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM), or an electrically erasable programmable read-only memory (EEPROM) and more particularly as a non-volatile storage medium of a device in an external memory such as a USB (Universal Serial Bus) memory or a Flash memory, such as a compact Flash memory.
- the computer program 820 is here schematically shown as a track on the depicted optical disk, the computer program 820 can be stored in any way which is suitable for the computer program product 810.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- Health & Medical Sciences (AREA)
- Dermatology (AREA)
- General Health & Medical Sciences (AREA)
- Neurology (AREA)
- Neurosurgery (AREA)
- Biomedical Technology (AREA)
- Eye Examination Apparatus (AREA)
Abstract
There is provided techniques for foveated rendering of an XR environment. A method is performed by a controller. The method comprises obtaining, using an eye-tracker, a gaze direction of a user towards a point in the XR environment. The method comprises obtaining a gaze accuracy for using the eye-tracker to perform eye-tracking of objects in a vicinity of this point in the XR environment. The method comprises determining a size of a region in which this point is located based on the gaze accuracy. The size of the region decreases with increased gaze accuracy. The method comprises rendering the XR environment whilst performing the eye-tracking. Higher resolution rendering is used inside the region than outside the region.
Description
FOVEA TED RENDERING OF AN EXTENDED REALITY ENVIRONMENT
TECHNICAL FIELD
Embodiments presented herein relate to a method, a controller, a computer program, and a computer program product for foveated rendering of an extended reality environment.
BACKGROUND
Camera-based eye-tracking, is a well-used technology in e.g., laptops and extended reality (XR) headsets, to track the eyes of the user for several different types of applications or user analyses. Stereo-based eyetracking uses two cameras as well as several illuminators using near-infrared light to enhance the eyetracking.
Whereas camera-based eye-tracking has the opportunity to provide detailed gaze tracking, camera-based eye-tracking is quite computationally complex, have a high cost, and requires high power consumption. Camera-based eye-tracking is often based on statically mounted cameras. There are simple camera-based eye-tracking techniques available, see for example E. Whitmire et al. “EyeContact: Scleral Coil Eye Tracking for Virtual Reality” in Proceedings of the International Symposium of Wearable Computers (ISWC), September 12-16, 2016, but these eye-tracking techniques are typically less accurate than camera-based eye-tracking with statically mounted cameras, are more difficult to place optimally in lightweight equipment, such as XR glasses, and still suffer from a significant power consumption due to the operation of multiple cameras, processing of image data at certain framerate and the use of near-infrared illumination (if included).
Originating for medical purposes, another technology to track the user’s eyes is electrooculography, or EOG. Electrooculography is a technique for measuring the comeo-retinal standing potential that exists between the front and the back of the human eye. The resulting signal is called the electrooculogram. Primary applications are in ophthalmological diagnosis and in recording eye movements. To measure eye movement, pairs of electrodes are typically placed either above and below the eye or to the left and right of the eye.
One advantage of electrooculography-based eye-tracking over camera-based eye-tracking is the significantly lower power consumption due to the measurement of low-dimensional signals at a fairly low bitrate and significantly less processing the analyze those signals compared to camera-based eye-tracking.
However, there are some shortcomings of electrooculography-based eye-tracking compared to camerabased eye-tracking. For example, depending on implementation and/or placement of the electrodes, etc., electrooculography-based eye-tracking can have worse accuracy in vertical dimension than in horizontal dimension. For example, electrooculography-based eye-tracking might need calibration (which is persondependent). The calibration also depends on e.g., sweat and adaption of the eyes to certain lightning conditions. In turn, this means that for the best performance there is a need to re-calibrate at occasions.
For example, electrooculography-based eye-tracking is generally less accurate in terms of gaze angle compared to camera-based eye-tracking. Furthermore, the accuracy might depend on how the electrodes are attached to, or touches, the skin, which might vary depending on the industrial design of the electrodes, etc.
Foveated rendering is a video rendering technique based on eye movements. Only the area, sometimes referred to as the foveal area, corresponding to the user’s gaze is rendered in full resolution, and the remaining part of the view is rendered with gradually decreasing quality. This does not affect the user experience, but the computational load is significantly reduced. While rendering techniques for general camera-based eye-tracking and electrooculography-based eye-tracking are known, current technologies for foveated rendering are mainly based on camera-based eye-tracking.
SUMMARY
An object of embodiments herein is to address the above short-comings with camera-based eye-tracking and electrooculography-based eye-tracking, especially in the context of foveated rendering.
A particular object is to provide energy-efficient foveated rendering in the context of eye-tracking.
According to a first aspect there is presented a method for foveated rendering of an XR environment. The method is performed by a controller. The method comprises obtaining, using an eye-tracker, a gaze direction of a user towards a point in the XR environment. The method comprises obtaining a gaze accuracy for using the eye-tracker to perform eye-tracking of objects in a vicinity of this point in the XR environment. The method comprises determining a size of a region in which this point is located based on the gaze accuracy. The size of the region decreases with increased gaze accuracy. The method comprises rendering the XR environment whilst performing the eye-tracking. Higher resolution rendering is used inside the region than outside the region.
According to a second aspect there is presented a controller for foveated rendering of an XR environment. The controller comprises processing circuitry. The processing circuitry is configured to cause the controller to obtain, using an eye-tracker, a gaze direction of a user towards a point in the XR environment. The processing circuitry is configured to cause the controller to obtain a gaze accuracy for using the eye-tracker to perform eye-tracking of objects in a vicinity of this point in the XR environment. The processing circuitry is configured to cause the controller to determine a size of a region in which this point is located based on the gaze accuracy. The size of the region decreases with increased gaze accuracy. The processing circuitry is configured to cause the controller to render the XR environment whilst performing the eye-tracking. Higher resolution rendering is used inside the region than outside the region.
According to a third aspect there is presented a controller for foveated rendering of an XR environment.
The controller comprises an obtain module configured to obtain, using an eye-tracker, a gaze direction of
a user towards a point in the XR environment. The controller comprises an obtain module configured to obtain a gaze accuracy for using the eye-tracker to perform eye-tracking of objects in a vicinity of this point in the XR environment. The controller comprises a determine module configured to determine a size of a region in which this point is located based on the gaze accuracy. The size of the region decreases with increased gaze accuracy. The controller comprises a render module configured to render the XR environment whilst performing the eye-tracking. Higher resolution rendering is used inside the region than outside the region.
According to a fourth aspect there is presented a computer program for foveated rendering of an XR environment. The computer program comprises computer code which, when run on processing circuitry of a controller, causes the controller to perform actions. One action comprises the controller to obtain, using an eye-tracker, a gaze direction of a user towards a point in the XR environment. One action comprises the controller to obtain a gaze accuracy for using the eye-tracker to perform eye-tracking of objects in a vicinity of this point in the XR environment. One action comprises the controller to determine a size of a region in which this point is located based on the gaze accuracy. The size of the region decreases with increased gaze accuracy. One action comprises the controller to render the XR environment whilst performing the eye-tracking. Higher resolution rendering is used inside the region than outside the region.
According to a fifth aspect there is presented a computer program product comprising a computer program according to the fourth aspect and a computer readable storage medium on which the computer program is stored. The computer readable storage medium could be a non-transitory computer readable storage medium.
Advantageously, these aspects provide eye-tracking that does not suffer from the above-identified issues.
Advantageously, these aspects enable energy-efficient foveated rendering in the context of eye-tracking.
Advantageously, these aspects enable efficient foveated rendering using electrooculography-based eyetracking as well as camera-based eye-tracking.
Advantageously, these aspects enable the foveated rendering to be adapted for an eye-tracker requires only low power consumption.
Advantageously, these aspects enable the time that electrooculography-based eye-tracking is used for foveated rendering to be maximized.
Other objectives, features and advantages of the enclosed embodiments will be apparent from the following detailed disclosure, from the attached dependent claims as well as from the drawings.
Generally, all terms used in the claims are to be interpreted according to their ordinary meaning in the technical field, unless explicitly defined otherwise herein. All references to "a/an/the element, apparatus, component, means, module, step, etc." are to be interpreted openly as referring to at least one instance of the element, apparatus, component, means, module, step, etc., unless explicitly stated otherwise. The steps of any method disclosed herein do not have to be performed in the exact order disclosed, unless explicitly stated.
BRIEF DESCRIPTION OF THE DRAWINGS
The inventive concept is now described, by way of example, with reference to the accompanying drawings, in which:
Fig. 1 is a schematic diagram illustrating user devices according to embodiments;
Fig. 2 is a flowchart of methods according to an embodiment;
Fig. 3 schematically illustrates eye-tracking resolution maps according to embodiments;
Fig. 4 schematically illustrates an XR environment according to an embodiment;
Fig. 5 is a flowchart of methods according to an embodiment;
Fig. 6 is a schematic diagram showing structural units of a controller according to an embodiment;
Fig. 7 is a schematic diagram showing functional modules of a controller according to an embodiment; and
Fig. 8 shows one example of a computer program product comprising computer readable storage medium according to an embodiment.
DETAILED DESCRIPTION
The inventive concept will now be described more fully hereinafter with reference to the accompanying drawings, in which certain embodiments of the inventive concept are shown. This inventive concept may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the inventive concept to those skilled in the art. Like numbers refer to like elements throughout the description. Any step or feature illustrated by dashed lines should be regarded as optional.
As noted above, an object of embodiments herein is to address the above short-comings with camerabased eye-tracking and electrooculography-based eye-tracking, especially in the context of foveated rendering.
The embodiments disclosed herein in particular relate to techniques for foveated rendering of an XR environment. In order to obtain such techniques there is provided a controller, a method performed by the controller, a computer program product comprising code, for example in the form of a computer program, that when run on a controller, causes the controller to perform the method.
Fig. 1 is a schematic diagram illustrating user devices 110, represented by a pair of smart glasses, or XR glasses, for eye-tracking according to embodiments. The eye-tracking might be performed in the context of gaming, navigation, or tracking user behaviour in an XR environment. Further, the eye-tracking might be performed as part of a meeting, a learning event, etc. In these examples, a first eye-tracking mode is represented by electrooculography-based eye-tracking and a second eye-tracking mode is represented by camera-based eye-tracking. The electrooculography-based eye-tracking is based on signals obtained from electrodes 120. In more detail, Fig. 1 schematically illustrates different types of implementations of electrooculography-based eye-tracking, and particularly the placements of the electrodes 120. In Fig. 1(a) six electrodes 120 (denoted HR (horizontal right), VL (vertical lower), HL (horizontal left), VU (vertical upper), REF (reference), GND (ground)) are fixed to the skin of a user 150 wearing the user device 110. In Fig. 1(b) the electrodes (as represented by single electrode 120) are part of the user device 110 itself. In Fig. 1(c) the electrodes (as represented by single electrode 120) are placed on an in-ear headset 140. In Fig. 1(c) the electrodes might be integrated with audio and microphone circuitry in the in-ear headset 140. The camera-based eye-tracking is based on signals obtained from a camera 130.
In general terms, a system for selecting eye-tracking mode for an eye-tracking application comprises the electrodes 120 to be placed on a user 150 of the eye-tracking application as well as a controller 500, 600 for selecting the eye-tracking mode for the eye-tracking application. Functionality and operations as performed by the controller 500, 600 will be disclosed in further detail below. In some examples, as in the example of Fig. 1, at least one second eye-tracking mode of the available eye-tracking modes involves camera-based eye-tracking, and the system further comprises a camera 130.
Fig. 2 is a flowchart illustrating embodiments of methods for foveated rendering of an XR environment 400. The methods are performed by the controller 600, 700. The methods are advantageously provided as computer programs 820.
The herein disclosed embodiments are based on combining eye-tracking, such as electrooculographybased eye-tracking and/or camera-based eye tracking, with foveated rendering, where the foveated rendering is based on the gaze accuracy of the used eye-tracker. One goal is to maximize the time a low- energy eye-tracker, such as an eye-tracker based on electrooculography-based eye-tracking, can be used due to its low energy consumption. This can be achieved by dynamically controlling the size of foveal area, i.e., the region in which a higher resolution rendering is used.
S 102: The controller 600, 700 obtains, using an eye-tracker, a gaze direction of a user towards a point in the XR environment 400.
S 104: The controller 600, 700 obtains a gaze accuracy for using the eye-tracker to perform eye-tracking of objects 430 in a vicinity of this point in the XR environment 400.
S106: The controller 600, 700 determines a size of a region 420a, 420b in which this point is located based on the gaze accuracy. The size of the region 420a, 420b decreases with increased gaze accuracy.
SI 08: The controller 600, 700 renders the XR environment 400 whilst performing the eye-tracking. Higher resolution rendering is used inside the region 420a, 420b than outside the region 420a, 420b. In this way, foveated rendering is achieved.
Embodiments relating to further details of foveated rendering of an XR environment 400 as performed by the controller 600, 700 will now be disclosed.
The gaze accuracy for using the eye-tracker to perform the eye-tracking can be represented by an eyetracking resolution map 300a, 300b, 300c. That is, in some embodiments, the gaze accuracy varies over the XR environment 400 according to an eye-tracking resolution map 300a, 300b, 300c. In the eyetracking resolution map 300a, 300b, 300c, certain areas represent a higher gaze accuracy (i.e., better resolution) for performing the eye-tracking. Examples of three such eye-tracking resolution maps 300a, 300b, 300c are provided in Fig. 3. The eye-tracking resolution maps 300a, 300b, 300c can thus be used to provide information of where the field-of-view of the gaze detection is of good enough accuracy and, consequently, which objects can be distinguished with good-enough probability for the used eye-tracker. The eye-tracking resolution map 300a is an example of a highly irregular eye-tracking resolution map, where the gaze accuracy varies over the field-of-view, whereas the eye-tracking resolution maps 300b, 300c are examples of regular eye-tracking resolution maps, where the field-of-view is more or less constant over the whole field-of-view. This is schematically illustrated in Fig. 3, where, for simplicity, the eye-tracking resolution maps 300a, 300b, 300c comprise regions representing with low, medium and/or high gaze accuracy/resolution of the used eye-tracker. For illustrative purposes, diameters dl, d2, d3, d4, d5, d6 of different ellipses 310a, 310b, 310c represent the smallest distance required for the used eyetracker to successfully distinguish between two objects in the XR environment. That is, the smaller the ellipses 310a, 310b, 310c is, the higher the gaze accuracy/resolution of the used eye-tracker is. That the eye-tracking resolution map 300a is a highly irregular eye-tracking resolution map is based on the facts that the ellipses 310a are of different sizes, i.e., the diameters dl, d2, d3, d4 all are different from each other. In contrast, in each of the eye-tracking resolution maps 300b, 300c, the diameter d5, d6 is the same, irrespectively of where the ellipses 310b, 310c are placed within the eye-tracking resolution maps 300b, 300c.
For reasons as will be further disclosed below the gaze accuracy of the used eye-tracker might change over time. Consequently, the eye-tracking resolution maps 300a, 300b, 300c might also change over time. Therefore, in some embodiments, the eye-tracking resolution map 300a, 300b, 300c is a time-varying eye-tracking resolution map 300a, 300b, 300c, and thereby the size of the region 420a, 420b is time-
varying. Thus, size of the region 420a, 420b can be calculated based on a time-varying eye-tracking resolution map 300a, 300b, 300c, where there could be different causes of the variations in the eyetracking resolution map 300a, 300b, 300c over time.
Reference is next made to Fig. 4 which schematically illustrates an XR environment 400 in which four objects 430 are provided (a triangular object, a rhombus-shaped object, and two circular objects), and where eye-tracking is performed using an eye-tracker. For illustrative purposes it is assumed that the gaze accuracy of the used eye-tracker corresponds to the eye-tracking resolution map 300a. This is in Fig. 4 illustrated by the regions 410 that represent the corresponding regions with low, medium, and high gaze accuracy/resolution in the eye-tracking resolution map 300a. Higher resolution rendering is used inside the regions 420a, 420b than outside the region 420a, 420b. In this way, foveated rendering is achieved. Further, the regions 410 define the size, as represented by diameters d7, d8, of regions 420a, 420b used by the foveated rendering. In this respect, since the different eye-tracking resolution maps 300a, 300b, 300c will yield different regions 410, the size of the region 420a, 420b in the XR environment 400 will differ among the eye-trackers, in case each eye-tracking resolution map 300a, 300b, 300c represent a different eye-tracker. With continued reference to Fig. 4, for example, in case the user is gazing in a direction corresponding to region 420a, for example looking at any of the two circular objects, the diameter d7 is smaller than if the user instead is gazing in a direction corresponding to region 420b, for example looking at the triangular object or the rhombus-shaped object. Hence, although the two circular objects are placed closer to each other than the triangular object and the rhombus-shaped object, it might still be easier for the used eye-tracker to distinguish between which of the two circular objects the user is gazing at than to distinguish whether the user is gazing at the triangular object or the rhombus-shaped object.
Further aspects of the eye-tracker will be disclosed next. As disclosed above, in Fig. 3 is illustrated three exemplary eye-tracking resolution maps 300a, 300b, 300c. These eye-tracking resolution maps 300a, 300b, 300c could all belong to one and the same eye-tracker, where, as an illustrative example, the eyetracking resolution map varies over time, starting from the eye-tracking resolution map 300c and where a change over time of the gaze accuracy of the used eye-tracker causes the eye-tracking resolution map 300c to change to the eye-tracking resolution map 300b or the eye-tracking resolution map 300a. In other aspects, at least two of the eye-tracking resolution maps 300a, 300b, 300c belong to different eyetrackers. That is, in some embodiments, each eye-tracker has its own eye-tracking resolution map 300a, 300b, 300c.
Further, in some embodiments, the used eye-tracker is a first eye-tracker that is selected from a set of available eye-trackers comprising at least the first eye-tracker and a second eye-tracker. In some nonlimiting examples, the first eye-tracker uses electrooculography-based eye-tracking and the second eyetracker uses camera-based eye-tracking. There can be different criteria according to the first eye-tracker is selected. In some embodiments, the first eye-tracker is selected to fulfil an energy requirement. Here, the
energy requirement might pertain to selecting the eye-tracker yielding lowest energy consumption for the foveated rendering of all the available eye-trackers.
One example embodiment for foveated rendering of an XR environment 400 as performed by the controller 600, 700 will now be disclosed with reference to the flowchart of Fig. 5. According to this example embodiment, there are two different eye-trackers available; camera-based eye-tracking (VOG) and electrooculography-based eye-tracking (EOG). Further, it is assumed that, first, the camera-based eye-tracking is started up (step S201). Due to the low energy consumption, simultaneous usage of electrooculography-based eye-tracking is negligible. The user’s position of the eye is analyzed (step S202) and at first, there might be a need to calibrate the electrooculography-based eye-tracking. The calibration can be carried out using simultaneous operation of both the camera-based eye-tracking and the electrooculography-based eye-tracking, where data of the camera-based eye-tracking is used as reference, as disclosed above. This results in an understanding of where the field-of-view the detection can be of good enough accuracy and which points, or areas, can be distinguished with good-enough probability for the electrooculography-based eye-tracking (step S203). Given that the gaze accuracy of the electrooculography-based eye-tracking is good enough for the current gaze direction of the user’s eye, if it is deemed that the energy consumption (or other suitable metric) would be lower with electrooculography-based eye-tracking for the current gaze of the user’s eye, a switch is made to the electrooculography-based eye-tracking (step S204), otherwise the camera-based eye-tracking is continuing (step S205). If the energy consumption is lower, the camera-based eye-tracking sensor is switched off (step S206), thus reducing the over-all energy consumption whilst prolonging the battery life of the device. The eye-tracking is continued using the electrooculography-based eye-tracking for estimating the current gaze direction of the user’s eye is therefore based on based on these (step S207). The bias/drift of the electrooculography-based eye-tracking as accumulating over time can be estimated (step S208). This bias/drift might affect the foveal area (step S209). Depending on the resolution map and bias/drift, the size of the foveal area might need to be adapted depending on the estimated accuracy of the specific area and time. If the gaze-accuracy of the electrooculography-based eye-tracking still is considered sufficient, then electrooculography-based eye-tracking is continued (step S211). Otherwise, camera-based eye-tracking is switched back on (step S212). In this respect, for a given gaze direction, the foveal area is in general larger when using electrooculography-based eye-tracking than when using camera-based eye-tracking. However, as electrooculography-based eye-tracking is more energy-efficient this trade-off is justified as long as the total energy consumption for the foveated rendering based on the electrooculography-based eye-tracking than the total energy consumption for foveated rendering based on the camera-based eye -tracking.
In some aspects, the electrooculography-based eye-tracking is calibrated. In general terms, there could be different reasons as to why the electrooculography-based eye-tracking needs to be calibrated. For example, each user 150 might have a slightly different anatomy. For example, each user 150 might place the user device or the electrodes 120 slightly differently. In some aspects, calibration of the
electrooculography-based eye-tracking is therefore performed. Eye-tracking data from the camera-based eye-tracking can then be used as reference data. In this way, also the gaze accuracy can be estimated. That is, in some embodiments, the gaze accuracy is obtained by comparing first eye-tracking data resulting from using the first eye-tracker (for example using electrooculography-based eye-tracking) to second eyetracking data resulting from using the second eye-tracker (for example using camera-based eye-tracking). Further, assuming that the eye-tracking is performed for an eye-tracking application, then the first eyetracking data can be obtained whilst the second eye-tracker is used during execution of the eye-tracking application, and the second eye-tracking data can be used as reference data when comparing the first eyetracking data to the second eye-tracking data. For example, the calibration, and thus also the estimation of the gaze accuracy, could be performed before or during the execution of the eye-tracking application. For example, the calibration could be performed upon every cold-start of the user device. For example, a calibration can be performed upon the user having started using the user device after having temporarily removed the user device. Further, as noted above, in some embodiments, the eye-tracking resolution map 300a, 300b, 300c is a time-varying eye-tracking resolution map 300a, 300b, 300c. There could be different reasons for this. For example, the placement of the user device or the electrodes 120 might differ over time (e.g., as the user moves). For example, electrooculography signals can be impacted by sweat. For example, the user’s eyes adapt to darkness or other conditions which might impact electrooculography signals. Therefore, there might be a need for the electrooculography-based eyetracking to be re-calibrated. For example, a re-calibration can be performed upon the user having adjusted the position of the user device. For example, a re-calibration could be performed upon the eye-tracking application, or some other application, having detected that the electrooculography-based eye-tracking is deviating from some assumed gaze direction, e.g. when new objects are shown or moved, in a consistently wrong way. For example, a re-calibration can be performed periodically or upon expiration of a timer, where the timer is started when the user devices is started. Other trigger conditions for when the (re-)calibration is to be performed are also envisioned. In general terms, the re-calibration involves the same actions as the calibration. One difference is that the re-calibration can be based on the current settings and measures errors and adjust to those, whereas a calibration might be either based on previous settings or start from some default settings. The electrooculography-based eye-tracking can be further calibrated, or finetuned, beyond the dedicated calibration phases, which means fewer such dedicated calibrations need to be performed. In particular, in some aspects, the calibration is performed in the background whilst the camera-based eye-tracking is ongoing. That is, in some examples, the controller 600, 700 is configured to calibrate the electrooculography-based eye-tracking whilst the camera-based eye-tracking is used during execution of the eye-tracking application. By analyzing the accuracy of the electrooculography-based eye-tracking whilst the camera-based eye-tracking is used during execution of the eye-tracking application, it can be assessed to what extent the electrooculography-based eye-tracking would be possible to use for that eye-tracking application, e.g. during certain parts of the execution, and thereby decrease the duty cycle of the camera-based eye-tracking where the accuracy of the electrooculography-based eye-tracking is sufficient. The controller 600, 700 might for this purpose be
configured to perform A-B tests. Since the energy consumption of camera-based eye-tracking is higher than that of electrooculography-based eye-tracking, having the electrooculography-based eye-tracking active (for training purposes) in parallel with using camera-based eye-tracking (when camera-based eyetracking is required) yields very low energy consumption overhead.
Fig. 6 schematically illustrates, in terms of a number of structural units, the components of a controller 600 according to an embodiment. Processing circuitry 610 is provided using any combination of one or more of a suitable central processing unit (CPU), multiprocessor, microcontroller, digital signal processor (DSP), etc., capable of executing software instructions stored in a computer program product 810 (as in Fig. 8), e.g. in the form of a storage medium 630. The processing circuitry 610 may further be provided as at least one application specific integrated circuit (ASIC), or field programmable gate array (FPGA).
Particularly, the processing circuitry 610 is configured to cause the controller 600 to perform a set of operations, or steps, as disclosed above. For example, the storage medium 630 may store the set of operations, and the processing circuitry 610 may be configured to retrieve the set of operations from the storage medium 630 to cause the controller 600 to perform the set of operations. The set of operations may be provided as a set of executable instructions.
Thus the processing circuitry 610 is thereby arranged to execute methods as herein disclosed. The storage medium 630 may also comprise persistent storage, which, for example, can be any single one or combination of magnetic memory, optical memory, solid state memory or even remotely mounted memory. The controller 600 may further comprise a communications (comm.) interface 620 at least configured for communications with other entities, functions, nodes, and devices (such as the user device 110, the electrodes 120, the camera 130, and the headset 140) as needed for the controller 600 to be able to perform the foveated rendering of an XR environment 400 in accordance with the herein disclosed embodiments. As such the communications interface 620 may comprise one or more transmitters and receivers, comprising analogue and digital components. The processing circuitry 610 controls the general operation of the controller 600 e.g. by sending data and control signals to the communications interface 620 and the storage medium 630, by receiving data and reports from the communications interface 620, and by retrieving data and instructions from the storage medium 630. Other components, as well as the related functionality, of the controller 600 are omitted in order not to obscure the concepts presented herein.
Fig. 7 schematically illustrates, in terms of a number of functional modules, the components of a controller 700 according to an embodiment. The controller 700 of Fig. 7 comprises a number of functional modules; an obtain module 710 configured to perform step SI 02, an obtain module 720 configured to perform step SI 04, a determine module 730 configured to perform step SI 06, and a render module 740 configured to perform step S108. The controller 700 of Fig. 7 may further comprise a number of optional functional modules, as represented by functional module 750. In general terms, each functional module 710:750 may in one embodiment be implemented only in hardware and in another
embodiment with the help of software, i.e., the latter embodiment having computer program instructions stored on the storage medium 630 which when run on the processing circuitry makes the controller 600 perform the corresponding steps mentioned above in conjunction with Fig 8. It should also be mentioned that even though the modules correspond to parts of a computer program, they do not need to be separate modules therein, but the way in which they are implemented in software is dependent on the programming language used. Preferably, one or more or all functional modules 710:750 may be implemented by the processing circuitry 610, possibly in cooperation with the communications interface 620 and/or the storage medium 630. The processing circuitry 610 may thus be configured to from the storage medium 630 fetch instructions as provided by a functional module 710:750 and to execute these instructions, thereby performing any steps as disclosed herein.
The controller 500, 600 may be provided as a standalone device or as a part of at least one further device. For example, the controller 500, 600 may be provided in the user device 110.
Fig. 8 shows one example of a computer program product 810 comprising computer readable storage medium 830. On this computer readable storage medium 830, a computer program 820 can be stored, which computer program 820 can cause the processing circuitry 610 and thereto operatively coupled entities and devices, such as the communications interface 620 and the storage medium 630, to execute methods according to embodiments described herein. The computer program 820 and/or computer program product 810 may thus provide means for performing any steps as herein disclosed.
In the example of Fig. 8, the computer program product 810 is illustrated as an optical disc, such as a CD (compact disc) or a DVD (digital versatile disc) or a Blu-Ray disc. The computer program product 810 could also be embodied as a memory, such as a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM), or an electrically erasable programmable read-only memory (EEPROM) and more particularly as a non-volatile storage medium of a device in an external memory such as a USB (Universal Serial Bus) memory or a Flash memory, such as a compact Flash memory. Thus, while the computer program 820 is here schematically shown as a track on the depicted optical disk, the computer program 820 can be stored in any way which is suitable for the computer program product 810.
The inventive concept has mainly been described above with reference to a few embodiments. However, as is readily appreciated by a person skilled in the art, other embodiments than the ones disclosed above are equally possible within the scope of the inventive concept, as defined by the appended patent claims.
Claims
1. A controller (600, 700) for foveated rendering of an XR environment (400), wherein the controller (600, 700) comprises processing circuitry (610), and wherein the processing circuitry (610) is configured to cause the controller (600, 700) to: obtain, using an eye-tracker, a gaze direction of a user towards a point in the XR environment (400); obtain a gaze accuracy for using the eye-tracker to perform eye-tracking of objects (430) in a vicinity of said point in the XR environment (400); determine a size of a region (420a, 420b) in which said point is located based on the gaze accuracy, wherein the size of the region (420a, 420b) decreases with increased gaze accuracy; and render the XR environment (400) whilst performing the eye-tracking, wherein higher resolution rendering is used inside the region (420a, 420b) than outside the region (420a, 420b).
2. The controller (600, 700) according to claim 1, wherein the gaze accuracy varies over the XR environment (400) according to an eye-tracking resolution map (300a, 300b, 300c).
3. The controller (600, 700) according to claim 2, wherein the eye-tracking resolution map (300a, 300b, 300c) is a time-varying eye-tracking resolution map (300a, 300b, 300c), and whereby the size of the region (420a, 420b) is time-varying.
4. The controller (600, 700) according to claim 1, wherein the eye-tracker is a first eye-tracker selected from a set of available eye-trackers comprising at least the first eye-tracker and a second eyetracker.
5. The controller (600, 700) according to claim 4, wherein the first eye-tracker uses electrooculography-based eye-tracking and the second eye-tracker uses camera-based eye-tracking.
6. The controller (600, 700) according to claim 4 or 5 in combination with claim 2 or 3, wherein each of the eye-trackers has its own eye-tracking resolution map (300a, 300b, 300c).
7. The controller (600, 700) according to claim 4 or 5, wherein the size of the region (420a, 420b) in the XR environment (400) differ among the eye-trackers.
8. The controller (600, 700) according to claim 4 or 5, wherein the first eye-tracker is selected to fulfil an energy requirement.
9. The controller (600, 700) according to claim 8, wherein the energy requirement pertains to selecting the eye-tracker yielding lowest energy consumption for the foveated rendering of all the available eye-trackers.
10. The controller (600, 700) according to claim 4 or 5, wherein the gaze accuracy is obtained by comparing first eye-tracking data resulting from using the first eye-tracker to second eye-tracking data resulting from using the second eye-tracker.
11. The controller (600, 700) according to claim 10, wherein the eye-tracking is performed for an eyetracking application, wherein the first eye-tracking data is obtained whilst the second eye-tracker is used during execution of the eye-tracking application, and wherein the second eye-tracking data is used as reference data when comparing the first eye-tracking data to the second eye-tracking data.
12. A controller (600, 700) for foveated rendering of an XR environment (400), the controller (600, 700) comprising: an obtain module (710) configured to obtain, using an eye-tracker, a gaze direction of a user towards a point in the XR environment (400); an obtain module (720) configured to obtain a gaze accuracy for using the eye-tracker to perform eye-tracking of objects (430) in a vicinity of said point in the XR environment (400); a determine module (730) configured to determine a size of a region (420a, 420b) in which said point is located based on the gaze accuracy, wherein the size of the region (420a, 420b) decreases with increased gaze accuracy; and a render module (740) configured to render the XR environment (400) whilst performing the eyetracking, wherein higher resolution rendering is used inside the region (420a, 420b) than outside the region (420a, 420b).
13. A method for foveated rendering of an XR environment (400), the method being performed by a controller (600, 700), the method comprising: obtaining (S 102), using an eye-tracker, a gaze direction of a user towards a point in the XR environment (400); obtaining (S104) a gaze accuracy for using the eye-tracker to perform eye-tracking of objects (430) in a vicinity of said point in the XR environment (400); determining (SI 06) a size of a region (420a, 420b) in which said point is located based on the gaze accuracy, wherein the size of the region (420a, 420b) decreases with increased gaze accuracy; and
rendering (S 108) the XR environment (400) whilst performing the eye-tracking, wherein higher resolution rendering is used inside the region (420a, 420b) than outside the region (420a, 420b).
14. A computer program (820) for foveated rendering of an XR environment (400), the computer program comprising computer code which, when run on processing circuitry (610) of a controller (600, 700), causes the controller (600, 700) to: obtain (S 102), using an eye-tracker, a gaze direction of a user towards a point in the XR environment (400); obtain (SI 04) a gaze accuracy for using the eye-tracker to perform eye-tracking of objects (430) in a vicinity of said point in the XR environment (400); determine (S 106) a size of a region (420a, 420b) in which said point is located based on the gaze accuracy, wherein the size of the region (420a, 420b) decreases with increased gaze accuracy; and render (SI 08) the XR environment (400) whilst performing the eye-tracking, wherein higher resolution rendering is used inside the region (420a, 420b) than outside the region (420a, 420b).
15. A computer program product (810) comprising a computer program (820) according to claim 14, and a computer readable storage medium (830) on which the computer program is stored.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/EP2023/085410 WO2025124699A1 (en) | 2023-12-12 | 2023-12-12 | Foveated rendering of an extended reality environment |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/EP2023/085410 WO2025124699A1 (en) | 2023-12-12 | 2023-12-12 | Foveated rendering of an extended reality environment |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025124699A1 true WO2025124699A1 (en) | 2025-06-19 |
Family
ID=89897683
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2023/085410 Pending WO2025124699A1 (en) | 2023-12-12 | 2023-12-12 | Foveated rendering of an extended reality environment |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025124699A1 (en) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20170285736A1 (en) * | 2016-03-31 | 2017-10-05 | Sony Computer Entertainment Inc. | Reducing rendering computation and power consumption by detecting saccades and blinks |
| US20180184002A1 (en) * | 2016-12-23 | 2018-06-28 | Microsoft Technology Licensing, Llc | Eye Tracking Using Video Information and Electrooculography Information |
| US20200394830A1 (en) * | 2019-06-13 | 2020-12-17 | Facebook Technologies, Llc | Dynamic tiling for foveated rendering |
-
2023
- 2023-12-12 WO PCT/EP2023/085410 patent/WO2025124699A1/en active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20170285736A1 (en) * | 2016-03-31 | 2017-10-05 | Sony Computer Entertainment Inc. | Reducing rendering computation and power consumption by detecting saccades and blinks |
| US20180184002A1 (en) * | 2016-12-23 | 2018-06-28 | Microsoft Technology Licensing, Llc | Eye Tracking Using Video Information and Electrooculography Information |
| US20200394830A1 (en) * | 2019-06-13 | 2020-12-17 | Facebook Technologies, Llc | Dynamic tiling for foveated rendering |
Non-Patent Citations (1)
| Title |
|---|
| E. WHITMIRE ET AL.: "EyeContact: Scleral Coil Eye Tracking for Virtual Reality", PROCEEDINGS OF THE INTERNATIONAL SYMPOSIUM OF WEARABLE COMPUTERS (ISWC, 12 September 2016 (2016-09-12) |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10043281B2 (en) | Apparatus and method for estimating eye gaze location | |
| US20220137704A1 (en) | Gaze tracking using mapping of pupil center position | |
| US8333475B2 (en) | Electro-oculography measuring device, ophthalmological diagnosis device, eye-gaze tracking device, wearable camera, head-mounted display, electronic eyeglasses, electro-oculography measuring method, and recording medium | |
| US20100103375A1 (en) | Eye-tracking method and system for implementing the same | |
| US9946340B2 (en) | Electronic device, method and storage medium | |
| US20140085189A1 (en) | Line-of-sight detection apparatus, line-of-sight detection method, and program therefor | |
| US9329684B2 (en) | Eye tracking with detection of adequacy of lighting | |
| US20150309567A1 (en) | Device and method for tracking gaze | |
| CN111698601B (en) | Bluetooth headset, and key sensitivity adjustment method, device and readable storage medium thereof | |
| US20170172491A1 (en) | System and method for detecting muscle activities | |
| WO2025124699A1 (en) | Foveated rendering of an extended reality environment | |
| JP2018060635A (en) | Input device | |
| WO2025087534A1 (en) | Adaptive selection of eye-tracking mode | |
| US20210345923A1 (en) | Redundant eye tracking system | |
| EP3779646B1 (en) | Estimation method, estimation program, and estimation device | |
| WO2025087535A1 (en) | Eye-tracking of objects in an xr environment | |
| JP6687639B2 (en) | Information processing method, information processing device, program, and eyewear | |
| US20200387247A1 (en) | Touch system, operation method, and non-transitory computer readable storage medium | |
| US12299193B2 (en) | Gaze demarcating system, method, device and non-transitory computer readable storage medium | |
| WO2025124700A1 (en) | Foveated rendering of an extended reality environment | |
| KR102306111B1 (en) | Method and apparatus for eog-based eye tracking protocol using baseline drift removal algorithm for long-term eye movement detection | |
| KR20170065054A (en) | Display device and driving method of the same | |
| KR20190143287A (en) | Method for estimating a distance between iris and imaging device, and terminal for executing the same | |
| US10895910B2 (en) | Adaptive eye-tracking calibration method | |
| US20250341891A1 (en) | Realtime background eye-tracking calibration |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23851034 Country of ref document: EP Kind code of ref document: A1 |