[go: up one dir, main page]

WO2025050320A1 - Image display apparatus - Google Patents

Image display apparatus Download PDF

Info

Publication number
WO2025050320A1
WO2025050320A1 PCT/CN2023/117293 CN2023117293W WO2025050320A1 WO 2025050320 A1 WO2025050320 A1 WO 2025050320A1 CN 2023117293 W CN2023117293 W CN 2023117293W WO 2025050320 A1 WO2025050320 A1 WO 2025050320A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
light source
controller
display apparatus
image
Prior art date
Application number
PCT/CN2023/117293
Other languages
French (fr)
Inventor
Yoji Okazaki
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp., Ltd. filed Critical Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority to PCT/CN2023/117293 priority Critical patent/WO2025050320A1/en
Publication of WO2025050320A1 publication Critical patent/WO2025050320A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking

Definitions

  • the present invention relates to an image display apparatus.
  • Some image display apparatuses such as augmented reality (AR) glasses can receive light from the external world at a light-guiding member and cause the light to pass therethrough to the user's eyeball side, and they also can receive light from a display device at the light-guiding member and guide the light to the user's eyeball side.
  • AR augmented reality
  • Some image display apparatuses may receive light from a display device (hereinafter, simply referred to as "light source” ) at a light-guiding member and copy, through the light-guiding member, mutually-corresponding and two-dimensionally arranged images in a wide eyebox. If, for example, brightness of the images is un-uniform between two or more areas in the eyebox, movement of the user's eye position between two or more areas in the eyebox causes acute change in brightness of the image (AR image) , etc., which causes uneven color or brightness when the user sees the AR image. As a result, quality of the image may degrade, and it may be difficult for the user to enjoy a realistic AR image. This may reduce usability of the image display apparatus for the user.
  • a display device hereinafter, simply referred to as "light source”
  • the present invention has been made in view of the above-described problem, and an aim of the invention is to provide an image display apparatus capable of easily improving usability for a user.
  • an image display apparatus includes a light source, a light-guiding member, an eye tracking mechanism, and a controller.
  • the light-guiding member receives light emitted from the light source.
  • the light-guiding member projects two or more images to an eyebox.
  • the two or more images correspond to each other.
  • the two or more images are arranged two-dimensionally.
  • the eye tracking mechanism detects a direction of line-of-sight toward the light-guiding member.
  • the controller performs image processing according to the detected direction of line-of-sight and drives the light source according to a result of the image processing.
  • FIG. 1 is a diagram illustrating appearance configuration of an image display apparatus according to an embodiment
  • FIG. 2 is a block diagram illustrating configuration of the image display apparatus according to the embodiment
  • FIG. 3 is a diagram illustrating a positional relation between an eyebox, a light-guiding surface of a light-guiding member, and a virtual display surface according to the embodiment
  • FIG. 4 is a diagram illustrating configuration of the light-guiding member according to the embodiment.
  • FIG. 5 a diagram illustrating configuration of an eye tracking mechanism according to the embodiment
  • FIG. 6 is a diagram illustrating brightness information according to the embodiment.
  • FIG. 7 is a diagram illustrating calibration information according to the embodiment.
  • FIG. 8 is a diagram illustrating brightness information according to the embodiment.
  • FIG. 9 is a diagram illustrating calibration information according to the embodiment.
  • FIG. 10 is a diagram illustrating control information according to the embodiment.
  • FIG. 11 is a diagram illustrating pulse driving for the light source according to the embodiment.
  • the image display apparatus detects a direction of line-of-sight toward the light-guiding member, performs image processing according to the detected direction of line-of-sight, and drives the light source with the amount of light changing according to a result of the image processing, which may achieve uniform brightness between two or more images and may improve the usability for user.
  • the scanning light-source unit 2 is connected to the light-guiding member 3 near one end of the light-guiding member 3, as illustrated in FIG. 1. With this configuration, the scanning light-source unit 2 can guide generated light to the light-guiding member 3 while scanning with the light under control of the controller 5.
  • the scanning light-source unit 2 may be an LBS unit driven in an LBS mode.
  • the scanning mechanism 22 guides the light emitted from the light source 21 to the light-guiding member 3 while scanning with the light according to the control by the controller 5.
  • the scanning mechanism 22 may include one or more mirrors and its driving mechanism.
  • the one or more mirrors may be one or more micro electro mechanical system (MEMS) mirrors.
  • the driving mechanism achieves scanning with the light, which is reflected by the mirror (s) and guided to the light-guiding member 3, by controlling the angle of the mirror (s) .
  • the light-guiding member 3 allows light coming from the external world to pass therethrough to the user's eyeball side and allows light coming from the light source 21 to project to the user's eyeball side.
  • the light-guiding member 3 is shaped like a plate extending in the X and Y directions.
  • the light-guiding member 3 may be shaped like a rounded rectangle or a substantive ellipse in an XY planar view.
  • the light-guiding member 3 projects light coming from the scanning light-source unit 2 to an eyebox 200 via the light-guiding surface 3a.
  • the eyebox 200 is defined at a place where a pupil 101 of a user's eyeball 100 lies.
  • the light-guiding member 3 generates an image of the light source 21 and projects the image to the eyebox 200 wherever the pupil 101 is positioned within the range of two or more areas EB1 to EB9 of the eyebox 200.
  • the user wearing the image display apparatus 1 is able to recognize that the image (AR image 301) is displayed on a virtual display surface 300 opposite to the eyebox 200 through the light-guiding member 3.
  • FIG. 3 is a diagram illustrating a positional relation between the eyebox 200, the light-guiding surface 3a of the light-guiding member 3, and the virtual display surface 300.
  • a total area of the respective areas EB1 to EB9 may be called as eyebox. Because the eyebox is temporarily divided into the two or more areas EB1 to EB9, their images are mutually correspond to the same image data.
  • an image based on image data is displayed as the AR image 301, and the image includes R (Red) , G (Green) , and B (Blue) images in the case of full-color display.
  • the R, G, and B images each include two or more FOV (Field Of View) regions that are arranged two-dimensionally.
  • the R image includes two or more pixel groups PR11 to PR55 corresponding to the two or more FOV regions. Each pixel group PR includes two or more R pixels.
  • the G image includes two or more pixel groups PG11 to PG55 corresponding to the two or more FOV regions. Each pixel group PG includes two or more G pixels.
  • the B image includes two or more pixel groups PB11 to PB55 corresponding to the two or more FOV regions. Each pixel group PB includes two or more B pixels.
  • the R image is captured with a camera to measure brightnesses R11 to R55 of the respective FOV regions, calibration tables are calculated based on them and stored, and efficiency and unevenness of brightness are measured.
  • the two or more pixel groups PR11 to PR55 correspond to the two or more brightnesses R11 to R55.
  • Each of the brightnesses R11 to R55 is an average brightness of the two or more R pixels in the corresponding pixel group PR.
  • the G image is captured with a camera to measure brightnesses G11 to G55 of the respective FOV regions, calibration tables are calculated based on them and stored, and efficiency and unevenness in brightness are measured.
  • the two or more pixel groups PG11 to PG55 correspond to the two or more brightnesses G11 to G55.
  • Each of the brightnesses G11 to G55 is an average brightness of the two or more G pixels in the corresponding pixel group PG.
  • the B image is captured with a camera to measure brightnesses B11 to B55 of the respective FOV regions, calibration tables are calculated based on them and stored, and efficiency and unevenness in brightness are measured.
  • the two or more pixel groups PB11 to PB55 correspond to the two or more brightnesses B11 to B55.
  • Each of the brightnesses B11 to B55 is an average brightness of the two or more B pixels in the corresponding pixel group PB.
  • FIG. 3 exemplifies the two or more brightnesses R11 to R55 corresponding to the two or more pixel groups PR11 to PR55.
  • the unevenness of R, G, and B brightnesses are thus stored as the calibration table and calibration is made using them; therefore, it becomes possible to correct uneven color, eventually.
  • the light-guiding member 3 is configured to divide light received to project the AR image 301 within the range of the areas EB1 to EB9, which may form the wide eyebox 200.
  • the light-guiding member 3 includes an input coupler IC, a waveguide WG, and an output coupler OC.
  • FIG. 4 is a diagram illustrating configuration of the light-guiding member 3. The input coupler IC, the waveguide WG, and the output coupler OC are arranged sequentially along an optical path in the light-guiding member 3.
  • the input coupler IC is disposed between the scanning light-source unit 2 and the waveguide WG.
  • the configuration is illustrated as an example in which an image is input from the scanning light-source unit 2 to the light-guiding member 3, specifically, the image is input through the input coupler IC to the waveguide WG and then guided from the waveguide WG through the output coupler OC and the light- guiding surface 3a to the eyebox 200.
  • the waveguide WG is disposed between the input coupler IC and the output coupler OC.
  • the waveguide WG receives an image of the scanning light-source unit 2 from the input coupler IC.
  • the input coupler IC, the waveguide WG, and the output coupler OC include two or more optical element groups corresponding to two or more images, the optical elements performing input, reflection, output, etc., of the images.
  • Each optical element may be a holographic optical element (HOE) , a diffractive optical element (DOE) , or a reflective optical element.
  • the waveguide WG divides the multiplexed light OB into two or more divided light groups DB1 to DB9 with the two or more optical element groups.
  • the waveguide WG guides the two or more divided light groups DB1 to DB9 to the output coupler OC.
  • the output coupler OC is disposed between the waveguide WG and the eyebox 200.
  • the output coupler OC receives the two or more divided light groups DB1 to DB9 and emits them toward the eyebox 200.
  • the light-guiding member 3 displays an image having the wide eyebox 200 within the range of EB1 to EB9 as illustrated in FIG. 3 on the pupil 101 of the user's eyeball 100.
  • the user is able to see the AR image 301 so long as the eye position lies in any of the two or more areas EB1 to EB9 when the user moves the eyeball 100.
  • This can expand the range (the eyebox 200) within which the user is able to see the AR image 301.
  • the eye tracking mechanism 4 detects a direction of user's line-of-sight toward the light-guiding member 3 under control of the controller 5.
  • the eye tracking mechanism 4 includes an illuminating element 41, an imaging element 42, and supporting members 43 and 44, these using a frame of an AR glass, for example, in main.
  • FIG. 5 is a diagram illustrating configuration of the eye tracking mechanism 4.
  • the illuminating element 41 and the imaging element 42 may be located opposite to each other with the light-guiding member 3 therebetween.
  • the illuminating element 41 is fixed to one end of the light-guiding member 3 via the supporting member 43
  • the imaging element 42 is fixed to the other end of the light-guiding member 3 via the supporting member 44.
  • FIG. 5 illustrates an example in which the illuminating element 41 is fixed to the positive Y-side of the light-guiding member 3, and the imaging element 42 is fixed to the negative Y-side of the light-guiding member 3.
  • the illuminating element 41 illuminates the pupil 101 of the user's eyeball 100 with light.
  • the imaging element 42 receives light reflected by the pupil 101 of the user's eyeball 100 and acquires an image of the pupil 101.
  • the image of the pupil 101 includes information about the direction of user's line-of-sight.
  • the imaging element 42 supplies an image signal indicating the image of the pupil 101 to the controller 5.
  • the controller 5 drives the light source 21 according to the direction of line-of-sight detected by the eye tracking mechanism 4.
  • the controller 5 acquires an image signal from the imaging element 42.
  • the controller 5 extracts information about the direction of user's line-of-sight from the image signal.
  • the controller 5 calculates an intersection CP of a line EL indicating the extracted direction of user's line-of-sight and the eyebox 200 (see FIG. 5) . This enables the controller 5 to estimate where the user's eye position EP lies on the eyebox 200 (see FIG. 3) .
  • the controller 5 performs image processing on the AR image 301 according to where the user's eye position EP lies on the eyebox 200 and drives the light source 21 according to a result of the image processing.
  • the controller 5 may perform image processing according to where the user's eye position EP lies on the eyebox 200 such that each pixel of the AR image 301 corresponding to a desired area EB among the two or more areas EB1 to EB9 has mutually closer brightness, and the controller 5 may drive the light source 21 according to a result of the image processing.
  • the controller 5 may drive the two or more light-emitting elements 21r, 21g, and 21b, separately.
  • the controller 5 may perform image processing according to where the user's eye position EP lies on the eyebox 200 such that each pixel of the AR image 301 corresponding to a desired area EB among the two or more areas EB1 to EB9 has mutually closer brightnesses in the corresponding color, and the controller 5 may drive the two or more light-emitting elements 21r to 21b separately according to a result of the image processing.
  • the controller 5 may perform image processing on the AR image 301 at each EB's position according to where the user's eye position EP lies on the eyebox 200, and the controller 5 may drive the light source 21 with pulse under a driving condition according to a result of the image processing.
  • the time cycle by which the controller 5 drives one pulse of the light source 21 is shorter than the time cycle by which the controller 5 drives one pixel with the light source 21.
  • the controller 5 can drive, for the AR image 301, the light source 21 with one or more pulses for each pixel according to a result of the image processing.
  • the controller 5 can perform image processing on the AR image 301 at each EB's position according to where the user's eye position EP lies on the eyebox 200, and the controller 5 can change the pulse peak intensity, the pulse width, and the number of pulses for each pixel according to a result of the image processing, thereby controlling the brightness.
  • FIGS. 6 and 8 illustrate brightness information used in generating calibration information 61.
  • FIGS. 7 and 9 illustrates the calibration information 61 that is generated by using the brightness information of FIGS. 6 and 8 and used in the calibrating process.
  • FIG. 6 (a) illustrates source data of brightness information of an image for creating calibration table groups corresponding to the areas EB1 to EB9 that are temporal divisions of the eyebox 200 (see FIG. 3) .
  • FIG. 7 illustrates the calibration information 61 operating as basic calibration tables that are calculated from FIG. 6 and used in the calibrating process.
  • FIGS. 6 (b) to FIG. 6 (d) illustrate source data of brightness for creating brightness tables of the AR image 301 for the area EB5, for example. They illustrate source data of brightness for creating calibration table groups corresponding to two or more areas of each FOV of the AR image 301, the data being represented as the color-based brightness information R11 to R55, G11 to G55, and B11 to B55.
  • FIGS. 8 (b) to 8 (d) illustrate source data for creating more precise calibration table groups for the two or more areas.
  • the basic calibration table group such as FIG. 7 corresponding to the calibration table group CT in order to have a low precision or a high precision
  • the number of divisions may be more than 400; however, an increase in the number of divisions results in an increase in the amount of memory for the calibration table group CT. Therefore, the number around 400 is desirable.
  • FIGS. 9 (b) to 9 (d) illustrate the calibration information 61 operating as high-precision calibration tables that are calculated from the above-mentioned more precise data and that are used in the calibrating process.
  • brightness is uneven in an image of each FOV of the AR image 301 for EB1, for example, among the two or more areas EB1 to EB9 in the eyebox 200 as illustrated in FIG. 3, which may cause color unevenness as a result of it.
  • brightness is uneven in an image of each FOV of the AR image 301 viewed from EB2, EB3, or the like, when an image of the AR image is viewed from a different area EB by changing the line of sight, the brightness and the unevenness of brightness and color may change, and the user is difficult to see a uniform and high-quality AR image.
  • the controller 5 may firstly determine which one of the areas EB1 to EB9, i.e., the divisions of the eyebox 200, the eye position EP lies on. The controller 5 may then divide the AR image 301 into two or more areas and determine which one of the two or more areas corresponds to where the eye position EP lies. In response to a result of determination, the controller 5 may perform, according to the concept of foveated rendering, a calibrating process with the calibration information 61, changing the precision depending on the position of line-of-sight of the AR image 301 corresponding to one of the area EBs, and the controller 5 may drive the light source 21 according to a result of the calibrating process.
  • Forveated rendering is a technology in which, in response to the fact that the density of photoreceptor cells in user's retina is higher in the center of the field of view than the surroundings, the resolution increases for one of two or more areas in the image corresponding to the eye position and decreases for its surrounding areas.
  • the present embodiment applies this concept to the precision in the brightness calibration for each area in the AR image 301 and to the precision in driving the light source 21 associated therewith.
  • the controller 5 performs image processing at a first precision when displaying a first area among the two or more areas, and the controller 5 drives the light sources 21 according to a result of the image processing under a first driving condition.
  • the first area is an area corresponding to where the eye position EP lies.
  • the controller 5 performs image processing at a second precision when displaying a second area among the two or more areas, and the controller 5 drives the light sources 21 according to a result of the image processing under a second driving condition.
  • the second area is an area not corresponding to where the eye position EP lies.
  • the second precision is lower than the first precision.
  • the second driving condition is rougher than the first driving condition.
  • the controller 5 may perform image processing at a third precision when displaying a third area among the two or more areas, and the controller 5 may drive the light sources 21 according to a result of the image processing under a third driving condition.
  • the third area is an area not corresponding to where the eye position EP lies and located farther away from the first region than the second region is located in the AR image 301.
  • the third precision is lower than the first precision and lower than the second precision.
  • the third driving condition is rougher than the first driving condition and rougher than the second driving condition.
  • the controller 5 calculates a reference brightness for the first area among the two or more areas of the AR image 301 at the first precision to generate first control information, and stores control information 62 including the first control information in the storage unit 6.
  • the first control information includes the first driving condition.
  • the controller 5 calculates a reference brightness for the second area at the second precision to generate second control information, and stores the control information 62 including the second control information in the storage unit 6.
  • the second control information includes the second driving condition.
  • the controller 5 may access the storage unit 6 when driving the light source 21 and refer to the control information 62.
  • the controller 5 may drive the light source 21 under the first driving condition, which corresponds to the first precision, when an image of the first area is projected to the eyebox 200 based on the control information 62.
  • the controller 5 may drive the light source 21 under the second driving condition, which corresponds to the second precision, when an image of the second area is projected to the eyebox 200 according to the control information 62.
  • the controller 5 may calculate a reference brightness for the third area at the third precision to generate third control information, and store the control information 62 including the third control information in the storage unit 6.
  • the third control information includes the third driving condition.
  • the controller 5 may drive the light source 21 under the third driving condition, which corresponds to the third precision, when an image of the third area is projected to the eyebox 200 based on the control information 62.
  • the controller 5 calculates respective reference brightnesses for the first number of partial areas to generate the first control information.
  • the controller 5 calculates respective reference brightnesses for the second number of partial areas to generate the second control information.
  • the second number is smaller than the first number.
  • the controller 5 thus generates the control information 62 including the first control information and the second control information.
  • the controller 5 drives the light source 21 under the first driving condition, which corresponds to the first number of partial areas, when an image of the first area is projected to the eyebox 200 based on the control information 62.
  • the controller 5 drives the light source 21 under the second driving condition, which corresponds to the second number of partial areas, when an image of the second area is projected to the eyebox 200 based on the control information 62.
  • the controller 5 may further calculate respective reference brightnesses for the third number of partial areas to generate the third control information.
  • the third number is smaller than the first number and smaller than the second number.
  • the controller 5 thus generates the control information 62 including the third control information.
  • the controller 5 may drive the light source 21 under the third driving condition, which corresponds to the third number of partial areas, when an image of the third area is projected to the eyebox 200 based on the control information 62.
  • the controller 5 includes a CPU 51, an image controller 52, light-source driving circuitry 53, and scan driving circuitry 54.
  • the CPU 51 is connected to the eye tracking mechanism 4, the image controller 52, and the storage unit 6.
  • the image controller 52 is connected to the CPU 51, the storage unit 6, the light-source driving circuitry 53, and the scan driving circuitry 54.
  • the CPU 51 and/or the image controller 52 performs a brightness re-calculating process according to the calibration information 61, and generates the control information 62 according to a result of the re-calculating process and stores it in the storage unit 6.
  • the control information 62 may be information used for controlling the light source 21 such that, among the two or more areas EB1 to EB9, the corresponding pixels have mutually closer brightness according to where the user's eye position EP lies on the eyebox 200.
  • the CPU 51 and/or the image controller 52 receives an image signal from the eye tracking mechanism 4 and extracts information about the direction of user's line-of-sight from the image signal.
  • the CPU 51 and/or the image controller 52 estimates the eye position EP on the eyebox 200 according to the information about the direction of user's line-of-sight.
  • the CPU 51 and/or the image controller 52 accesses the storage unit 6, refers to the calibration information 61, generates the control information 62 according to where the user's eye position EP lies on the eyebox 200 and the calibration information 61, and stores it in the storage unit 6.
  • the CPU 51 and/or the image controller 52 drives, according to where the user's eye position EP lies on the eyebox 200 and the control information 62, the light source 21 via the light-source driving circuitry 53 and drives the scanning mechanism 22 via the scan driving circuitry 54.
  • light is input from the scanning light-source unit 2 to the light-guiding member 3, and the two or more areas EB1 to EB9 are displayed on the eyebox 200 through the light-guiding member 3.
  • the controller 5 can drive and control the scanning light-source unit 2 with the reference brightness for each pixel re-calculated such that the corresponding pixels among the two or more areas EB1 to EB9 have mutually closer brightness.
  • controller 5 may perform the calibrating process in the following steps (i) to (iv) as rough calibration for a case where brightness is relatively less uneven (first calibrating method) .
  • the controller 5 calculates in advance correction values for uneven brightness by using calibration table values for the reference brightness calculated from measured information (for example, R11 to R55) about a distribution of wavelength-based unevenesses of brightness at YZ positions of each FOV in one area (for example EB5) of the two or more areas EB1 to EB9.
  • calibration table groups are created as the calibration information 61 for each of the areas EB1 to EB9 from the brightness data as illustrated in FIGS. 7 (a) to 7 (d) , and the calibration table groups are pre-stored in the calibration information 61.
  • color-based image data is acquired from one (for example, area EB5) of the areas in the eyebox 200 with a camera located at the pupil position.
  • the brightness information R11, etc. is calculated from an image of each FOV, which is divided into 25 pieces, an average brightness Lv (R) of the entire is calculated, and the correction values CV are created as follow:
  • the controller 5 corrects brightness using the calibration table group CT1.
  • the controller 5 corrects brightness using the calibration table group CT5.
  • the controller 5 corrects brightness using the calibration table group CT9.
  • the controller 5 estimates where the user's eye position EP lies according to an eye tracking result from the eye tracking mechanism 4 and determines which one of the two or more areas EB1 to EB9 in the eyebox 200 the eye position EP lies in.
  • the controller 5 identifies an area EB in the eyebox 200 that is determined to include the eye position EP through the eye tracking.
  • the controller 5 corrects the reference brightness for each pixel of the image data using the color-based calibration tables calculated at step (i) for the identified area EB in the eyebox 200.
  • the controller 5 may multiply the correction value with the reference brightness for each pixel of the image data and set its product as a corrected reference brightness. With this configuration, the controller 5 acquires corrected image data for the area EB.
  • the controller 5 updates the control information 62 with the acquired corrected image data.
  • the controller 5 refers to the control information 62 and drives the light source 21 with the light intensity varying for each pixel according to the corrected image data included in the control information 62. This makes it possible to project an image with reduced unevenness of brightness to the eyebox 200.
  • the controller 5 performs steps (ii) to (iv) repeatedly for each frame time. In other words, if the area corresponding to the eye position EP change to another in the eyebox 200 every time frame, the controller 5 changes the color-based calibration tables used for calibration to calibration tables for the area EB corresponding to the current eye position EP in the eyebox 200.
  • a commonly-used frame rate for example, 60 Hz
  • an eye movement severe Hz
  • the controller 5 may determine illuminance of the light source 21 so that brightness when the minimum correction value of the above correction values is corrected becomes relatively high (for example, maximum brightness) .
  • the controller 5 can expand a luminance dynamic range for the light source 21 and ensure a wide range of correctable brightnesses. This is especially desirable for a scan light-source unit capable of output an output with high brightness.
  • the rough brightness calibration has been explained above.
  • the brightness calibration with 400 may be performed by using the high-precision calibration tables of 400 divisions per one color as illustrated in FIGS. 9 (a) to 9 (c) .
  • controller 5 may perform the following steps for a case where relatively many unevenness of brightness appear as a calibration using foveated rendering (second calibrating method) .
  • FIG. 9 illustrates, as the calibration table groups, calibration table groups for the area EB5 among the areas EB1 to EB9 (see FIG. 3) in the eyebox 200.
  • the color-based calibration table groups form further-hierarchized high-precision calibration tables as illustrated in FIGS. 9 (b) to 9 (d) .
  • controller 5 it is possible for the controller 5 to modify the calibration tables as illustrated in FIGS. 9 (b) to 9 (d) to produce calibration tables as illustrated in FIGS. 10(b) to 10 (d) according to the concept of foveated rendering.
  • re-calculation is performed so that the number of color-based calibration tables decreases as it goes farther away from the area of the eye position EP, and the control information 62 is generated as a result of the re-calculation.
  • the red (R) calibration table as illustrated in FIG. 10 (a) may be created in the following steps (1) to (3) and included in the control information 62.
  • FIGS. 8 (b) to 8 (d) illustrate source data for creating more precise calibration table groups for the two or more areas.
  • CT calibration table group
  • the color balance is set to produce white for multiple colors (for example, red (R) , green (G) , and blue (B) ) .
  • the controller 5 extracts color-based grayscale values from the image data and pre-adjusts the color-based grayscale values so that a color temperature in synthesis of multiple colors is within a range of color temperatures near white.
  • the pre-created calibration information 61 will be explained again.
  • Color-based image data is acquired from one (for example, the area EB5) of the areas in the eyebox 200 with a camera located at the pupil position.
  • the four detailed correction values CV are created in the width direction of R11.
  • the correction values are created similarly for R12 to R15.
  • the correction values CV high-precision calibration tables
  • the correction values CV (R33 (11) ) to CV (R33 (44) ) are stored as the high-precision calibration tables of the calibration table group CV (R33) (FIGS. 9 (a) and 9 (d) ) .
  • the correction values CV (R44 (11) ) to CV (R44 (44) ) are stored as the high-precision calibration tables of the calibration table group CV (R44) (FIGS. 9 (a) and 9 (c) ) .
  • the correction values CV (R51 (11) ) to CV (R51 (44) ) are stored as the high-precision calibration tables of the calibration table group CV (R51) (FIGS. 9 (a) and 9 (b) ) .
  • the area RG1 includes a partial area (for example, a partial area of the brightness R33 as illustrated in FIG. 6 (b) ) corresponding to the eye position EP among two or more partial areas that are divisions of an image.
  • a partial area for example, a partial area of the brightness R33 as illustrated in FIG. 6 (b)
  • the correction values of the high-precision calibration tables CV (R33 (11) ) to CV (R33 (44) ) stored in the calibration table group CV (R33) (FIG. 9 (d) ) .
  • the area RG2 includes partial areas adjacent to the outside of the area RG1 (for example, partial areas of the brightnesses R22 to R24, R32, R34, and R42 to R44 as illustrated in FIG. 6 (b) ) .
  • the area RG3 includes partial areas adjacent to the outside of the area RG2 (for example, partial areas of the brightnesses R11 to R15, R21, R25, R31, R35, R41, R45, and R51 to R55 as illustrated in FIG. 6 (b) ) .
  • controller 5 It is possible for the controller 5 to use, among the areas RG1, RG2, and RG3 as illustrated in FIG. 10 (a) , a calibration table with the highest precision for the area RG1, a calibration table with the second highest precision for the area RG2, and a calibration table of the lowest precision for the area RG3 according to the concept of foveated rendering, thereby calculating an average of correction values or using a representative correction value at a lower precision. This allows reduction of processing load.
  • the controller 5 may re-create, for red R calibration table groups as the control information 62, calibration tables such that the number of calibration tables decreases as it goes farther away from the area RG1 of the eye position EP, i.e., the largest number for the area RG1, the second largest number for the area RG2, and the smallest number for the area RG3.
  • the controller 5 uses the high-precision calibration tables for the respective partial areas of the area RG1 without changes.
  • the controller 5 calculates the low-precision calibration tables CVF (R44 (11) ) to CVF (R44 (22) ) for foveated-rendering for the respective partial areas of the area RG2 by averaging the high-precision calibration tables or by taking any one representative value from the high-precision calibration tables.
  • the controller 5 calculates the low-precision calibration table CVF (R51 (11) ) for foveated-rendering for the respective partial areas of the area RG3 by averaging the high-precision calibration tables or by taking any one representative value from the high-precision calibration tables.
  • the controller 5 uses, among the areas RG1, RG2, and RG3 as illustrated in FIG. 10 (a) , a calibration table with the highest precision for the area RG1, a calibration table with the second highest precision for the area RG2, and a calibration table of the lowest precision for the area RG3 according to the concept of foveated rendering, thereby calculating correction values at a lower precision and generating the control information 62.
  • the controller 5 performs re-calculation of the calibration tables for the area RG1 as follow.
  • the controller 5 uses, as the calibration table group CV (R33) for the area RG1, the original high-precision calibration tables (sixteen calibration tables) initially stored, as illustrated in FIG. 10 (d)
  • the controller 5 calculates the correction values for the respective partial areas of the pixel group PR of the area RG2 as illustrated in FIG. 10 (c) .
  • the controller 5 calculates low-precision correction values CVF (R44 (11) ) to CVF (R44 (22) ) for the respective partial areas of the pixel group PR44 using the high-precision calibration tables by the following equations:
  • the controller 5 includes, as the calibration table group CV (R44) of the area RG2, four low-precision calibration tables CVF (R44 (11) ) to CVF(R44 (22) ) in the control information 62.
  • the controller 5 creates four low-precision calibration tables CVF as each of the calibration table groups CV (R22) to CV (R24) , CV (R32) , CV (R34) , CV (R42) to CV (R43) and includes them in the control information 62.
  • the low-precision correction value it is allowable to take a representative correction value for any one predetermined location as the low-precision correction value. For example, it is allowable to always take a value for the upper left section when the high-precision calibration table values are divided into four sections.
  • the controller 5 calculates the correction value for the partial areas of the pixel group PR of the area RG3 as illustrated in FIG. 10 (b) .
  • the controller 5 calculates the low-precision correction value CVF (R51 (11) ) for the partial areas of the pixel group PR51 by the following equation.
  • the controller 5 includes, as the calibration table group CV (R51) of the area RG3, one low-precision calibration table CVF (R51 (11) ) in the control information 62.
  • the controller 5 creates one low-precision calibration table CVF as each of the calibration table groups CV (R11) to CV (R15) , CV (R21) , CV (R25) , CV (R31) , CV (R35) , CV (R41) , CV (R45) , CV (R52) to CV (R55) and includes these in the control information 62.
  • the green (G) calibration tables as illustrated in FIG. 7 (c) and the blue (B) calibration tables as illustrated in FIG. 7 (d) may be created in similar manner.
  • the controller 5 may drive the light source 21 as illustrated in FIG. 11 according to the control information 62 including the corrected image data.
  • FIG. 11 is a diagram illustrating pulse driving for the light source 21.
  • the controller 5 may perform image processing for an image according to where the user's eye position EP lies on the eyebox 200, and drive the light source 21 with pulse under a driving condition according to the control information 62 including a result of the image processing.
  • the time cycle by which the controller 5 drives one pulse of the light source 21 may be shorter than the time cycle by which the controller 5 drives one pixel with the light source 21.
  • the controller 5 can drive, for an image, the light source with two or more pulses for each pixel according to the control information 62.
  • FIGS. 11 (a) to 11 (d) illustrates examples in which a time cycle T by which the controller 5 drives one pixel with the light source 21 is approximately five times as long as a time cycle 0.2T by which the controller 5 drives one pulse of the light source 21.
  • the time cycle T for one pixel may be 10 ns, and the time cycle 0.2T for one pulse may be 2 ns, for example.
  • the controller 5 can drive the light source 21 with five pulses for each pixel of the image.
  • the controller 5 can change the pulse intensity, the pulse width, and the number of pulses for each pixel of an image based on the reference brightness corrected in accordance with where the user's eye position EP lies on the eyebox 200.
  • the controller 5 can change, for each pixel and based on the reference brightness, the pulse intensity of the light source 21 within a predetermined range (for example, 0.5A to A) , the pulse width of the light source 21 within a predetermined range (for example, 0.1T to 0.2T) , and the number of pulses of the light source 21 within a predetermined range (for example, 0 to 5) .
  • a predetermined range for example, 0.5A to A
  • the pulse width of the light source 21 within a predetermined range for example, 0.1T to 0.2T
  • the number of pulses of the light source 21 within a predetermined range for example, 0 to 5
  • the image display apparatus 1 detects a direction of line-of-sight EL toward the light-guiding member 3, performs image processing according to the detected direction of line-of-sight, and drives the light source 21 according to a result of the image processing.
  • this configuration it is possible to make the brightness uniform between the two or more areas EB1 to EB9 and, therefore, to easily improve usability for user.
  • eye tracking enables color-based brightness calibration for each area EB according to where the user's eye position EP lies on the eyebox 200.
  • the concept of foveated rendering is applied to brightness calibration such that precision correction is made for the area corresponding to the eye position EP and the precision of correction decreases as it goes farther away from the eye position EP.
  • load for example, processing time, power consumed, etc.
  • the pulse driving can be performed at a time cycle shorter than a time cycle for one pixel, and driving can be perform with precision varying according to the brightness calibration.
  • precise driving is performed for an area corresponding to the eye position EP, and the driving precision decrease as it goes farther away from the eye position EP. This achieves effective driving of the light source 21 according to the brightness calibration.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

An image display apparatus (1) according to an aspect of the present invention includes a light source (21), a light-guiding member (3), an eye tracking mechanism (4), and a controller (5). The light-guiding member (3) receives light emitted from the light source (21). The light-guiding member (3) projects two or more images to an eyebox (200). The two or more images correspond to each other. The two or more images are arranged two-dimensionally. The eye tracking mechanism (4) detects a direction of line-of-sight toward the light-guiding member (3). The controller (5) performs image processing according to the detected direction of line-of-sight and drives the light source (21) according to a result of the image processing.

Description

IMAGE DISPLAY APPARATUS [Technical Field]
The present invention relates to an image display apparatus.
[Background Art]
Some image display apparatuses such as augmented reality (AR) glasses can receive light from the external world at a light-guiding member and cause the light to pass therethrough to the user's eyeball side, and they also can receive light from a display device at the light-guiding member and guide the light to the user's eyeball side.
[Problem to be Solved by the Invention]
Some image display apparatuses may receive light from a display device (hereinafter, simply referred to as "light source" ) at a light-guiding member and copy, through the light-guiding member, mutually-corresponding and two-dimensionally arranged images in a wide eyebox. If, for example, brightness of the images is un-uniform between two or more areas in the eyebox, movement of the user's eye position between two or more areas in the eyebox causes acute change in brightness of the image (AR image) , etc., which causes uneven color or brightness when the user sees the AR image. As a result, quality of the image may degrade, and it may be difficult for the user to enjoy a realistic AR image. This may reduce usability of the image display apparatus for the user.
The present invention has been made in view of the above-described problem, and an aim of the invention is to provide an image display apparatus capable of easily improving usability for a user.
[Means for Solving Problem]
To solve the problem described above and achieve the object, an image display apparatus according to an aspect of the present invention includes a light source, a light-guiding member, an eye tracking mechanism, and a controller. The light-guiding member receives light emitted from the light source. The light-guiding member projects two or more images to an eyebox. The two or more images correspond to each other. The two or more images are arranged two-dimensionally. The eye tracking mechanism detects a direction of line-of-sight toward the light-guiding member. The controller performs image processing according to the detected direction of line-of-sight  and drives the light source according to a result of the image processing.
[Effect of the Invention]
According to one aspect of the present invention, it is possible to easily improve usability for a user.
[Brief Description of Drawings]
FIG. 1 is a diagram illustrating appearance configuration of an image display apparatus according to an embodiment;
FIG. 2 is a block diagram illustrating configuration of the image display apparatus according to the embodiment;
FIG. 3 is a diagram illustrating a positional relation between an eyebox, a light-guiding surface of a light-guiding member, and a virtual display surface according to the embodiment;
FIG. 4 is a diagram illustrating configuration of the light-guiding member according to the embodiment;
FIG. 5 a diagram illustrating configuration of an eye tracking mechanism according to the embodiment;
FIG. 6 is a diagram illustrating brightness information according to the embodiment;
FIG. 7 is a diagram illustrating calibration information according to the embodiment;
FIG. 8 is a diagram illustrating brightness information according to the embodiment;
FIG. 9 is a diagram illustrating calibration information according to the embodiment;
FIG. 10 is a diagram illustrating control information according to the embodiment; and
FIG. 11 is a diagram illustrating pulse driving for the light source according to the embodiment.
[Embodiment (s) of Carrying Out the Invention]
Hereinafter, an image display apparatus according to an embodiment will be described in detail with reference to the accompanying drawings. Note that the present invention is not limited to this embodiment.
(Embodiment)
An image display apparatus according to an embodiment is, for example, an augmented reality (AR) glass for realizing AR. The image display apparatus can receive light from the external world at a light-guiding member and cause the light to pass therethrough to the user's eyeball side. In addition, the image display apparatus can receive light from a light source at the light-guiding member and guide the light to the user's eyeball-side. The image display apparatus receives light emitted from the light source at the light-guiding member and projects an image to an eyebox through the light-guiding member. In response to this, a user wearing the image display apparatus is able to recognize that an image (AR image) is displayed on a virtual display surface opposite to the eyebox through the light-guiding member. The image display apparatus may use a point-light-source display device as the light source, and it may two-dimensionally scan with light of the point light source, thereby displaying an image on the eyebox through the light-guiding member. The image display apparatus may use a pulsed laser as the point light source and may drive it in laser beam scanning (LBS) mode. It is easy for a point light source to increase brightness compared to a surface light source (for example, micro LED) , and it is easy to control its outputs. Using a point light source as the light source allows the user to see a clear AR image within a wide eye-moving range when the user is outdoors.
Upon receiving light from the light source, the light-guiding member projects mutually-corresponding and two-dimensionally arranged two or more images to the eyebox. If the eye position lies on any of the two or more images when the user moves his/her eyeball, the user is able to see the AR image. This makes it possible to expand the range (eyebox) within which the user is able to see the AR image.
However, if brightness is uneven between the two or more images, when the user moves his/her eyeball and a replacement occurs between images corresponding to the eye position, the unevenness of brightness or color changes dramatically and the brightness changes as well. Due to this, the image is no longer high quality for the user, and the user cannot enjoy realistic AR image, even worse, the image may be difficult to see because of not enough brightness.
In view of this, the image display apparatus according to the present embodiment detects a direction of line-of-sight toward the light-guiding member,  performs image processing according to the detected direction of line-of-sight, and drives the light source with the amount of light changing according to a result of the image processing, which may achieve uniform brightness between two or more images and may improve the usability for user.
For example, an image display apparatus 1 includes a scanning light-source unit 2, a light-guiding member 3, an eye tracking mechanism 4, a controller 5, and a storage unit 6, as illustrated in FIGS. 1 and 2. FIG. 1 is a diagram illustrating appearance configuration of the image display apparatus 1. In FIG. 1, an X direction is a direction perpendicular to a light-guiding surface 3a of the light-guiding member 3; a Y direction is a longitudinal direction of the image display apparatus 1; and a Z direction is a direction perpendicular to both the X direction and the Y direction. FIG. 2 is a block diagram illustrating configuration of the image display apparatus 1.
The controller 5 can control various sections of the image display apparatus 1 in general.
The scanning light-source unit 2 is connected to the light-guiding member 3 near one end of the light-guiding member 3, as illustrated in FIG. 1. With this configuration, the scanning light-source unit 2 can guide generated light to the light-guiding member 3 while scanning with the light under control of the controller 5. The scanning light-source unit 2 may be an LBS unit driven in an LBS mode.
As illustrated in FIG. 2, the scanning light-source unit 2 includes a light source 21 and a scanning mechanism 22. The light source 21 may be a point light source. The light source 21 may be a pulsed laser, for example. The light source 21 may be pulse-driven according to control of the controller 5 to generate light. The light source 21 may include a plurality of light-emitting elements 21r, 21g, and 21b corresponding to multiple colors and a multiplexer 21c. For example, the light-emitting elements 21r, 21g, and 21b may correspond to red (R) , green (G) , and blue (B) , respectively. The multiplexer 21c multiplexes multiple colors of light coming from the light-emitting elements 21r, 21g, and 21b and guides it to the scanning mechanism 22.
The scanning mechanism 22 guides the light emitted from the light source 21 to the light-guiding member 3 while scanning with the light according to the control by the controller 5. The scanning mechanism 22 may include one or more mirrors and its driving mechanism. The one or more mirrors may be one or more micro  electro mechanical system (MEMS) mirrors. The driving mechanism achieves scanning with the light, which is reflected by the mirror (s) and guided to the light-guiding member 3, by controlling the angle of the mirror (s) .
The light-guiding member 3 allows light coming from the external world to pass therethrough to the user's eyeball side and allows light coming from the light source 21 to project to the user's eyeball side. As illustrated in FIG. 1, the light-guiding member 3 is shaped like a plate extending in the X and Y directions. The light-guiding member 3 may be shaped like a rounded rectangle or a substantive ellipse in an XY planar view.
As illustrated in FIG. 3, the light-guiding member 3 projects light coming from the scanning light-source unit 2 to an eyebox 200 via the light-guiding surface 3a. The eyebox 200 is defined at a place where a pupil 101 of a user's eyeball 100 lies. The light-guiding member 3 generates an image of the light source 21 and projects the image to the eyebox 200 wherever the pupil 101 is positioned within the range of two or more areas EB1 to EB9 of the eyebox 200. In response to this, the user wearing the image display apparatus 1 is able to recognize that the image (AR image 301) is displayed on a virtual display surface 300 opposite to the eyebox 200 through the light-guiding member 3. FIG. 3 is a diagram illustrating a positional relation between the eyebox 200, the light-guiding surface 3a of the light-guiding member 3, and the virtual display surface 300. A total area of the respective areas EB1 to EB9 may be called as eyebox. Because the eyebox is temporarily divided into the two or more areas EB1 to EB9, their images are mutually correspond to the same image data.
Note that an image based on image data is displayed as the AR image 301, and the image includes R (Red) , G (Green) , and B (Blue) images in the case of full-color display. The R, G, and B images each include two or more FOV (Field Of View) regions that are arranged two-dimensionally. The R image includes two or more pixel groups PR11 to PR55 corresponding to the two or more FOV regions. Each pixel group PR includes two or more R pixels. The G image includes two or more pixel groups PG11 to PG55 corresponding to the two or more FOV regions. Each pixel group PG includes two or more G pixels. The B image includes two or more pixel groups PB11 to PB55 corresponding to the two or more FOV regions. Each pixel group PB includes two or more B pixels.
For example, the R image is captured with a camera to measure brightnesses R11 to R55 of the respective FOV regions, calibration tables are calculated based on them and stored, and efficiency and unevenness of brightness are measured. The two or more pixel groups PR11 to PR55 correspond to the two or more brightnesses R11 to R55. Each of the brightnesses R11 to R55 is an average brightness of the two or more R pixels in the corresponding pixel group PR. The G image is captured with a camera to measure brightnesses G11 to G55 of the respective FOV regions, calibration tables are calculated based on them and stored, and efficiency and unevenness in brightness are measured. The two or more pixel groups PG11 to PG55 correspond to the two or more brightnesses G11 to G55. Each of the brightnesses G11 to G55 is an average brightness of the two or more G pixels in the corresponding pixel group PG. The B image is captured with a camera to measure brightnesses B11 to B55 of the respective FOV regions, calibration tables are calculated based on them and stored, and efficiency and unevenness in brightness are measured. The two or more pixel groups PB11 to PB55 correspond to the two or more brightnesses B11 to B55. Each of the brightnesses B11 to B55 is an average brightness of the two or more B pixels in the corresponding pixel group PB. FIG. 3 exemplifies the two or more brightnesses R11 to R55 corresponding to the two or more pixel groups PR11 to PR55. The unevenness of R, G, and B brightnesses are thus stored as the calibration table and calibration is made using them; therefore, it becomes possible to correct uneven color, eventually.
The light-guiding member 3 is configured to divide light received to project the AR image 301 within the range of the areas EB1 to EB9, which may form the wide eyebox 200. As illustrated in FIG. 4, the light-guiding member 3 includes an input coupler IC, a waveguide WG, and an output coupler OC. FIG. 4 is a diagram illustrating configuration of the light-guiding member 3. The input coupler IC, the waveguide WG, and the output coupler OC are arranged sequentially along an optical path in the light-guiding member 3.
The input coupler IC is disposed between the scanning light-source unit 2 and the waveguide WG. The configuration is illustrated as an example in which an image is input from the scanning light-source unit 2 to the light-guiding member 3, specifically, the image is input through the input coupler IC to the waveguide WG and then guided from the waveguide WG through the output coupler OC and the light- guiding surface 3a to the eyebox 200.
The waveguide WG is disposed between the input coupler IC and the output coupler OC. The waveguide WG receives an image of the scanning light-source unit 2 from the input coupler IC. The input coupler IC, the waveguide WG, and the output coupler OC include two or more optical element groups corresponding to two or more images, the optical elements performing input, reflection, output, etc., of the images. Each optical element may be a holographic optical element (HOE) , a diffractive optical element (DOE) , or a reflective optical element.
As illustrated in FIG. 4, the waveguide WG divides the multiplexed light OB into two or more divided light groups DB1 to DB9 with the two or more optical element groups. The waveguide WG guides the two or more divided light groups DB1 to DB9 to the output coupler OC.
The output coupler OC is disposed between the waveguide WG and the eyebox 200. The output coupler OC receives the two or more divided light groups DB1 to DB9 and emits them toward the eyebox 200.
With this configuration, the light-guiding member 3 displays an image having the wide eyebox 200 within the range of EB1 to EB9 as illustrated in FIG. 3 on the pupil 101 of the user's eyeball 100. The user is able to see the AR image 301 so long as the eye position lies in any of the two or more areas EB1 to EB9 when the user moves the eyeball 100. This can expand the range (the eyebox 200) within which the user is able to see the AR image 301.
The eye tracking mechanism 4 detects a direction of user's line-of-sight toward the light-guiding member 3 under control of the controller 5. As illustrated in FIG. 5, the eye tracking mechanism 4 includes an illuminating element 41, an imaging element 42, and supporting members 43 and 44, these using a frame of an AR glass, for example, in main. FIG. 5 is a diagram illustrating configuration of the eye tracking mechanism 4.
The illuminating element 41 and the imaging element 42 may be located opposite to each other with the light-guiding member 3 therebetween. The illuminating element 41 is fixed to one end of the light-guiding member 3 via the supporting member 43, and the imaging element 42 is fixed to the other end of the light-guiding member 3 via the supporting member 44. FIG. 5 illustrates an example in which the illuminating  element 41 is fixed to the positive Y-side of the light-guiding member 3, and the imaging element 42 is fixed to the negative Y-side of the light-guiding member 3.
The illuminating element 41 illuminates the pupil 101 of the user's eyeball 100 with light. The imaging element 42 receives light reflected by the pupil 101 of the user's eyeball 100 and acquires an image of the pupil 101. The image of the pupil 101 includes information about the direction of user's line-of-sight. The imaging element 42 supplies an image signal indicating the image of the pupil 101 to the controller 5.
The controller 5 drives the light source 21 according to the direction of line-of-sight detected by the eye tracking mechanism 4.
For example, the controller 5 acquires an image signal from the imaging element 42. The controller 5 extracts information about the direction of user's line-of-sight from the image signal. The controller 5 calculates an intersection CP of a line EL indicating the extracted direction of user's line-of-sight and the eyebox 200 (see FIG. 5) . This enables the controller 5 to estimate where the user's eye position EP lies on the eyebox 200 (see FIG. 3) .
As for the two or more areas EB1 to EB9, the controller 5 performs image processing on the AR image 301 according to where the user's eye position EP lies on the eyebox 200 and drives the light source 21 according to a result of the image processing.
For example, the controller 5 may perform image processing according to where the user's eye position EP lies on the eyebox 200 such that each pixel of the AR image 301 corresponding to a desired area EB among the two or more areas EB1 to EB9 has mutually closer brightness, and the controller 5 may drive the light source 21 according to a result of the image processing. When the light source 21 has the two or more light-emitting elements 21r, 21g, and 21b corresponding to multiple colors (for example, red (R) , green (G) , and, blue (B) ) , the controller 5 may drive the two or more light-emitting elements 21r, 21g, and 21b, separately. The controller 5 may perform image processing according to where the user's eye position EP lies on the eyebox 200 such that each pixel of the AR image 301 corresponding to a desired area EB among the two or more areas EB1 to EB9 has mutually closer brightnesses in the corresponding color, and the controller 5 may drive the two or more light-emitting elements 21r to 21b  separately according to a result of the image processing.
At this time, the controller 5 may perform image processing on the AR image 301 at each EB's position according to where the user's eye position EP lies on the eyebox 200, and the controller 5 may drive the light source 21 with pulse under a driving condition according to a result of the image processing. The time cycle by which the controller 5 drives one pulse of the light source 21 is shorter than the time cycle by which the controller 5 drives one pixel with the light source 21.
The controller 5 can drive, for the AR image 301, the light source 21 with one or more pulses for each pixel according to a result of the image processing. The controller 5 can perform image processing on the AR image 301 at each EB's position according to where the user's eye position EP lies on the eyebox 200, and the controller 5 can change the pulse peak intensity, the pulse width, and the number of pulses for each pixel according to a result of the image processing, thereby controlling the brightness.
Explained below are details of a calibrating process by the controller 5 with reference to FIGS. 6, 7, 8, and 9. FIGS. 6 and 8 illustrate brightness information used in generating calibration information 61. FIGS. 7 and 9 illustrates the calibration information 61 that is generated by using the brightness information of FIGS. 6 and 8 and used in the calibrating process.
FIG. 6 (a) illustrates source data of brightness information of an image for creating calibration table groups corresponding to the areas EB1 to EB9 that are temporal divisions of the eyebox 200 (see FIG. 3) .
FIG. 7 illustrates the calibration information 61 operating as basic calibration tables that are calculated from FIG. 6 and used in the calibrating process.
FIGS. 6 (b) to FIG. 6 (d) illustrate source data of brightness for creating brightness tables of the AR image 301 for the area EB5, for example. They illustrate source data of brightness for creating calibration table groups corresponding to two or more areas of each FOV of the AR image 301, the data being represented as the color-based brightness information R11 to R55, G11 to G55, and B11 to B55. The source data of brightness in this example is for creating 5×5=25 calibration tables per one color, for simplicity. This is prepared for each of the nine areas EB1 to EB9.
FIG. 7 (a) exemplifies, as the calibration information, two or more  calibration table groups CT1 to CT9 that are calculated from the above data and that are used in the calibrating process. The two or more calibration table groups CT1 to CT9 correspond to the two or more areas EB1 to EB9 in the eyebox 200 (see FIG. 3) . FIGS. 7(b) to 7 (d) illustrate R basic calibration table groups CV (R11) to CV (R55) , G basic calibration table groups CV (G11) to CV (G55) , and B basic calibration table groups CV (B11) to CV (B55) , those corresponding to each calibration table group CT.
FIG. 8 (a) illustrate the brightness information R11 to R55 for creating more precise calibration table groups corresponding to two or more areas in the FOV of one piece of image data for red.
FIGS. 8 (b) to 8 (d) illustrate source data for creating more precise calibration table groups for the two or more areas. For easy modification of the basic calibration table group such as FIG. 7 corresponding to the calibration table group CT in order to have a low precision or a high precision, it is desirable to calculate brightness data of (5×4) × (5×4) =20×20=400 pieces per one color for one area EB in the eyebox 200. The number of divisions may be more than 400; however, an increase in the number of divisions results in an increase in the amount of memory for the calibration table group CT. Therefore, the number around 400 is desirable.
Actually, data is created for three colors for each of the two or more areas EB1 to EB9 in the eyebox 200; therefore, it is desirable to prepare brightness data of 9×3×400=10800 pieces. These are achieved by acquiring an image from each of the areas EB1 to EB9 with a camera and extracting color-based brightness data from the image data, and these are easily created by later-described calibration-table creating methods.
FIGS. 9 (b) to 9 (d) illustrate the calibration information 61 operating as high-precision calibration tables that are calculated from the above-mentioned more precise data and that are used in the calibrating process.
If brightness is uneven in an image of each FOV of the AR image 301 for EB1, for example, among the two or more areas EB1 to EB9 in the eyebox 200 as illustrated in FIG. 3, which may cause color unevenness as a result of it. If brightness is uneven in an image of each FOV of the AR image 301 viewed from EB2, EB3, or the like, when an image of the AR image is viewed from a different area EB by changing the line of sight, the brightness and the unevenness of brightness and color may change,  and the user is difficult to see a uniform and high-quality AR image.
In contrast, the controller 5 may firstly determine which one of the areas EB1 to EB9, i.e., the divisions of the eyebox 200, the eye position EP lies on. The controller 5 may then divide the AR image 301 into two or more areas and determine which one of the two or more areas corresponds to where the eye position EP lies. In response to a result of determination, the controller 5 may perform, according to the concept of foveated rendering, a calibrating process with the calibration information 61, changing the precision depending on the position of line-of-sight of the AR image 301 corresponding to one of the area EBs, and the controller 5 may drive the light source 21 according to a result of the calibrating process.
Forveated rendering is a technology in which, in response to the fact that the density of photoreceptor cells in user's retina is higher in the center of the field of view than the surroundings, the resolution increases for one of two or more areas in the image corresponding to the eye position and decreases for its surrounding areas. The present embodiment applies this concept to the precision in the brightness calibration for each area in the AR image 301 and to the precision in driving the light source 21 associated therewith.
For example, the controller 5 performs image processing at a first precision when displaying a first area among the two or more areas, and the controller 5 drives the light sources 21 according to a result of the image processing under a first driving condition. The first area is an area corresponding to where the eye position EP lies. The controller 5 performs image processing at a second precision when displaying a second area among the two or more areas, and the controller 5 drives the light sources 21 according to a result of the image processing under a second driving condition. The second area is an area not corresponding to where the eye position EP lies. The second precision is lower than the first precision. The second driving condition is rougher than the first driving condition.
Moreover, the controller 5 may perform image processing at a third precision when displaying a third area among the two or more areas, and the controller 5 may drive the light sources 21 according to a result of the image processing under a third driving condition. The third area is an area not corresponding to where the eye position EP lies and located farther away from the first region than the second region is  located in the AR image 301. The third precision is lower than the first precision and lower than the second precision. The third driving condition is rougher than the first driving condition and rougher than the second driving condition.
At this time, the controller 5 calculates a reference brightness for the first area among the two or more areas of the AR image 301 at the first precision to generate first control information, and stores control information 62 including the first control information in the storage unit 6. The first control information includes the first driving condition. The controller 5 calculates a reference brightness for the second area at the second precision to generate second control information, and stores the control information 62 including the second control information in the storage unit 6. The second control information includes the second driving condition. The controller 5 may access the storage unit 6 when driving the light source 21 and refer to the control information 62. The controller 5 may drive the light source 21 under the first driving condition, which corresponds to the first precision, when an image of the first area is projected to the eyebox 200 based on the control information 62. The controller 5 may drive the light source 21 under the second driving condition, which corresponds to the second precision, when an image of the second area is projected to the eyebox 200 according to the control information 62.
Furthermore, the controller 5 may calculate a reference brightness for the third area at the third precision to generate third control information, and store the control information 62 including the third control information in the storage unit 6. The third control information includes the third driving condition. The controller 5 may drive the light source 21 under the third driving condition, which corresponds to the third precision, when an image of the third area is projected to the eyebox 200 based on the control information 62.
For example, by using a calibration table in which the first area among the two or more areas is divided into a first number of partial areas, the controller 5 calculates respective reference brightnesses for the first number of partial areas to generate the first control information. By using a calibration table in which the second area among the two or more areas is divided into a second number of partial areas, the controller 5 calculates respective reference brightnesses for the second number of partial areas to generate the second control information. The second number is smaller than  the first number. The controller 5 thus generates the control information 62 including the first control information and the second control information. The controller 5 drives the light source 21 under the first driving condition, which corresponds to the first number of partial areas, when an image of the first area is projected to the eyebox 200 based on the control information 62. The controller 5 drives the light source 21 under the second driving condition, which corresponds to the second number of partial areas, when an image of the second area is projected to the eyebox 200 based on the control information 62.
By using a calibration table in which the third area among the two or more areas is divided into a third number of partial areas, the controller 5 may further calculate respective reference brightnesses for the third number of partial areas to generate the third control information. The third number is smaller than the first number and smaller than the second number. The controller 5 thus generates the control information 62 including the third control information. The controller 5 may drive the light source 21 under the third driving condition, which corresponds to the third number of partial areas, when an image of the third area is projected to the eyebox 200 based on the control information 62.
The controller 5 includes a CPU 51, an image controller 52, light-source driving circuitry 53, and scan driving circuitry 54.
The CPU 51 is connected to the eye tracking mechanism 4, the image controller 52, and the storage unit 6. The image controller 52 is connected to the CPU 51, the storage unit 6, the light-source driving circuitry 53, and the scan driving circuitry 54.
The CPU 51 and/or the image controller 52 performs a brightness re-calculating process according to the calibration information 61, and generates the control information 62 according to a result of the re-calculating process and stores it in the storage unit 6. The control information 62 may be information used for controlling the light source 21 such that, among the two or more areas EB1 to EB9, the corresponding pixels have mutually closer brightness according to where the user's eye position EP lies on the eyebox 200.
The CPU 51 and/or the image controller 52 receives an image signal from the eye tracking mechanism 4 and extracts information about the direction of  user's line-of-sight from the image signal. The CPU 51 and/or the image controller 52 estimates the eye position EP on the eyebox 200 according to the information about the direction of user's line-of-sight. The CPU 51 and/or the image controller 52 accesses the storage unit 6, refers to the calibration information 61, generates the control information 62 according to where the user's eye position EP lies on the eyebox 200 and the calibration information 61, and stores it in the storage unit 6. The CPU 51 and/or the image controller 52 drives, according to where the user's eye position EP lies on the eyebox 200 and the control information 62, the light source 21 via the light-source driving circuitry 53 and drives the scanning mechanism 22 via the scan driving circuitry 54. In response to this, light is input from the scanning light-source unit 2 to the light-guiding member 3, and the two or more areas EB1 to EB9 are displayed on the eyebox 200 through the light-guiding member 3.
With this configuration, according to where the user's eye position EP lies on the eyebox 200, the controller 5 can drive and control the scanning light-source unit 2 with the reference brightness for each pixel re-calculated such that the corresponding pixels among the two or more areas EB1 to EB9 have mutually closer brightness.
For example, the controller 5 may perform the calibrating process in the following steps (i) to (iv) as rough calibration for a case where brightness is relatively less uneven (first calibrating method) .
(First calibrating method)
(i) The controller 5 calculates in advance correction values for uneven brightness by using calibration table values for the reference brightness calculated from measured information (for example, R11 to R55) about a distribution of wavelength-based unevenesses of brightness at YZ positions of each FOV in one area (for example EB5) of the two or more areas EB1 to EB9. Note that calibration table groups are created as the calibration information 61 for each of the areas EB1 to EB9 from the brightness data as illustrated in FIGS. 7 (a) to 7 (d) , and the calibration table groups are pre-stored in the calibration information 61.
More particularly, color-based image data is acquired from one (for example, area EB5) of the areas in the eyebox 200 with a camera located at the pupil  position. In case of red, the brightness information R11, etc., is calculated from an image of each FOV, which is divided into 25 pieces, an average brightness Lv (R) of the entire is calculated, and the correction values CV are created as follow:
CV (R11) =R11/Lv (R)
....
CV (R55) =R55/Lv (R)
These CV values are included in the calibration information 61.
When, for example, the eye position EP corresponds to the area EB1, the controller 5 corrects brightness using the calibration table group CT1. When the eye position EP corresponds to the area EB5, the controller 5 corrects brightness using the calibration table group CT5. When the eye position EP corresponds to the area EB9, the controller 5 corrects brightness using the calibration table group CT9.
(ii) Within each frame time, the controller 5 estimates where the user's eye position EP lies according to an eye tracking result from the eye tracking mechanism 4 and determines which one of the two or more areas EB1 to EB9 in the eyebox 200 the eye position EP lies in.
(iii) The controller 5 identifies an area EB in the eyebox 200 that is determined to include the eye position EP through the eye tracking. The controller 5 corrects the reference brightness for each pixel of the image data using the color-based calibration tables calculated at step (i) for the identified area EB in the eyebox 200. The controller 5 may multiply the correction value with the reference brightness for each pixel of the image data and set its product as a corrected reference brightness. With this configuration, the controller 5 acquires corrected image data for the area EB. The controller 5 updates the control information 62 with the acquired corrected image data.
(iv) The controller 5 refers to the control information 62 and drives the light source 21 with the light intensity varying for each pixel according to the corrected image data included in the control information 62. This makes it possible to project an image with reduced unevenness of brightness to the eyebox 200.
The controller 5 performs steps (ii) to (iv) repeatedly for each frame time.  In other words, if the area corresponding to the eye position EP change to another in the eyebox 200 every time frame, the controller 5 changes the color-based calibration tables used for calibration to calibration tables for the area EB corresponding to the current eye position EP in the eyebox 200.
Because a commonly-used frame rate (for example, 60 Hz) may be faster than an eye movement (several Hz) , it is possible to correct uneven color/brightness of an image in response to the eye movement by correcting the reference brightness for each pixel of the image data for each frame.
Note that the controller 5 may determine illuminance of the light source 21 so that brightness when the minimum correction value of the above correction values is corrected becomes relatively high (for example, maximum brightness) . With this configuration, the controller 5 can expand a luminance dynamic range for the light source 21 and ensure a wide range of correctable brightnesses. This is especially desirable for a scan light-source unit capable of output an output with high brightness.
The rough brightness calibration has been explained above. The brightness calibration with 400 may be performed by using the high-precision calibration tables of 400 divisions per one color as illustrated in FIGS. 9 (a) to 9 (c) .
Alternatively, the controller 5 may perform the following steps for a case where relatively many unevenness of brightness appear as a calibration using foveated rendering (second calibrating method) .
(Second calibrating method)
At step (i) above, it is allowable to create in advance calibration table groups having hierarchical structure as illustrated in FIGS. 7 and 9 and store them as the calibration information 61. FIG. 9 illustrates, as the calibration table groups, calibration table groups for the area EB5 among the areas EB1 to EB9 (see FIG. 3) in the eyebox 200. The color-based calibration table groups form further-hierarchized high-precision calibration tables as illustrated in FIGS. 9 (b) to 9 (d) .
It is possible for the controller 5 to modify the calibration tables as illustrated in FIGS. 9 (b) to 9 (d) to produce calibration tables as illustrated in FIGS. 10(b) to 10 (d) according to the concept of foveated rendering. In other words, re-calculation is performed so that the number of color-based calibration tables decreases  as it goes farther away from the area of the eye position EP, and the control information 62 is generated as a result of the re-calculation.
For example, the red (R) calibration table as illustrated in FIG. 10 (a) may be created in the following steps (1) to (3) and included in the control information 62.
FIGS. 8 (b) to 8 (d) illustrate source data for creating more precise calibration table groups for the two or more areas. For easy creation of the calibration table group CT such as FIG. 7, it is desirable to calculate brightness data of (5×4) × (5×4) =20×20=400 pieces per one color for one area EB in the eyebox 200. For a standard display device, the color balance is set to produce white for multiple colors (for example, red (R) , green (G) , and blue (B) ) . The controller 5 extracts color-based grayscale values from the image data and pre-adjusts the color-based grayscale values so that a color temperature in synthesis of multiple colors is within a range of color temperatures near white.
The pre-created calibration information 61 will be explained again. Color-based image data is acquired from one (for example, the area EB5) of the areas in the eyebox 200 with a camera located at the pupil position. In case of red, brightness is calculated from an image that is divided into twenty in width and twenty in height, i.e., 20×20=400 in total, an average brightness Lv (R) of the entire is calculated, and the correction values CV are created as follow:
CV (R11 (11) ) =R11 (11) /Lv (R)
CV (R11 (12) ) =R11 (12) /Lv (R)
CV (R11 (13) ) =R11 (13) /Lv (R)
CV (R11 (14) ) =R11 (14) /Lv (R)
Thus, the four detailed correction values CV are created in the width direction of R11. The correction values are created similarly for R12 to R15.
....
CV (R15 (11) ) =R15 (11) /Lv (R)
CV (R15 (12) ) =R15 (12) /Lv (R)
CV (R15 (13) ) =R15 (13) /Lv (R)
CV (R15 (14) ) =R15 (14) /Lv (R)
Thus, the four detailed correction values CV are created for R15, which is the last one in the width direction.
In this way, the four detailed correction values are created for each of five R11 to R15; as a result, the width is divided into 5×4=20. The height is divided into 5×4=20, similarly. Thus, the correction values CV (high-precision calibration tables) are created as many as 20×20=400 in total.
Of the correction values created as such, corresponding 4×4=16 correction values are stored in calibration tables of each of the 5×5 calibration table groups. In other words, high-precision calibration tables are created as many as (5×4) × (5×4) =20×20=400. For example, the correction values CV (R33 (11) ) to CV (R33 (44) ) are stored as the high-precision calibration tables of the calibration table group CV (R33) (FIGS. 9 (a) and 9 (d) ) . The correction values CV (R44 (11) ) to CV (R44 (44) ) are stored as the high-precision calibration tables of the calibration table group CV (R44) (FIGS. 9 (a) and 9 (c) ) . The correction values CV (R51 (11) ) to CV (R51 (44) ) are stored as the high-precision calibration tables of the calibration table group CV (R51) (FIGS. 9 (a) and 9 (b) ) .
The correction values are created for green (G) and blue (B) in a similar manner, and the corresponding 4×4=16 correction values are stored as sixteen high-precision calibration tables in each of the 5×5 calibration tables.
Explained below is how to re-calculate the calibration tables for areas RG1, RG2, and RG3 in real time manner.
The area RG1 includes a partial area (for example, a partial area of the brightness R33 as illustrated in FIG. 6 (b) ) corresponding to the eye position EP among two or more partial areas that are divisions of an image. For the re-calculation of the calibration tables for the area RG1, there are used, for example, the correction values of the high-precision calibration tables CV (R33 (11) ) to CV (R33 (44) ) stored in the calibration table group CV (R33) (FIG. 9 (d) ) .
The area RG2 includes partial areas adjacent to the outside of the area RG1 (for example, partial areas of the brightnesses R22 to R24, R32, R34, and R42 to R44 as illustrated in FIG. 6 (b) ) . For the re-calculation of the calibration tables for the area RG2, there are used, for example, the correction values of the high-precision calibration tables CV (R44 (11) ) to CV (R44 (44) ) stored in the calibration table group CV (R44) (FIG. 9 (c) ) .
The area RG3 includes partial areas adjacent to the outside of the area  RG2 (for example, partial areas of the brightnesses R11 to R15, R21, R25, R31, R35, R41, R45, and R51 to R55 as illustrated in FIG. 6 (b) ) . For the re-calculation of the calibration tables of the area RG3, there are used, for example, the correction values of the high-precision calibration tables CV (R51 (11) ) to CV (R51 (44) ) stored in the calibration table group CV (R51) (FIG. 9 (b) ) .
(1) It is possible for the controller 5 to use, among the areas RG1, RG2, and RG3 as illustrated in FIG. 10 (a) , a calibration table with the highest precision for the area RG1, a calibration table with the second highest precision for the area RG2, and a calibration table of the lowest precision for the area RG3 according to the concept of foveated rendering, thereby calculating an average of correction values or using a representative correction value at a lower precision. This allows reduction of processing load. More particularly, the controller 5 may re-create, for red R calibration table groups as the control information 62, calibration tables such that the number of calibration tables decreases as it goes farther away from the area RG1 of the eye position EP, i.e., the largest number for the area RG1, the second largest number for the area RG2, and the smallest number for the area RG3.
For example, as illustrated in FIG. 10 (d) , the controller 5 uses the high-precision calibration tables for the respective partial areas of the area RG1 without changes.
As illustrated in FIG. 10 (c) , The controller 5 calculates the low-precision calibration tables CVF (R44 (11) ) to CVF (R44 (22) ) for foveated-rendering for the respective partial areas of the area RG2 by averaging the high-precision calibration tables or by taking any one representative value from the high-precision calibration tables.
As illustrated in FIG. 10 (b) , the controller 5 calculates the low-precision calibration table CVF (R51 (11) ) for foveated-rendering for the respective partial areas of the area RG3 by averaging the high-precision calibration tables or by taking any one representative value from the high-precision calibration tables.
(2) The controller 5 uses, among the areas RG1, RG2, and RG3 as illustrated in FIG. 10 (a) , a calibration table with the highest precision for the area RG1, a calibration table  with the second highest precision for the area RG2, and a calibration table of the lowest precision for the area RG3 according to the concept of foveated rendering, thereby calculating correction values at a lower precision and generating the control information 62.
For example, the controller 5 performs re-calculation of the calibration tables for the area RG1 as follow. The controller 5 uses, as the calibration table group CV (R33) for the area RG1, the original high-precision calibration tables (sixteen calibration tables) initially stored, as illustrated in FIG. 10 (d) 
The controller 5 calculates the correction values for the respective partial areas of the pixel group PR of the area RG2 as illustrated in FIG. 10 (c) . For example, the controller 5 calculates low-precision correction values CVF (R44 (11) ) to CVF (R44 (22) ) for the respective partial areas of the pixel group PR44 using the high-precision calibration tables by the following equations:
CVF (R44 (11) ) =Lv (CV (R44 (11) ) , CV (R44 (12) ) , CV (R44 (21) ) , CV (R44 (22) ) )
CVF (R44 (12) ) =Lv (CV (R44 (13) ) , CV (R44 (14) ) , CV (R44 (23) ) , CV (R44 (24) ) )
CVF (R44 (21) ) =Lv (CV (R44 (31) ) , CV (R44 (32) ) , CV (R44 (41) ) , CV (R44 (42) ) )
CVF (R44 (22) ) =Lv (CV (R44 (33) ) , CV (R44 (34) ) , CV (R44 (43) ) , CV (R44 (44) ) )
For example, the controller 5 includes, as the calibration table group CV (R44) of the area RG2, four low-precision calibration tables CVF (R44 (11) ) to CVF(R44 (22) ) in the control information 62. Similarly, the controller 5 creates four low-precision calibration tables CVF as each of the calibration table groups CV (R22) to CV (R24) , CV (R32) , CV (R34) , CV (R42) to CV (R43) and includes them in the control information 62.
Alternatively, for more reduction of processing load, it is allowable to take a representative correction value for any one predetermined location as the low-precision correction value. For example, it is allowable to always take a value for the upper left section when the high-precision calibration table values are divided into four sections.
CVF (R44 (11) ) =CV (R44 (11) )
CVF (R44 (12) ) =CV (R44 (13) )
CVF (R44 (21) ) =CV (R44 (31) )
CVF (R44 (22) ) =CV (R44 (33) )
The controller 5 calculates the correction value for the partial areas of the pixel group PR of the area RG3 as illustrated in FIG. 10 (b) . For example, the controller 5 calculates the low-precision correction value CVF (R51 (11) ) for the partial areas of the pixel group PR51 by the following equation.
CVF (R51 (11) ) =Lv (CV (R51 (11) ) , ...., CV (R51 (44) )
Alternatively, for more reduction of processing load, it is allowable to take any one representative correction value. For example, it may similarly take as follow:
CVF (R51 (11) ) =CV (R51 (11) )
For example, the controller 5 includes, as the calibration table group CV (R51) of the area RG3, one low-precision calibration table CVF (R51 (11) ) in the control information 62. The controller 5 creates one low-precision calibration table CVF as each of the calibration table groups CV (R11) to CV (R15) , CV (R21) , CV (R25) , CV (R31) , CV (R35) , CV (R41) , CV (R45) , CV (R52) to CV (R55) and includes these in the control information 62.
The green (G) calibration tables as illustrated in FIG. 7 (c) and the blue (B) calibration tables as illustrated in FIG. 7 (d) may be created in similar manner.
In this way, because the correction values are simplified according to the eye position EP, it becomes possible to reduce the processing load and increases the processing speed.
At step (2) above, the controller 5 may drive the light source 21 as illustrated in FIG. 11 according to the control information 62 including the corrected image data. FIG. 11 is a diagram illustrating pulse driving for the light source 21.
The controller 5 may perform image processing for an image according to where the user's eye position EP lies on the eyebox 200, and drive the light source 21 with pulse under a driving condition according to the control information 62 including a result of the image processing. At this time, the time cycle by which the controller 5 drives one pulse of the light source 21 may be shorter than the time cycle by which the controller 5 drives one pixel with the light source 21. The controller 5 can drive, for an image, the light source with two or more pulses for each pixel according to the control information 62.
For example, FIGS. 11 (a) to 11 (d) illustrates examples in which a time cycle T by which the controller 5 drives one pixel with the light source 21 is  approximately five times as long as a time cycle 0.2T by which the controller 5 drives one pulse of the light source 21. The time cycle T for one pixel may be 10 ns, and the time cycle 0.2T for one pulse may be 2 ns, for example. The controller 5 can drive the light source 21 with five pulses for each pixel of the image.
The controller 5 can change the pulse intensity, the pulse width, and the number of pulses for each pixel of an image based on the reference brightness corrected in accordance with where the user's eye position EP lies on the eyebox 200.
For example, in FIGS. 11 (a) to 11 (d) , the controller 5 can change, for each pixel and based on the reference brightness, the pulse intensity of the light source 21 within a predetermined range (for example, 0.5A to A) , the pulse width of the light source 21 within a predetermined range (for example, 0.1T to 0.2T) , and the number of pulses of the light source 21 within a predetermined range (for example, 0 to 5) .
For example, in response that an energy corresponding to the reference brightness corresponds to 0.09 AT for the pixel to be driven, the controller 5 can drive the light source 21 under the driving conditions that the pulse intensity=0.9A, the pulse width=0.1T, and the number of pulses=1 as illustrated in FIG. 11 (a) .
In response that an energy corresponding to the reference brightness corresponds to 0.14 AT for the pixel to be driven, the controller 5 can drive the light source 21 under the driving conditions that the pulse intensity=0.7A, the pulse width=0.2T, and the number of pulses=1 as illustrated in FIG. 11 (b) .
In response that an energy corresponding to the reference brightness corresponds to 0.27 AT for the pixel to be driven, the controller 5 can drive the light source 21 under the driving conditions that the pulse intensity=A, the pulse width=0.2T, and the number of pulses=1 as illustrated in FIG. 11 (c) .
In response that an energy corresponding to the reference brightness corresponds to 0.2 AT for the pixel to be driven, the controller 5 can drive the light source 21 under the driving conditions that the pulse intensity=A, the pulse width=0.2T, and the number of pulses=1 as illustrated in FIG. 11 (d) .
As described above, in the present embodiment, the image display apparatus 1 detects a direction of line-of-sight EL toward the light-guiding member 3, performs image processing according to the detected direction of line-of-sight, and drives the light source 21 according to a result of the image processing. With this  configuration, it is possible to make the brightness uniform between the two or more areas EB1 to EB9 and, therefore, to easily improve usability for user.
Moreover, in the present embodiment, eye tracking enables color-based brightness calibration for each area EB according to where the user's eye position EP lies on the eyebox 200. With this configuration, it is possible to make the brightness uniform between the two or more areas EB1 to EB9 in an easy manner and reduce the unevenness of brightness between the two or more areas EB1 to EB9.
Furthermore, in the present embodiment, the concept of foveated rendering is applied to brightness calibration such that precision correction is made for the area corresponding to the eye position EP and the precision of correction decreases as it goes farther away from the eye position EP. With this configuration, it is possible to reduce load (for example, processing time, power consumed, etc. ) of the calibrating process, while performing brightness calibration effectively.
Moreover, in the present embodiment, with using a point light source as the light source 21, the pulse driving can be performed at a time cycle shorter than a time cycle for one pixel, and driving can be perform with precision varying according to the brightness calibration. In other words, precise driving is performed for an area corresponding to the eye position EP, and the driving precision decrease as it goes farther away from the eye position EP. This achieves effective driving of the light source 21 according to the brightness calibration.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
[Explanations of Letters or Numerals]
1: image display system
2: scanning light source unit
3: light-guiding member
4: eye tracking mechanism
5: controller
6: storage unit
21: light source
22: scanning mechanism
51: CPU
52: image controller
53: light-source driving circuitry
54: scan driving circuitry
61: calibration information
62: control information

Claims (20)

  1. An image display apparatus comprising:
    a light source;
    a light-guiding member configured to receive light generated by the light source and project mutually-corresponding and two-dimensionally arranged two or more images to an eyebox;
    an eye tracking mechanism configured to detect a direction of line-of-sight toward the light guiding member; and
    a controller configured to perform image processing according to the detected direction of line-of-sight and drive the light source according to a result of the image processing.
  2. The image display apparatus according to claim 1, wherein
    the light source includes a point light source, and
    the image display apparatus further comprises a scanning mechanism configured to guide light coming from the point light source to the light-guiding member, while scanning with the light from the point light source.
  3. The image display apparatus according to claim 2, wherein
    the point light source includes a pulsed laser.
  4. The image display apparatus according to claim 1, wherein
    the eye tracking mechanism includes
    an illuminating element configured to illuminate a retina of a user's eyeball with light; and
    an imaging element configured to acquire an image on the retina.
  5. The image display apparatus according to claim 1, wherein
    the controller is configured to
    estimate where an eye position lies on the eyebox according to the detected direction of line-of-sight,
    perform image processing according to where the eye position lies on the  eyebox and
    drive the light source.
  6. The image display apparatus according to claim 1, wherein
    the controller is configured to
    perform image processing according to where an eye position lies on the eyebox such that corresponding pixels among the two or more images have mutually closer brightnesses and
    drive the light source.
  7. The image display apparatus according to claim 1, wherein
    the light source includes a plurality of light-emitting elements corresponding to a plurality of colors, and,
    the controller drives the plurality of light-emitting elements separately.
  8. The image display apparatus according to claim 7, wherein
    the controller is configured to
    perform image processing according to where an eye position lies on the eyebox such that corresponding pixels among the two or more images have mutually closer brightnesses in a corresponding color and
    drive the light source.
  9. The image display apparatus according to claim 1, wherein
    the controller is configured to
    perform, for the image, image processing according to where an eye position lies on the eyebox and
    pulse-drive the light source under a driving condition according to a result of the image processing.
  10. The image display apparatus according to claim 9, wherein
    a time cycle by which the controller drives one pulse of the light source is shorter than a time cycle by which the controller drives one pixel with the light source.
  11. The image display apparatus according to claim 10, wherein
    the controller is capable of drive the light source with a plurality of pulses for each pixel of the image.
  12. The image display apparatus according to claim 10, wherein
    the controller is configured to
    perform image processing for the image according to where an eye position lies on the eyebox, and
    the controller is capable of changing pulse intensity, pulse width, number of pulses for each pixel of the image according to a result of the image processing.
  13. The image display apparatus according to claim 12, wherein
    the controller is configured to
    perform image processing at a first precision when displaying, among two or more areas that are divisions of the image, a first area corresponding to where the eye position lies on the eyebox,
    perform the image processing at a second precision when displaying a second area not corresponding to where the eye position lies,
    drive the light source under a first driving condition according to a result of the image processing at the first precision, and
    drive the light source under a second driving condition according to a result of the image processing at the second precision, wherein the second precision is lower than the first precision, and the second driving condition is rougher than the first driving condition.
  14. The image display apparatus according to claim 13, wherein
    the controller is configured to
    calculate a reference brightness for the first area among the two or more areas at the first precision to generate first control information,
    calculate a reference brightness for the second area at the second precision to generate second control information,
    drive the light source under the first driving condition accordance to the first control information when displaying the first area, and
    drive the light source under the second driving condition accordance to the second control information when displaying the second area.
  15. The image display apparatus according to claim 14, wherein
    the controller is configured to
    calculate respective reference brightnesses for the first number of partial areas to generate the first control information by using a calibration table that is divided into a first number of partial areas for the first area among the two or more areas,
    calculate respective reference brightnesses for the second number of partial areas to generate the second control information by using a calibration table that is divided into a second number of partial areas for the second area, wherein the second number is smaller than the first number,
    drive the light source under the first driving condition according to the first control information when displaying the first area, and
    drive the light source under the second driving condition according to the second control information when displaying the second area.
  16. The image display apparatus according to claim 15, wherein
    the controller is configured to
    calculate a reference brightness by multiplying a correction value with a set brightness, the correction value being for the calibration table that is divided into the first number of partial areas for the first area among the two or more areas, thereby generating the first control information, and
    calculate a reference brightness by multiplying a correction value with a set brightness, the correction value being for the calibration table that is divided into the second number of partial areas for the second area, thereby generating the second control information.
  17. The image display apparatus according to claim 16, wherein
    the controller is configured to
    create the calibration table that is divided into the first number of partial areas by using correction values for a calibration table that is divided into the first number of partial areas and pre-created for the first area among the two or more areas, the correction values being used with no change, and
    create the calibration table that is divided into the second number of partial areas by averaging correction values for a calibration table that is divided into the first number of partial areas and pre-created for the second area, wherein the second number is smaller than the first number.
  18. The image display apparatus according to claim 15, wherein
    the controller is configured to
    create the calibration table that is divided into the first number of partial areas by using correction values for a calibration table that is divided into the first number of partial areas and pre-created for the first area among the two or more areas, the correction values being used with no change, and
    create the calibration table that is divided into the second number of partial areas by using, as a representative value, a correction value selected from collection values for a calibration table that is divided into the first number of partial areas and pre-created for the second area, wherein the second number is smaller than the first number.
  19. The image display apparatus according to claim 1, wherein
    the light source includes a plurality of light-emitting elements corresponding to multiple colors and a multiplexer configured to multiplex light of multiple colors coming from the plurality of light-emitting elements, and
    the light-guiding member is configured to receive the light multiplexed by the multiplexer.
  20. The image display apparatus according to claim 1, wherein
    the light-guiding member allows light coming from external world to pass therethrough to a user's eyeball side and allows light coming from the light source to guide to the user's eyeball side.
PCT/CN2023/117293 2023-09-06 2023-09-06 Image display apparatus WO2025050320A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2023/117293 WO2025050320A1 (en) 2023-09-06 2023-09-06 Image display apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2023/117293 WO2025050320A1 (en) 2023-09-06 2023-09-06 Image display apparatus

Publications (1)

Publication Number Publication Date
WO2025050320A1 true WO2025050320A1 (en) 2025-03-13

Family

ID=94922652

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/117293 WO2025050320A1 (en) 2023-09-06 2023-09-06 Image display apparatus

Country Status (1)

Country Link
WO (1) WO2025050320A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109302594A (en) * 2017-07-24 2019-02-01 三星电子株式会社 Projection display device including eye tracker
US20220026985A1 (en) * 2020-07-27 2022-01-27 Canon Kabushiki Kaisha Sight line position processing apparatus, image capturing apparatus, training apparatus, sight line position processing method, training method, and storage medium
CN114868072A (en) * 2019-12-27 2022-08-05 元平台技术有限公司 Macropixel display backplane
US20230116084A1 (en) * 2021-10-13 2023-04-13 Trilite Technologies Gmbh Display Apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109302594A (en) * 2017-07-24 2019-02-01 三星电子株式会社 Projection display device including eye tracker
CN114868072A (en) * 2019-12-27 2022-08-05 元平台技术有限公司 Macropixel display backplane
US20220026985A1 (en) * 2020-07-27 2022-01-27 Canon Kabushiki Kaisha Sight line position processing apparatus, image capturing apparatus, training apparatus, sight line position processing method, training method, and storage medium
US20230116084A1 (en) * 2021-10-13 2023-04-13 Trilite Technologies Gmbh Display Apparatus

Similar Documents

Publication Publication Date Title
CN110447063B (en) Apparatus and method for concave display and near-to-eye or head-up display system
US8262234B2 (en) Image display device using variable-focus lens at conjugate image plane
EP3607381B1 (en) Wide field of view scanning display
EP3610318B1 (en) Foveated mems scanning display
KR102458124B1 (en) Display system and method
JP6249688B2 (en) Display device, display method, and program
KR20220002334A (en) Display system with dynamic light output adjustment to maintain constant brightness
JP5033930B2 (en) Backlight device and display device
EP3729182B1 (en) Eye tracking for head-worn display
EP3514606A1 (en) Eye tracking for head-worn display
AU2016337275C1 (en) Improvements in and relating to displays
AU2016337278A1 (en) Improvements in and relating to displays
US10859823B1 (en) Head-mounted display device with selective gamma band for high contrast ratio
WO2025050320A1 (en) Image display apparatus
EP2983160A1 (en) Projection device, head-up display, control method, program, and storage medium
KR102281834B1 (en) Apparatus for compensation for augmented reality image
CN114174893A (en) Display device with reduced power consumption
WO2025105113A1 (en) Image generation device and head-mounted display
WO2025047508A1 (en) Image generation device and head-mounted display
GB2569600A (en) Eye tracking for head-worn display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23951147

Country of ref document: EP

Kind code of ref document: A1