Disclosure of Invention
The camera module applied to eyeball tracking, the manufacturing method thereof and the head-mounted device provided by the embodiment of the application can solve or partially solve the defects in the prior art or other defects in the prior art.
According to the application, the camera module applied to eyeball tracking is arranged on a head-mounted device and comprises an optical lens, a photosensitive chip and a photosensitive chip, wherein the optical lens faces the eyeball of a user of the head-mounted device and receives detection light reflected by the eyeball, the photosensitive chip is positioned on the image side of the optical lens and is used for imaging the detection light projected to the photosensitive chip through the optical lens, and the optical center of the optical lens is deviated from the center of a photosensitive surface of the photosensitive chip.
In one embodiment of the application, a first inclination angle is formed between a lens plane of the optical lens and an eyeball plane of the eyeball, and when the lens plane of the optical lens is parallel to the photosurface, the optical center of the optical lens and the center of the photosurface are offset by a first distance so as to deviate the optical center of the optical lens from the center of the photosurface.
In one embodiment of the present application, the first distance is determined according to the first inclination angle and the size of the photosurface.
In one embodiment of the application, the first inclination angle is between 0 ° and 15 °, and the first distance is between 120um and 200 um.
In one embodiment of the present application, the first inclination angle is 7 °, and the first distance is 150um.
In one embodiment of the application, the head-mounted device comprises a virtual reality head-mounted display device, and the camera module is mounted at a nose pad of the virtual reality head-mounted display device or at a side edge of an eye.
In one embodiment of the application, a second inclination angle is formed between the lens plane of the optical lens and the eyeball plane of the eyeball, a first non-zero included angle is formed between the lens plane of the optical lens and the photosurface, so that the optical center of the optical lens deviates from the center of the photosurface, and the photosurface, the lens plane and the eyeball plane intersect in a straight line.
In one embodiment of the present application, the first included angle is determined according to the second inclination angle and parameters of the optical lens.
In one embodiment of the present application, the second inclination angle is between 15 ° and 75 °, and the first inclination angle is between 3.5 ° and 10 °.
In one embodiment of the present application, the second inclination angle is 46 ° and the first inclination angle is 3.7 °.
In one embodiment of the application, a third inclination angle is formed between a lens plane of the optical lens and an eyeball plane of the eyeball, an optical center of the optical lens is offset from the center of the photosurface by a second distance, and a second non-zero included angle is formed between the lens plane of the optical lens and the photosurface, so that the optical center of the optical lens is offset from the center of the photosurface, wherein the photosurface, the lens plane and the eyeball plane intersect in a straight line.
In one embodiment of the application, the second distance is determined according to the third inclination angle and the size of the light sensing surface, and the second included angle is determined according to the third inclination angle, the second distance and the parameters of the optical lens.
In one embodiment of the present application, the third inclination angle is between 15 ° and 75 °, the second distance is between 50um and 120um, and the second included angle is between 1 ° and 3.5 °.
In one embodiment of the present application, the third inclination angle is 39 °, the second distance is 100um, and the second included angle is 3 °.
In one embodiment of the application, the head mounted device comprises an augmented reality head mounted display device, and the camera module is mounted at a nose pad or at a side of an eye of the augmented reality head mounted display device.
According to the manufacturing method of the camera module applied to eyeball tracking, the camera module is mounted on head-mounted equipment, the manufacturing method comprises the steps of machining an optical lens array with a non-zero preset inclination angle on a glass wafer based on a semiconductor manufacturing process, cutting the optical lens array to obtain single optical lenses with the preset inclination angle, and assembling each optical lens with the preset inclination angle with a photosensitive chip based on an active alignment process to obtain the camera module taking the preset inclination angle as an included angle between a lens plane of the optical lens and a photosensitive surface of the photosensitive chip so that an optical center of the optical lens deviates from the center of the photosensitive surface of the photosensitive chip.
In one embodiment of the application, the optical lens array with the non-zero preset inclination angle is processed on the glass wafer based on the semiconductor manufacturing process, and the optical lens array with the preset inclination angle is obtained by processing a first optical lens array with the preset inclination angle on the first glass wafer based on the semiconductor manufacturing process, processing a second optical lens array with the preset inclination angle on the second glass wafer based on the semiconductor manufacturing process, and splicing the first optical lens array with the second optical lens array based on the alignment of a single first optical lens and a single second optical lens.
According to a third aspect of the application, a head-mounted device comprises a device body and an imaging module applied to eyeball tracking according to the first aspect, wherein the imaging module is mounted on the device body.
According to the image pickup module applied to eyeball tracking, the manufacturing method thereof and the head-mounted device provided by the embodiment of the application, the structure of the image pickup module is optimized based on the axis shifting optical principle by enabling the optical center of the optical lens to deviate from the center of the light sensing surface of the light sensing chip, so that the effect of the image pickup module on the macro and obliquely photographed image can be optimized, and the precision of eyeball identification and tracking through the image can be improved.
The matters described in this section are not intended to identify key or critical features of the embodiments of the present disclosure, nor are they intended to limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following specification.
Detailed Description
Exemplary embodiments of the present application will now be described with reference to the accompanying drawings, in which various details of embodiments of the present application are included to facilitate understanding, and are to be considered merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
In addition, the embodiments of the present application and the features of the embodiments may be combined with each other without collision. The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Currently, in an augmented Reality (Augmented Reality, abbreviated as AR) and Virtual Reality (VR) head-mounted display device, a gaze direction of a human eye is detected mainly by an eye tracking technology, so as to adjust a display position of a Virtual image according to the gaze direction of the human eye, so as to ensure that the human eye can observe the Virtual image. The eye tracking technology used in AR and VR head-mounted display devices mainly projects infrared detection light to the eyes of a device user through an infrared light source, and shoots the eyes of the user through an infrared shooting module so as to confirm the visual axis direction of the eyes through the center of pupils in an image. Limited by the structure and volume of the AR and VR head mounted display devices, the infrared camera module is typically mounted at the nose pad of the AR and VR head mounted display device or at the side of the eyes, so that the infrared camera module photographs the eyes at a fine pitch and with an inclination. As shown in fig. 1, the infrared camera module 100 is mounted at a nose pad of an AR or VR head-mounted display device (not shown), a distance d between the infrared camera module 100 and an eyeball 200 of a user of the AR or VR head-mounted display device (not shown) is about 3.5cm, and an inclination β between the infrared camera module 100 and the eyeball 200 of the user of the AR or VR head-mounted display device (not shown) is about 25 °.
Since in a conventional image pickup module, such as the infrared image pickup module 100, the lens plane of the optical lens is parallel to the light sensing surface of the light sensing chip, and the optical center of the optical lens is aligned with the center of the light sensing surface, the focal plane of the optical lens is parallel to the lens plane and the light sensing surface. As shown in fig. 1, when performing macro and tilt shooting, the eyeball plane 211 of the eyeball 200 is not parallel to the lens plane and the photosurface of the optical lens of the infrared camera module 100, i.e. has an inclination angle β, so that the focal plane of the optical lens of the infrared camera module 100 is not coincident with the eyeball plane 211 of the eyeball 200, i.e. has an inclination angle β, in this case, the image shot by the infrared camera module 100 has a problem that the depth of field of the image is too narrow, and the image is blurred due to defocus. Meanwhile, the image captured by the infrared camera module 100 obliquely has the problem of perspective deformation of the image, which results in deformation effect of the image. Therefore, macro and tilt photographing affects the effect of the image photographed by the infrared photographing module 100, thereby affecting the accuracy of eye recognition and tracking through the image.
In order to solve the above-mentioned problems, an embodiment of the present application provides an image capturing module 300 applied to eye tracking.
Fig. 2 is a schematic structural diagram of an image capturing module 300 applied to eye tracking according to an embodiment of the present application. Fig. 3 is a schematic structural diagram of an image capturing module 300 according to another embodiment of the present application. The image pickup module 300 applied to eye tracking according to the embodiment of the present application is mounted on a head-mounted device, for example, an AR and VR head-mounted display device, and the embodiment of the present application is not limited thereto. As shown in fig. 2 and 3, an image pickup module 300 applied to eye tracking according to an embodiment of the present application may include an optical lens 310 and a light sensing chip 320. The optical lens 310 faces the eyeball of the user of the head-mounted device, receives the detection light reflected by the eyeball, and the photosensitive chip 320 is located at the image side of the optical lens 310, and images the detection light projected to the photosensitive chip 320 through the optical lens 310, wherein the optical center O of the optical lens 310 is deviated from the center O' of the photosensitive surface 321 of the photosensitive chip 320. The optical center O of the optical lens 310 may be deviated from the center O 'of the photosurface 321 of the photosurface 320 by making the lens plane 311 of the optical lens 310 have a preset non-zero angle with the photosurface 321 of the photosurface 320 and/or by making the optical center O of the optical lens 310 deviate from the center O' of the photosurface 321 of the photosurface 320 by a preset distance.
According to the image pickup module 300 applied to eyeball tracking provided by the embodiment of the application, the optical center O of the optical lens 310 is deviated from the center O' of the light sensing surface 321 of the light sensing chip 320, and the structure of the image pickup module 300 is optimized based on the axis shifting optical principle, so that the effect of the image picked up by the image pickup module 300 in a macro and inclined manner can be optimized, the depth of field of the image picked up by the image pickup module 300 in a macro and inclined manner can be enlarged, a clear image can be obtained, the problem of perspective deformation of the image picked up by the image pickup module 300 in an inclined manner can be solved, the deformation effect of the image in a lower size can be avoided, and the precision of eyeball identification and tracking through the image can be improved.
It should be understood that the composition and implementation of the optical lens 310 are not limited by the embodiments of the present application, and the number and type of optical lenses included in the optical lens 310 may be determined according to the accuracy requirements of eye tracking, etc. For example, the optical lens 310 may include two optical lenses, one of which is a plano-convex lens and the other of which is a concave-convex lens, and an optical film having a specific function may be further coated on the optical lenses, and the optical lens 310 may be manufactured by using a semiconductor manufacturing process, which is a wafer level optical element (WAFER LEVEL Optics, WLO for short).
It should be understood that the embodiment of the present application does not limit the type of the photosensitive chip 320, and the type of the photosensitive chip 320 may be determined according to the accuracy requirement of eye tracking, etc. For example, the photosensitive chip 320 may be a charge coupled device (Charge Coupled Device, abbreviated as CCD) chip, or may be a metal oxide semiconductor device (Complementary Metal-Oxide Semiconductor, abbreviated as CMOS) chip.
It should be understood that the camera module 300 in the embodiment of the present application refers to a device having an imaging function for light rays of an infrared band, and may include, but is not limited to, a camera, a video camera, a camera module, and the like.
In some embodiments of the present application, as shown in fig. 2, in the image capturing module 300, a first non-zero included angle α1 is formed between the lens plane 311 of the optical lens 310 and the photosurface 321 of the photosurface 320, a first inclination angle β1 is formed between the lens plane 311 of the optical lens 310 and the focal plane 312, and the photosurface 321, the lens plane 311 and the focal plane 312 intersect on a straight line a. In the present embodiment, the optical center O of the optical lens 310 is deviated from the center O' of the photosensitive surface 321 of the photosensitive chip 320 by making the first angle α1, which is non-zero, between the lens plane 311 of the optical lens 310 and the photosensitive surface 321 of the photosensitive chip 320. When the image capturing module 300 is mounted on the head-mounted device, the first inclination angle β1 is an inclination angle between the image capturing module 300 and an eyeball of a user of the head-mounted device, and is also an inclination angle between the lens plane 311 of the optical lens 310 and an eyeball plane of the eyeball of the user of the head-mounted device, and at this time, the light sensing plane 321, the lens plane 311 and the eyeball plane intersect at a straight line a.
In the present embodiment, the optical lens 310 and the photosensitive chip 320 are assembled so that a first non-zero angle α1 is formed between the lens plane 311 and the photosensitive surface 321, and the optical center O of the optical lens 310 is offset from the center O' of the photosensitive surface of the photosensitive chip 320. When the image pickup module 300 is mounted to the head-mounted device at the first inclination angle β1, the focal plane 312 of the optical lens 310 may be made to coincide with the eyeball plane of the eyeball of the user of the head-mounted device based on the shift optical principle. Therefore, the depth of field of the image captured by the micro-distance and inclined camera module 300 can be changed from the front-back relation to the left-right relation, the depth of field of the image captured by the micro-distance and inclined camera module 300 is enlarged, and the whole clear image including the edge view field is obtained.
Alternatively, in the image capturing module 300, the size of the first angle α1 between the lens plane 311 of the optical lens 310 and the light sensing surface 321 of the light sensing chip 320 may be determined according to the size of the first tilt angle β1 of the image capturing module 300 mounted to the head-mounted device and parameters of the optical lens 310. For example, the parameters of the optical lens 310 may include an effective focal length of the optical lens 310, a viewing angle of the optical lens 310, and the like, which is not limited by the embodiment of the present application.
Alternatively, the first inclination angle β1 of the camera module 300 mounted on the head-mounted device may be between 15 ° and 75 °, and the first angle α1 between the lens plane 311 of the optical lens 310 and the light sensing surface 321 of the light sensing chip 320 may be between 3.5 ° and 10 °. In an alternative example, the distance between the camera module 300 and the eyeball of the user of the head-mounted device is 13mm, the first inclination angle β1 at which the camera module 300 is mounted on the head-mounted device is 46 °, the first included angle α1 between the lens plane 311 of the optical lens 310 and the light sensing surface 321 of the light sensing chip 320 is 3.7 °, and as shown in fig. 2, the focal plane 312 of the optical lens 310 of the camera module 300 forms an angle of 46 ° with the lens plane 311 and coincides with the eyeball plane of the eyeball of the user of the head-mounted device.
Alternatively, since the image capturing module 300 shown in fig. 2 optimizes the structure of the image capturing module 300 by changing the angle between the optical lens 310 and the photosensitive chip 320, thereby optimizing the effect of the image capturing module 300 for macro-and tilt-photographing images, the image capturing module 300 shown in fig. 2 has a relatively small volume and can be applied to a head-mounted device with a small installation space, such as augmented reality glasses, etc. In an alternative example, the camera module 300 shown in fig. 2 may be mounted at a nose pad of augmented reality glasses. In another alternative example, the camera module 300 shown in fig. 2 may be mounted to the side of the eye of an augmented reality glasses.
In other embodiments of the present application, as shown in fig. 3, in the image capturing module 300, the lens plane 311 of the optical lens 310 is parallel to the light sensing surface 321 of the light sensing chip 320, the optical center O of the optical lens 310 is offset from the center O' of the light sensing surface 321 of the light sensing chip 320 by a first distance d1, and the lens plane 311 of the optical lens 310 is parallel to the focal plane 312. In the present embodiment, when the lens plane 311 of the optical lens 310 is parallel to the light sensing surface 321 of the light sensing chip 320, the optical center O of the optical lens 310 is offset from the center O' of the light sensing surface 321 of the light sensing chip 320 by a first distance d1 from the optical center O of the optical lens 310. When the image capturing module 300 is mounted on the head-mounted device, the image capturing module 300 may have a very small second inclination angle β2 (not shown) with respect to the eyeball of the user of the head-mounted device, where the second inclination angle β2 is also an inclination angle between the lens plane 311 of the optical lens 310 and the eyeball plane of the eyeball of the user of the head-mounted device, and the light sensing surface 321 and the lens plane 311 are almost parallel to the eyeball plane.
In the present embodiment, the optical lens 310 and the photosensitive chip 320 are assembled such that the optical center O of the optical lens 310 is offset by the first distance d1 from the center O 'of the photosensitive surface 321, thereby obtaining the image pickup module 300 in which the optical center O of the optical lens 310 is offset from the center O' of the photosensitive surface 321 of the photosensitive chip 320. When the camera module 300 is mounted on the head-mounted device at the very small second inclination angle beta 2, the camera module 300 can be in a state of hardly obliquely shooting based on the shift optical principle, and the whole eyeball of a user of the head-mounted device can be imaged on the light sensing surface 321 of the light sensing chip 320, so that the problem of perspective deformation of an image obliquely shot by the camera module 300 can be solved, the deformation effect of the image with large lower and small size is avoided, meanwhile, the camera module 300 does not need to be obliquely assembled, and the assembling difficulty of the camera module 30 can be reduced.
Alternatively, in the image capturing module 300, the size of the first distance d1 by which the optical center O of the optical lens 310 is offset from the center O' of the photosurface 321 of the photosurface 320 may be determined according to the size of the second tilt angle β2 by which the image capturing module 300 is mounted to the head-mounted device and the size of the photosurface 321 of the photosurface 320. The first distance d1 may be obtained by orienting the optical center O of the optical lens 310 to be offset from the center O' of the photosensitive surface 321 along the long side or the short side of the photosensitive surface 321.
Alternatively, the second inclination angle β2 of the camera module 300 mounted on the head-mounted device may be between 0 ° and 15 °, and the first distance d1 of the optical center O of the optical lens 310 from the center O' of the light sensing surface 321 of the light sensing chip 320 may be between 120um and 200 um. In an alternative example, the second tilt angle β2 of the camera module 300 mounted to the head-mounted device is 7 °, the first distance d1 of the optical center O of the optical lens 310 from the center O' of the photosurface 321 of the photosurface 320 is 150um, and the center of the eyeball of the user of the head-mounted device is imaged at the center of the photosurface 321 of the photosurface 320.
Alternatively, since the image capturing module 300 shown in fig. 3 optimizes the structure of the image capturing module 300 by changing the position between the optical lens 310 and the photosensitive chip 320, thereby optimizing the effect of the image capturing module 300 for macro and tilt capturing images, the image capturing module 300 shown in fig. 3 has a relatively large volume and can be applied to a head-mounted device having a large installation space, such as a virtual reality helmet, or the like. In an alternative example, the camera module 300 shown in fig. 3 may be mounted at a nose pad of a virtual reality helmet. In another alternative example, the camera module 300 shown in fig. 2 may be mounted to the side of the eye of a virtual reality helmet.
In still other embodiments of the present application, as shown in fig. 2 and 3, in the image capturing module 300, the optical center O of the optical lens 310 is offset from the center O' of the photosurface 321 of the photosurface 320 by a second distance d2, a second non-zero included angle α2 is formed between the lens plane 311 of the optical lens 310 and the photosurface 321 of the photosurface 320, a third inclination angle β3 is formed between the lens plane 311 of the optical lens 310 and the focal plane 312, and the photosurface 321, the lens plane 311 and the focal plane 312 intersect in a straight line a. In the present embodiment, the optical center O of the optical lens 310 is offset from the center O 'of the photosurface 321 of the photosurface 320 by a second distance d2, and the lens plane 311 of the optical lens 310 and the photosurface 321 of the photosurface 320 have a non-zero second angle α2 therebetween, so that the optical center O of the optical lens 310 is offset from the center O' of the photosurface 321 of the photosurface 320. When the image capturing module 300 is mounted on the head-mounted device, the third inclination angle β3 is an inclination angle between the image capturing module 300 and an eyeball of a user of the head-mounted device, and is also an inclination angle between the lens plane 311 of the optical lens 310 and an eyeball plane of the eyeball of the user of the head-mounted device, and at this time, the light sensing plane 321, the lens plane 311 and the eyeball plane intersect at a straight line a.
In this embodiment, the optical lens 310 and the photosensitive chip 320 are assembled such that the optical center O of the optical lens 310 is offset by a second distance d2 from the center O 'of the photosensitive surface 321, and a non-zero second angle α2 is formed between the lens plane 311 and the photosensitive surface 321, so as to obtain the image capturing module 300 in which the optical center O of the optical lens 310 is offset from the center O' of the photosensitive surface of the photosensitive chip 320. When the image capturing module 300 is mounted on the head-mounted device at the third inclination angle β3, the third inclination angle β3 of the image capturing module 300 mounted on the head-mounted device and the second included angle α2 between the lens plane 311 and the light sensing surface 321 can be reduced under the condition that the focal plane 312 of the optical lens 310 coincides with the eyeball plane of the eyeball of the user of the head-mounted device based on the shift optical principle, so that the depth of field of an image captured by the image capturing module 300 in a micro-distance and oblique manner can be enlarged, the overall clear image including the edge view field can be obtained, the problem of perspective deformation of the image captured by the image capturing module 300 in an oblique manner can be improved, the deformation effect of the image with a large size can be avoided, and the difficulty in designing the optical lens 310 can be reduced by compensating the second included angle α2 by the second distance d 2.
Alternatively, in the image capturing module 300, the size of the second distance d2 by which the optical center O of the optical lens 310 is offset from the center O' of the photosurface 321 of the photosurface 320 may be determined according to the size of the third tilt angle β3 by which the image capturing module 300 is mounted to the head-mounted device and the size of the photosurface 321 of the photosurface 320. The second angle α2 between the lens plane 311 of the optical lens 310 and the light sensing surface 321 of the light sensing chip 320 may be determined according to the third inclination angle β3 of the image capturing module 300 mounted on the head-mounted device and parameters of the optical lens 310.
Alternatively, the third inclination angle β3 of the image capturing module 300 mounted to the head-mounted device may be between 15 ° and 75 °, the second distance d2 of the optical center O of the optical lens 310 from the center O' of the light sensing surface 321 of the light sensing chip 320 may be between 50um and 120um, and the second angle α2 of the lens plane 311 of the optical lens 310 and the light sensing surface 321 of the light sensing chip 320 may be between 1 ° and 3.5 °. In an alternative example, the distance between the camera module 300 and the eyeball of the user of the head-mounted device is 13mm, the third inclination angle β3 at which the camera module 300 is mounted to the head-mounted device is 39 °, the second distance d2 by which the optical center O of the optical lens 310 is offset from the center O' of the photosurface 321 of the photosurface 320 is 100um, and the second angle α2 between the lens plane 311 of the optical lens 310 and the photosurface 321 of the photosurface 320 is 3 °.
Alternatively, the image capturing module 300 obtained in conjunction with fig. 2 and 3 optimizes the structure of the image capturing module 300 by simultaneously changing the position and the angle between the optical lens 310 and the photosensitive chip 320, so as to optimize the effect of the image capturing module 300 for macro-and tilt-capturing images, and thus the volume of the image capturing module 300 obtained in conjunction with fig. 2 and 3 is still relatively small, and the image capturing module 300 can be applied to a head-mounted device with a small installation space, such as augmented reality glasses, etc. In an alternative example, the camera module 300 shown in fig. 2 may be mounted at a nose pad of augmented reality glasses. In another alternative example, the camera module 300 shown in fig. 2 may be mounted to the side of the eye of an augmented reality glasses.
The embodiment of the application also provides a manufacturing method 1000 of the camera module applied to eyeball tracking. Fig. 4 shows a flowchart of a method 1000 of manufacturing an image pickup module applied to eye tracking according to an embodiment of the present application. The camera module manufactured by the method 1000 according to the embodiment of the present application is mounted on a head-mounted device, such as an AR and VR head-mounted display device, and the embodiment of the present application is not limited thereto. As shown in fig. 4, a method 1000 for manufacturing an image pickup module applied to eye tracking according to an embodiment of the present application may include the steps of:
s410, processing an optical lens array with a non-zero preset inclination angle on the glass wafer based on a semiconductor manufacturing process.
S420, cutting the optical lens array to obtain a single optical lens with a non-zero preset inclination angle.
S430, for each optical lens with a non-zero preset inclination angle, assembling with a photosensitive chip based on an active alignment process to obtain an image pickup module with the non-zero preset inclination angle between the lens plane of the optical lens and the photosensitive surface of the photosensitive chip, so that the optical center of the optical lens deviates from the center of the photosensitive surface of the photosensitive chip.
According to the manufacturing method 1000 of the camera module applied to eyeball tracking provided by the embodiment of the application, the optical lens with the non-zero preset inclination angle is directly processed on the glass wafer by adopting the semiconductor manufacturing process of the wafer-level optical element, and after the optical lens and the photosensitive chip are assembled based on the active alignment (ACTIVE ALIGNMENT, abbreviated as AA) process, an included angle which is consistent between the lens plane of the optical lens and the photosensitive surface of the photosensitive chip in the assembled camera module can be formed, so that the consistency of the finally obtained camera module product direction can be ensured, and meanwhile, the requirement of small-size rapid mass production of the camera module can be met.
It should be understood that the steps shown in method 1000 are not exclusive and that other steps may be performed before, after, or between any of the steps shown. Further, some of the illustrated steps may be performed simultaneously, or may be performed in a different order than shown in fig. 4.
In some embodiments of the present application, as shown in fig. 5 and 6, fig. 5 shows a flow chart of processing an optical lens array on a glass wafer according to an embodiment of the present application, and fig. 6 shows a flow chart of a process of processing and dicing a glass wafer to obtain a single optical lens according to an embodiment of the present application. Step S410 of the present embodiment may include the steps of:
s510, processing a first optical lens array with a non-zero preset inclination angle on the first glass wafer based on a semiconductor manufacturing process.
S520, processing a second optical lens array with a non-zero preset inclination angle on the second glass wafer based on the semiconductor manufacturing process.
And S530, based on the alignment of the single first optical lens and the single second optical lens, splicing the first optical lens array and the second optical lens array to obtain the optical lens array with the non-zero preset inclination angle.
The optical lens system of the present embodiment is a 2 p-type lens system including two lenses, each of which has a non-zero preset tilt angle α. First, a semiconductor manufacturing process using wafer level optical elements processes a first optical lens array wafer-1 having an inclination angle α on a piece of glass wafer. Then, a semiconductor manufacturing process using wafer level optical elements processes a second optical lens array wafer-2 having an inclination angle α on another glass wafer. Since the first optical lens array wafer-1 is parallel to the second optical lens array wafer-2 and the single first optical lens and the single second optical lens are inclined at the inclination angle α in the first optical lens array wafer-1 and the second optical lens array wafer-2, the two optical lens arrays, that is, the first optical lens array wafer-1 and the second optical lens array wafer-2, are spliced based on the alignment of the single first optical lens and the single second optical lens, so that the optical lens array wafer-3 of the wafer level can be obtained. Finally, the wafer-level optical lens array wafer-3 is cut, and a single optical lens with the inclination angle alpha can be obtained.
The embodiment of the application also provides the head-mounted device. The head-mounted device of the embodiment of the present application may include a device body and the image pickup module 300 applied to eye tracking in the above embodiment, the image pickup module 300 being mounted to the device body. Alternatively, the head mounted device may be an AR and VR head mounted display device, such as AR glasses, VR helmets, and the like, to which embodiments of the present application are not limited.
The above detailed description should not be taken as limiting the scope of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present disclosure are intended to be included within the scope of the present disclosure.