[go: up one dir, main page]

CN119310735A - Camera module for eye tracking, manufacturing method thereof, and head-mounted device - Google Patents

Camera module for eye tracking, manufacturing method thereof, and head-mounted device Download PDF

Info

Publication number
CN119310735A
CN119310735A CN202310867725.XA CN202310867725A CN119310735A CN 119310735 A CN119310735 A CN 119310735A CN 202310867725 A CN202310867725 A CN 202310867725A CN 119310735 A CN119310735 A CN 119310735A
Authority
CN
China
Prior art keywords
optical lens
camera module
center
optical
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310867725.XA
Other languages
Chinese (zh)
Inventor
张扣文
杜亚凤
周婷
茅武超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yuyao Sunny Optical Intelligence Technology Co Ltd
Original Assignee
Yuyao Sunny Optical Intelligence Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yuyao Sunny Optical Intelligence Technology Co Ltd filed Critical Yuyao Sunny Optical Intelligence Technology Co Ltd
Priority to CN202310867725.XA priority Critical patent/CN119310735A/en
Publication of CN119310735A publication Critical patent/CN119310735A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)

Abstract

本申请公开了一种应用于眼球追踪的摄像模块及其制造方法和头戴式设备。该应用于眼球追踪的摄像模块安装于头戴式设备,该摄像模块包括:光学镜头,朝向所述头戴式设备的使用者的眼球,接收所述眼球反射的检测光;以及感光芯片,位于所述光学镜头的像侧,对通过所述光学镜头投射至所述感光芯片的所述检测光成像;其中,所述光学镜头的光心偏离所述感光芯片的感光面的中心。本申请通过对摄像模块的结构进行优化,可以优化摄像模块微距且倾斜拍摄的图像的效果,从而可以提升通过图像进行眼球识别和追踪的精准度。

The present application discloses a camera module for eye tracking, a manufacturing method thereof, and a head-mounted device. The camera module for eye tracking is installed on a head-mounted device, and the camera module includes: an optical lens facing the eyeball of the user of the head-mounted device, receiving the detection light reflected by the eyeball; and a photosensitive chip, located on the image side of the optical lens, imaging the detection light projected to the photosensitive chip through the optical lens; wherein the optical center of the optical lens deviates from the center of the photosensitive surface of the photosensitive chip. The present application can optimize the effect of the image taken by the camera module at a macro distance and at an angle by optimizing the structure of the camera module, thereby improving the accuracy of eye recognition and tracking through images.

Description

Camera module applied to eyeball tracking, manufacturing method thereof and head-mounted device
Technical Field
The embodiment of the application relates to the technical field of intelligent equipment, in particular to an image pickup module applied to eyeball tracking, a manufacturing method thereof and head-mounted equipment.
Background
Currently, an eyeball tracking technology used in augmented Reality (Augmented Reality, abbreviated as AR) and Virtual Reality (VR) devices mainly projects infrared detection light to an eyeball of a user of the device through an infrared light source, and shoots the eyeball of the user through an infrared shooting module so as to confirm the visual axis direction of the eyeball through the center of a pupil in an image. Limited by the structure and volume of the AR and VR head mounted display devices, the infrared camera module is typically mounted at the nose pad of the AR and VR head mounted display device or at the side of the eyes, so that the infrared camera module photographs the eyes at a fine pitch and with an inclination.
However, macro and tilt photographing affects the effect of an image photographed by the infrared photographing module, thereby affecting the accuracy of eye recognition and tracking through the image.
Disclosure of Invention
The camera module applied to eyeball tracking, the manufacturing method thereof and the head-mounted device provided by the embodiment of the application can solve or partially solve the defects in the prior art or other defects in the prior art.
According to the application, the camera module applied to eyeball tracking is arranged on a head-mounted device and comprises an optical lens, a photosensitive chip and a photosensitive chip, wherein the optical lens faces the eyeball of a user of the head-mounted device and receives detection light reflected by the eyeball, the photosensitive chip is positioned on the image side of the optical lens and is used for imaging the detection light projected to the photosensitive chip through the optical lens, and the optical center of the optical lens is deviated from the center of a photosensitive surface of the photosensitive chip.
In one embodiment of the application, a first inclination angle is formed between a lens plane of the optical lens and an eyeball plane of the eyeball, and when the lens plane of the optical lens is parallel to the photosurface, the optical center of the optical lens and the center of the photosurface are offset by a first distance so as to deviate the optical center of the optical lens from the center of the photosurface.
In one embodiment of the present application, the first distance is determined according to the first inclination angle and the size of the photosurface.
In one embodiment of the application, the first inclination angle is between 0 ° and 15 °, and the first distance is between 120um and 200 um.
In one embodiment of the present application, the first inclination angle is 7 °, and the first distance is 150um.
In one embodiment of the application, the head-mounted device comprises a virtual reality head-mounted display device, and the camera module is mounted at a nose pad of the virtual reality head-mounted display device or at a side edge of an eye.
In one embodiment of the application, a second inclination angle is formed between the lens plane of the optical lens and the eyeball plane of the eyeball, a first non-zero included angle is formed between the lens plane of the optical lens and the photosurface, so that the optical center of the optical lens deviates from the center of the photosurface, and the photosurface, the lens plane and the eyeball plane intersect in a straight line.
In one embodiment of the present application, the first included angle is determined according to the second inclination angle and parameters of the optical lens.
In one embodiment of the present application, the second inclination angle is between 15 ° and 75 °, and the first inclination angle is between 3.5 ° and 10 °.
In one embodiment of the present application, the second inclination angle is 46 ° and the first inclination angle is 3.7 °.
In one embodiment of the application, a third inclination angle is formed between a lens plane of the optical lens and an eyeball plane of the eyeball, an optical center of the optical lens is offset from the center of the photosurface by a second distance, and a second non-zero included angle is formed between the lens plane of the optical lens and the photosurface, so that the optical center of the optical lens is offset from the center of the photosurface, wherein the photosurface, the lens plane and the eyeball plane intersect in a straight line.
In one embodiment of the application, the second distance is determined according to the third inclination angle and the size of the light sensing surface, and the second included angle is determined according to the third inclination angle, the second distance and the parameters of the optical lens.
In one embodiment of the present application, the third inclination angle is between 15 ° and 75 °, the second distance is between 50um and 120um, and the second included angle is between 1 ° and 3.5 °.
In one embodiment of the present application, the third inclination angle is 39 °, the second distance is 100um, and the second included angle is 3 °.
In one embodiment of the application, the head mounted device comprises an augmented reality head mounted display device, and the camera module is mounted at a nose pad or at a side of an eye of the augmented reality head mounted display device.
According to the manufacturing method of the camera module applied to eyeball tracking, the camera module is mounted on head-mounted equipment, the manufacturing method comprises the steps of machining an optical lens array with a non-zero preset inclination angle on a glass wafer based on a semiconductor manufacturing process, cutting the optical lens array to obtain single optical lenses with the preset inclination angle, and assembling each optical lens with the preset inclination angle with a photosensitive chip based on an active alignment process to obtain the camera module taking the preset inclination angle as an included angle between a lens plane of the optical lens and a photosensitive surface of the photosensitive chip so that an optical center of the optical lens deviates from the center of the photosensitive surface of the photosensitive chip.
In one embodiment of the application, the optical lens array with the non-zero preset inclination angle is processed on the glass wafer based on the semiconductor manufacturing process, and the optical lens array with the preset inclination angle is obtained by processing a first optical lens array with the preset inclination angle on the first glass wafer based on the semiconductor manufacturing process, processing a second optical lens array with the preset inclination angle on the second glass wafer based on the semiconductor manufacturing process, and splicing the first optical lens array with the second optical lens array based on the alignment of a single first optical lens and a single second optical lens.
According to a third aspect of the application, a head-mounted device comprises a device body and an imaging module applied to eyeball tracking according to the first aspect, wherein the imaging module is mounted on the device body.
According to the image pickup module applied to eyeball tracking, the manufacturing method thereof and the head-mounted device provided by the embodiment of the application, the structure of the image pickup module is optimized based on the axis shifting optical principle by enabling the optical center of the optical lens to deviate from the center of the light sensing surface of the light sensing chip, so that the effect of the image pickup module on the macro and obliquely photographed image can be optimized, and the precision of eyeball identification and tracking through the image can be improved.
The matters described in this section are not intended to identify key or critical features of the embodiments of the present disclosure, nor are they intended to limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the following drawings. The drawings are included to provide a better understanding of the present application and are not to be construed as limiting the application. Wherein:
fig. 1 is a schematic diagram of a positional relationship between an image capturing module and a human eye applied to eye tracking in a conventional head-mounted display device;
Fig. 2 is a schematic structural view of an image pickup module applied to eye tracking according to an embodiment of the present application;
fig. 3 is a schematic structural view of an image capturing module according to another embodiment of the present application;
fig. 4 is a flowchart of a manufacturing method of an image pickup module applied to eye tracking according to an embodiment of the present application;
FIG. 5 is a flow chart of processing an optical lens array on a glass wafer according to an embodiment of the present application;
Fig. 6 is a process flow diagram of processing and dicing a glass wafer to obtain individual optical lenses according to an embodiment of the application.
Detailed Description
Exemplary embodiments of the present application will now be described with reference to the accompanying drawings, in which various details of embodiments of the present application are included to facilitate understanding, and are to be considered merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
In addition, the embodiments of the present application and the features of the embodiments may be combined with each other without collision. The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Currently, in an augmented Reality (Augmented Reality, abbreviated as AR) and Virtual Reality (VR) head-mounted display device, a gaze direction of a human eye is detected mainly by an eye tracking technology, so as to adjust a display position of a Virtual image according to the gaze direction of the human eye, so as to ensure that the human eye can observe the Virtual image. The eye tracking technology used in AR and VR head-mounted display devices mainly projects infrared detection light to the eyes of a device user through an infrared light source, and shoots the eyes of the user through an infrared shooting module so as to confirm the visual axis direction of the eyes through the center of pupils in an image. Limited by the structure and volume of the AR and VR head mounted display devices, the infrared camera module is typically mounted at the nose pad of the AR and VR head mounted display device or at the side of the eyes, so that the infrared camera module photographs the eyes at a fine pitch and with an inclination. As shown in fig. 1, the infrared camera module 100 is mounted at a nose pad of an AR or VR head-mounted display device (not shown), a distance d between the infrared camera module 100 and an eyeball 200 of a user of the AR or VR head-mounted display device (not shown) is about 3.5cm, and an inclination β between the infrared camera module 100 and the eyeball 200 of the user of the AR or VR head-mounted display device (not shown) is about 25 °.
Since in a conventional image pickup module, such as the infrared image pickup module 100, the lens plane of the optical lens is parallel to the light sensing surface of the light sensing chip, and the optical center of the optical lens is aligned with the center of the light sensing surface, the focal plane of the optical lens is parallel to the lens plane and the light sensing surface. As shown in fig. 1, when performing macro and tilt shooting, the eyeball plane 211 of the eyeball 200 is not parallel to the lens plane and the photosurface of the optical lens of the infrared camera module 100, i.e. has an inclination angle β, so that the focal plane of the optical lens of the infrared camera module 100 is not coincident with the eyeball plane 211 of the eyeball 200, i.e. has an inclination angle β, in this case, the image shot by the infrared camera module 100 has a problem that the depth of field of the image is too narrow, and the image is blurred due to defocus. Meanwhile, the image captured by the infrared camera module 100 obliquely has the problem of perspective deformation of the image, which results in deformation effect of the image. Therefore, macro and tilt photographing affects the effect of the image photographed by the infrared photographing module 100, thereby affecting the accuracy of eye recognition and tracking through the image.
In order to solve the above-mentioned problems, an embodiment of the present application provides an image capturing module 300 applied to eye tracking.
Fig. 2 is a schematic structural diagram of an image capturing module 300 applied to eye tracking according to an embodiment of the present application. Fig. 3 is a schematic structural diagram of an image capturing module 300 according to another embodiment of the present application. The image pickup module 300 applied to eye tracking according to the embodiment of the present application is mounted on a head-mounted device, for example, an AR and VR head-mounted display device, and the embodiment of the present application is not limited thereto. As shown in fig. 2 and 3, an image pickup module 300 applied to eye tracking according to an embodiment of the present application may include an optical lens 310 and a light sensing chip 320. The optical lens 310 faces the eyeball of the user of the head-mounted device, receives the detection light reflected by the eyeball, and the photosensitive chip 320 is located at the image side of the optical lens 310, and images the detection light projected to the photosensitive chip 320 through the optical lens 310, wherein the optical center O of the optical lens 310 is deviated from the center O' of the photosensitive surface 321 of the photosensitive chip 320. The optical center O of the optical lens 310 may be deviated from the center O 'of the photosurface 321 of the photosurface 320 by making the lens plane 311 of the optical lens 310 have a preset non-zero angle with the photosurface 321 of the photosurface 320 and/or by making the optical center O of the optical lens 310 deviate from the center O' of the photosurface 321 of the photosurface 320 by a preset distance.
According to the image pickup module 300 applied to eyeball tracking provided by the embodiment of the application, the optical center O of the optical lens 310 is deviated from the center O' of the light sensing surface 321 of the light sensing chip 320, and the structure of the image pickup module 300 is optimized based on the axis shifting optical principle, so that the effect of the image picked up by the image pickup module 300 in a macro and inclined manner can be optimized, the depth of field of the image picked up by the image pickup module 300 in a macro and inclined manner can be enlarged, a clear image can be obtained, the problem of perspective deformation of the image picked up by the image pickup module 300 in an inclined manner can be solved, the deformation effect of the image in a lower size can be avoided, and the precision of eyeball identification and tracking through the image can be improved.
It should be understood that the composition and implementation of the optical lens 310 are not limited by the embodiments of the present application, and the number and type of optical lenses included in the optical lens 310 may be determined according to the accuracy requirements of eye tracking, etc. For example, the optical lens 310 may include two optical lenses, one of which is a plano-convex lens and the other of which is a concave-convex lens, and an optical film having a specific function may be further coated on the optical lenses, and the optical lens 310 may be manufactured by using a semiconductor manufacturing process, which is a wafer level optical element (WAFER LEVEL Optics, WLO for short).
It should be understood that the embodiment of the present application does not limit the type of the photosensitive chip 320, and the type of the photosensitive chip 320 may be determined according to the accuracy requirement of eye tracking, etc. For example, the photosensitive chip 320 may be a charge coupled device (Charge Coupled Device, abbreviated as CCD) chip, or may be a metal oxide semiconductor device (Complementary Metal-Oxide Semiconductor, abbreviated as CMOS) chip.
It should be understood that the camera module 300 in the embodiment of the present application refers to a device having an imaging function for light rays of an infrared band, and may include, but is not limited to, a camera, a video camera, a camera module, and the like.
In some embodiments of the present application, as shown in fig. 2, in the image capturing module 300, a first non-zero included angle α1 is formed between the lens plane 311 of the optical lens 310 and the photosurface 321 of the photosurface 320, a first inclination angle β1 is formed between the lens plane 311 of the optical lens 310 and the focal plane 312, and the photosurface 321, the lens plane 311 and the focal plane 312 intersect on a straight line a. In the present embodiment, the optical center O of the optical lens 310 is deviated from the center O' of the photosensitive surface 321 of the photosensitive chip 320 by making the first angle α1, which is non-zero, between the lens plane 311 of the optical lens 310 and the photosensitive surface 321 of the photosensitive chip 320. When the image capturing module 300 is mounted on the head-mounted device, the first inclination angle β1 is an inclination angle between the image capturing module 300 and an eyeball of a user of the head-mounted device, and is also an inclination angle between the lens plane 311 of the optical lens 310 and an eyeball plane of the eyeball of the user of the head-mounted device, and at this time, the light sensing plane 321, the lens plane 311 and the eyeball plane intersect at a straight line a.
In the present embodiment, the optical lens 310 and the photosensitive chip 320 are assembled so that a first non-zero angle α1 is formed between the lens plane 311 and the photosensitive surface 321, and the optical center O of the optical lens 310 is offset from the center O' of the photosensitive surface of the photosensitive chip 320. When the image pickup module 300 is mounted to the head-mounted device at the first inclination angle β1, the focal plane 312 of the optical lens 310 may be made to coincide with the eyeball plane of the eyeball of the user of the head-mounted device based on the shift optical principle. Therefore, the depth of field of the image captured by the micro-distance and inclined camera module 300 can be changed from the front-back relation to the left-right relation, the depth of field of the image captured by the micro-distance and inclined camera module 300 is enlarged, and the whole clear image including the edge view field is obtained.
Alternatively, in the image capturing module 300, the size of the first angle α1 between the lens plane 311 of the optical lens 310 and the light sensing surface 321 of the light sensing chip 320 may be determined according to the size of the first tilt angle β1 of the image capturing module 300 mounted to the head-mounted device and parameters of the optical lens 310. For example, the parameters of the optical lens 310 may include an effective focal length of the optical lens 310, a viewing angle of the optical lens 310, and the like, which is not limited by the embodiment of the present application.
Alternatively, the first inclination angle β1 of the camera module 300 mounted on the head-mounted device may be between 15 ° and 75 °, and the first angle α1 between the lens plane 311 of the optical lens 310 and the light sensing surface 321 of the light sensing chip 320 may be between 3.5 ° and 10 °. In an alternative example, the distance between the camera module 300 and the eyeball of the user of the head-mounted device is 13mm, the first inclination angle β1 at which the camera module 300 is mounted on the head-mounted device is 46 °, the first included angle α1 between the lens plane 311 of the optical lens 310 and the light sensing surface 321 of the light sensing chip 320 is 3.7 °, and as shown in fig. 2, the focal plane 312 of the optical lens 310 of the camera module 300 forms an angle of 46 ° with the lens plane 311 and coincides with the eyeball plane of the eyeball of the user of the head-mounted device.
Alternatively, since the image capturing module 300 shown in fig. 2 optimizes the structure of the image capturing module 300 by changing the angle between the optical lens 310 and the photosensitive chip 320, thereby optimizing the effect of the image capturing module 300 for macro-and tilt-photographing images, the image capturing module 300 shown in fig. 2 has a relatively small volume and can be applied to a head-mounted device with a small installation space, such as augmented reality glasses, etc. In an alternative example, the camera module 300 shown in fig. 2 may be mounted at a nose pad of augmented reality glasses. In another alternative example, the camera module 300 shown in fig. 2 may be mounted to the side of the eye of an augmented reality glasses.
In other embodiments of the present application, as shown in fig. 3, in the image capturing module 300, the lens plane 311 of the optical lens 310 is parallel to the light sensing surface 321 of the light sensing chip 320, the optical center O of the optical lens 310 is offset from the center O' of the light sensing surface 321 of the light sensing chip 320 by a first distance d1, and the lens plane 311 of the optical lens 310 is parallel to the focal plane 312. In the present embodiment, when the lens plane 311 of the optical lens 310 is parallel to the light sensing surface 321 of the light sensing chip 320, the optical center O of the optical lens 310 is offset from the center O' of the light sensing surface 321 of the light sensing chip 320 by a first distance d1 from the optical center O of the optical lens 310. When the image capturing module 300 is mounted on the head-mounted device, the image capturing module 300 may have a very small second inclination angle β2 (not shown) with respect to the eyeball of the user of the head-mounted device, where the second inclination angle β2 is also an inclination angle between the lens plane 311 of the optical lens 310 and the eyeball plane of the eyeball of the user of the head-mounted device, and the light sensing surface 321 and the lens plane 311 are almost parallel to the eyeball plane.
In the present embodiment, the optical lens 310 and the photosensitive chip 320 are assembled such that the optical center O of the optical lens 310 is offset by the first distance d1 from the center O 'of the photosensitive surface 321, thereby obtaining the image pickup module 300 in which the optical center O of the optical lens 310 is offset from the center O' of the photosensitive surface 321 of the photosensitive chip 320. When the camera module 300 is mounted on the head-mounted device at the very small second inclination angle beta 2, the camera module 300 can be in a state of hardly obliquely shooting based on the shift optical principle, and the whole eyeball of a user of the head-mounted device can be imaged on the light sensing surface 321 of the light sensing chip 320, so that the problem of perspective deformation of an image obliquely shot by the camera module 300 can be solved, the deformation effect of the image with large lower and small size is avoided, meanwhile, the camera module 300 does not need to be obliquely assembled, and the assembling difficulty of the camera module 30 can be reduced.
Alternatively, in the image capturing module 300, the size of the first distance d1 by which the optical center O of the optical lens 310 is offset from the center O' of the photosurface 321 of the photosurface 320 may be determined according to the size of the second tilt angle β2 by which the image capturing module 300 is mounted to the head-mounted device and the size of the photosurface 321 of the photosurface 320. The first distance d1 may be obtained by orienting the optical center O of the optical lens 310 to be offset from the center O' of the photosensitive surface 321 along the long side or the short side of the photosensitive surface 321.
Alternatively, the second inclination angle β2 of the camera module 300 mounted on the head-mounted device may be between 0 ° and 15 °, and the first distance d1 of the optical center O of the optical lens 310 from the center O' of the light sensing surface 321 of the light sensing chip 320 may be between 120um and 200 um. In an alternative example, the second tilt angle β2 of the camera module 300 mounted to the head-mounted device is 7 °, the first distance d1 of the optical center O of the optical lens 310 from the center O' of the photosurface 321 of the photosurface 320 is 150um, and the center of the eyeball of the user of the head-mounted device is imaged at the center of the photosurface 321 of the photosurface 320.
Alternatively, since the image capturing module 300 shown in fig. 3 optimizes the structure of the image capturing module 300 by changing the position between the optical lens 310 and the photosensitive chip 320, thereby optimizing the effect of the image capturing module 300 for macro and tilt capturing images, the image capturing module 300 shown in fig. 3 has a relatively large volume and can be applied to a head-mounted device having a large installation space, such as a virtual reality helmet, or the like. In an alternative example, the camera module 300 shown in fig. 3 may be mounted at a nose pad of a virtual reality helmet. In another alternative example, the camera module 300 shown in fig. 2 may be mounted to the side of the eye of a virtual reality helmet.
In still other embodiments of the present application, as shown in fig. 2 and 3, in the image capturing module 300, the optical center O of the optical lens 310 is offset from the center O' of the photosurface 321 of the photosurface 320 by a second distance d2, a second non-zero included angle α2 is formed between the lens plane 311 of the optical lens 310 and the photosurface 321 of the photosurface 320, a third inclination angle β3 is formed between the lens plane 311 of the optical lens 310 and the focal plane 312, and the photosurface 321, the lens plane 311 and the focal plane 312 intersect in a straight line a. In the present embodiment, the optical center O of the optical lens 310 is offset from the center O 'of the photosurface 321 of the photosurface 320 by a second distance d2, and the lens plane 311 of the optical lens 310 and the photosurface 321 of the photosurface 320 have a non-zero second angle α2 therebetween, so that the optical center O of the optical lens 310 is offset from the center O' of the photosurface 321 of the photosurface 320. When the image capturing module 300 is mounted on the head-mounted device, the third inclination angle β3 is an inclination angle between the image capturing module 300 and an eyeball of a user of the head-mounted device, and is also an inclination angle between the lens plane 311 of the optical lens 310 and an eyeball plane of the eyeball of the user of the head-mounted device, and at this time, the light sensing plane 321, the lens plane 311 and the eyeball plane intersect at a straight line a.
In this embodiment, the optical lens 310 and the photosensitive chip 320 are assembled such that the optical center O of the optical lens 310 is offset by a second distance d2 from the center O 'of the photosensitive surface 321, and a non-zero second angle α2 is formed between the lens plane 311 and the photosensitive surface 321, so as to obtain the image capturing module 300 in which the optical center O of the optical lens 310 is offset from the center O' of the photosensitive surface of the photosensitive chip 320. When the image capturing module 300 is mounted on the head-mounted device at the third inclination angle β3, the third inclination angle β3 of the image capturing module 300 mounted on the head-mounted device and the second included angle α2 between the lens plane 311 and the light sensing surface 321 can be reduced under the condition that the focal plane 312 of the optical lens 310 coincides with the eyeball plane of the eyeball of the user of the head-mounted device based on the shift optical principle, so that the depth of field of an image captured by the image capturing module 300 in a micro-distance and oblique manner can be enlarged, the overall clear image including the edge view field can be obtained, the problem of perspective deformation of the image captured by the image capturing module 300 in an oblique manner can be improved, the deformation effect of the image with a large size can be avoided, and the difficulty in designing the optical lens 310 can be reduced by compensating the second included angle α2 by the second distance d 2.
Alternatively, in the image capturing module 300, the size of the second distance d2 by which the optical center O of the optical lens 310 is offset from the center O' of the photosurface 321 of the photosurface 320 may be determined according to the size of the third tilt angle β3 by which the image capturing module 300 is mounted to the head-mounted device and the size of the photosurface 321 of the photosurface 320. The second angle α2 between the lens plane 311 of the optical lens 310 and the light sensing surface 321 of the light sensing chip 320 may be determined according to the third inclination angle β3 of the image capturing module 300 mounted on the head-mounted device and parameters of the optical lens 310.
Alternatively, the third inclination angle β3 of the image capturing module 300 mounted to the head-mounted device may be between 15 ° and 75 °, the second distance d2 of the optical center O of the optical lens 310 from the center O' of the light sensing surface 321 of the light sensing chip 320 may be between 50um and 120um, and the second angle α2 of the lens plane 311 of the optical lens 310 and the light sensing surface 321 of the light sensing chip 320 may be between 1 ° and 3.5 °. In an alternative example, the distance between the camera module 300 and the eyeball of the user of the head-mounted device is 13mm, the third inclination angle β3 at which the camera module 300 is mounted to the head-mounted device is 39 °, the second distance d2 by which the optical center O of the optical lens 310 is offset from the center O' of the photosurface 321 of the photosurface 320 is 100um, and the second angle α2 between the lens plane 311 of the optical lens 310 and the photosurface 321 of the photosurface 320 is 3 °.
Alternatively, the image capturing module 300 obtained in conjunction with fig. 2 and 3 optimizes the structure of the image capturing module 300 by simultaneously changing the position and the angle between the optical lens 310 and the photosensitive chip 320, so as to optimize the effect of the image capturing module 300 for macro-and tilt-capturing images, and thus the volume of the image capturing module 300 obtained in conjunction with fig. 2 and 3 is still relatively small, and the image capturing module 300 can be applied to a head-mounted device with a small installation space, such as augmented reality glasses, etc. In an alternative example, the camera module 300 shown in fig. 2 may be mounted at a nose pad of augmented reality glasses. In another alternative example, the camera module 300 shown in fig. 2 may be mounted to the side of the eye of an augmented reality glasses.
The embodiment of the application also provides a manufacturing method 1000 of the camera module applied to eyeball tracking. Fig. 4 shows a flowchart of a method 1000 of manufacturing an image pickup module applied to eye tracking according to an embodiment of the present application. The camera module manufactured by the method 1000 according to the embodiment of the present application is mounted on a head-mounted device, such as an AR and VR head-mounted display device, and the embodiment of the present application is not limited thereto. As shown in fig. 4, a method 1000 for manufacturing an image pickup module applied to eye tracking according to an embodiment of the present application may include the steps of:
s410, processing an optical lens array with a non-zero preset inclination angle on the glass wafer based on a semiconductor manufacturing process.
S420, cutting the optical lens array to obtain a single optical lens with a non-zero preset inclination angle.
S430, for each optical lens with a non-zero preset inclination angle, assembling with a photosensitive chip based on an active alignment process to obtain an image pickup module with the non-zero preset inclination angle between the lens plane of the optical lens and the photosensitive surface of the photosensitive chip, so that the optical center of the optical lens deviates from the center of the photosensitive surface of the photosensitive chip.
According to the manufacturing method 1000 of the camera module applied to eyeball tracking provided by the embodiment of the application, the optical lens with the non-zero preset inclination angle is directly processed on the glass wafer by adopting the semiconductor manufacturing process of the wafer-level optical element, and after the optical lens and the photosensitive chip are assembled based on the active alignment (ACTIVE ALIGNMENT, abbreviated as AA) process, an included angle which is consistent between the lens plane of the optical lens and the photosensitive surface of the photosensitive chip in the assembled camera module can be formed, so that the consistency of the finally obtained camera module product direction can be ensured, and meanwhile, the requirement of small-size rapid mass production of the camera module can be met.
It should be understood that the steps shown in method 1000 are not exclusive and that other steps may be performed before, after, or between any of the steps shown. Further, some of the illustrated steps may be performed simultaneously, or may be performed in a different order than shown in fig. 4.
In some embodiments of the present application, as shown in fig. 5 and 6, fig. 5 shows a flow chart of processing an optical lens array on a glass wafer according to an embodiment of the present application, and fig. 6 shows a flow chart of a process of processing and dicing a glass wafer to obtain a single optical lens according to an embodiment of the present application. Step S410 of the present embodiment may include the steps of:
s510, processing a first optical lens array with a non-zero preset inclination angle on the first glass wafer based on a semiconductor manufacturing process.
S520, processing a second optical lens array with a non-zero preset inclination angle on the second glass wafer based on the semiconductor manufacturing process.
And S530, based on the alignment of the single first optical lens and the single second optical lens, splicing the first optical lens array and the second optical lens array to obtain the optical lens array with the non-zero preset inclination angle.
The optical lens system of the present embodiment is a 2 p-type lens system including two lenses, each of which has a non-zero preset tilt angle α. First, a semiconductor manufacturing process using wafer level optical elements processes a first optical lens array wafer-1 having an inclination angle α on a piece of glass wafer. Then, a semiconductor manufacturing process using wafer level optical elements processes a second optical lens array wafer-2 having an inclination angle α on another glass wafer. Since the first optical lens array wafer-1 is parallel to the second optical lens array wafer-2 and the single first optical lens and the single second optical lens are inclined at the inclination angle α in the first optical lens array wafer-1 and the second optical lens array wafer-2, the two optical lens arrays, that is, the first optical lens array wafer-1 and the second optical lens array wafer-2, are spliced based on the alignment of the single first optical lens and the single second optical lens, so that the optical lens array wafer-3 of the wafer level can be obtained. Finally, the wafer-level optical lens array wafer-3 is cut, and a single optical lens with the inclination angle alpha can be obtained.
The embodiment of the application also provides the head-mounted device. The head-mounted device of the embodiment of the present application may include a device body and the image pickup module 300 applied to eye tracking in the above embodiment, the image pickup module 300 being mounted to the device body. Alternatively, the head mounted device may be an AR and VR head mounted display device, such as AR glasses, VR helmets, and the like, to which embodiments of the present application are not limited.
The above detailed description should not be taken as limiting the scope of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present disclosure are intended to be included within the scope of the present disclosure.

Claims (10)

1.一种应用于眼球追踪的摄像模块,其特征在于,所述摄像模块安装于头戴式设备,所述摄像模块包括:1. A camera module for eye tracking, characterized in that the camera module is installed on a head-mounted device, and the camera module comprises: 光学镜头,朝向所述头戴式设备的使用者的眼球,接收所述眼球反射的检测光;以及an optical lens, facing an eyeball of a user of the head mounted device, and receiving the detection light reflected by the eyeball; and 感光芯片,位于所述光学镜头的像侧,对通过所述光学镜头投射至所述感光芯片的所述检测光成像;A photosensitive chip, located at the image side of the optical lens, for imaging the detection light projected onto the photosensitive chip through the optical lens; 其中,所述光学镜头的光心偏离所述感光芯片的感光面的中心。Wherein, the optical center of the optical lens deviates from the center of the photosensitive surface of the photosensitive chip. 2.根据权利要求1所述的摄像模块,其中,所述光学镜头的镜头平面与所述眼球的眼球平面之间具有第一倾角;2. The camera module according to claim 1, wherein a first inclination angle exists between a lens plane of the optical lens and an eyeball plane of the eyeball; 在所述光学镜头的镜头平面与所述感光面平行的情况下,所述光学镜头的光心与所述感光面的中心偏移第一距离,以使所述光学镜头的光心偏离所述感光芯片的感光面的中心。When the lens plane of the optical lens is parallel to the photosensitive surface, the optical center of the optical lens is offset from the center of the photosensitive surface by a first distance so that the optical center of the optical lens deviates from the center of the photosensitive surface of the photosensitive chip. 3.根据权利要求2所述的摄像模块,其中,所述第一距离根据所述第一倾角和所述感光面的尺寸确定。3 . The camera module according to claim 2 , wherein the first distance is determined according to the first inclination angle and a size of the photosensitive surface. 4.根据权利要求1所述的摄像模块,其中,所述光学镜头的镜头平面与所述眼球的眼球平面之间具有第二倾角;4. The camera module according to claim 1, wherein a second inclination angle exists between a lens plane of the optical lens and an eyeball plane of the eyeball; 所述光学镜头的镜头平面与所述感光面之间具有非零的第一夹角,以使所述光学镜头的光心偏离所述感光芯片的感光面的中心;A non-zero first angle is formed between the lens plane of the optical lens and the photosensitive surface, so that the optical center of the optical lens deviates from the center of the photosensitive surface of the photosensitive chip; 其中,所述感光面、所述镜头平面和所述眼球平面的相交于一条直线。Wherein, the photosensitive surface, the lens plane and the eyeball plane intersect on a straight line. 5.根据权利要求4所述的摄像模块,其中,所述第一夹角根据所述第二倾角和所述光学镜头的参数确定。5 . The camera module according to claim 4 , wherein the first angle is determined according to the second inclination angle and parameters of the optical lens. 6.根据权利要求1所述的摄像模块,其中,所述光学镜头的镜头平面与所述眼球的眼球平面之间具有第三倾角;6. The camera module according to claim 1, wherein a third inclination angle exists between a lens plane of the optical lens and an eyeball plane of the eyeball; 所述光学镜头的光心与所述感光面的中心偏移第二距离,并且所述光学镜头的镜头平面与所述感光面之间具有非零的第二夹角,以使所述光学镜头的光心偏离所述感光芯片的感光面的中心;The optical center of the optical lens is offset from the center of the photosensitive surface by a second distance, and a non-zero second angle is formed between the lens plane of the optical lens and the photosensitive surface, so that the optical center of the optical lens deviates from the center of the photosensitive surface of the photosensitive chip; 其中,所述感光面、所述镜头平面和所述眼球平面的相交于一条直线。Wherein, the photosensitive surface, the lens plane and the eyeball plane intersect on a straight line. 7.根据权利要求6所述的摄像模块,其中,所述第二距离根据所述第三倾角和所述感光面的尺寸确定;7. The camera module according to claim 6, wherein the second distance is determined according to the third inclination angle and the size of the photosensitive surface; 所述第二夹角根据所述第三倾角、第二距离和所述光学镜头的参数确定。The second angle is determined according to the third inclination angle, the second distance and parameters of the optical lens. 8.一种应用于眼球追踪的摄像模块的制造方法,其特征在于,所述摄像模块安装于头戴式设备,所述制造方法包括:8. A method for manufacturing a camera module for eye tracking, wherein the camera module is installed in a head-mounted device, and the manufacturing method comprises: 在玻璃晶元上基于半导体制造工艺加工出具有非零的预设倾角的光学镜头阵列;An optical lens array having a non-zero preset tilt angle is processed on a glass wafer based on a semiconductor manufacturing process; 对所述光学镜头阵列进行裁切得到具有所述预设倾角的单个光学镜头;以及Cutting the optical lens array to obtain a single optical lens having the preset inclination angle; and 针对每一个具有所述预设倾角的光学镜头,基于主动对准工艺与一个感光芯片组装,得到所述光学镜头的镜头平面与所述感光芯片的感光面之间以所述预设倾角为夹角的摄像模块,以使所述光学镜头的光心偏离所述感光芯片的感光面的中心。For each optical lens with the preset inclination angle, it is assembled with a photosensitive chip based on an active alignment process to obtain a camera module with the preset inclination angle as the included angle between the lens plane of the optical lens and the photosensitive surface of the photosensitive chip, so that the optical center of the optical lens deviates from the center of the photosensitive surface of the photosensitive chip. 9.根据权利要求8所述的制造方法,其中,在玻璃晶元上基于半导体制造工艺加工出具有非零的预设倾角的光学镜头阵列,包括:9. The manufacturing method according to claim 8, wherein the optical lens array having a non-zero preset tilt angle is processed on a glass wafer based on a semiconductor manufacturing process, comprising: 在第一玻璃晶元上基于半导体制造工艺加工出具有所述预设倾角的第一光学镜片阵列;Processing a first optical lens array having the preset inclination angle on a first glass wafer based on a semiconductor manufacturing process; 在第二玻璃晶元上基于半导体制造工艺加工出具有所述预设倾角的第二光学镜片阵列;Processing a second optical lens array having the preset inclination angle on the second glass wafer based on a semiconductor manufacturing process; 基于单个第一光学镜片与单个第二光学镜片对准,对所述第一光学镜片阵列与所述第二光学镜片阵列进行拼接,得到所述具有预设倾角的光学镜头阵列。Based on the alignment of a single first optical lens and a single second optical lens, the first optical lens array and the second optical lens array are spliced to obtain the optical lens array with a preset inclination angle. 10.一种头戴式设备,其特征在于,包括:10. A head mounted device, comprising: 设备主体;以及The device body; and 根据权利要求1至7中任一项所述的应用于眼球追踪的摄像模块,所述摄像模块安装于所述设备主体。According to any one of claims 1 to 7, the camera module used for eye tracking is installed on the device body.
CN202310867725.XA 2023-07-13 2023-07-13 Camera module for eye tracking, manufacturing method thereof, and head-mounted device Pending CN119310735A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310867725.XA CN119310735A (en) 2023-07-13 2023-07-13 Camera module for eye tracking, manufacturing method thereof, and head-mounted device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310867725.XA CN119310735A (en) 2023-07-13 2023-07-13 Camera module for eye tracking, manufacturing method thereof, and head-mounted device

Publications (1)

Publication Number Publication Date
CN119310735A true CN119310735A (en) 2025-01-14

Family

ID=94183531

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310867725.XA Pending CN119310735A (en) 2023-07-13 2023-07-13 Camera module for eye tracking, manufacturing method thereof, and head-mounted device

Country Status (1)

Country Link
CN (1) CN119310735A (en)

Similar Documents

Publication Publication Date Title
US12092800B2 (en) Opto-mechanics of panoramic capture devices with abutting cameras
US10467469B2 (en) Optical system for an image acquisition device
US7920339B2 (en) Method and apparatus providing singlet wafer lens system with field flattener
US11456326B2 (en) Plenoptic camera for mobile devices
TWI521233B (en) Optical imaging lens and eletronic device comprising the same
JP2002171537A (en) Compound image pickup system, image pickup device and electronic device
JP2016533523A (en) Optimized imaging device for iris imaging
WO2016177914A1 (en) Image acquisition system
TW202009556A (en) Optical image capturing module、system and manufacturing method thereof
WO2018192579A1 (en) Camera module
CN100538436C (en) The optical aberration correcting system of digital camera and method
TW202009540A (en) Optical image capturing module、system and manufacturing method thereof
JP5043473B2 (en) Viewfinder optical system and imaging apparatus using the same
US7791667B2 (en) Focus detection apparatus and image-pickup apparatus including the same
TW202009541A (en) Optical image capturing module、system and manufacturing method thereof
JP2005300992A (en) Device for detecting focal point
US20120147247A1 (en) Optical system and imaging apparatus including the same
JP2002191060A (en) Three-dimensional imaging unit
TW202009542A (en) Optical image capturing module
CN119310735A (en) Camera module for eye tracking, manufacturing method thereof, and head-mounted device
US20190121005A1 (en) Imaging device and filter
US7880799B2 (en) Focus detecting apparatus and image pickup apparatus
TWI525340B (en) Optical imaging lens and eletronic device comprising the same
WO2006009088A1 (en) Imaging apparatus
JP2000193883A (en) Optical part for photographing stereoimage and stereoimage photographing device using it

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination