[go: up one dir, main page]

CN117369197B - 3D structure optical module, imaging system and method for obtaining depth map of target object - Google Patents

3D structure optical module, imaging system and method for obtaining depth map of target object Download PDF

Info

Publication number
CN117369197B
CN117369197B CN202311659789.7A CN202311659789A CN117369197B CN 117369197 B CN117369197 B CN 117369197B CN 202311659789 A CN202311659789 A CN 202311659789A CN 117369197 B CN117369197 B CN 117369197B
Authority
CN
China
Prior art keywords
event
light
camera
subunit
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311659789.7A
Other languages
Chinese (zh)
Other versions
CN117369197A (en
Inventor
李安
张莉萍
陈驰
鲁亚东
张思曼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Minshi Microelectronics Co.,Ltd.
Original Assignee
Shenzhen Angstrong Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Angstrong Technology Co ltd filed Critical Shenzhen Angstrong Technology Co ltd
Priority to CN202311659789.7A priority Critical patent/CN117369197B/en
Publication of CN117369197A publication Critical patent/CN117369197A/en
Application granted granted Critical
Publication of CN117369197B publication Critical patent/CN117369197B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/03Combinations of cameras with lighting apparatus; Flash units
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/42Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect
    • G02B27/4205Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect having a diffractive optical element [DOE] contributing to image formation, e.g. whereby modulation transfer function MTF or optical aberrations are relevant
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a 3D structure optical module, an imaging system and a method for obtaining a target object depth map, wherein the 3D structure optical module comprises the following components: the structure light projector is used for projecting structure light to the surface of the object to be detected, so that the surface of the object to be detected forms a reflected light beam; the event camera is used for sensing the reflected light beam and generating event data; and the structured light projector and the event camera are respectively connected with the module circuit board and are arranged on the module circuit board at intervals. The 3D structure optical module provided by the technical scheme of the invention can avoid the influence on measurement accuracy caused by too strong or too low illumination intensity, and can realize accurate 3D measurement of high-speed objects and objects with large background illumination intensity spans.

Description

3D structure optical module, imaging system and method for obtaining depth map of target object
Technical Field
The invention relates to the technical field of 3D imaging equipment, in particular to a 3D structure optical module, an imaging system and a method for obtaining a target object depth map.
Background
With the development of artificial intelligence, three-dimensional imaging technology plays an increasingly important role, and is widely applied to various fields such as floor sweeping machines, industrial robots, augmented reality and the like, and the existing three-dimensional imaging technology comprises a 3D structured light technology, an indirect/direct flight time technology and the like.
The traditional 3D structure light module mainly comprises a structure light projector and an infrared receiving camera, wherein the structure light projector projects structure light with certain characteristic information to an object, the infrared receiving camera collects structure light spots reflected by the surface of the object to form an infrared speckle pattern, and depth calculation is realized by utilizing a triangle measurement principle; however, since the image obtained by each exposure of the infrared receiving camera contains all contents of the scene, the data volume to be processed of each frame of image is large, the image output frame rate of the infrared receiving camera is directly affected, so that the traditional 3D structure optical module has the problems of performance degradation and the like in a fast moving or dynamic scene, and is limited by the photosensitive dynamic range (usually about 65 db) of the infrared receiving camera, when the traditional 3D structure optical module is applied in a scene with too strong or too low illumination intensity, the problems of overexposure or underexposure of the infrared receiving camera can occur, the measurement accuracy is affected, and in addition, the performance of the traditional structure light during remote measurement is limited by the luminous power of the structure light projector and the sensitivity of the infrared receiving camera.
Disclosure of Invention
The invention mainly aims to provide a 3D structure optical module, which aims to solve the technical problem that the measurement accuracy is affected by the influence of illumination intensity on the existing 3D structure optical module.
In order to achieve the above object, the present invention provides a 3D structured optical module, comprising:
the structure light projector is used for projecting structure light to the surface of the object to be detected, so that the surface of the object to be detected forms a reflected light beam;
The event camera is used for sensing the reflected light beam and generating event data;
And the structured light projector and the event camera are respectively connected with the module circuit board and are arranged on the module circuit board at intervals.
Optionally, the structured light projector comprises:
the projector circuit board is arranged independently or is arranged on a common substrate of the projector circuit board and the module circuit board;
the light source is used for emitting light and is electrically connected to the projector circuit board;
The optical element comprises a base material and a microstructure surface, wherein the microstructure surface is arranged on the surface of the base material;
The projector structure support is provided with a light-emitting channel, the optical element is connected in the light-emitting channel, and the light passes through the optical element and is emitted from the light-emitting channel.
Optionally, the microstructure surface is a diffraction microstructure surface, the projector structure support further includes at least one collimating lens, the collimating lens is disposed in the light-emitting channel, and the light sequentially passes through the collimating lens and the optical element and exits the light-emitting channel.
Optionally, the microstructured surface comprises a diffractive microstructured surface and a collimating microstructured surface; the collimating micro-structural surface and the diffraction micro-structural surface are respectively positioned on two sides or the same side of the base material, or the collimating micro-structural surface and the diffraction micro-structural surface are integrated on the same side of the base material to form an integrated micro-structural surface;
or, the micro-structure surface is a super-surface structure, the super-surface structure comprises a first micro-structure and a second micro-structure, and the first micro-structure and the second micro-structure are respectively arranged on two surfaces of the substrate.
Optionally, the light source is a polarized laser light source, the polarized laser light source is used for emitting first polarized laser light and second polarized laser light with different polarization directions at intervals, or the light source is a multispectral laser light source, and the multispectral laser light source is used for emitting at least two lasers with different wavelengths at intervals.
Optionally, when the light source is a polarized laser light source; the microstructure surface comprises a first subunit and a second subunit which are connected in a spliced manner, wherein the first subunit is adaptive to the first polarized laser, and the second subunit is adaptive to the second polarized laser;
when the light source is a multispectral laser light source, the microstructure surface comprises a plurality of subunits which are spliced and connected, and lasers with different wavelengths are correspondingly arranged with the different subunits;
Or when the microstructure surface is a super-surface structure; when the light source is a multi-spectrum laser light source, one wavelength laser is adaptive to the first microstructure, and the other wavelength laser is adaptive to the second microstructure.
Optionally, the diffraction microstructure surface obtains periodic phase distribution through vector design;
or, the diffraction microstructure surface obtains random phase distribution through scalar design.
Optionally, the event camera includes:
the camera circuit board is arranged independently or is arranged on a common substrate of the camera circuit board and the module circuit board;
The dynamic vision sensor is electrically connected with the camera circuit board and is used for receiving the reflected light beams and generating event data;
The camera structure bracket is arranged on the camera circuit board and is provided with a light entering channel, and speckle structure light reflected by the object to be detected enters the light entering channel;
an infrared narrowband filter or an infrared super surface lens;
When the event camera comprises the infrared narrowband optical filter, a first camera installation boss is arranged in the light entering channel, and the infrared narrowband optical filter is arranged on the first camera installation boss; the camera structure support is provided with at least one imaging lens, the imaging lens is arranged in the light entering channel, or the camera structure support is provided with a super-surface lens, the light entering channel is provided with a second camera mounting boss, and the super-surface lens is arranged in the second camera mounting boss;
when the event camera comprises the infrared super-surface lens, the light inlet channel is provided with a third camera mounting boss, and the infrared super-surface lens is arranged on the third camera mounting boss.
Optionally, two event cameras are provided, and/or the 3D structure light module further comprises an RGB camera, where the RGB camera is used for outputting a color chart;
Or, the 3D structure optical module further comprises an RGB camera, the event camera and the RGB camera are fused to form an integrated receiving camera, the dynamic vision sensor of the integrated receiving camera comprises a plurality of pixel units, each pixel unit comprises four pixel sub-units, and the four pixel sub-units are respectively a red photosensitive pixel, a green photosensitive pixel, a blue photosensitive pixel and an infrared photosensitive pixel.
The invention also proposes a 3D imaging system comprising: the main chip and the 3D structure optical module;
wherein, 3D structure optical module is as above 3D structure optical module, the main chip includes:
The control module is in control connection with the 3D structure light module and is used for receiving imaging information and controlling the structure light projector and the event camera to be started according to the imaging information;
The computing module is in communication connection with the event camera so as to receive event data generated by the event camera, convert the event data into event images and transmit the event images;
the storage module is in communication connection with the calculation module and is used for receiving and storing the event images transmitted by the calculation module, and the storage module also stores calibration parameters and a reference picture;
The image data processing module is connected with the storage module and is used for calibrating the event images according to the calibration parameters to form a calibration event map, and the image data processing module is also used for comparing the calibration event map with the reference map to obtain a depth map.
The invention also proposes a method of obtaining a depth map of a target object, applied to a 3D imaging system as described above, the method comprising:
Calibrating an event camera and a structure light projector respectively to obtain internal parameters and external parameters of the event camera and internal parameters and external parameters of the structure light projector;
the control module of the main chip firstly starts the event camera and then starts the structure light projector, wherein the time difference between the start of the event camera and the start of the structure light projector is larger than the initialization response time of the event camera;
The method comprises the steps that a structured light projector projects structured light with a certain characteristic structure to an object to be detected, light spots reflected by the object to be detected are incident to an event camera, and the event camera captures scattered spots to generate event data;
And the main chip obtains the depth image of the target object according to the event data, the internal parameters and external parameters of the structured light projector, the internal parameters and external parameters of the event camera and the reference image.
According to the technical scheme, the structural light projector is adopted to project structural light spots to the surface of the object to be detected, and the event camera senses light beams reflected by the object and generates event data. After the structural light spots projected by the structural light projector irradiate the surface of the object to be measured, the light beams reflected by the object to be measured are incident to the event camera to cause light intensity change, so that the event camera can acquire event data of a static scene, and further, the influence on measurement accuracy caused by the fact that the projection light intensity of the structural light projector is too high or too low is avoided. On the one hand, the sensor of the event camera has a very high dynamic range (more than 120 db), can capture weak light intensity change on the surface of an object, can operate under a wide illumination condition, can output accurate event data under the condition of high background light or low illumination, and is beneficial to three-dimensional measurement under the condition of complex illumination. On the other hand, the event camera only generates event data when the light intensity changes, and does not need to capture the whole image on a fixed frame, so that information redundancy is greatly reduced, and the event data generation efficiency and the data transmission and processing speed can be greatly improved. The event camera can capture the light intensity change with microsecond time resolution, which is several orders of magnitude higher than the traditional camera (the traditional camera is usually in ms frame rate), and is suitable for three-dimensional measurement of fast motion scenes. In yet another aspect, the event camera cooperates with the structured light projector to image, rather than estimating depth information of the object from the time difference of event data output by the event camera, a less noisy depth image may be obtained. In addition, the event camera acquires the advantages of high dynamic performance, high time resolution and the like of the data, and the accurate 3D measurement of high-speed objects and objects with large background illumination intensity spans is realized.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to the structures shown in these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a 3D imaging module according to the present invention;
FIG. 2 is a schematic diagram of a structured light projector according to the present invention;
FIG. 3 is a schematic view of another embodiment of a structured light projector according to the present invention;
FIG. 4 is a schematic view of a structural speckle pattern projected by a structural light projector according to the present invention;
FIG. 5 is a schematic view of another structural speckle pattern projected by the structural light projector of the present invention;
FIG. 6 is a schematic view of still another structural speckle pattern projected by the structural light projector provided by the present invention;
FIG. 7 is a schematic view of another structured light spot projected by the structured light projector according to the present invention;
FIG. 8 is a schematic illustration of yet another structured light projector provided by the present invention;
Fig. 9 is a schematic structural diagram of an event camera according to the present invention;
fig. 10 is a schematic structural diagram of another 3D structured light module according to the present invention;
FIG. 11 is a schematic diagram of another 3D optical module according to the present invention;
FIG. 12 is a schematic diagram of a partial pixel distribution of an RGB & event camera according to the present invention;
fig. 13 is a schematic structural diagram of another 3D structured light module according to the present invention;
FIG. 14 is a schematic diagram of the constituent modules of a 3D imaging system according to the present invention;
fig. 15 is a flowchart of a method for obtaining a depth map of a target object by a 3D imaging system according to the present invention.
Reference numerals illustrate:
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that all directional indicators (such as up, down, left, right, front, and rear … …) in the embodiments of the present invention are merely used to explain the relative positional relationship, movement, etc. between the components in a particular posture (as shown in the drawings), and if the particular posture is changed, the directional indicator is changed accordingly.
In the present invention, unless specifically stated and limited otherwise, the terms "connected," "affixed," and the like are to be construed broadly, and for example, "affixed" may be a fixed connection, a removable connection, or an integral body; can be mechanically or electrically connected; either directly or indirectly, through intermediaries, or both, may be in communication with each other or in interaction with each other, unless expressly defined otherwise. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
In addition, if there is a description of "first", "second", etc. in the embodiments of the present invention, the description of "first", "second", etc. is for descriptive purposes only and is not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In addition, the meaning of "and/or" as it appears throughout includes three parallel schemes, for example "A and/or B", including the A scheme, or the B scheme, or the scheme where A and B are satisfied simultaneously. In addition, the technical solutions of the embodiments may be combined with each other, but it is necessary to base that the technical solutions can be realized by those skilled in the art, and when the technical solutions are contradictory or cannot be realized, the combination of the technical solutions should be considered to be absent and not within the scope of protection claimed in the present invention.
The invention provides a 3D structure optical module.
In the prior art, the conventional 3D structured light module has the problems that the image obtained by each exposure of the infrared receiving camera contains all contents of a scene, the data amount to be processed of each frame of image is large, the image output frame rate of the infrared receiving camera is directly influenced, the conventional 3D structured light module has the problems of performance degradation and the like in fast motion or dynamic scenes, the conventional 3D structured light module is limited by the photosensitive dynamic range of the infrared receiving camera (usually about 65 db), when the conventional 3D structured light module is applied to scenes with excessively high or excessively low projection illumination intensity of the structured light projector, the problems of overexposure or underexposure of the infrared receiving camera influence the measurement accuracy, and in addition, the conventional structured light is limited by the luminous power of the structured light projector and the sensitivity of the infrared receiving camera, and the performance of the conventional structured light is also unsatisfactory in long-distance measurement.
In order to solve the technical problems, the technical scheme of the invention adopts the structural light projector to project structural light to the surface of the object to be detected, so that the surface of the object to be detected forms a reflected light beam; sensing the reflected light beam by using an event camera and generating event data; and respectively connecting the structured light projector and the event camera with the module circuit board and arranging the structured light projector and the event camera on the module circuit board at intervals. Specifically, the structured light projector projects structured light spots onto the surface of the object to be detected, and the event camera senses light beams reflected by the object and generates event data. After the structural light spots projected by the structural light projector irradiate the surface of the object to be measured, the light beams reflected by the object to be measured are incident to the event camera to cause light intensity change, so that the event camera can acquire event data of a static scene, and further, the influence on measurement accuracy caused by the fact that the projection light intensity of the structural light projector is too high or too low is avoided. On the one hand, the sensor of the event camera has a very high dynamic range (more than 120 db), can capture weak light intensity change on the surface of an object, can operate under a wide illumination condition, can output accurate event data under the condition of high background light or low illumination, and is beneficial to three-dimensional measurement under the condition of complex illumination. On the other hand, the event camera only generates event data when the light intensity changes, and does not need to capture the whole image on a fixed frame, so that information redundancy is greatly reduced, and the event data generation efficiency and the data transmission and processing speed can be greatly improved. The event camera can capture the light intensity change with microsecond time resolution, which is several orders of magnitude higher than the traditional camera (the traditional camera is usually in ms frame rate), and is suitable for three-dimensional measurement of fast motion scenes. In yet another aspect, the event camera cooperates with the structured light projector to image, rather than estimating depth information of the object from the time difference of event data output by the event camera, a less noisy depth image may be obtained. In addition, the event camera acquires the advantages of high dynamic performance, high time resolution and the like of the data, and the accurate 3D measurement of high-speed objects and objects with large background illumination intensity spans is realized.
The above technical scheme is described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic structural diagram of a 3D imaging module according to the present invention, where the 3D structural optical module 1000 includes: structured light projector 110, event camera 120, and module circuit board 130. The structured light projector 110 is configured to project structured light onto a surface of an object to be measured, so that the surface of the object to be measured forms a reflected light beam; the event camera 120 is used for sensing the reflected light beam and generating event data; the structured light projector 110 and the event camera 120 are respectively connected with the module circuit board 130, and are arranged on the module circuit board 130 at intervals.
In a specific implementation process, the structured light projector 110 and the event camera 120 may be separate modules electrically connected to the circuit board through connectors, or may be directly fixed to the circuit board, that is, the structured light projector 110 and the event camera 120 are co-mounted with other electronic devices, and in addition, the circuit board is integrated with a corresponding chip, an image transmission interface, a power transmission interface, and the like. The structured light projector 110 projects a structured light spot onto the surface of the object to be measured, and the event camera 120 senses the light beam reflected by the object and generates event data. After the structural light spot projected by the structural light projector 110 irradiates the surface of the object to be measured, the light beam reflected by the object to be measured is incident to the event camera 120 to cause the light intensity change, so that the event camera 120 can acquire the event data of the static scene. The event camera 120 generates event data only when a light intensity change occurs, without capturing the entire image on a fixed frame, so that event data generation efficiency can be greatly improved.
Example 1
In the embodiment of the present invention, as shown in fig. 2, a schematic structural diagram of a structured light projector provided by the present invention is shown; specifically, the structured light projector 110 includes: projector circuit board 111, light source 112, optical element 113, and projector structural support 114. The projector circuit board 111 is arranged alone or the projector circuit board 111 and the module circuit board 130 are arranged together on a substrate; the light source 112 is used for emitting light, and the light source 112 is electrically connected to the projector circuit board 111; the optical element 113 includes a substrate and a microstructure surface, wherein the microstructure surface is disposed on the surface of the substrate; the projector structure support 114 is provided with a light-emitting channel, the optical element 113 is connected in the light-emitting channel, and light passes through the optical element 113 and is emitted from the light-emitting channel.
Specifically, the light source 112 adopts a laser light source and is electrically connected to the projector circuit board 111, where the projector circuit board 111 may be a rigid-flex board, a ceramic substrate or a PCB board, etc. for supplying power to the laser light source, and the laser light source may be vcsel (vertical cavity surface laser emitter), hcsel (horizontal cavity surface laser emitter), eel (edge emitting laser), LED, etc. and is composed of a plurality of light emitting holes distributed pseudo-randomly, and the light emitting wavelength may be an infrared wavelength of 850nm, 905nm, 940nm, etc. The light source 112 emits light toward the light-emitting channel, and the light passes through the optical element 113 and emits out of the external surface of the object to be measured. The substrate of the optical element 113 is typically glass, resin, or quartz, and the microstructure surface may be formed on the surface of the substrate by etching or imprinting.
In this embodiment, the microstructure surface of the substrate surface of the optical element 113 may be a diffraction microstructure surface, and further, the microstructure surface may further include a diffraction microstructure surface and a collimation microstructure surface.
Specifically, when the microstructure surface is a diffraction microstructure surface, the projector structure support 114 further includes at least one collimating lens 115, as shown in fig. 2 (a), the collimating lens 115 is disposed in the light-emitting channel, and the light sequentially passes through the collimating lens 115 and the optical element 113 and is emitted out of the light-emitting channel. In a specific implementation process, the structured light projector 110 further includes a collimating lens, the optical element 113 is a diffractive optical element 113, the projector structure support 114 forms a barrel of the collimating lens, a light-emitting channel is formed in the barrel, the light-emitting channel protrudes to form a step, and the diffractive optical element 113 is mounted on the step surface. One or more collimating lenses 115 are arranged in the lens barrel, and the collimating lenses 115 and the lens barrel are integrally formed and positioned between the light source 112 and the optical element 113. The light emitted by the laser light source 112 is collimated into a parallel light beam by the collimating lens, the parallel light beam is replicated and diffused by the diffraction optical element 113 into structural light with certain characteristic information, such as common speckle structural light, after the speckle structural light irradiates on an object to be measured, the light beam reflected by speckle on the object is incident on a sensor of the event camera 120 to cause light intensity change, and the event data is output by a pixel area corresponding to the event camera 120.
When the micro-structural surface includes a diffraction micro-structural surface and a collimation micro-structural surface, as shown in fig. 2 (b), at this time, the optical element 113 is a collimation diffraction integrated optical element 113, the projector structural support 114 is fixed on the projector circuit board 111 through glue, the collimation diffraction integrated optical element 113 is fixed on a step surface of the projector structural support 114 through glue, at this time, no collimation lens 115 exists in the light emitting channel, the collimation function of the collimation lens 115 and the diffraction function of the diffraction optical element 113 are implemented by using one piece of optical element 113, the collimation and diffraction functions can be implemented by integrating the collimation micro-structural surface and the diffraction micro-structural surface on the basis of one piece of substrate, or one micro-structural surface can be used for implementing the collimation and diffraction functions, for example, the micro-structural surface can be a grating micro-structural surface designed based on a diffraction principle, or a super-surface micro-structural surface designed based on a generalized schnier principle.
In addition, the collimating microstructure surface and the diffraction microstructure surface are respectively positioned on two sides or the same side of the base material, or the collimating microstructure surface and the diffraction microstructure surface are integrated on the same side of the base material to form an integrated microstructure surface. Specifically, the structured light projector 110 may also be the structure as shown in fig. 2 (b), the collimating and diffracting integrated optical element 113 integrates the collimating function of the collimating lens 115 and the diffracting function of the optical element 113 in fig. 2 (a) on one optical element 113, and the collimating and diffracting integrated optical element 113 is also composed of a plurality of collimating and diffracting subunits, so that the number of devices used can be reduced, and the design size and cost of the projector can be reduced.
Based on the above embodiment, in order to improve the resolution (precision) of the final depth map output of the 3D structured light module 1000, the structured light projector 110 may be made to project speckle infrared maps with different speckle distributions multiple times, different event images may be acquired by the event camera 120 in a time-sharing manner, different event images may be matched with different event reference maps to generate multiple different depth maps, and the image data processing module 213 may perform fusion on the multiple generated different depth maps to obtain one high-precision depth map output, so as to improve the measurement precision of the 3D structured light.
Example two
In this embodiment, fig. 3 is a schematic structural diagram of another structural light projector according to the present invention. In this embodiment, the light source 112 is a polarized laser light source, and the light emitting holes of the polarized laser light source may be formed by a single light emitting hole, or may be formed by a plurality of light emitting holes distributed pseudo-randomly, or may be formed by a plurality of light emitting holes distributed regularly, and the polarized laser light source may output laser light in two different polarization states at intervals. The polarization control of the output light can be realized by introducing specific optical structures such as mirrors, waveguides, anisotropic materials, gratings and the like into the optical cavity of the laser; the polarized laser light source can also integrate a fast switching polaroid on the surface of the laser, and the polarization control of the output light can be realized by controlling the rotation angle of the polaroid or using an electro-optic modulator; the polarized laser light source can also be formed by designing a laser to be composed of various optical cavities, and polarized light in different polarization directions is alternately output by controlling the excitation of different cavities; the polarized laser light source can also be a multimode laser with polarization characteristics, and different mode outputs are realized through current modulation or a mode selector so as to realize polarization control of output light.
In this embodiment, the control module 210 turns on the polarized laser light sources, all the light emitting holes in the polarized laser light sources emit light at the same time, and the polarized laser light sources are controlled to output light beams with different polarization states at intervals; in some embodiments, the light emitting holes in the polarized laser light source can be divided into two types, the polarization states of the light beams emitted by different types of light emitting holes are different, and the polarized laser light source 112 outputs the light beams with different polarization states at intervals by driving the different light emitting holes to emit light; in some embodiments, the polarized laser light source 112 may also be composed of two independent polarized laser arrays, where the polarization states of the light beams emitted by each laser array are different, and by controlling the laser light emission of different arrays, the polarized laser light source 112 outputs light beams with different polarization states at intervals.
Specifically, the polarized laser light source 112 is configured to emit first polarized laser light 1121 and second polarized laser light 1122 having different polarization directions at intervals. At this time, the microstructure surface of the optical element 113 includes a first subunit 118 and a second subunit 119 that are spliced and connected, the first subunit 118 is adapted to the first polarized laser light 1121, and the second subunit 119 is adapted to the second polarized laser light 1122. Specifically, the first subunit 118 and the second subunit 119 share the same substrate, and are tightly spliced. The microstructure surfaces of the first subunit 118 and the second subunit 119 are provided with different grating constants (diffraction periods), and when the same laser light enters different diffraction subunit regions, structural light spots with different speckle distributions are formed. As shown in fig. 3 (a), at this time, the first subunit and the second subunit are two diffraction subunits, and the microstructure surface includes the first diffraction subunit and the second diffraction subunit that are spliced. As shown in fig. 3 (b), at this time, the first subunit and the second subunit are two collimating and diffracting integrated subunits, and the microstructure surface includes the first collimating and diffracting integrated subunit and the second collimating and diffracting integrated subunit that are spliced.
In addition, the first subunit 118 and the second subunit 119 have polarization characteristics, that is, only a light beam having a certain polarization direction is allowed to pass through, for example, the first subunit 118 can pass through the first polarized laser light 1121 and cannot pass through the second polarized laser light 1122, and the second subunit 119 can pass through the second polarized laser light 1122 and cannot pass through the first polarized laser light 1121. In the embodiment, a polarizing film, such as a horizontal polarizing film, is attached to the lower surface of the first subunit 118 to allow the horizontally polarized light to pass therethrough, and a vertical polarizing film is attached to the lower surface of the second subunit 119 to allow the vertically polarized light to pass therethrough; in some embodiments, the substrate of the optical element 113 is a material with polarization characteristics, such as a crystal, an optical wafer, or a liquid crystal, and different areas of the substrate, that is, the first subunit 118 and the second subunit 119 have different polarization properties, and the polarization & diffraction functions are achieved by etching or embossing diffraction microstructure surfaces of different designs on the substrate with different polarization characteristics in the different areas.
In this embodiment, as shown in fig. 4, a schematic view of a structural speckle pattern projected by the structural light projector according to the present invention is shown in fig. 5, and as shown in another schematic view of a structural speckle pattern projected by the structural light projector according to the present invention, the polarization directions of the first polarized laser 1121 and the second polarized laser 1122 are perpendicular.
In a specific implementation process, the control module 210 controls the polarized laser light source to emit the first polarized laser light 1121 and the second polarized laser light 1122 at intervals, where the first polarized laser light 1121 is horizontally polarized light and the second polarized laser light 1122 is vertically polarized light. As shown in fig. 4 (a), the left and right arrows in the circle in the figure represent that the polarization direction is horizontal (parallel to the paper surface direction), the black dots in the circle represent that the polarization direction is perpendicular to the paper surface, when the polarized laser light source emits the horizontal polarized light, the horizontal polarized light is collimated by the collimating lens 115 part of the collimating mirror and then is diffracted and diffused by the first subunit 118, or the horizontal polarized light directly passes through the first subunit 118 area of the collimating and diffracting integrated optical element to form a first infrared speckle pattern 1123 as shown in fig. 4 (b), at this time, the pixels of the event camera 120 respond to the change of the light intensity at the position corresponding to the light spot, output event data, the calculation module 211 converts the event data into an event pattern, the event pattern is kept in the storage module 212, the image data processing module 213 calls the calibration parameter saved in the storage module 212, and uses the parameter to calibrate the event pattern, the calibrated event pattern is matched with the reference pattern stored in the storage module 212, and the first depth pattern is obtained by using the principle of triangulation. When the polarized laser light source emits the vertical polarized light, the vertical polarized light is partially collimated by the collimating lens 115 of the collimating lens and then is diffracted and diffused by the second subunit 119, or the vertical polarized light is directly passed through the second subunit 119 area of the collimating and diffracting integrated optical element to form a second infrared speckle pattern 1124 shown in fig. 4 (b), at this time, the pixel of the event camera 120 responds to the change of the light intensity at the position corresponding to the light spot to output the event data, the calculation module 211 converts the event data into the event pattern and holds the event pattern in the storage module 212, the image data processing module 213 invokes the calibration parameter stored in the storage module 212 and uses the parameter to calibrate the event pattern, the calibrated event pattern is matched with the reference pattern stored in the storage module 212, and the second depth pattern is obtained by using the triangulation principle. Because the microstructures of the first subunit 118 and the second subunit 119 are different, the light spot distributions obtained by the polarized laser beams with the same laser spot distribution after passing through different diffraction subunits are different, the light spot distributions in the event images output by the event camera 120 are different, and finally, a second depth image different from the first depth image is obtained, and a more accurate depth image of the target object can be obtained by fusing the first depth image and the second depth image in the image data processing module 213. The corresponding distribution of the fused event data is shown in (c) of fig. 4, and compared with the data density of single event data, the distribution of the fused event data is higher, and the accuracy of the depth image output is improved.
It should be noted that, for the 3D structured light module 1000 using the structured light projector 110 in fig. 4, two reference event maps are stored in advance, that is, the first reference event map obtained by converting and calibrating the event data collected by the event camera 120 when the polarized laser light source projects the first polarized laser light 1121. When the polarized laser light source projects a second polarized laser light 1122, the event camera 120 collects a second reference event map from which the event data is converted and calibrated. The first reference event map and the second reference event map are both maintained in the storage module 212, and when the image pixels are matched subsequently, the event camera 120 performs matching calculation on the event map obtained by responding to the first polarized laser and the first reference event map, and the event camera 120 performs matching calculation on the event map obtained by responding to the second polarized laser and the second reference event map.
In addition, the first polarized laser light 1121 and the second polarized laser light 1122 may have other polarization directions, and it is only necessary to ensure that the polarization directions of the two are different by 90 °, for example, the polarization direction of the first polarized laser light 1121 is parallel to the paper surface and the angle is 0 °, and the polarization direction of the second polarized laser light 1122 is parallel to the paper surface and the angle is 90 °.
It should be noted that, as shown in fig. 4 (b), the diffraction microstructures of the optical element 113 corresponding to the first infrared speckle pattern 1123 and the second infrared speckle pattern 1124 are periodic phase distribution structures obtained based on a vector design method, at this time, the distribution of the plurality of light emitting holes of the polarized laser light source used in a matching way must be pseudo-random distribution, fig. 4 is only a schematic illustration for convenience, the replication order of the diffraction microstructure of the optical element 113 is set to 9, the number of speckle distribution and speckle of each order is consistent with the light emitting point of the laser light source 112, and finally, the first infrared speckle pattern 1123 and the second infrared speckle pattern 1124 shown in fig. 4 (b) are respectively formed by splicing 9 light spots, and the actual replication order can be set to any number according to the product requirement.
Of course, the diffraction microstructure of the optical element 113 may also be a random phase distribution structure obtained based on a scalar design method, where the corresponding polarized laser light source may be composed of a single light emitting hole or multiple light emitting holes, and the distribution of the multiple light emitting holes may be pseudo-random distribution or regular distribution, referring to fig. 5, which is a speckle diagram of another structure, where for convenience of explanation, only the light emitting holes of the polarized laser light source are shown to be regularly distributed, and the control module 210 controls the polarized laser light sources to emit horizontally polarized light as shown in (a) of fig. 5 and vertically polarized light as shown in (b) of fig. 5 at intervals. When the light source 112 emits the horizontally polarized light shown in fig. 5 (a), the horizontally polarized light is partially collimated by the collimating lens 115 of the collimating lens and then is diffracted and diffused by the first subunit 118 to form an infrared speckle pattern shown in fig. 5 (b), and similarly, when the light source 112 emits the vertically polarized light shown in fig. 5 (a), an infrared speckle pattern shown in fig. 5 (b) is formed, and the fused event data distribution is shown in fig. 5 (c). Because the scalar design does not need to consider the influence of the polarization characteristic of light on the simulation result, the calculated amount is greatly reduced compared with the vector design, the replication order of the diffraction phase surface of the scalar design can be set to be more than ten thousand times, and the order position of each replication point can be defined by user, so that ten thousands of scattered spots distributed randomly can be obtained after one luminous point passes through the diffraction element after being collimated, and the positions of the scattered spots can be defined by random user, and the area of the scattered spots can be defined according to the receiving view field area of the event camera 120, so that the scattered spots projected by the structured light projector 110 are received by the event camera 120 to the greatest extent, and the utilization rate of the scattered spots is greatly improved.
When the light source 112 is a single-point laser light source, each speckle in the target light field is generated by a portion of the light rays of the single-point laser light source 112; when the light source 112 includes a plurality of light emitting holes, each scattered spot in the target light field is generated by a part of light rays after light superposition of all light emitting points of the array light source; because the distribution boundary of the speckle in the target light field is a rectangular boundary, and the imaging view field of the event camera 120 is also rectangular, the speckle pattern shape can be better matched with the view field of the event camera 120, and the ineffective speckle ratio of the speckle projected by the structured light projector 110 is greatly reduced.
In this embodiment, since the laser light source 112 outputs polarized light, the design of the diffractive optical element 113 is simpler than that of the input light without polarization, and the diffractive optical element 113 can achieve better diffraction effects, such as uniformity and signal-to-noise ratio.
Example III
In this embodiment, the light source 112 is a multispectral laser light source, and at this time, the microstructure surface includes a plurality of subunits connected in a spliced manner, and lasers with different wavelengths are disposed corresponding to the different subunits. In the specific implementation process, multispectral laser light with two wavelengths is taken as an example. Specifically, the multispectral laser source is controlled by the control module 210 to emit first wavelength laser 1125 and second wavelength laser 1126 with different wavelengths at intervals. It should be noted that, the diffraction microstructure surface of the optical element 113 in this embodiment includes a first subunit 118 and a second subunit 119 connected in a spliced manner, the first wavelength laser 1125 is adapted to the first subunit 118, and the second wavelength laser 1126 is adapted to the second subunit 119.
Referring to fig. 6, fig. 6 is a schematic view of a structural speckle projected by the structural light projector according to the present invention, where the multispectral laser light source 112 emits a first wavelength laser 1125 with a wavelength λ1 and a second wavelength laser 1126 with a wavelength λ2 at intervals, and the corresponding first subunit 118 and second subunit 119 have integrated filtering functions. If the first subunit 118 only allows the first wavelength laser 1125 with the wavelength λ1 to pass through, the second subunit 119 only allows the second wavelength laser 1126 with the wavelength λ2 to pass through, when the first wavelength laser 1125 as shown in (a) of fig. 6 is emitted, the first wavelength laser 1125 is collimated by the collimating lens 115 of the collimating mirror, and then is diffracted and diffused by the first subunit 118 to form a first infrared speckle pattern 1123 as shown in (b) of fig. 6, as described above, the event camera 120 collects event data and outputs a first event pattern, and a first depth pattern is obtained through calculation; when the second wavelength laser 1126 shown in fig. 6 (a) is emitted, the second wavelength laser 1126 is collimated by the collimating lens 115 of the collimating lens, and is diffracted and diffused by the second subunit 119 to form a second infrared speckle pattern 1124 shown in fig. 6 (b), the event camera 120 collects event data and outputs the second event pattern, and the event pattern is calculated to obtain a second depth pattern, because the microstructures of the first subunit 118 and the second subunit 119 are different, different lasers with the same laser spot distribution are different in spot distribution after passing through different subunits, and finally a second depth pattern 1124 different from the first depth pattern 1123 is obtained, and finally a more accurate depth image of the target object can be obtained by fusing the first depth pattern and the second depth pattern in the image data processing module 213, and the distribution of the fused event data is correspondingly displayed, as shown in fig. 6 (c), which is more dense than the data of the single event data, and the accuracy of the depth image output is improved.
In this embodiment, the light source 112 may emit laser light with more than two wavelengths at intervals, the corresponding optical element 113 may be composed of more than two subunits, the number of subunits is consistent with the number of types of laser output wavelengths, the grating period of each subunit is different, each diffraction subunit integrates the filtering and diffraction functions, and the corresponding passing wavelength is designed according to the wavelength emitted by the laser light source; in this way, the measurement accuracy of the 3D structured light module 1000 can be further improved, and the wavelength design type can be determined according to the image output frame rate and accuracy of actual requirements. Different speckle light field distributions are formed by different laser wavelengths passing through different areas of the optical element 113, different event images are acquired by the event camera 120 in a time-sharing manner, different event images are matched with different reference event images to generate a plurality of different depth maps, and the image data processing module 213 fuses the generated plurality of different depth maps to also obtain a high-precision depth map for output.
For how to realize that the light source 112 outputs laser light with different wavelengths at intervals in fig. 6, specifically, in some embodiments, a tunable wave laser may be used, and a tunable harmonic device is used to selectively filter laser light with a specific wavelength, so as to realize that laser light with different wavelengths is output at intervals; in some embodiments, light emitting holes with different wavelengths can be integrated in the same laser array, and the light source 112 outputs laser with different wavelengths at intervals by driving the different light emitting holes to emit light; in some embodiments, the light source 112 may also be composed of a plurality of independent laser arrays, where each laser array emits laser light with different wavelengths, and the light source 112 outputs laser light with different wavelengths at intervals by controlling the laser light emission of different arrays.
With respect to how the selective transmission of the optical element 113 to different wavelengths of laser light is achieved in fig. 9, in some embodiments, a narrow band filter film of the relevant wavelength may be coated on the substrate region corresponding to the first subunit 118 and the second subunit 119; in some embodiments, a narrow-band filter with different wavelengths can be attached to different areas of the substrate; in some embodiments, the substrate of the optical element 113 is a material with wavelength selective transmission characteristics, such as a variable optical density material, and the optical density of the material can be changed under the action of an applied voltage or an electric field, so as to adjust the transmitted laser wavelength, and by controlling different areas of the substrate, the laser with different wavelengths is subjected to diffraction and replication through corresponding diffraction subunits.
It should be noted that, if the light source 112 emits laser light with multiple wavelengths, the filter matched with the event camera 120 correspondingly needs to be designed into a filter with multiple pass bands according to the wavelengths, so that the event camera 120 can respond to the light emitted by the light source 112 and suppress the background light noise of other bands.
Example IV
Based on the embodiment, in some application scenarios, the required view angles of different distances of the 3D structured light module 1000 are different, for example, the required view angle is 90 ° when measured at a short distance of 30cm-1m, in the long distance, the ratio of the imaging area of the object on the event camera 120 becomes smaller, for example, the long distance measurement of 2m-5m, the view angle satisfies 60 °, in order to achieve both the short distance measurement and the long distance measurement, the view angle of the projected spot of the structured light projector 110 is set to be > 90 ° to cover the view field of the required test distance, thus, the energy of a single speckle in the speckle pattern projected during the long distance measurement is smaller, the measurement accuracy of the long distance is affected, in some embodiments, the different requirements of the short distance measurement can be matched by making the structured light projector 110 project a plurality of different speckle distributions, for example, the infrared speckle pattern of a large view angle during the short distance measurement, the infrared speckle pattern of a small view angle during the long distance measurement, the infrared speckle pattern of the unit area satisfies the corresponding requirement of the camera 120, and the different requirements of the long distance measurement are satisfied.
Referring to fig. 7, fig. 7 is a schematic view of another structural light spot projected by the structural light projector provided by the invention, when an object to be measured is closer to the 3D structural light module 1000, a light beam emitted by the laser light source 112 as shown in fig. 7 (a) is collimated by the collimating lens 115 and then is replicated and diffracted by the first subunit 118, so as to obtain an infrared speckle pattern as shown in fig. 7 (b), at this time, the replication order of the first subunit 118 is N, when the object to be measured is farther from the 3D structural light module 1000, a light beam emitted by the laser light source 112 as shown in fig. 7 (c) is collimated by the collimating lens 115 and then is replicated and diffracted by the second subunit 119, so as to obtain an infrared speckle pattern as shown in fig. 7 (D), at this time, the replication order of the second subunit 119 is M, N > M, that is more speckle spots formed by the laser after passing through the first subunit 118, and the light spot angle is large, and the field of view formed by the laser after passing through the second subunit 119 is smaller, and the single speckle point number is high, and the distance measurement requirement is met; the solution for obtaining the various speckle patterns in fig. 7 refers to the description of the foregoing embodiments to the embodiments, and will not be repeated here.
Example five
Based on the above embodiment, in the present embodiment, the laser light source 112 may output only one state of laser light, and there is no special design for the polarization characteristic of the laser light, and the corresponding optical element 113 also has no polarization characteristic; in order to obtain different infrared speckle patterns in a time-sharing way, an electro-oxidation film layer can be plated on the surface of the optical element 113, and the electro-oxidation film layer has different transmittances to laser beams under the action of external voltage or an electric field, for example, the laser beams can be controlled to not pass through, the laser beams can also be controlled to pass through completely, and the electro-oxidation film layers at the corresponding positions of different diffraction subunits are independently controlled, so that the laser beam interval can be controlled to form different infrared speckle patterns through different diffraction subunits; in some embodiments, a material with an electronically controlled transmittance that is changed may be used as a substrate of the optical element 113, such as a liquid crystal, a semiconductor material, an electro-optic crystal, a liquid crystal polymer, an electrochromic material, or the like, where the transmittance of the substrate at the corresponding position of the different diffraction sub-units is individually controlled, so that the optical element 113 has the functions of selectively transmitting and diffracting, that is, the laser beam interval is controlled to form different infrared speckle patterns through the different diffraction sub-units.
Example six
Fig. 8 is a schematic diagram of another embodiment of a structured light projector according to the present invention, wherein the microstructure surface of the optical element 113 is a super-surface structure, the super-surface structure includes a first microstructure 116 and a second microstructure 117, and the first microstructure 116 and the second microstructure 117 are respectively disposed on two surfaces of the substrate. Unlike (b) in fig. 3, the collimating and diffracting integrated optical element 113 is replaced by a super-surface structure, where the super-surface microstructure is not required to be designed in a zoned manner, and the super-surface structure is formed by a substrate, and a layer of microstructure is respectively arranged on the upper surface and the lower surface of the substrate, and the first microstructure 116 and the second microstructure 117 are respectively designed for incident lasers with different polarization directions. Specifically, when the light source 112 is a polarized laser light source, the first microstructure 116 is adapted to the first polarized laser light 1121, and the second microstructure 117 is adapted to the second polarized laser light 1122. When the first polarized laser 1121 with polarization state a is incident to the optical element 113, the second microstructure 117 transmits the laser beam to the first microstructure 116, the first microstructure 116 modulates the beam to form a first infrared speckle, at this time, the event camera 120 collects corresponding event data, and the event map obtained by conversion is calculated to obtain a first depth map; when the light beam of the second polarized laser 1122 with the polarization state b is incident on the optical element 113, the second microstructure 117 modulates the light beam and transmits through the first microstructure 116 to form a second infrared speckle, and the first infrared speckle is different from the second infrared speckle pattern, so that the above-mentioned beneficial effects (improvement of the depth measurement accuracy or consideration of the near-distance measurement effect) can be achieved, which will not be described herein.
For fig. 8, the laser source 112 may be configured as a multispectral laser source, so that lasers with different wavelengths are projected in a time-sharing manner, and the two microstructures of the super-surface structure are designed for different incident wavelengths, so that the structured light projector 110 can project different infrared speckles when the different wavelengths are incident on the super-surface structure, thereby achieving the above-mentioned beneficial effects. Specifically, when the light source 112 is a multispectral laser light source, the first wavelength laser 1125 is adapted to the first microstructure 116, and the second wavelength laser 1126 is adapted to the second microstructure 117.
The imaging chip of the conventional infrared camera is a frame synchronization sensor, the imaging chip of the event camera 120 is an asynchronous sensor, namely a dynamic vision sensor 122, the event camera 120 generates an event only when the light intensity changes, and does not need to capture the whole image on a fixed frame, so that the event data generation probability can be greatly improved, and the event camera 120 has a high dynamic range (generally >120 db), so that the event camera 120 is excellent in high-speed, strong/low light, such as application fields of automatic driving, unmanned aerial vehicles and the like.
Example six
Fig. 9 is a schematic structural diagram of an event camera according to the present invention, in this embodiment, the event camera 120 includes: camera circuit board 121, dynamic vision sensor 122, and camera structural support 123. Wherein the camera circuit board 121 is arranged alone or the camera circuit board 121 and the module circuit board 130 are arranged together on a substrate; the dynamic vision sensor 122 is electrically connected to the camera circuit board 121, and the dynamic vision sensor 122 is used for receiving the reflected light beam and generating event data; the camera structure support 123 is disposed on the camera circuit board 121, the camera structure support 123 is provided with an incident light channel, and the speckle structure light reflected by the object to be measured enters the incident light channel.
In a specific implementation process, an incident light channel is formed in the camera structure support 123, the dynamic vision sensor 122 is located at one end of the incident light channel, and the speckle structure light is reflected by the object to be detected to form reflected speckle structure light, which enters through the incident light channel and is received by the dynamic vision sensor 122, so as to generate event data. Specifically, unlike the pixels of the conventional camera, the pixels of the dynamic vision sensor 122 not only include photosensitive elements, but also are provided with event generating circuits for monitoring light intensity changes and generating event data when light changes, and the event data is output only when the pixels of the dynamic vision sensor 122, which detect the light intensity changes exceeding a certain threshold, are output, and other pixels are not output.
When the event camera 120 includes the infrared narrowband filter 124, a first camera mounting boss is disposed in the light-in channel, and the infrared narrowband filter 124 is disposed on the first camera mounting boss. The infrared narrowband filter 124 is provided as a narrowband filter matching the wavelength of the laser light source 112 within the structured light projector 110 for filtering stray light to cause only reflected speckle beams corresponding to the speckle structured light beams emitted by the structured light projector 110 to enter the dynamic vision sensor 122, thereby obtaining accurate event data corresponding to the projector spot distribution.
As shown in fig. 9 (a), the event camera 120 further includes an imaging lens, wherein the camera structure support 123 is used as a lens barrel of the imaging lens, the camera structure support 123 is provided with at least one imaging lens 125, and the imaging lens 125 is disposed in the light entrance channel and is integrally configured with the lens barrel. The imaging lens 125 is fixed on the lens barrel, and the imaging lens 125 is used for receiving the reflected speckle structure light formed by the speckle structure light reflected by the object to be detected, and focusing the reflected speckle structure light on the dynamic vision sensor 122 to generate corresponding event data.
As shown in fig. 9 (b), the camera structure support 123 is provided with a super-surface lens 126, the light-in channel is provided with a second camera mounting boss, the super-surface lens 126 is provided on the second camera mounting boss, and the infrared narrow-band filter 124 is located between the super-surface lens 126 and the dynamic vision sensor 122, and at this time, the imaging lens 125 is not required to be provided in the camera structure support 123. The reflected speckle pattern light enters through the super surface lens 126, is filtered by the infrared narrowband filter 124, and is sensed by the dynamic vision sensor 122. The ultra-surface lens 126 is adopted to replace the imaging lens 125 designed based on the traditional geometrical optical imaging principle, the ultra-surface lens 126 is based on the generalized Snell's law, and a surface sub-wavelength size unit structure is introduced to generate a sudden change phase, so that the two-dimensional plane structure of the ultra-surface has special electromagnetic properties, the flexible regulation and control of the amplitude, the phase, the polarization and the like of incident light can be realized, the ultra-surface lens has strong light field control capability, and a layer of micro-structure surface which is formed by a plurality of sub-wavelength size units according to a certain rule is generally formed on a substrate with high transmittance such as quartz, siO2, polymer materials and PC; the phase distribution of the super surface at different positions can be obtained by defining the light field distribution and light field information (amplitude, polarization information and the like) of the input field and the output field, and the structural distribution of the micro structure surface can be obtained by calculating the phase distribution and the material selection of the super surface base material and the micro structure surface.
As shown in (c) of fig. 9, when the event camera 120 includes the infrared super surface lens 127, the light entrance path is provided with a third camera mounting boss, and the infrared super surface lens 127 is provided with the third camera mounting boss. The infrared super surface lens 127 integrates the imaging function of the imaging lens 125 and the filtering function of the infrared narrow-band filter 124, and at this time, the reflected speckle structure light is directly sensed by the dynamic vision sensor 122 after passing through the infrared super surface lens 127.
In this embodiment, the structured light projector 110 and the event camera 120 may be connected to the module circuit board 130 through connectors as separate devices, or may be directly attached to the module circuit board 130, that is, the module circuit board 130, the projector circuit board 111 and the camera circuit board 121 are arranged on a common substrate, and the structured light projector 110 and the event camera 120 are arranged on a common substrate, which is in the scope of the present invention.
Example seven
Fig. 10 is a schematic structural diagram of another 3D structured light module provided by the present invention, where the 3D structured light module includes a module circuit board 130, a structured light projector 110, and two event cameras 120, and is different from fig. 1 in that the 3D structured light module includes two event cameras 120, the two event cameras 120 are turned on simultaneously by a control module 210, the structured light projector 110 is started again, and the time difference between the two starts needs to be longer than the initialization response time of the event cameras 120, the structured light projector 110 projects a structured light spot outwards, the two event cameras 120 simultaneously receive the object reflected light (ignoring the time difference of the incident light to the event cameras 120 and the response of the event cameras 120), the event cameras 120 respond to the change of the light intensity at the position of the light spot, and by matching the event images between the two event cameras 120, the depth map of the object is obtained by using the triangle measurement principle. The structured light projector 110 in fig. 10 may be any one of fig. 2, 3 and 8, and the beneficial effects described above can be obtained as well, and compared with fig. 1, the embodiment does not need to store the reference infrared speckle pattern shot at a fixed distance in advance for matching, and depth information of an object is obtained by comparing offset amounts of the same object point on the two event cameras 120 on images, so that the position requirement of the structured light projector 110 relative to the event cameras 120 is low, and therefore, the assembly positioning and heat dissipation requirements of the 3D structured light module are low.
Example eight
Fig. 11 is a schematic structural diagram of another 3D structured light module according to the present invention, where the 3D structured light module 1000 further includes an RGB camera 140, and the RGB camera 140 is used for outputting a color chart. As shown in fig. 11, the 3D structural light module 1000 includes a module circuit board 130, a structural light projector 110, an event camera 120, and an RGB camera 140, which is different from fig. 1 in that there is one more RGB camera 140, in some application scenes, besides needing to output a depth map, a color map in the scene needs to be obtained, the RGB camera 140 adopts a conventional imaging camera, when outputting the depth map, the RGB camera 140 outputs the color map, but since the depth map is generally used for recognition or obstacle avoidance (there is a high-speed moving object to avoid), the color map is generally used for interactive display (only needs to satisfy the condition that the human eyes can see no jam), the output frame rate of the general depth map is greater than the output frame rate of the color map, and the depth map with the output time stamp closest to the output time of the RGB camera 140 can be selected to be fused with the color map to obtain an RGBD map, which is used for three-dimensional reconstruction;
In this embodiment, the event camera 120 may be further integrated with the RGB camera 140 to form an integrated receiving camera, where the dynamic vision sensor 122 of the integrated receiving camera includes a plurality of pixel units, each pixel unit includes four pixel sub-units, and the four pixel sub-units are respectively a red photosensitive pixel, a green photosensitive pixel, a blue photosensitive pixel, and an infrared photosensitive pixel. Specifically, the event camera 120 and the RGB camera 140 may be integrated in a single receiving camera, as shown in fig. 12, which is a schematic diagram of a partial pixel distribution of an RGB & event camera, where the imaging chip is composed of a plurality of pixel units, each pixel unit is composed of four pixel sub-units, including red, green, blue and infrared sensing pixels, respectively, where the red, green and blue sensing pixels are used for receiving brightness values corresponding to visible light output, the infrared sensing pixels are used for receiving changes of sensing infrared light intensity, outputting event data corresponding to infrared light, the red, green and blue sensing pixels are corresponding to a color map of an output object, the infrared sensing pixels are corresponding to event data of the output object, and the frame rate of the general output depth map is greater than the output frame rate of the color map.
Example nine
Fig. 13 is a schematic structural diagram of another 3D structured light module provided by the present invention, where the 3D structured light module includes a module circuit board 130, a structured light projector 110, two event cameras 120, and an RGB camera 140, and is different from fig. 10 in that one more RGB camera 140 is added, in addition to a depth map being required to be output in some application scenes, a color map in the scenes needs to be obtained, and when the RGB camera 140 adopts a conventional imaging camera to output the depth map, the RGB camera 140 also outputs the color map, but since the depth map is used for recognition or obstacle avoidance, the color map is generally used for interactive display, and the output frame rate of the general depth map is greater than the output frame rate of the color map, and the depth map with the output time stamp closest to the output time of the RGB camera 140 can be selected for fusion with the color map to obtain the RGBD map for three-dimensional reconstruction.
Examples ten
The invention also proposes a 3D imaging system, the 3D imaging system comprising: a main chip 2000 and a 3D structured light module 1000; the specific structure of the 3D structural optical module 1000 refers to the above embodiments, and since the present 3D imaging system adopts all the technical solutions of all the embodiments, at least has all the beneficial effects brought by the technical solutions of the embodiments, and will not be described in detail herein.
As shown in fig. 14, the main chip 2000 includes:
The control module 210, the control module 210 is in control connection with the 3D structured light module 1000, and the control module 210 is configured to receive imaging information and control to turn on the structured light projector 110 and the event camera 120 according to the imaging information;
The computing module 211, the computing module 211 is in communication connection with the event camera 120, so as to receive the event data generated by the event camera 120 and convert the event data into an event image and transmit the event image;
the storage module 212 is in communication connection with the calculation module 211 and is used for receiving and storing the event images transmitted by the calculation module 211, and the storage module 212 also stores calibration parameters and reference pictures;
The image data processing module 213 is connected to the storage module 212, and calibrates the event image according to the calibration parameters to form a calibration event map, and the image data processing module 213 is further configured to compare the calibration event map with the reference map to obtain a depth map.
In a specific implementation process, the event camera 120 of the 3D structured light module 1000 is placed at a distance from the structured light projector 110, and a horizontal distance between the optical center of the event camera 120 and the optical center of the structured light projector 110 is a baseline distance.
The structured light projector 110 is used for projecting the structured light spots onto the surface of the object, the structured light projector 110 is controlled to be lightened in a time-sharing manner, a dynamic illumination environment can be formed, the event camera 120 is used for replacing an original infrared receiving camera to respond to the light intensity change to generate event data, an event image is formed according to the event data, and the depth information of the object is calculated by utilizing the triangulation principle, so that on one hand, the event camera 120 is provided with the dynamic light source 112, the event camera 120 can acquire the event data of a static scene, and on the other hand, the advantages of high dynamic property, high time resolution and the like of the data acquired by the event camera 120 are utilized, and the accurate 3D measurement of the object with a high-speed object and a large background illumination intensity span is realized.
Example eleven
When the existing event camera is used for 3D measurement, depth estimation is carried out through the time difference of event data output by the event camera, the measurement accuracy is not high, and depth measurement cannot be carried out on a static object.
The present embodiment provides a method for obtaining a depth map of a target object, as shown in fig. 15, which is applied to the 3D imaging system described in the foregoing embodiment, and the method includes:
s1: and calibrating the event camera and the structure light projector respectively to obtain the internal parameters and the external parameters of the event camera and the internal parameters and the external parameters of the structure light projector.
The specific calibration method comprises the following steps:
Calibrating an event camera: the calibration plate (such as a checkerboard) with a known geometric structure is shot by using the event camera, the event camera can only acquire pixels with changed light intensity, the traditional checkerboard is static, the event camera is used for capturing without data output, a flickering checkerboard can be used at the moment, black squares in the checkerboard are quickly disappeared and reappeared to generate the light intensity change, the event camera outputs a calibration image, the positions of corner points or round points of the checkerboard are known, internal parameters and external parameters of the event camera are obtained, the internal parameters are generally focal length and distortion parameters of the event camera, and the external parameters are generally a rotation matrix R1 and a translation matrix T1 of camera coordinates relative to world coordinates.
Calibrating the structured light projector: and controlling the structured light projector to emit light at intervals, acquiring an image of the distribution of the speckle light spots by the event camera, knowing the coordinate positions of the speckle light spots, and solving the external parameters of the structured light projector, namely a rotation matrix R2 and a translation matrix T2 of the coordinates of the structured light projector relative to the world coordinates.
S2: and starting the structured light projector to acquire an event image at a known distance, and taking the event image as a reference event image.
And (3) placing the 3D imaging system in front of a plane with a known distance, low reflectivity and high flatness, controlling the structured light projector to emit light at intervals, acquiring images of speckle light spot distribution by the event camera, performing picture calibration by the internal parameters and the external parameters of the event camera and the external parameters of the structured light projector obtained in the first step, and storing the calibrated images as reference event images in the storage module.
Wherein S1 and S2 are completed at the time of production assembly of the 3D imaging system, and this operation does not need to be repeated in the subsequent actual measurement.
S3: and starting an event camera, starting a structured light projector, collecting event data, converting the event data into an event image, calibrating the event image, and matching the event image with a reference event image to generate a depth image.
Specifically, the control module of the main chip starts the event camera first and then starts the structured light projector, wherein the time difference between the start of the event camera and the start of the structured light projector is larger than the initialization response time of the event camera.
When the 3D imaging system receives an imaging task, the control module of the main chip starts the event camera first and then starts the structured light projector. After the structured light projector is started, the structured light projector projects structured light with a certain characteristic structure to an object to be detected, light spots reflected by the object to be detected are incident to the event camera, the positions of the scattered spots are captured by the time camera due to obvious light intensity change, the event camera captures the scattered spots, and event data are generated in corresponding pixel areas of the event camera. Wherein the event data records the pixel location of the light intensity change, the time stamp, and the polarity of the event light intensity change.
And the main chip obtains a depth image of the target object according to the event data, the internal parameters and the external parameters of the structured light projector, the internal parameters and the external parameters of the event camera and the reference image. Specifically, the computing module of the main chip converts the event data into a visualized event image according to the pixel coordinates and the polarities corresponding to the event data, the visualized event image is output, the output event image is stored in the storage module and is output to the image data processing module, the image data processing module calls internal parameters and external parameters of the event camera and external parameters of the structured light projector to calibrate the event image, the calibrated event image is subjected to pixel matching with a pre-stored reference event image, the same pixel point pair is found, the offset of the pixel point pair imaged in the event camera is calculated, and the depth image of the target object can be obtained by utilizing the triangle ranging principle.
In this way, using the event camera in conjunction with the structured light projector to image, rather than estimating depth information of the object from the time difference in event data output by the event camera, a less noisy depth image may be obtained. In addition, by controlling the time-sharing lighting of the structured light projector, a dynamically-changing illumination environment can be formed, so that the event camera can acquire event data of a static scene and perform 3D measurement on a static object. Therefore, the event camera is used in combination with the structured light projector, the defect of 3D measurement by the traditional 3D structured light is overcome, and the method for obtaining the depth map of the target object provided by the embodiment enables the 3D imaging system to be compatible with complex scenes such as high-speed motion, static state, high dynamic range and the like, and can accurately measure the distance.
The foregoing description is only of the optional embodiments of the present invention, and is not intended to limit the scope of the invention, and all the equivalent structural changes made by the description of the present invention and the accompanying drawings or the direct/indirect application in other related technical fields are included in the scope of the invention.

Claims (9)

1. A 3D structured light module, comprising:
the structure light projector is used for projecting structure light to the surface of the object to be detected, so that the surface of the object to be detected forms a reflected light beam;
The event camera is used for sensing the reflected light beam and generating event data;
The two event cameras are simultaneously started, the starting time of the two event cameras is earlier than the starting time of the structured light projector, the time difference between the starting time of the event cameras and the starting time of the structured light projector is greater than the initialization response time of the event cameras, and the depth information of an object is obtained through calculation by comparing the offset of the same object point on the two event cameras on the image;
the module circuit board is connected with the structured light projector and the event camera respectively, and is arranged on the module circuit board at intervals;
The structured light projector includes:
the projector circuit board is arranged independently or is arranged on a common substrate of the projector circuit board and the module circuit board;
the light source is used for emitting light and is electrically connected to the projector circuit board;
The optical element comprises a base material and a microstructure surface, wherein the microstructure surface is arranged on the surface of the base material;
the projector structure support is provided with a light-emitting channel, the optical element is connected in the light-emitting channel, and the light passes through the optical element and is emitted from the light-emitting channel;
The micro-structure surface is a diffraction micro-structure surface, or the micro-structure surface comprises a diffraction micro-structure surface and a collimation micro-structure surface, or the micro-structure surface is a super-surface structure, the super-surface structure comprises a first micro-structure and a second micro-structure, and the first micro-structure and the second micro-structure are respectively arranged on two surfaces of the substrate; the light source is a polarized laser light source, the polarized laser light source is used for emitting first polarized laser and second polarized laser with perpendicular polarization directions at intervals, or the light source is a multispectral laser light source, and the multispectral laser light source is used for emitting at least two lasers with different wavelengths at intervals;
when the microstructure surface is a diffraction microstructure surface or the microstructure surface comprises a diffraction microstructure surface and a collimation microstructure surface, and the light source is a polarized laser light source, the microstructure surface comprises a first subunit and a second subunit which are different in microstructure and are connected in a spliced manner, wherein the first subunit is provided with a polarized film with one polarization direction or adopts a substrate with one polarization direction, so that the first subunit and the second subunit have a polarization function, the first subunit only allows the first polarized laser to pass through and diffract and diffuse the first polarized laser, and the second subunit is provided with a polarized film with another polarization direction or adopts a substrate with another polarization direction, so that the second subunit only allows the second polarized laser to pass through and diffract and diffuse the second polarized laser; when the microstructure surface is a diffraction microstructure surface, or when the microstructure surface comprises a diffraction microstructure surface and a collimation microstructure surface, and when the light source is a multispectral laser light source, the microstructure surface comprises a first subunit and a second subunit which are spliced and connected, and the first subunit and the second subunit are respectively provided with a narrow-band filter film or base materials with different wavelengths, so that the first subunit and the second subunit have a light filtering function, and the first subunit and the second subunit only allow multispectral laser with corresponding wavelengths to pass through and diffract and diffuse the laser with different wavelengths;
When the microstructure surface is a diffraction microstructure surface or the microstructure surface comprises a diffraction microstructure surface and a collimation microstructure surface, and the light source is a polarized laser light source, the microstructure surface comprises a first subunit and a second subunit which are different in microstructure and are connected in a spliced mode, or when the microstructure surface is a diffraction microstructure surface or the microstructure surface comprises a diffraction microstructure surface and a collimation microstructure surface, and the light source is a multispectral laser light source, the microstructure surface comprises a first subunit and a second subunit which are connected in a spliced mode, the first subunit and the second subunit are respectively provided with a narrow-band filter film or a base material with different wavelengths, the replication order of the first subunit is N, the replication order of the second subunit is M, and N > M, scattered spots formed by the light source after passing through the first subunit are more than those formed by the light source after passing through the second subunit, and the light source after passing through the first subunit forms a visual field after passing through the second subunit.
2. The 3D structured light module of claim 1, wherein when the microstructured surface is a diffractive microstructured surface, the projector structure support further comprises at least one collimating lens disposed in the light exit channel, and the light rays sequentially pass through the collimating lens and the optical element and exit the light exit channel.
3. The 3D structured light module of claim 1, wherein when the microstructured surface comprises a diffractive microstructured surface and a collimating microstructured surface, the collimating microstructured surface and the diffractive microstructured surface are located on two sides or on the same side of the substrate, respectively, or the collimating microstructured surface and the diffractive microstructured surface are integrated on the same side of the substrate to form an integrated microstructured surface.
4. The 3D structured light module of claim 1, wherein when the microstructured surface is a supersurface structure; when the light source is a multi-spectrum laser light source, one wavelength laser is adaptive to the first microstructure, and the other wavelength laser is adaptive to the second microstructure.
5. A 3D structured light module as claimed in claim 3 wherein the diffractive micro-structured surface obtains a periodic phase distribution by vector design;
or, the diffraction microstructure surface obtains random phase distribution through scalar design.
6. The 3D structured light module of claim 1, wherein the event camera comprises:
the camera circuit board is arranged independently or is arranged on a common substrate of the camera circuit board and the module circuit board;
The dynamic vision sensor is electrically connected with the camera circuit board and is used for receiving the reflected light beams and generating event data;
The camera structure bracket is arranged on the camera circuit board and is provided with a light entering channel, and speckle structure light reflected by the object to be detected enters the light entering channel;
an infrared narrowband filter or an infrared super surface lens;
When the event camera comprises the infrared narrowband optical filter, a first camera installation boss is arranged in the light entering channel, and the infrared narrowband optical filter is arranged on the first camera installation boss; the camera structure support is provided with at least one imaging lens, the imaging lens is arranged in the light entering channel, or the camera structure support is provided with a super-surface lens, the light entering channel is provided with a second camera mounting boss, and the super-surface lens is arranged in the second camera mounting boss;
when the event camera comprises the infrared super-surface lens, the light inlet channel is provided with a third camera mounting boss, and the infrared super-surface lens is arranged on the third camera mounting boss.
7. The 3D structured light module of claim 1, wherein the 3D structured light module further comprises an RGB camera, the RGB camera configured to output a color map;
Or, the 3D structure optical module further comprises an RGB camera, the event camera and the RGB camera are fused to form an integrated receiving camera, the dynamic vision sensor of the integrated receiving camera comprises a plurality of pixel units, each pixel unit comprises four pixel sub-units, and the four pixel sub-units are respectively a red photosensitive pixel, a green photosensitive pixel, a blue photosensitive pixel and an infrared photosensitive pixel.
8. A 3D imaging system, comprising: the main chip and the 3D structure optical module;
wherein the 3D structured light module is a 3D structured light module as claimed in any one of claims 1 to 7, and the main chip comprises:
The control module is in control connection with the 3D structure light module and is used for receiving imaging information and controlling the opening of the structure light projector and the event camera according to the imaging information;
The computing module is in communication connection with the event camera so as to receive event data generated by the event camera, convert the event data into event images and transmit the event images;
the storage module is in communication connection with the calculation module and is used for receiving and storing the event images transmitted by the calculation module, and the storage module also stores calibration parameters and a reference picture;
The image data processing module is connected with the storage module and is used for calibrating the event images according to the calibration parameters to form a calibration event map, and the image data processing module is also used for comparing the calibration event map with the reference map to obtain a depth map.
9. A method of obtaining a depth map of a target object, for application in a 3D imaging system according to claim 8, the method comprising:
Calibrating an event camera and a structure light projector respectively to obtain internal parameters and external parameters of the event camera and internal parameters and external parameters of the structure light projector;
The event camera is calibrated: shooting a calibration plate with a known geometric structure by using an event camera, wherein the calibration plate uses a flickering checkerboard, black squares in the checkerboard are quickly disappeared and reappeared to generate light intensity change, the event camera outputs a calibration image, the angular points or the dot coordinate positions of the checkerboard are known, internal parameters and external parameters of the event camera are obtained, the internal parameters are generally focal length and distortion parameters of the event camera, and the external parameters are generally a rotation matrix R1 and a translation matrix T1 of camera coordinates relative to world coordinates;
Calibrating the structured light projector: controlling the structured light projector to emit light at intervals, acquiring images of speckle light spot distribution by the event camera, knowing the coordinate position of speckle, and solving the external parameters of the structured light projector, namely a rotation matrix R2 and a translation matrix T2 of the coordinates of the structured light projector relative to world coordinates;
the control module of the main chip firstly starts the event camera and then starts the structure light projector, wherein the time difference between the start of the event camera and the start of the structure light projector is larger than the initialization response time of the event camera;
Controlling the structured light projector to be lightened in a time-sharing way, so that the structured light projector projects structured light with a certain characteristic structure to an object to be detected, light spots reflected by the object to be detected are incident to the event camera, and the event camera captures scattered spots to generate event data;
The main chip obtains a depth image of the target object according to the event data, the internal parameters and the external parameters of the structured light projector, the internal parameters and the external parameters of the event camera and the reference image;
The computing module of the main chip converts the event data into a visualized event image according to the pixel coordinates and the polarities corresponding to the event data, the visualized event image is output, the output event image is stored in the storage module and is output to the image data processing module, the image data processing module calls internal parameters and external parameters of the event camera and external parameters of the structured light projector to calibrate the event image, the calibrated event image is subjected to pixel matching with a pre-stored reference event image, the same pixel point pair is found, the offset of the pixel point pair imaged in the event camera is calculated, and the depth image of the target object can be obtained by utilizing the triangle ranging principle.
CN202311659789.7A 2023-12-06 2023-12-06 3D structure optical module, imaging system and method for obtaining depth map of target object Active CN117369197B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311659789.7A CN117369197B (en) 2023-12-06 2023-12-06 3D structure optical module, imaging system and method for obtaining depth map of target object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311659789.7A CN117369197B (en) 2023-12-06 2023-12-06 3D structure optical module, imaging system and method for obtaining depth map of target object

Publications (2)

Publication Number Publication Date
CN117369197A CN117369197A (en) 2024-01-09
CN117369197B true CN117369197B (en) 2024-05-07

Family

ID=89400615

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311659789.7A Active CN117369197B (en) 2023-12-06 2023-12-06 3D structure optical module, imaging system and method for obtaining depth map of target object

Country Status (1)

Country Link
CN (1) CN117369197B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998024242A1 (en) * 1996-11-27 1998-06-04 Princeton Video Image, Inc. Image insertion in video streams using a combination of physical sensors and pattern recognition
EP3282285A1 (en) * 2016-08-09 2018-02-14 Oculus VR, LLC Multiple emitter illumination source for depth information determination
CN209281093U (en) * 2019-01-15 2019-08-20 深圳市安思疆科技有限公司 A structured light projection module and 3D imaging device with controllable polarization state
CN110443856A (en) * 2019-08-12 2019-11-12 广州图语信息科技有限公司 A kind of 3D structure optical mode group scaling method, storage medium, electronic equipment
CN112525107A (en) * 2020-11-24 2021-03-19 革点科技(深圳)有限公司 Structured light three-dimensional measurement method based on event camera
WO2022052313A1 (en) * 2020-09-11 2022-03-17 苏州中科全象智能科技有限公司 Calibration method for 3d structured light system, and electronic device and storage medium
CN217085782U (en) * 2022-02-18 2022-07-29 深圳锐视智芯科技有限公司 Structured light three-dimensional imaging module and depth camera
WO2023010565A1 (en) * 2021-08-06 2023-02-09 中国科学院深圳先进技术研究院 Method and apparatus for calibrating monocular speckle structured light system, and terminal
CN116500799A (en) * 2023-06-29 2023-07-28 深圳市安思疆科技有限公司 Structured light projector and structured light module

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12265232B2 (en) * 2019-02-14 2025-04-01 Hangzhou Uphoton Optoelectronics Technology Co., Ltd. Beam-splitting optical module and manufacturing method thereof
CN118176739A (en) * 2021-08-27 2024-06-11 夏日机器人公司 Multi-sensor super-resolution scanning and capturing system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998024242A1 (en) * 1996-11-27 1998-06-04 Princeton Video Image, Inc. Image insertion in video streams using a combination of physical sensors and pattern recognition
EP3282285A1 (en) * 2016-08-09 2018-02-14 Oculus VR, LLC Multiple emitter illumination source for depth information determination
CN209281093U (en) * 2019-01-15 2019-08-20 深圳市安思疆科技有限公司 A structured light projection module and 3D imaging device with controllable polarization state
CN110443856A (en) * 2019-08-12 2019-11-12 广州图语信息科技有限公司 A kind of 3D structure optical mode group scaling method, storage medium, electronic equipment
WO2022052313A1 (en) * 2020-09-11 2022-03-17 苏州中科全象智能科技有限公司 Calibration method for 3d structured light system, and electronic device and storage medium
CN112525107A (en) * 2020-11-24 2021-03-19 革点科技(深圳)有限公司 Structured light three-dimensional measurement method based on event camera
WO2023010565A1 (en) * 2021-08-06 2023-02-09 中国科学院深圳先进技术研究院 Method and apparatus for calibrating monocular speckle structured light system, and terminal
CN217085782U (en) * 2022-02-18 2022-07-29 深圳锐视智芯科技有限公司 Structured light three-dimensional imaging module and depth camera
CN116500799A (en) * 2023-06-29 2023-07-28 深圳市安思疆科技有限公司 Structured light projector and structured light module

Also Published As

Publication number Publication date
CN117369197A (en) 2024-01-09

Similar Documents

Publication Publication Date Title
JP7581413B2 (en) Method and system for tracking eye movements in conjunction with an optical scanning projector - Patents.com
US20230392920A1 (en) Multiple channel locating
US10571668B2 (en) Catadioptric projector systems, devices, and methods
US10739607B2 (en) Light source module, sensing device and method for generating superposition structured patterns
US20140293011A1 (en) Scanner System for Determining the Three Dimensional Shape of an Object and Method for Using
WO2020057204A1 (en) Compensating display screen, under-screen optical system and electronic device
US11067692B2 (en) Detector for determining a position of at least one object
JP5966467B2 (en) Ranging device
KR20190075044A (en) Multiple emitter illumination for depth information determination
CN107783353B (en) Apparatus and system for capturing stereoscopic images
US10317684B1 (en) Optical projector with on axis hologram and multiple beam splitter
RU2608690C2 (en) Light projector and vision system for distance determination
CN109167905B (en) Image acquisition method, image acquisition device, structured light assembly and electronic device
CN110133853B (en) Method for adjusting adjustable speckle pattern and projection method thereof
KR20150090680A (en) Camera apparatus
CN110784694B (en) Structured light projector and 3D image sensing module
CN111721239A (en) Depth data measurement equipment and structured light projection device
EP3933489A1 (en) Eye-tracking using laser doppler interferometry
CN112004000A (en) Light-emitting device and image acquisition device using same
JP7508150B2 (en) Depth data measuring device and structured light projection unit
CN117369197B (en) 3D structure optical module, imaging system and method for obtaining depth map of target object
JP2023539040A (en) Variable focus display with wavelength tuning
US20050036780A1 (en) Focal point detection device and camera
TW202043846A (en) Light emitting device and image capture device using same
JP2021018081A (en) Imaging apparatus, measuring device, and measuring method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20240819

Address after: Room 102, 1st Floor, Building 2, No. 199 Shui'an 1st Road, Xiangzhou District, Zhuhai City, Guangdong Province 519000

Patentee after: Zhuhai Minshi Microelectronics Co.,Ltd.

Country or region after: China

Address before: Building 3, Chongwen Park, Nanshan Zhiyuan, No. 3370 Liuxian Avenue, Fuguang Community, Taoyuan Street, Nanshan District, Shenzhen, Guangdong Province, 518000, 2201-2202

Patentee before: SHENZHEN ANGSTRONG TECHNOLOGY Co.,Ltd.

Country or region before: China

TR01 Transfer of patent right