[go: up one dir, main page]

CN111692987B - Depth data measuring head, measuring device and measuring method - Google Patents

Depth data measuring head, measuring device and measuring method Download PDF

Info

Publication number
CN111692987B
CN111692987B CN201910199180.3A CN201910199180A CN111692987B CN 111692987 B CN111692987 B CN 111692987B CN 201910199180 A CN201910199180 A CN 201910199180A CN 111692987 B CN111692987 B CN 111692987B
Authority
CN
China
Prior art keywords
light
imaging
image
stripe
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910199180.3A
Other languages
Chinese (zh)
Other versions
CN111692987A (en
Inventor
王敏捷
梁雨时
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Tuyang Information Technology Co ltd
Original Assignee
Shanghai Tuyang Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Tuyang Information Technology Co ltd filed Critical Shanghai Tuyang Information Technology Co ltd
Priority to CN201910199180.3A priority Critical patent/CN111692987B/en
Priority to US17/437,512 priority patent/US11885613B2/en
Priority to PCT/CN2019/122667 priority patent/WO2020186825A1/en
Priority to JP2022502318A priority patent/JP7224708B6/en
Priority to EP19919943.1A priority patent/EP3943882B1/en
Publication of CN111692987A publication Critical patent/CN111692987A/en
Application granted granted Critical
Publication of CN111692987B publication Critical patent/CN111692987B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/22Measuring arrangements characterised by the use of optical techniques for measuring depth

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A depth data measuring head, a measuring apparatus and a measuring method are disclosed. The measuring head comprises: a projection device for scanning and projecting structured light with stripe codes to a shooting area; first and second image sensors having a predetermined relative positional relationship for photographing the photographing region to obtain first and second two-dimensional image frames under the structural light irradiation, respectively; and a synchronization device for synchronously turning on the pixel columns in the stripe direction corresponding to the current scanning position in the first and second image sensors for imaging based on the scanning position of the projection device. Therefore, the depth imaging scheme with high flexibility is provided based on the characteristics that the stackable matching and binocular imaging of the plurality of stripe coding patterns do not depend on a specific calibration plane; meanwhile, the one-dimensional characteristic of the stripe image is utilized to control the pixel array range for imaging at each moment, so that adverse effects of ambient light on a measurement result are reduced.

Description

Depth data measuring head, measuring device and measuring method
Technical Field
The invention relates to the field of three-dimensional imaging, in particular to a depth data measuring head, a measuring device and a measuring method.
Background
The depth camera is collection equipment for collecting depth information of a target object, and is widely applied to the fields of three-dimensional scanning, three-dimensional modeling and the like, for example, more and more intelligent mobile phones are equipped with depth camera devices for face recognition.
Although three-dimensional imaging is a hotspot for many years of research in the field, the existing depth camera still has the problems of high power consumption, large volume, poor anti-interference capability, incapability of realizing real-time imaging at pixel level and even sub-pixel level, and the like.
For this reason, an improved depth data measurement scheme is needed.
Disclosure of Invention
In view of this, the present invention proposes a depth data measurement head and measurement system that provides a highly flexible pixel-level depth imaging scheme based on stackable matching of multiple stripe-encoded patterns and binocular imaging without relying on the characteristics of a specific nominal imaging plane, by combining actively projected stripe-encoded structured light with binocular imaging. The invention further extends the available scene of the invention by removing the effect of ambient light on the depth measurement results through a high synchronization of imaging and scanning.
According to one aspect of the present invention, there is provided a depth data measurement head comprising: a projection device for scanning and projecting structured light with stripe codes to a shooting area; first and second image sensors having a predetermined relative positional relationship for photographing the photographing region to obtain first and second two-dimensional image frames under the structural light irradiation, respectively; and a synchronization device for synchronously turning on the pixel columns in the stripe direction corresponding to the current scanning position in the first and second image sensors for imaging based on the scanning position of the projection device. Thus, the range of the pixel column imaged at each time is controlled by utilizing the one-dimensional characteristic of the fringe image, thereby reducing the adverse effect of the ambient light on the measurement result. Since the correspondence between the pixel columns and the scanning light is affected by many factors such as the width, power, speed, and photosensitive efficiency of the image sensor of the projected light, the number of pixel columns that are turned on at a time in synchronization can be determined based on a calibration operation, for example.
Preferably, the synchronization means may include measurement means for measuring a scanning position of the projection means, and the synchronization of the imaging of the pixel columns is turned on based on a measurement result of the measurement means. Thereby providing the high precision synchronization required for high frame rates by real-time measurements.
Preferably, the projection means may comprise a laser generator for generating linear and/or infrared laser light and the laser generator is switched on and off at a high speed to scan the structured light projected at intervals of light and shade corresponding to the stripe code. Thus, by simply switching the laser generator, precise control of the encoding pattern is achieved.
Preferably, the projection device may include: and a micromirror device reciprocally vibrating at a predetermined frequency for scanning and projecting the line-shaped laser light toward the photographing region at the predetermined frequency, wherein a length direction of the line-shaped laser light is a length direction of the projected stripe. Therefore, the requirements of the industry on the characteristics of the depth camera can be met by utilizing the characteristics of the micro-mirror device, such as high speed, accuracy, compactness and low power consumption.
In consideration of the characteristics of the phase oscillation of the micromirror device, a measuring device for measuring the oscillation phase of the micromirror device in real time may be included in the synchronizing device, and the synchronous turning on of the pixel column imaging is performed based on the measurement result of the measuring device. Thereby ensuring synchronization of scanning and imaging at extremely high frequencies. Preferably, the above-mentioned measuring means may be one or more photoelectric sensors, and the photoelectric sensors are arranged in any of the following ways: arranged on different exit paths of the projection device; arranged on different reflection paths within the projection device; and exit and reflection paths respectively arranged inside and outside the projection device. The arrangement mode of the photoelectric sensor can be reasonably selected, so that the normal projection of the structured light is not influenced while the phase is accurately measured.
Preferably, each of the image sensors completes imaging of one image frame after each predetermined number of scanning projections by the projection device. Therefore, the problem of insufficient light intensity of the projection structure is solved by multiple scanning imaging.
The pixels in each image sensor may include a structured light image frame memory unit that is synchronously turned on corresponding to the current scanning position, for example, a unit for storing electric charges and outputting in 0 or 1 based on the charge storage amount, or a multi-level memory unit capable of outputting a gray scale.
Preferably, the pixels in each image sensor each comprise a plurality of such structured-light image frame storage units, each for imaging a different pattern of fringe-encoded structured light projected sequentially by the projection means, respectively, to generate a set of image frames for the different patterns, wherein the set of image frames is used in its entirety for one depth data calculation. Thereby making it more suitable for the characteristics of fringe-encoded imaging, e.g. pixel matching between the first and second image sensors can be performed directly based on a plurality of values of 0 or 1 stored in each pixel after the generation of the set of image frames; the digital operation module can also directly carry out digital operation according to a plurality of gray values so as to realize pixel matching.
In addition, for imaging ambient light (e.g., a two-dimensional gray scale image), the pixels in each image sensor further each include an additional memory cell for turning off when at least one structured light image frame memory cell of the pixel is turned on and for at least a portion of the period of time that the structured light image frame memory cell is not receiving the structured light illumination, such that the image sensor generates an ambient light image frame based on the additional memory cell.
According to another aspect of the present invention, there is provided a depth data measuring apparatus comprising: a depth data measuring head according to any one of the preceding claims, and a processor connected to the depth data measuring head for determining depth data of a photographic subject in the photographic region based on predetermined relative positions of the first and second image sensors and first and second two-dimensional image frames obtained by imaging the structured light thereof. Preferably, at least part of the synchronisation function of the synchronisation means is implemented by the processor.
According to another aspect of the present invention, there is provided a depth data measurement method, comprising: scanning and projecting structured light with stripe codes to a shooting area; photographing the photographing region using first and second image sensors having a predetermined relative positional relationship to obtain first and second two-dimensional image frames under the structured light irradiation, respectively, wherein pixel columns in a stripe direction corresponding to a current scanning position in the first and second image sensors are synchronously turned on to perform imaging based on scanning positions of structured light stripes; and obtaining depth data of the measured object in the shooting area based on the first two-dimensional image frame and the second two-dimensional image frame.
Preferably, projecting the structured light may include generating infrared light using a laser emitter; and reciprocally vibrating at a predetermined frequency using a micromirror device to scan-project the line-type laser light to the photographing region at the predetermined frequency, wherein a length direction of the line-type laser light is a length direction of the projected stripe.
Preferably, the measuring method may further include: and measuring the vibration phase of the micro-mirror device in real time to obtain the scanning position of the structural light stripe.
Preferably, the measuring method may further include: the number of pixel columns that are turned on at a time in synchronization is determined based on the calibration operation.
Preferably, the measuring method may further include: imaging the stripe-encoded structured light of different patterns projected in sequence, respectively, using a plurality of structured light image frame storage units each included in a pixel in each image sensor, to generate a set of image frames for the different patterns; and performing pixel matching between the first and second image sensors directly based on the plurality of values of 0 or 1 stored in each pixel.
Preferably, the measuring method may further include: imaging at least part of the period of time during which the structured light is not received using an additional storage unit comprised by each of the pixels in each image sensor such that the image sensor generates an ambient light image frame based on the additional storage unit.
The depth data measurement scheme can provide a pixel-level depth imaging scheme with high flexibility by combining the actively projected stripe coding structured light and binocular imaging, and the superposition and binocular imaging based on the stripe coding pattern does not need to depend on the characteristic of a specific imaging plane. Specifically, the invention can also remove the influence of the ambient light on the depth measurement result through the high synchronization of imaging and scanning, and the high-speed scanning of the linear light is realized by utilizing the DMD, so that the usable scene of the invention is further expanded, and the imaging speed and the imaging precision are improved.
Drawings
The foregoing and other objects, features and advantages of the disclosure will be apparent from the following more particular descriptions of exemplary embodiments of the disclosure as illustrated in the accompanying drawings wherein like reference numbers generally represent like parts throughout exemplary embodiments of the disclosure.
Fig. 1 shows the principle of depth imaging with structured light encoded in stripes.
Fig. 2 shows another example of projecting stripe coded structured light.
Fig. 3 shows a schematic composition of a depth data measurement head according to one embodiment of the invention.
Fig. 4A-B show an enlarged operation example of the projection apparatus shown in fig. 3.
Fig. 5 shows a simplified perspective schematic diagram of a projection device for use with the present invention.
Fig. 6 shows a schematic diagram of pixel rows in an image sensor turned on in turns.
Fig. 7 shows an example of a pixel structure of an image sensor used in the present invention.
Fig. 8 shows another example of the pixel structure of the image sensor used in the present invention.
Fig. 9 shows a schematic diagram of a depth data measurement device according to an embodiment of the invention.
Fig. 10 shows a schematic flow chart of a depth data measurement method according to an embodiment of the invention.
Detailed Description
Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
In order to meet the requirements of the three-dimensional imaging field on high precision, high frame rate, low power consumption and miniaturization, the invention provides a depth data measuring head and a measuring system, which provide a pixel-level depth imaging scheme with high flexibility by combining stripe coding structured light which is actively projected and binocular imaging, and based on the characteristics that the superposition of stripe coding patterns and the binocular imaging do not depend on a specific imaging plane. The invention further extends the available scene of the invention by removing the effect of ambient light on the depth measurement results through a high synchronization of imaging and scanning.
According to the principle of structured light measurement, whether the scan angle α can be precisely determined is the key of the whole measurement system, the scan angle can be calculated and determined by mechanical devices such as turning mirrors, etc., and the meaning of image coding and decoding is to determine the scan angle of the coded structured light, i.e. the surface structured light system. Fig. 1 shows the principle of depth imaging with structured light encoded in stripes. For ease of understanding, the coding principle of stripe structured light is briefly described in the figure with two gray scale three bit binary time coding. The projection device can sequentially project three patterns shown in the figure to the measured object in the shooting area, and the projection space is divided into 8 areas by using two gray scales of brightness and darkness in the three patterns. Each region corresponds to a respective projection angle, wherein it can be assumed that bright regions correspond to a code "1" and dark regions correspond to a code "0". And combining the code values of a point in the three code patterns on the scenery in the projection space according to the projection sequence to obtain the area code value of the point, thereby determining the area where the point is located and then decoding to obtain the scanning angle of the point.
In a binocular imaging system, the above described decoding process may be simplified by matching the encoded values of the respective points in the first and second image sensors directly. In order to improve the matching accuracy, the number of projection patterns in time encoding may be increased. Fig. 2 shows another example of projecting stripe coded structured light. Specifically, a two-gray level five-bit binary time code is shown. In the application scenario of binocular imaging, this means that each pixel in each of the left and right image frames contains 5 or 0 or 1 region code values, for example, whereby left and right image matching can be achieved with higher accuracy (e.g., pixel level). In the case of a constant projection rate of the projection device, the example of fig. 2 corresponds to achieving a higher accuracy of image matching at a higher time-domain cost than the three coding patterns of fig. 1. This is still quite desirable in cases where the projection device would otherwise have an extremely high projection rate (e.g., the micromirror device preferably used in the present invention).
Fig. 3 shows a schematic composition of a depth data measurement head according to one embodiment of the invention. As shown in fig. 3, the depth data measuring head 300 includes a projection device 310 and two image sensors 320_1 and 320_2.
The projection device 310 is used for scanning and projecting structured light with stripe codes to a shooting area. For example, the projection device 310 may successively project three patterns as shown in fig. 1 during successive 3 image frame projection periods, the imaging results of which may be used for the generation of depth data. The image sensors 320_1 and 320_2, which may be referred to as first and second image sensors, respectively, have a predetermined relative positional relationship for photographing the photographing region to obtain first and second two-dimensional image frames under the structured light irradiation, respectively. For example, in the case where the projection device 310 projects three patterns as shown in fig. 1, the first and second image sensors 320_1 and 320_2 may image photographing regions (e.g., an imaging plane and regions within a certain range thereof in fig. 3) on which the three patterns are projected, respectively, in three synchronized image frame imaging periods.
As shown in fig. 3, the projection device 310 may project linear light extending in the x-direction in the z-direction (i.e., toward the photographing region). In different embodiments, the projection of the line light may be shaped (i.e. the outgoing light itself is line light) or may be a spot of light moving in the x-direction (i.e. the scanned line light). The projected line light can be continuously moved in the y-direction to cover the entire imaging area. The lower part of fig. 3 gives a more understandable illustration of the scanning of line light for a perspective view of the shot area.
In the embodiment of the invention, the direction of the light emergent measuring head is appointed as the z direction, the vertical direction of the shooting plane is the x direction, and the horizontal direction is the y direction. Then, the stripe structure light projected by the projection device may be a result of the linear light extending in the x direction moving in the y direction. Although in other embodiments, the synchronization and imaging process may be performed with respect to the stripe structure light obtained by moving the linear light extending in the horizontal y direction in the x direction, it is still preferable to use the vertical stripe light for the explanation in the embodiment of the present invention.
Further, the measuring head 300 further comprises a synchronizing device 330. The synchronization device 330 is connected to the projection device 310 and the first and second image sensors 320_1 and 320_2, respectively, to achieve precise synchronization therebetween. Specifically, the synchronization device 330 may synchronously turn on the pixel columns in the stripe direction corresponding to the current scanning position in the first and second image sensors 320_1 and 320_2 for imaging based on the scanning position of the projection device 310. As shown in fig. 3, the current stripe is being scanned to the center area of the photographing area. For this reason, in the image sensors 320_1 and 320_2, the pixel columns (for example, 3 adjacent pixel columns) located in the center area are turned on to perform imaging. As the stripe moves in the y-direction (as indicated by the arrow in the lower perspective view of fig. 3), the pixel columns in the image sensors 320_1 and 320_2 that are turned on for imaging correspondingly move synchronously (as indicated by the arrow above the matrix in the upper left block diagram of fig. 3). Thus, the range of the pixel column for imaging at each time can be controlled by utilizing the one-dimensional characteristic of the fringe image, thereby reducing the adverse effect of the ambient light on the measurement result. In order to further reduce the influence of ambient light, the projection device is particularly suitable for projecting light which is not easily confused with ambient light, such as infrared light. In addition, since the correspondence between the pixel columns and the scanning light is affected by many factors such as the width, power, speed, and photosensitive efficiency of the image sensor of the projected light, the pixel column range (and the corresponding number) that is turned on at a time in synchronization can be determined based on the calibration operation, for example.
Fig. 4A-B show an enlarged operation example of the projection apparatus shown in fig. 3. Specifically, as shown in fig. 3, in the projection apparatus 310, laser light emitted from a laser generator (such as the laser generator 411 shown in detail in fig. 4A-B) is scanned and projected onto a photographing region (gray region in fig. 3) by a projection mechanism (such as the projection mechanism 412 shown in detail in fig. 4A-B) to perform active structured light projection on an object to be measured (e.g., a person in fig. 3) in the photographing region. The pair of image sensors 320_1 and 320_2 images a photographing region, thereby acquiring image frames required for depth data calculation. As shown in fig. 3, the dashed lines from the projection device 310 are used to represent the projection ranges thereof, while the dashed lines from the image sensors 320_1 and 320_2 are used to represent the respective imaging ranges thereof. The photographing region is generally located in an overlapping region of respective projection and imaging ranges of the three.
In one embodiment, the laser generator may continuously emit laser light of the same intensity, and the projected fringe pattern is achieved by turning the laser generator on and off. In this case, since the laser generator projects only light of one intensity, each pixel of the image sensor only needs to record the "presence or absence" of light, and thus the equipped image sensor may be a black-and-white image sensor.
In another embodiment, the laser generator itself may emit laser light with varying intensity, for example, laser light with sinusoidal variation of the emitted intensity depending on the applied power. The sinusoidal laser may be combined with stripe projection, whereby a pattern with alternate brightness and different brightness between bright and dark stripes is scanned and projected. In this case, the image sensor needs to have the capability of differentially imaging different light intensities, and thus may be a multi-level gray scale image sensor. It is apparent that gray scale projection and imaging can provide more accurate inter-pixel matching than black and white projection and imaging, thereby improving the accuracy of depth data measurements.
In one embodiment, the laser generator 411 may be a line laser generator that generates line light extending in the x-direction (the direction perpendicular to the paper surface in fig. 4A-B). The line light is then projected onto the imaging plane by a reflective mechanism 412 that is swingable along an axis in the x-direction. The swinging view of the reflecting mechanism 412 is shown in fig. 4B. Thus, a reciprocating line-type optical scanning can be performed within the AB range of the imaging plane.
In one embodiment, the reflective mechanism 412 may be a micromirror device (also referred to as a digital micromirror device, DMD) and may be implemented as a MEMS (micro-electro-mechanical system). Fig. 5 shows a simplified perspective schematic diagram of a projection device for use with the present invention. As shown in fig. 5, the point laser generated by the laser may obtain line-shaped light (corresponding to the line-shaped laser generator 411 of fig. 4) through a lens, the line-shaped light is reflected by a micro mirror device of MEMS type, and the reflected line-shaped light is projected to an external space through a light window. Micromirror devices have extremely high performance, for example, commercially available DMDs can perform highly stable reciprocating vibrations at a frequency of 2k, thereby laying the foundation for high performance depth imaging.
In other embodiments, the laser light projected by scanning may be a point laser light, and the projection mechanism is required to change the projection direction in two dimensions (xy two directions in the figure) accordingly. For example, the projection mechanism scans stripe light in the x-direction, then shifts in the y-direction, and continues scanning in the x-direction at a different y-position.
Whether stripe light moving in the y direction is directly projected, or spot light which needs to be moved in the x direction to form stripes and be displaced in the y direction is projected, it appears on the photographing region as stripes moving in the y direction with time. As the spot moves in the y-direction, a particular pixel column of all pixels on the image sensor that record the image frame is turned on to enable it to collect light reflected back at the corresponding location. Fig. 6 shows a schematic diagram of pixel columns in an image sensor turned on in turns. As shown in fig. 6, when the stripe projected by the projection device moves from the middle to one side of the imaging area, the pixel array of the image sensor turns on the pixel column for imaging, and then moves from the middle to one side. Thereby, the pixel columns are made to perform imaging recording only for the period in which the corresponding photographing region is scanned, and are made to perform no recording for other times. Since the intensity of the projected laser light is higher than the ambient light intensity, the structured light itself can be imaged extremely accurately in the case where the ambient light cannot be accumulated under the synchronous opening scheme of the present invention. Since conventional image sensors are typically line exposed, the column-wise (or multiple columns simultaneously) exposed image sensor used in the present invention can be transposed 90 ° based on the existing image sensor. After the transposition, control of the whole column simultaneous exposure is required to be added thereto.
It should be understood that the pixel matrices shown in fig. 3 and 6 are only given as examples to illustrate the synchronization principle of the present invention. In practical applications, the pixel matrix of the image sensor tends to be of a higher order (e.g., 1000x 1000), and the pixel columns that are turned on at the same time each time may also have different ranges according to the calibration (e.g., 3 columns are turned on each time, or different columns are turned on at different positions for the photographing area, etc.). In addition, the opening of the pixel columns in the image sensor may be related only to the scanning position of the projection structure in the projection device, irrespective of whether or not the striped light is actually projected at the present time. In other words, the turning off and on of the laser emitters based on the distribution of the bright and dark fringes of the projection structure does not affect the scanning projection action of the projection structure nor the turning on action of the image sensor pixel columns in synchronization with the scanning projection action.
The projection apparatus as above may include a micromirror device (DMD) reciprocally vibrating at a predetermined frequency for scanning the projection line type laser light toward the photographing region at the predetermined frequency. Since the frequency of vibration of the micromirror device is extremely high, e.g., 2k per second, which corresponds to 250ns sweeping out a complete projected structured light, extremely precise synchronization of the positions of the light rays reflected by the micromirror device is required. The above-described accuracy makes it impossible to directly perform synchronization by using the start signal of the micromirror device (because the delay is unreliable), so that a measuring device for measuring the vibration phase of the micromirror device in real time can be included in the synchronization device in consideration of the characteristics of the phase vibration of the micromirror device, and the synchronous start of the pixel column imaging is performed based on the measurement result of the measuring device. Thereby ensuring synchronization of scanning and imaging at extremely high frequencies.
In one embodiment, the above measurement may be based on the exiting light itself. Thus, the above-described measuring device may be one or more photosensors (e.g., two photodiodes PD), and the two photosensors are arranged in any of the following ways: arranged on different exit paths of the projection device; arranged on different reflection paths within the projection device; and exit and reflection paths respectively arranged inside and outside the projection device. The arrangement mode of the photoelectric sensor can be reasonably selected, so that the normal projection of the structured light is not influenced while the phase is accurately measured. As shown in fig. 5, the PD can be installed in a projection apparatus, and the instantaneous vibration phase is determined by measuring the reflection angle at the time of laser light exiting the optical window. Since the vibration phase of the DMD is sinusoidally distributed, one PD can determine the sinusoid distribution information, while more PDs contribute to more accurate measurement of the phase. In other embodiments, the PD may also be mounted outside the projection device, e.g., on the light window, e.g., near the edges of the light window to prevent effects on the projection within the capture area. In other embodiments, other ways of making phase measurements, such as capacitive measurements, may also be utilized.
In one embodiment, each image sensor completes the imaging of one image frame after each scan projection by the projection device. For example, the DMD completes imaging one image frame (e.g., one pattern in fig. 1 or 2) after completing a half-cycle of vibration to scan the stripe light in the x-direction from one side of the photographing region to the other side. In the case where the projection power of the projection device is limited or the measured object is far away from the measuring head, the amount of charge acquired by the image sensor after a single scan is generally not imaged, and then multi-scan imaging is required. Thus, each image sensor completes imaging of one image frame after every predetermined number of scanning projections by the projection device. For example, the DMD may scan the same structured-light pattern for 5 consecutive shaking periods so that the image sensor acquires an amount of charge sufficient for imaging, then scans the same next structured-light pattern for the next 5 shaking periods, and so on.
Specifically, the pixels in each image sensor may include a structured light image frame storage unit that is synchronously turned on corresponding to the current scanning position. Fig. 7 shows an example of a pixel structure of an image sensor used in the present invention. As shown in fig. 7, one pixel column 721 may include k pixels P 1-Pk. Each pixel includes the same structure, i.e., one photosensitive unit, one switch, and one storage unit. In particular, the pixel P 1 722 may include a photodiode 724 serving as a photosensitive unit, a switch 726, and a storage unit 728. The pixel P k 723 may include a photodiode 725 serving as a photosensitive unit, a switch 727, and a storage unit 729. The storage unit is, for example, a unit for storing charges generated by the charge photodiode based on received light and outputting in 0 or 1 based on the charge storage amount. Thus, when the synchronization device determines that the pixel column of a certain block area in the image sensor needs to be turned on based on the measurement result of the measurement device, the synchronization device turns on the switch of each pixel in the corresponding pixel column 721 so that the charge converted by the photodiode can be stored by the storage unit; at other times, the charge accumulation switch of the pixel is turned off, thereby leaving the structured light image frame storage unit of each pixel off for a substantial portion of the imaging frame, thereby minimizing the effects of ambient light.
In a preferred embodiment, the pixels in each image sensor each comprise a plurality of such structured light image frame storage units. Fig. 8 shows another example of the pixel structure of the image sensor used in the present invention. As shown in fig. 8, one pixel column 821 may include k pixels P 1-Pk. Each pixel includes the same structure, i.e., one photosensitive cell, N switches, and N memory cells, where each switch controls charge storage of one memory cell. Specifically, the pixel P 1 822 may include a photodiode 824 serving as a photosensitive unit, N switches 826, and N storage units 828. The pixel P k 823 may include a photodiode 825 serving as a light sensing unit, N switches 827, and N storage units 829.
The storage unit is, for example, a unit for storing charges generated by the charge photodiode based on received light and outputting in 0 or 1 based on the charge storage amount. Each structured-light image frame storage unit is respectively used for imaging the stripe-encoded structured light of different patterns sequentially projected by the projection device so as to generate a group of image frames aiming at the different patterns. The set of image frames may be used in their entirety for one depth data calculation.
Taking an example of a group of five patterns of fig. 2, the projection device projects the first left-most pattern of fig. 2 first. The image sensor sequentially turns on the first group of switches and the memory cells in the corresponding pixel column 821 during the pattern scanning. The projection device then projects a second pattern from the left in fig. 2. The image sensor sequentially turns on the second group of switches and the memory cells in the corresponding pixel column 821 during the pattern scanning. The intermediate web pattern in fig. 2 is then projected by the projection device. The image sensor sequentially turns on the third group of switches and the memory cells in the corresponding pixel column 821 during the pattern scanning. Subsequently, the projecting device projects a second pattern from the right in fig. 2. The image sensor sequentially turns on the fourth group of switches and the memory cells in the corresponding pixel column 821 during the pattern scanning. Finally, the projection device projects the first pattern on the far right in fig. 2. The image sensor sequentially turns on the fifth group of switches and the memory cells in the corresponding pixel column 821 during the pattern scanning. Thus, image frame imaging for a set of five patterns is completed. At this time, the five memory cells of each pixel each store a value of 0 or 1. Then, pixel matching in the two image sensors, for example, pixel level matching can be directly performed by the equipped digital operation module based on the five-value of each pixel. In other words, in the case where the pixels of the image sensor are themselves provided with a plurality of memory cells, pixel matching between images can be performed directly by converting a virtual image signal into a digital signal and directly performing processing such as addition, subtraction, multiplication, and division on the digital operation module. Compared with the prior art of software calculation in which image frames are required to be read one by one and then pixel matching is performed by a processor, the digital operation scheme of the application can greatly improve the speed of image processing and thus the generation rate of depth data.
In a preferred embodiment, the memory cell may be a memory cell capable of storing a multi-level gray value. Accordingly, the laser generator can project stripe light with intensity changing according to a certain rule so as to enable the storage unit to image the stripe light in gray scale. The high-resolution image processing method based on digital operation under gray scale imaging can be realized by selecting a specific light intensity variation projection mode and combining an image sensor with a plurality of storage units and a digital operation module at the front end, so that the image definition is further improved while high-speed depth data calculation is ensured.
For imaging ambient light (e.g. different two-dimensional imaging), the pixels in each image sensor further each comprise an additional memory unit for being turned off when at least one structured light image frame memory unit of the pixel is turned on and being turned on for at least part of the period of time when said structured light image frame memory unit is not being illuminated by light, such that the image sensor generates an ambient light image frame based on the additional memory unit.
The invention also discloses a measuring device using the measuring head. Specifically, a depth data measuring apparatus may include a depth data measuring head as described above, and a processor connected to the depth data measuring head for determining depth data of a photographic subject in a photographic region based on predetermined relative positions of the first and second image sensors and first and second two-dimensional image frames obtained by imaging the structured light thereof. In various embodiments, the measuring head may have a relatively independent package, or may be packaged with the processor in the measuring device.
Fig. 9 shows a schematic diagram of a depth data measurement device according to an embodiment of the invention. As shown, the measurement device 900 may include a measurement head and a processor 940 as described above. The measuring head comprises a projection device 910, two image sensors 920, and a synchronization device 930.
The processor 940 is connected to the measuring head, for example to the imaging device 910, to the two image sensors 920, and to the synchronization device 930, for determining depth data of the object in the imaging area based on the predetermined relative positions of the first and second image sensors 920_1 and 920_2 and the first and second two-dimensional image frames obtained by imaging the structured light.
In one embodiment, at least part of the synchronization function of the synchronization device may be implemented by the processor. For example, the processor may determine the scanning position of the fringes in real time based on data measured by a measuring device included in the synchronizing device, and incorporate the synchronizing function of the synchronizing device to achieve synchronous control of the respective components, for example, directly based on the delay-free electrical signal.
Fig. 10 shows a schematic flow chart of a depth data measurement method according to an embodiment of the invention. The method may be implemented by the depth data measuring head and measuring device of the present invention.
In step S1010, structured light having a stripe code is scan-projected to a photographing region. In step S1020, the photographing region is photographed using first and second image sensors having a predetermined relative positional relationship to obtain first and second two-dimensional image frames under the structured light irradiation, respectively, wherein pixel columns in a stripe direction corresponding to a current scanning position in the first and second image sensors are synchronously turned on for imaging based on scanning positions of structured light stripes. In step S1030, depth data of the object to be measured in the photographing region is obtained based on the first and second two-dimensional image frames.
In one embodiment, step S1010 may include generating infrared light using a laser emitter; and reciprocally vibrating at a predetermined frequency using a micromirror device to scan-project the line-type laser light to the photographing region at the predetermined frequency, wherein a length direction of the line-type laser light is a length direction of the projected stripe.
In one embodiment, the measurement method may further include: and measuring the vibration phase of the micro-mirror device in real time to obtain the scanning position of the structural light stripe.
In one embodiment, the measurement method may further include: the number of pixel columns that are turned on at a time in synchronization is determined based on the calibration operation.
In one embodiment, the measurement method may further include: imaging the stripe-encoded structured light of different patterns projected in sequence, respectively, using a plurality of structured light image frame storage units each included in a pixel in each image sensor, to generate a set of image frames for the different patterns; and performing a digital operation to achieve pixel matching between the first and second image sensors directly based on the plurality of values stored in each pixel.
In one embodiment, the measurement method may further include: imaging at least part of the period of time during which the structured light is not received using an additional storage unit comprised by each of the pixels in each image sensor such that the image sensor generates an ambient light image frame based on the additional storage unit.
The depth data measuring head, the measuring apparatus and the measuring method according to the present invention have been described in detail above with reference to the accompanying drawings. The depth data measurement scheme can provide a pixel-level depth imaging scheme with high flexibility by combining the actively projected stripe coding structured light and binocular imaging, and the superposition and binocular imaging based on the stripe coding pattern does not need to depend on the characteristic of a specific imaging plane. Specifically, the invention can also remove the influence of the ambient light on the depth measurement result through the high synchronization of imaging and scanning, and the high-speed scanning of the linear light is realized by utilizing the DMD, so that the usable scene of the invention is further expanded, and the imaging speed and the imaging precision are improved.
Furthermore, the method according to the invention may also be implemented as a computer program or computer program product comprising computer program code instructions for performing the steps defined in the above-mentioned method of the invention.
Or the invention may also be embodied as a non-transitory machine-readable storage medium (or computer-readable storage medium, or machine-readable storage medium) having stored thereon executable code (or a computer program, or computer instruction code) that, when executed by a processor of an electronic device (or computing device, server, etc.), causes the processor to perform the steps of the above-described method according to the invention.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the disclosure herein may be implemented as electronic hardware, computer software, or combinations of both.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of embodiments of the invention has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (22)

1. A depth data measurement head, comprising:
a projection device for scanning and projecting structured light with stripe codes to a shooting area;
First and second image sensors having a predetermined relative positional relationship for photographing the photographing region to obtain first and second two-dimensional image frames under the structural light irradiation, respectively; and
Synchronization means for synchronously turning on pixel columns in a stripe direction corresponding to a current scanning position in the first and second image sensors based on a scanning position of the projection means to image,
Wherein, the projection arrangement includes:
a laser generator for generating line-type light;
a reflecting mechanism swinging along an axis of the linear light extending direction for projecting the linear light to the photographing region,
The laser generator performs light intensity conversion or high-speed switching to scan and project structured light with alternate brightness corresponding to stripe codes.
2. The measuring head according to claim 1, wherein the synchronization means includes measurement means for measuring a scanning position of the projection means, and synchronization of the imaging of the pixel columns is turned on based on a measurement result of the measurement means.
3. The measurement head of claim 1, wherein the
The laser generator is used for generating linear infrared light.
4. A measuring head according to claim 3, wherein the laser generator generates linear light of varying brightness over time to project structured light of alternating brightness and varying bright stripe brightness.
5. A measuring head according to claim 3, wherein the reflecting mechanism is a micromirror device reciprocally vibrating at a predetermined frequency for scanning and projecting the line-shaped light toward the photographing region at the predetermined frequency, wherein a length direction of the line-shaped light is a length direction of a projected stripe.
6. The measuring head according to claim 5, wherein the synchronization means includes measurement means for measuring a vibration phase of the micromirror device in real time, and synchronization on of the pixel column imaging is performed based on a measurement result of the measurement means.
7. The measurement head of claim 6, wherein the measurement device is one or more photosensors and the one or more photosensors are arranged in any of the following ways:
Arranged on different exit paths of the projection device;
Arranged on different reflection paths within the projection device; and
Are arranged on the outgoing and reflection paths inside and outside the projection device, respectively.
8. The measurement head of claim 1, wherein the number of pixel columns that are turned on at a time is determined based on a calibration operation.
9. The measurement head of claim 1, wherein each of the image sensors completes imaging of one image frame after each predetermined number of scan projections by the projection device.
10. The measurement head of claim 1, wherein the pixels in each of the image sensors comprise structured light image frame storage elements that are synchronously turned on corresponding to the current scanning position.
11. The measurement head of claim 10, wherein the pixels in each of the image sensors each comprise a plurality of structured-light image frame storage units, each structured-light image frame storage unit being respectively for imaging a different pattern of fringe-encoded structured light projected sequentially by the projection device to generate a set of image frames for the different pattern, wherein the set of image frames are used in their entirety for performing one depth data calculation.
12. The measurement head of claim 11, wherein the storage unit is a binary memory storing a value of 0 or 1, and after generating the set of image frames, pixel matching between the first and second image sensors is performed directly based on a plurality of values of 0 or 1 stored in each pixel.
13. The measurement head of claim 11, wherein the storage unit is a multi-level memory for storing gray values, and further comprising a digital operation module that directly performs a digital operation on the gray values stored in each pixel for pixel matching between the first and second image sensors after generating the set of image frames.
14. The measurement head of claim 10, wherein the pixels in each of the image sensors further each include an additional memory unit for turning off when at least one structured light image frame memory unit of the pixel is on and for at least a portion of the period of time that the structured light is not being received, such that the image sensor generates an ambient light image frame based on the additional memory units.
15. A depth data measurement device, comprising:
the depth data measurement head of any one of claims 1-14, and
And the processor is connected with the depth data measuring head and is used for determining the depth data of the shooting object in the shooting area according to the preset relative positions of the first image sensor and the second image sensor and the first two-dimensional image frames obtained by imaging the structured light.
16. The apparatus of claim 15, wherein at least a portion of a synchronization function of the synchronization apparatus is implemented by the processor.
17. A depth data measurement method, comprising:
scanning and projecting structured light with stripe codes to a shooting area;
Photographing the photographing region using first and second image sensors having a predetermined relative positional relationship to obtain first and second two-dimensional image frames under the structured light irradiation, respectively, wherein pixel columns in a stripe direction corresponding to a current scanning position in the first and second image sensors are synchronously turned on to perform imaging based on scanning positions of structured light stripes; and
Depth data of a measured object in the photographing region is obtained based on the first and second two-dimensional image frames,
Wherein scanning and projecting the structured light with stripe codes to the shooting area comprises:
generating infrared light by a laser generator; and
The linear light is projected by a reflecting mechanism swinging along the axis of the linear light extending direction, wherein the laser generator performs light intensity conversion or high-speed switching to scan and project the structured light with alternate brightness and darkness corresponding to stripe codes.
18. The method of claim 17, wherein generating infrared light with a laser emitter comprises:
Generating infrared line light using a laser generator, and
Projecting the line-shaped light using a reflection mechanism that swings along an axis of the line-shaped light extending direction includes:
the line-type light is scanned and projected toward the photographing region at a predetermined frequency by reciprocating vibration of a micromirror device at the predetermined frequency, wherein a length direction of the line-type light is a length direction of a projected stripe.
19. The method of claim 18, further comprising:
And measuring the vibration phase of the micro-mirror device in real time to obtain the scanning position of the structural light stripe.
20. The method of claim 17, further comprising:
the number of pixel columns that are turned on at a time in synchronization is determined based on the calibration operation.
21. The method of claim 17, further comprising:
Imaging the stripe-encoded structured light of different patterns projected in sequence, respectively, using a plurality of structured light image frame storage units each included in a pixel in each image sensor, to generate a set of image frames for the different patterns; and
A digital operation is performed to achieve pixel matching between the first and second image sensors directly based on the plurality of values stored in each pixel.
22. The method of claim 17, further comprising:
Imaging at least part of the period of time during which the structured light is not received using an additional storage unit comprised by each of the pixels in each image sensor such that the image sensor generates an ambient light image frame based on the additional storage unit.
CN201910199180.3A 2019-03-15 2019-03-15 Depth data measuring head, measuring device and measuring method Active CN111692987B (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN201910199180.3A CN111692987B (en) 2019-03-15 2019-03-15 Depth data measuring head, measuring device and measuring method
US17/437,512 US11885613B2 (en) 2019-03-15 2019-12-03 Depth data measuring head, measurement device and measuring method
PCT/CN2019/122667 WO2020186825A1 (en) 2019-03-15 2019-12-03 Depth data measuring head, measurement device and measurement method
JP2022502318A JP7224708B6 (en) 2019-03-15 2019-12-03 Depth data measuring head, measuring device and measuring method
EP19919943.1A EP3943882B1 (en) 2019-03-15 2019-12-03 Depth data measuring head, measurement device and measurement method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910199180.3A CN111692987B (en) 2019-03-15 2019-03-15 Depth data measuring head, measuring device and measuring method

Publications (2)

Publication Number Publication Date
CN111692987A CN111692987A (en) 2020-09-22
CN111692987B true CN111692987B (en) 2024-10-11

Family

ID=72475395

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910199180.3A Active CN111692987B (en) 2019-03-15 2019-03-15 Depth data measuring head, measuring device and measuring method

Country Status (1)

Country Link
CN (1) CN111692987B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115218820A (en) * 2021-04-20 2022-10-21 上海图漾信息科技有限公司 Structured light projection device, depth data measuring head, computing device and measuring method
CN113172624B (en) * 2021-04-23 2024-06-21 北京创源微致软件有限公司 Positioning guide device and method and electronic equipment
CN113703248B (en) * 2021-08-11 2022-09-09 深圳市睿识科技有限公司 3D structured light module and depth map point cloud image acquisition method based on same
JP2025518634A (en) * 2022-05-18 2025-06-18 上海図漾信息科技有限公司 Depth data measuring head, measuring device and measuring method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105427326A (en) * 2015-12-08 2016-03-23 上海图漾信息科技有限公司 Image matching method and device as well as depth data measuring method and system
CN205336464U (en) * 2015-12-08 2016-06-22 上海图漾信息科技有限公司 Range data detecting system
CN205987149U (en) * 2016-01-16 2017-02-22 上海图漾信息科技有限公司 Range data monitoring device
CN206321237U (en) * 2016-12-24 2017-07-11 上海图漾信息科技有限公司 Linear optical range finding apparatus
CN109491074A (en) * 2017-09-11 2019-03-19 宏达国际电子股份有限公司 Optical base station
CN209927097U (en) * 2019-03-15 2020-01-10 上海图漾信息科技有限公司 Depth data measuring head
CN111829449A (en) * 2019-04-23 2020-10-27 上海图漾信息科技有限公司 Depth data measuring head, measuring device and measuring method
CN111854625A (en) * 2019-04-29 2020-10-30 上海图漾信息科技有限公司 Depth data measuring head, measuring device and measuring method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08136222A (en) * 1994-11-12 1996-05-31 Omron Corp Method and device for three-dimensional measurement
JP6335011B2 (en) * 2014-04-25 2018-05-30 キヤノン株式会社 Measuring apparatus and method
CN205156874U (en) * 2015-10-28 2016-04-13 苏州临点三维科技有限公司 Many breadths 3D scanning survey appearance
CN107607040B (en) * 2017-08-11 2020-01-14 天津大学 Three-dimensional scanning measurement device and method suitable for strong reflection surface

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105427326A (en) * 2015-12-08 2016-03-23 上海图漾信息科技有限公司 Image matching method and device as well as depth data measuring method and system
CN205336464U (en) * 2015-12-08 2016-06-22 上海图漾信息科技有限公司 Range data detecting system
CN205987149U (en) * 2016-01-16 2017-02-22 上海图漾信息科技有限公司 Range data monitoring device
CN206321237U (en) * 2016-12-24 2017-07-11 上海图漾信息科技有限公司 Linear optical range finding apparatus
CN109491074A (en) * 2017-09-11 2019-03-19 宏达国际电子股份有限公司 Optical base station
CN209927097U (en) * 2019-03-15 2020-01-10 上海图漾信息科技有限公司 Depth data measuring head
CN111829449A (en) * 2019-04-23 2020-10-27 上海图漾信息科技有限公司 Depth data measuring head, measuring device and measuring method
CN111854625A (en) * 2019-04-29 2020-10-30 上海图漾信息科技有限公司 Depth data measuring head, measuring device and measuring method

Also Published As

Publication number Publication date
CN111692987A (en) 2020-09-22

Similar Documents

Publication Publication Date Title
CN111692987B (en) Depth data measuring head, measuring device and measuring method
CN107923737B (en) Method and apparatus for superpixel modulation and ambient light suppression
EP3943882B1 (en) Depth data measuring head, measurement device and measurement method
CN111829449B (en) Depth data measuring head, measuring device and measuring method
CN112019773B (en) Depth data measuring head, measuring device and method
EP3382421A1 (en) Methods and apparatus for superpixel modulation with ambient light suppression
CN209927097U (en) Depth data measuring head
CN115390087A (en) Laser line scanning three-dimensional imaging system and method
CN111854625B (en) Depth data measuring head, measuring device and measuring method
CN216246133U (en) Structured light projection device, depth data measuring head and computing equipment
CN216283296U (en) Depth data measuring head and depth data calculating apparatus
CN115218812B (en) Depth data measuring head, computing device and corresponding method thereof
JP7664658B2 (en) Depth data measuring head, computing device and measuring method
CN116626907A (en) Large-angle projection method, system and medium for multiple MEMS vibrating mirrors
JP4317300B2 (en) Range finder device
Chen et al. A light modulation/demodulation method for real-time 3d imaging
EP4528217A1 (en) Depth data measuring head, measuring apparatus, and measuring method
CN117989997A (en) Depth data measuring head, measuring device and measuring method
CN117542037A (en) Depth data measuring head and method
CN117128890A (en) Depth data measuring head, measuring device and measuring method
JP2003090712A (en) Three-dimensional image imaging device
CN117128891A (en) Depth data measuring head, measuring device and measuring method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant