CN108592886B - Image acquisition apparatus and image acquisition method - Google Patents
Image acquisition apparatus and image acquisition method Download PDFInfo
- Publication number
- CN108592886B CN108592886B CN201810403358.7A CN201810403358A CN108592886B CN 108592886 B CN108592886 B CN 108592886B CN 201810403358 A CN201810403358 A CN 201810403358A CN 108592886 B CN108592886 B CN 108592886B
- Authority
- CN
- China
- Prior art keywords
- light
- image
- type
- projection light
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 28
- 238000012545 processing Methods 0.000 claims abstract description 62
- 238000003384 imaging method Methods 0.000 claims abstract description 47
- 230000003287 optical effect Effects 0.000 claims abstract description 35
- 238000010586 diagram Methods 0.000 description 26
- 238000000691 measurement method Methods 0.000 description 20
- 238000005259 measurement Methods 0.000 description 11
- 238000004364 calculation method Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 5
- 238000004519 manufacturing process Methods 0.000 description 4
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000002050 diffraction method Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000002329 infrared spectrum Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 238000004148 unit process Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
An image acquisition apparatus and an image acquisition method for acquiring depth information and chromaticity luminance information of an object, the image acquisition apparatus comprising: a linear light generating unit for generating linear projection light and projecting the linear projection light to a subject; an imaging unit including an image sensor for receiving the ambient light and the linear projection light reflected from the photographed object to form an image corresponding to the photographed object; an optical path control unit for controlling whether the image sensor forms an image; a timing control unit for controlling the turn-on times of the optical path control unit and the linear light generation unit to turn on the optical path control unit during the projection of the linear projection light; the processing unit is used for receiving and processing the images generated by the image sensor, storing the images, and determining the images to be the first type of images or the second type of images according to the signals of the time sequence control unit, wherein the processing unit obtains the depth information of the shot object according to the second type of images and the first type of images.
Description
Technical Field
The present invention relates to the field of image acquisition, and in particular, to an image acquisition apparatus and an image acquisition method.
Background
In the field of image acquisition, it has been a conventional and common image acquisition means to acquire surface chromaticity and luminance information of a photographed object using light reflected by the photographed object. The existing common photographing is to collect and store the chromaticity and brightness information of the surface of the photographed object.
Fig. 1 shows the basic principle of an electronic imaging device. As shown in fig. 1, the photographed object m reflects natural light, and the light is imaged on the surface of the image sensor 30' after passing through the lens 10' of the electronic imaging device 90', and each pixel of the image sensor 30' senses the light intensity at a corresponding position, and converts the light intensity into an electrical signal to be output to the processing unit 40'. In order to reasonably control the exposure time of the image sensor 30', a mechanical or electronic shutter 20' is typically provided on the electronic imaging device 90, and opening and closing of the shutter 20 'can control the on-off of the optical path transmitted from the outside to the image sensor 30'.
Along with the progress of technology, besides collecting the chromaticity and brightness information of the surface of the object, the prior art can also collect the local depth information of the photographed object, that is, the distance between a certain point or points and the lens. And performing calculation such as splicing through the local depth information to obtain the three-dimensional shape information of the shot object.
Fig. 2 is a schematic diagram of a method for measuring local depth information. As shown in fig. 2, local depth information is generally collected by projecting light such as laser light onto the surface of an object to be photographed by triangulation. As shown in fig. 2, the straight line where the point a is located represents a laser plane, the distance from the center line of the lens is d, the point a of the imaging plane on the plane is imaged as a point a on the imaging plane after passing through the lens, the distance from the center line is x, and the distance from the lens to the imaging plane is f.
From the triangle geometry, the distance h of the point a from the imaging plane can be calculated using the known information f, d and x, and thus when a part of the object to be photographed is located at the point a, the distance h of the point of the object to be photographed from the imaging plane can be calculated. After the distance h from each point of the photographed object to the imaging plane is acquired, three-dimensional information of the photographed object can be obtained.
Currently, main methods for measuring depth information of a photographed object using laser light generally include a monocular laser measurement method, a binocular laser measurement method, and a binocular structured light measurement method.
The monocular laser measurement method is to obtain image information by using an image sensor, then obtain the image information from the image sensor by using a processing unit, detect a pixel point sequence corresponding to the imaged laser line according to the brightness of the pixel points, and calculate the depth information corresponding to the pixel points by using the principle of triangulation according to the position information of the pixel points.
In practical use, infrared laser is usually selected as the laser, a filter is arranged on the lens of the image sensor, visible light is filtered, and the brightness of pixels at the positions with laser irradiation is obviously higher than that of pixels at the positions without laser irradiation through infrared light. Or in use, the exposure time of the image sensor is reduced by adopting laser beam irradiation with higher brightness, so that the brightness of pixels at the positions with laser irradiation is obviously higher than that of pixels at the positions without laser irradiation.
Fig. 3A is a schematic diagram of a monocular laser measurement method. Fig. 3B is a schematic diagram of an image measured by the monocular laser measurement method. As shown in fig. 3A and 3B, in the case of using a line laser, depth information of one line irradiated with a laser line on the surface of an object can be obtained every time the detection is photographed. And (3) moving the object or the mobile shooting equipment to enable the laser line to sweep the surface of the whole object, so that depth information of all points on the surface of the object can be obtained, and three-dimensional data of the surface of the object can be constructed.
Fig. 3C is a schematic diagram of an image measured by the binocular laser measurement method. Fig. 3D is a schematic diagram of left and right sensor imaging for a binocular laser measurement method. A group of lenses and image sensors are added on the basis of a monocular laser measurement method. However, the monocular laser measurement method has lower precision and low efficiency, while the binocular laser measurement method needs to find out the position pixels corresponding to the same area from the left image and the right image respectively, has larger limitation and also has the problem of lower precision.
The structured light is projected by a projection method, and the texture features are projected on the surface of the photographed object. For the relevant content of structured light reference is made to US patent application US8208719. Fig. 3E is a schematic diagram showing binocular structured light measurement. Fig. 3F is a schematic diagram of left and right sensor imaging for binocular structured light measurement. The binocular structured light measurement adopts a laser projection method to project texture features on the surface of the photographed object, and two image sensors are required to be used in the binocular structured light measurement method to collect light projected onto the surface of the photographed object from different angles. The laser emitted by the laser is generated by the lens to generate a structured light, the structured light is irradiated to the object to be shot, and then the light returned to the lens a and the lens C is collected by the image sensor a and the image sensor C, and the image information collected by the image sensor a and the image sensor C is shown in fig. 3F, for example. The image information is sent to the processing unit for the correspondence and identification of two groups of image data, so that the processing unit is entered to calculate the depth information of each point, and further calculate the three-dimensional information of the measured object.
In addition, there is also a monocular structured light measurement method corresponding to the binocular structured light measurement method. The difference between monocular structured light measurement and the general laser measurement method is that the laser is replaced with structured light. The difference with the binocular structured light measuring method is that the binocular structured light technology collects reflection information from two different left and right positions to form two images, and calculates depth information from the position offset of the same texture in the two images; the monocular structured light technology only collects one image, recognizes each local texture according to texture coding information of structured light features, and calculates depth information according to imaging positions of the local textures.
Since the structured light technology needs to identify each local texture according to the coding information of the structured light, the texture structure design requirement of the used structured light is high.
With the advent of the need to obtain depth information and chrominance luminance information simultaneously, more and more people explore how depth information and chrominance luminance information of an object can be obtained simultaneously by a set of devices. Currently, the above-mentioned scheme for measuring depth information by binocular structured light is commonly used in the industry together with the common scheme for measuring chrominance and luminance information. That is, as shown in fig. 3G, an RGB sensor for collecting chromaticity and luminance information is added to the above-described scheme for measuring depth information by binocular structured light. The image sensor A and the image sensor C are used for collecting infrared spectrums, and the image sensor B is used for collecting RGB information of the photographed object m after natural light is reflected. The processing unit calculates and acquires depth information from binocular structured light, acquires chromaticity and brightness information from an RGB sensor, and correlates the chromaticity and brightness information of each pixel with three-dimensional information after calibration and alignment to obtain three-dimensional information and surface color information corresponding to an object.
Also, the monocular structured light scheme described above plus the chrominance luminance image sensor scheme may be employed. As shown in fig. 3I and 3J, an RGB sensor for collecting chromaticity and luminance information is added on the basis of monocular structured light, a processing unit calculates and obtains depth information from the monocular structured light, obtains chromaticity and luminance information from the RGB sensor, and then associates chromaticity and luminance information of each pixel with three-dimensional information after calibration and alignment to obtain three-dimensional information and surface color information corresponding to an object.
However, the scheme using the binocular structured light and the RGB and the scheme using the monocular structured light and the RGB have the advantages of complex equipment, high cost, high manufacturing precision requirement, and capability of acquiring the chromaticity information and the depth information of the object through different sensors and lenses respectively, and determining the mapping association relationship between the chromaticity information and the depth information through a large amount of calculation.
Disclosure of Invention
In view of the different problems of the existing measurement methods, in order to solve the problems of the prior art, the embodiment of the invention provides image acquisition equipment and an image acquisition method. The invention provides a device for simultaneously collecting three-dimensional shape information and surface chromaticity/brightness information of a detected object, which can be used for reconstructing a three-dimensional model of the detected object and surface textures thereof.
To solve the above problems, an embodiment of the present invention provides an image capturing apparatus for capturing depth information and chromaticity and luminance information of an object, the image capturing apparatus including:
A projection light generating unit for generating projection light and projecting the projection light to a subject;
an imaging unit including an image sensor for receiving light reflected from a photographed object to form an image corresponding to the photographed object;
an optical path control unit for controlling whether the image sensor forms an image;
the time sequence control unit is in signal connection with the light path control unit and/or the projection light generation unit and is used for controlling the opening time of the light path control unit and/or the projection light generation unit so as to open the light path control unit during the projection of the projection light;
The processing unit is in signal connection with the time sequence control unit and the image sensor, and is used for receiving and processing images generated by the image sensor and storing the images, determining the images to be first-class images or second-class images according to the signals of the time sequence control unit, wherein the first-class images are images formed by the first-class light reflected by the shot object received by the image sensor, and the second-class images are images formed by the second-class light reflected by the shot object received by the image sensor; the second type of light includes the first type of light and the projected light;
the processing unit acquires chromaticity and brightness information of the shot object from the first type of image, and acquires local depth information of the shot object according to the second type of image and the first type of image.
To solve the above problems, an embodiment of the present invention provides an image capturing apparatus for capturing depth information and chromaticity and luminance information of an object, the image capturing apparatus including:
A projection light generating unit for generating projection light and projecting the projection light to a subject;
an imaging unit including an image sensor for receiving light reflected from a photographed object to form an image corresponding to the photographed object;
an optical path control unit for controlling whether the image sensor forms an image;
the time sequence control unit is in signal connection with the light path control unit and/or the projection light generation unit and is used for controlling the opening time of the light path control unit and/or the projection light generation unit so as to open the light path control unit during the projection of the projection light;
The processing unit is in signal connection with the time sequence control unit and the image sensor, and is used for receiving and processing images generated by the image sensor and storing the images, determining the images to be first-class images or second-class images according to the signals of the time sequence control unit, wherein the first-class images are images formed by the first-class light reflected by the shot object received by the image sensor, and the second-class images are images formed by the second-class light reflected by the shot object received by the image sensor; the second type of light includes the first type of light and the projected light;
the processing unit acquires chromaticity and brightness information of the shot object from the first type of image, and acquires local depth information of the shot object according to the second type of image, wherein the brightness of the projection light is more than 2 times of the brightness of the environment light.
To solve the above problems, an embodiment of the present invention provides an image capturing apparatus for capturing local depth information and chromaticity luminance information of an object, the image capturing apparatus including:
The two or more projection light generating units are used for respectively generating projection light and projecting the projection light to a photographed object;
an imaging unit including an image sensor for receiving light reflected from a photographed object to form an image corresponding to the photographed object;
an optical path control unit for controlling whether the image sensor forms an image;
The time sequence control unit is in signal connection with the light path control unit and/or the projection light generation unit and is used for controlling the opening time of the light path control unit and the projection light generation unit so as to open the light path control unit during the projection of the projection light;
The processing unit is in signal connection with the time sequence control unit and the image sensor, and is used for receiving and processing images generated by the image sensor and storing the images, determining the images to be first-class images or second-class images according to the signals of the time sequence control unit, wherein the first-class images are images formed by the first-class light reflected by the shot object received by the image sensor, and the second-class images are images formed by the second-class light reflected by the shot object received by the image sensor; the second type of light includes the first type of light and the projected light;
the processing unit acquires chromaticity and brightness information of the shot object from the first type of image, and acquires local depth information of the shot object according to the second type of image and the first type of image.
In order to solve the above problems, an embodiment of the present invention provides an image acquisition method, including:
In a first time period, acquiring a first type of image, wherein the first type of image is formed by an image sensor receiving first type of light reflected by a photographed object;
Starting a projection light generating unit in a second time period, and collecting a second type of image, wherein the second type of image is an image formed by reflecting the second type of light rays by an object to be shot received by an image sensor;
Calculating local depth information of the shot object according to the difference between the first type image and the second type image;
correlating the local depth information with chrominance luminance information collected from the first type of image;
wherein the second type of light includes the first type of light and projection light generated by the projection light generating unit.
In summary, the image acquisition equipment and the image acquisition method provided by the invention solve the problems existing in the prior art, and have the advantages of simple system, low cost, low manufacturing precision requirement and the like. The invention can use simple light sources such as line laser and the like without designing complex structured light. The scheme of the invention is not easy to be interfered by ambient light, and projection light sources such as infrared light wave bands and visible light wave bands can be used. In addition, the chroma information and the depth information acquired by the method have natural association relation, and the chroma and the depth information are not required to be fused by a large amount of calculation.
Drawings
Fig. 1 is a schematic diagram showing the basic principle of an electronic imaging device.
Fig. 2 is a schematic diagram of a method for measuring depth information.
Fig. 3A is a schematic diagram of a monocular laser measurement method.
Fig. 3B is a schematic diagram of an image measured by the monocular laser measurement method.
Fig. 3C is a schematic diagram of an image measured by the binocular laser measurement method.
Fig. 3D is a schematic diagram of left and right sensor imaging for a binocular laser measurement method.
Fig. 3E is a schematic diagram showing binocular structured light measurement.
Fig. 3F is a schematic diagram of left and right sensor imaging for binocular structured light measurement.
Fig. 3G is a schematic diagram of a combined measurement system of binocular structured light measurement and ambient light capture.
Fig. 3H is a schematic diagram showing an image obtained by photographing with the measurement system of fig. 3G.
Fig. 3I is a schematic diagram of a combined monocular structured light measurement and ambient light capture system.
Fig. 3J is a schematic diagram showing an image captured by the measurement system of fig. 3I.
Fig. 4 is a schematic diagram of an image capturing device according to an embodiment of the present invention.
Fig. 5 and 6 are schematic diagrams showing images acquired by the image acquisition apparatus.
Fig. 7 is a schematic diagram showing surrounding shooting with an image pickup apparatus.
Fig. 8 and 9 are schematic diagrams showing images captured with linear projection light of normal brightness and with linear projection light of high brightness.
Fig. 10 is a schematic diagram of an image capturing apparatus employing two imaging units.
Detailed Description
The image capturing apparatus and method according to the present invention will be described below by way of a plurality of embodiments.
First embodiment
The embodiment of the invention provides an image acquisition device for three-dimensional information and chromaticity brightness. Fig. 4 is a schematic diagram of the image acquisition apparatus. As shown in fig. 4, the image acquisition apparatus includes: a projection light generation unit 10, an imaging unit 20, an optical path control unit 30, a timing control unit 40, and a processing unit 50. Wherein the imaging unit may discontinuously receive an image of the sum of the ambient light reflected by the photographed object 100 and the projected light, and determine local depth information of the photographed object by a difference compared with an image of the previously or later received reflected only ambient light. The following will specifically explain.
In the present embodiment, the projection light generating unit 10 that generates projection light includes, for example, a projection light source 11 and a projection lens 12. The projection light source 11 is for generating projection light; the projection lens 12 is used to convert the projection light emitted from the projection light source 11 into a simple, regular pattern.
The projection light is, for example, linear projection light, or light of other shapes such as a dot shape, a wedge shape, a rectangle, a trapezoid, or the like, in terms of the shape of the projection light. The shape of the light may be determined by the projection lens 12, which is a technology well known in the art and will not be described in detail herein. The projection light is, for example, laser light, infrared light, ultraviolet light, blue light, structural light, or the like, in terms of the properties of the projection light.
In one embodiment, the term "linear" projection light refers to an aspect ratio greater than a certain ratio-e.g., 200: 1. The linear light can be easily generated by the projection light emitted from the projection light source 11 by the conventional diffraction method or the like.
The linear projection light is not limited to one, and may be plural, and the linear projection light may be projected on the subject 100 in various patterns such as parallel lines, a cross, a grid, and a radial pattern.
Further, in the case of including a plurality of projection light sources 11 and a plurality of projection lenses 12, the linear projection light may be light rays of the same wavelength, light rays of different wavelengths, or the like. For example, the plurality of linear projection lights may be infrared light, ultraviolet light, blue light, structured light, etc. emitted from the plurality of projection light sources 11 and the projection lens 12, without limitation.
For example, the same projection light generating unit may simultaneously emit three kinds of projection light (such as red light, green light, and blue light) with different wavelengths, and the three kinds of projection light are respectively projected onto the photographed object 100, and the light reflected by the reflection point of the object 100 returns to the imaging unit, so that the imaging unit may collect three kinds of reflection light with different wavelengths, thereby collecting three different images after one projection, performing subsequent calculation, and improving the collection efficiency and the accuracy of calculation processing.
The structured light is known in the art, and is a structured light that is projected onto the surface of the object by a projection method. For the relevant content of structured light reference is made to US patent application US8208719. Which are well known in the art and will not be described in detail herein.
The imaging unit 20 includes, for example, an imaging lens 21 and an image sensor 22. The imaging lens 21 is configured to optically image the ambient light and the projection light reflected by the surface of the subject 100 on the surface of the image sensor 22 to form an image. The image sensor 22 converts the image of the surface into a digital electrical signal, forming a digital image. The imaging units 20 may be one or more groups, for example, fig. 10 shows an image acquisition device including two imaging units 20, where the efficiency of image acquisition may be improved.
The optical path control unit 30 is used to control the opening or closing of the path of light projected to the image sensor 22 by the imaging lens 21 of the imaging unit 20. The optical path control unit 30 may be a shutter, for example, a physical mechanical shutter, or an electronic shutter integrated on the image sensor 22, and the present invention is not particularly limited. The optical path control unit 30 may also be a filter provided on the projection optical path or the imaging optical path. When the optical filter is installed on an imaging optical path, for example, between the imaging lens 21 and the image sensor 22, the projection light is filtered out when the optical filter is effective, and the projection light part cannot be imaged; when the filter fails, the projected light is partially imaged. Similarly, such a filter may also be installed on the projection light path, as between the projection light source 11 and the projection lens 12, for synchronously controlling the projection light source and the image sensor.
The projection light source 11 is controllable, is connected with the time sequence control unit 40, and can receive a control signal of the time sequence control unit 40 and be turned on or turned off according to a control instruction; the optical path control unit 30 is connected to the timing control unit 40, and receives control of the timing control unit 40.
In an embodiment, the timing control unit 40 is simultaneously signal-connected to the optical path control unit 30 and the projection light generating unit 10 for synchronously controlling the linear light generating unit 10 and the optical path control unit 30, so that the first type image and the second type image can be generated in a staggered manner for each image acquisition.
The above-described "synchronously controlling the projection light generation unit 10 and the optical path control unit 30" and synchronizing the projection light generation unit 10 and the optical path control unit 30 "simultaneously" on or off; but means that by means of the time sequence control, the linear projection light can be reflected to the image sensor 22 in the period in which the projection light source 11 of the linear light generating unit 10 is turned on during the process of collecting the second type image, that is, the optical path control unit 30 is turned on, the optical path between the imaging lens 21 and the image sensor 22 is turned on, or the optical path between the photographed object 100 and the image sensor 22 is turned on, and the linear projection light can normally propagate. In addition, the timing control unit 40 also ensures that the linear light generating unit 10 is always turned off and no projected linear projection light is reflected to the image sensor 22 during the first type of image acquisition.
In another embodiment, the timing control unit 40 may control only one of the light path control unit 30 or the projection light generation unit 10, and the scheme of the present invention can be implemented as well, which will be described in detail below.
For example, the timing control unit 40 controls only the optical path control unit 30 unit. In this case, the projection light generating unit 10 may continuously emit the projection light, and the timing control unit 40 may control the light path control unit 30 to be intermittently turned on during the emission of the projection light so that the image sensor 22 collects the projection light and the ambient light reflected by the photographed object, and the image sensor 22 may collect only the ambient light reflected by the photographed object during the period in which the light path control unit 30 is turned off to filter out the projection light, during the period in which the projection light is emitted.
For another example, the timing control unit 40 controls only the projection light generating unit 10 such that the projection light generating unit intermittently emits the projection light, and causes the image sensor 22 to collect the projection light and the ambient light reflected by the photographed object during a period in which the projection light is emitted, and causes the image sensor 22 to collect the ambient light reflected by the photographed object during a period in which the projection light is not emitted.
It is noted that the above-mentioned "ambient light" refers to natural light or light under photographing conditions of a conventional laboratory or the like. Only the light rays in the normal photographing state are referred to herein for the explanation in contrast to the "projected light". It is considered that, in the present invention, when the image sensor 22 receives the projection light reflected by the object to be photographed, the received image is a second type of light, and when the projection light not including the reflection is received, the received image is a first type of light, which is formed by adding the projection light to the first type of light.
The processing unit 50 is configured to receive and store the digital image formed by the image sensor 22, and determine that the image is the first type image or the second type image according to the signal of the timing control unit 40. The first type of image is an image formed by the image sensor receiving first type of light reflected by the shot object, and the second type of image is an image formed by the image sensor receiving second type of light reflected by the shot object; the second type of light is formed by the first type of light adding the projection light; the second type of light may include projected light and ambient light in one embodiment, and the first type of light may include ambient light in one embodiment.
The processing unit 50 is further configured to extract chromaticity and luminance information of the surface of the photographed object from the first type of image or the first type and the second type of image.
The processing unit 50 also recognizes the pattern formed by the projected light by comparing differences between the first-type image and the second-type image that are adjacent or close in time order or become continuous in time, and calculates local calculated depth information of the photographed object. The term "partial" as used herein refers to a portion of the subject to which the projection light is projected, that is, a portion of the subject that receives and reflects the projection light. For example, when the projection light is linear light, the linear light is projected onto the object to be photographed, and a linear region of the projected object is illuminated, the region is the "local" of the object to be photographed.
The time series includes a first type image and a second type image photographed before and after. For example, the pictures taken every second are 24 frames, and the case of being continuous or adjacent in time includes the case of a first frame taken 1/24 second, a second frame taken 2/24 second, a third frame taken 3/24 second, and a fourth frame taken 4/24 second. Likewise, the case of the first frame shot at 1/24 second and the fourth frame shot at 4/24 second, that is, even if a plurality of frames are skipped in the middle, is also included in the scope of the present invention as long as the acquired images are regular in time.
In an alternative embodiment, the image capturing apparatus may further include a motion and posture estimating unit 60, where the motion and posture estimating unit 60 obtains an image sequence from the processing unit 50, recognizes whether the image has a change in translation, scaling, rotation, etc. by comparing differences between images adjacent or close in time, and accordingly detects and calculates a change in posture such as a relative orientation, an angle, etc. between the photographing device and the photographed object, and estimates a current photographing posture and a motion trend; the motion and posture estimation unit 60 transmits the calculated posture information to the processing unit 50;
Correspondingly, the processing unit 50 is further configured to receive the gesture information such as the shooting time angle, direction, etc., transmitted by the motion and gesture estimation unit 60. After that, the processing unit 50 correlates the chromaticity brightness information, the local depth information, the posture information of the subject surface; the processing unit 50 performs fusion and optimization solution on the chromaticity and brightness information, the local depth information and the attitude information obtained by shooting the same shot object 100 from different angles to obtain three-dimensional information with complete object surface and associated chromaticity and brightness information. The manner in which depth information is calculated from the obtained image information is described in the background section and will not be described in detail herein.
The timing control unit 40 and the motion and posture estimation unit 60 may be independent from the processing unit 50, or may be integrated, located on the same circuit board, or in the same component. The present invention is not particularly limited.
Fig. 5 and 6 are schematic diagrams of the acquired first and second type images. According to the scheme of the invention, a series of image sequences with staggered first-type images and second-type images can be obtained. The interleaving may be one by one (as in fig. 5) or alternatively may be arranged according to other rules (as in fig. 6). According to the image sequence, the processing unit can obtain the position information of laser projection imaging by comparing the difference between the adjacent first type images and the adjacent second type images, and then can calculate the local depth information of the pixel points of the laser projection position according to a triangulation method.
Continuously shooting and moving the object or moving the shooting equipment to sweep the surface of the whole object, so that depth information of all points on the surface of the object can be obtained, and three-dimensional data of the surface of the object can be constructed. As shown in fig. 7. In an embodiment, the projection light is linear light, the linear light sweeps across the surface of the object to be shot from top to bottom, or sweeps across the surface of the object to be shot from left to right, and the difference between the first type of image and the second type of image photographed each time can be used in the later stage to obtain depth information corresponding to the part of the linear light projected on the object to be projected, namely, the local depth information of the object to be projected.
In alternative embodiments, the image acquisition device of the present invention may further comprise an internal attitude sensor 80, typically one or a combination of several of accelerometers, gyroscopes, magnetometers, etc.; the gesture sensor 80 is connected to the motion and gesture estimation unit 60, and the motion and gesture estimation unit combines the gesture sensor information, so that the current gesture can be estimated more accurately.
Alternatively, the apparatus may be coupled to an external sensor 70, for example, the apparatus may be fixed to a circular track for taking images around the object to be photographed, and the track may provide a positioning sensor for more accurately determining the relative shooting orientations at each moment. The result sensed by the external sensor 70 is acquired by the processing unit 50 for determining the current posture of the subject 100, etc., from the relative shooting orientations, for more accurately estimating the current posture.
In the image processing device provided by the invention, because the chromaticity and brightness information of the shot object and the local depth information obtained by projection of the projection light are acquired by the same image sensor through the same light path, the chromaticity information and the local depth information of the pixels have natural relevance, and the corresponding matching is not needed again like the prior art. The image acquisition rate can reach tens to hundreds frames per second, the movement change of an object is very small between two adjacent acquired images, and the movement can be calculated according to an image sequence, so that the requirement on the precision of equipment production is low, and processes such as calibration and the like are not needed. The cost is saved and the acquisition precision is improved.
Second embodiment
A second embodiment of the present invention proposes an image capturing apparatus for capturing depth information and chromaticity luminance information of an object, the image capturing apparatus including:
A projection light generating unit for generating projection light and projecting the projection light to a subject;
an imaging unit including an image sensor for receiving light reflected from a photographed object to form an image corresponding to the photographed object;
an optical path control unit for controlling whether the image sensor forms an image;
the time sequence control unit is in signal connection with the light path control unit and/or the projection light generation unit and is used for controlling the opening time of the light path control unit and/or the projection light generation unit so as to open the light path control unit during the projection of the projection light;
The processing unit is in signal connection with the time sequence control unit and the image sensor, and is used for receiving and processing images generated by the image sensor and storing the images, determining the images to be first-class images or second-class images according to the signals of the time sequence control unit, wherein the first-class images are images formed by the first-class light reflected by the shot object received by the image sensor, and the second-class images are images formed by the second-class light reflected by the shot object received by the image sensor; the second type of light includes the first type of light and the projected light;
the processing unit acquires chromaticity and brightness information of the shot object from the first type of image, and acquires local depth information of the shot object according to the second type of image, wherein the brightness of the projection light is more than 2 times of the brightness of the environment light.
The image pickup apparatus described above is similar to that of the first embodiment, and only the differences are described in this embodiment, and reference is made to the first embodiment for the relevant matters.
In one embodiment, the projected light is one or more linear projected light.
In an embodiment, the processing unit calculates the local depth information of the photographed object according to a triangulation method by comparing differences between the first type of image and the second type of image which are continuous or adjacent in time.
In an embodiment, the linear projection light is a plurality of linear projection lights having the same or different wavelengths.
In an embodiment, the processing unit further calculates three-dimensional information and chromaticity brightness information of the surface of the photographed object according to the chromaticity brightness information and the local depth information of the photographed object acquired multiple times. In this embodiment, as in the foregoing, the processing unit uses the local depth information acquired a plurality of times to splice to form three-dimensional information and chromaticity luminance information of the surface of the object to be photographed by using the splice method.
In an embodiment, the image acquisition apparatus further comprises:
A motion and gesture estimation unit for providing motion gesture information to the processing unit;
and the processing unit calculates and obtains the three-dimensional information of the surface of the shot object and the associated chromaticity brightness information according to the movement posture information, chromaticity brightness information and local depth information obtained by shooting for a plurality of times.
In an embodiment, the projection light is a laser, or may be one of infrared light, ultraviolet light, and blue light.
In this embodiment, the projection light is identified by comparing two adjacent images, so that the influence of the ambient light reflected by the object itself is small, and projection light with various wavelengths can be used as the light source. As a specific example, in some scenes, a projection light source with higher brightness may be used to perform projection, so that the intensity of the projected light reflected by the object is far greater than the intensity of the ambient light reflected by the object, and in this case, even if the comparison of adjacent images is not performed, the image formed by the reflected linear projected light can be identified. Fig. 8 and 9 show an image taken with normal-brightness linear projection light and an image taken with high-intensity linear projection light. As a comparison of fig. 8 and 9 shows, since the intensity of the projected light is much greater than that of the reflected ambient light, the image of the reflected linear projected light can be easily recognized. In an embodiment, the brightness of the linear projection light is 2 times, 10 times, 20 times, more than 50 times, etc. the brightness of the ambient light.
Third embodiment
A third embodiment of the present invention proposes an image capturing apparatus for capturing depth information and chromaticity luminance information of an object, the image capturing apparatus including:
The two or more projection light generating units are used for respectively generating projection light and projecting the projection light to a photographed object;
an imaging unit including an image sensor for receiving light reflected from a photographed object to form an image corresponding to the photographed object;
an optical path control unit for controlling whether the image sensor forms an image;
The time sequence control unit is in signal connection with the light path control unit and/or the projection light generation unit and is used for controlling the opening time of the light path control unit and the projection light generation unit so as to open the light path control unit during the projection of the projection light;
The processing unit is in signal connection with the time sequence control unit and the image sensor, and is used for receiving and processing images generated by the image sensor and storing the images, determining the images to be first-class images or second-class images according to the signals of the time sequence control unit, wherein the first-class images are images formed by the first-class light reflected by the shot object received by the image sensor, and the second-class images are images formed by the second-class light reflected by the shot object received by the image sensor; the second type of light includes the first type of light and the projected light;
the processing unit acquires chromaticity and brightness information of the shot object from the first type of image, and acquires local depth information of the shot object according to the second type of image and the first type of image.
The third embodiment is similar to the first or second embodiment, and only the differences will be described here.
In the present embodiment, the projection light generating units are two or more, and one or more of them is not limited to emit linear projection light. In one embodiment, the projection light generating units are two for generating linear projection light and structured light, respectively. The imaging unit may be configured to form three types of images, namely, a first type of image formed by receiving ambient light, a second type of image generated by receiving ambient light and linear projection light, and a third type of image generated by receiving ambient light and structured light. The processing unit receives the reflected light rays generated by the two projection light generating units projected on the reflecting points of the shot object respectively, and determines the local depth information of the shot object according to the three generated images.
The two projection light generating units may be turned off at the same time, the imaging unit generates the first type of image, and then turned on in turn, and the imaging unit generates the second type of image and the third type of image, which is not limited herein.
In another embodiment, the projection light generating unit may project light rays of different wavelengths, such as infrared light, violet light, blue light, etc., respectively. The processing unit processes the different light rays and can calculate the local depth information of the shot object.
In one embodiment, the number of the projection light generating units is two, and the projection light generating units are respectively used for generating linear projection light and structural light.
In an embodiment, the number of the projection light generating units is two, and the two projection light generating units are used for generating linear projection light with different wavelengths.
In an embodiment, the projection light generating units are turned on simultaneously or alternately.
In an embodiment, the second type of image includes a first sub-type image and a second sub-type image, the first sub-type image being generated by the image sensor receiving the reflected light of the ambient light and the projection light emitted from one of the projection light generating units to the photographed object; the second sub-class image is generated by the image sensor receiving reflected light of the ambient light and the projected light emitted from the other projected light generating unit to the subject.
In the present embodiment, the projection light generating unit includes two or more, for example, one for projecting infrared light and the other for projecting ultraviolet light. The image sensor may be arranged to receive three reflected lights in succession: the first type of light is light without reflection of infrared light and ultraviolet light (such as ambient light only), the second type of light is light with reflection of infrared light (such as ambient light and infrared light), the third type of light is light with reflection of ultraviolet light (such as ambient light and ultraviolet light), the three types of light respectively generate images in the image sensor, the first type of image is an image reflecting the ambient light, and the first type of image is an image reflecting the ambient light and the infrared light; the second sub-class of images of the second class of images are images that reflect ambient light and ultraviolet light. And obtaining local depth information through calculation of differences between the second type image and the first type image (including differences between the first sub-type image and the second sub-type image respectively in the first type image) after imaging.
The triangulation algorithm for obtaining local depth information from image information is described in the background art and will not be described in detail here.
In this embodiment, the image information generated by a plurality of projection lights can be collected at one time, and the efficiency of obtaining the local depth information and the three-dimensional surface shape of the object to be measured is improved by using a multi-tube alignment manner.
Fourth embodiment
The fourth embodiment of the present invention provides an image acquisition method, including the following steps:
s401, collecting a first type of image in a first time period, wherein the first type of image is formed by an image sensor receiving first type of light reflected by a shot object;
In this step, the image sensor 22 included in the imaging unit 20 of the foregoing embodiment may acquire a first type of image, for example, an image formed by the image sensor 22 receiving the ambient light reflected by the object to be photographed of the object to be photographed 100.
S402, starting a projection light generating unit in a second time period, and collecting a second type of image, wherein the second type of image is formed by an image sensor receiving second type of light reflected by a shot object;
In this step, the second time period is, for example, temporally successive or adjacent to the first time period, including at spaced adjacent times at which the projection light generating unit is turned on, and the second type of image is acquired. The projection light is, for example, a linear laser light, and thus the second type of image is an image formed by the image sensor receiving the projection light reflected by the subject and the ambient light. As can be seen from the above, although the ambient light may be various different light rays or photographing environments, the second light rays always include the first light rays and the projection light generated by the projection light generating unit.
S403, calculating local depth information of the shot object according to the difference between the first type image and the second type image;
in this step, the processing unit 50 may obtain difference image information using differences between the first type and the second type images, for example, by subtraction, and may obtain local depth information of the photographed object according to the triangle calculation method.
S404, associating the local depth information with the chrominance brightness information acquired from the first type image;
in this step, the aforementioned processing unit 50 may correspond the calculated local depth information to the chromaticity luminance information.
The details of the method proposed in this embodiment are the same as or similar to those of the image capturing device in the foregoing embodiment, and will not be described here again.
In an embodiment, the projected light is linear light.
In one embodiment, the projection light is a plurality of types including linear light, structured light, and the like.
In an embodiment, the projected light is a plurality of linear light having the same or different wavelengths.
In an embodiment, the method further comprises:
S405, acquiring local depth information and chromaticity brightness information of the shot object acquired by shooting for a plurality of times, and calculating three-dimensional information of the surface of the shot object and related chromaticity brightness information thereof.
In this step, the aforementioned processing unit 50 obtains the overall three-dimensional information of the photographed object from the partial depth information and the chromaticity luminance information of the photographed object acquired by the plurality of photographing, and corresponds to the associated chromaticity luminance information.
In the image processing method provided by the invention, because the chromaticity and brightness information of the shot object and the local depth information obtained by projection of the projection light are acquired by the same image sensor through the same light path, the chromaticity information and the local depth information of the pixels have natural relevance, and the corresponding matching is not needed again like the prior art. The image acquisition rate can reach tens to hundreds frames per second, the movement change of an object is very small between two adjacent acquired images, and the movement can be calculated according to an image sequence, so that the requirement on the precision of equipment production is low, and processes such as calibration and the like are not needed. The cost is saved and the acquisition precision is improved.
The foregoing has described in detail an image acquisition apparatus and method of the present application, and specific examples have been provided herein to illustrate the principles and embodiments of the present application, the above examples being provided only to assist in understanding the method of the present application and its core ideas; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.
Claims (16)
1. An image acquisition apparatus for acquiring depth information and chromaticity brightness information of an object, the image acquisition apparatus comprising:
A projection light generating unit for generating projection light and projecting the projection light to a subject;
an imaging unit including an image sensor for receiving light reflected from a photographed object to form an image corresponding to the photographed object;
an optical path control unit for controlling whether the image sensor forms an image;
the time sequence control unit is in signal connection with the light path control unit and/or the projection light generation unit and is used for controlling the opening time of the light path control unit and/or the projection light generation unit so as to open the light path control unit during the projection of the projection light;
The processing unit is in signal connection with the time sequence control unit and the image sensor, and is used for receiving and processing images generated by the image sensor and storing the images, determining the images to be first-class images or second-class images according to the signals of the time sequence control unit, wherein the first-class images are images formed by the first-class light reflected by the shot object received by the image sensor, and the second-class images are images formed by the second-class light reflected by the shot object received by the image sensor; the second type of light includes the first type of light and the projected light;
The processing unit acquires chromaticity and brightness information of the shot object from the first type of image, and acquires local depth information of the shot object according to the second type of image and the first type of image, wherein the local depth information refers to depth information corresponding to a part of the shot object projected by the projection light.
2. The image capturing apparatus according to claim 1, wherein the processing unit calculates the local depth information of the subject from triangulation by comparing differences between the first type image and the second type image which are continuous or adjacent in time.
3. The image capturing apparatus according to claim 1, wherein the optical path control unit includes a shutter for controlling whether the image sensor receives light by controlling on-off of an optical path within the imaging unit.
4. The image capturing apparatus according to claim 1, wherein the optical path control unit includes an optical filter for controlling whether the image sensor receives the light by controlling on-off of an optical path in the projection light generating unit.
5. The image capturing apparatus according to claim 1, wherein the projection light generating unit includes a projection light source for generating projection light and a projection lens for generating linear projection light from the projection light.
6. The image acquisition apparatus of claim 1 wherein the projected light is one or more linear projected light.
7. The image capturing apparatus according to claim 6, wherein the linear projection light is a plurality of linear projection light, and the plurality of linear projection light is parallel light, cross-shaped, grid-shaped, or radial-shaped.
8. The image capturing device of claim 6, wherein the linear projection light is a plurality of linear projection light having the same or different wavelengths.
9. The image pickup apparatus according to claim 1, wherein the processing unit further calculates three-dimensional information and chromaticity luminance information of the surface of the object to be photographed based on chromaticity luminance information and local depth information of the object to be photographed obtained by the plurality of photographing.
10. The image capturing device of claim 1, wherein the image capturing device further comprises:
The motion and gesture estimation unit is in signal connection with the processing unit and is used for providing motion gesture information for the processing unit;
and the processing unit obtains three-dimensional information and chromaticity brightness information of the surface of the shot object according to the movement posture information, chromaticity brightness information and local depth information of the shot object obtained by shooting for multiple times.
11. The image acquisition device of claim 10, wherein the motion and pose estimation unit comprises a combination of one or more of an accelerometer, a gyroscope, a magnetometer.
12. The image capturing device of claim 1, wherein the projected light is a laser.
13. The image capturing device of claim 10, further comprising an external sensor signal coupled to the motion and pose estimation unit for determining the relative shooting orientations of the image capturing device at different moments in time.
14. An image acquisition apparatus for acquiring depth information and chromaticity brightness information of an object, the image acquisition apparatus comprising:
A projection light generating unit for generating projection light and projecting the projection light to a subject;
an imaging unit including an image sensor for receiving light reflected from a photographed object to form an image corresponding to the photographed object;
an optical path control unit for controlling whether the image sensor forms an image;
the time sequence control unit is in signal connection with the light path control unit and/or the projection light generation unit and is used for controlling the opening time of the light path control unit and/or the projection light generation unit so as to open the light path control unit during the projection of the projection light;
The processing unit is in signal connection with the time sequence control unit and the image sensor, and is used for receiving and processing images generated by the image sensor and storing the images, determining the images to be first-class images or second-class images according to the signals of the time sequence control unit, wherein the first-class images are images formed by the first-class light reflected by the shot object received by the image sensor, and the second-class images are images formed by the second-class light reflected by the shot object received by the image sensor; the second type of light includes the first type of light and the projected light;
The processing unit acquires chromaticity and brightness information of the shot object from the first type of image, and acquires local depth information of the shot object according to the second type of image, wherein the brightness of the projection light is more than 2 times of the brightness of the first type of light, and the local depth information refers to depth information corresponding to a part of the shot object projected by the projection light.
15. An image acquisition apparatus for acquiring depth information and chromaticity brightness information of an object, the image acquisition apparatus comprising:
The two or more projection light generating units are used for respectively generating projection light and projecting the projection light to a photographed object;
an imaging unit including an image sensor for receiving light reflected from a photographed object to form an image corresponding to the photographed object;
an optical path control unit for controlling whether the image sensor forms an image;
The time sequence control unit is in signal connection with the light path control unit and/or the projection light generation unit and is used for controlling the opening time of the light path control unit and the projection light generation unit so as to open the light path control unit during the projection of the projection light;
The processing unit is in signal connection with the time sequence control unit and the image sensor, and is used for receiving and processing images generated by the image sensor and storing the images, determining the images to be first-class images or second-class images according to the signals of the time sequence control unit, wherein the first-class images are images formed by the first-class light reflected by the shot object received by the image sensor, and the second-class images are images formed by the second-class light reflected by the shot object received by the image sensor; the second type of light includes the first type of light and the projected light;
the processing unit acquires chromaticity and brightness information of the shot object from the first type of image, and acquires local depth information of the shot object according to the second type of image and the first type of image, wherein the local depth information refers to depth information corresponding to a part of the shot object projected by the projection light.
16. An image acquisition method, comprising:
In a first time period, acquiring a first type of image, wherein the first type of image is formed by an image sensor receiving first type of light reflected by a photographed object;
Starting a projection light generating unit in a second time period, and collecting a second type of image, wherein the second type of image is an image formed by reflecting the second type of light rays by an object to be shot received by an image sensor;
Calculating local depth information of the shot object according to the difference between the first type image and the second type image;
correlating the local depth information with chrominance luminance information collected from the first type of image;
the second light includes the first light and the projection light generated by the projection light generating unit, and the local depth information refers to depth information corresponding to a part of the photographed object projected by the projection light.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810403358.7A CN108592886B (en) | 2018-04-28 | 2018-04-28 | Image acquisition apparatus and image acquisition method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810403358.7A CN108592886B (en) | 2018-04-28 | 2018-04-28 | Image acquisition apparatus and image acquisition method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108592886A CN108592886A (en) | 2018-09-28 |
CN108592886B true CN108592886B (en) | 2024-04-26 |
Family
ID=63619285
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810403358.7A Active CN108592886B (en) | 2018-04-28 | 2018-04-28 | Image acquisition apparatus and image acquisition method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108592886B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110532935A (en) * | 2019-08-26 | 2019-12-03 | 李清华 | A kind of high-throughput reciprocity monitoring system of field crop phenotypic information and monitoring method |
CN113112444B (en) * | 2020-01-09 | 2022-05-31 | 舜宇光学(浙江)研究院有限公司 | Ghost image detection method and system, electronic equipment and ghost image detection platform |
CN111881719B (en) * | 2020-06-09 | 2024-04-16 | 青岛奥美克生物信息科技有限公司 | Non-contact type biological recognition guiding device, method and biological feature recognition system |
CN113691791B (en) * | 2021-09-17 | 2023-10-03 | 青岛海信激光显示股份有限公司 | Laser projection device and display method of projection image thereof |
CN116416178B (en) * | 2021-12-27 | 2024-07-12 | 广州镭晨智能装备科技有限公司 | Visual inspection equipment, visual inspection system and visual inspection method for product surface defects |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102074045A (en) * | 2011-01-27 | 2011-05-25 | 深圳泰山在线科技有限公司 | System and method for projection reconstruction |
CN102867328A (en) * | 2011-01-27 | 2013-01-09 | 深圳泰山在线科技有限公司 | Object surface reconstruction system |
CN103808305A (en) * | 2012-11-07 | 2014-05-21 | 原相科技股份有限公司 | Detection system |
CN208536839U (en) * | 2018-04-28 | 2019-02-22 | 朱炳强 | Image capture device |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040125205A1 (en) * | 2002-12-05 | 2004-07-01 | Geng Z. Jason | System and a method for high speed three-dimensional imaging |
-
2018
- 2018-04-28 CN CN201810403358.7A patent/CN108592886B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102074045A (en) * | 2011-01-27 | 2011-05-25 | 深圳泰山在线科技有限公司 | System and method for projection reconstruction |
CN102867328A (en) * | 2011-01-27 | 2013-01-09 | 深圳泰山在线科技有限公司 | Object surface reconstruction system |
CN103808305A (en) * | 2012-11-07 | 2014-05-21 | 原相科技股份有限公司 | Detection system |
CN208536839U (en) * | 2018-04-28 | 2019-02-22 | 朱炳强 | Image capture device |
Also Published As
Publication number | Publication date |
---|---|
CN108592886A (en) | 2018-09-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108592886B (en) | Image acquisition apparatus and image acquisition method | |
EP3650807B1 (en) | Handheld large-scale three-dimensional measurement scanner system simultaneously having photography measurement and three-dimensional scanning functions | |
CN108269279B (en) | Three-dimensional reconstruction method and device based on monocular 3 D scanning system | |
US20160364903A1 (en) | 3d geometric modeling and 3d video content creation | |
EP2870428B1 (en) | System and method for 3d measurement of the surface geometry of an object | |
EP3531066A1 (en) | Three-dimensional scanning method including a plurality of lasers with different wavelengths, and scanner | |
US20160134860A1 (en) | Multiple template improved 3d modeling of imaged objects using camera position and pose to obtain accuracy | |
JP6112769B2 (en) | Information processing apparatus and information processing method | |
EP2175232A1 (en) | Three-dimensional shape measuring device, three-dimensional shape measuring method, three-dimensional shape measuring program, and recording medium | |
CN111971525B (en) | Method and system for measuring an object with a stereoscope | |
WO2018072817A1 (en) | A device and method for obtaining distance information from views | |
EP3069100B1 (en) | 3d mapping device | |
CN107860337B (en) | Structured light three-dimensional reconstruction method and device based on array camera | |
KR20120058828A (en) | System for extracting 3-dimensional coordinate and method thereof | |
WO2004044522A1 (en) | Three-dimensional shape measuring method and its device | |
CN106767526A (en) | A kind of colored multi-thread 3-d laser measurement method based on the projection of laser MEMS galvanometers | |
WO2018028152A1 (en) | Image acquisition device and virtual reality device | |
EP3382645A2 (en) | Method for generation of a 3d model based on structure from motion and photometric stereo of 2d sparse images | |
CN112602118B (en) | Image processing device and three-dimensional measurement system | |
JP7184203B2 (en) | Image processing device, three-dimensional measurement system, image processing method | |
WO2019125427A1 (en) | System and method for hybrid depth estimation | |
CN108779978A (en) | Depth sense system and method | |
WO2012037085A1 (en) | Active lighting for stereo reconstruction of edges | |
JP2011095131A (en) | Image processing method | |
Akasaka et al. | A sensor for simultaneously capturing texture and shape by projecting structured infrared light |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |