[go: up one dir, main page]

CN112540494A - Image forming apparatus and image forming method - Google Patents

Image forming apparatus and image forming method Download PDF

Info

Publication number
CN112540494A
CN112540494A CN201910844558.0A CN201910844558A CN112540494A CN 112540494 A CN112540494 A CN 112540494A CN 201910844558 A CN201910844558 A CN 201910844558A CN 112540494 A CN112540494 A CN 112540494A
Authority
CN
China
Prior art keywords
depth
lenses
lens
light
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910844558.0A
Other languages
Chinese (zh)
Other versions
CN112540494B (en
Inventor
杨萌
李建军
戴付建
赵烈烽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Sunny Optics Co Ltd
Original Assignee
Zhejiang Sunny Optics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Sunny Optics Co Ltd filed Critical Zhejiang Sunny Optics Co Ltd
Priority to CN201910844558.0A priority Critical patent/CN112540494B/en
Publication of CN112540494A publication Critical patent/CN112540494A/en
Application granted granted Critical
Publication of CN112540494B publication Critical patent/CN112540494B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/001Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
    • G02B13/008Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras designed for infrared light
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/14Optical objectives specially designed for the purposes specified below for use with infrared or ultraviolet radiation
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B41/00Special techniques not covered by groups G03B31/00 - G03B39/00; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B9/00Exposure-making shutters; Diaphragms
    • G03B9/02Diaphragms

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application provides an imaging apparatus and an imaging method. The imaging device comprises light source equipment, light diffusion equipment, a lens assembly and processing equipment, wherein the light source equipment is used for emitting initial light; the light diffusion device is positioned at one side of the light source device and is used for diffusing the initial light into a plurality of light spots positioned in different position areas; the lens assembly comprises a plurality of lenses which are arranged at intervals, the plurality of lenses are used for imaging the reflected light spots corresponding to the plurality of light spots respectively to form a plurality of images, wherein the light spots are reflected by a measured object to form the reflected light spots, the lenses have different f-numbers, different object distances and different focal lengths, in any two lenses, the object distance of the lens with more f-numbers is larger than the object distance of the lens with less f-numbers, and the focal length of the lens with more f-numbers is smaller than the focal length of the lens with less f-numbers; the processing device is electrically connected with the lens assembly and used for calculating the depth of the measured object according to a plurality of phase differences corresponding to a plurality of images.

Description

Image forming apparatus and image forming method
Technical Field
The present invention relates to the field of imaging, and in particular, to an imaging apparatus and an imaging method.
Background
The 3D structured light is a new technology that irradiates an optical pattern with certain spatial and/or temporal structural features onto an object to be measured by a laser such as a near-infrared wavelength, and then an infrared camera collects reflected light and estimates the depth of each pixel point according to the change in the spatial and/or temporal structural features of the reflected light.
Wherein the depth is directly derived by substituting the change in spatial and/or temporal structural characteristics of the reflected light into a transformation matrix that is encoded at the time of shipment. The 3D structured light needs to actively emit a fixed spot pattern which is designed in advance and has high precision, so that the three-dimensional structured light has the advantages of being applicable to a dark environment, high in measurement precision, high in resolution and the like, and also has the essential defect that the measurement distance is short. Once the distance of the object to be measured is long, for example, the object reaches an outdoor scene or the depth of the object to be measured exceeds 1-10 m, the scattering spot of the optical pattern is continuously enlarged, which causes the problems of defocusing and large acquired depth error.
The above information disclosed in this background section is only for enhancement of understanding of the background of the technology described herein and, therefore, certain information may be included in the background that does not form the prior art that is already known in this country to a person of ordinary skill in the art.
Disclosure of Invention
The present application mainly aims to provide an imaging apparatus and an imaging method to solve the problem of large depth error of acquisition in the prior art.
In order to achieve the above object, an image forming apparatus according to the present application includes: a light source device for emitting primary light; the light diffusion device is positioned on one side of the light source device and is used for diffusing the initial light into a plurality of light spots positioned in different position areas; the lens assembly comprises a plurality of lenses which are arranged at intervals, the plurality of lenses are used for imaging the reflected light spots corresponding to the plurality of light spots respectively to form a plurality of images, wherein the light spots are reflected by a measured object to form the reflected light spots, the plurality of lenses are different in f-number, object distances and focal lengths, in any two lenses, the object distance of the lens with more f-number is larger than the object distance of the lens with less f-number, and the focal length of the lens with more f-number is smaller than the focal length of the lens with less f-number; and the processing device is electrically connected with the lens assembly and used for calculating the depth of the measured object according to a plurality of phase differences corresponding to a plurality of images.
Further, the light diffusion device includes a plurality of diffusion portions, and the number of the light spots formed by at least two of the diffusion portions diffusing the initial light is different.
Further, there are three of the lenses in the lens assembly, the F-numbers of the three lenses are fno1, fno2 and fno3, respectively, and fno1< fno2< fno3, the object distances of the three lenses are U1, U2 and U3, respectively, and U1< U2< U3, the focal lengths of the three lenses are F1, F2 and F3, respectively, and F1> F2> F3.
Further, the centers of any two adjacent lenses are located on a predetermined straight line, and the projection of the light source device on the predetermined straight line is located between the two corresponding lenses.
Further, the light source equipment comprises a plurality of lasers distributed in an array, and the wavelength range of light emitted by the lasers is 800-1000 nm.
Further, the processing device comprises: the first acquisition module is used for acquiring corresponding preset depth according to each image; the first calculation module is used for weighting each preset depth to obtain a preset component; and the second calculation module is used for accumulating the preset components corresponding to the images to obtain the depth.
Further, the processing device further comprises: and the determining module is used for determining the weight of each preset depth before each preset depth is weighted to obtain a preset component.
Further, the determining module is further configured to determine a weight of the corresponding predetermined depth according to the sharpness of each of the images.
Further, the determining module is further configured to determine a weight of the corresponding predetermined depth according to the sharpness of different regions of each of the images.
According to another aspect of the present application, there is provided an imaging method including: controlling to emit initial light; diffusing the initial light into a plurality of light spots located in different position areas; the method comprises the steps that a lens assembly is adopted to image reflection light spots corresponding to a plurality of light spots respectively to form a plurality of images, wherein the light spots are reflected by a measured object to form the reflection light spots, the f-numbers of a plurality of lenses are different, the object distances of the lenses are different, the focal lengths of the lenses are different, in any two lenses, the object distance of the lens with more f-numbers is larger than the object distance of the lens with less f-numbers, and the focal length of the lens with more f-numbers is smaller than the focal length of the lens with less f-numbers; and calculating the depth of the measured object according to a plurality of phase differences corresponding to a plurality of images.
Use the technical scheme of this application, among the above-mentioned imaging device, the initial light that the light source subassembly sent is a plurality of faculas that are located different position region through the diffusion of light diffusion equipment, measured object reflects a plurality of faculas, obtain a plurality of reflection faculas, the camera lens that a plurality of intervals set up is imaged a plurality of reflection faculas respectively, form a plurality of images, and, the light ring of arbitrary two camera lenses is different, and the object distance of the more camera lens of f-number is greater than the object distance of the less camera lens of f-number, and the focus of the more camera lens of f-number is less than the focus of the less camera lens of f-number, consequently, the depth of field scope of a plurality of images of formation is different, follow-up processing equipment calculates the degree of depth of measured object according to a plurality of phase differences that the different images of. The imaging device adopts a plurality of light spots located in different position areas to irradiate a measured object to obtain a plurality of reflected light spots, adopts the lenses with different parameters to image the reflected light spots corresponding to the light spots located in the different position areas, and utilizes the depth obtained by a plurality of different images to be more accurate relative to the depth obtained by one image, the error of the depth obtained by the method is smaller or even has no error, and the problem of larger error of the obtained depth in the prior art is solved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the application and, together with the description, serve to explain the application and are not intended to limit the application. In the drawings:
FIG. 1 shows a schematic structural diagram of an imaging apparatus according to an embodiment of the present application;
FIG. 2 shows a schematic flow diagram of an imaging method according to an embodiment of the present application;
fig. 3 shows an electronic device including the imaging apparatus of the present application; and
fig. 4 shows another electronic device including the imaging apparatus of the present application.
Detailed Description
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It will be understood that when an element such as a layer, film, region, or substrate is referred to as being "on" another element, it can be directly on the other element or intervening elements may also be present. Also, in the specification and claims, when an element is described as being "connected" to another element, the element may be "directly connected" to the other element or "connected" to the other element through a third element.
As described in the background, the depth error of the prior art acquisition is large. In order to solve the above technical problem, the present application proposes an imaging apparatus and an imaging method.
In a first embodiment of the present application, there is provided an image forming apparatus, as shown in fig. 1, including:
a light source device 10 for emitting primary light;
a light diffusing device 20 located at one side of the light source device 10, the light diffusing device 20 being configured to diffuse the initial light into a plurality of light spots located in different location areas;
a lens assembly 30, including a plurality of lenses arranged at intervals, wherein the plurality of lenses respectively image the reflected light spots corresponding to the plurality of light spots to form a plurality of images, wherein the light spots are reflected by the object to be measured to form the reflected light spots, wherein the plurality of lenses have different f-numbers, different object distances and different focal lengths, and in any two of the lenses, the object distance of the lens with a larger f-number is larger than the object distance of the lens with a smaller f-number, and the focal length of the lens with a larger f-number is smaller than the focal length of the lens with a smaller f-number, and the lens assembly 30 shown in fig. 1 includes three lenses;
and a processing device 40 electrically connected to the lens assembly, wherein the processing device 40 is configured to calculate a depth of the object according to a plurality of phase differences corresponding to a plurality of the images.
Among the above-mentioned imaging device, the initial light that the light source subassembly sent is a plurality of faculas that are located different position region through the diffusion of light diffusion equipment, the measured object reflects a plurality of faculas, obtain a plurality of reflection faculas, the camera lens that a plurality of intervals set up images a plurality of reflection faculas respectively, form a plurality of images, and, the diaphragm of arbitrary two camera lenses is different, and the object distance of the camera lens that the f-number is more is greater than the object distance of the camera lens that the f-number is less, and the focus of the camera lens that the f-number is more is less than the focus of the camera lens that the f-number is less, therefore, the depth of field scope of a plurality of images that form is different, follow-up processing equipment calculates the depth of measured object according to a plurality of phase differences that. The imaging device adopts a plurality of light spots located in different position areas to irradiate a measured object to obtain a plurality of reflected light spots, adopts the lenses with different parameters to image the reflected light spots corresponding to the light spots located in the different position areas, and utilizes the depth obtained by a plurality of different images to be more accurate relative to the depth obtained by one image, the error of the depth obtained by the method is smaller or even has no error, and the problem of larger error of the obtained depth in the prior art is solved.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than presented herein.
The light diffusing device may be a grating, or may be a diffractive optical element having a microstructure arranged periodically or aperiodically, and a person skilled in the art may use any suitable light diffusing device to provide different detection accuracies for different objects to be measured, so as to further improve the accuracy of the obtained depth. The microstructures in the diffusing parts may be different or the same, and the periodicity or density of the microstructures may be different, so as to ensure that the number of the spots formed by at least two diffusing parts for the initial light diffusion is different.
In the third embodiment of the present application, as shown in fig. 1, there are three lenses 31 in the lens assembly 30, the F-numbers of the three lenses 31 are fno1, fno2 and fno3, respectively, and fno1< fno2< fno3, the object distances of the three lenses 31 are U1, U2 and U3, respectively, and U1< U2< U3, the focal distances of the three lenses 31 are F1, F2 and F3, respectively, and F1> F2> F3. When the optical parameters of the three lenses meet the above conditions, the imaging device can be suitable for depth detection of different detected objects, such as face recognition, AR navigation, AR games and the like, and technicians in the field can select the lenses with appropriate optical parameters to image according to actual conditions.
Specifically, the structure of each lens can be adjusted according to actual conditions, for example, each lens may include a plurality of lenses, each lens is made of glass or plastic, optical parameters of the lens may be adjusted by a combination of the plurality of lenses, for example, an f-number of the lens may range from 1 to 20, a specific f-number may range from 1.8, 2.0, 2.5, 3.0, 4.0 …, an object distance of the lens may be any value greater than 1cm, a specific object distance may range from 1cm, 5cm, 20cm, 1m, 5m, 10m …, a focal length of the lens may range from 0.5 to 10cm, and a specific focal length may range from 1cm, 2cm, 4cm, 6cm, 8cm ….
It should be noted that the number of shots in the present application may be any natural number greater than or equal to 2, and a technician in the field may select an appropriate number of shots according to actual situations.
In a fourth embodiment of the present application, centers of any two adjacent lenses are located on a predetermined straight line, and projections of the light source device on the predetermined straight line are located between the two corresponding lenses. The spatial distribution mode of the lens and the light source can ensure that a plurality of lenses can collect enough reflected light spots, so that the formed images are ensured not to have overlarge difference, and the accuracy of the obtained depth is further improved.
The light source device in this application may be a light source device emitting light of any wavelength, and those skilled in the art may adopt any suitable light source device, and may be various laser light sources such as VCSEL or semiconductor LD, and may also be a combined structure of LED and optical filter, and the spectral light sources such as LED are filtered by the optical filter to obtain initial light, and the initial light may or may not have periodicity, and may further include an array of micro lenses to collimate and focus the laser, and the laser may output multiple beams of laser light or one beam of laser light. In order to further reduce the influence of ambient light on imaging, in a fifth embodiment of the present application, the light source device includes a plurality of lasers distributed in an array, and the wavelength range of light emitted by the lasers is between 800 nm and 1000 nm. The light emitted by the light source equipment is infrared light, and the infrared light can further reduce the influence of other ambient light on imaging.
In a sixth embodiment of the present application, the processing device includes a first obtaining module, a first calculating module, and a second calculating module, where the first obtaining module is configured to obtain a corresponding predetermined depth according to each of the images; the first calculation module is configured to weight each of the predetermined depths to obtain a predetermined component; the second calculating module is configured to accumulate a plurality of the predetermined components corresponding to a plurality of the images to obtain the depth. The processing equipment calculates the depth in a weighted accumulation mode, so that the depth error can be reduced, and the more accurate depth of the measured object can be obtained. Of course, the processing device of the present application is not limited to include the above module, and may include other modules that can perform "calculating the depth of the measured object from a plurality of phase differences corresponding to a plurality of the images".
In a seventh embodiment of the present application, the processing device further includes a determining module, where the determining module is configured to determine a weight of each of the predetermined depths before each of the predetermined depths is weighted to obtain a predetermined component. It should be noted that the predetermined depths corresponding to different images are also different, and the weights of the predetermined depths are reasonably distributed, so that a more accurate depth of the measured object is obtained.
In an eighth embodiment of the present application, the determining module is further configured to determine a weight of the corresponding predetermined depth according to a sharpness of each of the images. The definition can be determined by the distribution of the gray level and the gray level gradient in the image area, and the higher the definition is, the more accurate the predetermined depth corresponding to the image is, so that the weight of the image with the clearer imaging is larger, and the accuracy of depth detection is further improved.
In an actual detection process, the definitions of different regions of each of the images are different, and in order to further reduce the depth error, in a ninth embodiment of the present application, the determining module is further configured to determine the weight of the corresponding predetermined depth according to the definitions of the different regions of each of the images, that is, the higher the definition of the image is, the larger the weight of the corresponding predetermined depth is, and the smaller the weight is, so as to reduce an error caused by a noise point, a blank point, or a data error point of the image when the depth is calculated, thereby obtaining a more accurate depth of the object to be measured.
It should be noted that, when the determining module determines the weight of the predetermined depth corresponding to each image, the definition of different regions of each image is not the only criterion, and in some special cases, the weight of the predetermined depth needs to be adjusted to further reduce the depth error.
In a tenth embodiment of the present application, the determining module is further configured to compare errors or contrasts of the plurality of images, increase a weight of a depth corresponding to a first partial image according to a comparison result, and decrease a weight of a depth corresponding to a second partial image, where the error of the first partial image is smaller than the error of the predetermined image, and the error of the second partial image is larger than the error of the predetermined image; and increasing the weight of the depth corresponding to a third partial image and decreasing the weight of the depth corresponding to a fourth partial image according to the comparison result, wherein the contrast of the third partial image is greater than that of the preset image, and the contrast of the fourth partial image is less than that of the preset image. The predetermined image is an image formed by a predetermined lens in the lens assembly, the focal length of the predetermined lens is greater than the minimum focal length of the lens assembly and less than the maximum focal length, and the object distance of the predetermined lens is greater than the minimum object distance of the lens assembly and less than the maximum object distance. Specifically, one lens is selected as a predetermined lens among a plurality of lenses, for example, three lenses having object distances of U1, U2 and U3, respectively, and U1< U2< U3, focal lengths of the three lenses being F1, F2 and F3, respectively, and F1> F2> F3, a lens having an object distance of U2 and a focal length of F2 is selected as the predetermined lens, an error of an image formed by the predetermined lens is a2, a contrast of an image formed by the predetermined lens is b2, errors of images formed by the other two lenses are a1 and a3, contrasts are b3 and b3, respectively, a3 and a3 are compared with a3, when a3> a3 and a3> a3 a, a weight of a depth corresponding to the image formed by the predetermined lens is increased, when a3< a3 and a3, a weight of a3 is increased when a3 a corresponding to the error < 3 a < 3 a, and a3 a corresponding to the depth of the error is increased as a < 3 a < 36, increasing the weight of the depth corresponding to the image with the error a1, and increasing the weight of the depth corresponding to the image with the error a 3; comparing b1 and b3 with b2, increasing the weight of the depth corresponding to an image formed by a predetermined shot when b1< b2 and b3< b2, increasing the weight of the depth corresponding to an image with a contrast of b3 when b1< b2 and b3> b2, increasing the weight of the depth corresponding to an image with a contrast of b1 when b1> b2 and b3< b2, increasing the weight of the depth corresponding to an image with a contrast of b1 when b1> b2 and b3> b2, and increasing the weight of the depth corresponding to an image with a contrast of b 3.
In an eleventh embodiment of the application, the determining module is further configured to obtain a corresponding depth image according to each of the images, obtain a depth gradient of each of the depth images, compare the depth gradients, and increase the depth according to a comparison resultAnd adding or reducing the weight of the depth corresponding to part of the depth image. Specifically, the depth gradient of each depth image is obtained to obtain a maximum depth gradient Tmax and a minimum depth gradient Tmin, and the maximum depth gradient Tmax and the minimum depth gradient Tmin are averaged to obtain T0I.e. T0Increasing the depth gradient by less than T ═ Tmax + Tmin)/20Weighting the corresponding depth, reducing the depth gradient to be greater than T0The weight of the corresponding depth. For example, the depth gradient bits of the three depth images T1, T2, and T3, and T1<T2<T3, then T0(T1+ T3)/2, when T2>T0When the depth gradient is T1, the weight of the depth corresponding to the image is increased, when the depth gradient is T2<T0Then, the depth corresponding to the image with the depth gradient T1 is weighted, and the depth corresponding to the image with the depth gradient T2 is weighted.
It should be noted that the object distances of the three lenses are U1, U2 and U3, respectively, and U1< U2< U3, the focal lengths of the three lenses are F1, F2 and F3, respectively, and F1> F2> F3, the lens with the object distance of U2 and the focal length of F2 is selected as the predetermined lens, the predetermined depth d2 corresponding to the image formed by the predetermined lens is predetermined, and the predetermined depths d1 and d3 corresponding to the images formed by the other two lenses are predetermined, and when the determining module determines the weight of the predetermined depth corresponding to each image, the determining module may make the initial weight of the predetermined depth d2 corresponding to the image formed by the predetermined lens larger, and determine whether to relatively increase or decrease the weight in d1 and d3 or in the partial region according to the sharpness, the error, the depth gradient, and the like. Specifically, the calculation formula of the depth of the object to be measured is D ═ w1D1+ w2D2+ w3D3, where D is the depth of the object to be measured, and w1, w2, and w3 are the weights of the predetermined depths D1, D2, and D3, respectively.
In a twelfth embodiment of the present application, there is also provided an imaging method, as shown in fig. 2, including:
step S101, controlling to emit initial light;
step S102, diffusing the initial light into a plurality of light spots located in different position areas;
step S103, imaging the reflected light spots corresponding to the light spots by using the lens assemblies respectively to form a plurality of images, wherein the light spots are reflected by a measured object to form the reflected light spots, the f-numbers of the plurality of lenses are different, the object distances of the plurality of lenses are different, and the focal lengths of the plurality of lenses are different, in any two lenses, the object distance of the lens with more f-numbers is larger than the object distance of the lens with less f-numbers, and the focal length of the lens with more f-numbers is smaller than the focal length of the lens with less f-numbers;
step S104 is to calculate the depth of the object based on the plurality of phase differences corresponding to the plurality of images.
In the imaging method, initial light emitted by the light source component is diffused into a plurality of light spots located in different position areas through the light diffusion equipment, the measured object reflects the light spots to obtain a plurality of reflected light spots, the plurality of lenses arranged at intervals respectively image the reflected light spots to form a plurality of images, moreover, the apertures of any two lenses are different, the object distance of the lens with more f-number is larger than the object distance of the lens with less f-number, and the focal length of the lens with more f-number is smaller than that of the lens with less f-number, therefore, the depth range of the formed images is different, and the subsequent processing equipment calculates the depth of the measured object according to a plurality of phase differences corresponding to the images with different depth ranges. According to the imaging method, the plurality of light spots located in different position areas are adopted to irradiate the measured object to obtain the plurality of reflected light spots, the lenses with different parameters are adopted to image the reflected light spots corresponding to the light spots located in the different position areas, the depth obtained by utilizing the plurality of different images is more accurate relative to the depth obtained by one image, the error of the depth obtained by the method is smaller or even has no error, and the problem that the obtained depth error is larger in the prior art is solved.
In a thirteenth embodiment of the present application, the density of the light spots is different in different position areas. Since the microstructures in the diffusing parts of the diffusing device may be different or the same, and the periodicity or density of the microstructures may be different, the different diffusing parts may have different location areas, different numbers and different densities for the light spots formed by the initial light diffusion.
In a fourteenth embodiment of the present application, calculating the depth of the object to be measured from a plurality of images formed by the lens assembly includes: acquiring corresponding preset depth according to each image; weighting each preset depth to obtain a preset component; and accumulating a plurality of preset components corresponding to a plurality of images to obtain the depth. The imaging method calculates the depth in a weighted accumulation mode, so that the more accurate depth of the measured object is obtained. Of course, the imaging method of the present application is not limited to the method of calculating the depth of the object, and may include another method of "calculating the depth of the object based on a plurality of phase differences corresponding to a plurality of images".
In a fifteenth embodiment of the present application, the weight of each of the predetermined depths is determined before each of the predetermined depths is weighted to obtain a predetermined component. It should be noted that the predetermined depths corresponding to different images are also different, and the weights of the predetermined depths are reasonably distributed, so that a more accurate depth of the measured object is obtained.
In a sixteenth embodiment of the present application, determining a weight for each of the predetermined depths includes: and determining the weight of the corresponding preset depth according to the definition of each image. The definition can be determined by the distribution of the gray level and the gray level gradient in the image area, and the higher the definition is, the more accurate the predetermined depth corresponding to the image is, so that the sharper the imaged image is, the higher the weight is, and the accuracy of the obtained depth is further improved.
In order to further reduce the depth error, in a seventeenth embodiment of the present application, the determining the weight of each predetermined depth includes: and determining the corresponding weight of the predetermined depth according to the definition of different regions of each image, namely, the weight of the predetermined depth corresponding to the region with higher definition of each image is also higher and is smaller, otherwise, the weight of the predetermined depth is smaller, so that the error caused by the noise point, the blank point or the data error point of the image when the depth is calculated is reduced, and the more accurate depth of the measured object is obtained.
It should be noted that, when the determining module determines the weight of the predetermined depth corresponding to each image, the definition of different regions of each image is not the only criterion, and in some special cases, the weight of the predetermined depth needs to be adjusted to further reduce the depth error.
In an eighteenth embodiment of the present application, determining a weight for each of the predetermined depths includes: when the predetermined depth corresponding to a predetermined image is larger than a predetermined threshold and has an error, increasing the weight of the predetermined depth smaller than the predetermined threshold, wherein the predetermined image is an image formed by a predetermined lens in the lens assembly, the focal length of the predetermined lens is larger than the minimum focal length of the lens assembly and smaller than the maximum focal length, and the object distance of the predetermined lens is larger than the minimum object distance of the lens assembly and smaller than the maximum object distance; or when the depth corresponding to the predetermined image has an error and is smaller than a predetermined threshold value, increasing the weight of the predetermined depth larger than the predetermined threshold value. Specifically, one lens is selected from a plurality of lenses as a predetermined lens, for example, the object distances of the three lenses are respectively U1, U2 and U3, U1< U1, the focal lengths of the three lenses are respectively F1, F1 and F1, and F1> F1, the lens with the object distance of U1 and the focal length of F1 is selected as the predetermined lens, a predetermined depth d1 corresponding to an image formed by the lens is predetermined, a predetermined depth d1 and a predetermined depth d1 corresponding to an image formed by the other two lenses are predetermined, d1 and d1 are compared with a predetermined threshold d, and when d1> d and d1 have an error, the weight of d1< d and d1 are increased, and if d1> d and d1< d are increased; alternatively, when d2< d and d2 have errors, the weight of d3 is increased if d1< d and d3> d, the weight of d1 is increased if d1> d and d3< d, and the weights of d1 and d3 are increased if d1> d and d3> d.
In a nineteenth embodiment of the present application, determining a weight for each of the predetermined depths includes: acquiring a corresponding depth image according to each image; acquiring the depth gradient of each depth image; increasing the weight of the predetermined depth corresponding to the depth gradient smaller than the predetermined gradient when the depth gradient is larger than the predetermined gradient; and increasing the weight of the predetermined depth corresponding to the depth gradient larger than the predetermined gradient when the depth gradient is smaller than the predetermined gradient. Specifically, the depth gradient bits T1, T2, and T3 of the three depth images, the predetermined depth corresponding to the depth gradient bit T1 depth image is d1, the predetermined depth corresponding to the depth gradient bit T2 depth image is d2, the predetermined depth corresponding to the depth gradient bit T3 depth image is d3, the predetermined gradient is T, the weight of d3 is increased when T1> T, T2> T, and T3< T, or the weight of d1 is increased when T3< T, T2< T, and T1> T.
It should be noted that the object distances of the three lenses are U1, U2 and U3, respectively, and U1< U2< U3, the focal lengths of the three lenses are F1, F2 and F3, respectively, and F1> F2> F3, the lens with the object distance of U2 and the focal length of F2 is selected as the predetermined lens, the predetermined depth d2 corresponding to the image formed by the predetermined lens is predetermined, and the predetermined depths d1 and d3 corresponding to the images formed by the other two lenses are predetermined, and when the determining module determines the weight of the predetermined depth corresponding to each image, the determining module may make the initial weight of the predetermined depth d2 corresponding to the image formed by the predetermined lens larger, and determine whether to relatively increase or decrease the weight in d1 and d3 or in the partial region according to the sharpness, the error, the depth gradient, and the like. Specifically, the calculation formula of the depth of the object to be measured is D ═ w1D1+ w2D2+ w3D3, where D is the depth of the object to be measured, and w1, w2, and w3 are the weights of the predetermined depths D1, D2, and D3, respectively.
The processing device comprises a processor and a memory, wherein the first acquiring module, the first calculating module, the second calculating module, the determining module and the like are stored in the memory as program units, and the processor executes the program units stored in the memory to realize corresponding functions.
The processor comprises a kernel, and the kernel calls the corresponding program unit from the memory. The kernel can be set to be one or more than one, and the problem of large depth error obtained in the prior art is solved by adjusting kernel parameters.
The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
A twentieth embodiment of the present application provides a storage medium having stored thereon a program that, when executed by a processor, implements the imaging method described above.
A twenty-first embodiment of the present application provides a processor for executing a program, wherein the program executes the imaging method.
A twenty-second embodiment of the present application provides an apparatus comprising a light source apparatus, a light diffusion apparatus, a lens assembly, a processor, a memory, and a program stored on the memory and executable on the processor, the processor implementing at least the following steps when executing the program:
calculating the depth of the object to be measured from a plurality of phase differences corresponding to a plurality of images, and specifically, calculating the depth of the object to be measured from a plurality of phase differences corresponding to a plurality of images includes:
acquiring corresponding preset depth according to each image;
weighting each preset depth to obtain a preset component;
and accumulating a plurality of preset components corresponding to the plurality of images to obtain the depth.
The device in this context may be a server, a PC, a PAD, a mobile phone, etc., for example, the mobile phone shown in fig. 3, and also the PAD shown in fig. 4.
The present application further provides a computer program product adapted to perform a program of initializing at least the following method steps when executed on a data processing device:
calculating the depth of the object to be measured from a plurality of phase differences corresponding to a plurality of images, and specifically, calculating the depth of the object to be measured from a plurality of phase differences corresponding to a plurality of images includes:
acquiring corresponding preset depth according to each image;
weighting each preset depth to obtain a preset component;
and accumulating a plurality of preset components corresponding to the plurality of images to obtain the depth.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
From the above description, it can be seen that the above-described embodiments of the present application achieve the following technical effects:
1) in the imaging device, initial light that the light source subassembly sent is a plurality of faculas that are located different position regions through the diffusion of light diffusion equipment, measured object reflects a plurality of faculas, obtain a plurality of reflection faculas, the camera lens that a plurality of intervals set up is imaged a plurality of reflection faculas respectively, the object distance of the more camera lens of f-number is greater than the object distance of the less camera lens of f-number, and the focus of the more camera lens of f-number is less than the focus of the less camera lens of f-number, form a plurality of images, processing equipment calculates the degree of depth of measured object according to a plurality of phase differences that a plurality of images correspond. The imaging device adopts a plurality of light spots located in different position areas to irradiate a measured object to obtain a plurality of reflected light spots, the reflected light spots corresponding to the light spots located in the different position areas are imaged by the lens with different parameters, so that the depth of the measured object can be reflected more accurately by the phase of the formed image, the error of calculating the depth of the measured object according to a plurality of phase differences corresponding to the plurality of images is reduced, and the problem of large depth error of acquisition in the prior art is solved.
2) In the imaging method, initial light emitted by the light source assembly is diffused into a plurality of light spots located in different position areas through the light diffusion device, the measured object reflects the light spots to obtain a plurality of reflected light spots, the plurality of spaced lenses image the reflected light spots respectively, the object distance of the lens with more f-number is larger than the object distance of the lens with less f-number, the focal distance of the lens with more f-number is smaller than the focal distance of the lens with less f-number, a plurality of images are formed, and the depth of the measured object is calculated by the processing device according to a plurality of phase differences corresponding to the images. The imaging method adopts a plurality of light spots located in different position areas to irradiate a measured object to obtain a plurality of reflected light spots, and the reflected light spots corresponding to the light spots located in the different position areas are imaged by using the lenses with different parameters, so that the depth of the measured object can be reflected more accurately by the phase of the formed image, the error of calculating the depth of the measured object according to a plurality of phase differences corresponding to the plurality of images is reduced, and the problem of larger acquired depth error in the prior art is solved.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. An image forming apparatus, comprising:
a light source device for emitting primary light;
the light diffusion device is positioned on one side of the light source device and is used for diffusing the initial light into a plurality of light spots positioned in different position areas;
the lens assembly comprises a plurality of lenses which are arranged at intervals, the plurality of lenses are used for imaging the reflected light spots corresponding to the plurality of light spots respectively to form a plurality of images, wherein the light spots are reflected by a measured object to form the reflected light spots, the plurality of lenses are different in f-number, object distances and focal lengths, in any two lenses, the object distance of the lens with more f-number is larger than the object distance of the lens with less f-number, and the focal length of the lens with more f-number is smaller than the focal length of the lens with less f-number;
and the processing device is electrically connected with the lens assembly and used for calculating the depth of the measured object according to a plurality of phase differences corresponding to a plurality of images.
2. The imaging apparatus according to claim 1, wherein the light diffusing device includes a plurality of diffusing portions, and the number of the light spots formed by at least two of the diffusing portions diffusing the initial light is different.
3. An imaging device as claimed in claim 1 or 2, wherein there are three said lenses in said lens assembly, the F-numbers of the three said lenses are fno1, fno2 and fno3, respectively, and fno1< fno2< fno3, the object distances of the three said lenses are U1, U2 and U3, respectively, and U1< U2< U3, the focal distances of the three said lenses are F1, F2 and F3, respectively, and F1> F2> F3.
4. An imaging apparatus according to claim 3, wherein centers of any two adjacent lenses are located on a predetermined straight line, and a projection of the light source device on the predetermined straight line is located between the corresponding two lenses.
5. The imaging apparatus according to claim 1, wherein the light source device comprises a plurality of lasers distributed in an array, and the wavelength of light emitted by the lasers is in a range of 800-1000 nm.
6. The imaging apparatus of claim 1, wherein the processing device comprises:
the first acquisition module is used for acquiring corresponding preset depth according to each image;
the first calculation module is used for weighting each preset depth to obtain a preset component;
and the second calculation module is used for accumulating the preset components corresponding to the images to obtain the depth.
7. The imaging apparatus of claim 6, wherein the processing device further comprises:
and the determining module is used for determining the weight of each preset depth before each preset depth is weighted to obtain a preset component.
8. The imaging apparatus of claim 7, wherein the determining module is further configured to determine a weight of the corresponding predetermined depth according to a sharpness of each of the images.
9. The imaging apparatus of claim 7, wherein the determining module is further configured to determine a weight of the corresponding predetermined depth according to a sharpness of a different region of each of the images.
10. An imaging method, comprising:
controlling to emit initial light;
diffusing the initial light into a plurality of light spots located in different position areas;
the method comprises the steps that a lens assembly is adopted to image reflection light spots corresponding to a plurality of light spots respectively to form a plurality of images, wherein the light spots are reflected by a measured object to form the reflection light spots, the f-numbers of a plurality of lenses are different, the object distances of the lenses are different, the focal lengths of the lenses are different, in any two lenses, the object distance of the lens with more f-numbers is larger than the object distance of the lens with less f-numbers, and the focal length of the lens with more f-numbers is smaller than the focal length of the lens with less f-numbers;
and calculating the depth of the measured object according to a plurality of phase differences corresponding to a plurality of images.
CN201910844558.0A 2019-09-06 2019-09-06 Image forming apparatus and image forming method Active CN112540494B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910844558.0A CN112540494B (en) 2019-09-06 2019-09-06 Image forming apparatus and image forming method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910844558.0A CN112540494B (en) 2019-09-06 2019-09-06 Image forming apparatus and image forming method

Publications (2)

Publication Number Publication Date
CN112540494A true CN112540494A (en) 2021-03-23
CN112540494B CN112540494B (en) 2022-05-03

Family

ID=75012167

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910844558.0A Active CN112540494B (en) 2019-09-06 2019-09-06 Image forming apparatus and image forming method

Country Status (1)

Country Link
CN (1) CN112540494B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011164094A (en) * 2010-02-12 2011-08-25 Samsung Electronics Co Ltd Depth estimation method of depth sensor and recording medium of the same
JP2015132546A (en) * 2014-01-14 2015-07-23 ソニー株式会社 Information processing apparatus and method
TW201607313A (en) * 2014-08-15 2016-02-16 光寶科技股份有限公司 Image capturing system obtaining scene depth information and focusing method thereof
US20170069097A1 (en) * 2015-09-04 2017-03-09 Apple Inc. Depth Map Calculation in a Stereo Camera System
US20170078559A1 (en) * 2014-05-15 2017-03-16 Indiana University Research And Technology Corporation Three dimensional moving pictures with a single imager and microfluidic lens
CN107396080A (en) * 2016-05-17 2017-11-24 纬创资通股份有限公司 Method and system for generating depth information
CN108351199A (en) * 2015-11-06 2018-07-31 富士胶片株式会社 Information processing unit, information processing method and program
CN109544618A (en) * 2018-10-30 2019-03-29 华为技术有限公司 A kind of method and electronic equipment obtaining depth information

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011164094A (en) * 2010-02-12 2011-08-25 Samsung Electronics Co Ltd Depth estimation method of depth sensor and recording medium of the same
JP2015132546A (en) * 2014-01-14 2015-07-23 ソニー株式会社 Information processing apparatus and method
US20170078559A1 (en) * 2014-05-15 2017-03-16 Indiana University Research And Technology Corporation Three dimensional moving pictures with a single imager and microfluidic lens
TW201607313A (en) * 2014-08-15 2016-02-16 光寶科技股份有限公司 Image capturing system obtaining scene depth information and focusing method thereof
US20170069097A1 (en) * 2015-09-04 2017-03-09 Apple Inc. Depth Map Calculation in a Stereo Camera System
CN108351199A (en) * 2015-11-06 2018-07-31 富士胶片株式会社 Information processing unit, information processing method and program
CN107396080A (en) * 2016-05-17 2017-11-24 纬创资通股份有限公司 Method and system for generating depth information
CN109544618A (en) * 2018-10-30 2019-03-29 华为技术有限公司 A kind of method and electronic equipment obtaining depth information

Also Published As

Publication number Publication date
CN112540494B (en) 2022-05-03

Similar Documents

Publication Publication Date Title
CN108957911B (en) Speckle structure light projection module and 3D degree of depth camera
EP3176601B1 (en) Illumination light projection for a depth camera
EP2885672B1 (en) Illumination light shaping for a depth camera
CN111263899B (en) Lighting device, time-of-flight system and method
US8773570B2 (en) Image processing apparatus and image processing method
KR20090117399A (en) Lenses with Extended Depth of Focus and Optical Systems Including the Same
KR102046944B1 (en) Sub-resolution optical detection
WO2018057711A1 (en) Non-contact coordinate measuring machine using hybrid cyclic binary code structured light
KR20170086570A (en) Multiple pattern illumination optics for time of flight system
JP6112909B2 (en) Shape measuring device and shape measuring method using Shack-Hartmann sensor
FR3092915B1 (en) Optical metasurfaces, associated manufacturing processes and systems
US10362235B2 (en) Processing apparatus, processing system, image pickup apparatus, processing method, and storage medium
CN112540494B (en) Image forming apparatus and image forming method
US20160041063A1 (en) Light spot centroid position acquisition method for wavefront sensor, wavefront measurement method, wavefront measurement apparatus and storage medium storing light spot centroid position acquisition program
CN110186387B (en) Depth detection method, device and system
EP3009790A1 (en) Slope data processing method, slope data processing apparatus and measurement apparatus
JP7662623B2 (en) PROJECTOR FOR ILLUMINATING AT LEAST ONE OBJECT - Patent application
CN111932598B (en) Depth image construction method
CN210639650U (en) Construction system and electronic equipment of depth image
CN115542537A (en) Super-surface design method, super-surface, projection device and sweeping robot
CN113052887A (en) Depth calculation method and system
KR102774465B1 (en) Appapatus for three-dimensional shape measurment
US20250146901A1 (en) Phase detection device and method for optical element
CN114627174B (en) Depth map generation system, method and autonomous mobile device
EP3637044B1 (en) Multi-image projector and electronic device having the multi-image projector

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant