[go: up one dir, main page]

WO2021121540A1 - Depth estimation system - Google Patents

Depth estimation system Download PDF

Info

Publication number
WO2021121540A1
WO2021121540A1 PCT/EP2019/085292 EP2019085292W WO2021121540A1 WO 2021121540 A1 WO2021121540 A1 WO 2021121540A1 EP 2019085292 W EP2019085292 W EP 2019085292W WO 2021121540 A1 WO2021121540 A1 WO 2021121540A1
Authority
WO
WIPO (PCT)
Prior art keywords
electromagnetic radiation
spectrum
estimation system
organic layer
depth estimation
Prior art date
Application number
PCT/EP2019/085292
Other languages
French (fr)
Inventor
Radu Ciprian Bilcu
Mikko Muukki
Original Assignee
Huawei Technologies Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co., Ltd. filed Critical Huawei Technologies Co., Ltd.
Priority to CN201980103040.XA priority Critical patent/CN114829977A/en
Priority to PCT/EP2019/085292 priority patent/WO2021121540A1/en
Publication of WO2021121540A1 publication Critical patent/WO2021121540A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K39/00Integrated devices, or assemblies of multiple devices, comprising at least one organic radiation-sensitive element covered by group H10K30/00
    • H10K39/30Devices controlled by radiation
    • H10K39/32Organic image sensors

Definitions

  • the disclosure relates to a depth estimation system comprising a light emitter.
  • Depth sensing systems are becoming common imaging features in electronic devices such as smartphones.
  • the depth sensing systems enhance camera operations, for instance laser-base focusing, for scene three-dimensional (3D) mapping where the 3D model of the surrounding scene is built, in AR/VR applications and for different scenery objects analysis solutions.
  • the 3D coordinates of the scene points can be calculated from the depth information when knowing the intrinsic parameters of the capturing system.
  • One example of such depth sensing solutions comprises time-of- flight (ToF) sensors where an infrared (IR) wave is projected onto the scene. The reflections from the scene are captured by an IR sensor which outputs the time passing as the light travels from the IR generator to the IR sensor, for each pixel.
  • a further example comprises structured light (SL) sensors. These modules calculate the depth based on the position, onto an IR sensor, of a certain IR pattern when reflected by the scene, also known as the triangulation principle.
  • SL solutions usually comprise a laser emitter and an IR camera which captures light only in the narrow wavelength band around the wavelength of the laser emitter in order to minimize the effect of ambient light and prevent sensor saturation.
  • the IR camera module which contains the IR sensor, must be equipped with a narrow IR passband filter.
  • a further imaging sensor is placed near the IR sensor. Due to the IR sensor and the color sensor seeing slightly different views of the scene, a method to identify the correspondence between the detected dots in the IR image and their position in the color image needs to be used. Calculating the correspondence between the detected dots in the IR image and the geometric calibration of the two cameras introduces extra computations and extra sources of errors.
  • a depth estimation system for capturing color and distance information from a point, the depth estimation system comprising a light emitter and a multi-layered imaging sensor, the multi-layered imaging sensor being adapted for absorbing electromagnetic radiation within a plurality of discrete electromagnetic spectrums.
  • Such a system enables capturing different types of information for each 3D data point, without having to align the different types of information.
  • By using the same sensor there is no need for extra calibration, except the calibration steps of a standard SL system.
  • layers of the multi-layered imaging sensor are stacked on top of each other, each discrete layer absorbing electromagnetic radiation within a predefined, discrete spectrum. This allows each color component, for a certain pixel, to be collected from one and the same horizontal and vertical position, but at different levels inside the sensor structure. As a consequence, the color information will be aligned with the distance information, and pixel saturation is eliminated.
  • the electromagnetic radiation comprises first electromagnetic radiation and second electromagnetic radiation, the first electromagnetic radiation originating directly from the object, and the second electromagnetic radiation being emitted by the light emitter and reflected off the point.
  • the first electromagnetic radiation is visible light.
  • the depth estimation system further comprises a diffractive optical element, diffracting a beam of electromagnetic radiation emitted by the light emitter into a plurality of individual rays, one of the rays being reflected off the point towards the multi-layered imaging sensor.
  • the depth estimation system is a structured light depth sensing system.
  • the light emitter is a laser emitter.
  • the second electromagnetic radiation is infrared radiation.
  • the multi-layered imaging sensor comprises a plurality of organic layers.
  • the multi-layered imaging sensor comprises a first organic layer, a second organic layer, a third organic layer, and a silicon layer, the first organic layer absorbing electromagnetic radiation within a first spectrum, the second organic layer absorbing electromagnetic radiation within a second spectrum, the third organic layer absorbing electromagnetic radiation within a third spectrum, and the silicon layer absorbing electromagnetic radiation within a fourth spectrum.
  • the multi-layered imaging sensor comprises a first organic layer, a second organic layer, a third organic layer, and a fourth organic layer, the first organic layer absorbing electromagnetic radiation within a first spectrum, the second organic layer absorbing electromagnetic radiation within a second spectrum, the third organic layer absorbing electromagnetic radiation within a third spectrum, the fourth organic layer absorbing electromagnetic radiation within a fourth spectrum.
  • the first electromagnetic spectrum, the second electromagnetic spectrum, and the third electromagnetic spectrum are within the range of 400-700 nm.
  • the fourth electromagnetic spectrum is within the range of 800-2500 nm.
  • an electronic device comprising the depth estimation system according to the above.
  • Fig. 1 shows a schematic illustration of a depth estimation system in accordance with an embodiment of the present invention
  • Fig. 2 shows a schematic perspective view of a multi-layered imaging sensor comprised in a depth estimation system in accordance with an embodiment of the present invention.
  • the present invention relates to an electronic device, such as a camera, smartphone, or tablet, comprising a depth estimation system 1 used for capturing both color information and distance information from a point P, point P being a point located on an object in the surroundings.
  • the depth estimation system 1 may be a structured light depth sensing system.
  • the depth estimation system 1 comprises a light emitter 2 and a multi-layered imaging sensor 3, as shown in Fig. 1.
  • the light emitter 2 may be a laser emitter.
  • the multi-layered imaging sensor 3 is adapted for absorbing electromagnetic radiation R within a plurality of discrete electromagnetic spectrums such as the visible light spectrum and the infrared spectrum.
  • the multi-layered imaging sensor 3 comprises a plurality of layers stacked on top of each other, each layer absorbing light only from a certain wavelength, or narrow part of the electromagnetic spectrum.
  • the sensor 3 may comprise any suitable number of layers, but Fig. 2 shows embodiments comprising four layers 3a, 3b, 3c, 3d, 3e.
  • Each discrete layer 3a, 3b, 3c, 3d, 3e absorbs electromagnetic radiation R within a predefined, discrete spectrum. Due to these layers, for a certain pixel, each color component can be collected from one and the same horizontal and vertical position, but at different levels inside the sensor structure. As a consequence, the color information will be aligned with the distance information, and pixel saturation is eliminated.
  • the electromagnetic radiation R may comprise first electromagnetic radiation R1 and second electromagnetic radiation R2.
  • the first electromagnetic radiation R1 originates directly from the object P, and may comprise color information.
  • the second electromagnetic radiation R2 is initially emitted by the light emitter 2 and thereafter reflected off the point P, and may be used for generating distance information.
  • the first electromagnetic radiation R1 may be visible light, and the second electromagnetic radiation R2 may be infrared radiation.
  • the depth estimation system 1 may comprise a diffractive optical element 4, which element diffracts a beam of electromagnetic radiation, emitted by the light emitter 2, into a plurality of individual rays. One of the rays R2 are subsequently reflected off the point P in a direction towards the multi-layered imaging sensor 3.
  • the multi-layered imaging sensor 3 may comprise a plurality of organic layers 3a, 3b, 3c, 3d, i.e. any suitable number of organic layers.
  • the multi-layered imaging sensor 3 may furthermore comprise at least one inorganic layer, such as a silicon layer.
  • the multi-layered imaging sensor 3 comprises a first organic layer 3a, a second organic layer 3b, a third organic layer 3c, and a silicon layer 3e.
  • the first organic layer 3a absorbs electromagnetic radiation within a first spectrum
  • the second organic layer 3b absorbs electromagnetic radiation within a second spectrum
  • the third organic layer 3c absorbs electromagnetic radiation within a third spectrum
  • the silicon layer 3e absorbs electromagnetic radiation within a fourth spectrum.
  • the multi-layered imaging sensor 3 comprises a first organic layer 3a, a second organic layer 3b, a third organic layer 3c, and a fourth organic layer 3d.
  • the first organic layer 3a absorbs electromagnetic radiation within a first spectrum
  • the second organic layer 3b absorbs electromagnetic radiation within a second spectrum
  • the third organic layer 3c absorbs electromagnetic radiation within a third spectrum
  • the fourth organic layer 3d absorbs electromagnetic radiation within a fourth spectrum.
  • the fourth electromagnetic spectrum may be within the range of 800-2500 nm.
  • the first electromagnetic spectrum, the second electromagnetic spectrum, and the third electromagnetic spectrum may be within the range of 400-700 nm.
  • the first spectrum may comprise blue or green wavelengths, the second spectrum green or blue wavelengths, the third spectrum red wavelengths, and the fourth spectrum infrared wavelengths.
  • the fourth layer may effectively record bandpass wavelength for example at nominal wavelength 850 nm, 940 nm, or 1340 nm and having for example 20 nm, 50 nm or 100 nm bandpass width.
  • This may be achieved by having an infrared cut off filter located in front of a first organic layer 3a, where the filter may pass blue, green, and red wavelengths, and has additional bandpass wavelength at e.g. 850 nm, 940 nm, or 1340 nm and a width of for example 20 nm, 50 nm, or 100 nm.
  • a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
  • a suitable medium such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

A depth estimation system (1) for capturing color and distance information from a point (P), the depth estimation system (1) comprising a light emitter (2) and a multi-layered imaging sensor (3), the multi-layered imaging sensor (3) being adapted for absorbing electromagnetic radiation (R) within a plurality of discrete electromagnetic spectrums. This enables capturing different types of information such as distance information and color information for a 3D data point.

Description

DEPTH ESTIMATION SYSTEM
TECHNICAL FIELD
The disclosure relates to a depth estimation system comprising a light emitter.
BACKGROUND
Depth sensing systems are becoming common imaging features in electronic devices such as smartphones. The depth sensing systems enhance camera operations, for instance laser-base focusing, for scene three-dimensional (3D) mapping where the 3D model of the surrounding scene is built, in AR/VR applications and for different scenery objects analysis solutions. The 3D coordinates of the scene points can be calculated from the depth information when knowing the intrinsic parameters of the capturing system.
One example of such depth sensing solutions comprises time-of- flight (ToF) sensors where an infrared (IR) wave is projected onto the scene. The reflections from the scene are captured by an IR sensor which outputs the time passing as the light travels from the IR generator to the IR sensor, for each pixel. A further example comprises structured light (SL) sensors. These modules calculate the depth based on the position, onto an IR sensor, of a certain IR pattern when reflected by the scene, also known as the triangulation principle.
However, these solutions comprise disadvantages. ToF solutions often encounter multipath problems, due to the continuous wave detection method used. SL solutions usually comprise a laser emitter and an IR camera which captures light only in the narrow wavelength band around the wavelength of the laser emitter in order to minimize the effect of ambient light and prevent sensor saturation. As a consequence, the IR camera module, which contains the IR sensor, must be equipped with a narrow IR passband filter. To get other information, such as color information, a further imaging sensor is placed near the IR sensor. Due to the IR sensor and the color sensor seeing slightly different views of the scene, a method to identify the correspondence between the detected dots in the IR image and their position in the color image needs to be used. Calculating the correspondence between the detected dots in the IR image and the geometric calibration of the two cameras introduces extra computations and extra sources of errors. SUMMARY
It is an object to provide an improved depth estimation system. The foregoing and other objects are achieved by the features of the independent claim. Further implementation forms are apparent from the dependent claims, the description, and the figures.
According to a first aspect, there is provided a depth estimation system for capturing color and distance information from a point, the depth estimation system comprising a light emitter and a multi-layered imaging sensor, the multi-layered imaging sensor being adapted for absorbing electromagnetic radiation within a plurality of discrete electromagnetic spectrums.
Such a system enables capturing different types of information for each 3D data point, without having to align the different types of information. By using the same sensor, there is no need for extra calibration, except the calibration steps of a standard SL system.
In a possible implementation form of the first aspect, layers of the multi-layered imaging sensor are stacked on top of each other, each discrete layer absorbing electromagnetic radiation within a predefined, discrete spectrum. This allows each color component, for a certain pixel, to be collected from one and the same horizontal and vertical position, but at different levels inside the sensor structure. As a consequence, the color information will be aligned with the distance information, and pixel saturation is eliminated.
In a further possible implementation form of the first aspect, the electromagnetic radiation comprises first electromagnetic radiation and second electromagnetic radiation, the first electromagnetic radiation originating directly from the object, and the second electromagnetic radiation being emitted by the light emitter and reflected off the point. This enables capturing color information for each 3D data point, without having to align the 3D data and the color image. By using same sensor for capturing the color information and the distance information, a solution which is robust to the pixel saturation effect is achieved, since the detection of each color component happens at different levels of the pixel.
In a further possible implementation form of the first aspect, the first electromagnetic radiation is visible light. In a further possible implementation form of the first aspect, the depth estimation system further comprises a diffractive optical element, diffracting a beam of electromagnetic radiation emitted by the light emitter into a plurality of individual rays, one of the rays being reflected off the point towards the multi-layered imaging sensor.
In a further possible implementation form of the first aspect, the depth estimation system is a structured light depth sensing system.
In a further possible implementation form of the first aspect, the light emitter is a laser emitter.
In a further possible implementation form of the first aspect, the second electromagnetic radiation is infrared radiation.
In a further possible implementation form of the first aspect, the multi-layered imaging sensor comprises a plurality of organic layers.
In a further possible implementation form of the first aspect, the multi-layered imaging sensor comprises a first organic layer, a second organic layer, a third organic layer, and a silicon layer, the first organic layer absorbing electromagnetic radiation within a first spectrum, the second organic layer absorbing electromagnetic radiation within a second spectrum, the third organic layer absorbing electromagnetic radiation within a third spectrum, and the silicon layer absorbing electromagnetic radiation within a fourth spectrum.
In a further possible implementation form of the first aspect, the multi-layered imaging sensor comprises a first organic layer, a second organic layer, a third organic layer, and a fourth organic layer, the first organic layer absorbing electromagnetic radiation within a first spectrum, the second organic layer absorbing electromagnetic radiation within a second spectrum, the third organic layer absorbing electromagnetic radiation within a third spectrum, the fourth organic layer absorbing electromagnetic radiation within a fourth spectrum.
In a further possible implementation form of the first aspect, the first electromagnetic spectrum, the second electromagnetic spectrum, and the third electromagnetic spectrum are within the range of 400-700 nm. In a further possible implementation form of the first aspect, the fourth electromagnetic spectrum is within the range of 800-2500 nm.
According to a second aspect, there is provided an electronic device comprising the depth estimation system according to the above.
These and other aspects will be apparent from the embodiments described below.
BRIEF DESCRIPTION OF THE DRAWINGS
In the following detailed portion of the present disclosure, the aspects, embodiments and implementations will be explained in more detail with reference to the example embodiments shown in the drawings, in which:
Fig. 1 shows a schematic illustration of a depth estimation system in accordance with an embodiment of the present invention;
Fig. 2 shows a schematic perspective view of a multi-layered imaging sensor comprised in a depth estimation system in accordance with an embodiment of the present invention.
DETAILED DESCRIPTION
The present invention relates to an electronic device, such as a camera, smartphone, or tablet, comprising a depth estimation system 1 used for capturing both color information and distance information from a point P, point P being a point located on an object in the surroundings. The depth estimation system 1 may be a structured light depth sensing system.
The depth estimation system 1 comprises a light emitter 2 and a multi-layered imaging sensor 3, as shown in Fig. 1. The light emitter 2 may be a laser emitter.
The multi-layered imaging sensor 3 is adapted for absorbing electromagnetic radiation R within a plurality of discrete electromagnetic spectrums such as the visible light spectrum and the infrared spectrum. The multi-layered imaging sensor 3 comprises a plurality of layers stacked on top of each other, each layer absorbing light only from a certain wavelength, or narrow part of the electromagnetic spectrum. The sensor 3 may comprise any suitable number of layers, but Fig. 2 shows embodiments comprising four layers 3a, 3b, 3c, 3d, 3e. Each discrete layer 3a, 3b, 3c, 3d, 3e absorbs electromagnetic radiation R within a predefined, discrete spectrum. Due to these layers, for a certain pixel, each color component can be collected from one and the same horizontal and vertical position, but at different levels inside the sensor structure. As a consequence, the color information will be aligned with the distance information, and pixel saturation is eliminated.
The electromagnetic radiation R may comprise first electromagnetic radiation R1 and second electromagnetic radiation R2. The first electromagnetic radiation R1 originates directly from the object P, and may comprise color information. The second electromagnetic radiation R2 is initially emitted by the light emitter 2 and thereafter reflected off the point P, and may be used for generating distance information. The first electromagnetic radiation R1 may be visible light, and the second electromagnetic radiation R2 may be infrared radiation.
The depth estimation system 1 may comprise a diffractive optical element 4, which element diffracts a beam of electromagnetic radiation, emitted by the light emitter 2, into a plurality of individual rays. One of the rays R2 are subsequently reflected off the point P in a direction towards the multi-layered imaging sensor 3.
The multi-layered imaging sensor 3 may comprise a plurality of organic layers 3a, 3b, 3c, 3d, i.e. any suitable number of organic layers. The multi-layered imaging sensor 3 may furthermore comprise at least one inorganic layer, such as a silicon layer.
In one embodiment, the multi-layered imaging sensor 3 comprises a first organic layer 3a, a second organic layer 3b, a third organic layer 3c, and a silicon layer 3e. The first organic layer 3a absorbs electromagnetic radiation within a first spectrum, the second organic layer 3b absorbs electromagnetic radiation within a second spectrum, the third organic layer 3c absorbs electromagnetic radiation within a third spectrum, and the silicon layer 3e absorbs electromagnetic radiation within a fourth spectrum.
In a further embodiment, the multi-layered imaging sensor 3 comprises a first organic layer 3a, a second organic layer 3b, a third organic layer 3c, and a fourth organic layer 3d. The first organic layer 3a absorbs electromagnetic radiation within a first spectrum, the second organic layer 3b absorbs electromagnetic radiation within a second spectrum, the third organic layer 3c absorbs electromagnetic radiation within a third spectrum, and the fourth organic layer 3d absorbs electromagnetic radiation within a fourth spectrum.
The fourth electromagnetic spectrum may be within the range of 800-2500 nm.
The first electromagnetic spectrum, the second electromagnetic spectrum, and the third electromagnetic spectrum may be within the range of 400-700 nm. The first spectrum may comprise blue or green wavelengths, the second spectrum green or blue wavelengths, the third spectrum red wavelengths, and the fourth spectrum infrared wavelengths.
In a further embodiment, the fourth layer may effectively record bandpass wavelength for example at nominal wavelength 850 nm, 940 nm, or 1340 nm and having for example 20 nm, 50 nm or 100 nm bandpass width. This may be achieved by having an infrared cut off filter located in front of a first organic layer 3a, where the filter may pass blue, green, and red wavelengths, and has additional bandpass wavelength at e.g. 850 nm, 940 nm, or 1340 nm and a width of for example 20 nm, 50 nm, or 100 nm.
The various aspects and implementations have been described in conjunction with various embodiments herein. However, other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed subject-matter, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word “comprising” does not exclude other elements or steps, and the indefinite article “a” or “an” does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measured cannot be used to advantage. A computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
The reference signs used in the claims shall not be construed as limiting the scope. Unless otherwise indicated, the drawings are intended to be read (e.g., cross-hatching, arrangement of parts, proportion, degree, etc.) together with the specification, and are to be considered a portion of the entire written description of this disclosure. As used in the description, the terms “horizontal”, “vertical”, “left”, “right”, “up” and “down”, as well as adjectival and adverbial derivatives thereof (e.g., “horizontally”, “rightwardly”, “upwardly”, etc.), simply refer to the orientation of the illustrated structure as the particular drawing figure faces the reader. Similarly, the terms “inwardly” and “outwardly” generally refer to the orientation of a surface relative to its axis of elongation, or axis of rotation, as appropriate.

Claims

1. A depth estimation system (1) for capturing color and distance information from a point (P), said depth estimation system (1) comprising a light emitter (2) and a multi-layered imaging sensor (3), said multi-layered imaging sensor (3) being adapted for absorbing electromagnetic radiation (R) within a plurality of discrete electromagnetic spectrums.
2. The depth estimation system (1) according to claim 1, wherein layers (3a, 3b, 3c, 3d, 3e) of said multi-layered imaging sensor (3) are stacked on top of each other, each discrete layer (3a, 3b, 3c, 3d, 3e) absorbing electromagnetic radiation (R) within a predefined, discrete spectrum.
3. The depth estimation system (1) according to claim 1 or 2, wherein said electromagnetic radiation (R) comprises first electromagnetic radiation (Rl) and second electromagnetic radiation (R2), said first electromagnetic radiation (Rl) originating directly from said object (P), and said second electromagnetic radiation (R2) being emitted by said light emitter (2) and reflected off said point (P).
4. The depth estimation system (1) according claim 3, wherein said first electromagnetic radiation (Rl) is visible light.
5. The depth estimation system (1) according to any one of the previous claims, further comprising a diffractive optical element (4), diffracting a beam of electromagnetic radiation emitted by said light emitter (2) into a plurality of individual rays, one of said rays (R2) being reflected off said point (P) towards said multi-layered imaging sensor (3).
6. The depth estimation system (1) according to any one of the previous claims, wherein said depth estimation system (1) is a structured light depth sensing system.
7. The depth estimation system (1) according to any one of the previous claims, wherein said light emitter (2) is a laser emitter.
8. The depth estimation system (1) according to any one of claims 3 to 7, wherein said second electromagnetic radiation (R2) is infrared radiation.
9. The depth estimation system (1) according to any one of the previous claims, wherein said multi-layered imaging sensor (3) comprises a plurality of organic layers (3a, 3b, 3c, 3d).
10. The depth estimation system (1) according to any one of the previous claims, wherein said multi-layered imaging sensor (3) comprises a first organic layer (3a), a second organic layer (3b), a third organic layer (3c), and a silicon layer (3e), said first organic layer (3a) absorbing electromagnetic radiation within a first spectrum, said second organic layer (3b) absorbing electromagnetic radiation within a second spectrum, said third organic layer (3c) absorbing electromagnetic radiation within a third spectrum, and said silicon layer (3e) absorbing electromagnetic radiation within a fourth spectrum.
11. The depth estimation system (1) according to any one claims 1 to 9, wherein said multi layered imaging sensor (3) comprises a first organic layer (3a), a second organic layer (3b), a third organic layer (3c), and a fourth organic layer (3d), said first organic layer (3a) absorbing electromagnetic radiation within a first spectrum, said second organic layer (3b) absorbing electromagnetic radiation within a second spectrum, said third organic layer (3c) absorbing electromagnetic radiation within a third spectrum, said fourth organic layer (3d) absorbing electromagnetic radiation within a fourth spectrum.
12. The depth estimation system (1) according to claim 10 or 11, wherein said first electromagnetic spectrum, said second electromagnetic spectrum, and said third electromagnetic spectrum are within the range of 400-700 nm.
13. The depth estimation system according to at least one of claims 10 to 12, wherein said fourth electromagnetic spectrum is within the range of 800-2500 nm.
14. An electronic device comprising the depth estimation system (1) according to any one of claims 1 to 13.
PCT/EP2019/085292 2019-12-16 2019-12-16 Depth estimation system WO2021121540A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980103040.XA CN114829977A (en) 2019-12-16 2019-12-16 Depth estimation system
PCT/EP2019/085292 WO2021121540A1 (en) 2019-12-16 2019-12-16 Depth estimation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2019/085292 WO2021121540A1 (en) 2019-12-16 2019-12-16 Depth estimation system

Publications (1)

Publication Number Publication Date
WO2021121540A1 true WO2021121540A1 (en) 2021-06-24

Family

ID=69056012

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/085292 WO2021121540A1 (en) 2019-12-16 2019-12-16 Depth estimation system

Country Status (2)

Country Link
CN (1) CN114829977A (en)
WO (1) WO2021121540A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008219370A (en) * 2007-03-02 2008-09-18 Canon Inc Imaging apparatus
KR101666600B1 (en) * 2009-11-12 2016-10-17 삼성전자주식회사 3D Color Image Sensor Using Stack Structure of Organic Photoelectric Conversion Layers
US20180041718A1 (en) * 2016-08-08 2018-02-08 Microsoft Technology Licensing, Llc Hybrid imaging sensor for structured light object capture

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE33729E (en) * 1987-09-11 1991-10-29 Coherent, Inc. Multilayer optical filter for producing colored reflected light and neutral transmission
WO2007035720A2 (en) * 2005-09-20 2007-03-29 Deltasphere, Inc. Methods, systems, and computer program products for acquiring three-dimensional range information
WO2011101036A1 (en) * 2010-02-19 2011-08-25 Iplink Limited Processing multi-aperture image data
US8542348B2 (en) * 2010-11-03 2013-09-24 Rockwell Automation Technologies, Inc. Color sensor insensitive to distance variations
KR101833269B1 (en) * 2011-03-10 2018-02-28 사이오닉스, 엘엘씨 Three dimensional sensors, systems, and associated methods
CN105308626A (en) * 2013-01-17 2016-02-03 西奥尼克斯股份有限公司 Biometric imaging devices and associated methods
KR102496483B1 (en) * 2017-11-23 2023-02-06 삼성전자주식회사 Avalanche photodetector and image sensor including the same

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008219370A (en) * 2007-03-02 2008-09-18 Canon Inc Imaging apparatus
KR101666600B1 (en) * 2009-11-12 2016-10-17 삼성전자주식회사 3D Color Image Sensor Using Stack Structure of Organic Photoelectric Conversion Layers
US20180041718A1 (en) * 2016-08-08 2018-02-08 Microsoft Technology Licensing, Llc Hybrid imaging sensor for structured light object capture

Also Published As

Publication number Publication date
CN114829977A (en) 2022-07-29

Similar Documents

Publication Publication Date Title
US11300400B2 (en) Three-dimensional measurement device
US9928420B2 (en) Depth imaging system based on stereo vision and infrared radiation
US11015927B2 (en) Optical sensor and optical sensor system
KR102618542B1 (en) ToF (time of flight) capturing apparatus and method for processing image for decreasing blur of depth image thereof
US20140307100A1 (en) Orthographic image capture system
US9383549B2 (en) Imaging system
TWI693373B (en) Three-dimensional sensing module
US20170339396A1 (en) System and method for adjusting a baseline of an imaging system with microlens array
US20160134860A1 (en) Multiple template improved 3d modeling of imaged objects using camera position and pose to obtain accuracy
US11080874B1 (en) Apparatuses, systems, and methods for high-sensitivity active illumination imaging
WO2013012335A1 (en) Imaging device for motion detection of objects in a scene, and method for motion detection of objects in a scene
US10055881B2 (en) Video imaging to assess specularity
CN114073075B (en) Mapping 3D depth map data to 2D image
KR20240015643A (en) Automatic correction from the conjugate line distance of the projection pattern
KR20150029897A (en) Photographing device and operating method thereof
CN108345002A (en) Structure light measurement device and method
US9992472B1 (en) Optoelectronic devices for collecting three-dimensional data
KR101868293B1 (en) Apparatus for Providing Vehicle LIDAR
US11391843B2 (en) Using time-of-flight techniques for stereoscopic image processing
WO2021121540A1 (en) Depth estimation system
CN211060850U (en) Depth detection system and its bracket and electronics
JP6605244B2 (en) Overhead wire imaging apparatus and overhead wire imaging method
JP2005331413A (en) Distance image acquiring system
CN111089612A (en) Optical sensors and optical sensing systems
CN111433570A (en) Multi-sensor calibration system and multi-sensor calibration method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19828657

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19828657

Country of ref document: EP

Kind code of ref document: A1