EP3176601A1 - Illumination light projection for a depth camera - Google Patents
Illumination light projection for a depth camera Download PDFInfo
- Publication number
- EP3176601A1 EP3176601A1 EP17151248.6A EP17151248A EP3176601A1 EP 3176601 A1 EP3176601 A1 EP 3176601A1 EP 17151248 A EP17151248 A EP 17151248A EP 3176601 A1 EP3176601 A1 EP 3176601A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- light
- depth camera
- homogenizing
- illumination
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4814—Constructional features, e.g. arrangements of optical elements of transmitters alone
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4814—Constructional features, e.g. arrangements of optical elements of transmitters alone
- G01S7/4815—Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4818—Constructional features, e.g. arrangements of optical elements using optical fibres
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
Definitions
- time-of-flight (TOF) depth camera In a time-of-flight (TOF) depth camera, light pulses are projected from a light source to an object in an image environment that is focused onto an image sensor. It can be difficult to fill the image environment with illumination light, as the image environment may have a sizeable volume and may have a cross-sectional shape (e.g. rectangular) that can be difficult to achieve with a desired intensity profile. Further, the imaging optics may have a large depth of field in which a consistent projected light intensity is desired.
- TOF time-of-flight
- a TOF depth camera configured to collect image data from an image environment illuminated by illumination light.
- the TOF camera includes a light source including a plurality of surface-emitting lasers configured to generate coherent light.
- the TOF camera also includes an optical assembly configured to transmit light from the plurality of surface-emitting lasers to the image environment and an image sensor configured to detect at least a portion of return light reflected from the image environment.
- a TOF depth camera utilizes light pulses (e.g. infrared and/or visible light) projected from the TOF depth camera into an image environment.
- the illumination light pulses reflect from the various surfaces of objects in the image environment and are returned to an image sensor.
- the TOF depth camera generates distance data by quantifying time-dependent return light information. In other words, because light is detected sooner when reflected from a feature nearer to the photosensitive surface than from an object feature farther away, the TOF depth camera can determine distance information about the object's features.
- the intensity of the project light may be somewhat greater in a region near a periphery of the image environment than in a center of the imaging environment, as light reflected from those regions may have a lower intensity at the image sensor due to the angle of incidence on the imaging optics.
- the imaging environment may have a different cross-sectional shape than light emitted by the light source.
- the imaging environment also may be relatively large to capture potentially large ranges of movements of potentially multiple users.
- Illumination sources used with TOF depth cameras may emit light in circular patterns or circularly-shaped emission envelopes. Therefore, overlaying a circularly-shaped emission pattern onto a non-circular image environment in a manner that achieves a relatively uniform illumination intensity across the entire non-circular image environment may result in the illumination of portions of the environment that are not used for depth analysis. This may waste light source power, and also may involve the use of a more powerful and expensive light source.
- Some previous approaches to reshaping illumination light employ random distributions of spherical microlenses. By randomly distributing the microlenses, the shape of the emitted light may be adjusted while avoiding the introduction of diffractive interference that may result from a periodic arrangement of microlenses. However, because the microlenses are randomly sized, the ability to control the distribution of light within the image environment, including the light's cross-sectional profile and the dimensions of the envelope that it illuminates within the room, may be compromised.
- a TOF depth camera includes a light source including a plurality of surface-emitting lasers configured to generate coherent light.
- the example TOF camera also includes an optical assembly configured to transmit light from the plurality of surface-emitting lasers to the image environment and an image sensor configured to detect at least a portion of return light reflected from the image environment.
- the plurality of surface-emitting lasers may be arranged in a desired illumination light shape, thereby allowing an image of the shape of the light source to be relayed into the image environment.
- a homogenizing light guide may be configured to provide a shaped light source for such use.
- FIG. 1 schematically shows an embodiment of a TOF depth camera 100.
- TOF depth camera 100 includes an illuminator 102 configured to illuminate a portion of an object 104 positioned in an image environment 106 with illumination light 108.
- illumination light 108 For example, a ray of illumination light 108A striking a portion of object 104 is reflected as return light 112.
- Photons from return light 112 may be collected and used to generate depth information for object 104, as explained in detail below.
- FIG. 1 depicts a single illuminator 102 included within TOF depth camera 100, it will be appreciated that a plurality of illuminators 102 may be included within TOF depth camera 100 to illuminate an image environment.
- TOF depth camera 100 also includes an image sensor 110 configured to detect at least a portion of return light 112 reflected from image environment 106.
- Image sensor 110 includes a detector 114 for collecting return light 112 for use in generating depth information (such as a depth map) for the scene.
- illuminator 102 includes a light source 118 configured to generate coherent light and an optical assembly 120 configured to shape the coherent light and direct it toward image environment 106.
- Light source 118 may emit coherent light at any suitable wavelength(s), including but not limited to infrared and visible wavelengths.
- FIG. 2 schematically shows an embodiment of light source 118 including a laser array 200 comprising a plurality of individual surface-emitting lasers 202.
- laser array 200 may have any suitable shape without departing from the scope of the present disclosure.
- laser array 200 has a rectangular/oblong shape, which matches a desired illumination light cross-sectional shape.
- a plurality of surface-emitting lasers 202 may have any other suitable shape and/or pattern.
- Surface-emitting lasers 202 may be fabricated on a suitable substrate (e.g., GaAs) using large-scale integration techniques (e.g., film deposition and film patterning techniques).
- a die comprising a laser array 200 may include hundreds or more of surface-emitting lasers 202.
- a 1.5 mm square die including surface-emitting lasers 202 that have a center-to-center pitch of approximately 44 ⁇ m may include up to 1156 surface-emitting lasers 202.
- FIG. 3 schematically shows a cross-sectional view of an embodiment of a surface-emitting laser 202.
- the embodiment of surface-emitting laser 202 shown in FIG. 3 is a vertical-cavity surface-emitting laser (VCSEL).
- VCSEL is a semiconductor laser diode that emits laser light perpendicular from a substrate surface on which the VCSEL is formed.
- Light or current is pumped into the VCSEL via a pump source to excite the active laser medium (e.g., the material suited to stimulated emission in response to the pump source - one non-limiting example includes InGaAs) in the gain region.
- the energy injected into the gain region resonates between two mirrors prior to emission.
- the light may reflect between two distributed Bragg reflectors formed from alternating layers of high- and low-refractive index films.
- the top and bottom mirrors may be isolated from the gain region by an insulating dielectric layer.
- FIG. 4 Another embodiment of a surface-emitting laser 202 is shown in FIG. 4 .
- FIG. 4 depicts a VCSEL.
- the laser shown in FIG. 4 includes a free-space region between the top and bottom mirrors, a configuration sometimes referred to as a vertical external cavity surface-emitting laser (VECSEL). Because a VECSEL includes a free-space region, the diode may generate a higher power compared to a similar VCSEL.
- VECSEL vertical external cavity surface-emitting laser
- optical assembly 120 transmits light generated by light source 118 to illuminate a portion of image environment 106.
- the lit portion of image environment 106 may be broken down into an illumination depth region and an illumination envelope region.
- the illumination depth region refers to a depth of focus of the projected light.
- illumination light 108 is relayed to an illumination depth region 122 bounded by a near edge 124 and a far edge 126.
- illumination depth region 122 may be approximately 3.5 m deep.
- the illumination envelope region refers to a cross-sectional area that is lit with illumination light 108.
- a rectangularly-shaped illumination envelope region 128 is represented with horizontal dimension 130 and with vertical dimension 132.
- any suitably shaped illumination envelope region 128 e.g., an elliptical shape, a polygon shape, or other closed shape may be formed without departing from the scope of the present disclosure.
- the lasers included in light source 118 may be arranged in a shape that matches that of a desired emission envelope (e.g., a shape or pattern of light projected by the lasers), and optical assembly 120 may be configured to transmit or relay that shape to the far field.
- the emission envelope and illumination envelope region 128 may take the shape of the arrangement of the lasers.
- a rectangularly-shaped array of surface-emitting lasers may be used to generate a rectangularly-shaped light envelope in the far field.
- optical assembly 120 may be configured re-shape the emission envelope. For example, light emitted from square arrangement of surface-emitting lasers may be reshaped into a rectangularly-shaped light envelope in the far field.
- optical assembly 120 may shape the cross-sectional light intensity/irradiance profile of illumination light 108 from a Gaussian profile into a differently-shaped illumination profile.
- illumination light 108 may be shaped into an illumination profile exhibiting a flat-topped, mesa-like shape that is symmetrically oriented around an optical axis of illumination light 108.
- the irradiance of illumination light 108 may have a constant intensity, within an acceptable tolerance, in a region near the optical axis (e.g., a region corresponding to a top of the mesa). The irradiance may then decrease in intensity in region farther from the optical axis (e.g., a region corresponding to sidewalls of the mesa).
- illumination light 108 may be characterized by a cross-sectional light profile that is more intense farther from an optical axis of illumination light 108 than closer to an optical axis of the illumination light.
- FIG. 5 shows an embodiment of a relationship 500 between incoherent irradiance and cross-sectional position within an example light profile 502 for illumination light.
- light profile 502 exhibits a greater irradiant intensity in a region farther from optical axis 504 than at positions closer to optical axis 504.
- Metaphorically, light profile 502 exhibits cross-sectional irradiance profile somewhat resembling a capital letter "M" arranged about optical axis 504.
- generating an "M"-shaped profile for the illumination light may offset a "W"-shaped intensity profile received at image sensor 110 due to reflection effects caused by objects in the image environment.
- the net effect of supplying light with an "M"-shaped profile to image environment 106 may be that image sensor 110 detects return light having a mesa-shaped profile.
- FIG. 6 schematically shows an embodiment a lens system 600 configured to relay an image of light source 118 into image environment 106.
- Lens system 600 includes a condenser lens stage 602, a relay lens stage 604, and an optional Schmidt plate 606, each of which is described in more detail below.
- FIG. 6 also depicts an example light source 118 comprising three light emitters.
- a light emitter may comprise one or more surface-emitting lasers.
- a single light emitter may comprise a single VCSEL, a single array of VCSELs (whether distributed in an ordered manner or a random fashion within the array), etc.
- Light from the three emitters is directed (shown as light paths 608A, 608B, and 608C in FIG. 6 ) via lens system 600 so that light from each emitter is collimated and then routed to different regions of the far field.
- lens system 600 fills illumination envelope region 128 with light by directing light from each surface-emitting laser element to different areas within illumination envelope region 128.
- Lens system 600 may utilize a high f-number aperture stop 610 to achieve a desired depth of field for the relayed image source light in the illumination depth region 122.
- f-numbers in a range of f/250 to f/1000 may be used to provide an illumination depth region having a depth of field in a corresponding range of 500 to 3500 mm.
- Condenser lens stage 602 is positioned within lens system 600 to receive light from light source 118, condensing divergent rays of the emitted light and forming aperture stop 610.
- condenser lens stage 602 may be configured to condense the light received without magnifying or demagnifying the light beyond an acceptable tolerance.
- condenser lens stage 602 may be configured to impart or shape the light received into a selected light illumination profile. For example, condenser lens stage 602 may distort light received from light source 118 to generate the "M"-shaped profile described above, or any other suitable cross-sectional illumination profile.
- Relay lens stage 604 is positioned to receive light from condenser lens stage 602 and relay an image of light source 118 into illumination depth region 122. Stated differently, relay lens stage 604 provides the power within lens system 600 to transmit the image of light source 118 into image environment 106, forming and lighting illumination envelope region 128.
- an optional Schmidt plate 606 may be included within lens system 600, positioned at an entrance pupil 612 of lens system 600.
- Schmidt plate 606 may be used to introduce aberrations to illumination light to reduce the intensity of diffraction artifacts that may be introduced by surface-emitting lasers 202. Further, Schmidt plate 606 may help to achieve a desired light illumination profile. For example, including Schmidt plate 606 may emphasize peaks and valleys within an "M"-shaped illumination profile imparted by condenser lens stage 602.
- Schmidt plate 606 may impact the collimating effect of condenser lens stage 602, potentially reducing depth of illumination depth region 122, inclusion of Schmidt plate 606 may be accompanied by a compensatory adjustment to the f-number of lens system 600.
- lens system 600 depicts classical lenses for clarity, it will be appreciated that any suitable embodiment of the lens stages described above may be included within lens system 600 without departing from the scope of the present disclosure.
- wafer-level optics may be employed for one or more of the lens stages.
- a wafer optic structure refers to an optical structure formed using suitable formation and/or patterning processes like those used in semiconductor patterning. Wafer-level optics may offer the potential advantage of cost-effective miniaturization of one or more of the lens stages and/or enhance manufacturing tolerances for such stages.
- FIG. 7 schematically shows another embodiment of an example lens system 700 for illuminator 102.
- wafer optic element 702 encodes a prescription for a portion of a condenser lens stage on a light receiving surface 704 and a prescription for a relay lens stage on light emitting surface 706.
- Wafer optic element 708 encodes a prescription for a Schmidt plate on light receiving surface 710.
- the light distributed by lens system 700 is less collimated relative to the light distributed by the embodiment of lens system 600 shown in FIG. 6 , leading to overlap of the light paths 712A, 712B, and 712C in the far field.
- a lens system may be formed using diffractive optics. If diffractive optical elements are employed for one or more of the lens elements/stages included in the lens system, a diffractive optic substrate will have a prescription for those stages encoded on a respective surface of the substrate. In some embodiments, for example, a single substrate may have a light receiving surface that encodes a prescription for one lens stage and a light emitting surface that encodes a prescription for another lens stage.
- the diffractive optic may offer similar potential miniaturization enhancements to wafer optics, but may also preserve collimation and depth of field. Moreover, in some embodiments, diffractive optics may permit one or more optical elements to be removed.
- FIG. 8 schematically shows another embodiment of a lens system 800 suitable for use with illuminator 102.
- diffractive optic element 802 encodes a prescription for a condenser lens stage on a light receiving surface 804 and a prescription for a relay lens stage on light emitting surface 806.
- a Schmidt plate is not included in the example illuminator 102 shown in FIG. 8 .
- the light distributed by lens system 800 may be more highly collimated relative to the light distributed by the embodiment of lens system 600 shown in FIG. 6 .
- one or more of the optical stages may be varied to increase the apparent size of light source 118.
- Increasing the size of light source 118 may reduce a user's ability to focus on the light source (e.g., by making the light source appear more diffuse) and/or may avoid directly imaging light source 118 on a user's retina.
- some systems may be configured so that an image of light source 118 may not be focused on a user's retina when the user's retina is positioned within 100 mm of light source 118.
- increasing the apparent source size may include positioning relay lens stage 604 closer to light source 118, which may cause illumination light 108 to diverge faster, depending upon the configuration of the relay lens stage 604 and light source 118. Because this adjustment may also lead to an increase in the field of view and a decrease in illumination depth region 122, a prescription and/or position for condenser lens stage 602 may also be adjusted to adjust the focal length of optical assembly 120 while the arrangement and pitch of surface-emitting lasers 202 included within light source 118 may be varied to adjust illumination envelope region 128. In some embodiments, optical assembly 120 may also be configured to transform the emission envelope into a different shape while relaying the light to image environment 106.
- FIG. 9 schematically shows a sectional view of another embodiment of an illuminator 102 in the form of a homogenizing light guide 902.
- Homogenizing light guide 902 is configured to increase an apparent size of light source 118 by receiving light from light source 118 via light receiving surface 904 and spreading it within the light guide.
- light source 118 may include an array of surface-emitting lasers 202, and/or may include any other suitable light emitting devices.
- light source 118 may include a long, thin, array of surface-emitting lasers 202.
- Homogenizing light guide 902 takes the form of an optical wedge, though it will be appreciated that any suitable light guide configured to spread and smooth light may be employed without departing from the present disclosure.
- light is retained within homogenizing light guide 902 via total internal reflection in total reflection region 906.
- total reflection region 906 Upon leaving total reflection region 906, light encounters a light exit region 908 where the opposing faces of the wedge are angled with respect to light emission surface 910, which allows light to exceed the critical angle for total internal reflection relative to light emission surface 910, and thereby escape the optical wedge.
- Light passing along homogenizing light guide 902 may travel in a collimated or near-collimated path to light emission surface 910.
- light may fan out by 9 degrees or less while traveling between light receiving surface 904 and light emission surface 910.
- light from light source 118 may blend and mingle while traveling through homogenizing light guide 902, so that the light emitted at light emission surface 910 causes the plurality of lasers to appear as a single, larger source located at light emission surface 910.
- FIG. 10 schematically shows a front view of a portion of an example microlens array 912 including a plurality of lens elements 1002 retained by a frame 1004.
- each lens element 1002 is defined with reference to a long-axis lens element pitch 1006 that is different from a short-axis lens element pitch 1008, so that each lens element 1002 has an oblong shape.
- the pitch is defined with reference to the center of each cell, which may correspond to an apex of each lens surface.
- Other suitable pitch definitions may be employed in other embodiments without departing from the scope of the present disclosure.
- Each of the lens elements 1002 included in microlens array 912 is configured to create the desired angular field of illumination for optical assembly 120. Put another way, each lens element 1002 is configured to impart a selected angular divergence to incoming light.
- divergent light refers to coherent light that is spread from a more collimated beam into a less collimated beam. Divergent light may have any suitable illumination intensity cross-section, as explained in more detail below, and may have any suitable divergence angle, as measured between an optical axis and an extreme ray of the divergent light. The divergence angle may adjusted by adjusting the pitch of the lens elements 1002 within microlens array 912. By spreading the incoming light, microlens array 912 transmits light to all regions within illumination envelope region 128.
- FIG. 11 schematically shows a perspective of an embodiment of an individual lens element 1002 having a convex lens surface 1102.
- Lens surface 1102 is shaped in part by pitch dimensions for lens element 1002 (e.g., cell dimensions for lens element 1002).
- the pitch dimensions for the cell may affect the aspheric nature of lens surface 1102. Consequently, the diverging power of lens element 1002 is established at least in part by the pitch dimensions.
- convex lens surface 1102 will have a first divergence angle 1104, defined between optical axis 1106 and extreme ray 1108, that will be different from a second divergence angle 1110, defined between optical axis 1106 and extreme ray 1112.
- first divergence angle 1104 defined between optical axis 1106 and extreme ray 11008
- second divergence angle 1110 defined between optical axis 1106 and extreme ray 1112.
- lens elements 1002 may be made from optical grade poly(methyl methacrylate) (PMMA), which has a refractive index of approximately 1.49.
- lens elements 1002 may be made from optical grade polycarbonate (PC), having a refractive index of approximately 1.6. Lens elements 1002 made from PC may have less curvature to obtain the same divergence angle compared to elements made from PMMA. It will be appreciated that any suitable optical grade material may be used to make lens elements 1002, including the polymers described above, optical grade glasses, etc.
- convex lens surface 1102 may be positioned toward light source 118. Positioning convex lens surface 1102 to face light source 118 may result in comparatively higher angles of incidence before the light experiences total internal reflection within the lens element relative to examples where lens surface 1102 faces away from light source 118. In turn, the angular field of illumination, and thus the illumination envelope region, may be larger when lens surface 1102 faces light source 118. Further, positioning lens surface 1102 to face light source 118 may reduce or eliminate some surface coatings (e.g., anti-reflective coatings such as MgF 2 ) that may otherwise be applied if lens surface 1102 faces in another direction.
- surface coatings e.g., anti-reflective coatings such as MgF 2
- the aggregate effect of spreading the coherent light at each lens element 1002 may be to shape the cross-sectional light intensity/irradiance profile from a Gaussian profile associated with incident coherent light into a differently-shaped illumination profile.
- a desired illumination profile such as the "M"-shaped illumination profile described above.
- FIG. 12 schematically shows another embodiment of illuminator 102 that includes a homogenizing light guide 1202 that has the form of a slab, rather than a wedge.
- Homogenizing light guide 1202 is configured to receive light from light source 118 via light receiving surface 1204.
- Light reflects off of a total internal reflection surface 1206 and is directed toward a light emission region 1208 where some of the light is emitted via light emission surface 1210.
- Light emission surface 1210 is configured as a partial internal reflection surface, reflecting a portion of the light toward total internal reflection surface 1206 for continued propagation while allowing another portion to escape.
- light emission surface 1210 may be configured to reflect approximately 95% of incident light at any individual reflection instance, allowing 5% to be emitted to microlens array 912.
- the reflected light may re-encounter light emission surface 1210 and again experience partial emission of the incident light. Such partial emission instances may be repeated until substantially all of the light received by homogenizing light guide 1202 is emitted via light emission surface 1210.
- homogenizing light guide 1202 may include a total internal reflection region positioned opposite total internal reflection surface 1206 to conserve and propagate received light until it reaches light emission region 1208.
- FIG. 13 schematically shows another embodiment of illuminator 102.
- the embodiment shown in FIG. 13 also depicts a cross-section of a reflective light guide 1302 that receives at least a portion of the light from light source 118 and emits the light received to microlens array 912.
- the light follows a folded light path (shown as light paths 1304A, 1304B, and 1304C in FIG. 13 ) while transiting reflective light guide 1302.
- the folded light path shown in FIG. 13 includes complementary angles that allow total internal reflection within reflective light guide 1302.
- a folded light path may include one or more mirrors suitably positioned to achieve the desired optical path.
- errors that may be introduced by horizontally misplacing reflective light guide 1302 may be canceled by reflection through these complementary angles.
- light traveling along light path 1304B is received at light entrance 1306 and strikes a first total internal reflection surface 1308, where it is reflected at a first angle 1312 toward a second total internal reflection surface 1310.
- first total internal reflection surface 1308 At second total internal reflection surface 1310, the light is reflected at a second angle 1314 toward light emission surface 1316.
- the total internal reflection surfaces are arranged with respect to each other so that angle 1312 is complementary with angle 1314, so no angular error exists within the light exiting light emission surface 1316.
- potential manufacturing errors or impacts to optical assembly 120 may be self-correcting within an acceptable tolerance.
- FIG. 14 shows a flowchart depicting an embodiment of a method 1400 of projecting illumination light into an image environment. It will be appreciated that method 1400 may be performed by any suitable hardware, including but not limited to the hardware described herein. Further, it will be appreciated that the embodiment of method 1400 shown in FIG. 14 and described below is presented for the purpose of example. In some embodiments, any of the processes described with reference to FIG. 14 may be supplemented with other suitable processes, omitted, and/or suitably reordered without departing from the scope of the present disclosure.
- method 1400 includes generating coherent light using a plurality of surface-emitting lasers.
- coherent visible, infrared, or near-infrared light may be generated using suitable surface-emitting lasers like the VCSELs and/or VECSELs described herein.
- method 1400 may include homogenizing the coherent light at 1404. Homogenizing the coherent light may increase the apparent size of the light source and/or may cause the plurality of surface-emitting lasers to appear as a single source.
- homogenizing the coherent light at 1404 may include, at 1406, homogenizing the illumination light using a homogenizing light guide.
- homogenizing light guides include homogenizing light wedges and homogenizing light slabs configured to emit light along one surface via partial reflection of the light while totally reflecting light from another surface within the light guide.
- homogenizing the coherent light at 1404 may include, at 1408, homogenizing the illumination light using a reflective light guide.
- Non-limiting examples of reflective light guides include guides that define folded light paths. In yet other embodiments, such homogenization may be omitted.
- method 1400 includes relaying the illumination light to the image environment.
- relaying the illumination light to the image environment may include, at 1412, relaying an image of the light source to the image environment via a lens system
- the apparent size of the image source may be adjusted by adjusting the focal length, illumination depth region, and illumination envelope region of the lens system.
- relaying the illumination light to the image environment at 1410 may include, at 1414, relaying collimated light to the image environment.
- light from each laser of an array of surface-emitting laser may be collimated, and then directed in a different direction than collimated light from other lasers in the array.
- a microlens array may be used to relay the light received from a suitable homogenizing light guide to different portions of the illumination envelope region.
- the methods and processes described above may be tied to a computing system of one or more computing devices.
- such methods and processes may be implemented as a computer-application program or service, an application-programming interface (API), a library, and/or other computer-program product.
- API application-programming interface
- TOF depth camera 100 shown in FIG. 1 depicts an example of a non-limiting embodiment of a computing system that may perform one or more of the methods and processes described above.
- light generation module 150 may include instructions executable to operate illuminator 102
- depth information module 152 may include instructions executable to operate image sensor 110 and interpret image information detected by detector 114.
- the modules shown in FIG. 1 are illustrated as distinct, standalone entities within TOF depth camera 100, it will be appreciated that the functions performed by such modules may be integrated and/or distributed throughout TOF depth camera 100 and/or a computing device connected locally or remotely with TOF depth camera 100 without departing from the scope of the present disclosure.
- TOF depth camera 100 includes a logic subsystem 160 and a storage subsystem 162.
- TOF depth camera 100 may optionally include a display subsystem 164, input/output-device subsystem 166, and/or other components not shown in FIG. 1 .
- Logic subsystem 160 includes one or more physical devices configured to execute instructions.
- logic subsystem 160 may be configured to execute instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, or otherwise arrive at a desired result.
- Logic subsystem 160 may include one or more processors configured to execute software instructions. Additionally or alternatively, logic subsystem 160 may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. The processors of logic subsystem 160 may be single-core or multi-core, and the programs executed thereon may be configured for sequential, parallel or distributed processing. Logic subsystem 160 may optionally include individual components that are distributed among two or more devices, which can be remotely located and/or configured for coordinated processing. Aspects of the logic subsystem may be virtualized and executed by remotely accessible networked computing devices configured in a cloud-computing configuration.
- Storage subsystem 162 includes one or more physical, non-transitory, devices configured to hold data and/or instructions executable by logic subsystem 160 to implement the herein-described methods and processes. When such methods and processes are implemented, the state of storage subsystem 162 may be transformed-e.g., to hold different data.
- Storage subsystem 162 may include removable media and/or built-in devices.
- Storage subsystem 162 may include optical memory devices (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory devices (e.g., RAM, EPROM, EEPROM, etc.) and/or magnetic memory devices (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others.
- Storage subsystem 162 may include volatile, nonvolatile, dynamic, static, read/write, read-only, random-access, sequential-access, location-addressable, file-addressable, and/or content-addressable devices.
- logic subsystem 160 and storage subsystem 162 may be integrated into one or more unitary devices, such as an application-specific integrated circuit (ASIC), or a system-on-a-chip.
- ASIC application-specific integrated circuit
- storage subsystem 162 includes one or more physical, non-transitory devices.
- aspects of the instructions described herein may be propagated in a transitory fashion by a pure signal (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by a physical device for a finite duration.
- a pure signal e.g., an electromagnetic signal, an optical signal, etc.
- data and/or other forms of information pertaining to the present disclosure may be propagated by a pure signal.
- module and “program” may be used to describe an aspect of the computing system implemented to perform a particular function.
- a module or program may be instantiated via logic subsystem 160 executing instructions held by storage subsystem 162. It will be understood that different modules and/or programs may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same module, and/or program may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc.
- module and “program” may encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.
- display subsystem 164 may be used to present a visual representation of data held by storage subsystem 162.
- This visual representation may take the form of a graphical user interface (GUI).
- GUI graphical user interface
- the state of display subsystem 164 may likewise be transformed to visually represent changes in the underlying data.
- Display subsystem 164 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with logic subsystem 160 and/or storage subsystem 162 in a shared enclosure, or such display devices may be peripheral display devices.
- input/output-device subsystem 166 may be configured to communicatively couple the computing system with one or more other computing devices.
- Input/output-device subsystem 166 may include wired and/or wireless communication devices compatible with one or more different communication protocols.
- input/output-device subsystem 166 may be configured for communication via a wireless telephone network, or a wired or wireless local- or wide-area network.
- input/output-device subsystem 166 may allow the computing system to send and/or receive messages to and/or from other devices via a network such as the Internet.
- Input/output-device subsystem 166 may also optionally include or interface with one or more user-input devices such as a keyboard, mouse, game controller, camera, microphone, and/or touch screen, for example.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Studio Devices (AREA)
- Non-Portable Lighting Devices Or Systems Thereof (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
Description
- In a time-of-flight (TOF) depth camera, light pulses are projected from a light source to an object in an image environment that is focused onto an image sensor. It can be difficult to fill the image environment with illumination light, as the image environment may have a sizeable volume and may have a cross-sectional shape (e.g. rectangular) that can be difficult to achieve with a desired intensity profile. Further, the imaging optics may have a large depth of field in which a consistent projected light intensity is desired.
- Some previous approaches to filling image environments with light use high-order optics to shape diverging light emitted from side-emitting light sources. However, such approaches typically require precise design and manufacturing control of the angular distribution of the light in order to fill the image environment.
- Various embodiments related to illuminating image environments with illumination light for a TOF depth camera are provided herein. For example, one embodiment provides a TOF depth camera configured to collect image data from an image environment illuminated by illumination light is provided. The TOF camera includes a light source including a plurality of surface-emitting lasers configured to generate coherent light. The TOF camera also includes an optical assembly configured to transmit light from the plurality of surface-emitting lasers to the image environment and an image sensor configured to detect at least a portion of return light reflected from the image environment.
- This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
-
-
FIG. 1 schematically shows an example time-of-flight (TOF) depth camera in an example use environment according to an embodiment of the present disclosure. -
FIG. 2 schematically shows an example light source according to an embodiment of the present disclosure. -
FIG. 3 schematically shows an example surface-emitting laser according to an embodiment of the present disclosure. -
FIG. 4 schematically shows another example surface-emitting laser according to an embodiment of the present disclosure. -
FIG. 5 shows an example illumination profile according to an embodiment of the present disclosure. -
FIG. 6 schematically shows an example lens system according to an embodiment of the present disclosure. -
FIG. 7 schematically shows another example lens system according to an embodiment of the present disclosure. -
FIG. 8 schematically shows another example lens system according to an embodiment of the present disclosure. -
FIG. 9 schematically shows an example homogenizing light guide according to an embodiment of the present disclosure. -
FIG. 10 schematically shows a portion of an example microlens array according to an embodiment of the present disclosure. -
FIG. 11 schematically shows a perspective view of an example lens element in a microlens array according to an embodiment of the present disclosure. -
FIG. 12 schematically shows another example homogenizing light guide according to an embodiment of the present disclosure. -
FIG. 13 schematically shows an example reflective light guide according to an embodiment of the present disclosure. -
FIG. 14 shows a flowchart illustrating an example method of projecting illumination light into an image environment according to an embodiment of the present disclosure. - As mentioned above, a TOF depth camera utilizes light pulses (e.g. infrared and/or visible light) projected from the TOF depth camera into an image environment. The illumination light pulses reflect from the various surfaces of objects in the image environment and are returned to an image sensor. The TOF depth camera generates distance data by quantifying time-dependent return light information. In other words, because light is detected sooner when reflected from a feature nearer to the photosensitive surface than from an object feature farther away, the TOF depth camera can determine distance information about the object's features.
- It may be difficult to fill the image environment with illumination light of a desired intensity profile. For example, it may be desirable for the intensity of the project light to be somewhat greater in a region near a periphery of the image environment than in a center of the imaging environment, as light reflected from those regions may have a lower intensity at the image sensor due to the angle of incidence on the imaging optics.
- Further, as mentioned above, the imaging environment may have a different cross-sectional shape than light emitted by the light source. The imaging environment also may be relatively large to capture potentially large ranges of movements of potentially multiple users.
- Illumination sources used with TOF depth cameras may emit light in circular patterns or circularly-shaped emission envelopes. Therefore, overlaying a circularly-shaped emission pattern onto a non-circular image environment in a manner that achieves a relatively uniform illumination intensity across the entire non-circular image environment may result in the illumination of portions of the environment that are not used for depth analysis. This may waste light source power, and also may involve the use of a more powerful and expensive light source.
- Some previous approaches to reshaping illumination light employ random distributions of spherical microlenses. By randomly distributing the microlenses, the shape of the emitted light may be adjusted while avoiding the introduction of diffractive interference that may result from a periodic arrangement of microlenses. However, because the microlenses are randomly sized, the ability to control the distribution of light within the image environment, including the light's cross-sectional profile and the dimensions of the envelope that it illuminates within the room, may be compromised.
- Accordingly, various embodiments of TOF depth cameras and methods for illuminating image environments with illumination light are provided herein. For example, in some embodiments, a TOF depth camera includes a light source including a plurality of surface-emitting lasers configured to generate coherent light. The example TOF camera also includes an optical assembly configured to transmit light from the plurality of surface-emitting lasers to the image environment and an image sensor configured to detect at least a portion of return light reflected from the image environment. The plurality of surface-emitting lasers may be arranged in a desired illumination light shape, thereby allowing an image of the shape of the light source to be relayed into the image environment. In other embodiments, a homogenizing light guide may be configured to provide a shaped light source for such use.
-
FIG. 1 schematically shows an embodiment of aTOF depth camera 100. In the embodiment shown inFIG. 1 ,TOF depth camera 100 includes anilluminator 102 configured to illuminate a portion of anobject 104 positioned in animage environment 106 withillumination light 108. For example, a ray ofillumination light 108A striking a portion ofobject 104 is reflected asreturn light 112. Photons fromreturn light 112 may be collected and used to generate depth information forobject 104, as explained in detail below. - While the example shown in
FIG. 1 depicts asingle illuminator 102 included withinTOF depth camera 100, it will be appreciated that a plurality ofilluminators 102 may be included withinTOF depth camera 100 to illuminate an image environment. -
TOF depth camera 100 also includes animage sensor 110 configured to detect at least a portion ofreturn light 112 reflected fromimage environment 106.Image sensor 110 includes adetector 114 for collectingreturn light 112 for use in generating depth information (such as a depth map) for the scene. - In the embodiment shown in
FIG. 1 ,illuminator 102 includes alight source 118 configured to generate coherent light and anoptical assembly 120 configured to shape the coherent light and direct it towardimage environment 106.Light source 118 may emit coherent light at any suitable wavelength(s), including but not limited to infrared and visible wavelengths. -
FIG. 2 schematically shows an embodiment oflight source 118 including alaser array 200 comprising a plurality of individual surface-emittinglasers 202. It will be appreciated thatlaser array 200 may have any suitable shape without departing from the scope of the present disclosure. In the embodiment shown inFIG. 2 ,laser array 200 has a rectangular/oblong shape, which matches a desired illumination light cross-sectional shape. It will be appreciated that a plurality of surface-emittinglasers 202 may have any other suitable shape and/or pattern. - Surface-emitting
lasers 202 may be fabricated on a suitable substrate (e.g., GaAs) using large-scale integration techniques (e.g., film deposition and film patterning techniques). In some examples, a die comprising alaser array 200 may include hundreds or more of surface-emittinglasers 202. For example, a 1.5 mm square die including surface-emittinglasers 202 that have a center-to-center pitch of approximately 44 µm may include up to 1156 surface-emittinglasers 202. -
FIG. 3 schematically shows a cross-sectional view of an embodiment of a surface-emittinglaser 202. Specifically, the embodiment of surface-emittinglaser 202 shown inFIG. 3 is a vertical-cavity surface-emitting laser (VCSEL). A VCSEL is a semiconductor laser diode that emits laser light perpendicular from a substrate surface on which the VCSEL is formed. Light or current is pumped into the VCSEL via a pump source to excite the active laser medium (e.g., the material suited to stimulated emission in response to the pump source - one non-limiting example includes InGaAs) in the gain region. The energy injected into the gain region resonates between two mirrors prior to emission. For example, the light may reflect between two distributed Bragg reflectors formed from alternating layers of high- and low-refractive index films. In some embodiments, the top and bottom mirrors may be isolated from the gain region by an insulating dielectric layer. - Another embodiment of a surface-emitting
laser 202 is shown inFIG. 4 . LikeFIG. 3, FIG. 4 depicts a VCSEL. However, the laser shown inFIG. 4 includes a free-space region between the top and bottom mirrors, a configuration sometimes referred to as a vertical external cavity surface-emitting laser (VECSEL). Because a VECSEL includes a free-space region, the diode may generate a higher power compared to a similar VCSEL. - Turning back to
FIG. 1 ,optical assembly 120 transmits light generated bylight source 118 to illuminate a portion ofimage environment 106. For purposes of discussion, the lit portion ofimage environment 106 may be broken down into an illumination depth region and an illumination envelope region. The illumination depth region refers to a depth of focus of the projected light. In the embodiment shown inFIG. 1 ,illumination light 108 is relayed to anillumination depth region 122 bounded by anear edge 124 and afar edge 126. For example, in some embodiments,illumination depth region 122 may be approximately 3.5 m deep. - The illumination envelope region refers to a cross-sectional area that is lit with
illumination light 108. In the embodiment shown inFIG. 1 , a rectangularly-shapedillumination envelope region 128 is represented withhorizontal dimension 130 and withvertical dimension 132. However, it will be appreciated that any suitably shaped illumination envelope region 128 (e.g., an elliptical shape, a polygon shape, or other closed shape) may be formed without departing from the scope of the present disclosure. - As mentioned above, in some embodiments, the lasers included in
light source 118 may be arranged in a shape that matches that of a desired emission envelope (e.g., a shape or pattern of light projected by the lasers), andoptical assembly 120 may be configured to transmit or relay that shape to the far field. In such embodiments, the emission envelope andillumination envelope region 128 may take the shape of the arrangement of the lasers. Thus, as one specific example, a rectangularly-shaped array of surface-emitting lasers may be used to generate a rectangularly-shaped light envelope in the far field. In other embodiments,optical assembly 120 may be configured re-shape the emission envelope. For example, light emitted from square arrangement of surface-emitting lasers may be reshaped into a rectangularly-shaped light envelope in the far field. - Further, in some embodiments,
optical assembly 120 may shape the cross-sectional light intensity/irradiance profile of illumination light 108 from a Gaussian profile into a differently-shaped illumination profile. For example, in some embodiments,illumination light 108 may be shaped into an illumination profile exhibiting a flat-topped, mesa-like shape that is symmetrically oriented around an optical axis ofillumination light 108. In such embodiments, the irradiance ofillumination light 108 may have a constant intensity, within an acceptable tolerance, in a region near the optical axis (e.g., a region corresponding to a top of the mesa). The irradiance may then decrease in intensity in region farther from the optical axis (e.g., a region corresponding to sidewalls of the mesa). - In some other embodiments,
illumination light 108 may be characterized by a cross-sectional light profile that is more intense farther from an optical axis ofillumination light 108 than closer to an optical axis of the illumination light.FIG. 5 shows an embodiment of arelationship 500 between incoherent irradiance and cross-sectional position within an examplelight profile 502 for illumination light. In the example shown inFIG. 5 ,light profile 502 exhibits a greater irradiant intensity in a region farther fromoptical axis 504 than at positions closer tooptical axis 504. Metaphorically,light profile 502 exhibits cross-sectional irradiance profile somewhat resembling a capital letter "M" arranged aboutoptical axis 504. - Without wishing to be bound by theory, generating an "M"-shaped profile for the illumination light may offset a "W"-shaped intensity profile received at
image sensor 110 due to reflection effects caused by objects in the image environment. In other words, the net effect of supplying light with an "M"-shaped profile to imageenvironment 106 may be thatimage sensor 110 detects return light having a mesa-shaped profile. -
FIG. 6 schematically shows an embodiment alens system 600 configured to relay an image oflight source 118 intoimage environment 106.Lens system 600 includes acondenser lens stage 602, arelay lens stage 604, and anoptional Schmidt plate 606, each of which is described in more detail below. -
FIG. 6 also depicts an examplelight source 118 comprising three light emitters. As used herein, a light emitter may comprise one or more surface-emitting lasers. For example, a single light emitter may comprise a single VCSEL, a single array of VCSELs (whether distributed in an ordered manner or a random fashion within the array), etc. Light from the three emitters is directed (shown aslight paths FIG. 6 ) vialens system 600 so that light from each emitter is collimated and then routed to different regions of the far field. In this manner,lens system 600 fillsillumination envelope region 128 with light by directing light from each surface-emitting laser element to different areas withinillumination envelope region 128. -
Lens system 600 may utilize a high f-number aperture stop 610 to achieve a desired depth of field for the relayed image source light in theillumination depth region 122. In some non-limiting embodiments, f-numbers in a range of f/250 to f/1000 may be used to provide an illumination depth region having a depth of field in a corresponding range of 500 to 3500 mm. -
Condenser lens stage 602 is positioned withinlens system 600 to receive light fromlight source 118, condensing divergent rays of the emitted light and formingaperture stop 610. In some embodiments,condenser lens stage 602 may be configured to condense the light received without magnifying or demagnifying the light beyond an acceptable tolerance. Additionally or alternatively, in some embodiments,condenser lens stage 602 may be configured to impart or shape the light received into a selected light illumination profile. For example,condenser lens stage 602 may distort light received fromlight source 118 to generate the "M"-shaped profile described above, or any other suitable cross-sectional illumination profile. -
Relay lens stage 604 is positioned to receive light fromcondenser lens stage 602 and relay an image oflight source 118 intoillumination depth region 122. Stated differently,relay lens stage 604 provides the power withinlens system 600 to transmit the image oflight source 118 intoimage environment 106, forming and lightingillumination envelope region 128. - In some embodiments, an
optional Schmidt plate 606 may be included withinlens system 600, positioned at anentrance pupil 612 oflens system 600.Schmidt plate 606 may be used to introduce aberrations to illumination light to reduce the intensity of diffraction artifacts that may be introduced by surface-emittinglasers 202. Further,Schmidt plate 606 may help to achieve a desired light illumination profile. For example, includingSchmidt plate 606 may emphasize peaks and valleys within an "M"-shaped illumination profile imparted bycondenser lens stage 602. As the defocusing effect ofSchmidt plate 606 may impact the collimating effect ofcondenser lens stage 602, potentially reducing depth ofillumination depth region 122, inclusion ofSchmidt plate 606 may be accompanied by a compensatory adjustment to the f-number oflens system 600. - While
lens system 600 depicts classical lenses for clarity, it will be appreciated that any suitable embodiment of the lens stages described above may be included withinlens system 600 without departing from the scope of the present disclosure. For example, in some embodiments, wafer-level optics may be employed for one or more of the lens stages. As used herein, a wafer optic structure refers to an optical structure formed using suitable formation and/or patterning processes like those used in semiconductor patterning. Wafer-level optics may offer the potential advantage of cost-effective miniaturization of one or more of the lens stages and/or enhance manufacturing tolerances for such stages. -
FIG. 7 schematically shows another embodiment of anexample lens system 700 forilluminator 102. In the embodiment shown inFIG. 7 ,wafer optic element 702 encodes a prescription for a portion of a condenser lens stage on alight receiving surface 704 and a prescription for a relay lens stage onlight emitting surface 706. Waferoptic element 708 encodes a prescription for a Schmidt plate on light receivingsurface 710. In the example shown inFIG. 7 , the light distributed bylens system 700 is less collimated relative to the light distributed by the embodiment oflens system 600 shown inFIG. 6 , leading to overlap of thelight paths - While lower levels of collimation may spread
illumination light 108 over a greater area, that spreading be accompanied by a reduction inillumination depth region 122. Accordingly, in some embodiments, a lens system may be formed using diffractive optics. If diffractive optical elements are employed for one or more of the lens elements/stages included in the lens system, a diffractive optic substrate will have a prescription for those stages encoded on a respective surface of the substrate. In some embodiments, for example, a single substrate may have a light receiving surface that encodes a prescription for one lens stage and a light emitting surface that encodes a prescription for another lens stage. Because the working surface of a diffractive optic is comparatively thinner than a classical lens analog, which may have a thickness set by a radius of curvature for the classical lens, the diffractive optic may offer similar potential miniaturization enhancements to wafer optics, but may also preserve collimation and depth of field. Moreover, in some embodiments, diffractive optics may permit one or more optical elements to be removed. -
FIG. 8 schematically shows another embodiment of alens system 800 suitable for use withilluminator 102. In the embodiment shown inFIG. 8 , diffractiveoptic element 802 encodes a prescription for a condenser lens stage on alight receiving surface 804 and a prescription for a relay lens stage onlight emitting surface 806. A Schmidt plate is not included in theexample illuminator 102 shown inFIG. 8 . In the example shown inFIG. 8 , the light distributed bylens system 800 may be more highly collimated relative to the light distributed by the embodiment oflens system 600 shown inFIG. 6 . - It will be appreciated that the relative positions of the optical stages described above may be varied in any suitable manner without departing from the scope of the present disclosure. For example, in some embodiments, one or more of the optical stages may be varied to increase the apparent size of
light source 118. Increasing the size oflight source 118 may reduce a user's ability to focus on the light source (e.g., by making the light source appear more diffuse) and/or may avoid directly imaginglight source 118 on a user's retina. As a non-limiting example, some systems may be configured so that an image oflight source 118 may not be focused on a user's retina when the user's retina is positioned within 100 mm oflight source 118. - In some embodiments, increasing the apparent source size may include positioning
relay lens stage 604 closer tolight source 118, which may causeillumination light 108 to diverge faster, depending upon the configuration of therelay lens stage 604 andlight source 118. Because this adjustment may also lead to an increase in the field of view and a decrease inillumination depth region 122, a prescription and/or position forcondenser lens stage 602 may also be adjusted to adjust the focal length ofoptical assembly 120 while the arrangement and pitch of surface-emittinglasers 202 included withinlight source 118 may be varied to adjustillumination envelope region 128. In some embodiments,optical assembly 120 may also be configured to transform the emission envelope into a different shape while relaying the light to imageenvironment 106. -
FIG. 9 schematically shows a sectional view of another embodiment of anilluminator 102 in the form of a homogenizinglight guide 902. Homogenizinglight guide 902 is configured to increase an apparent size oflight source 118 by receiving light fromlight source 118 vialight receiving surface 904 and spreading it within the light guide. In some embodiments,light source 118 may include an array of surface-emittinglasers 202, and/or may include any other suitable light emitting devices. In one specific example,light source 118 may include a long, thin, array of surface-emittinglasers 202. - Homogenizing
light guide 902 takes the form of an optical wedge, though it will be appreciated that any suitable light guide configured to spread and smooth light may be employed without departing from the present disclosure. In the embodiment shown inFIG. 9 , light is retained within homogenizinglight guide 902 via total internal reflection intotal reflection region 906. Upon leavingtotal reflection region 906, light encounters alight exit region 908 where the opposing faces of the wedge are angled with respect tolight emission surface 910, which allows light to exceed the critical angle for total internal reflection relative tolight emission surface 910, and thereby escape the optical wedge. - Light passing along homogenizing
light guide 902 may travel in a collimated or near-collimated path tolight emission surface 910. In some non-limiting examples, light may fan out by 9 degrees or less while traveling betweenlight receiving surface 904 andlight emission surface 910. However, light fromlight source 118 may blend and mingle while traveling through homogenizinglight guide 902, so that the light emitted atlight emission surface 910 causes the plurality of lasers to appear as a single, larger source located atlight emission surface 910. - After emission from
light emission surface 910, the light is received by amicro lens array 912 and spread to fillillumination envelope region 128.Microlens array 912 includes a plurality of small lens elements configured to diverge the light and projected it intoimage environment 106. For example,FIG. 10 schematically shows a front view of a portion of anexample microlens array 912 including a plurality oflens elements 1002 retained by aframe 1004. As shown inFIG. 10 , eachlens element 1002 is defined with reference to a long-axislens element pitch 1006 that is different from a short-axislens element pitch 1008, so that eachlens element 1002 has an oblong shape. In the embodiment shown inFIG. 10 , the pitch is defined with reference to the center of each cell, which may correspond to an apex of each lens surface. Other suitable pitch definitions may be employed in other embodiments without departing from the scope of the present disclosure. - Each of the
lens elements 1002 included inmicrolens array 912 is configured to create the desired angular field of illumination foroptical assembly 120. Put another way, eachlens element 1002 is configured to impart a selected angular divergence to incoming light. As used herein, divergent light refers to coherent light that is spread from a more collimated beam into a less collimated beam. Divergent light may have any suitable illumination intensity cross-section, as explained in more detail below, and may have any suitable divergence angle, as measured between an optical axis and an extreme ray of the divergent light. The divergence angle may adjusted by adjusting the pitch of thelens elements 1002 withinmicrolens array 912. By spreading the incoming light,microlens array 912 transmits light to all regions withinillumination envelope region 128. -
FIG. 11 schematically shows a perspective of an embodiment of anindividual lens element 1002 having aconvex lens surface 1102.Lens surface 1102 is shaped in part by pitch dimensions for lens element 1002 (e.g., cell dimensions for lens element 1002). In turn, the pitch dimensions for the cell may affect the aspheric nature oflens surface 1102. Consequently, the diverging power oflens element 1002 is established at least in part by the pitch dimensions. In the embodiment shown inFIG. 11 , wherelens element 1002 is depicted as having an oblong cell shape,convex lens surface 1102 will have afirst divergence angle 1104, defined betweenoptical axis 1106 andextreme ray 1108, that will be different from asecond divergence angle 1110, defined betweenoptical axis 1106 andextreme ray 1112. When projected intoimage environment 106, the illumination light, spread in respective directions according to these divergence angles, will in turn establish the boundaries forillumination envelope region 128. - In some embodiments, the degree of divergence that may be realized by
lens elements 1002 may be affected by the refractive index of the material from which the lenses are formed. As the lens curvature increases, the light approaches a total internal reflection limit. However, by increasing the index of refraction, a selected divergence angle may be achieved with comparatively less light bending. For example, in some embodiments,lens elements 1002 may be made from optical grade poly(methyl methacrylate) (PMMA), which has a refractive index of approximately 1.49. In other embodiments,lens elements 1002 may be made from optical grade polycarbonate (PC), having a refractive index of approximately 1.6.Lens elements 1002 made from PC may have less curvature to obtain the same divergence angle compared to elements made from PMMA. It will be appreciated that any suitable optical grade material may be used to makelens elements 1002, including the polymers described above, optical grade glasses, etc. - While the embodiment of
microlens array 912 shown inFIG. 10 depicts convex lens surfaces included in the array to be facing away fromlight emission surface 910, in some embodiments,convex lens surface 1102 may be positioned towardlight source 118. Positioningconvex lens surface 1102 to facelight source 118 may result in comparatively higher angles of incidence before the light experiences total internal reflection within the lens element relative to examples wherelens surface 1102 faces away fromlight source 118. In turn, the angular field of illumination, and thus the illumination envelope region, may be larger whenlens surface 1102 faceslight source 118. Further, positioninglens surface 1102 to facelight source 118 may reduce or eliminate some surface coatings (e.g., anti-reflective coatings such as MgF2) that may otherwise be applied iflens surface 1102 faces in another direction. - The aggregate effect of spreading the coherent light at each
lens element 1002 may be to shape the cross-sectional light intensity/irradiance profile from a Gaussian profile associated with incident coherent light into a differently-shaped illumination profile. For example, in some embodiments, as few as sixlens elements 1002 may be sufficient to form a desired illumination profile such as the "M"-shaped illumination profile described above. -
FIG. 12 schematically shows another embodiment ofilluminator 102 that includes a homogenizinglight guide 1202 that has the form of a slab, rather than a wedge.Homogenizing light guide 1202 is configured to receive light fromlight source 118 vialight receiving surface 1204. Light reflects off of a totalinternal reflection surface 1206 and is directed toward alight emission region 1208 where some of the light is emitted vialight emission surface 1210.Light emission surface 1210 is configured as a partial internal reflection surface, reflecting a portion of the light toward totalinternal reflection surface 1206 for continued propagation while allowing another portion to escape. In some non-limiting examples,light emission surface 1210 may be configured to reflect approximately 95% of incident light at any individual reflection instance, allowing 5% to be emitted to microlensarray 912. The reflected light may re-encounterlight emission surface 1210 and again experience partial emission of the incident light. Such partial emission instances may be repeated until substantially all of the light received by homogenizinglight guide 1202 is emitted vialight emission surface 1210. In some embodiments, homogenizinglight guide 1202 may include a total internal reflection region positioned opposite totalinternal reflection surface 1206 to conserve and propagate received light until it reacheslight emission region 1208. - Yet another approach to reshaping the emission envelope and increasing the apparent source size includes the use of a folded optical path within
optical assembly 120.FIG. 13 schematically shows another embodiment ofilluminator 102. The embodiment shown inFIG. 13 also depicts a cross-section of areflective light guide 1302 that receives at least a portion of the light fromlight source 118 and emits the light received to microlensarray 912. The light follows a folded light path (shown aslight paths FIG. 13 ) while transitingreflective light guide 1302. The folded light path shown inFIG. 13 includes complementary angles that allow total internal reflection withinreflective light guide 1302. The use of complementary angles withinreflective light guide 1302 may provide self-correction of one or more reflection errors caused by misplacement of the light guide. It will be appreciated that in some embodiments, a folded light path may include one or more mirrors suitably positioned to achieve the desired optical path. - In the example shown in
FIG. 13 , errors that may be introduced by horizontally misplacingreflective light guide 1302 may be canceled by reflection through these complementary angles. For example, light traveling alonglight path 1304B is received atlight entrance 1306 and strikes a first totalinternal reflection surface 1308, where it is reflected at afirst angle 1312 toward a second totalinternal reflection surface 1310. At second totalinternal reflection surface 1310, the light is reflected at asecond angle 1314 towardlight emission surface 1316. The total internal reflection surfaces are arranged with respect to each other so thatangle 1312 is complementary withangle 1314, so no angular error exists within the light exitinglight emission surface 1316. Thus, potential manufacturing errors or impacts tooptical assembly 120 may be self-correcting within an acceptable tolerance. -
FIG. 14 shows a flowchart depicting an embodiment of amethod 1400 of projecting illumination light into an image environment. It will be appreciated thatmethod 1400 may be performed by any suitable hardware, including but not limited to the hardware described herein. Further, it will be appreciated that the embodiment ofmethod 1400 shown inFIG. 14 and described below is presented for the purpose of example. In some embodiments, any of the processes described with reference toFIG. 14 may be supplemented with other suitable processes, omitted, and/or suitably reordered without departing from the scope of the present disclosure. - At 1402,
method 1400 includes generating coherent light using a plurality of surface-emitting lasers. For example, coherent visible, infrared, or near-infrared light may be generated using suitable surface-emitting lasers like the VCSELs and/or VECSELs described herein. - In some embodiments,
method 1400 may include homogenizing the coherent light at 1404. Homogenizing the coherent light may increase the apparent size of the light source and/or may cause the plurality of surface-emitting lasers to appear as a single source. In some of such embodiments, homogenizing the coherent light at 1404 may include, at 1406, homogenizing the illumination light using a homogenizing light guide. Non-limiting examples of homogenizing light guides include homogenizing light wedges and homogenizing light slabs configured to emit light along one surface via partial reflection of the light while totally reflecting light from another surface within the light guide. In other embodiments, homogenizing the coherent light at 1404 may include, at 1408, homogenizing the illumination light using a reflective light guide. Non-limiting examples of reflective light guides include guides that define folded light paths. In yet other embodiments, such homogenization may be omitted. - At 1410,
method 1400 includes relaying the illumination light to the image environment. In some embodiments, relaying the illumination light to the image environment may include, at 1412, relaying an image of the light source to the image environment via a lens system In some of such embodiments, the apparent size of the image source may be adjusted by adjusting the focal length, illumination depth region, and illumination envelope region of the lens system. - In some embodiments, relaying the illumination light to the image environment at 1410 may include, at 1414, relaying collimated light to the image environment. For example, as described above, light from each laser of an array of surface-emitting laser may be collimated, and then directed in a different direction than collimated light from other lasers in the array. As another example, a microlens array may be used to relay the light received from a suitable homogenizing light guide to different portions of the illumination envelope region.
- In some embodiments, the methods and processes described above may be tied to a computing system of one or more computing devices. In particular, such methods and processes may be implemented as a computer-application program or service, an application-programming interface (API), a library, and/or other computer-program product.
-
TOF depth camera 100 shown inFIG. 1 depicts an example of a non-limiting embodiment of a computing system that may perform one or more of the methods and processes described above. For example, in the embodiment shown inFIG. 1 ,light generation module 150 may include instructions executable to operateilluminator 102, anddepth information module 152 may include instructions executable to operateimage sensor 110 and interpret image information detected bydetector 114. While the modules shown inFIG. 1 are illustrated as distinct, standalone entities withinTOF depth camera 100, it will be appreciated that the functions performed by such modules may be integrated and/or distributed throughoutTOF depth camera 100 and/or a computing device connected locally or remotely withTOF depth camera 100 without departing from the scope of the present disclosure. -
TOF depth camera 100 includes alogic subsystem 160 and astorage subsystem 162.TOF depth camera 100 may optionally include adisplay subsystem 164, input/output-device subsystem 166, and/or other components not shown inFIG. 1 . -
Logic subsystem 160 includes one or more physical devices configured to execute instructions. For example,logic subsystem 160 may be configured to execute instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, or otherwise arrive at a desired result. -
Logic subsystem 160 may include one or more processors configured to execute software instructions. Additionally or alternatively,logic subsystem 160 may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. The processors oflogic subsystem 160 may be single-core or multi-core, and the programs executed thereon may be configured for sequential, parallel or distributed processing.Logic subsystem 160 may optionally include individual components that are distributed among two or more devices, which can be remotely located and/or configured for coordinated processing. Aspects of the logic subsystem may be virtualized and executed by remotely accessible networked computing devices configured in a cloud-computing configuration. -
Storage subsystem 162 includes one or more physical, non-transitory, devices configured to hold data and/or instructions executable bylogic subsystem 160 to implement the herein-described methods and processes. When such methods and processes are implemented, the state ofstorage subsystem 162 may be transformed-e.g., to hold different data. -
Storage subsystem 162 may include removable media and/or built-in devices.Storage subsystem 162 may include optical memory devices (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory devices (e.g., RAM, EPROM, EEPROM, etc.) and/or magnetic memory devices (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others.Storage subsystem 162 may include volatile, nonvolatile, dynamic, static, read/write, read-only, random-access, sequential-access, location-addressable, file-addressable, and/or content-addressable devices. In some embodiments,logic subsystem 160 andstorage subsystem 162 may be integrated into one or more unitary devices, such as an application-specific integrated circuit (ASIC), or a system-on-a-chip. - It will be appreciated that
storage subsystem 162 includes one or more physical, non-transitory devices. However, in some embodiments, aspects of the instructions described herein may be propagated in a transitory fashion by a pure signal (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by a physical device for a finite duration. Furthermore, data and/or other forms of information pertaining to the present disclosure may be propagated by a pure signal. - The terms "module" and "program" may be used to describe an aspect of the computing system implemented to perform a particular function. In some cases, a module or program may be instantiated via
logic subsystem 160 executing instructions held bystorage subsystem 162. It will be understood that different modules and/or programs may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same module, and/or program may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc. The terms "module" and "program" may encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc. - When included,
display subsystem 164 may be used to present a visual representation of data held bystorage subsystem 162. This visual representation may take the form of a graphical user interface (GUI). As the herein described methods and processes change the data held by the storage subsystem, and thus transform the state of the storage subsystem, the state ofdisplay subsystem 164 may likewise be transformed to visually represent changes in the underlying data.Display subsystem 164 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined withlogic subsystem 160 and/orstorage subsystem 162 in a shared enclosure, or such display devices may be peripheral display devices. - When included, input/output-
device subsystem 166 may be configured to communicatively couple the computing system with one or more other computing devices. Input/output-device subsystem 166 may include wired and/or wireless communication devices compatible with one or more different communication protocols. As non-limiting examples, input/output-device subsystem 166 may be configured for communication via a wireless telephone network, or a wired or wireless local- or wide-area network. In some embodiments, input/output-device subsystem 166 may allow the computing system to send and/or receive messages to and/or from other devices via a network such as the Internet. Input/output-device subsystem 166 may also optionally include or interface with one or more user-input devices such as a keyboard, mouse, game controller, camera, microphone, and/or touch screen, for example. - It will be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Likewise, the order of the above-described processes may be changed.
- The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.
- The following statements also form part of the disclosure:-
- Statement 1. A time-of-flight depth camera configured to collect image data from an image environment illuminated by illumination light, the time-of-flight depth camera comprising:
- a light source including a plurality of surface-emitting lasers configured to generate coherent light, the light source comprising an arrangement of the plurality of surface-emitting lasers in a selected shape;
- a lens system configured to project an image of the light source to the image environment, the lens system comprising:
- a first stage positioned to condense light received from the light source, and
- a second stage positioned to receive light from the first stage, the second stage being configured to relay the image of the light source into the image environment to illuminate the image environment; and
- an image sensor configured to detect at least a portion of return light reflected from the image environment.
- Statement 2. The time-of-flight depth camera of Statement 1, where the plurality of surface-emitting lasers are arranged in a rectangular shape.
- Statement 3. The time-of-flight depth camera of Statement 1, wherein the lens system is configured to transmit collimated light emitted from a first surface-emitting laser to the image environment in a different direction than collimated light emitted from a second surface-emitting laser.
- Statement 4. The time-of-flight depth camera of Statement 1, where at least one of the plurality of surface-emitting lasers is selected from the group consisting of vertical external cavity surface-emitting lasers (VECSELs) and vertical-cavity surface-emitting lasers (VCSELs).
- Statement 5. The time-of-flight depth camera of Statement 1, where the lens system has an f-number configured to provide an illumination depth region of at least 0.5 m.
- Statement 6. The time-of-flight depth camera of Statement 1, further comprising a Schmidt plate positioned at an entrance pupil of the lens system.
- Statement 7. The time-of-flight depth camera of Statement 1, where each of the plurality of surface-emitting lasers generates coherent light having an angular divergence of 2 degrees or more.
- Statement 8. The time-of-flight depth camera of Statement 1, where the lens system includes a single substrate having a light receiving surface configured to receive light from the light source and a light emitting surface configured to transmit light to the image environment, where the light receiving surface includes a first pattern encoding a first prescription for the first stage, and where the light emitting surface includes a second pattern encoding a second prescription for the second stage.
- Statement 9. The time-of-flight depth camera of Statement 1, where the first stage is further configured to shape the coherent light into light having a light profile that is more intense at a location farther from an optical axis of the light than at a location closer to the optical axis of the light.
- Statement 10. A peripheral time-of-flight depth camera system configured to collect image data from an image environment illuminated by illumination light, the peripheral time-of-flight depth camera comprising:
- a light source including a plurality of surface-emitting lasers configured to generate coherent light;
- a reflective light guide including a folded light path that receives at least a portion of the coherent light from the light source and emits all of the portion of the coherent light received, the reflective light guide configured to self-correct one or more reflection errors via total internal reflection;
- a microlens array positioned to receive at least a portion of the light emitted from the reflective light guide, the microlens array adapted to diverge the light received from the reflective light guide for projection into the image environment as illumination light;
- an image sensor configured to detect at least a portion of return light reflected from the image environment;
- a logic subsystem; and
- a storage subsystem holding instructions executable by the logic subsystem to generate depth information about the object based upon image information generated by the image sensor from detected return light and to output the depth information to a computing device.
Claims (8)
- A time-of-flight depth camera configured to collect image data from an image environment illuminated by illumination light, the time-of-flight depth camera comprising:a light source including an array of surface-emitting lasers configured to generate coherent light;a homogenizing light guide positioned to receive at least a portion of coherent light from the light source, the homogenizing light guide being configured to increase an apparent size of the light source;a microlens array positioned to receive at least a portion of the light emitted from the homogenizing light guide, the microlens array adapted to diverge the light received from the homogenizing light guide for projection into the image environment as illumination light; andan image sensor configured to detect at least a portion of return light reflected from the image environment.
- The time-of-flight depth camera of claim 1, where the homogenizing light guide is further configured to defocus the coherent light to generate collimated light.
- The time-of-flight depth camera of claim 1, where at least one of the plurality of surface-emitting lasers is selected from the group consisting of vertical external cavity surface-emitting lasers (VECSELs) and vertical-cavity surface-emitting lasers (VCSELs).
- The time-of-flight depth camera of claim 1, where each of the plurality of surface-emitting lasers generates coherent light having an angular divergence of 2 degrees or more.
- The time-of-flight depth camera of claim 1, where the homogenizing light guide includes a homogenizing light wedge having a total internal reflection surface opposite the light emitting surface, the total internal reflection surface being angled with respect to the light emitting surface.
- The time-of-flight depth camera of claim 1, where the homogenizing light guide includes a partial internal reflection surface at the light emitting surface and a total internal reflection surface opposite the partial internal reflection surface.
- The time-of-flight depth camera of claim 1, where the microlens array is further configured to shape the light received from the homogenizing light guide into light having a light profile that is more intense at a location farther from an optical axis of the light than at a location closer to the optical axis of the light.
- The time-of-flight depth camera of claim 1, where the microlens array is further configured to shape the light received from the light guide into a profile comprising a rectangular shape.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/585,638 US9297889B2 (en) | 2012-08-14 | 2012-08-14 | Illumination light projection for a depth camera |
EP13750224.1A EP2885650B1 (en) | 2012-08-14 | 2013-08-05 | Illumination light projection for a depth camera |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP13750224.1A Division EP2885650B1 (en) | 2012-08-14 | 2013-08-05 | Illumination light projection for a depth camera |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3176601A1 true EP3176601A1 (en) | 2017-06-07 |
EP3176601B1 EP3176601B1 (en) | 2018-01-31 |
Family
ID=48986257
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17151248.6A Active EP3176601B1 (en) | 2012-08-14 | 2013-08-05 | Illumination light projection for a depth camera |
EP13750224.1A Active EP2885650B1 (en) | 2012-08-14 | 2013-08-05 | Illumination light projection for a depth camera |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP13750224.1A Active EP2885650B1 (en) | 2012-08-14 | 2013-08-05 | Illumination light projection for a depth camera |
Country Status (4)
Country | Link |
---|---|
US (2) | US9297889B2 (en) |
EP (2) | EP3176601B1 (en) |
CN (1) | CN104583804B (en) |
WO (1) | WO2014028251A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020076454A1 (en) * | 2018-10-09 | 2020-04-16 | Waymo Llc | Multichannel monostatic rangefinder |
WO2020075953A1 (en) * | 2018-10-08 | 2020-04-16 | 삼성전자 주식회사 | Method for generating depth information using structured light projected on external object, and electronic apparatus using same |
Families Citing this family (65)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12226188B2 (en) | 2012-12-31 | 2025-02-18 | Omni Medsci, Inc. | Active illumination and time-of-flight camera system to evaluate facial blood flow, eye movements and physiological parameters |
US10660526B2 (en) | 2012-12-31 | 2020-05-26 | Omni Medsci, Inc. | Near-infrared time-of-flight imaging using laser diodes with Bragg reflectors |
US9134114B2 (en) * | 2013-03-11 | 2015-09-15 | Texas Instruments Incorporated | Time of flight sensor binning |
US9874638B2 (en) * | 2014-03-06 | 2018-01-23 | University Of Waikato | Time of flight camera system which resolves direct and multi-path radiation components |
JP6343972B2 (en) * | 2014-03-10 | 2018-06-20 | 富士通株式会社 | Illumination device and biometric authentication device |
US9841496B2 (en) | 2014-11-21 | 2017-12-12 | Microsoft Technology Licensing, Llc | Multiple pattern illumination optics for time of flight system |
US9918073B2 (en) | 2014-12-22 | 2018-03-13 | Google Llc | Integrated camera system having two dimensional image capture and three dimensional time-of-flight capture with movable illuminated region of interest |
US9854226B2 (en) | 2014-12-22 | 2017-12-26 | Google Inc. | Illuminator for camera system having three dimensional time-of-flight capture with movable mirror element |
US9958758B2 (en) | 2015-01-21 | 2018-05-01 | Microsoft Technology Licensing, Llc | Multiple exposure structured light pattern |
US9992396B1 (en) | 2015-02-02 | 2018-06-05 | Apple Inc. | Focusing lighting module |
WO2016131658A1 (en) * | 2015-02-19 | 2016-08-25 | Koninklijke Philips N.V. | Infrared laser illumination device |
US20160255966A1 (en) * | 2015-03-03 | 2016-09-08 | Sealy Technology, Llc | Real time adaptable body support system and method of operation |
JP6753653B2 (en) * | 2015-06-23 | 2020-09-09 | ローム株式会社 | Proximity sensor and electronic devices using it |
US10444467B2 (en) | 2015-11-25 | 2019-10-15 | Himax Technologies Limited | Collimation lens module and light source module using the same |
EP3185060B1 (en) * | 2015-12-23 | 2018-12-12 | Himax Technologies Limited | Collimation lens module and light source module using the same |
JP2017120364A (en) * | 2015-12-28 | 2017-07-06 | 奇景光電股▲ふん▼有限公司 | Projector, electronic apparatus having projector, and related manufacturing method |
US10761195B2 (en) | 2016-04-22 | 2020-09-01 | OPSYS Tech Ltd. | Multi-wavelength LIDAR system |
EP4474867A3 (en) * | 2016-09-30 | 2025-02-19 | Magic Leap, Inc. | Projector with spatial light modulation |
CN106772431B (en) * | 2017-01-23 | 2019-09-20 | 杭州蓝芯科技有限公司 | A kind of Depth Information Acquistion devices and methods therefor of combination TOF technology and binocular vision |
FR3063374B1 (en) | 2017-02-27 | 2019-06-07 | Stmicroelectronics Sa | METHOD AND DEVICE FOR DETERMINING A DEPTH MAP OF A SCENE |
US10445893B2 (en) | 2017-03-10 | 2019-10-15 | Microsoft Technology Licensing, Llc | Dot-based time of flight |
KR102592139B1 (en) | 2017-03-13 | 2023-10-23 | 옵시스 테크 엘티디 | Eye-Safe Scanning LIDAR System |
US10451741B2 (en) | 2017-04-30 | 2019-10-22 | Microsoft Technology Licensing, Llc | Time of flight camera |
US10542245B2 (en) * | 2017-05-24 | 2020-01-21 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US10591923B2 (en) * | 2017-06-27 | 2020-03-17 | GM Global Technology Operations LLC | Method and apparatus for parallel illumination by a VCSEL array |
KR102476404B1 (en) * | 2017-07-18 | 2022-12-12 | 엘지이노텍 주식회사 | Tof module and subject recogniging apparatus using the same |
KR102326508B1 (en) | 2017-07-28 | 2021-11-17 | 옵시스 테크 엘티디 | Vcsel array lidar transmitter with small angular divergence |
KR101988602B1 (en) * | 2017-09-14 | 2019-06-13 | 주식회사 나오텍 | Lens assembly for TOF |
CN107422571B (en) * | 2017-09-20 | 2020-08-21 | 京东方科技集团股份有限公司 | Display panel, device and control method thereof |
WO2019221776A2 (en) | 2017-11-15 | 2019-11-21 | OPSYS Tech Ltd. | Noise adaptive solid-state lidar system |
CN113325392A (en) * | 2017-12-08 | 2021-08-31 | 浙江舜宇智能光学技术有限公司 | Wide-angle TOF module and application thereof |
CN108490632B (en) | 2018-03-12 | 2020-01-10 | Oppo广东移动通信有限公司 | Laser projection module, depth camera and electronic device |
CN111919137A (en) | 2018-04-01 | 2020-11-10 | 欧普赛斯技术有限公司 | Noise adaptive solid state LIDAR system |
US10914823B2 (en) * | 2018-05-01 | 2021-02-09 | Qualcomm Incorporated | Time of flight ranging with varying fields of emission |
CN109459738A (en) * | 2018-06-06 | 2019-03-12 | 杭州艾芯智能科技有限公司 | A kind of more TOF cameras mutually avoid the method and system of interference |
CN112740666A (en) | 2018-07-19 | 2021-04-30 | 艾科缇弗外科公司 | System and method for multi-modal depth sensing in an automated surgical robotic vision system |
US11609313B2 (en) | 2018-07-31 | 2023-03-21 | Waymo Llc | Hybrid time-of-flight and imager module |
JP2021532368A (en) | 2018-08-03 | 2021-11-25 | オプシス テック リミテッド | Distributed modular solid-state lidar system |
CA3226819A1 (en) | 2018-08-10 | 2020-02-13 | Aurora Operations, Inc. | Method and system for scanning of coherent lidar with fan of collimated beams |
US11178392B2 (en) * | 2018-09-12 | 2021-11-16 | Apple Inc. | Integrated optical emitters and applications thereof |
US10901092B1 (en) | 2018-10-02 | 2021-01-26 | Facebook Technologies, Llc | Depth sensing using dynamic illumination with range extension |
US10896516B1 (en) | 2018-10-02 | 2021-01-19 | Facebook Technologies, Llc | Low-power depth sensing using dynamic illumination |
US11158074B1 (en) | 2018-10-02 | 2021-10-26 | Facebook Technologies, Llc | Depth sensing using temporal coding |
US10839536B2 (en) * | 2018-10-02 | 2020-11-17 | Facebook Technologies, Llc | Depth sensing using grid light patterns |
KR20200069096A (en) * | 2018-12-06 | 2020-06-16 | 삼성전자주식회사 | Electronic device and method for acquiring depth information of object by using the same |
GB2583172B (en) * | 2019-02-06 | 2023-09-13 | Rockley Photonics Ltd | Optical components for imaging |
KR102651647B1 (en) | 2019-03-12 | 2024-03-26 | 루머스 리미티드 | image projector |
CA3132350A1 (en) | 2019-04-08 | 2020-10-15 | Stephen Tully | Systems and methods for medical imaging |
KR102634887B1 (en) | 2019-04-09 | 2024-02-08 | 옵시스 테크 엘티디 | Solid-state LIDAR transmitter with laser control |
WO2020214821A1 (en) | 2019-04-19 | 2020-10-22 | Activ Surgical, Inc. | Systems and methods for trocar kinematics |
EP3956680A4 (en) * | 2019-05-23 | 2022-12-28 | Sense Photonics, Inc. | Optical aperture division for customization of far field pattern |
JP7603998B2 (en) | 2019-05-30 | 2024-12-23 | オプシス テック リミテッド | Eye-safe long-range LIDAR system using actuators |
KR102637658B1 (en) | 2019-06-10 | 2024-02-20 | 옵시스 테크 엘티디 | Eye-safe long-range solid-state LIDAR system |
JP7620330B2 (en) | 2019-06-25 | 2025-01-23 | オプシス テック リミテッド | Adaptive Multi-Pulse LIDAR System |
CN114174869A (en) | 2019-07-31 | 2022-03-11 | 欧普赛斯技术有限公司 | High Resolution Solid State LIDAR Transmitter |
CN112394526A (en) * | 2019-08-19 | 2021-02-23 | 上海鲲游光电科技有限公司 | Multi-dimensional camera device and application terminal and method thereof |
WO2021035094A1 (en) | 2019-08-21 | 2021-02-25 | Activ Surgical, Inc. | Systems and methods for medical imaging |
US20240085533A1 (en) * | 2019-10-24 | 2024-03-14 | Sony Semiconductor Solutions Corporation | Illumination device, light detection device and method |
WO2021087998A1 (en) * | 2019-11-08 | 2021-05-14 | 南昌欧菲生物识别技术有限公司 | Light emitting module, depth camera and electronic device |
CN111007523A (en) * | 2019-12-09 | 2020-04-14 | Oppo广东移动通信有限公司 | Time-of-flight transmitters, time-of-flight depth modules and electronics |
US20220357431A1 (en) * | 2019-12-30 | 2022-11-10 | Lumus Ltd. | Detection and ranging systems employing optical waveguides |
WO2021214745A1 (en) | 2020-04-20 | 2021-10-28 | Lumus Ltd. | Near-eye display with enhanced laser efficiency and eye safety |
WO2021256576A1 (en) * | 2020-06-16 | 2021-12-23 | 엘지전자 주식회사 | Device for generating depth image and operating method thereof |
US12224551B2 (en) | 2020-12-10 | 2025-02-11 | Osram Opto Semiconductors Gmbh | Laser package and projector with the laser package |
US12242672B1 (en) | 2022-10-21 | 2025-03-04 | Meta Platforms Technologies, Llc | Triggering actions based on detected motions on an artificial reality device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7436494B1 (en) * | 2003-03-28 | 2008-10-14 | Irvine Sensors Corp. | Three-dimensional ladar module with alignment reference insert circuitry |
US20080278460A1 (en) * | 2007-05-11 | 2008-11-13 | Rpo Pty Limited | Transmissive Body |
WO2010104692A2 (en) * | 2009-03-13 | 2010-09-16 | Microsoft Corporation | Image display via multiple light guide sections |
EP2442134A1 (en) * | 2010-10-01 | 2012-04-18 | Jay Young Wee | Image acquisition unit, acquisition method, and associated control unit |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5013133A (en) * | 1988-10-31 | 1991-05-07 | The University Of Rochester | Diffractive optical imaging lens systems |
US7028899B2 (en) | 1999-06-07 | 2006-04-18 | Metrologic Instruments, Inc. | Method of speckle-noise pattern reduction and apparatus therefore based on reducing the temporal-coherence of the planar laser illumination beam before it illuminates the target object by applying temporal phase modulation techniques during the transmission of the plib towards the target |
US20020071472A1 (en) | 1999-04-30 | 2002-06-13 | Metrologic Instruments, Inc. | DOE-based systems and devices for producing laser beams having modified beam characteristics |
JP2001264662A (en) | 2000-03-16 | 2001-09-26 | Fuji Photo Film Co Ltd | Color laser display |
US6606173B2 (en) | 2000-08-01 | 2003-08-12 | Riake Corporation | Illumination device and method for laser projector |
US6870650B2 (en) * | 2000-08-01 | 2005-03-22 | Riake Corporation | Illumination device and method for laser projector |
US20040037450A1 (en) * | 2002-08-22 | 2004-02-26 | Bradski Gary R. | Method, apparatus and system for using computer vision to identify facial characteristics |
EP1771767A4 (en) | 2004-07-30 | 2009-12-23 | Novalux Inc | Projection display apparatus, system, and method |
JP4681842B2 (en) | 2004-09-30 | 2011-05-11 | キヤノン株式会社 | Zoom lens and imaging apparatus having the same |
JP2007201642A (en) | 2006-01-24 | 2007-08-09 | Kyocera Mita Corp | Image reader and image-forming device having the same |
US8066378B2 (en) | 2006-10-06 | 2011-11-29 | Marc Lalley | Three-dimensional internal back-projection system and method for using the same |
JP5336475B2 (en) | 2007-05-20 | 2013-11-06 | スリーエム イノベイティブ プロパティズ カンパニー | Optical recycling hollow cavity type display backlight |
JP5042767B2 (en) | 2007-10-05 | 2012-10-03 | 富士フイルム株式会社 | Imaging lens and imaging apparatus |
DE102008045387B4 (en) | 2008-09-02 | 2017-02-09 | Carl Zeiss Ag | Apparatus and method for measuring a surface |
US8066389B2 (en) | 2009-04-30 | 2011-11-29 | Eastman Kodak Company | Beam alignment chamber providing divergence correction |
US8803967B2 (en) * | 2009-07-31 | 2014-08-12 | Mesa Imaging Ag | Time of flight camera with rectangular field of illumination |
TWM399332U (en) * | 2009-10-20 | 2011-03-01 | Fujifilm Corp | Photographic lens and photographic device |
US8320621B2 (en) | 2009-12-21 | 2012-11-27 | Microsoft Corporation | Depth projector system with integrated VCSEL array |
US8330804B2 (en) | 2010-05-12 | 2012-12-11 | Microsoft Corporation | Scanned-beam depth mapping to 2D image |
US8351120B2 (en) | 2010-09-15 | 2013-01-08 | Visera Technologies Company Limited | Optical device having extented depth of field and fabrication method thereof |
US8548270B2 (en) | 2010-10-04 | 2013-10-01 | Microsoft Corporation | Time-of-flight depth imaging |
US8803952B2 (en) | 2010-12-20 | 2014-08-12 | Microsoft Corporation | Plural detector time-of-flight depth mapping |
US8854426B2 (en) * | 2011-11-07 | 2014-10-07 | Microsoft Corporation | Time-of-flight camera with guided light |
-
2012
- 2012-08-14 US US13/585,638 patent/US9297889B2/en active Active
-
2013
- 2013-08-05 WO PCT/US2013/053539 patent/WO2014028251A1/en active Application Filing
- 2013-08-05 EP EP17151248.6A patent/EP3176601B1/en active Active
- 2013-08-05 CN CN201380043199.XA patent/CN104583804B/en active Active
- 2013-08-05 EP EP13750224.1A patent/EP2885650B1/en active Active
-
2016
- 2016-03-11 US US15/068,479 patent/US9891309B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7436494B1 (en) * | 2003-03-28 | 2008-10-14 | Irvine Sensors Corp. | Three-dimensional ladar module with alignment reference insert circuitry |
US20080278460A1 (en) * | 2007-05-11 | 2008-11-13 | Rpo Pty Limited | Transmissive Body |
WO2010104692A2 (en) * | 2009-03-13 | 2010-09-16 | Microsoft Corporation | Image display via multiple light guide sections |
EP2442134A1 (en) * | 2010-10-01 | 2012-04-18 | Jay Young Wee | Image acquisition unit, acquisition method, and associated control unit |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020075953A1 (en) * | 2018-10-08 | 2020-04-16 | 삼성전자 주식회사 | Method for generating depth information using structured light projected on external object, and electronic apparatus using same |
WO2020076454A1 (en) * | 2018-10-09 | 2020-04-16 | Waymo Llc | Multichannel monostatic rangefinder |
US10707195B2 (en) | 2018-10-09 | 2020-07-07 | Waymo Llc | Multichannel monostatic rangefinder |
US11088127B2 (en) | 2018-10-09 | 2021-08-10 | Waymo Llc | Multichannel monostatic rangefinder |
JP2022504206A (en) * | 2018-10-09 | 2022-01-13 | ウェイモ エルエルシー | Multi-channel monostatic rangefinder |
US11658166B2 (en) | 2018-10-09 | 2023-05-23 | Waymo Llc | Multichannel monostatic rangefinder |
Also Published As
Publication number | Publication date |
---|---|
EP3176601B1 (en) | 2018-01-31 |
WO2014028251A1 (en) | 2014-02-20 |
EP2885650A1 (en) | 2015-06-24 |
CN104583804A (en) | 2015-04-29 |
US9891309B2 (en) | 2018-02-13 |
CN104583804B (en) | 2018-03-06 |
US20160195610A1 (en) | 2016-07-07 |
EP2885650B1 (en) | 2017-01-18 |
US20140049610A1 (en) | 2014-02-20 |
US9297889B2 (en) | 2016-03-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9891309B2 (en) | Illumination light projection for a depth camera | |
US9958758B2 (en) | Multiple exposure structured light pattern | |
US20250020783A1 (en) | Lidar device comprising a plurality of beam steering cells for steering a laser beam | |
TWI838490B (en) | Time of flight-based three-dimensional sensing system | |
EP3221716B1 (en) | Multiple pattern illumination optics for time of flight system | |
TWI716533B (en) | Multi-mode illumination module and related method | |
WO2014028652A1 (en) | Illumination light shaping for a depth camera | |
JP7319690B2 (en) | Optical and detector design to improve the resolution of lidar systems | |
US10001583B2 (en) | Structured light projection using a compound patterned mask | |
US20250004286A1 (en) | Optical element and optical system | |
KR20210059591A (en) | An optic and method for producing the same | |
KR20210027041A (en) | A vcsel array and a lidar device employing thereof | |
JP7565342B2 (en) | Projector for solid-state LIDAR systems | |
JP2020106771A (en) | Diffraction optical element and optical system device using the same | |
CN216956618U (en) | Structured light projector, camera module and electronic equipment | |
US20250003739A1 (en) | Eye safety for projectors | |
CN214954356U (en) | Speckle projector and electronic equipment | |
CN113009705A (en) | Structured light assembly for eliminating zero-order diffraction influence | |
TW202407380A (en) | Lidar system and resolusion improvement method thereof | |
CN116893506A (en) | Scanner laser optics for LIDAR | |
KR20220133565A (en) | Camera module | |
CN113391514A (en) | 3D imaging device and method | |
JPWO2021043851A5 (en) |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
17P | Request for examination filed |
Effective date: 20170112 |
|
AC | Divisional application: reference to earlier application |
Ref document number: 2885650 Country of ref document: EP Kind code of ref document: P |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17Q | First examination report despatched |
Effective date: 20170519 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
INTG | Intention to grant announced |
Effective date: 20171030 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
AC | Divisional application: reference to earlier application |
Ref document number: 2885650 Country of ref document: EP Kind code of ref document: P |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 967881 Country of ref document: AT Kind code of ref document: T Effective date: 20180215 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602013032898 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: FP |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 967881 Country of ref document: AT Kind code of ref document: T Effective date: 20180131 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 6 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180430 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180501 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180430 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180531 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602013032898 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20181102 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180805 Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180831 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180831 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20180831 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: MM4A |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180805 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180831 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R082 Ref document number: 602013032898 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180805 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: MK Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180131 Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20130805 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180805 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230428 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20230720 Year of fee payment: 11 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: NL Payment date: 20240723 Year of fee payment: 12 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20240723 Year of fee payment: 12 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20240723 Year of fee payment: 12 |