CN110349985B - Image sensor and method of manufacturing the same - Google Patents
Image sensor and method of manufacturing the same Download PDFInfo
- Publication number
- CN110349985B CN110349985B CN201910150495.9A CN201910150495A CN110349985B CN 110349985 B CN110349985 B CN 110349985B CN 201910150495 A CN201910150495 A CN 201910150495A CN 110349985 B CN110349985 B CN 110349985B
- Authority
- CN
- China
- Prior art keywords
- layer
- image sensor
- lens element
- barrier layer
- light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000004519 manufacturing process Methods 0.000 title claims abstract description 17
- 238000010521 absorption reaction Methods 0.000 claims abstract description 40
- 230000000903 blocking effect Effects 0.000 claims abstract description 35
- 238000000034 method Methods 0.000 claims abstract description 21
- 230000004888 barrier function Effects 0.000 claims description 139
- 239000000758 substrate Substances 0.000 claims description 47
- 229920001169 thermoplastic Polymers 0.000 claims description 31
- 229920002120 photoresistant polymer Polymers 0.000 claims description 20
- 229920005787 opaque polymer Polymers 0.000 claims description 15
- 238000000576 coating method Methods 0.000 claims description 14
- 239000011248 coating agent Substances 0.000 claims description 13
- 125000006850 spacer group Chemical group 0.000 claims description 13
- 230000002745 absorbent Effects 0.000 claims description 12
- 239000002250 absorbent Substances 0.000 claims description 12
- 230000002209 hydrophobic effect Effects 0.000 claims description 5
- 229920000642 polymer Polymers 0.000 claims description 5
- 238000010438 heat treatment Methods 0.000 claims description 4
- 239000011159 matrix material Substances 0.000 claims description 4
- 239000012780 transparent material Substances 0.000 claims description 3
- 239000010410 layer Substances 0.000 description 288
- 239000006096 absorbing agent Substances 0.000 description 24
- 230000003287 optical effect Effects 0.000 description 8
- 238000009826 distribution Methods 0.000 description 6
- 239000000463 material Substances 0.000 description 6
- 238000005259 measurement Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000007423 decrease Effects 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 238000002834 transmittance Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 206010065042 Immune reconstitution inflammatory syndrome Diseases 0.000 description 1
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical group [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- NBVXSUQYWXRMNV-UHFFFAOYSA-N fluoromethane Chemical compound FC NBVXSUQYWXRMNV-UHFFFAOYSA-N 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000031700 light absorption Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000002120 nanofilm Substances 0.000 description 1
- 238000012634 optical imaging Methods 0.000 description 1
- 229910052760 oxygen Inorganic materials 0.000 description 1
- 239000001301 oxygen Substances 0.000 description 1
- 238000000059 patterning Methods 0.000 description 1
- 238000009832 plasma treatment Methods 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 239000002356 single layer Substances 0.000 description 1
- 239000002904 solvent Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/0006—Arrays
- G02B3/0037—Arrays characterized by the distribution or form of lenses
- G02B3/0056—Arrays characterized by the distribution or form of lenses arranged along two different directions in a plane, e.g. honeycomb arrangement of lenses
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/011—Manufacture or treatment of image sensors covered by group H10F39/12
- H10F39/014—Manufacture or treatment of image sensors covered by group H10F39/12 of CMOS image sensors
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/10—Integrated devices
- H10F39/12—Image sensors
- H10F39/18—Complementary metal-oxide-semiconductor [CMOS] image sensors; Photodiode array image sensors
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/80—Constructional details of image sensors
- H10F39/806—Optical elements or arrangements associated with the image sensors
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/80—Constructional details of image sensors
- H10F39/806—Optical elements or arrangements associated with the image sensors
- H10F39/8063—Microlenses
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Solid State Image Pick-Up Elements (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
Abstract
提供一种图像传感器和制造图像传感器的方法。图像传感器包括:阻挡层,包括吸收层和透明层;透镜元件位于阻挡层的下面;感测元件被布置为面向透镜元件。
An image sensor and a method for manufacturing the image sensor are provided. The image sensor comprises: a blocking layer including an absorption layer and a transparent layer; a lens element is located below the blocking layer; and a sensing element is arranged to face the lens element.
Description
The present application claims the benefits of korean patent application No. 10-2018-0039290, which was filed on 4 th month 2018, and korean patent application No. 10-2018-0074099, which was filed on 27 th month 2018, which is incorporated herein by reference for all purposes.
Technical Field
The following description relates to an image sensor and a method of manufacturing the same.
Background
An image sensor is a device configured to capture an image of an object, and may convert an optical signal including image information of the object into an electrical signal. Image sensors are included in various electronic devices. For example, image sensors such as Charge Coupled Device (CCD) image sensors and Complementary Metal Oxide Semiconductor (CMOS) image sensors are widely used.
The CMOS image sensor includes a plurality of pixels including a plurality of transistors and a photoelectric conversion device. The signal photoelectrically converted by the photoelectric conversion device may be processed and output by a plurality of transistors, and image data may be generated based on a pixel signal output from a pixel. Each of the plurality of pixels may photoelectrically convert light or color in a wavelength range to generate and output a signal.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In one general aspect, there is provided an image sensor including a blocking layer including an absorbing layer and a transparent layer, the blocking layer configured to transmit light through an opening formed in the absorbing layer and the transparent layer, and a lens element configured to transmit the light to a sensing element.
The sensing element may be spaced apart from the lens element, the sensing element being configured to receive light passing through the opening and the lens element.
The lens element may be configured to refract light and form a focal point on the sensing element.
The barrier layer may be spaced from the sensing element by a focal length of the lens element.
The image sensor may include a transparent substrate configured to transmit light.
The opening may be located in the barrier layer to correspond to the arrangement of the lens elements.
The image sensor may include a spacer configured to maintain a spacing between the barrier layer and the sensing element.
The absorbing layer and the transparent layer may be alternately arranged in the barrier layer.
The absorbing layer may comprise a circular iris opening centered at a point corresponding to the lens element.
The absorbing layer may comprise an iris opening having a first diameter centered at a point corresponding to the lens element and the blocking layer may comprise a further absorbing layer, wherein the further absorbing layer comprises a further iris opening having a second diameter different from the first diameter.
The diameter of the opening may gradually change from one surface of the barrier layer to the other surface.
The transparent layer may be configured to transmit light within a band of wavelengths.
The barrier layer may have a height based on a field of view (FOV).
The number of absorber layers stacked in the barrier layer may be determined based on the field of view (FOV).
The diameter of the opening may be based on the amount of light, and the barrier layer is spaced from the sensing element by a focal length determined based on the amount of light.
The transparent layer may include a transparent polymer configured to transmit light.
The absorbing layer may include a black matrix material configured to absorb light.
The barrier layer may include a plurality of absorber layers, and the circular iris opening formed in each of the plurality of absorber layers may have a diameter determined based on a field of view (FOV), a refractive index determined by the transparent layer and a transparent substrate disposed on the barrier layer.
The lens elements and the sensing elements may be arranged in a planar array pattern and the absorbing layer may include iris openings arranged based on the planar array pattern.
The transparent substrate may be disposed on a first side of the barrier layer and the lens element may be disposed on a second side of the barrier layer opposite the first side.
In another general aspect, there is provided a method of manufacturing an image sensor, the method including providing a transparent substrate, providing a barrier layer including an absorption layer and a transparent layer, and providing a lens element based on a pattern of an opening portion formed in the barrier layer.
In another general aspect, a method of manufacturing an image sensor includes disposing a barrier layer including an absorbing layer and a transparent layer on a transparent substrate, and disposing a lens element corresponding to a pattern of an opening formed in the barrier layer.
The step of disposing the barrier layer may include coating the transparent substrate with an opaque polymer, disposing a mask having a pattern on portions of the opaque polymer, emitting Ultraviolet (UV) rays toward the opaque polymer exposed through the mask pattern, removing portions of the opaque polymer covered with the mask, coating a transparent layer, which may include a transparent layer of a negative photoresist, over and between the remaining opaque polymer, and exposing the transparent layer to UV rays.
The method may include increasing the bond between the transparent layer and the remaining opaque polymer by performing a hydrophilic treatment.
The mask may include a circular mask having a mesh pattern.
The coating may include a negative photoresist.
The step of disposing the lens element may include coating a thermoplastic polymer layer on the barrier layer, the thermoplastic polymer layer may include a positive photoresist, disposing a mask having a pattern on a portion of the thermoplastic polymer layer, exposing the thermoplastic polymer layer to Ultraviolet (UV) rays through the pattern of the mask, dissolving the thermoplastic polymer layer exposed to ultraviolet rays through a developer, applying a hydrophobic coating to the patterned thermoplastic polymer layer, and heating the coated thermoplastic polymer layer to form a spherical lens.
The thermoplastic polymer layer may comprise a transparent material that is formable by heating.
In another general aspect, there is provided an image sensor including a blocking layer including an absorbing layer and a transparent layer alternately stacked together, an opening formed in the absorbing layer and the transparent layer to transmit light to a lens element, and a sensing element spaced apart from the lens element and configured to receive light from the lens element.
The diameter of the opening may gradually change between two opposite surfaces of the barrier layer.
The opening portion may include a circular opening of each of the plurality of absorber layers, a diameter of the circular opening being based on a field of view (FOV) and a refractive index determined by the transparent layer and the transparent substrate disposed on the barrier layer.
The image sensor may include a spacer located at an outer boundary of the barrier layer and the sensing element, the spacer configured to maintain a spacing substantially equal to a focal length of the lens element between the barrier layer and the sensing element.
Other features and aspects will be apparent from the following detailed description, the accompanying drawings, and the claims.
Drawings
Fig. 1 shows an example of a configuration of an image sensor.
Fig. 2 shows an example of a barrier layer and a lens element.
Fig. 3 shows an example of a cross section of a barrier layer and a lens element.
Fig. 4 shows an example of a field of view (FOV) of an image sensor.
Fig. 5 shows an example of FOV determined by an opening (aperture) of a barrier layer.
Fig. 6 shows another example of FOV determined by an opening of a barrier layer.
Fig. 7 is a diagram illustrating an example of a method of manufacturing an image sensor.
Fig. 8 shows an example of providing a barrier layer.
Fig. 9 shows an example of a method of setting a lens element.
Fig. 10 shows an example of an image obtained by capturing a structure in which a lens element is integrated on one surface of a barrier layer using an electron scanning electron microscope.
Fig. 11 shows an example of an image obtained by capturing a structure in which a lens element is integrated on one surface of a barrier layer using an optical microscope.
Fig. 12 shows an example of transmittance of light based on the number of absorption layers included in the blocking layer.
Fig. 13 shows an example of an image obtained by capturing the upper surface and the side surface of the image sensor using a confocal laser scanning microscope.
Fig. 14 shows an example of an intensity distribution of an image sensor.
Fig. 15 shows an example of an image acquired by the image sensor.
Fig. 16 shows an example of a result obtained by measuring the FOV of the image sensor.
Fig. 17 and 18 show examples of the influence of the blocking layer on the Modulation Transfer Function (MTF).
Throughout the drawings and detailed description, identical reference numerals will be understood to refer to identical elements, features and structures unless otherwise described or provided. The figures may not be drawn to scale and the relative sizes, proportions, and descriptions of elements in the figures may be exaggerated for clarity, illustration, and convenience.
Detailed Description
The following detailed description is provided to assist the reader in obtaining a comprehensive understanding of the methods, apparatus, and/or systems described herein. However, various changes, modifications, and equivalents of the methods, apparatus, and/or systems described herein will be apparent after an understanding of the present disclosure. For example, the order of operations described herein is merely an example, and the order of operations is not limited to the order set forth herein, but may be altered as would be apparent after an understanding of the disclosure of the application, except for operations that must occur in a specific order. In addition, descriptions of features known in the art may be omitted for the sake of clarity and conciseness.
The features described herein may be implemented in different forms and should not be construed as limited to the examples described herein. Rather, the examples described herein are provided to illustrate only some of the many possible ways to implement the methods, devices, and/or systems described herein, as will be apparent after an understanding of the present disclosure.
Hereinafter, examples will be described in detail with reference to the accompanying drawings. However, the scope of the claims should not be construed as limited to the examples set forth herein. Various modifications may be made to the examples. The examples are not to be construed as limited to the present disclosure and should be construed as including all changes, equivalents, and alternatives within the spirit and technical scope of the present disclosure.
The terminology used herein is for the purpose of describing particular examples only and is not intended to be limiting. As used herein, the singular is intended to include the plural unless the context clearly indicates otherwise. As used herein, the term "and/or" includes any one of the items listed in association or any combination of any two or more of the items listed in association.
Although the terms "first" or "second" are used to explain various components, the components are not limited to the terms. These terms should be only used to distinguish one element from another element. For example, within the scope of the claims in accordance with the concepts of the present disclosure, a "first" component may be referred to as a "second" component, or similarly, a "second" component may be referred to as a "first" component. It will be understood that when an element is referred to as being "connected to" another element, it can be directly connected or coupled to the other element or intervening elements may be present.
The use of the term "may" (e.g., the term "may" with respect to what the example or embodiment may include or implement) herein with respect to the example or embodiment indicates that there is at least one example or embodiment that includes or implements such feature, and all examples and embodiments are not so limited.
With regard to the reference numerals assigned to the elements in the drawings, it should be noted that identical elements will be denoted by the same reference numerals, wherever possible, even though they are illustrated in different drawings. In addition, in the description of the examples, when it is considered that detailed description of well-known related structures or functions will lead to a ambiguous interpretation of the present disclosure, such description will be omitted.
Fig. 1 shows an example of a configuration of an image sensor 100.
The image sensor 100 is a device for capturing an image of an object. In one example, the image sensor 100 includes a lens element 110, a barrier layer 120, a transparent substrate 130, a sensing element 140, a spacer 150, and a camera chip 190.
The lens element 110 is a light element that refracts light received from the outside, and condenses the light. The lens element 110 refracts the light to form a focal point on the sensing element 140. In one example, one surface of the lens element 110 has protrusions, while the other surface is a flat surface. In one example, the lens element 110 is a microlens having a convex surface. Other shapes of the surface of the lens element 110 with protrusions (such as, for example, spherical, non-spherical, concave lens, or fresnel shapes) may be used without departing from the spirit and scope of the described illustrative examples.
A group of lens elements 110 is referred to as a lens array. For example, a lens array includes a plurality of lens elements arranged in a planar array pattern (such as, for example, a grid pattern).
The blocking layer 120 is a layer that blocks light. In one example, the barrier layer 120 includes a transparent layer and an absorber layer that is patterned (e.g., hole pattern). In one example, the barrier layer 120 includes an opening formed by a pattern of an absorber layer. The blocking layer 120 transmits light received from the outside to the lens element 110 through the opening portion. Examples of transparent layers and absorbing layers included in the barrier layer 120 will be further described below with reference to fig. 2 and 3.
The transparent substrate 130 is a transparent substrate configured to transmit light. In one example, the transparent substrate 130 is disposed on the barrier layer 120. However, the arrangement of the transparent substrate 130 is not limited thereto, and the transparent substrate 130 is also arranged on the lens element 110 located on the barrier layer 120. In one example, the transparent substrate 130 includes a glass wafer, however, the type of the transparent substrate 130 is not limited thereto. Other types of transparent substrates 130 may be used without departing from the spirit and scope of the illustrative examples described.
In one example, the sensing element 140 is spaced apart from the lens element 110 and receives light through the opening of the barrier layer 120 and the lens element 110. The sensing element 140 receives light collected by the lens element 110 after the light passes through the transparent substrate 130 and the opening portion of the blocking layer 120. The sensing element 140 outputs a signal indicative of the intensity of the received light. In one example, the sensing element 140 senses light corresponding to a color channel and outputs a signal indicative of the intensity of the corresponding color. The color channel is a channel representing a color corresponding to a portion of the visible region, and includes, for example, a red channel, a green channel, and a blue channel. In one example, a separate sensing element 140 senses light corresponding to one of the red, green, and blue color channels. The wavelength that the sensing element 140 can sense is not limited to the visible region. In other examples, the sensing element 140 may sense infrared or ultraviolet rays depending on the design.
A set of sensing elements 140 is referred to as a sensor array. For example, the sensor array includes a plurality of sensing elements arranged in a planar array pattern (e.g., a grid pattern). In one example, the sensor array includes a sensing element 140 that senses red, a sensing element 140 that senses green, and a sensing element 140 that senses blue. In one example, the sensor array can differentially sense three colors.
The spacer 150 maintains the spacing between the barrier 120 and the sensing element 140. Here, maintaining the interval between the barrier layer 120 and the sensing element 140 may be understood as maintaining the interval between the barrier layer 120 and the sensing element 140 within a predetermined range of predetermined values. For example, the spacers 150 support the barrier layer 120 and the sensing element 140. As shown in fig. 1, in one example, spacers 150 are disposed along the outer boundary of the sensor array and support the outer boundary of the barrier 120.
The camera chip 190 is a chip implementing a sensor array. For example, camera chip 190 is implemented at the wafer level.
In one example, the lens element 110, the barrier layer 120, the transparent substrate 130, the sensing element 140, the spacer 150, and the camera chip 190 are combined by an integrated process.
Fig. 2 shows an example of a barrier layer 220 and a lens element 210.
The barrier layer 220 includes an absorber layer 221 and a transparent layer 222. The blocking layer 220 transmits light received from the outside through the opening part 229 formed in the absorbing layer 221 and the transparent layer 222. In one example, the blocking layer 220 provides light received from the outside to the lens element 210 through the opening 229.
The absorption layer 221 is a layer that absorbs light, and is referred to as a "light absorption layer" again, for example. In one example, the absorbing layer 221 includes a black matrix material that absorbs light. For example, the black matrix material includes black SU-8. However, the material of the absorption layer 221 is not limited thereto, and in another example, the absorption layer 221 includes a negative photoresist (negative photoresist) that absorbs light. In one example, the absorber layer 221 includes a circular iris opening (IRIS DIAPHRAGM) formed centered on a point of the arrangement of lens elements 210 corresponding to the lens elements 210. The absorptive layer 221 includes iris openings formed and arranged based on a planar array pattern.
The transparent layer 222 is a layer transmitting light. In one example, the transparent layer 222 includes a transparent polymer that transmits light. For example, the transparent polymer includes SU-8. However, the material of the transparent layer 222 is not limited thereto, and the transparent layer 222 includes any negative photoresist that transmits light. In one example, the transparent layer 222 transmits light within a band of wavelengths. In another example, the transparent layer 222 transmits light in the visible light band.
In one example, the barrier layer 220 forms the opening 229 by a structure in which the absorption layers 221 and the transparent layers 222 are alternately arranged. In one example, the surface of barrier layer 220 on which absorber layer 221 is disposed is opposite the surface of barrier layer 220 on which lens element 210 is disposed. However, the arrangement of the absorption layer 221 is not limited thereto, and the absorption layer 221 and the lens element 210 are located on the same surface.
Circular iris openings are formed in the absorbing layer 221 in a grid pattern. For example, when the barrier layer 220 includes a plurality of absorbent layers, iris openings of the plurality of absorbent layers form the opening 229. The opening 229 is a portion through which light of the barrier layer 220 passes. The opening 229 is formed based on the arrangement of the lens elements 210 in the barrier layer 220. The opening 229 will be further described below with reference to fig. 3.
The lens element 210 transmits light received from the outside to the sensing element. In one example, lens element 210 is positioned below barrier layer 220 and transmits light provided through opening 229. In one example, a protrusion is formed on one surface of the lens element 210, the protrusion being arranged such that the protrusion faces the sensing element. For example, the lens element 210 transmits light provided through the opening 229 to the sensing element corresponding to the opening 229. However, the above-described structure is merely an example, and the example is not limited thereto. In another example, lens element 210 is located above barrier layer 220 instead of below barrier layer 220, or lens element 210 is located above and below barrier layer 220. In another example, the flat portion of the lens element 210 is arranged to face the sensing element instead of the protrusion. In another example, the lens element 210 has recesses rather than protrusions formed on a surface.
The pattern (e.g., hole pattern) formed in the barrier layer 220 allows light passing through the opening 229 in the barrier layer 220 to be transmitted to the sensing element corresponding to the opening 229 and prevents the light from propagating toward another sensing element. Thus, the patterned barrier layer 220 reduces optical crosstalk. Further, a wide field of view (FOV) of the image sensor is designed based on, for example, the diameter of the hole pattern or the height of the barrier layer 220.
Fig. 3 shows an example of a cross-section of a barrier layer 320 and a lens element 310.
The image sensor includes a lens element 310 positioned below a barrier layer 320 and the barrier layer 320 forming an opening 329. As shown in fig. 3, the barrier layer 320 has a structure in which the absorption layers 321 and the transparent layers 322 are alternately stacked. The separate absorbent layer 321 comprises a circular iris opening, which is filled with the same material as the transparent layer 322.
In the absorption layer 321, a region where an iris opening is formed transmits light, and other regions where an iris opening is not formed absorb light. Light passing through the iris opening of any of the absorbing layers 321 passes through the next transparent layer 322. Light passing through the next transparent layer 322 passes through the iris opening of the next absorbing layer. Accordingly, the iris opening of each of the plurality of absorption layers forms an opening portion 329 having a cylindrical shape and transmitting light received from the outside.
Light passing through the opening 329 is supplied to the lens element 310. The lens element 310 collects light passing through the opening 329 and transmits the light to the sensing element.
Fig. 4 shows an example of FOV of an image sensor.
The barrier 420 has a height H determined based on the FOV. For example, the FOV of the image sensor represents the maximum angle θ 1 at which light received from the outside reaches the sensing element, incident with respect to the transparent substrate 430. Light incident to the transparent substrate 430 is refracted based on the refractive index of each of the transparent substrate 430 and the transparent layer 422. For example, in fig. 4, it is assumed that light is incident on the transparent substrate 430 at an angle θ 1. Because light is absorbed by the absorption layer 421 in the blocking layer 420, the maximum angle θ 2 that allows light to pass through the blocking layer 420 through the opening is determined. For example, a maximum angle θ 2 allowing light received from the outside to pass through the barrier layer 420 is determined based on the height and diameter of an opening formed in the barrier layer 420. The height of the opening corresponds to the height H of the barrier layer 420.
In one example, the relationship between the angle θ 1 and the angle θ 2 in the image sensor and the height H of the barrier layer 420 is shown in table 1 below.
TABLE 1
H(μm) | θ2(°) | θ1(°) |
120 | 22.61 | 33.92 |
110 | 24.44 | 36.66 |
100 | 26.56 | 39.84 |
90 | 29.054 | 43.58 |
80 | 32.00 | 48.00 |
70 | 35.53 | 53.30 |
60 | 39.80 | 59.70 |
50 | 45 | 67.5 |
40 | 51.340 | 77.01 |
30 | 59.036 | 88.55 |
20 | 68.198 | 77.7 |
In table 1, when the height H of the barrier layer 420 is 110 μm, the FOV of the image sensor is about 70 °.
However, the configuration of the barrier layer 420 is not limited to the above description. The number of the absorber layers 421 stacked in the barrier layer 420 is determined based on the FOV. For example, the height of the barrier layer 420 is determined based on the FOV, and the number of stacked absorber layers 421 is determined based on the interval between the absorber layers 421 and the height of the barrier layer 420. The spacing between the absorber layers 421 corresponds to the thickness of the transparent layer 422.
Fig. 5 shows an example of FOV determined by an opening portion of a barrier layer.
In one example, barrier layer 520 includes an absorbing layer 521, wherein absorbing layer 521 includes an iris opening having a first diameter D1 formed centered on a point corresponding to lens element 510. The barrier layer 520 includes a second absorbent layer (i.e., absorbent layer 522) that includes an iris opening having a second diameter D2 that is different from the first diameter D1. The barrier layer 520 includes a third absorbent layer (i.e., absorbent layer 523) that includes an iris opening having a third diameter D3 that is different from the second diameter D2. In one example, the first diameter D1, the second diameter D2, and the third diameter D3 have progressively increasing or decreasing values. For example, the barrier layer 520 has a structure in which the diameter of the opening portion 529 gradually changes from one surface of the barrier layer 520 to the other surface of the barrier layer 520. The number of absorber layers in barrier layer 520 is a non-exhaustive example, and other numbers of absorber layers may be used without departing from the spirit and scope of the illustrative examples described.
Fig. 4 shows an example of a constant diameter of the opening portion, and fig. 5 shows an example of a gradual increase in diameter of the opening portion 529. Referring to fig. 5, the diameter of the iris opening formed in the absorption layer gradually increases from the absorption layer near the lens element 510 to the absorption layer far from the lens element 510. As the diameter of the opening portion 529 gradually increases, the maximum angle θ 2 at which light is allowed to pass through the blocking layer 520 through the opening portion 529 increases. In this example, the degree of increase in the diameter of the opening portion 529 is determined based on the desired FOV θ 1, the refractive index of the transparent substrate 530, and the refractive index of each of the plurality of transparent layers. The degree to which the diameter of the opening portion 529 increases is determined based on the angle at which light incident on the transparent substrate 530 and refracted at the desired FOV θ 1 is incident on the barrier layer 520. The degree to which the diameter of the opening portion 529 increases corresponds to the maximum angle θ 2 at which light is allowed to pass through the blocking layer 520. Thus, the circular iris opening formed in each of the plurality of absorber layers in barrier layer 520 has a diameter determined based on the desired FOV θ 1 and the refractive index determined by each of the transparent layer and transparent substrate 530.
In one example, barrier 520 is spaced from sensing element 540 by the focal length of lens element 510. For example, as shown in fig. 5, the barrier layer 520 is arranged to face the sensing element 540 based on the lens element 510. However, the arrangement of the barrier layer 520 is not limited thereto, and the barrier layer 520 is arranged on the same side as the sensing element 540 based on the lens element 510.
For example, the opening has a diameter determined based on the amount of light, and the blocking layer 520 is spaced from the sensing element 540 by a focal length determined based on the amount of light.
An example in which the diameter of the opening portion 529 gradually increases has been described with reference to fig. 5, and an example in which the diameter of the opening portion gradually decreases is described below with reference to fig. 6.
Fig. 6 shows another example of FOV determined by an opening of a barrier layer.
The barrier layer 620 is based on a lens element 610 comprising a first absorbing layer 621 comprising an iris opening having a first diameter D1, a second absorbing layer 622 comprising an iris opening having a second diameter D2, and a third absorbing layer 623 comprising an iris opening having a third diameter D3. The first diameter D1, the second diameter D2, and the third diameter D3 have gradually decreasing values. For example, the barrier layer 620 has a structure in which the diameter of the opening portion 629 gradually decreases from one surface of the barrier layer 620 to the other surface of the barrier layer 620. Thus, based on the above structure, the image sensor allows a distant object to be observed more.
Fig. 7 is a diagram illustrating an example of a method of manufacturing an image sensor. The operations in fig. 7 may be performed in the order and manner shown, but the order of some operations may be changed or some operations omitted without departing from the spirit and scope of the illustrative examples described. Many of the operations shown in fig. 7 can be performed in parallel or concurrently. One or more of the blocks of fig. 7, as well as combinations of blocks, may be implemented by special purpose hardware-based computers which perform the specified functions, or combinations of special purpose hardware and computer instructions. In addition to the description of fig. 7 below, the descriptions of fig. 1-6 also apply to fig. 7 and are incorporated herein by reference. Accordingly, the above description may not be repeated here.
In one example, the image sensor is formed by sequentially stacking a transparent substrate, a barrier layer, a lens element, and a sensing element.
Referring to fig. 7, in operation 710, a transparent substrate is provided. The transparent substrate is a glass wafer as described above, however, examples are not limited thereto.
In operation 720, a barrier layer including an absorber layer and a transparent layer is provided. For example, the barrier layer is disposed on a transparent substrate. In one example, as described above, the absorbing layer and the transparent layer are alternately stacked. An example of forming the barrier layer will be further described below with reference to fig. 8.
In operation 730, a lens element is disposed based on a pattern of an opening portion formed in the barrier layer. For example, the lens element is disposed on the barrier layer. An example of disposing a microlens array as a lens element will be further described below with reference to fig. 9.
Fig. 8 shows an example of providing a barrier layer.
Referring to fig. 8, in operation 821, an absorbing layer is disposed on a transparent substrate. For example, a transparent substrate is coated (coat) with an opaque polymer, such as a black polymer (e.g., black SU-8), as an absorbing layer.
In operation 822, a mask is aligned over the absorber layer. In one example, the mask is arranged in a planar array pattern on the absorber layer. For example, a circular mask is arranged based on a mesh pattern. Ultraviolet (UV) rays are emitted toward the aligned mask on the absorbing layer. The portion of the absorbing layer other than the mask is exposed to ultraviolet rays. The absorber layer includes a negative photoresist. Photoresists are photosensitive materials used in a variety of processes and form patterned coatings on surfaces.
In operation 823, the absorbing layer patterned by UV exposure is developed by a developer. In one example, a solvent called a "developer" is applied to the surface. The uv-exposed portions of the negative photoresist are insoluble in the developer. The portion of the negative photoresist that is not exposed to ultraviolet light is dissolved by the photoresist developer. Thus, as shown in fig. 8, a portion of the absorbing layer corresponding to the shape of the mask is dissolved and removed.
In operation 824, the patterned absorber layer is coated with a transparent layer. In the case where a transparent layer is coated on the patterned absorber layer, the barrier layer is exposed to ultraviolet light. The transparent layer further includes a negative photoresist, and the ultraviolet-exposed portion of the transparent layer is insoluble in a developer. The transparent layer is fixed because all portions of the transparent layer are exposed to ultraviolet light without a mask. In one example, a transparent layer comprising a negative photoresist is coated over and between the remaining (or undissolved) opaque polymer.
In operation 825, a hydrophilic treatment is applied to the transparent layer. By hydrophilic treatment, the binding force between the transparent layer and the absorbent layer is increased. For example, the hydrophilic treatment is an oxygen plasma treatment.
In operation 826, multiple layers (e.g., "N" layers) are stacked by repeating operations 821 through 825. "N" is an integer greater than or equal to "2". The absorption layer and the transparent layer are alternately stacked. While repeating operations 821 to 825, alignment between patterns formed for the absorption layer is maintained. Thus, a set of iris openings formed for the absorbent layer corresponds to the opening portions.
Fig. 9 shows an example of providing a lens element.
Referring to fig. 9, in operation 931, a thermoplastic polymer (or thermoplastic polymer layer) is disposed on the barrier layer. Thermoplastic polymers are transparent materials used to make lenses and are materials that are formable by heating. For example, the thermoplastic polymer is AZ9260.
In operation 932, a mask arranged in a pattern with respect to the planar array pattern formed on the absorber layer is disposed on the thermoplastic polymer. In the case where the mask is disposed on the thermoplastic polymer, the thermoplastic polymer is exposed to ultraviolet rays. In other words, the thermoplastic polymer is exposed to Ultraviolet (UV) rays through the pattern of the mask.
In operation 933, the thermoplastic polymer patterned by UV exposure is developed by a developer. For example, the thermoplastic polymer is a positive photoresist, and the portion of the positive photoresist exposed to ultraviolet light is dissolved by a photoresist developer. The portions of the positive photoresist that are not exposed to ultraviolet light are insoluble in the developer. Thus, as shown in fig. 9, the portion of the thermoplastic polymer corresponding to the shape of the mask remains unchanged after UV exposure. For example, when using a circular mask, the thermoplastic polymer is held in a cylindrical form based on a planar array pattern.
In operation 934, a hydrophobic coating 901 (e.g., a fluorocarbon nanofilm coating) is applied to the thermoplastic polymer patterned structure. The cohesion of the thermoplastic polymer is increased by the hydrophobic coating 901.
In operation 935, a microlens array is fabricated by a thermal reflow process. Because the cohesion of the thermoplastic polymer is increased by the hydrophobic coating 901 of operation 934, the thermoplastic polymer aggregates in the form of spheres.
However, examples of manufacturing the image sensor are not limited to fig. 7 to 9, and may vary according to designs. Further, the operation sequence of each of the plurality of manufacturing processes described above with reference to fig. 7 to 9 is not limited to the above description, and the sequence may be changed. In some embodiments, a portion of the process may be omitted or added.
Fig. 10 shows an example of an image obtained by capturing a structure in which a lens element is integrated on one surface of a barrier layer using an electron scanning electron microscope.
Fig. 10 shows a lens element captured by an electron scanning electron microscope. In fig. 10, spherical lens elements are uniformly formed in a grid pattern under the barrier layer.
Fig. 11 shows an example of an image obtained by capturing a structure in which a lens element is integrated on one surface of a barrier layer using an optical microscope.
Fig. 11 shows a lens element captured by an optical microscope. In fig. 11, the focal points of the lens elements are uniformly formed and represented by white dots.
Fig. 12 shows an example of transmittance of light based on the number of absorption layers included in the blocking layer.
Fig. 12 shows a result indicating that when the number of absorption layers included in the barrier layer increases, the transmittance of light passing through the barrier layer decreases. For example, when the coating of the absorption layer of 4.5 μm is performed, about 40% to 50% of light is transmitted through the single layer. About 10% of the light is transmitted through both layers. Light in the visible light region is hardly transmitted through the four layers.
Fig. 13 shows an example of an image acquired by capturing an upper surface (e.g., a plane formed by an x-axis and a y-axis) and a side surface (e.g., a plane formed by an x-axis and a z-axis) of an image sensor using a confocal laser scanning microscope.
The image of FIG. 13 taken using a confocal laser scanning microscope shows a structure without a lens array (w/o MLA) 1310, a structure without a barrier layer (w/o MBL) 1320, and a structure with a lens array and a barrier layer (wMLA+MBL) 1330.
In structure 1310, the path of the light is not focused. In structure 1320, light is not blocked.
In the structure 1330, the path of light is focused, and light is blocked in a portion other than the opening portion.
Fig. 14 shows an example of an intensity distribution of an image sensor.
Fig. 14 shows the result of comparing the intensity distributions of structures 1310 to 1330 of fig. 13. In fig. 14, a broken line w/o MLA represents an intensity distribution without a microlens array, a broken line w/o MBL represents an intensity distribution without a plurality of barrier layers, and a solid line wmla+mbl represents an intensity distribution with both a microlens array and a plurality of barrier layers. The image sensor may block about 60% of the light more effectively than a structure without a blocking layer. Further, the image sensor exhibits a uniform amount of light in terms of focus. The amount of uniform light indicates that the lens is uniformly manufactured.
Fig. 15 shows an example of an image acquired by the image sensor.
The image shown in the bottom of fig. 15 can be obtained by capturing the original image shown in the top of fig. 15 using an image sensor. The image sensor acquires a plurality of clear segmented images while preventing crosstalk using a blocking layer.
Fig. 16 shows an example of a result obtained by measuring the FOV of the image sensor.
When the image sensor captures an image of a target object, the FOV of the image sensor is measured from the length of the image of the target object and a distance that allows a user to view the image that fills the object and is captured by the image sensor. As shown in fig. 16, the FOV of the image sensor is about 70 °.
Fig. 17 and 18 show examples of the influence of the blocking layer on the Modulation Transfer Function (MTF).
Fig. 17 shows MTF measurement images of structure 1710 without a barrier layer, and MTF measurement images of structure 1720 with a barrier layer. The MTF measurement image of structure 1710 has relatively low definition, whereas the MTF measurement image of structure 1720 has relatively high definition. As shown in fig. 18, for an image sensor having a structure of a blocking layer, the MTF can be measured.
According to an example, the image sensor disclosed herein is installed in an ultra-thin camera application product (e.g., a mobile digital camera). Furthermore, the image sensor is suitable for other imaging devices, such as a subminiature optical imaging device (e.g., an endoscopic camera or a drone).
According to an example, in the image sensor, the absorption layer preventing optical crosstalk is arranged to face the sensing element based on the lens element, and a blank space between the lens element and the sensing element is maintained by the spacer. Thus, a thin image sensor can be realized.
According to an example, an image sensor uses a barrier layer including an absorbing layer and a transparent layer to reduce optical crosstalk. Furthermore, the FOV of the image sensor is adjusted during fabrication of the image sensor.
According to an example, in an image sensor, a lens element is arranged on a lower surface of a barrier layer. Because the barrier layer is arranged on the lens element, optical crosstalk is reduced. Further, in the image sensor, a transparent substrate is disposed on the barrier layer. Because the transparent substrate is located on the barrier layer, the received light is refracted to effectively increase the FOV of the image sensor. The FOV is adjusted based on the thickness of the transparent layer and the number of absorptive layers stacked. Further, the diameter or width of the iris opening formed in one absorption layer is designed to be different from that of the iris opening formed in the other absorption layer, so that the FOV is determined.
According to an example, the image sensor is integrated in a semiconductor to be used as an ultra-thin camera. Further, the amount of light received by the image sensor is adjusted by designing the width of the iris opening and the f-number of the lens element formed in each of the plurality of stacked absorber layers.
According to an example, a method of manufacturing an image sensor enables the use of a material capable of UV patterning to manufacture an accurate image sensor. In addition, the method of manufacturing the image sensor can manufacture an ultra-thin lens. The iris opening formed in the absorption layer is set to have different diameters, and thus, the FOV of the received light can be adjusted. Furthermore, the focal length based on the curvature of the lens element and the height of the spacers may vary depending on the design.
According to an example, the FOV of an image sensor for individual sensing elements included in a sensor array may be adjusted by designing a barrier layer and a lens element that provide light to the individual sensing elements. The image sensor acquires a plurality of divided images using a plurality of sensing elements. The image sensor acquires a segmented image with an adjusted degree of overlap based on the adjustment of the FOV by designing the barrier as described above. The image sensor enhances the MTF for high quality ultra thin cameras or extracts 3D depth information for three dimensional (3D) cameras.
While this disclosure includes particular examples, it will be apparent, after an understanding of the disclosure, that various changes in form and details may be made therein without departing from the spirit and scope of the claims and their equivalents. The examples described herein are to be considered in descriptive sense only and not for purposes of limitation. The descriptions of features or aspects in each example will be considered applicable to similar features or aspects in other examples. Suitable results may be achieved if the described techniques are performed in a different order and/or if components in the described systems, architectures, devices or circuits are combined in a different manner and/or replaced or supplemented by other components or their equivalents. Therefore, the scope of the present disclosure is defined not by the detailed description but by the claims and their equivalents, and all changes within the scope of the claims and their equivalents are to be construed as being included in the present disclosure.
Claims (30)
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20180039290 | 2018-04-04 | ||
KR10-2018-0039290 | 2018-04-04 | ||
KR10-2018-0074099 | 2018-06-27 | ||
KR1020180074099A KR102614907B1 (en) | 2018-04-04 | 2018-06-27 | Image sensor and method to produce image sensor |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110349985A CN110349985A (en) | 2019-10-18 |
CN110349985B true CN110349985B (en) | 2025-01-24 |
Family
ID=68171832
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910150495.9A Active CN110349985B (en) | 2018-04-04 | 2019-02-28 | Image sensor and method of manufacturing the same |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP7503367B2 (en) |
KR (1) | KR102614907B1 (en) |
CN (1) | CN110349985B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11030724B2 (en) | 2018-09-13 | 2021-06-08 | Samsung Electronics Co., Ltd. | Method and apparatus for restoring image |
KR102429987B1 (en) * | 2020-01-06 | 2022-08-05 | 엘아이지넥스원 주식회사 | Micro lens array and Image sensor module including the same and Manufacturing method thereof |
KR102457628B1 (en) * | 2020-08-20 | 2022-10-24 | 한국과학기술원 | Lens structure and manufacturing method thereof |
CN112817076A (en) * | 2021-02-05 | 2021-05-18 | 京东方科技集团股份有限公司 | Grating structure and display device |
CN117497551B (en) * | 2023-12-25 | 2024-04-30 | 合肥晶合集成电路股份有限公司 | Image sensor and method for manufacturing the same |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0576144A1 (en) * | 1992-05-22 | 1993-12-29 | Matsushita Electronics Corporation | Solid state image sensor and manufacturing method thereof |
US6137535A (en) * | 1996-11-04 | 2000-10-24 | Eastman Kodak Company | Compact digital camera with segmented fields of view |
KR20210088375A (en) * | 2020-01-06 | 2021-07-14 | 엘아이지넥스원 주식회사 | Micro lens array and Image sensor module including the same and Manufacturing method thereof |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0645569A (en) * | 1992-05-22 | 1994-02-18 | Matsushita Electron Corp | Solid-state image pick-up device and its manufacturing method |
JP2002368235A (en) * | 2001-03-21 | 2002-12-20 | Canon Inc | Semiconductor device and manufacturing method thereof |
KR100685881B1 (en) * | 2004-06-22 | 2007-02-23 | 동부일렉트로닉스 주식회사 | CMOS image sensor and its manufacturing method |
US7829965B2 (en) * | 2005-05-18 | 2010-11-09 | International Business Machines Corporation | Touching microlens structure for a pixel sensor and method of fabrication |
KR100649031B1 (en) * | 2005-06-27 | 2006-11-27 | 동부일렉트로닉스 주식회사 | Manufacturing Method of CMOS Image Sensor |
DE102006004802B4 (en) * | 2006-01-23 | 2008-09-25 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Image acquisition system and method for producing at least one image capture system |
JP5147226B2 (en) * | 2006-12-15 | 2013-02-20 | 株式会社日立製作所 | Solid-state image sensor, photodetector, and authentication device using the same |
JP5076679B2 (en) * | 2007-06-28 | 2012-11-21 | ソニー株式会社 | Solid-state imaging device and camera module |
KR100896876B1 (en) * | 2007-11-16 | 2009-05-12 | 주식회사 동부하이텍 | Image sensor and its manufacturing method |
KR100982270B1 (en) * | 2008-08-08 | 2010-09-15 | 삼성전기주식회사 | Camera module and manufacturing method thereof |
JP5324890B2 (en) * | 2008-11-11 | 2013-10-23 | ラピスセミコンダクタ株式会社 | Camera module and manufacturing method thereof |
JPWO2014042178A1 (en) * | 2012-09-11 | 2016-08-18 | コニカミノルタ株式会社 | LENS ARRAY, LENS ARRAY LAMINATE, AND IMAGING DEVICE |
JP6055270B2 (en) * | 2012-10-26 | 2016-12-27 | キヤノン株式会社 | Solid-state imaging device, manufacturing method thereof, and camera |
JP2015173474A (en) * | 2015-04-30 | 2015-10-01 | セイコーエプソン株式会社 | Imaging apparatus |
JP2018046040A (en) * | 2016-09-12 | 2018-03-22 | ソニーセミコンダクタソリューションズ株式会社 | SOLID-STATE IMAGING DEVICE, MANUFACTURING METHOD THEREOF, AND ELECTRONIC DEVICE |
-
2018
- 2018-06-27 KR KR1020180074099A patent/KR102614907B1/en active IP Right Grant
-
2019
- 2019-02-28 CN CN201910150495.9A patent/CN110349985B/en active Active
- 2019-04-02 JP JP2019070487A patent/JP7503367B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0576144A1 (en) * | 1992-05-22 | 1993-12-29 | Matsushita Electronics Corporation | Solid state image sensor and manufacturing method thereof |
US6137535A (en) * | 1996-11-04 | 2000-10-24 | Eastman Kodak Company | Compact digital camera with segmented fields of view |
KR20210088375A (en) * | 2020-01-06 | 2021-07-14 | 엘아이지넥스원 주식회사 | Micro lens array and Image sensor module including the same and Manufacturing method thereof |
Also Published As
Publication number | Publication date |
---|---|
JP2019186544A (en) | 2019-10-24 |
KR102614907B1 (en) | 2023-12-19 |
CN110349985A (en) | 2019-10-18 |
KR20190116026A (en) | 2019-10-14 |
JP7503367B2 (en) | 2024-06-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110349985B (en) | Image sensor and method of manufacturing the same | |
KR100876505B1 (en) | Lens arrays and methods for fabricating the lens arrays | |
US8675282B2 (en) | Solid-state imaging device and method for manufacturing the same | |
CN110061018B (en) | Monolithic integration of plenoptic lenses on photosensor substrates | |
US8637799B2 (en) | Imaging apparatus with lens array having densely arranged lens surfaces without a gap | |
US10277788B2 (en) | Ultrathin digital camera and method of manufacturing the same | |
KR101410002B1 (en) | Image sensing apparatus with artificial ommatidia and Manufacturing method for artificial ommatidia unit | |
US7986019B2 (en) | Solid-state imaging device and its manufacturing method | |
CN1816915A (en) | Multiple microlens system for image sensors or display units | |
US20090141361A1 (en) | Imaging Apparatus and Method for Manufacturing Microlens Array | |
CN106575657B (en) | Solid-state imaging device and method of manufacturing the same | |
US10916575B2 (en) | Image sensor and method of manufacturing image sensor | |
JP2013143431A (en) | Solid state imaging device and solid state imaging device manufacturing method | |
JP6801230B2 (en) | Solid-state image sensor and electronic equipment | |
CN1812505B (en) | Image sensor using fiber optics | |
JP2017034280A (en) | Solid state imaging element | |
JP4708721B2 (en) | Solid-state image sensor | |
JP2020113596A (en) | Solid-state imaging device array substrate, position accuracy detection method, and solid-state imaging device manufacturing method | |
TW202026744A (en) | Image module and biometric device using the same | |
KR100731094B1 (en) | CMOS image sensor and its manufacturing method | |
TWI857588B (en) | Optical lenses and electronic products | |
KR100788351B1 (en) | CMOS image sensor and its manufacturing method | |
JP2007079325A (en) | Microlens array | |
JP2022019233A (en) | Solid-state image sensor and manufacturing method thereof | |
JP2019004110A (en) | Solid state imaging device and manufacturing method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |