[go: up one dir, main page]

CN109445002B - Microlens array structure and manufacturing method thereof, compound eye lens, and electronic device - Google Patents

Microlens array structure and manufacturing method thereof, compound eye lens, and electronic device Download PDF

Info

Publication number
CN109445002B
CN109445002B CN201811416956.4A CN201811416956A CN109445002B CN 109445002 B CN109445002 B CN 109445002B CN 201811416956 A CN201811416956 A CN 201811416956A CN 109445002 B CN109445002 B CN 109445002B
Authority
CN
China
Prior art keywords
image
meta
images
flat surface
fly
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201811416956.4A
Other languages
Chinese (zh)
Other versions
CN109445002A (en
Inventor
徐锐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201811416956.4A priority Critical patent/CN109445002B/en
Publication of CN109445002A publication Critical patent/CN109445002A/en
Priority to PCT/CN2019/103994 priority patent/WO2020107984A1/en
Application granted granted Critical
Publication of CN109445002B publication Critical patent/CN109445002B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • G02B3/0012Arrays characterised by the manufacturing method
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • G02B3/0037Arrays characterized by the distribution or form of lenses
    • G02B3/005Arrays characterized by the distribution or form of lenses arranged along a single direction only, e.g. lenticular sheets
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • G02B3/0037Arrays characterized by the distribution or form of lenses
    • G02B3/0056Arrays characterized by the distribution or form of lenses arranged along two different directions in a plane, e.g. honeycomb arrangement of lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses a micro-lens array structure, a fly-eye lens, a manufacturing method of the micro-lens array structure and an electronic device. The microlens array structure includes a first lens array and a second lens array. The first lens array includes a plurality of first bosses arranged in a first direction. The second lens array includes a plurality of second bosses arranged in a second direction. The plurality of first bosses and the plurality of second bosses are overlapped in the third direction to form a micro-lens array. The first direction intersects the second direction, and the third direction is perpendicular to the first direction and the second direction. In the microlens array structure, the fly-eye lens, the manufacturing method of the microlens array structure and the electronic device, the plurality of first bosses and the plurality of second bosses are overlapped to form the microlens array, the manufacturing process is simple, the manufactured microlens array is good in uniformity, and the cost is low.

Description

Micro-lens array structure and manufacturing method thereof, fly-eye lens and electronic device
Technical Field
The present disclosure relates to the field of consumer electronics, and more particularly, to a microlens array structure, a fly-eye lens, a method for manufacturing the microlens array structure, and an electronic device.
Background
The micro lens array can be manufactured by photoetching and then hot reflux, firstly, tiny columnar structures are processed on the surface of a flat resin material by photoetching, then, the columnar resin is vitrified at high temperature, the lower surface of the softened resin becomes a curved surface under the action of surface tension, and the original columnar structures become tiny convex lenses after cooling. In addition, the microlens array can be manufactured by grinding or injection molding, and a mold is designed according to the surface morphology of the microlens array to be formed, and the microlenses are injected into the mold or ground with a grinder.
However, the size of the microlens fabricated by the photolithography heating reflow method cannot be made large, the surface of the microlens cannot generate a large radian when the size of the microlens is made large, the diopter is limited, and the uniformity of the microlens is poor. The mode of mould injection molding or grinding processing is utilized, a precise mould needs to be manufactured according to the surface type of the micro lens, and the processing difficulty of the mould is very high.
Disclosure of Invention
The embodiment of the application provides a micro-lens array structure, a fly-eye lens, a manufacturing method of the micro-lens array structure and an electronic device.
The microlens array structure of the embodiment of the present application includes a first lens array including a plurality of first bosses arranged in a first direction, and a second lens array; and the second lens array comprises a plurality of second bosses, the second bosses are arranged along a second direction, the first bosses and the second bosses are overlapped in a third direction to form the micro lens array, the first direction intersects with the second direction, and the third direction is perpendicular to the first direction and the second direction.
The compound eye lens of the embodiment of the application comprises the micro lens array structure of the embodiment of the application and an image sensor, wherein the image sensor is arranged on the image side of the micro lens array structure.
The method for manufacturing the microlens array structure of the embodiment of the present application includes: forming a first lens array including a plurality of first bosses arranged in a first direction; forming a second lens array including a plurality of second bosses arranged in a second direction; and combining the first lens array with the second lens array such that a plurality of the first lands overlap a plurality of the second lands in a third direction to form the microlens array, the first direction intersecting the second direction, the third direction being perpendicular to the first direction and the second direction.
The electronic device of the embodiment of the application comprises a shell and the compound eye lens of the embodiment of the application, wherein the compound eye lens is arranged on the shell.
In the microlens array structure, the fly-eye lens, the manufacturing method of the microlens array structure and the electronic device, the plurality of first bosses and the plurality of second bosses are overlapped to form the microlens array, the manufacturing process is simple, the manufactured microlens array is good in uniformity, and the cost is low.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a perspective view of a state of an electronic device according to some embodiments of the present disclosure;
FIG. 2 is a schematic perspective view of another state of an electronic device according to some embodiments of the present disclosure;
FIG. 3 is a schematic diagram of a fly-eye lens according to some embodiments of the present disclosure;
FIG. 4 is a schematic diagram of a fly-eye lens according to some embodiments of the present application;
FIG. 5 is a schematic perspective assembly view of a microlens array structure according to some embodiments of the present application;
FIG. 6 is a schematic cross-sectional view of the microlens array structure of FIG. 5 taken along line VI-VI;
FIG. 7 is an exploded schematic view of a microlens array structure according to some embodiments of the present application;
FIG. 8 is a schematic perspective assembly view of a microlens array structure according to some embodiments of the present application;
FIG. 9 is a schematic cross-sectional view of the microlens array structure of FIG. 8 taken along line IX-IX;
FIG. 10 is a schematic perspective assembly view of a microlens array structure according to some embodiments of the present application;
FIG. 11 is a schematic cross-sectional view of the microlens array structure of FIG. 10 taken along line XI-XI;
FIG. 12 is a schematic cross-sectional view of a microlens array structure of certain embodiments of the present application taken along a position corresponding to line VI-VI in FIG. 5;
FIG. 13 is a schematic perspective assembly view of a microlens array structure according to some embodiments of the present application;
FIG. 14 is a schematic perspective assembly view of a microlens array structure according to some embodiments of the present application;
FIG. 15 is a schematic perspective view of a portion of a fly-eye lens according to some embodiments of the present application;
FIG. 16 is a scene schematic of a meta-image stitching according to some embodiments of the present application;
FIG. 17 is a schematic view of a scene of depth information calculation according to some embodiments of the present application;
fig. 18 is a schematic flow chart of a method of fabricating a microlens array structure according to some embodiments of the present application.
Detailed Description
Embodiments of the present application will be further described below with reference to the accompanying drawings. The same or similar reference numbers in the drawings identify the same or similar elements or elements having the same or similar functionality throughout.
In addition, the embodiments of the present application described below in conjunction with the accompanying drawings are exemplary and are only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the present application.
In this application, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through intervening media. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
Referring to fig. 1 and fig. 2 together, an electronic device 200 according to an embodiment of the present disclosure includes a housing 210 and a fly-eye lens 100. The electronic device 200 may be a mobile phone, a tablet computer, a game machine, a smart watch, a head display device, an unmanned aerial vehicle, and the like, and in the embodiment of the present application, the electronic device 200 is a mobile phone as an example, and it is understood that the specific form of the electronic device 200 is not limited to a mobile phone.
The housing 210 may serve as a mounting carrier for the functional elements of the electronic device 200, the housing 210 may provide protection for the functional elements, such as the display 230 or the receiver 250, against dust, water, and falling. In the embodiment of the present application, the housing 210 includes a main body 211 and a movable bracket 212, the movable bracket 212 can move relative to the main body 211 under the driving of the driving device, for example, the movable bracket 212 can slide relative to the main body 211 to slide into the main body 211 (as shown in fig. 1) or slide out of the main body 211 (as shown in fig. 2). Part of the functional elements (e.g., the display screen 230) may be mounted on the main body 211, and another part of the functional elements (e.g., the fly-eye lens 100 and the receiver 250) may be mounted on the movable bracket 212, and the movable bracket 212 moves to cause the another part of the functional elements to retract into the main body 211 or extend out of the main body 211.
The fly-eye lens 100 is mounted on the housing 210. Specifically, the housing 210 may have a collection window, and the fly-eye lens 100 is aligned with the collection window to enable the fly-eye lens 100 to collect image information. In the embodiment of the present application, the fly-eye lens 100 is mounted on the movable bracket 212, and when a user needs to use the fly-eye lens 100, the user can trigger the movable bracket 212 to slide out of the main body 211 to drive the fly-eye lens 100 to extend out of the main body 211, and when the user does not need to use the fly-eye lens 100, the user can trigger the movable bracket 212 to slide into the main body 211 to drive the fly-eye lens 100 to retract into the main body 211. Of course, fig. 1 and 2 only illustrate one specific form of the housing 210, and should not be construed as limiting the housing 210 of the present application, for example, in another example, the capturing window formed on the housing 210 may be fixed, and the fly-eye lens 100 is fixedly disposed and aligned with the capturing window; in yet another example, fly-eye lens 100 is fixedly disposed below display screen 230.
Referring to fig. 3 and 4, the fly-eye lens 100 includes an image sensor 30 and a microlens array structure 10.
The image sensor 30 is disposed on the image side of the microlens array structure 10. The image sensor 30 may be a Complementary Metal Oxide Semiconductor (CMOS) image sensor or a Charge-coupled Device (CCD) image sensor. The image sensor 30 is a monolithic structure, i.e., physically non-partitioned. Image sensor 30 includes a plurality of light-sensitive pixels 32 (as shown in fig. 15, fly-eye lens 100 may further include a substrate 40, and a plurality of light-sensitive pixels 32 are formed on substrate 40), each light-sensitive pixel 32 for converting an optical signal into an electrical signal. When the fly-eye lens 100 is in operation, light outside the housing 210 enters from the collecting window and passes through the micro-lens array structure 10 to be imaged on the image sensor 30.
In this embodiment, the fly-eye lens 100 may further include a filter 50, and the filter 50 is located on a side of the microlens array structure 10 facing away from the image sensor 30. That is, the microlens array structure 10 is located between the filter 50 and the image sensor 30. At this time, light outside the housing 210 enters from the collecting window, and then sequentially passes through the optical filter 50 and the microlens array structure 10 to form an image on the image sensor 30. The filter 50 is also a monolithic structure, i.e., physically undivided. The optical filter 50 may be an infrared pass filter, which is used to adjust the wavelength range of the imaged light, specifically to isolate visible light and allow infrared light to enter the image sensor 30; alternatively, the filter 50 is an infrared cut filter, and the infrared cut filter is used to adjust the wavelength range of the imaging light, and specifically to isolate the infrared light from entering the image sensor 30, so as to prevent the infrared light from affecting the color and definition of the normal image.
Referring to fig. 5 and 6, the microlens array structure 10 includes a first lens array 12 and a second lens array 14.
The first lens array 12 includes a plurality of first bosses 126, and the plurality of first bosses 126 are arranged along a first direction (e.g., an X-axis direction in fig. 5). Each first boss 126 is semi-cylindrical (or other suitable shape) to achieve an optical focusing effect. The first lens array 12 includes opposing first planar and convex surfaces 122 and 124 (wave-like surfaces). The plurality of first bosses 126 collectively form the first raised surface 124. Specifically, each first boss 126 includes a first sub-convex surface, and a plurality of first sub-convex surfaces of the plurality of first bosses 126 are connected to form the first convex surface 124. The plurality of first bosses 126 collectively form the first planar face 122. Specifically, each first boss 126 includes a first sub-planar surface, and a plurality of first sub-planar surfaces of the plurality of first bosses 126 are connected to form the first planar surface 122.
The second lens array 14 includes a plurality of second bosses 146, and the plurality of second bosses 146 are arranged along a second direction (e.g., a Y-axis direction in fig. 5). Each second boss 146 has a semi-cylindrical shape (or other suitable shape) to achieve an optical focusing effect. The second lens array 14 includes opposing second flat surfaces 142 and second convex surfaces 144 (wave-like surfaces). The plurality of second bosses 146 collectively form the second raised surface 144. Specifically, each second boss 146 includes a second sub-convex surface, and a plurality of second sub-convex surfaces of the plurality of second bosses 146 are connected to form the second convex surface 144. The plurality of second bosses 146 share a common second planar surface 142. Specifically, each second boss 146 includes a second sub-planar surface, and a plurality of second sub-planar surfaces of the plurality of second bosses 146 are connected to form the second planar surface 142.
The plurality of first mesas 126 and the plurality of second mesas 146 overlap in a third direction (e.g., the Z-axis direction in fig. 5) to form a microlens array. The first direction intersects the second direction, and the third direction is perpendicular to the first direction and the second direction. Taking fig. 5 as an example, the number of the first bosses 126 and the number of the second bosses 146 are 20, that is, 20 first bosses 126 are arranged along the X-axis direction to form the 1 × 20 first lens array 12, 20 second bosses 146 are arranged along the Y-axis direction to form the 20 × 1 second lens array 14, and 20 first bosses 126 and 20 second bosses 146 are overlapped in the Z-axis direction to form the 20 × 20 microlens array. In this embodiment, the first direction intersects the second direction, and the third direction is perpendicular to the first direction and the second direction. The first direction and the second direction intersect means that the first direction and the second direction are neither coincident nor parallel to each other, and their projections in the stereo space intersect, and specifically, an included angle of 30 degrees, 45 degrees, 60 degrees, 75 degrees, 90 degrees, etc. may be formed, for example, the included angle is 90 degrees in fig. 5.
Referring to fig. 7, in manufacturing the microlens array structure 10 according to the embodiment of the present disclosure, a first lens array 12 including a plurality of first bosses 126 and a second lens array 14 including a plurality of second bosses 146 may be formed, respectively, and then the first lens array 12 and the second lens array 14 are combined to overlap the plurality of first bosses 126 and the plurality of second bosses 146 in a third direction to form a microlens array, where the microlens array includes a plurality of microlenses 101, as shown by a dotted line in fig. 6, each microlens 101 is formed by overlapping a first boss 126 in a Y-direction and a second boss 146 in an X-direction, and each microlens 101 is formed by overlapping a portion of the first boss 126 and the second boss 146 and overlapping a portion of the second boss 146 and the first boss 126 on the second boss 146. Specifically, a plurality of first bosses 126 arranged in a first direction may be formed on a whole lens structure of a cube (a rectangular parallelepiped or a cube) by a nanoimprint technology to serve as the first lens array 12, the plurality of first bosses 126 are sequentially connected without a gap therebetween, so that a microlens array arranged more closely may be formed, and thus the fly-eye lens 100 may collect more image information. The second lens array 14 may be formed in the same manner, and a description thereof will not be repeated. Combining the first lens array 12 with the second lens array 14 such that the plurality of first lands 126 overlaps the plurality of second lands 146 in the third direction to form the microlens array may be placing the first lens array 12 on the second lens array 14; alternatively, the second lens array 14 is disposed on the first lens array 12, and the plurality of first bosses 126 and the plurality of second bosses 146 are arranged in a staggered manner (for example, in a crisscross manner), and are mutually interfered or in contact fit in the third direction. In this embodiment, the material used to form the first lens array 12 and the second lens array 14 may be glass, plastic, or other materials. The microlens array and the image sensor 30 can be aligned, and specifically, the optical axis of each microlens 101 coincides with the normal of the image sensor 30, so as to have a better optical imaging effect.
In the microlens array structure 10 according to the embodiment of the application, the plurality of first bosses 126 and the plurality of second bosses 146 are overlapped to form the microlens array, the manufacturing process is simple, the uniformity of the manufactured microlens array is good (namely, the microlens array is arranged neatly, the structure is consistent), and the cost is low. Because the mode of photoetching heating reflux and the mode of mould injection molding or grinding processing are not adopted for manufacturing, the problem that the size of the micro lens cannot be large, or the problems that the diopter is limited and the consistency is poor when the size of the micro lens is large, or the problem that the processing difficulty of a precise mould is very large do not exist.
In addition, since the plurality of first bosses 126 are integrally formed and the plurality of second bosses 146 are integrally formed, the plurality of first bosses 126 do not need to be fixed together by glue or the like, and the plurality of second bosses 146 do not need to be fixed together by glue or the like. When the first lens array 12 and the second lens array 14 are assembled together, the first bosses 126 and the second bosses 146 are not offset from each other, and the assembly stability is high.
Moreover, when the first lens array 12 and the second lens array 14 are assembled together, the first boss 126 and the second boss 146 may be fixed without glue (such as optical glue), and only the first boss 126 and the second boss 146 need to be abutted, and the first lens array 12 and the second lens array 14 are fixed by a lens barrel or other elements of the fly-eye lens 100, so that the whole installation method of the fly-eye lens 100 is simple. When one of the lens arrays, such as the first lens array 12 or the second lens array 14, is damaged, only the first lens array 12 or the second lens array 14 may be removed for replacement.
Finally, the focal length of each microlens 101 in the microlens array is the same, the fly-eye lens 100 can be used as a fixed focus lens (not related to focusing), when the object to be shot is within the effective focal length range of the fly-eye lens 100, clear imaging can be achieved, and the compound eye lens is suitable for being used as a front lens of the electronic device 100.
Referring to fig. 8 and 9, in one embodiment, when the first lens array 12 is combined with the second lens array 14, the first flat surface 122 is combined with the second flat surface 142. At this time, the bonding area of the first flat surface 122 and the second flat surface 142 can be completely attached together, the bonding is tight, the stability is higher, and no moisture or impurities enter between the first lens array 12 and the second lens array 14, which is beneficial to ensuring the service life of the micro lens array structure 10 and the good imaging quality of the fly-eye lens 100.
Referring to fig. 5 and 6, in one embodiment, when the first lens array 12 is combined with the second lens array 14, the first convex surface 124 is combined with the second flat surface 142. At this time, when the micro lens array structure 10 is assembled with other structures (such as the lens barrel, the image sensor 30, etc.) of the fly-eye lens 100, the first flat surface 122 can be well mounted on the other structures.
In one embodiment, the first planar surface 122 is joined to the second convex surface 144 when the first lens array 12 is combined with the second lens array 14 (similar to the joining of the first convex surface 124 to the second planar surface 142 in fig. 5 and 6). In this case, when the micro lens array structure 10 is assembled with another structure (for example, a lens barrel, a filter 50, etc.) of the fly-eye lens 100, the second flat surface 142 can be well attached to the other structure.
Referring to fig. 10 and 11, in one embodiment, when the first lens array 12 is combined with the second lens array 14, the first convex surface 124 is combined with the second convex surface 144. At this time, when the micro lens array structure 10 is assembled with other structures (such as the lens barrel, the image sensor 30, the filter 50, etc.) of the fly-eye lens 100, the first flat surface 122 and the second flat surface 142 can be well mounted on the other structures, and since the first protruding surface 124 and the second protruding surface 144 are not exposed to the outside (the side combined with the other structures), the first protruding surface 124 and the second protruding surface 144 will not be worn by the other structures to affect the optical focusing effect.
In the present embodiment, the first flat surface 122 is coupled to the second flat surface 142, the first flat surface 122 is coupled to the second protrusion surface 144, the first protrusion surface 124 is coupled to the second flat surface 142, and the first protrusion surface 124 is coupled to the second protrusion surface 144, so that different focal lengths and different viewing angles can be realized.
Referring to fig. 5, fig. 8 and fig. 10, in the above embodiments, the first flat surface 122 and the second flat surface 142 may be both planar structures, and at this time, the image sensor 30 is also a planar structure, so as to achieve alignment between the microlens array and the image sensor 30, ensure good imaging quality of the fly-eye lens 100, and facilitate assembly of the fly-eye lens 100. It is understood that a planar structure means that the line connecting any two points on a surface entirely falls on the surface.
Referring to fig. 12, in the above embodiments, the first flat surface 122 and the second flat surface 142 may be both curved surfaces, and at this time, the image sensor 30 is also curved surfaces, so as to achieve alignment between the microlens array and the image sensor 30, ensure good imaging quality of the fly-eye lens 100, and the fly-eye lens 100 has a larger field angle, can collect more light, and has a smaller volume compared to planar combination. In this case, if the fly-eye lens 100 further includes the filter 50, the filter 50 may also have a curved surface structure to better filter light.
Further, when the first flat surface 122 is combined with the second flat surface 142, the first flat surface 122 is bent to the same degree as the second flat surface 142. When the first flat surface 122 is combined with the second convex surface 144, the first flat surface 122 is bent to the same degree as the second convex surface 144. When the first raised surface 124 is joined with the second planar surface 142, the first raised surface 124 is curved to the same extent as the second planar surface 142 (as shown in fig. 12). When the first convex surface 124 is combined with the second convex surface 144, the first convex surface 124 is bent to the same degree as the second convex surface 144. It should be noted that the degree of curvature of the convex surface may be considered as the degree of curvature of the external arc of the convex surface, or the point having the farthest distance between each convex platform and the corresponding flat surface, and the degree of curvature of the convex surface is the degree of curvature of the arc formed by the points of the plurality of convex platforms.
The surface types of the first flat surface 122, the first convex surface 124, the second flat surface 142 and the second convex surface 144 may be any one of aspheric surfaces, spherical surfaces, fresnel surfaces or binary optical surfaces. For example, the first planar surface 122, the first convex surface 124, the second planar surface 142, and the second convex surface 144 are all aspheric; alternatively, the first flat surface 122 and the first convex surface 124 are aspheric surfaces, the second flat surface 142 is a spherical surface, and the second convex surface 144 is a fresnel surface; alternatively, the first flat surface 122 is aspheric, the first convex surface 124 is spherical, the second flat surface 142 is fresnel surface, and the second convex surface 144 is binary optical surface.
When the surface type is an aspheric surface, the aberration of the fly-eye lens 100 can be corrected, the problems of distorted vision and the like can be solved, and meanwhile, the lens is lighter, thinner and flatter and can still maintain excellent shock resistance; when the surface type is spherical, the manufacturing process of the micro-lens array structure 10 is simple; when the Fresnel surface is selected as the surface type, the fly-eye lens 100 is bright in imaging and uniform in brightness, and the problems of darkening and blurring of corners are not easy to occur; when the surface type adopts a binary optical surface, the lens has light weight and low cost, and can realize new functions of tiny, array, integration and the like which are difficult to be finished by the traditional optics.
Referring to fig. 5 again, when the first direction is perpendicular to the second direction and the width of each first protrusion 126 is equal to the width of each second protrusion 146, the microlens array formed by the plurality of first protrusions 126 and the plurality of second protrusions 146 is a square microlens array. Referring to fig. 13, when the first direction is perpendicular to the second direction, and the width of each first protrusion 126 is greater than the width of each second protrusion 146 or the width of each second protrusion 146 is greater than the width of the first protrusion 126 (as shown in fig. 13), the microlens array formed by the plurality of first protrusions 126 and the plurality of second protrusions 146 is a rectangular microlens array. Referring to fig. 14, when the first direction intersects the second direction and the first direction is not perpendicular to the second direction, the microlens array formed by the plurality of first bosses 126 and the plurality of second bosses 146 is a parallelogram microlens array. The microlens array of the present embodiment may be a square microlens array, a rectangular microlens array, or a parallelogram microlens array, so as to be suitable for the fly-eye lens 100 of various shapes, structures, or functions.
Referring to fig. 15, all of the photosensitive pixels 32 formed on the substrate 40 are divided into a plurality of photosensitive pixel sets, each including a plurality of photosensitive pixels 32. In one example, each set of photosensitive pixels may include 70 × 70 photosensitive pixels 32. Of course, the number of 70 × 70 is merely an example, and in other examples, each photosensitive pixel set may further include 60 × 60, 80 × 80, 100 × 100 photosensitive pixels 32, and so on, which is not limited herein. The greater the number of photosensitive pixels 32 in each photosensitive pixel set, the higher the resolution of the corresponding formed meta-image. The plurality of photosensitive pixel sets may be arranged in a horizontal direction, a vertical direction, a "tian" shape, and the like.
The microlens array structure 10 covers a plurality of sets of photosensitive pixels. The plurality of photosensitive pixels 32 in each photosensitive pixel set may receive light incident after passing through the microlens array structure 10 to output a sheet image corresponding to the photosensitive pixel set. Multiple sets of photosensitive pixels may output multiple meta-images. In one example, the microlens array structure 10 includes a plurality of microlenses 101, each microlens 101 covering a set of photosensitive pixels. After passing through the microlens 101, a light ray in a scene is incident on the multiple photosensitive pixels 32 in the corresponding photosensitive pixel set, so that the multiple photosensitive pixels 32 receive the light ray and correspondingly output multiple electrical signals, and the multiple electrical signals output by the multiple photosensitive pixels 32 in the same photosensitive pixel set form a sheet element image.
Referring to fig. 2, the electronic device 200 may further include a processor (or the processor may also be a processor of the fly-eye lens 100). The processor is mounted within the housing 210. The processor may be configured to control the exposure of the plurality of photosensitive pixels 32 to receive light passing through the microlens array structure 10 and to receive the electrical signals output by each photosensitive pixel 32 to form a meta-image corresponding one-to-one to the plurality of sets of photosensitive pixels. The processor can also be used for fusing a plurality of meta-images to obtain a merged image, calculating the depth information of the scene according to at least two meta-images, and performing preset processing on the merged image according to the depth information.
In one example, when the processor is configured to fuse the plurality of meta-images to obtain the merged image, the processor performs substantially the following operations: selecting two meta-images, wherein one meta-image is used as a reference meta-image, and the other meta-image is used as a to-be-matched meta-image; dividing the reference meta-image into a plurality of block images, and selecting one block image from the plurality of block images as a reference block image; searching a matching block image matched with the reference block image in the element image to be matched to form a matching image pair; circularly executing the dividing step and the searching step to traverse a plurality of block images in the reference element image to obtain a plurality of matching image pairs; fusing the reference block image and the matching block image in each matching image pair to obtain a fused sub-image, and splicing a plurality of fused sub-images to obtain spliced sub-images; and taking the spliced sub-image as a new reference meta-image, selecting one meta-image from the rest multiple meta-images as a new meta-image to be matched, and circularly executing the step of dividing the reference meta-image into multiple block images to the step of acquiring the spliced sub-image so as to obtain a combined image through fusion. The step of dividing the reference meta-image into a plurality of block images is performed in a loop by using the previous stitched sub-image as the reference meta-image.
Specifically, as shown in fig. 16, assuming that there are N meta-images, i.e., meta-images P1, P2, and P3 … PN, the processor first selects two meta-images from the N meta-images: such as the meta-image P1 and the meta-image P2, and has the meta-image P1 as a reference meta-image and the meta-image P2 as a meta-image to be matched. Subsequently, the processor divides the reference meta-image P1 into a plurality of block images, such as 9 block images: block images P1-00, block images P1-01, block images P1-02, block images P1-10, block images P1-11, block images P1-12, block images P1-20, block images P1-21, and block images P1-22. Subsequently, the processor selects one block image from the 9 block images as a reference block image, for example, selects the block image P1-00 as a reference block image. After determining the reference block image P1-00, the processor finds a matching block image matching the reference block image P1-00 in the meta image to be matched P2. Specifically, the processor searches for regions P2-00 corresponding to the positions of the reference block images P1-00 in the meta-image P2 to be matched, performs correlation calculation on the reference block images P1-00 and the regions P2-00 to judge whether the regions P2-00 are matched block images matched with the reference block images P1-00 or not, determines the regions P2-00 as matched block images matched with the reference block images P1-00 if the correlation is larger than a preset correlation value, and marks the regions P2-00 for subsequent image fusion; on the contrary, in the meta-image P2 to be matched, the area P2-00 is taken as a starting point, a rectangular frame with the same size as the reference block image P1-00 is moved in the x direction and/or the y direction according to a predetermined moving step, and each time the rectangular frame is moved, the area framed out by the rectangular frame needs to be subjected to correlation calculation with the reference block image P1-00 to judge whether the area framed out by the rectangular frame is a matching block image matched with the reference block image P1-00, if the area framed out by the rectangular frame is a matching block image matched with the reference block image P1-00, the area framed out by the rectangular frame is marked, otherwise, the rectangular frame continues to move until the whole meta-image P2 is traversed. As shown in fig. 16, due to the difference in the field of view between the set of photosensitive pixels outputting the reference meta-image P1 and the set of photosensitive pixels outputting the meta-image P2, the reference block images P1-00 cannot find a matching block image matching therewith in the meta-image P2 to be matched, and at this time, an image matching pair is also output, but only the reference block image P1-00 is included in the image matching pair.
After finding a matching block image of the reference block image P1-00, the processor replaces the reference block image P1-00 with the block image P1-01 and finds a matching block image matching the reference block image P1-01 in the manner described above. As shown in fig. 16, the matching block image that matches the reference block image P1-01 is a block image P12-01, and at this time, an image matching pair is output, which includes the reference block image P1-01 and the matching block image P12-01. The processor then continues to replace the reference block image and continues to perform the above-described finding step. This is repeated until matching block images of all the block images in the reference meta image P1 are determined, and a plurality of pairs of image matching pairs corresponding to the number of block images are output. As shown in fig. 16, the image matching pairs corresponding to each block image in the meta-image P1 are: "P1-00", "P1-01 ═ P12-01", "P1-02 ═ P12-02", "P1-10", "P1-11 ═ P12-11", "P1-12 ═ P12-12", "P1-20", "P1-21 ═ P12-21", and "P1-22 ═ P12-22". Then, the processor fuses the block images in the image matching pair comprising two block images, specifically, fuses the reference block image and the matching block image in each pair of image matching pairs to obtain a fused sub-image, and for the image matching pair comprising only one block image, the reference block image in the image matching pair is the fused sub-image. Thus, a plurality of fusion sub-images which are in one-to-one correspondence with the matching pairs of the images can be obtained. And then, the processor splices the multiple fused sub-images to obtain an initial spliced sub-image. Further, the processor needs to capture an unmatched region of the to-be-matched meta-image P2 excluding the region matched with the reference meta-image P1, and to stitch the initial stitched sub-image with the unmatched region, so as to obtain a final stitched sub-image Pm, where the number of pixels of the stitched sub-image is greater than the number of pixels of the reference meta-image P1 and the to-be-matched meta-image P2. It can be understood that the unmatched region is an image which cannot be found in the reference meta-image P1 and matches with the part of the region, which indicates that the image of the unmatched region is not found in the reference meta-image P1, due to the field of view difference between the photosensitive pixel set outputting the reference meta-image P1 and the photosensitive pixel set outputting the meta-image P2 to be matched, when the stitched sub-image Pm is formed, the integrity of the picture shot by the compound eye lens 100 can be ensured by stitching the image of the unmatched region into the stitched sub-image Pm.
After forming the stitched sub-image Pm, the processor takes the stitched sub-image Pm as a new reference meta-image, and continues to pick one meta-image from the remaining meta-images P3, P4, and P5 … PN as a new meta-image to be matched. Then, the processor divides the reference meta-image Pm into a plurality of block images according to the fusion and splicing manner of the reference meta-image P1 and the to-be-matched meta-image P2, finds matching block images respectively matched with the plurality of block images in the reference meta-image Pm in the to-be-matched meta-image P3, and performs the fusion and splicing process to fuse and splice the reference meta-image Pm and the to-be-matched meta-image P3 into a new spliced sub-image Pm. Subsequently, the processor uses the new stitched sub-image Pm as a new reference meta-image, and continues to pick one meta-image from the remaining meta-images P4, P5, and P6 … PN as a new meta-image to be matched. And the above steps are repeated in a circulating way until all the element images are fused and spliced, and finally a combined image is obtained, wherein the combined image has higher resolution.
In one example, when the processor calculates the depth information of the scene according to at least two meta-images, the following operations are specifically performed: dividing the multiple meta-images into a reference image set and an image set to be matched, wherein the reference image set comprises the multiple meta-images, and the image set to be matched comprises the multiple meta-images; selecting one meta image from the reference image set as a reference meta image, and selecting one meta image from the image set to be matched as a meta image to be matched; dividing the reference meta-image into a plurality of block images, and selecting one block image from the plurality of block images as a reference block image; searching a matching block image matched with the reference block image in the element image to be matched to form a matching image pair; calculating the depth information according to the parallax of the reference block image and the matching block image in the matching image pair; circularly executing the dividing step, the searching step and the calculating step to traverse a plurality of block images in the reference element image to acquire a plurality of depth information; and circularly executing the step of selecting one meta image from the reference image set as the reference meta image to the step of acquiring a plurality of pieces of depth information to traverse the plurality of meta images in the reference image set to obtain a plurality of pieces of depth information.
Specifically, as shown in fig. 17, assuming that there are 16 meta-images, i.e., meta-images P1, P2, P3 … P16, respectively, the processor divides the 16 meta-images into two sets: the image matching method comprises a reference image set and an image set to be matched. Wherein the reference image set includes a meta image P1, a meta image P2, a meta image P5, a meta image P6, a meta image P9, a meta image P10, a meta image P13, a meta image P14; the image set to be matched comprises a meta-image P3, a meta-image P4, a meta-image P7, a meta-image P8, a meta-image P11, a meta-image P12, a meta-image P15 and a meta-image P16. Then, the processor selects one meta image from the reference image set as a reference meta image, such as the meta image P1 as a reference meta image, and selects one meta image from the image set to be matched as a meta image to be matched, such as the meta image P3 as a meta image to be matched. Subsequently, the processor divides the reference meta-image P1 into a plurality of block images, such as 9 block images: block images P1-00, block images P1-01, block images P1-02, block images P1-10, block images P1-11, block images P1-12, block images P1-20, block images P1-21, and block images P1-22. Subsequently, the processor selects one block image from the 9 block images as a reference block image, for example, selects the block image P1-00 as a reference block image. After determining the reference block image P1-00, the processor finds a matching block image matching the reference block image P1-00 in the meta image to be matched P3. Specifically, the processor searches for regions P3-00 corresponding to the positions of the reference block images P1-00 in the meta images P3 to be matched, performs correlation calculation on the reference block images P1-00 and the regions P3-00 to judge whether the regions P3-00 are matched block images matched with the reference block images P1-00 or not, determines the regions P3-00 as matched block images matched with the reference block images P1-00 if the correlation is larger than a preset correlation value, and marks the regions P3-00 for subsequent depth information calculation; on the contrary, in the meta-image P3 to be matched, the area P3-00 is taken as a starting point, a rectangular frame with the same size as the reference block image P1-00 is moved in the x direction and/or the y direction according to a predetermined moving step, and each time the rectangular frame is moved, the area framed out by the rectangular frame needs to be subjected to correlation calculation with the reference block image P1-00 to judge whether the area framed out by the rectangular frame is a matching block image matched with the reference block image P1-00, if the area framed out by the rectangular frame is a matching block image matched with the reference block image P1-00, the area framed out by the rectangular frame is marked, otherwise, the rectangular frame continues to move until the whole meta-image P3 is traversed. As shown in fig. 17, due to the difference in the field of view between the set of photosensitive pixels outputting the reference meta-image P1 and the set of photosensitive pixels outputting the meta-image P3, the reference block images P1-00 cannot find a matching block image matching therewith in the meta-image P3 to be matched, and at this time, an image matching pair is also output, but only the reference block image P1-00 is included in the image matching pair.
After finding a matching block image of the reference block image P1-00, the processor replaces the reference block image P1-00 with the block image P1-01 and finds a matching block image matching the reference block image P1-01 in the manner described above. As shown in fig. 17, the matching block image that matches the reference block image P1-01 is a block image P13-01, and at this time, an image matching pair is output, which includes the reference block image P1-01 and the matching block image P13-01. The processor then continues to replace the reference block image and continues to perform the above-described finding step. This is repeated until matching block images of all the block images in the reference meta image P1 are determined, and a plurality of pairs of image matching pairs corresponding to the number of block images are output. As shown in fig. 17, the image matching pairs corresponding to each block image in the meta-image P1 are: "P1-00", "P1-01 ═ P13-01", "P1-02 ═ P13-02", "P1-10", "P1-11 ═ P13-11", "P1-12 ═ P13-12", "P1-20", "P1-21 ═ P13-21", and "P1-22 ═ P13-22". Then, the processor screens out image matching pairs containing two meta images, and performs parallax calculation on the reference block image and the matching block image in each pair of image matching pairs to obtain at least one depth information d. Specifically, the processor performs parallax calculation based on the coordinate position of the reference block image in the reference meta-image P1, the coordinate position of the matching block image in the to-be-matched meta-image P3, and the positional relationship between the set of photosensitive pixels outputting the reference meta-image P1 and the set of photosensitive pixels outputting the to-be-matched meta-image P3 to obtain at least one depth information d. In this way, the parallax calculation is performed on the reference block images and the matching block images in the pairs of image matching pairs, so that a plurality of depth information d can be obtained.
Subsequently, the processor selects one meta-image from the remaining meta-images of the reference image set as a new reference meta-image, such as the meta-image P2 as a new reference meta-image, and selects one meta-image from the remaining meta-images of the image set to be matched as a new meta-image to be matched, such as the meta-image P4 as a new meta-image to be matched. Subsequently, the processor processes the reference meta-image P2 and the meta-image P4 to be matched in the manner of the above-described calculation of the depth information d to obtain a plurality of depth information d. Subsequently, the processor selects one meta-image from the remaining meta-images of the reference image set as a new reference meta-image, such as the meta-image P5 as a new reference meta-image, and selects one meta-image from the remaining meta-images of the image set to be matched as a new meta-image to be matched, such as the meta-image P7 as a new meta-image to be matched. This is repeated until the processor performs the depth information d calculation on the reference meta-image P14 and the meta-image P16 to be matched. In this way, a plurality of depth information d may be obtained, and a depth image of the scene may be obtained by fusing the plurality of depth information d, where the depth information d indicates a distance between each object in the scene and the fly-eye lens 100.
In one example, when the processor is configured to perform the predetermined processing on the merged image according to the depth information, the processor specifically performs the following operations: determining a foreground region and a background region of the combined image according to the depth information; and blurring the background area according to the depth information.
The merged image and the depth image have a certain mapping relation, and each pixel in the merged image can find corresponding depth information in the depth image. After the depth information of the scene is acquired, the processor can perform segmentation of the foreground region and the background region on the merged image according to the depth information. Specifically, in an example, the processor may directly perform segmentation of the foreground region and the background region on the merged image according to a preset depth, that is, merge pixels with depth information greater than the preset depth into the background region, and merge pixels with depth information less than or equal to the preset depth into the foreground region. The processor then does no processing or appropriate sharpening processing on the foreground region. Meanwhile, the processor performs blurring processing on the background area. When the processor blurs the background area, all pixels of the background area may have the same degree of blurring. Or, when the processor virtualizes the background region, the processor may further divide the background region, divide the background region into a plurality of sub-regions from near to far, and sequentially increase the virtualization degrees of the sub-regions along the direction from near to far, where a plurality of pixels in each sub-region have the same virtualization degree. Therefore, the background area is blurred to different degrees, and the quality of the finally output combined image can be improved.
In one example, when the processor is configured to perform the predetermined processing on the merged image according to the depth information, the processor specifically performs the following operations: determining a region to be focused of the combined image according to user input; and blurring the area of the combined image except the area to be focused according to the depth information. Wherein the user input may include: the user clicks on the touchable display screen 230 at a location on the combined image for preview corresponding to the display screen 230, and the processor enlarges an area of a predetermined size and shape outward with the location point as a center point to obtain an area to be focused. Or, the processor records a plurality of positions of the display screen 230 clicked by the user when the user uses the electronic device 200 for a plurality of times before, and takes the position with the largest number of clicks as a default position, and when the user does not click the display screen 230, the processor takes the default position as a central point, and expands an area with a preset size and a preset shape outwards to obtain an area to be focused.
The merged image and the depth image have a certain mapping relation, and each pixel in the merged image can find corresponding depth information in the depth image. After the depth information of the scene is acquired and the area to be focused of the combined image is determined, the processor may not process the area to be focused or appropriately sharpen the area to be focused. Meanwhile, the processor performs blurring processing on an area except for the area to be focused (i.e., a non-focusing area). Specifically, the processor may perform the blurring processing of the same blurring degree on all pixels in the out-of-focus area. Or, the processor may further divide the non-focusing region into a plurality of sub-regions from near to far according to the depth information, and sequentially increase the degree of blurring of the sub-regions along the direction from near to far, where a plurality of pixels in each sub-region have the same degree of blurring. Therefore, the blurring of the non-focusing area is performed to different degrees, and the quality of the finally output combined image can be improved.
In the electronic device 200 according to the embodiment of the application, the microlens array structure 10 covers a plurality of photosensitive pixel sets, each photosensitive pixel set can output a meta-image, and the meta-images are fused by a processor to obtain a high-resolution merged image. The electronic device 200 can capture a combined image with a higher resolution without arranging a plurality of conventional cameras, the overall size of the compound eye lens 100 is smaller, which is beneficial to being integrated on the electronic device 200 with a higher requirement on thickness, and compared with a plurality of conventional cameras, the compound eye lens 100 is lower in cost, so that the manufacturing cost of the electronic device 200 can be further reduced.
Referring to fig. 18, a method for manufacturing a microlens array structure 10 according to an embodiment of the present disclosure includes:
01: forming a first lens array 12, the first lens array 12 including a plurality of first bosses 126, the plurality of first bosses 126 being arranged in a first direction;
02: forming a second lens array 14, the second lens array 14 including a plurality of second bosses 146, the plurality of second bosses 146 being arranged in a second direction; and
03: the first lens array 12 is combined with the second lens array 14 such that the plurality of first mesas 126 overlap the plurality of second mesas 146 in a third direction, the first direction intersecting the second direction, the third direction being perpendicular to the first direction and the second direction to form a microlens array.
It is to be understood that the foregoing explanation of the microlens array structure 10 is applicable to the method for manufacturing the microlens array structure 10 of the present embodiment, and will not be further described herein.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (8)

1. A fly-eye lens, characterized in that the fly-eye lens comprises:
a microlens array structure including a first lens array and a second lens array, the first lens array including a plurality of first bosses arranged in a first direction; the second lens array comprises a plurality of second bosses, the second bosses are arranged along a second direction, the first bosses and the second bosses are overlapped in a third direction to form the micro lens array, the first direction intersects with the second direction, and the third direction is perpendicular to the first direction and the second direction;
an image sensor disposed on an image side of the microlens array structure, the image sensor including a plurality of photosensitive pixels, the microlens array structure covering a plurality of photosensitive pixel sets, each photosensitive pixel set including a plurality of the photosensitive pixels, the plurality of photosensitive pixels in each photosensitive pixel set receiving light incident through the microlens array structure to output a plurality of primitive images; and
the processor is used for fusing a plurality of the meta-images to obtain a merged image, calculating depth information of a scene according to at least two meta-images, and performing preset processing on the merged image according to the depth information;
the processor is further configured to:
selecting one meta image from the multiple meta images as a reference meta image and one meta image as a to-be-matched meta image, dividing the reference meta image into multiple block images, and selecting one block image from the multiple block images as a reference block image; searching a matching block image matched with the reference block image in the element image to be matched to form a matching image pair; circularly executing the dividing step and the searching step to traverse a plurality of block images in the reference element image to obtain a plurality of matching image pairs; fusing each pair of image matching pairs to obtain a fused sub-image, and splicing a plurality of fused sub-images to obtain an initial spliced sub-image;
and taking the spliced sub-images as new reference meta-images, continuously selecting one meta-image from the rest meta-images as a new meta-image to be matched, repeating the steps, and repeating the steps in a circulating way until all the meta-images are fused to obtain a combined image.
2. The fly-eye lens of claim 1, wherein the first lens array comprises first and second opposing flat faces and a first convex face, the first plurality of bosses forming the first convex face, the second lens array comprises second and second opposing flat faces and a second convex face, the second plurality of bosses forming the second convex face;
the first planar surface is bonded to the second planar surface; or
The first flat surface is combined with the second convex surface; or
The first convex surface is combined with the second flat surface; or
The first raised surface is joined to the second raised surface.
3. The fly-eye lens according to claim 2, wherein the first flat surface and the second flat surface are both planar structures.
4. The fly-eye lens according to claim 2, wherein the first flat surface and the second flat surface are both curved surface structures;
when the first flat surface is combined with the second flat surface, the first flat surface is bent to the same degree as the second flat surface;
when the first flat surface is combined with the second convex surface, the bending degree of the first flat surface is the same as that of the second convex surface;
when the first convex surface is combined with the second flat surface, the degree of bending of the first convex surface is the same as that of the second flat surface;
when the first convex surface is combined with the second convex surface, the bending degree of the first convex surface is the same as that of the second convex surface.
5. The fly-eye lens according to claim 2, wherein the first flat surface, the first convex surface, the second flat surface, and the second convex surface are each any one of an aspherical surface, a spherical surface, a fresnel surface, or a binary optical surface.
6. The fly-eye lens according to claim 2, wherein the first flat surface and the second flat surface are both planar structures, and the image sensor is also planar structures.
7. The fly-eye lens of claim 2, wherein the first flat surface and the second flat surface are both curved structures, and the image sensor is also a curved structure.
8. An electronic device, comprising:
a housing; and
the fly-eye lens of any one of claims 1 to 7, which is disposed on the housing.
CN201811416956.4A 2018-11-26 2018-11-26 Microlens array structure and manufacturing method thereof, compound eye lens, and electronic device Expired - Fee Related CN109445002B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201811416956.4A CN109445002B (en) 2018-11-26 2018-11-26 Microlens array structure and manufacturing method thereof, compound eye lens, and electronic device
PCT/CN2019/103994 WO2020107984A1 (en) 2018-11-26 2019-09-02 Micro lens array structure and manufacturing method therefor, fly-eye lens, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811416956.4A CN109445002B (en) 2018-11-26 2018-11-26 Microlens array structure and manufacturing method thereof, compound eye lens, and electronic device

Publications (2)

Publication Number Publication Date
CN109445002A CN109445002A (en) 2019-03-08
CN109445002B true CN109445002B (en) 2021-03-23

Family

ID=65554975

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811416956.4A Expired - Fee Related CN109445002B (en) 2018-11-26 2018-11-26 Microlens array structure and manufacturing method thereof, compound eye lens, and electronic device

Country Status (2)

Country Link
CN (1) CN109445002B (en)
WO (1) WO2020107984A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109348114A (en) * 2018-11-26 2019-02-15 Oppo广东移动通信有限公司 Imaging device and electronic apparatus
CN109445002B (en) * 2018-11-26 2021-03-23 Oppo广东移动通信有限公司 Microlens array structure and manufacturing method thereof, compound eye lens, and electronic device
CN109743430A (en) * 2019-03-14 2019-05-10 Oppo广东移动通信有限公司 Display module and electronic device
CN111025528A (en) * 2019-11-29 2020-04-17 华为机器有限公司 Imaging system, camera module and mobile terminal

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NO305728B1 (en) * 1997-11-14 1999-07-12 Reidar E Tangen Optoelectronic camera and method of image formatting in the same
JP2002122706A (en) * 2000-10-12 2002-04-26 Ngk Insulators Ltd Microlens array and method for manufacturing the same
JP2007516469A (en) * 2003-11-18 2007-06-21 マーリン テクノロジー リミテッド ライアビリティ カンパニー Variable optical arrangement and variable manufacturing method
CN102681046B (en) * 2012-05-17 2014-04-16 中北大学 Method for preparing large-area NOA73 curved-surface micro lens array
TWI471610B (en) * 2013-07-02 2015-02-01 Chunghwa Picture Tubes Ltd Stereoscopic display device
CN105182552B (en) * 2015-09-10 2018-01-16 张家港康得新光电材料有限公司 Grating film and 3d display device
CN106094067B (en) * 2016-08-23 2018-01-09 张家港康得新光电材料有限公司 Electrooptical material column lens array structure and the display device for including it
CN106534693B (en) * 2016-11-25 2019-10-25 努比亚技术有限公司 A kind of photo processing method, device and terminal
CN108230384B (en) * 2017-11-28 2021-08-24 深圳市商汤科技有限公司 Image depth calculation method and device, storage medium and electronic equipment
CN109348114A (en) * 2018-11-26 2019-02-15 Oppo广东移动通信有限公司 Imaging device and electronic apparatus
CN109445002B (en) * 2018-11-26 2021-03-23 Oppo广东移动通信有限公司 Microlens array structure and manufacturing method thereof, compound eye lens, and electronic device

Also Published As

Publication number Publication date
CN109445002A (en) 2019-03-08
WO2020107984A1 (en) 2020-06-04

Similar Documents

Publication Publication Date Title
WO2020107997A1 (en) Imaging apparatus and electronic device
CN109445002B (en) Microlens array structure and manufacturing method thereof, compound eye lens, and electronic device
US10044919B2 (en) Structures and methods for capturing images by a portable electronic device
CN103037180B (en) Imageing sensor and picture pick-up device
CN110636277B (en) Detection apparatus, detection method, and image pickup apparatus
JP2023030021A (en) Plenoptic camera for mobile devices
CN106847092B (en) Display panel, display device and method for acquiring image
US20180047185A1 (en) Light field metadata
WO2012133106A1 (en) Image pickup apparatus, image pickup device, image processing method, aperture control method, and program
JP2008129606A (en) Lens assembly and manufacturing method thereof
US11711604B2 (en) Camera module array and assembly method therefor
JP6589006B2 (en) Image sensor with shifted microlens array
JP2009169025A (en) Imaging apparatus and digital camera
US12189273B2 (en) Image capturing unit, camera module and electronic device
CN112004011B (en) Image acquisition method and device and light path conversion element
US20210211580A1 (en) Phase detection autofocus (pdaf) optical system
CN107948470B (en) Camera module and mobile device
CN100583950C (en) Imaging apparatus and method for producing the same, portable equipment
JP5757129B2 (en) Imaging apparatus, aperture control method, and program
EP3065395A1 (en) Processing of light field data
US11314150B2 (en) Phase detection autofocus (PDAF) optical system
CN102789056A (en) Three-dimensional lens changing ordinary camera to three-dimensional telecamera
CN110505387A (en) Imaging system, terminal and image acquisition method
US20200012069A1 (en) Structures and Methods for Capturing Images by a Portable Electronic Device with a Linear Movement Switching Mechanism
JP2009180976A (en) Compound eye camera module

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210323

CF01 Termination of patent right due to non-payment of annual fee