Detailed Description
Various exemplary embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
Fig. 1 is a diagram illustrating a camera module according to an embodiment of the present disclosure.
As shown in fig. 1, a camera module 1 of the embodiment of the present disclosure includes a first imaging unit 10 and a second imaging unit 20 which are adjacently arranged. Each imaging unit 10 and 20 comprises a light sensor for capturing an image. That is, the first imaging unit 10 includes the photosensor 11, and the second imaging unit 20 includes the photosensor 21. In some embodiments of the present disclosure, the light sensors of the first imaging unit 10 and the second imaging unit 20 are separate two sensors, which may be coupled to the same substrate or disposed on separate substrates. In some embodiments of the present disclosure, the photosensors of the first and second imaging units 10 and 20 may be combined with each other into one sensor, or the photosensors of the first and second imaging units 10 and 20 are two portions of one sensor, respectively. The type of light sensor may include any light sensor known in the art, such as CCD (charge coupled) elements, CMOS (complementary metal oxide conductor) devices, photodiodes, and the like, according to embodiments of the present disclosure. In addition, according to embodiments of the present disclosure, the light sensor may be a color light sensor, a monochromatic light sensor, an infrared light sensor, a grayscale sensor, or the like. In some embodiments of the present disclosure, the light sensors of the first and second imaging units 10 and 20 may be different types of sensors. For example, one light sensor may be a colored light sensor while another light sensor is a monochromatic light sensor. For example, the overall size of one light sensor may be different from another light sensor. For example, one photosensor may have a different number of pixels than another photosensor. For example, the pixel size of one photosensor may be different from another photosensor. Hereinafter, the optical sensor will be further described.
The first imaging unit 10 and the second imaging unit 20 of the camera module 1 of the embodiment of the present disclosure are both periscopic optical imaging units. That is, each imaging unit includes at least a first optical axis and a second optical axis, and a plurality of optical elements arranged along the first optical axis and the second optical axis of each imaging unit. These optical elements of the imaging unit comprise at least a reflective element and a refractive element. The reflecting element is used for receiving light from an object field in front of the camera module along a first optical axis and reflecting the received light. The reflected light propagates along the second optical axis. The refractive element is for refracting light propagating along the second optical axis from the reflective element and directing it towards the light sensor to form an image of the object field on the light sensor. Specifically, the first imaging unit 10 includes first and second optical axes AX11 and AX12, and a reflective element 12 and a refractive element 13 arranged along the first and second optical axes AX11 and AX 12. The reflecting element 12 is configured to receive light from an object field in front of the camera module 1 along a first optical axis AX11 and reflect the received light. The reflected light travels along the second optical axis AX 12. The refractive element 13 is for refracting light traveling along the second optical axis AX12 and guiding the light toward the photosensor 11 to form an image of an object field on the photosensor 11. Similarly, the second imaging unit 20 includes first and second optical axes AX21 and AX22, and a reflective element 22 and a refractive element 23 arranged along the first and second optical axes AX21 and AX 22. The reflective element 22 is configured to receive light from an object field in front of the camera module 1 along a first optical axis AX21 and reflect the received light. The reflected light travels along the second optical axis AX 22. The refractive element 23 serves to refract light traveling along the second optical axis AX22 and guide the light toward the photosensor 21 to form an image of the object field on the photosensor 21. As shown in fig. 1, the first and second imaging units 10 and 20 are arranged side by side such that the first and second optical axes AX11 and AX12 of the first imaging unit 10 and the first and second optical axes AX21 and AX22 of the second imaging unit 20 are parallel to each other. In this embodiment, not only the optical axes of the two imaging units are parallel to each other, but also the directions of incident light along the two optical axes are the same.
In this application, "receiving light along a first optical axis" and "light propagating along a second optical axis" do not mean that all of the light rays in these light beams propagate in directions that are perfectly coincident with or parallel to the direction of the optical axis. For example, among the light beams entering the imaging unit along the first optical axis, rays coincident with or parallel to the direction of the first optical axis may be included, and rays at an angle to the first optical axis may also be included as long as the rays can enter the imaging unit and be reflected by the reflecting element. Similarly, among the light traveling along the second optical axis, rays coincident with or parallel to the direction of the second optical axis may be included, and rays at an angle to the second optical axis may also be included as long as the rays can be refracted by the refractive element and incident on the light sensor.
The reflective elements 12 and 22 of the first and second imaging units 10 and 20 of the embodiment of the present disclosure may include any optical element having a reflective function, for example, a mirror, a prism, a spectroscope, and the like.
In some embodiments of the present disclosure, the first optical axis and the second optical axis of the imaging unit are substantially perpendicular to each other, but the present disclosure is not limited thereto. It will also be apparent to those skilled in the art that the first optical axis and the second optical axis may form any other angle by changing the orientation of the reflective surface of the reflective element, for example, according to design requirements or the like.
In addition, the refractive elements 13 and 23 of the first and second imaging units 10 and 20 of the embodiment of the present disclosure may include any optical element having a light ray refractive power, for example, a spherical lens (e.g., a convex lens, a concave lens, a convex-concave lens, a meniscus lens, etc.), an aspherical lens, a free-form surface lens, or the like. In addition, the refractive elements 13 and 23 may include an element having a negative refractive power and an element having a positive refractive power. In addition, the material of the refractive elements 13 and 23 may be any transparent material, for example, glass, resin, plastic, etc. Furthermore, the surfaces of the refractive elements 13 and 23 may be provided with optical coatings for e.g. increasing transmission, reducing reflection, filtering etc. In one embodiment of the present disclosure, the refractive element may include at least one of: zoom lenses, focus lenses, dispersion correction lenses, and the like. Those skilled in the relevant art can implement various configurations of the refractive element according to actual needs, so as to realize optical functions including zooming, focusing, and the like. In addition, the refractive elements 13 and 23 shown in fig. 1 are merely schematic and do not represent a specific configuration of the refractive elements of the embodiments of the present disclosure.
In addition, the optical elements of the imaging unit may also include optical elements such as filters, aperture stops, outer protective lenses, and the like.
In some embodiments of the present disclosure, the camera module further comprises one or more actuation motors. Motors may be used to displace optical elements, light sensors, etc. For example, by moving the focus lens and the zoom lens with the actuator motor, the operations of focusing and optical zooming can be achieved. In addition, by moving the optical element and/or the optical sensor with the actuator motor according to the movement of the camera module with respect to the object field, the movement of the camera module can be compensated to obtain a more stable picture. For example, the movement of the camera module may be compensated by adjusting at least one of a reflection angle of the reflection element, a position of the refraction element, and a position of the light sensor according to the movement of the camera module.
One of ordinary skill in the relevant art will appreciate that there may be differences between the images obtained by the two imaging units. For example, although two imaging units capture the same object field, the two images captured by the two imaging units are different in quality, so that a higher quality image can be obtained by combining the two images. For example, when the two imaging units observe an object at different angles, there is parallax between images obtained by the two imaging units. Parallax information can be obtained by comparing the two images thus obtained, and from this parallax, the distance of each object in the object field from the camera module, that is, depth information of the field of view can be calculated. Based on the calculated distances of the respective objects, operations such as blurring of the background of the subject, quick focusing, and 3D imaging can be performed. One of ordinary skill in the relevant art will appreciate that the two scenarios are not inconsistent and that the various embodiments of the present disclosure may be implemented simultaneously.
In some embodiments of the present disclosure, there is a spacing between the two imaging units on the image plane, so that there is a parallax between the images obtained by the two imaging units. It will be appreciated by one of ordinary skill in the relevant art that the distance between the two imaging units can be set as desired. In some embodiments of the present disclosure, when the distance between the two imaging units is large, the parallax between the images obtained by the two imaging units is large, thereby facilitating obtaining depth information of the captured object field by comparing the two images. In other embodiments of the present disclosure, when the distance between the two imaging units is small, the parallax between the images obtained by the two imaging units is small, thereby contributing to saving the amount of computation of image processing for eliminating the parallax when the two images are combined to obtain the final captured image.
In some embodiments of the present disclosure, angles of the two imaging units at which the object is observed may be the same, and although such an embodiment cannot obtain depth information because parallax cannot be generated between images obtained by the two imaging units, the embodiment completely omits an image processing operation of eliminating parallax between the two images, contributing to reduction in the workload of the processor.
In the conventional camera module, light is directly incident on the light sensor in the height direction of the camera module by refraction of the refractive element. However, since the height of the camera module is limited, the length of the optical path in the camera module is also limited, which greatly limits the number, thickness, shape, displacement distance, and the like of the refractive elements. Therefore, the conventional camera module has great limitations in the magnification value of the optical zoom, the magnification range of the optical zoom, the imaging quality, the anti-shake capability and the like. In contrast to this, in the periscopic imaging unit included in the camera module according to the embodiment of the present disclosure, since the imaging optical path is reflected in the longitudinal direction of the camera module by the reflection element, the imaging optical path is extended, so that there is more space to arrange optical elements (e.g., refractive elements) and other elements and to set the moving distances of the elements, and the like, and thus the magnification value of the optical zoom, the magnification range of the optical zoom, the imaging quality (e.g., enhancing the resolution and contrast, reducing glare, and the like), and the anti-shake capability of the imaging unit can be greatly improved. In addition, the periscope type imaging unit can also reduce the height of the camera module, and prevent the camera module from protruding from the shell of the mobile equipment and the like.
In addition, by combining the advantages of the two periscope type imaging units, the camera module of the present disclosure can improve the magnification value of the optical zoom, the magnification range of the optical zoom, the imaging quality and the anti-shake capability of the imaging unit compared to a camera module using two conventional imaging units. The above and other advantages of the present disclosure will be further explained in the present disclosure in conjunction with the above embodiments and the following embodiments.
In some embodiments of the present disclosure, for example, the first imaging unit 10 is configured to perform optical zooming in a low magnification range, and the second imaging unit 20 is configured to perform optical zooming in a high magnification range, and there is at least partial overlap in the magnification ranges of the optical zooming of the first imaging unit 10 and the second imaging unit 20. Therefore, the camera module according to the present embodiment can perform optical zooming in the low magnification range with the first imaging unit 10 and optical zooming in the high magnification range with the second imaging unit 20, and perform imaging with one or both of the two imaging units in a region where the magnification ranges of the optical zooming of the two imaging units overlap, thereby transitioning from imaging with one imaging unit to imaging with the other imaging unit. Therefore, during the movement of the user from the low optical zoom magnification to the high optical zoom magnification, a smooth optical zoom result can be obtained, so that the overall optical zoom magnification range of the entire camera module is a combination of the optical zoom magnification ranges of the two imaging units. Thereby enabling optical zooming of a larger magnification range.
For example, the magnification range of the optical zoom of the first imaging unit 10 may be 1 to 6 times, and the magnification range of the optical zoom of the second imaging unit may be 4 to 10 times, so that the overlapping area of the optical zoom ranges of the two is 4 to 6 times. In another embodiment, the magnification range of the optical zoom of the first imaging unit is 1 to 11 times, and the magnification range of the optical zoom of the second imaging unit is 9 to 15 times, so that the overlapping area of the optical zoom ranges of the two is 9 to 11 times. The lower limit of the low magnification range may be, for example, 0.5 times, 0.2 times, 0.01 times, etc., and the upper limit of the high magnification range may be, for example, 100 times, 2000 times, etc. In addition, the size of the overlapping area of the two optical zoom ranges may be, for example, 0.5 times, 1 time, 5 times, 10 times, or the like, in addition to 2 times. For example, the magnification range of the optical zoom of the first imaging unit 10 may be 1 to 5.25 times, and the magnification range of the optical zoom of the second imaging unit may be 4.75 to 10 times, so that the overlapping area of the optical zoom ranges of the two is 4.75 to 5.25 times, that is, the size of the overlapping area is 0.5 times. In addition, the overlapping of the two optical zoom ranges also includes a case where only end points of the two optical zoom ranges overlap, for example, the magnification range of the optical zoom of the first imaging unit 10 may be 1 times to 5 times, and the magnification range of the optical zoom of the second imaging unit may be 5 times to 10 times, so that the overlapping area of the optical zoom ranges of the two is only 5 times. It will be understood by those of ordinary skill in the relevant art that the above numerical values are merely examples, and the present disclosure is not limited to these examples, and the magnification ranges of the optical zooms of the two imaging units may be arbitrarily set according to actual needs as long as the two ranges are different and overlap.
In addition, it can be understood by those of ordinary skill in the related art that although the magnifications of the optical zooms of the two imaging units are different, it is still possible to combine the images obtained by the two imaging units to obtain a higher quality image and/or depth information of the field of view in consideration of the difference in the magnifications of the images obtained by the two imaging units.
In some embodiments of the present disclosure, the zoom ranges of the optical zooms of the first imaging unit 10 and the second imaging unit 20 are the same, and the optical zooms of the first imaging unit 10 and the second imaging unit 20 are synchronized. For example, the magnifications of the optical zooms of the first imaging unit 10 and the second imaging unit 20 may always differ by a fixed magnification or remain the same. In this embodiment, it is possible to obtain a higher quality image and/or depth information of the field of view using the images captured by the first and second imaging units 10 and 20.
In some embodiments of the present disclosure, the zoom ranges of the optical zooms of the first imaging unit 10 and the second imaging unit 20 are the same, and the optical zooms of the first imaging unit and the second imaging unit are synchronized. However, in this embodiment, the light sensors of the first and second imaging units 10 and 20 may be a color light sensor and a monochromatic light sensor, respectively. Generally, since the three-color filter is eliminated, the amount of light entering the monochromatic light sensor is three times that of the color light sensor, and accordingly, the area of the sensor becomes three times that of the color light sensor for a single color. Therefore, the monochromatic light sensor can obtain a higher-luminance and sharper image, and can help improve the quality of an image obtained with a color light sensor, and the like, particularly in the case of taking an image in an environment where luminance is low. In this case, by combining the images photographed by the first and second imaging units 10 and 20, a higher quality image can be obtained.
One of ordinary skill in the relevant art will appreciate that the features and advantages discussed above with respect to utilizing a colored light sensor and a monochromatic light sensor may be incorporated into various embodiments of the present disclosure.
In addition, in this embodiment of the present disclosure, at least one of the overall size, the number of pixels, and the pixel size of the photosensors of the first and second imaging units 10 and 20 is different. For example, by reducing the size and number of pixels of the optical sensor, the amount of data that needs to be processed when focusing and obtaining the depth of field of a scene can be greatly reduced, thereby reducing the burden on the processor and increasing the processing speed. In addition, reducing the size of the light sensor of one imaging unit compared to the other imaging unit may increase the parallax between the two imaging units, helping to obtain depth information of the scene. In addition, increasing the pixel size can increase the amount of light entering each pixel, which helps to increase the image quality when capturing a darker scene.
One of ordinary skill in the relevant art will appreciate that the features and advantages discussed above with respect to overall size, number of pixels, and pixel size of the light sensor may be incorporated into various embodiments of the present disclosure.
In some embodiments of the present disclosure, the focal lengths of the first imaging unit 10 and the second imaging unit 20 are fixed, i.e., neither the first imaging unit 10 nor the second imaging unit 20 is capable of optical zooming. In these embodiments, the zoom lens is not included in the optical elements of the first and second imaging units 10 and 20, so that the magnification (focal length) of the optical zooming of the two imaging units is fixed, and the magnification of the optical zooming of the two imaging units may be the same or different. It will be appreciated by those of ordinary skill in the relevant art that images obtained by two imaging units may be combined to obtain a higher quality image and/or depth information of the field of view, regardless of whether the magnifications of the optical zooms of the two imaging units are the same. In such an embodiment, the advantages of high optical zoom magnification, high imaging quality, and strong anti-shake capability of the periscopic imaging unit are utilized, so that the obtained image is more excellent.
In some embodiments of the present disclosure, at least one of the first and second imaging units may further have a second reflective element for reflecting light from the refractive element onto a third optical axis and directing it towards the light sensor to form an image of the object field on the light sensor. Fig. 2 is a diagram of a camera module according to an embodiment of the present disclosure. As shown in fig. 2, second reflective elements 14 and 24 are provided in both the first and second imaging units 10 and 20 so as to reflect light from the refractive element onto the third optical axes AX13 and AX23, respectively, and guide the light toward the photo sensor. By changing the direction of the optical path again, the optical sensor can be arranged on the substrate of the camera module, increasing the convenience and stability of the arrangement of the optical sensor.
In some embodiments of the present disclosure, the plurality of optical elements of the first and second imaging units further have a first refractive element in front of the reflective element along the optical path for refracting light from an object field positioned in front of the camera module along a first optical axis and transferring the refracted light to the reflective element. Fig. 3 is a diagram illustrating a camera module according to an embodiment of the present disclosure. As shown in fig. 3, in the first and second imaging units 10 and 20, in front of the reflective elements 12 and 22, there are also first refractive elements 15 and 25, respectively. The first refractive elements 15 and 25 take advantage of the space above the reflective elements 12 and 22 to facilitate the placement of the refractive elements along the optical path.
In some embodiments of the present disclosure, at least a portion of the optical elements of the first imaging unit 10 and the second imaging unit 20 are common. For example, the first and second imaging units 10 and 20 may share an optical filter, an outer protective lens, and the like. When the first and second imaging units 10 and 20 share the outer protective lens, it is possible to make the camera module 1 including the first and second imaging units 10 and 20 appear as if it is a camera module including only a single imaging unit, i.e., a single-lens camera module, in terms of appearance. Such a camera module 1 is more aesthetically pleasing. In addition, in the embodiment shown in fig. 1, the reflective elements 12 and 22 of the first and second imaging units 10 and 20 may be combined into one reflective element so that they share one reflective element. Furthermore, in some embodiments of the present disclosure, for example, when the optical zooming of the first imaging unit 10 and the second imaging unit 20 is synchronized, the two imaging units may even share a zoom lens and/or a focus lens. Therefore, since the number of components is reduced, it is possible to make the assembly of the camera module simpler and to increase the reliability of the camera module.
In addition to the arrangement of two imaging units shown in fig. 1, the two imaging units of the camera module according to the embodiment of the present disclosure may also adopt other arrangements. The arrangement of the two image forming units is explained below.
Fig. 4 to 7 are diagrams illustrating an arrangement of camera modules according to an embodiment of the present disclosure. As shown in fig. 4 to 6, the first and second imaging units 10 and 20 are arranged side by side such that the first and second optical axes AX11 and AX12 of the first imaging unit 10 and the first and second optical axes AX21 and AX22 of the second imaging unit 20 are substantially parallel to each other, similar to fig. 1. Specifically, in fig. 4 and 5, the reflective elements 12 and 22 of the two imaging units are adjacently arranged such that the first optical axes AX11 and AX21 of the two imaging units are substantially parallel to each other and in the same direction, and the second optical axes AX12 and AX22 of the two imaging units are substantially parallel to each other but in substantially opposite directions. That is, the two reflective elements 12 and 22 of the first and second imaging units 10 and 20 reflect light toward opposite directions. Such an arrangement may, for example, make it easier to arrange accessories such as motors for the refractive optical elements, and to facilitate wiring and heat dissipation for the imaging elements. In addition, in fig. 6, the reflective elements 12 and 22 of the two imaging units are not adjacent, but spaced apart by a certain distance. For example, as shown in fig. 6, the two imaging units are placed head-to-tail opposite such that the refractive element of one imaging unit is adjacent to the photosensor of the other imaging unit. Such an arrangement may increase the parallax between the images obtained by the two imaging units while reducing the size of the camera head assembly, and may thus contribute more to obtaining the depth of field of the field of view.
Fig. 7 is a diagram illustrating an arrangement of camera modules according to an embodiment of the present disclosure. As shown in fig. 7, the first and second imaging units 10 and 20 are arranged to overlap such that the first optical axes AX11 and AX21 of the first and second imaging units 10 and 10 overlap each other, and the second optical axes AX21 and AX22 are substantially parallel to each other. In this embodiment, the reflective element 12 of the first imaging unit 10 is a transflective element, and is configured to reflect a part of received light and transmit another part of the received light. The part of the received light propagates along the second optical axis AX12 of the first imaging unit 10 after being reflected, and the other part of the received light propagates along the first optical axis AX21 of the second imaging unit 20 after being transmitted and is received by the reflective element 22 of the second imaging unit 20. With this embodiment, the angle at which the two imaging units view the object is made the same. Although such an embodiment cannot obtain depth information because parallax cannot be generated between images obtained by two imaging units, the embodiment completely omits an image processing operation to eliminate parallax between the two images, contributing to reduction in the workload of the processor.
The arrangement of the imaging units discussed and illustrated above is merely a few examples of the adjacent arrangement of two imaging units, and the embodiments of the present disclosure are not limited to these. For example, the positions of the two imaging units may be interchanged, and this solution is still within the scope of the present disclosure.
According to further embodiments of the present disclosure, there is provided a mobile device including a camera module according to various embodiments of the present disclosure. The mobile device may be, for example, a cell phone, a camera, a personal digital assistant (PAD), etc. A mobile device incorporating the camera module of the present disclosure can obtain various advantages of the camera module of the present disclosure.
The terms "front," "back," "top," "bottom," "over," "under," and the like in the description and in the claims, if any, are used for descriptive purposes and not necessarily for describing permanent relative positions. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are, for example, capable of operation in other orientations than those illustrated or otherwise described herein.
As used herein, the word "exemplary" means "serving as an example, instance, or illustration," and not as a "model" that is to be replicated accurately. Any implementation exemplarily described herein is not necessarily to be construed as preferred or advantageous over other implementations. Furthermore, the disclosure is not limited by any expressed or implied theory presented in the preceding technical field, background, brief summary or the detailed description.
As used herein, the term "substantially" is intended to encompass any minor variation resulting from design or manufacturing imperfections, device or component tolerances, environmental influences, and/or other factors. The word "substantially" also allows for differences from a perfect or ideal situation due to parasitic effects, noise, and other practical considerations that may exist in a practical implementation.
In addition, certain terminology may also be used in the following description for the purpose of reference only, and thus is not intended to be limiting. For example, the terms "first," "second," and other such numerical terms referring to structures or elements do not imply a sequence or order unless clearly indicated by the context.
It will be further understood that the terms "comprises/comprising," "includes" and/or "including," when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
In the present disclosure, the term "providing" is used broadly to encompass all ways of obtaining an object, and thus "providing an object" includes, but is not limited to, "purchasing," "preparing/manufacturing," "arranging/setting," "installing/assembling," and/or "ordering" the object, and the like.
One of ordinary skill in the relevant art will appreciate that the boundaries between the above described operations are merely illustrative. Multiple operations may be combined into a single operation, single operations may be distributed in additional operations, and operations may be performed at least partially overlapping in time. Moreover, alternative embodiments may include multiple instances of a particular operation, and the order of operations may be altered in various other embodiments. However, other modifications, variations, and alternatives are also possible. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
Although some specific embodiments of the present disclosure have been described in detail by way of example, it should be understood by those of ordinary skill in the relevant art that the above examples are for illustration only and are not intended to limit the scope of the present disclosure. The various embodiments disclosed herein may be combined in any combination without departing from the spirit and scope of the present disclosure. It will also be appreciated by those of ordinary skill in the relevant art that various modifications may be made to the embodiments without departing from the scope and spirit of the disclosure. The scope of the present disclosure is defined by the appended claims.
The present disclosure relates to the following aspects:
1. a camera module, comprising:
a first imaging unit and a second imaging unit arranged adjacently, each imaging unit including a photosensor and a plurality of optical elements arranged along a first optical axis and a second optical axis of each imaging unit, the photosensor being for capturing an image, and the plurality of optical elements including:
a reflective element for receiving light from an object field in front of the camera module along a first optical axis and reflecting the received light, the reflected light propagating along a second optical axis; and
and a refractive element for refracting and guiding the light propagating along the second optical axis towards the light sensor to form an image of the object field on the light sensor.
2. A camera module according to item 1,
the first imaging unit is configured to perform optical zooming in a low magnification range, the second imaging unit is configured to perform optical zooming in a high magnification range, and magnification ranges of the optical zooming of the first imaging unit and the second imaging unit overlap.
3. A camera module according to item 2,
the magnification range of the optical zoom of the first imaging unit is 1 times to 6 times, and the magnification range of the optical zoom of the second imaging unit is 4 times to 10 times, or
The magnification range of the optical zoom of the first imaging unit is 1 times to 11 times, and the magnification range of the optical zoom of the second imaging unit is 9 times to 15 times.
4. A camera module according to item 1,
both the first imaging unit and the second imaging unit are capable of optical zooming, and during the optical zooming, the optical zooming of the first imaging unit and the second imaging unit is synchronized.
5. A camera module according to item 1,
the focal lengths of the first imaging unit and the second imaging unit are fixed.
6. A camera module according to item 1,
at least one of the first and second imaging units further has a second reflective element for reflecting light propagating along the second optical axis, propagating along a third optical axis and directed towards the light sensor to form an image of the object field on the light sensor.
7. A camera module according to item 1,
the plurality of optical elements of the first and second imaging units also have a first refractive element in front of the reflective element along the optical path for refracting light from an object field positioned in front of the camera module along a first optical axis and transferring the refracted light to the reflective element.
8. A camera module according to any of items 1-7,
the light sensors of the first and second imaging units are color light sensors and monochromatic light sensors, respectively.
9. A camera module according to any of items 1-7,
at least one of the overall size, the number of pixels, and the pixel size of the photosensors of the first and second imaging units is different.
10. A camera module according to any of items 1-7,
at least a portion of the optical elements of the first and second imaging units are common.
11. A camera module according to any of items 1-7,
the first imaging unit and the second imaging unit are arranged side by side such that the first optical axis and the second optical axis are each parallel.
12. A camera module according to any of items 1-7,
the first imaging unit and the second imaging unit are arranged to overlap such that first optical axes of the first imaging unit and the second imaging unit overlap each other and second optical axes are parallel to each other, and,
the reflective element of the first imaging unit is a transflective element configured to reflect a portion of the received light and transmit another portion of the received light, the portion of the received light propagating along the second optical axis of the first imaging unit after being reflected, and the another portion of the received light propagating along the first optical axis of the second imaging unit after being transmitted and being received by the reflective element of the second imaging unit.
13. A mobile device comprising a camera module according to any of items 1-12.