[go: up one dir, main page]

CN119653213A - Optical filter, camera module, camera device and electronic equipment - Google Patents

Optical filter, camera module, camera device and electronic equipment Download PDF

Info

Publication number
CN119653213A
CN119653213A CN202411203726.5A CN202411203726A CN119653213A CN 119653213 A CN119653213 A CN 119653213A CN 202411203726 A CN202411203726 A CN 202411203726A CN 119653213 A CN119653213 A CN 119653213A
Authority
CN
China
Prior art keywords
camera module
transmittance
wavelength
equal
curve
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202411203726.5A
Other languages
Chinese (zh)
Inventor
叶海水
侯昕彤
朱腾峰
黄剑辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202411203726.5A priority Critical patent/CN119653213A/en
Publication of CN119653213A publication Critical patent/CN119653213A/en
Pending legal-status Critical Current

Links

Landscapes

  • Blocking Light For Cameras (AREA)

Abstract

本申请提供了一种滤光片、摄像头模组、摄像装置和电子设备。该摄像装置包括第一摄像头模组,该第一摄像头模组包括沿光轴依次设置的镜头、滤光片和图像传感器,滤光片用于可见光通过以在图像传感器上获取目标物体的光谱信息;滤光片具有在正入射条件下获取的第一透过率曲线和在斜入射条件下获取的第二透过率曲线;在第一透过率曲线上,50%透过率对应的波长包括第一波长,第一波长在600nm~660nm范围内;在第二透过率曲线上,50%透过率对应的波长包括第二波长,第一波长与第二波长之间的差值在0~20nm范围内。上述技术方案能够在保证摄像头模组颜色还原能力的同时提高摄像头模组用于颜色校准的准确性。

The present application provides a filter, a camera module, a camera device and an electronic device. The camera device includes a first camera module, which includes a lens, a filter and an image sensor arranged in sequence along the optical axis. The filter is used for visible light to pass through to obtain spectral information of the target object on the image sensor; the filter has a first transmittance curve obtained under normal incidence conditions and a second transmittance curve obtained under oblique incidence conditions; on the first transmittance curve, the wavelength corresponding to 50% transmittance includes a first wavelength, and the first wavelength is in the range of 600nm to 660nm; on the second transmittance curve, the wavelength corresponding to 50% transmittance includes a second wavelength, and the difference between the first wavelength and the second wavelength is in the range of 0 to 20nm. The above technical solution can improve the accuracy of the camera module for color calibration while ensuring the color restoration capability of the camera module.

Description

Optical filter, camera module, camera device and electronic equipment
Technical Field
The present application relates to the field of imaging technologies, and more particularly, to an optical filter, a camera module, an imaging device, and an electronic apparatus.
Background
The multispectral camera module can obtain information of a plurality of spectral radiation or reflection of the shot target through a light splitting technology, and further obtain a characteristic spectrum of the shot target, wherein the characteristic spectrum can be used for restoring the color of the shot target.
Compared with the existing three-channel camera module, the multispectral camera module has stronger spectrum sensing capability, so that the color reduction capability is strong, and the color of the acquired image is closer to the real color. Therefore, in order to improve the quality of the output image of the three-channel camera module, the image acquired by the multi-spectrum camera module can be used for carrying out color calibration on the image acquired by the three-channel camera module.
However, the color reduction capability of the multispectral camera module is also affected by two key optical components, namely an optical lens and an optical filter, in the module, and the optical response of the module needs to be designed by combining the chip, the optical lens and the optical filter together. Poor optical response design can lead to poor accuracy of the multispectral camera module for color calibration, even if the multispectral camera module can acquire real colors, the three-channel camera module cannot accurately acquire color information from the multispectral camera module, so that color performance of the three-channel camera module has large color deviation from that seen by human eyes, and the reality is poor.
Disclosure of Invention
The application provides an optical filter, a camera module, a camera device and electronic equipment, which can improve the accuracy of the camera module for color calibration while ensuring the color restoration capability of the camera module, thereby improving the color authenticity and consistency of images.
The first aspect provides an image pickup device, which comprises a first camera module, wherein the first camera module comprises an optical lens, an optical filter and an image sensor, the optical lens, the optical filter and the image sensor are sequentially arranged along an optical axis direction, the optical lens is used for receiving light rays from a target object, the optical filter is used for receiving visible light rays to pass through, the image sensor is used for receiving the visible light rays to obtain spectrum information of the target object, the number of channels of the image sensor is greater than 3, the optical filter is provided with a first transmittance curve and a second transmittance curve, the first transmittance curve is a curve of the correspondence between the wavelength of the light rays which are normally incident on the optical filter and the transmittance, the second transmittance curve is a curve of the correspondence between the wavelength of the light rays which are incident on the optical filter at a first angle and the transmittance, the first angle is greater than 0 DEG and less than or equal to 35 DEG, the wavelength which is corresponding to 50% transmittance comprises a first wavelength which is greater than or equal to 600nm and less than or equal to 660nm, and the second transmittance curve which is corresponding to the wavelength which is greater than or equal to 20nm.
In the embodiment of the application, the optical filter with the parameters enables the spectrum response curve of the first camera module to be similar to the spectrum response curve of human eyes and the spectrum response curve of the three-way camera module respectively, and accordingly, the spectrum response curve of the first camera module can be respectively close to or equal to the spectrum response curve of the human eyes and the spectrum response curve of the three-way camera module after being mapped from high dimension to low dimension. Therefore, the color restoration capability of the first camera module and the accuracy of color correction on the three-channel camera module can be improved, the color cast problem is reduced or avoided, the color authenticity of the three-channel camera module image is improved, and therefore the user experience is improved.
Specifically, in normal incidence, the first wavelength range corresponding to the light filter at the position of 50% of transmittance is 600nm-660nm, which is approximately equivalent to the corresponding wavelength range of the light filter in the three-way camera module, so that the spectral response range can be limited on the whole, and the spectral response curve of the first camera module is beneficial to approaching the spectral response curve of human eyes and the spectral response curve of the three-way camera module from high-dimensional dimension reduction. And the difference between the first wavelength and the second wavelength is in the range of 0-20nm, so that the transmittance curve of the optical filter can ensure that the spectral response curve of the first camera module has higher similarity with the spectral response curve of human eyes and the three-channel camera module in the incidence range of light, thereby reducing or avoiding the color cast problem and improving the color authenticity of the output image.
In addition, the difference between the first wavelength and the second wavelength is in the range of 0-20nm, so that the angular offset of the transmittance curve of the optical filter can be limited, the color reduction consistency of the image center and the image edge can be improved, and the color difference between the image center and the image edge is reduced.
With reference to the first aspect, in one possible implementation manner, on the first transmittance curve, a wavelength corresponding to 50% transmittance includes a third wavelength, where the third wavelength is greater than or equal to 400nm and less than or equal to 440nm, and on the second transmittance curve, a wavelength corresponding to 50% transmittance includes a fourth wavelength, where a difference between the third wavelength and the fourth wavelength is greater than or equal to 0 and less than or equal to 20nm.
When oblique incidence is carried out, the range of the third wavelength corresponding to the position of the light filter with the transmittance of 50% is 400-440 nm, the light filter is approximately equivalent to the corresponding wavelength range of the light filter in the three-channel camera module, and the spectrum response range can be limited on the whole, so that the similarity degree between the spectrum response curve of the first camera module and the spectrum response curve of the human eyes and the three-channel camera module is higher, the color reduction capability of the first camera module and the accuracy of color correction on the three-channel camera module are improved, the color cast problem is reduced or avoided, and the color authenticity of the three-channel camera module image is improved.
The difference between the third wavelength and the fourth wavelength is in the range of 0-20nm, so that the angular offset of the transmittance curve of the optical filter can be limited, the color reduction consistency of the image center and the image edge can be improved, and the color difference between the image center and the image edge is reduced.
With reference to the first aspect, in one possible implementation manner, a difference between the first wavelength and the second wavelength is greater than or equal to 0 and less than or equal to 10nm.
With reference to the first aspect, in one possible implementation manner, on the first transmittance curve, an average transmittance of 350nm to 390 nm bands is less than or equal to 3%, an average transmittance of 700nm to 780nm bands is less than or equal to 2%, and on the second transmittance curve, an average transmittance of 350nm to 390 nm bands is less than or equal to 4%, and an average transmittance of 700nm to 780nm bands is less than or equal to 3%.
In the embodiment of the application, the cut-off wavelength of the optical filter corresponds to the cut-off wavelength position of the visual response curve of the human eye, and the optical wave beyond the human eye induction band can be filtered by the optical filter, so that the effective information has better signal-to-noise ratio, and the image quality is improved.
With reference to the first aspect, in one possible implementation manner, on the first transmittance curve, the lowest transmittance of 440nm to 580nm bands is greater than or equal to 70%, the average transmittance of 440nm to 580nm bands is greater than or equal to 80%, and on the second transmittance curve, the lowest transmittance of 440nm to 580nm bands is greater than or equal to 70%, and the average transmittance of 440nm to 580nm bands is greater than or equal to 80%.
4400 Nm-580 nm is the stable section of the transmittance of the optical filter, and the optical filter can be prevented from larger transmittance fluctuation by restraining the minimum value in the range of the wave band so as to ensure the consistency of the effect colors. The whole photosensitivity of the multispectral camera module can be ensured by restraining the average transmittance value of the wave band, thereby improving the drawing quality.
With reference to the first aspect, in one possible implementation manner, the first wavelength and the second wavelength belong to a red light band, and the third wavelength and the fourth wavelength belong to a blue light band.
With reference to the first aspect, in one possible implementation manner, the camera device further includes at least one second camera module, the second camera module is a three-channel camera, the first camera module is a C-channel camera, C is greater than 3, a similarity v 0 between a spectral response curve of the first camera module and a spectral response curve of a human eye is greater than or equal to 0.93, a similarity v' between a spectral response curve of the first camera module and a spectral response curve of the second camera module is greater than or equal to 0.93, wherein,
Q N×C is a spectral response matrix obtained by respectively carrying out N-point sampling on each of the C channels based on the spectral response curve of the first camera module, and P { Q N×C}=QN×C[QN×C TQN×C]-1QN×C T;
for a spectral response matrix obtained by N-point sampling in each of three channels based on the spectral response curve of the human eye,
X' N×3 is a spectral response matrix obtained by sampling N points in each of three channels based on the spectral response curve of the second camera,
Tr () is trace.
When v 0 is greater than or equal to 0.93, the spectral response curve of the first camera module can be as close to the spectral response curve of human eyes as possible after the dimension is reduced from high dimension, so that the color reduction capability of the multispectral camera module can be improved, the color cast problem is reduced or avoided, and the color authenticity or the accuracy of color parameters of an output image of the multispectral camera module is improved.
When v' is greater than or equal to 0.93, the spectral response curve of the first camera module can be as close to the spectral response curve of the second camera module as possible after the dimension is reduced from high dimension, so that the accuracy of the transfer of the color parameters of the multispectral camera module to the second camera module can be improved, the color cast problem caused by the transfer error is reduced or avoided, and the color authenticity of the output image of the second camera module is improved.
With reference to the first aspect, in one possible implementation manner, the spectral response matrix of the first camera module
QN×C=MN×C⊙TN×1⊙LN×1
M N×C is a matrix obtained by respectively carrying out N-point sampling on each of the C channels based on the spectral response curve of the image sensor;
t N×1 is a matrix obtained by N-point sampling based on the transmittance curve of the optical filter;
L N×1 is a matrix obtained by sampling N points based on the transmittance curve of the optical lens.
The second aspect provides an optical filter, which is applied to a camera module, the camera module further comprises an optical lens and an image sensor, the optical filter is arranged between the optical lens and the image sensor, the optical lens is used for receiving light rays from a target object, the optical filter is used for receiving visible light rays in the light rays to pass through, the image sensor is used for receiving the visible light rays to obtain spectrum information of the target object, the channel number of the image sensor is greater than 3, the optical filter is provided with a first transmittance curve and a second transmittance curve, the first transmittance curve is a corresponding relation curve of the wavelength of the light rays which are normally incident on the optical filter and the transmittance, the second transmittance curve is a corresponding relation curve of the wavelength of the light rays which are incident on the optical filter at a first angle and the transmittance, the first angle is greater than 0 DEG and less than or equal to 35 DEG, the first wavelength is greater than or equal to 600nm and less than or equal to 660nm, and the second transmittance curve is greater than or equal to 20nm, and the second transmittance curve is greater than or equal to 0nm.
With reference to the second aspect, in one possible implementation manner, on the first transmittance curve, a wavelength corresponding to 50% transmittance includes a third wavelength, where the third wavelength is greater than or equal to 400nm and less than or equal to 440nm, and on the second transmittance curve, a wavelength corresponding to 50% transmittance includes a fourth wavelength, where a difference between the third wavelength and the fourth wavelength is greater than or equal to 0 and less than or equal to 20nm.
With reference to the second aspect, in one possible implementation manner, a difference between the first wavelength and the second wavelength is greater than or equal to 0 and less than or equal to 10nm.
With reference to the second aspect, in one possible implementation manner, on the first transmittance curve, an average transmittance of 350nm to 390 nm bands is less than or equal to 3%, an average transmittance of 700nm to 780nm bands is less than or equal to 2%, and on the second transmittance curve, an average transmittance of 350nm to 390 nm bands is less than or equal to 4%, and an average transmittance of 700nm to 780nm bands is less than or equal to 3%.
With reference to the second aspect, in one possible implementation manner, on the first transmittance curve, the lowest transmittance of 440nm to 580nm bands is greater than or equal to 70%, the average transmittance of 440nm to 580nm bands is greater than or equal to 80%, and on the second transmittance curve, the lowest transmittance of 440nm to 580nm bands is greater than or equal to 70%, and the average transmittance of 440nm to 580nm bands is greater than or equal to 80%.
With reference to the second aspect, in one possible implementation manner, the first wavelength and the second wavelength belong to a red light band, and the third wavelength and the fourth wavelength belong to a blue light band.
In a third aspect, a camera module is provided, including an optical lens, an image sensor, and an optical filter in any implementation manner of the second aspect and the second aspect, where the optical filter is disposed between the optical lens and the image sensor, the optical lens is configured to receive light from a target object, the optical filter is configured to pass visible light in the light, the image sensor is configured to receive the visible light to obtain spectral information of the target object, and a number of channels of the image sensor is greater than 3.
In a fourth aspect, an electronic device is provided, including an image processing chip and the camera module in the third aspect, where the image processing chip is configured to process an image acquired by the camera module.
A fifth aspect provides an electronic device, including an image processing chip and the image capturing apparatus of the first aspect and any implementation manner of the first aspect, where the image processing chip is configured to process an image acquired by the image capturing apparatus.
With reference to the fifth aspect, in one possible implementation manner, the image capturing device further includes at least one second camera module, where the second camera module is a three-way camera, and the image processing chip is configured to correct a color of the target object acquired by the second camera module based on spectral information of the target object acquired by the first camera module.
With reference to the fifth aspect, in one possible implementation manner, the second camera module is a wide-angle camera, a tele camera or an ultra-wide-angle camera.
Advantageous effects of the device according to the second to fifth aspects are described with reference to the first aspect, and are not described in detail herein.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device to which an embodiment of the present application is applied.
Fig. 2 is a schematic exploded view of a camera module according to an embodiment of the present application.
Fig. 3 is a schematic cross-sectional view of a camera module according to an embodiment of the present application.
Fig. 4 is a schematic diagram of a transmittance curve of an optical filter according to an embodiment of the present application.
Fig. 5 is a schematic diagram of transmittance curves of another optical filter according to an embodiment of the application.
Fig. 6 is a schematic diagram of transmittance curves of another optical filter according to an embodiment of the application.
Fig. 7 is a schematic diagram of a transmittance curve of another optical filter according to an embodiment of the application.
Fig. 8 is a schematic diagram of transmittance curves of another optical filter according to an embodiment of the application.
Fig. 9 is a schematic diagram of transmittance curves of another optical filter according to an embodiment of the application.
Fig. 10 is a schematic flowchart of a design method of a camera module according to an embodiment of the present application.
Detailed Description
The technical scheme of the application will be described below with reference to the accompanying drawings.
It should be noted that, in the description of the embodiment of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B, and "and/or" herein is merely an association relationship describing an association object, which means that three relationships may exist, for example, a and/or B, and may mean that a exists alone, a and B exist together, and B exists alone.
In embodiments of the present application, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first", "a second", etc. may explicitly or implicitly include one or more such feature. In addition, in the description of the embodiments of the present application, "plurality" means two or more, and "at least one" and "one or more" mean one, two or more. The singular expressions "a," "an," "the," and "such" are intended to include, for example, also "one or more" such expressions, unless the context clearly indicates to the contrary.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
In the description of the embodiments of the present application, the terms "upper," "lower," "inner," "outer," "vertical," "horizontal," and the like indicate an orientation or positional relationship defined with respect to the orientation or position in which the components are schematically disposed in the drawings, and it should be understood that these directional terms are relative concepts used for descriptive purposes and clarity with respect to each other, rather than for indicating or implying that the apparatus or component being referred to must have a particular orientation or be constructed and operated in a particular orientation, which may vary accordingly with respect to the orientation in which the components are disposed in the drawings and therefore should not be construed as limiting the present application. Furthermore, the term "vertical" referred to in the present application is not strictly vertical but is within the allowable error range. "parallel" is not strictly parallel but is within the tolerance of the error.
In the embodiments of the present application, the same reference numerals are used to denote the same components or the same parts. In addition, the various components in the drawings are not to scale, and the dimensions and sizes of the components shown in the drawings are merely illustrative and should not be construed as limiting the application.
For convenience of understanding, technical terms related to the present application will be explained and illustrated.
The optical axis, which is an imaginary line in the optical system, is understood to be the direction in which the optical system transmits light. For a symmetric transmission system, the optical axis generally coincides with the optical system rotation centerline. If the light beam coincides with the optical axis, the light will pass along the optical axis in the optical system.
Auto Focus (AF) is to use the light reflection principle of the shot object to image and receive the reflected light from the shot object on the image sensor after passing through the lens, and to drive the focusing device to focus after processing by the computer.
Optical anti-shake (optical image stabilization, OIS) refers to the prevention or reduction of instrument shake phenomena occurring during capturing of optical signals by the arrangement of optical components in imaging instruments such as mobile phones or cameras, so as to improve imaging quality. One common approach is to do shake detection by a gyroscope (gyro), and then translate or rotate the entire lens in the opposite direction by OIS motor, compensating for image blur caused by imaging instrument shake during exposure.
Color space, also known as color space, is a mathematical model representing colors, used for quantization of colors. Common color spaces include Lab color space, luv color space, LCh color space, yxy color space, XYZ color space, CMYK color space, RGB color space, hex color space, YUV color space, and the like.
The parameters of the optical filter comprise a central wavelength, transmittance, peak transmittance, bandwidth, a cut-off band and an incident angle.
The center wavelength (CENTER WAVE LENGTH, CWL) refers to the midpoint between wavelengths where the peak transmittance is 50%, also referred to as the midpoint of the full width half maximum. Typically, the center wavelength is expressed as the peak transmission wavelength of a bandpass or narrowband filter, or the peak reflection wavelength of a notch filter.
Transmittance, which is the ratio of light allowed to pass through a filter to incident light, is generally expressed in percent and generally represents the loss of light after it enters the filter. The cut-off is exhibited when the transmittance reaches 10% or less.
The peak transmittance is a value of the highest transmittance of the passband of the bandpass, that is, the highest value that can be transmitted after the filter is lost. In the spectral curve, the area through which the light rays actually pass is generally called the passband.
The bandwidth (FWHM), also known as full width at half maximum, half width at half maximum, etc., is a range of wavelengths representing the difference between the bands in which the spectrum is located at half the peak transmittance. Filters with half-widths less than 20nm are referred to as narrowband filters, and filters with half-widths greater than 20nm are referred to as bandpass filters.
The cut-off band (cut off wavelength) represents the range of the filter cut-off optical band.
The angle of incidence (angle of incidence, AOI) refers to the angle between the incident light and the normal to the filter surface. When light is perpendicularly incident, the incident angle is 0 °.
An operator, which is a mapping, is able to transform one mathematical object into another. An operator is understood to be an operation that processes an input according to some rule and produces an output.
The trace of the matrix is the sum of the matrix eigenvalues. For a square matrix, the trace represents the sum of all elements on the main diagonal (diagonal from top left to bottom right) of the square matrix.
It should be noted that the above terms and concepts are presented only to aid in understanding the use and should not be construed as limiting the embodiments of the application.
Fig. 1 shows a schematic block diagram of an electronic device to which an embodiment of the present application is applied.
In an embodiment of the present application, the electronic device is an electronic device having an imaging function, such as a mobile phone, a Personal Digital Assistant (PDA), a tablet computer, a laptop computer (laptop), a video camera, a video recorder, a camera, a smart watch (SMART WATCH), a smart bracelet (smart wristband), a point of sale (POS), a car set, a television (e.g., a smart screen), a wearable device, and the like. The embodiment of the application does not limit the specific form of the electronic device. For convenience of explanation and understanding, the following description will be made taking an electronic device as an example of a mobile phone.
By way of example, (a) and (b) in fig. 1 schematically show the front and back, respectively, of the electronic device 100. As shown in fig. 1, an electronic device 100 may include a housing 101, a display screen (DISPLAY PANEL, DP) 102, and a camera array 103.
The case 101 is formed with an accommodating space for accommodating components of the electronic apparatus 100. The housing 101 may also function to protect the electronic device 100 and support the entire machine. The display screen 102 and the camera array 103 are disposed in the accommodation space of the housing 101, and are connected to the housing 101. In some embodiments, the housing 101 may include a rear cover disposed opposite the display screen 102 and a middle frame disposed between the rear cover and the display screen 102, and the display screen 102 and the camera array 103 may be fixed to the middle frame. The material of the housing 101 may be metal, plastic, ceramic, glass, or the like.
The display screen 102 is used to display images, for example, images captured by the camera array 103. The display screen 102 may be a Liquid CRYSTAL DISPLAY (LCD) screen, an Organic LIGHT EMITTING Diode (OLED) display screen, or the like. The display screen 102 may be a regular screen, a special-shaped screen, a folding screen, or the like. The display screen 102 may be disposed on the front and/or back of the electronic device 100. Here, the front side of the electronic device 100 may be understood as a side facing the user when the user uses the electronic device 100, and the back side of the electronic device 100 may be understood as a side facing away from the user when the user uses the electronic device 100.
The camera array 103 is used to capture still images or video. The camera array 103 includes a plurality of camera modules (camera compact module, CCMs) (or simply cameras). Illustratively, the camera array 103 includes at least one multispectral camera module (also referred to herein as a first camera module) and at least one second camera module, wherein the second camera module is a tele camera, a wide camera, an ultra-wide camera, or a depth camera. When the camera array 103 includes multiple multispectral camera modules (or multiple second camera modules), the multiple multispectral camera modules (or multiple second camera modules) may be identical or different, for example, focal segments of the multiple multispectral camera modules (or multiple second camera modules) are different, optical structures are different, or optical parameters are different.
In some embodiments, each of the at least one multispectral camera module is capable of independent imaging, and the captured images can be presented directly to a user for viewing.
In some embodiments, each of the at least one second camera module is capable of independent imaging, and the captured images may be presented directly to a user for viewing. One or more of the at least one second camera module may be a master camera. Typically, the main camera is responsible for the main shooting task, usually with the highest pixels, and can provide higher resolution and stronger sensors, thereby meeting the shooting needs of the user in different scenes.
In some embodiments, the image acquired by the multispectral camera module may be used to color calibrate the image acquired by the second camera module. For example, color calibration may be performed on three-channel images (such as Red Green Blue (RGB) images or Red Yellow Blue (RYB) images) of the first scene acquired by the second camera module based on the multispectral image of the first scene acquired by the multispectral camera module.
The second camera module is a three-channel camera module and is limited by material capability, a certain difference exists between a corresponding spectral response curve and a spectral response curve corresponding to human eyes, and the quantity of the perceived spectral bands is small, so that the color reduction capability is weak, and color cast of certain colors of the acquired image is easy to occur. Compared with the second camera module, the difference between the spectrum response curve corresponding to the multispectral camera module and the spectrum response curve corresponding to the human eye can be smaller, and the multispectral camera module has stronger spectrum sensing capability (namely, the number of the spectrum bands which can be sensed is more than 3), so that the color reduction capability is strong, and the color of the acquired image is more similar to the real color after being reduced. And the image acquired by the multispectral camera module is taken as a reference, and the second camera module is subjected to color calibration, so that the color authenticity of the output image of the second camera module can be improved.
In some embodiments, the multispectral camera module or the second camera module may be an upright module or a folded module (or periscope camera module). The vertical camera module can be understood that the light entering the camera module is directly transmitted to the image sensor, and the light path is not bent. The folding camera module can be understood that light entering the camera module can be transmitted to the image sensor only through optical elements such as a reflector, a lens, a prism and the like, and the light path is folded.
In some embodiments, camera array 103 may be disposed on a front and/or back side of electronic device 100. When the camera array 103 is disposed on the front surface of the electronic device 100, it may also be referred to as a front camera. When the camera array 103 is disposed on the back of the electronic device 100, it may also be called a rear camera. In some embodiments, when the display screen 102 is foldable, the camera array 103 may act as a front camera or a rear camera as the display screen 102 is folded. It will be appreciated that the location of the camera array 103 may be determined according to actual requirements, and that the mounting location shown in fig. 1 is merely illustrative.
In some embodiments, the electronic device 100 may further include a protective lens 104 for protecting the camera modules in the camera array 103. The protective lens 104 is disposed on the housing 101 and covers the camera module. For example, when the protective lens 104 is used to protect a front camera, the protective lens 104 may cover only the front camera module or the entire front face of the electronic device 100. When the protective lens 104 covers the entire front surface of the electronic device 100, the protective lens 104 may be used to protect the display screen 102 at the same time, and the protective lens 104 is Cover Glass (CG). For another example, when the protection lens 104 is used for protecting the rear camera, the protection lens 104 may cover the entire back surface of the electronic device 100, or may be disposed only at a position corresponding to the rear camera module.
In some embodiments, the material of the protective lens 104 may be glass, sapphire, ceramic, etc., which is not particularly limited in the present application. Illustratively, the protective lens 104 is transparent, and light outside the electronic device 100 may enter the camera module through the protective lens 104.
In some embodiments, the electronic device 100 may further include a circuit board and an image processor (not shown in the drawings), which are located in the receiving space formed by the housing 101, and the image processor is fixed to and electrically connected with the circuit board. The image processor is in communication with a camera module (e.g., a multispectral camera module or a second camera module) in the camera array 103. The image processor is used for acquiring image data from the camera module and processing the image data. The communication connection between the camera module and the image processor can comprise data transmission through electrical connection modes such as wiring and the like, and can also comprise data transmission through coupling and the like. It can be understood that the camera module and the image processor can also be in communication connection through other modes capable of realizing data transmission.
In some embodiments, the electronic device 100 may also include an analog-to-digital converter (also referred to as an a/D converter, not shown). The analog-to-digital converter is connected between the camera module and the image processor. The analog-to-digital converter is used for converting the signal generated by the camera module into a digital image signal and transmitting the digital image signal to the image processor, and then the digital image signal is processed by the image processor, and finally the image or the image is displayed by the display screen 102.
In some embodiments, the electronic device 100 may further include a memory (not shown) in communication with the image processor, where the image processor processes the digital image signals and then transmits the image to the memory so that the image can be retrieved from the memory and displayed on the display screen 102 at any time when the image is subsequently viewed. In some embodiments, the image processor further compresses the processed image digital signal and stores the compressed image digital signal in the memory, so as to save the memory space.
It should be understood that the configuration illustrated in fig. 1 is not intended to be limiting in detail to the electronic device 100, and the electronic device 100 may include more or less components than illustrated, for example, the electronic device 100 may further include one or more of a battery, a flash, an earpiece, a key, a sensor, etc., or the electronic device 100 may not include the display screen 102, or the electronic device 100 may be provided with a different arrangement of components than illustrated.
Fig. 2 and fig. 3 show a schematic structural diagram of a camera module according to an embodiment of the present application. Wherein fig. 2 is a schematic exploded view of the camera module 200 and fig. 3 is a schematic cross-sectional view of the camera module 200. The camera module 200 of fig. 2 may be one exemplary configuration of the multispectral camera module or the second camera module of fig. 1. The structure of the camera module 200 is briefly described below with reference to fig. 2 and 3.
For convenience of description, the following defines the optical axis direction of the camera module 200 as the Z direction, and the two directions perpendicular to the optical axis are the X direction and the Y direction, respectively, and the X direction is perpendicular to the Y direction. In the Z direction, the side of the direction of the object is the front side, the side of the direction opposite to the object is the rear side, and in the X direction and the Y direction, the direction close to the optical axis is the inner side, and the direction opposite to the optical axis is the outer side. In the embodiment of the application, the optical axis direction is the direction in which the optical system conducts light.
Here, the X, Y, Z directions and the definitions of front, rear, inner and outer are equally applicable to the respective drawings to be described later. It should be noted that, the above definitions of the direction X, Y, Z and the front, rear, inner and outer directions are only for convenience in describing the positional relationship, the connection relationship or the movement relationship between the components in the embodiment of the present application, and should not be construed as limiting the embodiment of the present application.
As shown in fig. 2 and 3, the camera module 200 may include a housing 210, a lens assembly 220, a lens group actuator 230, and a light perception assembly 240.
The housing 210 is formed with an accommodating space for accommodating the lens assembly 220, the lens group actuator 230, the light sensing assembly 240, and the like. In addition, the housing 210 may also serve as a protection and support. It will be appreciated that the configuration of the housing 210 shown in fig. 2 and 3 is merely exemplary and should not be construed as limiting the application, and that one skilled in the art may design the shape of the housing 210 accordingly as desired.
The lens assembly 220 (or optical lens) mainly includes a lens group 221 and a lens barrel 222, and the lens group 221 is accommodated in an accommodating space formed by the lens barrel 222. The lens assembly 220 is used to image an object-side subject on an imaging plane on an image side. In some embodiments, lens assembly 220 may also perform certain processing on the received imaging beam, such as correcting aberrations, eliminating chromatic aberrations, and the like. Here, the imaging beam refers to a beam formed by light incident on the camera module 200.
The lens group 221 may include at least one lens. The at least one lens may be different or at least partially identical. The number of lenses included in the lens group 221 is not particularly limited in the embodiment of the present application, and a person skilled in the art may set the number of lenses according to actual needs, for example, 1,2, 3, 5, 8 or more.
The focal length of the lens assembly 221 may be fixed, and accordingly, the lens assembly 220 is a fixed focus lens. The focal length of the lens assembly 221 can also be adjusted, and accordingly, the lens assembly 220 is a zoom lens. Adjustment of the focal length of the lens set 221 may be accomplished, for example, by adjusting the relative positions between the lenses in the lens set 221.
The lens barrel 222 is formed with an accommodating space for accommodating the lens group 221. In some embodiments, the lens barrel 222 may be a single body, and the lens group 221 is accommodated in the single body of the lens barrel 222. In other embodiments, the lens barrel 222 may also include a plurality of barrel portions into which the lenses of the lens group 221 are disposed in groups, wherein each barrel portion and the lenses received therein may be referred to as a lens group. Illustratively, the relative positions between the barrel portions may be adjustable, enabling adjustment of the relative positions between the lenses to achieve optical zoom.
It should be understood that the structure of the lens barrel 222, the connection manner of the lens group 221 and the lens barrel 222 in fig. 2 and 3, and the like are merely exemplary, and are not limited in any way.
The lens group actuator 230 is used to drive the lens assembly 220 to move, so as to achieve auto-focusing and/or optical anti-shake. In some embodiments, the lens group actuator 230 may also be referred to as a lens group motor, or simply a motor.
As shown in fig. 3, the lens group actuator 230 may include a motor (hereinafter, abbreviated as AF motor for convenience of description) 231 for moving the lens assembly 220 for AF and/or a motor (hereinafter, abbreviated as OIS motor for convenience of description) 232 for moving the lens assembly 220 for OIS. Specifically, the AF motor 231 is used to move the lens assembly 220 for auto-focusing in the Z direction, and the OIS motor 232 is used to move the lens assembly 220 for optical anti-shake in the X direction and/or the Y direction. In some embodiments, the AF motor 231 and OIS motor 232 may be two separate components that independently drive the lens assembly 220 for AF and OIS, respectively. Alternatively, the AF motor 231 and the OIS motor 232 may be integrated, and the lens assembly 220 is driven by one motor to perform AF and OIS. Fig. 3 exemplarily shows that the lens group actuator 230 includes the independent AF motor 231 and OIS motor 232, but it should be understood that the embodiment of the present application is not limited thereto.
In some embodiments, the AF motor 231 or OIS motor 232 may be used to move the entire lens assembly 220, or to move portions of the lens assembly 220. For example, if a portion (e.g., the first lens group) of the lens assembly 220 is fixed relative to another portion (e.g., the second lens group) is movable, the AF motor 231 or OIS motor 232 may move the movable portion to change the optical path to achieve the desired function.
In some embodiments, the AF motor 231 or OIS motor 232 may be a Voice Coil Motor (VCM), a shape memory alloy (shape memory alloy, SMA) motor, a stepper motor (stepping motor), a piezoelectric motor (piezoelectric motor), or the like. It should be understood that the specific structure of the AF motor 231 or OIS motor 232 may be correspondingly designed and selected according to the selected driving mode, which is not limited in this embodiment of the present application.
The light sensing component 240 is disposed at the rear side of the lens component 220, and is mainly used for imaging. Illustratively, the light perception component 240 may include a filter 241, an image sensor 242, a wiring board 243.
The filter 241 is disposed between the lens assembly 220 and the image sensor 242. The filter 241 can eliminate unnecessary light projected onto the image sensor 242, and prevent the image sensor from having problems of ghosting, parasitic light, color cast, and the like at the time of imaging. Illustratively, the filter 241 may be an infrared cut filter (filtering out long wave rays other than visible light), a band pass filter (for passing light of a specific wavelength, cutting off light outside a pass band), a filter for filtering out rays of other bands, or the like. The filter 241 may have different ranges of action for different usage scenarios.
The image sensor 242 is a semiconductor chip for converting the collected external light signal into an electrical signal. Specifically, the surface of the image sensor 242 includes hundreds of thousands to millions of photodiodes, which generate electric charges when irradiated with light, thereby converting the optical signals collected by the lens assembly 220 into electrical signals. The image sensor 242 may be, for example, a charge coupled device (charge coupled device, CCD) or a complementary metal oxide conductor device (complementary metal-oxide semiconductor, CMOS).
The image sensor 242 includes a plurality of photosensitive cells that can convert light signals into electric charges to form an electronic image corresponding to a scene. Each photosensitive unit corresponds to one pixel (pixel), and the more pixels are, the clearer the imaging effect is.
In an image sensor used by the three-channel camera module, each photosensitive unit comprises three different filtering channels, such as a red filtering channel, a green filtering channel and a blue filtering channel, or a red filtering channel, a yellow filtering channel and a blue filtering channel, for example, different filtering channels can be formed by placing 3 different color filters on the photosensitive units.
In an image sensor (hereinafter referred to as multispectral image sensor for convenience of description) used by the multispectral camera module, each photosensitive unit includes more filtering channels (the number of the filtering channels is more than 3), for example, a differential filter array (color FILTER ARRAY, CFA) can be realized by chemical dye, film coating interference, a superfine surface micro-nano structure and the like, so as to form different filtering channels. The multispectral image sensor can thus extract spectral information for multiple channels to acquire images. Since the multispectral image sensor has more filtering channels and stronger spectrum sensing capability, the perceived color of the multispectral image has higher reality. The multispectral image sensor can acquire multispectral images through a plurality of filtering channels, and can transmit acquired image information to the color restoration module for analysis, so that color imaging or color information extraction is realized.
Illustratively, the multispectral image sensor may include a spectrum modulation region and a photoelectric conversion region, where the spectrum modulation region forms a plurality of filtering channels through a plurality of materials or optical structures to split incident light, and the photoelectric conversion region converts the split optical signal into an electrical signal, and outputs a digital signal or code through analog-to-digital conversion or the like. The color reproduction module (for example, an image signal processor) is electrically connected with the multispectral image sensor, and the color reproduction module can calculate spectral information or color information according to the light signals and the pixel position information input by the multispectral image sensor. The color reproduction module may also convert the multispectral signals acquired by the multispectral image sensor into color space information (e.g., RGB information, XYZ tristimulus values, YUV information, etc.) of the image based on a color transformation matrix from the signals of the plurality of channels to the color space. Here, the color transformation matrix of the signals of the multiple channels to the color space is established according to the visual response curve (such as the spectral tristimulus value curve, including the response curve of the human eye to red, green and blue light) and the spectral response curve of the multispectral camera module (including the spectral response curve of the multispectral camera module to different wavelengths). In some embodiments, the color restoration module may further perform color calibration on other cameras by using the spectral information obtained by the multispectral camera module, for example, map the color space information corresponding to the multispectral camera module onto other camera modules to correct the color of the multispectral camera module based on the transformation relationship between the spectral response curve of the multispectral camera module and the spectral response curve of the other camera modules, so that the image color of the other camera modules is more accurate.
In some embodiments, camera modules of different focal segments may have respective image sensors, or camera modules of different focal segments may share the same image sensor.
In some embodiments, the light perception component 240 may include an Image Signal Processing (ISP) module for processing the signals collected by the image sensor 242, such as linear correction, noise cancellation, automatic white balancing, automatic exposure control, color correction, etc., to convert raw data collected by the image sensor 242 into an algorithm-supported format. For example, the image signal processing module may be integrated with the image sensor 242.
The wiring board 243 is for transmitting electrical signals, and may be a flexible circuit board (flexible printed circuit, FPC) or a printed circuit board (printed circuit board, PCB). The image sensor 242 may be electrically connected to the wiring board 243 through a wire to achieve extraction of signals.
In some embodiments, the light perception component 240 may further include a microelectromechanical system (microelectromechanical systems, MEMS) actuator 244, the MEMS actuator 244 configured to move the image sensor 242 in the direction of the optical axis and/or in the direction perpendicular to the optical axis, thereby achieving auto-focus and/or optical anti-shake. The MEMS actuator 244 may be driven by electrostatic force, magneto-electric, piezo-electric, thermo-electric, etc. It should be appreciated that the particular configuration of MEMS actuator 244 may be correspondingly designed and selected based on the selected actuation scheme, as the application is not limited in this regard.
It should be understood that the structures illustrated in fig. 2 and 3 do not constitute a specific limitation on the camera module 200. The camera module 200 may include more or less components than illustrated, for example, the camera module 200 may also include connectors, peripheral electronics, etc., or the camera module 200 may not include the lens group actuator 230, which is not described in detail herein.
As described above, the spectrum sensing capability and the color restoration capability of the multispectral camera module are stronger than those of the three-way camera module, so that the spectrum information acquired by the multispectral camera module can be utilized to perform color calibration on the image acquired by the three-way camera module. However, the accuracy of the current multispectral camera module for three-channel camera module color calibration needs to be improved.
This is because the accuracy of the multispectral camera module used for color calibration is affected by the degree of similarity (or equivalent degree) between the spectral response curve of the multispectral camera module and the spectral response curve of the three-way camera module, in addition to the correlation with the spectral sensing capability and the color reducing capability of the multispectral camera module. Specifically, as the spectrum bands sensed by the multispectral camera module are more, the spectrum sensing function corresponding to the multispectral camera module can be converted from high-dimensional dimension reduction to the spectrum sensing function corresponding to the three-way camera module. The higher the equivalence degree between the spectral response curve of the multispectral camera module and the spectral response curve of the three-channel camera module, the higher the conversion accuracy between the spectral response curve of the multispectral camera module and the spectral response curve of the three-channel camera module, namely the more accurate the process of fitting the spectral sensing function of the multispectral camera module to the spectral sensing function of the three-channel camera module in a dimension-reducing manner, correspondingly, the three-channel camera module can acquire real color information from the multispectral camera module more accurately based on a more accurate dimension-reducing conversion matrix between the two spectral sensing functions.
For example, if the spectral response curve of the multispectral camera module is completely equivalent to the spectral response curve of the three-channel camera module, the spectral response curve of the multispectral camera module is completely identical to the spectral response curve of the three-channel camera module after the dimension is reduced from high dimension, that is, the chromaticity sensing capability of the multispectral camera module is completely adapted to the chromaticity sensing capability of the three-channel camera module, the spectral sensing function of the multispectral camera module can be linearly converted into the spectral sensing function of the three-channel camera module, accordingly, based on the dimension reduction conversion matrix between the two, the direct conversion of the chromaticity space can be performed through the linear conversion, and when the color of the multispectral camera module is accurate, the three-channel camera module can obtain the real color information.
Conversely, if the degree of similarity between the spectral response curve of the multispectral camera module and the spectral response curve of the three-channel camera module is lower, the difference between the spectral response curve of the multispectral camera module and the spectral response curve of the three-channel camera module is larger after the dimension is reduced from the high dimension, that is, the accuracy of the spectral sensing function fitted from the spectral sensing function of the multispectral camera module to the spectral sensing function of the three-channel camera module is lower, and accordingly, the three-channel camera module cannot accurately acquire the color information of the multispectral camera module based on the dimension reduction conversion matrix between the two.
In the same way as the principle, the multispectral camera module itself needs to realize accurate color restoration, and the spectral response curve of the multispectral camera module and the spectral response curve (such as tristimulus value curve) of human eyes are required to have the highest equivalent degree. For example, when the spectral response curve of the multispectral camera module is completely the same as the tristimulus value curve of the human eye after the dimension is reduced from high dimension, that is, the chromaticity perceptibility of the multispectral camera module is completely adapted to the chromaticity perceptibility of the human eye, the spectral perception function of the multispectral camera module can be linearly converted into the tristimulus value curve of the human eye, and correspondingly, based on the dimension-reducing conversion matrix between the two, the direct conversion of the chromaticity space can be performed through linear conversion, so that the completely real color information is obtained.
However, the color restoration capability of the multispectral camera module is affected by the optical filter in the module, and the difference of the optical filter to the transmittance of light leads to easy deviation of the spectrum information acquired by the multispectral camera module when the real image information is restored later, so that the color deviation between the final image and the human eye is larger, and the image authenticity is poorer.
In view of this, the embodiment of the application provides an optical filter, which can improve the accuracy of the multispectral camera module for color calibration while ensuring the color restoration capability of the multispectral camera module when being applied to the multispectral camera module, thereby improving the color authenticity and consistency of images.
The optical filter provided by the embodiment of the application is applied to a first camera module, and the first camera module is a multispectral camera module. The first camera module further comprises an optical lens and an image sensor, and the optical filter is arranged between the optical lens and the image sensor. The optical lens is used for receiving light rays from the target object. The optical filter is used for filtering out light rays except visible light in the light rays, namely, the light rays are used for passing the visible light in the light rays. The image sensor is a multispectral image sensor, wherein the number of channels of the image sensor is greater than 3.
The filter has a first transmittance curve and a second transmittance curve. The first transmittance curve is a corresponding relation curve of the wavelength and the transmittance of the light rays which are normally incident on the optical filter. The second transmittance curve is a corresponding relation curve of the wavelength and the transmittance of the light rays incident on the optical filter at a first angle, and the first angle is larger than 0 DEG and smaller than or equal to 35 deg.
On the first transmittance curve, the wavelength corresponding to 50% transmittance includes a first wavelength that is greater than or equal to 600nm and less than or equal to 660nm. On the second transmittance curve, the wavelength corresponding to 50% transmittance includes a second wavelength, and a difference between the first wavelength and the second wavelength is greater than or equal to 0 and less than or equal to 20nm.
It will be appreciated that the first transmittance curve is obtained at normal incidence, wherein the angle of incidence of the light rays incident on the filter is 0 °. The second transmittance curve is obtained under oblique incidence condition, wherein the incidence angle of the light rays entering the optical filter is a first angle. The first transmittance curve and the second transmittance curve of the optical filter can be obtained through an optical experiment or a simulation experiment, and the application is not particularly limited.
It will be appreciated that the optical filter disposed in the camera module generally has a high transmittance for visible light, and based on this characteristic, the transmittance curve (the abscissa is the wavelength and the ordinate is the transmittance) of the optical filter generally has a commonality that at least a portion of the transmittance curve is in an ascending trend as the wavelength goes from the ultraviolet band to the visible band, and for convenience of description, this portion of the curve is referred to as a rising edge of the transmittance curve, and at least a portion of the transmittance curve is in a descending trend as the wavelength goes from the visible band to the infrared band, and for convenience of description, this portion of the curve is referred to as a falling edge of the transmittance curve. In the rising edge or the falling edge of the transmittance curve, there may be local small-range fluctuations, and such small-range fluctuations do not affect the overall trend of the rising edge or the falling edge.
In the embodiment of the application, the first wavelength is greater than or equal to 600nm and less than or equal to 660nm, the first wavelength is located at the falling edge of the first transmittance curve, the difference between the first wavelength and the second wavelength is greater than or equal to 0 and less than or equal to 20nm, the first wavelength and the second wavelength are close to each other, and the second wavelength is located at the falling edge of the second transmittance curve. Thus, it can also be understood that the first transmission curve has a first wavelength corresponding to 50% transmittance of 600nm or more and 660nm or less in the falling edge. The difference between the first wavelength corresponding to 50% transmittance in the first transmittance curve and the second wavelength corresponding to 50% transmittance in the falling edge is greater than or equal to 0 and less than or equal to 20nm.
When the optical filter with the parameters is applied to the multispectral camera module, the equivalent degree between the spectral response curve of the multispectral camera module and the spectral response curves of human eyes and the three-way camera module is higher. Accordingly, after the spectral response curve of the multispectral camera module is mapped from high dimension to low dimension, the spectral response curve of the multispectral camera module can be respectively close to or equal to the spectral response curve of human eyes and the three-way camera module. Therefore, the color restoration capability of the multi-spectrum camera module and the accuracy of color correction on the three-channel camera module can be improved, the color cast problem is reduced or avoided, the color authenticity of the output image of the three-channel camera module is improved, and the user experience is improved.
Specifically, in normal incidence, the range of the first wavelength corresponding to the light filter at the position with the transmittance of 50% is approximately equivalent to the corresponding wavelength range of the light filter in the three-way camera module, so that the spectral response range can be limited on the whole, and the spectral response curve of the multi-spectral camera module is beneficial to approaching the spectral response curve of human eyes and the spectral response curve of the three-way camera module after the dimension reduction from high dimension. And the difference between the first wavelength and the second wavelength is in the range of 0-20nm, so that the transmittance curve of the optical filter can ensure that the spectral response curve of the first camera module has higher similarity with the spectral response curve of human eyes and the three-channel camera module in the incidence range of light, thereby reducing or avoiding the color cast problem and improving the color authenticity of the output image.
In addition, when the light incident on the filter surface has an angle, the optical path length through which the light passes in the filter film layer increases, so that the transmittance of the light of the same wavelength in the filter is changed, and a transmittance curve angle shift phenomenon (shift in the short wave direction) occurs. That is, the transmittance curve of the filter is shifted due to the change in the angle of the incident light. In general, the larger the angle of incident light, the larger the amount of shift in the transmittance curve. In the embodiment of the application, the difference between the first wavelength under normal incidence condition and the second wavelength under oblique incidence condition is in the range of 0-20nm, so that when the incidence angle of light rays changes in the range of the first angle (for example, 35 degrees), the transmittance curve of the optical filter can meet the requirement that the spectral response curve of the multispectral camera module has higher similarity with the spectral response curve of human eyes and the three-way camera module even if the transmittance curve of the optical filter deviates. In other words, the light transmittance curve of the light filter is shifted within 20nm, so that the similarity degree between the spectrum response curve of the multispectral camera module and the spectrum response curves of the human eyes and the three-channel camera module is higher, the color cast problem is reduced or avoided, and the color reduction consistency of the center and the edge of the image is improved.
Generally, infrared light forms a virtual image on a target surface imaged by visible light, affecting the color and quality of the image. The difference value between the first wavelength and the second wavelength is within the range of 0-20nm, so that the optical filter has a good cut-off effect on infrared light rays within the incident angle range, the imaging quality is improved, and the risk of false color is reduced.
In addition, the difference between the first wavelength and the second wavelength is in the range of 0-20, so that the angular offset of the transmittance curve is restrained, and the color difference (color shading) of the module can be reduced. Specifically, the incidence angles of light at the center and the edge of the filter are different, resulting in a shift in the transmittance curve of the edge of the filter from the transmittance curve at the center. Accordingly, the spectral response curves at the center and the edge of the multispectral camera module deviate, and chromatic aberration exists between the center and the edge of the image. The multispectral camera module has more spectral channels than ordinary tee bend camera module, if carry out extra correction, the degree of difficulty is great. The optical filter provided by the application has small angular offset of the transmittance curve, and can radically reduce the deviation of the spectral response curve of the camera module at the center and the edge, thereby reducing the chromatic aberration of the center and the edge of the image. Therefore, color shading of the multispectral camera module can be reduced on the premise of not carrying out additional correction so as to ensure the accuracy of color restoration of the whole picture and the accuracy when the multispectral camera module is used for color correction.
In some embodiments, on the first transmittance curve, the wavelength corresponding to 50% transmittance includes a third wavelength that is greater than or equal to 400nm and less than or equal to 440nm. On the second transmittance curve, the wavelength corresponding to 50% transmittance includes a fourth wavelength, and a difference between the third wavelength and the fourth wavelength is greater than or equal to 0 and less than or equal to 20nm.
In the embodiment of the application, the third wavelength is greater than or equal to 400nm and less than or equal to 440nm, the third wavelength is located at the rising edge of the first transmittance curve, and the fourth wavelength is located at the rising edge of the second transmittance curve. Thus, it can also be understood that the third wavelength corresponding to 50% transmittance in the rising edge of the first transmission curve is greater than or equal to 400nm and less than or equal to 440nm. The difference between the third wavelength corresponding to 50% transmittance in the rising edge of the first transmittance curve and the fourth wavelength corresponding to 50% transmittance in the rising edge of the second transmittance curve is greater than or equal to 0 and less than or equal to 20nm.
When oblique incidence is carried out, the range of the third wavelength corresponding to the position of the light filter with the transmittance of 50% is 400-440 nm, the light filter is approximately equivalent to the corresponding wavelength range of the light filter in the three-channel camera module, and the spectrum response range can be limited on the whole, so that the similarity degree between the spectrum response curve of the first camera module and the spectrum response curve of the human eyes and the three-channel camera module is higher, the color reduction capability of the first camera module and the accuracy of color correction on the three-channel camera module are improved, the color cast problem is reduced or avoided, and the color authenticity of the three-channel camera module image is improved.
The difference between the third wavelength and the fourth wavelength is in the range of 0-20nm, so that the angular offset of the transmittance curve of the optical filter can be limited, the color reduction consistency of the image center and the image edge can be improved, and the color difference between the image center and the image edge is reduced.
In the embodiment of the application, the incident angle refers to an included angle between a light ray incident on the surface of the optical filter and a normal line of the surface of the optical filter. The 50% transmittance means a position on the transmittance curve where the transmittance is 50%. The rising edge is a section in which the transmittance is rising as a whole with an increase in wavelength on the transmittance curve, for example, a section in which the transmittance is rising from a minimum value to a maximum value with an increase in wavelength, or a section in which the transmittance is rising from 5% to 80%. The falling edge is a section on the transmittance curve in which the transmittance generally decreases with an increase in wavelength, for example, a section in which the transmittance decreases from the highest value to the lowest value with an increase in wavelength, or a section in which the transmittance decreases from 80% to 5%. The band corresponding to the rising edge overlaps with the blue light band (about 435 nm-450 nm), and the band corresponding to the falling edge overlaps with the red light band (about 622 nm-760 nm).
In some embodiments, the first wavelength and the second wavelength belong to the red light band.
In some embodiments, the third wavelength and the fourth wavelength belong to the blue light band.
In some embodiments, the difference between the first wavelength and the second wavelength is greater than or equal to 0 and less than or equal to 10nm. That is, the difference between the wavelength corresponding to 50% transmittance in the first transmittance curve on the falling edge and the wavelength corresponding to 50% transmittance in the second transmittance curve on the falling edge is greater than or equal to 0 and less than or equal to 10nm.
In some embodiments, the difference between the third wavelength and the fourth wavelength is greater than or equal to 0 and less than or equal to 10nm. That is, the difference between the wavelength corresponding to 50% transmittance in the rising edge of the first transmittance curve and the wavelength corresponding to 50% transmittance in the rising edge of the second transmittance curve is greater than or equal to 0 and less than or equal to 10nm.
As described above, when the angle of the incident light changes, the transmittance curve of the optical filter shifts, and the difference between the first wavelength and the second wavelength is limited to be within 10nm, so that when the incident angle changes within the range of the first angle (for example, 35 °), the transmittance curve of the optical filter can satisfy that the spectral response curve of the multispectral camera module has higher similarity with the spectral response curve of the human eye and the three-way camera module even if the transmittance curve shifts, and the color reduction consistency of the center and the edge of the image is further improved. In addition, by limiting the difference range, color shading of the module can be reduced, and accuracy of color restoration of the whole picture and accuracy in color correction are further improved.
In some embodiments, on the first transmittance curve, the average transmittance in the 350nm to 390 nm band is less than or equal to 3%, and the average transmittance in the 700nm to 780nm band is less than or equal to 2%.
In some embodiments, on the second transmittance curve, the average transmittance in the 350nm to 390 nm band is less than or equal to 4% and the average transmittance in the 700nm to 780nm band is less than or equal to 3%.
The color restoration of the multispectral camera module takes human eyes as targets, the wavelength range of human eyes is 400 nm-700 nm, and light with the wavelength exceeding the human eye sensing range is a meaningless signal for the multispectral camera module. In the embodiment of the application, the cut-off wavelength of the optical filter corresponds to the cut-off wavelength position of the visual response curve of the human eye, and the optical wave beyond the human eye induction band can be filtered by the optical filter, so that the effective information has better signal-to-noise ratio, and the image quality is improved.
In some embodiments, on the first transmittance curve, the lowest transmittance in the 440 nm-580 nm band is greater than or equal to 70%, and the average transmittance in the 440 nm-580 nm band is greater than or equal to 80%.
In some embodiments, on the second transmittance curve, the lowest transmittance in the 440nm to 580nm band is greater than or equal to 70%, and the average transmittance in the 440nm to 580nm band is greater than or equal to 80%.
440 Nm-580 nm is the stable section of the light filter transmittance, and the light filter can be prevented from having larger transmittance fluctuation by restraining the minimum value in the range of the wave band. Because the design with larger fluctuation of the transmissivity of the stable section easily causes inconsistent transmissivity of different optical filters in actual production, the consistency of the multispectral camera module in application is affected, and the consistency of the effect colors is affected. The quality of the graph can be improved by restricting the average transmittance value of the 440 nm-580 nm wave band. Because the overall sensitivity of the multispectral camera module can be affected by the too low transmittance, the camera can need longer exposure time under the same environment, image smear blurring is easy to generate, and the quality of the image is affected.
In some embodiments, the material of the filter may be resin or glass. The optical filter in the foregoing embodiment of the present application may be obtained by adjusting the thickness dimension of the optical filter, the type and/or thickness of the plating film, adding an organic material having a specific spectral absorption characteristic to the optical filter, and other various manners, and the present application is not limited to a specific implementation manner of the optical filter.
For further understanding, fig. 4 to 9 respectively show transmittance curve diagrams of several filters according to the embodiments of the present application.
Referring to fig. 4 to 9, a first wavelength corresponding to 50% transmittance in a falling edge of the first transmittance curve is denoted by λ1, and a third wavelength corresponding to 50% transmittance in a rising edge of the first transmittance curve is denoted by λ3. The second wavelength corresponding to 50% transmittance in the falling edge of the second transmittance curve is represented by λ2, and the fourth wavelength corresponding to 50% transmittance in the rising edge of the second transmittance curve is represented by λ4.
For example, referring to fig. 4 to 9, the approximate values of the corresponding parameters are shown in table 1 below.
TABLE 1
It is understood that, on the first transmittance curve, if there is a fluctuation in the falling edge around 50% transmittance, the wavelength corresponding to 50% transmittance may include a plurality of wavelengths in the 600nm to 660nm range, and the first wavelength λ1 may be any one of the plurality of wavelengths in the 600nm to 660nm range, for example, the shortest/longest/approximately centered one of the plurality of wavelengths in the 600nm to 660nm range. On the first transmittance curve, if there is a fluctuation in the rising edge around 50% transmittance, the wavelength corresponding to 50% transmittance may include a plurality of wavelengths in the range of 400nm to 440nm, and the third wavelength λ3 may be any one of the plurality of wavelengths in the range of 400nm to 440nm, for example, the shortest/longest/approximately centered one of the plurality of wavelengths in the range of 400nm to 440 nm.
Similarly, on the second transmittance curve, if there is a fluctuation in the falling edge around 50% transmittance, the wavelength corresponding to 50% transmittance may include a plurality of wavelengths for which the difference between the plurality and the first wavelength λ1 satisfies the requirement, and the second wavelength λ2 may be any one of the wavelengths for which the difference between the plurality and the first wavelength λ1 satisfies the requirement, for example, one of the farthest/nearest from the first wavelength λ1. On the second transmittance curve, if there is a fluctuation in the rising edge around 50% transmittance, the wavelength corresponding to 50% transmittance may include a plurality of wavelengths for which the difference between the plurality of wavelengths and the third wavelength λ3 satisfies the requirement, and the fourth wavelength λ4 may be any one of the wavelengths for which the difference between the plurality of wavelengths and the third wavelength λ3 satisfies the requirement, for example, one of the most distant/closest to the third wavelength λ3.
The optical filters shown in fig. 4 to 9 are proved to have higher similarity between the spectral response curves of the multispectral camera module and the spectral response curves of the human eyes and the three-channel camera module.
The effect of the optical filter provided by the application can be verified by means of experimental measurement, simulation and the like.
For example, in a manner based on experimental measurement, the spectral response curve (hereinafter referred to as a first spectral response curve for convenience of description) of the multispectral camera module (including the optical filter provided by the present application) and the spectral response curve (hereinafter referred to as a second spectral response curve for convenience of description) of the three-way camera module may be measured through experiments. The spectral response curve of the human eye (hereinafter referred to as the third spectral response curve for convenience of description) may employ an existing spectral tristimulus value curve. The color information of the target object shot by the multispectral camera module can be obtained based on the transformation relation between the first spectral response curve and the third spectral response curve, and the color reduction capability of the multispectral camera module can be judged through human eyes. The color space information corresponding to the multispectral camera module can be mapped onto the three-channel camera module based on the transformation relation between the first spectral response curve and the second spectral response curve, whether the image presented by the three-channel camera module has color cast problem can be judged through human eyes, and the color cast problem can be used for judging the accuracy of color correction of the multispectral camera module.
For example, in a simulation-based approach, a first spectral response curve may be acquired through simulation, and a second spectral response curve may be acquired through simulation or experiment. The third spectral response curve may be a spectral tristimulus value curve. The dimension-reduction fitting precision between the first spectral response curve and the second spectral response curve and the dimension-reduction fitting precision between the first spectral response curve and the third spectral response curve can be calculated through computer simulation. The color reduction capability of the multispectral camera module can be judged based on the dimension reduction fitting precision between the first spectral response curve and the third spectral response curve, and the accuracy of the multispectral camera module for color correction of the three-channel camera module can be judged based on the dimension reduction fitting precision between the first spectral response curve and the second spectral response curve.
For easy understanding, the filter provided by the application can also verify the effect of the filter through formula deduction, and the filter is described in detail below.
In some embodiments, the degree of equivalence (or similarity) of the spectral response curves of two camera modules (which the human eye can see as a particular three-way camera module) can be described in terms of Vora-Value. The Vora-Value is a number between 0 and 1, when the Vora-Value is 1, the spectrum response curves of the two camera modules are completely equivalent, and the spectrum sensing functions (or spectrum response matrixes) of the two camera modules can be converted linearly. For ease of quantization and expression, vora-Value is defined below as v, with the superscript or subscript being different for v corresponding to different comparison objects.
In some embodiments, the similarity between the spectral response curve of the multispectral camera module and the spectral response curve of the human eye is denoted by v 0, which may be specifically:
in formula (1):
Q N×C is a spectrum response matrix of the multispectral camera module, which is obtained based on a spectrum response curve of the multispectral camera module;
the method comprises the steps of obtaining a spectral response matrix of human eyes based on a spectral response curve of the human eyes;
P { } is an operator, P { a } = a [ a TA]-1AT for any one matrix a;
The order is N multiplied by N;
The order is N multiplied by N;
tr () is the trace operation of the square matrix, i.e., the sum of all elements on the main diagonal of the square matrix;
n is equal to the number of sampling points, and C is equal to the number of channels of the multispectral camera module.
The multispectral camera module has more channels, and can be converted from high-dimension reduction to approach the spectral response curve of human eyes. When the spectral response curve of the multispectral camera module is proper, v 0 can be made as close to 1 as possible. Correspondingly, in the color reduction process, the accuracy of the conversion result is higher and higher when the conversion of the chromaticity space is carried out between the multispectral camera module and the human eyes, which means that the multispectral camera module has better color reduction capability and can reduce or avoid the occurrence of color cast problem.
When v 0 =1, it indicates that the spectral response curve of the multispectral camera module is completely equivalent to the spectral response curve of the human eye, the chromaticity sensing capability of the multispectral camera module is completely adaptive to the chromaticity sensing capability of the human eye, and the direct conversion of the chromaticity space can be performed through linear transformation (such as addition and multiplication of a matrix), accordingly, the multispectral camera module can accurately restore the color seen by the human eye without color cast.
For the existing three-channel camera module, because of the limitation of the film material, the spectral response curve cannot be completely the same as the spectral response curve of human eyes, so that the similarity between the spectral response curve of the existing three-channel camera module and the spectral response curve of human eyes is usually less than 0.9, which means that the color reduction capability of the existing three-channel camera module is weaker, and color cast of certain colors is easy to occur.
In the embodiment of the application, v 0 can be calculated by using the formula (1), and the color reduction capability of the multispectral camera module can be judged by the value of v 0. For example, if v 0 is greater than or equal to a certain threshold (e.g., 0.93), it can be shown that the spectral response curve of the multispectral camera module can be as close to the spectral response curve of the human eye as possible after the dimension is reduced from high dimension, that is, the color reduction capability of the multispectral camera module is higher.
In some embodiments, the similarity between the spectral response curve of the multispectral camera module and the spectral response curve of the three-way camera module is denoted by v', which may be specifically:
In formula (2):
Q N×C is a spectrum response matrix of the multispectral camera module, which is obtained based on a spectrum response curve of the multispectral camera module;
X' N×3 is the spectral response matrix of the three-channel camera module, which is obtained based on the spectral response curve of the three-channel camera module;
P { } is an operator, P { a } = a [ a TA]-1AT for any one matrix a;
p { Q N×C}=QN×C[QN×C TQN×C]-1QN×C T, the order of which is nxn;
P { X' N×3}=X′N×3[X′N×3 TX′N×3]-1X′N×3 T, the order of which is N;
tr () is the trace operation of the square matrix, i.e., the sum of all elements on the main diagonal of the square matrix;
n is equal to the number of sampling points, and C is equal to the number of channels of the multispectral camera module.
For example, if the camera device or electronic apparatus includes i three-way camera modules, the different three-way camera modules have respective spectral response curves and spectral response matrices. To distinguish the spectral response matrices of different three-way camera modules, for i=1, 2,3, n, n is an integer greater than or equal to 1, the spectral response matrices of three-way camera modules may be represented as Wherein, the different values of i represent different three-way camera modules. For example, taking a three-shot scene as an example, i=1, 2,3 may represent a wide-angle camera module, a tele camera module, and a super-wide-angle camera module, respectively, and the corresponding spectral response matrices are respectively Similarly, the similarity v' between the spectral response curve of the multispectral camera module and the spectral response curve of each three-way camera module may be denoted as v 1,v2,v3, respectively
The multispectral camera module has more channels, and can be converted from high-dimensional dimension reduction to approach the spectral response curve of the three-channel camera module. When the spectral response curve of the multispectral camera module is proper, v' can be made as close to 1 as possible. Accordingly, in the color correction process, the multispectral camera module can be mapped from high dimensions to the three-way camera module more accurately, so that the three-way camera module can accurately acquire real color information, and the color cast problem is avoided.
When v' =1, it indicates that the spectral response curve of the multispectral camera module is completely equivalent to the spectral response curve of the three-channel camera module, the chromaticity sensing capability of the multispectral camera module is completely adaptive to the chromaticity sensing capability of the three-channel camera module, and the direct conversion of chromaticity space can be performed through linear conversion, accordingly, the three-channel camera module can accurately acquire the color of the multispectral camera module, and when the color of the multispectral camera module is accurate, the three-channel camera module cannot appear color cast.
In the embodiment of the application, v 'can be calculated by using the formula (2), and the color correction capability of the multispectral camera module can be judged by the value of v'. For example, if v' is greater than or equal to a certain threshold (e.g., 0.93), it may be indicated that the spectral response curve of the multispectral camera module may be as close to the spectral response curve of the three-way camera module as possible after the dimensional reduction from high dimensions, and also may be indicated that the accuracy of the multispectral camera module in performing color correction on the three-way camera module is higher.
For Q N×C in equation (1) and equation (2), it can be obtained in several ways as follows.
In one example, a spectral response curve of a multispectral camera module may be obtained first, where the spectral response curve of the multispectral camera module is obtained by directly measuring or simulating the entire module, and the multispectral camera module includes spectral response curves of C channels, where the spectral response curve of each channel considers the influencing factors of the optical lens, the optical filter, and the image sensor. And then, based on the spectral response curve of the multispectral camera module, sampling N points on the spectral response curve of each channel in the C channels to obtain Q N×C.
In another example, instead of acquiring the spectral response curve of the multispectral camera module, the response value of each channel to light of each wavelength may be directly acquired by measuring or simulating the whole module by light of discrete wavelengths, so as to acquire Q N×C.
In yet another example, Q N×C may be obtained based on the following equation (3):
QN×C=MN×C⊙TN×1⊙LN×1 (3)
In formula (3):
m N×C is a matrix obtained by respectively carrying out N-point sampling on each of the C channels based on the spectral response curve of the image sensor;
T N×1 is a matrix obtained by N-point sampling based on a transmittance curve of the optical filter;
l N×1 is a matrix obtained by N-point sampling based on a transmittance curve of the optical lens;
The addition is dot product.
In the embodiment of the application, the multispectral camera module comprises an optical lens, an optical filter and an image sensor, wherein the optical lens is used for focusing light from a target object onto the image sensor, and the optical filter is positioned between the optical lens and the image sensor and is used for realizing optical filtering. The image sensor is a multispectral image sensor, and is provided with a plurality of response channels (i.e. a plurality of filtering channels, also called channels for short), and can respond to light in a continuous wavelength range, wherein each channel can obtain a corresponding spectral response curve, and the set formed by the spectral response curves of the channels is the spectral response curve corresponding to the image sensor. Both the optical lens and the optical filter are optical elements that affect the spectrum that reaches the image sensor. Therefore, the spectral response characteristics of the multispectral camera module needs to comprehensively consider the transmittance curve of the optical lens, the transmittance curve of the optical filter and the spectral response curve of the image sensor.
In the embodiment of the application, the transmittance curve of the optical lens is a corresponding relation curve between the transmittance of the optical lens and the wavelength, the transmittance curve of the optical filter is a corresponding relation curve between the transmittance of the optical filter and the wavelength, and the spectral response curve of the image sensor is a corresponding relation curve between the spectral response of the image sensor and the wavelength. The transmittance curve of the optical lens, the transmittance curve of the optical filter and the spectral response curve of the image sensor are respectively measured. The spectral response curve of the multispectral camera module can be indirectly obtained through the transmittance curve of the optical lens, the transmittance curve of the optical filter and the spectral response curve of the image sensor.
It is understood that the spectral response curve of the image sensor includes a spectral response curve corresponding to each of the C channels, which can be said to be a set of the spectral response curves of the C channels.
In some embodiments, the transmittance matrix L of the optical lens may be obtained by performing discrete sampling of N points on the transmittance curve of the optical lens, where the order is n×1, i.e., N rows and 1 columns, i.e., L N×1 in equation (3). The transmittance matrix T of the optical filter, the order of which is nx1, that is, T N×1 in equation (3), can be obtained by performing discrete sampling of N points on the transmittance matrix of the optical filter. By performing discrete sampling of N points on the spectral response curve corresponding to each of the C channels of the image sensor, a spectral response matrix M of the image sensor can be obtained, where the order is nxc (i.e., N rows and C columns), that is, M N×C in equation (3).
Here, N sampling points selected when the transmittance curve of the optical filter is discretely sampled, N sampling points selected when the transmittance curve of the optical lens is discretely sampled, and N sampling points selected when each channel of the image sensor is discretely sampled are the same, that is, sampling positions are the same. For example, every 5 nanometers is a sampling point within a preset wavelength range.
For X' N×3 in equation (2), it can be obtained in several ways as follows.
In one example, a spectral response curve of a three-channel camera module may be obtained first, where the spectral response curve of the three-channel camera module is obtained by directly measuring or simulating the whole module, and the three-channel camera module includes 3 channels of spectral response curves, where each channel of spectral response curve considers the influence factors of an optical lens, an optical filter, and an image sensor in the module. And then, based on the spectral response curve of the three-channel camera module, sampling N points on the spectral response curve of each channel in the three channels to obtain X' N×3.
In another example, instead of acquiring the spectral response curves of the three-channel camera module, the response values of each channel for light of each wavelength may be directly acquired by measuring or simulating the whole module by light of discrete wavelengths, thereby acquiring X' N×3.
In yet another example, similar to equation (3), X' N×3 may be point multiplied by the transmittance matrix of the optical lens in the three-channel camera module (acquired by N-point sampling on the transmittance curve of the optical lens), the transmittance matrix of the filter in the three-channel camera module (acquired by N-point sampling on the transmittance curve of the filter), and the spectral response matrix of the image sensor in the three-channel camera module (acquired by N-point sampling on each channel of the spectral response curve of the image sensor). The transmittance curve of the optical lens, the transmittance curve of the optical filter and the spectral response curve of the image sensor in the three-channel camera module are respectively measured.
It can be understood that the spectral response curve of the image sensor of the three-channel camera module includes a spectral response curve corresponding to each of the 3 channels, which can be said to be a set of spectral response curves of the 3 channels. In addition, when the corresponding curve is discretely sampled, the selected N sampling points are the same, that is, the sampling positions are the same.
The human eye can be equal to a special three-channel camera module, and the spectral response matrix of the human eye can be obtained by performing discrete sampling on the spectral response curve corresponding to the human eyeThe order is N x3 (i.e. N rows and 3 columns),May also be denoted as X 0. In particular, the method comprises the steps of,N-point sampling acquisitions may be made separately for each of the three channels based on the spectral response curve of the human eye. I.e., the spectral response curve of the human eye comprises three channels of spectral response curves respectively corresponding,Acquired for N-point sampling on the three curves, respectively.
It should be noted that, the sampling position selected when the spectral response curve of the three-channel camera module and the spectral response curve of the human eye are sampled is the same as the sampling position selected when the multispectral camera module is sampled. For example, every 5 nanometers is a sampling point within the same wavelength range.
Referring still to fig. 4 to 9, the similarity between the spectral response curves of the multispectral camera module and the spectral response curves of the human eye and the three-way camera module is calculated according to the formula (1) and the formula (2), respectively, as follows.
The transmittance curves of the filters shown in fig. 4 to 9 each include a transmittance curve at AOI of 0 ° and a transmittance curve at AOI of 35 °. Taking i=1, 2,3 in formula (2) as examples, respectively representing the wide-angle camera module, the tele camera module and the ultra-wide-angle camera module, the corresponding spectral response matrixes are respectivelyThe calculated similarity according to formula (2) may be denoted as v 1,v2,v3, respectively. The spectral response matrix of the human eye isThe similarity calculated according to equation (1) may be expressed as v 0.
Through verification, the transmittance curve of the optical filter (made of glass) shown in fig. 4 can make the multi-spectrum camera module and the Vora value between the human eye, the wide-angle camera module, the tele camera module and the ultra-wide-angle camera module as follows:
At the time of aoi=0°, v 0=0.9933;v1=0.9935;v2=0.9960;v3 = 0.9926.
At the time of aoi=35°, v 0=0.9835;v1=0.9950;v2=0.9891;v3 = 0.9882.
It can be seen that v 0、v1、v2、v3 is greater than 0.98 over the range of angles of incidence of the filter.
Through verification, the transmittance curve of the optical filter shown in fig. 5 can enable the multispectral camera module to be respectively connected with the human eye, the wide-angle camera module, the tele camera module and the ultra-wide-angle camera module as follows:
At the time of aoi=0°, v 0=0.9799;v1=0.9958;v2=0.9866;v3 = 0.9893.
At the time of aoi=35°, v 0=0.9743;v1=0.9933;v2=0.9809;v3 = 0.9774.
It can be seen that v 0、v1、v2、v3 is greater than 0.97 over the range of angles of incidence of the filter.
Through verification, the transmittance curve of the optical filter shown in fig. 6 can enable the multispectral camera module to be respectively connected with the human eye, the wide-angle camera module, the tele camera module and the ultra-wide-angle camera module as follows:
At the time of aoi=0°, v 0=0.9786;v1=0.9911;v2=0.9904;v3 = 0.9939.
At the time of aoi=35°, v 0=0.9697;v1=0.9941;v2=0.9798;v3 = 0.9839.
It can be seen that v 0、v1、v2、v3 is greater than 0.96 over the range of angles of incidence of the filter.
Through verification, the transmittance curve of the optical filter (made of resin) shown in fig. 7 can make the Vora value between the multispectral camera module and the human eye, the wide-angle camera module, the tele camera module and the ultra-wide-angle camera module as follows:
At the time of aoi=0°, v 0=0.9841;v1=0.9865;v2=0.9942;v3 = 0.9899.
At the time of aoi=35°, v 0=0.9768;v1=0.9952;v2=0.9845;v3 = 0.9880.
It can be seen that v 0、v1、v2、v3 is greater than 0.97 over the range of angles of incidence of the filter.
Through verification, the transmittance curve of the optical filter shown in fig. 8 can enable the multispectral camera module to be respectively connected with the human eye, the wide-angle camera module, the tele camera module and the ultra-wide-angle camera module as follows:
At the time of aoi=0°, v 0=0.9824;v1=0.9872;v2=0.9933;v3 = 0.9907.
At the time of aoi=35°, v 0=0.9744;v1=0.9947;v2=0.9832;v3 = 0.9828.
It can be seen that v 0、v1、v2、v3 is greater than 0.97 over the range of angles of incidence of the filter.
Through verification, the transmittance curve of the optical filter shown in fig. 9 can enable the multispectral camera module to be respectively connected with the human eye, the wide-angle camera module, the tele camera module and the ultra-wide-angle camera module as follows:
At the time of aoi=0°, v 0=0.9799;v1=0.9958;v2=0.9866;v3 = 0.9893.
At the time of aoi=35°, v 0=0.9727;v1=0.9892;v2=0.9768;v3 = 0.9690.
It can be seen that v 0、v1、v2、v3 is greater than 0.96 over the range of angles of incidence of the filter.
According to the value of v 0, the spectral response curve of the multispectral camera module can be as close to the spectral response curve of human eyes as possible after the dimension is reduced from high dimension, so that the color reduction capability of the multispectral camera module can be improved, the color cast problem can be reduced or avoided, and the color authenticity of an output image of the multispectral camera module can be improved.
According to the value of v 1、v2、v3, the spectral response curve of the multi-spectral camera module can be as close to the spectral response curve of the three-channel camera module as possible after the dimension is reduced from high dimension, so that the accuracy of the multi-spectral camera module in correcting the colors of the three-channel camera module can be improved, the color cast problem is reduced or avoided, and the color authenticity of the output image of the three-channel camera module is improved.
The existing three-channel camera module is difficult to achieve consistency of spectral response curves, so that different camera modules have color differences among images output aiming at the same scene, and the color consistency is poor, so that switching experience is affected. In the application, different three-way camera modules can correspond to good Vora-Value, namely, the spectral response curves of the multi-spectrum camera modules can be as close to the spectral response curves of each three-way camera module as possible after the dimension is reduced from high dimension. Like this, when the color information of multispectral camera module is unified to the three way camera module of difference, can guarantee that different three way camera modules see the same standard and be neat, all acquire color information from multispectral camera module promptly, can improve the color uniformity of a plurality of three way camera module output images.
In addition, in the incidence angle range of the optical filter, even if the transmittance curve of the optical filter shifts, the higher Vora value can be satisfied. Therefore, the deviation between the spectral response curve of the center and the spectral response curve of the edge of the multispectral camera module can be reduced, and the accuracy of color restoration of the whole picture is ensured.
The embodiment of the application also provides a camera module, which comprises an optical lens, an image sensor and the optical filter, wherein the optical filter is arranged between the optical lens and the image sensor, the optical lens is used for receiving light rays from a target object, the optical filter is used for passing visible light in the light rays, and the image sensor is used for receiving the visible light to acquire spectrum information of the target object, and the number of channels of the image sensor is more than 3.
The camera module is a multispectral camera module, and can also be abbreviated as multispectral module.
It can be appreciated that the number of channels of the camera module is determined based on the number of channels of the image sensor. Specifically, the optical lens and the optical filter are provided with one channel, so that the number of channels of the camera module is the same as that of the image sensor. Correspondingly, the number of channels of the image sensor is greater than 3, and the number of channels of the camera module is also greater than 3, for example, the number of channels is C, and the camera module is a C-channel camera.
In some embodiments, the filter is coated with an infrared cut-off film.
The embodiment of the application also provides an image pickup device, which comprises a first camera module, wherein the first camera module comprises an optical lens, an optical filter and an image sensor which are sequentially arranged along the optical axis direction, the optical lens is used for receiving light rays from a target object, the optical filter is used for passing visible light in the light rays, the image sensor is used for receiving the visible light to acquire spectrum information of the target object, and the channel number of the image sensor is more than 3. The filter is the filter in the previous embodiment.
In some embodiments, the camera device further comprises at least one second camera module, the second camera module is a three-channel camera, the first camera module is a C-channel camera, C is greater than 3, a similarity v 0 between a spectral response curve of the first camera module and a spectral response curve of a human eye is greater than or equal to 0.93, a similarity v' between a spectral response curve of the first camera module and a spectral response curve of the second camera module is greater than or equal to 0.93, wherein,
Q N×C is a spectral response matrix obtained by respectively carrying out N-point sampling on each of the C channels based on the spectral response curve of the first camera module, and P { Q N×C}=QN×C[QN×C TQN×C]-1QN×C T;
for a spectral response matrix obtained by N-point sampling in each of three channels based on the spectral response curve of the human eye,
X 'N×3 is a spectral response matrix obtained by sampling N points in each of three channels based on the spectral response curve of the second camera, and P { X' N×3}=X′N×3[X′N×3 TX′N×3]-1X′N×3 T;
Tr () is trace.
For the specific calculation of v 0 and v' and the relevant content of each parameter, reference may be made to the descriptions of equation (1) and equation (2), and details thereof will not be given here for brevity.
In some embodiments, Q N×C may be obtained by measuring the spectral response curve of the entire first camera module and then performing discrete sampling on the spectral response curve of the module.
In other embodiments, Q N×C=MN×C⊙TN×1⊙LN×1, wherein:
m N×C is a matrix obtained by respectively carrying out N-point sampling on each of the C channels based on the spectral response curve of the image sensor;
T N×1 is a matrix obtained by N-point sampling based on a transmittance curve of the optical filter;
L N×1 is a matrix obtained by sampling N points based on the transmittance curve of the optical lens.
That is, after the spectral response curve of the image sensor, the transmittance curve of the optical filter, and the transmittance curve of the optical lens are measured, respectively, discrete sampling is performed to obtain M N×C、LN×1、TN×1, and then the point multiplication of the three is performed to obtain Q N×C.
For the specific calculation of Q N×C and the description of the related parameters with reference to equation (3), details are not described here for brevity.
In some embodiments, X' N×3 may be obtained by measuring the spectral response curve of the entire second camera module and then performing discrete sampling on the spectral response curve of the module. Or respectively measuring the spectral response curve of the image sensor, the transmittance curve of the optical filter and the transmittance curve of the optical lens in the second camera module, respectively performing discrete sampling to obtain corresponding matrixes, and obtaining X' N×3 by dot multiplication of the three matrixes.
In some embodiments of the present invention, in some embodiments,May be obtained by discrete sampling of the spectral response curve of the human eye.
In some embodiments, the second camera module may be a tele camera module, a wide camera module, or an ultra wide camera module. If the camera device comprises a plurality of second camera modules, the focal segments of the plurality of second camera modules can be the same or different. Illustratively, the plurality of second camera modules may implement a full focus Duan Fugai.
In the embodiment of the application, the second camera module comprises the optical filter, for example, the optical filter is used for filtering out other wave bands except visible light, and correspondingly, the second camera module can output visible light images.
The embodiment of the application also provides electronic equipment, which comprises an image processing chip and the multispectral image head module (namely the first camera module) related to the embodiment, wherein the image processing chip is used for processing the image acquired by the multispectral camera module.
The embodiment of the application also provides electronic equipment, which comprises an image processing chip and the image pickup device related to the embodiment, wherein the image processing chip is used for processing the image acquired by the image pickup device.
The image processing chip may be, for example, the image processor mentioned in the foregoing embodiment.
In some embodiments, the image capturing device further includes at least one second camera module, the second camera module is a three-channel camera, and the image processing chip is configured to correct the color of the target object acquired by the second camera module based on the spectral information of the target object acquired by the multispectral camera module.
Fig. 10 is a schematic flowchart of a design method of a camera module according to an embodiment of the present application. The number of color channels perceived by the multi-spectral camera module (hereinafter also referred to as a first camera module) designed by the method 400 shown in fig. 10 is greater than the number of color channels perceived by the three-channel camera module (hereinafter also referred to as a second camera module). For example, a multispectral camera module has a number of channels greater than 3. The method 400 includes steps S410 to S420, which are described below with reference to the accompanying drawings.
S410, acquiring an initial spectral response matrix of the multispectral camera module, a spectral response matrix of a human eye and a spectral response matrix of each second camera module in the at least one second camera module.
S420, optimizing an initial spectral response matrix of the multispectral camera module based on the optimization target, and obtaining a target spectral response matrix of the multispectral camera module.
In the embodiment of the application, the multispectral camera module is a C-channel camera, and C is an integer greater than 3. Illustratively, each photosensitive unit includes C distinct filter channels, C >3, on the image sensor of the multispectral camera module. That is, the number of channels of the image sensor of the multispectral camera module is greater than 3.
In the embodiment of the application, the multispectral camera module comprises an optical lens, an optical filter and an image sensor. Both the optical lens and the optical filter are optical elements that affect the spectrum that reaches the image sensor. Therefore, the spectral response characteristics of the multispectral camera module needs to comprehensively consider the transmittance curve of the optical lens, the transmittance curve of the optical filter and the spectral response curve of the image sensor.
The spectral response characteristic of the entire multispectral camera module can be represented by a spectral response matrix Q N×C (which can also be represented as Q), and can be specifically represented as Q N×C=MN×C⊙TN×1⊙LN×1.
M N×C is a matrix obtained by sampling N points on each of the C channels based on the spectral response curve of the image sensor, T N×1 is a matrix obtained by sampling N points on the transmittance curve of the optical filter, L N×1 is a matrix obtained by sampling N points on the transmittance curve of the optical lens, and as a result, the channel is the dot product.
It is understood that the spectral response curve of the image sensor includes a spectral response curve corresponding to each of the C channels, which can be said to be a set of the spectral response curves of the C channels.
It can be seen that by adjusting at least one of the spectral response curve (corresponding to M N×C) of the image sensor, the transmittance curve (corresponding to T N×1) of the optical filter, and the transmittance curve (corresponding to L N×1) of the optical lens, the spectral response characteristic (corresponding to Q N×C) of the multispectral camera module can be adjusted.
For example, taking the transmittance curve of the optimized filter as an example, the filter has an initial transmittance matrix T ' N×1 and the multispectral camera module has an initial spectral response matrix Q ' N×C, where Q ' N×C=MN×C⊙T′N×1⊙LN×1, before optimization. By optimizing the transmittance curve of the filter, the filter has a target transmittance matrix T ' N×1, and the multispectral camera module has a target spectral response matrix Q ' N×C, wherein Q ' N×C=MN×C⊙T″N×1⊙LN×1. Wherein the spectral response matrix of the image sensor and the transmittance matrix of the optical lens are unchanged.
In the embodiment of the application, the second camera module is a three-way camera. It will be appreciated that each light sensing unit on the image sensor of the second camera module comprises 3 different filter channels, for example with red, green, blue filter channels or red, yellow, blue filter channels. Each filter channel corresponds to a spectral response curve. That is, the number of channels of the image sensor of the second camera module is 3.
In the embodiment of the application, similar to the multispectral camera module, the human eye and the second camera module also have corresponding spectral response curves (or spectral response characteristics), wherein the spectral response curves corresponding to the human eye comprise response curves of the human eye to red, green and blue lights, and the spectral response curves corresponding to the second camera module comprise response curves corresponding to three lights with different wavelengths (such as red, green and blue lights).
The method 400 treats the human eye as a special three-way camera module. Setting i=0, 1,2, 3..n, n is an integer greater than or equal to 1. i is 0, representing a human eye, i is 1,2,3,..n, and n, representing a second, different camera module. The spectral response matrix X i of the three-channel camera module can be discretely sampled based on the spectral response curve of the three-channel camera module, the order of which is n×3 (i.e., N rows and 3 columns), and in some embodiments, X i may also be represented asThat is, the spectral response matrix of the human eye is expressed asThe spectral response matrixes of the different second camera modules are respectively expressed as
In an embodiment of the present application,There are various ways of obtaining (A) such as X 'N×3 and X' N×3 described in the above formula (2)The present application is not limited in this regard.
Based on the above description, in step S410, the initial spectral response matrix Q' N×C of the multispectral camera module, the human eye, and the spectral response matrix of the second camera module may be obtainedThe Q' N×C is specifically obtained by multiplying the transmittance matrix L N×1 of the optical lens, the transmittance matrix T N×1 of the optical filter, and the spectral response matrix M N×C of the image sensor.
For example, the separately obtained L N×1、TN×1、MN×C,Directly into the device performing the method 400. Exemplary, L N×1、MN×C、TN×1 or(I > 0) may be obtained from data recorded in a specification corresponding to the corresponding hardware, or obtained from test data by a manufacturer, or obtained from data by a self-test, or the like.Can be obtained by sampling the tristimulus value curve of human eyes.
For another example, the spectral response curve of the human eye, the spectral response curve of the second camera module, and the transmittance curve of the optical lens in the multispectral image sensor module, the transmittance curve of the optical filter, and the spectral response curve of the image sensor may be input into the apparatus performing the method 400, and the apparatus performs discrete sampling to obtain the corresponding dataAnd multiplying the three points by L N×1、TN×1、MN×C to obtain Q' N×C. That is, the apparatus performing the method 400 is used both for sampling the corresponding matrix and for performing the optimization process of the method 400.
In the embodiment of the present application, the optimization targets involved in step S420 include:
the similarity between the spectral response matrix of the multispectral camera module and the spectral response matrix of the human eye is larger than or equal to a first preset threshold;
The similarity between the spectral response matrix of the multispectral camera module and the spectral response matrix of each second camera module is larger than or equal to a second preset threshold value.
Illustratively, the first predetermined threshold is greater than or equal to 0.93. For example, the first preset threshold may be 0.95, 0.96, or 0.97, etc.
Illustratively, the second predetermined threshold is greater than or equal to 0.93. For example, the second preset threshold may be 0.95, 0.96, or 0.97, etc.
The first preset threshold value and the second preset threshold value may be the same or different, and may be determined according to needs in practical application, which is not limited in the present application.
It can be understood that the similarity between the spectral response matrix of the multispectral camera module and the spectral response matrix of the human eye is used for representing the similarity or the equivalence degree between the spectral response curve of the multispectral camera module and the spectral response curve of the human eye, and is particularly used for representing the proximity degree between the spectral response matrix (or the spectral response curve) of the multispectral camera module and the spectral response matrix (or the spectral response curve) of the human eye after Gao Weijiang dimensions are obtained.
It can be understood that the similarity between the spectral response matrix of the multispectral camera module and the spectral response matrix of each second camera module is used for representing the similarity or equivalence degree between the spectral response curve of the multispectral camera module and the spectral response curve of the second camera module, and is specifically used for representing the proximity degree between the spectral response matrix (or the spectral response curve) of the multispectral camera module and the spectral response matrix (or the spectral response curve) of the second camera module after Gao Weijiang dimensions are taken.
In some embodiments, the similarity between the spectral response matrix of the multispectral camera module and the spectral response matrix of the three-channel camera module is denoted by v i, which may be specifically:
wherein the operator P { } is specifically:
P { Q N×C}=QN×C[QN×C TQN×C]-1QN×C T, the order of which is N, Q N×C=MN×C⊙TN×1⊙LN×1
The order is N.
In formula (4), i=0, 1,2,3,..n, n is an integer greater than or equal to 1. When i is 0, the human eye is represented, and the corresponding spectral response matrix isI is 1,2,3,..n, n represents different second camera modules, and the corresponding spectral response matrices are respectively
The value of v i may be greater than or equal to a preset value, which may be greater than or equal to 0.93, as an example.
In the embodiment of the present application, in order to enable the multispectral camera module to assist the second camera module to achieve the effect most similar to that of the human eye, v i should trend toward 1 as much as possible, and the exemplary value of v i should be at least raised to above 0.93.
Based on the above formula (4), before optimization, the initial spectral response matrix Q' N×C of the multispectral camera module is substituted into the formula (4) to obtain the equivalence degree between the spectral response curves of the multispectral camera module and the spectral response curves of the human eyes and the second camera module respectively. By adjusting at least one of the spectral response curve of the image sensor (corresponding to M N×C), the transmittance curve of the filter (corresponding to T N×1) and the transmittance curve of the optical lens (corresponding to L N×1), Q' N×C can be optimized and v i can be varied accordingly. When the value of v i meets the optimization target, the obtained spectral response matrix is the target spectral response matrix Q' N×C of the multispectral camera module. The target spectral response matrix Q' N×C and the spectral response matrix of the three-channel camera module (including the human eye and the second camera module)And (5) phase adaptation.
For example, taking the transmittance curve of the optimized filter as an example, the transmittance matrix of the filter can be adjusted by adjusting the transmittance curve of the filter. Accordingly, the spectral response matrix of the multispectral camera module is changed, and then v i is changed. When the value of v i meets an optimization objective, for example, greater than or equal to 0.93, the optimal design of the multispectral camera module can be considered complete. The corresponding parameter setting of the optical filter can improve the color restoration capability and the color correction accuracy of the multispectral camera module.
In some embodiments, a corresponding weight F i may also be set for different v i, e.g., v 0,v1,v2,v3,…,vn corresponds to weight F 0,F1,F2,F3,…,Fn, respectively. Wherein F i is more than or equal to 0,The optimization objective in step S420 further includes: Is greater than or equal to a third preset threshold.
Illustratively, the third predetermined threshold is greater than or equal to 0.93. For example, the third preset threshold may be 0.95, 0.96, or 0.97, etc.
By setting the weight to determine the value of v i, a multispectral camera module which can meet the actual needs (such as the importance degree of each camera module, the color balance strategy and the like) can be finally selected.
It will be appreciated that the number of components,The expansion is denoted as F 0/m*v0+F1/m*v1+…+Fn/m*vn. For example, taking m=1 as an example, the above optimization objective may be specified as:
In some embodiments, the weight allocation principle may be determined according to the importance degree and/or the color balance policy of each second camera module.
For example, the higher the importance of the second camera module, the larger the corresponding weight value. For example, if a certain second camera module is a main camera module, the weight corresponding to the v value calculated based on the spectral response matrix of the camera module may be the maximum value in F 0,F1,F2,F3,…,Fn.
For another example, if the color balance policy is biased to the color reduction capability of the multispectral camera module, the weight corresponding to the v value calculated by the spectral response matrix of the human eye can be set to be the largest.
For another example, if some second camera modules are not considered in the color balance policy, the corresponding weight may be set to 0.
For another example, if the second camera modules are not prioritized, the weights corresponding to the second camera modules may be set equal.
In some embodiments, in step S420, a nonlinear optimization algorithm may be used to optimize the initial spectral response matrix of the multispectral camera module based on the optimization objective.
Exemplary nonlinear optimization algorithms include, but are not limited to, rectangular segmentation (DIVIDING RECTANGLES, DIRECT) algorithms, control random search (controlled random search, CRS) algorithms, multi-stage single-linked (LEVEL SINGLE-link, MLSL) algorithms, modified random order evolution strategies (improved stochastic ranking evolution strategy, ISRES), evolution algorithms (evolutionary algorithm, ESCH), and the like.
As described above, in the design of the multispectral camera module, the above-described optimization objective can be achieved by adjusting at least one of T N×1、MN×C and L N×1, so as to obtain the target spectral response matrix of the multispectral camera module.
Generally, the optical lens and the image sensor are relatively more considered in the design, for example, the optical lens needs to be designed with depth of field, focal length of the lens, angle of field, optical resolution, maximum aperture, tolerance precision design, aberration control, and the like, and the image sensor needs to be designed with pixel size, target surface size, resolution, filter process feasibility, and the like. Therefore, the design of the optical lens and the image sensor is required to take the basic design requirement into priority, so that the optical lens and the image sensor have smaller adjustable space, larger adjustment cost and longer period when being used for optimizing the spectrum response curve of the multispectral camera module on the premise of meeting the basic design requirement after the basic design is completed.
The filter has the main functions of filtering light to improve color restoration accuracy, effective utilization rate of visible light wave bands and the like, so that the spectral response curve of the multispectral camera module is optimized by adjusting parameters of the filter, the adjustable space is large, and the adjustment mode is flexible. Thus, based on the method 400, the spectral response curve of the multispectral camera module can be optimized by adjusting T N×1.
In the embodiment of the application, the transmittance value in T N×1 is a relative value, and only the relative relation between the transmittance of different wavelengths is restricted. In practical application, according to the practical process capability of the optical filter, the real transmittance matrix can be obtained by multiplying T N×1 by a constant. The transmittance matrix of the filter may be converted into a transmittance curve.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (18)

1. The image pickup device is characterized by comprising a first camera module, wherein the first camera module comprises an optical lens, an optical filter and an image sensor which are sequentially arranged along the optical axis direction, the optical lens is used for receiving light rays from a target object, the optical filter is used for allowing visible light in the light rays to pass through, the image sensor is used for receiving the visible light to acquire spectrum information of the target object, and the channel number of the image sensor is more than 3;
The optical filter is provided with a first transmittance curve and a second transmittance curve, wherein the first transmittance curve is a corresponding relation curve of the wavelength and the transmittance of light rays which are normally incident on the optical filter, the second transmittance curve is a corresponding relation curve of the wavelength and the transmittance of light rays which are incident on the optical filter at a first angle, and the first angle is larger than 0 degree and smaller than or equal to 35 degrees;
on the first transmittance curve, the wavelength corresponding to 50% transmittance includes a first wavelength, the first wavelength being greater than or equal to 600nm and less than or equal to 660nm;
on the second transmittance curve, the wavelength corresponding to 50% transmittance includes a second wavelength, and a difference between the first wavelength and the second wavelength is greater than or equal to 0 and less than or equal to 20nm.
2. The image pickup apparatus according to claim 1, wherein,
On the first transmittance curve, the wavelength corresponding to 50% transmittance includes a third wavelength, the third wavelength being greater than or equal to 400nm and less than or equal to 440nm;
On the second transmittance curve, the wavelength corresponding to 50% transmittance includes a fourth wavelength, and a difference between the third wavelength and the fourth wavelength is greater than or equal to 0 and less than or equal to 20nm.
3. The image capturing apparatus according to claim 1 or 2, wherein a difference between the first wavelength and the second wavelength is greater than or equal to 0 and less than or equal to 10nm.
4. The image pickup apparatus according to any one of claims 1 to 3, wherein,
On the first transmittance curve, the average transmittance of 350 nm-390 nm wave bands is less than or equal to 3%, and the average transmittance of 700 nm-780 nm wave bands is less than or equal to 2%;
on the second transmittance curve, the average transmittance of 350 nm-390 nm wave bands is less than or equal to 4%, and the average transmittance of 700 nm-780 nm wave bands is less than or equal to 3%.
5. The image pickup apparatus according to any one of claims 1 to 4, wherein,
On the first transmittance curve, the lowest transmittance of 440 nm-580 nm wave bands is more than or equal to 70%, and the average transmittance of 440 nm-580 nm wave bands is more than or equal to 80%;
On the second transmittance curve, the lowest transmittance of 440 nm-580 nm wave bands is greater than or equal to 70%, and the average transmittance of 440 nm-580 nm wave bands is greater than or equal to 80%.
6. The image pickup apparatus according to any one of claims 2 to 5, wherein the first wavelength and the second wavelength belong to a red light band, and the third wavelength and the fourth wavelength belong to a blue light band.
7. The camera device of any one of claims 1-6, further comprising at least one second camera module, the second camera module being a three-channel camera, the first camera module being a C-channel camera and C being greater than 3, a similarity v 0 between a spectral response curve of the first camera module and a spectral response curve of a human eye being greater than or equal to 0.93, a similarity v' between a spectral response curve of the first camera module and a spectral response curve of the second camera module being greater than or equal to 0.93, wherein
Q N×C is a spectral response matrix obtained by respectively carrying out N-point sampling on each of the C channels based on the spectral response curve of the first camera module, and P { Q N×C}=QN×C[QN×C TQN×C]-1QN×C T;
For a spectral response matrix obtained by N-point sampling in each of three channels based on the spectral response curve of the human eye,
X 'N×3 is a spectral response matrix obtained by respectively carrying out N-point sampling on each of three channels based on the spectral response curve of the second camera, and P { X' N×3}=X′N×3[X′N×3 TX′N×3]-1X′N×3 T;
Tr () is trace.
8. The imaging apparatus of claim 7, wherein the spectral response matrix of the first camera module
QN×C=MN×C⊙TN×1⊙LN×1
M N×C is a matrix obtained by respectively carrying out N-point sampling on each of the C channels based on the spectral response curve of the image sensor;
T N×1 is a matrix obtained by N-point sampling based on the transmittance curve of the optical filter;
L N×1 is a matrix obtained by sampling N points based on the transmittance curve of the optical lens.
9. The optical filter is characterized by being applied to a camera module, the camera module further comprises an optical lens and an image sensor, the optical filter is arranged between the optical lens and the image sensor, the optical lens is used for receiving light rays from a target object, the optical filter is used for passing visible light in the light rays, the image sensor is used for receiving the visible light to acquire spectrum information of the target object, and the channel number of the image sensor is more than 3;
The optical filter is provided with a first transmittance curve and a second transmittance curve, wherein the first transmittance curve is a corresponding relation curve of the wavelength and the transmittance of light rays which are normally incident on the optical filter, the second transmittance curve is a corresponding relation curve of the wavelength and the transmittance of light rays which are incident on the optical filter at a first angle, and the first angle is larger than 0 degree and smaller than or equal to 35 degrees;
on the first transmittance curve, the wavelength corresponding to 50% transmittance includes a first wavelength, the first wavelength being greater than or equal to 600nm and less than or equal to 660nm;
on the second transmittance curve, the wavelength corresponding to 50% transmittance includes a second wavelength, and a difference between the first wavelength and the second wavelength is greater than or equal to 0 and less than or equal to 20nm.
10. The filter according to claim 9, wherein,
On the first transmittance curve, the wavelength corresponding to 50% transmittance includes a third wavelength, the third wavelength being greater than or equal to 400nm and less than or equal to 440nm;
On the second transmittance curve, the wavelength corresponding to 50% transmittance includes a fourth wavelength, and a difference between the third wavelength and the fourth wavelength is greater than or equal to 0 and less than or equal to 20nm.
11. The filter of claim 10, wherein a difference between the first wavelength and the second wavelength is greater than or equal to 0 and less than or equal to 10nm.
12. The filter according to any one of claims 9 to 11, wherein,
On the first transmittance curve, the average transmittance of 350 nm-390 nm wave bands is less than or equal to 3%, and the average transmittance of 700 nm-780 nm wave bands is less than or equal to 2%;
on the second transmittance curve, the average transmittance of 350 nm-390 nm wave bands is less than or equal to 4%, and the average transmittance of 700 nm-780 nm wave bands is less than or equal to 3%.
13. The filter according to any one of claims 8 to 10, wherein,
On the first transmittance curve, the lowest transmittance of 440 nm-580 nm wave bands is more than or equal to 70%, and the average transmittance of 440 nm-580 nm wave bands is more than or equal to 80%;
On the second transmittance curve, the lowest transmittance of 440 nm-580 nm wave bands is greater than or equal to 70%, and the average transmittance of 440 nm-580 nm wave bands is greater than or equal to 80%.
14. The filter according to any one of claims 10 to 13, wherein the first wavelength and the second wavelength belong to a red light band, and the third wavelength and the fourth wavelength belong to a blue light band.
15. A camera module comprising an optical lens, an image sensor and the optical filter according to any one of claims 9 to 14, wherein the optical filter is disposed between the optical lens and the image sensor, the optical lens is configured to receive light from a target object, the optical filter is configured to pass visible light in the light, the image sensor is configured to receive the visible light to obtain spectral information of the target object, and the number of channels of the image sensor is greater than 3.
16. An electronic apparatus comprising an image processing chip for processing an image acquired by the image pickup device, and the image pickup device according to any one of claims 1 to 8.
17. The electronic device of claim 16, wherein the camera device further comprises at least one second camera module, the second camera module is a three-way camera, and the image processing chip is configured to correct a color of the target object acquired by the second camera module based on spectral information of the target object acquired by the first camera module.
18. The electronic device of claim 17, wherein the second camera module is a wide angle camera, a tele camera, or a super wide angle camera.
CN202411203726.5A 2024-08-29 2024-08-29 Optical filter, camera module, camera device and electronic equipment Pending CN119653213A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202411203726.5A CN119653213A (en) 2024-08-29 2024-08-29 Optical filter, camera module, camera device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202411203726.5A CN119653213A (en) 2024-08-29 2024-08-29 Optical filter, camera module, camera device and electronic equipment

Publications (1)

Publication Number Publication Date
CN119653213A true CN119653213A (en) 2025-03-18

Family

ID=94955974

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202411203726.5A Pending CN119653213A (en) 2024-08-29 2024-08-29 Optical filter, camera module, camera device and electronic equipment

Country Status (1)

Country Link
CN (1) CN119653213A (en)

Similar Documents

Publication Publication Date Title
CN113960764B (en) Image pickup optical lens group
US9986157B2 (en) Imaging optical system, camera apparatus and stereo camera apparatus
JP4290935B2 (en) Electronic imaging device
JP4285957B2 (en) Zoom lens and electronic imaging apparatus having the same
US9201223B2 (en) Imaging lens, camera and hand-held data terminal device
US7112779B2 (en) Optical apparatus and beam splitter
CN110888216A (en) Optical lens, lens module and terminal
KR20160013855A (en) Image pickup lens, camera module, and image pickup device
JP2004309695A (en) Imaging optical system and imaging apparatus using imaging optical system
CN101271248A (en) Electronic image pickup apparatus
EP1860868A1 (en) Image pickup device and portable terminal device
CN107979716B (en) Camera module and electronic device including the same
CN107005627B (en) Image pickup apparatus, image pickup method, and recording medium
US10948683B2 (en) Imaging lens, camera, and portable information terminal device
US7515818B2 (en) Image capturing apparatus
US7214926B2 (en) Imaging systems and methods
JP4322921B2 (en) Camera module and electronic device including the same
WO2024046056A1 (en) Camera module and electronic device
JP4083355B2 (en) Imaging device
CN114624859B (en) Imaging optical lens system, imaging device and electronic device
CN101001309A (en) Imaging system and method
JP2003302575A (en) Zoom lens and electronic image pickup device using the same
JP7485268B2 (en) Optical lenses, camera modules, and terminals
CN114859511A (en) Optical lens, camera module and electronic equipment
JP2001078217A (en) Image pickup device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination