[go: up one dir, main page]

WO2022198436A1 - 图像传感器、图像数据获取方法、成像设备 - Google Patents

图像传感器、图像数据获取方法、成像设备 Download PDF

Info

Publication number
WO2022198436A1
WO2022198436A1 PCT/CN2021/082350 CN2021082350W WO2022198436A1 WO 2022198436 A1 WO2022198436 A1 WO 2022198436A1 CN 2021082350 W CN2021082350 W CN 2021082350W WO 2022198436 A1 WO2022198436 A1 WO 2022198436A1
Authority
WO
WIPO (PCT)
Prior art keywords
color filter
current frame
measurement data
spectral measurement
electrical signal
Prior art date
Application number
PCT/CN2021/082350
Other languages
English (en)
French (fr)
Inventor
杨子路
尹玄武
李果
孙亚民
左文明
占云龙
张显斗
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2021/082350 priority Critical patent/WO2022198436A1/zh
Priority to CN202180090558.1A priority patent/CN116724564A/zh
Priority to EP21932068.6A priority patent/EP4300936A4/en
Publication of WO2022198436A1 publication Critical patent/WO2022198436A1/zh
Priority to US18/470,847 priority patent/US20240014233A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2803Investigating the spectrum using photoelectric array detector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/135Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2803Investigating the spectrum using photoelectric array detector
    • G01J2003/2806Array and filter array
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • G01J2003/2826Multispectral imaging, e.g. filter imaging
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F39/00Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
    • H10F39/80Constructional details of image sensors
    • H10F39/805Coatings
    • H10F39/8053Colour filters

Definitions

  • the present application relates to the technical field of image processing.
  • it relates to an image sensor, an image data acquisition method and an imaging device.
  • a commonly used image sensor includes a color filter array.
  • the color filter array is composed of a plurality of identical color filter units, and each color filter unit includes a plurality of color filters.
  • the commonly used color filter arrays include the same color filter units, and each color filter unit contains 3 or 4 color filters, it is equivalent that the sampling points of the color filter array are only 3 or 4.
  • the change of light intensity on any small spectral band may cause the color change of human vision, so only rough color restoration can be done by the commonly used image sensor.
  • using 3 or 4 sampling points to restore the 400-700 nm spectrum will lead to more metamerism problems, which brings challenges to the calculation of the white balance coefficient and the color restoration matrix.
  • the present application provides an image sensor, an image data acquisition method, and an imaging device, which are used to solve the problem that commonly used image sensors can only perform rough color restoration, more metamerism, calculation of white balance coefficient and color conversion matrix Challenging questions.
  • the present application provides an image sensor, comprising: a pixel array and a color filter array covering the pixel array; wherein: the color filter array includes a plurality of color filter units, the plurality of The color filter unit includes at least one first color filter unit and at least one second color filter unit, each of the first color filter units includes a basic color filter and an extended color filter, each of the second color filter units
  • the color filter unit includes a plurality of basic color filters; wherein, the color of the extended color filter is different from the color of the basic color filter, and the light signal passing through the basic color filter is at least used for imaging, and passes through the basic color filter.
  • the optical signal of the expanded color filter is used for spectral measurement.
  • the second color filter unit includes a plurality of basic color filters, and the color of the extended color filter is different from the color of the basic color filter, therefore,
  • the color combination of the color filters in the first color filter unit is different from the combination of color filters in the second color filter unit, that is, the color filter array includes two types of color filter units.
  • the color filter array composed of the same color filter unit increases the sampling points, improves the accuracy of color restoration, reduces the occurrence of metamerism problems, and reduces the calculation difficulty of white balance coefficient and color conversion matrix.
  • the improvement of the color filter array due to the improvement of the color filter array, no additional devices are added, which is conducive to the miniaturization and low-cost popularization of the image sensor, and also makes the image sensor more convenient to be applied to mobile terminals such as mobile phones. device usage scenarios.
  • the optical signal passing through the basic color filter is at least used for imaging, and the optical signal passing through the extended color filter is used for spectral measurement, imaging and spectral measurement can be performed simultaneously through one exposure, which shortens the time-consuming of spectral measurement, and further Improve color reproduction and imaging efficiency of image sensors.
  • the number of basic color filters in each of the first color filter units is three, and the number of extended color filters in each of the first color filter units is one , the number of basic color filters in each of the second color filter units is 4.
  • the colors of the basic color filters in the second color filter unit include red, green, and blue
  • the colors of the basic color filters in the first color filter unit include red, green, and blue. green and blue.
  • different extended color filters have different colors.
  • the extended color filter includes a first extended color filter and a second extended color filter, wherein: the thickness of the first extended color filter At least one of the effective area of the pixel covered by the filter and the exposure time of the pixel covered by the first extended color filter is determined according to the upper limit of the dynamic range of the image sensor; the thickness of the second extended color filter At least one of the effective area of the pixels covered by the second extended color filter, and the exposure time of the pixels covered by the second extended color filter is determined according to the lower limit of the dynamic range of the image sensor.
  • the first extended color filter, the pixels covered by the first extended color filter, the second extended color filter, and the pixels covered by the second extended color filter can be determined according to the upper and lower limits of the dynamic range of the image sensor.
  • the relevant parameters of the pixels make the dynamic range of the image sensor unrestricted, and then take into account the dynamic range of objects with high brightness and objects with low brightness, and improve the imaging effect.
  • the present application provides an image data acquisition method, which is applied to an image sensor, where the image sensor includes a pixel array and a color filter array covering the pixel array; wherein: the color filter array includes multiple color filter units, the plurality of color filter units include at least one first color filter unit and at least one second color filter unit, each of the first color filter units includes a basic color filter and an extension Color filters, each of the second color filter units includes a plurality of basic color filters, and the colors of the extended color filters are different from the colors of the basic color filters;
  • the method includes: obtaining a first electrical signal and a second electrical signal; determining first spectral measurement data of a current frame according to the first electrical signal, and determining imaging data of the current frame according to the second electrical signal; wherein , the first electrical signal is the electrical signal obtained after the first pixel photoelectrically converts the first optical signal, the first optical signal is the optical signal passed through the extended color filter, and the first pixel is the For the pixels covered by the extended color filter, the second electrical signal is an electrical signal obtained by photoelectrically converting the second optical signal by the second pixel, and the second optical signal is the light passing through the basic color filter. signal, the second pixel is a pixel covered by the basic color filter.
  • the imaging data of the current frame and the first spectral measurement data can be simultaneously obtained through one exposure, which shortens the time-consuming of data acquisition.
  • the sampling points of the image sensor applied by the image data acquisition method are increased, the accuracy of color restoration is improved, and the calculation difficulty of the white balance coefficient and the color conversion accuracy is reduced.
  • the determining, according to the first electrical signal, the first spectral measurement data of the current frame includes: determining, according to the imaging data of the current frame and the imaging data of historical frames, relative to the current frame The motion information of the historical frame; the second spectral measurement data of the current frame is determined according to the first electrical signal; the second spectral measurement data is performed according to the motion information and the spectral measurement data of the historical frame correction to obtain the first spectral measurement data.
  • the determining the first spectral measurement data of the current frame according to the first electrical signal includes: generating the first electrical signal according to at least a part of the second electrical signal and the first electrical signal First spectral measurement data.
  • the determining, according to the first electrical signal, the first spectral measurement data of the current frame includes: determining, according to the imaging data of the current frame and the imaging data of historical frames, relative to the current frame motion information of the historical frame; generating second spectral measurement data of the current frame according to at least a part of the second electrical signal and the first electrical signal; according to the motion information and the spectrum of the historical frame The measurement data corrects the second spectral measurement data to obtain the first spectral measurement data.
  • the obtaining the first electrical signal and the second electrical signal includes: obtaining the first electrical signal according to the position of the extended color filter in the color filter array; The position of the basic color filter in the color filter array obtains the second electrical signal.
  • the method further includes: performing element segmentation on the current frame based on the imaging data of the current frame to obtain at least one object in the current frame; based on the at least one object The color distribution of the object is determined, and the original light source spectrum of the current frame is determined; the spectral measurement data of each of the objects is determined in the first spectral measurement data of the current frame, and the spectral measurement data of each object is paired according to the spectral measurement data.
  • the original light source spectrum is corrected to obtain the target light source spectrum; the white balance coefficient and/or color conversion matrix of the current frame is determined according to the target light source spectrum; the white balance coefficient and/or color conversion matrix of the current frame is converted matrix for processing the imaging data of the current frame.
  • the method further includes: based on the imaging data of the current frame, performing element segmentation on the current frame to obtain at least one object in the current frame; Determine the spectral measurement data of each of the objects in the first spectral measurement data of the object, and determine the spectrum of each of the objects according to the spectral measurement data of each of the objects; diagnosis and/or classification and/or identification of the subject.
  • the extended color filter includes a first extended color filter and a second extended color filter, wherein: the thickness of the first extended color filter At least one of the effective area of the pixel covered by the filter and the exposure time of the pixel covered by the first extended color filter is determined according to the upper limit of the dynamic range of the image sensor; the thickness of the second extended color filter At least one of the effective area of the pixels covered by the second extended color filter, and the exposure time of the pixels covered by the second extended color filter is determined according to the lower limit of the dynamic range of the image sensor.
  • the present application provides an imaging device, comprising: an image sensor and an image processor, wherein: the image sensor includes a pixel array and a color filter array covering the pixel array; the color filter array including a plurality of color filter units, the plurality of color filter units include at least one first color filter unit and at least one second color filter unit, each of the first color filter units includes a basic color filter and an extended color filter, each of the second color filter units includes a plurality of basic color filters, the color of the extended color filter is different from the color of the basic color filter; the extended color filter, used to filter the incident light signal to obtain the first light signal; the basic color filter is used to filter the incident light signal to obtain the second light signal; the first pixel is used to filter the first light signal Photoelectric conversion is performed to obtain a first electrical signal, and the first pixel is a pixel covered by the extended color filter; the second pixel is used to perform photoelectric conversion on the second optical signal to obtain a second electrical signal, and the second pixel is used for obtaining
  • the second pixel is the pixel covered by the basic color filter; the image processor is configured to obtain the first electrical signal and the second electrical signal, and determine the first electrical signal of the current frame according to the first electrical signal Spectral measurement data, and the imaging data of the current frame is determined according to the second electrical signal.
  • the image processor specifically determines the first spectral measurement data in the following manner: determining, according to the imaging data of the current frame and the imaging data of historical frames, relative to the current frame the motion information of the historical frame; determine the second spectral measurement data of the current frame according to the first electrical signal; modify the second spectral measurement data according to the motion information and the spectral measurement data of the historical frame , to obtain the first spectral measurement data.
  • the image processor specifically determines the first spectral measurement data by: generating the first spectral measurement data according to at least a part of the second electrical signal and the first electrical signal a spectral measurement data.
  • the image processor specifically determines the first spectral measurement data in the following manner: determining, according to the imaging data of the current frame and the imaging data of historical frames, relative to the current frame the motion information of the historical frame; generate the second spectral measurement data of the current frame according to at least a part of the second electrical signal and the first electrical signal; according to the motion information and the spectral measurement of the historical frame The data corrects the second spectral measurement data to obtain the first spectral measurement data.
  • the image processor specifically obtains the first electrical signal and the second electrical signal in the following manner: obtaining the first electrical signal and the second electrical signal according to the position of the extended color filter in the color filter array The first electrical signal is obtained; the second electrical signal is obtained according to the position of the basic color filter in the color filter array.
  • the image processor is further configured to: perform element segmentation on the current frame based on the imaging data of the current frame, so as to obtain at least one object in the current frame;
  • the color distribution of the at least one object is determined, and the original light source spectrum of the current frame is determined;
  • the spectrum measurement data of each of the objects is determined in the first spectrum measurement data of the current frame, and the spectrum of each object is determined according to the spectrum of each object.
  • the original light source spectrum is corrected by the measurement data to obtain the target light source spectrum;
  • the white balance coefficient and/or color conversion matrix of the current frame is determined according to the target light source spectrum;
  • the white balance coefficient and/or the color conversion matrix of the current frame are determined according to the or a color conversion matrix to process the imaging data of the current frame.
  • the image processor is further configured to: perform element segmentation on the current frame based on the imaging data of the current frame, so as to obtain at least one object in the current frame; Determine the spectral measurement data of each of the objects in the first spectral measurement data of the current frame, and determine the spectrum of each of the objects according to the spectral measurement data of each of the objects; according to the spectral pair of each of the objects The corresponding said objects are diagnosed and/or classified and/or identified.
  • the extended color filter includes a first extended color filter and a second extended color filter, wherein: the thickness of the first extended color filter At least one of the effective area of the pixel covered by the filter and the exposure time of the pixel covered by the first extended color filter is determined according to the upper limit of the dynamic range of the image sensor; the thickness of the second extended color filter At least one of the effective area of the pixels covered by the second extended color filter, and the exposure time of the pixels covered by the second extended color filter is determined according to the lower limit of the dynamic range of the image sensor.
  • the present application provides a computer-readable storage medium, where instructions are stored in the computer-readable storage medium, and when the instructions are executed on a computer or a processor, the computer or the processor is made to execute The method of any one of the two aspects.
  • the application provides a computer program product comprising instructions which, when run on a computer or processor, cause the computer or processor to perform the method of any one of the second aspects.
  • Fig. 1 is the structure schematic diagram 1 of the commonly used color filter array
  • Fig. 2 is the structural schematic diagram 2 of the commonly used color filter array
  • FIG. 3 is a schematic structural diagram of an imaging device provided by an embodiment of the present application.
  • FIG. 4 is a schematic structural diagram of an image sensor provided by an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram 1 of a color filter array in an image sensor provided by an embodiment of the present application.
  • FIG. 6 is a second structural schematic diagram of a color filter array in an image sensor provided by an embodiment of the present application.
  • FIG. 7 is a third structural schematic diagram of a color filter array in an image sensor according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a workflow of an image processor according to an embodiment of the present application.
  • At least one (item) refers to one or more, and "a plurality” refers to two or more.
  • “And/or” is used to describe the relationship between related objects, indicating that there can be three kinds of relationships, for example, “A and/or B” can mean: only A, only B, and both A and B exist , where A and B can be singular or plural.
  • the character “/” generally indicates that the associated objects are an “or” relationship.
  • At least one item(s) below” or similar expressions thereof refer to any combination of these items, including any combination of single item(s) or plural items(s).
  • At least one (a) of a, b or c can mean: a, b, c, "a and b", “a and c", “b and c", or "a and b and c" ", where a, b, c can be single or multiple.
  • Commonly used image sensors include a pixel array and an array of color filters overlaid on the pixel array.
  • FIG. 1 is a schematic diagram 1 of the structure of a commonly used color filter array.
  • the color filter array includes 16 identical color filter units 101 , and the 16 color filter units 101 are arranged in four rows and four columns.
  • Each color filter unit 101 includes four color filters, and the four color filters are arranged in two rows and two columns.
  • the color filter unit 101 is 3-channel RGB, that is, the color filter unit 101 is composed of color filters of three colors, wherein the colors of the four color filters included in the color filter unit 101 are red R and green G respectively. , green G, blue B.
  • FIG. 2 is a second structural schematic diagram of a commonly used color filter array.
  • the color filter array includes one color filter unit.
  • the color filter unit includes 64 color filters.
  • the color filter unit is 4-channel RGYB, that is, the color filter unit is composed of color filters of 4 colors.
  • the 64 color filters in the color filter unit are divided into 4 filter groups, and each filter group includes 16 color filters.
  • the 4 filter groups are arranged in two rows and two columns, and the 16 color filters in each filter group are arranged in 4 rows and 4 columns.
  • the color filter in the first filter group is red R
  • the color filter in the second filter group is green G
  • the color filter in the third filter group is yellow Y
  • the color of the color filter in the fourth filter group is blue B.
  • the commonly used color filter array is composed of multiple identical color filter units, and each color filter unit is usually 3 or 4 channels, it is equivalent that the sampling points in the color filter array are only three One or four, with fewer sampling points, resulting in only rough color reproduction through commonly used image sensors. In addition, fewer sampling points will make the metamerism problem more serious, which brings challenges to the calculation of white balance coefficients and color conversion matrices.
  • the related art provides an image sensor.
  • the image sensor adds a grating device on the basis of the commonly used image sensor. In this way, while the image sensor is imaging, the incident light can also be split through the grating device to achieve spectral measurement, and the spectral measurement results can assist the image sensor's color restoration process and the calculation of white balance coefficients and color conversion matrices.
  • the sampling points are increased, the color restoration accuracy of the image sensor is improved, the occurrence of the metamerism problem is reduced, and the calculation difficulty of the white balance coefficient and the color conversion matrix is reduced.
  • the grating device is added on the basis of the common structure. Due to the large size and high price of the grating device, it is not conducive to the miniaturization and low-cost popularization of image sensors. In addition, in the case of vibration, the reliability of the grating device will be affected, which is not suitable for the usage scenarios of mobile terminal equipment such as mobile phones. In addition, the grating device takes a long time to perform a spectral measurement, which leads to low color reproduction and imaging efficiency of the image sensor.
  • FIG. 3 is a schematic structural diagram of an imaging device provided by an embodiment of the present application.
  • the imaging device includes: an image sensor 310 , a memory 320 , an image processor 330 , an I/O interface 340 and a display 350 . in:
  • Image sensor 310 may also be referred to as an imaging sensor, which is a sensor that senses and communicates image information.
  • the image sensor 310 may be used in electronic imaging devices, including digital cameras, camera modules, medical imaging devices, such as thermal imaging devices, radar, sonar, and other night vision devices.
  • the image sensor 310 may be an active pixel sensor in Complementary Metal Oxide Semiconductor (CMOS) or N-type Metal Oxide Semiconductor (NMOS) technology, and implementation of the present invention
  • CMOS Complementary Metal Oxide Semiconductor
  • NMOS N-type Metal Oxide Semiconductor
  • the image sensor 310 may also be a charge-coupled device (Charge-Coupled Device, CCD) or the like.
  • CCD charge-coupled Device
  • FIG. 4 is a schematic structural diagram of an image sensor provided by an embodiment of the present application.
  • the image sensor 310 includes a pixel array 410 , a color filter array 420 covering the pixel array 410 , and a readout circuit 430 . in:
  • the color filter array 420 includes a plurality of color filter units, the plurality of color filter units include at least one first color filter unit and at least one second color filter unit, and each first color filter unit includes a basic color filter unit. mirror and extended color filter, each second color filter unit includes a plurality of basic color filters, the color of the extended color filter is different from the color of the basic color filter.
  • the extended color filter is used to filter the incident light (even if the light within the passband range of the extended color filter passes through) to obtain the first optical signal.
  • the first optical signal i.e. the optical signal through the extended color filter
  • the first optical signal is used for spectral measurements.
  • the basic color filter is used to filter the incident light signal (even if the light within the passband range of the basic color filter passes through) to obtain the second light signal.
  • the second light signal (ie the light signal through the basic color filter) is used for imaging or for imaging and spectral measurements.
  • the pixel array 410 includes a plurality of pixels, wherein the pixels covered by the extended color filter are called first pixels, and the pixels covered by the basic color filter are called second pixels.
  • the first pixel is used for photoelectric conversion of the first optical signal to obtain the first electrical signal
  • the second pixel is used for photoelectric conversion of the second optical signal to obtain the second electrical signal.
  • the readout circuit 430 is used to read out the first electrical signal from the first pixel and the second electrical signal from the second pixel.
  • the image processor 330 is configured to obtain the first electrical signal and the second electrical signal, determine the first spectral measurement data of the current frame according to the first electrical signal, and determine the imaging data of the current frame according to the second electrical signal.
  • the image processor 330 may also determine the white balance coefficient and/or color and conversion matrix of the current frame and/or diagnose and/or classify and/or identify objects in the current frame according to the first spectral data and imaging data of the current frame .
  • the memory 320 is used to store data corresponding to the signal output by the image sensor 310 .
  • Display 350 is used to display images and/or spectra according to data corresponding to signals output from image sensor 310 or stored in memory 320 .
  • the I/O interface 340 is used to communicate with other electronic devices, such as mobile phones, smartphones, phablets, tablet computers, or personal computers.
  • imaging devices those skilled in the art will understand that the techniques disclosed herein are also applicable to other electronic devices with imaging capabilities, such as mobile phones, smart phones, phablets, tablet computers, personal assistants, and the like.
  • the second color filter unit includes a plurality of basic color filters, and the color of the extended color filter is the same as the color of the basic color filter. Therefore, the color combination of the color filters in the first color filter unit is different from the combination of color filters in the second color filter unit, that is, the color filter array includes two types of color filter units, compared to The commonly used color filter array composed of multiple identical color filter units increases the sampling points, improves the accuracy of color restoration, reduces the occurrence of metamerism problems, and reduces the calculation of white balance coefficient and color conversion matrix. difficulty.
  • the improvement of the color filter array due to the improvement of the color filter array, no additional devices are added, which is conducive to the miniaturization and low-cost popularization of the image sensor, and also makes the image sensor more convenient to be applied to mobile terminals such as mobile phones. device usage scenarios.
  • the optical signal passing through the basic color filter is at least used for imaging, and the optical signal passing through the extended color filter is used for spectral measurement, imaging and spectral measurement can be performed simultaneously through one exposure, which shortens the time-consuming of spectral measurement, and further The color reproduction and imaging efficiency of the image sensor are improved.
  • FIG. 5 is a first structural schematic diagram of a color filter array in an image sensor according to an embodiment of the present application.
  • the color filter array 420 includes a plurality of color filter units, and the plurality of color filter units include at least one first color filter unit 501 and at least one second color filter unit 502, each first color filter unit 502
  • the color filter unit 501 includes basic color filters and extended color filters
  • each second color filter unit 502 includes a plurality of basic color filters.
  • the number of color filters in the color filter unit refers to the sum of the numbers of basic color filters and extended color filters therein. If the color filter unit is the second color filter unit 502, the number of color filters in the color filter unit refers to the number of basic color filters therein.
  • the number of color filters in the first color filter unit 501 and the number of color filters in the second color filter unit 502 may be the same or different.
  • the optical signal passing through the extended color filter is used for spectral measurement
  • the optical signal passing through the basic color filter is at least used for imaging
  • the color filters in the second color filter unit 502 are all basic color filters.
  • the color filters in the filter unit 501 include basic color filters and extended color filters. Therefore, the number of the first color filter units 501, the number of the first color filter units 501 in the plurality of color filter units, the number of The number of two color filter units 502 and the number of extended color filters in the first color filter unit 501 .
  • the number of extended color filters in the first color filter unit 501 is at least one.
  • the number of basic color filters in each first color filter unit 501 is three
  • the number of extended color filters in each first color filter unit 501 is one
  • the number of each second color filter unit is one
  • the number of basic color filters in the 502 is 4.
  • the colors and the number of color types of the basic color filters in the second color filter unit 502 are determined according to imaging requirements.
  • the number of color types of the basic color filters in the second color filter unit 502 may be, for example, 3 or 4 or more, which is not specifically limited in this application.
  • the colors of the basic color filters in the second color filter unit 502 may include, for example, three or four or more of red, green, blue, yellow, magenta, near-infrared, etc., which are not specially made in this application. limited.
  • the number of color types of the basic color filters in the second color filter unit 502 is equal to or smaller than the number of basic color filters in the second color filter unit 502 . Specifically, if the number of color types of the basic color filters in the second color filter unit 502 is equal to the number of basic color filters in the second color filter unit 502, then the different basic color filters in the second color filter unit 502 The colors of the mirrors are different. If the number of color types of the basic color filters in the second color filter unit 502 is less than the number of basic color filters in the second color filter unit 502, then some of the basic color filters in the second color filter unit 502 are basically The color filter is the same color.
  • the number of channels of the second color filter unit 502 is equal to the number of color types of the basic color filter therein. For example, if the colors of the basic color filters in the second color filter unit 502 include red, green, and blue, the number of channels in the second color filter unit 502 is 3. The colors of the filters include red, yellow, green, and blue, and the number of channels of the second color filter unit 502 is four.
  • the basic color filter in the first color filter unit 501 is a part of a plurality of basic color filters in the second color filter unit 502 .
  • the colors of the basic color filters in different first color filter units 501 may be completely different, may not be completely the same, or may be completely the same.
  • the number of basic color filters in the second color filter unit 501 is four, and the colors of the four basic color filters are: red, green, blue, and yellow, respectively. If the number of basic color filters in the first color filter unit 501 is three, the colors of the three basic color filters in each first color filter unit 501 may be: red, green, blue, respectively, Obviously, the basic color filters in different first color filter units 501 have the same color.
  • the colors of the two basic color filters in a part of the first color filter unit 501 may be: red, green, and a part of the first color filter
  • the colors of the two basic color filters in the filter unit 501 can be respectively: blue and yellow, and the colors of the two basic color filters in a part of the first color filter unit 501 can be respectively: green, yellow, obviously , the colors of the basic color filters in different first color filter units 501 may be completely the same or completely different or partially the same.
  • the color of the extended color filter is any color except the color of the basic color filter. Specifically, it can be determined according to the spectral measurement requirements.
  • the colors of different extended color filters in the same first color filter unit 501 may be completely the same, may be completely different, or may not be completely the same.
  • the colors of the extended color filters in different first color filter units 501 may be exactly the same, may be completely different, or may not be exactly the same.
  • the first color filter unit 501 includes one extended color filter, and the colors of the extended color filters in different first color filter units 501 are different.
  • the first color filter unit 501 includes a plurality of extended color filters, the colors of different extended color filters in the same first color filter unit 501 are completely different, and the colors of different extended color filters in the same first color filter unit 501 are completely different.
  • the colors of the extended color filter are completely different.
  • different extended color filters have different colors.
  • the more color types of the extended color filter the more sampling points. Based on this, in the above two examples, since different extended color filters have different colors, the number of sampling points is further increased, the accuracy of color restoration is improved, the occurrence of the metamerism problem is reduced, and the white Computational difficulty of balance coefficients and color restoration matrices.
  • the number of channels of the first color filter unit 501 is the number of color types of the basic color filter and the extended color filter in the first color filter unit 501 .
  • the number of channels of the first color filter unit 501 and the number of channels of the second color filter unit 502 may be the same or different.
  • At least one first color filter unit 501 and at least one second color filter unit 502 are arranged in N rows and M columns.
  • N and M may be the same or different, and both N and M are integers greater than or equal to 1.
  • the first color filter units 501 may be regularly distributed in N rows and M columns, and the first color filter units 501 may also be randomly distributed in N rows and M columns, which are not specially limited in this application.
  • the extended color filter may be located at any position in the first color filter unit 501 to which it belongs, and the extended color filter may also be located at a fixed position in the first color filter unit 501 to which it belongs.
  • the location information of the extended color filter needs to be recorded to facilitate subsequent data processing. Specifically, if the extended color filter rules are distributed in the color filter array, the position of the first extended color filter and the distribution rule of the extended color filter are recorded. The location and distribution laws determine the location of each extended color filter. If the extended color filters are randomly distributed in the color filter array, the position of each extended color filter needs to be recorded.
  • FIG. 6 is a second schematic structural diagram of a color filter array in an image sensor according to an embodiment of the present application.
  • the color filter array includes 16 color filter units.
  • the 16 color filter units include 4 first color filter units 601 and 12 second color filter units 602 .
  • Each first color filter unit 601 includes three basic color filters and one extended color filter.
  • Each second color filter unit 602 includes 4 basic color filters.
  • the colors of the four basic color filters in each second color filter unit 602 are RGGB (red, green, green, and blue) respectively.
  • the colors of the three basic color filters in each first color filter unit 601 are RGB (red, green, and blue) respectively.
  • the colors of the extended color filters in different first color filter units 601 are different, and the colors of the extended color filters in the four first color filter units 501 are respectively: cyan, orange, purple, and gray.
  • the second color filter unit 602 includes basic color filters of three colors
  • the number of channels of the second color filter unit 602 is three.
  • the first color filter unit 601 includes basic color filters of three colors and an extended color filter of one color. Therefore, the number of channels of the first color filter unit 601 is four.
  • FIG. 7 is a third structural schematic diagram of a color filter array in an image sensor according to an embodiment of the present application.
  • the difference between the color filter array in FIG. 7 and the color filter array in FIG. 6 is that the colors of the four basic color filters in each second color filter unit 702 are RGYB (red, green, yellow, and blue) respectively.
  • the colors of the three basic color filters in each first color filter unit 701 are RGB (red, green, and blue) respectively.
  • the second color filter unit 702 includes basic color filters of four colors, the number of channels of the second color filter unit 702 is four.
  • the first color filter unit 701 includes basic color filters of three colors and an extended color filter of one color. Therefore, the number of channels of the first color filter unit 701 is four.
  • a light source and an object appear at the same time. Since the brightness of the light source is high and the brightness of the object is low, if you focus on the light source, the details of the light source can be clearly displayed in the image, but the details of the object will be lost. Lost, if you focus on the object, the details of the object can be clearly displayed in the image, but the light source will be overexposed.
  • the present application improves the extended color filter in the color filter array.
  • the principle of improvement is as follows:
  • the overexposure can be avoided by reducing the photon accumulation of the pixels, and the signal-to-noise ratio can be improved by increasing the photon accumulation of the pixels, the loss of object details can be avoided. Also due to changes in at least one of the effective area of the pixel, the exposure time, and the thickness of the color filter covering the pixel, the photon accumulation of the pixel may change. Therefore, in the present application, a part of the extended color filters in the color filter array can be used as the first extended color filter, and another part of the extended color filters can be used as the second extended color filter. Then, the dynamic range of the image sensor is determined according to the dynamic range of the brightest object and the dynamic range of the darkest object in the scene to which the image sensor is applied.
  • a union of the dynamic range of the brightest object and the dynamic range of the darkest object is obtained, the maximum value of the union is determined as the upper limit of the dynamic range of the image sensor, and the minimum value of the union is determined as the dynamic range of the image sensor. The lower limit of the range.
  • at least one of the thickness of the first extended color filter, the effective area of the pixels covered by the first extended color filter, and the exposure time of the pixels covered by the first extended color filter is determined according to the upper limit of the dynamic range of the image sensor.
  • One, according to the lower limit of the dynamic range of the image sensor determine the thickness of the second extended color filter, the exposure time of the pixels covered by the second extended color filter, and the effective area of the pixels covered by the second extended color filter. at least one.
  • the thickness of the extended color filter is inversely related to the photon accumulation effect of the pixel.
  • the exposure time of the pixel has a positive correlation with the photon accumulation effect of the pixel, and the effective area of the pixel has a positive correlation with the photon accumulation effect of the pixel.
  • Ways to reduce the effective area of a pixel include: reducing the fill factor and half occlusion.
  • the dynamic range of the image sensor can be determined according to the application scenario, and the first extended color filter, the pixels covered by the first extended color filter, and the second extended color filter can be determined according to the upper and lower limits of the dynamic range of the image sensor.
  • the related parameters of the mirror and the pixels covered by the second extended color filter make the dynamic range of the image sensor unrestricted, so as to take into account the dynamic range of objects with high brightness and objects with low brightness, and improve the imaging effect.
  • the workflow of the image processor includes the following steps:
  • the image sensor obtains the first electrical and second electrical signals from signals read out by the readout circuit.
  • the process of obtaining the first electrical signal and the second electrical signal may be: obtaining the first electrical signal according to the position of the extended color filter in the color filter array, and according to the position of the basic color filter in the color filter array, A second electrical signal is obtained. Specifically, according to the position of the extended color filter in the color filter array, the position of the first pixel in the pixel array is determined, and according to the position of the first pixel in the pixel array, the readout circuit is read out from the pixel array. The first electrical signal is obtained from the electrical signal.
  • the position of the second pixel in the pixel array is determined, and according to the position of the second pixel in the pixel array, the readout circuit is read out from the pixel array.
  • a second electrical signal is obtained from the electrical signal.
  • the process of determining the imaging data of the current frame according to the second electrical signal may be as follows:
  • the second electrical signal read from the second pixel is converted from the electrical signal to the pixel value, and the pixel value of the second pixel is obtained.
  • the pixel value of the second pixel and the position of the second pixel in the pixel array are determined as imaging data of the current frame.
  • the imaging data may also be preprocessed, and the preprocessing here includes but is not limited to black level correction, lens correction, dead pixel compensation, etc. one or more.
  • the preprocessed imaging data is determined as the imaging data obtained in step 802 .
  • the imaging data of the current frame may be the imaging data before preprocessing, or may be the imaging data after preprocessing.
  • the manners of determining the first spectral measurement data of the current frame according to the first electrical signal may include the following four:
  • the first electrical signal read from the first pixel is converted from the electrical signal to the pixel value, and the pixel value of the first pixel is obtained.
  • the pixel value of the first pixel and the position of the first pixel in the pixel array are determined as the first spectral measurement data of the current frame.
  • the motion information of the current frame relative to the historical frame is determined according to the imaging data of the current frame and the imaging data of the historical frame.
  • a historical frame is any frame that precedes the current frame.
  • a history frame is a video frame that is adjacent to the current frame before and adjacent to the current frame.
  • the imaging data of the historical frame can be the original imaging data, and the original imaging data includes the pixel value of the second pixel obtained when the historical frame is captured and the position of the second pixel in the pixel array, and the imaging data of the historical frame can also be the image of the historical frame.
  • the raw imaging data is the data after the above-mentioned preprocessing and/or white balance coefficient processing and/or color conversion matrix processing.
  • the process of determining the motion information of the current frame relative to the historical frame may include: according to the pixel value of the second pixel and the position of the second pixel in the pixel array in the imaging data of the current frame and the second pixel in the imaging data of the historical frame. The position in the array and the pixel value of the second pixel, calculate the position difference between the current frame and the historical frame of the second pixel indicating the same element, and calculate the position difference between the current frame and the historical frame for all the second pixels indicating the same element. An average value is obtained, and the average value is determined as the motion information of the current frame relative to the historical frame.
  • Elements include, but are not limited to, semantics and features, among others. It should be noted that the foregoing manner of determining the motion information of the current frame relative to the historical frame is merely exemplary, and is not intended to limit the present application.
  • the second spectral measurement data of the current frame is determined according to the first electrical signal.
  • the second spectral measurement data here can be understood as the first spectral measurement data in the first manner.
  • the second spectral measurement data of the current frame is corrected according to the motion information of the current frame relative to the historical frame and the spectral measurement data of the historical frame to obtain the first spectral measurement data of the current frame.
  • the spectral measurement data of the historical frame may be the original spectral measurement data of the historical frame, and the original spectral measurement data includes the pixel value of the first pixel and the position of the first pixel in the pixel array obtained when the historical frame is captured.
  • the spectral measurement data of the historical frame can also be data obtained after processing the original spectral measurement data, and the processing here includes, but is not limited to, correcting the original spectral measurement data of the historical frame by using the spectral measurement data of the video frame before the historical frame. , Perform black level correction, lens correction, dead pixel compensation, etc. on the original spectral measurement data of historical frames.
  • the process of correcting the second spectral measurement data of the current frame can be, for example, as follows: according to the motion information, aligning the historical frame with the first pixel indicating the same element in the current frame, and aligning the aligned first pixel between the historical frame and the current frame. Average the pixel values in and determine the average value as the corrected pixel value of the first pixel in the current frame, and replace the pixel value of the aligned first pixel in the second spectral measurement data of the current frame with the corrected pixel value , the first spectral measurement data of the current frame can be obtained. It should be noted that the above-mentioned process of correcting the second spectral measurement data of the current frame is only exemplary, and is not intended to limit the present application.
  • the first spectral measurement data is generated according to at least a part of the second electrical signal and the first electrical signal.
  • At least a portion of the second electrical signals refers to second electrical signals read from at least a portion of the second pixels. At least a portion of the second pixels refers to all of the second pixels or a portion of the second pixels. A part of the second pixels may be, for example, second pixels located around the first pixel.
  • the first electrical signal read from the first pixel is converted from the electrical signal to a pixel value to obtain the pixel value of the first pixel.
  • the second electrical signals read from at least a portion of the second pixels are converted from electrical signals to pixel values to obtain pixel values of at least a portion of the second pixels.
  • the pixel value of the first pixel and the position of the first pixel in the pixel array, the pixel value of at least a part of the second pixel and the position of at least a part of the second pixel in the pixel array are determined as the first spectral measurement data.
  • the motion information of the current frame relative to the historical frame is determined according to the imaging data of the current frame and the imaging data of the historical frame. Since this process has been described above, it will not be repeated here.
  • second spectral measurement data of the current frame is generated according to at least a part of the second electrical signal and the first electrical signal.
  • the second spectral data here can be understood as the first spectral measurement data in the third manner.
  • the second spectral measurement data of the current frame is corrected according to the motion information and the spectral measurement data of the historical frame to obtain the first spectral measurement data of the current frame.
  • the spectral measurement data of the historical frame may be the original spectral measurement data of the historical frame, and the original spectral measurement data includes the pixel value of the first pixel obtained when the historical frame is captured, the position of the first pixel in the pixel array, and at least a part of the second pixel. The pixel value and the position of at least a portion of the second pixel in the pixel array.
  • the spectral measurement data of the historical frame can also be data obtained after processing the original spectral measurement data, and the processing here includes, but is not limited to, correcting the original spectral data of the historical frame through the spectral measurement data of the video frame before the historical frame, Perform black level correction, lens correction, dead pixel compensation, etc. on the original spectral measurement data of historical frames.
  • the first spectral measurement data may also be preprocessed, and the preprocessing here includes but is not limited to black level correction, lens correction, One or more of dead pixel compensation, etc.
  • the preprocessed first spectral measurement data is determined as the first spectral measurement data obtained in step 802 .
  • the first spectral measurement data of the current frame may be the first spectral measurement data before preprocessing, or may be the first spectral measurement data after preprocessing.
  • the preprocessing process can also be advanced to obtain the second spectral measurement data.
  • Post-execution that is, after the second spectral measurement data is obtained, the above-mentioned preprocessing is performed on the second spectral measurement data, and subsequent processing is performed according to the second spectral measurement data obtained after the preprocessing.
  • the first spectral measurement data of the current frame is obtained, that is, the multi-frame noise reduction method is used to obtain the first spectral measurement data of the current frame.
  • the second spectral measurement data of the current frame is corrected, which improves the accuracy of the first spectral measurement data, improves the accuracy of color restoration, and improves the calculation accuracy of the white balance coefficient and color conversion accuracy.
  • the imaging data of the current frame and the first spectral measurement data can be obtained simultaneously through one exposure, which shortens the time-consuming of data acquisition.
  • the sampling points of the image sensor applied by the image data acquisition method are increased, the accuracy of color restoration is improved, and the calculation difficulty of the white balance coefficient and the color conversion accuracy is reduced.
  • the application scenarios of the first spectral measurement data include the following two:
  • the first is to calculate the white balance coefficient and/or the color conversion matrix according to the first spectral measurement data and the imaging data of the current frame, so as to process the imaging data of the current frame through the white balance coefficient and/or the color conversion matrix to enhance the color Restore the accuracy, thereby improving the display effect.
  • the specific steps are as follows:
  • Elements include, but are not limited to, semantics, features, and the like.
  • Element segmentation can be implemented, for example, by a model formed by a sage network.
  • Objects include, but are not limited to, people, animals, flowers, grass, clouds, blue sky, tables, houses, faces, white blocks, gray blocks, etc.
  • the process of determining the spectral measurement data of the object may be: determining the position of the boundary pixel of the object, and according to the position of the pixel in the first spectral measurement data of the current frame, the pixel of the pixel located in the boundary pixel of the object in the first spectral measurement data. The value and position are determined as the object's spectral measurement data.
  • the process of obtaining the spectrum of the target light source may be: analyzing the color distribution of at least one object according to the spectral measurement data of each object, determining a corrected color temperature according to the color distribution, and determining a corrected spectrum according to the corrected color temperature.
  • the original light source spectrum is corrected according to the corrected spectrum to obtain the target light source spectrum.
  • the light source spectrum is used to indicate the distribution of photometric or radiometric quantities of individual wavelengths in the light source.
  • the above process of determining the spectrum of the target light source is only exemplary, and is not intended to limit the present application. For example, after determining the color temperature of the original light source, determine the corrected color temperature, then correct the color temperature of the original light source according to the corrected color temperature to obtain the color temperature of the target light source, and finally determine the spectrum of the target light source according to the color temperature of the target light source. For another example, after obtaining at least one object in the current frame, the imaging data of each object and the spectral measurement data of each object may be determined, and then, the imaging data of the corresponding object may be corrected according to the spectral measurement data of each object.
  • the process of processing the imaging data by the white balance coefficient is as follows:
  • R is the processed pixel value of the pixel covered by the red basic color filter
  • G is the processed pixel value of the pixel covered by the green basic color filter
  • B is the pixel value covered by the blue basic color filter
  • Rg is the pixel value of the pixel covered by the red basic color filter
  • Gg is the pixel value of the pixel covered by the green basic color filter
  • Bg is the pixel value covered by the blue basic color filter
  • the pixel value of the pixel, r, g, b are white balance coefficients.
  • sR is the processed pixel value of the pixel covered by the red basic color filter
  • sG is the processed pixel value of the pixel covered by the green basic color filter
  • sB is the pixel value covered by the blue basic color filter
  • Rg is the pixel value of the pixel covered by the red basic color filter
  • Gg is the pixel value of the pixel covered by the green basic color filter
  • Bg is the pixel value covered by the blue basic color filter
  • a, b, c, d, e, f, g, h, i are the parameters in the color conversion matrix.
  • the local light source spectrum and local color conversion matrix in the current frame can also be obtained through the above principles, so that the local light source spectrum and local color conversion matrix in the current frame can be used. deal with.
  • the increase of sampling points makes the imaging data and the first spectral measurement data reflect the information of more sampling points.
  • the original light source spectrum of the current frame is determined through the imaging data, and the spectral measurement data of each object is used to determine the original light source spectrum.
  • the target light source spectrum obtained after the original light source spectrum is corrected reflects more sampling point information, improves the accuracy of the target light source spectrum, reduces the occurrence of metamerism problems, and further reduces the white balance coefficient and color conversion matrix.
  • the calculation difficulty is improved, and the accuracy of color restoration is improved.
  • the process of determining the spectral measurement data of each object has been described above, and will not be repeated here.
  • the process of determining the spectrum of the object may be, for example, analyzing the color distribution of the object according to the spectral measurement data of the object, evaluating the color temperature of the object according to the color distribution, and determining the spectrum of the object according to the color temperature of the object.
  • the spectrum of an object refers to the distribution of photometric or radiometric quantities of individual wavelengths reflected by the object.
  • the spectrum of the object for different diagnosis results can be obtained in advance, the spectrum of the object is matched with the spectrum of the object for different diagnosis results, and the diagnosis result of the object is determined according to the matching result.
  • the spectrum of the object for different classification results can be obtained in advance, the spectrum of the object is matched with the spectrum of the object for different classification, and the classification result of the object is determined according to the matching result.
  • the spectrum of each object can be obtained in advance, the spectrum of the object is matched with the spectrum of each object, and the recognition result of the object is determined according to the matching result.
  • matching of spectra may refer to matching of spectra on certain mathematical features (eg, mean, variance, statistical distribution, etc.).
  • the object can be diagnosed and/or classified and/or identified according to the spectrum of the object, which is simple and easy to implement.
  • the user can also quickly and accurately diagnose and/or classify and/or identify the object in the above manner, thereby improving user experience.
  • the present application also provides an image data acquisition method, which is applied to the above-mentioned image sensor.
  • the image data acquisition method includes the following steps:
  • a first electrical signal and a second electrical signal are obtained. Then, the first spectral measurement data of the current frame is determined according to the first electrical signal, and the imaging data of the current frame is determined according to the second electrical signal.
  • the first electrical signal is an electrical signal obtained after the first pixel performs photoelectric conversion on the first optical signal
  • the first optical signal is an optical signal passed through the extended color filter
  • the first pixel is
  • the second electrical signal is an electrical signal obtained after the second pixel performs photoelectric conversion on the second optical signal
  • the second optical signal is the signal that has passed through the basic color filter.
  • light signal, the second pixel is a pixel covered by the basic color filter.
  • the determining, according to the first electrical signal, the first spectral measurement data of the current frame includes: determining, according to the imaging data of the current frame and the imaging data of historical frames, relative to the current frame The motion information of the historical frame; the second spectral measurement data of the current frame is determined according to the first electrical signal; the second spectral measurement data is performed according to the motion information and the spectral measurement data of the historical frame correction to obtain the first spectral measurement data.
  • the determining the first spectral measurement data of the current frame according to the first electrical signal includes: generating the first electrical signal according to at least a part of the second electrical signal and the first electrical signal First spectral measurement data.
  • the determining, according to the first electrical signal, the first spectral measurement data of the current frame includes: determining, according to the imaging data of the current frame and the imaging data of historical frames, relative to the current frame motion information of the historical frame; generating second spectral measurement data of the current frame according to at least a part of the second electrical signal and the first electrical signal; according to the motion information and the spectrum of the historical frame The measurement data corrects the second spectral measurement data to obtain the first spectral measurement data.
  • the obtaining the first electrical signal and the second electrical signal includes: obtaining the first electrical signal according to the position of the extended color filter in the color filter array; The position of the basic color filter in the color filter array obtains the second electrical signal.
  • the method further includes: performing element segmentation on the current frame based on the imaging data of the current frame to obtain at least one object in the current frame; based on the at least one object The color distribution of the object is determined, and the original light source spectrum of the current frame is determined; the spectral measurement data of each of the objects is determined in the first spectral measurement data of the current frame, and the spectral measurement data of each object is paired according to the spectral measurement data.
  • the original light source spectrum is corrected to obtain the target light source spectrum; the white balance coefficient and/or color conversion matrix of the current frame is determined according to the target light source spectrum according to the white balance coefficient and/or color conversion matrix of the current frame. , and process the imaging data of the current frame.
  • the method further includes: based on the imaging data of the current frame, performing element segmentation on the current frame to obtain at least one object in the current frame; Determine the spectral measurement data of each of the objects in the first spectral measurement data of the object, and determine the spectrum of each of the objects according to the spectral measurement data of each of the objects; diagnosis and/or classification and/or identification of the subject.
  • the present application further provides a computer-readable storage medium, including a computer program, which, when executed on a computer, causes the computer to execute the technical solutions of any one of the above method embodiments.
  • the present application also provides a computer program, which, when the computer program is executed by a computer or a processor, is used to execute the technical solution of any one of the above method embodiments.
  • the disclosed system, apparatus and method may be implemented in other manners.
  • the apparatus embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some ports, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the functions, if implemented in the form of software functional units and sold or used as independent products, may be stored in a computer-readable storage medium.
  • the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art or the part of the technical solution.
  • the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (personal computer, server, or network device, etc.) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk and other media that can store program codes .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

一种图像传感器,包括:像素阵列和覆盖在像素阵列上的彩色滤镜阵列(420);其中:彩色滤镜阵列(420)包括多个彩色滤镜单元,多个彩色滤镜单元包括至少一个第一彩色滤镜单元(501)和至少一个第二彩色滤镜单元(502),每个第一彩色滤镜单元(501)包括基本颜色滤镜和扩展颜色滤镜,每个第二彩色滤镜单元(502)包括多个基本颜色滤镜,其中,扩展颜色滤镜的颜色与基本颜色滤镜的颜色不同,经过基本颜色滤镜的光信号至少用于成像,经过扩展颜色滤镜的光信号用于光谱测量。所述图像传感器增加了采样点,提高了颜色还原的精度,降低了同色异谱问题的出现,降低了白平衡系数和颜色转换矩阵的计算难度。

Description

图像传感器、图像数据获取方法、成像设备 技术领域
本申请涉及图像处理技术领域。尤其涉及一种图像传感器、一种图像数据获取方法和一种成像设备。
背景技术
随着数码相机、手机的普及,图像传感器(CCD/CMOS)近年来得到了广泛的关注和应用。常用的图像传感器包括彩色滤镜阵列,彩色滤镜阵列由多个相同的彩色滤镜单元构成,每个彩色滤镜单元均包括多个颜色滤镜。
由于常用的彩色滤镜阵列包括的彩色滤镜单元相同,且每个彩色滤镜单元包含3种或者4种颜色的颜色滤镜,相当于,彩色滤镜阵列的采样点仅为3个或4个,但实际场景中,任何一个微小谱段上的光强变化都可能造成人眼视觉的颜色变化,因此通过常用的图像传感器仅能够做粗略的颜色还原。此外,用3或4个采样点还原400~700nm光谱的方式,会导致较多的同色异谱问题,为白平衡系数和颜色还原矩阵的计算带来挑战。
发明内容
本申请提供一种图像传感器、一种图像数据获取方法和一种成像设备,用于解决常用的图像传感器仅能做粗略的颜色还原、同色异谱较多、白平衡系数和颜色转换矩阵的计算挑战性变高的问题。
第一方面,本申请提供一种图像传感器,包括:像素阵列和覆盖在所述像素阵列上的彩色滤镜阵列;其中:所述彩色滤镜阵列包括多个彩色滤镜单元,所述多个彩色滤镜单元包括至少一个第一彩色滤镜单元和至少一个第二彩色滤镜单元,每个所述第一彩色滤镜单元包括基本颜色滤镜和扩展颜色滤镜,每个所述第二彩色滤镜单元包括多个基本颜色滤镜;其中,所述扩展颜色滤镜的颜色与所述基本颜色滤镜的颜色不同,经过所述基本颜色滤镜的光信号至少用于成像,经过所述扩展颜色滤镜的光信号用于光谱测量。
由于第一彩色滤镜单元包括基本颜色滤镜和扩展颜色滤镜,第二彩色滤镜单元包括多个基本颜色滤镜,且扩展颜色滤镜的颜色与基本颜色滤镜的颜色不同,因此,第一彩色滤镜单元中颜色滤镜的颜色组合与第二彩色滤镜单元中颜色滤镜的组合不同,即彩色滤镜阵列包括两种类型的彩色滤镜单元,相比于常用的由多个相同的彩色滤镜单元构成的彩色滤镜阵列,增加了采样点,提高了颜色还原的精度,降低了同色异谱问题的出现,降低了白平衡系数和颜色转换矩阵的计算难度。另外,相比于相关技术,由于对彩色滤镜阵列进行了改进,未增加任何额外的器件,有利于图像传感器的小型化和低成本普及,同时也使得图像传感器更便于应用在手机等移动终端设备的使用场景。此外,由于经过基本颜色滤镜的光信号至少用于成像,经过扩展颜色滤镜的光信号用于光谱测量,因此,通过一次曝光可以同时进行成像和光谱测量,缩短了光谱测量耗时,进而提高图像传感器的颜色还原和 成像效率。
在一种可能的实现方式中,每个所述第一彩色滤镜单元中基本颜色滤镜的数量为3个,每个所述第一彩色滤镜单元中扩展颜色滤镜的数量为1个,每个所述第二彩色滤镜单元中基本颜色滤镜的数量为4个。
在一种可能的实现方式中,所述第二彩色滤镜单元中基本颜色滤镜的颜色包括红色、绿色和蓝色,所述第一彩色滤镜单元中基本颜色滤镜的颜色包括红色、绿色和蓝色。
在一种可能的实现方式中,不同的所述扩展颜色滤镜具有不同的颜色。
由于不同的扩展颜色滤镜具有不同的颜色,因此进一步增加了采样点的数量,提高了颜色还原的精度,降低了同色异谱问题的出现,降低了白平衡系数和颜色还原矩阵的计算难度。
在一种可能的实现方式中,所述扩展颜色滤镜包括第一扩展颜色滤镜和第二扩展颜色滤镜,其中:所述第一扩展颜色滤镜的厚度、被所述第一扩展颜色滤镜覆盖的像素的有效面积、被所述第一扩展颜色滤镜覆盖的像素的曝光时间中的至少一个根据所述图像传感器的动态范围的上限确定;所述第二扩展颜色滤镜的厚度、被所述第二扩展颜色滤镜覆盖的像素的有效面积、被所述第二扩展颜色滤镜覆盖的像素的曝光时间中的至少一个根据所述图像传感器的动态范围的下限确定。
由上可知,可以根据图像传感器的动态范围的上限和下限确定第一扩展颜色滤镜、被第一扩展颜色滤镜覆盖的像素、第二扩展颜色滤镜、被第二扩展颜色滤镜覆盖的像素的相关参数,使得图像传感器的动态范围不受限制,进而同时兼顾亮度较高的对象和亮度较低的对象的动态范围,提升成像效果。
第二方面,本申请提供一种图像数据获取方法,应用于图像传感器,所述图像传感器包括像素阵列、覆盖在所述像素阵列上的彩色滤镜阵列;其中:所述彩色滤镜阵列包括多个彩色滤镜单元,所述多个彩色滤镜单元包括至少一个第一彩色滤镜单元和至少一个第二彩色滤镜单元,每个所述第一彩色滤镜单元包括基本颜色滤镜和扩展颜色滤镜,每个所述第二彩色滤镜单元包括多个基本颜色滤镜,所述扩展颜色滤镜的颜色与所述基本颜色滤镜的颜色不同;
所述方法包括:获得第一电信号和第二电信号;根据所述第一电信号确定当前帧的第一光谱测量数据,根据所述第二电信号确定所述当前帧的成像数据;其中,所述第一电信号为第一像素对第一光信号进行光电转换后得到的电信号,所述第一光信号为经过所述扩展颜色滤镜的光信号,所述第一像素为被所述扩展颜色滤镜覆盖的像素,所述第二电信号为第二像素对第二光信号进行光电转换后得到的电信号,所述第二光信号为经过所述基本颜色滤镜的光信号,所述第二像素为被所述基本颜色滤镜覆盖的像素。
由上可知,通过一次曝光能够同时获得当前帧的成像数据和第一光谱测量数据,缩短了数据采集的耗时。此外,由于该图像数据获取方法应用的图像传感器的采样点得到了增加,因此,提升颜色还原的精度,降低白平衡系数和颜色转换精度的计算难度。
在一种可能的实现方式中,所述根据所述第一电信号确定当前帧的第一光谱测量数据包括:根据所述当前帧的成像数据和历史帧的成像数据确定所述当前帧相对于所述历史帧的运动信息;根据所述第一电信号确定所述当前帧的第二光谱测量数据;根据所述运动信息和所述历史帧的光谱测量数据对所述第二光谱测量数据进行修正,以得到所述第一光谱 测量数据。
在一种可能的实现方式中,所述根据所述第一电信号确定当前帧的第一光谱测量数据包括:根据所述第二电信号中的至少一部分和所述第一电信号生成所述第一光谱测量数据。
在一种可能的实现方式中,所述根据所述第一电信号确定当前帧的第一光谱测量数据包括:根据所述当前帧的成像数据和历史帧的成像数据确定所述当前帧相对于所述历史帧的运动信息;根据所述第二电信号中的至少一部分和所述第一电信号生成所述当前帧的第二光谱测量数据;根据所述运动信息和所述历史帧的光谱测量数据对所述第二光谱测量数据进行修正,以得到所述第一光谱测量数据。
在一种可能的实现方式中,所述获得第一电信号和第二电信号包括:根据所述扩展颜色滤镜在所述彩色滤镜阵列中的位置,获得所述第一电信号;根据所述基本颜色滤镜在所述彩色滤镜阵列中的位置,获得所述第二电信号。
在一种可能的实现方式中,所述方法还包括:基于所述当前帧的成像数据,对所述当前帧进行要素分割,以得到所述当前帧中的至少一个对象;基于所述至少一个对象的颜色分布,确定所述当前帧的原始光源光谱;在所述当前帧的第一光谱测量数据中确定每个所述对象的光谱测量数据,以及根据每个所述对象的光谱测量数据对所述原始光源光谱进行修正,以得到目标光源光谱;根据所述目标光源光谱确定所述当前帧的白平衡系数和/或颜色转换矩阵;根据所述当前帧的白平衡系数和/或颜色转换矩阵,对所述当前帧的成像数据进行处理。
在一种可能的实现方式中,所述方法还包括:基于所述当前帧的成像数据,对所述当前帧进行要素分割,以得到所述当前帧中的至少一个对象;在所述当前帧的第一光谱测量数据中确定每个所述对象的光谱测量数据,以及根据每个所述对象的光谱测量数据确定每个所述对象的光谱;根据每个所述对象的光谱对对应的所述对象进行诊断和/或分类和/或识别。
在一种可能的实现方式中,所述扩展颜色滤镜包括第一扩展颜色滤镜和第二扩展颜色滤镜,其中:所述第一扩展颜色滤镜的厚度、被所述第一扩展颜色滤镜覆盖的像素的有效面积、被所述第一扩展颜色滤镜覆盖的像素的曝光时间中的至少一个根据所述图像传感器的动态范围的上限确定;所述第二扩展颜色滤镜的厚度、被所述第二扩展颜色滤镜覆盖的像素的有效面积、被所述第二扩展颜色滤镜覆盖的像素的曝光时间中的至少一个根据所述图像传感器的动态范围的下限确定。
第三方面,本申请提供一种成像设备,包括:图像传感器和图像处理器,其中:所述图像传感器包括像素阵列和覆盖在所述像素阵列上的彩色滤镜阵列;所述彩色滤镜阵列包括多个彩色滤镜单元,所述多个彩色滤镜单元包括至少一个第一彩色滤镜单元和至少一个第二彩色滤镜单元,每个所述第一彩色滤镜单元包括基本颜色滤镜和扩展颜色滤镜,每个所述第二彩色滤镜单元包括多个基本颜色滤镜,所述扩展颜色滤镜的颜色与所述基本颜色滤镜的颜色不同;所述扩展颜色滤镜,用于对入射光信号进行滤波,得到第一光信号;所述基本颜色滤镜,用于对入射光信号进行滤波,得到第二光信号;第一像素,用于对所述第一光信号进行光电转换得到第一电信号,所述第一像素为被所述扩展颜色滤镜覆盖的像素;第二像素,用于对所述第二光信号进行光电转换得到第二电信号,所述第二像素为被所述基本颜色滤镜覆盖的像素;所述图像处理器,用于获得所述第一电信号和所述第二电 信号,根据所述第一电信号确定当前帧的第一光谱测量数据,根据所述第二电信号确定所述当前帧的成像数据。
在一种可能的实现方式中,所述图像处理器具体通过下述方式确定所述第一光谱测量数据:根据所述当前帧的成像数据和历史帧的成像数据确定所述当前帧相对于所述历史帧的运动信息;根据所述第一电信号确定所述当前帧的第二光谱测量数据;根据所述运动信息和所述历史帧的光谱测量数据对所述第二光谱测量数据进行修正,以得到所述第一光谱测量数据。
在一种可能的实现方式中,所述图像处理器具体通过下述方式确定所述第一光谱测量数据:根据所述第二电信号中的至少一部分和所述第一电信号生成所述第一光谱测量数据。
在一种可能的实现方式中,所述图像处理器具体通过下述方式确定所述第一光谱测量数据:根据所述当前帧的成像数据和历史帧的成像数据确定所述当前帧相对于所述历史帧的运动信息;根据所述第二电信号中的至少一部分和所述第一电信号生成所述当前帧的第二光谱测量数据;根据所述运动信息和所述历史帧的光谱测量数据对所述第二光谱测量数据进行修正,以得到所述第一光谱测量数据。
在一种可能的实现方式中,所述图像处理器具体通过下述方式获得第一电信号和第二电信号:根据所述扩展颜色滤镜在所述彩色滤镜阵列中的位置,获得所述第一电信号;根据所述基本颜色滤镜在所述彩色滤镜阵列中的位置,获得所述第二电信号。
在一种可能的实现方式中,所述图像处理器还用于:基于所述当前帧的成像数据,对所述当前帧进行要素分割,以得到所述当前帧中的至少一个对象;基于所述至少一个对象的颜色分布,确定所述当前帧的原始光源光谱;在所述当前帧的第一光谱测量数据中确定每个所述对象的光谱测量数据,以及根据每个所述对象的光谱测量数据对所述原始光源光谱进行修正,以得到目标光源光谱;根据所述目标光源光谱确定所述当前帧的白平衡系数和/或颜色转换矩阵;根据所述当前帧的白平衡系数和/或颜色转换矩阵,对所述当前帧的成像数据进行处理。
在一种可能的实现方式中,所述图像处理器还用于:基于所述当前帧的成像数据,对所述当前帧进行要素分割,以得到所述当前帧中的至少一个对象;在所述当前帧的第一光谱测量数据中确定每个所述对象的光谱测量数据,以及根据每个所述对象的光谱测量数据确定每个所述对象的光谱;根据每个所述对象的光谱对对应的所述对象进行诊断和/或分类和/或识别。
在一种可能的实现方式中,所述扩展颜色滤镜包括第一扩展颜色滤镜和第二扩展颜色滤镜,其中:所述第一扩展颜色滤镜的厚度、被所述第一扩展颜色滤镜覆盖的像素的有效面积、被所述第一扩展颜色滤镜覆盖的像素的曝光时间中的至少一个根据所述图像传感器的动态范围的上限确定;所述第二扩展颜色滤镜的厚度、被所述第二扩展颜色滤镜覆盖的像素的有效面积、被所述第二扩展颜色滤镜覆盖的像素的曝光时间中的至少一个根据所述图像传感器的动态范围的下限确定。
第四方面,本申请提供一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,当所述指令在计算机或处理器上运行时,使得所述计算机或处理器执行如第二方面中任一项所述的方法。
第五方面,本申请提供一种包含指令的计算机程序产品,当其在计算机或处理器上运 行时,使得所述计算机或处理器执行如第二方面中任一项所述的方法。
附图说明
图1为常用的彩色滤镜阵列的结构示意图一;
图2为常用的彩色滤镜阵列的结构示意图二;
图3为本申请实施例提供的成像设备的结构示意图;
图4为本申请实施例提供的图像传感器的结构示意图;
图5为本申请实施例提供的图像传感器中的彩色滤镜阵列的结构示意图一;
图6为本申请实施例提供的图像传感器中的彩色滤镜阵列的结构示意图二;
图7为本申请实施例提供的图像传感器中的彩色滤镜阵列的结构示意图三;
图8为本申请实施例提供的图像处理器的工作流程示意图。
具体实施方式
下面将结合附图,对本申请中的技术方案进行描述。
为使本申请的目的、技术方案和优点更加清楚,下面将结合本申请中的附图,对本申请中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请的说明书实施例和权利要求书及附图中的术语“第一”、“第二”等仅用于区分描述的目的,而不能理解为指示或暗示相对重要性,也不能理解为指示或暗示顺序。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元。方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
应当理解,在本申请中,“至少一个(项)”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,用于描述关联对象的关联关系,表示可以存在三种关系,例如,“A和/或B”可以表示:只存在A,只存在B以及同时存在A和B三种情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b或c中的至少一项(个),可以表示:a,b,c,“a和b”,“a和c”,“b和c”,或“a和b和c”,其中a,b,c可以是单个,也可以是多个。
常用的图像传感器包括像素阵列和覆盖在像素阵列上的彩色滤镜阵列。
图1为常用的彩色滤镜阵列的结构示意图一。如图1所示,彩色滤镜阵列包括16个相同的彩色滤镜单元101,16个彩色滤镜单元101呈四行四列排布。每个彩色滤镜单元101包括四个颜色滤镜,四个颜色滤镜呈两行两列排布。彩色滤镜单元101为3通道RGB,即,彩色滤镜单元101由三种颜色的颜色滤镜构成,其中,彩色滤镜单元101包括的四个颜色滤镜的颜色分别为红色R、绿色G、绿色G、蓝色B。
图2为常用的彩色滤镜阵列的结构示意图二。如图2所示,彩色滤镜阵列包括1个彩色滤镜单元。彩色滤镜单元包括64个颜色滤镜。彩色滤镜单元为4通道RGYB,即彩色 滤镜单元由4种颜色的颜色滤镜构成。其中,彩色滤镜单元中的64个颜色滤镜被划分为4个滤镜组,且每个滤镜组包括16个颜色滤镜。4个滤镜组呈两行两列排布,每个滤镜组中的16个颜色滤镜呈4行4列排布。第一个滤镜组中的颜色滤镜的颜色为红色R、第二个滤镜组中的颜色滤镜的颜色为绿色G、第三个滤镜组中的颜色滤镜的颜色为黄色Y、第四个滤镜组中的颜色滤镜的颜色为蓝色B。
综上,由于常用的彩色滤镜阵列由多个相同的彩色滤镜单元构成,且每个彩色滤镜单元通常为3通道或者4通道,相当于,彩色滤镜阵列中的采样点仅为三个或四个,采样点较少,导致通过常用的图像传感器仅能做粗略的颜色还原。此外,采样点较少,会导致同色异谱问题越发严重,为白平衡系数和颜色转换矩阵的计算带来挑战。
为了解决上述技术问题,相关技术提供了一种图像传感器。该图像传感器在常用的图像传感器的基础上,增加了光栅器件。这样,在图像传感器成像的同时,还可以通过光栅器件对入射光进行分光,实现光谱测量,以及通过光谱测量结果辅助图像传感器的颜色还原过程以及白平衡系数和颜色转换矩阵的计算。
显然,通过光栅器件,增加了采样点,提高了图像传感器的颜色还原精度,减少了同色异谱问题的出现,降低了白平衡系数和颜色转换矩阵的计算难度。
然而,通过光栅器件虽然能够增加采样点,但却在常用结构的基础上增加了光栅器件。由于光栅器件尺寸较大、价格较高,因此不利于图像传感器的小型化和低成本普及。另外,在振动情况下,光栅器件的可靠性会受到影响,不符合手机等移动终端设备的使用场景。此外,光栅器件进行一次光谱测量的耗时较长,导致图像传感器的颜色还原和成像效率变低。
为了解决上述技术问题,本申请提供了一种成像设备。图3为本申请实施例提供的成像设备的结构示意图。如图3所示,该成像设备包括:图像传感器310、存储器320、图像处理器330、I/O接口340和显示器350。其中:
图像传感器310还可以称为成像传感器,成像传感器是感测和传递图像信息的传感器。图像传感器310可以用于电子成像设备中,包括数码相机、相机模块、医用成像设备,例如热成像设备、雷达、声纳等夜视设备。例如,图像传感器310可以是互补金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS)或N型金属氧化物半导体(N-type Metal Oxide Semiconductor,NMOS)技术中的有源像素传感器,并且本发明的实施例并不限于此,例如,图像传感器310还可以是电荷耦合装置(Charge-Coupled Device,CCD)等。
图4为本申请实施例提供的图像传感器的结构示意图,如图4所示,图像传感器310包括像素阵列410、覆盖在像素阵列410上的彩色滤镜阵列420和读出电路430。其中:
彩色滤镜阵列420包括多个彩色滤镜单元,多个彩色滤镜单元包括至少一个第一彩色滤镜单元和至少一个第二彩色滤镜单元,每个第一彩色滤镜单元包括基本颜色滤镜和扩展颜色滤镜,每个第二彩色滤镜单元包括多个基本颜色滤镜,扩展颜色滤镜的颜色与基本颜色滤镜的颜色不同。
扩展颜色滤镜用于对入射光进行滤波(即使光线中位于扩展颜色滤镜的通带范围内的光线通过),得到第一光信号。第一光信号(即经过扩展颜色滤镜的光信号)用于光谱测 量。
基本颜色滤镜用于对入射光信号进行滤波(即使光线中位于基本颜色滤镜的通带范围内的光线通过),得到第二光信号。第二光信号(即经过基本颜色滤镜的光信号)用于成像或者用于成像和光谱测量。
像素阵列410包括多个像素,其中,被扩展颜色滤镜覆盖的像素称为第一像素,被基本颜色滤镜覆盖的像素称为第二像素。第一像素用于对第一光信号进行光电转换,得到第一电信号,第二像素用于对第二光信号进行光电转换,得到第二电信号。
读出电路430用于从第一像素中读出第一电信号和从第二像素中读出第二电信号。
图像处理器330用于获得第一电信号和第二电信号,根据第一电信号确定当前帧的第一光谱测量数据,根据第二电信号确定当前帧的成像数据。图像处理器330还可以根据当前帧的第一光谱数据和成像数据确定当前帧的白平衡系数和/或颜色和转换矩阵和/或对当前帧中的对象进行诊断和/或分类和/或识别。
存储器320用于存储对应于图像传感器310所输出的信号的数据。显示器350用于根据与从图像传感器310输出或存储于存储器320中信号对应的数据显示图像和/或光谱。I/O接口340用于与其它电子装置通信,例如,移动电话、智能手机、平板手机、平板计算机或个人计算机。
除了以上成像设备之外,本领域的技术人员应理解,本文所揭示的技术还适用于具有成像功能的其它电子装置,例如,移动电话、智能手机、平板手机、平板计算机、个人助理等。
由上可知,由于第一彩色滤镜单元包括基本颜色滤镜和扩展颜色滤镜,第二彩色滤镜单元包括多个基本颜色滤镜,且扩展颜色滤镜的颜色与基本颜色滤镜的颜色不同,因此,第一彩色滤镜单元中颜色滤镜的颜色组合与第二彩色滤镜单元中颜色滤镜的组合不同,即彩色滤镜阵列包括两种类型的彩色滤镜单元,相比于常用的由多个相同的彩色滤镜单元构成的彩色滤镜阵列,增加了采样点,提高了颜色还原的精度,降低了同色异谱问题的出现,降低了白平衡系数和颜色转换矩阵的计算难度。另外,相比于相关技术,由于对彩色滤镜阵列进行了改进,未增加任何额外的器件,有利于图像传感器的小型化和低成本普及,同时也使得图像传感器更便于应用在手机等移动终端设备的使用场景。此外,由于经过基本颜色滤镜的光信号至少用于成像,经过扩展颜色滤镜的光信号用于光谱测量,因此,通过一次曝光可以同时进行成像和光谱测量,缩短了光谱测量耗时,进而提高了图像传感器的颜色还原和成像效率。
图5为本申请实施例提供的图像传感器中的彩色滤镜阵列的结构示意图一。如图5所示,彩色滤镜阵列420包括多个彩色滤镜单元,多个彩色滤镜单元包括至少一个第一彩色滤镜单元501和至少一个第二彩色滤镜单元502,每个第一彩色滤镜单元501包括基本颜色滤镜和扩展颜色滤镜,每个第二彩色滤镜单元502包括多个基本颜色滤镜。
在彩色滤镜阵列中彩色滤镜单元的数量越多,分辨率就越高,反之亦然。彩色滤镜单元中颜色滤镜的颜色种类越多,彩色滤镜单元的通道数就越高,反之亦然。彩色滤镜单元中颜色滤镜的数量越多,在彩色滤镜单元中才能够包括更多种颜色的颜色滤镜。由于在彩色滤镜阵列中颜色滤镜的数量一定的情况下,彩色滤镜单元的数量越多,彩色滤镜单元中 颜色滤镜的数量越少,反之亦然。因此,可以根据对分辨率和彩色滤镜单元的通道数的要求,设置彩色滤镜单元的数量和彩色滤镜单元中颜色滤镜的数量。
需要说明的是,若彩色滤镜单元为第一彩色滤镜单元501,则彩色滤镜单元中颜色滤镜的数量指其中的基本颜色滤镜和扩展颜色滤镜的数量总和。若彩色滤镜单元为第二彩色滤镜单元502,则彩色滤镜单元中颜色滤镜的数量指其中的基本颜色滤镜的数量。
第一彩色滤镜单元501中颜色滤镜的数量与第二彩色滤镜单元502中颜色滤镜的数量可以相同,也可以不同。
由于经过扩展颜色滤镜的光信号用于光谱测量,经过基本颜色滤镜的光信号至少用于成像,而第二彩色滤镜单元502中的颜色滤镜均为基本颜色滤镜,第一彩色滤镜单元501中的颜色滤镜包括基本颜色滤镜和扩展颜色滤镜,因此,可以根据成像效果和光谱测量需求,确定多个彩色滤镜单元中第一彩色滤镜单元501的数量、第二彩色滤镜单元502的数量和第一彩色滤镜单元501中扩展颜色滤镜的数量。第一彩色滤镜单元501中扩展颜色滤镜的数量为至少一个。
例如,每个第一彩色滤镜单元501中基本颜色滤镜的数量为3个,每个第一彩色滤镜单元501中扩展颜色滤镜的数量为1个,每个第二彩色滤镜单元502中基本颜色滤镜的数量为4个。
第二彩色滤镜单元502中基本颜色滤镜的颜色和颜色种类数量根据成像需求确定。第二彩色滤镜单元502中基本颜色滤镜的颜色种类数量例如可以为3或4或者更多等,本申请对此不作特殊限定。第二彩色滤镜单元502中基本颜色滤镜的颜色例如可以包括红色、绿色、蓝色、黄色、品红、近红外等中的三种或者四种或者更多等,本申请对此不作特殊限定。
第二彩色滤镜单元502中基本颜色滤镜的颜色种类的数量等于或者小于第二彩色滤镜单元502中基本颜色滤镜的数量。具体的,若第二彩色滤镜单元502中基本颜色滤镜的颜色种类的数量等于第二彩色滤镜单元502中基本颜色滤镜的数量,则第二彩色滤镜单元502中不同基本颜色滤镜的颜色不同,若第二彩色滤镜单元502中基本颜色滤镜的颜色种类的数量小于第二彩色滤镜单元502中基本颜色滤镜的数量,则第二彩色滤镜单元502中部分基本颜色滤镜的颜色相同。
第二彩色滤镜单元502的通道数等于其中的基本颜色滤镜的颜色种类数量。例如,若第二彩色滤镜单元502中基本颜色滤镜的颜色包括红色、绿色、蓝色,则第二彩色滤镜单元502的通道数为3,若第二彩色滤镜单元502中基本颜色滤镜的颜色包括红色、黄色、绿色、蓝色,则第二彩色滤镜单元502的通道数为4。
第一彩色滤镜单元501中基本颜色滤镜为第二彩色滤镜单元502中多个基本颜色滤镜中的一部分。不同的第一彩色滤镜单元501中基本颜色滤镜的颜色可以完全不同、也可以不完全相同、也可以完全相同。例如,第二彩色滤镜单元501中基本颜色滤镜的数量为4个,且该四个基本颜色滤镜的颜色分别为:红色、绿色、蓝色、黄色。若第一彩色滤镜单元501中基本颜色滤镜的数量为3个,则每个第一彩色滤镜单元501中的3个基本颜色滤镜的颜色可以分别为:红色、绿色、蓝色,显然,不同的第一彩色滤镜单元501中基本颜色滤镜的颜色相同。若第一彩色滤镜单元501中基本颜色滤镜的数量为2个,则一部分第一彩色滤镜单元501中的2个基本颜色滤镜的颜色可以分别为:红色、绿色,一部分第一 彩色滤镜单元501中的两个基本颜色滤镜的颜色可以分别为:蓝色、黄色,一部分第一彩色滤镜单元501中的两个基本颜色滤镜的颜色可以分别为:绿色、黄色,显然,不同的第一彩色滤镜单元501中基本颜色滤镜的颜色可以完全相同或者完全不同或者部分相同。
由于基本颜色滤镜的颜色与扩展颜色滤镜的颜色不同,因此,扩展颜色滤镜的颜色为除基本颜色滤镜的颜色之外的任何颜色,具体的,可以根据光谱测量需求确定。
同一个第一彩色滤镜单元501中的不同扩展颜色滤镜的颜色可以完全相同、也可以完全不同、还可以不完全相同。不同的第一彩色滤镜单元501中扩展颜色滤镜的颜色可以完全相同、也可以完全不同、也可以不完全相同。
例如,第一彩色滤镜单元501包括一个扩展颜色滤镜,不同的第一彩色滤镜单元501中的扩展颜色滤镜的颜色不同。再例如,第一彩色滤镜单元501包括多个扩展颜色滤镜,同一个第一彩色滤镜单元501中的不同扩展颜色滤镜的颜色完全不同,不同的第一彩色滤镜单元501中的扩展颜色滤镜的颜色完全不同。显然,在上述两个例子中,针对彩色滤镜阵列,不同的扩展颜色滤镜具有不同的颜色。
在彩色滤镜阵列中,扩展颜色滤镜的颜色种类越多,采样点就越多。基于此,在上述两个例子中,由于不同的扩展颜色滤镜具有不同的颜色,因此进一步增加了采样点的数量,提高了颜色还原的精度,降低了同色异谱问题的出现,降低了白平衡系数和颜色还原矩阵的计算难度。
第一彩色滤镜单元501的通道数为第一彩色滤镜单元501中基本颜色滤镜和扩展颜色滤镜的颜色种类数量。第一彩色滤镜单元501的通道数与第二彩色滤镜单元502的通道数可以相同,也可以不同。
至少一个第一彩色滤镜单元501和至少一个第二彩色滤镜单元502呈N行M列排布。其中,N和M可以相同,也可以不相同,且N和M均为大于或者等于1的整数。第一彩色滤镜单元501可以规律的分布在N行M列中,第一彩色滤镜单元501也可以随机的分布在N行M列中,本申请对此不作特殊限定。
扩展颜色滤镜可以位于其所属的第一彩色滤镜单元501中的任意位置,扩展颜色滤镜还可以位于其所属的第一彩色滤镜单元501中的固定位置。
需要说明的是,还需记录扩展颜色滤镜的位置信息,以便于后续数据处理。具体的,若扩展颜色滤镜规律的分布在彩色滤镜阵列中,则记录第一个扩展颜色滤镜的位置和扩展颜色滤镜的分布规律,这样就可以根据第一个扩展颜色滤镜的位置和分布规律确定每个扩展颜色滤镜的位置。若扩展颜色滤镜随机分布在彩色滤镜阵列中,则需记录每个扩展颜色滤镜的位置。
图6为本申请实施例提供的图像传感器中的彩色滤镜阵列的结构示意图二。如图6所示,彩色滤镜阵列包括16个彩色滤镜单元。其中,16个彩色滤镜单元包括4个第一彩色滤镜单元601和12个第二彩色滤镜单元602。每个第一彩色滤镜单元601包括3个基本颜色滤镜和1个扩展颜色滤镜。每个第二彩色滤镜单元602包括4个基本颜色滤镜。
每个第二彩色滤镜单元602中的4个基本颜色滤镜的颜色分别为RGGB(红绿绿蓝)。每个第一彩色滤镜单元601中的3个基本颜色滤镜的颜色分别为RGB(红绿蓝)。不同的第一彩色滤镜单元601中的扩展颜色滤镜的颜色不同,且4个第一彩色滤镜单元501中 的扩展颜色滤镜的颜色分别为:青色、橙色、紫色、灰色。
由上可知,由于第二彩色滤镜单元602包括3种颜色的基本颜色滤镜,因此,第二彩色滤镜单元602的通道数为3。第一彩色滤镜单元601包括3种颜色的基本颜色滤镜和1种颜色的扩展颜色滤镜,因此,第一彩色滤镜单元601的通道数为4。
图7为本申请实施例提供的图像传感器中的彩色滤镜阵列的结构示意图三。图7中的彩色滤镜阵列与图6中的彩色滤镜阵列的区别在于:每个第二彩色滤镜单元702中的4个基本颜色滤镜的颜色分别为RGYB(红绿黄蓝)。每个第一彩色滤镜单元701中的3个基本颜色滤镜的颜色分别为RGB(红绿蓝)。
由上可知,由于第二彩色滤镜单元702包括4种颜色的基本颜色滤镜,因此,第二彩色滤镜单元702的通道数为4。第一彩色滤镜单元701包括3种颜色的基本颜色滤镜和1种颜色的扩展颜色滤镜,因此,第一彩色滤镜单元701的通道数为4。
需要说明的是,上述图6和图7中对彩色滤镜阵列的说明仅为示例性的,并不用于限定本申请。
由于常用的图像传感器的动态范围受限,因此,不能同时兼顾亮度较高的对象和亮度较低的对象的动态范围,导致成像效果差。典型的一个由于动态范围受限,而导致成像效果差的场景如下:
在一个拍摄场景中同时出现光源和物体,由于光源的亮度较高,物体的亮度较低,因此,若对准光源聚焦,则在图像中可以清晰的显示光源的细节,但是物体的细节会被丢失,若对准物体聚焦,则在图像中可以清晰的显示物体的细节,但是光源会出现过度曝光的现象。
为了解决该技术问题,本申请对彩色滤镜阵列中的扩展颜色滤镜进行了改进。改进的原理如下:
由于通过减少像素的光子积累,能够避免过曝光,通过增加像素的光子积累,能够提高信噪比,避免丢失物体细节。又由于像素的有效面积、曝光时间和覆盖像素的颜色滤镜的厚度中的至少一种的改变,会改变像素的光子积累情况。因此,本申请可以将彩色滤镜阵列中的一部扩展颜色滤镜作为第一扩展颜色滤镜,将另一部分扩展颜色滤镜作为第二扩展颜色滤镜。然后,根据图像传感器所应用的场景中最亮对象的动态范围和最暗对象的动态范围确定图像传感器的动态范围。示例性的,对最亮对象的动态范围和最暗对象的动态范围求并集,将并集的最大值确定为图像传感器的动态范围的上限,将并集的最小值确定为图像传感器的动态范围的下限。最后,根据图像传感器的动态范围的上限确定第一扩展颜色滤镜的厚度、被第一扩展颜色滤镜覆盖的像素的有效面积、被第一扩展颜色滤镜覆盖的像素的曝光时间中的至少一种,根据图像传感器的动态范围的下限确定第二扩展颜色滤镜的厚度、被第二扩展颜色滤镜覆盖的像素的曝光时间、被第二扩展颜色滤镜覆盖的像素的有效面积中的至少一种。
扩展颜色滤镜的厚度与像素的光子积累效果成负相关关系。像素的曝光时间与像素的光子积累效果成正相关关系,像素的有效面积与像素的光子积累效果成正相关关系。
减小像素的有效面积的方式包括:缩小填充因子和半遮挡等方式。
由上可知,可以根据应用场景确定图像传感器的动态范围,以及根据图像传感器的动 态范围的上限和下限确定第一扩展颜色滤镜、被第一扩展颜色滤镜覆盖的像素、第二扩展颜色滤镜、被第二扩展颜色滤镜覆盖的像素的相关参数,使得图像传感器的动态范围不受限制,以同时兼顾亮度较高的对象和亮度较低的对象的动态范围,提升成像效果。
下来,结合图8对图像处理器的工作流程进行说明。如图8所示,图像处理器的工作流程包括以下步骤:
801、获得第一电信号和第二电信号。
示例性的,图像传感器从读出电路读出的信号中获得第一电信和第二电信号。
获得第一电信号和第二电信号的过程例如可以为:根据扩展颜色滤镜在彩色滤镜阵列中的位置,获得第一电信号,根据基本颜色滤镜在彩色滤镜阵列中的位置,获得第二电信号。具体的,根据扩展颜色滤镜在彩色滤镜阵列中的位置,确定第一像素在像素阵列中的位置,根据第一像素在像素阵列中的位置,从读出电路从像素阵列中读出的电信号中获得第一电信号。同理,根据基本颜色滤镜在彩色滤镜阵列中的位置,确定第二像素在像素阵列中的位置,根据第二像素在像素阵列中的位置,从读出电路从像素阵列中读出的电信号中获得第二电信号。
802、根据第一电信号确定当前帧的第一光谱测量数据,根据第二电信号确定当前帧的成像数据。
示例性的,根据第二电信号确定当前帧的成像数据的过程可以如下:
将从第二像素中读取的第二电信号从电信号转换为像素值,得到第二像素的像素值。将第二像素的像素值和第二像素在像素阵列中的位置确定为当前帧的成像数据。
在本申请的另一可能的实现方式中,在得到成像数据后,还可以对成像数据进行预处理,此处的预处理包括但不限于黑电平校正、镜头校正、坏点补偿等中的一种或多种。将预处理后的成像数据确定为步骤802中得到的成像数据。
由上可知,即当前帧的成像数据可以为预处理前的成像数据,也可以为预处理后的成像数据。
示例性的,根据第一电信号确定当前帧的第一光谱测量数据的方式可以包括以下四种:
第一种,将从第一像素中读取的第一电信号从电信号转换为像素值,得到第一像素的像素值。将第一像素的像素值和第一像素在像素阵列中的位置确定为当前帧的第一光谱测量数据。
第二种,首先,根据当前帧的成像数据和历史帧的成像数据确定当前帧相对于历史帧的运动信息。
历史帧为位于当前帧之前的任意帧。例如,历史帧为位对于当前帧之前且与当前帧相邻的视频帧。历史帧的成像数据可以为原始成像数据,原始成像数据包括拍摄历史帧时获得的第二像素的像素值和第二像素在像素阵列中的位置,历史帧的成像数据还可以为对历史帧的原始成像数据进行上述预处理和/或经过白平衡系数处理和/或进行颜色转换矩阵处理后的数据。
确定当前帧相对于历史帧的运动信息的过程可以包括:根据当前帧的成像数据中第二像素的像素值和第二像素在像素阵列中的位置以及历史帧的成像数据中第二像素在像素阵列中的位置和第二像素的像素值,计算指示同一要素的第二像素在当前帧与历史帧中的 位置差,对所有指示同一要素的第二像素在当前帧与历史帧中的位置差求平均值,将该平均值确定为当前帧相对于历史帧的运动信息。要素包括但不限于语义和特征等。需要说明的是,上述确定当前帧相对于历史帧的运动信息的方式仅为示例性的,并不用于限定本申请。
然后,根据第一电信号确定当前帧的第二光谱测量数据。此处的第二光谱测量数据可以理解为第一种方式中的第一光谱测量数据。
最后,根据当前帧相对于历史帧的运动信息和历史帧的光谱测量数据对当前帧的第二光谱测量数据进行修正,以得到当前帧的第一光谱测量数据。
历史帧的光谱测量数据可以为历史帧的原始光谱测量数据,原始光谱测量数据包括拍摄历史帧时获得的第一像素的像素值和第一像素在像素阵列中的位置。历史帧的光谱测量数据还可以为对原始光谱测量数据进行处理后得到的数据,此处的处理包括但不限于通过历史帧之前的视频帧的光谱测量数据对历史帧的原始光谱测量数据进行修正、对历史帧的原始光谱测量数据进行黑电平校正、镜头校正、坏点补偿等。
对当前帧的第二光谱测量数据进行修正的过程例如可以为:根据运动信息,将历史帧与当前帧中指示同一要素的第一像素对齐,并对对齐的第一像素在历史帧和当前帧中的像素值求平均值,以及将该平均值确定为第一像素在当前帧中的修正像素值,将当前帧的第二光谱测量数据中对齐的第一像素的像素值替换为修正像素值,即可得到当前帧的第一光谱测量数据。需要说明的是,上述关于对当前帧的第二光谱测量数据进行修正的过程仅为示例性的,并不用于限定本申请。
第三种,根据第二电信号中的至少一部分和第一电信号生成第一光谱测量数据。
第二电信号中的至少一部分指从至少一部分第二像素中读取的第二电信号。至少一部分第二像素指全部第二像素或一部分第二像素。一部分第二像素例如可以为位于第一像素周边的第二像素。
将从第一像素读取的第一电信号从电信号转换为像素值,得到第一像素的像素值。将从至少一部分第二像素中读取的第二电信号从电信号转换为像素值,以得到至少一部分第二像素的像素值。将第一像素的像素值和第一像素在像素阵列中的位置、至少一部分第二像素的像素值和至少一部分第二像素在像素阵列中的位置确定为第一光谱测量数据。
第四种,首先,根据当前帧的成像数据和历史帧的成像数据确定当前帧相对于历史帧的运动信息。由于该过程已经在上文中进行了说明,因此此处不再赘述。
然后,根据第二电信号中的至少一部分和第一电信号生成当前帧的第二光谱测量数据。此处的第二光谱数据可以理解为第三种方式中的第一光谱测量数据。
最后,根据运动信息和历史帧的光谱测量数据对当前帧的第二光谱测量数据进行修正,以得到当前帧的第一光谱测量数据。历史帧的光谱测量数据可以为历史帧的原始光谱测量数据,原始光谱测量数据包括拍摄历史帧时获得的第一像素的像素值和第一像素在像素阵列中的位置和至少一部分第二像素的像素值和至少一部分第二像素在像素阵列中的位置。历史帧的光谱测量数据还可以为对原始光谱测量数据进行处理后得到的数据,此处的处理包括但不限于通过历史帧之前的视频帧的光谱测量数据对历史帧的原始光谱数据进行修正、对历史帧的原始光谱测量数据进行黑电平校正、镜头校正、坏点补偿等。
由于根据运动信息和历史帧的光谱测量数据对当前帧的第二光谱数据进行修正的原 理已经在上文中说明,因此此处不再赘述。
在本申请的另一可能的实现方式中,在得到第一光谱测量数据后,还可以对第一光谱测量数据进行预处理,此处的预处理包括但不限于黑电平校正、镜头校正、坏点补偿等中的一种或者多种。将预处理后的第一光谱测量数据确定为步骤802中得到的第一光谱测量数据。
由上可知,当前帧的第一光谱测量数据可以为预处理前的第一光谱测量数据,也可以为预处理后的第一光谱测量数据。
需要说明的是,在当前帧的第一光谱测量数据为预处理后的第一光谱测量数据的情况下,针对方式二和方式四,还可以将预处理的过程提前至得到第二光谱测量数据后执行,即在得到第二光谱测量数据后,对第二光谱测量数据进行上述预处理,以及根据预处理后得到第二光谱测量数据进行后续处理。
由于在上述方式二和方式四中,通过历史帧的光谱测量数据对当前帧的第二光谱测量数据进行修正后,以得到当前帧的第一光谱测量数据,即通过多帧降噪的方式对当前帧的第二光谱测量数据进行了修正,提升了第一光谱测量数据的准确率,提升了颜色还原的精度,提升了白平衡系数和颜色转换精度的计算精度。
综上,通过一次曝光能够同时获得当前帧的成像数据和第一光谱测量数据,缩短了数据采集的耗时。此外,由于该图像数据获取方法应用的图像传感器的采样点得到了增加,因此,提升颜色还原的精度,降低白平衡系数和颜色转换精度的计算难度。
在获得当前帧的第一光谱测量数据后,第一光谱测量数据的应用场景包括以下两种:
第一种,根据当前帧的第一光谱测量数据和成像数据计算白平衡系数和/或颜色转换矩阵,以通过白平衡系数和/或颜色转换矩阵对当前帧的成像数据进行处理,以提升颜色还原精度,进而提升显示效果。具体执行步骤如下:
803、基于当前帧的成像数据,对当前帧进行要素分割,得到当前帧中的至少一个对象。
要素包括但不限于语义、特征等。要素分割例如可以通过神行网络形成的模型来实现。对象包括但不限于人、动物、花、草、云朵、蓝天、桌子、房屋、人脸、白色块、灰色块等。
804,基于至少一个对象的颜色分布,确定当前帧的原始光源光谱。具体的,根据当前帧的成像数据以及每个对象的位置,分析至少一个对象的颜色分布,根据颜色分布,评估当前帧的原始光源色温,根据原始光源色温确定原始光源光谱。
805,在当前帧的第一光谱测量数据中确定每个对象的光谱测量数据,以及根据每个对象的光谱测量数据对原始光源光谱进行修正,以得到目标光源光谱。
确定对象的光谱测量数据的过程可以为:确定对象的边界像素的位置,根据当前帧的第一光谱测量数据中像素的位置,将第一光谱测量数据中位于对象的边界像素内的像素的像素值和位置确定为对象的光谱测量数据。
得到目标光源光谱的过程可以为:根据每个对象的光谱测量数据,分析至少一个对象的颜色分布,以及根据颜色分布确定修正色温,根据修正色温确定修正光谱。根据修正光谱对原始光源光谱进行修正,得到目标光源光谱。
光源光谱用于指示光源中的各个波长的光度量或辐射度量的分布。
上述关于目标光源光谱的确定过程仅为示例性的,并不用于限定本申请。例如,还可以在确定原始光源色温后,确定修正色温,然后,根据修正色温对原始光源色温进行修正,得到目标光源色温,最后,根据目标光源色温确定目标光源光谱。再例如,还可以在得到当前帧中的至少一个对象后,确定每个对象的成像数据和每个对象的光谱测量数据,然后,根据每个对象的光谱测量数据对对应对象的成像数据进行修正,得到每个对象的目标成像数据,最后,根据每个对象的目标成像数据,确定至少一个对象的颜色分析,根据颜色分布评估目标光源色温,根据目标光源色温确定目标光源光谱。
806,根据目标光源光谱确定当前帧的白平衡系数和/或颜色转换矩阵。
807,根据当前帧的颜色转换矩阵和/或当前帧的颜色转换矩阵,对当前帧的成像数据进行处理。
示例性的,若当前帧为RGB图像,即图像传感器中彩色滤镜阵列中的第二彩色滤镜单元为3通道RGB,则通过白平衡系数对成像数据进行处理的过程如下:
Figure PCTCN2021082350-appb-000001
其中,R为被红色的基本颜色滤镜覆盖的像素处理后的像素值,G为被绿色的基本颜色滤镜覆盖的像素处理后的像素值,B为被蓝色的基本颜色滤镜覆盖的像素处理后的像素值,Rg为被红色的基本颜色滤镜覆盖的像素的像素值,Gg为被绿色的基本颜色滤镜覆盖的像素的像素值,Bg为被蓝色的基本颜色滤镜覆盖的像素的像素值,r、g、b为白平衡系数。
通过颜色转换矩阵对成像数据进行处理的原理如下:
Figure PCTCN2021082350-appb-000002
其中,sR为被红色的基本颜色滤镜覆盖的像素处理后的像素值,sG为被绿色的基本颜色滤镜覆盖的像素处理后的像素值,sB为被蓝色的基本颜色滤镜覆盖的像素处理后的像素值,Rg为被红色的基本颜色滤镜覆盖的像素的像素值,Gg为被绿色的基本颜色滤镜覆盖的像素的像素值,Bg为被蓝色的基本颜色滤镜覆盖的像素的像素值,a、b、c、d、e、f、g、h、i为颜色转换矩阵中的各个参数。
需要说明的是,在本申请的其他实施例中,还可以通过上述原理获得当前帧中的局部光源光谱、局部颜色转换矩阵,以通过局部光源光谱和局部颜色转换矩阵对当前帧中的局部进行处理。
由上可知,采样点的增加,使成像数据和第一光谱测量数据体现了更多采样点的信息,这样,通过成像数据确定当前帧的原始光源光谱,以及根据每个对象的光谱测量数据对原始光源光谱进行修正后得到的目标光源光谱就体现了更多采样点的信息,提高了目标光源光谱的准确性,降低了同色异谱问题的出现,进而降低了白平衡系数和颜色转换矩阵的计算难度,提高了颜色还原的精度。
第二种,根据当前帧的第一光谱测试数据对当前帧中的对象进行诊断和/或分类和/或识别。具体执行步骤如下:
808、基于当前帧的成像数据,对当前帧进行要素分割,以得到当前帧中的至少一个对象。由于该步骤已经在上文中进行了说明,因此此处不再赘述。
809、在当前帧的第一光谱测量数据中确定每个对象的光谱测量数据,以及根据每个对象的光谱测量数据确定每个对象的光谱。
确定每个对象的光谱测量数据的过程已经在上文中进行了说明,此处不再赘述。确定对象的光谱的过程例如可以为:根据对象的光谱测量数据分析对象的颜色分布,根据颜色分布评估对象的色温,根据对象的色温确定对象的光谱。
对象的光谱指经过对象反射的各个波长的光度量或辐射度量的分布。
810、根据每个对象的光谱对对应的对象进行诊断和/或分类和/或识别。
针对对象的诊断,可以预先获得对象针对不同的诊断结果的光谱,将对象的光谱与对象针对不同诊断结果的光谱进行匹配,根据匹配结果确定对象的诊断结果。
针对对象的分类,可以预先获得对象针对不同的分类结果的光谱,将对象的光谱与对象针对不同分类的光谱进行匹配,根据匹配结果确定对象的分类结果。
针对对象的识别,可以预先获得各个对象的光谱,将对象的光谱与各个对象的光谱进行匹配,根据匹配结果确定对象的识别结果。
需要说明的是,上述光谱的匹配可以指光谱在某个数学特征(例如均值、方差、统计分布等)上的匹配。
由上可知,通过确定当前帧中的每个对象的光谱,即可根据对象的光谱对对象进行诊断和/或分类和/或识别,方式简单,易于执行。此外,也可使用户通过上述方式快速准确的对对象进行诊断和/或分类和/或识别结果,提升用户体验。
本申请还提供了一种图像数据获取方法,应用于上述图像传感器。该图像数据获取方法包括以下步骤:
首先,获得第一电信号和第二电信号。然后,根据所述第一电信号确定当前帧的第一光谱测量数据,根据所述第二电信号确定所述当前帧的成像数据。
其中,所述第一电信号为第一像素对第一光信号进行光电转换后得到的电信号,所述第一光信号为经过所述扩展颜色滤镜的光信号,所述第一像素为被所述扩展颜色滤镜覆盖的像素,所述第二电信号为第二像素对第二光信号进行光电转换后得到的电信号,所述第二光信号为经过所述基本颜色滤镜的光信号,所述第二像素为被所述基本颜色滤镜覆盖的像素。
在一种可能的实现方式中,所述根据所述第一电信号确定当前帧的第一光谱测量数据包括:根据所述当前帧的成像数据和历史帧的成像数据确定所述当前帧相对于所述历史帧的运动信息;根据所述第一电信号确定所述当前帧的第二光谱测量数据;根据所述运动信息和所述历史帧的光谱测量数据对所述第二光谱测量数据进行修正,以得到所述第一光谱测量数据。
在一种可能的实现方式中,所述根据所述第一电信号确定当前帧的第一光谱测量数据包括:根据所述第二电信号中的至少一部分和所述第一电信号生成所述第一光谱测量数据。
在一种可能的实现方式中,所述根据所述第一电信号确定当前帧的第一光谱测量数据包括:根据所述当前帧的成像数据和历史帧的成像数据确定所述当前帧相对于所述历史帧的运动信息;根据所述第二电信号中的至少一部分和所述第一电信号生成所述当前帧的第二光谱测量数据;根据所述运动信息和所述历史帧的光谱测量数据对所述第二光谱测量数据进行修正,以得到所述第一光谱测量数据。
在一种可能的实现方式中,所述获得第一电信号和第二电信号包括:根据所述扩展颜色滤镜在所述彩色滤镜阵列中的位置,获得所述第一电信号;根据所述基本颜色滤镜在所述彩色滤镜阵列中的位置,获得所述第二电信号。
在一种可能的实现方式中,所述方法还包括:基于所述当前帧的成像数据,对所述当前帧进行要素分割,以得到所述当前帧中的至少一个对象;基于所述至少一个对象的颜色分布,确定所述当前帧的原始光源光谱;在所述当前帧的第一光谱测量数据中确定每个所述对象的光谱测量数据,以及根据每个所述对象的光谱测量数据对所述原始光源光谱进行修正,以得到目标光源光谱;根据所述目标光源光谱确定所述当前帧的白平衡系数和/或颜色转换矩阵根据所述当前帧的白平衡系数和/或颜色转换矩阵,对所述当前帧的成像数据进行处理。
在一种可能的实现方式中,所述方法还包括:基于所述当前帧的成像数据,对所述当前帧进行要素分割,以得到所述当前帧中的至少一个对象;在所述当前帧的第一光谱测量数据中确定每个所述对象的光谱测量数据,以及根据每个所述对象的光谱测量数据确定每个所述对象的光谱;根据每个所述对象的光谱对对应的所述对象进行诊断和/或分类和/或识别。
本申请的上述种图像数据获取方法的实现原理和技术效果已经在上文中进行了说明,此处不再赘述。
本申请还提供一种计算机可读存储介质,包括计算机程序,所述计算机程序在计算机上被执行时,使得所述计算机执行上述中的任一种方法实施例的技术方案。
本申请还提供一种计算机程序,当所述计算机程序被计算机或处理器执行时,用于执行上述中的任一种方法实施例的技术方案。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些端口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (23)

  1. 一种图像传感器,其特征在于,包括:像素阵列和覆盖在所述像素阵列上的彩色滤镜阵列;其中:
    所述彩色滤镜阵列包括多个彩色滤镜单元,所述多个彩色滤镜单元包括至少一个第一彩色滤镜单元和至少一个第二彩色滤镜单元,每个所述第一彩色滤镜单元包括基本颜色滤镜和扩展颜色滤镜,每个所述第二彩色滤镜单元包括多个基本颜色滤镜;
    其中,所述扩展颜色滤镜的颜色与所述基本颜色滤镜的颜色不同,经过所述基本颜色滤镜的光信号至少用于成像,经过所述扩展颜色滤镜的光信号用于光谱测量。
  2. 根据权利要求1所述的图像传感器,其特征在于,每个所述第一彩色滤镜单元中基本颜色滤镜的数量为3个,每个所述第一彩色滤镜单元中扩展颜色滤镜的数量为1个,每个所述第二彩色滤镜单元中基本颜色滤镜的数量为4个。
  3. 根据权利要求3所述的图像传感器,其特征在于,所述第二彩色滤镜单元中基本颜色滤镜的颜色包括红色、绿色和蓝色,所述第一彩色滤镜单元中基本颜色滤镜的颜色包括红色、绿色和蓝色。
  4. 根据权利要求1~3中任一项所述的图像传感器,其特征在于,不同的所述扩展颜色滤镜具有不同的颜色。
  5. 根据权利要求1~4中任一项所述的图像传感器,其特征在于,所述扩展颜色滤镜包括第一扩展颜色滤镜和第二扩展颜色滤镜,其中:
    所述第一扩展颜色滤镜的厚度、被所述第一扩展颜色滤镜覆盖的像素的有效面积、被所述第一扩展颜色滤镜覆盖的像素的曝光时间中的至少一个根据所述图像传感器的动态范围的上限确定;
    所述第二扩展颜色滤镜的厚度、被所述第二扩展颜色滤镜覆盖的像素的有效面积、被所述第二扩展颜色滤镜覆盖的像素的曝光时间中的至少一个根据所述图像传感器的动态范围的下限确定。
  6. 一种图像数据获取方法,其特征在于,应用于图像传感器,所述图像传感器包括像素阵列、覆盖在所述像素阵列上的彩色滤镜阵列;其中:所述彩色滤镜阵列包括多个彩色滤镜单元,所述多个彩色滤镜单元包括至少一个第一彩色滤镜单元和至少一个第二彩色滤镜单元,每个所述第一彩色滤镜单元包括基本颜色滤镜和扩展颜色滤镜,每个所述第二彩色滤镜单元包括多个基本颜色滤镜,所述扩展颜色滤镜的颜色与所述基本颜色滤镜的颜色不同;
    所述方法包括:
    获得第一电信号和第二电信号;
    根据所述第一电信号确定当前帧的第一光谱测量数据,根据所述第二电信号确定所述当前帧的成像数据;
    其中,所述第一电信号为第一像素对第一光信号进行光电转换后得到的电信号,所述第一光信号为经过所述扩展颜色滤镜的光信号,所述第一像素为被所述扩展颜色滤镜覆盖的像素,所述第二电信号为第二像素对第二光信号进行光电转换后得到的电信号,所述第二光信号为经过所述基本颜色滤镜的光信号,所述第二像素为被所述基本颜色滤镜覆盖的像素。
  7. 根据权利要求6所述的方法,其特征在于,所述根据所述第一电信号确定当前帧的第一光谱测量数据包括:
    根据所述当前帧的成像数据和历史帧的成像数据确定所述当前帧相对于所述历史帧的运动信息;
    根据所述第一电信号确定所述当前帧的第二光谱测量数据;
    根据所述运动信息和所述历史帧的光谱测量数据对所述第二光谱测量数据进行修正,以得到所述第一光谱测量数据。
  8. 根据权利要求6所述的方法,其特征在于,所述根据所述第一电信号确定当前帧的第一光谱测量数据包括:
    根据所述第二电信号中的至少一部分和所述第一电信号生成所述第一光谱测量数据。
  9. 根据权利要求6所述的方法,其特征在于,所述根据所述第一电信号确定当前帧的第一光谱测量数据包括:
    根据所述当前帧的成像数据和历史帧的成像数据确定所述当前帧相对于所述历史帧的运动信息;
    根据所述第二电信号中的至少一部分和所述第一电信号生成所述当前帧的第二光谱测量数据;
    根据所述运动信息和所述历史帧的光谱测量数据对所述第二光谱测量数据进行修正,以得到所述第一光谱测量数据。
  10. 根据权利要求6~9中任一项所述的方法,其特征在于,所述获得第一电信号和第二电信号包括:
    根据所述扩展颜色滤镜在所述彩色滤镜阵列中的位置,获得所述第一电信号;
    根据所述基本颜色滤镜在所述彩色滤镜阵列中的位置,获得所述第二电信号。
  11. 根据权利要求6~10中任一项所述的方法,其特征在于,所述方法还包括:
    基于所述当前帧的成像数据,对所述当前帧进行要素分割,以得到所述当前帧中的至少一个对象;
    基于所述至少一个对象的颜色分布,确定所述当前帧的原始光源光谱;
    在所述当前帧的第一光谱测量数据中确定每个所述对象的光谱测量数据,以及根据每 个所述对象的光谱测量数据对所述原始光源光谱进行修正,以得到目标光源光谱;
    根据所述目标光源光谱确定所述当前帧的白平衡系数和/或颜色转换矩阵;
    根据所述当前帧的白平衡系数和/或颜色转换矩阵,对所述当前帧的成像数据进行处理。
  12. 根据权利要求6~11中任一项所述的方法,其特征在于,所述方法还包括:
    基于所述当前帧的成像数据,对所述当前帧进行要素分割,以得到所述当前帧中的至少一个对象;
    在所述当前帧的第一光谱测量数据中确定每个所述对象的光谱测量数据,以及根据每个所述对象的光谱测量数据确定每个所述对象的光谱;
    根据每个所述对象的光谱对对应的所述对象进行诊断和/或分类和/或识别。
  13. 根据权利要求6~12中任一项所述的方法,其特征在于,所述扩展颜色滤镜包括第一扩展颜色滤镜和第二扩展颜色滤镜,其中:
    所述第一扩展颜色滤镜的厚度、被所述第一扩展颜色滤镜覆盖的像素的有效面积、被所述第一扩展颜色滤镜覆盖的像素的曝光时间中的至少一个根据所述图像传感器的动态范围的上限确定;
    所述第二扩展颜色滤镜的厚度、被所述第二扩展颜色滤镜覆盖的像素的有效面积、被所述第二扩展颜色滤镜覆盖的像素的曝光时间中的至少一个根据所述图像传感器的动态范围的下限确定。
  14. 一种成像设备,其特征在于,包括:图像传感器和图像处理器,其中:
    所述图像传感器包括像素阵列和覆盖在所述像素阵列上的彩色滤镜阵列;所述彩色滤镜阵列包括多个彩色滤镜单元,所述多个彩色滤镜单元包括至少一个第一彩色滤镜单元和至少一个第二彩色滤镜单元,每个所述第一彩色滤镜单元包括基本颜色滤镜和扩展颜色滤镜,每个所述第二彩色滤镜单元包括多个基本颜色滤镜,所述扩展颜色滤镜的颜色与所述基本颜色滤镜的颜色不同;
    所述扩展颜色滤镜,用于对入射光信号进行滤波,得到第一光信号;
    所述基本颜色滤镜,用于对入射光信号进行滤波,得到第二光信号;
    第一像素,用于对所述第一光信号进行光电转换得到第一电信号,所述第一像素为被所述扩展颜色滤镜覆盖的像素;
    第二像素,用于对所述第二光信号进行光电转换得到第二电信号,所述第二像素为被所述基本颜色滤镜覆盖的像素;
    所述图像处理器,用于获得所述第一电信号和所述第二电信号,根据所述第一电信号确定当前帧的第一光谱测量数据,根据所述第二电信号确定所述当前帧的成像数据。
  15. 根据权利要求14所述的设备,其特征在于,所述图像处理器具体通过下述方式确定所述第一光谱测量数据:
    根据所述当前帧的成像数据和历史帧的成像数据确定所述当前帧相对于所述历史帧 的运动信息;
    根据所述第一电信号确定所述当前帧的第二光谱测量数据;
    根据所述运动信息和所述历史帧的光谱测量数据对所述第二光谱测量数据进行修正,以得到所述第一光谱测量数据。
  16. 根据权利要求14所述的设备,其特征在于,所述图像处理器具体通过下述方式确定所述第一光谱测量数据:
    根据所述第二电信号中的至少一部分和所述第一电信号生成所述第一光谱测量数据。
  17. 根据权利要求14所述的设备,其特征在于,所述图像处理器具体通过下述方式确定所述第一光谱测量数据:
    根据所述当前帧的成像数据和历史帧的成像数据确定所述当前帧相对于所述历史帧的运动信息;
    根据所述第二电信号中的至少一部分和所述第一电信号生成所述当前帧的第二光谱测量数据;
    根据所述运动信息和所述历史帧的光谱测量数据对所述第二光谱测量数据进行修正,以得到所述第一光谱测量数据。
  18. 根据权利要求14~17中任一项所述的设备,其特征在于,所述图像处理器具体通过下述方式获得第一电信号和第二电信号:
    根据所述扩展颜色滤镜在所述彩色滤镜阵列中的位置,获得所述第一电信号;
    根据所述基本颜色滤镜在所述彩色滤镜阵列中的位置,获得所述第二电信号。
  19. 根据权利要求14~18中任一项所述的设备,其特征在于,所述图像处理器还用于:
    基于所述当前帧的成像数据,对所述当前帧进行要素分割,以得到所述当前帧中的至少一个对象;
    基于所述至少一个对象的颜色分布,确定所述当前帧的原始光源光谱;
    在所述当前帧的第一光谱测量数据中确定每个所述对象的光谱测量数据,以及根据每个所述对象的光谱测量数据对所述原始光源光谱进行修正,以得到目标光源光谱;
    根据所述目标光源光谱确定所述当前帧的白平衡系数和/或颜色转换矩阵;
    根据所述当前帧的白平衡系数和/或颜色转换矩阵,对所述当前帧的成像数据进行处理。
  20. 根据权利要求14~19中任一项所述的设备,其特征在于,所述图像处理器还用于:
    基于所述当前帧的成像数据,对所述当前帧进行要素分割,以得到所述当前帧中的至少一个对象;
    在所述当前帧的第一光谱测量数据中确定每个所述对象的光谱测量数据,以及根据每个所述对象的光谱测量数据确定每个所述对象的光谱;
    根据每个所述对象的光谱对对应的所述对象进行诊断和/或分类和/或识别。
  21. 根据权利要求14~20中任一项所述的设备,其特征在于,所述扩展颜色滤镜包括第一扩展颜色滤镜和第二扩展颜色滤镜,其中:
    所述第一扩展颜色滤镜的厚度、被所述第一扩展颜色滤镜覆盖的像素的有效面积、被所述第一扩展颜色滤镜覆盖的像素的曝光时间中的至少一个根据所述图像传感器的动态范围的上限确定;
    所述第二扩展颜色滤镜的厚度、被所述第二扩展颜色滤镜覆盖的像素的有效面积、被所述第二扩展颜色滤镜覆盖的像素的曝光时间中的至少一个根据所述图像传感器的动态范围的下限确定。
  22. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,当所述指令在计算机或处理器上运行时,使得所述计算机或处理器执行如权利要求6-13任一项所述的方法。
  23. 一种包含指令的计算机程序产品,当其在计算机或处理器上运行时,使得所述计算机或处理器执行如权利要求6-13任一项所述的方法。
PCT/CN2021/082350 2021-03-23 2021-03-23 图像传感器、图像数据获取方法、成像设备 WO2022198436A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
PCT/CN2021/082350 WO2022198436A1 (zh) 2021-03-23 2021-03-23 图像传感器、图像数据获取方法、成像设备
CN202180090558.1A CN116724564A (zh) 2021-03-23 2021-03-23 图像传感器、图像数据获取方法、成像设备
EP21932068.6A EP4300936A4 (en) 2021-03-23 2021-03-23 IMAGE SENSOR, IMAGE DATA ACQUISITION METHOD AND IMAGING DEVICE
US18/470,847 US20240014233A1 (en) 2021-03-23 2023-09-20 Image sensor, image data obtaining method, and imaging device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/082350 WO2022198436A1 (zh) 2021-03-23 2021-03-23 图像传感器、图像数据获取方法、成像设备

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/470,847 Continuation US20240014233A1 (en) 2021-03-23 2023-09-20 Image sensor, image data obtaining method, and imaging device

Publications (1)

Publication Number Publication Date
WO2022198436A1 true WO2022198436A1 (zh) 2022-09-29

Family

ID=83396175

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/082350 WO2022198436A1 (zh) 2021-03-23 2021-03-23 图像传感器、图像数据获取方法、成像设备

Country Status (4)

Country Link
US (1) US20240014233A1 (zh)
EP (1) EP4300936A4 (zh)
CN (1) CN116724564A (zh)
WO (1) WO2022198436A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6687003B1 (en) * 1998-10-20 2004-02-03 Svend Erik Borre Sorensen Method for recording and viewing stereoscopic images in color using multichrome filters
US20090256927A1 (en) * 2008-04-11 2009-10-15 Olympus Corporation Image capturing apparatus
CN102834309A (zh) * 2010-02-26 2012-12-19 金泰克斯公司 自动车辆设备监控、报警和控制系统
US20140307122A1 (en) * 2011-12-28 2014-10-16 Fujifilm Corporation Image processing device and method, and imaging device
US20200400570A1 (en) * 2019-06-20 2020-12-24 Ethicon Llc Super resolution and color motion artifact correction in a pulsed hyperspectral, fluorescence, and laser mapping imaging system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4702635B2 (ja) * 2007-07-17 2011-06-15 富士フイルム株式会社 オートホワイトバランス補正値算出装置、方法およびプログラムならびに撮像装置
US8659670B2 (en) * 2009-04-20 2014-02-25 Qualcomm Incorporated Motion information assisted 3A techniques
JP5687676B2 (ja) * 2012-10-23 2015-03-18 オリンパス株式会社 撮像装置及び画像生成方法
US9521385B2 (en) * 2014-03-27 2016-12-13 Himax Imaging Limited Image sensor equipped with additional group of selectively transmissive filters for illuminant estimation, and associated illuminant estimation method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6687003B1 (en) * 1998-10-20 2004-02-03 Svend Erik Borre Sorensen Method for recording and viewing stereoscopic images in color using multichrome filters
US20090256927A1 (en) * 2008-04-11 2009-10-15 Olympus Corporation Image capturing apparatus
CN102834309A (zh) * 2010-02-26 2012-12-19 金泰克斯公司 自动车辆设备监控、报警和控制系统
US20140307122A1 (en) * 2011-12-28 2014-10-16 Fujifilm Corporation Image processing device and method, and imaging device
US20200400570A1 (en) * 2019-06-20 2020-12-24 Ethicon Llc Super resolution and color motion artifact correction in a pulsed hyperspectral, fluorescence, and laser mapping imaging system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4300936A4 *

Also Published As

Publication number Publication date
CN116724564A (zh) 2023-09-08
US20240014233A1 (en) 2024-01-11
EP4300936A1 (en) 2024-01-03
EP4300936A4 (en) 2024-01-24

Similar Documents

Publication Publication Date Title
US11632525B2 (en) Image processing method and filter array including wideband filter elements and narrowband filter elements
US8199229B2 (en) Color filter, image processing apparatus, image processing method, image-capture apparatus, image-capture method, program and recording medium
US10136107B2 (en) Imaging systems with visible light sensitive pixels and infrared light sensitive pixels
CN103327342B (zh) 具有透明滤波器像素的成像系统
US8730357B2 (en) Image processing device, image processing method, and program
WO2021196554A1 (zh) 图像传感器、处理系统及方法、电子设备和存储介质
US6646246B1 (en) Method and system of noise removal for a sparsely sampled extended dynamic range image sensing device
CN205726019U (zh) 成像系统、成像设备和图像传感器
CN109685853B (zh) 图像处理方法、装置、电子设备和计算机可读存储介质
JP5186517B2 (ja) 撮像装置
JP2020024103A (ja) 情報処理装置、情報処理方法、及び、プログラム
CN115239550A (zh) 图像处理方法、图像处理装置、存储介质与电子设备
WO2012153532A1 (ja) 撮像装置
JPWO2017222021A1 (ja) 画像処理装置、画像処理システム、画像処理方法及びプログラム
JPWO2018116972A1 (ja) 画像処理方法、画像処理装置および記録媒体
CN115100085A (zh) 图像颜色校正方法、装置、存储介质与电子设备
TW202220431A (zh) 攝像元件及電子機器
WO2022198436A1 (zh) 图像传感器、图像数据获取方法、成像设备
CN115187559A (zh) 用于图像的光照检测方法、装置、存储介质与电子设备
US11140370B2 (en) Image processing device, image processing system, image processing method, and program recording medium for generating a visible image and a near-infrared image
WO2013111824A1 (ja) 画像処理装置、撮像装置及び画像処理方法
CN118433487B (zh) 一种图像数据的处理方法及装置
CN120034750A (zh) 图像处理方法、装置、电子设备、存储介质和程序产品
Dikbas et al. Impact of Photometric Space Linearity on Demosaicing Image Quality
WO2022185345A2 (en) Optimal color filter array and a demosaicing method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21932068

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180090558.1

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 2021932068

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2021932068

Country of ref document: EP

Effective date: 20230926

NENP Non-entry into the national phase

Ref country code: DE