[go: up one dir, main page]

CN114930136B - Method and device for determining wavelength deviation of image shot by multi-lens shooting system - Google Patents

Method and device for determining wavelength deviation of image shot by multi-lens shooting system

Info

Publication number
CN114930136B
CN114930136B CN202080092968.5A CN202080092968A CN114930136B CN 114930136 B CN114930136 B CN 114930136B CN 202080092968 A CN202080092968 A CN 202080092968A CN 114930136 B CN114930136 B CN 114930136B
Authority
CN
China
Prior art keywords
image
image sensor
pixels
determined
spectral
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202080092968.5A
Other languages
Chinese (zh)
Other versions
CN114930136A (en
Inventor
R·海涅
A·R·布兰德斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cobot Co ltd
Original Assignee
Cobot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cobot Co ltd filed Critical Cobot Co ltd
Publication of CN114930136A publication Critical patent/CN114930136A/en
Application granted granted Critical
Publication of CN114930136B publication Critical patent/CN114930136B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0208Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using focussing or collimating elements, e.g. lenses or mirrors; performing aberration correction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0278Control or determination of height or angle information for sensors or receivers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • G01J2003/2826Multispectral imaging, e.g. filter imaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/957Light-field or plenoptic cameras or camera modules

Landscapes

  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Color Television Image Signal Generators (AREA)
  • Studio Devices (AREA)

Abstract

本发明涉及一种确定由多透镜摄像系统(1)拍摄的图像的波长偏差的方法,包括以下步骤:‑为所述多透镜摄像系统(1)的过滤元件(4)的一个区域确定色域(F),所述区域与多透镜摄像系统(1)的图像传感器(3)的一预定区域相关联,‑基于确定的色域(F),为图像传感器(3)的预定区域的至少两个像素(P1、P2)确定中心波长(Z),‑基于确定的中心波长(Z),对图像传感器(3)的区域和/或用该图像传感器(3)拍摄的图像(B、B1)进行修正,和/或生成针对所述图像或用于成像控制的补充性数据集。本发明还涉及一种对应的装置和一种多透镜摄像系统。

This invention relates to a method for determining wavelength deviation in an image captured by a multi-lens camera system (1), comprising the steps of: - determining a color gamut (F) for a region of a filter element (4) of the multi-lens camera system (1), said region being associated with a predetermined region of an image sensor (3) of the multi-lens camera system (1); - determining a center wavelength (Z) for at least two pixels (P1, P2) of the predetermined region of the image sensor (3) based on the determined color gamut (F); - correcting the region of the image sensor (3) and/or an image (B, B1) captured by said image sensor (3) based on the determined center wavelength (Z); and/or generating a supplementary dataset for said image or for imaging control. The invention also relates to a corresponding apparatus and a multi-lens camera system.

Description

Method and device for determining wavelength deviation of image shot by multi-lens shooting system
Technical Field
The invention relates to a method and a device for determining the wavelength deviation of images recorded by a multi-lens image recording system, and in particular for using the same, in particular for chromatic dispersion calibration. The multi-lens camera system is preferably an imaging system for (hyper) spectral taking of images.
Background
In many commercial and scientific fields, cameras are used which have a spectral resolution in addition to the spatial resolution (spectral cameras), which often exceeds the visible spectrum (multispectral cameras). For example, when measuring the earth's surface from the air, commonly used cameras not only have normal RGB color resolution, but also provide high resolution spectra, possibly even into the ultraviolet or infrared range. By means of these measurements, it is possible, for example, to identify individual planting areas in an agricultural land. Whereby for example the growth state or health state of the plant, or the distribution of different chemical elements such as chlorophyll or lignin, can be determined.
For recent decades, a spectral high resolution imaging technique called "hyperspectral imaging (HYPERSPECTRAL IMAGING)" has proven suitable for these measurements. By means of this hyperspectral imaging, for example, different chemical elements can be identified and distinguished on the basis of the spectra recorded in a spatially resolved manner.
In a typical construction of a (hyper) spectral camera system, a lens array is provided in front of the image sensor, mapping the subject to the image sensor in the form of a plurality of different images, one for each lens. Such an imaging system is also referred to as a "multi-lens imaging system". Each of the images is taken with a different spectral range by means of a filter element, such as a mosaic filter or a linearly variable filter, located between the lens array and the image sensor. A large number of mappings of topics over different spectral ranges ("channels") are thus obtained.
However, the prior art has a disadvantage in that photographed images cannot be optimally compared with each other. These images also require calibration. Particularly when capturing objects at different distances from the camera system (e.g. objects in front of the background, or two or more objects at different distances), the different light paths result in inaccurate spectral classification of the objects when captured by the filter element of the camera system.
Disclosure of Invention
The object of the present invention is to overcome the disadvantages of the prior art and to provide a method of determining a wavelength deviation of a multi-lens imaging system, in particular for calibrating a multi-lens imaging system and/or for improving spectral resolution.
The solution to the above object of the invention is a method and a device according to the claims.
The method of the invention for determining (and advantageously utilizing) the wavelength deviation of a shot of a multi-lens imaging system, in particular for achieving dispersion calibration and/or improving spectral resolution, comprises the steps of:
Determining a spectral sensitivity ("color gamut") for a region of a filter element of the multi-lens camera system, said region corresponding to a predetermined region of an image sensor of the multi-lens camera system,
Determining a center wavelength for at least two pixels of a predetermined area of the image sensor based on the determined spectral sensitivity (determined color gamut),
-Modifying the area of the image sensor and/or the image taken with the image sensor (or the corresponding area of the image sensor) based on the determined center wavelength, and/or generating a supplementary dataset for the image or for imaging control.
The spectral sensitivity of a region of the filter element reflects which color (wavelength) of which pixel of the image sensor is captured after the filter element. In the ideal case this is light with a single (center) wavelength, in the actual case a wavelength distribution with a center wavelength. The spectral sensitivity is also referred to herein as "color gamut" because it comprises a domain (range) in which different colors predominate depending on the filter element. For ideal filter elements, the color gamut extends mostly in a linearly variable (linearly variable filter) or constant and stepped (mosaic filter) manner. The area of the image sensor for imaging should have a well-defined (linearly changing/step constant) gamut extension. In practical cases, however, it is critical for many filter elements that the path of light travel in the filter element during propagation through the filter element. In one region of a mosaic filter designed for only a single center wavelength with angle dependent filtering characteristics, the measured center wavelengths are different in the case of straight incidence and in the case of oblique incidence. Although this distinction is limited to a few nm in practice, it still leads to systematic measurement errors and has a negative effect on the evaluation of the spectral images. Furthermore, as described below, this effect can be exploited to improve spectral resolution.
It is noted that the observations explained herein relate mainly to effects due to different optical paths in the filter. The present invention is able to handle not only these effects but also non-uniformities in the filter element or wavelength deviations in the image due to the linearly variable filter element.
Thus, the color gamut comprises information about the center wavelength of light that is incident through the filter element to the pixels of the predetermined area of the image sensor. Preferably, the color gamut also includes information of the angular dependence of the center wavelength of Guan Toushe light for a pixel or group of pixels. This further improves the accuracy, since light from objects photographed at a shorter distance propagates through the filter system at a slightly different angle than light from objects farther away.
The color gamut is determined for specific areas of the filter element that are penetrated by light for taking an image or images of a predetermined area of the image sensor. In the case where only a single image or the corresponding region of the image sensor needs to be corrected by the method, which is a region for capturing the relevant image, the method may also be used for simultaneously correcting several images, wherein the predetermined region on the image sensor is correspondingly larger.
The color gamut may be determined by direct measurement. To this end, an image of the subject is recorded and/or provided, which is recorded by the multi-lens recording system in the spectral range. For a more detailed description of such a photographing or providing operation see below.
Alternatively or additionally, the color gamut may also be determined by calculating filter characteristics at wavelengths incident at different angles of incidence taking into account imaging properties of optics of the multi-lens imaging system corresponding to the region of the image sensor.
As described above, it is noted that in the case of capturing an image through one area of the image sensor, light for different pixels of the image extends through the filter at different incident angles. Thus, and as the case may be, also due to non-uniformity of the filter thickness over the relevant area, light for different pixels of the image sensor passes through the filter element in different paths. Depending on the type of filter element (or region of the filter element), a shift in filter properties may result therefrom. If z is the center wavelength of the filter for light incident at angle a, then the center wavelength z±Δz occurs for light passing through the filter at angle b.
The center wavelength is now determined for at least two pixels of a predetermined region of the image sensor, preferably for all pixels of a predetermined region of interest ("Region of Interest", ROI). However, it is noted for this purpose that said (at least) two pixels are preferably located in an area suitable for taking a single image. If the area of the image sensor is used to take several images, this step should be performed for at least two pixels per image.
If the center wavelength of two pixels is observed for each "image", then a correction function ("support function") for all pixels can be determined from these pixels assuming that the center wavelength has a linear extension. If it is a nonlinear extension, the center wavelength is preferably determined for more than two pixels and the correction function is determined therefrom. In case the central wavelengths of all pixels of the area are observed, very accurate results are obtained. The advantage of the correction function is that the center wavelength does not need to be determined for all pixels, but can also be set by the correction function.
By means of the determined center wavelength, the area of the image sensor and/or the corresponding image recorded by the image sensor is corrected. Alternatively or additionally, a complementary data set is generated for the image or for imaging control. The correction and the generation of the data set have mostly the same meaning, since in a digitizing system the correction can be based on the data set, which is read in and used for the correction of the image sensor or the data.
The correction may be an adaptation, in particular a calibration. But may also comprise adding image information to the image, for example adding supplementary spectral information, or information about the radiation characteristics of the object. Such supplementary information may for example be present directly in the image data or as an additional dataset, i.e. "outside" the image but associated therewith. The dataset may be utilized when viewing the image. As long as the corresponding data exists and the computing unit "knows" where these data are located, it is equivalent whether the supplemental data is located in the image file or in a separate dataset. This applies similarly to the complementary data for the image sensor. It is basically irrelevant for the correction of the image sensor whether the calibration is performed directly in the shooting or on the raw data of the shooting (or the reconstructed data as the case may be). Thus, the data for correcting the image sensor may also be present in the form of a complementary data set.
For calibration, the captured image should exhibit a uniform theme such that each pixel of the image looks the same in terms of emphasis. The preferred subject matter is a uniform surface, or light from a lambert radiator or light source incident on a lambert diffuser. If only one image is taken, preferably the light has approximately the wavelength corresponding to the spectral channels of the image (wherein preferably narrowband radiation is used), if several images are taken, the subject should include the wavelengths of all spectral channels. However, several images can also be recorded in succession, the subject matter or the wavelength being able to be changed accordingly.
Capturing (at least) one image of a subject in one spectral range by means of a multi-spectral multi-lens imaging system, i.e. in the form of a measurement in one spectral channel of the multi-lens imaging system, is known to the person skilled in the art and is produced in a known manner by means of a (known) multi-spectral multi-lens imaging system. A multi-lens imaging system for (multi/super) spectrally capturing images comprises a planar image sensor, a position-sensitive spectral filtering unit and an imaging system. Wherein the imaging system comprises a planar lens array comprising a plurality of individual lenses arranged such that at a first point in time of capture a first plurality of grid-like arranged first mappings of the subject are generated within a first area on the image sensor. Such as spherical lenses, cylindrical lenses, holographic lenses or fresnel lenses, or lens systems composed of several such lenses, such as objective lenses.
Planar image sensors are basically known to those skilled in the art. It is particularly preferred here to relate to a pixel detector, which enables electronic recording of image points ("pixels"). The preferred pixel detector is a CCD sensor (CCD: "charge-coupled device") or a CMOS sensor (CMOS: "Complementary metal-oxide-semiconductor", complementary metal oxide semiconductor). Silicon-based sensors are particularly preferred, but InGaAs sensors and lead oxide or graphene-based sensors may also be used, especially for wavelength ranges outside the visible range.
Spectral filter elements that are designed to transmit different spectral proportions of incident light and not transmit other spectral proportions at different locations on the surface of the filter element are referred to herein as "position sensitive spectral filter elements" and may also be referred to as "position dependent spectral filter elements". Which is used to filter the map generated by the imaging system on the image sensor according to the different (smaller) spectral ranges.
There are several arrangements of filter elements. The filter element may be positioned, for example, directly in front of the lens array or between the lens array and the image sensor. It is also preferred that the components of the imaging system are designed as filter elements, in particular as lens arrays. For example, the base of the lens array may be designed as a filter element.
A lens array within the scope of the invention comprises a large number of lenses which are arranged in a grid-like manner, i.e. in a regular arrangement, in particular on a carrier, with respect to each other. The lenses are preferably arranged in regular rows and columns or offset from each other. Particularly preferably rectangular or square arrangements, or hexagonal arrangements are used. The lens may be, for example, a spherical lens or a cylindrical lens, but in some applications an aspherical lens is also preferred.
Typically, (poly/super) spectral shots always display similar images of the same subject. Through the filter element, images are recorded by the image sensor at different (light) wavelengths or in different wavelength ranges, respectively.
The image exists in the form of a digital image, the picture elements of which are referred to as "pixels". The pixels are located at predetermined positions of the image sensor so that each image has a coordinate system of pixel positions. In the context of multispectral photography, images are often also referred to as "channels".
Since the recorded images are usually stored in an image memory, they can also be retrieved therefrom for the method. The method can of course also work with already recorded "old" images, and data can be read in from the memory and thus provided.
The center wavelength or correction function may exist in x, y, λ space, where x and y are the image coordinates or corresponding coordinates of the image sensor, and λ (x, y) is the center wavelength or correction value involved corresponding to the respective coordinates.
The device according to the invention for determining the wavelength deviation of an image recorded by a multi-lens image recording system, in particular for achieving dispersion calibration and/or for improving spectral resolution, comprises the following components:
-a determination unit for determining a color gamut of a region of a filter element of the multi-lens camera system, the region corresponding to a predetermined region of an image sensor of the multi-lens camera system. The determination unit may comprise (or be present as a software module in) a computing system that determines the color gamut from the sensor information and/or from the information given for the filter and optionally also by the subject (e.g. distance) to be mapped.
-A determination unit designed to determine a center wavelength for at least two pixels of a predetermined area of the image sensor based on the determined color gamut. The determination unit may also be implemented by a computing unit (or present as a software module in a computing system) that is able to evaluate the sensor data of the image sensor and thereby determine the center wavelength.
-A processing unit designed to modify the setting of the area of the image sensor and/or the image taken with the image sensor (or the area of the image sensor) based on the determined center wavelength and/or to generate a supplementary dataset for the image or for imaging control. The processing unit may also be implemented by a computing unit (or present as a software module in a computing system) that is able to access the image sensor or its image storage space, or to modify the image data. In the case where the processing unit is used only for correcting an image or an image sensor, it may also be referred to as a "correction unit".
The multi-lens imaging system of the invention comprises a device according to the invention and/or is designed to implement the method of the invention.
Preferred embodiments of the present invention will be described below. It should be noted that the preferred multi-lens camera system can also be constructed similarly to the description of the method, and in particular that the features of the different embodiments can also be combined with one another.
In a preferred method, the color gamut is determined by taking and/or providing an image of the subject with a multi-lens imaging system of the multi-spectrum, and/or by calculating filter characteristics at wavelengths incident at different angles of incidence taking into account imaging properties of optics of the multi-lens imaging system corresponding to the region. That is, the color gamut is based on measurements or calculations, where the two can also be combined with each other.
In a preferred method, a correction function ("support function") is determined for pixels of a predetermined area of the image sensor based on the determined center wavelengths of the at least two pixels within the range of the determined center wavelengths. The correction function may then be used to correct (in particular for adaptation, for example for calibration) the area of the image sensor and/or the image that has been captured with the image sensor (or the area of the image sensor). This also allows correction, in particular calibration, of image areas for which no color gamut is present. Other images, whose color gamuts should be similar, can also be adapted by the correction function.
In a preferred method, in determining the color gamut, information about the angular dependence of the center wavelength of the transmitted light is added to the color gamut. Thereby, even in the case where the optical paths are different (e.g., the distances from the subject or object are different), the correction can be performed. The distances of the elements of the theme are taken into account in the correction, wherein preferably the color gamut is determined for the pixels of the image sensor depending on the distance of the mapped theme zone from the camera system.
In a preferred method, the photographed parallax is also determined, comprising the steps of:
Capturing and/or providing at least two images of the object in different spectral ranges by means of a multi-spectral multi-lens imaging system,
Identifying objects in the captured image by means of a digital object identification system and creating a virtual object using the identified object map,
Determining coordinates of the virtual object in the image in the form of absolute image coordinates and/or relative coordinates to other elements in the image,
Determining the parallax of the object from the determined coordinates and the known position of the image capture point on the image sensor of the multi-lens camera system,
Preferably, the distance of the object from the multi-lens camera system and/or from another object in the image is determined based on the determined parallax and used to determine the color gamut and/or the center wavelength.
According to a preferred method, the correction (in this case the adaptation) comprises a calibration of the area of the image sensor or of the image recorded with this area. That is, the aforementioned correction measures are used here for calibration of the image sensor or the image in order to adjust the spectral information accordingly for each pixel. If the color gamut is not constant, the spectral information reflected by the pixels of the relevant region is now known. In this case, the correction function is preferably a calibration function comprising a color gamut based calibration value for the pixels of the area of the image sensor or the pixels of the image taken with the area for calibrating the relevant pixels.
According to a preferred method, the correction comprises improving spectral information of an image captured by the image sensor, wherein at least two pixels are determined in an image, whose central wavelengths are known and which belong to an image region for which an approximately identical spectrum is assumed. That is, the pixels are located on different coordinates of the image and have different center wavelengths due to the aforementioned dispersion effects, filter errors, or due to the use of a linearly variable filter. It is assumed below that these pixels are each assigned to a single object of the subject matter, wherein the objects are assumed to be regions having approximately the same spectrum (where "approximately" means: within the desired measurement accuracy range). In this case, the spectral information reflected by all these pixels is as expected on all pixel coordinates (due to approximately the same spectrum). For this purpose, object recognition is preferably performed in the image, as already described elsewhere herein. The spectral information of the pixels is now integrated into the total spectral information for at least one of the pixels, preferably for the group of pixels or all pixels.
Of course, this (and subsequent) approach can be implemented for different spectral channels (spectral images), which can improve spectral resolution.
According to another preferred method, the correction also includes improving spectral information of the image captured by the image sensor. The principle of this method is the same as the previous method in that two pixels have personalized (different) center wavelengths and come from a region with approximately the same spectrum, which complements the total spectral information. In the former method, the pixels are located in the same image but at different locations of the object, and in the following, the pixels are observed at the same location of the object but in different images. Preferably, both methods can be used to further refine the spectral information. A multi-lens imaging system comprising a linearly variable filter element is particularly suitable for both methods, wherein in this case the concept of "different images" can also refer to different image areas or different sub-images, since the filter element has a linear extension of the center wavelength in the image range.
The preferred further method comprises the additional steps of:
-taking at least two images of the same subject, from different perspectives or in case of a movement of the subject, through the same area of the image sensor. These images can be taken at different times, for example, in the relative movement of the camera system and the subject, or the subject can be caused to move in the region of the image sensor. It is important that the images are not identical, but that at least one object of the subject is mapped on different coordinates in the image and/or from different perspectives. The latter has caused different center wavelengths at the same image location based on the dispersion effect. Of course several images can be taken, wherein each additional image contributes to an improved spectrum. The change in viewing angle may also be produced by tilting the filter element, or by lateral relative movement of the filter element and the lens array, as the viewing angle depends on the angle at which incident light passes through the filter element. That is, the "filter passing angle" concept may be used instead of the "view angle" concept, but the "view angle" is easy to understand.
-Determining pixels of the image corresponding to the same area of the subject. That is, the pixels are located in the same location on the subject (but not necessarily of the image). In accordance with the foregoing, the pixels are typically located on different image coordinates, and due to the different viewing angles, the pixels should have different center wavelengths on the same image coordinates. Of course, some of these pixels may have the same center wavelength, but no information is provided.
-Determining the center wavelength of the respective pixel. Typically a set of pixels is obtained, each having a different center wavelength.
-Integrating the spectral information of the pixels into total spectral information for at least one of the pixels. That is, the relevant pixel now contains the spectral information of the other pixels.
Alternatively or additionally, the viewing angle from which the pixel is mapped is determined from the center wavelength, and the intensity of the pixel is determined from the viewing angle.
The radiation characteristics of an object (or pixel) can also be determined from viewing angles of the object (or pixel) in different spectral ranges, i.e. over different areas of the image sensor. Wherein the intensities of objects (or pixels) are recorded in different channels and the intensity versus viewing angle characteristic is plotted. In particular in the form of a bi-directional reflectance distribution function (Bidirectional reflectance distribution fuction, BRDF). In this case, the viewing angle is produced, for example, by a position on the image sensor. It is particularly preferred to observe objects (pixels) from two different perspectives of the camera.
Although the objects in the image are typically shown in different spectral channels, in the simplest case, it can be assumed that the radiation characteristics are the same for all wavelengths, and the (wavelength dependent) intensity of each channel can be normalized by the known spectrum of the object. Alternatively, it is of course possible to shoot objects in the same channel from two or more perspectives and thereby achieve standardization of the intensity of the spectral channels.
It is particularly preferred that the subject is moved after the first image is taken on the image sensor, for example by tilting a mirror or lens array of the multi-lens imaging system. It is also possible to capture the entire series of images, in particular by means of such tilting. By said shifting, the same point or area of the subject is mapped onto different pixels of the image sensor, and the resulting shift of the center wavelength can be used to improve the spectral resolution as described earlier. For example, a first image may be taken first, and then, in particular in the form of a film, a sequence of several images may be taken during the movement with a lower resolution or with a reduced exposure time, in order to improve the spectral resolution of the first image. Alternatively or additionally, however, a lateral relative movement of the filter element and the lens array may be performed as described above, and/or the filter element may be switched, for example by a filter wheel or a replacement filter.
According to a preferred embodiment, the filter element comprises a mosaic filter. Preferably, the mosaics of the mosaic filter are arranged in such a way that the larger wavelength step is outside and the smaller interval is inside. In a preferred form of the mosaic filter, a colour mosaic (in particular a colour glass mosaic) is applied, in particular evaporated, on one side of a substrate (preferably glass). According to one advantageous embodiment, the filter element (mosaic filter or another filter) is applied on the front side of the substrate, while the lens array is applied on the back side of the substrate. Preferably, the mosaic filter transmits different wavelengths for each individual lens.
According to a preferred embodiment, the filter element comprises a linearly variable filter ("graded filter") with filter wires, which filter is preferably rotated at an angle of between 1 ° and 45 ° with respect to the lens array, with respect to the orientation of the filter wires. Alternatively or additionally, the filter element comprises a filter array, particularly preferably a mosaic filter.
According to a preferred embodiment, the multi-lens camera system comprises an aperture cover between the lens array and the image sensor, wherein the aperture on the aperture cover is positioned according to the lenses of the lens array, and the aperture cover is positioned such that the mapped light of each lens passes through the aperture of the aperture cover. That is, the aperture cover has the same pattern as the lens array, with apertures present instead of lenses.
In order to determine the wavelength deviation, in particular for improving the result in dispersion calibration or for improving the spectral resolution, a correction method, in particular a calibration step, can also be implemented, which likewise exhibits the advantages of recording by a multi-lens imaging system independently of the method according to the invention described above. A problem arises here in that a multi-lens imaging system that is not optimally calibrated provides less than optimal imaging. By making corrections by each of the following alternatives, whether applied alone or in combination with each other, the imaging of the multi-lens imaging system is improved.
A preferred calibration method for a multi-spectral multi-lens imaging system is used to identify a region of interest (Region of Interest, ROI). The method comprises the following steps:
Capturing an image by means of a multispectral multi-lens camera system, wherein the image preferably has a uniform brightness distribution or has a brightness that overexposes the image sensor. The advantage of overexposure is that in this case the area of the captured image where less light is obtained due to shadowing effects (e.g. due to aperture edges) becomes visible. These regions are no longer assigned to the ROI, since (due to the shadowing effect) it is not ensured that the image information here is optimal. An image is produced in which only the region visible to the ROI is illuminated.
Cross-correlating said image with a comparison image, in particular with a reference image (e.g. an ideal sensor image).
-Optionally performing a hough transform. The angle of the image can be made uniform by means of hough transform.
-Selecting from the image the region with the highest similarity according to auto-correlation and/or cross-correlation. The selecting preferably includes either singulating the selected region or limiting the image capture to the selected region. Preferably, in this framework, a corresponding predefining is performed in the reference image, and an object segmentation of the captured image is performed.
A preferred calibration method for a multi-spectral multi-lens imaging system is used to correct lens errors. The method comprises the following steps:
-capturing an image of a previously known optical target by means of a lens to be inspected of a multi-lens camera system. Wherein preferably each channel of the multi-lens camera system is photographed so that the entire system can be studied for lens errors.
Based on these shots and the known dimensions of the optical target, the optical lens error is calculated by means of determined lens parameters, preferably by means of the Levenberg-Marquard algorithm.
A preferred calibration method for a multi-lens imaging system for multispectral is used to calibrate projection errors. The method comprises the following steps:
providing (taking or reading from a data memory) an image of a previously known optical target by means of a multi-spectral multi-lens camera system.
-Identifying the object in the image and preferably determining at least three coordinates of the characterizing points of the object. The characterizing point is, for example, the angle of the object.
-Using inverse homography. This is the co-linear of the 2-dimensional real projection space to itself, i.e. the projection of the image in space, which can be implemented by an algorithm through the (digital) image. Wherein the target is adjusted according to the target that would be mapped without projection errors at a specific distance. Such ideal mapping is known because the properties of the mapping optics are known and the targets are known. Preferably, only the coordinates characterizing the reference point are used in this case, and the coordinates of the corresponding characterizing point of the object identified in the image are recorded to the reference point. With this preferred method, translation (movement), rotation, clipping and scaling can be corrected by using homographies.
A preferred calibration method for a multi-lens imaging system for multispectral is used to improve its resolution. The method comprises the following steps:
-capturing at least two images, one of the images having a higher spatial resolution than the other image and the other image having a higher spectral resolution. Wherein a large number of images of lower spatial resolution are preferably taken in different spectral ranges and Pan images or gray-scale images with higher spatial resolution are taken.
Optionally, the multi-lens imaging system may additionally be calibrated, in particular by the method of calibrating projection errors described above. It is particularly preferred that the parallax of the object in the image is predetermined and compensated for.
-Extrapolation of images with higher spatial resolution or spectral resolution from the at least two images. Wherein the higher spatial resolution of one of the images is used to improve the spatial resolution of the image with higher spectral resolution and/or wherein the higher spectral resolution of the other image is used to improve the spectral resolution of the image with higher spatial resolution.
In particular in the preferred several single channels (spectral images), the spatial resolution is increased based on the information of the image with the higher spatial resolution. Among them, the known Pan-SHARPENING principle is preferably utilized. The fact that one pixel of one image of a spectral channel corresponds to a group of pixels of an image with a higher spatial resolution, for example 10 x10 pixels in a Pan image, when observing the subject is utilized herein. Preferably, the corresponding group of pixels (e.g. 10 x10 spectral pixels) is generated from one pixel of the (spectral) image, in particular by using the spectral form from the original spectral pixel, but the luminance is from the Pan image.
Alternatively or additionally, the spectral resolution of an image with higher spatial resolution may be improved. This can be achieved if there is a first image of higher spatial resolution and lower spectral resolution (but more than three channels must be present) and a second image of lower spatial resolution and higher spectral resolution as described above. By interpolating the missing spectral channels of the first image from the information of the second image, the spectral resolution of the first image is improved such that the spectral resolution of the second image is (nearly) reached. This is preferably achieved in that one pixel of the second image is assigned to one pixel group of the first image and the spectral information of the pixels of the second image is assigned to this pixel group. For example, in the case where the spatial resolution of the first image is Z times greater, each block (pixel group) of z×z pixels of the first image may be allocated to one pixel of the second image at a corresponding (block-taking-into-account) image position.
But the result can be further improved by performing object recognition within the first image (or in groups of pixels). In this case, it is assumed that the spectra within one object are approximately the same. Object recognition can also be improved by information of the second image, in which parts of the object having different spectra are separated from each other and treated as independent objects. Subsequently, in the first image, for example, different mutually adjoining regions can be present, which are separated from one another by edges. Regions of different objects are treated differently within a group of pixels in the first image, in particular by assigning different spectra to these regions. This is easy to achieve in terms of a uniform group of pixels by assigning spectral information of corresponding pixels of the second image to these areas (see the method described earlier). If there is a portion of a different object in one pixel group, information of the second image for one of the objects is used for the object and information of the second image for the other object is used for the other object. For each further object part, it operates in a corresponding manner. In this case the spectral information of the corresponding pixel in the second image may be a convolution of the spectra of the different objects, so that here the spectra of neighboring pixels in the second image may be added, which contain information about the object concerned. Alternatively or additionally, additional object recognition can be performed in the second image, which corresponds to the objects of the first image, and corresponding spectral information is assigned to these objects in the information of the second image.
This information can also be used to improve the spectrum if the color channels are not uniform, i.e. the center wavelength follows a trend beyond the image level for pixels in one channel (image). For this purpose, it is again assumed that the spectrum within the object is uniform, at least in terms of two adjacent points within the object. In the case of non-uniformity of the color channels, the spectra of two adjacent pixels slightly differ when a uniform subject is photographed. One pixel "views" the subject at a wavelength w and the second pixel takes a wavelength w±Δw. If it is found in an image that the two pixels show an object (which is assumed to be uniform), not only the wavelength w but also the wavelength w±Δw can be assigned to the object, and in this respect the spectrum of the object is perfected.
In an exemplary case, one image has a spatial resolution of 50×50 pixels, a spectral resolution of 500 channels, and another image has 500×500 pixels, and a spectral resolution of 10 channels, whereby the spatial information of the resulting image can be tuned to 500×500 by means of (part of) Pan-SHARPENING and the spectrum to 500 by means of "spectral sharpening", resulting in a spatial resolution of 500×500 pixels and a spectral resolution of 500 channels.
The following method can be combined particularly simply with the aforementioned method, but in its basic form two images are not necessarily required. More accurate spectral information can also be determined from a single image.
Preferably, for each pixel spectrum, at least the spectrum of the immediately adjacent pixel is added. As previously mentioned, these adjacent spectra contain other wavelength information (other control points due to other center wavelengths). This simple approach increases the spectral resolution but may lead to errors in the transition regions between different objects of the subject matter.
However, even in this case, the result can be improved by performing object recognition within the image. Wherein it is again assumed that the spectrum within the object is approximately uniform. If the spectral differences within the object are large, object segmentation may also be re-performed, in which parts of the object with different spectra are separated from each other and treated as independent objects. Subsequently, for example, different mutually adjoining regions can be present in the image, which are separated from one another by edges. The spectra of pixels within an object, in particular pixels from the center of the object, are now combined with each other. The combination may be a combination of spectra of adjacent pixels or a combination of spectra of all pixels. The first approach achieves acceptable improvement even in the case of slightly non-uniform objects, while the second approach achieves very large resolution in the case of uniform objects. The two alternatives can be combined with each other, wherein the object is divided into concentric regions and the spectra of the pixels of these regions are combined.
Other preferred calibration methods for multi-spectral multi-lens imaging systems are known methods for calibrating dark current and/or for calibrating white balance and/or for radiation calibration and/or for calibration of photo-response non-uniformity ("PRNU").
Drawings
Examples of preferred embodiments of the invention refer to the figures.
Fig. 1 shows a multi-lens imaging system according to the prior art.
Fig. 2 shows a shooting scene.
Fig. 3 shows a shooting scene from top to bottom, a multi-lens imaging system comprising an embodiment of the device according to the invention.
Fig. 4 shows an example of a dispersion effect.
Fig. 5 shows examples for the center wavelength and for the correction function.
Fig. 6 shows an example of spectral improvement for a captured image.
Fig. 7 shows another example of spectral improvement for two images taken.
Fig. 8 shows an exemplary block diagram of a method according to the present invention.
Detailed Description
Fig. 1 is a schematic perspective view of a multi-lens imaging system 1 for hyperspectral photographed images according to the prior art. The multi-lens imaging system 1 comprises a planar image sensor 3 and a planar lens array 2 of uniform individual lenses 2a, which are arranged in such a way that a large number of first mappings AS (see for example only the smaller first mappings AS in fig. 5) are produced on the image sensor 3 in a grid-like arrangement by means of a subject M. For clarity, only one of the individual lenses 2a is denoted by a reference numeral.
In order to improve the quality of the first map AS, an aperture mask 5 is provided between the image sensor 3 and the lens array 2. Each aperture 5a of the aperture cover 5 corresponds to one single lens 2a and is arranged directly behind it. In order to obtain spectral information, a filter element 4 is arranged between the aperture cover 5 and the image sensor 3. In other embodiments, the filter element 4 may also be arranged in front of the lens array (see for example fig. 8). In the case shown here, the filter element 4 is a linearly variable filter, which rotates slightly with respect to the image sensor. Thus, the center of each map is at a different wavelength range of the filter element. Thus, each first mapping AS on the image sensor provides different spectral information, and the totality of the first mapping AS is used to create an image containing the spectral information.
Fig. 2 shows a shooting scene of the subject M. The theme includes a house here used as a background H and a tree as an object O in the foreground. The subject is photographed by the multi-lens photographing system 1.
Fig. 3 shows the subject M in fig. 2 from top to bottom. The multi-lens imaging system 1 here comprises an embodiment of the device 6 of the invention. The device comprises a data interface 7, a determination unit 8, a determination unit 9 and a processing unit 10.
The data interface 7 is designed to receive images captured by the multi-lens imaging system 1. The data interface may for example directly access the image sensor, access the storage unit or communicate with a network.
The determination unit 8 is designed to determine a color gamut F of an area of the filter element 4 of the multi-lens imaging system 1, which area corresponds to a predetermined area of the image sensor 3 of the multi-lens imaging system 1.
The determination unit 9 is designed to determine the center wavelength Z for at least two pixels P1, P2 of a predetermined area of the image sensor 3 based on the determined color gamut F.
The processing unit 10 is designed to modify the setting of the area of the image sensor 3 and/or to modify the image B captured with the image sensor 3. The correction is performed by means of said determined center wavelength Z. Alternatively or additionally, the processing unit is designed to be adapted to generate a complementary data set for the image or for imaging control.
Fig. 4 shows an example of a dispersion effect. The two light beams (arrows) with the spread spectrum pass through the filter element 6 at different angles and thus pass through different paths in the filter element 6. This results in a shift of the center wavelength Z. Furthermore, the filter element 6 may be transparent to different wavelengths at different locations, which likewise results in different center wavelengths (even if the beams are parallel).
Fig. 5 shows an example of a correction function for the center wavelength Z (left side) and for the adaptation function a (right side) here. For a light beam having a broad spectrum passing through the filter element 6 (see for example fig. 4), a narrow spectrum having a center wavelength Z is produced after passing through. If all the center wavelengths Z or calibration values derived from these center wavelengths Z are input into a graph combining the coordinates x and y with the wavelength λ (e.g. center wavelength Z) for a pixel domain having the coordinates x and y, a distribution is obtained, for example as shown on the right side of the graph. This is an exemplary illustration of an adaptation function a as an example of a correction function.
Fig. 6 shows one example of spectral improvement for the photographed image B. In this example, it is assumed that the entire crown of the tree representing the object O here has the same spectrum, i.e. each pixel P1, P2 reflecting the crown should reflect the value of the same spectrum. But the pixels P1, P2 are located at different positions and therefore have different center wavelengths Z (see, for example, the right-hand distribution of fig. 5). This means that the intensities of the two pixels P1, P2 reflect information of a different part of the total spectrum. That is, the center wavelength Z of the two pixels P1, P2 may be assigned to each pixel P1, P2.
Fig. 7 shows another example of spectral improvement for two images taken. The principle is very similar to fig. 8, except that the two pixels are located at different coordinates of two different images, but on the same object coordinates of the tree (object O). Here, each pixel P1, P2 also reflects the value of the same spectrum. But since the pixels P1, P2 are located on different areas of the image sensor 3, they also have different center wavelengths Z (see e.g. the distribution on the right side of fig. 5). This means that the intensities of the two pixels P1, P2 reflect information of different parts of the total spectrum. That is, the center wavelength Z of the two pixels P1, P2 may be assigned to each pixel P1, P2.
Fig. 8 is an exemplary block diagram of a method for determining the parallax of an image captured by the multi-lens imaging system 1 according to the present invention.
In step I, a color gamut F is determined for one region of the filter element 4 of the multi-lens imaging system 1, which corresponds to one predetermined region of the image sensor 3 of the multi-lens imaging system 1.
In step II, a center wavelength Z is determined for at least two pixels (P1, P2) of a predetermined area of the image sensor 3 based on the determined color gamut F.
In step III, the area of the image sensor 3 and/or the image B, B recorded with the image sensor 3 is corrected on the basis of the determined center wavelength Z and/or a complementary dataset is generated for the image or for controlling the image acquisition.
The preferred correction is further subdivided in step 3.
In step IIIa, a calibration (as an example of a correction) of the area of the image sensor 3 or of the image B recorded with this area is performed (see also fig. 6 for this purpose).
In step IIIb, the spectral information of the captured image of the image sensor 3 is improved, wherein at least two pixels P1, P2 are determined in a number of images B, B, whose central wavelength Z is known and which belong to an image region for which the same spectrum is assumed, and wherein the spectral information of the pixels P1, P2 is integrated into the total spectral information for at least one of the pixels P1, P2 (see also fig. 7 for this purpose). Such spectral information improvement may be a modification to the image or may be in the form of a complementary dataset.
Finally, it is pointed out that the use of words such as "a" or "an" does not exclude the possibility of multiple occurrences of the feature in question. Accordingly, "a" may also be understood as "at least one". Concepts such as "units" or "devices" do not exclude the possibility that the elements involved are made up of several co-acting components which are not necessarily mounted in a common housing, although preferably an encapsulated housing is used. Within the scope of the optics, the lens element can in particular be formed by a single lens, or by a lens system, or by an objective lens, without precise differentiation being required.
Reference numeral table
1. Multi-lens imaging system
2. Lens array
2A Single lens
3. Image sensor
4. Filter element
5. Aperture cover
5A pore diameter
6. Device and method for controlling the same
7. Data interface
8. Determination unit
9. Determination unit
10. Processing unit
A adaptation function
B image
B1 Image processing apparatus
F color gamut
H background
M theme
O object
Z center wavelength

Claims (13)

1. A method of determining a wavelength deviation of a captured image of a multi-lens imaging system (1), comprising the steps of:
Determining a color gamut (F) for a region of the filter element (4) of the multi-lens imaging system (1) by direct measurement, which corresponds to a predetermined region of the image sensor (3) of the multi-lens imaging system (1), wherein the color gamut (F) represents the spectral sensitivity of the region and comprises information about the central wavelength (Z) of the light, which is incident through the filter element (4) to the pixels of the region of the image sensor (3), wherein the color gamut (F) is determined by capturing and/or providing an image (B, B1) of the subject (M) with the multi-spectral multi-lens imaging system (1),
-Determining a center wavelength (Z) for at least two pixels (P1, P2) of a predetermined area of the image sensor (3) based on the determined color gamut (F),
-Modifying the area of the image sensor (3) and/or the image (B, B1) taken with the image sensor (3) based on the determined center wavelength (Z), and/or generating a supplementary dataset for the image or for imaging control.
2. The method according to claim 1, characterized in that the color gamut (F) is also determined by calculating the characteristics of the filter element (4) at wavelengths incident at different angles of incidence taking into account the imaging properties of the optics (2) of the multi-lens imaging system (1) corresponding to the region.
3. Method according to any one of the preceding claims, characterized in that, in the range of determining the center wavelength (Z), a correction function (a) is determined for pixels (P1, P2) of a predetermined area of the image sensor (3) based on the determined center wavelength (Z) of at least two pixels (P1, P2), and that the area of the image sensor (3) and/or the image (B, B1) are corrected using the correction function (a).
4. A method according to claim 1, characterized in that in determining the color gamut (F) information about the angular dependence of the central wavelength (Z) of the transmitted light is added to the color gamut, and in taking into account the distance of the elements of the theme (M) in the correction.
5. Method according to claim 1, wherein a color gamut (F) is determined for the pixels (P1, P2) of the image sensor (3) depending on the distance of the area of the mapped subject (M) from the multi-lens imaging system (1).
6. The method of claim 1, further determining a photographed disparity, comprising the steps of:
capturing and/or providing at least two images (B, B) of an object (O) in different spectral ranges by means of a multi-spectral multi-lens imaging system (1),
Identifying an object (O) in the captured image (B, B) by means of a digital object identification system and creating a virtual object (O) using a mapping of the identified object (O),
Determining the coordinates of the virtual object (O) in the image (B, B) in the form of absolute image coordinates and/or relative coordinates of other elements in the relative image (B, B1),
-Determining the parallax of the object (O) from the determined coordinates and the known position of the shooting point of the image (B, B) on the image sensor (3) of the multi-lens camera system (1).
7. Method according to claim 6, characterized in that, based on the determined parallax, a distance of an object (O) from the multi-lens imaging system (1) and/or from another object (O) in the image (B, B) is determined and the distance is used to determine a color gamut (F) and/or a center wavelength (Z).
8. The method according to claim 1, characterized in that the correction comprises a calibration of the area of the image sensor (3) or of the image (B, B1) taken with the image sensor (3).
9. A method according to claim 3, wherein the correction function (a) is a calibration function comprising a color gamut (F) based calibration value for the pixels (P1, P2) of the area of the image sensor (3) or the pixels of the image (B, B1) taken with said area for calibrating the relevant pixels (P1, P2),
And/or wherein for calibration purposes the radiation characteristics of a point of the object and/or subject are determined, wherein in the different images recorded by the image sensor the angle of incidence of the light beam of the point of the object or subject is determined and from this the characteristic curve of the intensity of the corresponding image point as a function of the angle of incidence is determined.
10. The method according to claim 1, characterized in that the corrected and/or generated dataset comprises spectral information improving the capturing of the image sensor (3), wherein at least two pixels (P1, P2) are determined in the image (B, B1), whose central wavelengths (Z) are known and which belong to an image region for which the same spectrum is assumed, and wherein the spectral information of the pixels (P1, P2) is integrated into total spectral information for at least one of the pixels (P1, P2).
11. Method according to claim 1, characterized in that for improving the captured spectral information of the image sensor (3) the following steps are carried out:
-taking at least two images (B, B1) of the same subject (M) from different perspectives or in case of a subject movement through the same area of the image sensor (3),
-Determining pixels (P1, P2) of said image (B, B1) which correspond to the same area of said subject (M),
Determining the center wavelength (Z) of the respective pixel (P1, P2),
A) Integrating the spectral information of the pixels (P1, P2) into total spectral information for at least one of the pixels (P1, P2),
And/or
B) The viewing angle from which the pixel is mapped is determined based on the center wavelength, and the intensity of the pixel is determined based on the viewing angle.
12. An apparatus (6) for determining a wavelength deviation of a captured image of a multi-lens imaging system (1), comprising:
-a determination unit (8) designed to determine a color gamut (F) of an area of the filter element (4) of the multi-lens imaging system (1) by direct measurement, said area corresponding to a predetermined area of the image sensor (3) of the multi-lens imaging system (1), wherein the color gamut (F) represents the spectral sensitivity of said area and comprises information about the central wavelength (Z) of light, which is incident through the filter element (4) to the pixels of the area of the image sensor (3), wherein the color gamut (F) is determined by capturing and/or providing an image (B, B1) of the subject (M) with the multi-lens imaging system (1) of the multi-spectrum,
A determination unit (9) designed to determine a center wavelength (Z) for at least two pixels (P1, P2) of a predetermined area of the image sensor (3) based on the determined color gamut (F),
-A processing unit (10) designed to correct an area of an image sensor (3) and/or an image (B, B1) taken with the image sensor (3) based on the determined center wavelength (Z) and/or to generate a complementary dataset for the image or for imaging control.
13. A multi-lens camera system (1) comprising the device of claim 12 and/or being designed to be adapted to implement the method of any one of claims 1 to 11.
CN202080092968.5A 2019-12-09 2020-12-07 Method and device for determining wavelength deviation of image shot by multi-lens shooting system Active CN114930136B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE102019133516.7A DE102019133516B4 (en) 2019-12-09 2019-12-09 Method and device for determining wavelength deviations from recordings of a multi-lens camera system
DE102019133516.7 2019-12-09
PCT/DE2020/101034 WO2021115532A1 (en) 2019-12-09 2020-12-07 Method and device for determining wavelength deviations of images captured by a multi-lens camera system

Publications (2)

Publication Number Publication Date
CN114930136A CN114930136A (en) 2022-08-19
CN114930136B true CN114930136B (en) 2025-11-11

Family

ID=74187063

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080092968.5A Active CN114930136B (en) 2019-12-09 2020-12-07 Method and device for determining wavelength deviation of image shot by multi-lens shooting system

Country Status (4)

Country Link
EP (1) EP4073478A1 (en)
CN (1) CN114930136B (en)
DE (1) DE102019133516B4 (en)
WO (1) WO2021115532A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8665440B1 (en) * 2011-02-10 2014-03-04 Physical Optics Corporation Pseudo-apposition eye spectral imaging system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE0402576D0 (en) * 2004-10-25 2004-10-25 Forskarpatent I Uppsala Ab Multispectral and hyperspectral imaging
DE102010041569B4 (en) * 2010-09-28 2017-04-06 Leica Geosystems Ag Digital camera system, color filter element for digital camera system, method for determining deviations between the cameras of a digital camera system and image processing unit for digital camera system
GB2488519A (en) * 2011-02-16 2012-09-05 St Microelectronics Res & Dev Multi-channel image sensor incorporating lenslet array and overlapping fields of view.
IN2014CN03038A (en) 2011-11-04 2015-07-03 Imec
CN104350732B (en) 2012-05-28 2018-04-06 株式会社尼康 Camera device
US10648960B2 (en) * 2015-05-29 2020-05-12 Rebellion Photonics, Inc. Hydrogen sulfide imaging system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8665440B1 (en) * 2011-02-10 2014-03-04 Physical Optics Corporation Pseudo-apposition eye spectral imaging system

Also Published As

Publication number Publication date
DE102019133516B4 (en) 2021-07-15
CN114930136A (en) 2022-08-19
EP4073478A1 (en) 2022-10-19
DE102019133516A1 (en) 2021-06-10
WO2021115532A1 (en) 2021-06-17

Similar Documents

Publication Publication Date Title
US10560684B2 (en) System and methods for calibration of an array camera
US10127682B2 (en) System and methods for calibration of an array camera
KR102040368B1 (en) Hyper spectral image sensor and 3D Scanner using it
US9030528B2 (en) Multi-zone imaging sensor and lens array
Lin et al. Radiometric calibration from a single image
US10412286B2 (en) Multicamera imaging system and method for measuring illumination
US8717485B2 (en) Picture capturing apparatus and method using an image sensor, an optical element, and interpolation
EP3535553A1 (en) Calibration method and apparatus for active pixel hyperspectral sensors and cameras
WO2007133898A1 (en) Compensating for non-uniform illumination of object fields captured by a camera
US9638575B2 (en) Measuring apparatus, measuring system, and measuring method
KR20030028553A (en) Method and apparatus for image mosaicing
US12436032B2 (en) System, method and apparatus for wide wavelength range imaging with focus and image correction
CN107077722B (en) Image recording apparatus and method for recording images
US10609361B2 (en) Imaging systems with depth detection
JP6606231B2 (en) Camera and method for generating color images
US8243162B2 (en) Automatic white balancing using meter sensors
JP6942480B2 (en) Focus detector, focus detection method, and focus detection program
CN114930136B (en) Method and device for determining wavelength deviation of image shot by multi-lens shooting system
KR102655377B1 (en) Apparatus and method for measuring spectral radiance luminance and image
Cavanaugh et al. VNIR hypersensor camera system
US9699394B2 (en) Filter arrangement for image sensor
US7394541B1 (en) Ambient light analysis methods, imaging devices, and articles of manufacture
CN111551251A (en) Ordered spectral imaging
JP5409158B2 (en) Image pickup apparatus including single-plate color two-dimensional image pickup device
WO2024167808A1 (en) Sub-pixel sensor alignment in optical systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant