CN110383803A - Compensate vignetting - Google Patents
Compensate vignetting Download PDFInfo
- Publication number
- CN110383803A CN110383803A CN201780087293.3A CN201780087293A CN110383803A CN 110383803 A CN110383803 A CN 110383803A CN 201780087293 A CN201780087293 A CN 201780087293A CN 110383803 A CN110383803 A CN 110383803A
- Authority
- CN
- China
- Prior art keywords
- illumination
- image data
- luminaire
- light
- visual field
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000005286 illumination Methods 0.000 claims abstract description 92
- 230000000007 visual effect Effects 0.000 claims abstract description 52
- 238000003384 imaging method Methods 0.000 claims abstract description 31
- 230000003287 optical effect Effects 0.000 claims description 40
- 238000009826 distribution Methods 0.000 claims description 31
- 238000012545 processing Methods 0.000 claims description 28
- 238000000034 method Methods 0.000 claims description 17
- 230000008859 change Effects 0.000 claims description 9
- 238000005259 measurement Methods 0.000 claims description 7
- 239000004065 semiconductor Substances 0.000 claims description 7
- 230000000694 effects Effects 0.000 claims description 4
- 230000010287 polarization Effects 0.000 claims description 2
- 238000007493 shaping process Methods 0.000 claims 3
- 238000003860 storage Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 230000004297 night vision Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000035945 sensitivity Effects 0.000 description 4
- 238000013461 design Methods 0.000 description 3
- 238000012805 post-processing Methods 0.000 description 3
- 230000005855 radiation Effects 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 230000003595 spectral effect Effects 0.000 description 2
- 238000010183 spectrum analysis Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 241001269238 Data Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 230000003796 beauty Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000001143 conditioned effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 208000002173 dizziness Diseases 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000005538 encapsulation Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 239000002932 luster Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000011282 treatment Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/81—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/44—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
- H04N25/441—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array by reading contiguous pixels from selected rows or columns of the array, e.g. interlaced scanning
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/44—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
- H04N25/447—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array by preserving the colour pattern with or without loss of information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/46—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by combining or binning pixels
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
- H04N25/53—Control of the integration time
- H04N25/533—Control of the integration time by using differing integration times for different sensor regions
- H04N25/534—Control of the integration time by using differing integration times for different sensor regions depending on the spectral component
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
- H04N25/61—Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Studio Devices (AREA)
- Image Input (AREA)
Abstract
A kind of includes that the system of the image capture system with the sensing efficiency changed in the visual field of image capture system can compensate the variation of sensing efficiency using forming illumination.Luminaire, which is configured to be shaped as, carrys out the visual field of irradiation image capture systems in the illumination of lower local (for example, at periphery of the visual field) intensity with higher of sensing efficiency.Therefore, imaging system can provide the image data with signal-to-noise ratio more evenly.The data from non-irradiated scene can be used to manipulate the image data from irradiation scene, to generate improved image data.
Description
Cross reference to related applications
This application claims the U.S. Provisional Patent Application No. 62/438,956 submitted on December 23rd, 2016, in April, 2017
The European Patent Application No. 17168779.1 submitted for 28th and in the U.S. non-provisional application number submitted on December 21st, 2017
15/851,240 equity, is incorporated herein by reference, as fully expounding.
Background technique
In photography or imaging, vignetting be a kind of a part (such as periphery) of image it is more darker than another part of image or
More unsaturated effect.In some photographies, vignetting may be it is intentional or desired, to realize desired image effect or beauty
Sense.In other cases, it is not intended to sum departing from desired vignetting due to camera limitation or it is unsuitable setting and generate.Number
Word image rectification can reduce or eliminate undesirable vignetting, although however, Digital Image Processing can improve the appearance of image,
But digital processing may not improve the accuracy of image data.Therefore, in some application (such as time-of-flight cameras or machine
Device visual imaging) in, post processing of image may be invalid, because additional treatments do not improve and may deteriorate image data
Signal to noise ratio (snr).
Summary of the invention
According to an aspect of of the present present invention, scene or object can be irradiated to compensate in the imaging system that otherwise will lead to gradually
It is dizzy.As an example, the illumination corresponded in the region in the corner of the visual field of imaging system can be than the center corresponding to visual field
Illumination in region is stronger.In addition, luminaire can be used in imaging system, which is provided according to image capture system
Ability and the illumination of specific forming.Some systems can be based on two or more versions for the image data collected for a scene
Original manipulation image data, such as removes data from first version and based on the image data from the second edition.It is some
System can allow to avoid uncontrolled using only to the image capture techniques of the transmitting photaesthesia of the luminaire from system
The influence of the ambient lighting of system.Some in these imaging systems may be particularly well suited for secretly set set used in 3D camera,
Gesture control camera or camera system.
According to an embodiment, imaging system includes image capture system and luminaire.Image capture system, which has, is scheming
As capture systems visual field in the sensing efficiency that changes, and luminaire is using having the illumination being distributed as follows to catch come irradiation image
The visual field of system is obtained, which is shaped as in the sensing lower place intensity with higher of efficiency.
According to another embodiment, a kind of method for collecting image data includes: using having a forming distribution
Scene is irradiated in illumination, which is distributed in the lower place intensity with higher of sensing efficiency of image capture system.So
Afterwards, image capture system can capture the image data indicated such as the scene using the illumination irradiation with forming distribution.Image
Data can have substantially homogeneous signal-to-noise ratio, or can be used to generate the picture number with substantially homogeneous signal-to-noise ratio
According to.
Detailed description of the invention
It is included to provide and this is incorporated into this specification and constituted to the attached drawing of published subject further understood
Part of specification.Attached drawing also illustrates the embodiment of published subject, and with detailed description together for explaining institute
The principle of the embodiment of open theme.It is not attempt to beyond the level of detail necessary to the basic comprehension to published subject
To show CONSTRUCTED SPECIFICATION and show the various modes that it may be practiced.
Fig. 1 shows the block diagram of the system including luminaire, and luminaire offer, which is shaped as, is reduced or avoided vignetting pair
The illumination of the influence of image data;
Fig. 2A shows isogram, and which illustrates the sensing efficiency in image capture system can be how towards image capture system
The edge of the visual field of system declines;
Fig. 2 B shows isogram, and which illustrates the angles for the illumination for being shaped as the sensing efficiency change illustrated in compensation Fig. 2A
Spend correlation;
Fig. 3 A, which is shown, provides the perspective view of the luminaire of illumination of rotational symmetry forming using optical device;
Fig. 3 B, which is shown, provides the perspective view of the luminaire of forming illumination using molded lens, which efficiently limits
Illumination to the visual field of imaging system;
Fig. 4 shows the sectional view that the luminaire of illumination with forming distribution is generated using divergent light source;
Fig. 5 A and 5B respectively illustrate the decomposition view and assembled view of the luminaire of the array using semiconductor laser;
Fig. 6 shows the luminaire using light source, which generates the light with the uneven intensity distribution in space, has to generate
The illumination of the uneven intensity distribution of angle;And
Fig. 7 shows the block diagram of the processing system using the luminaire for providing illumination, which is shaped as compensation image capture
The characteristic of system.
Specific embodiment
Luminaire can be used according to embodiment described herein, imaging system, the luminaire be configured to provide by
Shape is the illumination of the vignetting being reduced or avoided in image or other variations and/or provides with signal to noise ratio (snr) more evenly
Image data, especially with the tradition that vignetting can be corrected during post-processing stages that leads to uneven and/or higher SNR
Image data is compared.Luminaire can be particularly adapted to provide the illumination with following intensity distribution, intensity distribution tool
There is the shape for the characteristic (for example, ability of image forming optics and sensing array) for being based specifically on image capture system.Illumination
Device can provide the fixation shape (for example, the shape generated by luminaire hardware, lens and light source) for illumination, or provide
It can (for example, according to current environment lighting condition or current setting of image capture system) illumination for being programmed or changed.?
In some embodiments, imaging system may include the luminaire and image capture system to match, so that illumination compensation image
The non-uniform sensing efficiency of capture systems, and make image capture system only to the light sensitive from luminaire.
Fig. 1 is the block diagram according to the system 100 of an embodiment of the invention.System 100 can be capture picture number
According to any kind of equipment or equipment set.For example, system 100 can be such as security camera, 3D camera, depth sense
The camera of camera, night vision camera or biological identification camera or biological sensing camera, and can be multifunctional equipment (such as with
Or mobile phone, computer, robot, industrial system or vehicle using imaging capability) or can be the multifunctional equipment
A part, or can be need indicate or the scene in FOV of acquisition data any kind of system.System 100 can
With in particular or including handle image data system, and as described further below, system 100 can capture
Or generate the image data with the SNR of the different zones across image.It will be appreciated that though describing the realization side using system
Formula, but as disclosed herein, system can be the combination of individual equipment or equipment.
As shown, system 100 includes luminaire 110, image capture system 120 and processing system 130, and system 100 is grasped
Make to capture image or image data from visual field 150.Imaging system 100(or especially luminaire 110 and image capture system
Separation between 120) relative in visual field 150 object or scene can be it is small so that luminaire 110 and image capture system
The optical axis of system 120 can be approximate conllinear each other.
Luminaire 110 can be the object being capable of providing in the visual field 150 of covering image capture system 120 or scene extremely
The lighting system of at least part of illumination.Luminaire 110 may include one or more light sources, such as light emitting diode (LED)
Or semiconductor laser and it is configured to provide the (such as modified, particularly with from luminaire 110 of forming together
Light relative to the optical axis of luminaire 110 angle and the intensity profile that changes) illumination flash of light or flash or continuous light
According to optical device.This document describes some examples of the embodiment of suitable luminaire, and luminaire 110 is herein
Be used as the luminaires that can be applied to one or more suitable luminaires referring generally to.
Image capture system 120 can be the camera for capableing of capturing still image or image sequence (for example, video).Image
Capture systems 120 can be any design, including design those of known in camera field.In the general configuration illustrated in Fig. 1,
Image capture system 120 includes image forming optics 122 and imaging sensor 124.In typical configuration, image forming optics
122 may include for example one or more lens, focusing system and operation to form the aperture of image on imaging sensor 124
Control system.It may, for example, be the imaging sensor 124 of charge-coupled device (CCD) or CMOS sensor array, sensing comes from
The light of the image formed by image forming optics 122 and image data is supplied to processing system 130.Image data can example
It such as include one or more pixelated array values, wherein each pixel value indicates element sensor for the correspondence area in visual field 150
Intensity, color, shade, reflection or the spectral content of domain or angular range sensing.
Processing system 130 may provide for the general control of imaging capture systems 120, for example, being used for image with setting
The parameter of capture or to start image capture, and processing system 130 may include with the appropriate software for those functions
Or the conventional microcontroller of firmware.The image that processing system 130 particularly could set or detect the boundary of control visual field 150 is caught
Obtain corresponding region of the element sensor sensing from visual field 150 in the setting or characteristic and imaging sensor 124 of system
The efficiency of light.Processing system 130 can control luminaire 110 further to be directed to the current-configuration (example of image capture system 120
Such as, the focus or magnifying power of image capture system 120) or it is proper for ambient lighting offer that may be present during image capture
When illumination.Processing system 130 can be further processed the image data from image capture system 120, for example, simply will
Image data, which is stored in data storage (not shown) or implements, such as to be identified object, extracts depth information, carries out color correction
Or the function of execution spectrum analysis etc..Processing system 130 can be based further on two or more captured image data
The manipulation for handling image data, such as by subtracting the first image data from the second image data, as being further disclosed herein
Like that.
The configuration of image forming optics 122 and the size of imaging sensor 124 and configuration can determine or limit visual field
150 boundary.However, in many configurations, image capture system 120 may not provide uniformly for the entirety of visual field 150
Sensitivity or uniformly sense efficiency.For example, the size of the aperture in image forming optics 122 may make image capture system
120 can less collect the light on the periphery from visual field 150, this leads to vignetting.Fig. 2A illustrates example image capture systems
Sense the example of efficiency.The sensing efficiency of element sensor in imaging system can be defined as or true according to following
It is fixed: the intensity of the intensity sensed by the element sensor and the light in the area or angular range that are originated from the visual field corresponding to the pixel
Ratio.By considering the uniform field being such as uniformly illuminated (for example, equably illuminating and by the sensing battle array in image capture system
Arrange the white screen of sensing), it may be exemplified this sensing efficiency of ideal image capture systems.In this case, image is caught
Region ideally across sensor array is measured uniform strength by the sensor array obtained in system.In real image capture systems
In, the different pixels sensor in imaging sensor can be capable of measuring different intensity, for example, coming because image capture system is collected
From the limitation of the ability of the light of the different zones in visual field.As shown in Fig. 2A, it is subjected to the typical image capture systems of vignetting
There may be highest sensing efficiency at the center of the visual field of image capture system, this may be because the system can capture
The greater portion of light at center from visual field.Fig. 2A is shown with peak value sensing efficiency (being designated relative value 1.0)
The center of visual field.How it is for the side being located in closer to imaging sensor that isopleth in Fig. 2A shows sensing sensitivity
The element sensor of edge and decline, this will lead to image vignetting and visual field adjacent edges associated pixel pixel
Higher signal-to-noise ratio in data.Using having the imaging capture systems for sensing efficiency as shown in the example of Fig. 2A, uniformly
The image of the scene illuminated visual field corner by with ideal image system by about the 40% of the strength level of measurement
The strength level of measurement.
The strength level of lower measurement those of shows strength level towards corner such as in the example of Fig. 2A, can
It can lead to signal-to-noise ratio degradation, because signal-to-noise ratio generally depends on the strength level of measurement.Here, lower strength level may be right
It should be in the higher signal-to-noise ratio of the corner of the visual field in Fig. 2A.Attempt to improve the corner of visual field using post-processing technology
Picture quality may cause even higher (worse) signal-to-noise ratio.This is in certain imaging systems (such as security camera, flight time
Camera, depth sense camera, 3D imager, night vision camera etc.) in may be to be particularly problematic.
Luminaire 110 can provide the intensity point with other variations being shaped as in compensation vignetting or sensing sensitivity
The illumination of cloth.Particularly, illumination can have the pattern shaped in angular region.Fig. 2 B is according to the embodiment party of published subject
Formula shows the isogram of diagram intensity distribution, which compensates the non-uniform sensing efficiency illustrated in Fig. 2A.Figure
Intensity of illumination in visual field shown in 2B can sensing efficiency corresponding in visual field shown in Fig. 2A be inversely proportional.In Fig. 2 B
Illumination be plotted as relative to from luminaire 110 to the center of the visual field of luminaire 110 light angle function.For
The light of scene in visual field from image capture system, the respective pixel sensor in image capture system can be from respective corners
Spend detection light.Fig. 2 B is not shown the illumination in the region outside the visual field for irradiating imaging system.Outside the visual field of imaging system, it is not required to
Illumination is wanted to be used to be imaged, therefore except the range of field of view angle, the luminaire intensity from luminaire 110 is not crucial
, and can be zero.
The illumination provided by luminaire 110 is expressed in fig. 2b and has minimum exposure intensity at the center of visual field
(the relative intensity factor for being designated 1.0).Illumination increases in intensity towards the edge of visual field.For example, in the corner of visual field,
Illumination has 2.5 relative intensity, this is 250% of the illumination at the center of visual field.If the increase of intensity of illumination compensates for
The reduction of the sensing efficiency of image capture then may be implemented using the imaging system of this illumination uniformly effective across sensing array
Sense efficiency.In the example shown in the series of figures, for each element sensor, relative sensitivity factor from Fig. 2A and Fig. 2 B is come from
The product of the corresponding relative intensity factor be uniform.It is opposite to capture the factor 1.0 and the Relative light intensity factor 1.0 at center
Product be 1, and in corner, the opposite factor 0.4 and the product of the Relative light intensity factor 2.5 of capturing is also 1.
The illumination illustrated in Fig. 2A can completely or substantially be fully compensated to be existed by the variable sensing efficiency illustrated in Fig. 2A
The variation of caused vignetting or SNR in image or image data.According to the embodiment of published subject, image capture system can
Only to capture the reflected light that luminaire 110 is original source.This can be by using only to the spy with the light from luminaire 110
Property (for example, polarization, frequency or wavelength) the image capture system of photaesthesia (at least approximately) to realize.For example, luminaire
110 can produce the illumination in narrow band, and image capture system may include adjusting to the illumination band and allowing to come
The spectral filter passed through from most of light of luminaire 110.Furthermore it is possible to be switched off and on luminaire 110 with capture " dark "
The image of scene in " bright " illumination, and processing system can subtract the picture number corresponding to dark image from bright image data
According to, with generate correspond to when scene only under the illumination from luminaire 110 when by the image data of the image of generation.Synchronous figure
As detection technique can similarly with according to the frame rate of camera or image capture timing selection frequency and phase come modulate come
Be switched on and off from luminaire 110() light.For example, the frequency of the flash of light of the illumination from luminaire 110 can be than camera
The phase difference that frame rate is much higher and/or offer is relative to magazine image capture, can be used so that the flight time calculates
Image data and the distance for determining the object irradiated to luminaire 110.Variation about synchronous images detection technique can be used for
In flight time imaging.
Camera or image capture system can be with capturing ambient light and from the object reflection in visual field from luminaire 110
Light.According to the embodiment of disclosed theme, luminaire 110 can provide illumination, which is based on can be pre- for visual field
The ambient lighting of phase or measurement and change or visual field is expected or the ambient lighting of measurement and is adapted to according to that can be directed to.For example, right
In the scene of uniform illumination, luminaire 110 can be operated to provide illumination, illumination offer when being added to uniform ambient light
Overall strength as shown in Figure 2 B, and therefore compensate the opposite sensing efficiency illustrated in Fig. 2A.In general, the environment light in scene
(such as conventional lighting sources from such as room illumination or regular camera flash system) may not be uniform and may aggravate
Vignetting, and the luminaire 110 being used together with this environment light may need to provide and be shaped as compensation by image capture system
The illumination of influence caused by the limitation of system and both ambient lightings.Particularly, therefore luminaire 110 may need to compensate illumination
Vignetting and imaging vignetting, so that luminaire 110 be needed to generate stronger compensation more illustrated than example by Fig. 2A and Fig. 2 B.
Image capture system can capture the two or more groups image data of given scenario.Such as it can be carried on the back by capture
Two or more groups image data is collected to back image data.The first image can be collected when luminaire 110 irradiates scene
Data.The second image data can be captured in the case where the illumination not from luminaire 110, and can be in ambient lighting
Penetrate the image data of capturing scenes in the case where scene.Such as microprocessor, microcontroller or any other processing applicatory
The processing component of component may be configured to manipulation two or more groups image data, can be used for security system, 3D to generate
The improved image data of system, depth sense, Object identifying, night vision, biological identification and sensing etc..Improved image data can
To combine the first and second image datas, so that obtained improved image data includes uniform signal-to-noise ratio.Alternatively,
Improved image data can be to subtract from the first image data captured when luminaire 110 irradiates scene and catch in environment light
The result of the second image data obtained.For example, this improved image data can permit processing component may compare data quilt
Identical format in systems is stored to collect image data.By improved image data and such as face recognition can be used for
The comparable image data of Object identifying be compared.More specifically, from the first picture number captured under the light of luminaire
It can cause to can correspond to the improved of standardized data format according to the second image data captured under ambient light is subtracted
Image data.In this example, influence of the environment light to scene can be removed in improved image data, allowed to it
It is compared with other standards image data.
A variety of different frameworks can be used to realize the luminaire 110 of Fig. 1.For example, Fig. 3 A is illustrated including light source 310
With the component of the luminaire 300 of optical device 320.Light source 310 can be arranged to any part of transmitting light, such as LED
Or LED array, and optical device 320 may include one or more optical elements, such as Fresnel lens, grating or change
Become or control illumination is emitted the angular distribution of optical device 320 or the other structures of shape.In the luminaire 300 of assembling, light
Light source 310 can be attached to by learning device 320.
The optical device 320 of luminaire 300 for Fig. 3 A can produce the illumination pattern of rotational symmetry, and can be more
It is readily designed and produced, for example, passing through circular optical elements.However, when image capture system has rectangular field, rotation pair
Illumination is claimed to may cause the light largely wasted, that is, the light except visual field.It includes rectangular optical component that Fig. 3 B, which shows and uses,
The luminaire 350 of optical device 370, more efficiently to use the radiation from light source 360 to provide illumination to rectangular field.
Use LED optical system may be needed (for example, optics as the luminaire of light source (for example, light source 310 or 360)
Device 320 or 370), remold the angle-dependence of the intensity distribution of the light emitted from light source.It therefore, can be according to light source
Light emitting characteristic designs optical system.For example, Fig. 4 shows the sectional view of the luminaire 400 including light source 410, light source 410 can
With other luminaires of light 415 for being LED or transmitting is dissipated from source region.The intensity of light 415 from light source 410 depends on
The construction of light source 410, and typical light source can have for example uniform or lambert angular intensity distribution.As noted above
, the expected angle distribution of the illumination 425 from luminaire 400 may need different distributions, for example, in the light away from luminaire
The bigger distribution of intensity at the larger angle of axis.In order to increase the intensity at field of view edge, optical element 420 can fold from
The light of LED diverging, so that light is in the region for it is expected more light or on direction more dense (that is, brighter), for example, bigger
The angle of divergence at it is brighter.
For diverging light single source 410 in Fig. 4 illustrated same principle can be used and for this
The array of light source is repeated.For example, luminaire 110 includes LED array and optical element battle array in another interchangeable configuration
Column, so that illumination shapes and the illumination from LED is directed in rectangular field." multiple tube core LED and thoroughly for example, entitled
The U.S. Patent number 8,729,571 of mirror optical system " discloses the framework of LED array or system with controlled light distribution.
For example, Fig. 5 A illustrates the component of luminaire 500, wherein using with the diffusing globe as primary optics
Semiconductor laser array.Luminaire 500 includes integrated antenna package 510, and it includes vertical cavity surface emitting lasers
(VCSEL) or the array 512 of other semiconductor lasers.Diffusing globe 520 is shown as in fig. 5 on carrier 522, carrier
522 can use during the manufacture of diffusing globe 520, or for being attached to diffusing globe 520 in the luminaire 500 of assembling
Integrated antenna package 510, as shown in Fig. 5 B.Diffusing globe 520 can be the engineered diffuser with fine controlled characteristic, example
Such as, lenticule or the array with other of predefined shape and position optical element.Single laser in array 512 can be with
Generate the light of the diverging with about 20 degree.Depending on small in the interval of the laser in array 512 and diffusing globe 520
The interval of mirror or other optical elements, each laser can irradiate one or more lenslets or optics in diffusing globe 520
Element.In one configuration, diffusing globe 520 can the light individually to each laser in array 512 be formed.
More generally, the light from array 512 is transformed into the angular-spread beam with expected angle intensity distribution, such as square by diffusing globe 520
Shape angle fan.As shown in Fig. 5 B, diffusing globe 520 can be attached to encapsulation 510, so that diffusing globe 520, which receives, comes from array
512 light.
The electricity for the laser that the distribution of illumination from luminaire 500 can be further respectively applied in array 512
Power or current control.More generally, some optical transmitting sets still can will have certain for each light emitting region close to point source
Kind size or range, and optical system tends to produce the distorted image of light emitting region.In general, the periphery from light emitting region
The light at (for example, edge of LED chip) tends to irradiate the periphery in place or scene.Therefore, it can set or change from all
Such as the spatial distribution of the light of the extended source of array 512, to influence the angle of the illumination except such as optical device of diffusing globe 520
Radiation pattern.
According to the embodiment of published subject, Fig. 6 shows the luminaire 600 using light source 610,610 quilt of light source
It extends and spatially unevenly to generate the expected angle distribution for the intensity in output illumination.Light source 610 can be by
It is multiple and may be that separately addressable single transmitter (for example, VCSEL array) is made, or can be an extension
Source generates the spatial distribution of the illumination changed in the light emitting region of extended source.In configurations illustrated, 610 quilt of light source
It is positioned to generate the light for the focal plane for being originated from optical system 620.It is the especially simple of lens that Fig. 6, which illustrates wherein optical system 620,
Single embodiment, but other or more complicated optical system can be used to be converted into the spatial position on light source 610
The space of far field angle or otherwise associated light and angular distribution.Using configurations illustrated, optical system 620 is in the future
Different far field angles is guided or projected from the light of the different zones of light source 610.For example, optical system 620 will be originated from light
The light guidance in the region 612 of the light source 610 on the optical axis of system 620 is the light 622 for being parallel to optical axis, but optical system
620 position along the focal length f for depending on optical system 620 and depending on region 614 relative to the optical axis of optical system 620
The light 624 of off-axis region 614 of the direction guidance from light source 610.Therefore, the angular distribution of the light from luminaire 600 with from
The spatial distribution for the light that light source 610 emits is associated, and the more bright area of light source 610 is in the illumination from luminaire 600
Brighter angular area is generated in radiation pattern.It therefore, can be by light source 610 in order to generate brighter illumination at greater angle
It is designed to or operates and is brighter at the periphery Cheng Qi.
In one embodiment, light source 610 can be with the phase that can be programmed or control to realize transmitting illumination
Hope shape intensity can be controlled separately or the set or array of separately addressable optical element.For example, using in light source 610
The array of separately controllable optical element, control or processing system can be operating independently the optical element in array, so that light source
610 generate the spatial distribution of light, and the spatial distribution of the light generates the desired far field illumination from luminaire 600.Therefore, it comes from
The illumination of luminaire 600 can be conditioned according to the sensing efficiency of image capture system and can be changed if necessary
Become, because image capture system changes due to the setting of aging or change.In one embodiment, processing system can be held
Row software, the software change according to the focus for the image capture system being used together with luminaire 600, magnifying power or another characteristic
Become to change the illumination from light source 610.
Fig. 7 is using the block diagram of another embodiment of the processing system 700 of luminaire 710,710 pairs of luminaire guidance
It is formed to the illumination in the visual field 750 of image capture system 720.Processing system 700 can be computer (for example, general
Computing system), or can be directed primarily to implement one group of specific function system (for example, mobile phone, tablet computer, can
Wearable device or time-of-flight camera), and the component of processing system 700 be desirably integrated into individual equipment (for example, it is portable or
Handheld device) in, or can be by multiple detachable parts (for example, the computer with one or more peripheral equipments) structure
At.Processing system 700 can specifically include the processor 730 of associated processing hardware, the associated processing hardware
Allowing processor 730 to access data (for example, image data 745) and execute can store the instruction (example in memory 740
Such as, software or firmware).For example, processor 730 may include the one or more central processing unit for being able to carry out program instruction
It (CPU) or processing core, and may include that (such as luminaire 710 and image are caught so that CPU or nuclear energy enough control the equipment of connection
Obtain the hardware of operation 720).
Memory 740 may include constitute processor 730 address space a part volatibility or it is non-volatile with
Machine accesses memory (RAM).Fig. 7 illustrates a sample implementation, wherein the memory 740 for processor 730 includes one
Group executable program module 742,744,746 and 748.(instead of in addressable memory or in addition to being in addressable storage
Except in device, such module 742,744,746 and 748 also can store in storage medium or be stored in storage equipment 770
On, storage medium or storage equipment 770 can be for example including hard disk drive or removable memory equipment.) module 742,
744,746 and 748 various purposes is can have and can selectively be held when system 700 implements particular procedure or function
Row.For example, processor 730 can execute user interface 742 with the input and output device in control system 700, and receive
Order or information, or information or content are provided.Particularly, other than luminaire 710 and image capture system 720, system
700 further include interface hardware 760, and interface hardware 760 may include that user can operate to provide the input of input or order and set
It is standby, such as switch, button, keypad, keyboard, mouse, touch screen or microphone, and processor 730 can execute user and connect
Mouth 742 is to control input equipment and explain user action.Interface hardware 760 can also include conventional output equipment, such as raise
Sound device, audio system or touch screen or other displays, and processor 730 can execute user interface 742 via output
Equipment output information.Interface hardware 760 can also include network interface, and system 700 is enabled to pass through network (such as local
Net, wide area network, telecommunication network or internet) receive or send information.
Processor 730 can execute image capture control module 744 when capturing data using image capture system 720,
For example, in response to the order for capturing image using image capture system 720 or changing used in the image capture system 720
The order of (for example, focus, aperture, filter or lens) is set.According to one side disclosed herein, when starting image capture
When, luminaire 710 generates illumination, which is shaped to the limitation of compensation image capture system 720, for example, to reduce capture
Image data in vignetting.In system 700, processor 730 can execute luminaire control module 746 to control luminaire
When and how 710 operate.Particular luminaire control process will generally depend upon the ability of luminaire 710.For example, if according to
Funerary objects 710 provides the illumination with a fixed shape, then a light control process can be with image capture 720 synchronously
(for example, in time identical with image capture or at the specified time offset relative to image capture) operates luminaire 710.
If luminaire 710 can produce the illumination with programmable shape, for example, if the luminaire 600 of luminaire 710 and Fig. 6
Similar or identical, then processor 730 can execute luminaire control process 746 to select and generate the light from luminaire 710
According to intended shape.In such a embodiment, the execution of luminaire control module 746 can be based on image capture system
720 current setting (namely based on the currently used focus of image capture system 720, aperture, filter or other features) is marked
Know target light to shine.The process generated by the execution of luminaire control module 746 can further the environment in field of detection 750 shine
It is bright, and luminaire 710 can be operated so that illumination to be directed in visual field 750, which individually or with ambient lighting combines
Realize that the target light of the mark of visual field 750 is shone in ground.It is shone using fixed-illumination or using programmable optical, captured image data can be with
With than utilizing ambient lighting or utilizing conven-tional luster or lighting system by the SNR more evenly of realization.
Memory 740(or the image data 745 stored in 770) can indicate the one or more captured by system 700
Image or frame.Other than capturing image data 745, system 700 can also handle image data.Particularly, processor 730 can
Processing function is implemented to execute image processing module 748, the object indicated in such as identification image data 745 extracts image
The depth or range information of the object indicated in data 745, the color correction for carrying out image data 745 implement image data
745 spectrum analysis.
System and method as described above can compensate the undesired effect of such as vignetting using forming illumination,
And the image data with the SNR across image more evenly can be provided.May need to be further processed image data with
In the case where machine vision, Object identifying, 3D modeling or other purposes, such system and method can be particularly useful
's.It may include but is not limited to security camera, 3D camera, depth sense, object in particular benefit from some applications of these abilities
Identification, night vision camera, biological identification camera and biological sensing camera.
Although the particular embodiment of the present invention or embodiment has been shown and described, for those skilled in the art
It is evident that, it can be changed and repair in its broader aspect without departing from the present invention for member
Change, and therefore, appended claims should cover within its scope fall into true spirit and scope of the present invention it is all this
Kind changes and modification.
Although describing feature and element above with specific combination, it will be appreciated by those of ordinary skill in the art that every
A feature or element can be used alone or be used with any combination with other features and element.In addition, retouching herein
The method stated can be realized in being incorporated to computer program in computer-readable medium, software or firmware, by computer
Or processor executes.The example of computer-readable medium includes electronic signal (being sent by wired or wireless connection) and computer
Readable storage medium storing program for executing.The example of computer readable storage medium includes but is not limited to read-only memory (ROM), random access memory
Device (RAM), register, cache memory, semiconductor memory devices, magnetic medium (such as internal hard drive and can be removed
Disk), magnet-optical medium and optical medium (such as CD-ROM disk and digital versatile disc (DVD)).
Claims (20)
1. a kind of system, comprising:
Image capture system has the sensing efficiency changed in the visual field of described image capture systems;
Luminaire is configured to irradiate the visual field of described image capture systems, wherein the luminaire generate have by
Shape is the illumination in the distribution of the sensing lower place intensity with higher of efficiency;And
Processing component, be configured to using correspond to ambient lighting scene the second image data come manipulate correspond to by
First image data of the scene of the luminaire irradiation.
2. system according to claim 1, wherein indicating scene and by described image capture systems captured image number
There is substantially homogeneous signal-to-noise ratio according to due to the illumination.
3. system according to claim 1, wherein the first image data are combined with second image data.
4. system according to claim 1, wherein removing second image data from the first image data.
5. system according to claim 1, wherein the illumination has at the center of the visual field lower than in the view
The intensity of the intensity of the illumination of the edge of field.
6. system according to claim 1, wherein the luminaire includes:
Light source;And
The light of the first intensity from the light source is directed to first area and the future of the visual field by shaping optics
The second area of the visual field is directed to from the light of the second intensity of the light source, wherein the second area corresponds to the figure
As the lower place of sensing efficiency of capture systems;And
Second intensity is greater than first intensity.
7. system according to claim 6 wherein the light source generates the light from source region, and comes from the source region
The light in domain is spatially non-uniform.
8. system according to claim 7, wherein the light source includes multiple optical elements, the multiple optical element is distributed in
In the source region and it is operated to create the spatially non-uniform illumination.
9. system according to claim 6, wherein the shaping optics change the illumination from the light source, so that
The illumination from the luminaire has rectangular section.
10. system according to claim 6, wherein shaping optics include lens, and the source region is located at institute
It states in the focal plane of lens.
11. system according to claim 1, wherein the luminaire generates the illumination of rotational symmetry.
12. system according to claim 1, wherein the luminaire includes the array of semiconductor laser.
13. system according to claim 12, wherein the luminaire further includes diffusing globe, the diffusing globe is oriented
Control the diverging of the beam from the semiconductor laser.
14. system according to claim 1 is known wherein the system is selected from by camera, mobile phone, computer, object
The group of other system and time-of-flight measurement system composition.
15. a kind of method for collecting image data, comprising:
Determine sensing efficiency of the image capture system at multiple points;
Scene is irradiated using the illumination with forming distribution, it is described to shape the sensing effect for being distributed in described image capture systems
The lower place intensity with higher of rate;
The scene indicated using the illumination irradiation with the forming distribution is captured using described image capture systems
The first image data;And
The ring in the case where illumination not with the forming distribution is indicated using the capture of described image capture systems
Second image data of the scene in the light of border.
16. according to the method for claim 15, wherein institute is compared in the illumination at the periphery of the visual field of the imaging system
It is brighter at the center of the visual field of the imaging system to state illumination.
17. it according to the method for claim 15, further include combining the first image data and second image data,
To generate the third image data with substantially homogeneous signal-to-noise ratio.
18. further including according to the method for claim 15, subtracting second image data from the first image data
At least part, the first image data indicate using have it is described forming distribution the illumination irradiation the field
Scape, the scene in the second pictorial data representation environment light, to generate the third figure with substantially homogeneous signal-to-noise ratio
As data.
19. according to the method for claim 15, wherein described image capture systems are configured to the spy based on the illumination
Property capture the first image data, the first image data are indicated using having the illumination of the forming distribution to shine
The scene penetrated.
20. according to the method for claim 19, wherein the characteristic of the illumination is selected from by polarization, wavelength and group of frequencies
At group.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110678977.9A CN113329140A (en) | 2016-12-23 | 2017-12-22 | Compensating vignetting |
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662438956P | 2016-12-23 | 2016-12-23 | |
US62/438956 | 2016-12-23 | ||
EP17168779 | 2017-04-28 | ||
EP17168779.1 | 2017-04-28 | ||
US15/851240 | 2017-12-21 | ||
US15/851,240 US10582169B2 (en) | 2016-12-23 | 2017-12-21 | Compensating for vignetting |
PCT/US2017/068087 WO2018119345A1 (en) | 2016-12-23 | 2017-12-22 | Compensating for vignetting |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110678977.9A Division CN113329140A (en) | 2016-12-23 | 2017-12-22 | Compensating vignetting |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110383803A true CN110383803A (en) | 2019-10-25 |
Family
ID=60955412
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201780087293.3A Pending CN110383803A (en) | 2016-12-23 | 2017-12-22 | Compensate vignetting |
CN202110678977.9A Pending CN113329140A (en) | 2016-12-23 | 2017-12-22 | Compensating vignetting |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110678977.9A Pending CN113329140A (en) | 2016-12-23 | 2017-12-22 | Compensating vignetting |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP3560184B1 (en) |
JP (1) | JP7386703B2 (en) |
KR (1) | KR102258568B1 (en) |
CN (2) | CN110383803A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111555116A (en) * | 2020-06-19 | 2020-08-18 | 宁波舜宇奥来技术有限公司 | Inverted vcsel light source and TOF module |
US10893244B2 (en) | 2016-12-23 | 2021-01-12 | Lumileds Llc | Compensating for vignetting |
CN114207471A (en) * | 2021-05-21 | 2022-03-18 | 深圳市汇顶科技股份有限公司 | Launching device and electronic equipment for time-of-flight depth detection |
CN114502985A (en) * | 2021-05-21 | 2022-05-13 | 深圳市汇顶科技股份有限公司 | Emitting device for flight time depth detection and electronic equipment |
CN116648639A (en) * | 2020-10-26 | 2023-08-25 | 捷普光学德国有限公司 | Surround View Imaging System |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1233606A3 (en) * | 2001-02-16 | 2004-03-31 | Hewlett-Packard Company, A Delaware Corporation | Digital cameras |
WO2007078961A2 (en) * | 2005-12-29 | 2007-07-12 | Motorola, Inc. | Illumination mechanism for mobile digital imaging |
US20080106636A1 (en) * | 2006-11-08 | 2008-05-08 | Sony Ericsson Mobile Communications Ab | Camera and method in a camera |
US20100020227A1 (en) * | 2008-07-25 | 2010-01-28 | Research In Motion Limited | Electronic device having a camera and method of controlling a flash |
CN101803392A (en) * | 2007-09-13 | 2010-08-11 | 皇家飞利浦电子股份有限公司 | Illumination device for pixelated illumination |
WO2014087301A1 (en) * | 2012-12-05 | 2014-06-12 | Koninklijke Philips N.V. | Illumination array with adapted distribution of radiation |
US20150116586A1 (en) * | 2008-01-03 | 2015-04-30 | Apple Inc. | Illumination Systems and Methods for Computer Imagers |
WO2016105698A1 (en) * | 2014-12-22 | 2016-06-30 | Google Inc. | Time-of-flight camera system and method to improve measurement quality of weak field-of-view signal regions |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050046739A1 (en) * | 2003-08-29 | 2005-03-03 | Voss James S. | System and method using light emitting diodes with an image capture device |
JP4747065B2 (en) * | 2006-09-29 | 2011-08-10 | 富士通株式会社 | Image generation apparatus, image generation method, and image generation program |
JP2009176471A (en) | 2008-01-22 | 2009-08-06 | Stanley Electric Co Ltd | LED light source lens |
JP5756722B2 (en) * | 2011-06-22 | 2015-07-29 | 株式会社エンプラス | Luminous flux control member, light emitting device, and illumination device |
DE102013202890B4 (en) | 2013-02-22 | 2018-12-06 | Schaeffler Technologies AG & Co. KG | A method for generating an audio signal for a synthetic noise of a motor vehicle |
-
2017
- 2017-12-22 KR KR1020197021424A patent/KR102258568B1/en active Active
- 2017-12-22 CN CN201780087293.3A patent/CN110383803A/en active Pending
- 2017-12-22 CN CN202110678977.9A patent/CN113329140A/en active Pending
- 2017-12-22 JP JP2019534379A patent/JP7386703B2/en active Active
- 2017-12-22 EP EP17828824.7A patent/EP3560184B1/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1233606A3 (en) * | 2001-02-16 | 2004-03-31 | Hewlett-Packard Company, A Delaware Corporation | Digital cameras |
WO2007078961A2 (en) * | 2005-12-29 | 2007-07-12 | Motorola, Inc. | Illumination mechanism for mobile digital imaging |
US20080106636A1 (en) * | 2006-11-08 | 2008-05-08 | Sony Ericsson Mobile Communications Ab | Camera and method in a camera |
CN101803392A (en) * | 2007-09-13 | 2010-08-11 | 皇家飞利浦电子股份有限公司 | Illumination device for pixelated illumination |
US20150116586A1 (en) * | 2008-01-03 | 2015-04-30 | Apple Inc. | Illumination Systems and Methods for Computer Imagers |
US20100020227A1 (en) * | 2008-07-25 | 2010-01-28 | Research In Motion Limited | Electronic device having a camera and method of controlling a flash |
WO2014087301A1 (en) * | 2012-12-05 | 2014-06-12 | Koninklijke Philips N.V. | Illumination array with adapted distribution of radiation |
WO2016105698A1 (en) * | 2014-12-22 | 2016-06-30 | Google Inc. | Time-of-flight camera system and method to improve measurement quality of weak field-of-view signal regions |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10893244B2 (en) | 2016-12-23 | 2021-01-12 | Lumileds Llc | Compensating for vignetting |
CN111555116A (en) * | 2020-06-19 | 2020-08-18 | 宁波舜宇奥来技术有限公司 | Inverted vcsel light source and TOF module |
CN116648639A (en) * | 2020-10-26 | 2023-08-25 | 捷普光学德国有限公司 | Surround View Imaging System |
CN114207471A (en) * | 2021-05-21 | 2022-03-18 | 深圳市汇顶科技股份有限公司 | Launching device and electronic equipment for time-of-flight depth detection |
CN114502985A (en) * | 2021-05-21 | 2022-05-13 | 深圳市汇顶科技股份有限公司 | Emitting device for flight time depth detection and electronic equipment |
WO2022241778A1 (en) * | 2021-05-21 | 2022-11-24 | 深圳市汇顶科技股份有限公司 | Transmitting apparatus for time-of-flight depth detection and electronic device |
Also Published As
Publication number | Publication date |
---|---|
EP3560184A1 (en) | 2019-10-30 |
KR20190099044A (en) | 2019-08-23 |
KR102258568B1 (en) | 2021-05-28 |
EP3560184B1 (en) | 2021-04-21 |
CN113329140A (en) | 2021-08-31 |
JP7386703B2 (en) | 2023-11-27 |
JP2020515100A (en) | 2020-05-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110383803A (en) | Compensate vignetting | |
TWI731206B (en) | Systems and methods for compensating for vignetting | |
JP6139017B2 (en) | Method for determining characteristics of light source and mobile device | |
RU2763756C1 (en) | Obtaining images for use in determining one or more properties of the subject's skin | |
US9245332B2 (en) | Method and apparatus for image production | |
US9300931B2 (en) | Image pickup system | |
TWI565323B (en) | Imaging device for distinguishing foreground and operation method thereof, and image sensor | |
CN110248056A (en) | Image testing device | |
JP2022107533A (en) | Systems, methods and apparatuses for focus selection using image disparity | |
WO2007005018A1 (en) | Projection of subsurface structure onto an object's surface | |
JP2017107309A (en) | Finger vein authentication device | |
JP2023093574A (en) | Information processing device, control device, method of processing information and program | |
JP6877936B2 (en) | Processing equipment, processing systems, imaging equipment, processing methods, programs, and recording media | |
JP2019200140A (en) | Imaging apparatus, accessory, processing device, processing method, and program | |
CN109154974A (en) | Determine the arrangement and method of target range and the reading parameter based on target range adjustment imaging reader | |
JP6368593B2 (en) | Image processing program, information processing system, and image processing method | |
JP2017158977A (en) | Skin image generation device, operation method of skin image generation device, and skin image generation processing program | |
JP6386837B2 (en) | Image processing program, information processing system, information processing apparatus, and image processing method | |
JP7080724B2 (en) | Light distribution control device, light projection system and light distribution control method | |
JP2018010116A (en) | Processor, processing system, imaging apparatus, processing method, program, and record medium | |
JP2024105178A (en) | Imaging device and ranging system | |
JP6512806B2 (en) | Imaging device | |
CN119343926A (en) | Irradiation adaptation method and image recording device | |
Blasinski | Camera Design Optimization Using Image Systems Simulation | |
KR20190142654A (en) | Apparatus and method of acquiring three-dimensional information of imaging object |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20191025 |
|
RJ01 | Rejection of invention patent application after publication |