[go: up one dir, main page]

CN108010071B - System and method for measuring brightness distribution by using 3D depth measurement - Google Patents

System and method for measuring brightness distribution by using 3D depth measurement Download PDF

Info

Publication number
CN108010071B
CN108010071B CN201711252506.1A CN201711252506A CN108010071B CN 108010071 B CN108010071 B CN 108010071B CN 201711252506 A CN201711252506 A CN 201711252506A CN 108010071 B CN108010071 B CN 108010071B
Authority
CN
China
Prior art keywords
measurement
brightness
depth
imaging
measuring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711252506.1A
Other languages
Chinese (zh)
Other versions
CN108010071A (en
Inventor
易斌
甄江龙
袁韬
杨静
叶盛
毛龙波
米红菊
王海龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Logistical Engineering University of PLA
Original Assignee
Logistical Engineering University of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Logistical Engineering University of PLA filed Critical Logistical Engineering University of PLA
Priority to CN201711252506.1A priority Critical patent/CN108010071B/en
Publication of CN108010071A publication Critical patent/CN108010071A/en
Application granted granted Critical
Publication of CN108010071B publication Critical patent/CN108010071B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/42Photometry, e.g. photographic exposure meter using electric radiation detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/42Photometry, e.g. photographic exposure meter using electric radiation detectors
    • G01J2001/4247Photometry, e.g. photographic exposure meter using electric radiation detectors for testing lamps or other light sources
    • G01J2001/4252Photometry, e.g. photographic exposure meter using electric radiation detectors for testing lamps or other light sources for testing LED's
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本发明属于照明质量检测技术领域,公开了一种利用3D深度测量的亮度分布测量系统及方法,设置有:标定物、亮度分布测量装置、后台处理设备、云台、数据传输线;被测物上安装有标定块,测量装置通过数据传输线连接后台处理设备,测量装置上安装有云台。本发明采用3D深度测量单元获得被测物体的深度信息,利用面阵图像传感器成像设备获取亮度信息,通过软件处理对虚光效应加以修正,获得景物亮度分布,从而降低测量难度和成本;且较现有的成像式亮度分布测量方法相比,通过利用景物深度信息对虚光效应加以校正,极大的提高了测量准确度。

Figure 201711252506

The invention belongs to the technical field of lighting quality detection, and discloses a brightness distribution measurement system and method using 3D depth measurement. A calibration block is installed, the measuring device is connected to the background processing equipment through a data transmission line, and a pan/tilt is installed on the measuring device. The invention adopts the 3D depth measurement unit to obtain the depth information of the measured object, uses the area array image sensor imaging device to obtain the brightness information, and corrects the virtual light effect through software processing to obtain the brightness distribution of the scene, thereby reducing the measurement difficulty and cost; Compared with the existing imaging-based brightness distribution measurement methods, by using the scene depth information to correct the vignetting effect, the measurement accuracy is greatly improved.

Figure 201711252506

Description

System and method for measuring brightness distribution by using 3D depth measurement
Technical Field
The invention belongs to the technical field of illumination quality detection, and particularly relates to a brightness distribution measuring system and method by utilizing 3D depth measurement.
Background
For the application fields of testing the light distribution curve characteristic of the LED lamp, testing the color brightness uniformity of the display screen, measuring the sky brightness light environment, detecting the glare of a cockpit, detecting the illumination quality of large scenes such as road traffic, stadiums and the like, the brightness distribution in the scenes needs to be obtained. However, the traditional aiming point type brightness can only measure the brightness mean value of a small-area light-emitting area, and the point type brightness measurement is used for obtaining the brightness distribution, so that the time and the labor are wasted, the efficiency is extremely low, and the measurement error is large.
Luminance meter focal plane exposure H (product of phase plane illuminance E and exposure time) and object brightness LsSatisfies the following formula:
Figure BDA0001492053410000011
in the formula:
a is the lens F number (lens focal length/aperture diameter); e is focal plane illuminance (lx); f is the focal length (m) of the lens;
h is total focal plane exposure (lx · s); i is the image distance (m); l issAs scene luminance (cd/m)2) (ii) a t is the exposure time(s);
t is the lens transmittance; u is the object distance (m); upsilon is a vignetting coefficient; theta is an included angle between the object and the normal of the lens; q ═ v cos4(θ);
In practice, the object distance u > F is measured, and an aperture diaphragm is usually designed in the luminance meter, and the solid angle of the light detector to the lens can be kept constant by adjusting the diaphragm, so that (1-F/u) ≈ 1 can be approximately considered within an acceptable error range. In the formula: q ═ v cos4And theta is the included angle between the normal vector of the measured surface and the optical axis of the lens. Cos can be considered as being represented by the following formula when θ is 04Theta is approximately equal to 1; but when the off-axis angle of the measured object is larger, cos4θ cannot be ignored, so the conventional luminance meter measures the average value of the luminance of a minute light emitting surface near the optical axis. Such "aim point" measurement does not allow the target to be observed in a low-brightness scene, and the measurement error is large for a light emitting surface having a small area and poor brightness uniformity. Analysis shows that the illuminance of the phase surfaces around the area array imaging unit is attenuated according to the fourth power of the cosine of the off-axis angle of the center of the optical axis, and the inherent phenomenon is called as the virtual light effect. In order to avoid the influence of a virtual light effect, an object distance and the like on measurement, the traditional photometer measures the average brightness value of a tiny light-emitting surface, so that when the brightness distribution measurement in a large visual field range such as road illumination, tunnel illumination, a stadium and the like is carried out, only a point-by-point measurement mode can be adopted, time and labor are wasted, and the error is large. And the measurement process is easy to be carried out due to the overlong measurement timeThe illumination environment changes, resulting in inaccurate test results.
In order to solve the problem of efficiency of measuring brightness distribution in a large scene, research has been conducted to obtain the brightness value of a scene in a field of view at one time by using a (CCD/CMOS) area array imaging unit as imaging brightness measuring Devices (LMDs) in a large field of view. For example, CN201731940U discloses an image brightness meter and a measurement method thereof, which utilize a consumer-grade digital camera calibrated in advance to obtain the ambient brightness distribution, thereby simplifying the workload of brightness measurement and reducing the measurement cost. However, in the above method, in the process of calibrating the equipment, the geometric size of the standard white board (gray board) as the calibration object is small, and the cos can still be considered after imaging4Theta is approximately equal to 1; but the imaging brightness measuring device obtains the ratio K of the brightness Ls of the measured object and the phase surface illumination value E (or Y stimulus value) obtained by the color space conversion of the RGB value corresponding to the CCD imaging unit in the calibration stage, and when the brightness distribution is measured in the actual large scene, the measured surface cos corresponding to the pixel point with larger angle deviating from the optical axis is measured4Theta cannot be ignored. In addition, no data report or material object for improving the above-mentioned disadvantages is found.
In summary, the prior art has the following problems: the aiming point type brightness meter needs point-by-point measurement when measuring brightness distribution, and has low efficiency and large error; in the existing technology for measuring the brightness of a large scene by using an area array imaging unit, the influence of a virtual light effect on equipment calibration and measurement errors caused by the virtual light effect is not considered.
Disclosure of Invention
The invention aims to provide a brightness distribution measuring system and method based on 3D depth measurement, which aims to improve the brightness distribution measuring precision and reduce the measuring error by utilizing the scene depth information to correct the virtual light effect of the brightness distribution.
The present invention is achieved as such, a brightness distribution measuring system using 3D depth measurement, provided with:
the brightness measuring unit adopts an area array image sensor imaging unit and is used for acquiring corresponding measured values proportional to brightness values according to the illumination of the phase plane of the scene image pixel points and the characteristic parameters of the imaging equipment calibrated in advance and marking the measured values as Y (i, j), wherein x and Y are the position coordinates of the pixel points;
a 3D depth measuring unit for obtaining scene 3D depth information U (i, j) by laser ranging scanning or binocular vision imaging and scene depth U corresponding to the optical axis of the imaging brightness measuring unit0
The data processing and communication unit is used for calculating a brightness distribution result L (i, j) corrected by the virtual light effect according to the measured scene depth U (i, j) and the calculated brightness Y (i, j) and displaying the result on the background terminal equipment;
the background terminal equipment is used for being connected with the measuring equipment through a data transmission line or wifi;
the control terminal comprises a setting measurement region ROI, a display unit and a storage unit, wherein the setting measurement region ROI is used for setting measurement parameters and displaying and storing measurement results;
and the calibration object is used for the LED light source with preset values of the luminous brightness and the light distribution curve.
In the brightness distribution measuring system, the processing unit evaluates the brightness dynamic range of the measuring scene when the brightness dynamic range is not high (<104) Using direct imaging measurements, e.g. with a large dynamic range of brightness: (>104) Or when the measurement accuracy is expected to be higher, the method of synthesizing the high dynamic range image in the background by adopting the time exposure sequence is adopted for measurement.
Further, the background terminal device is a smart phone, a handheld device or a notebook computer.
Another object of the present invention is to provide a luminance distribution measuring method using 3D depth measurement of the luminance distribution measuring system using 3D depth measurement, the luminance distribution measuring method using 3D depth measurement including the steps of:
step one, arranging measurement equipment, connecting background terminal equipment with a system, and selecting an interested measurement region ROI on the background terminal equipment according to the requirement of a measurement task.
Step two, more than 4 calibration objects are placed in the interested brightness measurement area; the positions of the calibration objects should be considered to be uniformly distributed in the ROI area, and the peripheral positions should be considered preferentially.
Thirdly, brightness imaging measurement, namely deriving calibrated image data from a brightness measurement unit; the processing unit extracts image calculation measurement data Y (i, j) corresponding to the measurement region ROI, wherein i, j are coordinate values of image pixel points, and the Y value is a calculation brightness value which is obtained by reverse calculation of the output level of the read area array sensor imaging unit and does not consider a virtual light effect. According to the imaging principle, the output level of the sensor unit should be proportional to the phase exposure H within the dynamic range of the sensor unit. On the premise that the influence of the virtual light effect is not considered, the brightness value Y can be calculated according to the imaging principle formula.
Fourthly, measuring the depth of the scenery, and extracting the depth U (i, j) of the scenery of the corresponding measuring region ROI from the measured data of the 3D depth measuring unit and the depth U of the scenery corresponding to the optical axis of the imaging brightness measuring unit0(ii) a Obtaining the off-axis angle theta (i, j) of the scene measurement position corresponding to each image pixel point in the ROI area according to the scene depth measurement data, and calculating the corresponding virtual light effect correction coefficient cos4θ(i,j)。
Step five, extracting a brightness measurement result Y corresponding to the light source of the calibration object from the brightness image measurement data and the depth measurement datanWherein n is a calibration object number; calculating the theoretical value L of the brightness by using the position parameter of the calibration object and the known brightness and light distribution curven. L may be adjusted when vignetting caused by other factors is not taken into accountnAnd YnThe data sequence is regarded as linear correlation, and a brightness measurement calibration coefficient K can be calculated by using a least square method; when vignetting caused by other factors such as the lens group of the imaging device is considered, L is setnAnd YnThe ratio of the data sequences is regarded as a power function, and then a calibration coefficient K is calculated according to corresponding curve fitting.
Step six, calculating a brightness measurement result L (i, j) ═ K × Y (i, j)/cos4θ(i,j)。
And seventhly, post-processing the data. Includes performing trapezoidal transformation (transformation of perspective relationship into geometric linear relationship) on the measurement region according to the measurement task, and generating RO after trapezoidal transformationI measuring the regional brightness contour line and calculating the average brightness LavUniformity of brightness (L)min/Lav) And storing the brightness calibration coefficient value, storing and sharing the measurement result and the like.
The invention has the advantages and positive effects that: the traditional aiming point type brightness meter can only measure the brightness distribution of a large scene point by point, and wastes time and labor. However, the existing imaging brightness distribution measurement technology does not consider the correction of the virtual light effect in real time. When the calibration coefficient is obtained in a small scale range with a small imaging angle and an unobvious virtual light effect, and the brightness measurement is directly carried out in a large scene scale, the imaging angle is far larger than the calibration condition of a laboratory, and the virtual light effect caused by neglecting the scene off-axis angle can cause a large error. By adopting the method and the device for measuring the brightness distribution, the brightness distribution of a large-view scene can be quickly obtained, and the measurement difficulty and the cost are reduced; compared with the existing imaging brightness distribution measurement method, the virtual light effect is corrected by utilizing the depth information of the scenery, so that the measurement accuracy is greatly improved.
Drawings
Fig. 1 is a functional block diagram of a brightness distribution measuring system using 3D depth measurement according to an embodiment of the present invention.
Fig. 2 is a flow chart of brightness distribution measurement using 3D depth measurement according to an embodiment of the present invention.
In the figure: 1. a calibration object; 2. a luminance distribution measuring device; 2-1, 3D depth measurement unit; 2-2, an imaging brightness measuring unit; 2-3 data processing and communication unit; 3. a background processing device; 4. a holder; 5. a data transmission line; 6. and (5) measuring the object.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The following detailed description of the principles of the invention is provided in connection with the accompanying drawings.
As shown in fig. 1, a brightness distribution measuring system using 3D depth measurement according to an embodiment of the present invention includes: the device comprises a calibration object 1, a measuring device 2, background processing equipment 3, a cradle head 4, data transmission 5 and a measured object 6.
The measured object 6 is provided with a calibration object 1, the measuring device 2 is connected with the background processing equipment 3 through a data transmission line 5, and the measuring device 2 is provided with a holder 4.
The measuring device 2 further comprises: a 3D depth measuring unit 2-1, an imaging brightness measuring unit 2-2, a 2-3 data processing and communication unit.
The background processing device 3 is a handheld device or a notebook computer.
As shown in fig. 2, the method for measuring a luminance distribution using 3D depth measurement according to an embodiment of the present invention includes the following steps:
step 1, arranging measurement equipment, connecting background terminal equipment with a system, and selecting an interested measurement region ROI on the background terminal equipment.
And 2, placing more than 4 calibration objects in the interested brightness measurement area according to the task requirement of measurement. The positions of the calibration objects should be considered to be uniformly distributed in the ROI area, and the peripheral positions should be considered preferentially.
Step 3, brightness imaging measurement, namely deriving calibrated image data from a brightness measurement unit; the processing unit extracts image calculation measurement data Y (i, j) corresponding to the measurement region ROI, wherein i, j are coordinate values of image pixel points, and the Y value is a calculation brightness value which is obtained by reverse calculation of the output level of the read area array sensor imaging unit and does not consider a virtual light effect. According to the imaging principle, the output level of the sensor unit should be proportional to the phase exposure H within the dynamic range of the sensor unit. On the premise that the influence of the virtual light effect is not considered, the brightness value Y can be calculated according to the imaging principle formula.
Step 4, measuring the depth of the scenery, and extracting the depth U (i, j) of the scenery of the corresponding measuring region ROI from the measured data of the 3D depth measuring unit and the depth U of the scenery corresponding to the optical axis of the imaging brightness measuring unit0. Obtaining the off-axis angle theta (i, j) of the scene measurement position corresponding to each image pixel point in the ROI area according to the scene depth measurement data) And calculating the corresponding correction coefficient cos of the virtual light effect4θ(i,j)。
Step 5, extracting brightness measurement result Y corresponding to the light source of the calibration object from the brightness image measurement data and the depth measurement datanWherein n is a calibration object number; calculating the theoretical value L of the brightness by using the position parameter of the calibration object and the known brightness and light distribution curven. L may be adjusted when vignetting caused by other factors is not taken into accountnAnd YnThe data sequence is regarded as linear correlation, and a brightness measurement calibration coefficient K can be calculated by using a least square method; when vignetting caused by other factors such as the lens group of the imaging device is considered, L is setnAnd YnThe ratio of the data sequences is regarded as a power function, and then a calibration coefficient K is calculated according to corresponding curve fitting.
Step 6, calculating the brightness measurement result L (i, j) ═ K × Y (i, j)/cos4θ(i,j)。
And 7, post-processing the data. Includes trapezoidal conversion (perspective relation is converted into geometric linear relation) of measurement region according to measurement task, generation of ROI measurement region brightness contour line after trapezoidal conversion treatment, and calculation of average brightness LavUniformity of brightness (L)min/Lav) And storing the brightness calibration coefficient value, storing and sharing the measurement result and the like.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (4)

1. A luminance distribution measuring method using 3D depth measurement, characterized in that the luminance distribution measuring method using 3D depth measurement derives calibrated image data; extracting image calculation measurement data corresponding to the measurement area; measuring data, extracting scene depth of a corresponding measuring area and scene depth corresponding to an optical axis of an imaging brightness measuring unit, and calculating a corresponding virtual light effect correction coefficient; extracting a brightness measurement result corresponding to the light source of the calibration object from the brightness image measurement data and the depth measurement data, and calculating to obtain a calibration coefficient according to corresponding curve fitting;
the brightness distribution measuring method using 3D depth measurement includes the steps of:
step one, arranging measurement equipment, connecting background terminal equipment with a system, and selecting an interested measurement region ROI on the background terminal equipment according to the requirement of a measurement task;
step two, more than 4 calibration objects are placed in the interested brightness measurement area; the positions of the calibration objects are uniformly distributed in the ROI, and the peripheral positions are preferably considered;
thirdly, brightness imaging measurement, namely deriving calibrated image data from a brightness measurement unit; the processing unit extracts image calculation measurement data Y (i, j) corresponding to the measurement region ROI, wherein i, j are image pixel point coordinate values, and the Y value is a calculated brightness value which is obtained by reverse calculation of the output level of the read area array sensor imaging unit and does not consider a virtual light effect; according to the imaging principle, the output level of the imaging sensor unit is in direct proportion to the exposure H of the phase surface in the dynamic range of the imaging sensor unit;
fourthly, measuring the depth of the scenery, and extracting the depth U (i, j) of the scenery of the corresponding measuring region ROI from the measured data of the 3D depth measuring unit and the depth U of the scenery corresponding to the optical axis of the imaging brightness measuring unit0(ii) a Obtaining the off-axis angle theta (i, j) of the scene measurement position corresponding to each image pixel point in the ROI area according to the scene depth measurement data, and calculating the corresponding virtual light effect correction coefficient cos4θ(i,j);
Step five, extracting a brightness measurement result Y corresponding to the light source of the calibration object from the brightness image measurement data and the depth measurement datanWherein n is a calibration object number; calculating the theoretical value L of the brightness by using the position parameter of the calibration object and the known brightness and light distribution curven(ii) a When vignetting caused by other factors is not considered, L is adjustednAnd YnThe data sequence is regarded as linear correlation, and a brightness measurement calibration coefficient K is calculated by using a least square method; when considering vignetting caused by the lens group of the imaging device, L isnAnd YnThe ratio of the data sequences being regarded as powersA function, calculating a calibration coefficient K according to corresponding curve fitting;
step six, calculating a brightness measurement result L (i, j) ═ K × Y (i, j)/cos4θ(i,j);
Seventhly, data post-processing, namely performing trapezoidal transformation on the measurement area according to the measurement task, generating an ROI measurement area brightness contour line after trapezoidal transformation processing, and calculating average brightness LavUniformity of brightness (L)min/Lav) The brightness calibration coefficient values are saved, and the measurement results are saved and shared.
2. A brightness distribution measuring system using 3D depth measurement according to the brightness distribution measuring method using 3D depth measurement according to claim 1, wherein the brightness distribution measuring system using 3D depth measurement is provided with:
the brightness measuring unit adopts an area array imaging unit and is used for acquiring a corresponding measured value Y (i, j) proportional to the brightness value according to the illumination of the phase plane of a pixel point of the scene image and the characteristic parameters of the imaging equipment calibrated in advance, wherein x and Y are the position coordinates of the pixel point;
a 3D depth measuring unit for obtaining scene 3D depth information U (i, j) by laser ranging scanning or binocular vision imaging and scene depth U corresponding to the optical axis of the imaging brightness measuring unit0
The data processing and communication unit is used for calculating a brightness distribution result L (i, j) corrected by the virtual light effect according to the measured scene depth U (i, j) and the calculated brightness Y (i, j) and displaying the result on the background terminal equipment;
the background terminal equipment is used for being connected with the measuring equipment through a data transmission line or wifi;
the control terminal comprises a setting measurement region ROI, a display unit and a storage unit, wherein the setting measurement region ROI is used for setting measurement parameters and displaying and storing measurement results;
and the calibration object is used for the LED light source with preset values of the luminous brightness and the light distribution curve.
3. The luminance distribution measurement system using 3D depth measurement according to claim 2, wherein the measurement apparatus further includes: distance measuring module, imaging device.
4. The brightness distribution measurement system using 3D depth measurement according to claim 2, wherein the background processing device is a handheld device or a notebook computer.
CN201711252506.1A 2017-12-01 2017-12-01 System and method for measuring brightness distribution by using 3D depth measurement Active CN108010071B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711252506.1A CN108010071B (en) 2017-12-01 2017-12-01 System and method for measuring brightness distribution by using 3D depth measurement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711252506.1A CN108010071B (en) 2017-12-01 2017-12-01 System and method for measuring brightness distribution by using 3D depth measurement

Publications (2)

Publication Number Publication Date
CN108010071A CN108010071A (en) 2018-05-08
CN108010071B true CN108010071B (en) 2022-03-25

Family

ID=62056162

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711252506.1A Active CN108010071B (en) 2017-12-01 2017-12-01 System and method for measuring brightness distribution by using 3D depth measurement

Country Status (1)

Country Link
CN (1) CN108010071B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110319899A (en) * 2019-08-12 2019-10-11 深圳市知维智能科技有限公司 Volume measuring method, device and system
CN110807813B (en) * 2019-10-22 2022-09-23 歌尔光学科技有限公司 TOF module calibration method, device and system
CN111210406B (en) * 2019-12-27 2023-05-23 中国航空工业集团公司西安飞机设计研究所 Cockpit glare source position calculation method
CN112767527B (en) * 2021-01-28 2024-11-22 武汉海微科技股份有限公司 A method for detecting luminous intensity and uniformity based on CCD perception

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4962425A (en) * 1988-10-27 1990-10-09 National Research Council Of Canada/Conseil National Deresherches Canada Photometric device
CN101922968A (en) * 2010-07-26 2010-12-22 杭州远方光电信息有限公司 Automatic distance error correction luminance meter
CN104185777A (en) * 2012-03-06 2014-12-03 岩崎电气株式会社 Brightness-measuring apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4962425A (en) * 1988-10-27 1990-10-09 National Research Council Of Canada/Conseil National Deresherches Canada Photometric device
CN101922968A (en) * 2010-07-26 2010-12-22 杭州远方光电信息有限公司 Automatic distance error correction luminance meter
CN104185777A (en) * 2012-03-06 2014-12-03 岩崎电气株式会社 Brightness-measuring apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
《成像式亮度测量装置虚光效应校正》;易斌 等;《照明工程学院》;20121215;51-54 *

Also Published As

Publication number Publication date
CN108010071A (en) 2018-05-08

Similar Documents

Publication Publication Date Title
US10931924B2 (en) Method for the generation of a correction model of a camera for the correction of an aberration
WO2022213427A1 (en) Light source and arrangement method therefor, and optical testing method and system
CN108010071B (en) System and method for measuring brightness distribution by using 3D depth measurement
US9500526B2 (en) High-throughput and high resolution method for measuring the color uniformity of a light spot
CN109525840B (en) Method for detecting weak defects on imaging chip
CN107256689B (en) Uniformity repairing method for LED display screen after brightness correction
CN111105365B (en) Color correction method, medium, terminal and device for texture image
WO2010031252A1 (en) Method for correcting lightness of electrical display screen
RU2013101791A (en) OBTAINING SPATIAL TOPOGRAPHIC IMAGES OF TRACES FROM THE TOOL USING A NONLINEAR PHOTOMETRIC STEREO METHOD
US20090021526A1 (en) Determination method for white-point and correction method of the white balance
CN104977155B (en) A kind of LED distribution curve fluxs method for fast measuring
CN110458964B (en) A Real-time Calculation Method for Dynamic Illumination in Real Environment
CN108088658A (en) A kind of dazzle measuring method and its measuring system
CN108986170A (en) A kind of line-scan digital camera method for correcting flat field suitable for field working conditions
CN109238461A (en) Room lighting dazzle measurement method based on digital camera images
CN113963065A (en) Lens internal reference calibration method and device based on external reference known and electronic equipment
CN107071371B (en) The camera lens colour brightness calibration method and device of panoramic shooting mould group
CN110769229A (en) Method, device and system for detecting color brightness of projection picture
CN109186941A (en) A kind of detection method and system of light source uniformity
CN114503097A (en) Method and apparatus for color lookup using mobile device
CN110493540B (en) A method and device for real-time acquisition of scene dynamic lighting
TW200426348A (en) Spatial 2-D distribution measuring method of light source emitting power and its device
CN106817542B (en) imaging method and imaging device of microlens array
JP7524735B2 (en) Image measurement device and image measurement method
CN110300291A (en) Determine device and method, digital camera, application and the computer equipment of color-values

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant