System and method for measuring brightness distribution by using 3D depth measurement
Technical Field
The invention belongs to the technical field of illumination quality detection, and particularly relates to a brightness distribution measuring system and method by utilizing 3D depth measurement.
Background
For the application fields of testing the light distribution curve characteristic of the LED lamp, testing the color brightness uniformity of the display screen, measuring the sky brightness light environment, detecting the glare of a cockpit, detecting the illumination quality of large scenes such as road traffic, stadiums and the like, the brightness distribution in the scenes needs to be obtained. However, the traditional aiming point type brightness can only measure the brightness mean value of a small-area light-emitting area, and the point type brightness measurement is used for obtaining the brightness distribution, so that the time and the labor are wasted, the efficiency is extremely low, and the measurement error is large.
Luminance meter focal plane exposure H (product of phase plane illuminance E and exposure time) and object brightness LsSatisfies the following formula:
in the formula:
a is the lens F number (lens focal length/aperture diameter); e is focal plane illuminance (lx); f is the focal length (m) of the lens;
h is total focal plane exposure (lx · s); i is the image distance (m); l issAs scene luminance (cd/m)2) (ii) a t is the exposure time(s);
t is the lens transmittance; u is the object distance (m); upsilon is a vignetting coefficient; theta is an included angle between the object and the normal of the lens; q ═ v cos4(θ);
In practice, the object distance u > F is measured, and an aperture diaphragm is usually designed in the luminance meter, and the solid angle of the light detector to the lens can be kept constant by adjusting the diaphragm, so that (1-F/u) ≈ 1 can be approximately considered within an acceptable error range. In the formula: q ═ v cos4And theta is the included angle between the normal vector of the measured surface and the optical axis of the lens. Cos can be considered as being represented by the following formula when θ is 04Theta is approximately equal to 1; but when the off-axis angle of the measured object is larger, cos4θ cannot be ignored, so the conventional luminance meter measures the average value of the luminance of a minute light emitting surface near the optical axis. Such "aim point" measurement does not allow the target to be observed in a low-brightness scene, and the measurement error is large for a light emitting surface having a small area and poor brightness uniformity. Analysis shows that the illuminance of the phase surfaces around the area array imaging unit is attenuated according to the fourth power of the cosine of the off-axis angle of the center of the optical axis, and the inherent phenomenon is called as the virtual light effect. In order to avoid the influence of a virtual light effect, an object distance and the like on measurement, the traditional photometer measures the average brightness value of a tiny light-emitting surface, so that when the brightness distribution measurement in a large visual field range such as road illumination, tunnel illumination, a stadium and the like is carried out, only a point-by-point measurement mode can be adopted, time and labor are wasted, and the error is large. And the measurement process is easy to be carried out due to the overlong measurement timeThe illumination environment changes, resulting in inaccurate test results.
In order to solve the problem of efficiency of measuring brightness distribution in a large scene, research has been conducted to obtain the brightness value of a scene in a field of view at one time by using a (CCD/CMOS) area array imaging unit as imaging brightness measuring Devices (LMDs) in a large field of view. For example, CN201731940U discloses an image brightness meter and a measurement method thereof, which utilize a consumer-grade digital camera calibrated in advance to obtain the ambient brightness distribution, thereby simplifying the workload of brightness measurement and reducing the measurement cost. However, in the above method, in the process of calibrating the equipment, the geometric size of the standard white board (gray board) as the calibration object is small, and the cos can still be considered after imaging4Theta is approximately equal to 1; but the imaging brightness measuring device obtains the ratio K of the brightness Ls of the measured object and the phase surface illumination value E (or Y stimulus value) obtained by the color space conversion of the RGB value corresponding to the CCD imaging unit in the calibration stage, and when the brightness distribution is measured in the actual large scene, the measured surface cos corresponding to the pixel point with larger angle deviating from the optical axis is measured4Theta cannot be ignored. In addition, no data report or material object for improving the above-mentioned disadvantages is found.
In summary, the prior art has the following problems: the aiming point type brightness meter needs point-by-point measurement when measuring brightness distribution, and has low efficiency and large error; in the existing technology for measuring the brightness of a large scene by using an area array imaging unit, the influence of a virtual light effect on equipment calibration and measurement errors caused by the virtual light effect is not considered.
Disclosure of Invention
The invention aims to provide a brightness distribution measuring system and method based on 3D depth measurement, which aims to improve the brightness distribution measuring precision and reduce the measuring error by utilizing the scene depth information to correct the virtual light effect of the brightness distribution.
The present invention is achieved as such, a brightness distribution measuring system using 3D depth measurement, provided with:
the brightness measuring unit adopts an area array image sensor imaging unit and is used for acquiring corresponding measured values proportional to brightness values according to the illumination of the phase plane of the scene image pixel points and the characteristic parameters of the imaging equipment calibrated in advance and marking the measured values as Y (i, j), wherein x and Y are the position coordinates of the pixel points;
a 3D depth measuring unit for obtaining scene 3D depth information U (i, j) by laser ranging scanning or binocular vision imaging and scene depth U corresponding to the optical axis of the imaging brightness measuring unit0;
The data processing and communication unit is used for calculating a brightness distribution result L (i, j) corrected by the virtual light effect according to the measured scene depth U (i, j) and the calculated brightness Y (i, j) and displaying the result on the background terminal equipment;
the background terminal equipment is used for being connected with the measuring equipment through a data transmission line or wifi;
the control terminal comprises a setting measurement region ROI, a display unit and a storage unit, wherein the setting measurement region ROI is used for setting measurement parameters and displaying and storing measurement results;
and the calibration object is used for the LED light source with preset values of the luminous brightness and the light distribution curve.
In the brightness distribution measuring system, the processing unit evaluates the brightness dynamic range of the measuring scene when the brightness dynamic range is not high (<104) Using direct imaging measurements, e.g. with a large dynamic range of brightness: (>104) Or when the measurement accuracy is expected to be higher, the method of synthesizing the high dynamic range image in the background by adopting the time exposure sequence is adopted for measurement.
Further, the background terminal device is a smart phone, a handheld device or a notebook computer.
Another object of the present invention is to provide a luminance distribution measuring method using 3D depth measurement of the luminance distribution measuring system using 3D depth measurement, the luminance distribution measuring method using 3D depth measurement including the steps of:
step one, arranging measurement equipment, connecting background terminal equipment with a system, and selecting an interested measurement region ROI on the background terminal equipment according to the requirement of a measurement task.
Step two, more than 4 calibration objects are placed in the interested brightness measurement area; the positions of the calibration objects should be considered to be uniformly distributed in the ROI area, and the peripheral positions should be considered preferentially.
Thirdly, brightness imaging measurement, namely deriving calibrated image data from a brightness measurement unit; the processing unit extracts image calculation measurement data Y (i, j) corresponding to the measurement region ROI, wherein i, j are coordinate values of image pixel points, and the Y value is a calculation brightness value which is obtained by reverse calculation of the output level of the read area array sensor imaging unit and does not consider a virtual light effect. According to the imaging principle, the output level of the sensor unit should be proportional to the phase exposure H within the dynamic range of the sensor unit. On the premise that the influence of the virtual light effect is not considered, the brightness value Y can be calculated according to the imaging principle formula.
Fourthly, measuring the depth of the scenery, and extracting the depth U (i, j) of the scenery of the corresponding measuring region ROI from the measured data of the 3D depth measuring unit and the depth U of the scenery corresponding to the optical axis of the imaging brightness measuring unit0(ii) a Obtaining the off-axis angle theta (i, j) of the scene measurement position corresponding to each image pixel point in the ROI area according to the scene depth measurement data, and calculating the corresponding virtual light effect correction coefficient cos4θ(i,j)。
Step five, extracting a brightness measurement result Y corresponding to the light source of the calibration object from the brightness image measurement data and the depth measurement datanWherein n is a calibration object number; calculating the theoretical value L of the brightness by using the position parameter of the calibration object and the known brightness and light distribution curven. L may be adjusted when vignetting caused by other factors is not taken into accountnAnd YnThe data sequence is regarded as linear correlation, and a brightness measurement calibration coefficient K can be calculated by using a least square method; when vignetting caused by other factors such as the lens group of the imaging device is considered, L is setnAnd YnThe ratio of the data sequences is regarded as a power function, and then a calibration coefficient K is calculated according to corresponding curve fitting.
Step six, calculating a brightness measurement result L (i, j) ═ K × Y (i, j)/cos4θ(i,j)。
And seventhly, post-processing the data. Includes performing trapezoidal transformation (transformation of perspective relationship into geometric linear relationship) on the measurement region according to the measurement task, and generating RO after trapezoidal transformationI measuring the regional brightness contour line and calculating the average brightness LavUniformity of brightness (L)min/Lav) And storing the brightness calibration coefficient value, storing and sharing the measurement result and the like.
The invention has the advantages and positive effects that: the traditional aiming point type brightness meter can only measure the brightness distribution of a large scene point by point, and wastes time and labor. However, the existing imaging brightness distribution measurement technology does not consider the correction of the virtual light effect in real time. When the calibration coefficient is obtained in a small scale range with a small imaging angle and an unobvious virtual light effect, and the brightness measurement is directly carried out in a large scene scale, the imaging angle is far larger than the calibration condition of a laboratory, and the virtual light effect caused by neglecting the scene off-axis angle can cause a large error. By adopting the method and the device for measuring the brightness distribution, the brightness distribution of a large-view scene can be quickly obtained, and the measurement difficulty and the cost are reduced; compared with the existing imaging brightness distribution measurement method, the virtual light effect is corrected by utilizing the depth information of the scenery, so that the measurement accuracy is greatly improved.
Drawings
Fig. 1 is a functional block diagram of a brightness distribution measuring system using 3D depth measurement according to an embodiment of the present invention.
Fig. 2 is a flow chart of brightness distribution measurement using 3D depth measurement according to an embodiment of the present invention.
In the figure: 1. a calibration object; 2. a luminance distribution measuring device; 2-1, 3D depth measurement unit; 2-2, an imaging brightness measuring unit; 2-3 data processing and communication unit; 3. a background processing device; 4. a holder; 5. a data transmission line; 6. and (5) measuring the object.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The following detailed description of the principles of the invention is provided in connection with the accompanying drawings.
As shown in fig. 1, a brightness distribution measuring system using 3D depth measurement according to an embodiment of the present invention includes: the device comprises a calibration object 1, a measuring device 2, background processing equipment 3, a cradle head 4, data transmission 5 and a measured object 6.
The measured object 6 is provided with a calibration object 1, the measuring device 2 is connected with the background processing equipment 3 through a data transmission line 5, and the measuring device 2 is provided with a holder 4.
The measuring device 2 further comprises: a 3D depth measuring unit 2-1, an imaging brightness measuring unit 2-2, a 2-3 data processing and communication unit.
The background processing device 3 is a handheld device or a notebook computer.
As shown in fig. 2, the method for measuring a luminance distribution using 3D depth measurement according to an embodiment of the present invention includes the following steps:
step 1, arranging measurement equipment, connecting background terminal equipment with a system, and selecting an interested measurement region ROI on the background terminal equipment.
And 2, placing more than 4 calibration objects in the interested brightness measurement area according to the task requirement of measurement. The positions of the calibration objects should be considered to be uniformly distributed in the ROI area, and the peripheral positions should be considered preferentially.
Step 3, brightness imaging measurement, namely deriving calibrated image data from a brightness measurement unit; the processing unit extracts image calculation measurement data Y (i, j) corresponding to the measurement region ROI, wherein i, j are coordinate values of image pixel points, and the Y value is a calculation brightness value which is obtained by reverse calculation of the output level of the read area array sensor imaging unit and does not consider a virtual light effect. According to the imaging principle, the output level of the sensor unit should be proportional to the phase exposure H within the dynamic range of the sensor unit. On the premise that the influence of the virtual light effect is not considered, the brightness value Y can be calculated according to the imaging principle formula.
Step 4, measuring the depth of the scenery, and extracting the depth U (i, j) of the scenery of the corresponding measuring region ROI from the measured data of the 3D depth measuring unit and the depth U of the scenery corresponding to the optical axis of the imaging brightness measuring unit0. Obtaining the off-axis angle theta (i, j) of the scene measurement position corresponding to each image pixel point in the ROI area according to the scene depth measurement data) And calculating the corresponding correction coefficient cos of the virtual light effect4θ(i,j)。
Step 5, extracting brightness measurement result Y corresponding to the light source of the calibration object from the brightness image measurement data and the depth measurement datanWherein n is a calibration object number; calculating the theoretical value L of the brightness by using the position parameter of the calibration object and the known brightness and light distribution curven. L may be adjusted when vignetting caused by other factors is not taken into accountnAnd YnThe data sequence is regarded as linear correlation, and a brightness measurement calibration coefficient K can be calculated by using a least square method; when vignetting caused by other factors such as the lens group of the imaging device is considered, L is setnAnd YnThe ratio of the data sequences is regarded as a power function, and then a calibration coefficient K is calculated according to corresponding curve fitting.
Step 6, calculating the brightness measurement result L (i, j) ═ K × Y (i, j)/cos4θ(i,j)。
And 7, post-processing the data. Includes trapezoidal conversion (perspective relation is converted into geometric linear relation) of measurement region according to measurement task, generation of ROI measurement region brightness contour line after trapezoidal conversion treatment, and calculation of average brightness LavUniformity of brightness (L)min/Lav) And storing the brightness calibration coefficient value, storing and sharing the measurement result and the like.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.