CN118945477B - A method, device and storage medium for adjusting consistency of multiple cameras of a spliced screen - Google Patents
A method, device and storage medium for adjusting consistency of multiple cameras of a spliced screen Download PDFInfo
- Publication number
- CN118945477B CN118945477B CN202411419942.3A CN202411419942A CN118945477B CN 118945477 B CN118945477 B CN 118945477B CN 202411419942 A CN202411419942 A CN 202411419942A CN 118945477 B CN118945477 B CN 118945477B
- Authority
- CN
- China
- Prior art keywords
- standard
- camera
- image set
- gray
- exposure time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 65
- 238000012937 correction Methods 0.000 claims abstract description 52
- 238000007499 fusion processing Methods 0.000 claims abstract description 34
- 241001270131 Agaricus moelleri Species 0.000 claims description 44
- 230000002146 bilateral effect Effects 0.000 claims description 38
- 238000001914 filtration Methods 0.000 claims description 32
- 230000004927 fusion Effects 0.000 claims description 16
- 238000004364 calculation method Methods 0.000 claims description 9
- 238000003384 imaging method Methods 0.000 claims description 9
- 238000012935 Averaging Methods 0.000 claims description 7
- 238000000605 extraction Methods 0.000 claims description 3
- 238000001514 detection method Methods 0.000 abstract description 18
- 230000008569 process Effects 0.000 description 18
- 230000000694 effects Effects 0.000 description 7
- 230000006698 induction Effects 0.000 description 6
- 238000011161 development Methods 0.000 description 4
- 230000018109 developmental process Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 238000012216 screening Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/82—Camera processing pipelines; Components thereof for controlling camera response irrespective of the scene brightness, e.g. gamma correction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
The application discloses a method and a device for adjusting consistency of multiple cameras of a spliced screen and a storage medium, which are used for reducing overall detection time of the spliced screen. The method comprises the steps of assembling and debugging a multi-industry camera system, lighting up a spliced screen, adjusting exposure time of the camera, shooting the spliced screen by using a standard camera and a camera to be corrected, generating a first standard gray-scale image set, a second standard gray-scale image set, a first reference gray-scale image set and a second reference gray-scale image set, determining a central area of each image in the gray-scale image set, carrying out central area fusion processing on images with the same gray scale and the same exposure time in the gray-scale image set, fitting according to gray average values and exposure time of the central area, generating a first standard fitting polynomial, a second standard fitting polynomial, a first reference fitting polynomial set and a second reference fitting polynomial set, calculating correction coefficients of the camera to be corrected according to polynomial coefficients in the first standard fitting polynomial, the second standard fitting polynomial set and the first reference fitting polynomial set, and correcting initial exposure time of the camera to be corrected according to the correction coefficients.
Description
Technical Field
The embodiment of the application relates to the field of spliced screen detection, in particular to a method and a device for adjusting consistency of multiple cameras of a spliced screen and a storage medium.
Background
With the development of industry and internet technology, the requirements of industry on the quality of industrial products are more and more refined and standardized, and the quality of the products directly affects the competitiveness of the products in terms of the display panel industry, wherein the precision of the product form and dimension specifications is the most basic technological index of the quality of the products. The display screen is used as one of display components of devices, such as mobile phones, televisions, tablet computers, etc., on various high-end devices. With the increasing demands of people on picture display, display screens gradually become technically precise products.
The spliced screen is a display screen which can be spliced, a plurality of spliced screens can form independent boxes, and the plurality of independent boxes are spliced into the final ultra-large display screen. Therefore, the super-large display screen formed by splicing has the great advantages of being capable of being flexibly applied to various use scenes and free of display of the splice. However, while providing this advantage, the assembled screen may also be subject to other negative problems in the process, such as screen uniformity due to differences between display panel panels caused by the manufacturing process. Specifically, the multi-station industrial camera is used for image acquisition of the spliced screen in the manufacturing process, the plurality of spliced screens respectively use cameras with different positions for image acquisition, but the running environments of the cameras with different stations are different, so that the acquired image gray levels are inconsistent easily, and further links such as follow-up defect detection are affected. In the prior art, a plurality of industrial cameras at multiple stations are generally selected to perform consistency adjustment of acquisition parameters such as shooting angles, shooting apertures, shooting distances, ambient light sources and the like. However, the method is used for the aging difference of equipment and the assembly difference of parts of each industrial camera, and still has the condition that the gray scale of the images is different when the same spliced screen is shot by the same industrial camera. In the prior art, the standard plate and the spliced screen are acquired simultaneously, the average gray level detection is carried out on the shot images of the standard plate and the spliced screen, and when the average gray levels of the spliced screen and the standard plate are inconsistent, the gray level of the acquired images of the spliced screen is corrected through the acquired images of the standard plate. This method requires additional photographing of standard board images, and is computationally intensive and poorly corrected. In order to solve the above-described problem, the gray scale adjustment is performed by adjusting the exposure time in the related art. An average gray scale range is preset, the average gray scale is calculated after the spliced screen is shot by the industrial camera, if the average gray scale is not in the average gray scale range, the exposure time is adjusted according to a preset step length, and the image is acquired again to acquire the image of the spliced screen until the average gray scale reaches the preset range. However, this way still has a problem that before shooting is performed on a certain type of spliced screen, the industrial cameras on each station default to set the same initial exposure time, but because the pixel points of the spliced screen of different types are displayed on the edge area and the central area to have single-frame noise difference, the initial exposure time is not necessarily the most consistent with the spliced screen, so that the industrial cameras on some stations need to continuously adjust the exposure time according to the step length, the total adjustment times of the exposure time of each industrial camera are uncertain, the gray adjustment time of the industrial camera on the acquired images of the spliced screen is increased, and the overall detection time of the spliced screen is further increased.
Disclosure of Invention
The application discloses a method and a device for adjusting consistency of multiple cameras of a spliced screen and a storage medium, which are used for reducing overall detection time of the spliced screen.
The first aspect of the application discloses a method for adjusting consistency of multiple cameras of a spliced screen, which comprises the following steps:
Assembling and debugging a multi-industry camera system through a spliced screen, so that the operation environment and the initial exposure time of each industry camera with the same model in the multi-industry camera system are the same, and the multi-industry camera system comprises a standard camera and at least one camera to be corrected;
The spliced screen is lightened, so that the spliced screen respectively reaches standard gray scales and N reference gray scales, the exposure time of a standard camera and a camera to be corrected in the shooting process is adjusted according to a preset exposure time set, and N is an integer larger than 0;
Shooting a spliced screen by using a standard camera and a camera to be corrected, and generating a first standard gray-scale image set, a first reference gray-scale image set, a second standard gray-scale image set and a second reference gray-scale image set of the standard camera, wherein the same display gray-scale in each image set is at least provided with 2 images under the same exposure time;
Determining a first standard gray-scale image set, a first reference gray-scale image set, a second standard gray-scale image set and a central area of each image in the second reference gray-scale image set according to the type of the spliced screen;
Carrying out central region fusion processing on images with the same gray scale and the same exposure time in a first standard gray scale image set, a first reference gray scale image set, a second standard gray scale image set and a second reference gray scale image set;
Fitting according to the gray average value and the exposure time of the central area in the first standard gray image set and the first reference gray image set, and generating a first standard fitting polynomial and a first reference fitting polynomial set of the standard camera;
Fitting according to the gray average value and the exposure time of the central area in the second standard gray image set and the second reference gray image set to generate a second standard fitting polynomial and a second reference fitting polynomial set of the camera to be corrected;
calculating correction coefficients of the camera to be corrected according to polynomial coefficients in the first standard fitting polynomial, the first reference fitting polynomial set, the second standard fitting polynomial and the second reference fitting polynomial set;
and correcting the initial exposure time of the camera to be corrected according to the correction coefficient.
Optionally, determining a center area of each image in the first standard gray-scale image set, the first reference gray-scale image set, the second standard gray-scale image set and the second reference gray-scale image set according to the type of the spliced screen includes:
Selecting an image to be extracted from the first standard gray-scale image set;
Determining a plurality of undetermined thresholds according to the spliced screen, and respectively determining an intra-threshold area and an extra-threshold area of the image to be extracted under each undetermined threshold;
Generating an intra-threshold ratio, an intra-threshold gray level average value, an extra-threshold ratio and an extra-threshold gray level average value according to the intra-threshold area and the extra-threshold area corresponding to each undetermined threshold;
Generating a weight value according to the intra-threshold area proportion, the intra-threshold gray average value, the extra-threshold area proportion and the extra-domain gray average value of each to-be-determined threshold;
determining a target threshold according to the weight value of each undetermined threshold T;
Determining a central area of an image to be extracted according to a target threshold;
and calculating a target threshold value and extracting a central region for each image in the first standard gray-scale image set, the first reference gray-scale image set, the second standard gray-scale image set and the second reference gray-scale image set.
Optionally, performing a center region fusion process on the images with the same gray scale and the same exposure time in the first standard gray scale image set, the first reference gray scale image set, the second standard gray scale image set and the second reference gray scale image set, including:
determining a center region range and a center region pixel value of each image of one exposure time in the first standard gray scale image set;
Determining an inscribed rectangular area or a complete overlapping area according to the central area range of each image, wherein the complete overlapping area is in the central area range of each image, and the inscribed rectangular area is in the central area range of at least one image;
According to the pixel value of the central area of each image, carrying out mean value fusion processing on the pixel points on the inscribed rectangular area or the completely overlapped area through a pixel averaging algorithm to generate a mean value fusion image, wherein the mean value fusion image replaces each image with the exposure time;
Generating inscribed rectangular areas or complete overlapping areas and carrying out mean value fusion processing on images with the same exposure time in a first standard gray-scale image set and a second standard gray-scale image set;
and generating inscribed rectangular areas or complete overlapping areas and carrying out mean value fusion processing on the images with the same gray scale and the same exposure time in the first reference gray scale image set and the second reference gray scale image set.
Optionally, calculating the correction coefficient of the camera to be corrected according to the polynomial coefficients in the first standard fitting polynomial, the first reference fitting polynomial set, the second standard fitting polynomial and the second reference fitting polynomial set includes:
performing M times of item coefficient comparison on the first standard fitting polynomial and the first reference fitting polynomial set according to the size of N to generate a first comparison coefficient, wherein M is an integer greater than 1;
performing M times of item coefficient comparison on the second standard fitting polynomial and the second reference fitting polynomial set according to the size of N to generate a second comparison coefficient;
Performing M times of item coefficient comparison on the first standard fitting polynomial and the second standard fitting polynomial according to the size of N to generate a third comparison coefficient;
and generating correction coefficients of standard gray scales and N reference gray scales for the camera to be corrected according to the first comparison coefficient, the second comparison coefficient and the third comparison coefficient by taking the standard camera as a reference.
Optionally, assembling and debugging the multi-industry camera system through the tiled screen, comprising:
assembling a multi-industry camera system, and selecting a spliced screen;
Adjusting shooting angles, diaphragms and external light sources of industrial cameras in the multi-industrial camera system to be consistent;
adjusting the image capturing height of the industrial cameras in the multi-industrial camera system according to the spliced screen;
an initial exposure time is generated for an industrial camera in the multi-industrial camera system based on the stitched screen.
Optionally, adjusting the image capturing height of the industrial camera in the multi-industrial camera system according to the spliced screen includes:
placing the spliced screen on a station corresponding to the industrial camera;
Determining an acquisition range and a step length, wherein the acquisition range is the distance range between an industrial camera and a spliced screen, and the step length is an acquisition interval;
Controlling an industrial camera to acquire images of the spliced screen in an acquisition range by taking the step length as an acquisition interval to generate a spliced screen image group;
Carrying out bilateral filtering on the images in the spliced screen image group to generate a bilateral filtering image;
acquiring the mean value and standard deviation of the spliced screen image, the mean value and standard deviation of the bilateral filtering image and the covariance of the spliced screen image and the bilateral filtering image;
acquiring a pre-designed brightness formula, a comparison formula and a structural formula;
Calculating a focusing value according to the average value, standard deviation and covariance of the spliced screen image and the bilateral filtering image and a brightness formula, a contrast formula and a structural formula, and generating a focusing value set;
And determining the acquisition point position corresponding to the maximum focusing value as the imaging height of the industrial camera.
Optionally, generating an initial exposure time for an industrial camera in the multi-industrial camera system according to the stitched screen includes:
determining that the industrial camera is a standard camera from the multi-industrial camera system, and determining the rest industrial cameras as cameras to be corrected;
The spliced screen is placed on a station corresponding to the standard camera to be lightened;
Shooting the spliced screen through a standard camera to generate a spliced screen gray-scale image;
calculating a real-time gray average value of the gray-scale images of the spliced screen;
When the real-time gray average value is not in the preset gray range, the exposure time of the standard camera is adjusted to shoot the spliced screen again until the real-time gray average value reaches the preset gray range, and the unified initial exposure time is determined.
The second aspect of the application discloses a consistency adjusting device for a plurality of cameras of a spliced screen, which comprises the following components:
The assembly unit is used for assembling and debugging the multi-industry camera system through the spliced screen, so that the operation environment and the initial exposure time of each industry camera with the same model in the multi-industry camera system are the same, and the multi-industry camera system comprises a standard camera and at least one camera to be corrected;
The starting unit is used for lighting the spliced screen to enable the spliced screen to respectively reach standard gray scales and N reference gray scales, and adjusting exposure time of the standard camera and the camera to be corrected in the shooting process according to a preset exposure time set, wherein N is an integer larger than 0;
the first generation unit is used for shooting the spliced screen by using the standard camera and the camera to be corrected, and generating a first standard gray-scale image set, a first reference gray-scale image set, a second standard gray-scale image set and a second reference gray-scale image set of the standard camera to be corrected, wherein at least 2 images are acquired by the same display gray scale in each image set under the same exposure time;
The determining unit is used for determining the central area of each image in the first standard gray-scale image set, the first reference gray-scale image set, the second standard gray-scale image set and the second reference gray-scale image set according to the type of the spliced screen;
The fusion unit is used for carrying out central area fusion processing on the images with the same gray scale and the same exposure time in the first standard gray scale image set, the first reference gray scale image set, the second standard gray scale image set and the second reference gray scale image set;
The second generating unit is used for fitting according to the gray average value and the exposure time of the central area in the first standard gray image set and the first reference gray image set, and generating a first standard fitting polynomial and a first reference fitting polynomial set of the standard camera;
The third generating unit is used for fitting according to the gray average value and the exposure time of the central area in the second standard gray image set and the second reference gray image set, and generating a second standard fitting polynomial and a second reference fitting polynomial set of the camera to be corrected;
The calculating unit is used for calculating the correction coefficient of the camera to be corrected according to the polynomial coefficients in the first standard fitting polynomial, the first reference fitting polynomial set, the second standard fitting polynomial and the second reference fitting polynomial set;
And the correction unit is used for correcting the initial exposure time of the camera to be corrected according to the correction coefficient.
Optionally, the determining unit includes:
Selecting an image to be extracted from the first standard gray-scale image set;
Determining a plurality of undetermined thresholds according to the spliced screen, and respectively determining an intra-threshold area and an extra-threshold area of the image to be extracted under each undetermined threshold;
Generating an intra-threshold ratio, an intra-threshold gray level average value, an extra-threshold ratio and an extra-threshold gray level average value according to the intra-threshold area and the extra-threshold area corresponding to each undetermined threshold;
Generating a weight value according to the intra-threshold area proportion, the intra-threshold gray average value, the extra-threshold area proportion and the extra-domain gray average value of each to-be-determined threshold;
determining a target threshold according to the weight value of each undetermined threshold T;
Determining a central area of an image to be extracted according to a target threshold;
and calculating a target threshold value and extracting a central region for each image in the first standard gray-scale image set, the first reference gray-scale image set, the second standard gray-scale image set and the second reference gray-scale image set.
Optionally, the fusion unit includes:
determining a center region range and a center region pixel value of each image of one exposure time in the first standard gray scale image set;
Determining an inscribed rectangular area or a complete overlapping area according to the central area range of each image, wherein the complete overlapping area is in the central area range of each image, and the inscribed rectangular area is in the central area range of at least one image;
According to the pixel value of the central area of each image, carrying out mean value fusion processing on the pixel points on the inscribed rectangular area or the completely overlapped area through a pixel averaging algorithm to generate a mean value fusion image, wherein the mean value fusion image replaces each image with the exposure time;
Generating inscribed rectangular areas or complete overlapping areas and carrying out mean value fusion processing on images with the same exposure time in a first standard gray-scale image set and a second standard gray-scale image set;
and generating inscribed rectangular areas or complete overlapping areas and carrying out mean value fusion processing on the images with the same gray scale and the same exposure time in the first reference gray scale image set and the second reference gray scale image set.
Optionally, the computing unit includes:
performing M times of item coefficient comparison on the first standard fitting polynomial and the first reference fitting polynomial set according to the size of N to generate a first comparison coefficient, wherein M is an integer greater than 1;
performing M times of item coefficient comparison on the second standard fitting polynomial and the second reference fitting polynomial set according to the size of N to generate a second comparison coefficient;
Performing M times of item coefficient comparison on the first standard fitting polynomial and the second standard fitting polynomial according to the size of N to generate a third comparison coefficient;
and generating correction coefficients of standard gray scales and N reference gray scales for the camera to be corrected according to the first comparison coefficient, the second comparison coefficient and the third comparison coefficient by taking the standard camera as a reference.
Optionally, the assembly unit includes:
the assembly module is used for assembling the multi-industry camera system and selecting a spliced screen;
The adjusting module is used for adjusting the shooting angle, the aperture and the external light source of the industrial camera in the multi-industrial camera system to be consistent;
The height adjusting module is used for adjusting the image capturing height of the industrial camera in the multi-industrial camera system according to the spliced screen;
and the generation module is used for generating initial exposure time for the industrial cameras in the multi-industrial camera system according to the spliced screen.
Optionally, the height adjustment module includes:
placing the spliced screen on a station corresponding to the industrial camera;
Determining an acquisition range and a step length, wherein the acquisition range is the distance range between an industrial camera and a spliced screen, and the step length is an acquisition interval;
Controlling an industrial camera to acquire images of the spliced screen in an acquisition range by taking the step length as an acquisition interval to generate a spliced screen image group;
Carrying out bilateral filtering on the images in the spliced screen image group to generate a bilateral filtering image;
acquiring the mean value and standard deviation of the spliced screen image, the mean value and standard deviation of the bilateral filtering image and the covariance of the spliced screen image and the bilateral filtering image;
acquiring a pre-designed brightness formula, a comparison formula and a structural formula;
Calculating a focusing value according to the average value, standard deviation and covariance of the spliced screen image and the bilateral filtering image and a brightness formula, a contrast formula and a structural formula, and generating a focusing value set;
And determining the acquisition point position corresponding to the maximum focusing value as the imaging height of the industrial camera.
Optionally, the generating module includes:
determining that the industrial camera is a standard camera from the multi-industrial camera system, and determining the rest industrial cameras as cameras to be corrected;
The spliced screen is placed on a station corresponding to the standard camera to be lightened;
Shooting the spliced screen through a standard camera to generate a spliced screen gray-scale image;
calculating a real-time gray average value of the gray-scale images of the spliced screen;
When the real-time gray average value is not in the preset gray range, the exposure time of the standard camera is adjusted to shoot the spliced screen again until the real-time gray average value reaches the preset gray range, and the unified initial exposure time is determined.
A third aspect of the present application provides a multi-camera consistency adjustment apparatus for a tiled screen, including:
a processor, a memory, an input-output unit, and a bus;
The processor is connected with the memory, the input/output unit and the bus;
the memory holds a program that the processor invokes to perform any of the optional coherency adjustment methods as in the first aspect as well as the first aspect.
A fourth aspect of the application provides a computer readable storage medium having a program stored thereon, which when executed on a computer performs any of the optional consistency adjustment methods as in the first aspect and the first aspect.
From the above technical solutions, the embodiment of the present application has the following advantages:
In the application, the multi-industry camera system is assembled firstly, and comprises a plurality of industry cameras with the same model, and the multi-industry camera system is debugged through the spliced screen, so that the detection parameters of the system are the same as possible. One of the industrial cameras of the multi-industrial camera system is set as a standard camera, and the other industrial cameras are all determined as cameras to be corrected. And inputting a standard gray-scale picture and N reference gray-scale pictures into the spliced screen, shooting by using a standard camera and a camera to be corrected in the actual process, and adjusting the exposure time of the standard camera and the camera to be corrected in the shooting process. At a certain gray level, the standard camera and the camera to be corrected will take at least 2 images simultaneously at each exposure time. And generating a first standard gray-scale image set of the standard camera under the standard gray scale, and generating a first reference gray-scale image set of the standard camera under the N reference gray scales. The camera to be corrected is in a second standard gray level image set under the standard gray level, and the camera to be corrected is in a second reference gray level image set under N reference gray levels. And determining the central area of each image in the first standard gray-scale image set, the first reference gray-scale image set, the second standard gray-scale image set and the second reference gray-scale image set according to the type of the spliced screen, and carrying out central area fusion processing on the images with the same gray scale and the same exposure time. And then fitting according to the gray average value and the exposure time of the central area in the first standard gray image set and the first reference gray image set, and generating a first standard fitting polynomial and a first reference fitting polynomial set of the standard camera. And fitting according to the gray average value and the exposure time of the central area in the second standard gray image set and the second reference gray image set, and generating a second standard fitting polynomial and a second reference fitting polynomial set of the camera to be corrected. And calculating correction coefficients of the camera to be corrected according to polynomial coefficients in the first standard fitting polynomial, the first reference fitting polynomial set, the second standard fitting polynomial and the second reference fitting polynomial set. And finally, correcting the initial exposure time of the camera to be corrected according to the correction coefficient.
The method comprises the steps of setting consistent initial exposure time for an industrial camera on a multi-station, carrying out image acquisition on standard gray scales and reference gray scales for multiple times under different exposure time by using a standard camera, screening out edge regions with weak gray scale induction by determining a central region with strong gray scale induction of each image, and reducing single-frame noise difference generated by pixel points on the edge regions and the central region by fusion processing of the central region, wherein more accurate gray average data can be extracted from images. The subsequent polynomial fitting effect on the exposure time and the gray average value of the image is improved. And then calculating correction coefficients through polynomials of the standard camera and the camera to be corrected, and finally correcting the initial exposure time of the camera to be corrected through the correction coefficients. According to the method, each camera to be corrected can take the standard camera as a reference, the initial exposure time of each camera to be corrected is adjusted aiming at a spliced screen of a certain type, and before the spliced screen of the certain type is acquired, the initial exposure time of each camera to be corrected accords with the spliced screen of the certain type as much as possible, so that the total adjustment times of the exposure time of each industrial camera are reduced, even the initial exposure time is not required to be adjusted, the gray adjustment time of the industrial camera on the acquired image of the spliced screen is reduced, and the integral detection time of the spliced screen is further reduced.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of one embodiment of a method for adjusting the consistency of multiple cameras of a tiled screen of the present application;
FIG. 2 is a schematic representation of one embodiment of a first stage of a multi-camera consistency adjustment method of a tiled screen of the present application;
FIG. 3 is a schematic diagram of one embodiment of a second stage of a multi-camera consistency adjustment method of a tiled screen of the present application;
FIG. 4 is a schematic representation of one embodiment of a third stage of the multi-camera consistency adjustment method of the tiled screen of the present application;
FIG. 5 is a schematic diagram of one embodiment of a fourth stage of a multi-camera consistency adjustment method of a tiled screen of the present application;
FIG. 6 is a schematic diagram of one embodiment of a multi-camera uniformity adjustment apparatus for a tiled screen of the present application;
Fig. 7 is a schematic view of another embodiment of a multi-camera uniformity adjustment device for a tiled screen in accordance with the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in the present description and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
Furthermore, the terms "first," "second," "third," and the like in the description of the present specification and in the appended claims, are used for distinguishing between descriptions and not necessarily for indicating or implying a relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
In the prior art, gray scale adjustment is performed on the spliced screen images by using a mode of adjusting exposure time. Firstly, an average gray scale range is required to be preset, the average gray scale is calculated after the spliced screen is shot by the industrial camera, if the average gray scale is not in the average gray scale range, the exposure time is adjusted according to a preset step length, and the image is acquired again to acquire the image of the spliced screen until the average gray scale reaches the preset range. However, the exposure time needs to be continuously adjusted according to the step length, and the initial exposure time of the industrial cameras of each station is uniform, and because the different types of spliced screens are displayed on the edge area and the central area and have single-frame noise difference, the initial exposure time is not adjusted according to the internal difference of the type of the spliced screen, the total adjustment times of the exposure time of each industrial camera are uncertain, the gray adjustment time of the industrial cameras for acquiring images of the spliced screen is increased, and the overall detection time of the spliced screen is further increased.
Based on the method and the device, the application discloses a multi-camera consistency adjusting method and device of a spliced screen and a storage medium, which are used for reducing the overall detection time of the spliced screen.
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The method of the present application may be applied to a server, a device, a terminal, or other devices having logic processing capabilities, and the present application is not limited thereto. For convenience of description, the following description will take an execution body as an example of a terminal.
Referring to fig. 1, the present application provides an embodiment of a method for adjusting consistency of multiple cameras of a spliced screen, including:
101. assembling and debugging a multi-industry camera system through a spliced screen, so that the operation environment and the initial exposure time of each industry camera with the same model in the multi-industry camera system are the same, and the multi-industry camera system comprises a standard camera and at least one camera to be corrected;
In this embodiment, the terminal assembles and debugs the multi-industry camera system through the spliced screen, so that the operation environments of each industry camera with the same model in the multi-industry camera system are the same as possible.
Specifically, the requirements for the relevant elements of the industrial camera on each station to influence the gray scale of the image are as follows:
First, all the industrial cameras of all the stations must be of the same camera brand and of the same model. Secondly, the industrial cameras of each station must take images of the same standard spliced screen sample throughout the assembly to a subsequent calibration step. The industrial cameras of each station must be in the same operating environment, specifically requiring a photographing angle, a photographing center, external light source illumination brightness, external light source illumination angle, darkroom environment, camera aperture, initial exposure time, photographing height, and the like. Among the above parameters, the setting of a part of the parameters needs to be set by using a standard spliced screen, so that the debugged multi-industry camera system can be adapted to the spliced screen to be detected of the same type. It should be noted that, the standard spliced screen used in the debugging can be replaced by a spliced screen to be detected, and the model and the specification of the standard spliced screen are the same as those of the spliced screen to be detected.
In this embodiment, the aperture of the industrial cameras of all the stations can be uniformly adjusted to 8.0 before capturing the image, and they are fixed at the same height position on the camera support of each station. Meanwhile, the camera lens of each station and the detection table of the station are ensured to have the same horizontal or vertical included angle. After the operation is finished, the industrial cameras at all stations are focused and adjusted to normal image capturing. In summary, the differentiated influence of various external factors on the quality of the image taken by the industrial cameras at each station is avoided as much as possible. It should be noted that, the possible f-numbers of different usage scenarios may be different, and it is necessary to ensure that different stations use the same aperture under the same usage scenario, and the aperture is uniformly adjusted to 8.0 in the above example only in one case.
It should be noted that, the parameters of the initial exposure time that need to be corrected are kept consistent before correction, and the initial exposure time after correction except for the standard camera may be changed.
When the terminal completes the assembly and debugging of the multi-industry camera system, next, the terminal determines one industry camera from the multi-industry camera system as a standard camera and another industry camera as a camera to be corrected.
102. The spliced screen is lightened, so that the spliced screen respectively reaches standard gray scales and N reference gray scales, the exposure time of a standard camera and a camera to be corrected in the shooting process is adjusted according to a preset exposure time set, and N is an integer larger than 0;
103. shooting a spliced screen by using a standard camera and a camera to be corrected, and generating a first standard gray-scale image set, a first reference gray-scale image set, a second standard gray-scale image set and a second reference gray-scale image set of the standard camera, wherein the same display gray-scale in each image set is at least provided with 2 images under the same exposure time;
and the terminal lights up the spliced screen to enable the display screen to display standard gray scales and N reference gray scales.
In this embodiment, the N reference gray levels are different from the standard gray levels, and the reference gray levels are determined according to the gray level images to be acquired by the spliced screen to be detected.
In this embodiment, the value of the standard gray level depends on the gray level display effect of the standard spliced screen of the type, and when the gray level display effect of the spliced screen of the type reaches a preset condition when a certain gray level or a certain range of gray levels are in reality, one gray level is selected as the standard gray level for detection, so that the effect is better in subsequent fitting.
The terminal adjusts the exposure time of the standard camera and the camera to be corrected in the shooting process according to a preset exposure time set, so that a plurality of images are shot at each exposure time.
For example, the type A spliced screen has a standard gray scale of 128 gray scales and a reference gray scale of B gray scales. The exposure time set { T1, T2, T3, T4, T5}, the standard camera is set with the exposure time T1, the spliced screen displays 128 gray scales, 5 images are shot through the standard camera, then the exposure time is adjusted to be T2, 5 images are shot, and the like, so that a first standard gray scale image set of the standard camera is generated, and 25 images are totally obtained.
The first reference gray-scale image set, the second standard gray-scale image set and the second reference gray-scale image set are similar to each other, and the description thereof will be omitted.
104. Determining a first standard gray-scale image set, a first reference gray-scale image set, a second standard gray-scale image set and a central area of each image in the second reference gray-scale image set according to the type of the spliced screen;
In the prior art, only one image is shot, only the central area is determined for the image, the gray average value of the central area is directly calculated in the follow-up process, fitting is carried out through the gray average value and the exposure time, and because the development capability of different spliced screens is different, the development capability in the central area is better than that in the edge area, and the development capability refers to the display effect and the quality when different gray scales are displayed. When external noise or system noise exists, single-frame noise difference is easy to exist on an edge area and a central area of a single spliced screen image, because the gray scale of an edge pixel relative to the gray scale of the central image is low, the transition gray scale difference is large, the gray scale displayed by a pixel point is possibly not the best, deviation occurs in the determination of the central area, and the gray average value of the central area is error, so that the subsequent fitting is problematic.
In this embodiment, by capturing a plurality of stitched screen images at the same gray level and the same exposure time, the center area on these images is sequentially determined. At this time, the center area of the partial image is deviated, and it is necessary to reduce the deviation by the subsequent steps.
105. Carrying out central region fusion processing on images with the same gray scale and the same exposure time in a first standard gray scale image set, a first reference gray scale image set, a second standard gray scale image set and a second reference gray scale image set;
And the terminal performs center region fusion processing on the images with the same gray scale and the same exposure time in the first standard gray scale image set, the first reference gray scale image set, the second standard gray scale image set and the second reference gray scale image set.
For example, the type A spliced screen has a standard gray scale of 128 gray scales and a reference gray scale of B gray scales. The exposure time set { T1, T2, T3, T4, T5}, there are 5 images at each exposure time. And 5 images with exposure time T1 are selected, the display screen areas of the 5 images are identical, after the central area and the edge area are determined, the respective central areas of the 5 images are used for overlapping placement, a new central area range (blank image) is determined, and the gray scale value of the corresponding pixel point on the central area of the 5 images is determined through the range. And carrying out mean value processing on the gray scale values of the pixel points with overlapped positions, filling the gray scale values into a new central area range, generating a target image (new central area) with 128 gray scales and exposure time T1, and using the target image for subsequent fitting.
After the central area range is determined, the central area range may not be in the central area of a part of the images, for example, when only two images are used, the overlapping position of the central areas of the two images is selected as a new central area range, and when gray average value calculation is performed on gray scale values of pixel points corresponding to two central areas at the same position, the pixel points corresponding to one position in each blank pixel point in the central area range are required to be averaged, and then the gray scale values are filled into the central area range.
106. Fitting according to the gray average value and the exposure time of the central area in the first standard gray image set and the first reference gray image set, and generating a first standard fitting polynomial and a first reference fitting polynomial set of the standard camera;
107. Fitting according to the gray average value and the exposure time of the central area in the second standard gray image set and the second reference gray image set to generate a second standard fitting polynomial and a second reference fitting polynomial set of the camera to be corrected;
And the terminal fits according to the gray average value and the exposure time of the central area of the first standard gray image set, and generates a first standard fitting polynomial of the standard camera.
The terminal performs multiple experiments on two groups of variables, namely exposure time and average gray scale, and in the embodiment, a plurality of fitting methods are adopted to calculate and compare, such as an exponential function equation, a trigonometric function equation, a polynomial equation and the like, and finally a polynomial fitting algorithm is adopted to accurately fit the relation between the exposure time and the gray scale average.
The polynomial fit equation is of the form:
wherein G is the gray average value of each central area in the first standard gray image set, T is the exposure time in the first standard gray image set, And n is the highest term number, which is the term number coefficient. To minimize the error of fitting the points to the curve, the following equation is minimized:
r is an error parameter, which is a coefficient Right pair of the above equationAnd obtaining a partial derivative, and enabling the partial derivative to be 0.
Solving each coefficient by simultaneous matrix equationSize of the product.
And the terminal fits according to the gray average value and the exposure time of the central area in the first reference gray image set, and a first reference fitting polynomial set of the standard camera is generated.
The same operation flow as standard gray level is carried out on the reference gray level in the same way, and the size of each coefficient is calculated under the reference gray level。
And the terminal fits according to the gray average value and the exposure time of the central area in the second standard gray image set, and generates a second standard fitting polynomial of the camera to be corrected.
Wherein, For the gray average value of each center region in the second standard gray image set,To be the exposure time in the second standard grayscale image set,And n is the highest term number, which is the term number coefficient.
And the terminal performs fitting according to the gray average value and the exposure time of the central area in the second reference gray image set, and generates a second reference fitting polynomial set of the camera to be corrected.
The same operation flow as standard gray level is carried out on the reference gray level in the same way, and the size of each coefficient is calculated under the reference gray level。
It should be noted that each reference gray level generates a dedicated reference fitting polynomial with its own polynomial coefficients.
In this embodiment, after each polynomial is calculated, the correction coefficient of the reference gray level of the camera to be corrected is calculated according to the coefficients of different levels by using the standard gray level of the standard camera as the reference system. Specific calculation modes are described in the examples.
108. Calculating correction coefficients of the camera to be corrected according to polynomial coefficients in the first standard fitting polynomial, the first reference fitting polynomial set, the second standard fitting polynomial and the second reference fitting polynomial set;
In this embodiment, after each polynomial is calculated, the correction coefficient of the reference gray level of the camera to be corrected is calculated according to the coefficients of different levels by using the standard gray level of the standard camera as the reference system. Specific calculation modes are described in the examples.
109. And correcting the initial exposure time of the camera to be corrected according to the correction coefficient.
And finally, the terminal corrects the initial exposure time of the cameras to be corrected according to the correction coefficients, and corrects the initial exposure time of each camera to be corrected in this way.
In this embodiment, a multi-industrial camera system is assembled first, and the multi-industrial camera system includes a plurality of industrial cameras with the same model, and the multi-industrial camera system is debugged through a mosaic screen, so that the detection parameters of the system are the same as possible. One of the industrial cameras of the multi-industrial camera system is set as a standard camera, and the other industrial cameras are all determined as cameras to be corrected. And inputting a standard gray-scale picture and N reference gray-scale pictures into the spliced screen, shooting by using a standard camera and a camera to be corrected in the actual process, and adjusting the exposure time of the standard camera and the camera to be corrected in the shooting process. At a certain gray level, the standard camera and the camera to be corrected will take at least 2 images simultaneously at each exposure time. And generating a first standard gray-scale image set of the standard camera under the standard gray scale, and generating a first reference gray-scale image set of the standard camera under the N reference gray scales. The camera to be corrected is in a second standard gray level image set under the standard gray level, and the camera to be corrected is in a second reference gray level image set under N reference gray levels. And determining the central area of each image in the first standard gray-scale image set, the first reference gray-scale image set, the second standard gray-scale image set and the second reference gray-scale image set according to the type of the spliced screen, and carrying out central area fusion processing on the images with the same gray scale and the same exposure time. And then fitting according to the gray average value and the exposure time of the central area in the first standard gray image set and the first reference gray image set, and generating a first standard fitting polynomial and a first reference fitting polynomial set of the standard camera. And fitting according to the gray average value and the exposure time of the central area in the second standard gray image set and the second reference gray image set, and generating a second standard fitting polynomial and a second reference fitting polynomial set of the camera to be corrected. And calculating correction coefficients of the camera to be corrected according to polynomial coefficients in the first standard fitting polynomial, the first reference fitting polynomial set, the second standard fitting polynomial and the second reference fitting polynomial set. And finally, correcting the initial exposure time of the camera to be corrected according to the correction coefficient.
The method comprises the steps of setting consistent initial exposure time for an industrial camera on a multi-station, carrying out image acquisition on standard gray scales and reference gray scales for multiple times under different exposure time by using a standard camera, screening out edge regions with weak gray scale induction by determining a central region with strong gray scale induction of each image, and reducing single-frame noise difference generated by pixel points on the edge regions and the central region by fusion processing of the central region, wherein more accurate gray average data can be extracted from images. The subsequent polynomial fitting effect on the exposure time and the gray average value of the image is improved. And then calculating correction coefficients through polynomials of the standard camera and the camera to be corrected, and finally correcting the initial exposure time of the camera to be corrected through the correction coefficients. According to the method, each camera to be corrected can take the standard camera as a reference, the initial exposure time of each camera to be corrected is adjusted aiming at a spliced screen of a certain type, and before the spliced screen of the certain type is acquired, the initial exposure time of each camera to be corrected accords with the spliced screen of the certain type as much as possible, so that the total adjustment times of the exposure time of each industrial camera are reduced, even the initial exposure time is not required to be adjusted, the gray adjustment time of the industrial camera on the acquired image of the spliced screen is reduced, and the integral detection time of the spliced screen is further reduced.
Referring to fig. 2,3, 4 and 5, another embodiment of a method for adjusting consistency of multiple cameras of a spliced screen is provided, including:
201. assembling a multi-industry camera system, and selecting a spliced screen;
202. adjusting shooting angles, diaphragms and external light sources of industrial cameras in the multi-industrial camera system to be consistent;
203. placing the spliced screen on a station corresponding to the industrial camera;
In this embodiment, the terminal is equipped with the multi-industry camera system, and then selects a spliced screen as a standard spliced screen for debugging and correcting links. The terminal adjusts the photographing angle, aperture and external light source of the industrial camera in the multi-industrial camera system to be consistent.
The terminal then places the tiled screen at the station to which the industrial camera corresponds.
204. Determining an acquisition range and a step length, wherein the acquisition range is the distance range between an industrial camera and a spliced screen, and the step length is an acquisition interval;
205. Controlling an industrial camera to acquire images of the spliced screen in an acquisition range by taking the step length as an acquisition interval to generate a spliced screen image group;
Because of the inconsistent thickness of the different tiled screens, it is often necessary to adjust the capture height of the industrial camera and tiled screen so that the captured image is well focused. In the conventional process, the height of each industrial camera is generally adjusted to be the same empirically, and such a process does not take into consideration the difference between each station, so that the difference in photographed stitched screen images is amplified.
In this embodiment, the terminal first determines an acquisition range and a step length, where the acquisition range is a distance range between the industrial camera and the spliced screen, and the step length is an acquisition interval. The acquisition range is a range extending in two directions around a height set by an empirical value.
And the terminal controls the industrial camera to acquire images of the spliced screen in an acquisition range by taking the step length as an acquisition interval, so as to generate an image group of the spliced screen. Binding the corresponding image capturing height for each image in the spliced screen image group.
206. Carrying out bilateral filtering on the images in the spliced screen image group to generate a bilateral filtering image;
207. acquiring the mean value and standard deviation of the spliced screen image, the mean value and standard deviation of the bilateral filtering image and the covariance of the spliced screen image and the bilateral filtering image;
208. Acquiring a pre-designed brightness formula, a comparison formula and a structural formula;
209. Calculating a focusing value according to the average value, standard deviation and covariance of the spliced screen image and the bilateral filtering image and a brightness formula, a contrast formula and a structural formula, and generating a focusing value set;
210. determining the acquisition point position corresponding to the maximum focusing value as the imaging height of the industrial camera;
and the terminal carries out bilateral filtering on the spliced screen images in the spliced screen image group to generate bilateral filtering images. Next, the terminal calculates the average value of the spliced screen images And standard deviationMean value of bilateral filtering imageAnd standard deviationThen calculates covariance of the spliced screen image and the bilateral filtering image. And finally, calculating a focus value F according to a mean value, a standard deviation and a covariance of the spliced screen image and the bilateral filtering image in combination with a brightness formula, a contrast formula and a structural formula, wherein the calculation formula is as follows:
Wherein, the brightness formula is:
Wherein, the comparison formula is:
Wherein, structural formula:
Wherein x represents the acquired spliced screen image, and y represents the bilateral filtering image. AndRespectively representing the mean value and standard deviation of the spliced screen images.AndRepresenting the mean and standard deviation of the bilateral filtered images, respectively.Representing the covariance of the stitched screen image and the bilateral filtered image,,,Is constant and is determined according to a preset bit depth.
And finally, the terminal determines the acquisition point position corresponding to the maximum focusing value as the imaging height of the industrial camera.
211. Determining that the industrial camera is a standard camera from the multi-industrial camera system, and determining the rest industrial cameras as cameras to be corrected;
212. The spliced screen is placed on a station corresponding to the standard camera to be lightened;
213. Shooting the spliced screen through a standard camera to generate a spliced screen gray-scale image;
214. calculating a real-time gray average value of the gray-scale images of the spliced screen;
215. when the real-time gray average value is not in the preset gray range, adjusting the exposure time of the standard camera to shoot the spliced screen again until the real-time gray average value reaches the preset gray range, and determining the uniform initial exposure time;
In this embodiment, the terminal determines that the industrial camera is a standard camera from the multi-industrial camera system, all the other industrial cameras are determined to be cameras to be corrected, and then the terminal places the spliced screen on a station corresponding to the standard camera to be lightened, shoots the spliced screen through the standard camera, generates a gray-scale image of the spliced screen, and calculates a real-time gray-scale average value of the gray-scale image of the spliced screen. When the real-time gray average value is not in the preset gray range, the exposure time of the standard camera is adjusted to shoot the spliced screen again until the real-time gray average value reaches the preset gray range. Finally, the initial exposure time is set for each industrial camera to be corrected.
The terminal adjusts the initial exposure time of the standard camera in advance to ensure that the initial exposure time is the highest with the adaptation degree of the standard camera, and then adjusts the initial exposure time to other cameras to be corrected respectively.
If the initial exposure time calculation is not performed for the standard camera, the default exposure time of the standard camera is not necessarily suitable for the spliced screen of the type, and is suitable for most of the screen bodies, and in the subsequent exposure time correction process, if the default exposure time is large in deviation from the standard camera, each camera to be corrected has deviation even after correction is completed.
216. The spliced screen is lightened, so that the spliced screen respectively reaches standard gray scales and N reference gray scales, the exposure time of a standard camera and a camera to be corrected in the shooting process is adjusted according to a preset exposure time set, and N is an integer larger than 0;
217. Shooting a spliced screen by using a standard camera and a camera to be corrected, and generating a first standard gray-scale image set, a first reference gray-scale image set, a second standard gray-scale image set and a second reference gray-scale image set of the standard camera, wherein the same display gray-scale in each image set is at least provided with 2 images under the same exposure time;
Step 216 and step 217 in this embodiment are similar to step 102 and step 103 in the previous embodiment, and will not be repeated here.
218. Selecting an image to be extracted from the first standard gray-scale image set;
219. determining a plurality of undetermined thresholds according to the spliced screen, and respectively determining an intra-threshold area and an extra-threshold area of the image to be extracted under each undetermined threshold;
In this embodiment, the terminal selects an image to be extracted from the first standard gray-scale image set, and uses this as a reference example, the terminal determines a plurality of undetermined thresholds according to the stitching screen, and determines an intra-threshold area and an extra-threshold area of the image to be extracted under each undetermined threshold respectively, where the undetermined thresholds are determined according to the quality of the display of the stitching screen on the gray-scale, first determines a gray-scale range with the best gray-scale display quality of the stitching screen, generates a threshold range from an upper limit gray-scale, a lower limit gray-scale, a gray-scale 0 and a gray-scale 255 of the range, selects a plurality of undetermined thresholds from the ranges of the gray-scale 0 and the lower limit gray-scale, and then selects a plurality of undetermined thresholds from the gray-scale upper limit and the gray-scale 255.
The terminal determines the intra-and extra-threshold regions of the image to be extracted below each of the pending thresholds.
220. Generating an intra-threshold ratio, an intra-threshold gray level average value, an extra-threshold ratio and an extra-threshold gray level average value according to the intra-threshold area and the extra-threshold area corresponding to each undetermined threshold;
At this time, the terminal generates corresponding intra-threshold area and extra-threshold area in the image to be extracted according to each of the predetermined thresholds, generates an intra-threshold ratio and an intra-threshold gray average value according to the data of the pixels in the intra-threshold area, and generates an extra-threshold ratio and an extra-threshold gray average value according to the data of the pixels in the extra-threshold area.
221. Generating a weight value according to the intra-threshold area proportion, the intra-threshold gray average value, the extra-threshold area proportion and the extra-domain gray average value of each to-be-determined threshold;
The terminal generates a weight value according to the intra-threshold area proportion, the intra-threshold gray average value, the extra-threshold area proportion and the extra-domain gray average value of each undetermined threshold value. Let the ratio of the threshold regions be The ratio of the out-of-threshold area isThe average value of the gray scale in the threshold isThe outside gray scale mean value isWeight valueThe method comprises the following steps:
222. Determining a target threshold according to the weight value of each undetermined threshold T;
223. Determining a central area of an image to be extracted according to a target threshold;
224. Performing target threshold calculation and center region extraction for each image in the first standard gray level image set, the first reference gray level image set, the second standard gray level image set and the second reference gray level image set;
In this embodiment, the terminal generates a corresponding weight value for each pending threshold, and determines the pending threshold with the largest weight value as the target threshold. And determining a central region and an edge region of the image to be extracted according to the target threshold value, and screening the edge region to generate the central region of the image to be extracted.
And performing target threshold calculation and extraction of a central region for each image in the first standard gray-scale image set, the first reference gray-scale image set, the second standard gray-scale image set and the second reference gray-scale image set in the above manner.
225. Determining a center region range and a center region pixel value of each image of one exposure time in the first standard gray scale image set;
In this embodiment, since a plurality of images are acquired at the same exposure time for the same gray scale, the central areas of the images need to be integrated first to generate a blank image.
First, the central area range and the central area pixel value of each image of one exposure time in the first standard gray-scale image set are terminated, namely, the gray-scale value of each pixel point of the central area and the central area of each image.
226. Determining an inscribed rectangular area or a complete overlapping area according to the central area range of each image, wherein the complete overlapping area is in the central area range of each image, and the inscribed rectangular area is in the central area range of at least one image;
The terminal determines an inscribed rectangular region or a full overlap region according to the central region range of each image.
Assuming that 5 images are provided, after the central areas of the 5 images are determined, the 5 images are placed on the same spliced screen image, and the central areas can be partially overlapped or completely overlapped, wherein the completely overlapped area refers to the position where the 5 central areas are all covered, and the inscribed rectangular area is reduced in the maximum range included by the 5 central areas according to the contour (generally rectangular) of the spliced screen in equal proportion, so that the inscribed rectangular area is positioned in the maximum range.
The center area of each image in the full overlapping area needs to be large enough, so that the acquired full overlapping area is also large enough, the number of pixels is increased by using an inscribed rectangular area if the overlapping area is small, and only the number of pixels is enough, so that the gray average value is more accurate in the subsequent fusion processing process, and the single frame noise difference is reduced.
227. According to the pixel value of the central area of each image, carrying out mean value fusion processing on the pixel points on the inscribed rectangular area or the completely overlapped area through a pixel averaging algorithm to generate a mean value fusion image, wherein the mean value fusion image replaces each image with the exposure time;
228. Generating inscribed rectangular areas or complete overlapping areas and carrying out mean value fusion processing on images with the same exposure time in a first standard gray-scale image set and a second standard gray-scale image set;
229. generating inscribed rectangular areas or complete overlapping areas and carrying out mean value fusion processing on images with the same gray scale and the same exposure time in a first reference gray scale image set and a second reference gray scale image set;
in this embodiment, the terminal performs the mean value fusion processing on the pixels in the inscribed rectangular area or the completely overlapped area according to the pixel value in the central area of each image and through the pixel averaging algorithm, so as to generate a mean value fusion image, and the mean value fusion image replaces each image with the exposure time, where the formula is as follows:
Wherein, Is the image after the mean value is fused,For the number of images taken,Gray values of pixel points on the inscribed rectangular region or the completely overlapped region among the center regions of each photographed image.
230. Fitting according to the gray average value and the exposure time of the central area in the first standard gray image set and the first reference gray image set, and generating a first standard fitting polynomial and a first reference fitting polynomial set of the standard camera;
231. Fitting according to the gray average value and the exposure time of the central area in the second standard gray image set and the second reference gray image set to generate a second standard fitting polynomial and a second reference fitting polynomial set of the camera to be corrected;
step 230 and step 231 in this embodiment are similar to step 106 and step 107 in the previous embodiment, and will not be repeated here.
232. Performing M times of item coefficient comparison on the first standard fitting polynomial and the first reference fitting polynomial set according to the size of N to generate a first comparison coefficient, wherein M is an integer greater than 1;
233. Performing M times of item coefficient comparison on the second standard fitting polynomial and the second reference fitting polynomial set according to the size of N to generate a second comparison coefficient;
234. Performing M times of item coefficient comparison on the first standard fitting polynomial and the second standard fitting polynomial according to the size of N to generate a third comparison coefficient;
235. generating correction coefficients of standard gray scales and N reference gray scales for the camera to be corrected according to the first comparison coefficient, the second comparison coefficient and the third comparison coefficient by taking the standard camera as a reference;
In this embodiment, the terminal performs M-degree item coefficient comparison on the first standard fitting polynomial and the first reference fitting polynomial set according to the size of the reference gray scale, generates a first comparison coefficient, M is an integer greater than 1, and performs M-degree item coefficient comparison on the second standard fitting polynomial and the second reference fitting polynomial set according to the size of N, to generate a second comparison coefficient. And then the terminal compares the M-degree term coefficients of the first standard fitting polynomial and the second standard fitting polynomial according to the size of N to generate a third comparison coefficient.
For example, if N is 1, i.e. only the standard gray level and one reference gray level, the data of the standard gray level and one reference gray level are analyzed and compared to selectAndThe multiple coefficients in the two are compared to obtain a first comparison coefficient:
Respectively selectAndThe multiple items in the two are compared to obtain a second comparison coefficient:
Meanwhile, the patent uses 128 gray scales of each station as reference gray scales, so that multiple coefficient comparison of 128 gray scales of a standard station and a correction station is needed, and a third comparison coefficient is calculated:
And finally, the terminal generates correction coefficients of standard gray scales and reference gray scales for the camera to be corrected according to the first comparison coefficient, the second comparison coefficient and the third comparison coefficient by taking the standard camera as a reference.
For correction coefficients on a standard camera, the correction coefficients of the standard gray scale and the reference gray scale are normalized in such a way that the correction coefficient M of the standard camera at the standard gray scale and the correction coefficient N at the reference gray scale are normalized to 1:
Correction coefficients of standard gray scale and reference gray scale of the camera to be corrected are respectively required to be calculated by taking the coefficients of the standard camera as references and respectively adopting the following formulas, AndCorrection coefficients for standard gray scale and reference gray scale of the camera to be corrected are respectively:
Can pass through subsequently AndAnd respectively carrying out correction processing for the exposure time of the camera to be corrected.
236. And correcting the initial exposure time of the camera to be corrected according to the correction coefficient.
Step 236 in this embodiment is similar to step 109 in the previous embodiment, and will not be described again here.
In this embodiment, a multi-industry camera system is first assembled, and a split screen is selected. The photographing angle, aperture and external light source of the industrial camera in the multi-industrial camera system are adjusted to be consistent. And placing the spliced screen on a station corresponding to the industrial camera. And determining an acquisition range and a step length, wherein the acquisition range is the distance range between the industrial camera and the spliced screen, and the step length is the acquisition interval. And controlling the industrial camera to acquire images of the spliced screen in an acquisition range by taking the step length as an acquisition interval, and generating an image group of the spliced screen. And carrying out bilateral filtering on the images in the spliced screen image group to generate a bilateral filtering image. And acquiring the mean value and standard deviation of the spliced screen image, the mean value and standard deviation of the bilateral filtering image and the covariance of the spliced screen image and the bilateral filtering image. And acquiring a pre-designed brightness formula, a comparison formula and a structural formula. And calculating a focusing value according to the average value, standard deviation and covariance of the spliced screen image and the bilateral filtering image and by combining a brightness formula, a contrast formula and a structural formula, and generating a focusing value set. And determining the acquisition point position corresponding to the maximum focusing value as the imaging height of the industrial camera. And determining the industrial camera as a standard camera from the industrial camera system, and determining the rest industrial cameras as cameras to be corrected. And placing the spliced screen on a station corresponding to the standard camera to be lightened. And shooting the spliced screen through a standard camera to generate a spliced screen gray-scale image. And calculating the real-time gray average value of the gray-scale images of the spliced screen. When the real-time gray average value is not in the preset gray range, the exposure time of the standard camera is adjusted to shoot the spliced screen again until the real-time gray average value reaches the preset gray range.
And inputting a standard gray-scale picture and N reference gray-scale pictures into the spliced screen, shooting by using a standard camera and a camera to be corrected in the actual process, and adjusting the exposure time of the standard camera and the camera to be corrected in the shooting process. At a certain gray level, the standard camera and the camera to be corrected will take at least 2 images simultaneously at each exposure time. And generating a first standard gray-scale image set of the standard camera under the standard gray scale, and generating a first reference gray-scale image set of the standard camera under the N reference gray scales. The camera to be corrected is in a second standard gray level image set under the standard gray level, and the camera to be corrected is in a second reference gray level image set under N reference gray levels.
An image to be extracted is selected from the first standard gray-scale image set. And determining a plurality of undetermined thresholds according to the spliced screen, and respectively determining an intra-threshold area and an extra-threshold area of the image to be extracted under each undetermined threshold. And generating an intra-threshold ratio, an intra-threshold gray scale average value, an extra-threshold region ratio and an extra-domain gray scale average value according to the intra-threshold region and the extra-threshold region corresponding to each pending threshold. And generating a weight value according to the intra-threshold area proportion, the intra-threshold gray average value, the extra-threshold area proportion and the extra-domain gray average value of each undetermined threshold. And determining a target threshold according to the weight value of each undetermined threshold T. And determining a central area of the image to be extracted according to the target threshold value. And calculating a target threshold value and extracting a central region for each image in the first standard gray-scale image set, the first reference gray-scale image set, the second standard gray-scale image set and the second reference gray-scale image set. A center region range and center region pixel value for each image of an exposure time in a first set of standard gray scale images is determined. And determining an inscribed rectangular area or a complete overlapping area according to the central area range of each image, wherein the complete overlapping area is in the central area range of each image, and the inscribed rectangular area is in the central area range of at least one image. And carrying out mean value fusion processing on pixel points on the inscribed rectangular area or the completely overlapped area according to the pixel value of the central area of each image through a pixel averaging algorithm to generate a mean value fusion image, wherein the mean value fusion image replaces each image with the exposure time. And generating inscribed rectangular areas or complete overlapping areas for images with the same exposure time in the first standard gray-scale image set and the second standard gray-scale image set and carrying out mean value fusion processing. And generating inscribed rectangular areas or complete overlapping areas and carrying out mean value fusion processing on the images with the same gray scale and the same exposure time in the first reference gray scale image set and the second reference gray scale image set.
And then fitting according to the gray average value and the exposure time of the central area in the first standard gray image set and the first reference gray image set, and generating a first standard fitting polynomial and a first reference fitting polynomial set of the standard camera. And fitting according to the gray average value and the exposure time of the central area in the second standard gray image set and the second reference gray image set, and generating a second standard fitting polynomial and a second reference fitting polynomial set of the camera to be corrected. And carrying out M times of coefficient comparison on the first standard fitting polynomial and the first reference fitting polynomial set according to the size of N to generate a first comparison coefficient, wherein M is an integer greater than 1. And carrying out M times of item coefficient comparison on the second standard fitting polynomial and the second reference fitting polynomial set according to the size of N to generate a second comparison coefficient. And carrying out M times of item coefficient comparison on the first standard fitting polynomial and the second standard fitting polynomial according to the size of N to generate a third comparison coefficient. And generating correction coefficients of standard gray scales and N reference gray scales for the camera to be corrected according to the first comparison coefficient, the second comparison coefficient and the third comparison coefficient by taking the standard camera as a reference. And finally, correcting the initial exposure time of the camera to be corrected according to the correction coefficient.
The method comprises the steps of setting consistent initial exposure time for an industrial camera on a multi-station, carrying out image acquisition on standard gray scales and reference gray scales for multiple times under different exposure time by using a standard camera, screening out edge regions with weak gray scale induction by determining a central region with strong gray scale induction of each image, and reducing single-frame noise difference generated by pixel points on the edge regions and the central region by fusion processing of the central region, wherein more accurate gray average data can be extracted from images. The subsequent polynomial fitting effect on the exposure time and the gray average value of the image is improved. And then calculating correction coefficients through polynomials of the standard camera and the camera to be corrected, and finally correcting the initial exposure time of the camera to be corrected through the correction coefficients. According to the method, each camera to be corrected can take the standard camera as a reference, the initial exposure time of each camera to be corrected is adjusted aiming at a spliced screen of a certain type, and before the spliced screen of the certain type is acquired, the initial exposure time of each camera to be corrected accords with the spliced screen of the certain type as much as possible, so that the total adjustment times of the exposure time of each industrial camera are reduced, even the initial exposure time is not required to be adjusted, the gray adjustment time of the industrial camera on the acquired image of the spliced screen is reduced, and the integral detection time of the spliced screen is further reduced.
Referring to fig. 6, an embodiment of a multi-camera uniformity adjustment device for a tiled screen is provided in the present application, including:
The assembling unit 601 is configured to assemble and debug a multi-industry camera system through a splicing screen, so that an operation environment and an initial exposure time of each industry camera with the same model in the multi-industry camera system are the same, and the multi-industry camera system includes a standard camera and at least one camera to be corrected;
optionally, the assembly unit 601 includes:
the assembly module 6011 is used for assembling the multi-industry camera system and selecting a spliced screen;
an adjusting module 6012 for adjusting the photographing angle, aperture and external light source of the industrial camera in the multi-industrial camera system to be consistent;
the height adjusting module 6013 is used for adjusting the image capturing height of the industrial camera in the multi-industrial camera system according to the spliced screen;
Optionally, the height adjustment module 6013 includes:
placing the spliced screen on a station corresponding to the industrial camera;
Determining an acquisition range and a step length, wherein the acquisition range is the distance range between an industrial camera and a spliced screen, and the step length is an acquisition interval;
Controlling an industrial camera to acquire images of the spliced screen in an acquisition range by taking the step length as an acquisition interval to generate a spliced screen image group;
Carrying out bilateral filtering on the images in the spliced screen image group to generate a bilateral filtering image;
acquiring the mean value and standard deviation of the spliced screen image, the mean value and standard deviation of the bilateral filtering image and the covariance of the spliced screen image and the bilateral filtering image;
acquiring a pre-designed brightness formula, a comparison formula and a structural formula;
Calculating a focusing value according to the average value, standard deviation and covariance of the spliced screen image and the bilateral filtering image and a brightness formula, a contrast formula and a structural formula, and generating a focusing value set;
And determining the acquisition point position corresponding to the maximum focusing value as the imaging height of the industrial camera.
The generating module 6014 is configured to generate an initial exposure time for an industrial camera in the industrial camera system according to the stitched screen.
Optionally, the generating module 6014 includes:
determining that the industrial camera is a standard camera from the multi-industrial camera system, and determining the rest industrial cameras as cameras to be corrected;
The spliced screen is placed on a station corresponding to the standard camera to be lightened;
Shooting the spliced screen through a standard camera to generate a spliced screen gray-scale image;
calculating a real-time gray average value of the gray-scale images of the spliced screen;
When the real-time gray average value is not in the preset gray range, the exposure time of the standard camera is adjusted to shoot the spliced screen again until the real-time gray average value reaches the preset gray range, and the unified initial exposure time is determined.
The starting unit 602 is configured to light up the spliced screen, so that the spliced screen respectively reaches a standard gray level and N reference gray levels, and adjust exposure times of the standard camera and the camera to be corrected in a shooting process according to a preset exposure time set, where N is an integer greater than 0;
A first generating unit 603, configured to take a picture of the spliced screen using the standard camera and the camera to be corrected, and generate a first standard gray-scale image set, a first reference gray-scale image set, and a second standard gray-scale image set and a second reference gray-scale image set of the standard camera to be corrected, where at least 2 images are collected at the same exposure time for the same display gray-scale in each image set;
a determining unit 604, configured to determine a central area of each of the first standard gray-scale image set, the first reference gray-scale image set, the second standard gray-scale image set, and the second reference gray-scale image set according to a type of the stitched screen;
Optionally, the determining unit 604 includes:
Selecting an image to be extracted from the first standard gray-scale image set;
Determining a plurality of undetermined thresholds according to the spliced screen, and respectively determining an intra-threshold area and an extra-threshold area of the image to be extracted under each undetermined threshold;
Generating an intra-threshold ratio, an intra-threshold gray level average value, an extra-threshold ratio and an extra-threshold gray level average value according to the intra-threshold area and the extra-threshold area corresponding to each undetermined threshold;
Generating a weight value according to the intra-threshold area proportion, the intra-threshold gray average value, the extra-threshold area proportion and the extra-domain gray average value of each to-be-determined threshold;
determining a target threshold according to the weight value of each undetermined threshold T;
Determining a central area of an image to be extracted according to a target threshold;
and calculating a target threshold value and extracting a central region for each image in the first standard gray-scale image set, the first reference gray-scale image set, the second standard gray-scale image set and the second reference gray-scale image set.
A fusion unit 605, configured to perform a center region fusion process on images that have the same gray scale and the same exposure time in the first standard gray scale image set, the first reference gray scale image set, the second standard gray scale image set, and the second reference gray scale image set;
optionally, the fusing unit 605 includes:
determining a center region range and a center region pixel value of each image of one exposure time in the first standard gray scale image set;
Determining an inscribed rectangular area or a complete overlapping area according to the central area range of each image, wherein the complete overlapping area is in the central area range of each image, and the inscribed rectangular area is in the central area range of at least one image;
According to the pixel value of the central area of each image, carrying out mean value fusion processing on the pixel points on the inscribed rectangular area or the completely overlapped area through a pixel averaging algorithm to generate a mean value fusion image, wherein the mean value fusion image replaces each image with the exposure time;
Generating inscribed rectangular areas or complete overlapping areas and carrying out mean value fusion processing on images with the same exposure time in a first standard gray-scale image set and a second standard gray-scale image set;
and generating inscribed rectangular areas or complete overlapping areas and carrying out mean value fusion processing on the images with the same gray scale and the same exposure time in the first reference gray scale image set and the second reference gray scale image set.
A second generating unit 606, configured to perform fitting according to the gray average value and the exposure time of the central area in the first standard gray image set and the first reference gray image set, and generate a first standard fitting polynomial and a first reference fitting polynomial set of the standard camera;
a third generating unit 607, configured to perform fitting according to the gray average value and the exposure time of the center areas in the second standard gray image set and the second reference gray image set, and generate a second standard fitting polynomial and a second reference fitting polynomial set of the camera to be corrected;
A calculating unit 608, configured to calculate a correction coefficient of the camera to be corrected according to polynomial coefficients in the first standard fitting polynomial, the first reference fitting polynomial set, the second standard fitting polynomial, and the second reference fitting polynomial set;
optionally, the calculating unit 608 includes:
performing M times of item coefficient comparison on the first standard fitting polynomial and the first reference fitting polynomial set according to the size of N to generate a first comparison coefficient, wherein M is an integer greater than 1;
performing M times of item coefficient comparison on the second standard fitting polynomial and the second reference fitting polynomial set according to the size of N to generate a second comparison coefficient;
Performing M times of item coefficient comparison on the first standard fitting polynomial and the second standard fitting polynomial according to the size of N to generate a third comparison coefficient;
and generating correction coefficients of standard gray scales and N reference gray scales for the camera to be corrected according to the first comparison coefficient, the second comparison coefficient and the third comparison coefficient by taking the standard camera as a reference.
A correction unit 609 for correcting the initial exposure time of the camera to be corrected according to the correction coefficient.
Referring to fig. 7, the present application provides a consistency adjusting device for multiple cameras of a spliced screen, comprising:
a processor 701, a memory 702, an input-output unit 703, and a bus 704.
The processor 701 is connected to a memory 702, an input-output unit 703, and a bus 704.
The memory 702 holds a program, and the processor 701 invokes the program to execute the consistency adjustment method as in fig. 1,2, and 3, 4, and 5.
The present application provides a computer-readable storage medium having a program stored thereon, which when executed on a computer performs a consistency adjustment method as in fig. 1, 2 and 3, and 4 and 5.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. The storage medium includes a usb disk, a removable hard disk, a read-only memory (ROM), a random-access memory (RAM, random access memory), a magnetic disk, an optical disk, or other various media capable of storing program codes.
Claims (10)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202411419942.3A CN118945477B (en) | 2024-10-12 | 2024-10-12 | A method, device and storage medium for adjusting consistency of multiple cameras of a spliced screen |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202411419942.3A CN118945477B (en) | 2024-10-12 | 2024-10-12 | A method, device and storage medium for adjusting consistency of multiple cameras of a spliced screen |
Publications (2)
Publication Number | Publication Date |
---|---|
CN118945477A CN118945477A (en) | 2024-11-12 |
CN118945477B true CN118945477B (en) | 2024-12-27 |
Family
ID=93363149
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202411419942.3A Active CN118945477B (en) | 2024-10-12 | 2024-10-12 | A method, device and storage medium for adjusting consistency of multiple cameras of a spliced screen |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118945477B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107799084A (en) * | 2017-11-21 | 2018-03-13 | 武汉华星光电半导体显示技术有限公司 | Device and method, the memory of luminance compensation |
CN114359055A (en) * | 2022-03-21 | 2022-04-15 | 湖南大学 | Image stitching method and related device for multi-camera shooting screen body |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100589583C (en) * | 2007-06-05 | 2010-02-10 | 广东威创视讯科技股份有限公司 | A multi-screen video wall correction method |
US8558923B2 (en) * | 2010-05-03 | 2013-10-15 | Canon Kabushiki Kaisha | Image capturing apparatus and method for selective real time focus/parameter adjustment |
CN118018715B (en) * | 2024-04-10 | 2024-07-26 | 深圳精智达技术股份有限公司 | Method and related device for adjusting consistency of exposure time of multiple cameras |
-
2024
- 2024-10-12 CN CN202411419942.3A patent/CN118945477B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107799084A (en) * | 2017-11-21 | 2018-03-13 | 武汉华星光电半导体显示技术有限公司 | Device and method, the memory of luminance compensation |
CN114359055A (en) * | 2022-03-21 | 2022-04-15 | 湖南大学 | Image stitching method and related device for multi-camera shooting screen body |
Also Published As
Publication number | Publication date |
---|---|
CN118945477A (en) | 2024-11-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022057670A1 (en) | Real-time focusing method, apparatus and system, and computer-readable storage medium | |
CN108156369B (en) | Image processing method and device | |
US20140099042A1 (en) | Method and system for modifying image quality of an image | |
TW201515433A (en) | Image calibration system and calibration method of a stereo camera | |
CN114302121B (en) | Image correction checking method, device, electronic equipment and storage medium | |
JP2017182038A (en) | Projection system and correction method of projection screen | |
CN111242858B (en) | Distortion correction method and system for camera lens | |
WO2019232793A1 (en) | Two-camera calibration method, electronic device and computer-readable storage medium | |
KR20150109177A (en) | Photographing apparatus, method for controlling the same, and computer-readable recording medium | |
CN114359055A (en) | Image stitching method and related device for multi-camera shooting screen body | |
CN115278103B (en) | Security monitoring image compensation processing method and system based on environment perception | |
US20230276034A1 (en) | Method and system for adjusting projection dithering | |
US9894339B2 (en) | Image processing apparatus, image processing method and program | |
CN107742310B (en) | Method and device for testing included angle of double cameras and storage device | |
CN118018715B (en) | Method and related device for adjusting consistency of exposure time of multiple cameras | |
CN115609915A (en) | Light intensity adjusting method and device of 3D printer | |
CN106603929A (en) | Screen fill-in light camera method and system based on mobile terminal | |
CN116337412A (en) | Screen detection method, device and storage medium | |
CN118945477B (en) | A method, device and storage medium for adjusting consistency of multiple cameras of a spliced screen | |
TW202145776A (en) | Projection system and adaptive adjustment method thereof | |
CN117097872A (en) | Automatic trapezoid correction system and method for projection equipment | |
CN113364935A (en) | Camera lens shadow compensation method, device and equipment and camera equipment | |
CN114071109B (en) | Method for improving white balance instability and camera module | |
US20040114198A1 (en) | Image processing system and method | |
CN114359397A (en) | Image optimization method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |