[go: up one dir, main page]

CN118762626B - Screen brightness uniformity detection method and detection equipment - Google Patents

Screen brightness uniformity detection method and detection equipment Download PDF

Info

Publication number
CN118762626B
CN118762626B CN202411245380.5A CN202411245380A CN118762626B CN 118762626 B CN118762626 B CN 118762626B CN 202411245380 A CN202411245380 A CN 202411245380A CN 118762626 B CN118762626 B CN 118762626B
Authority
CN
China
Prior art keywords
screen
image
brightness
gray
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202411245380.5A
Other languages
Chinese (zh)
Other versions
CN118762626A (en
Inventor
侯增福
李想
赵天宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202411245380.5A priority Critical patent/CN118762626B/en
Publication of CN118762626A publication Critical patent/CN118762626A/en
Application granted granted Critical
Publication of CN118762626B publication Critical patent/CN118762626B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/006Electronic inspection or testing of displays and display drivers, e.g. of LED or LCD displays
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30121CRT, LCD or plasma display
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0233Improving the luminance or brightness uniformity across the screen

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Computer Hardware Design (AREA)
  • Image Processing (AREA)

Abstract

本申请实施例提供了屏幕的亮度均匀性检测方法和检测设备。该方法中,在确定待测屏幕灰度图像之后,检测设备可以通过该待测屏幕灰度图像确定待测屏幕区域图像。该待测屏幕区域图像可以为部分待测屏幕灰度图像或者全部待测屏幕灰度图像。检测设备确定待测屏幕区域图像的亮度均匀度。然后,将该亮度均匀度与均匀性检测阈值进行比较,以确定该待测屏幕区域图像对应的屏幕区域的亮度均匀性。实施本申请提供的技术方案,可以使得屏幕的亮度均匀性检测更加客观,减少对检测人员的依赖。

An embodiment of the present application provides a method and a device for detecting the brightness uniformity of a screen. In the method, after determining the grayscale image of the screen to be tested, the detection device can determine the screen area image to be tested through the grayscale image of the screen to be tested. The screen area image to be tested can be a partial screen grayscale image to be tested or the entire screen grayscale image to be tested. The detection device determines the brightness uniformity of the screen area image to be tested. Then, the brightness uniformity is compared with a uniformity detection threshold to determine the brightness uniformity of the screen area corresponding to the screen area image to be tested. The implementation of the technical solution provided by the present application can make the brightness uniformity detection of the screen more objective and reduce the dependence on detection personnel.

Description

Screen brightness uniformity detection method and detection equipment
Technical Field
The application relates to the field of terminals, in particular to a method and equipment for detecting brightness uniformity of a screen.
Background
The uniformity of the brightness of a screen in an electronic device (e.g., a cell phone, tablet, etc.) indicates whether the brightness in the screen is uniform, i.e., whether the white represented by the pixels in each region of the screen is uniform, when the screen displays an image of the same color (e.g., a full white image). If the brightness of the screen is uniform, this means that the brightness of the pixels in each region of the screen appears uniform, and a more stable visual appearance can be provided. If the brightness of the screen is not uniform, it means that the brightness of the pixels in different areas of the screen are not uniform, which may cause poor visual performance.
Therefore, it is worth discussing to detect the brightness uniformity of the screen and determine the brightness uniformity of the screen.
Disclosure of Invention
The application provides a method and equipment for detecting brightness uniformity of a screen, which are used for optimizing the process of detecting the brightness uniformity of the screen.
In a first aspect, the application provides a method for detecting luminance uniformity of a screen, applied to a detection device, the detection device being used for detecting luminance uniformity of a first screen, the method comprising the steps of acquiring a screen gray image of the first screen, the screen gray image of the first screen being obtained by shooting the first screen when the first image is displayed, determining a luminance maximum value and a luminance minimum value corresponding to the first screen area image based on at least two sampling images in the first screen area image, the first screen area image comprising part or all of the screen gray images of the first screen, the luminance maximum value corresponding to the first screen area image indicating a maximum value in luminance of at least two sampling images in the first screen area image, and the luminance minimum value corresponding to the first screen area image indicating a minimum value in luminance of at least two sampling images in the first screen area image, determining the first screen area image based on a difference between the luminance maximum value corresponding to the first screen area image and the luminance minimum value corresponding to the first screen area image, determining that the luminance uniformity of the first screen image is a luminance uniformity of the first screen image when the luminance uniformity of the first screen image is detected in the first screen area image comprising the first screen area image is determined in accordance with the luminance uniformity of the first screen area image, the position of the second screen area image in the screen gray image of the sample screen is the same as the position of the first screen area image in the screen gray image of the first screen, and when the first screen area image comprises all the screen gray images of the first screen, the second screen area image is the screen gray image of the sample screen.
In the above-described embodiments, the first screen can be regarded as the screen to be tested as referred to in the embodiments described below. When the first screen region image includes a portion of a screen gray scale image of the first screen, local luminance uniformity detection is performed for the first screen. When the first screen region image includes all of the screen gray scale image of the first screen, global luminance uniformity detection is performed for the first screen. And comparing the brightness uniformity of the first screen area image with a uniformity detection threshold value to determine the brightness uniformity of the screen area corresponding to the first screen area image. The brightness uniformity detection of the screen can be more objective, and the dependence on detection personnel is reduced.
In combination with the first aspect, in some embodiments, the method includes obtaining screen brightness data of the first screen when the first screen displays the first image, where the screen brightness data is obtained by shooting the first screen when the first image is displayed by a camera, and includes illumination intensity information collected by each pixel unit in the camera when the camera shoots, converting the illumination intensity information in the screen brightness data into a gray image, and determining an image in an area where the first screen is located in the gray image as the gray image of the first screen.
In the above embodiment, there may be an interference picture in the gray-scale image, so that the interference picture has brightness in addition to the brightness of the screen to be tested. The image in the area where the first screen is located in the gray level image is determined to be the screen gray level image of the first screen, so that interference can be eliminated. The description of this part may be referred to as shown in fig. 3 below.
In combination with the first aspect, in some embodiments, determining the luminance uniformity of the first screen area image based on a difference between the luminance maximum value corresponding to the first screen area image and the luminance minimum value corresponding to the first screen area image specifically includes dividing the luminance minimum value corresponding to the first screen area image by the luminance maximum value corresponding to the screen area image to obtain the luminance uniformity of the first screen area image, determining the luminance uniformity of the screen area corresponding to the first screen area image based on the luminance uniformity of the first screen area image and a uniformity detection threshold, specifically includes determining that the luminance uniformity of the screen area corresponding to the first screen area image when the luminance uniformity of the first screen area image is greater than the uniformity detection threshold, and determining that the luminance uniformity of the screen area corresponding to the first screen area image is not uniform when the luminance uniformity of the first screen area image is less than the uniformity detection threshold.
With reference to the first aspect, in some embodiments, the method further includes dividing the screen gray image of the first screen as a first screen region image or dividing the screen gray image of the first screen into M rows by N columns of screen region images, each of the M rows by N columns of screen region images being individually the first screen region image, the M and the N being integers greater than 1.
In the above-described embodiment, M rows by N columns of screen area images can be regarded as M rows by N columns of sub-screen gray scale images involved in the following implementation. In order to make the detection result more reliable, the M rows by N columns of screen area images need to be uniform, i.e., the screen gray image of the first screen is uniformly divided into M rows by N columns of screen area images.
In combination with the first aspect, in some embodiments, in a case that each of the M rows by N columns of screen area images is the first screen area image alone, the method further includes determining that brightness of the screen is uniform when brightness of M rows by N columns of screen area images corresponding to the M rows by N columns of screen area images is uniform, determining that brightness of the screen is non-uniform when a first screen area image with non-uniform brightness exists in the M rows by N columns of screen area images, and determining that brightness of a screen area corresponding to the first screen area image with non-uniform brightness is non-uniform.
In combination with the first aspect, in some embodiments, before determining the luminance uniformity of the screen region corresponding to the first screen region image, where the screen gray image of the first screen is taken as the first screen region image, the method further includes dividing the screen gray image of the sample screen uniformly into at least two screen region images, determining a uniformity detection threshold value by an average value of the luminance uniformity of the at least two screen region images, the uniformity detection threshold value being used to indicate the luminance uniformity of the screen gray image of the sample screen, the luminance uniformity of a single screen region image of the at least two screen region images being determined by a luminance maximum value and a luminance minimum value corresponding to the single screen region image, the luminance maximum value corresponding to the single screen region image being indicative of a maximum value of the luminances of at least two sample images of the single screen region image, and the luminance minimum value corresponding to the single screen region image being indicative of a minimum value of the luminances of at least two sample images of the single screen region image.
In the above-described embodiment, when the screen gray-scale image of the first screen is taken as the first screen region image, the uniformity detection threshold value is the uniformity detection threshold value b referred to in the following embodiment. The process of determining the uniformity detection threshold value by the average value of the luminance uniformity of the at least two screen area images may refer to the following formula (6), and the uniformity detection threshold value may be made more representative.
In combination with the first aspect, in some embodiments, before determining the luminance uniformity of the screen region corresponding to the first screen region image with the screen gray image of the first screen as the first screen region image, the method further includes determining at least two sampling images in the screen gray images of the sample screen, determining a uniformity detection threshold based on a ratio of a luminance minimum value corresponding to the screen gray image of the sample screen to a luminance maximum value, the uniformity detection threshold being used to indicate the luminance uniformity of the screen gray image of the sample screen, the luminance minimum value corresponding to the screen gray image of the sample screen being used to indicate a minimum value in the luminance of at least two sampling images in the screen gray image of the sample screen, and the luminance maximum value corresponding to the screen gray image of the sample screen being used to indicate a maximum value in the luminance of at least two sampling images in the screen gray image of the sample screen.
In the above-described embodiment, when the screen gray-scale image of the first screen is taken as the first screen region image, the uniformity detection threshold value is the uniformity detection threshold value b referred to in the following embodiment.
In combination with the first aspect, before determining the luminance uniformity of the screen area corresponding to the first screen area image, the method further includes determining at least two sampling images in the second screen area image, determining a uniformity detection threshold based on a ratio of a luminance minimum value corresponding to the second screen area image to a luminance maximum value, the uniformity detection threshold being used for indicating the luminance uniformity of the second screen area image, the luminance minimum value corresponding to the second screen area image being used for indicating a minimum value in the luminance of the at least two sampling images in the second screen area image, and the luminance maximum value corresponding to the second screen area image being used for indicating a maximum value in the luminance of the at least two sampling images in the second screen area image.
In the above-described embodiment, in the case where each of M rows by N columns of screen area images is the first screen area image alone, the uniformity detection threshold value may be the uniformity detection threshold value a referred to below.
With reference to the first aspect, in some embodiments, the shape of the sampled image is a circular image.
In the above embodiment, in the circular sampling graph, the pixel points in each direction are uniformly arranged from the center of the circle of the sampling image, which is beneficial to determining more accurate brightness uniformity.
In combination with the first aspect, in some embodiments, the arrangement of the sampling images in the screen gray image of the first screen is the same as the arrangement of the sampling images in the screen gray image of the sample screen, the screen gray image of the first screen includes T rows by H columns of sampling images, the first type of sampling images on the same row are aligned and the non-first type of sampling images on the same row are aligned, the first type of sampling images belong to a first type of screen region image, the first type of screen region image indicates a vertex angle of the first screen, the non-first type of sampling images include a second type of sampling images and a non-second type of sampling images, the second type of sampling images belong to a second type of screen region image, the second type of screen region image indicates a non-display region of the first screen, the first type of sampling images on the same column are aligned, the second type of sampling images on the same column are aligned, the screen images of the first screen are divided into M rows by N columns of screen region images, and the gray images in the first type of screen region image is divided into M rows by N columns of sampling images, and the gray images in the first type of the first screen region is divided into M columns by N integer numbers 2.
The sampling manner in the above embodiment may refer to the following fig. 4 (2), and four vertex angles and non-display areas (i.e., the bang area) of the screen to be tested may be avoided during sampling.
In a second aspect, an embodiment of the application provides a detection device comprising one or more processors and a memory coupled to the one or more processors, the memory for storing computer program code comprising computer instructions that the one or more processors invoke the computer instructions to cause the detection device to perform a method as implemented in the first aspect.
In a third aspect, embodiments of the present application provide a computer readable storage medium comprising instructions which, when run on a detection device, cause the detection device to perform a method as implemented in the first aspect.
In a fourth aspect, embodiments of the present application provide a system on a chip for use in a detection device, the system on a chip comprising one or more processors for invoking computer instructions to cause the detection device to perform the method as implemented in the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product comprising instructions which, when run on a detection device, cause the detection device to perform the method as implemented in the first aspect.
It will be appreciated that the detection device provided in the second aspect, the computer storage medium provided in the third aspect, the chip system provided in the fourth aspect and the computer program product provided in the fifth aspect are all adapted to perform the method provided by the embodiment of the present application. Therefore, other advantages achieved by the method can be referred to as advantages of the corresponding method, and will not be described herein.
Drawings
FIG. 1 illustrates an exemplary scene graph involved in acquiring a screen gray image;
FIG. 2 illustrates an exemplary flow chart involved in implementing a method of detecting luminance uniformity of a screen;
FIG. 3 shows a schematic diagram involved in determining a screen gray image to be tested;
A schematic diagram involved in dividing and sampling a gray-scale image of a screen to be tested is shown in fig. 4;
FIG. 5 shows a schematic diagram involved in setting sampling parameters by a detection device;
FIG. 6 shows a schematic diagram of a sample screen gray scale image;
a schematic diagram of detecting the sampling result is shown in fig. 7A;
Fig. 7B shows a schematic diagram of simulation of the degree of coincidence obtained when the sampling result is detected;
FIG. 8 shows a schematic diagram involved in the presence of anomalies in the brightness of a sampled image;
FIG. 9 illustrates an exemplary flow chart for obtaining a sample database;
Fig. 10 is a schematic structural diagram of a detection device according to an embodiment of the present application.
Detailed Description
In a method for detecting luminance uniformity of a screen, a inspector selects a sample screen as a reference, compares a picture (picture 1) when an image is displayed on a screen to be inspected with a picture (picture 2) when the image is displayed on the sample screen, and if it is determined that a region having a different luminance expression exists in the picture 1 as compared with the picture 2, luminance of a screen region corresponding to the region (in the picture 1) is uneven. If the brightness of the picture 1 is determined to be the same as that of the picture 2, the brightness of the screen to be tested is determined to be uniform.
The screen when the screen to be tested displays the image can also be called as a screen gray image of the screen to be tested. The picture when the sample screen displays an image may also be referred to as a screen gray image of the sample screen. Wherein the screen gray-scale image of one screen is obtained by photographing the screen when the image is displayed. The brightness uniformity behavior of a screen when displaying an image can be described by the screen gray scale image of the screen.
It should be noted that the sample screen may be the same screen as the screen to be tested. Through professional screening, a screen with good working quality and good brightness can be selected as a sample screen. The screen to be tested and the sample screen need to display the same image. In general, the pixel values of the pixels in the image are the same, so that the expected brightness generated by the pixels in the screen during display is the same, and the comparison result is accurate.
Hereinafter, for convenience of description, the screen gray image of the screen to be measured is simply referred to as the screen gray image to be measured, and the screen gray image of the sample screen is simply referred to as the sample screen gray image.
In one possible implementation, a special camera (e.g., CX-50C camera) may be used to capture a screen under test when the image is displayed, resulting in a screen gray image under test, and a sample screen when the image is displayed, resulting in a sample screen gray image. Here, taking the acquisition of the gray-scale image of the screen to be tested as an example for illustration, the process of acquiring the gray-scale image of the sample screen may refer to the related description, and will not be repeated.
The related content of the gray image of the screen to be measured can be obtained by referring to fig. 1.
As shown in fig. 1, the screen to be tested is placed in the device to be tested, and the detection device is connected with the camera. When the screen to be tested displays images, the screen to be tested is shot through a camera, and screen brightness data of the screen to be tested are captured. Then, the detection device receives the screen brightness data transmitted by the camera. The screen brightness data comprises illumination intensity information collected by each pixel unit in the camera. The units of illumination intensity are candela per square meter (cd/m 2).
Subsequently, the detection device converts the illumination intensity information in the screen luminance data into a grayscale image (refer to the grayscale image 101 shown in fig. 1). And then determining the gray image of the screen to be tested by the image in the area of the screen to be tested in the gray image.
Each pixel point in the gray level image and each pixel unit in the camera are in one-to-one correspondence, and the gray level value corresponding to one pixel point in the gray level image is obtained by converting the illumination intensity information of the pixel unit corresponding to the pixel point. The gray values are typically between 0 and 255, where 0 represents black, 255 represents white, and the remaining values represent different degrees of gray. The smaller the gray value of a pixel, the darker the pixel will be when displayed. This also means that the darker the brightness of the screen area to which the pixel corresponds. The screen area corresponding to one pixel point is extremely small, and the brightness thereof is difficult to observe by human eyes, but the used camera can be determined.
Here, it should be noted that the connection between the detection device and the camera is optional in the content related to fig. 1, and in actual circumstances, the camera may be further provided in the detection device as the camera of the detection device. As long as the detection device can acquire the data acquired by the camera. The detection device shown in fig. 1 may be connected to the camera by a connection method such as a data line connection or a network connection, which is not limited in the embodiment of the present application.
It is also noted that in case the camera is a CX-50C camera, the detection device may be referred to as CX-50C device.
In the method, whether the brightness uniformity of the screen to be detected is good depends on the observation of a detector. Often with subjectivity, not objectivity.
In addition, when the brightness of the images displayed on the screen to be tested and the sample screen is low. For example, near black or just black images, it is also difficult for a inspector to observe areas with different brightness expression in the screen gray image to be inspected and the sample screen gray image.
The scene when the brightness of the images displayed on the screen to be tested and the sample screen is very low can be called as a detection scene with low brightness and low gray scale.
In order to solve the problems involved in the foregoing methods, another method for detecting brightness uniformity of a screen is provided, which can make the detection of brightness uniformity of the screen more objective and reduce the dependence on detection personnel. In the method, after determining a screen gray image of a screen to be measured (screen gray image to be measured), the detection device may determine a screen region image to be measured through the screen gray image to be measured, and the screen region image to be measured may be a part of the screen gray image to be measured or all of the screen gray image to be measured. The detection device determines the brightness uniformity of the screen area image to be detected. Then, the luminance uniformity is compared with a uniformity detection threshold to determine the luminance uniformity of a screen region to be measured (simply referred to as a screen region) corresponding to the screen region image to be measured.
The process for determining the brightness uniformity of the screen area image to be detected comprises the step of determining a brightness maximum value and a brightness minimum value corresponding to the screen area image to be detected based on at least two sampling images in the screen area image to be detected. And determining a difference between the brightness maximum value and the brightness minimum value, and taking the difference as brightness uniformity of the screen area image to be tested, or taking the difference as brightness uniformity of the screen area image to be tested by rounding up or rounding down the difference. The maximum brightness value corresponding to the screen area image to be detected is the maximum brightness value of at least two sampling images in the screen area image to be detected. The minimum value of the brightness corresponding to the screen area image to be detected is the minimum value of the brightness of at least two sampling images in the screen area image to be detected. One of the sample images in the screen region image to be measured refers to a local image sampled from the screen region image to be measured, and is used for representing the local brightness in the screen region image to be measured.
In some possible implementations (implementation 1), the difference between the luminance maximum and the luminance minimum may be expressed as a ratio of the luminance minimum divided by the luminance maximum. The larger the ratio is, the more uniform the brightness of the screen area to be measured corresponding to the screen area to be measured image is.
In implementation 1, the process of comparing the brightness uniformity with the uniformity detection threshold to determine the brightness uniformity of the screen region to be tested corresponding to the screen region image to be tested may include determining that the brightness uniformity of the screen region to be tested corresponding to the screen region image to be tested is uniform if the brightness uniformity of the screen region image to be tested is greater than the uniformity detection threshold. And determining that the brightness of the screen area corresponding to the screen area image to be detected is uneven under the condition that the brightness uniformity of the screen area image to be detected is smaller than or equal to the uniformity detection threshold value.
Not limited to the ratio of the luminance minimum divided by the luminance maximum, the difference between the luminance maximum and the luminance minimum has other representations as well. For example, in other implementations (implementation 2), the difference between the luminance maximum and the luminance minimum may also be expressed as the difference of the luminance maximum minus the luminance minimum. The smaller the difference value is, the more uniform the brightness of the screen area to be detected corresponding to the screen area to be detected image is.
In implementation 2, comparing the brightness uniformity with the uniformity detection threshold to determine the brightness uniformity of the screen region to be tested corresponding to the screen region image to be tested may include determining that the brightness uniformity of the screen region to be tested corresponding to the screen region image to be tested is uniform if the brightness uniformity of the screen region image to be tested is less than the uniformity detection threshold. And determining that the brightness of the screen area corresponding to the screen area image to be detected is uneven under the condition that the brightness uniformity of the screen area image to be detected is greater than or equal to the uniformity detection threshold.
Wherein the uniformity detection threshold is used to indicate brightness uniformity of the sample screen area image. The uniformity detection threshold is determined from a sample screen at the time of displaying the image. When the screen region image to be measured comprises a part of the screen gray image to be measured, the position of the sample screen region image in the sample screen gray image is the same as the position of the screen region image to be measured in the screen gray image to be measured. When the screen area image to be measured comprises all the screen gray images to be measured, namely, when the screen area image to be measured is the screen gray image to be measured, the sample screen area image is the sample screen gray image.
In some possible implementations, the uniformity detection threshold may be determined in a manner similar to the manner in which the brightness uniformity of the screen area image to be measured is determined. For example, the luminance uniformity of the sample screen region image is determined by the difference between the luminance maximum value and the luminance minimum value corresponding to the sample screen region image, and the uniformity detection threshold is determined based on the luminance uniformity of the sample screen region image. The brightness maximum value corresponding to the sample screen area image is the maximum value in the brightness of at least two sampling images in the sample screen area image. The minimum brightness value corresponding to the sample screen area image is the minimum brightness value of at least two sampling images in the sample screen area image. One of the sample screen region images refers to a local image sampled from the sample screen region image for representing the luminance of a local in the sample screen region image.
Regarding the luminance uniformity of the screen to be measured, it should be noted that, when the image of the screen area to be measured is the gray image of the screen to be measured, the area of the screen to be measured corresponding to the image of the screen area to be measured is the whole screen to be measured, if the luminance of the area of the screen to be measured corresponding to the image of the screen area to be measured is uniform, the luminance uniformity of the screen to be measured is illustrated, and the luminance uniformity at this time is the luminance uniformity of the whole screen to be measured. If the brightness of the screen region to be measured corresponding to the screen region image to be measured is uneven, the uneven brightness of the screen to be measured is indicated, and the uneven brightness at the moment is the uneven brightness of the whole screen to be measured.
When the screen region image to be measured includes a part of the gray level image of the screen to be measured, if the brightness of the screen region to be measured corresponding to the screen region image to be measured is uneven, the uneven brightness of the screen to be measured can be also illustrated. The detection device may divide the screen gray image into M rows by N columns of sub-screen gray images, each of which may be individually regarded as the aforementioned related screen area image to be measured. If the brightness of the screen areas to be detected corresponding to the gray images of the sub-screens of M rows and N columns is uniform, the brightness uniformity of the screen to be detected can be determined. If the sub-screen gray-scale images with uneven brightness exist in the sub-screen gray-scale images with M rows and N columns, the uneven brightness of the screen to be tested is determined. Wherein, the screen area to be measured corresponding to the sub-screen gray level image with uneven brightness expression is uneven. Wherein, M and N are integers greater than 1, M and N may be the same or different, and the embodiment of the application is not limited thereto.
Based on the foregoing, by the luminance uniformity detection method of the other screen, the detection device can use the gray image of the whole screen to be detected as the region image of the screen to be detected, and perform global luminance uniformity detection on the screen to be detected. The gray level image of the screen to be detected can be divided into M rows and N columns of sub-screen gray level images, each sub-screen gray level image is independently used as a region image of the screen to be detected, and local brightness uniformity detection is carried out on the screen to be detected. The relevant contents involved in implementing the luminance uniformity detection method of the screen can be referred to the following description of step S101 to step S107 in fig. 2.
S101, detecting equipment acquires screen brightness data of a screen to be detected.
The detection equipment can obtain screen brightness data of a screen to be detected by shooting the screen to be detected when displaying the image through the camera. The screen brightness data comprises illumination intensity information collected by each pixel unit in the camera. For other descriptions of the screen brightness data, reference may be made to the foregoing related contents, and no further description is given here.
S102, the detection device obtains a gray image based on the screen brightness data.
The detection equipment converts illumination intensity information of each pixel unit in the screen brightness data into a gray value of each pixel point to obtain a gray image. Each pixel point in the gray scale image and each pixel unit in the camera can be in one-to-one correspondence. For the description of the gray value, reference may be made to the foregoing related content, and no further description is given here.
S103, the detection device determines a screen gray image to be detected based on the gray image.
The detection device identifies an image in the area where the screen to be detected is located in the gray level image, and the image is used as the gray level image of the screen to be detected.
As shown in (1) of fig. 3, there may be an interference picture in the gray-scale image, so that the interference picture has brightness in addition to the brightness of the screen to be measured. The reason for generating the interference picture includes, but is not limited to, that when a screen to be measured is photographed by a camera while an image is displayed, an interference signal may exist or a luminescent impurity exists around the screen to be measured.
As shown in (2) of fig. 3, at the white dotted line, the gray image of the screen to be measured includes an image in the area where the screen to be measured is located, which can be understood as including the smallest rectangular area of the screen to be measured. The brightness uniformity of the screen is detected by the gray level image of the screen to be detected, so that the influence of the interference picture on the brightness uniformity detection of the screen can be eliminated.
The manner of determining the gray image of the screen to be measured includes, but is not limited to, the following two determination manners.
The method 1 is to binarize the gray image and determine the target pixel and non-target pixel. The target pixel point is obtained by binarizing a pixel point with a gray value larger than a preset gray value in the gray image, which means that the target pixel point is not a black pixel point and has brightness. The non-target pixel is obtained by binarizing a pixel point with a gray value smaller than a preset gray value in the gray image, which means that the non-target pixel can be a black pixel point.
Then, the detection device removes the non-target region in the grayscale image. The non-target area includes connected target pixels, but the number of target pixels is less than a preset number (e.g., 100). And then, identifying the area where the minimum circumscribed rectangle is located as the area where the screen to be detected is located, and taking the image in the area where the minimum circumscribed rectangle is located as the gray level image of the screen to be detected. The minimum bounding rectangle includes all target pixel points in the gray scale image that have not been removed.
In some possible cases, the detection device may binarize the grayscale image using the Ostu method.
The detection device can identify the largest area formed by the communicated target pixel points in the gray level image as the area where the screen to be detected is located, and the image in the largest area is used as the gray level image of the screen to be detected. The communication is used for indicating that non-target pixel points are not included between adjacent target pixel points, and the non-target pixel points are black pixel points in the gray level image.
For the screen gray image to be measured, the operations shown in the following steps S104 to S107 are performed to realize the luminance uniformity detection of the screen.
S104, uniformly dividing the gray level image of the screen to be detected into a plurality of sub-screen gray level images by the detection equipment.
This step S104 is optional.
The detection device divides the screen gray image to be detected into M rows by N columns of sub-screen gray images. Wherein M and N are integers greater than 1.
As shown in (1) of fig. 4, the screen gray image to be measured is divided into 3 rows and 5 columns of sub-screen gray images. The 15 sub-screen grayscale images are numbered sequentially as (1, 1), (1, 2), (3, 4), (3, 5). Wherein the kth row and the kth column of the sub-screen gray scale image are marked with (k, u).
S105, the detection equipment determines a plurality of uniformly arranged sampling images in the gray level images of the screen to be detected, and the sampling images are uniformly divided into a plurality of sub-screen gray level images.
In step S105, the uniform division of the sample image into a plurality of sub-screen gray scale images occurs in the case where step S104 is performed.
The detection device determines a plurality of uniformly arranged sampling images in the gray level image of the screen to be detected, for example, T rows multiplied by H columns of sampling images, wherein T is an integer greater than 2M, and H is an integer greater than 2N.
In the case where step S104 is performed, it can also be described that the detection device determines at least two sample images, for example, R rows by C columns of sample images, R being T/M and C being H/N, respectively, in each sub-screen gray scale image.
As shown in (2) of fig. 4, the detection device determines 15 lines by 20 columns of sampling images in the screen gray-scale image to be measured. Each sub-screen gray scale image comprises 5 lines and 4 columns of sampling images.
In determining the sample image, the detection device needs to avoid the non-display area of the detection device and the area including the top corner of the screen in the gray-scale image of the screen to be detected, but the T rows multiplied by the H columns of the sample image are still uniform as a whole. The T-row and H-column sampling images conform to a preset arrangement rule, as shown in (2) in fig. 4, in the T-row and H-column sampling images, the first-type sampling images on the same row are aligned and arranged, and the non-first-type sampling images on the same row are aligned and arranged. The non-first type sampling images comprise second type sampling images and non-second type sampling images, the first type sampling images on the same column are aligned, the second type sampling images on the same column are aligned, and the non-second type sampling images on the same column are aligned. The first type of sample image belongs to a first type of screen region image in which the top corners of the screen to be measured are described, for example, sample images in the sub-screen gray scale image (1, 1), the sub-screen gray scale image (1, 5), the sub-screen gray scale image (3, 1), and the sub-screen gray scale image (3, 5). The second type of sampled image belongs to the second type of screen area image in which the non-display area (which may also be referred to as the bang area) of the screen to be measured is depicted, for example the sampled image in the sub-screen gray image (2, 5).
In some possible implementations, the detection device includes a configuration file for performing the luminance uniformity detection. The configuration file may be used to determine sampling parameters for determining a sampling image in a screen gray scale image to be measured. The sampling parameters are used for determining sampling positions in the gray level image of the screen to be detected so as to obtain a sampling image.
The detection device may display a sample setup interface based on the configuration file, as shown in fig. 5 (1), for one exemplary sample setup interface, which may include sample parameters. Through the sample setting interface, the values of the sample parameters can be set. In some possible cases, one or more of the following parameters are the left side margin (val_left), the upper side margin (val_top), the right side margin (val_right), the lower side margin (val_bottom), the number of division lines (val_seg_row), the number of division columns (val_seg_col), the number of sampling lines (val_sample_row), the number of sampling columns (val_sample_col), the sampling diameter (val_diameter), the non-display area width (val_ bangs), and the non-display area position (str_ bangs).
Where the left side distance represents the distance (in mm) from the left boundary of the screen to be tested, of the sample image closest to the top left corner of the screen to be tested (e.g., sample image 501) and the sample image closest to the bottom left corner of the screen to be tested (e.g., sample image 502). The upper side distance represents the distance (in mm) from the upper boundary of the screen to be measured of the sample image (e.g., sample image 501) closest to the upper left corner of the screen to be measured and the sample image (e.g., sample image 503) closest to the upper right corner of the screen to be measured. The right side distance represents the distance (in mm) from the right boundary of the screen to be measured of the sample image (e.g., sample image 503) closest to the upper right corner of the screen to be measured and the sample image (e.g., sample image 504) closest to the lower right corner of the screen to be measured. The lower side distance represents the distance (in mm) from the lower boundary of the screen to be measured of the sample image (e.g., sample image 502) closest to the lower left corner of the screen to be measured and the sample image (e.g., sample image 504) closest to the lower right corner of the screen to be measured. The dividing line number M and the dividing column number N respectively represent dividing the screen gray image to be detected into M lines by N columns of digital screen gray images. The number of sampling lines R and the number of sampling columns C represent the determination of R lines by C columns of sampling images in one sub-screen gray scale image, respectively. The sampling diameter represents the diameter when the sampling image is circular. The width of the non-display area and the position of the non-display area are used for determining an image where the non-display area is located in the gray level image of the screen to be detected.
As shown in (1) in fig. 5 and (2) in fig. 5, in response to an operation for the confirmation control, the detection device may determine a sampling image in the screen gray image to be detected according to a preset arrangement rule.
In some possible cases, in addition to the sampling parameters mentioned above, other parameters for detecting brightness uniformity may be included in the configuration file. For example, a uniformity detection threshold (val_ threshod) involved in performing the overall luminance uniformity detection may also be included.
The foregoing reference to the sampling parameters is illustrative, and in fact, the sampling parameters may have other manifestations, which may be used to determine the sampling image in the gray-scale image of the screen to be tested, and should not be construed as limiting the embodiments of the present application.
S106, determining whether the brightness of the screen area corresponding to each sub-screen gray level image is uniform or not based on the sampling images included in each sub-screen gray level image, wherein the brightness uniformity of the screen area corresponding to one sub-screen gray level image indicates that the difference between the darkest sampling image and the brightest sampling image in the sub-screen gray level image is smaller than a uniformity detection threshold value a.
Step S106 is used for detecting local brightness uniformity of the screen to be detected.
The difference between the darkest sample image and the brightest sample image in a sub-screen gray scale image can be understood as the difference between the maximum brightness value and the minimum brightness value corresponding to the sub-screen gray scale image. For determining the brightness uniformity of the sub-screen gray scale image. A sub-screen gray scale image can be understood as the screen area image to be measured referred to above. In step S106, a difference is described as an example of a ratio of the minimum brightness value divided by the maximum brightness value. For further description of the differences and brightness uniformity, reference is made to the foregoing related contents, and no further description is given here.
The detection device may determine whether the brightness of the screen area corresponding to each sub-screen gray-scale image is uniform. Here, description will be made taking an example of determining whether or not the luminance of the screen area corresponding to the kth line and the kth column sub-screen gray scale image is uniform. The process of determining whether the brightness of the screen area corresponding to the gray level image of the other sub-screen is uniform may refer to the related description, and will not be repeated here.
Hereinafter, for convenience of description, the kth line and the kth column sub-screen gray-scale image are described as sub-screen gray-scale images (k, u).
First, the detection device determines the brightness of each of the sampling images in the sub-screen gray scale image (k, u). Wherein, the formula for determining the brightness of the sampled image may refer to the following formula (1).
In the formula (1),Represents the brightness of the j-th sampling image of the i-th line in the sub-screen gray scale image (k, u),The brightness value (i.e. gray value) of the f pixel point in the j-th sampling image of the i-th line is represented, and the value range of f is from 1 to n c.nc, which represents the total number of all the pixel points in the j-th sampling image of the i-th line. i is an integer from 1 to R, and j is an integer from 1 to C.
Then, a luminance maximum value and a luminance minimum value corresponding to the sub-screen gray-scale image (k, u) are determined. The luminance uniformity of the sub-screen gray scale image (k, u) is determined by the difference between the luminance maximum value and the luminance minimum value. And comparing the brightness uniformity with a uniformity detection threshold value a corresponding to the sub-screen gray level image (k, u) to determine whether the brightness of the screen area corresponding to the sub-screen gray level image (k, u) is uniform. The formula involved in this process can be referred to as the following formula (2).
In the formula (2), R (k, u) represents luminance uniformity of a screen region corresponding to the sub-screen gray-scale image (k, u), and R (k, u) =pass represents luminance uniformity of a screen region corresponding to the sub-screen gray-scale image (k, u). R (k, u) =failed, and indicates that luminance of a screen region corresponding to the sub-screen gray-scale image (k, u) is uneven. thr (a) represents a uniformity detection threshold value a corresponding to the sub-screen gray-scale image (k, u). thr (a) is determined based on the sample screen. For the process of determining this thr (a), reference may be made to the following description of formula (4).The luminance minimum value corresponding to the sub-screen gray scale image (k, u) represents the luminance of the darkest sample image in the sub-screen gray scale image (k, u).The luminance maximum value corresponding to the sub-screen gray-scale image (k, u) is represented, representing the luminance of the brightest sample image in the sub-screen gray-scale image (k, u).The difference between the maximum brightness value and the minimum brightness value corresponding to the sub-screen gray scale image (k, u) is represented. The difference between the maximum luminance value and the minimum luminance value is taken as an example of luminance uniformity of the sub-screen gray-scale image (k, u) in the formula (2).
In the above formula (1), the method involved in determining whether a pixel in the gray-scale image of the screen to be measured is a pixel in the j-th sampling image of the i-th line may refer to the following formula (3).
And (3) expressing that the pixel points meeting the formula (3) in the gray level image of the screen to be detected are the pixel points in the j-th sampling image of the i-th row. In the formula (3),AndThe pixel row number of the f pixel point in the ith row and the jth sampling image in the gray image of the screen to be tested and the pixel column number in the screen to be tested are respectively represented. Here, all pixel points included in the gray-scale image of the screen to be measured are distributed as rows times columns cols.The value of (2) is any integer from 1 to rows,The value of (2) is any integer from 1 to cols.AndThe pixel row number of the central pixel point of the ith row and the jth sampling image in the gray image of the screen to be tested and the pixel column number in the screen to be tested are respectively represented.
Regarding the uniformity detection threshold a corresponding to the sub-screen gray-scale image (k, u), that is, thr (a) in the foregoing formula (2). The detection device may determine the uniformity detection threshold a by a sample screen region image corresponding to the sub-screen gray image (k, u) in the sample screen gray image. As shown in fig. 6, the arrangement of the sample screen area images obtained by dividing the sample screen gray-scale image is the same as the arrangement of the sub-screen gray-scale images obtained by detecting the division of the screen gray-scale image. The sample screen region image includes M rows by N columns of sample screen region images, and a sample screen region image corresponding to a sub-screen gray image (k, u) in the sample screen gray image is located in a kth row and a kth column in the sample screen gray image.
For convenience of description, a sample screen area image corresponding to the sub-screen gray-scale image (k, u) is referred to as a sample screen area image (k, u). As shown in fig. 6, the sample screen region image (1, 1) to the sample screen region image (3, 5) correspond to the sub-screen gray-scale image (1, 1) to the sub-screen gray-scale image (3, 5), respectively. A uniformity detection threshold a can be determined based on each of the sample screen region images (1, 1) to (3, 5). For example, it is possible to determine that the uniformity detection threshold a is 0.93 based on the present screen area image (1, 1) as the uniformity detection threshold a corresponding to the sub-screen gray-scale image (1, 1).
The process of determining the uniformity detection threshold a may refer to the following equation (4).
In the formula (4), thr (a) represents a uniformity detection threshold value a, floor) For rounding down, i.e.Can represent pairs ofA rounding down is performed.Representing the difference between the maximum luminance value and the minimum luminance value corresponding to the sample screen region image (k, u) can also be understood as luminance uniformity of the sample screen region image (k, u). The arrangement of the sample images in the sample screen area image (k, u) is the same as the arrangement of the sample images in the sub-screen gray scale image (k, u). DeterminedThe process is the same as the process of determining R (k, u), and the sub-screen gray scale image (k, u) may be replaced by the sample screen area image (k, u) by referring to the process of determining R (k, u), which is not described in detail in the embodiment of the present application.
It should be noted that the downward rounding operation in the formula (4) is optional, and in practical cases, the downward rounding operation may be performed directlyAs thr (a).
S107, determining whether the brightness of a screen area corresponding to the gray image of the screen to be detected is uniform or not based on the sampling image included in the gray image of the screen to be detected, wherein the brightness uniformity of the screen area corresponding to the gray image of the screen to be detected indicates that the difference between the darkest sampling image and the brightest sampling image in the gray image of the screen to be detected is smaller than a uniformity detection threshold b.
Step S107 is used for detecting the brightness uniformity of the whole screen to be tested.
The difference between the darkest sampling image and the brightest sampling image in the gray level image of the screen to be detected can be understood as the difference between the brightness maximum value and the brightness minimum value corresponding to the gray level image of the screen to be detected. And the brightness uniformity of the screen gray level image to be detected is determined. In step S107, a difference is described as an example of a ratio of the minimum brightness value divided by the maximum brightness value. For further description of the differences and brightness uniformity, reference is made to the foregoing related contents, and no further description is given here.
In the case where step S106 is performed, the distribution of the sampled images in the screen gray image to be measured in step S107 is the same as the distribution of the sampled images in the screen gray image to be measured in step S105. After determining the sampling image in step S105, step S107 may be performed to determine luminance uniformity of a screen area corresponding to the screen gray image to be measured. The formula involved in this process can be referred to as the following formula (5).
In the formula (5), P represents the luminance uniformity of a screen area (entire screen area) corresponding to the screen gray image to be measured. P=pass, which means that the brightness of the screen area (the whole screen area) corresponding to the gray-scale image of the screen to be measured is uniform. P=failed, which indicates that the brightness of the screen area corresponding to the gray-scale image of the screen to be measured is uneven.And representing the minimum brightness corresponding to the gray level image of the screen to be tested, and representing the brightness of the darkest sampling image in the gray level image of the screen to be tested.And representing the brightness maximum value corresponding to the gray level image of the screen to be tested, and representing the brightness of the brightest sampling image in the gray level image of the screen to be tested.And representing the difference between the brightness maximum value and the brightness minimum value corresponding to the gray level image of the screen to be tested. thr (b) represents a uniformity detection threshold b corresponding to a gray-scale image of a screen to be measured. thr (b) is determined based on the sample screen. Reference may be made to the description of equation (4) above for the process of determining this thr (b). The sample screen area image (k, u) is modified into the screen gray image to be tested, and the description is omitted here.
In other possible cases, in order to make the uniformity detection threshold b more representative, the uniformity detection threshold b may be determined without reference to the foregoing equation (4). At this time, the uniformity detection threshold b may be determined using an average value of luminance uniformity of at least two sample screen region images divided in the sample screen gray-scale image. The description of this process may refer to the following formula (6).
In the formula (6), thr (b) represents a uniformity detection threshold b. floor%) For the rounding down operation, a pair is here indicatedA rounding down operation is performed.
In the formula (6), M1 and N1 respectively represent that the detection device divides the sample screen into M1 rows by N1 columns of sample screen area images, and M1 and N1 are integers greater than 1. In the case where step S104 is performed, this M1 may be the same as M as previously mentioned, and N1 may be the same as N as previously mentioned.The difference between the maximum luminance value and the minimum luminance value corresponding to the qth row and the sth column of the sample screen area image (q, s) is also understood as the luminance uniformity of the sample screen area image (q, s). Wherein, the value range of q is 1 to M1, and the value range of s is 1 to N1. DeterminedThe process is the same as the process of determining R (k, u), and the content related to the sub-screen gray scale image (k, u) may be replaced with the content related to the sample screen area image (q, s) by referring to the process of determining R (k, u), which is not described in detail in the embodiment of the present application.
It should be noted that the downward rounding operation in the formula (6) is optional, and in practical cases, the downward rounding operation may be performed directlyAs thr (a).
Here, it should be noted that the step S107 is performed independently of the steps S104 to S106, and in the case of performing the steps S104 to S106, the parameters (e.g., M1, N1, q, S, etc.) in the related formulas in the step S107 may be the same as the parameters in the related formulas in the steps S104 to S106. In the case where step S104 to step S106 are not performed, the parameters in the related formulas in step S107 may be set based on actual conditions. The method comprises the steps of dividing a screen gray image to be detected and a sampling image in a sample screen gray image, and dividing a sample screen region image in the sample screen gray image. However, the arrangement of the sampling images in the screen gray image to be detected and the sample screen gray image is required to be the same.
In some possible cases, the detection apparatus may also output a luminance uniformity detection result for reference, and the case where both local luminance uniformity detection (abbreviated as local detection) and global luminance uniformity detection (abbreviated as global detection) are performed is taken as an example for explanation, and the obtained luminance uniformity detection result may be referred to table 1 below.
TABLE 1
Shown in table 1 is a detection result obtained when luminance uniformity detection is performed based on each sub-screen gray-scale image when the screen gray-scale image to be measured is divided into 15 sub-screen gray-scale images in fig. 4. From the detection result, the brightness of the screen area corresponding to the sub-screen gray-scale image of the 2 nd row and the 5 th column is uneven. The brightness of the screen area corresponding to the gray level images of other sub-screens is uniform, and the brightness of the whole screen is uniform.
Table 1 is merely illustrative, and in actual cases, table 1 may have more or less content, for example, the foregoing sampling parameters, uniformity detection threshold, and the like may also be included, which is not limited in this embodiment of the present application.
Here, the uniformity detection threshold values (including the uniformity detection threshold value a and the uniformity detection threshold value b) referred to above may be determined by the sample screen. The uniformity detection threshold value a and the uniformity detection threshold value b may be the same empirical value, for example, 0.9, or the like, for the empirical value selected according to the actual situation, which may be determined not through the sample screen. The uniformity detection threshold a and the uniformity detection threshold b may be different empirical values, and the determination of the uniformity detection threshold should not be taken as a limitation of the embodiments of the present application.
Based on the foregoing, it can be seen that the arrangement of the sample images in the screen gray image to be measured and the sample screen gray image needs to be the same, which is important for determining the brightness uniformity. Therefore, it is necessary to verify that the detection device determines the accuracy of the sampling image in the screen gray image to be detected and the sample screen gray image. After the verification is qualified, the brightness uniformity of the screen can be determined by using the detection device.
As shown in (1) of fig. 7A, 135 sample images determined by the detection device in the screen gray-scale image to be detected are used. As shown in (2) in fig. 7A, 135 sample images determined in the screen gray image to be measured using the dot pattern are shown. The sample image shown in (2) in fig. 7A is accurate by the detection person determining that the sample image is uniform. Fig. 7B shows a coincidence simulation of 135 sample images shown in (1) in fig. 7A and 135 sample images shown in (2) in fig. 7A. And if the coincidence degree simulation is qualified, after the coincidence degree simulation is verified to be qualified, the detection equipment can be used for determining the brightness uniformity of the screen. Otherwise, adjusting the process of determining the sampling image by the detection equipment, and continuing to perform the contact ratio simulation until the contact ratio simulation is qualified.
Here, the luminance uniformity detection of the screen is described based on one screen to be detected, and in actual situations, the luminance uniformity detection may be performed on a plurality of screens to be detected in batches. The uniformity detection threshold used may be determined based on one sample screen.
It should be noted that, after the brightness of each sample image is determined. The detection device may further determine a sampling image having abnormal brightness, so that brightness of a screen area corresponding to the sampling image is not uniform. Referring to fig. 8, a schematic diagram of brightness of each sampling image in a sub-screen gray scale image in a screen gray scale image to be measured is shown. Here, the screen gray image to be measured is divided into 3 rows and 5 columns of sub-screen gray images as an example. The 15 sub-screen grayscale images are numbered sequentially as (1, 1), (1, 2), (3, 4), (3, 5). One sub-screen gray scale image includes 20 sampling images. As can be seen in fig. 8, there is an abnormality in the brightness of the 18 th sample image in the sub-screen gray scale images (2, 4). The brightness of the screen area corresponding to the 18 th sample image is not uniform. The sub-screen gray level images (2, 4) are the sub-screen gray level images of the 2 nd row and the 4 th column of the screen gray level image to be detected.
In some possible cases, in the process of detecting the brightness uniformity of the screen, a sample database can be obtained, and the sample database is used for deep learning to construct a brightness uniformity detection model of the screen. Subsequently, the luminance uniformity of the screen may be detected using the luminance uniformity detection model.
The process of determining the sample database may refer to the description of step S201 to step S204 in fig. 9 described below.
S201, setting an interception threshold value and an interception area.
In some possible cases, the interception area is at least one of a sub-screen gray image in local luminance uniformity detection, a screen gray image to be detected in global luminance uniformity detection.
The interception threshold may be a uniformity detection threshold corresponding to the interception area.
S202, respectively acquiring screen data which does not meet the interception threshold value and screen data which meets the interception threshold value.
The non-satisfaction of the interception threshold may determine that brightness of the interception area is non-uniform, and screen data that does not satisfy the interception threshold enters the negative sample database.
Meeting the interception threshold can determine that the brightness of the interception area is uniform, and screen data meeting the interception threshold enters a positive sample database.
The screen data herein may include an image of the interception area or screen data generating the image, the interception mark of the image. In the case that the image is a gray image of the screen to be tested, the interception mark is labeled label 0. In the case that the image is a sub-screen gray image, the interception mark is labeled as label e, where e is the number of lines in the sub-screen gray image multiplied by the number of columns in the screen gray image to be tested. The method can also comprise the brightness uniformity of the area corresponding to the image.
S203, classifying based on the screen data to obtain a positive sample database and a negative sample database.
And dividing the screen data which do not meet the interception threshold into a negative sample database according to the interception mark. The screen data corresponding to the interception mark belongs to a sub-negative sample database in the negative sample database.
Dividing the screen data meeting the interception threshold into a positive sample database according to the interception mark. The screen data corresponding to the interception mark belongs to a sub positive sample database in the positive sample database.
S204, processing the positive sample database and the negative sample database to obtain a sample database.
The processing in step S204 includes formatting, where the positive sample database and the negative sample database are processed according to a unified format, so as to obtain a sample database.
An exemplary detection apparatus provided by an embodiment of the present application is described below.
Fig. 10 is a schematic structural diagram of a detection device according to an embodiment of the present application.
It should be understood that the detection device may have more or fewer components than shown in fig. 10, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 10 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The detection devices may include a processor 171, a memory 172, a camera 173, a chipset 174 (including one or more of a north bridge 174A and a south bridge 174B), a power switch 175, an input output device 176, a USB communication processing module 177, a wireless communication processing module 178 (including one or more of a bluetooth communication processing module 178A and a WLAN communication processing module 178B), and the like.
The camera 173 may be a camera for capturing a gray-scale image of a screen to be tested as described above.
The processor 171 may be used to read and execute computer readable instructions. In particular implementations, the processor 171 may include primarily controllers, operators, and registers. The controller is mainly responsible for instruction decoding and sending out control signals for operations corresponding to the instructions. The arithmetic unit is mainly responsible for performing fixed-point or floating-point arithmetic operations, shift operations, logic operations, and the like, and may also perform address operations and conversions. The register is mainly responsible for storing register operands, intermediate operation results and the like temporarily stored in the instruction execution process.
In an embodiment of the present application, the memory 172 may be understood as an internal memory of the detection device, such as a flash ROM chip or other type of ROM chip, such as an EEP ROM chip. The memory 172 may be integrated with the processor 171 or may be provided separately from the processor 171, and the embodiment of the present application is not limited.
In an embodiment of the present application, the processor 171 may invoke computer instructions stored in the memory 172 to cause the detection device to perform the method in the embodiment of the present application.
The application also provides a chip system comprising at least one processor for implementing the functions involved in the method performed by the detection device in any of the embodiments described above.
In one possible design, the system on a chip further includes a memory to hold program instructions and data, the memory being located either within the processor or external to the processor.
The chip system may be formed of a chip or may include a chip and other discrete devices.
Alternatively, the processor in the system-on-chip may be one or more. The processor may be implemented in hardware or in software. When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like. When implemented in software, the processor may be a general purpose processor, implemented by reading software code stored in a memory.
Alternatively, the memory in the system-on-chip may be one or more. The memory may be integral with the processor or separate from the processor, and embodiments of the present application are not limited.
The memory may be a non-transitory processor, such as a ROM, which may be integrated on the same chip as the processor, or may be separately provided on different chips, and the type of memory and the manner of providing the memory and the processor are not particularly limited in the embodiments of the present application.
The present application also provides a computer program product comprising a computer program (which may also be referred to as code, or instructions) which, when executed, causes a computer to perform the method performed by the detection device in any of the embodiments described above.
The present application also provides a computer-readable storage medium storing a computer program (which may also be referred to as code, or instructions). The computer program, when executed, causes a computer to perform the method performed by the detection device in any of the embodiments described above.
While the application has been described in detail with reference to the foregoing embodiments, it will be understood by those skilled in the art that the foregoing embodiments may be modified or equivalents may be substituted for some of the features thereof, and that the modifications or substitutions do not depart from the spirit of the embodiments.
As used in the above embodiments, the term "when..is interpreted as meaning" if..or "after..or" in response to determining..or "in response to detecting..is" depending on the context. Similarly, the phrase "when determining..or" if (a stated condition or event) is detected "may be interpreted to mean" if determined.+ -. "or" in response to determining.+ -. "or" when (a stated condition or event) is detected "or" in response to (a stated condition or event) "depending on the context.
The terminology used in the embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include the plural forms as well, unless the context clearly indicates to the contrary. It should also be understood that the term "and/or" as used in this disclosure refers to and encompasses any or all possible combinations of one or more of the listed items.
The terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the application, unless otherwise indicated, the meaning of "a plurality" is two or more.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), etc.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. The storage medium includes a ROM or a random access memory RAM, a magnetic disk or an optical disk, and other various media capable of storing program codes.

Claims (10)

1. A brightness uniformity detection method for a screen, applied to a detection device for detecting brightness uniformity of a first screen, the method comprising:
Acquiring screen brightness data when the first screen displays a first image, wherein the screen brightness data is obtained by shooting the first screen when the first image is displayed through a camera, and comprises illumination intensity information acquired by each pixel unit in the camera when the camera shoots, wherein the pixel values of each pixel in the first image are the same, and the first image comprises a low-brightness image with the pixel value equal to or close to black;
converting the illumination intensity information in the screen brightness data into a gray level image;
performing binarization processing on the gray level image to determine target pixels and non-target pixels in the gray level image;
The method comprises the steps of removing interference pictures in a gray image based on the target pixels and non-target pixels, and determining an image in a region where a first screen is located in the gray image as a screen gray image of the first screen, wherein the interference pictures comprise the non-target pixels and target pixels which are communicated but have the number smaller than a preset number;
Determining a brightness maximum value and a brightness minimum value corresponding to a first screen area image based on at least two sampling images in the first screen area image; the brightness maximum value corresponding to the first screen area image indicates the maximum value in the brightness of at least two sampling images in the first screen area image, and the brightness minimum value corresponding to the first screen area image indicates the minimum value in the brightness of at least two sampling images in the first screen area image;
taking the ratio of the brightness minimum value corresponding to the first screen area image to the brightness maximum value corresponding to the first screen area image as the brightness uniformity of the first screen area image;
Determining the brightness uniformity of a screen area corresponding to the first screen area image based on the brightness uniformity and uniformity detection threshold value of the first screen area image;
Wherein, in the case that the first screen region image comprises a part of the screen gray image of the first screen, the position of a second screen region image in the screen gray image of the sample screen is the same as the position of the first screen region image in the screen gray image of the first screen;
in the case that the first screen region image comprises all the screen gray images of the first screen, the second screen region image is the screen gray image of the sample screen, and the process of determining the uniformity detection threshold comprises the steps of uniformly dividing the screen gray image of the sample screen into at least two screen region images, determining the uniformity detection threshold through the average value of brightness uniformity of the at least two screen region images, and indicating the brightness uniformity of the screen gray image of the sample screen.
2. The method of claim 1, wherein the step of determining the position of the substrate comprises,
Based on the brightness uniformity and uniformity detection threshold of the first screen area image, determining the brightness uniformity of the screen area corresponding to the first screen area image specifically includes:
Determining that the brightness of the screen area corresponding to the first screen area image is uniform under the condition that the brightness uniformity of the first screen area image is larger than the uniformity detection threshold value;
and determining that the brightness of the corresponding screen area in the first screen area image is uneven under the condition that the brightness uniformity of the first screen area image is smaller than the uniformity detection threshold value.
3. The method according to claim 2, wherein the method further comprises:
And taking the screen gray image of the first screen as a first screen area image, or dividing the screen gray image of the first screen into M rows and N columns of screen area images, wherein each screen area image in the M rows and N columns of screen area images is independently the first screen area image, and M and N are integers larger than 1.
4. A method according to claim 3, wherein in case each of the M rows by N columns of screen area images is the first screen area image alone, the method further comprises:
under the condition that the brightness of the M rows and N columns of screen areas corresponding to the M rows and N columns of screen area images is uniform, determining that the brightness of the screen is uniform;
when first screen area images with uneven brightness are present in the M rows and N columns of screen area images, the uneven brightness of the screen is determined, and the first screen area images with uneven brightness correspond to the screen areas with uneven brightness.
5. The method of any one of claims 1-4, wherein the shape of the sampled image is a circular image.
6. The method of claim 5, wherein the arrangement of the sample images in the screen gray image of the first screen is the same as the arrangement of the sample images in the screen gray image of the sample screen, the screen gray image of the first screen comprises T rows and H columns of sample images, the first sample images on the same row are aligned and the non-first sample images on the same row are aligned, the first sample image belongs to a first type screen region image, the first type screen region image indicates the vertex angle of the first screen, the non-first type sample image comprises a second type sample image and a non-second type sample image, the second type sample image belongs to a second type screen region image, the second type screen region image indicates the non-display region of the first screen, the first sample images on the same column are aligned, the second sample images on the same column are aligned, the non-first sample images on the same column are aligned, the screen images of the first screen are divided into M rows and N columns of sample images, the first sample images on the first column and the second column of sample images are divided into N columns of M rows, and the first sample images on the first column and the second column of sample images on the same column are divided into N columns of M numbers, and the first sample images are corresponding to the N columns of the N gray images are larger than the integer numbers 2.
7. A detection device comprising one or more processors and a memory coupled to the one or more processors, the memory for storing computer program code comprising computer instructions that the one or more processors invoke to cause the detection device to perform the method of any of claims 1-6.
8. A computer readable storage medium comprising computer instructions which, when run on a detection device, cause the detection device to perform the method of any one of claims 1 to 6.
9. A chip system for application to a detection device, the chip system comprising one or more processors for invoking computer instructions to cause the detection device to perform the method of any of claims 1 to 6.
10. A computer program product containing instructions which, when run on a detection device, cause the detection device to perform the method of any of claims 1 to 6.
CN202411245380.5A 2024-09-06 2024-09-06 Screen brightness uniformity detection method and detection equipment Active CN118762626B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202411245380.5A CN118762626B (en) 2024-09-06 2024-09-06 Screen brightness uniformity detection method and detection equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202411245380.5A CN118762626B (en) 2024-09-06 2024-09-06 Screen brightness uniformity detection method and detection equipment

Publications (2)

Publication Number Publication Date
CN118762626A CN118762626A (en) 2024-10-11
CN118762626B true CN118762626B (en) 2025-02-07

Family

ID=92945891

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202411245380.5A Active CN118762626B (en) 2024-09-06 2024-09-06 Screen brightness uniformity detection method and detection equipment

Country Status (1)

Country Link
CN (1) CN118762626B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106448524A (en) * 2016-12-14 2017-02-22 深圳Tcl数字技术有限公司 Display brightness uniformity test method and device
CN116543669A (en) * 2023-05-04 2023-08-04 苏州佳世达电通有限公司 Display panel detection method and detection system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140095793A (en) * 2013-01-25 2014-08-04 삼성디스플레이 주식회사 Method and system for evaluating panel mura
CN106713903B (en) * 2016-12-08 2019-05-07 广州视源电子科技股份有限公司 Method and system for detecting screen brightness uniformity
CN108091288B (en) * 2017-12-08 2021-08-03 深圳Tcl新技术有限公司 Display screen uniformity testing method, terminal and computer readable storage medium
CN110299114A (en) * 2019-06-25 2019-10-01 深圳Tcl新技术有限公司 Judgment method, device and the storage medium of show uniformity
CN116912181B (en) * 2023-06-30 2024-04-26 深圳市圆周检测技术有限公司 Screen uniformity detection method, system and readable storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106448524A (en) * 2016-12-14 2017-02-22 深圳Tcl数字技术有限公司 Display brightness uniformity test method and device
CN116543669A (en) * 2023-05-04 2023-08-04 苏州佳世达电通有限公司 Display panel detection method and detection system

Also Published As

Publication number Publication date
CN118762626A (en) 2024-10-11

Similar Documents

Publication Publication Date Title
CN106596073B (en) Method and system for detecting image quality of optical system and test target
WO2016150134A1 (en) Test paper reading method, and pregnancy test and ovulation test method therefor
CN111031311B (en) Imaging quality detection method and device, electronic equipment and readable storage medium
CN111325717B (en) Mobile phone defect position identification method and equipment
CN105335963A (en) Edge defect detection method and apparatus
US12198385B2 (en) Method and apparatus for adjusting an image acquisition apparatus, compensation method of a display panel, device and medium
CN108445010B (en) Automatic optical detection method and device
CN113781396B (en) Screen defect detection method, device, equipment and storage medium
CN114549390A (en) Circuit board detection method, electronic device and storage medium
CN113781393B (en) Screen defect detection method, device, equipment and storage medium
CN112967258B (en) Display defect detection method and device for watch and computer readable storage medium
US20220067514A1 (en) Inference apparatus, method, non-transitory computer readable medium and learning apparatus
CN113362270B (en) Method and device for monitoring abnormal display of display screen picture
CN113008524B (en) Screen light leakage detection method and device
US10971044B2 (en) Methods and systems for measuring electronic visual displays using fractional pixels
CN115063342A (en) Lens dead pixel detection method and device, electronic equipment and storage medium
CN112465780B (en) Method and device for monitoring abnormal film thickness of insulating layer
CN118762626B (en) Screen brightness uniformity detection method and detection equipment
CN105427315B (en) Digital instrument image position testing method and device
WO2023034441A1 (en) Imaging test strips
JP3985891B2 (en) Image defect detection apparatus and image defect detection method
US20230194915A1 (en) Method of detecting defective pixels in electronic displays
US20150347865A1 (en) Image processing apparatus, image processing method, and medium
CN115641312A (en) Qualification detection method and device for LED film-coated backlight plate and computer equipment
CN116007900A (en) Display brightness test method and device, industrial camera and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: Unit 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong 518040

Patentee after: Honor Terminal Co.,Ltd.

Country or region after: China

Address before: 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong

Patentee before: Honor Device Co.,Ltd.

Country or region before: China