CN114187458A - Interface abnormity detection method, device and equipment - Google Patents
Interface abnormity detection method, device and equipment Download PDFInfo
- Publication number
- CN114187458A CN114187458A CN202111556013.3A CN202111556013A CN114187458A CN 114187458 A CN114187458 A CN 114187458A CN 202111556013 A CN202111556013 A CN 202111556013A CN 114187458 A CN114187458 A CN 114187458A
- Authority
- CN
- China
- Prior art keywords
- image
- difference
- pixel
- interface
- time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
Landscapes
- Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Image Analysis (AREA)
Abstract
The application discloses a method, a device and equipment for detecting interface abnormity, wherein the method comprises the steps of obtaining a first image and a second image of an interface to be detected, wherein the first image comprises N first pixel groups, the second image comprises N second pixel groups, and the N first pixel groups and the N second pixel groups are obtained by dividing according to a preset pixel division rule; calculating a first pixel value corresponding to each first pixel group and a second pixel value corresponding to each second pixel group; determining the difference degree of the first image and the second image according to the first pixel value and the second pixel value; and determining that the detection result of the interface to be detected is abnormal under the condition that the difference degree of the first image and the second image is less than or equal to the target difference degree threshold value. In the embodiment of the application, the adoption of full-quantity pixel value calculation is avoided, the calculation amount is reduced, and the calculation complexity is reduced, so that the interface detection rate can be effectively improved, and the calculation power resource can be saved.
Description
Technical Field
The present application belongs to the technical field of interface detection, and in particular, to a method, an apparatus, and a device for detecting an interface abnormality.
Background
The system interface is a medium for information interaction between the system and the user, and realizes visual display of information, so that the user can conveniently operate hardware to realize the purpose of information interaction, and therefore, in the running process of the system, in order to ensure the interaction experience of the user, it is vital to detect whether the system interface has abnormal conditions such as response overtime, jamming or jamming. At present, interface detection is generally performed by capturing and comparing images of an interface to be detected, and judging whether the interface to be detected is abnormal according to similarity, however, the resolution of the current system interface is often very high, so that the calculation complexity is high during image comparison, and therefore, the interface detection process is slow and a large amount of calculation resources are required to be occupied.
Disclosure of Invention
The embodiment of the application provides an interface abnormity detection method, device and equipment, and aims to solve the technical problem that a large amount of computational resources are required to be occupied in the interface detection process.
In a first aspect, an embodiment of the present application provides an interface anomaly detection method, where the method includes:
acquiring a first image and a second image of an interface to be detected, wherein the first image comprises N first pixel groups, the N first pixel groups are obtained by dividing pixel points in the first image according to a preset pixel division rule, the second image comprises N second pixel groups, the N second pixel groups are obtained by dividing pixel points in the second image according to the preset pixel division rule, and N is an integer greater than 1;
calculating a first pixel value corresponding to each first pixel group and a second pixel value corresponding to each second pixel group;
determining the difference degree of the first image and the second image according to the first pixel value and the second pixel value;
and determining that the detection result of the interface to be detected is abnormal under the condition that the difference degree of the first image and the second image is less than or equal to a target difference degree threshold value.
In a second aspect, an embodiment of the present application provides an interface abnormality detection apparatus, including:
the device comprises a first acquisition module, a second acquisition module and a processing module, wherein the first acquisition module is used for acquiring a first image and a second image of an interface to be detected, the first image comprises N first pixel groups, the N first pixel groups are obtained by dividing pixel points in the first image according to a preset pixel division rule, the second image comprises N second pixel groups, the N second pixel groups are obtained by dividing pixel points in the second image according to the preset pixel division rule, and N is an integer greater than 1;
a first calculating module, configured to calculate a first pixel value corresponding to each of the first pixel groups and a second pixel value corresponding to each of the second pixel groups;
a first determining module, configured to determine a difference between the first image and the second image according to the first pixel value and the second pixel value;
and the second determining module is used for determining that the detection result of the interface to be detected is abnormal under the condition that the difference degree of the first image and the second image is smaller than or equal to a target difference degree threshold value.
In a third aspect, an embodiment of the present application provides an electronic device, where the device includes:
a processor and a memory storing program instructions;
the processor, when executing the program instructions, implements the method described above.
In a fourth aspect, the present application provides a storage medium, on which program instructions are stored, and when the program instructions are executed by a processor, the method described above is implemented.
In a fifth aspect, the present application provides a computer program product, and when executed by a processor of an electronic device, the instructions of the computer program product cause the electronic device to perform the above method.
The interface abnormality detection method, the interface abnormality detection device and the interface abnormality detection equipment can acquire a first image and a second image of an interface to be detected, wherein the first image comprises N first pixel groups, the second image comprises N second pixel groups, and the N first pixel groups and the N second pixel groups are obtained by dividing according to a preset pixel division rule; calculating a first pixel value corresponding to each first pixel group and a second pixel value corresponding to each second pixel group; then, the difference degree of the first image and the second image can be determined according to the first pixel value and the second pixel value; and under the condition that the difference degree of the first image and the second image is smaller than or equal to the target difference degree threshold value, determining that the detection result of the interface to be detected is abnormal. Therefore, the pixel points of the first image and the second image are divided and grouped, and the difference degree of the first image and the second image is calculated according to the grouped pixel values to judge whether the interface is abnormal or not, so that the adoption of full-scale pixel value calculation is avoided, the calculation amount is reduced, the calculation complexity is reduced, the interface detection rate can be effectively improved, and the calculation force resource can be saved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments of the present application will be briefly described below, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart diagram illustrating a method for detecting interface anomalies according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an interface abnormality detection apparatus according to another embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device according to still another embodiment of the present application.
Detailed Description
Features and exemplary embodiments of various aspects of the present application will be described in detail below, and in order to make objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail below with reference to the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described herein are intended to be illustrative only and are not intended to be limiting. It will be apparent to one skilled in the art that the present application may be practiced without some of these specific details. The following description of the embodiments is merely intended to provide a better understanding of the present application by illustrating examples thereof.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
In order to solve the prior art problems, embodiments of the present application provide a method, an apparatus, and a device for detecting an interface abnormality. First, a method for detecting an interface abnormality provided in an embodiment of the present application is described below.
Fig. 1 shows a schematic flowchart of an interface anomaly detection method according to an embodiment of the present application. The interface anomaly detection method can be applied to interface detection scenes of any system.
As shown in fig. 1, the interface abnormality detection method may include the following steps:
101, acquiring a first image and a second image of an interface to be detected, wherein the first image comprises N first pixel groups, the N first pixel groups are obtained by dividing pixel points in the first image according to a preset pixel division rule, the second image comprises N second pixel groups, the N second pixel groups are obtained by dividing pixel points in the second image according to the preset pixel division rule, and N is an integer greater than 1;
102, calculating a first pixel value corresponding to each first pixel group and a second pixel value corresponding to each second pixel group;
103, determining the difference degree between the first image and the second image according to the first pixel value and the second pixel value;
and 104, determining that the detection result of the interface to be detected is abnormal under the condition that the difference degree between the first image and the second image is smaller than or equal to the target difference degree threshold value.
Specific implementations of the above steps will be described in detail below.
In the embodiment of the application, the interface abnormality detection method can acquire a first image and a second image of an interface to be detected, wherein the first image comprises N first pixel groups, the second image comprises N second pixel groups, and the N first pixel groups and the N second pixel groups are obtained by dividing according to a preset pixel division rule; calculating a first pixel value corresponding to each first pixel group and a second pixel value corresponding to each second pixel group; then, the difference degree of the first image and the second image can be determined according to the first pixel value and the second pixel value; and under the condition that the difference degree of the first image and the second image is smaller than or equal to the target difference degree threshold value, determining that the detection result of the interface to be detected is abnormal. Therefore, the pixel points of the first image and the second image are divided and grouped, and the difference degree of the first image and the second image is calculated according to the grouped pixel values to judge whether the interface is abnormal or not, so that the adoption of full-scale pixel value calculation is avoided, the calculation amount is reduced, the calculation complexity is reduced, the interface detection rate can be effectively improved, and the calculation force resource can be saved.
Specific implementations of the above steps are described below.
In step 101, the interface may be abnormal after each click or start, and if the time on the interface is not changed after the click or start, it may be considered that the interface may have abnormal conditions such as response timeout, stuck or dead. Whether the interface is abnormal or not can be judged by comparing the interface display conditions before and after the interface is clicked or started. Therefore, the obtaining of the first image and the second image of the interface to be detected may be obtaining of images of the interface to be detected before and after clicking or starting.
After the first image and the second image are obtained, the pixel points in the first image and the second image can be grouped according to a preset pixel division rule. The preset pixel division rule may be set according to an actual situation, for example, the first image and the second image may be divided into a plurality of small images according to a preset specification, and a pixel point in each small image is used as a pixel group. For another example, the size of the first image and the second image may be W × H, that is, it may be considered that the pixel points of the first image and the second image may be arranged according to W columns and H rows, the pixel point of each row may be used as one pixel group, and the pixel points of a preset number in one row may be used as one pixel group in a row unit.
For example, the pixels in the first image may be grouped first, and taking grouping the pixels in the first row as an example, the pixels in the first row may be grouped according to the preset number, and the last pixel is still used as a group when the number of the pixels is smaller than the preset number. For example, the size of the first image is 1024 × 768, and at this time, every 100 pixels in the first row may be regarded as a first pixel group, and the first row may include 11 first pixel groups. Then, each subsequent row is grouped according to the grouping method of the first row, and according to that each row in the first image includes 11 first pixel groups and 768 rows in total, the pixel points in the first image can be divided into 8448 first pixel groups. It is to be understood that the same pixel division rule may be employed to divide the pixel points in the second image into 8448 second pixel groups.
In step 102, a first pixel value corresponding to each first pixel group and a second pixel value corresponding to each second pixel group may be calculated. For example, if each pixel group includes 100 pixel points, a pixel value average of the 100 pixel points in each first pixel group may be calculated, and the pixel value average may be a first pixel value corresponding to each first pixel group. Similarly, the pixel value average of 100 pixels in each second pixel group may be calculated, and the pixel value average may be the second pixel value corresponding to each second pixel group.
In step 103, after calculating the first pixel value corresponding to each first pixel group and the second pixel value corresponding to each second pixel group, N first pixel values of the first image and N second pixel values of the second image may be obtained. The N first pixel values and the N second pixel values are compared in a one-to-one correspondence manner, and when a difference value between the first pixel value and the second pixel value is greater than a first threshold value, the comparison result is considered to be dissimilar, where the first threshold value may be set according to an empirical value in combination with an actual situation, and is not specifically limited herein.
After the N first pixel values and the N second pixel values are compared in a one-to-one correspondence manner, the number of the first image and the second image which are not similar to each other can be obtained, and the difference between the first image and the second image can be obtained by dividing the number of the first image and the second image which are not similar to each other by the number N.
In step 104, after determining the difference between the first image and the second image, the difference between the first image and the second image may be compared with a target difference threshold, and when the difference between the first image and the second image is smaller than or equal to the target difference threshold, the first image and the second image may be considered to be similar, in other words, the interface to be detected may not be changed before and after clicking or starting, and at this time, it may be determined that the detection result of the interface to be detected is abnormal. The target difference threshold may be set according to an empirical value in combination with actual conditions, and is not specifically limited herein.
In some embodiments, the step 102 may specifically perform the following steps:
carrying out gray level processing on the first image and the second image respectively;
and calculating a first pixel value corresponding to each first pixel group in the first image after the gray processing, and a second pixel value corresponding to each second pixel group in the second image after the gray processing.
It can be understood that the first image and the second image may be images with rich colors, and pixel values of respective pixel points may have a large difference, in this case, in order to improve accuracy of interface detection, when the first pixel value and the second pixel value are calculated again in the embodiment of the present application, gray processing may be performed on the first image and the second image first, and then a first pixel value corresponding to each first pixel group in the first image after gray processing and a second pixel value corresponding to each second pixel group in the second image after gray processing are calculated.
Therefore, the pixel value difference of each pixel point of the image after the gray processing is smaller, and therefore, the difference value between the first pixel value corresponding to each first pixel group and the second pixel value corresponding to each second pixel group can be controlled within a smaller range, whether the first pixel group is similar to the second pixel group can be judged more easily and accurately during the subsequent one-to-one correspondence comparison, and the accuracy of interface detection is improved.
In some embodiments, the step 103 may specifically perform the following steps:
determining a first equivalent value corresponding to each first pixel group according to the first pixel value and a preset equivalent rule;
determining a second equivalent value corresponding to each second pixel group according to the second pixel values and a preset equivalent rule;
determining the difference degree of the first image and the second image according to the first equivalent value and the second equivalent value;
wherein, the preset equivalent rule is as follows: under the condition that the pixel value is greater than or equal to a preset pixel threshold value, the equivalent value corresponding to the pixel value is a first identifier; and under the condition that the pixel value is smaller than the preset pixel threshold value, the equivalent value corresponding to the pixel value is a second identifier, and the first equivalent value and the second equivalent value respectively comprise a first identifier or a second identifier.
In this embodiment of the application, in order to further improve the rate of interface detection and save computational resources, the first equivalent value corresponding to each first pixel group may be determined according to the first pixel value corresponding to each first pixel group and a preset equivalent rule. For example, when the first pixel value is greater than or equal to the preset pixel threshold, the first pixel value may be equivalent to the first identifier, that is, the first equivalent value corresponding to the first pixel group may be the first identifier at this time; when the first pixel value is smaller than the preset pixel threshold, the first pixel value may be equivalent to the second identifier, that is, the first equivalent value corresponding to the first pixel group may be the second identifier. In other words, the first equivalent value corresponding to each first pixel group may be the first identifier or the second identifier. The preset pixel threshold may be set according to an empirical value in combination with an actual situation, and is not limited in detail here.
For example, the preset pixel threshold may be 127, the first flag may be "1", and the second flag may be "0". That is, if the first pixel value is greater than or equal to 127, the first pixel value is equivalent to "1", and the first equivalent value corresponding to the first pixel group is recorded as "1", and if the first pixel value is less than 127, the first pixel value is equivalent to "0", and the first equivalent value corresponding to the first pixel group is recorded as "0". After determining the first equivalent value corresponding to each first pixel group, N first equivalent values may be obtained, such as: [1,1,0,1,0,0,1 … … 0 ].
It will be appreciated that the same preset equivalence rules described above may be used to determine the second equivalence for each second group of pixels. In other words, after determining the second equivalent value corresponding to each second pixel group, N second equivalent values may be obtained, such as: [1,0,0,1,0,0,1 … … 1 ].
After the N first equivalent values and the N second equivalent values are obtained, the N first equivalent values and the N second equivalent values may be compared in a one-to-one correspondence manner, if the first equivalent value and the second equivalent value are both "1", the comparison result may be considered to be similar, if the first equivalent value and the second equivalent value are both "0", the comparison result may also be considered to be similar, and if one of the first equivalent value and the second equivalent value is "1", and the other one is "0", the comparison result may be considered to be dissimilar.
After the one-to-one correspondence comparison of the N first equivalent values and the N second equivalent values is completed, the number of the dissimilar comparison results can be obtained, and the difference degree between the first image and the second image can be obtained by dividing the number of the dissimilar comparison results by N.
In the embodiment of the application, the pixel value can be equivalent to the first identifier or the second identifier, and whether the first pixel group is similar to the second pixel value can be directly determined according to whether the identifiers are matched, so that the interface detection rate can be further improved, and the calculation power resource can be saved.
In some embodiments, before the step 104, the interface abnormality detecting method may further include the steps of:
acquiring an initial difference threshold and M image groups, wherein the image group comprises two images, and M is an integer greater than 1;
determining the difference degree of the two images in each image group;
determining Q image groups with the difference degree smaller than or equal to the initial difference degree threshold value as a target image group, wherein Q is a positive integer and is smaller than or equal to M;
calculating the difference mean value and the difference standard deviation of the target image group according to the Q difference degrees corresponding to the target image group;
and determining a target difference threshold according to the difference mean value and the difference standard deviation of the target image group.
It can be understood that, when detecting whether the interface is abnormal, the detection result is affected by the target difference threshold, so the more accurate the target difference threshold is, the more accurate the interface detection result is.
Based on this, in the embodiment of the present application, an initial disparity threshold and M image groups may be obtained, where the initial disparity threshold may be set according to an empirical value, and each image group may include two images of the same interface captured randomly.
The degree of disparity of the two images in each image group can be determined according to the above-described disparity calculation method. The group of images corresponding to the degree of dissimilarity less than or equal to the initial degree of dissimilarity threshold may then be labeled as "T", and the group of images corresponding to the degree of dissimilarity greater than the initial degree of dissimilarity threshold may be labeled as "F", i.e., the target group of images may be the Q groups of images labeled as "T".
The target difference threshold value can be calculated according to the difference of each image group marked as 'T' and the difference mean value and the difference standard deviation of the target image group.
A specific calculation formula of the target difference degree threshold K may be as shown in formula (1):
K=k(avg)+3*k(sd) (1)
wherein, K is the target difference threshold, K (avg) is the difference mean, and K (sd) is the difference standard deviation.
In the embodiment of the application, the target image group with the difference degree smaller than or equal to the initial difference degree threshold value can be determined on the basis of presetting the initial difference degree threshold value, and then the target difference degree threshold value is calculated based on the difference degree of the target image group, so that the initial difference degree threshold value can be adjusted according to the difference degree meeting the conditions, a more accurate target difference degree threshold value is obtained, and the accuracy of interface detection is improved.
In some embodiments, the determining the target difference threshold according to the difference mean and the difference standard deviation of the target image group may specifically perform the following steps:
calculating a first difference threshold according to the difference mean value and the difference standard deviation of the target image group;
and under the condition that the Q difference degrees corresponding to the target image group are all smaller than or equal to the first difference threshold value, determining the first difference threshold value as the target difference threshold value.
In the embodiment of the application, after the first difference threshold is obtained by calculation according to the difference mean value and the difference standard deviation of the target image group, the first difference threshold can be verified again, so that the accuracy of the target difference threshold is further ensured.
If all of the Q difference degrees corresponding to the target image group are less than or equal to the first difference degree threshold, the interface detection result may be accurate when the first difference degree threshold is regarded as the target difference degree threshold, and at this time, the first difference degree threshold may be determined as the target difference degree threshold.
In some embodiments, the determining the target difference threshold according to the difference mean and the difference standard deviation of the target image group may specifically perform the following steps:
calculating a first difference threshold according to the difference mean value and the difference standard deviation of the target image group;
under the condition that the difference degree of a first image group in a target image group is larger than a first difference degree threshold value, calculating a difference degree mean value and a difference degree standard deviation of a second image group according to P difference degrees corresponding to the second image group, wherein the second image group is the other image group except the first image group in the target image group, P is a positive integer and is smaller than Q;
calculating a second difference threshold according to the difference mean value and the difference standard deviation of the second image group;
and determining the second difference threshold as the target difference threshold when the P difference degrees corresponding to the second image group are all smaller than or equal to the second difference threshold.
In the embodiment of the application, after the first difference threshold is obtained by calculation according to the difference mean value and the difference standard deviation of the target image group, the first difference threshold can be verified again, so that the accuracy of the target difference threshold is further ensured.
If the difference degree of the first image group is larger than the first difference degree threshold value in the target image group, the interface detection result is inaccurate when the first difference degree threshold value is regarded as the target difference degree threshold value, and at this time, P image groups with the difference degree smaller than or equal to the first difference degree threshold value in the target image group can be determined as the second image group.
The mean value of the difference degree and the standard deviation of the difference degree of the second image group can be calculated according to the corresponding P difference degrees in the second image group, and then the second threshold value of the difference degree can be calculated according to the mean value of the difference degree and the standard deviation of the difference degree of the second image group, wherein the second threshold value of the difference degree can be calculated by referring to the formula (1).
If the P difference degrees corresponding to the second image group are all smaller than or equal to the second difference degree threshold, the interface detection result may be accurate when the second difference degree threshold is regarded as the target difference degree threshold, and at this time, the second difference degree threshold may be determined as the target difference degree threshold.
It is understood that if there are still image groups with a degree of difference greater than the second degree of difference threshold in the second image group, the above steps may be repeated to determine a third image group with a degree of difference less than or equal to the second degree of difference threshold, and the degree of difference threshold is determined as the target degree of difference threshold when the degree of difference of the image groups participating in the calculation of the degree of difference threshold is less than or equal to the degree of difference threshold.
Therefore, the calculated difference threshold can be continuously adjusted according to the difference meeting the conditions until the verification result of the difference threshold participating in verification indicates that the interface detection is accurate, the difference threshold participating in verification is determined as the target difference threshold, so that the target difference threshold is more accurate, and the accuracy of the interface detection is further improved.
In some embodiments, the step 101 may specifically perform the following steps:
performing maximum processing on an interface to be detected;
and acquiring a first image and a second image of the interface to be detected after the maximization processing.
In the embodiment of the application, when the first image and the second image of the interface to be detected are obtained, the first image and the second image of the interface to be detected after the maximization processing can be performed on the interface to be detected, and the first image and the second image of the interface to be detected after the maximization processing can be obtained. Therefore, the first image and the second image can be ensured to only comprise the content of the interface to be detected, the influence of other factors on the difference degree of the first image and the second image is eliminated, and the accuracy of interface detection is ensured.
In some embodiments, the step 101 may specifically perform the following steps:
acquiring a first time difference, wherein the first time difference is a standard time difference of loading completion of an interface to be detected;
under the condition that a first input of an interface to be detected is received, responding to the first input, acquiring a first image of the interface to be detected, and recording first time for acquiring the first image;
calculating a second time according to the first time and the first time difference;
and acquiring a second image of the interface to be detected according to the second time.
It can be understood that, when the first image and the second image of the interface to be detected are acquired, there is a time difference between the capturing of the first image and the capturing of the second image. If the intercepted time difference is too short, the interface which is not abnormal is easy to detect as abnormal, so that the interface detection is inaccurate, in addition, the intercepted time difference is short, the number of the intercepted images in the same time period is more, the calculated amount of the difference degree of the two adjacent images is increased, and the calculation resources are wasted. If the interception time difference is too long, although the accuracy of interface detection can be ensured, the interface detection time is also long, and when the interface is abnormal, the interaction experience of the user can be influenced by the long interface detection time.
Based on this, in the embodiment of the present application, a first time difference may be obtained, where the first time difference may be a standard time difference of the loading completion of the interface to be detected, and then the first image and the second image of the interface to be detected may be obtained based on the first time difference.
For example, a first input of the user to the interface to be detected may be received, where the first input may be: the user may specifically determine the click input to the interface to be detected, or the voice instruction input by the user, or the specific gesture input by the user according to the actual use requirement, which is not limited in the embodiment of the present application. The specific gesture in the embodiment of the application can be any one of a single-click gesture, a sliding gesture, a dragging gesture, a pressure identification gesture, a long-press gesture, an area change gesture, a double-press gesture and a double-click gesture; the click input in the embodiment of the application can be click input, double-click input, click input of any number of times and the like, and can also be long-time press input or short-time press input. It can be understood that, in the case that the interface to be detected operates normally, after the first time difference of the first input is received, the interface to be detected completes the interface loading associated with the first input.
After receiving a first input of an interface to be detected, a first image of the interface to be detected can be acquired in response to the first input, and a first time for acquiring the first image is recorded. And calculating the sum of the first time and the first time difference to obtain a second time for acquiring the second image of the interface to be detected, namely acquiring the second image of the interface to be detected at the second time.
In the embodiment of the application, the first image and the second image of the interface to be detected can be obtained according to the standard time difference of the loading completion of the interface to be detected, so that the accuracy of interface detection is ensured, and the interaction experience of a user is improved.
In some embodiments, the obtaining the first time difference may specifically perform the following steps:
acquiring a plurality of second time differences associated with a plurality of controls, wherein the plurality of controls are arranged in at least one interface of a target system, and the interface to be detected is any interface in the target system, wherein the second time differences are obtained based on the time of receiving a second input to the controls and the time of completing the loading of the interface associated with the controls in response to the second input;
calculating a time difference standard deviation according to the plurality of second time differences;
and calculating the first time difference according to the minimum time difference and the standard deviation of the time differences in the plurality of second time differences.
In this embodiment of the application, the interface to be detected may be any interface in the target system, in other words, the standard time difference of the completion of loading of the interface to be detected may be the standard time difference of the completion of loading of the page of the target system. Based on this, a plurality of controls may be determined from the target system as samples, and the plurality of controls may be disposed in at least one interface in the target system.
And receiving a second input to one of the controls, for example, if the user clicks the control, recording the time t1 of the click, then waiting for the interface associated with the control to be loaded, recording the time t2 of the loading completion, and calculating the difference between the time t2 of the loading completion and the time t1 of the click to obtain the time difference of the response, where the time difference may be a second time difference associated with the control. And then acquiring a plurality of second time differences associated with the plurality of controls based on the same calculation mode.
The time difference standard deviation may be calculated from the plurality of second time differences, and then the first time difference may be calculated from a smallest time difference of the plurality of second time differences and the time difference standard deviation.
A specific calculation formula of the first time difference T1 may be as shown in formula (2):
T1=T(min)-3*T(sd) (2)
wherein T1 is the first time difference, T (min) is the minimum time difference of the second time differences, and T (sd) is the standard deviation of the time differences.
In some embodiments, the step 104 may specifically perform the following steps:
under the condition that the difference degree of the first image and the second image is smaller than or equal to the target difference degree threshold value, calculating a third time according to the second time and the first time difference;
acquiring a third image of the interface to be detected according to the third time, wherein the third image comprises N third pixel groups, and the N third pixel groups are obtained by dividing pixel points in the third image according to a preset pixel division rule;
calculating a third pixel value corresponding to each third pixel group;
determining the difference degree of the second image and the third image according to the second pixel value and the third pixel value;
and determining that the detection result of the interface to be detected is abnormal under the condition that the difference degree of the second image and the third image is less than or equal to the target difference degree threshold value.
In the embodiment of the application, in order to further improve the accuracy of interface detection, when the difference between the first image and the second image is less than or equal to the target difference threshold, it may be considered that the interface to be detected has not changed after the first time difference, that is, page loading is not completed, at this time, the third image of the interface to be detected may be obtained again, and whether the interface to be detected has abnormal conditions such as response timeout or jamming is further determined according to the difference between the second image and the third image.
For example, the sum of the second time and the first time difference may be calculated to obtain a third time for acquiring the third image of the interface to be detected, that is, the third image of the interface to be detected is acquired at the third time. The pixel points in the third image may be divided into N third pixel groups using the same pixel division rule as the first image and the second image. Then, a pixel value average of a plurality of pixel points in each third pixel group may be calculated, and the pixel value average may be a third pixel value corresponding to each third pixel group. And comparing the N second pixel values with the N third pixel values in a one-to-one correspondence manner, obtaining the number of the compared results which are dissimilar, and dividing the number of the compared results which are dissimilar by N to obtain the difference between the second image and the third image.
After the difference between the second image and the third image is determined, the difference between the second image and the third image may be compared with a target difference threshold, and when the difference between the second image and the third image is also smaller than or equal to the target difference threshold, it may be determined again that the interface to be detected is not changed, and at this time, it may be determined that the detection result of the interface to be detected is abnormal.
Therefore, whether the interface to be detected is loaded or not is determined for multiple times, misjudgment behaviors in the interface detection process are effectively avoided, and the accuracy of interface detection is further improved.
In some embodiments, the obtaining of the third image of the interface to be detected according to the third time may specifically perform the following steps:
calculating a mean time difference value according to the plurality of second time differences;
calculating a third time difference according to the maximum time difference and the average value of the time differences in the plurality of second time differences;
and under the condition that the time difference between the first time and the third time is less than or equal to the third time difference, acquiring a third image of the interface to be detected according to the third time.
It can be understood that in the interface detection process, images need to be continuously captured from the interface to be detected, comparison is performed according to the captured images, and whether the interface to be detected is abnormal or not is determined according to the comparison result. And in order to avoid the resource waste caused by continuously intercepting the image all the time, a third time difference for stopping intercepting the image can be set. In other words, the image of the interface to be detected may be intercepted within a third time difference from the reception of the first input of the interface to be detected, and for example, the image of the interface to be detected may be intercepted continuously within the second time difference in this time period. And stopping intercepting the image of the interface to be detected after the third time difference is exceeded after the first input of the interface to be detected is received.
A mean time difference may be calculated from the plurality of second time differences, and then a third time difference may be calculated from a maximum time difference of the plurality of second time differences and the mean time difference.
A specific calculation formula of the third time difference T2 may be as shown in formula (3):
T2=2*T(max)-T(avg) (3)
wherein T2 is the third time difference, T (max) is the largest time difference among the second time differences, and T (avg) is the average time difference.
And calculating the time difference between the first time and the third time, and if the time difference is less than or equal to the third time difference, intercepting a third image of the interface to be detected at the third time. In other words, whether the third time for capturing the third image is within the third time difference is determined, if the third time is within the third time difference, the third image may be acquired, and if the third time is outside the third time difference, the third image cannot be acquired because capturing the image of the interface to be detected is stopped at this time.
Based on the interface abnormality detection method provided by the embodiment, the application also provides an embodiment of an interface abnormality detection device.
Fig. 2 is a schematic structural diagram of an interface abnormality detection apparatus according to another embodiment of the present application, and only a part related to the embodiment of the present application is shown for convenience of description.
Referring to fig. 2, the interface abnormality detection apparatus 200 may include:
the first obtaining module 201 is configured to obtain a first image and a second image of the interface to be detected, where the first image includes N first pixel groups, the N first pixel groups are obtained by dividing pixel points in the first image according to a preset pixel division rule, the second image includes N second pixel groups, the N second pixel groups are obtained by dividing pixel points in the second image according to the preset pixel division rule, and N is an integer greater than 1;
a first calculating module 202, configured to calculate a first pixel value corresponding to each first pixel group and a second pixel value corresponding to each second pixel group;
a first determining module 203, configured to determine a difference between the first image and the second image according to the first pixel value and the second pixel value;
the second determining module 204 is configured to determine that a detection result of the interface to be detected is abnormal when the difference between the first image and the second image is smaller than or equal to the target difference threshold.
The interface abnormality detection device 200 in the embodiment of the application can acquire a first image and a second image of an interface to be detected, where the first image includes N first pixel groups, the second image includes N second pixel groups, and the N first pixel groups and the N second pixel groups are obtained by dividing according to a preset pixel division rule; calculating a first pixel value corresponding to each first pixel group and a second pixel value corresponding to each second pixel group; then, the difference degree of the first image and the second image can be determined according to the first pixel value and the second pixel value; and under the condition that the difference degree of the first image and the second image is smaller than or equal to the target difference degree threshold value, determining that the detection result of the interface to be detected is abnormal. Therefore, the pixel points of the first image and the second image are divided and grouped, and the difference degree of the first image and the second image is calculated according to the grouped pixel values to judge whether the interface is abnormal or not, so that the adoption of full-scale pixel value calculation is avoided, the calculation amount is reduced, the calculation complexity is reduced, the interface detection rate can be effectively improved, and the calculation force resource can be saved.
In some embodiments, the first computing module 202 may include:
the first processing unit is used for respectively carrying out gray processing on the first image and the second image;
and the first calculating unit is used for calculating a first pixel value corresponding to each first pixel group in the first image after the gray processing and a second pixel value corresponding to each second pixel group in the second image after the gray processing.
In some embodiments, the first determining module 203 may include:
the first determining unit is used for determining a first equivalent value corresponding to each first pixel group according to the first pixel value and a preset equivalent rule;
a second determining unit, configured to determine, according to the second pixel value and according to a preset equivalence rule, a second equivalence value corresponding to each second pixel group;
a third determining unit, configured to determine a difference between the first image and the second image according to the first equivalent value and the second equivalent value;
wherein, the preset equivalent rule is as follows: under the condition that the pixel value is greater than or equal to a preset pixel threshold value, the equivalent value corresponding to the pixel value is a first identifier; and under the condition that the pixel value is smaller than the preset pixel threshold value, the equivalent value corresponding to the pixel value is a second identifier, and the first equivalent value and the second equivalent value respectively comprise a first identifier or a second identifier.
In some embodiments, the interface abnormality detection apparatus 200 may further include:
the second acquisition module is used for acquiring the initial difference threshold and M image groups, wherein the image group comprises two images, and M is an integer greater than 1;
the third determining module is used for determining the difference degree of the two images in each image group;
a fourth determining module, configured to determine Q image groups with a disparity smaller than or equal to the initial disparity threshold as a target image group, where Q is a positive integer and Q is smaller than or equal to M;
the second calculation module is used for calculating the difference degree mean value and the difference degree standard deviation of the target image group according to the Q difference degrees corresponding to the target image group;
and the fifth determining module is used for determining a target difference threshold according to the difference mean value and the difference standard deviation of the target image group.
In some embodiments, the fifth determining module may include:
the second calculating unit is used for calculating a first difference threshold according to the difference mean value and the difference standard deviation of the target image group;
and a fourth determining unit, configured to determine the first difference threshold as the target difference threshold when all of the Q difference degrees corresponding to the target image group are less than or equal to the first difference threshold.
In some embodiments, the fifth determining module may further include:
a third calculating unit, configured to calculate a mean disparity value and a standard disparity standard deviation of a second image group according to P disparity degrees corresponding to the second image group when the disparity degree of the first image group in the target image group is greater than the first disparity threshold, where the second image group is another image group in the target image group except the first image group, P is a positive integer, and P is smaller than Q;
a fourth calculating unit, configured to calculate a second disparity threshold according to the mean disparity and standard deviation of the second image group;
a fifth determining unit configured to determine the second difference threshold as the target difference threshold in a case where the P difference corresponding to the second image group are all less than or equal to the second difference threshold.
In some embodiments, the first obtaining module 201 may include:
the second processing unit is used for carrying out maximum processing on the interface to be detected;
the first acquisition unit is used for acquiring the first image and the second image of the interface to be detected after maximization processing.
In some embodiments, the first obtaining module 201 may include:
the second obtaining unit is used for obtaining a first time difference, and the first time difference is a standard time difference of loading completion of the interface to be detected;
the third acquisition unit is used for responding to the first input under the condition that the first input of the interface to be detected is received, acquiring a first image of the interface to be detected and recording the first time for acquiring the first image;
a fifth calculating unit, configured to calculate a second time according to the first time and the first time difference;
and the fourth acquisition unit is used for acquiring a second image of the interface to be detected according to the second time.
In some embodiments, the second obtaining unit may include:
the first obtaining subunit is configured to obtain a plurality of second time differences associated with the plurality of controls, where the plurality of controls are arranged in at least one interface of the target system, and the interface to be detected is any interface in the target system, where the second time difference is obtained based on a time when a second input to the control is received and a time when the loading of the interface associated with the control is completed in response to the second input;
a first calculating subunit, configured to calculate a standard deviation of the time difference according to the plurality of second time differences;
and the second calculating subunit is used for calculating the first time difference according to the minimum time difference and the standard deviation of the time differences in the plurality of second time differences.
In some embodiments, the second determination module 204 may include:
a sixth calculation unit configured to calculate a third time from the second time and the first time difference when the degree of difference between the first image and the second image is less than or equal to the target degree of difference threshold;
the fifth obtaining unit is used for obtaining a third image of the interface to be detected according to a third time, wherein the third image comprises N third pixel groups, and the N third pixel groups are obtained by dividing pixel points in the third image according to a preset pixel division rule;
a seventh calculating unit configured to calculate a third pixel value corresponding to each of the third pixel groups;
a sixth determining unit configured to determine a degree of difference between the second image and the third image according to the second pixel value and the third pixel value;
and the seventh determining unit is used for determining that the detection result of the interface to be detected is abnormal under the condition that the difference degree of the second image and the third image is less than or equal to the target difference degree threshold value.
In some embodiments, the fifth obtaining unit may include:
the third calculating subunit is used for calculating a time difference mean value according to the plurality of second time differences;
a fourth calculating subunit, configured to calculate a third time difference according to a maximum time difference and a time difference average value in the plurality of second time differences;
and the second acquiring subunit is used for acquiring a third image of the interface to be detected according to the third time under the condition that the time difference between the first time and the third time is less than or equal to the third time difference.
It should be noted that, the contents of information interaction, execution process, and the like between the above-mentioned devices/units are based on the same concept as that of the embodiment of the method of the present application, and are devices corresponding to the above-mentioned interface abnormality detection method, and all implementation manners in the embodiment of the method are applicable to the embodiment of the device, and specific functions and technical effects thereof may be specifically referred to a part of the embodiment of the method, and are not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Fig. 3 shows a hardware structure diagram of an electronic device according to still another embodiment of the present application.
The device may include a processor 301 and a memory 302 in which program instructions are stored.
The steps in any of the various method embodiments described above are implemented when the processor 301 executes a program.
Illustratively, the program may be divided into one or more modules/units, which are stored in the memory 302 and executed by the processor 301 to accomplish the present application. One or more modules/units may be a series of program instruction segments capable of performing certain functions and describing the execution of programs on the device.
Specifically, the processor 301 may include a Central Processing Unit (CPU), or an Application Specific Integrated Circuit (ASIC), or may be configured to implement one or more Integrated circuits of the embodiments of the present Application.
The memory may include Read Only Memory (ROM), Random Access Memory (RAM), magnetic disk storage media devices, optical storage media devices, flash memory devices, electrical, optical, or other physical/tangible memory storage devices. Thus, in general, the memory includes one or more tangible (non-transitory) readable storage media (e.g., a memory device) encoded with software comprising computer-executable instructions and when the software is executed (e.g., by one or more processors), it is operable to perform operations described with reference to the method according to an aspect of the disclosure.
The processor 301 implements any of the above-described embodiments by reading and executing program instructions stored in the memory 302.
In one example, the electronic device may also include a communication interface 303 and a bus 310. The processor 301, the memory 302, and the communication interface 303 are connected via a bus 310 to complete communication therebetween.
The communication interface 303 is mainly used for implementing communication between modules, apparatuses, units and/or devices in the embodiment of the present application.
In addition, in combination with the methods in the foregoing embodiments, the embodiments of the present application may provide a storage medium to implement. The storage medium having stored thereon program instructions; which when executed by a processor implements any of the methods in the embodiments described above.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the foregoing method embodiments, and can achieve the same technical effect, and in order to avoid repetition, the details are not repeated here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
Embodiments of the present application provide a computer program product, where the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the processes of the foregoing method embodiments, and achieve the same technical effects, and in order to avoid repetition, details are not described here again.
It is to be understood that the present application is not limited to the particular arrangements and instrumentality described above and shown in the attached drawings. A detailed description of known methods is omitted herein for the sake of brevity. In the above embodiments, several specific steps are described and shown as examples. However, the method processes of the present application are not limited to the specific steps described and illustrated, and those skilled in the art can make various changes, modifications, and additions or change the order between the steps after comprehending the spirit of the present application.
The functional blocks shown in the above-described structural block diagrams may be implemented as hardware, software, firmware, or a combination thereof. When implemented in hardware, it may be, for example, an electronic circuit, an Application Specific Integrated Circuit (ASIC), suitable firmware, plug-in, function card, or the like. When implemented in software, the elements of the present application are the programs or code segments used to perform the required tasks. The program or code segments may be stored in a machine-readable medium or transmitted by a data signal carried in a carrier wave over a transmission medium or a communication link. A "machine-readable medium" may include any medium that can store or transfer information. Examples of a machine-readable medium include electronic circuits, semiconductor memory devices, ROM, flash memory, Erasable ROM (EROM), floppy disks, CD-ROMs, optical disks, hard disks, fiber optic media, Radio Frequency (RF) links, and so forth. The code segments may be downloaded via a computer grid such as the internet, an intranet, etc.
It should also be noted that the exemplary embodiments mentioned in this application describe some methods or systems based on a series of steps or devices. However, the present application is not limited to the order of the above-described steps, that is, the steps may be performed in the order mentioned in the embodiments, may be performed in an order different from the order in the embodiments, or may be performed simultaneously.
Aspects of the present disclosure are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, enable the implementation of the functions/acts specified in the flowchart and/or block diagram block or blocks. Such a processor may be, but is not limited to, a general purpose processor, a special purpose processor, an application specific processor, or a field programmable logic circuit. It will also be understood that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware for performing the specified functions or acts, or combinations of special purpose hardware and computer instructions.
As described above, only the specific embodiments of the present application are provided, and it can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the system, the module and the unit described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. It should be understood that the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the present application, and these modifications or substitutions should be covered within the scope of the present application.
Claims (15)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111556013.3A CN114187458B (en) | 2021-12-17 | 2021-12-17 | Interface anomaly detection method, device and equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111556013.3A CN114187458B (en) | 2021-12-17 | 2021-12-17 | Interface anomaly detection method, device and equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114187458A true CN114187458A (en) | 2022-03-15 |
CN114187458B CN114187458B (en) | 2025-03-25 |
Family
ID=80544448
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111556013.3A Active CN114187458B (en) | 2021-12-17 | 2021-12-17 | Interface anomaly detection method, device and equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114187458B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118277270A (en) * | 2024-04-26 | 2024-07-02 | 中国电子产品可靠性与环境试验研究所((工业和信息化部电子第五研究所)(中国赛宝实验室)) | Method, device, computer equipment and storage medium for obtaining execution results |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111601105A (en) * | 2020-06-01 | 2020-08-28 | 上海申际轨道交通设备发展有限公司 | A kind of video display state abnormal detection method, device and electronic equipment |
-
2021
- 2021-12-17 CN CN202111556013.3A patent/CN114187458B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111601105A (en) * | 2020-06-01 | 2020-08-28 | 上海申际轨道交通设备发展有限公司 | A kind of video display state abnormal detection method, device and electronic equipment |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118277270A (en) * | 2024-04-26 | 2024-07-02 | 中国电子产品可靠性与环境试验研究所((工业和信息化部电子第五研究所)(中国赛宝实验室)) | Method, device, computer equipment and storage medium for obtaining execution results |
Also Published As
Publication number | Publication date |
---|---|
CN114187458B (en) | 2025-03-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113918376B (en) | Fault detection method, device, equipment and computer readable storage medium | |
CN109587008B (en) | Method, device and storage medium for detecting abnormal flow data | |
CN109886127B (en) | Fingerprint identification method and terminal equipment | |
US10586335B2 (en) | Hand segmentation in a 3-dimensional image | |
CN117538677A (en) | Magnetic bearing coil fault detection method, device, equipment and medium | |
CN114187458A (en) | Interface abnormity detection method, device and equipment | |
CN112418089A (en) | Gesture recognition method and device and terminal | |
US20250078303A1 (en) | Method for calculating intersection over union between target region and designated region in an image and electronic device using the same | |
CN115564790A (en) | Target object detection method, electronic device and storage medium | |
CN106484441B (en) | Controller initialization method and electronic device applying same | |
CN110880023A (en) | Method and device for detecting certificate picture | |
CN107515821B (en) | Control testing method and device | |
CN116501637A (en) | Printing test method and device, electronic equipment and storage medium | |
CN108629219A (en) | A kind of method and device of identification one-dimension code | |
CN111026989B (en) | Page loading time detection method and device and electronic equipment | |
JP6611963B2 (en) | Program analysis apparatus, program analysis system, program analysis method, and analysis program | |
CN109993022B (en) | Height detection method and method for establishing height detection equation | |
CN114298990A (en) | Detection method and device for vehicle-mounted camera device, storage medium and vehicle | |
CN112487466A (en) | Featureless encrypted file detection method, terminal equipment and storage medium | |
CN106446902B (en) | non-character image recognition method and device | |
CN111629005A (en) | Anti-cheating method and device, electronic equipment and storage medium | |
CN112668660B (en) | Abnormal point detection method and device based on time sequence data | |
CN115436899B (en) | Millimeter wave radar detection data processing method, device, equipment and storage medium | |
US20170139796A1 (en) | Bus interface card detection system and method for detecting bus interface card | |
CN109670519B (en) | Image processing apparatus and image processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |