CN107506300B - User interface testing method, device, server and storage medium - Google Patents
User interface testing method, device, server and storage medium Download PDFInfo
- Publication number
- CN107506300B CN107506300B CN201710675850.5A CN201710675850A CN107506300B CN 107506300 B CN107506300 B CN 107506300B CN 201710675850 A CN201710675850 A CN 201710675850A CN 107506300 B CN107506300 B CN 107506300B
- Authority
- CN
- China
- Prior art keywords
- controls
- page
- user interface
- operated
- attribute information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Prevention of errors by analysis, debugging or testing of software
- G06F11/3668—Testing of software
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
The embodiment of the invention discloses a user interface testing method, a user interface testing device, a server and a storage medium. The user interface testing method comprises the following steps: traversing each page of the user interface to be tested; respectively acquiring attribute information of all controls and operated controls in each page in the traversal process; and calculating the coverage rate of the operated control of each page according to the attribute information. The embodiment of the invention effectively quantifies the effect of UI automatic test by using the coverage rate of the operated control so as to guide a tester to optimize the UI automatic test and improve the UI test efficiency.
Description
Technical Field
The embodiment of the invention relates to the Internet technology, in particular to a user interface testing method, a user interface testing device, a server and a storage medium.
Background
With the wide application of mobile intelligent terminals, mobile terminals are developing in the direction of function enhancement, multi-modeling, customization and platform opening, and mobile application services (APP) developed based on an Android system or an IOS system affect and change the lives of people. After each APP is developed, in order to avoid the problems that a User cannot respond when clicking a certain function, a User Interface (UI) fails to jump or a logic error occurs when the User uses the APP, the developed APP needs to be subjected to traversal testing.
Currently, in the UI test, the quantification methods of the UI automation test effect are code coverage and activity (APP page) coverage. The code coverage rate is the ratio of triggered codes to total codes in the process of running the UI automation test in statistics. activity coverage, that is, the ratio of the number of entered activities to the total number of activities of the specified APP during running of the UI automation test.
However, the code coverage mainly reflects the coverage degree of a function or a code, is related to a code logic structure, and cannot be well related to an APP page UI, so that testers unfamiliar with source codes are difficult to quickly acquire uncovered places of a UI automation test through code coverage data. In addition, data affecting code coverage may be present due to the presence of code that has been invalidated. The activity coverage rate quantification method is more likely to reflect the problem of coverage depth of UI automatic test, and cannot define the problem of UI automatic test breadth. Neither can effectively quantify the effect of the UI automated test, nor can direct the tester to optimize the UI automated test.
Disclosure of Invention
The embodiment of the invention provides a user interface testing method, a user interface testing device, a server and a storage medium, and aims to solve the problems that in the prior art, the effect of UI automatic testing cannot be effectively quantified, and a tester cannot be directly instructed to optimize the UI automatic testing.
In a first aspect, an embodiment of the present invention provides a user interface testing method, where the method includes:
traversing each page of the user interface to be tested;
respectively acquiring attribute information of all controls and operated controls in each page in the traversal process;
and calculating the coverage rate of the operated control of each page according to the attribute information.
In a second aspect, an embodiment of the present invention further provides a user interface testing apparatus, where the apparatus includes:
the traversal module is used for traversing each page of the user interface to be tested;
the information acquisition module is used for respectively acquiring attribute information of all controls and operated controls in each page in the traversal process;
and the calculating module is used for calculating the coverage rate of the operated control of each page according to the attribute information.
In a third aspect, an embodiment of the present invention further provides a server, where the server includes:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement a user interface testing method as described in any embodiment of the invention.
In a fourth aspect, the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the user interface testing method described in any embodiment of the present invention.
According to the embodiment of the invention, by traversing each page of the user interface to be tested, the attribute information of all controls and operated controls in each page in the traversing process is respectively obtained, the coverage rate of the operated controls of each page is calculated according to the obtained attribute information, and the effect of UI automatic test is effectively quantified by using the coverage rate of the operated controls, so that testers are directly guided to optimize the UI automatic test, and the UI test efficiency is improved.
Drawings
FIG. 1 is a flowchart of a user interface testing method according to a first embodiment of the present invention;
FIG. 2a is a flowchart of a user interface testing method according to a second embodiment of the present invention;
FIG. 2b is a schematic diagram of a marked screenshot in the user interface testing method according to the second embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a user interface testing apparatus according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of a server in the fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of a user interface testing method according to an embodiment of the present invention, where the present embodiment is applicable to a user interface testing situation, and the method may be executed by a user interface testing apparatus, and the apparatus may be implemented in a software and/or hardware manner. As shown in fig. 1, the method specifically includes:
and S110, traversing each page of the user interface to be tested.
When the UI test is performed on the mobile APP, each user interface in the APP to be tested and each control in each user interface need to be tested, and therefore, the user interface to be tested needs to be automatically traversed.
Generally, UI automation traversal is mainly classified as depth traversal or breadth traversal. Splitting an APP from a UI, wherein the UI mainly comprises pages and various controls in the pages, is similar to a tree structure, and starting from a root page (namely a root node), adopting a DFS (Depth First Search algorithm, which is one of Search algorithms) algorithm or a BFS (Breadth First Search algorithm, which is one of the simplest and most convenient Search algorithms of a graph) algorithm to enter each page in sequence, operating various controls in the pages, testing and verifying the stability of the APP. A single UI automation traversal task may run on one device.
Preferably, a distributed UI automatic traversal method is adopted, a single UI traversal task can be split into a plurality of sub traversal tasks according to the page characteristics, the sub traversal tasks are executed respectively, and the user interface traversal test is completed. The time consumed by automatic traversal can be greatly shortened, and the problem of low coverage rate is solved.
It should be noted that the embodiment of the present invention does not limit the method for traversing the UI, as long as the testing of each user interface of the APP to be tested and the control on each user interface can be implemented.
And S120, respectively acquiring attribute information of all the controls and the operated controls in each page in the traversal process.
The controls in one page may not be tested, only a part of the controls may be tested, or all the controls may be tested, and attribute information of all the controls and the operated (i.e., tested) controls needs to be obtained to determine to which degree the test is performed.
Illustratively, the attribute information includes the number of controls, which is divided into the number of all controls in a certain page and the number of operated controls; the method also comprises position information of the controls, wherein the position information is divided into the distribution positions of all the controls in the page where the controls are located and the positions of the controls operated in the same page.
Further, in the embodiment of the present invention, attribute information of all the controls and operated controls in each page in the traversal process is obtained in a dump manner.
Dump is a file backup instruction, and can acquire information of a corresponding file. Applied to a mobile terminal, dump may: acquiring an xml structure file of a current page from mobile equipment, wherein the xml structure file comprises the overall layout of the current page on the mobile equipment and comprises a layout control, a UI control and the like; and analyzing the xml structure file to obtain the attribute value of each file.
And S130, calculating the coverage rate of the operated control of each page according to the attribute information.
The coverage rate of the operated controls of each page is calculated according to all the controls in the page and the attribute information of the operated controls.
Further, the attribute information includes the number of all the controls and the number of the operated controls, and accordingly, the ratio of the number of the operated controls to the total number of the controls of each page is used as the coverage rate of the operated controls of each page.
According to the technical scheme of the embodiment, all the controls and the attribute information of the operated controls in each page in the traversal process are respectively obtained by traversing each page of the user interface to be tested, the coverage rate of the operated controls of each page is calculated according to the obtained attribute information, and the effect of UI automatic test is effectively quantified by utilizing the coverage rate of the operated controls, so that testers are directly guided to optimize the UI automatic test, and the test efficiency is improved.
Example two
Fig. 2a is a flowchart of a user interface testing method according to a second embodiment of the present invention, and the second embodiment further optimizes a user interface testing process based on the first embodiment. As shown in fig. 2a, the method comprises:
s210, traversing each page of the user interface to be tested.
And S220, respectively acquiring attribute information of all the controls and operated controls in each page in the traversal process.
And S230, calculating the coverage rate of the operated control of each page according to the attribute information.
And S240, screenshot is carried out on each page according to the positions of all the controls in each page and the positions of the operated controls.
The attribute information of all the controls and the operated controls obtained in S220 includes the number and position information of all the controls and the number and position information of the operated controls, where the number is used to calculate the coverage of the operated controls; the positions of all the controls in each page and the positions of the operated controls are used for carrying out screenshot on each page, and the layout of each control in the page can be seen.
And S250, marking the position of the operated control on the screenshot.
In the screenshot obtained in S240, the position of the operated control is marked for distinguishing from the control not operated, so that it is convenient to distinguish which controls have been operated and which controls are still to be operated.
And S260, displaying the coverage rate of the operated control and the marked screenshot.
And displaying the calculated coverage rate of the operated control and the page screenshot marked with the operated control on a server or other test platforms for UI test, so that testers can visually see the progress and the result of the UI test, and the testers are directly guided to optimize the UI automatic test.
Exemplarily, as shown in fig. 2b, a marked page screenshot is obtained according to positions of all controls in a certain page and positions of operated controls, in the screenshot, the controls a to f are all controls in the page, wherein the controls a, c, and e under a dotted line frame are operated controls, and the controls b, d, f, and g under a solid line frame are unoperated controls, so that screenshot testers can intuitively see the UI test condition, and a certain control is not operated yet, which UI test can be further optimized. It should be noted that, in the embodiment of the present invention, a marking manner of the operated control is not limited, a dashed frame may be adopted, a line of the border may be thickened, or different border colors may be used, which only serves as an exemplary function.
According to the technical scheme of the embodiment of the invention, the screenshot is carried out on each page according to the positions of all the controls in each page and the positions of the operated controls, and the positions of the operated controls are marked on the screenshot, so that a tester can visually see the progress and the result of the UI test, the tester is directly guided to optimize the UI automatic test, and the UI test efficiency is improved.
EXAMPLE III
Fig. 3 is a schematic structural diagram of a user interface testing apparatus in a third embodiment of the present invention. As shown in fig. 3, the user interface test apparatus includes:
and a traversing module 310 for traversing each page of the user interface to be tested.
The information obtaining module 320 is configured to obtain attribute information of all the controls and the operated controls in each page in the traversal process.
And the calculating module 330 is configured to calculate the coverage rate of the operated control of each page according to the attribute information.
Further, the acquired attribute information includes the number of controls; correspondingly, the calculating module is specifically configured to use a ratio of the number of the operated controls of each page to the total number of the controls as the coverage rate of the operated controls of each page.
Further, the acquired attribute information also comprises the position of the control on the page; correspondingly, the user interface testing device further comprises:
the screenshot module 340 is configured to perform screenshot on each page according to the positions of all the controls in each page and the positions of the operated controls;
a marking module 350, configured to mark, on the screenshot, a position of the operated control.
Preferably, the user interface testing apparatus further comprises:
and the display module 360 is used for displaying the coverage rate of the operated control and the marked screenshot.
Further, the information acquiring module is specifically configured to:
and acquiring attribute information of all the controls and operated controls in each page in the traversal process by using the dump mode.
According to the technical scheme of the embodiment, each page of the user interface to be tested is traversed through the traversing module, the information acquisition module respectively acquires attribute information of all controls and operated controls in each page in the traversing process, the calculation module calculates the coverage rate of the operated controls of each page according to the acquired attribute information, then the coverage rate and the page screenshots marked with the operated controls are displayed, the effect of UI automatic test can be effectively quantified by using the coverage rate of the operated controls, so that testers can be directly guided to optimize the UI automatic test, and the test efficiency is improved.
The user interface testing device provided by the embodiment of the invention can execute the user interface testing method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example four
Fig. 4 is a schematic structural diagram of a server in the fourth embodiment of the present invention. FIG. 4 illustrates a block diagram of an exemplary server 412 suitable for use in implementing embodiments of the present invention. The server 412 shown in fig. 4 is only an example and should not bring any limitations to the function and scope of use of the embodiments of the present invention.
As shown in FIG. 4, the server 412 is in the form of a general purpose computing device. Components of server 412 may include, but are not limited to: one or more processors or processing units 416, a system memory 428, and a bus 418 that couples the various system components including the system memory 428 and the processing unit 416.
The system memory 428 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)430 and/or cache memory 432. The server 412 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 434 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 4, commonly referred to as a "hard drive"). Although not shown in FIG. 4, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 418 by one or more data media interfaces. Memory 428 can include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 440 having a set (at least one) of program modules 442 may be stored, for instance, in memory 428, such program modules 442 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. The program modules 442 generally perform the functions and/or methodologies of the described embodiments of the invention.
The server 412 may also communicate with one or more external devices 414 (e.g., keyboard, pointing device, display 424, etc.), with one or more devices that enable a user to interact with the server 412, and/or with any devices (e.g., network card, modem, etc.) that enable the server 412 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 422. Also, server 412 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet) through network adapter 420. As shown, network adapter 420 communicates with the other modules of server 412 over bus 418. It should be appreciated that although not shown in FIG. 4, other hardware and/or software modules may be used in conjunction with the server 412, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 416 executes programs stored in the system memory 428 to perform various functional applications and data processing, such as implementing user interface testing methods provided by embodiments of the present invention.
EXAMPLE five
The fifth embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the user interface testing method provided in the fifth embodiment of the present invention.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.
Claims (8)
1. A user interface testing method, comprising:
traversing each page of the user interface to be tested;
respectively acquiring attribute information of all controls and operated controls in each page in a traversal process, wherein the attribute information comprises the number of the controls and the positions of the controls on the page;
calculating the coverage rate of the operated controls of each page according to the attribute information, wherein the coverage rate of the operated controls of each page is taken as the ratio of the number of the operated controls of each page to the total number of the controls;
screenshot is carried out on each page according to the positions of all the controls in each page and the positions of the operated controls;
and marking the position of the operated control on the screenshot.
2. The user interface testing method of claim 1, further comprising:
and displaying the coverage rate of the operated control and the marked screenshot.
3. The method for testing the user interface according to claim 1, wherein the obtaining of the attribute information of all the controls and the operated controls in each page in the traversal process includes:
and acquiring attribute information of all the controls and operated controls in each page in the traversal process by using the dump mode.
4. A user interface testing apparatus, comprising:
the traversal module is used for traversing each page of the user interface to be tested;
the information acquisition module is used for respectively acquiring attribute information of all controls and operated controls in each page in the traversal process, wherein the attribute information comprises the number of the controls and the positions of the controls on the page;
the calculating module is used for calculating the coverage rate of the operated controls of each page according to the attribute information, and specifically, the ratio of the number of the operated controls of each page to the total number of the controls is used as the coverage rate of the operated controls of each page;
the screenshot module is used for screenshot for each page according to the positions of all the controls in each page and the positions of the operated controls;
and the marking module is used for marking the position of the operated control on the screenshot.
5. The user interface testing device of claim 4, wherein said device further comprises:
and the display module is used for displaying the coverage rate of the operated control and the marked screenshot.
6. The user interface testing device of claim 4, wherein the information acquisition module is specifically configured to:
and acquiring attribute information of all the controls and operated controls in each page in the traversal process by using the dump mode.
7. A server, characterized in that the server comprises:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement a user interface testing method as recited in any of claims 1-3.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the user interface testing method according to any one of claims 1-3.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710675850.5A CN107506300B (en) | 2017-08-09 | 2017-08-09 | User interface testing method, device, server and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710675850.5A CN107506300B (en) | 2017-08-09 | 2017-08-09 | User interface testing method, device, server and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107506300A CN107506300A (en) | 2017-12-22 |
CN107506300B true CN107506300B (en) | 2020-10-13 |
Family
ID=60689729
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710675850.5A Active CN107506300B (en) | 2017-08-09 | 2017-08-09 | User interface testing method, device, server and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107506300B (en) |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107197070B (en) * | 2017-06-08 | 2020-03-24 | 杭州友声科技股份有限公司 | Automatic test method for mobile phone application UI breadth traversal based on event sequence |
CN108229485B (en) * | 2018-02-08 | 2022-05-17 | 百度在线网络技术(北京)有限公司 | Method and apparatus for testing user interface |
CN111209183A (en) * | 2018-11-22 | 2020-05-29 | 中国电信股份有限公司 | UI function traversal test method and device |
CN111290940B (en) * | 2018-12-10 | 2023-04-28 | 中国移动通信集团江西有限公司 | Automated testing method, device, equipment and medium for APP |
CN110334014A (en) * | 2019-06-12 | 2019-10-15 | 北京大米科技有限公司 | For user interface automated testing method, system, server and storage medium |
CN112685285B (en) * | 2019-10-18 | 2025-03-28 | 北京奇虎科技有限公司 | User interface test case generation method and device |
CN111694752B (en) * | 2020-07-28 | 2023-09-05 | 中移(杭州)信息技术有限公司 | Application testing method, electronic device and storage medium |
CN111694755B (en) * | 2020-07-31 | 2023-07-18 | 抖音视界有限公司 | Application program testing method and device, electronic equipment and medium |
CN112069070A (en) * | 2020-09-04 | 2020-12-11 | 中国平安人寿保险股份有限公司 | A page detection method, apparatus, server, and computer-readable storage medium |
CN112363918B (en) * | 2020-11-02 | 2024-03-08 | 北京云聚智慧科技有限公司 | User interface AI automatic test method, device, equipment and storage medium |
CN112506782B (en) * | 2020-12-08 | 2024-10-11 | 北京指掌易科技有限公司 | Application program testing method, device, equipment and storage medium |
CN112631933A (en) * | 2020-12-30 | 2021-04-09 | 上海中通吉网络技术有限公司 | Method, device and equipment for page tracking test coverage |
CN112817864B (en) * | 2021-02-23 | 2024-04-16 | 北京字节跳动网络技术有限公司 | Method, device, equipment and medium for generating test file |
CN113076242A (en) * | 2021-02-24 | 2021-07-06 | 西安闻泰电子科技有限公司 | User interface testing method and device, storage medium and electronic equipment |
CN113032264B (en) * | 2021-03-29 | 2024-09-06 | 杭州网易数之帆科技有限公司 | Page view control detection method and device |
CN113190453B (en) * | 2021-05-10 | 2025-03-21 | 北京沃东天骏信息技术有限公司 | User interface testing method, device, server and medium |
CN114218078A (en) * | 2021-12-07 | 2022-03-22 | 中信银行股份有限公司 | UI page testing method, device and equipment and readable storage medium |
CN114780376A (en) * | 2022-03-03 | 2022-07-22 | 浙江吉利控股集团有限公司 | Application function traversal test method, apparatus and storage medium |
CN114610609A (en) * | 2022-03-04 | 2022-06-10 | 中信银行股份有限公司 | Evaluation method, device and equipment for testing network page and readable storage medium |
CN114968687B (en) * | 2022-06-09 | 2024-07-02 | 腾讯科技(深圳)有限公司 | Traversal test method, apparatus, electronic device, program product, and storage medium |
CN117931652B (en) * | 2024-01-16 | 2024-09-10 | 镁佳(北京)科技有限公司 | Automatic test script generation method and device, computer equipment and storage medium |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6871327B2 (en) * | 2002-03-04 | 2005-03-22 | Sun Microsystems, Inc. | Method and apparatus for extending coverage of GUI tests |
CN101916225B (en) * | 2010-09-02 | 2012-05-02 | 于秀山 | Graphical user interface software function coverage testing method |
CN105446884A (en) * | 2015-12-16 | 2016-03-30 | 北京奇虎科技有限公司 | Code coverage rate test method and apparatus |
CN105630686B (en) * | 2016-03-24 | 2018-12-18 | 厦门美图移动科技有限公司 | A kind of application traversal test method, equipment and mobile terminal |
CN106294152B (en) * | 2016-08-09 | 2019-03-12 | 努比亚技术有限公司 | Using the traversal test device and method of user interface |
-
2017
- 2017-08-09 CN CN201710675850.5A patent/CN107506300B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN107506300A (en) | 2017-12-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107506300B (en) | User interface testing method, device, server and storage medium | |
CN110059009B (en) | Method and apparatus for testing code files | |
CN109302522B (en) | Test method, test device, computer system, and computer medium | |
CN109992498B (en) | Test case generation method and system and computer system | |
US20130117855A1 (en) | Apparatus for automatically inspecting security of applications and method thereof | |
CN110597704B (en) | Pressure test method, device, server and medium for application program | |
US8752023B2 (en) | System, method and program product for executing a debugger | |
CN106550038B (en) | Data configuration diagnosis system and method of digital control system | |
CN110990411B (en) | Data structure generation method and device, and calling method and device | |
US10721152B2 (en) | Automated analysis and recommendations for highly performant single page web applications | |
US20130179867A1 (en) | Program Code Analysis System | |
CN112199261A (en) | Application program performance analysis method and device and electronic equipment | |
CN113590495B (en) | A method, device, equipment and storage medium for determining test coverage | |
CN114490337A (en) | Commissioning method, commissioning platform, equipment and storage medium | |
US11119892B2 (en) | Method, device and computer-readable storage medium for guiding symbolic execution | |
CN111741046B (en) | Data reporting method, data acquisition method, device, equipment and medium | |
CN110716859A (en) | Method for automatically pushing test cases for modified codes and related device | |
CN110968519A (en) | Game testing method, device, server and storage medium | |
CN117667716A (en) | Page testing method and device and electronic equipment | |
CN112597041B (en) | Cross-branch merging method, system, equipment and storage medium for code coverage rate | |
CN112965910B (en) | Automatic regression testing method and device, electronic equipment and storage medium | |
CN115481025A (en) | Script recording method and device for automatic test, computer equipment and medium | |
CN115080113A (en) | Item code detection method and device, readable storage medium and electronic equipment | |
CN114968687B (en) | Traversal test method, apparatus, electronic device, program product, and storage medium | |
CN113190453B (en) | User interface testing method, device, server and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |