CN112506807B - Automatic test system for interface serving multiple systems - Google Patents
Automatic test system for interface serving multiple systems Download PDFInfo
- Publication number
- CN112506807B CN112506807B CN202110165898.8A CN202110165898A CN112506807B CN 112506807 B CN112506807 B CN 112506807B CN 202110165898 A CN202110165898 A CN 202110165898A CN 112506807 B CN112506807 B CN 112506807B
- Authority
- CN
- China
- Prior art keywords
- module
- test
- data
- test case
- execution
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 296
- 238000000034 method Methods 0.000 claims abstract description 28
- 230000004044 response Effects 0.000 claims abstract description 25
- 238000007726 management method Methods 0.000 claims abstract description 17
- 238000006243 chemical reaction Methods 0.000 claims abstract description 15
- 238000013524 data verification Methods 0.000 claims abstract description 13
- 230000008569 process Effects 0.000 claims abstract description 10
- 238000012545 processing Methods 0.000 claims description 4
- 238000004806 packaging method and process Methods 0.000 claims description 3
- 235000010627 Phaseolus vulgaris Nutrition 0.000 description 9
- 244000046052 Phaseolus vulgaris Species 0.000 description 9
- 239000000047 product Substances 0.000 description 9
- 238000010998 test method Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000003780 insertion Methods 0.000 description 2
- 230000037431 insertion Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 238000004140 cleaning Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000012467 final product Substances 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Prevention of errors by analysis, debugging or testing of software
- G06F11/3668—Testing of software
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Prevention of errors by analysis, debugging or testing of software
- G06F11/3668—Testing of software
- G06F11/3672—Test management
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
The invention discloses an automatic interface test system serving multiple systems, which comprises a multiple API request response module, a data format conversion module, a preposed data generation module, a data query and operation module, a data verification module, an execution log recording module, a test case module, a code management module, a search specified condition execution task module, a compiling module, a log analyzing and warehousing module, a timing execution task module and a platform log report display module. The system can complete the requirements of automatic interface test and regression test for a plurality of back-end system services by executing the test cases regularly, and the whole execution process is automatically completed from case execution to report generation, so that the repeated labor of testers is greatly reduced, the data feedback is more timely, and the overall efficiency of the test work is improved.
Description
Technical Field
The invention relates to the technical field of computers, in particular to an automatic interface testing system serving multiple systems.
Background
With the expansion of network service scale, more and more systems are required to carry out technical support, so that a back-end system becomes complicated, and once program change is sent, a large amount of manpower is required to carry out test work for verifying whether the program change affects the current system. In order to better test a plurality of backend system services, a platform capable of automatically executing interface test work in batches is needed, and the platform needs to be capable of completing interface tests and regression tests aiming at a large number of background services; a scheme for conveniently viewing and analyzing the test result is needed; the test work can be managed and counted; the unified technical scheme is that multiple systems of the whole team are common; the requirements of current situations such as the existing test environment are met.
Disclosure of Invention
The technical problem to be solved by the invention is to provide an automatic interface test system serving multiple systems, the system finishes the requirements of automatic interface tests and regression tests for numerous back-end system services by executing test cases regularly, the whole execution process is automatically finished from case execution to report generation, the repeated labor of testers is greatly reduced, meanwhile, the feedback of data becomes more timely, and the overall efficiency of test work is improved.
In order to solve the technical problem, the interface automatic test system serving for the multiple systems comprises a multiple-system API request response module, a data format conversion module, a preposed data generation module, a data query and operation module, a data verification module, an execution log recording module, a test case module, a code management module, a search specified condition execution task module, a compiling module, a log analyzing and warehousing module, a timing execution task module and a platform log report display module;
the multi-system API request response module calls a network request by sending an HTTP message to acquire the return data of the tested interface, and comprises a sending data structure, a return data structure and a calling interface, wherein the sending data structure and the return data structure define data fields and types, and the calling interface defines an interface address URL, a calling method, a data formatting mode and request header information;
the data format conversion module is used for converting a sending data structure into an appointed character string or converting the character string into an appointed return data structure;
the data query and operation module is used for querying and operating data in the database, and returning a data result set by calling the database query operation component, so as to verify the correctness of the test data;
the prepositive data generating module makes prepositive data required by testing by calling the multi-system API request response module and the pre-packaging service function of the data query and operation module;
the data verification module automatically compares whether the expected result is matched with the actual returned data content by using program assertion;
the execution log recording module automatically records the preposed data of the preposed data generating module, the sending request and the return response data of the multi-system API request response module and the comparison result of the data verification module into an xml format and stores the xml format as a log file;
the test case module marks the attribute of the test case by using the annotation and the packet name, is used for distinguishing the test cases of different systems and different conditions, and is realized by compiling a script program of the test module: preparing preposed data, sending a request acquisition return message to a tested system, inquiring a database to verify whether a return result is correct or not, and recording a case log;
the code management module uploads, maintains and manages codes in a unified way through a code management tool;
the search appointed condition execution task module searches the test attributes of the annotation and the class name mark used in the test case module through the appointed condition, finds out the test case which meets the appointed condition, and executes the test case;
the compiling module finishes the compiling work of the specified code through the script and calls the test case executed by the searching specified condition execution task module by using the specified condition;
the log analyzing and warehousing module acquires the execution data of the test case and stores the execution data in a database by analyzing the xml log file recorded by the execution log recording module;
the timed execution task module calls the code management module and downloads a test case module script program in a timed task setting mode, calls a compiling module to compile the test case module script program, and then distributes a product to an execution server;
and the platform log report display module reads the execution data of the test case stored in the database and displays the execution data to a page after processing the execution data.
Further, before sending the request, the multi-system API request response module calls an interface to automatically convert the corresponding sending data format according to different data formatting modes, then loads request header information, sends request content by using a corresponding interface address and a calling method and obtains a return result; the interface address URL is divided into a domain name definition and a path definition, wherein the domain name definition is determined by a domain name read from a configuration file, and the path definition is self-defined by an interface declaration class; the data formatting mode supports a plurality of formatting automatic conversion modes.
Furthermore, the test case packet name of the test case module uses a specific rule, and the corresponding test case can be reversely searched through the packet name; the test case contains a self-defined annotation mark, and the attribute of the test case can be marked by the self-defined annotation mark.
Further, the task module for executing the specified search condition searches the test cases of the specified condition through the annotations and the packet name marks of the test cases, the search function queries all executable test cases under the corresponding packet names according to the station names to be tested, filters a list of finally executed test cases by using the annotations according to the input search condition, and then executes the test cases.
Further, the database for storing the test case execution data by the log analyzing and warehousing module is composed of a test case execution plan, a test suite, a test case, a test suite execution result, a test case execution result and a detailed execution record; analyzing the log file format by using an xml analysis tool, and acquiring a test interface name, a test interface description, a test case name, a test case description, a test case execution time, input and output information automatically recorded by a multi-system API request response module, and test case program assertion judgment success information; and when the test case information is not recorded in the test suite and the test case table, inserting the test case into the test suite and the test case table, creating a test case execution plan, and storing whether the test suite execution result, the test case execution result, the detailed execution record and the program assertion are successfully recorded.
Further, the platform log report display module queries the test case execution plan, the test suite, the test case, the test suite execution result, the test case execution result and the detailed execution record through the database, obtains the test result data of the test case which is put in storage, processes the test result data and displays the processed test result data to a query list page.
The automatic test system for the interfaces serving the multiple systems adopts the technical scheme, namely the test system comprises a multiple-system API request response module, a data format conversion module, a preposed data generation module, a data query and operation module, a data verification module, an execution log recording module, a test case module, a code management module, a search specified condition execution task module, a compiling module, a log analysis and storage module, a timing execution task module and a platform log report display module. The system can complete the requirements of automatic interface test and regression test for a plurality of back-end system services by executing the test cases regularly, the whole execution process is automatically completed from case execution to report generation, the repeated labor of testers is greatly reduced, meanwhile, the feedback of data becomes more timely, and the overall efficiency of the test work is improved.
Drawings
The invention is described in further detail below with reference to the following figures and embodiments:
FIG. 1 is a functional block diagram of an interface automated test system of the present invention serving multiple systems;
FIG. 2 is a block diagram of the test system;
FIG. 3 is a schematic block diagram of test case execution and report generation in the test system.
Detailed Description
The embodiment is shown in fig. 1, and the interface automated testing system serving multiple systems of the present invention includes a multiple system API request response module, a data format conversion module, a pre-data generation module, a data query and operation module, a data verification module, an execution log recording module, a test case module, a code management module, a search specified condition execution task module, a compiling module, a log parsing and warehousing module, a timing execution task module, and a platform log report presentation module;
the multi-system API request response module calls a network request by sending an HTTP message to acquire the return data of the tested interface, and comprises a sending data structure, a return data structure and a calling interface, wherein the sending data structure and the return data structure define data fields and types, and the calling interface defines an interface address URL, a calling method, a data formatting mode and request header information;
the data format conversion module is used for converting a sending data structure into an appointed character string or converting the character string into an appointed return data structure;
the data query and operation module is used for querying and operating data in the database, and returning a data result set by calling the database query operation component, so as to verify the correctness of the test data;
the prepositive data generating module makes prepositive data required by testing by calling the multi-system API request response module and the pre-packaging service function of the data query and operation module;
the data verification module automatically compares whether the expected result is matched with the actual returned data content by using program assertion; the program assertion comprises 5 specific forms including equal assertion, unequal assertion, null assertion, non-null assertion and assertion, and when the assertion is over, an execution log recording module is called to record the assertion result;
the execution log recording module automatically records the preposed data of the preposed data generating module, the sending request and the return response data of the multi-system API request response module and the comparison result of the data verification module into an xml format and stores the xml format as a log file;
the test case module marks the attribute of the test case by using the annotation and the packet name, is used for distinguishing the test cases of different systems and different conditions, and is realized by compiling a script program of the test module: preparing preposed data, sending a request acquisition return message to a tested system, inquiring a database to verify whether a return result is correct or not, and recording a case log;
the code management module uploads, maintains and manages codes in a unified way through a code management tool;
the search appointed condition execution task module searches the test attributes of the annotation and the class name mark used in the test case module through the appointed condition, finds out the test case which meets the appointed condition, and executes the test case;
the compiling module finishes the compiling work of the specified code through the script and calls the test case executed by the searching specified condition execution task module by using the specified condition;
the log analyzing and warehousing module acquires the execution data of the test case and stores the execution data in a database by analyzing the xml log file recorded by the execution log recording module;
the timed execution task module calls the code management module and downloads a test case module script program in a timed task setting mode, calls a compiling module to compile the test case module script program, and then distributes a product to an execution server;
and the platform log report display module reads the execution data of the test case stored in the database and displays the execution data to a page after processing the execution data.
Preferably, before the multi-system API request response module sends the request, the interface is called to automatically convert the corresponding sending data format according to different data formatting modes, then the request header information is loaded, the corresponding interface address and the calling method are used to send the request content, and the return result is obtained; the interface address URL is divided into a domain name definition and a path definition, wherein the domain name definition is determined by a domain name read from a configuration file, and the path definition is self-defined by an interface declaration class; the data format automatic conversion supports a plurality of format modes, including PARAM, JSON and String, wherein the PARAM calls a data format conversion module to automatically convert a sending data structure into: the method comprises the following steps of (1) enabling a parameter name I = a parameter value I & a parameter name II = a parameter value II format, and simultaneously supporting whether the final result is automatically subjected to urlEncode or not and automatically subjected to capital and small case conversion format processing or not; the JSON can analyze the sending data structure and automatically convert the sending data structure into a JSON structure; string puts the incoming data into the sending message as it is and does not process it. The calling method supports methods such as GET, POST, DELETE, PUT, FILE and the like, wherein the FILE method stores return results in a FILE stream form in a local hard disk. The request header information may specify header content to the request, supporting whether to override existing header content functions.
Preferably, the test case package name of the test case module uses a specific rule, and the corresponding test case can be reversely searched through the package name; the test case contains a self-defined annotation mark, and the attribute of the test case can be marked by the self-defined annotation mark. The specific packet name rule refers to a reverse string of a domain name of a belonging system, for example, if a domain name served by a dock user is user service.
Preferably, the task module for executing the specified search condition searches the test cases of the specified condition through the annotations and the packet name labels of the test cases, the search function queries all executable test cases under the corresponding packet names according to the site names to be tested, filters a list of finally executed test cases by using the annotations according to the search condition, and then executes the test cases.
The search function analyzes a directory where a test case of the system is located according to an incoming system name and a package name rule, for example, if a domain name of a dock user service is user service, iapi.ymatou.com, the case of the system is in com.ymatou.iapi.userservice directory; after the directory where the use case is located is obtained, traversing all files under a compiling generation product corresponding to the directory, judging whether the files are class files or not, and then obtaining method names and annotations of all the class files through reflection; judging whether the method is a test case according to the annotation and accords with the input search condition, filtering out the matched test case, and adding the class name and the method name into a queue to be tested; and adding all the queues to be tested into a thread pool for executing the test task, and executing the test cases in sequence.
Preferably, the database for storing the test case execution data by the log analysis and storage module is composed of a test case execution plan, a test suite, a test case, a test suite execution result, a test case execution result and a detailed execution record; analyzing the log file format by using an xml analysis tool, and acquiring a test interface name, a test interface description, a test case name, a test case description, a test case execution time, input and output information automatically recorded by a multi-system API request response module, and test case program assertion judgment success information; and when the test case information is not recorded in the test suite and the test case table, inserting the test case into the test suite and the test case table, creating a test case execution plan, and storing whether the test suite execution result, the test case execution result, the detailed execution record and the program assertion are successfully recorded.
Preferably, the platform log report display module queries the test case execution plan, the test suite, the test case, the test suite execution result, the test case execution result and the detailed execution record through the database, obtains the test result data of the test case which is put in storage, processes the test result data and displays the processed test result data to the query list page. The query list page home page is a test case execution plan list, test case execution plan information and the number of test cases with success and failure of the current plan are checked, and a secondary page is clicked to check the execution result of the test suite; the test suite execution result display page can see the execution success and failure number of the current test suite, and after clicking, the test suite enters a three-level page to check the test case execution result; the test case execution interface page can see the success or failure of the specific case execution, and enters a detailed execution record page after clicking; the detailed execution log page shows detailed execution log information and program assertion success-and-failure records, as well as viewing specific success-and-failure assertion content.
In addition, in the test system, the data format conversion module is used for converting the data structure into the specified character string or converting the character string into the specified data structure, and the data structure comprises a Bean to table single format, a form format to Bean, a Bean to JSON format and a JSON format to Bean. The Bean-to-table format uses a reflection mechanism to extract key and value data of a Bean structure by traversing Bean information and converting the key and value data into: the method comprises the following steps of (1) forming a form format of a parameter name I = a parameter value I & a parameter name II = a parameter value II; analyzing and extracting keys and values through a regular expression and assigning the values to corresponding beans by using a reflection mechanism; the Bean to JSON format and the JSON to Bean format implement a formatting conversion of the send data structure and the JSON structure by using a third party component GSON.
The data query and operation module encapsulates various database drivers, so that data query and operation of various database types can be realized. Before use, general increasing, deleting, modifying and checking sentences need to be written in advance, and the increasing, deleting, modifying and checking work of corresponding data is completed through the input condition.
The compiling module realizes cleaning of old products, setting of a code compiling mode and a source code path, appointing of an output path, content and format of a final product and then execution of configuration to complete a compiling process through ant configuration.
The timing execution task module uses the appointed condition to call the appointed condition execution task searching module to execute the test task, and then calls the log analysis and storage module to store the execution result in the database. Creating a timing task named by the name of the system to be tested, configuring a timing trigger to call a code management module at regular time to download codes, then calling a compiling module to finish code compiling, distributing a product to an execution server after the code compiling is finished, sending a condition call search appointed condition execution task module to execute a test task according to a task name and a pre-set parameter, and calling a log analysis warehousing module to store an execution result in a database after the test task is finished.
As shown in fig. 2, a plurality of test developments in the test system upload codes to a code management server through a code management component; the timed execution task module pulls codes, compiles products through the compiling server and then pushes the products to the execution server; after the execution server runs the test program, analyzing the log file and writing the result into a database; and the data display server reads the database data and issues sites to provide the sites for the Internet users to check the reports.
As shown in fig. 3, the task execution flow of the test system includes the following steps:
and S1, the test developer writes the test case script program locally and uploads the test case script program to the code server. The test case comprises a preposed data generation module, a multi-system API request response module and a data query and operation module, wherein the data verification module verifies whether the result is correct and calls an execution log recording module to record the execution log. The packet name of the test case uses a specific rule, and the case of the corresponding test system can be reversely searched through the packet name. The test case method comprises a self-defined annotation mark, and the property of the test case can be marked by the self-defined annotation. Wherein the specific package name rule refers to the reverse character string of the domain name of the system. For example: currently, an interface of a wharf user service for binding a user mobile phone number is to be tested, the interface address is http:// user service. iapi.ymatou.com/bindMobile, the data format is json, and parameters include userid and mobile, then: firstly, creating a catalog with a package name com.ymatou.iapi.userservice, and storing a subsequently created data structure, an interface call and a test case in the catalog; thirdly, creating a calling interface named as BindMobileCall and declaring the url of the calling interface as user service, initial, ymatou, com, the path as bindMobile, the method as POST and the data structure as JSON; creating a test suite named BindMobileTest, and compiling a test method BindMobileTestCase1 to bind the mobile phone successfully; the test suite requires the use of a description and identification of the log declaration suite; the test method needs to add annotation statement that the test method is a P1 priority test case, and the test method needs to state the description of the test method; and fifthly, calling the precondition to create a user return userid, setting the userid and the user-defined mobile to the BindMobileBeam, transmitting the BindMobileBeam into the BindMobileCall, automatically converting parameters of the BindMobileCall into json character strings to be transmitted, returning a message result, calling a data query and operation module to query a mobile field in the database, and verifying whether the database query result is consistent with the transmitted mobile field by using a data verification module.
And S2, the timed task automatically calls the task on time according to the crontab, and when triggered, the timed task goes to the code server to pull the latest code and then calls the ant script to compile.
And S3, remotely transmitting all files under the product package to a specified directory of the execution server by using sftp after compiling is completed, and starting tasks by using the parameter domain name and the priority on the ssh tape. And after the execution is finished, the log file under the relative path is analyzed. For example: introducing a parameter domain name user service, iapi.ymatou.com, a priority P1, scanning all files in a product directory com.ymatou.iapi.user service, acquiring information of classes, methods and annotations of use cases, then retrieving all test methods and checking whether the annotations are P1; after all the use cases are searched out, calling an execution method to run the use cases, and obtaining an xml log file after the use cases are executed, wherein the file records a sending return request of a request response module, a comparison result of a result verification module and a custom log; these files are then parsed to obtain the result set that needs to be inserted into the database.
S4, judging whether the obtained test case log is inserted into the database for the first time, if so, jumping to S5, and if not, initializing a test suite and a test case table; reading the name and description of the test suite corresponding to the station point and inserting the test suite table, and reading the description information of the test case corresponding to the test suite and inserting the description information into the test case table; the test suite and the test case are associated through the test suite id; for example: the bound mobile phone case is a new test case, a next newly generated test suite id is recorded in a test suite table as 1, the test suite name is BindMobileTest, the test suite describes the bound mobile phone, the test suite belongs to a system user service, iapi, ymatou, com, the next newly generated test case id is recorded in the test suite table as 2, the test case name is BindMobileTestCase1, the test case describes that the mobile phone is successfully bound, and the test suite id belongs to the test case is 1.
S5, the operations performed to write the test result into the database are: inserting a use case execution plan to describe which system is executed by the current plan; inserting the execution result of the test suite to describe the execution condition of the test suite; inserting the test case execution result to describe the test case execution condition; and inserting detailed execution log information to describe a specific message, program assertion and custom log information. For example: 1. adding a record with past as 1 in the case execution plan; 2. inserting a test suite execution result testsuitid to be 1, and using past as an associated field; 3. inserting test case execution results testcaseid and testsuitid as associated fields; 4. inserting a detailed execution record testcase as an associated field, and updating the execution result of the test case to be in a failure state if the current insertion record has a record for finding that the writing verification point mark fails; 5. after the data insertion is completed, counting the failure condition of the test case, and then updating the success number and the failure number of the execution result of the test suite; 6. and after the success number and the failure number of the execution result of the test suite are updated, updating the total success number and the failure number of the use cases of the use case execution plan.
S6, the user checks the test report through the platform log report display module, obtains the test result data which is put in storage through the database query table, the case execution plan, the test suite, the test case, the test suite execution result, the test case execution result and the detailed execution record, and then processes the data to display to the query list page. The display page home page is a case execution plan list, case execution plan information and the number of cases which are successfully and unsuccessfully planned at present are checked, and a secondary page is clicked to check the execution result of the test suite; the test suite execution result display page can see the number of successful and failed execution of the current test suite, and after clicking, the test suite execution result display page enters a three-level page to check the test case execution result; the test case execution interface page can see the success or failure of the specific case execution, and enters a detailed execution record page after clicking; the detailed execution log page shows detailed execution log information and program assertion success and failure records, and specific success and failure assertion contents are checked.
The test system finishes the requirements of automatic interface test and regression test on a plurality of back-end services by executing the test cases regularly, and the whole execution process is automatically finished from case execution to report generation, thereby greatly reducing the repeated labor of testers; the historical report can be inquired to help develop and test the problem site restored according to the historical log, the positioning problem is searched, and the overall efficiency of the test work is improved.
Claims (6)
1. An automated interface test system serving multiple systems, comprising: the system comprises a multi-system API request response module, a data format conversion module, a preposed data generation module, a data query and operation module, a data verification module, an execution log recording module, a test case module, a code management module, a search specified condition execution task module, a compiling module, a log analysis and storage module, a timing execution task module and a platform log report display module;
the multi-system API request response module calls a network request by sending an HTTP message to acquire the return data of the tested interface, and comprises a sending data structure, a return data structure and a calling interface, wherein the sending data structure and the return data structure define data fields and types, and the calling interface defines an interface address URL, a calling method, a data formatting mode and request header information;
the data format conversion module is used for converting a sending data structure into an appointed character string or converting the character string into an appointed return data structure;
the data query and operation module is used for querying and operating data in the database, and returning a data result set by calling the database query operation component, so as to verify the correctness of the test data;
the prepositive data generating module makes prepositive data required by testing by calling the multi-system API request response module and the pre-packaging service function of the data query and operation module;
the data verification module automatically compares whether the expected result is matched with the actual returned data content by using program assertion;
the execution log recording module automatically records the preposed data of the preposed data generating module, the sending request and the return response data of the multi-system API request response module and the comparison result of the data verification module into an xml format and stores the xml format as a log file;
the test case module marks the attribute of the test case by using the annotation and the packet name, is used for distinguishing the test cases of different systems and different conditions, and is realized by compiling a script program of the test module: preparing preposed data, sending a request acquisition return message to a tested system, inquiring a database to verify whether a return result is correct or not, and recording a case log;
the code management module uploads, maintains and manages the script program of the test case module in a unified way through a code management tool;
the search appointed condition execution task module searches the test attributes of the annotation and the class name mark used in the test case module through the appointed condition, finds out the test case which meets the appointed condition, and executes the test case;
the compiling module finishes the compiling work of a script program of the appointed test case module through a script and calls the test case executed by the searching appointed condition execution task module by using an appointed condition;
the log analyzing and warehousing module acquires the execution data of the test case and stores the execution data in a database by analyzing the xml log file recorded by the execution log recording module;
the timed execution task module calls the code management module and downloads a test case module script program in a timed task setting mode, calls a compiling module to compile the test case module script program, and then distributes a product to an execution server;
and the platform log report display module reads the execution data of the test case stored in the database and displays the execution data to a page after processing the execution data.
2. The automated interface testing system for servicing multiple systems of claim 1, wherein: before the multi-system API request response module sends a request, a calling interface automatically converts a corresponding sending data format according to different data formatting modes, then loads request header information, sends request content by using a corresponding interface address and a calling method and obtains a return result; the interface address URL is divided into a domain name definition and a path definition, wherein the domain name definition is determined by a domain name read from a configuration file, and the path definition is self-defined by an interface declaration class; the data formatting mode supports a plurality of formatting automatic conversion modes.
3. The automated interface testing system for servicing multiple systems of claim 1, wherein: the test case packet name of the test case module uses a specific rule, and the corresponding test case can be reversely searched through the packet name; the test case contains a self-defined annotation mark, and the attribute of the test case can be marked by the self-defined annotation mark.
4. The automated interface testing system for servicing multiple systems of claim 1, wherein: the task module for searching the specified condition searches the test case of the specified condition through the annotation of the test case and the packet name mark, the search function queries all executable test cases under the corresponding packet name according to the transmitted site name to be tested, filters a finally executed test case list by using the annotation according to the transmitted search condition, and then executes the test case.
5. The automated interface testing system for servicing multiple systems of claim 1, wherein: the database for storing the test case execution data by the log analyzing and warehousing module consists of a test case execution plan, a test suite, a test case, a test suite execution result, a test case execution result and a detailed execution record; analyzing the log file format by using an xml analysis tool, and acquiring a test interface name, a test interface description, a test case name, a test case description, a test case execution time, input and output information automatically recorded by a multi-system API request response module, and test case program assertion judgment success information; and when the test case information is not recorded in the test suite and the test case table, inserting the test case into the test suite and the test case table, creating a test case execution plan, and storing whether the test suite execution result, the test case execution result, the detailed execution record and the program assertion are successfully recorded.
6. The automated interface testing system for servicing multiple systems of claim 5, wherein: the platform log report display module queries the test case execution plan, the test suite, the test cases, the test suite execution results, the test case execution results and the detailed execution records through the database, obtains the test result data of the test cases which are put in storage, processes the test result data and displays the processed test result data to a query list page.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110165898.8A CN112506807B (en) | 2021-02-07 | 2021-02-07 | Automatic test system for interface serving multiple systems |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110165898.8A CN112506807B (en) | 2021-02-07 | 2021-02-07 | Automatic test system for interface serving multiple systems |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112506807A CN112506807A (en) | 2021-03-16 |
CN112506807B true CN112506807B (en) | 2021-05-11 |
Family
ID=74952737
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110165898.8A Expired - Fee Related CN112506807B (en) | 2021-02-07 | 2021-02-07 | Automatic test system for interface serving multiple systems |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112506807B (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113438126B (en) * | 2021-06-17 | 2022-05-10 | 中国科学院计算机网络信息中心 | A distributed online testing system applied to blockchain domain name system |
CN113434404B (en) * | 2021-06-24 | 2024-03-19 | 北京同创永益科技发展有限公司 | Automatic service verification method and device for verifying reliability of disaster recovery system |
CN113852610B (en) * | 2021-09-06 | 2024-03-05 | 招银云创信息技术有限公司 | Message processing method, device, computer equipment and storage medium |
CN114546858B (en) * | 2022-02-23 | 2024-04-30 | 霖久智慧(广东)科技有限公司 | Automatic interface test platform based on property industry |
CN114880239B (en) * | 2022-05-31 | 2024-05-24 | 成都秦川物联网科技股份有限公司 | Data-driven-based interface automation test framework and method |
CN115118792A (en) * | 2022-06-27 | 2022-09-27 | 中国银行股份有限公司 | Message interface format conversion method, device and system |
CN115328758A (en) * | 2022-06-30 | 2022-11-11 | 浙江中控技术股份有限公司 | A performance testing method and system for large data volume of industrial software |
CN115269400A (en) * | 2022-07-22 | 2022-11-01 | 康键信息技术(深圳)有限公司 | Interface test coverage statistical method and device, computer equipment and storage medium |
CN115827480B (en) * | 2022-12-20 | 2023-05-12 | 中船重工奥蓝托无锡软件技术有限公司 | Automatic test method, device and system for ship performance prediction APP |
CN116594912A (en) * | 2023-07-14 | 2023-08-15 | 中航金网(北京)电子商务有限公司 | Data testing method, device, equipment and medium of server |
CN118606067B (en) * | 2024-04-15 | 2025-02-07 | 湖南长银五八消费金融股份有限公司 | A method, device, equipment and storage medium for implementing a retry mechanism |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105373469B (en) * | 2014-08-25 | 2018-09-04 | 广东金赋科技股份有限公司 | A kind of software automated testing system and method based on interface |
CN109298997A (en) * | 2018-08-08 | 2019-02-01 | 平安科技(深圳)有限公司 | Interface test method, system, computer equipment and storage medium |
CN111679982A (en) * | 2020-06-08 | 2020-09-18 | 广东赛百威信息科技有限公司 | Automatic testing method for REST API (representational State transfer) interface software |
CN112069064B (en) * | 2020-08-31 | 2024-02-02 | 北京首汽智行科技有限公司 | API (application program interface) testing method for short message service provider |
-
2021
- 2021-02-07 CN CN202110165898.8A patent/CN112506807B/en not_active Expired - Fee Related
Also Published As
Publication number | Publication date |
---|---|
CN112506807A (en) | 2021-03-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112506807B (en) | Automatic test system for interface serving multiple systems | |
US12130732B2 (en) | System and method for performing automated API tests | |
CN107908541B (en) | Interface testing method and device, computer equipment and storage medium | |
US5903897A (en) | Software documentation release control system | |
US7676816B2 (en) | Systems and methods for integrating services | |
CN109474488A (en) | Interface test method, device and computer equipment | |
US20020188890A1 (en) | System and method for testing an application | |
US8930772B2 (en) | Method and system for implementing a test automation results importer | |
CN108108297A (en) | The method and apparatus of automatic test | |
CN110334326B (en) | A kind of method and system for identifying recipe file and being converted into XML file | |
CN111563041B (en) | Test case on-demand accurate execution method | |
WO1998027488A1 (en) | Software release metric reporting system and method | |
CN112069073B (en) | Test case management method, terminal and storage medium | |
CN113515297B (en) | Version updating method and device, electronic equipment and storage medium | |
EP2557499A1 (en) | A system and method for automatic impact variable analysis and field expansion in mainframe systems | |
CN107003931B (en) | Decoupling test validation from test execution | |
US11436133B2 (en) | Comparable user interface object identifications | |
CN112540924A (en) | Interface automation test method, device, equipment and storage medium | |
CN111240981A (en) | Interface testing method, system and platform | |
CN108073511B (en) | Test code generation method and device | |
CN110674024A (en) | Electronic equipment integration test system and method thereof | |
CN115269387A (en) | Automatic interface testing method and device | |
CN113050925B (en) | Block chain intelligent contract repairing method and device | |
CN115203306A (en) | Data exporting method and device, computer equipment and readable storage medium | |
CN116955140A (en) | SDK test method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20210511 |
|
CF01 | Termination of patent right due to non-payment of annual fee |