[go: up one dir, main page]

CN117707982A - Test method, apparatus, device, medium and program product - Google Patents

Test method, apparatus, device, medium and program product Download PDF

Info

Publication number
CN117707982A
CN117707982A CN202410003371.9A CN202410003371A CN117707982A CN 117707982 A CN117707982 A CN 117707982A CN 202410003371 A CN202410003371 A CN 202410003371A CN 117707982 A CN117707982 A CN 117707982A
Authority
CN
China
Prior art keywords
test
target
browser
target test
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410003371.9A
Other languages
Chinese (zh)
Inventor
杨光宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CCB Finetech Co Ltd
Original Assignee
CCB Finetech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CCB Finetech Co Ltd filed Critical CCB Finetech Co Ltd
Priority to CN202410003371.9A priority Critical patent/CN117707982A/en
Publication of CN117707982A publication Critical patent/CN117707982A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Prevention of errors by analysis, debugging or testing of software
    • G06F11/3668Testing of software
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Prevention of errors by analysis, debugging or testing of software
    • G06F11/3668Testing of software
    • G06F11/3696Methods or tools to render software testable
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The present application relates to a test method, apparatus, device, medium and program product. The test method comprises the following steps: receiving a control instruction; determining a target test set from at least one test set integrated in advance according to the control instruction; the target test set comprises a target test browser and a target test language; converting at least one preset test data set into at least one test instruction by adopting a target test language; the method focuses on the fact that test personnel write test scripts in advance, recording in advance is not needed, and the packaged target test set can be directly called to carry out automatic test, so that the problem of complex recording mode process in a common test method is solved, and automatic test efficiency is improved.

Description

Test method, apparatus, device, medium and program product
Technical Field
The present application relates to the field of automated testing technology, and in particular, to a testing method, apparatus, device, medium, and program product.
Background
Web function testing (focusing on testing of Web program functions, and not testing page layout, style, etc.) is the most important testing dimension of Web programs, and is usually implemented by manually verifying one by one. In addition, the method can also be realized by using an automatic test, the conventional method of the automatic test is to write a script or a program describing the execution steps of the test cases, so that a tester needs to master a plurality of back-end programming languages, the test cases are strongly coupled with codes, the reusability and maintainability are poor, and the efficiency of the automatic test is low.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a testing method, apparatus, device, medium, and program product that can improve automated testing efficiency.
In a first aspect, the present application provides a test method comprising:
receiving a control instruction;
determining a target test set from at least one test set integrated in advance according to the control instruction; the target test set comprises a target test browser and a target test language;
converting at least one preset test data set into at least one test instruction by adopting the target test language;
and calling at least one corresponding target test browser to perform parallel test of at least one test instruction.
In one embodiment, the method for generating the test set includes:
and randomly combining at least one preset test browser and at least one test language to obtain at least one test set.
In one embodiment, the control instruction carries a target test engine tag;
the test browser corresponds to at least one test engine label one by one;
the determining, according to the control instruction, a target test set from at least one test set integrated in advance includes:
according to the target test engine label, matching the target test engine label with a target test browser corresponding to the control instruction;
and taking the test set containing the target test browser as the target test set.
In one embodiment, the setting the test set including the target test browser as the target test set includes:
when a plurality of test sets containing the target test browser exist, prompt information is sent out;
receiving a selection instruction aiming at the prompt information;
and determining the target test set from a plurality of test sets containing the target test browser according to the selection instruction.
In one embodiment, before the invoking the corresponding at least one target test browser performs the parallel test of the at least one test instruction, the method further includes:
determining a target test tool from at least one test tool according to the control instruction;
and the calling the corresponding at least one target test browser to perform parallel test of at least one test instruction comprises the following steps:
and calling at least one corresponding target test browser to perform parallel test of at least one test instruction by adopting the target test tool.
In one embodiment, the test tools are in one-to-one correspondence with tool tags;
the control instruction carries a target tool label;
and determining a target test tool from at least one test tool according to the control instruction, wherein the target test tool comprises:
and taking the test tool corresponding to the target tool label as the target test tool.
In one embodiment, the test browser corresponds to the engine driver one by one;
and calling at least one corresponding target test browser to perform parallel test of at least one test instruction by adopting the target test tool, wherein the method comprises the following steps:
taking an engine driver corresponding to the target test browser as a target engine driver;
and pulling the target engine driver to execute at least one test instruction in parallel by adopting the target test tool.
In one embodiment, after the invoking the corresponding at least one target test browser performs the parallel test of the at least one test instruction, the method further includes:
and collecting and recording test data generated in the target test browser.
In a second aspect, the present application further provides a test apparatus, including:
the receiving module is used for receiving the test instruction;
the determining module is used for determining a target test set from at least one test set integrated in advance according to the test instruction; the target test set comprises a target test browser and a target test language;
the conversion module is used for converting at least one preset test data set into at least one test instruction by adopting the target test language;
and the execution module is used for calling at least one corresponding target test browser to carry out parallel test of at least one test instruction.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory and a processor, wherein the memory stores a computer program, and the processor executes the computer program to implement the test method according to any one of the embodiments.
In a fourth aspect, the present application also provides a computer-readable storage medium. The computer readable storage medium has stored thereon a computer program which, when executed by a processor, implements the test method described in any of the above embodiments.
In a fifth aspect, the present application also provides a computer program product. The computer program product comprises a computer program which, when executed by a processor, implements the test method according to any of the embodiments described above.
According to the testing method, the device, the equipment, the medium and the program product, at least one browser engine and at least one dynamic programming language are combined and packaged randomly in advance, after a terminal receives a control instruction, a target testing set formed after packaging is directly called, conversion and simulation testing of a testing data set are carried out, the testing method focuses on writing a testing script in advance by a tester, recording in advance is not needed, automatic testing can be carried out by directly calling the packaged target testing set, the problem that the recording mode process is complex in a common testing method is solved, and automatic testing efficiency is improved.
Drawings
FIG. 1 is a diagram of an application environment for a test method in one embodiment;
FIG. 2 is a flow chart of a test method in one embodiment;
FIG. 3 is a flow chart of a test method in one embodiment;
FIG. 4 is a flow chart of a test method in one embodiment;
FIG. 5 is a flow chart of a test method in one embodiment;
FIG. 6 is a flow chart of a test method in one embodiment;
FIG. 7 is a block diagram of a test apparatus in one embodiment;
fig. 8 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
The test method provided by the embodiment of the application can be applied to an application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network.
For example, the test method is applied to the terminal 102, and when the terminal 102 receives a control instruction; the terminal 102 acquires at least one pre-integrated test set from a data storage system of the server 104 according to the control instruction, and determines a target test set from the at least one pre-integrated test set; the target test set comprises a target test browser and a target test language; the terminal 102 then converts the preset at least one test data set into at least one test instruction using the target test language; and invokes the corresponding at least one target test browser to perform parallel test of at least one test instruction, wherein the terminal 102 can be, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, internet of things devices and portable wearable devices, and the internet of things devices can be smart speakers, smart televisions, smart air conditioners, smart vehicle devices and the like. The portable wearable device may be a smart watch, smart bracelet, headset, or the like. The server 104 may be implemented as a stand-alone server or as a server cluster of multiple servers. The terminal 102 and the server 104 may be connected directly or indirectly through wired or wireless communication means, for example, through a network connection.
For another example, the test method is applied to the server 104, when the terminal 102 detects an access signal for data in the memory, the terminal 102 sends the access signal to the server 104, and the server 104 obtains at least one pre-integrated test set from the data storage system according to the control instruction, and determines a target test set from the at least one pre-integrated test set; the target test set comprises a target test browser and a target test language; the terminal 102 then converts the preset at least one test data set into at least one test instruction using the target test language; and invoking the corresponding at least one target test browser to perform parallel test of the at least one test instruction. It will be appreciated that the data storage system may be a stand-alone storage device, or the data storage system may be located on a server, or the data storage system may be located on another terminal.
In one embodiment, a test method is provided, where the test method is applied to a terminal to illustrate, it is understood that the test method may also be applied to a server, and may also be applied to a system including the terminal and the server, and implemented through interaction between the terminal and the server. As shown in fig. 2, the test method includes:
step 202, receiving a control instruction.
The control instruction may be an instruction for starting one or more test processes to test the browser to be tested, and in this embodiment of the present application, the test process is specifically a rendering process (Render process). As an example, the control instructions may be issued by a test-related staff member.
Step 204, determining a target test set from at least one test set integrated in advance according to the control instruction; the target test set includes a target test browser and a target test language.
The test set refers to a set formed by combining a browser engine and a dynamic programming language.
According to the embodiment, the browser engine for driving the application program of the browser to be tested and the dynamic programming language for analyzing the code in the testing process are packaged, so that the browser engine and the dynamic programming language can be called integrally in the testing process, and the testing efficiency is improved.
The browser engine is typically referred to as a rendering engine in a Web browser that is responsible for parsing the code and presenting the built Web pages that simulate the browser to the tester. Common browser engines include: blink: the rendering engines used by Google Chrome and Opera browsers are developed based on Webkit engines. Webkit: a rendering engine used by the Safari browser; gecko: a rendering engine used by the Mozilla Firefox browser; trident: internet Explorer rendering engine used by the browser. The dynamic programming language may include python, java, javascript, etc.
As an example, the target test browser may refer to an analog browser corresponding to a browser that needs to be tested.
As an example, the target test browser may employ a chromaum framework, where the chromaum framework is a Web browser engine that is open-source and is responsible for parsing various codes and presenting the built Web pages that simulate the browser to the tester, and during the test, the target test browser may be used to simulate the browser behavior of the browser that needs to be tested.
The target test language may be JavaScript, which is a dynamic programming language for adding interactivity and dynamic effects to web pages. During the test, the target test language may be used to write test scripts, simulate user behavior, and verify that the behavior and functionality of the web page are correct. It can be used with a target test browser to implement automated testing.
Step 206, converting the preset at least one test data set into at least one test instruction by using the target test language.
The test data set may specifically be a preset test script, which is usually written by a test engineer and is used for simulating the behavior of a user on the browser to be tested, so as to verify whether the functions and performances of the application program of the browser to be tested meet the expectations, and the test script generally comprises the following contents: test cases: the test script comprises a series of test cases, each test case is a group of test steps and is used for verifying a specific function or scene of the application program; test data: the test data contained in the test script is used for simulating the input and operation of a user, and comprises input text, clicked buttons, selected options and the like; assertion statement: the assertion statements contained in the test script are used to verify that the behavior and functionality of the application is in compliance with expectations. Asserting statements typically include judging conditions and desired results; exception handling: the exception handling code contained in the test script is used for handling possible exception conditions, such as timeout, network error, etc.; logging: the log record code contained in the test script is used for recording key information in the test process, such as test start time, test end time, test result and the like. By way of example, the writing of test scripts may be performed using a Robot Framework, cucumber, testNG, electron, etc.
Test instructions refer to instructions that can be executed or run and that, after execution and execution, can result in certain test results, e.g., the test instructions can be test code.
Step 208, calling at least one corresponding target test browser to perform parallel test of at least one test instruction.
The terminal can execute at least one test instruction in parallel in at least one simulated browser corresponding to the browser to be tested, so that the test efficiency of the automatic test is improved.
According to the test method, at least one browser engine and at least one dynamic programming language can be combined and packaged in advance, after a terminal receives a control instruction, a target test set formed after packaging is directly called, conversion and simulation test of a test data set are carried out, the test method focuses on writing test scripts in advance by a tester, recording in advance is not needed, automatic test can be carried out by directly calling the packaged target test set, the problem that the recording mode process is complicated in a common test method is solved, and automatic test efficiency is improved.
In some alternative embodiments, the method for generating the test set includes:
and randomly combining at least one preset test browser and at least one test language to obtain at least one test set.
The test browser refers to an analog browser corresponding to a browser which can be tested.
The test language may refer to a dynamic programming language such as python, java, javascript.
As shown in fig. 3, in some alternative embodiments, the control instructions carry a target test engine tag;
the test browser corresponds to at least one test engine label one by one;
step 204 includes:
step 2042, matching the target test engine label with a target test browser corresponding to the control instruction;
step 2044, using the test set including the target test browser as the target test set.
The test engine label may be at least one of letters, characters or numbers, and is used for uniquely identifying an analog browser corresponding to the browser capable of performing the test, where a one-to-one mapping relationship between the test engine label and a plurality of browser engines is pre-stored in the server in this embodiment.
The target test engine label is used for referring to a browser engine corresponding to a browser to be tested corresponding to the control instruction. The terminal can be matched with a corresponding target browser engine through the test engine label, and the matched target browser engine can be further adopted to drive an application program of a browser to be tested.
After determining the target browser engine corresponding to the control instruction, the terminal can take the test set containing the target browser engine as a target test set.
As shown in fig. 4, in some alternative embodiments, step 2044 comprises:
step 20442, when there are multiple test sets including the target test browser, sending out prompt information;
step 20444, receiving a selection instruction aiming at prompt information;
step 20446, determining a target test set from a plurality of test sets including the target test browser according to the selection instruction.
The terminal can consider that the current control instruction can not call the packaged browser engine and the dynamic programming language according to the number of the browser engines conforming to the target browser engine in each test set when the browser engines conforming to the target browser engine do not exist, and the terminal can send an alarm at the moment; when two or more browser engines conforming to the target browser engine exist, it can be considered that the current control instruction may have a plurality of applicable test sets of packaged browser engines and dynamic programming languages, and the terminal generates prompt information to prompt the testers to perform manual selection, for example, the terminal can also perform centralized display on each test set containing the target test browser through a man-machine interaction interface of the terminal while sending the prompt information.
The selection instruction may be an instruction for selecting one test set from a plurality of test sets including the target test browser as the target test set, and the selection instruction may be an instruction issued by a tester when the tester clicks the displayed test set including the target test browser on the man-machine interface of the terminal.
In some alternative embodiments, as shown in fig. 5, prior to step 208, the method further comprises:
step 207, determining a target test tool from at least one test tool according to the control instruction;
step 208 includes:
and step 208a, calling at least one corresponding target test browser to perform parallel test of at least one test instruction by adopting a target test tool.
The object test tool is used for testing Web application programs, and can simulate the operation of a user in a browser, such as clicking, inputting text, submitting forms and the like, and verifying whether the behavior and the functions of the Web pages of the browser are correct. During testing, the target test tool may be used in conjunction with a browser engine to execute test scripts. The target test tool may employ, for example, a Selenium framework. In addition to the Selenium framework, the target test tool may also employ other types of test tools, such as a JMeter framework for performance testing, a LoadRunner framework for load testing, a Burp Suite framework for security testing, and the like, for example.
In some alternative embodiments, the test tools are in one-to-one correspondence with the tool tags;
the control instruction carries a target tool label;
step 207 comprises: and taking the test tool corresponding to the target tool label as a target test tool.
The test tool refers to software or a program for performing various types of tests such as an automation test, a performance test, a load test, a security test, and the like.
The tool label may be at least one of letters, characters or numbers, and is used for uniquely identifying a test tool performing a simulation operation in a simulation browser corresponding to a browser to be tested, where a one-to-one mapping relationship between the tool label and a plurality of test tools is pre-stored in the server in this embodiment.
The target test engine label is used for referring to a browser engine corresponding to a browser to be tested corresponding to the control instruction. The terminal can be matched with a corresponding target browser engine through the test engine label, and the matched target browser engine can be further adopted to drive an application program of a browser to be tested.
As shown in fig. 6, in some alternative embodiments, the test browser is in one-to-one correspondence with the engine driver;
step 208a includes:
step 2082, taking an engine driver corresponding to the target test browser as a target engine driver;
step 2084, using the target test tool, pulling the target engine driver to execute at least one test instruction in parallel.
An engine driver refers to a software driver that connects a browser engine and a test tool, and can convert content in a test script into instructions that can be executed by the browser engine. The engine driver may employ ChromeDriver, geckoDriver, safariDriver, for example.
In this embodiment, after determining the target test browser, the terminal may use the target test tool to invoke and drive the engine driver corresponding to the target test browser. The test instruction is converted into an instruction which can be executed by the browser engine through the engine driver corresponding to the target test browser, so that the test of the virtual browser is realized.
And after step 208, the terminal can also collect and record the test data generated in the target test browser, and the terminal can display the generated test data through a man-machine interaction interface of the terminal, for example, so that the test result can be checked by the tester.
According to the test method, at least one browser engine and at least one dynamic programming language can be combined and packaged in advance, after a terminal receives a control instruction, a target test set formed after packaging is directly called, conversion and simulation test of a test data set are carried out, the test method focuses on writing test scripts in advance by a tester, recording in advance is not needed, automatic test can be carried out by directly calling the packaged target test set, the problem that the recording mode process is complicated in a common test method is solved, and automatic test efficiency is improved.
It should be understood that, although the steps in the flowcharts related to the above embodiments are sequentially shown as indicated by arrows, these steps are not necessarily performed in parallel in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides a memory data access device for realizing the above-mentioned memory data access method. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitation in the embodiments of the memory data access device or devices provided below may be referred to the limitation of the memory data access method hereinabove, and will not be repeated here.
As shown in fig. 7, in one embodiment, as shown in fig. 7, a test apparatus 700 is provided, including a receiving module 702, a determining module 704, a converting module 706, and an executing module 708:
the receiving module 702 is configured to receive a test instruction;
the determining module 704 is configured to determine, according to the test instruction, a target test set from at least one test set that is integrated in advance; the target test set comprises a target test browser and a target test language;
the conversion module 706 is configured to convert at least one preset test data set into at least one test instruction by using a target test language;
the execution module 708 is configured to invoke the corresponding at least one target test browser to perform a parallel test of the at least one test instruction.
In some alternative embodiments, the method for generating the test set includes:
and randomly combining at least one preset test browser and at least one test language to obtain at least one test set.
In some alternative embodiments, the control instructions carry a target test engine tag;
the test browser corresponds to at least one test engine label one by one;
the determination module 704 is further configured to:
matching the target test engine label with a target test browser corresponding to the control instruction;
and taking the test set containing the target test browser as a target test set.
In some alternative embodiments, the determination module 704 is further configured to:
when a plurality of test sets containing target test browsers exist, prompt information is sent out;
receiving a selection instruction aiming at prompt information;
and determining a target test set from a plurality of test sets containing target test browsers according to the selection instruction.
In some alternative embodiments, the execution module 708 is further configured to:
determining a target test tool from at least one test tool according to the control instruction;
and calling at least one corresponding target test browser to perform parallel test of at least one test instruction by adopting a target test tool.
In some alternative embodiments, the test tools are in one-to-one correspondence with the tool tags;
the control instruction carries a target tool label;
the execution module 708 is further configured to:
and taking the test tool corresponding to the target tool label as a target test tool.
In some alternative embodiments, the test browser is in one-to-one correspondence with the engine driver;
the execution module 708 is further configured to:
taking an engine driver corresponding to the target test browser as a target engine driver;
and pulling the target engine driver to execute at least one test instruction in parallel by adopting the target test tool.
In some alternative embodiments, the execution module 708 is further configured to:
and collecting and recording test data generated in the target test browser.
The various modules in the test apparatus described above may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a terminal, and the internal structure thereof may be as shown in fig. 8. The computer device includes a processor, a memory, an input/output interface, a communication interface, a display unit, and an input means. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface, the display unit and the input device are connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a test method. The display unit of the computer device is used for forming a visual picture, and can be a display screen, a projection device or a virtual reality imaging device. The display screen can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be a key, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 8 is merely a block diagram of some of the structures associated with the present application and is not limiting of the computer device to which the present application may be applied, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, implements the test method in the above embodiments.
In one embodiment, a computer program product is provided, comprising a computer program product that, when executed by a processor, implements the test method in the above embodiments.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the various embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (Phase Change Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the various embodiments provided herein may include at least one of relational databases and non-relational databases. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic units, quantum computing-based data processing logic units, etc., without being limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (11)

1. A method of testing, comprising:
receiving a control instruction;
determining a target test set from at least one test set integrated in advance according to the control instruction; the target test set comprises a target test browser and a target test language; the control instruction carries a target test engine label; the test browser corresponds to at least one test engine label one by one;
the method for generating the test set comprises the following steps: randomly combining at least one preset test browser and at least one test language to obtain at least one test set;
converting at least one preset test data set into at least one test instruction by adopting the target test language;
and calling at least one corresponding target test browser to perform parallel test of at least one test instruction.
2. The method of claim 1, wherein said determining a target test set from at least one test set pre-integrated according to said control instructions comprises:
according to the target test engine label, matching the target test engine label with a target test browser corresponding to the control instruction;
and taking the test set containing the target test browser as the target test set.
3. The method of claim 2, wherein the taking the test set containing the target test browser as the target test set comprises:
when a plurality of test sets containing the target test browser exist, prompt information is sent out;
receiving a selection instruction aiming at the prompt information;
and determining the target test set from a plurality of test sets containing the target test browser according to the selection instruction.
4. The method of claim 1, wherein before the invoking the corresponding at least one of the target test browsers performs the parallel testing of at least one of the test instructions, further comprising:
determining a target test tool from at least one test tool according to the control instruction;
and the calling the corresponding at least one target test browser to perform parallel test of at least one test instruction comprises the following steps:
and calling at least one corresponding target test browser to perform parallel test of at least one test instruction by adopting the target test tool.
5. The method of claim 4, wherein the test tools are in one-to-one correspondence with tool tags;
the control instruction carries a target tool label;
and determining a target test tool from at least one test tool according to the control instruction, wherein the target test tool comprises:
and taking the test tool corresponding to the target tool label as the target test tool.
6. The method of claim 4, wherein the test browser corresponds one-to-one to an engine driver;
and calling at least one corresponding target test browser to perform parallel test of at least one test instruction by adopting the target test tool, wherein the method comprises the following steps:
taking an engine driver corresponding to the target test browser as a target engine driver;
and pulling the target engine driver to execute at least one test instruction in parallel by adopting the target test tool.
7. The method of claim 1, wherein after the invoking the corresponding at least one of the target test browsers performs the parallel testing of at least one of the test instructions, further comprising:
and collecting and recording test data generated in the target test browser.
8. A test device, comprising:
the receiving module is used for receiving the test instruction;
the determining module is used for determining a target test set from at least one test set integrated in advance according to the test instruction; the target test set comprises a target test browser and a target test language; the control instruction carries a target test engine label; the test browser corresponds to at least one test engine label one by one; the determining module is further used for randomly combining at least one preset test browser and at least one preset test language to obtain at least one test set;
the conversion module is used for converting at least one preset test data set into at least one test instruction by adopting the target test language;
and the execution module is used for calling at least one corresponding target test browser to carry out parallel test of at least one test instruction.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the test method of any one of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 7.
11. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 7.
CN202410003371.9A 2024-01-02 2024-01-02 Test method, apparatus, device, medium and program product Pending CN117707982A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410003371.9A CN117707982A (en) 2024-01-02 2024-01-02 Test method, apparatus, device, medium and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410003371.9A CN117707982A (en) 2024-01-02 2024-01-02 Test method, apparatus, device, medium and program product

Publications (1)

Publication Number Publication Date
CN117707982A true CN117707982A (en) 2024-03-15

Family

ID=90155355

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410003371.9A Pending CN117707982A (en) 2024-01-02 2024-01-02 Test method, apparatus, device, medium and program product

Country Status (1)

Country Link
CN (1) CN117707982A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118377726A (en) * 2024-06-24 2024-07-23 北京辰信领创信息技术有限公司 Web application automatic test method and computer device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118377726A (en) * 2024-06-24 2024-07-23 北京辰信领创信息技术有限公司 Web application automatic test method and computer device

Similar Documents

Publication Publication Date Title
CN111061526B (en) Automatic test method, device, computer equipment and storage medium
US8862940B2 (en) Integrated fuzzing
US8645912B2 (en) System and method for use in replaying software application events
CN103810089B (en) Automatically testing gesture-based applications
CN110716870B (en) Automatic service testing method and device
AU2017327823B2 (en) Test case generator built into data-integration workflow editor
US20110289489A1 (en) Concurrent cross browser testing
CN103049371A (en) Testing method and testing device of Android application programs
CN107526676B (en) Cross-system test method and device
WO2019109553A1 (en) Functional and performance test script creation method, device, apparatus and storage medium
Tuovenen et al. MAuto: Automatic mobile game testing tool using image-matching based approach
CN113836014A (en) An interface testing method, device, electronic device and storage medium
CN117707982A (en) Test method, apparatus, device, medium and program product
CN104471530A (en) Executable Software Specification Generation
Othman et al. Test Case Auto-Generation For Web Applications: A Model-Based Approach
CN115964274A (en) Test data acquisition method and device, computer equipment and storage medium
CN113220596B (en) Application testing method, device, equipment, storage medium and program product
JP6916327B1 (en) Derived test equipment, derived test methods, and derived test programs
JP2015069332A (en) Object range extractor, static verification system, object range extraction method, and object range extraction program for extracting object range to be subjected to static verification by static verification device
CN116467208A (en) Interface automatic test method and device and computer equipment
Huang Visual Regression Testing in Practice: Problems, Solutions, and Future Directions
CN119557221A (en) Page automatic desensitization test method, device, equipment, storage medium and product
Chaudhary ?? Translation from 2nd to 3rd generation mobile test cases with Appium
CN114328189A (en) Fault recurrence method, device, terminal and computer readable storage medium
CN117149622A (en) Test method and device based on vehicle-mounted SOA and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination