CN114143235B - NFV automatic testing method, device, equipment and storage medium - Google Patents
NFV automatic testing method, device, equipment and storage medium Download PDFInfo
- Publication number
- CN114143235B CN114143235B CN202010810322.8A CN202010810322A CN114143235B CN 114143235 B CN114143235 B CN 114143235B CN 202010810322 A CN202010810322 A CN 202010810322A CN 114143235 B CN114143235 B CN 114143235B
- Authority
- CN
- China
- Prior art keywords
- test
- topology
- case
- task
- description file
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/50—Testing arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/455—Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
- G06F9/45533—Hypervisors; Virtual machine monitors
- G06F9/45558—Hypervisor-specific management and integration aspects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/455—Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
- G06F9/45533—Hypervisors; Virtual machine monitors
- G06F9/45558—Hypervisor-specific management and integration aspects
- G06F2009/45591—Monitoring or debugging support
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/455—Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
- G06F9/45533—Hypervisors; Virtual machine monitors
- G06F9/45558—Hypervisor-specific management and integration aspects
- G06F2009/45595—Network integration; Enabling network access in virtual machine instances
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
Abstract
The invention discloses an NFV automatic test method, an NFV automatic test device, NFV automatic test equipment and a storage medium. The method comprises the steps of obtaining a test case set required by a test, wherein the test case set comprises at least two test cases, constructing a test task based on case description files of each test case in the test case set, wherein the case description files comprise dependency information for describing relevance between the test case and other test cases, optimizing the execution sequence of the test task based on the dependency information of each test case related to the test task, and executing the test task based on the optimized execution sequence. The embodiment of the invention can reduce repeated deployment of the test environment, thereby improving the test efficiency.
Description
Technical Field
The present invention relates to the field of network function virtualization (Network Function Virtualization, NFV), and in particular, to an NFV automatic test method, apparatus, device, and storage medium.
Background
NFV refers to a technology that carries a wide variety of software-based network functions through servers, memories, and switches to replace conventional proprietary hardware devices (e.g., routers, firewalls, content distribution networks, etc.) by means of virtualization technology. The NFV architecture includes a virtualized infrastructure manager (virtual infrastructure manager, VIM) responsible for managing the software and hardware resources of the infrastructure layer (network function virtualization infrastructure, NFVI), a virtual network function manager (virtual network function manager, VNFM) responsible for managing the lifecycle of the virtual network elements (virtual network function, VNF), a network manager (ELEMENT MANAGEMENT SYSTEM, EMS) responsible for managing the configuration, failure, alarms, etc. of the virtual network elements, and a network function virtualization orchestrator (network function virtualization orchestrator, NFVO) responsible for managing the lifecycle of the Network Services (NS).
Because of the many components involved in NFV architecture, the complex and numerous interfaces between the components themselves and the different components, testing and docking are labor intensive. In the related art, the process of testing the network element comprises four links of test case design, test environment construction, test task execution and test result analysis. Besides the fact that the test task execution links of part of test cases (such as interface consistency test and the like) can be optimized by adopting an automatic script, other steps are often manually executed by adopting a manual mode, and the method has the advantages of high cost, low efficiency, easiness in error and difficulty in large scale.
Disclosure of Invention
In view of this, the embodiment of the invention provides an NFV automatic test method, an apparatus, a device and a storage medium, which aims to reduce test cost and improve test efficiency.
The technical scheme of the embodiment of the invention is realized as follows:
the embodiment of the invention provides an NFV automatic test method, which comprises the following steps:
acquiring a test case set required by a test, wherein the test case set comprises at least two test cases;
constructing a test task based on a case description file of each test case in the test case set, wherein the case description file comprises dependency relationship information for describing the relevance between the test case and other test cases;
optimizing the execution sequence of the test task based on the dependency information of each test case related to the test task;
And executing the test task based on the optimized execution sequence.
The embodiment of the invention also provides an NFV automatic test device, which comprises:
the system comprises an acquisition module, a test case acquisition module and a test program acquisition module, wherein the acquisition module is used for acquiring a test case set required by a test, and the test case set comprises at least two test cases;
The task construction module is used for constructing a test task based on a case description file of each test case in the test case set, wherein the case description file comprises dependency relationship information for describing the relevance between the test case and other test cases;
The task scheduling module is used for optimizing the execution sequence of the test task based on the dependency relationship information of each test case related to the test task;
and the task execution module is used for executing the test task based on the optimized execution sequence.
The embodiment of the invention also provides NFV automatic test equipment, which comprises a processor and a memory for storing a computer program capable of running on the processor, wherein the processor is used for executing the steps of the method of the embodiment of the invention when the computer program is run.
The embodiment of the invention further provides a storage medium, and the storage medium stores a computer program, and when the computer program is executed by a processor, the steps of the method of the embodiment of the invention are realized.
The technical scheme provided by the embodiment of the invention is that the case description file of each test case in the test case set comprises the dependency relationship information for describing the relevance between the test case and other test cases, so that the dependency relationship information of each test case related to the test task can be obtained, the execution sequence of the test task is optimized based on the dependency relationship information of each test case, the execution sequence of each test case in the test task can be optimized and adjusted, repeated deployment of a test environment can be reduced, and the test efficiency is improved.
Drawings
FIG. 1 is a schematic diagram of an NFV automatic test architecture according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of an NFV automatic test method according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of an NFV automatic test equipment according to an embodiment of the present invention;
FIG. 4 is a schematic structural diagram of an NFV automatic test equipment according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an NFV automatic test equipment according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used herein in the description of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
Before explaining the NFV automatic test method according to the embodiment of the present invention, an explanation is given to the NFV automatic test architecture according to the embodiment of the present invention. As shown in fig. 1, the NFV automatic test architecture includes an NFV automatic test device 101, a general device 102, an instrument device 103, a component to be tested or a system to be tested 104 and a surrounding component 105, where the NFV automatic test device 101 is communicatively connected to the general device 102, the instrument device 103, the component to be tested or the system to be tested 104 and the surrounding component 105, the general device 102 is mainly a general server, a centralized memory and a switching device, and is used for carrying hardware of the NFV automatic test device, and the instrument device 103 includes a physical instrument and a virtual instrument. The system under test may be an end-to-end integrated System Under Test (SUT) conforming to the NFV architecture, and the component under test may be a specific software functional component (FUT) in the NFV architecture, where the system under test may include a number of FUTs that need to form a test environment with the meter device 103 and surrounding components 105. The surrounding components 105 may be other components of the NFV system than the SUT that are not controlled, observed during testing, and used in the instrumentation 103 combined into a test environment to facilitate testing of the component under test or the system under test 104.
The embodiment of the invention provides an NFV automatic test method, which can be applied to the NFV automatic test equipment 101, as shown in fig. 2, and includes:
Step 201, obtaining a test case set required by a test, wherein the test case set comprises at least two test cases;
Here, the NFV automatic test equipment may receive a test case selected by a test executor, determine a test case set, where the test case set includes at least two selected test cases.
In the embodiment of the invention, the test cases are independent units for testing the components to be tested, and the test task can be understood as a set of a plurality of test cases.
Here, the test case may be described using a case description file that, in some embodiments, includes four types of information elements, a test environment (also referred to as a test topology or topology configuration), a precondition (also referred to as an input condition or a priori condition), a test step (also referred to as a test flow or execution step), and a decision condition (also referred to as a checkpoint or observation). Wherein:
Test environment (Configuration) is used to describe the test topology that needs to be built before the test steps are performed, and typically contains the components under test, the general purpose devices, the instrumentation devices, the surrounding components, and the network (wired or wireless) connections between them.
Pre-test Conditions (Pre-test Conditions) a priori Conditions before the test steps are performed after the test environment is set up, for example, the service or resource parameter configuration performed on the network connection may be performed in a manner of commands, scripts or UIs (User interfaces), or other manual/automatic test verification items (such as connectivity and functional test verification before performance test is performed) on which the parameters are dependent.
And a Test Sequence step of simulating specific situations such as business processes, information interaction or equipment faults and the like occurring in the actual environment by utilizing a series of operation processes aiming at elements in the Test environment, and verifying whether the system/component to be tested meets the requirements of expected specific functions, performances, reliability, interface compliance and the like.
And judging the condition (TEST VERDICT) to judge whether the test case passes or not, or whether the behavior of the component to be tested meets the expected condition or not. In particular, the method can be embodied as a priori, in-process, post-process observation item and logical combinations thereof, involved in each test step.
Step 202, constructing a test task based on a case description file of each test case in the test case set, wherein the case description file comprises dependency relationship information for describing the relevance between the test case and other test cases;
Here, the NFV automatic test equipment extracts a case description file of each test case in the case set based on the acquired test case set, and constructs a test task including each test case. The case description file of each test case may include the dependency information in addition to the foregoing four types of information elements, and the dependency information may include, for example, a topology description file for describing a test topology of the test case or reference information of the topology description file. In some embodiments, the dependency information may also include configuration information or the like for specifying dependencies between different test cases.
The NFV automatic test apparatus extracts the case description file of each test case in the case set, and then performs parameter configuration on each test case based on the input information of the test executor. Here, for the test parameters related to the actual test environment of the test case, which cannot be specified in the case description file in advance, the test executor needs to supplement, for example, the test executor configures an address port of DHCP (Dynamic Host Configuration Protocol ) service required to be used by the test executor for the test task, a mobile terminal SIM (Subscriber Identification Module, subscriber identity module card) card number used in the test task, and the like, so as to complete the construction of the test task.
Step 203, optimizing the execution sequence of the test task based on the dependency information of each test case related to the test task;
the NFV automatic test equipment analyzes the case description files of each test case in the test task, and the test topology related to the test task can be determined because each case description file comprises a topology description file for describing the test topology or reference information of the topology description file. Illustratively, each topology description file adopts a unified format specification specified by a test framework, and is used for describing description information such as all network nodes, network connections and relevant configurations thereof in each specific test topology, and each topology description file has a unique UUID (Universally Unique Identifier, globally unique identifier), and the reference information can be the UUID of the topology description file.
The NFV automatic test equipment may determine all test topologies included in a test task based on UUIDs of topology description files related to the test task, and optimize an execution sequence of each test case in the test task based on each test topology.
And step 204, executing the test task based on the optimized execution sequence.
The NFV automatic test apparatus may execute the corresponding test cases based on the execution order of the optimized test cases.
In the related art, the test topology of each test case is often deployed independently, and although the functions of each test case are simple, the logic is clear and the cases are mutually independent, the test topology is often deployed repeatedly in an unnecessary test environment, so that the test efficiency is affected.
In the embodiment of the invention, the case description file of each test case in each test case set comprises the topology description file for describing the test topology or the reference information of the topology description file, so that the test topology related to the test task can be obtained, the execution sequence of the test task is optimized based on each test topology, the execution sequence of each test case in the test task can be optimized and adjusted, repeated deployment of the test environment can be reduced, and the test efficiency is improved.
In practical application, before constructing the test task, the method further includes:
constructing the topology description file for the test case;
and constructing the case description file for the test case based on the topology description file.
Illustratively, the NFV automatic test equipment may construct a normalized topology description file based on the input of the test designer. For example, a test designer converts a corresponding test topology (including a plurality of systems under test, a plurality of test meters, and network connection relationships thereof) and specific configuration information thereof into a formalized topology description file based on inputs of a graphical interface. The topology description file can adopt yaml format defined by TOSCA standard, the topology description file can include UUIDs of components to be tested, UUIDs of other network nodes related to the test topology, description information of network connection and related configuration, and the topology description file also has globally unique UUIDs.
Illustratively, the NFV automatic test equipment may convert the test script, input and output parameters, etc. of each test case into a formalized case description file according to the input information of the test designer. Wherein each use case description file may contain a UUID reference of the topology description file on which it depends. In this way, the NFV automatic test equipment may package the case description file (including the UUID reference of the topology description file, the test script, the test flow file, etc.) according to the specified format and issue the package to the test case database, so that the test executor may call the package when constructing the test task. It can be appreciated that the NFV automatic test equipment may assign globally unique UUIDs to the instance description files, i.e., the instance description files may carry the UUIDs of the instance description files.
In some embodiments, step 203 comprises:
Analyzing each use case description file in the test task, and determining the test topology related to the test task;
The test cases sharing the same test topology are divided into a first level case group;
Accordingly, step 204 includes:
Sequentially executing each test case in the same first-level case group in series, or
And executing each test case in the same first-level case group in parallel.
Here, for the case where the case description file includes the topology description file, the case description file may be parsed, and the UUID of the topology description file may be obtained based on the topology description file. Then, the use case description files sharing the same topology description file may be divided into the same first-level use case group (also referred to as child use case group) based on the correspondence between the UUIDs of the topology description files and the UUIDs of the use case description files.
In this way, the NFV test apparatus can implement parallel execution or sequential serial execution of test cases sharing the same test environment in the process of executing the test task, so that repeated deployment of the test environment can be reduced, and the test efficiency is improved.
It can be understood that in practical application, if there is no resource conflict or limitation (for example, limit on the number of VNFs or meters license, limit on the total amount of resources occupied by the test, etc.), that is, the test resources are enough, and it is hoped to shorten the total execution time of the test tasks as much as possible, multiple parallel test environments can be constructed at the same time for the same test topology, and the corresponding test cases can be run in parallel.
In some embodiments, step 203 may further comprise:
analyzing the topology description file of each test topology, and identifying network nodes included in each test topology;
performing topology clustering based on network nodes included in each test topology;
Grouping the first level use case group of each test topology in the same topology cluster into a second level use case group;
Accordingly, step 204 includes:
and executing each first-level use case group in the same second-level use case group in sequence in series.
Here, the NFV automatic test equipment may analyze topology description files of each test topology in the test task, identify UUIDs of network nodes included in each topology description file, and perform topology clustering based on the UUIDs of the network nodes.
Illustratively, the NFV automatic test equipment classifies test topologies including the same network node into the same topological cluster based on UUIDs of the network nodes, and further classifies each of the first-level use case groups under the same topological cluster into the same second-level use case group (also referred to as parent use case group).
The NFV automatic test equipment sequentially executes each of the first level case groups in the same second level case group in series, wherein when the test cases in the same first level case group are tested, the NFV automatic test equipment can be implemented in a mode of sequentially executing in series or executing in parallel without replacing a test topology, and therefore details are not repeated here.
It can be appreciated that, for the different first-tier use case groups, there is an update in the network connection configuration of the test topology, and the NFV automatic test equipment may insert a topology switching use case before executing the different first-tier use case groups, which is responsible for the configuration of the network connection, but does not need to perform the operations of pulling up or destroying the VNF (virtualized network function, virtual network element) node.
In some embodiments, step 203 may further comprise:
arranging switching paths among different topological clusters;
determining the execution sequence of the second-level use case group of different topological clusters based on the switching path;
Accordingly, step 204 includes:
and executing each second-level use case group in series based on the execution sequence.
Here, the NFV automatic test equipment may schedule the switching paths between different topology clusters, so that different second-level use case groups perform the test according to the test topology corresponding to the switching paths. The NFV automatic test equipment may orchestrate the switching paths based on switching overhead between different topological clusters such that the cumulative switching overhead of the switching paths is minimal. The handover overhead may be a topology handover overhead set based on preset information, or may also be a topology handover overhead determined based on historical test results.
In some embodiments, the NFV automatic test equipment may directly construct a super topology cluster including all VNF sets of all network nodes involved in the test topology, and insert a super cluster case before all test cases are executed, where the super cluster case is used to complete deployment of all network nodes in the super topology cluster, and subsequently, all switching of the test topology is degraded into network connection construction and configuration operations, without performing cluster switching or VNF node pulling/destroying operations. Optionally, the NFV automatic test equipment may check each time the test topology switches during execution of a test task, and if one VNF is not used in the remaining test topologies, this VNF may be destroyed in time in order to save resources.
In some embodiments, the orchestrating the switching paths between different topological clusters comprises:
taking each topological cluster as a node, taking switching overhead among different topological clusters as edge weight, and constructing a directed graph;
and selecting a node traversal path with the minimum accumulated switching overhead as the switching path based on the directed graph.
Illustratively, the NFV automatic test equipment uses topological clusters as nodes, switching overhead (i.e., operation overhead of node pulling up or destroying) among different topological clusters as edge weights, constructs a directed graph, solves a node traversal path p= { N 1,N2, & gt, which has the smallest accumulated switching overhead, on the directed graph, and performs scheduling of test topological cluster sequencing according to the path. The NFV automatic test equipment may perform a pull-up or destroy operation of the VNF node when switching between different topological clusters, i.e. update the test topology, so as to perform the second level use case group corresponding to the different topological clusters.
In practical applications, the NFV automatic test equipment performs switching between different topological clusters, which may include one of the following:
the NFV automatic test equipment removes the existing test topology and reconstructs the test topology;
The NFV automatic test equipment reserves repeated network nodes by analyzing the topology description file and comparing the difference between the topology description file to be updated and the network nodes included in the topology description file of the existing test topology, destroys the network nodes which are no longer used in the existing test topology, and pulls up the newly added network nodes in the test topology to be updated.
The NFV automatic test apparatus only pulls up newly added network nodes in the test topology to be updated by parsing the topology description file, comparing differences between the topology description file to be updated and the network nodes included in the topology description file of the existing test topology.
In some embodiments, the method further comprises:
acquiring indication information for optimizing the use time length of a target network node;
Determining a test topology using the target network node based on the indication information;
The executing the test task based on the optimized execution sequence comprises the following steps:
And preferentially executing the test cases corresponding to the test topology of the target network node.
Here, the NFV automatic test equipment may preferentially execute the test cases of the test topology of the target network node, so that the use duration of the target network node may be reduced.
For example, the NFV automatic test equipment receives indication information input by a test executor, where the indication information may indicate that the use time of a specific VNF or a meter device is shortened, and then the NFV automatic test equipment matches a test topology using the specific VNF or the meter device based on each test topology related to a test task, and optimizes an execution sequence of the test cases based on at least one of sharing the test topology, topology clustering of the test topology, and path switching of the topology clustering, so that the use time of the specific VNF or the meter device may be reduced. For a specific VNF or meter device that charges according to the time length of use, the test cost can be effectively reduced.
The present invention will be described in further detail with reference to examples of application.
As shown in fig. 3, in the embodiment of the application, the VNF automatic test equipment includes a test design subsystem 301 and a test execution subsystem 302, wherein the test design subsystem 301 includes a topology design module 3011, a use case design module 3012 and a test case library 3013, and the test execution subsystem 302 includes a task construction module 3021, a use case arrangement module 3022 and a test execution module 3023.
The test design subsystem 301 is configured to provide a graphical network topology design interactive interface for a test designer to design test cases with reference to a test specification (text description), convert descriptions of the test cases and corresponding test topologies into formal description files that can be automatically processed by the system, and issue the formal description files to a test case library that can be used by a test executor.
The topology design module 3011 is configured to convert a corresponding test topology (including a plurality of systems under test, a plurality of test meters, and network connection relationships thereof) and specific configuration information thereof into a formalized topology description file according to an input of a test designer on a graphical interface. Wherein each topology description file has a globally unique UUID.
The case design module 3012 is configured to convert the test script, input and output parameters of each test case into a formalized case description file according to the input of the test designer. Wherein each use case description file contains a UUID reference of the topology description file on which it depends.
The test case library 3013 is configured to package and issue packaged case description files (including topology description files of a test topology, test scripts, test flow files, etc.) into the test execution subsystem 302 according to a specified format.
The test execution subsystem 302 is responsible for triggering a test task execution flow for a test case selected by a test executor from the test case library.
The task construction module 3021 extracts a corresponding application description file according to the test case set selected by the test executor for the specific test task, and fills specific test parameters.
The case scheduling module 3022 is responsible for optimally scheduling the execution sequence of the corresponding test case set in each test task, and delivering the optimized test case set to the test execution system for sequential execution.
The test execution module 3023 is responsible for sequentially executing test cases in the test task.
The following describes a specific implementation of the use case layout module 3022, which includes:
First step, topology identification and use case grouping
A) Analyzing the case description files of each test case one by one, and identifying UUIDs of the corresponding test topologies;
b) Grouping { T1, T2,. }, the test cases of the same test topology as a group, ti= { Ti1, ti2,. }, where Ti represents a set of test cases (i.e., the aforementioned first-tier case group) having the same test topology, and Ti1, ti2, etc. represent each test case in the same first-tier case group;
c) The test cases in the same group are sequentially executed without switching the test topology.
Second step, topology clustering
A) Analyzing topology description files related to the test task one by one, identifying a network node UUID set contained in the topology description files,
B) The topology clustering { C1, C2, & gt} is performed on different test topologies, wherein the test topologies comprising the same network nodes (software and hardware network element devices and the like) are divided into one class, and cj= { Tj1, tj2, & gt, namely the topology clustering Cj comprises the same network nodes Tj1, tj2 and the like.
C) When the test cases of different test topologies in the same topology cluster are sequentially executed, the topology switching cases are inserted and are used for being responsible for configuring network connection, but the operation of pulling up or destroying the VNF node is not needed.
Third step, clustering switching paths
In a first mode, a super topology cluster containing all the VNF sets of all the test topologies related to the network nodes is directly constructed, and a special super clustering case is inserted before all the test cases are executed, wherein the special super clustering case is responsible for completing the deployment of all the nodes in the super topology cluster, and the subsequent switching degradation of all the test topologies is changed into network connection construction and configuration operation without cluster switching or VNF node pulling/destroying operation. Alternatively, during execution of the test tasks, each time the test topology is switched, it may be checked that if one VNF is not used in the remaining topology, it is destroyed in time in order to save resources.
In a second mode, topological clusters are used as nodes, switching overhead (namely node pulling/destroying operation overhead) among different topological clusters is used as edge weight, a directed graph is constructed, and node traversing paths P= { N1, N2, & gt } with minimum accumulated overhead are solved on the directed graph. And performing test topology clustering sequencing according to the path. When switching between different topological clusters, the operation of pulling up/destroying the VNF node is needed.
Illustratively, a greedy algorithmic description of node traversal is given for the node traversal of the directed graph described above.
The use case arrangement module 3022 performs Path Planning on the directed graph F (n) from the current node X and the adjacent Y node, deletes the node X from the node set of the directed graph F (n), obtains the directed graph F (n-1) of each node of the n-1, and uses the directed graph F (n-1) and the node Y to call Path Planning function Planning of the Path, so that the Path is repeated until the remaining number of nodes in the directed graph is zero, thereby obtaining the Path after Path Planning.
According to the NFV automatic test method of the application embodiment, the test design subsystem 301 designates the test topology corresponding to the test case, the topology description file of the test topology can be independently designed by the test case, the case description file of the test case can directly comprise the topology description file or reference the topology description file, and the case description file comprising or referring to the topology description file is issued to the test case library 3013 for being called by test executors. The test execution personnel selects an instance description file required by the test task in the test execution subsystem 302, the task construction module 3021 generates the test task based on the selected test instance set, the instance arrangement module 3022 optimizes the execution sequence of each test instance in the test instance set based on the test topology related to the test task, and the test execution module 3023 executes the test instance in the test task based on the optimized execution sequence, thereby reducing repeated deployment of the test environment and improving the test efficiency.
In order to implement the method according to the embodiment of the present invention, the embodiment of the present invention further provides an NFV automatic test apparatus, where the NFV automatic test apparatus corresponds to the above-mentioned NFV automatic test method, and each step in the above-mentioned NFV automatic test method embodiment is also fully applicable to the present NFV automatic test apparatus embodiment.
As shown in fig. 4, the NFV automatic test apparatus 400 includes an acquisition module 401, a task construction module 402, a task orchestration module 403, and a task execution module 404. The acquiring module 401 is configured to acquire a set of test cases required for testing, where the set of test cases includes at least two test cases; the task construction module 402 is configured to construct a test task based on a case description file of each test case in the test case set, where the case description file includes dependency information for describing relevance between the test case and other test cases, the task orchestration module 403 is configured to optimize an execution order of the test task based on the dependency information of each test case related to the test task, and the task execution module 404 is configured to execute the test task based on the optimized execution order.
In some embodiments, the dependency information includes a topology description file for describing a test topology of the test case or reference information of the topology description file, and the task orchestration module 403 is specifically configured to:
Analyzing each use case description file in the test task, and determining the test topology related to the test task;
The test cases sharing the same test topology are divided into a first level case group;
accordingly, the task execution module 404 is specifically configured to:
Sequentially executing each test case in the same first-level case group in series, or
And executing each test case in the same first-level case group in parallel.
In some embodiments, the task orchestration module 403 is further configured to:
analyzing the topology description file of each test topology, and identifying network nodes included in each test topology;
performing topology clustering based on network nodes included in each test topology;
Grouping the first level use case group of each test topology in the same topology cluster into a second level use case group;
Accordingly, the task execution module 404 is further configured to:
and executing each first-level use case group in the same second-level use case group in sequence in series.
In some embodiments, the task orchestration module 403 is further configured to:
arranging switching paths among different topological clusters;
determining the execution sequence of the second-level use case group of different topological clusters based on the switching path;
Accordingly, the task execution module 404 is further configured to:
and executing each second-level use case group in series based on the execution sequence.
In some embodiments, the task orchestration module 403 is specifically configured to:
taking each topological cluster as a node, taking switching overhead among different topological clusters as edge weight, and constructing a directed graph;
and selecting a node traversal path with the minimum accumulated switching overhead as the switching path based on the directed graph.
In some embodiments, the NFV automatic test equipment further comprises:
the construction module 405 is configured to construct the topology description file for use in testing the case, and construct the case description file for use in testing the case based on the topology description file.
In some embodiments, the obtaining module 401 is further configured to obtain indication information for optimizing a usage time period of a target network node, the task orchestration module 403 is further configured to determine a test topology using the target network node based on the indication information, and the task execution module 404 is further configured to preferentially execute a test case corresponding to the test topology of the target network node.
It can be understood that the task building block 402 corresponds to the task building block 3021, the task orchestration block 403 corresponds to the use case orchestration block 3022, the task execution block 404 corresponds to the test execution block 3023, and the building block 405 corresponds to the topology design block 3011 and the use case design block 3012.
In practical application, the acquiring module 401, the task building module 402, the task orchestration module 403, the task execution module 404 and the building module 405 may be implemented by a processor in the NFV automatic test equipment 400. Of course, the processor needs to run a computer program in memory to implement its functions.
It should be noted that, in the NFV automatic test apparatus provided in the foregoing embodiment, only the division of each program module is used for illustration, and in practical application, the process allocation may be performed by different program modules according to needs, that is, the internal structure of the apparatus is divided into different program modules to complete all or part of the processes described above. In addition, the NFV automatic test apparatus provided in the above embodiment and the NFV automatic test method embodiment belong to the same concept, and specific implementation processes thereof are detailed in the method embodiment and are not described herein again.
Based on the hardware implementation of the program module, and in order to implement the method of the embodiment of the present invention, the embodiment of the present invention further provides an NFV automatic test equipment. Fig. 5 shows only an exemplary structure of the NFV automatic test equipment, not all of which, part or all of the structure shown in fig. 5 may be implemented as desired.
As shown in fig. 5, an NFV automatic test equipment 500 provided by an embodiment of the present invention includes at least one processor 501, a memory 502, a user interface 503, and at least one network interface 504. The various components in NFV automatic test equipment 500 are coupled together by bus system 505. It is understood that bus system 505 is used to enable connected communications between these components. The bus system 505 includes a power bus, a control bus, and a status signal bus in addition to a data bus. But for clarity of illustration the various buses are labeled as bus system 505 in fig. 5.
The user interface 503 may include, among other things, a display, keyboard, mouse, trackball, click wheel, keys, buttons, touch pad, or touch screen, etc.
The memory 502 in embodiments of the present invention is used to store various types of data to support the operation of NFV automatic test equipment. Examples of such data include any computer program for operating on NFV automatic test equipment.
The NFV automatic test method disclosed in the embodiment of the present invention may be applied to the processor 501 or implemented by the processor 501. The processor 501 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the NFV automatic test method may be accomplished by instructions in the form of integrated logic circuits or software of hardware in the processor 501. The Processor 501 may be a general purpose Processor, a digital signal Processor (DSP, digital Signal Processor), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. The processor 501 may implement or perform the methods, steps and logic blocks disclosed in embodiments of the present invention. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the method disclosed in the embodiment of the invention can be directly embodied in the hardware of the decoding processor or can be implemented by combining hardware and software modules in the decoding processor. The software module may be located in a storage medium, where the storage medium is located in the memory 502, and the processor 501 reads information in the memory 502, and in combination with its hardware, performs the steps of the NFV automatic test method provided in the embodiment of the present invention.
In an exemplary embodiment, the NFV automatic test equipment may be implemented by one or more Application Specific Integrated Circuits (ASICs), DSPs, programmable logic devices (PLDs, programmable Logic Device), complex programmable logic devices (CPLDs, complex Programmable Logic Device), FPGAs, general purpose processors, controllers, microcontrollers (MCUs, micro Controller Unit), microprocessors, or other electronic components for performing the aforementioned methods.
It is to be appreciated that memory 502 can be either volatile memory or nonvolatile memory, and can include both volatile and nonvolatile memory. The non-volatile Memory may be, among other things, a Read Only Memory (ROM), a programmable Read Only Memory (PROM, programmable Read-Only Memory), erasable programmable Read-Only Memory (EPROM, erasable Programmable Read-Only Memory), electrically erasable programmable Read-Only Memory (EEPROM, ELECTRICALLY ERASABLE PROGRAMMABLE READ-Only Memory), Magnetic random access Memory (FRAM, ferromagnetic random access Memory), flash Memory (Flash Memory), magnetic surface Memory, optical disk, or compact disk-Only Memory (CD-ROM, compact Disc Read-Only Memory), which may be disk Memory or tape Memory. the volatile memory may be random access memory (RAM, random Access Memory) which acts as external cache memory. By way of example and not limitation, many forms of RAM are available, such as static random access memory (SRAM, static Random Access Memory), synchronous static random access memory (SSRAM, synchronous Static Random Access Memory), dynamic random access memory (DRAM, dynamic Random Access Memory), synchronous dynamic random access memory (SDRAM, synchronous Dynamic Random Access Memory), and, Double data rate synchronous dynamic random access memory (DDRSDRAM, double Data Rate Synchronous Dynamic Random Access Memory), enhanced synchronous dynamic random access memory (ESDRAM, enhanced Synchronous Dynamic Random Access Memory), synchronous link dynamic random access memory (SLDRAM, syncLink Dynamic Random Access Memory), Direct memory bus random access memory (DRRAM, direct Rambus Random Access Memory). The memory described by embodiments of the present invention is intended to comprise, without being limited to, these and any other suitable types of memory.
In an exemplary embodiment, the present invention further provides a storage medium, i.e., a computer storage medium, which may be specifically a computer readable storage medium, for example, including a memory 502 storing a computer program, where the computer program may be executed by a processor 501 of an NFV automatic test equipment to perform the steps described in the method of the embodiment of the present invention. The computer readable storage medium may be ROM, PROM, EPROM, EEPROM, flash Memory, magnetic surface Memory, optical disk, or CD-ROM.
It should be noted that "first," "second," etc. are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order.
In addition, the embodiments of the present invention may be arbitrarily combined without any collision.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.
Claims (10)
1. An automatic NFV testing method, comprising:
acquiring a test case set required by a test, wherein the test case set comprises at least two test cases;
constructing a test task based on a case description file of each test case in the test case set, wherein the case description file comprises dependency relationship information for describing the relevance between the test case and other test cases;
optimizing the execution sequence of the test task based on the dependency information of each test case related to the test task;
Executing the test task based on the optimized execution sequence;
the dependency information comprises a topology description file for describing the test topology of the test case or reference information of the topology description file, and the optimizing the execution sequence of the test task based on the dependency information of each test case related to the test task comprises the following steps:
Analyzing each use case description file in the test task, and determining the test topology related to the test task;
The test cases sharing the same test topology are divided into a first level case group;
analyzing the topology description file of each test topology, and identifying network nodes included in each test topology;
Performing topology classification based on network nodes included in each test topology;
The first level use case group of each test topology in the same topology class is partitioned into a second level use case group.
2. The method of claim 1, wherein the step of determining the position of the substrate comprises,
The executing the test task based on the optimized execution sequence comprises the following steps:
Sequentially executing each test case in the same first-level case group in series, or
And executing each test case in the same first-level case group in parallel.
3. The method of claim 1, wherein the step of determining the position of the substrate comprises,
The executing the test task based on the optimized execution sequence comprises the following steps:
and executing each first-level use case group in the same second-level use case group in sequence in series.
4. The method according to claim 1, wherein optimizing the execution sequence of the test tasks based on the dependency information of each test case related to the test tasks further comprises:
Arranging switching paths among different topology classifications;
Determining an execution order of the second-level use case group of different topology classifications based on the switching path;
The executing the test task based on the optimized execution sequence comprises the following steps:
and executing each second-level use case group in series based on the execution sequence.
5. The method of claim 4, wherein the orchestrating the switching paths between different topology classifications comprises:
Taking each topology classification as a node, taking switching overhead among different topology classifications as edge weight, and constructing a directed graph;
and selecting a node traversal path with the minimum accumulated switching overhead as the switching path based on the directed graph.
6. The method according to claim 1, wherein the method further comprises:
constructing the topology description file for the test case;
and constructing the case description file for the test case based on the topology description file.
7. The method according to claim 1, wherein the method further comprises:
acquiring indication information for optimizing the use time length of a target network node;
Determining a test topology using the target network node based on the indication information;
The executing the test task based on the optimized execution sequence comprises the following steps:
And preferentially executing the test cases corresponding to the test topology of the target network node.
8. An NFV automatic test equipment, comprising:
the system comprises an acquisition module, a test case acquisition module and a test program acquisition module, wherein the acquisition module is used for acquiring a test case set required by a test, and the test case set comprises at least two test cases;
The task construction module is used for constructing a test task based on a case description file of each test case in the test case set, wherein the case description file comprises dependency relationship information for describing the relevance between the test case and other test cases;
The task scheduling module is used for optimizing the execution sequence of the test task based on the dependency relationship information of each test case related to the test task;
the task execution module is used for executing the test task based on the optimized execution sequence;
The dependency relationship information comprises a topology description file for describing the test topology of the test case or reference information of the topology description file, and the task orchestration module is specifically used for:
Analyzing each use case description file in the test task, and determining the test topology related to the test task;
The test cases sharing the same test topology are divided into a first level case group;
analyzing the topology description file of each test topology, and identifying network nodes included in each test topology;
Performing topology classification based on network nodes included in each test topology;
The first level use case group of each test topology in the same topology class is partitioned into a second level use case group.
9. An NFV automatic test equipment, comprising a processor and a memory for storing a computer program capable of running on the processor, wherein,
The processor being adapted to perform the steps of the method of any of claims 1 to 7 when the computer program is run.
10. A storage medium having a computer program stored thereon, which, when executed by a processor, implements the steps of the method of any of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010810322.8A CN114143235B (en) | 2020-08-13 | 2020-08-13 | NFV automatic testing method, device, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010810322.8A CN114143235B (en) | 2020-08-13 | 2020-08-13 | NFV automatic testing method, device, equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114143235A CN114143235A (en) | 2022-03-04 |
CN114143235B true CN114143235B (en) | 2024-12-31 |
Family
ID=80438025
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010810322.8A Active CN114143235B (en) | 2020-08-13 | 2020-08-13 | NFV automatic testing method, device, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114143235B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114721918A (en) * | 2022-03-31 | 2022-07-08 | 宁畅信息产业(北京)有限公司 | Pressure testing method and device, electronic equipment and storage medium |
CN114884856B (en) * | 2022-07-11 | 2022-09-30 | 中国科学技术大学 | Reconfigurable network test system and method based on test function virtualization |
CN118885347B (en) * | 2024-09-29 | 2024-12-20 | 苏州元脑智能科技有限公司 | Server testing method and device |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102609352A (en) * | 2011-01-19 | 2012-07-25 | 阿里巴巴集团控股有限公司 | Parallel testing method and parallel testing server |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101917306B (en) * | 2010-08-20 | 2012-08-15 | 北京星网锐捷网络技术有限公司 | Method, system and device for automatic test |
CN103645989B (en) * | 2013-12-26 | 2017-04-19 | 大唐移动通信设备有限公司 | Device and method for analyzing test resource required by test case during test |
CN105824746B (en) * | 2015-01-05 | 2018-09-25 | 中移信息技术有限公司 | A kind of method and apparatus that test dispatching is automatically generated based on use-case dependence |
CN107124326B (en) * | 2017-04-05 | 2020-05-05 | 烽火通信科技股份有限公司 | Automatic testing method and system |
CN107678951A (en) * | 2017-09-21 | 2018-02-09 | 平安科技(深圳)有限公司 | Test exemple automation management method, device, equipment and storage medium |
-
2020
- 2020-08-13 CN CN202010810322.8A patent/CN114143235B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102609352A (en) * | 2011-01-19 | 2012-07-25 | 阿里巴巴集团控股有限公司 | Parallel testing method and parallel testing server |
Also Published As
Publication number | Publication date |
---|---|
CN114143235A (en) | 2022-03-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9037915B2 (en) | Analysis of tests of software programs based on classification of failed test cases | |
CN114143235B (en) | NFV automatic testing method, device, equipment and storage medium | |
CN102576311B (en) | For being improved the method and system of software execution time by Optimal performance model | |
KR101410099B1 (en) | Function Test Apparatus based on Unit Test Cases Reusing and Function Test Method thereof | |
US10354031B2 (en) | Information processing by interpenetrating signal transmission channel in design for testability of chip | |
CN109032850B (en) | On-site device debugging system and on-site device debugging method | |
Rojas et al. | Are we ready to drive software-defined networks? A comprehensive survey on management tools and techniques | |
US10592703B1 (en) | Method and system for processing verification tests for testing a design under test | |
CN109901985B (en) | Distributed test apparatus and method, storage medium, and electronic device | |
CN108989153A (en) | A kind of performance test methods and device | |
US11734141B2 (en) | Dynamic testing of systems | |
CN111382065B (en) | Verification flow management system and method based on test template | |
CN115176233B (en) | Performing tests in deterministic order | |
CN114818565A (en) | Simulation environment management platform, method, equipment and medium based on python | |
CN116569147A (en) | System test infrastructure with hidden variables, hidden attributes, and hidden value detection | |
US10073938B2 (en) | Integrated circuit design verification | |
CN112698974A (en) | Fault injection test method, device and storage medium | |
CN117725869A (en) | Assertion development method, chip verification method, device, equipment and medium | |
WO2019222941A1 (en) | Method for evaluating application deployment, apparatus, computer program product, and readable medium | |
Bosse et al. | Predicting availability and response times of IT services | |
US11442839B2 (en) | Runtime metrics based test ordering | |
US7831879B2 (en) | Generating test coverage bin based on simulation result | |
CN115373696B (en) | Low code configuration method, system, equipment and storage medium for software resource generation | |
US11645193B2 (en) | Heterogeneous services for enabling collaborative logic design and debug in aspect oriented hardware designing | |
Scriven et al. | Resource evaluation and node monitoring in service oriented ad-hoc grids |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |