CN116643977A - Automatic performance model testing method, device, equipment and storage medium - Google Patents
Automatic performance model testing method, device, equipment and storage medium Download PDFInfo
- Publication number
- CN116643977A CN116643977A CN202310545640.XA CN202310545640A CN116643977A CN 116643977 A CN116643977 A CN 116643977A CN 202310545640 A CN202310545640 A CN 202310545640A CN 116643977 A CN116643977 A CN 116643977A
- Authority
- CN
- China
- Prior art keywords
- test
- configuration information
- performance model
- task
- tasks
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 367
- 238000000034 method Methods 0.000 claims abstract description 54
- 238000004458 analytical method Methods 0.000 claims abstract description 38
- 230000008569 process Effects 0.000 claims abstract description 31
- 238000004088 simulation Methods 0.000 claims description 40
- 238000004590 computer program Methods 0.000 claims description 15
- 230000002452 interceptive effect Effects 0.000 claims description 11
- 238000010998 test method Methods 0.000 claims description 10
- 238000004364 calculation method Methods 0.000 claims description 8
- 230000008859 change Effects 0.000 claims description 7
- 238000012544 monitoring process Methods 0.000 claims description 6
- 238000013461 design Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 7
- 238000013515 script Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 230000009471 action Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000004140 cleaning Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Prevention of errors by analysis, debugging or testing of software
- G06F11/3668—Testing of software
- G06F11/3672—Test management
- G06F11/3684—Test management for test design, e.g. generating new test cases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Prevention of errors by analysis, debugging or testing of software
- G06F11/3668—Testing of software
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Prevention of errors by analysis, debugging or testing of software
- G06F11/3668—Testing of software
- G06F11/3672—Test management
- G06F11/3692—Test management for test results analysis
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
The application discloses a method, a device, equipment and a storage medium for automatically testing a performance model, and relates to the technical field of model testing. The method comprises the following steps: acquiring a compiling instruction, compiling a performance model project according to the compiling instruction to generate a corresponding executable file; acquiring test case configuration information, and executing corresponding test tasks through the executable file according to the test case configuration information; and automatically extracting key parameters from the log file corresponding to the test task after the test task is completed, and analyzing results based on the key parameters. The method can realize automation of testing and analysis, reduce complex operation in the testing process, reduce complexity of the testing process, provide concise testing flow for testers, execute testing tasks based on the testing case configuration information and executable files, avoid the problem that the same performance model needs to be compiled repeatedly under different parameter configurations, and improve testing efficiency.
Description
Technical Field
The present application relates to the field of model testing technologies, and in particular, to a method, an apparatus, a device, and a storage medium for automatically testing a performance model.
Background
At present, the test and result analysis of an electronic system level (Elsctronic System Level, ESL) performance model mainly comprise manual test, when a plurality of groups of parameter tests are faced, a developer is required to manually modify model parameters, recompile the model and perform simulation test one by one, write a corresponding log analysis script to capture and analyze test results, and manually write a test report. With the improvement of the computing power of the computer, the existing test mode cannot fully utilize the computer resources, and a case test usually needs to wait for a plurality of hours, so that the problems of low test efficiency, long test period and low utilization rate of the computer resources exist. Therefore, how to improve the test efficiency of the performance model is a problem to be solved.
Disclosure of Invention
Therefore, the application aims to provide a method, a device, equipment and a medium for automatically testing a performance model, which can realize the automation of the simulation test and the test analysis of the performance model and improve the test efficiency. The specific scheme is as follows:
in a first aspect, the application discloses a performance model automatic test method, which comprises the following steps:
acquiring a compiling instruction, compiling a performance model project according to the compiling instruction to generate a corresponding executable file;
acquiring test case configuration information, and executing corresponding test tasks through the executable file according to the test case configuration information;
and automatically extracting key parameters from the log file corresponding to the test task after the test task is completed, and analyzing results based on the key parameters.
Optionally, the obtaining the test case configuration information includes:
acquiring operation configuration information aiming at a test case; the operation configuration information comprises the number of cases and operation modes; the operation modes comprise a parallel mode and a serial mode;
correspondingly, the executing the corresponding test task according to the test case configuration information through the executable file includes:
if the operation mode is a parallel mode, a plurality of test tasks are executed in parallel by starting a plurality of terminal windows in the background and submitting a plurality of test cases at the same time;
if the operation mode is a serial mode, a terminal window is opened in the background and each test case is submitted automatically and sequentially until all the test tasks are executed.
Optionally, the obtaining the test case configuration information includes:
acquiring parameter configuration information aiming at a test case; the parameter configuration information comprises any one or more of model parameters, simulation scene information, output file paths, input file paths and engine parameters;
correspondingly, the executing the corresponding test task according to the test case configuration information through the executable file includes:
configuring model parameters for a performance model based on the parameter configuration information, and executing test tasks by running the executable file;
and recording a simulation log generated in the performance model task execution process, and generating a simulation file.
Optionally, in the process of executing the corresponding test task through the executable file according to the test case configuration information, the method further includes:
monitoring the task execution state of each test task;
generating a test task progress prompt message in real time according to the task execution state; the test task progress prompt information comprises any one or more of the number of completed test tasks, the number of currently operated test tasks and the number of remaining test tasks.
Optionally, the acquiring compiling instruction and the acquiring the test case configuration information include:
acquiring the compiling instruction and the test case configuration information through a preset interactive interface;
correspondingly, the process of executing the corresponding test task through the executable file according to the test case configuration information further comprises:
and dynamically generating a performance bottleneck report of the performance model on the preset interactive interface.
Optionally, the analyzing the result based on the key parameter includes:
generating a test report of the performance model according to the analysis result and a preconfigured report template; the test report includes any one or more of a performance change curve and a text report.
Optionally, the automatically extracting the key parameter from the log file corresponding to the test task, and performing result analysis based on the key parameter includes:
after the test task is completed, reading and analyzing a log file corresponding to the test task, and extracting key parameters from the analyzed log file;
and determining the performance parameters of the performance model based on the key parameters according to a preset performance calculation rule.
In a second aspect, the present application discloses an automatic performance model testing device, comprising:
the engineering compiling module is used for acquiring compiling instructions, and compiling performance model engineering according to the compiling instructions to generate corresponding executable files;
the test configuration module is used for acquiring test case configuration information and executing corresponding test tasks through the executable file according to the test case configuration information;
and the result analysis module is used for automatically extracting key parameters from the log files corresponding to the test tasks after the test tasks are completed, and carrying out result analysis based on the key parameters.
In a third aspect, the present application discloses an electronic device, comprising:
a memory for storing a computer program;
and the processor is used for executing the computer program to realize the automatic performance model testing method.
In a fourth aspect, the present application discloses a computer-readable storage medium for storing a computer program; wherein the computer program when executed by the processor implements the aforementioned performance model automatic test method.
According to the method, compiling instructions are obtained, and performance model engineering is compiled according to the compiling instructions so as to generate corresponding executable files; acquiring test case configuration information, and executing corresponding test tasks through the executable file according to the test case configuration information; and automatically extracting key parameters from the log file corresponding to the test task after the test task is completed, and analyzing results based on the key parameters. Therefore, through each step in the automatic execution performance model simulation test process, the automation of performance model simulation test and test analysis is realized, the complex operation in the test process is reduced, the complexity of the test process is reduced, a concise test flow is provided for a tester, test tasks are executed based on test case configuration information and executable files, the problem that the same performance model needs to be compiled repeatedly under different parameter configurations is avoided, and the test efficiency is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present application, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of an automatic test method for a performance model provided by the application;
FIG. 2 is a flow chart of a parallel test of a specific performance model provided by the application;
FIG. 3 is a flow chart of a specific performance model serial test provided by the present application;
FIG. 4 is a schematic diagram of exemplary test case configuration information provided by the present application;
FIG. 5 is a flow chart of a specific test task execution provided by the present application;
FIG. 6 is a flow chart of a specific test result analysis provided by the present application;
FIG. 7 is a schematic diagram of an exemplary automated performance model test system provided by the present application;
FIG. 8 is a schematic diagram of a performance model automatic test equipment according to the present application;
fig. 9 is a block diagram of an electronic device according to the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In the prior art, in order to solve this problem, the design of hardware and software faces serious challenges facing increasingly complex system on Chip (SoC) and severe time constraints, and thus, electronic system level design methodologies have been developed. The system level design is performed at a very high level of abstraction, and can be applied to aspects of performance models, functional models, system designs and the like, so that the design flow can enter a software design stage early, perform hardware design at a higher level, realize hardware/software collaborative design and the like. At present, the test and result analysis of the electronic system level performance model mainly adopt manual test, when facing to multi-group parameter test, a developer is required to manually modify model parameters, recompile the model and perform simulation test one by one, write corresponding log analysis scripts to capture and analyze test results, manually write test reports, fail to fully utilize computer resources, and one case test usually needs to wait for a plurality of hours, so that the problems of low test efficiency, long test period and low computer resource utilization rate exist. In order to overcome the technical problems, the application provides an automatic performance model testing method which can realize the automation of performance model simulation test and test analysis and improve the testing efficiency.
The embodiment of the application discloses an automatic performance model testing method, which is shown in fig. 1, and can comprise the following steps:
step S11: and acquiring a compiling instruction, and compiling a performance model project according to the compiling instruction to generate a corresponding executable file.
In this embodiment, a compiling instruction sent by a tester is firstly obtained, and the compiling instruction can be obtained through a preset interactive interface, wherein the preset interactive interface can be a graphical user interface (Graphical User Interface, GUI), and then a performance model engineering is compiled according to the compiling instruction, that is, automatic compiling and linking are performed on a system model to generate a corresponding executable file.
Specifically, a tester selects preset compiling options on a GUI interface, taking a Cmake compiling tool as an example, and a background runs a corresponding CMakeLists.txt file according to a compiling instruction, runs a make command after a corresponding Makefile is generated, compiles and links model engineering, generates a corresponding executable file, and prompts a user to enter the next operation. Before each tester sets a test case, the model needs to be recompiled and only needs to be recompiled once, and after an executable file is generated, all the set test cases are tested by running the executable file.
Step S12: and acquiring test case configuration information, and executing corresponding test tasks through the executable file according to the test case configuration information.
In this embodiment, the configuration information of the test cases, that is, the configuration information of the test cases for the simulation test of the performance model, may include the running configuration and the parameter configuration of the test cases, and may include the number of cases, the running mode, the model parameters, the simulation scene information, the output file path, the input file path, the engine parameters, and the like, and then execute the corresponding test tasks through the executable files according to the obtained configuration information of the test cases. In this embodiment, the test case configuration information may be obtained through the preset interactive interface.
In this embodiment, the obtaining the test case configuration information may include: acquiring operation configuration information aiming at a test case; the operation configuration information comprises the number of cases and operation modes; the operation modes comprise a parallel mode and a serial mode; correspondingly, the executing, according to the test case configuration information, the corresponding test task through the executable file may include: if the operation mode is a parallel mode, a plurality of test tasks are executed in parallel by starting a plurality of terminal windows in the background and submitting a plurality of test cases at the same time; if the operation mode is a serial mode, a terminal window is opened in the background and each test case is submitted automatically and sequentially until all the test tasks are executed.
Namely, a tester configures the number of test cases on a GUI interface and selects an operation mode, and two operation modes of parallel test and serial test are provided for a user. And in parallel test, a plurality of terminal windows are simultaneously opened in the background, and a plurality of test tasks are simultaneously submitted. The parallel mode can simultaneously run a plurality of test cases, so that the test time is saved, the test efficiency is improved, and the utilization rate of computer resources is improved; and when the serial test is finished, submitting the next test task. Under the condition of limited computing resources, the computer resources are tested in a serial mode without stopping until the test cases are all completed, so that the computer resources are saved, and computing space is provided for other tasks or personnel. Namely, the parallel or serial test mode is selected according to the equipment state, so that the aim of improving the test efficiency or saving the computer resources can be fulfilled.
In this embodiment, the obtaining the test case configuration information may include: acquiring parameter configuration information aiming at a test case; the parameter configuration information comprises any one or more of model parameters, simulation scene information, output file paths, input file paths and engine parameters; correspondingly, the executing, according to the test case configuration information, the corresponding test task through the executable file may include: configuring model parameters for a performance model based on the parameter configuration information, and executing test tasks by running the executable file; and recording a simulation log generated in the performance model task execution process, and generating a simulation file.
It can be understood that when the performance model needs to use the experimental result as an Input parameter, an Input file path corresponding to the test case is selected through the GUI interface, for example, when the experimental measured I/O (Input/Output) processing delay information of a certain engine or the IOPS (Input/Output Operations Per Second) of the CPU, etc. parameter files are used as parameters of the model, the experimental test result file is read, and the parameter files are configured into the system model as inputs. The system model test result adopts a scheme of outputting a log file, analyzes the log file to obtain key parameter changes, and can be outputted through a log file output path and a simulation report output path generated when the GUI interface setting model runs, so that the log is generated and then outputted through a corresponding path. The engine parameters may include buffer depth in the engine, switches for functions, etc. The model parameters are model and test parameters, when the task is executed, the performance model is configured by taking the model parameters set by the user, simulation scene information and the like as input parameters so as to test a target scene, and the model running log is stored in a user setting path.
In this embodiment, the process of executing the corresponding test task according to the test case configuration information through the executable file may further include: and dynamically generating a performance bottleneck report of the performance model on the preset interactive interface. The performance bottleneck of the performance model can be dynamically displayed in the simulation process or after the simulation process is finished, and the simulation result is efficiently and intuitively displayed.
In this embodiment, the process of executing the corresponding test task according to the test case configuration information through the executable file may further include: monitoring the task execution state of each test task; generating a test task progress prompt message in real time according to the task execution state; the test task progress prompt information comprises any one or more of the number of completed test tasks, the number of currently operated test tasks and the number of remaining test tasks. It may be understood that the test case configuration information is configuration information for a plurality of test cases, each test case is used as a test task, in this embodiment, the task execution state may be monitored during the task execution process, and task progress information such as the number of completed test tasks, the number of currently running test tasks, and the number of remaining test tasks may be displayed in the interactive interface.
Step S13: and automatically extracting key parameters from the log file corresponding to the test task after the test task is completed, and analyzing results based on the key parameters.
In this embodiment, after the test task is completed, the key parameters may be automatically extracted from the log file corresponding to the test task according to the key parameter template, and the result analysis is automatically performed based on the key parameters, so that the automatic analysis of the test result is implemented by automatically extracting the key parameters and performing the automatic analysis according to the key parameters.
In this embodiment, the automatically extracting the key parameter from the log file corresponding to the test task and performing the result analysis based on the key parameter may include: after the test task is completed, reading and analyzing a log file corresponding to the test task, and extracting key parameters from the analyzed log file; and determining the performance parameters of the performance model based on the key parameters according to a preset performance calculation rule. I.e. the performance parameters are calculated specifically according to preset performance rules. For example, when a test case runs on the performance model, a preset script is run, the saved log file is read, and key information is recorded, including buffer occupancy information and I/O in-engine time stamps. After the key information extraction is completed, performance parameters such as delay information of the I/O in the engine, integral delay information of the I/O, utilization rate of each buffer and the like are calculated. And drawing the change information focused by the user into a chart, and temporarily storing the chart at the position of the output file. It can be understood that, in this embodiment, the selection of the key parameters and the performance calculation rules may be defined according to the actual use situation.
In this embodiment, after the analyzing the result based on the key parameter, the method may further include: generating a test report of the performance model according to the analysis result and a preconfigured report template; the test report includes any one or more of a performance change curve and a text report. After the test result analysis of the test cases is completed, the test case information configured by the user, the test analysis result and the drawn chart are written into a test report corresponding to the test cases by using a preset script, and the test report is stored under a path set by the user.
The automatic test method of the electronic system level performance model is provided for solving the problems of complex test process, low test efficiency and low resource utilization rate of the existing test and analysis methods of the electronic system level performance model, and adopts a parallel or serial test mode to automatically submit test cases, so that the test efficiency is improved, test parameters and test steps are simplified into visual interface operation, complex operation processes of testers are reduced, and the test efficiency of the system model is improved.
From the above, in this embodiment, a compiling instruction is obtained, and a performance model engineering is compiled according to the compiling instruction to generate a corresponding executable file; acquiring test case configuration information, and executing corresponding test tasks through the executable file according to the test case configuration information; and automatically extracting key parameters from the log file corresponding to the test task after the test task is completed, and analyzing results based on the key parameters. Therefore, by automatically executing each step in the performance model simulation test process, the automation of test and analysis is realized, the complex operation in the test process is reduced, the complexity of the test process is reduced, a concise test flow is provided for a tester, test tasks are executed based on test case configuration information and executable files, the problem that the same performance model needs to be repeatedly compiled under different parameter configurations is avoided, and the test efficiency is improved.
The embodiment of the application discloses a specific automatic performance model testing method, which comprises the following steps:
1. and selecting a compiling option by a tester at a GUI interface, automatically compiling and linking the system model, and generating an executable file.
Specifically, the tester selects a compiling option on an operation interface, executes a cmake command in the background, runs a corresponding cmake list. Before each tester sets a test case, the model needs to be recompiled and only needs to be recompiled once, and after an executable file is generated, all the set test cases are tested by running the executable file.
2. And selecting a parallel or serial test mode according to the requirements of a user.
Specifically, a tester selects a parallel or serial test mode at an operation interface, and when in parallel test, a plurality of terminal windows are simultaneously opened at the background, and an independent test task is created in each terminal window, wherein the schematic diagram of the parallel test mode is shown in fig. 2, and a plurality of test tasks submitted by a user can be synchronously performed. The parallel test mode saves test time and can improve test efficiency; and when the serial test is completed, creating and submitting the next test task until the test tasks are all executed, wherein a serial test schematic diagram is shown in fig. 3. The serial test mode can save computer resources, provide calculation space for other tasks or personnel, ensure that the computer cannot have an idle state in the process of testing the tasks, and improve the utilization rate of the computer resources by the automatic serial test mode compared with the manual test mode. The tester selects the appropriate test mode according to the personal and computer requirements.
3. Parameter configuration information for the test cases is set.
The parameter configuration information comprises the number of test cases, test scene options, model parameter configuration and the like. Adding detailed information of test tasks, such as configuring the number of test cases, selecting the number of the test cases, and sequentially configuring models and test parameters in each case option; configuring simulation scenes and selecting scenes equipped in the performance model; configuration engine parameter configuration, such as buffer depth in the configuration engine, switching of functions, etc.
4. An input file path and an output file path for each test case are set.
When the performance model needs to be used as an input parameter according to the experimental result, selecting an input document path corresponding to the test case through the GUI interface. For example, when the I/O processing delay information of a certain engine or the parameter file such as the IOPS of the CPU is experimentally measured as the parameter of the model, the parameter file of the experimental test result is read, and the parameter file is used as input to be configured in the system model. The system model test result adopts a scheme of outputting a log file, analyzes the log file to obtain key parameter changes, and sets a log file output path and a simulation report output path generated when the model operates on a GUI interface. The step 3 and the step 4 can be performed after the user selects the test mode, and the flow shown in fig. 4 is performed on the GUI interface, so that the operations of the step 3 and the step 4 are completed.
5. Executing the test task and recording the log file.
Specifically, executable files are run, simulation parameters are configured, simulation logs are recorded, and test progress is monitored. When the test task is set, selecting 'running', submitting the test task by the background, running an executable file, configuring a performance model by taking simulation parameters set by a user as input parameters, recording a simulation log of the performance model, and outputting the simulation log into a file path set by the user. The execution state of the test tasks is monitored, and the number of completed test tasks, the number of currently running test tasks and the number of remaining test tasks are displayed in a GUI window. The flow of performing the test tasks is shown in fig. 5.
6. And analyzing the performance of the simulation model, reading and analyzing a simulation test log file, and calculating key performance information.
When the test task is completed, calling a preset script, reading information in a log file, analyzing sentences in the log file to obtain and record key parameters, including buffer occupation information, I/O time stamp in an engine and other information concerned by designers. The recorded key parameter information is used in the preset script to calculate performance parameters, such as delay information of the I/O in the engine, integral delay information of the I/O, buffer utilization rate of each buffer through buffer occupation information and the like, through statistical time stamp information.
7. And drawing a performance parameter change curve and outputting a test report.
And (3) drawing a corresponding icon according to the key performance information calculated in the step (6), temporarily storing the icon in an output report path, and waiting for the call of a test report writing module. And automatically writing and outputting user configuration information, simulation test output, key parameter calculation and drawing results into a word format report, wherein the word format report comprises model parameter setting, input data sources, related parameter change pictures, key parameter calculation results, simulation test time consumption and the like, and outputting the test report to an output path position set by a user. Steps 6 and 7 may be performed after the test case simulation model is run, and the execution is shown in fig. 6.
8. And after the test result is finished, cleaning the generated executable file and configuration file.
Specifically, after the test result is finished, the generated executable file and configuration file are cleaned, a redundant chart generated in the analysis process of the result is cleaned, a background terminal window is closed, and the 'test completion' is displayed to a user at a GUI interface. After each test case is completed, the generated redundant graphs are cleaned up. When the test report is automatically generated, the drawn icon is added into a word format document, the chart under the storage path is a redundant file, and the corresponding chart file is deleted after the simulation is finished. And after all the test cases are completed, cleaning the generated executable files and configuration files.
For example, fig. 7 shows a corresponding performance model automation test system, which is composed of a GUI module, an engineering compiling module, a test configuration module, a test progress management module, a result analysis module, and a report writing module, where a user operates the GUI module, and the background invokes other modules to perform model case test. The GUI module is used for man-machine interaction, the engineering compiling module is used for compiling and generating executable files, the test configuration module is used for configuring test case configuration information, the test progress management module is used for monitoring task progress, the result analysis module is used for automatically extracting key parameters and carrying out result analysis based on the key parameters, and the report writing module is used for automatically generating an analysis report according to analysis results.
Correspondingly, the embodiment of the application also discloses an automatic performance model testing device, which is shown in fig. 8 and comprises:
the engineering compiling module 11 is used for acquiring compiling instructions, and compiling performance model engineering according to the compiling instructions to generate corresponding executable files;
the test configuration module 12 is configured to obtain test case configuration information, and execute a corresponding test task according to the test case configuration information through the executable file;
and the result analysis module 13 is used for automatically extracting key parameters from the log files corresponding to the test tasks after the test tasks are completed, and carrying out result analysis based on the key parameters.
From the above, in this embodiment, a compiling instruction is obtained, and a performance model engineering is compiled according to the compiling instruction to generate a corresponding executable file; acquiring test case configuration information, and executing corresponding test tasks through the executable file according to the test case configuration information; and automatically extracting key parameters from the log file corresponding to the test task after the test task is completed, and analyzing results based on the key parameters. Therefore, by automatically executing each step in the performance model simulation test process, the automation of test and analysis is realized, the complex operation in the test process is reduced, the complexity of the test process is reduced, a concise test flow is provided for a tester, test tasks are executed based on test case configuration information and executable files, the problem that the same performance model needs to be repeatedly compiled under different parameter configurations is avoided, and the test efficiency is improved.
In some embodiments, the test configuration module 12 may specifically include:
the configuration information acquisition unit is used for acquiring operation configuration information aiming at the test case; the operation configuration information comprises the number of cases and operation modes; the operation modes comprise a parallel mode and a serial mode;
accordingly, the test configuration module 12 includes:
the parallel execution unit is used for executing a plurality of test tasks in parallel by starting a plurality of terminal windows in the background and submitting a plurality of test cases at the same time if the operation mode is a parallel mode;
and the serial execution unit is used for opening a terminal window in the background and automatically submitting each test case in sequence until all the test tasks are executed if the operation mode is a serial mode.
In some embodiments, the test configuration module 12 may specifically include:
the configuration information acquisition unit is used for acquiring parameter configuration information aiming at the test case; the parameter configuration information comprises any one or more of model parameters, simulation scene information, output file paths, input file paths and engine parameters;
accordingly, the test configuration module 12 includes:
the execution unit is used for configuring model parameters for the performance model based on the parameter configuration information and executing test tasks by running the executable file;
the recording unit is used for recording the simulation log generated in the performance model task execution process and generating a simulation file.
In some specific embodiments, the performance model automatic test equipment may specifically further include:
the monitoring unit is used for monitoring the task execution state of each test task;
the progress prompt unit is used for generating test task progress prompt information in real time according to the task execution state; the test task progress prompt information comprises any one or more of the number of completed test tasks, the number of currently operated test tasks and the number of remaining test tasks.
In some embodiments, the performance model automatic test equipment may specifically include:
the configuration information acquisition unit is used for acquiring the compiling instruction and the test case configuration information through a preset interactive interface;
correspondingly, the automatic performance model testing device further comprises:
and the performance report dynamic generation unit is used for dynamically generating a performance bottleneck report of the performance model on the preset interactive interface.
In some embodiments, the result analysis module 13 may specifically include:
the test report generating unit is used for generating a test report of the performance model according to the analysis result and a pre-configured report template; the test report includes any one or more of a performance change curve and a text report.
In some embodiments, the result analysis module 13 may specifically include:
the key parameter extraction unit is used for reading and analyzing the log file corresponding to the test task after the test task is completed, and extracting key parameters from the analyzed log file;
and the performance parameter determining unit is used for determining the performance parameters of the performance model based on the key parameters according to a preset performance calculation rule.
Further, the embodiment of the application also discloses an electronic device, and referring to fig. 9, the content in the drawing should not be considered as any limitation on the application scope of the application.
Fig. 9 is a schematic structural diagram of an electronic device 20 according to an embodiment of the present application. The electronic device 20 may specifically include: at least one processor 21, at least one memory 22, a power supply 23, a communication interface 24, an input output interface 25, and a communication bus 26. Wherein the memory 22 is used for storing a computer program, and the computer program is loaded and executed by the processor 21 to implement relevant steps in the performance model automatic test method disclosed in any of the foregoing embodiments.
In this embodiment, the power supply 23 is configured to provide an operating voltage for each hardware device on the electronic device 20; the communication interface 24 can create a data transmission channel between the electronic device 20 and an external device, and the communication protocol to be followed is any communication protocol applicable to the technical solution of the present application, which is not specifically limited herein; the input/output interface 25 is used for acquiring external input data or outputting external output data, and the specific interface type thereof may be selected according to the specific application requirement, which is not limited herein.
The memory 22 may be a carrier for storing resources, such as a read-only memory, a random access memory, a magnetic disk, or an optical disk, and the resources stored thereon include an operating system 221, a computer program 222, and data 223 including test case configuration information, which may be stored in a temporary manner or a permanent manner.
The operating system 221 is used for managing and controlling various hardware devices on the electronic device 20 and the computer program 222, so as to implement the operation and processing of the processor 21 on the mass data 223 in the memory 22, which may be Windows Server, netware, unix, linux, etc. The computer program 222 may further include a computer program that can be used to perform other specific tasks in addition to the computer program that can be used to perform the performance model automatic test method performed by the electronic device 20 disclosed in any of the previous embodiments.
Further, the embodiment of the application also discloses a computer storage medium, wherein the computer storage medium stores computer executable instructions, and when the computer executable instructions are loaded and executed by a processor, the steps of the performance model automatic test method disclosed in any embodiment are realized.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, so that the same or similar parts between the embodiments are referred to each other. For the device disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. The software modules may be disposed in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
Finally, it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above detailed description of the method, the device, the equipment and the medium for automatically testing the performance model provided by the application applies specific examples to illustrate the principle and the implementation of the application, and the above examples are only used for helping to understand the method and the core idea of the application; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.
Claims (10)
1. An automatic performance model testing method is characterized by comprising the following steps:
acquiring a compiling instruction, compiling a performance model project according to the compiling instruction to generate a corresponding executable file;
acquiring test case configuration information, and executing corresponding test tasks through the executable file according to the test case configuration information;
and automatically extracting key parameters from the log file corresponding to the test task after the test task is completed, and analyzing results based on the key parameters.
2. The method of claim 1, wherein the obtaining test case configuration information comprises:
acquiring operation configuration information aiming at a test case; the operation configuration information comprises the number of cases and operation modes; the operation modes comprise a parallel mode and a serial mode;
correspondingly, the executing the corresponding test task according to the test case configuration information through the executable file includes:
if the operation mode is a parallel mode, a plurality of test tasks are executed in parallel by starting a plurality of terminal windows in the background and submitting a plurality of test cases at the same time;
if the operation mode is a serial mode, a terminal window is opened in the background and each test case is submitted automatically and sequentially until all the test tasks are executed.
3. The method of claim 1, wherein the obtaining test case configuration information comprises:
acquiring parameter configuration information aiming at a test case; the parameter configuration information comprises any one or more of model parameters, simulation scene information, output file paths, input file paths and engine parameters;
correspondingly, the executing the corresponding test task according to the test case configuration information through the executable file includes:
configuring model parameters for a performance model based on the parameter configuration information, and executing test tasks by running the executable file;
and recording a simulation log generated in the performance model task execution process, and generating a simulation file.
4. The method for automatically testing a performance model according to claim 1, wherein the process of executing the corresponding test task through the executable file according to the test case configuration information further comprises:
monitoring the task execution state of each test task;
generating a test task progress prompt message in real time according to the task execution state; the test task progress prompt information comprises any one or more of the number of completed test tasks, the number of currently operated test tasks and the number of remaining test tasks.
5. The method of claim 1, wherein the obtaining compiling instructions and the obtaining test case configuration information comprise:
acquiring the compiling instruction and the test case configuration information through a preset interactive interface;
correspondingly, the process of executing the corresponding test task through the executable file according to the test case configuration information further comprises:
and dynamically generating a performance bottleneck report of the performance model on the preset interactive interface.
6. The method for automatically testing a performance model according to claim 1, further comprising, after the analyzing the results based on the key parameters:
generating a test report of the performance model according to the analysis result and a preconfigured report template; the test report includes any one or more of a performance change curve and a text report.
7. The method for automatically testing a performance model according to any one of claims 1 to 6, wherein the automatically extracting key parameters from the log file corresponding to the test task and performing result analysis based on the key parameters includes:
after the test task is completed, reading and analyzing a log file corresponding to the test task, and extracting key parameters from the analyzed log file;
and determining the performance parameters of the performance model based on the key parameters according to a preset performance calculation rule.
8. An automatic performance model testing device, comprising:
the engineering compiling module is used for acquiring compiling instructions, and compiling performance model engineering according to the compiling instructions to generate corresponding executable files;
the test configuration module is used for acquiring test case configuration information and executing corresponding test tasks through the executable file according to the test case configuration information;
and the result analysis module is used for automatically extracting key parameters from the log files corresponding to the test tasks after the test tasks are completed, and carrying out result analysis based on the key parameters.
9. An electronic device, comprising:
a memory for storing a computer program;
a processor for executing the computer program to implement the performance model automatic test method according to any one of claims 1 to 7.
10. A computer-readable storage medium storing a computer program; wherein the computer program when executed by a processor implements the performance model automatic test method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310545640.XA CN116643977A (en) | 2023-05-12 | 2023-05-12 | Automatic performance model testing method, device, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310545640.XA CN116643977A (en) | 2023-05-12 | 2023-05-12 | Automatic performance model testing method, device, equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116643977A true CN116643977A (en) | 2023-08-25 |
Family
ID=87614616
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310545640.XA Pending CN116643977A (en) | 2023-05-12 | 2023-05-12 | Automatic performance model testing method, device, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116643977A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118672903A (en) * | 2024-05-31 | 2024-09-20 | 中国科学院软件研究所 | System automatic test method and device based on discrete event simulation engine |
CN119046171A (en) * | 2024-10-30 | 2024-11-29 | 中国民用航空飞行学院 | Aircraft performance software calculation kernel test analysis tool, method, computer equipment, medium and terminal |
-
2023
- 2023-05-12 CN CN202310545640.XA patent/CN116643977A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118672903A (en) * | 2024-05-31 | 2024-09-20 | 中国科学院软件研究所 | System automatic test method and device based on discrete event simulation engine |
CN119046171A (en) * | 2024-10-30 | 2024-11-29 | 中国民用航空飞行学院 | Aircraft performance software calculation kernel test analysis tool, method, computer equipment, medium and terminal |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7209034B2 (en) | Edge computing test method, apparatus, equipment and readable storage medium | |
CN105045710B (en) | A kind of automatic test data creation method under cloud computing environment | |
CN103164328B (en) | The regression testing method of a kind of business function, Apparatus and system | |
CN116643977A (en) | Automatic performance model testing method, device, equipment and storage medium | |
CN111124919A (en) | User interface testing method, device, equipment and storage medium | |
CN112732576B (en) | Automatic testing method, device and equipment based on user interface | |
CN111949545A (en) | Automatic testing method, system, server and storage medium | |
CN113434387B (en) | A script-driven automated testing tool and system | |
CN112860587A (en) | UI automatic test method and device | |
CN112199273A (en) | Virtual machine pressure/performance testing method and system | |
CN111523676A (en) | Method and device for assisting machine learning model to be online | |
CN116820908A (en) | Locust-based performance testing methods, devices, equipment and media | |
CN108984380A (en) | A kind of server test method, device and medium based on linux system | |
CN116069649A (en) | Page testing method, device, equipment and medium | |
CN117785160B (en) | Behavior logic development and debugging method, system and device for low-code application | |
CN113986263A (en) | Code automation testing method, device, electronic device, storage medium | |
CN117634370A (en) | Function verification method and platform for Verilog code | |
CN113946519B (en) | A method for UI automation testing | |
CN115629979A (en) | UI automation test method, system, equipment and storage medium | |
RU2729210C1 (en) | Electronic devices software testing system | |
CN115470152A (en) | Test code generation method, test code generation device, and storage medium | |
CN115562923A (en) | Automatic testing method and device for equipment, electronic equipment and storage medium | |
CN115291540A (en) | Device control method, device, electronic device and storage medium | |
CN108960433B (en) | Method and system for running machine learning modeling process | |
CN114490330A (en) | Method and device for generating test case |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |