CN112527878A - Automatic tool for IO performance test of database - Google Patents
Automatic tool for IO performance test of database Download PDFInfo
- Publication number
- CN112527878A CN112527878A CN202011476670.2A CN202011476670A CN112527878A CN 112527878 A CN112527878 A CN 112527878A CN 202011476670 A CN202011476670 A CN 202011476670A CN 112527878 A CN112527878 A CN 112527878A
- Authority
- CN
- China
- Prior art keywords
- module
- database
- user interaction
- processing module
- tested
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000011056 performance test Methods 0.000 title claims abstract description 8
- 238000012360 testing method Methods 0.000 claims abstract description 48
- 230000003993 interaction Effects 0.000 claims abstract description 38
- 238000012545 processing Methods 0.000 claims abstract description 34
- 238000000034 method Methods 0.000 claims abstract description 30
- 238000006243 chemical reaction Methods 0.000 claims abstract description 16
- 230000008569 process Effects 0.000 claims description 28
- 230000000007 visual effect Effects 0.000 abstract description 3
- 239000002699 waste material Substances 0.000 abstract description 3
- 230000006870 function Effects 0.000 description 8
- 230000009286 beneficial effect Effects 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/25—Integrating or interfacing systems involving database management systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/21—Design, administration or maintenance of databases
Landscapes
- Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
The invention provides an automation tool for IO performance test of a database, which relates to the field of databases and comprises the following components: a user interaction module for receiving user operation and returning operation result; the processing module is connected with at least one database to be tested and used for operating the database to be tested according to the operation instruction and obtaining an operation result; the first conversion module is respectively connected with the user interaction module and the processing module and used for converting the user operation received by the user interaction module into a corresponding operation instruction and sending the operation instruction to the processing module; and the second conversion module is respectively connected with the user interaction module and the processing module and used for sending the operation result to the user interaction module. The method is simple to operate, and the visual user interaction interface is utilized for parameter configuration and execution, so that the learning cost of an automatic tool is reduced, and the labor waste is reduced; and various commonly used databases are supported for testing, so that the environment dependence arrangement cost is reduced, and the testing efficiency is improved.
Description
Technical Field
The invention relates to the field of databases, in particular to an automation tool for testing IO performance of a database.
Background
With the development of high and new technologies, a database is the most important one of application systems and is also a very concerned one for performance testing, in the operating system testing, a compatible three-party database system becomes one of necessary testing items, and the database io performance is one of indexes for measuring the compatibility of the operating system with the database.
The common database performance test tool in the market at present has the following characteristics: the method has the advantages of source code installation, complex function, non-visualization, multiple configuration items and weak readability of test results, and needs testers with strong speciality to configure and use.
In the operating system, for the performance test of the database io, many test scenes exist, tools need to be frequently installed, configured and tested, the existing tools on the market are complex to install and configure, and the labor is wasted in repeated work.
There are currently few tools on the market for database io performance testing.
In terms of operation, the current tools on the market need source code installation, manually modify configuration files, manually execute commands for testing, are inconvenient to interact, and need more professional people for operation and use.
From the readability of the test result, the readability of the test result of the existing test tool is weak, the test data needs to be manually collated, the test result is extracted, and the labor is wasted.
From the cross-platform aspect, the existing tool only supports the test operation in the linux environment and does not support the operation on windows.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides an automation tool for testing the IO performance of a database, which comprises:
a user interaction module for receiving user operation and returning operation result;
the processing module is connected with at least one database to be tested and used for operating the database to be tested according to the operation instruction and obtaining an operation result;
the first conversion module is respectively connected with the user interaction module and the processing module and is used for converting the user operation received by the user interaction module into a corresponding operation instruction and sending the operation instruction to the processing module;
and the second conversion module is respectively connected with the user interaction module and the processing module and used for sending the operation result to the user interaction module.
Preferably, the processing module is connected to the database to be tested through a cross-platform interface.
Preferably, the test system further comprises a configuration module connected to the user interaction module for forming and/or editing the test case according to the user operation.
Preferably, the device further comprises a result output module connected to the processing module for analyzing and storing the operation result.
Preferably, the result processing module analyzes the operation result through a regular expression, and stores the analyzed result in a text file.
Preferably, the processing module comprises:
the process management unit is used for creating and scheduling at least one process for operating the database to be tested;
the function management unit is connected with the process management unit and used for managing at least one pre-configured function for the process to call;
and the summarizing unit is connected with the process management unit and is used for summarizing the result obtained by operating the to-be-tested database by each process to form the operation result.
Preferably, the processing module is implemented by python.
Preferably, the user interaction module is implemented by a QT interface, which is used to form a text in XML format according to a user operation.
Preferably, the first conversion module is implemented by a pyuic plug-in of the PYQT5 toolkit to convert the XML text into a python object.
Preferably, the second conversion module is implemented by a PYQT5 toolkit, the user interaction module being controlled with the operation result that has been obtained according to a python run.
The technical scheme has the following advantages or beneficial effects:
(1) the technical scheme supports a plurality of commonly used databases for testing, is beneficial to reducing the environment dependence arrangement cost and improving the testing efficiency;
(2) the technical scheme is simple to operate, and the visual user interaction interface is utilized to carry out parameter configuration and execution test on the database to be tested, so that the learning cost of an automatic tool is reduced, and the labor waste is reduced;
(3) the technical scheme can be used in a cross-platform mode, and the test environment provided with the python and the related third-party library can run the automation tool in the technical scheme, so that the limitation is reduced.
Drawings
FIG. 1 is a schematic diagram of an automated tool according to a preferred embodiment of the present invention.
Detailed Description
The invention is described in detail below with reference to the figures and specific embodiments. The present invention is not limited to the embodiment, and other embodiments may be included in the scope of the present invention as long as the gist of the present invention is satisfied.
In accordance with the above-mentioned problems in the prior art, there is provided an automated tool for testing IO performance of a database, as shown in fig. 1, comprising:
a user interaction module 1, which is used for receiving user operation and returning an operation result;
the processing module 2 is connected to at least one to-be-tested database 2A and is used for operating the to-be-tested database 2A according to the operation instruction and obtaining an operation result;
the first conversion module 3 is respectively connected with the user interaction module 1 and the processing module 2 and is used for converting the user operation received by the user interaction module 1 into a corresponding operation instruction and sending the operation instruction to the processing module 2;
and the second conversion module 4 is respectively connected with the user interaction module 1 and the processing module 2 and used for sending the operation result to the user interaction module 1.
Specifically, in this embodiment, the user interaction module 1 may be a visualized user interaction interface designed based on Qt, and the Qt interface is used to form a text in an XML format according to a user operation. According to the technical scheme, the parameter configuration and execution of each database 2A to be tested are carried out on the visual user interaction interface, so that the learning cost of an automation tool is reduced, and the labor waste is reduced.
In this embodiment, the user interaction module 1 receives a user operation, converts the user operation into a corresponding operation instruction through the first conversion module 3, and sends the operation instruction to the processing module 2, and the processing module 2 performs a read-write performance test on the database 2A to be tested according to the operation instruction. The read-write performance of each database 2A can be tested from both the data length and the data content. In the user interaction interface, a user can input the data length and the request times in sequence, and then a test button is selected to test the read-write performance of each database 2A according to the data length; in the user interface, the user may sequentially input add data, change data, and request times to test the read-write performance of each database 2A from the data content. And feeding back the operation result to the user interaction interface through the second conversion module 4 for analysis, display and storage.
In the preferred embodiment of the invention, the processing module 2 is implemented by python.
In a preferred embodiment of the invention, the first conversion module 3 is implemented by the pyuic plug-in of the PYQT5 toolkit for converting XML text into python objects.
In a preferred embodiment of the invention the second conversion module 4 is implemented by the PYQT5 toolkit, controlling the user interaction module 1 with the operation results that have been obtained from the python run.
This technical scheme realizes under the test environment based on installing python and relevant third party storehouse, and this technical scheme can cross the platform use simultaneously, as long as install the test environment of python and relevant third party storehouse all can the running automation instrument in this technical scheme, has reduced test environment's limitation.
The use flow of the automation tool in the technical scheme is as follows: obtaining a type of a database 2A to be tested through a user interactive interface, wherein the type of the database 2A comprises one or more of redis, memcache, mysql and postgresql; checking the installation configuration condition of the database 2A to be tested through an os module of python; completing the connection with the database to be tested 2A through python; acquiring the parameters of the data to be tested through a user interaction interface, and simultaneously configuring the parameters of each database 2A so as to execute the test; and after the test is finished, storing the log in the test process to the local, and storing the test result.
The technical scheme supports various commonly used databases 2A to test, is beneficial to reducing the environment dependence arrangement cost, improves the test efficiency, and can test the performance of reading, writing and other operations of the commonly used databases 2A.
In a preferred embodiment, the read-write performance test of the redis database 2A is performed by using the automation tool in the technical solution, and the process is as follows:
1. in the user interaction interface, selecting redis in the database 2A;
2. selecting a connection redis;
3. checking a connection state of the redis;
4. configuring test parameters for the redis;
5. selecting a test report output path;
6. executing the test;
7. and quickly entering a corresponding directory to check the test result and the log.
In the preferred embodiment of the present invention, the processing module 2 is connected to the database 2A to be tested through a cross-platform interface.
Specifically, in this embodiment, the cross-platform interface may be implemented by the cross-platform feature of python.
In the preferred embodiment of the present invention, the testing system further comprises a configuration module 6 connected to the user interaction module 1 for forming and/or editing the test case according to the user operation.
Specifically, in this embodiment, the process of configuring, by the configuration module 6, the data to be tested in the database 2A includes: adding data, wherein the length of the data content can be edited, and performing default configuration by using a configparser module of python; modifying data, wherein the length of the data content can be edited, and performing default configuration by using configparser; deleting data, wherein the length of the data content can be edited, and performing default configuration by using configparser; and querying data, wherein the length of the data content can be edited, and performing default configuration by using configparser.
In the preferred embodiment of the present invention, the present invention further comprises a result output module 5 connected to the processing module 2 for analyzing and saving the operation result.
Specifically, in this embodiment, the operation result is analyzed and stored by the result output module 5. Preferably, the operation result may be stored in the local storage space in the form of an excel table, so as to form an operation log.
In a preferred embodiment of the present invention, the result processing module 2 analyzes the operation result through the regular expression, and stores the analyzed result in a text file.
Specifically, in this embodiment, the processing module 2 processes the test result by using the regular expression + openpyxl, which specifically includes: summarizing test results of multiple processes through a Queue; storing the analysis result into a log file; and analyzing and calculating the test data by using python to obtain an excel report, and outputting the excel report through openpyxl.
In a preferred embodiment of the present invention, the processing module 2 comprises:
a process management unit 21 for creating and scheduling at least one process for operating the database to be tested 2A;
the function management unit 22 is connected with the process management unit 21 and used for managing at least one function which is pre-configured for being called by a process;
and the summarizing unit 23 is connected with the process management unit 21 and is used for summarizing the result obtained by each process operating the database to be tested 2A to form an operation result.
Specifically, in this embodiment, the process management unit 21 creates a process Pool through Pool, and controls the concurrent amount of data through the process Pool; the function management unit 22 writes functions of read-write operations related to the database 2A, and calls the functions to execute operations of reading and writing the database 2A in each read-write process of each database 2A, and the summarizing unit 23 may be a Queue, and summarizes test results of multiple processes of python through the Queue.
While the invention has been described with reference to a preferred embodiment, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention.
Claims (10)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011476670.2A CN112527878A (en) | 2020-12-15 | 2020-12-15 | Automatic tool for IO performance test of database |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011476670.2A CN112527878A (en) | 2020-12-15 | 2020-12-15 | Automatic tool for IO performance test of database |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112527878A true CN112527878A (en) | 2021-03-19 |
Family
ID=75000051
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011476670.2A Pending CN112527878A (en) | 2020-12-15 | 2020-12-15 | Automatic tool for IO performance test of database |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112527878A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113505126A (en) * | 2021-06-18 | 2021-10-15 | 山东师范大学 | Information management security visualization method and system based on domestic database |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102075381A (en) * | 2010-12-14 | 2011-05-25 | 云海创想信息技术(北京)有限公司 | Automatic test platform server and system applied to cloud storage |
US20140215078A1 (en) * | 2013-01-29 | 2014-07-31 | Qualcomm Incorporated | Cross-platform module that is shared by client applications for access to rich communications suite resources on a client device |
CN107203473A (en) * | 2017-05-26 | 2017-09-26 | 四川长虹电器股份有限公司 | The automatization test system and method for automatic expansion interface test case |
CN107291611A (en) * | 2016-04-11 | 2017-10-24 | 中兴通讯股份有限公司 | Call the test system and method for third-party testing instrument |
CN107402854A (en) * | 2016-05-19 | 2017-11-28 | 中兴通讯股份有限公司 | Test information management method, apparatus, test case execution system and equipment |
CN108519932A (en) * | 2018-01-24 | 2018-09-11 | 中国电子信息产业集团有限公司第六研究所 | A kind of more performance testing tools based on homemade chip platform |
CN112015715A (en) * | 2019-05-28 | 2020-12-01 | 清华大学 | Industrial Internet data management service testing method and system |
-
2020
- 2020-12-15 CN CN202011476670.2A patent/CN112527878A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102075381A (en) * | 2010-12-14 | 2011-05-25 | 云海创想信息技术(北京)有限公司 | Automatic test platform server and system applied to cloud storage |
US20140215078A1 (en) * | 2013-01-29 | 2014-07-31 | Qualcomm Incorporated | Cross-platform module that is shared by client applications for access to rich communications suite resources on a client device |
CN107291611A (en) * | 2016-04-11 | 2017-10-24 | 中兴通讯股份有限公司 | Call the test system and method for third-party testing instrument |
CN107402854A (en) * | 2016-05-19 | 2017-11-28 | 中兴通讯股份有限公司 | Test information management method, apparatus, test case execution system and equipment |
CN107203473A (en) * | 2017-05-26 | 2017-09-26 | 四川长虹电器股份有限公司 | The automatization test system and method for automatic expansion interface test case |
CN108519932A (en) * | 2018-01-24 | 2018-09-11 | 中国电子信息产业集团有限公司第六研究所 | A kind of more performance testing tools based on homemade chip platform |
CN112015715A (en) * | 2019-05-28 | 2020-12-01 | 清华大学 | Industrial Internet data management service testing method and system |
Non-Patent Citations (2)
Title |
---|
王小科: "Python GUI设计PyQt5 从入门到实践", 31 July 2020, 吉林大学出版社, pages: 29 - 32 * |
霍亚飞: "Qt Creator快速入门", 31 January 2017, 北京航空航天大学出版社, pages: 397 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113505126A (en) * | 2021-06-18 | 2021-10-15 | 山东师范大学 | Information management security visualization method and system based on domestic database |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109885488B (en) | Automatic test method and system for satellite orbit calculation software driven by case table | |
CN106294673B (en) | Method and system for analyzing log data in real time by user-defined rule | |
US9141521B2 (en) | Method and apparatus for automatically generating a test script for a graphical user interface | |
CN103186460B (en) | A kind of method, Apparatus and system of generating test use case script | |
CN108509556B (en) | Data migration method and device, server and storage medium | |
US20120174061A1 (en) | Code suggestion in a software development tool | |
CN110457277A (en) | Service process performance analysis method, device, equipment and storage medium | |
US9009175B2 (en) | System and method for database migration and validation | |
CN107943689B (en) | Automatic test method and test system based on parameterized test script | |
CN104778124A (en) | Automatic testing method for software application | |
CN110020021B (en) | Visualization-based data stream processing method | |
CN110543299A (en) | Cloud computing management platform code generation method and device | |
US10157057B2 (en) | Method and apparatus of segment flow trace analysis | |
CN111611127A (en) | Processing method, device and equipment for task running log and storage medium | |
CN115543290A (en) | Visual programming method and system | |
CN112527878A (en) | Automatic tool for IO performance test of database | |
CN108427645B (en) | Method and system for realizing unattended operation in automatic test platform without command line interface | |
CN114185791A (en) | Method, device and equipment for testing data mapping file and storage medium | |
KR101476536B1 (en) | The method and system for inspecting program | |
CN109408391A (en) | Software System Integrated Testing system based on continuous integrating technology | |
CN101165637A (en) | Input system and method | |
CN103220186A (en) | Communication equipment testing method and system | |
KR100591354B1 (en) | Apparatus for writing a program using a computer | |
CN114356337B (en) | A method, device and equipment for executing a parameter-expandable program organization unit | |
KR101940719B1 (en) | Task graph construct apparatus and method of conversational processing system based on task graph |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |