Disclosure of Invention
The disclosure provides an automatic test method and device, a server and a storage medium.
According to a first aspect of embodiments of the present disclosure, there is provided an automated testing method, applied to a server, including:
receiving test task information, wherein the test task information comprises scene setting parameters and task parameters;
constructing a test scene based on the scene setting parameters;
According to the task parameters, sending an operation instruction to a test terminal under the built test scene;
And detecting an execution result of the operation instruction executed by the test terminal.
Optionally, the test task comprises an image acquisition task, wherein the test scene is an image acquisition scene, and the operation instruction comprises an image acquisition instruction;
The detecting the execution result of the operation instruction executed by the test terminal comprises the following steps:
And carrying out acquisition image quality evaluation on the image acquired by the test terminal.
Optionally, the scene setting parameters comprise equipment identifiers of environment building equipment and first working parameters of the environment building equipment;
Based on the scene setting parameters, constructing a test scene, including:
According to the equipment identification, determining environment building equipment for building the test scene;
and controlling the environment construction equipment to operate according to the first working parameter.
Optionally, the environment construction device comprises at least one of:
The setting equipment is used for constructing a setting in the space where the test scene is located;
The environment adjusting device is used for providing a test environment required by the test scene;
and the displacement equipment is used for adjusting the position of the test terminal in the test scene.
Optionally, the environment adjusting device includes at least one of:
The illumination adjusting device is used for providing an illumination environment required by the test environment;
A temperature regulation device for providing a temperature environment required for the test environment;
humidity adjusting equipment for providing humidity environment required by the test environment;
And the air pressure adjusting device is used for providing an air pressure environment required by the test environment.
Optionally, the first operating parameter of the lighting adjustment device includes at least one of:
Illumination brightness;
An illumination angle;
illumination chromaticity.
Optionally, the sending an operation instruction to the test terminal located in the built test scene according to the task parameter includes:
The method comprises the steps of sending a driving instruction to the displacement equipment according to the task parameter, wherein the driving instruction is used for driving the displacement equipment to move so as to drive the test terminal to move to a preset position of the test scene;
And sending the operation instruction to the test terminal positioned at the preset position in the test scene.
Optionally, the environment setting up device further includes a photographing device, and the sending, according to the task parameter, a driving instruction to the displacement device includes:
Sending an acquisition instruction for acquiring an image of the environment where the test terminal is located to the photographing equipment;
receiving an image returned by the photographing equipment;
analyzing the image and determining the position of a test target of the test terminal in the image;
Determining the preset position of the test terminal for testing the test target in the test scene according to the position of the test target in the image;
and sending a driving instruction carrying the preset position to the displacement equipment.
Optionally, the task parameters include:
The function identifier is used for indicating the function to be tested;
And the test item identifier is used for indicating the test item of the function to be tested.
Optionally, the image acquisition task includes:
Collecting a human image;
collecting night scenes;
collecting sea scenery;
collecting static scenes;
collecting a moving scene;
collecting far focus;
collecting at a wide angle;
and (5) micro-distance collection.
Optionally, the task parameter includes a second working parameter when the test terminal executes the operation instruction;
the sending an operation instruction to the test terminal under the built test scene comprises the following steps:
and sending the operation instruction carrying the second working parameter to the test terminal under the built test scene.
Optionally, the second operating parameter includes at least one of:
Focusing parameters;
White balance parameters;
sensitivity parameters.
According to a second aspect of embodiments of the present disclosure, there is provided an automated testing apparatus, for application in a server, comprising:
The system comprises a receiving module, a testing module and a testing module, wherein the receiving module is configured to receive testing task information, and the testing task information comprises scene setting parameters and task parameters;
the building module is configured to build a test scene based on the scene setting parameters;
The sending module is configured to send an operation instruction to the test terminal under the built test scene according to the task parameters;
And the detection module is configured to detect an execution result of the operation instruction executed by the test terminal.
Optionally, the scene setting parameters comprise equipment identifiers of environment building equipment and first working parameters of the environment building equipment;
The construction module is specifically configured to determine an environment construction device for constructing the test scene according to the device identifier, and control the environment construction device to operate according to the first working parameter.
Optionally, the environment construction device comprises at least one of:
The setting equipment is used for constructing a setting in the space where the test scene is located;
The environment adjusting device is used for providing a test environment required by the test scene;
and the displacement equipment is used for adjusting the position of the test terminal in the test scene.
Optionally, the environment adjusting device includes at least one of:
The illumination adjusting device is used for providing an illumination environment required by the test environment;
A temperature regulation device for providing a temperature environment required for the test environment;
humidity adjusting equipment for providing humidity environment required by the test environment;
And the air pressure adjusting device is used for providing an air pressure environment required by the test environment.
Optionally, the first operating parameter of the lighting adjustment device includes at least one of:
Illumination brightness;
An illumination angle;
illumination chromaticity.
Optionally, the sending module is specifically configured to send a driving instruction to the displacement device according to the task parameter, where the driving instruction is used to drive the displacement device to move so as to drive the test terminal to move to a preset position of the test scene, and send the operation instruction to the test terminal located at the preset position in the test scene.
Optionally, the environment setting up device further comprises a photographing device,
The sending module is specifically configured to send an acquisition instruction for acquiring an image of an environment where the test terminal is located to the photographing equipment, receive the image returned by the photographing equipment, analyze the image, determine the position of a test target of the test terminal in the image, determine the preset position of the test terminal for testing the test target in the test scene according to the position of the test target in the image, and send a driving instruction carrying the preset position to the displacement equipment.
Optionally, the task parameters include:
The function identifier is used for indicating the function to be tested;
And the test item identifier is used for indicating the test item of the function to be tested.
Optionally, the image acquisition task includes:
Collecting a human image;
collecting night scenes;
collecting sea scenery;
collecting static scenes;
collecting a moving scene;
collecting far focus;
collecting at a wide angle;
and (5) micro-distance collection.
Optionally, the task parameter includes a second working parameter when the test terminal executes the operation instruction;
The sending module is specifically configured to send the operation instruction carrying the second working parameter to the test terminal under the built test scene.
Optionally, the second operating parameter includes at least one of:
Focusing parameters;
White balance parameters;
sensitivity parameters.
According to a third aspect of the disclosed embodiments, an automated testing system is provided, comprising a server, a testing terminal and an environment construction device, wherein,
The server is used for receiving test task information, sending a building instruction to the environment building equipment based on scene setting parameters in the test task information, and sending an operation instruction to the test terminal according to the task parameters in the test task information;
the environment construction equipment is used for constructing a test scene according to the construction instruction;
The test terminal is in communication connection with the server and is used for executing the operation instruction in the built test scene according to the operation instruction and returning the test result to the server.
Optionally, the scene setting parameters include a first working parameter of the environment construction device;
The environment construction equipment is used for receiving the construction instruction carrying the first working parameter sent by the server based on an Arduino hardware interface.
Optionally, the server and the environment setting up device perform remote communication based on a Web service.
Optionally, the environment construction device comprises a displacement device,
The server is further used for sending a driving instruction to the displacement equipment according to the task parameters;
and the displacement equipment is used for driving the test terminal to move to a preset position of the test scene according to the driving instruction.
Optionally, the test task comprises an image acquisition task, the environment setting-up device further comprises a photographing device,
The server is further used for sending an acquisition instruction for acquiring the image of the environment where the test terminal is located to the photographing equipment;
The photographing equipment is used for collecting the image of the environment where the test terminal is located according to the collection instruction and sending the image to the server;
The server is used for analyzing the image, determining the position of the shooting target of the test terminal in the image, determining the preset position of the test terminal for shooting the shooting target in the test scene according to the position of the shooting target in the image, and sending a driving instruction carrying the preset position to the displacement equipment.
According to a fourth aspect of embodiments of the present disclosure, there is provided a server comprising:
A processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the automated test method as described in the first aspect above.
According to a fifth aspect of embodiments of the present disclosure, there is provided a storage medium comprising:
The instructions in the storage medium, when executed by a processor of a server, enable the server to perform the automated test method as described in the first aspect above.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects:
In the embodiment of the disclosure, the server builds the test scene according to the scene setting parameters included in the received test task information, so that the test terminal can work according to the task parameters included in the test task information under the built test scene, on one hand, the test terminal does not need to consume manpower to search for an external scene and manually operate to execute the test task, so that the manpower can be saved, the test efficiency is improved, and on the other hand, the test scene is formed by enabling the equipment to work in a certain state according to the scene setting parameters in the form of instructions, and has stability compared with the natural scene outside, so that the reliability of the test can be improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
FIG. 1 is a flowchart of an automated testing method according to an embodiment of the present disclosure, and as shown in FIG. 1, the automated testing method applied to a server includes the following steps:
s11, receiving test task information, wherein the test task information comprises scene setting parameters and task parameters;
s12, constructing a test scene based on the scene setting parameters;
S13, sending an operation instruction to a test terminal under the built test scene according to the task parameters;
s14, detecting an execution result of the operation instruction executed by the test terminal.
In an embodiment of the present disclosure, an automated testing method is applied in a server. In step S11, the server may receive test task information, which is set by a technician, and includes scene setting parameters and task parameters. The scene setting parameters comprise equipment identifiers and working parameters of equipment required by a scene environment corresponding to the execution test task, and the task parameters refer to the working parameters corresponding to the execution test task of the test terminal.
For example, the test tasks include an image acquisition task, a voice acquisition task, or a network test task, etc. Taking an image acquisition task as an example, the scene setting parameters may include an identifier of the lighting device and a corresponding lighting brightness, and may also include an identifier of an acquisition target, etc. Taking a voice acquisition task as an example, the scene setting parameters may include an equipment identifier of a voice blocking equipment during voice acquisition and an azimuth of the voice blocking equipment, and may also include a voice strength parameter of a voice pronunciation equipment. Taking a network test task as an example, the scene setting parameters may include an identifier of a network signal transmitting device, a network type of work, a network signal transmitting intensity parameter, and the like.
In step S12, the server may build a test scenario based on the scenario setting parameters in the test task information. It should be noted that, in the embodiment of the present disclosure, when the server builds a test scene, an instruction is sent to each device in the test scene according to the scene setting parameters, so that each device in the test scene works with the corresponding scene setting parameters.
In step S13, after the test scenario is set up, an operation instruction is sent to the test terminal in the test scenario according to the task parameter, so that the test terminal can work according to the task parameter. The test terminal may be an electronic device such as a smart phone, a tablet computer, and an intelligent sound box, and the task parameters may include indicating which test function the test terminal executes. For example, a test for performing an image capturing function, a test for a voice capturing function, or a test for a network function, etc. In addition, the task parameters may further include working parameters for the test terminal to execute the corresponding functions. For example, a focusing parameter, a white balance parameter, a sensitivity parameter, and the like of the test terminal when performing the image capturing function.
In step S14, the server may detect the execution result of the operation instruction by the test terminal, so that the execution result may be saved for manual analysis, or the server may perform automated analysis, or the like. It should be noted that, the server may detect the execution result of the operation instruction executed by the test terminal, because the test terminal uploads the execution result to the server after executing the operation instruction, so that the server may detect the operation instruction.
In one embodiment, when a terminal product needs to be tested, a landscape is manually found, and the terminal is manually operated to execute a test task. The method has the advantages of high labor consumption, low efficiency and unstable state of the artificially searched external scene, so that the test cannot be executed as expected.
It can be understood that in the embodiment of the disclosure, the server builds the test scene according to the scene setting parameters included in the received test task information, so that the test terminal can work according to the task parameters included in the test task information under the built test scene, on one hand, the test terminal does not need to consume manpower to find an external scene and manually operate to execute the test task, so that the manpower can be saved, the test efficiency is improved, and on the other hand, the test scene is formed by enabling the equipment to work in a certain state according to the scene setting parameters in the form of instructions, and has stability compared with the external natural scene, so that the reliability of the test can be improved.
In one embodiment, the test task comprises an image acquisition task, wherein the test scene is an image acquisition scene, and the operation instruction comprises an image acquisition instruction;
The detecting the execution result of the operation instruction executed by the test terminal comprises the following steps:
And carrying out acquisition image quality evaluation on the image acquired by the test terminal.
In this embodiment, the test task includes an image acquisition task, the test scene is an image acquisition scene, and the corresponding operation instruction executed by the test terminal is an image acquisition instruction. After the server sends an image acquisition instruction to the test terminal, the test terminal can upload the acquired image to the server, and the server can evaluate the image quality of the image acquired by the test terminal after receiving the image acquisition instruction.
In one embodiment, the test terminal typically performs image quality assessment manually after image acquisition is completed. The image quality review in the mode is low in working efficiency and consumes more time. The server can automatically evaluate the image quality of the image acquired by the test terminal, for example, based on an artificial intelligence mode, complete the image quality evaluation and form a report. It can be appreciated that in this way, the efficiency of the test is greatly improved.
In one embodiment, the image acquisition task includes:
Collecting a human image;
collecting night scenes;
collecting sea scenery;
collecting static scenes;
collecting a moving scene;
collecting far focus;
collecting at a wide angle;
and (5) micro-distance collection.
In this embodiment, after the server sets up the test scene based on the received image acquisition task including portrait acquisition, night scene acquisition, sea scene acquisition, and the like, the test scene may be set up based on the corresponding scene setting parameters.
In one embodiment, the scene setting parameters comprise an equipment identifier of an environment construction equipment and a first working parameter of the environment construction equipment;
Based on the scene setting parameters, constructing a test scene, including:
According to the equipment identification, determining environment building equipment for building the test scene;
and controlling the environment construction equipment to operate according to the first working parameter.
In an embodiment of the disclosure, the scene setting parameters include a device identification of the environment construction device and a first operating parameter of the environment construction device. Wherein the environment construction equipment is the lighting equipment, the voice pronunciation equipment, the network signal transmitting equipment and the like as described above. According to the method, the environment construction equipment for constructing the test scene can be determined according to the equipment identification, and a control signal is sent to the environment construction equipment corresponding to the equipment identification so as to control the environment construction equipment to operate with the first working parameters.
In one embodiment, the environment construction device comprises at least one of:
Setting equipment is used for setting up a setting in a space where the test scene is located;
The environment adjusting device is used for providing a test environment required by the test scene;
and the displacement equipment is used for adjusting the position of the test terminal in the test scene.
In embodiments of the present disclosure, the environment construction device may include a scenery device for constructing scenery located in a space where the test scene is located. For example, for an image acquisition scene, the scenery device may be a device that builds a specific background color building or a specific type of character, or may be a device that builds a sea, sun, or rain scene.
The environment construction device may comprise an environment adjustment device for providing a test environment required for the test scenario. In one embodiment of the present disclosure, the environmental conditioning apparatus includes at least one of:
The illumination adjusting device is used for providing an illumination environment required by the test environment;
A temperature regulation device for providing a temperature environment required for the test environment;
humidity adjusting equipment for providing humidity environment required by the test environment;
And the air pressure adjusting device is used for providing an air pressure environment required by the test environment.
In this embodiment, the environment setting up device may comprise a lighting adjustment device, a temperature adjustment device, a humidity adjustment device and an air pressure adjustment device. For example, in an image acquisition scene, the daytime or night environment required by the test terminal to perform the image acquisition task is provided by the illumination adjusting device, and the special temperature environment, the humidity environment and the air pressure environment required by the test terminal to perform the image acquisition task are provided by the temperature adjusting device, the humidity adjusting device and the air pressure adjusting device respectively.
The environment construction device may further comprise a displacement device, which may be a slide rail or a robot, that may adjust the position of the test terminal within the test scene. For example, in an image acquisition scene, the test terminal needs to take a picture of an acquisition target "child", and then the displacement device is required to move the test terminal to an orientation in which the "child" can be acquired.
In one embodiment, the first operating parameter of the lighting adjustment device comprises at least one of:
Illumination brightness;
An illumination angle;
illumination chromaticity.
In embodiments of the present disclosure, different illumination brightness, illumination angles, and illumination chromaticity of the illumination adjustment device may all affect the test environment. For example, in an image acquisition scene, different illumination brightness, illumination angle and illumination chromaticity of the illumination adjusting device have a certain influence on an image acquired by the test terminal.
In one embodiment, step S13 includes:
The method comprises the steps of sending a driving instruction to the displacement equipment according to the task parameter, wherein the driving instruction is used for driving the displacement equipment to move so as to drive the test terminal to move to a preset position of the test scene;
And sending the operation instruction to the test terminal positioned at the preset position in the test scene.
In this embodiment, as described above, the server may adjust the position of the test terminal in the test scene through the displacement device, and specifically, the server may send a driving instruction to the displacement device according to the task parameter, for example, the position of the acquisition target indicated in the task parameter, so as to drive the displacement device to drive the test terminal to move to a predetermined position where the acquisition target can be acquired. After the test terminal moves to the preset position, the server can send an operation instruction to the test terminal positioned at the preset position in the test scene.
It should be noted that, in an embodiment of the present disclosure, a correspondence between task parameters and predetermined positions may be stored in the server. Based on the pre-stored corresponding relation, the server can search the corresponding preset position after receiving the task parameters so as to send a driving instruction to the displacement equipment, and the displacement equipment drives the test terminal to move to the preset position of the test scene.
In another embodiment of the present disclosure, the environment setting up device further includes a photographing device, and the sending, according to the task parameter, a driving instruction to the displacement device includes:
Sending an acquisition instruction for acquiring an image of the environment where the test terminal is located to the photographing equipment;
receiving an image returned by the photographing equipment;
analyzing the image and determining the position of a test target of the test terminal in the image;
Determining the preset position of the test terminal for testing the test target in the test scene according to the position of the test target in the image;
and sending a driving instruction carrying the preset position to the displacement equipment.
In this embodiment, the server may further determine a predetermined position of the test terminal for testing the test target in the test scenario based on the manner of image analysis. Specifically, the environment setting up device further comprises a photographing device, the server can send an acquisition instruction for acquiring the image of the environment where the test terminal is located to the photographing device, and receive the image returned by the photographing device, so that the image is analyzed to determine the position of the test target of the test terminal in the image. The position of the test target in the actual environment coordinates can be determined based on the stored dimensional proportion relation between the photographed image and the actual environment by determining the position of the test target in the image of the test terminal, and the preset position of the test terminal for testing the test target in the test scene is further determined.
Taking an image acquisition task as an example, the test target can be a specific person, and the server can determine the preset position of the test terminal for shooting the specific person in the image acquisition scene after determining the position of the specific person in the image based on the acquired image of the environment of the test terminal.
Taking a voice acquisition task as an example, the test target can be a voice blocking device, and the server can determine the preset position of the test terminal in the voice acquisition scene when the voice receiving condition is tested by the voice blocking device after determining the position of the voice blocking device in the image based on the acquired image of the environment where the test terminal is located.
After determining the preset position of the test terminal for testing the test target in the test scene, the server can send a driving instruction carrying the preset position to the displacement equipment so that the displacement equipment moves to drive the test terminal to move to the preset position of the test scene.
In one embodiment, the task parameters include:
The function identifier is used for indicating the function to be tested;
And the test item identifier is used for indicating the test item of the function to be tested.
In embodiments of the present disclosure, the task parameters may include a function identifier indicating the function to be tested, and may also include a test item identifier indicating a test item of the function to be tested.
For example, the function identifier "1" indicates an image capturing function test, "11" indicates a person capturing test, "12" indicates a sea view capturing test, and the like. The function identifier "2" indicates a network function test, "21" indicates a bluetooth network, "22" indicates a wireless network (WIRELESS FIDELITY, wi-Fi), and the like.
In one embodiment, the task parameters include a second operating parameter when the test terminal executes the operation instruction;
the sending an operation instruction to the test terminal under the built test scene comprises the following steps:
and sending the operation instruction carrying the second working parameter to the test terminal under the built test scene.
In the embodiment of the disclosure, the task parameters further include a second working parameter when the test terminal executes the operation instruction. Taking an image acquisition task as an example, in one embodiment, the second operating parameter includes at least one of:
Focusing parameters;
White balance parameters;
sensitivity parameters.
It can be appreciated that the second working parameter when the test terminal executes the operation instruction is included in the task parameter, so that the test terminal can work with the designated parameter, thereby realizing customization and having more flexibility.
Fig. 2 is a block diagram of an automated testing system in an embodiment of the disclosure, as depicted in fig. 2, including a server 201, a testing terminal 202, and an environment setup device 203, wherein,
The server 201 is configured to receive test task information, send a construction instruction to the environment construction device 203 based on scene setting parameters in the test task information, and send an operation instruction to the test terminal 202 according to task parameters in the test task information;
the environment setting-up device 203 is in communication connection with the server 201, and is configured to set up a test scene according to the setting-up instruction;
the test terminal 202 is in communication connection with the server 201, and is configured to execute an operation instruction in the built test scene according to the operation instruction, and return the test result to the server 201.
In this embodiment, the server 201 establishes communication connection with the test terminal 202 and the environment setting up device 203 respectively, so that the environment setting up device 203 can set up a test scene based on a setting up instruction sent by the server, so that the test terminal 202 can execute the operation instruction in the set up test scene based on the operation instruction sent by the server, and return a test result to the server 201.
It can be understood that in the embodiment of the disclosure, the server sends the setting-up instruction to the environment setting-up device according to the scene setting parameters included in the received test task information so as to enable the environment setting-up device to set up the test scene, so that the test terminal can work according to the operation instruction sent by the server under the set-up test scene, on one hand, the test terminal does not need to consume manpower to find an external scene and manually operate the test terminal to execute the test task, thereby saving manpower and improving the test efficiency, and on the other hand, the test scene is formed by enabling the environment setting-up device to work in a certain state according to the scene setting parameters in an instruction mode, and has stability relative to the natural scene of the outside, so that the reliability of the test can be improved.
In one embodiment, the scene setting parameters comprise a first operating parameter of the environment construction device;
the environment setting-up device 203 is configured to receive, based on an Arduino hardware interface, the setting-up instruction carrying the first working parameter sent by the server 201.
In this embodiment, the interface in the environment setup device 203 that can communicate with the server 201 may be an Arduino hardware interface. The environment setting-up device 203 is based on a technician compiling a program contained in the hardware interface into a binary file and burning the binary file into the environment setting-up device 203 to form a microcontroller, i.e. the remote communication can be performed based on a predetermined network protocol and the server 201. Fig. 3 is an exemplary diagram of Arduino procedure in an environment-building apparatus in an embodiment of the present disclosure.
It should be noted that, after the Arduino program is burned, the environment setting device 203 needs to further have a corresponding upper computer program to form a microcontroller, where the upper computer program is developed according to a certain logic based on a serial port or other communication media. Meanwhile, the server end is integrated with an automation program which can communicate with the microcontroller so as to realize the call of an Arduino hardware interface, so that the environment setting-up equipment 203 can be automatically set up, changed and adjusted, and the effect that the environment setting-up equipment 203 can change according to the requirement to influence the physical environment is achieved.
For example, the environment setting up device 203 is an illumination lamp or a sliding rail, and the illumination lamp or the sliding rail can receive the setting up instruction carrying the first working parameter sent by the server 201 based on the Arduino hardware interface, so as to realize light brightness adjustment, movement of the sliding rail, angle change and the like.
In one embodiment, the server 201 and the environment set-up device 203 communicate remotely based on a Web service.
In an embodiment of the present disclosure, the server 201 and the environment setup device 203 communicate remotely based on Web services. Specifically, the client program of the environment setting-up device 203 is integrated in the Web framework (Web service), the server 201 realizes control of remote client serial port communication based on the Web framework, and the operations of the Web framework background (server 201) and the laboratory hardware (environment setting-up device 203) are not directly related. The method for realizing the remote communication function of the client side serial port based on the Web framework comprises the following steps that Node service is automatically installed in the environment of the client side when a laboratory Web is accessed, the client side utilizes Node Js and Arduino communication to realize the client side functions like electronics, a service side uses JavaScript, HTML, CSS to construct the cross-platform application function, and realized local service such as HTTP, webSocket is utilized to realize the automatic communication related to hardware of a Web page (client side).
In one embodiment, the environment construction device 203 comprises a displacement device 204,
The server 201 is further configured to send a driving instruction to the displacement device 204 according to the task parameter;
The displacement device 204 is configured to drive the test terminal 202 to move to a predetermined position of the test scene according to the driving instruction.
In this embodiment, the server 201 may adjust the position of the test terminal 202 in the test scene through the displacement device 204, and specifically, the server may send a driving instruction to the displacement device 204 according to the task parameter, for example, the position of the acquisition target indicated in the task parameter, so as to drive the displacement device 204 to drive the test terminal 202 to move to a predetermined position where the acquisition target can be acquired.
In one embodiment, the test tasks include image acquisition tasks, the environment construction device 203 further includes a photographing device 205,
The server 201 is further configured to send an acquisition instruction for acquiring an image of an environment where the test terminal 202 is located to the photographing apparatus 205;
The photographing device 205 is configured to collect, according to the collection instruction, an image of an environment where the test terminal 202 is located and send the image to the server 201;
The server 201 is configured to analyze the image, determine a position of a shooting target of the test terminal 202 in the image, determine the predetermined position of the test terminal 202 for shooting the shooting target in the test scene according to the position of the shooting target in the image, and send a driving instruction carrying the predetermined position to the displacement device 204.
In this embodiment, taking an image acquisition task as an example, the server 201 may further determine a predetermined position of the test terminal 202 for capturing a capturing target in the test scene based on the manner of image analysis. Specifically, the environment setting-up device 203 further includes a photographing device 205, where the server 201 may send an acquisition instruction for acquiring an image of the environment where the test terminal 202 is located to the photographing device 205, and receive an image returned by the photographing device 205, so as to analyze the image to determine a position of a photographing target of the test terminal 202 in the image, and further determine a predetermined position of the test terminal 202 for photographing the photographing target in the test scene.
In the image analysis, the server 201 may use visual recognition techniques such as OpenCV. OpenCV is an Intel open source computer vision library, which is composed of a series of C functions and a small number of c++ classes, implementing many general algorithms in terms of image processing and computer vision.
In embodiments of the present disclosure, to perform content-based image analysis, meaningful features, such as contours, lines, blobs, etc., must be extracted from the image. Taking the example of detecting the contour of an image, a Canny operator may be used. The core idea of the Canny operator is to use two different thresholds to determine which point belongs to the contour, one being a low threshold and one being a high threshold. When selecting the low threshold, it is guaranteed that it can contain all edge pixels belonging to important image contours, specifically as follows:
1. The input image is gaussian smoothed to blur details of the image. Gaussian smoothing gives pixels at different positions different weights when averaging the gray levels of pixels in the neighborhood. For example, a Gaussian template of 3*3 neighborhood is displayed, and the closer to the center of the neighborhood, the higher the weight is on the template. The significance of arranging the weight values is that when the template is used for image smoothing, the gray distribution characteristics of the whole image can be reserved more while the details of the image are blurred.
2. The gradient amplitude and direction are calculated to estimate the edge strength and direction of each point, and the optional templates are Sobel operator, prewitt operator, roberts template, etc. Taking the Sobel operator as an example, the following formulas (1) - (3) are used for calculating dx, dy by utilizing the Sobel horizontal operator Sobel X and the vertical operator Sobel Y to convolve with the input image f (x, y):
dX=f(x,y)*SobelX(x,y) dy=f(x,y)*Sobely(x,y) (3)
The amplitude M (x, y) of the image gradient is further obtained as follows equation (4):
Fig. 4 is an exemplary diagram of gradient vectors, azimuth angles, and edge directions representing a center point. As shown in fig. 4, the edge direction of any point is orthogonal to the gradient direction.
3. And carrying out non-maximum suppression on the gradient amplitude according to the gradient direction.
Assuming that the edge can be divided into four directions of vertical, horizontal, 45 ° and 135 ° in the 3*3 region, the gradient direction is also divided into four directions (orthogonal to the edge direction). Fig. 5 is an exemplary diagram of non-maximum suppression, as shown in fig. 5, quantifying all possible directions into 4 directions. Quantization can be summarized as a vertical horizontal edge-gradient direction, a 135 ° edge-gradient direction of 45 °, a horizontal vertical edge-gradient direction, and a 45 ° edge-gradient direction of 135 °. The non-maximum value inhibition is to compare the magnitudes of corresponding neighborhood values in 3*3 neighborhood along the gradient directions of the 4 types.
4. Edges are detected and connected using a double threshold algorithm. The coefficients high threshold TH and low threshold TL may be selected in a ratio of 2:1 or 3:1. Discarding the points smaller than the low threshold value, giving 0, marking the points larger than the high threshold value, giving 1 or 255, and the points larger than the high threshold value are edge points.
In the embodiment of the disclosure, the photographing device 205 is installed to photograph, so that the server 201 can acquire an image and perform image analysis, thereby determining the position of a photographing target in the image, and further determining the predetermined position of the test terminal 202 for photographing the photographing target in the test scene, so that the server 201 can drive the displacement device 204 to move the test terminal 202 to the predetermined position, and the method has the characteristic of being intelligent.
Fig. 6 is a diagram of a system topology, as shown in fig. 6, an automated unmanned laboratory arranges a slide rail or robot to control the direction and position sliding of a camera phone, and a plurality of test machines including a racing machine can be placed in the laboratory. Cloud and each open source hardware can be controlled through an Arduino hardware interface, remote creation, automatic construction and simulation of laboratory environment construction information are achieved, for example, light brightness, angles and other prop scenes in the environment are configured, and remote automatic control construction is achieved. The cloud end can also remotely control the high-definition cameras at all angles of the laboratory, and based on the Opencv and artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) technologies, image recognition, grey-scale processing, binarization processing and the like are carried out, so that scenes are recognized to schedule corresponding open source hardware or manipulators, and the laboratory is controlled. The cloud end can also perform task issuing, data transmission and the like based on a transmission control protocol/internet protocol (TCP/IP) and a control remote terminal device (test terminal). The automatic unmanned laboratory is connected with a manipulator (a sliding rail or a robot), instruments and meters and various open source hardware through serial ports/parallel ports. In fig. 6, the slide rail or the robot can control the mobile phone to aim at the target object, and the cloud triggers the mobile phone end, for example, an android mobile phone end automation program to execute actions such as focusing and photographing. It should be noted that, in the above process, all scenes are simulated through physical devices, such as scene switching, brightness adjustment, distance, angle, and the like, and the physical devices provide an internet control application program interface (Application Programming Interface, API) based on the Arduino hardware interface, so that the cloud can control each physical device.
Fig. 7 is a schematic diagram of an unmanned shooting scene laboratory system of a cloud distributed system, as shown in fig. 7, the cloud system mainly adopts a distributed architecture, and performs cloud automatic unmanned operation on an external scene simulation experiment environment node through a network. The cloud server adopts a Database as a data storage medium to realize the maximization of the internal protection Database and the data table, and the search efficiency is fully considered in the structural design of the Database table. The cloud server also acquires terminal equipment information based on a TCP/IP protocol, and judges the state of the remote test terminal in real time by using a heartbeat technology so as to perform data transmission interaction. The distributed system can adopt Celery architecture to issue, dispatch and manage distributed tasks, and can also store data and analyze data based on Hadoop distributed mode. In addition, the cloud end can also carry out load balancing based on the Nginx proxy server, coordinate access to the background pressure condition of different areas and the like.
FIG. 8 is a diagram illustrating an automated test equipment, according to an example embodiment. Referring to fig. 8, an automated testing apparatus applied to a server includes:
The receiving module 301 is configured to receive test task information, wherein the test task information comprises scene setting parameters and task parameters;
A building module 302 configured to build a test scene based on the scene setting parameters;
the sending module 303 is configured to send an operation instruction to the test terminal under the built test scene according to the task parameter;
and the detection module 304 is configured to detect an execution result of the operation instruction executed by the test terminal.
Optionally, the test task comprises an image acquisition task, wherein the test scene is an image acquisition scene, and the operation instruction comprises an image acquisition instruction;
the detection module 304 is specifically configured to perform collection image quality evaluation on the image collected by the test terminal.
Optionally, the scene setting parameters comprise equipment identifiers of environment building equipment and first working parameters of the environment building equipment;
The construction module 302 is specifically configured to determine an environment construction device for constructing the test scenario according to the device identifier, and control the environment construction device to operate according to the first working parameter.
Optionally, the environment construction device comprises at least one of:
The setting equipment is used for constructing a setting in the space where the test scene is located;
The environment adjusting device is used for providing a test environment required by the test scene;
and the displacement equipment is used for adjusting the position of the test terminal in the test scene.
Optionally, the environment adjusting device includes at least one of:
The illumination adjusting device is used for providing an illumination environment required by the test environment;
A temperature regulation device for providing a temperature environment required for the test environment;
humidity adjusting equipment for providing humidity environment required by the test environment;
And the air pressure adjusting device is used for providing an air pressure environment required by the test environment.
Optionally, the first operating parameter of the lighting adjustment device includes at least one of:
Illumination brightness;
An illumination angle;
illumination chromaticity.
Optionally, the sending module 303 is specifically configured to send a driving instruction to the displacement device according to the task parameter, where the driving instruction is configured to drive the displacement device to move so as to drive the test terminal to move to a predetermined position of the test scene, and send the operation instruction to the test terminal located at the predetermined position in the test scene.
Optionally, the environment setting up device further comprises a photographing device,
The sending module 303 is specifically configured to send an acquisition instruction for acquiring an image of an environment where the test terminal is located to the photographing device, receive an image returned by the photographing device, analyze the image, determine a position of a test target of the test terminal in the image, determine the preset position of the test terminal for testing the test target in the test scene according to the position of the test target in the image, and send a driving instruction carrying the preset position to the displacement device.
Optionally, the task parameters include:
The function identifier is used for indicating the function to be tested;
And the test item identifier is used for indicating the test item of the function to be tested.
Optionally, the image acquisition task includes:
Collecting a human image;
collecting night scenes;
collecting sea scenery;
collecting static scenes;
collecting a moving scene;
collecting far focus;
collecting at a wide angle;
and (5) micro-distance collection.
Optionally, the task parameter includes a second working parameter when the test terminal executes the operation instruction;
the sending module 303 is specifically configured to send the operation instruction carrying the second working parameter to the test terminal located under the built test scenario.
Optionally, the second operating parameter includes at least one of:
Focusing parameters;
White balance parameters;
sensitivity parameters.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Fig. 9 is a block diagram of a server apparatus 900, according to an example embodiment. Referring to FIG. 9, apparatus 900 includes a processing component 922 that further includes one or more processors, and memory resources represented by memory 932, for storing instructions, such as applications, executable by processing component 922. The application programs stored in memory 932 may include one or more modules that each correspond to a set of instructions. In addition, processing component 922 is configured to execute instructions to perform the networking method described above.
The apparatus 900 may also include a power component 926 configured to perform power management of the apparatus 900, a wired or wireless network interface 950 configured to connect the apparatus 900 to a network, and an input output (I/O) interface 958. The device 900 may operate based on an operating system stored in the memory 932, such as Windows Server, mac OS XTM, unixTM, linuxTM, freeBSDTM, or the like.
In an exemplary embodiment, a non-transitory computer-readable storage medium is also provided, such as memory 932, that includes instructions executable by processing component 922 of apparatus 900 to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
A non-transitory computer readable storage medium, which when executed by a processing component of a server, enables the server to perform an automated test method, the method comprising:
receiving test task information, wherein the test task information comprises scene setting parameters and task parameters;
constructing a test scene based on the scene setting parameters;
According to the task parameters, sending an operation instruction to a test terminal under the built test scene;
And detecting an execution result of the operation instruction executed by the test terminal.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.