[go: up one dir, main page]

CN113468042A - Human-computer interaction test system and method - Google Patents

Human-computer interaction test system and method Download PDF

Info

Publication number
CN113468042A
CN113468042A CN202010243078.1A CN202010243078A CN113468042A CN 113468042 A CN113468042 A CN 113468042A CN 202010243078 A CN202010243078 A CN 202010243078A CN 113468042 A CN113468042 A CN 113468042A
Authority
CN
China
Prior art keywords
equipment
bionic
test
control
tested
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010243078.1A
Other languages
Chinese (zh)
Inventor
许晨光
黄宗明
王烨
伍玉英
李思佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Banma Zhixing Network Hongkong Co Ltd
Original Assignee
Banma Zhixing Network Hongkong Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Banma Zhixing Network Hongkong Co Ltd filed Critical Banma Zhixing Network Hongkong Co Ltd
Priority to CN202010243078.1A priority Critical patent/CN113468042A/en
Publication of CN113468042A publication Critical patent/CN113468042A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Prevention of errors by analysis, debugging or testing of software
    • G06F11/362Debugging of software
    • G06F11/3648Debugging of software using additional hardware
    • G06F11/3652Debugging of software using additional hardware in-circuit-emulation [ICE] arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请提供了一种人机交互测试方案,该方案在进行多模态人机交互测试时采用了包括控制设备、仿生输出设备和仿生输入设备的系统,所述仿生输出设备和仿生输入设备对应相应的模态,其中,所述控制设备可以根据第一测试脚本,向所述仿生输出设备发送对应的控制指令,使得所述仿生输出设备根据控制指令,向被测设备输出对应模态的交互信息,以使所述被测设备根据所述交互信息生成交互反馈,而所述仿生输入设备可以采集被测设备的交互反馈,并将所述交互反馈提供至控制设备,使得控制设备以此确定测试结果。由此,通过控制设备和相关的仿生设备扩展了被测设备的测试能力,使用仿生设备来模拟多模态下的交互,由此可以实现了完整的多模态自动化测试。

Figure 202010243078

The present application provides a human-computer interaction test solution, which adopts a system including a control device, a bionic output device and a bionic input device when performing a multi-modal human-computer interaction test. The bionic output device and the bionic input device correspond to Corresponding mode, wherein the control device can send a corresponding control command to the bionic output device according to the first test script, so that the bionic output device outputs the interaction of the corresponding mode to the device under test according to the control command information, so that the device under test generates interactive feedback according to the interaction information, and the bionic input device can collect the interactive feedback of the device under test and provide the interactive feedback to the control device, so that the control device can determine accordingly Test Results. As a result, the testing capability of the device under test is expanded through the control device and related bionic devices, and the bionic device is used to simulate the interaction under multimodality, thereby realizing a complete multimodal automated test.

Figure 202010243078

Description

Human-computer interaction test system and method
Technical Field
The application relates to the technical field of information, in particular to a human-computer interaction testing system and method.
Background
Multimodal human-computer interaction is increasingly used in various intelligent devices, for example, in internet automobiles, a user can control an automobile air conditioner in a touch screen mode, control a skylight through voice, connect a telephone through gestures, and the like.
Some current smart devices provide no support for automated testing, although they provide support for multi-modal human-machine interaction functionality. If the multi-mode man-machine interaction function of the intelligent device needs To be tested, the intelligent device executes manual testing manually, or automatically tests the man-machine interaction function in some aspect, for example, the automatic testing in the aspect of voice function is realized through a Text To Speech (TTS) simulation testing framework, or the automatic testing in the aspect of clicking, Text input and the like is realized through software simulation, so that the automatic testing of the function is realized. However, none of these approaches enables complete multi-modal automated testing.
Disclosure of Invention
An object of the present application is to provide a human-computer interaction testing method, so as to solve the problem in the prior art that a complete multi-modal automated test cannot be implemented.
The embodiment of the application provides a multi-mode man-machine interaction test system, which comprises control equipment, bionic output equipment and bionic input equipment, wherein the bionic output equipment and the bionic input equipment correspond to corresponding modes;
the control equipment is used for sending a corresponding control instruction to the bionic output equipment according to the first test script, acquiring interactive feedback of the tested equipment and determining a test result according to the interactive feedback;
the bionic output equipment is used for outputting interaction information of a corresponding mode to the tested equipment according to a control instruction from the control equipment so that the tested equipment generates interaction feedback according to the interaction information;
the bionic input device is used for collecting interactive feedback of the tested device and providing the interactive feedback to the control device.
The embodiment of the application also provides a human-computer interaction test system, which comprises control equipment, output equipment and input equipment;
the control equipment is used for sending a corresponding control instruction to the output equipment according to the first test script, acquiring interactive feedback of the tested equipment and determining a test result according to the interactive feedback;
the output equipment is used for outputting interaction information to the tested equipment according to a control instruction from the control equipment so that the tested equipment generates interaction feedback according to the interaction information;
the input device is used for collecting the interactive feedback of the tested device and providing the interactive feedback to the control device.
The embodiment of the application provides a multi-mode man-machine interaction testing method, which adopts a system comprising control equipment, bionic output equipment and bionic input equipment, wherein the bionic output equipment and the bionic input equipment correspond to corresponding modes, and the method comprises the following steps:
the control equipment sends a corresponding control instruction to the bionic output equipment according to the first test script;
the bionic output equipment outputs interaction information of a corresponding mode to the tested equipment according to a control instruction from the control equipment, so that the tested equipment generates interaction feedback according to the interaction information;
the bionic input equipment acquires the interactive feedback of the equipment to be tested and provides the interactive feedback to the control equipment;
the control equipment acquires the interactive feedback of the tested equipment and determines a test result according to the interactive feedback.
The embodiment of the application also provides a human-computer interaction testing method, which adopts a system comprising control equipment, output equipment and input equipment, and comprises the following steps:
the control equipment sends a corresponding control instruction to the output equipment according to the first test script;
the output equipment outputs interaction information to the tested equipment according to a control instruction from the control equipment, so that the tested equipment generates interaction feedback according to the interaction information;
the input equipment acquires interactive feedback of the tested equipment and provides the interactive feedback to the control equipment;
the control equipment acquires the interactive feedback of the tested equipment and determines a test result according to the interactive feedback.
Some embodiments of the present application also provide a computing device, wherein the device comprises a memory for storing computer program instructions and a processor for executing the computer program instructions, wherein the computer program instructions, when executed by the processor, trigger the device to perform the aforementioned multimodal human machine interaction testing method.
Still other embodiments of the present application provide a computer readable medium having computer program instructions stored thereon that are executable by a processor to implement the multimodal human machine interaction test method.
In the multi-mode human-computer interaction test scheme provided by the embodiment of the application, a system comprising control equipment, bionic output equipment and bionic input equipment is adopted, the bionic output equipment and the bionic input equipment correspond to corresponding modes, wherein the control equipment can send corresponding control instructions to the bionic output equipment according to a first test script, so that the bionic output equipment outputs interaction information of the corresponding modes to tested equipment according to the control instructions, the tested equipment generates interaction feedback according to the interaction information, the bionic input equipment can acquire the interaction feedback of the tested equipment and provide the interaction feedback to the control equipment, and the control equipment determines a test result according to the interaction feedback. Therefore, the testing capability of the tested equipment is expanded through the control equipment and the related bionic equipment, and the interaction under multiple modes is simulated by using the bionic equipment, so that the complete multi-mode automatic test can be realized.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
fig. 1 is a schematic structural diagram of a multi-modal human-computer interaction test system according to an embodiment of the present disclosure;
FIG. 2 is a schematic structural diagram of another multi-modal human-computer interaction test system provided in an embodiment of the present application;
fig. 3 is an interaction process when a test scheme provided by an embodiment of the present application tests a voice interaction function of music playing;
FIG. 4 is a flowchart illustrating a process for performing a human-computer interaction test according to an embodiment of the present disclosure;
FIG. 5 is a processing flow chart of a multi-modal human-machine interaction testing method according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an apparatus provided in an embodiment of the present application;
the same or similar reference numbers in the drawings identify the same or similar elements.
Detailed Description
The present application is described in further detail below with reference to the attached figures.
In a typical configuration of the present application, the terminal, the devices serving the network each include one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, which include both non-transitory and non-transitory, removable and non-removable media, may implement the information storage by any method or technology. The information may be computer readable instructions, data structures, program means, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
The embodiment of the application provides a multi-mode man-machine interaction test system which comprises control equipment, bionic output equipment and bionic input equipment, the test capability of tested equipment is expanded through the control equipment and the related bionic equipment, interaction under multiple modes is simulated through the bionic equipment, and therefore complete multi-mode automatic test can be achieved.
Fig. 1 shows a structure of a multi-modal human-computer interaction test system provided by an embodiment of the present application, where the test system includes a control device 110, a bionic output device 120, and a bionic input device 130, where the bionic output device and the bionic input device correspond to respective modalities. In the testing process, the control device 110 is configured to send a corresponding control instruction to the bionic output device according to a first test script, obtain an interaction feedback of the device under test 140, and determine a testing result according to the interaction feedback; the bionic output device 120 is configured to output interaction information of a corresponding modality to a device under test according to a control instruction from a control device, so that the device under test generates interaction feedback according to the interaction information; the bionic input device 130 is used for collecting the interactive feedback of the device to be tested and providing the interactive feedback to the control device.
The interactive information is information corresponding to interactive operation in different modalities, for example, specific voice information corresponding to voice interactive operation, click object or click position information corresponding to click operation, hand action information corresponding to gesture operation, and the like. The interactive feedback is feedback made by the device under test after the interactive information is acquired, for example, the device under test switches the user interface to a specific interface, plays specific feedback voice, and the like.
Different bionic output devices are used for simulating the control process of the tested device in a corresponding form, and different bionic input devices are used for acquiring the interactive feedback of the tested device in the corresponding form. For example, the bionic output device may include a bionic hand 121 and a bionic mouth 122, the bionic hand may simulate the actions of a hand of a user, so as to simulate the user to control the device under test through various gestures and hand actions in an actual scene, and the bionic mouth may simulate the voice of the user, so as to simulate the user to control the device under test through sound. In an actual scene, the bionic hand can be various manipulators simulating human hand shapes, a steering engine is used as an execution part, so that different postures and actions of the human hand can be simulated, and the bionic mouth can be various audio devices such as a sound box and the like, so that the voice of a user can be simulated by emitting different sounds.
The bionic input device 130 may include a bionic eye 131 and a bionic ear 132, where the bionic eye may collect visual interaction feedback generated by the device under test, and the visual interaction feedback may be feedback information in any visual aspect, such as feedback content displayed on a display screen of the device under test after receiving the interaction information and performing corresponding processing, or feedback action performed by a movable component of the device under test after receiving the interaction information and performing corresponding processing, and the visual interaction feedback may be collected by the bionic eye 131. The bionic ear 132 may collect an auditory interaction feedback generated by the device under test, where the auditory interaction feedback may be any auditory feedback information, such as a sound output by an audio device of the device under test after receiving the interaction information and performing corresponding processing. In an actual scene, the bionic eye 131 may employ a video capture device such as a camera to capture a relevant image to obtain visual interaction feedback, and the bionic ear may employ an audio capture device such as a microphone to capture a relevant sound to obtain audio interaction feedback.
Fig. 2 shows the structure of another multi-modal human-computer interaction test system in the embodiment of the present application, in which a bionic hand 121 and a bionic mouth 122 are adopted as the bionic output device, and a bionic eye 131 and a bionic ear 132 are adopted as the bionic input device. The control device 110 may be various computing devices with data processing functions, for example, a computer deployed locally or a server deployed on a network side, and if the computer deployed locally is used, a user may implement a multimodal human-computer interaction test through a user interface on the computer, and if the server deployed on the network side is used, the user may provide a client or a browser to implement remote interaction with the server, thereby implementing the multimodal human-computer interaction test. For example, the control device 110 in fig. 2 may provide a Socket Interface and a User Interface (UI) to the outside, and the client 150 used by the User may perform data interaction with the control device 110 through the Socket Interface, or may perform data interaction with the control device 110 in a display User Interface after connecting with the control device by using the browser 160, such as starting a test, editing a test script, obtaining a test result, viewing a test report, and the like.
The first test script is a test script for controlling the bionic output device to output the interaction information of the corresponding modality, for example, taking a first test script for controlling the bionic mouth as an example, the content of the first test script may be as follows:
*test_MusicPlay()
{ yield robotmouth.
Assert (uidevice, getcurrentpageuri () ═ page:// music ×.xxx.cn "," open music failed ");
}
the "i want to listen to the song of liu de hua" represents that the interactive information is sent through the artificial mouth connected with the control device, the interactive information is a voice instruction, and the specific content of the interactive information is "i want to listen to the song of liu de hua" so as to control the tested device to play specific music. In an actual scene, the control device may generate an audio file from the text in a TTS manner based on the text "i want to listen to songs of liudeluxe", and then play the audio file through a connected manual mouth to implement voice interaction.
In the test script, namely, whether the tested device generates correct interaction feedback is judged by means of assertion operation, namely, whether the tested device opens a URI (Uniform Resource Identifier) for playing music, wherein the URI is the page:// music.xxx.cn, if the tested device opens the URI for playing music, no operation is executed, and if the tested device does not open the URI, information of 'music opening failure' is generated, so that interaction feedback corresponding to the current interaction information can be confirmed, and whether human-computer interaction is abnormal is judged.
The results of the assertion operation may be sent to the control device so that the control device may determine whether the interactive feedback generated by the device under test is correct. In addition, the interactive feedback of the device under test can also be acquired through a bionic input device such as the bionic eye 131 or the bionic ear 132. Taking the bionic eye as an example, if the tested device executes correct processing based on the voice interaction information 'I want to listen to songs of Liu De Hua', the page:// music. When playing, the tested device enters the corresponding user interface, so that when the tested device enters the user interface, the tested device can generate correct interactive feedback, otherwise, if the tested device does not enter the user interface, the tested device does not generate correct interactive feedback.
Fig. 3 shows an interaction process when testing a voice interaction function of music playing, which includes the following steps:
in step S301, the control device 110 generates a control command a1 according to the first test script and sends the control command a1 to the bionic mouth 122 to send out a voice "i want to listen to the song of liudebua".
In step S302, the bionic mouth 122 sends a voice "i want to listen to the song of liudelhi" to the device under test 140 according to the control instruction a 1.
Step S303, when the device under test 140 recognizes the voice "i want to listen to the song of liu de hua", the interaction information is processed to generate corresponding interaction feedback. If the whole processing process is correct, the interactive feedback is to enter the corresponding playing interface a2 to start playing the Liudebua song. If the processing process of the interactive information is incorrect or the interactive information is not correctly identified, the corresponding playing interface cannot be entered.
In step S304, the bionic eye 131 shoots the user interface of the device under test 140, and sends the user interface to the control device for determination. If the user interface of the device under test enters the playback interface a2 in step S303, the bionic eye 131 may capture the playback interface a2, and send the playback interface a2 to the control device for analysis.
In step S305, the control device 110 may perform analysis according to the user interface captured by the bionic eye 131 to obtain a test result. If the user interface is found to enter the playing interface a2 correctly within the preset time, it can be determined that the voice interaction function of music playing is normal.
And the control equipment is also used for generating a test report according to the test result. When the test report is generated, after the execution of a single test script is completed, the control device may generate the test report for the single test script, or may continue to control the execution of other test scripts, and after the execution of all the test scripts is completed, the control device generates the test report for all the test scripts. In addition, a page related to the test report can be output in combination with other third-party test platforms, so that a user can conveniently view the test result.
In an actual scene, the interaction information during the human-computer interaction of the device to be tested can be simulated through the bionic output device, and can also be simulated through an Application Programming Interface (API) packaged in the device to be tested. For example, various application programming interfaces for interaction, which are packaged in the device under test through a preset test framework, for example, interactive operations such as clicking, dragging, sliding, text input and the like, which are input by a user in the device under test, are simulated by calling the application programming interfaces packaged in the uiautomation and the like in the device under test. When the man-machine interaction test is carried out, the test script used for controlling the simulated interaction operation of the tested equipment is the second test script, and the first test script is the test script used for controlling the bionic output equipment to output the corresponding modal interaction information.
Therefore, in some embodiments of the present application, the control device is further configured to import a second test script into the device under test, so that the device under test runs the second test script, and generates a corresponding interaction feedback after simulating a corresponding interaction operation. A user can simulate multi-mode interactive operation through various bionic output devices and a mode of directly guiding a test script into tested equipment to run, corresponding interactive feedback is obtained, and a test result is verified, so that a developer can more flexibly construct a multi-mode test scene, and multi-mode human-computer interaction test is more conveniently realized.
If an assist manual test (amt) tool is integrated in the device to be tested, a corresponding assist manual test action command can be generated. Therefore, the control device can obtain an auxiliary manual test action instruction corresponding to the second test script, and the auxiliary manual test action instruction is led into the tested device through the socket interface, so that an auxiliary manual test tool in the tested device executes the auxiliary manual test action instruction to generate interactive feedback. Because the auxiliary manual testing tool can be used for interactive operation of repeated user input, the efficiency of the test and the stability of the test can be improved.
In an actual scenario, a test case set may be preconfigured in the control device, where the test case set includes test scripts for performing various human-computer interaction tests on the device under test, and the test scripts include the first test script or the second test script. The test script in the test case set can be written or imported by a developer, a tester and other users through a socket interface or a user interface provided by the control device by using a browser or a client. Or, the test script may also come from the device under test, and at this time, the control device may also obtain the first test script or the second test script from the device under test through the socket, so that the test script used for the human-computer interaction test is uniformly managed on the control device.
Fig. 4 shows a processing flow when performing a human-Computer interaction test according to an embodiment of the present application, where a control device uses a PC (Personal Computer), and when performing a human-Computer interaction test on a device under test, an interaction process between a PC end and the device under test end is as follows:
step S401, starting a test component of the PC side. The test component of the PC end comprises a test script management function, so that a test script set can be conveniently configured and the execution of the test script can be controlled, and meanwhile, a test result can be determined and a test report can be generated.
And S402, after the test component of the PC end is started, the control of the test script can be realized through the socket interface. The control process is as follows: interaction with the equipment to be tested can be realized in an adb (Android Debug Bridge) mode, for example, an adb shell command is sent, a socket port is monitored at a PC (personal computer) end in an adb port mapping mode, and instruction messages from the equipment to be tested are received, wherein the message instructions can be amt instruction character strings. In addition, the PC terminal can also send an amt action instruction to the amt tool in the tested device through the socket port, the amt action instructions are used for simulating interactive operations of clicking, dragging, sliding, text input and the like input by a user in the tested device, the amt tool in the tested device can execute and control hardware in the tested device to execute corresponding processing, and a response is returned to the PC terminal.
In step S403, control of the biomimetic device is initiated. The bionic output devices such as the mechanical arm and the artificial mouth can make various actions or send various voices to simulate the interactive operation of the test under the control of the PC end, and the bionic input devices such as the camera can sense the interactive feedback from the tested device.
Step S404, generating a test report. After the execution of the single test script is finished, the PC end can continue to control other test cases in the execution case set, and after the execution of all the test scripts is finished, the PC end can generate a test report.
In the control process of the bionic device, the bionic device depends on three core components including an image analysis component, a gesture control component and a voice control component.
The image analysis component is used for sensing interactive feedback on the user interface of the tested device. The PC end can be connected with a camera to shoot the change condition of the user interface of the tested equipment. When the interactive feedback is made on the user interface of the tested device, the interactive feedback can be shot into continuous image frames, the time sent by the test action is used as the starting time of the interactive feedback, and the time when the image change of the user interface is stopped is used as the ending time of the interactive feedback, so that the interactive feedback corresponding to the interactive information is obtained.
The gesture control component is used for simulating the action interaction of the hand of the user. Mechanical hardware of the bionic hand can be manufactured in a 3D printing mode, and the manipulator is driven to move through the connected steering engine, so that motion control is achieved. And the PC terminal generates a control instruction according to the test script, controls the steering engine accordingly, and drives the bionic manipulator to execute actions to form the gesture operation to be tested.
For example, in the embodiment of the present application, a test script corresponding to a static gesture may be as follows:
yield MultiMode.sendGestureToServer(MultiMode.GestureEvent.GESTU RECODE_PALM);
the test script corresponding to the dynamic gesture may be as follows:
yield MultiMode.sendGestureToServer([{thumb:180,index:0,middle:180,ring:180,pinky:150,wrist:180,bicep:60,rotator:40,shoulder:30,omoplate:10,interval:2000},{thumb:180,index:0,middle:180,ring:180,pinky:150,wrist:180,bicep:60,rotator:120,shoulder:30,omoplate:10,interval:0}]);
the gesture made by the manipulator can be recognized by a gesture recognition module at the tested device end, and a gesture event is sent to the system application, so that the event is responded. After the event responds, the state of the tested equipment end can be changed correspondingly, and the test script of the equipment end can judge whether the state change meets the expectation through an image analysis component or UIAutomator and the like.
The voice control component is used for simulating voice interaction of a user. The voice test script is executed at the tested equipment end or the PC end, and if the tested equipment end is used, the voice operation instruction text in the voice test script is sent to the PC end. After receiving the voice operation instruction text, the PC terminal calls a TTS interface to generate an audio file from the text, then the system automatically processes the audio file, removes invalid data, and then automatically plays the audio file through equipment such as a manual mouth connected with the PC terminal.
Therefore, the scheme provided by the embodiment of the application can expand the testing capability of the tested equipment through the control equipment and the related bionic equipment, and the bionic equipment is used for simulating interaction under multiple modes, so that complete multi-mode automatic testing can be realized.
In addition, the embodiment of the application can also provide a human-computer interaction testing system applicable to other scenes, and the system comprises control equipment, output equipment and input equipment, wherein the output equipment and the input equipment are not limited to various bionic equipment and can be output equipment and input equipment in any other forms, and the human-computer interaction function of the tested equipment is tested through the interaction among the control equipment, the output equipment, the input equipment and the tested equipment.
In the human-computer interaction test system of this embodiment, the control device is configured to send a corresponding control instruction to the output device according to the first test script. The output equipment is used for outputting interaction information to the tested equipment according to a control instruction from the control equipment, so that the tested equipment generates interaction feedback according to the interaction information. The input device is used for collecting the interactive feedback of the tested device and providing the interactive feedback to the control device. And after the control equipment acquires the interactive feedback provided by the input equipment, the control equipment can determine a test result according to the interactive feedback to finish the test of the man-machine interactive function.
Based on the same inventive concept, the embodiment of the application also provides a multi-mode human-computer interaction testing method, the corresponding system of the method is the multi-mode human-computer interaction testing system in the embodiment, and the problem solving principle is similar to that of the method.
Fig. 5 shows a processing flow of a multi-modal human-computer interaction testing method provided by an embodiment of the application, which relies on a testing system of a control device, a bionic output device and a bionic input device. When the human-computer interaction test is realized, the method comprises the following processing steps:
step S501, the control device sends a corresponding control instruction to the bionic output device according to the first test script.
Step S502, the bionic output device outputs interaction information of a corresponding mode to the tested device according to a control instruction from the control device, so that the tested device generates interaction feedback according to the interaction information.
Step S503, the bionic input device collects the interactive feedback of the tested device and provides the interactive feedback to the control device.
Step S504, the control device obtains the interactive feedback of the tested device, and determines the test result according to the interactive feedback.
The interactive information is information corresponding to interactive operation in different modalities, for example, specific voice information corresponding to voice interactive operation, click object or click position information corresponding to click operation, hand action information corresponding to gesture operation, and the like. The interactive feedback is feedback made by the device under test after the interactive information is acquired, for example, the device under test switches the user interface to a specific interface, plays specific feedback voice, and the like.
Different bionic output devices are used for simulating the control process of the tested device in a corresponding form, and different bionic input devices are used for acquiring the interactive feedback of the tested device in the corresponding form. For example, the bionic output device may include a bionic hand 121 and a bionic mouth 122, the bionic hand may simulate the actions of a hand of a user, so as to simulate the user to control the device under test through various gestures and hand actions in an actual scene, and the bionic mouth may simulate the voice of the user, so as to simulate the user to control the device under test through sound. In an actual scene, the bionic hand can be various manipulators simulating human hand shapes, a steering engine is used as an execution part, so that different postures and actions of the human hand can be simulated, and the bionic mouth can be various audio devices such as a sound box and the like, so that the voice of a user can be simulated by emitting different sounds.
The bionic input device 130 may include a bionic eye 131 and a bionic ear 132, where the bionic eye may collect visual interaction feedback generated by the device under test, and the visual interaction feedback may be feedback information in any visual aspect, such as feedback content displayed on a display screen of the device under test after receiving the interaction information and performing corresponding processing, or feedback action performed by a movable component of the device under test after receiving the interaction information and performing corresponding processing, and the visual interaction feedback may be collected by the bionic eye 131. The bionic ear 132 may collect an auditory interaction feedback generated by the device under test, where the auditory interaction feedback may be any auditory feedback information, such as a sound output by an audio device of the device under test after receiving the interaction information and performing corresponding processing. In an actual scene, the bionic eye 131 may employ a video capture device such as a camera to capture a relevant image to obtain visual interaction feedback, and the bionic ear may employ an audio capture device such as a microphone to capture a relevant sound to obtain audio interaction feedback.
Fig. 2 shows the structure of another multi-modal human-computer interaction test system in the embodiment of the present application, in which a bionic hand 121 and a bionic mouth 122 are adopted as the bionic output device, and a bionic eye 131 and a bionic ear 132 are adopted as the bionic input device. The control device 110 may be various computing devices with data processing functions, for example, a computer deployed locally or a server deployed on a network side, and if the computer deployed locally is used, a user may implement a multimodal human-computer interaction test through a user interface on the computer, and if the server deployed on the network side is used, the user may provide a client or a browser to implement remote interaction with the server, thereby implementing the multimodal human-computer interaction test. For example, the control device 110 in fig. 2 may provide a Socket Interface and a User Interface (UI) to the outside, and the client 150 used by the User may perform data interaction with the control device 110 through the Socket Interface, or may perform data interaction with the control device 110 in a display User Interface after connecting with the control device by using the browser 160, such as starting a test, editing a test script, obtaining a test result, viewing a test report, and the like.
The first test script is a test script for controlling the bionic output device to output the interaction information of the corresponding modality, for example, taking a first test script for controlling the bionic mouth as an example, the content of the first test script may be as follows:
*test_MusicPlay()
{ yield robotmouth.
Assert (uidevice, getcurrentpageuri () ═ page:// music ×.xxx.cn "," open music failed ");
}
the "i want to listen to the song of liu de hua" represents that the interactive information is sent through the artificial mouth connected with the control device, the interactive information is a voice instruction, and the specific content of the interactive information is "i want to listen to the song of liu de hua" so as to control the tested device to play specific music. In an actual scene, the control device may generate an audio file from the text in a TTS manner based on the text "i want to listen to songs of liudeluxe", and then play the audio file through a connected manual mouth to implement voice interaction.
In the test script, namely, whether the tested device generates correct interaction feedback is judged by means of assertion operation, namely, whether the tested device opens a URI (Uniform Resource Identifier) for playing music, wherein the URI is the page:// music.xxx.cn, if the tested device opens the URI for playing music, no operation is executed, and if the tested device does not open the URI, information of 'music opening failure' is generated, so that interaction feedback corresponding to the current interaction information can be confirmed, and whether human-computer interaction is abnormal is judged.
The results of the assertion operation may be sent to the control device so that the control device may determine whether the interactive feedback generated by the device under test is correct. In addition, the interactive feedback of the device under test can also be acquired through a bionic input device such as the bionic eye 131 or the bionic ear 132. Taking the bionic eye as an example, if the tested device executes correct processing based on the voice interaction information 'I want to listen to songs of Liu De Hua', the page:// music. When playing, the tested device enters the corresponding user interface, so that when the tested device enters the user interface, the tested device can generate correct interactive feedback, otherwise, if the tested device does not enter the user interface, the tested device does not generate correct interactive feedback.
Fig. 3 shows an interaction process when testing a voice interaction function of music playing, which includes the following steps:
in step S301, the control device 110 generates a control command a1 according to the first test script and sends the control command a1 to the bionic mouth 122 to send out a voice "i want to listen to the song of liudebua".
In step S302, the bionic mouth 122 sends a voice "i want to listen to the song of liudelhi" to the device under test 140 according to the control instruction a 1.
Step S303, when the device under test 140 recognizes the voice "i want to listen to the song of liu de hua", the interaction information is processed to generate corresponding interaction feedback. If the whole processing process is correct, the interactive feedback is to enter the corresponding playing interface a2 to start playing the Liudebua song. If the processing process of the interactive information is incorrect or the interactive information is not correctly identified, the corresponding playing interface cannot be entered.
In step S304, the bionic eye 131 shoots the user interface of the device under test 140, and sends the user interface to the control device for determination. If the user interface of the device under test enters the playback interface a2 in step S303, the bionic eye 131 may capture the playback interface a2, and send the playback interface a2 to the control device for analysis.
In step S305, the control device 110 may perform analysis according to the user interface captured by the bionic eye 131 to obtain a test result. If the user interface is found to enter the playing interface a2 correctly within the preset time, it can be determined that the voice interaction function of music playing is normal.
The multi-mode man-machine interaction test method can also generate a test report according to the test result. When the test report is generated, after the execution of a single test script is completed, the control device may generate the test report for the single test script, or may continue to control the execution of other test scripts, and after the execution of all the test scripts is completed, the control device generates the test report for all the test scripts. In addition, a page related to the test report can be output in combination with other third-party test platforms, so that a user can conveniently view the test result.
In an actual scene, the interaction information during the human-computer interaction of the device to be tested can be simulated through the bionic output device, and can also be simulated through an Application Programming Interface (API) packaged in the device to be tested. For example, various application programming interfaces for interaction, which are packaged in the device under test through a preset test framework, for example, interactive operations such as clicking, dragging, sliding, text input and the like, which are input by a user in the device under test, are simulated by calling the application programming interfaces packaged in the uiautomation and the like in the device under test. When the man-machine interaction test is carried out, the test script used for controlling the simulated interaction operation of the tested equipment is the second test script, and the first test script is the test script used for controlling the bionic output equipment to output the corresponding modal interaction information.
Therefore, in some embodiments of the application, the multi-modal human-computer interaction testing method may further import a second test script into the device under test, so that the device under test runs the second test script, and generates corresponding interaction feedback after simulating corresponding interaction operations. A user can simulate multi-mode interactive operation through various bionic output devices and a mode of directly guiding a test script into tested equipment to run, corresponding interactive feedback is obtained, and a test result is verified, so that a developer can more flexibly construct a multi-mode test scene, and multi-mode human-computer interaction test is more conveniently realized.
If an assist manual test (amt) tool is integrated in the device to be tested, a corresponding assist manual test action command can be generated. Therefore, the control device can obtain an auxiliary manual test action instruction corresponding to the second test script, and the auxiliary manual test action instruction is led into the tested device through the socket interface, so that an auxiliary manual test tool in the tested device executes the auxiliary manual test action instruction to generate interactive feedback. Because the auxiliary manual testing tool can be used for interactive operation of repeated user input, the efficiency of the test and the stability of the test can be improved.
In an actual scenario, a test case set may be preconfigured in the control device, where the test case set includes test scripts for performing various human-computer interaction tests on the device under test, and the test scripts include the first test script or the second test script. The test script in the test case set can be written or imported by a developer, a tester and other users through a socket interface or a user interface provided by the control device by using a browser or a client. Or, the test script may also come from the device under test, and at this time, the control device may also obtain the first test script or the second test script from the device under test through the socket, so that the test script used for the human-computer interaction test is uniformly managed on the control device.
In addition, the embodiment of the application can also provide a human-computer interaction testing method applicable to other scenes, the method adopts a system comprising control equipment, output equipment and input equipment, wherein the output equipment and the input equipment are not limited to various bionic equipment and can be output equipment and input equipment in any other forms, and the human-computer interaction function of the tested equipment is tested through interaction among the control equipment, the output equipment, the input equipment and the tested equipment.
In the human-computer interaction test method of this embodiment, the control device first sends a corresponding control instruction to the output device according to a first test script. The output device may output interaction information to the device under test according to a control instruction from the control device, so that the device under test generates interaction feedback according to the interaction information. The input device collects the interactive feedback of the device under test and provides the interactive feedback to the control device. And after the control equipment acquires the interactive feedback provided by the input equipment, the control equipment can determine a test result according to the interactive feedback to finish the test of the man-machine interactive function.
In addition, some of the present application may be implemented as a computer program product, such as computer program instructions, which when executed by a computer, may invoke or provide methods and/or techniques in accordance with the present application through the operation of the computer. Program instructions which invoke the methods of the present application may be stored on a fixed or removable recording medium and/or transmitted via a data stream on a broadcast or other signal-bearing medium and/or stored within a working memory of a computer device operating in accordance with the program instructions. Some embodiments according to the present application include a computing device as shown in fig. 6, which includes one or more memories 610 storing computer-readable instructions and a processor 620 for executing the computer-readable instructions, wherein the computer-readable instructions, when executed by the processor, cause the device to perform the methods and/or aspects based on the embodiments of the present application.
Furthermore, some embodiments of the present application also provide a computer readable medium, on which computer program instructions are stored, the computer readable instructions being executable by a processor to implement the methods and/or aspects of the foregoing embodiments of the present application.
It should be noted that the present application may be implemented in software and/or a combination of software and hardware, for example, implemented using Application Specific Integrated Circuits (ASICs), general purpose computers or any other similar hardware devices. In some embodiments, the software programs of the present application may be executed by a processor to implement the above steps or functions. Likewise, the software programs (including associated data structures) of the present application may be stored in a computer readable recording medium, such as RAM memory, magnetic or optical drive or diskette and the like. Additionally, some of the steps or functions of the present application may be implemented in hardware, for example, as circuitry that cooperates with the processor to perform various steps or functions.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the apparatus claims may also be implemented by one unit or means in software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.

Claims (18)

1. A multi-mode human-computer interaction test system comprises control equipment, bionic output equipment and bionic input equipment, wherein the bionic output equipment and the bionic input equipment correspond to corresponding modes;
the control equipment is used for sending a corresponding control instruction to the bionic output equipment according to the first test script, acquiring interactive feedback of the tested equipment and determining a test result according to the interactive feedback;
the bionic output equipment is used for outputting interaction information of a corresponding mode to the tested equipment according to a control instruction from the control equipment so that the tested equipment generates interaction feedback according to the interaction information;
the bionic input device is used for collecting interactive feedback of the tested device and providing the interactive feedback to the control device.
2. The system of claim 1, wherein the control device is further configured to import a second test script into the device under test, so that the device under test runs the second test script to generate the interactive feedback.
3. The system of claim 2, wherein the control device is configured to obtain an auxiliary manual test action instruction corresponding to the second test script, and import the auxiliary manual test action instruction into the device under test through a socket interface, so that an auxiliary manual test tool in the device under test executes the auxiliary manual test action instruction to generate the interactive feedback.
4. The system of claim 2, wherein the control device is configured to obtain the first test script or the second test script from the device under test via a socket.
5. The system of claim 1, wherein the bionic output device comprises a bionic hand and a bionic mouth, the bionic hand is used for simulating hand motions of a user to output motion interaction information to the device to be tested according to control instructions from the control device, and the bionic mouth is used for simulating voice of the user to output voice interaction information to the device to be tested according to the control instructions from the control device.
6. The system of claim 1, wherein the biomimetic input device comprises a biomimetic eye and a biomimetic ear, the biomimetic eye configured to acquire visual interactive feedback generated by the device under test and provide the visual interactive feedback to the control device, the biomimetic ear configured to acquire audible interactive feedback generated by the device under test and provide the audible interactive feedback to the control device.
7. The system of any one of claims 1 to 6, wherein the control device is further configured to generate a test report based on the test results.
8. A human-computer interaction test system, wherein the system comprises a control device, an output device and an input device;
the control equipment is used for sending a corresponding control instruction to the output equipment according to the first test script, acquiring interactive feedback of the tested equipment and determining a test result according to the interactive feedback;
the output equipment is used for outputting interaction information to the tested equipment according to a control instruction from the control equipment so that the tested equipment generates interaction feedback according to the interaction information;
the input device is used for collecting the interactive feedback of the tested device and providing the interactive feedback to the control device.
9. A multi-modal human-computer interaction testing method, wherein a system comprising a control device, a bionic output device and a bionic input device is adopted, the bionic output device and the bionic input device correspond to corresponding modalities, and the method comprises the following steps:
the control equipment sends a corresponding control instruction to the bionic output equipment according to the first test script;
the bionic output equipment outputs interaction information of a corresponding mode to the tested equipment according to a control instruction from the control equipment, so that the tested equipment generates interaction feedback according to the interaction information;
the bionic input equipment acquires the interactive feedback of the equipment to be tested and provides the interactive feedback to the control equipment;
the control equipment acquires the interactive feedback of the tested equipment and determines a test result according to the interactive feedback.
10. The method of claim 9, wherein the method further comprises:
and the control equipment leads a second test script into the tested equipment so that the tested equipment runs the second test script to generate interactive feedback.
11. The method of claim 10, wherein the importing, by the control device, a second test script into the device under test to cause the device under test to run the second test script to generate the interactive feedback comprises:
and the control equipment acquires an auxiliary manual test action instruction corresponding to the second test script, and introduces the auxiliary manual test action instruction into the tested equipment through a socket interface, so that an auxiliary manual test tool in the tested equipment executes the auxiliary manual test action instruction to generate interactive feedback.
12. The method of claim 10, wherein the control device obtains the first test script or the second test script from the device under test through a socket.
13. The method of claim 9, wherein the biomimetic output device comprises a biomimetic hand and a biomimetic mouth, the biomimetic hand is configured to simulate hand movements of the user to output the movement control instructions according to the output movement control instructions, the biomimetic mouth is configured to simulate voice of the user to output the voice control instructions;
the bionic output equipment outputs interaction information of a corresponding mode to the tested equipment according to a control instruction from the control equipment, so that the tested equipment generates interaction feedback according to the interaction information, and the bionic output equipment comprises the following steps:
the bionic hand and the bionic mouth respectively output action interaction information and voice interaction information to the tested equipment according to the control instruction from the control equipment and the control instruction from the control equipment, so that the tested equipment generates interaction feedback according to the action interaction information and the voice interaction information.
14. The method of claim 9, wherein the biomimetic input device comprises a biomimetic eye and a biomimetic ear;
the bionic input device collects interactive feedback of the tested device and provides the interactive feedback to the control device, and the bionic input device comprises:
the bionic eye and the bionic ear are used for respectively acquiring visual interactive feedback and auditory interactive feedback generated by the tested equipment and providing the visual interactive feedback and the auditory interactive feedback to the control equipment.
15. The method of any of claims 9 to 14, wherein the method further comprises:
and the control equipment generates a test report according to the test result.
16. A human-computer interaction testing method, wherein the method employs a system comprising a control device, an output device and an input device, the method comprising:
the control equipment sends a corresponding control instruction to the output equipment according to the first test script;
the output equipment outputs interaction information to the tested equipment according to a control instruction from the control equipment, so that the tested equipment generates interaction feedback according to the interaction information;
the input equipment acquires interactive feedback of the tested equipment and provides the interactive feedback to the control equipment;
the control equipment acquires the interactive feedback of the tested equipment and determines a test result according to the interactive feedback.
17. A computing device comprising a memory for storing computer program instructions and a processor for executing the computer program instructions, wherein the computer program instructions, when executed by the processor, trigger the device to perform the method of any of claims 9 to 16.
18. A computer readable medium having stored thereon computer program instructions executable by a processor to implement the method of any one of claims 9 to 16.
CN202010243078.1A 2020-03-31 2020-03-31 Human-computer interaction test system and method Pending CN113468042A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010243078.1A CN113468042A (en) 2020-03-31 2020-03-31 Human-computer interaction test system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010243078.1A CN113468042A (en) 2020-03-31 2020-03-31 Human-computer interaction test system and method

Publications (1)

Publication Number Publication Date
CN113468042A true CN113468042A (en) 2021-10-01

Family

ID=77865381

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010243078.1A Pending CN113468042A (en) 2020-03-31 2020-03-31 Human-computer interaction test system and method

Country Status (1)

Country Link
CN (1) CN113468042A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114756449A (en) * 2022-03-08 2022-07-15 深圳市普渡科技有限公司 Robot interactive component testing system and robot interactive component testing method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203241253U (en) * 2013-05-06 2013-10-16 东莞市卓安精机自动化设备有限公司 Automatic test system of interactive interface of touch-screen electronic device
US20140019860A1 (en) * 2012-07-10 2014-01-16 Nokia Corporation Method and apparatus for providing a multimodal user interface track
CN106201796A (en) * 2016-07-04 2016-12-07 珠海市魅族科技有限公司 The collocation method of a kind of test and device
US20170052527A1 (en) * 2015-08-19 2017-02-23 Fmr Llc Intelligent mobile device test fixture
CN107608609A (en) * 2016-07-11 2018-01-19 阿里巴巴集团控股有限公司 A kind of event object sending method and device
CN108184109A (en) * 2017-12-19 2018-06-19 北京经纬恒润科技有限公司 The test platform and test method of a kind of vehicle-mounted information and entertainment system
CN109361916A (en) * 2018-12-05 2019-02-19 东风汽车集团有限公司 Cross-platform automatic test bench for multimedia intelligent terminals
CN210166153U (en) * 2019-08-01 2020-03-20 上海怿星电子科技有限公司 Automatic test system of car information entertainment function

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140019860A1 (en) * 2012-07-10 2014-01-16 Nokia Corporation Method and apparatus for providing a multimodal user interface track
CN203241253U (en) * 2013-05-06 2013-10-16 东莞市卓安精机自动化设备有限公司 Automatic test system of interactive interface of touch-screen electronic device
US20170052527A1 (en) * 2015-08-19 2017-02-23 Fmr Llc Intelligent mobile device test fixture
CN106201796A (en) * 2016-07-04 2016-12-07 珠海市魅族科技有限公司 The collocation method of a kind of test and device
CN107608609A (en) * 2016-07-11 2018-01-19 阿里巴巴集团控股有限公司 A kind of event object sending method and device
CN108184109A (en) * 2017-12-19 2018-06-19 北京经纬恒润科技有限公司 The test platform and test method of a kind of vehicle-mounted information and entertainment system
CN109361916A (en) * 2018-12-05 2019-02-19 东风汽车集团有限公司 Cross-platform automatic test bench for multimedia intelligent terminals
CN210166153U (en) * 2019-08-01 2020-03-20 上海怿星电子科技有限公司 Automatic test system of car information entertainment function

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114756449A (en) * 2022-03-08 2022-07-15 深圳市普渡科技有限公司 Robot interactive component testing system and robot interactive component testing method

Similar Documents

Publication Publication Date Title
JP6129073B2 (en) Humanoid robot with natural dialogue interface, method for controlling the robot, and corresponding program
CN110659091B (en) Conversation agent conversation flow user interface
US10860345B2 (en) System for user sentiment tracking
RU2690071C2 (en) Methods and systems for managing robot dialogs
CN109521927B (en) Robot interaction method and equipment
CN109584868A (en) Natural Human-Computer Interaction for virtual personal assistant system
US10275341B2 (en) Mobile application usability testing
CN107423049A (en) Realize method, browser and the terminal device of online programming
CN109669831A (en) A kind of test method and electronic equipment
US12161942B2 (en) Videogame telemetry data and game asset tracker for session recordings
CN112115038A (en) Application testing method and device and storage medium
JP2019032719A (en) Information processing system, information processing method, and program
US20190050686A1 (en) Methods and apparatus to add common sense reasoning to artificial intelligence in the context of human machine interfaces
JP2016181018A (en) Information processing system and information processing method
CN113468042A (en) Human-computer interaction test system and method
WO2016206647A1 (en) System for controlling machine apparatus to generate action
JP2013077159A (en) Test automation system
JP6798258B2 (en) Generation program, generation device, control program, control method, robot device and call system
JP2021089576A (en) Information processor, information processing method and program
CN117076635A (en) Information processing method, apparatus, device and storage medium
CN114936358A (en) Human-computer interaction based verification method and verification system
CN111290944B (en) Script generation method, script generation device and storage medium
US20210201139A1 (en) Device and method for measuring a characteristic of an interaction between a user and an interaction device
KR20090001681A (en) Contents / Service Scenario Development Chart Modeling Method for Intelligent Robots
CN119635622A (en) A method and device for intelligent connection of measurement and control instrument system based on multi-modal large model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination