[go: up one dir, main page]

CN119621550A - Touch simulation method, device, electronic device and storage medium - Google Patents

Touch simulation method, device, electronic device and storage medium Download PDF

Info

Publication number
CN119621550A
CN119621550A CN202411622223.1A CN202411622223A CN119621550A CN 119621550 A CN119621550 A CN 119621550A CN 202411622223 A CN202411622223 A CN 202411622223A CN 119621550 A CN119621550 A CN 119621550A
Authority
CN
China
Prior art keywords
touch
original
event
coordinate
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202411622223.1A
Other languages
Chinese (zh)
Inventor
索文涛
李广辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Snow Software Development Co ltd
Original Assignee
Beijing Snow Software Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Snow Software Development Co ltd filed Critical Beijing Snow Software Development Co ltd
Priority to CN202411622223.1A priority Critical patent/CN119621550A/en
Publication of CN119621550A publication Critical patent/CN119621550A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Prevention of errors by analysis, debugging or testing of software
    • G06F11/3668Testing of software
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • G06F3/04186Touch location disambiguation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Position Input By Displaying (AREA)

Abstract

The disclosure provides a touch simulation method, a touch simulation device, electronic equipment and a storage medium. The method comprises the steps of reading a touch event device file of original equipment based on a system debugging tool to detect a touch event occurring in the original equipment, updating an original operation set according to the touch event when the touch event occurring in the original equipment is detected, wherein the original operation set is used for recording touch data, the touch data are used for representing touch operations received by the original equipment, correcting contact coordinates contained in the touch data recorded by the original operation set according to display configuration information of target equipment and the original equipment, and determining a touch instruction according to the corrected original operation set, wherein the touch instruction is used for simulating the touch operations received by the original equipment on the target equipment.

Description

Touch simulation method and device, electronic equipment and storage medium
Technical Field
The disclosure relates to the technical field of software testing, in particular to a touch simulation method, a touch simulation device, electronic equipment and a storage medium.
Background
Touch operation testing is often an important loop of a software testing process, and the same set of touch testing process is often required to be executed on a plurality of devices, so that it is particularly important to generate touch instructions to automatically complete the testing process. Currently, it is not uncommon for software systems that are relatively closed, or specialized for a certain class of products, to have a small number of callable system interfaces (resulting in some software functions not being implemented) and low rights to open to users (e.g., not allowing third party applications to be installed), resulting in performance statistics software and automatic test software not running properly thereon, or even being installed on the system.
In the related technology, in the face of the above situation, the automatic test can be realized only by manually writing the bottom instruction set, but the writing of the bottom instruction set is difficult to complete in time in the software development process, and meanwhile, the development process is also quite not intuitive, and the labor cost and the time cost are both higher.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides a touch simulation method, a touch simulation device, an electronic device, and a storage medium.
A first aspect of the present disclosure provides a touch simulation method, the method including:
reading a touch event device file of original equipment based on a system debugging tool to detect a touch event occurring in the original equipment;
When a touch event occurs in the original equipment, updating an original operation set according to the touch event, wherein the original operation set is used for recording touch data, and the touch data are used for representing touch operations received by the original equipment;
correcting contact coordinates contained in touch data recorded by the original operation set according to display configuration information of the target equipment and the original equipment;
and determining a touch instruction according to the corrected original operation set, wherein the touch instruction is used for simulating the touch operation received by the original equipment on the target equipment.
Optionally, the updating the original operation set according to the touch event includes:
creating a data structure for storing contact coordinates having a time sequence relationship in the case that the touch event is a contact tracking event or a touch down event;
recording coordinate values indicated by the touch point coordinate event under the condition that the touch event is the touch point coordinate event;
under the condition that the touch event is a synchronous report event, pairing the recorded coordinate values to obtain a current contact coordinate, and storing the current contact coordinate into the data structure;
And under the condition that the touch event is a touch lifting event, taking the data structure as touch data and recording the touch data into the original operation set.
Optionally, the pairing the recorded coordinate values to obtain the current contact coordinate includes:
And under the condition that the coordinate values corresponding to part of coordinate axes are not recorded, compensating the current contact point coordinate based on at least one historical contact point coordinate stored in the data structure.
Optionally, the method further comprises:
When a plurality of contact coordinates exist in the data structure, which are adjacent in time sequence, and the coordinate distance between every two contact coordinates is smaller than the distance threshold value, unifying the plurality of contact coordinates or discarding part of the plurality of contact coordinates.
Optionally, the correcting coordinates included in the touch data recorded in the original operation set according to the display configuration information of the target device and the original device includes:
Determining a coordinate scaling coefficient between the original equipment and the target equipment according to display configuration information of the target equipment and the original equipment, wherein the display configuration information comprises screen size and/or display resolution;
and scaling the touch point coordinates contained in the touch data recorded by the original operation set according to the coordinate scaling coefficient.
Optionally, the determining the touch instruction according to the modified original operation set includes:
Determining contact coordinates contained in touch data aiming at each touch data in the corrected original operation set;
Determining a clicking instruction according to the contact coordinates contained in the touch operation under the condition that the number of the contact coordinates contained in the touch operation is smaller than or equal to a number threshold, and determining the clicking instruction as a touch instruction corresponding to the touch operation;
And under the condition that the number of the contact coordinates included in the touch operation is larger than the number threshold, determining a touch pressing instruction according to the first contact coordinate included in the touch operation on a time sequence relation, determining a touch lifting instruction according to the last contact coordinate included in the touch operation on the time sequence relation, determining at least one touch moving instruction according to other contact coordinates included in the touch operation, and determining the touch pressing instruction, the at least one touch moving instruction and the touch lifting instruction as touch instructions corresponding to the touch operation.
Optionally, the method further comprises:
Inputting the touch instruction to the target equipment based on a system debugging tool so as to simulate touch operation received by the original equipment on the target equipment;
or inputting the touch instruction to a mechanical arm so that the mechanical arm simulates the touch operation received by the original equipment on the target equipment.
A second aspect of the present disclosure provides a touch simulation apparatus, the apparatus comprising:
the system comprises an event detection module, a touch event detection module and a touch event detection module, wherein the event detection module is used for reading a touch event device file of original equipment based on a system debugging tool so as to detect a touch event occurring in the original equipment;
The touch control recording module is used for updating an original operation set according to the touch control event when the touch control event occurs in the original equipment, wherein the original operation set is used for recording touch control data, and the touch control data are used for representing touch control operation received by the original equipment;
The coordinate correction module is used for correcting the contact coordinates contained in the touch data recorded by the original operation set according to the display configuration information of the target equipment and the original equipment;
The instruction generation module is used for determining a touch instruction according to the corrected original operation set, wherein the touch instruction is used for simulating touch operation received by the original equipment on the target equipment.
Optionally, the touch recording module is configured to, when the original operation set is updated according to the touch event, specifically:
creating a data structure for storing contact coordinates having a time sequence relationship in the case that the touch event is a contact tracking event or a touch down event;
recording coordinate values indicated by the touch point coordinate event under the condition that the touch event is the touch point coordinate event;
under the condition that the touch event is a synchronous report event, pairing the recorded coordinate values to obtain a current contact coordinate, and storing the current contact coordinate into the data structure;
And under the condition that the touch event is a touch lifting event, taking the data structure as touch data and recording the touch data into the original operation set.
Optionally, the touch recording module is configured to, when the recorded coordinate values are paired to obtain the current contact coordinate, specifically:
And under the condition that the coordinate values corresponding to part of coordinate axes are not recorded, compensating the current contact point coordinate based on at least one historical contact point coordinate stored in the data structure.
Optionally, the apparatus further includes:
And the filtering module is used for unifying the contact coordinates or discarding part of the contact coordinates when the coordinate distance between every two contact coordinates in the data structure is smaller than the distance threshold value and a plurality of contact coordinates adjacent in time sequence relation exist.
Optionally, the coordinate correcting module is configured to correct coordinates included in the touch data recorded in the original operation set according to display configuration information of the target device and the original device, and specifically is configured to:
Determining a coordinate scaling coefficient between the original equipment and the target equipment according to display configuration information of the target equipment and the original equipment, wherein the display configuration information comprises screen size and/or display resolution;
and scaling the touch point coordinates contained in the touch data recorded by the original operation set according to the coordinate scaling coefficient.
Optionally, the instruction determining module is configured to, when determining the touch instruction according to the modified original operation set, specifically:
Determining contact coordinates contained in touch data aiming at each touch data in the corrected original operation set;
Determining a clicking instruction according to the contact coordinates contained in the touch operation under the condition that the number of the contact coordinates contained in the touch operation is smaller than or equal to a number threshold, and determining the clicking instruction as a touch instruction corresponding to the touch operation;
And under the condition that the number of the contact coordinates included in the touch operation is larger than the number threshold, determining a touch pressing instruction according to the first contact coordinate included in the touch operation on a time sequence relation, determining a touch lifting instruction according to the last contact coordinate included in the touch operation on the time sequence relation, determining at least one touch moving instruction according to other contact coordinates included in the touch operation, and determining the touch pressing instruction, the at least one touch moving instruction and the touch lifting instruction as touch instructions corresponding to the touch operation.
Optionally, the device further includes a touch simulation module, configured to:
Inputting the touch instruction to the target equipment based on a system debugging tool so as to simulate touch operation received by the original equipment on the target equipment;
or inputting the touch instruction to a mechanical arm so that the mechanical arm simulates the touch operation received by the original equipment on the target equipment.
A third aspect of the present disclosure provides a computer program product comprising computer programs/instructions which, when executed by a processor, implement the method of the first aspect.
A fourth aspect of the present disclosure provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method according to the first aspect when executing the program.
A fifth aspect of the present disclosure provides a computer readable storage medium having stored thereon a computer program which when executed by a processor implements the method according to the first aspect.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects:
The embodiment of the disclosure intuitively realizes the generation of the test instruction by adopting a mode of recording the touch operation, is not easy to be limited by system permission, can be still applied to some relatively closed software systems specially specified for a certain product, and greatly improves the usability and universality of the software test method based on touch. Meanwhile, the method provided by the disclosure provides a very visual testing method for developers, so that the development workload of operators is greatly reduced, and the labor cost and the time cost of software testing are further reduced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is an application scenario diagram of a touch simulation method according to some exemplary embodiments.
FIG. 2 is a flow chart of a touch simulation method shown in some exemplary embodiments.
FIG. 3 is a flow chart of another touch simulation method shown in some exemplary embodiments.
FIG. 4 is a flow chart of yet another touch simulation method shown in some exemplary embodiments.
Fig. 5 is a block diagram of a touch simulation device, as shown in some example embodiments.
Fig. 6 is a hardware configuration diagram of an electronic device shown in some exemplary embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
The same product is generally suitable for devices with different screen sizes and screen resolutions (hereinafter, the screen sizes, the screen resolutions and other parameters possibly affecting the control layout of the software are called display configuration information of the device), and in order to truly simulate the operation in the daily life of the user and avoid the possible interference factors as much as possible, touch-control testing is an important test acceptance link. Such testing simulates the actual touch operations of the user on different devices, i.e. the process under test considers that these touch operations are received at the adaptive locations. For touch testing, one exemplary embodiment is to generate a system-level instruction set and input the instructions into a console of the system or run the instructions through a debug bridge to simulate the touch operations, and another exemplary embodiment is to generate an instruction set for a robotic arm (or other instrument independent of the device under test) and then control the motion of the robotic arm based on the instruction set to cause the robotic arm to simulate the touch operations on the device under test.
As described in the background, it is not uncommon for software systems that are relatively closed, or specialized for a certain class of products, to call for a system interface that is small (resulting in some software functionality not being available) and to open low authority to users (e.g., not allowing third party applications to be installed), resulting in performance statistics software and automatic test software not running properly thereon, or even being installed on the system.
It should be understood that, unless otherwise specified, the disclosure describes parameters such as touch coordinates, control positions, screen resolution, etc., in terms of pixels, and the origin of coordinates is the upper left corner.
In the related technology, in the face of the above situation, the automatic test can be realized only by manually writing the bottom instruction set, but the writing of the bottom instruction set is difficult to complete in time in the software development process, and meanwhile, the development process is also quite not intuitive, and the labor cost and the time cost are both higher.
In view of this, the disclosure provides a touch simulation method, a touch simulation device, an electronic device, and a storage medium.
Exemplary events (exemplified by android system) occurring in the present disclosure are briefly described below:
The contact TRACKING event EV_ABS_MT_TRACKING_ID, which generally indicates the start of TRACKING a new contact ID (contact identification), can be understood as the contact is identified to the device at this time;
Contact coordinate events, namely EV_ABS_MT_POSITION_X (X-axis coordinate event), EV_ABS_MT_POSITION_Y (Y-axis coordinate event) and the like, and generally represents the coordinate value of the contact on a certain coordinate axis at present;
TOUCH DOWN event, EV_KEY BTN_TOUCH DOWN, typically characterizing TOUCH start;
the TOUCH lift event EV_KEY BTN_TOUCH UP generally indicates that one TOUCH process is finished, i.e. the contact point disappears;
The event EV_SYN SYN_REPORT is typically indicative of the end of an event, which is a logical concept, e.g. indicative of an event that a touch point has occurred within a touch frame (or scan frame, i.e. a complete touch scan procedure). For example, if the X-axis coordinates of the touch point in the current frame are unchanged, the X-axis coordinate event may not occur, but the current touch frame is already finished, and the next touch scan starts, only one touch point coordinate event may occur at this time (that is, only the touch point coordinate event corresponding to a part of the coordinate axes occurs), and then the synchronous report event occurs. Then, one scan frame may be considered to have ended after the sync report event is detected.
Exemplary instructions (exemplified by the android debug bridge command line) presented in this disclosure are briefly presented below:
A single click instruction, such as an input tap instruction, for simulating a single click on the touch screen;
A touch press instruction, such as input motionevent DOWN instruction, for simulating a press operation on the touch screen;
a touch movement instruction, such as input motionevent MOVE instruction, for simulating a sliding operation on the touch screen;
and a touch lifting instruction, such as input motionevent UP instruction, is used for simulating lifting operation on the touch screen.
The embodiments of the present disclosure are described in detail below.
The first aspect of the present disclosure provides a touch simulation method. Please refer to fig. 1, which may include steps S101 to S104.
Step S101, reading a touch event device file of an original device based on a system debugging tool to detect a touch event occurring in the original device.
Among other things, the system debug tool (System Debugging Tools) is a software tool for monitoring, analyzing, and solving various problems in computer systems, such as android debug bridges (ADBs, android Debug Bridge) in android systems and WinDbg (Windows Debugger) in Windows systems.
For example, the method may be applied to a device (of course, the device to which the method is applied and the device to be tested may be physically the same device, for example, a virtual machine running on a physical device, and then the method may also be applied to an operating system of the physical device itself, software for supporting the virtual machine, or other virtual machines) other than the original device (i.e., the device to be tested), such as the debug device in fig. 2. And the equipment applying the method can establish connection with the original equipment through a system debugging tool. Furthermore, the above-described method may be performed by one software module or may be performed asynchronously by a plurality of software modules, which is not limited in this disclosure.
Based on the method, the touch event can be monitored on the premise that scripts are not required to be written on the original equipment, so that interference of system installation permission to a software testing process is avoided, meanwhile, development difficulty of developers is reduced, compatibility optimization of test scripts for different equipment is not required (touch information of the original equipment is recorded on another equipment directly based on a system debugging bridge).
The Touch event device file (Touch EVENT DEVICE FILE, generally abbreviated as TouchEvent file) may refer to a device file in an operating system for capturing (and processing) an input event of a Touch module (or a peripheral device such as a mouse of course). These files may be considered an interface that typically interacts with the hardware through a device driver and converts touch events into input signals that are understandable to the system. It is noted that there may be one or more touch event device files, and that the paths of the touch event device files may be different in different systems and in different versions of the same system, for example, in an Android system (e.g., android 13), touchEvent files are typically located in the/dev/input/event 6 path (as will be described later in this example).
The process reads the touch event equipment file in the equipment based on the system debugging tool and directly obtains the touch event information, so that the problems of high development difficulty and high limitation of the system authority caused by adopting a mode of developing scripts and acquiring the system bottom layer authority to monitor the event information are avoided. In other words, the above procedure is a simple read-only procedure, by which the acquisition of the touch event can be completed without affecting the normal operation of the operating system and the process on the original device and without being limited by the authority of the operating system.
In some embodiments, the touch events may include a touch point tracking event, a touch down event, a touch point coordinates event, a synchronous report event, a touch up event, and the like.
Step S102, when a touch event is detected in the original device, updating an original operation set according to the touch event, where the original operation set is used to record touch data, and the touch data is used to characterize a touch operation received by the original device.
The original operation set in the above steps may be a character string directly used for recording the content of the event (for example, directly recording the data stream in the file of the touch event device one by one), or may be an array for recording the coordinates of the touch point in time sequence, for example, in a manner of "[ { x:1001, y:200}, { x:1101, y:200},.+ -.) ]", sequentially recording the coordinates of the X axis and the Y axis in different frames, where { x:1001, y:200}, { x:1101, y:200} are each an element in the array. This allows maximum saving of space, since information other than the coordinates of the contacts can be generally determined based on reference amounts of other dimensions, for example, when only one element is included in the original operation set, a user performs a click operation, and the coordinates of the contacts of the click operation are the coordinates of the contacts recorded in the element, which will be described later. The following will take the original operation set as an array in which the coordinates of the contacts are recorded in time order as an example.
As an exemplary embodiment, the updating the original operation set according to the touch event may include:
creating a data structure for storing contact coordinates having a time sequence relationship in the case that the touch event is a contact tracking event or a touch down event;
recording coordinate values indicated by the touch point coordinate event under the condition that the touch event is the touch point coordinate event;
under the condition that the touch event is a synchronous report event, pairing the recorded coordinate values to obtain a current contact coordinate, and storing the current contact coordinate into the data structure;
And under the condition that the touch event is a touch lifting event, taking the data structure as touch data and recording the touch data into the original operation set.
The following is an exemplary data flow for a touch event device file that appears on some Android 13 device:
/dev/input/event6:EV_ABS ABS_MT_TRACKING_ID 00000253
......
/dev/input/event6:EV_ABS ABS_MT_POSITION_X 00000298
/dev/input/event6:EV_ABS ABS_MT_POSITION_Y 00000926
/dev/input/event6:EV_KEY BTN_TOUCH DOWN
/dev/input/event6:EV_SYN SYN_REPORT 00000000
......
/dev/input/event6:EV_ABS ABS_MT_POSITION_X 00000296
/dev/input/event6:EV_SYN SYN_REPORT 00000000
/dev/input/event6:EV_ABS ABS_MT_TRACKING_ID ffffffff
......
/dev/input/event6:EV_KEY BTN_TOUCH UP
/dev/input/event6:EV_SYN SYN_REPORT 00000000。
Each data line of the above example can be considered a touch event. Wherein/dev/input/event 6 is a touch event device file, ev_abs is an event type, abs_mt_ TRACING _id is an event instruction, and 00000395 is a value corresponding to an event. The data lines described above may be described as a "{ device file }: { event type } { event instruction } { instruction value }" format. For example, the data lines may be obtained by listening for device events based on the "adb-s { deviceId } SHELL GETEVENT-l" command line of the android debug bridge, and then filtering out the data of the "/dev/input/event6" device file.
First, the identity of the start of the operation may be obtained from the monitored real-time data. Referring to fig. 3, in the foregoing example of Android 13, it can be generally considered that when the value of the "abt_mt_track_id" event is not equal to "ffffffff" (i.e., the event has a valid value, which is generally referred to as the ID identification of the contact), this means that a contact TRACKING event occurs. For example, in the data line described above, an instruction with a data event type of "EV_ABS", an event instruction of "ABS_MT_TRACKING_ID", and an instruction value not equal to "ffffffff" represents an operation start.
Then, the subsequent data event type is "ev_abs", the event command is "abs_mt_position_x" (X coordinate) or "abs_mt_position_y" (Y coordinate), which may indicate the change of the touch point coordinates in the touch process, and these events are the touch point coordinate events, where the coordinate values indicated by the touch point coordinate events may be recorded, for example, the hexadecimal command value is converted into the decimal command value, then the X coordinate and the Y coordinate are paired, and finally stored in the array.
The pairing and storing process may be performed when a "syn_report" event (synchronous REPORT event) occurs, where the synchronous REPORT event generally represents a change in the X or Y coordinates of the user during completion of a touch trajectory. At this point, the X, Y coordinates indicated by the touch point coordinate event that occurred prior to the synchronization report event may be paired. For example, before the first "SYN_REPORT" event in the data line, the recorded X coordinate is 00000298 and Y coordinate is 00000926, then a decimal coordinate pair { x:664, y:2334} may be determined and added to the array as the current touch point coordinate.
Before the "syn_report" instruction, a single coordinate change instruction may be received according to the change of the finger operation path of the user, and the pairing is needed according to the monitored X or Y coordinate and the Y or X coordinate appearing last time. That is, the pairing of the recorded coordinate values to obtain the current contact point coordinate may include compensating the current contact point coordinate based on at least one historical contact point coordinate stored in the data structure in a case where a coordinate value corresponding to a part of the coordinate axes is not recorded.
For example, there is only one "abs_mt_position_x" instruction (662 indicating that the X-axis coordinate is in decimal) with a value of "00000296" before the second "syn_report" event of the data line, and no "abs_mt_position_y" instruction (i.e., Y-coordinate is missing). The X coordinate may be paired with the Y coordinate (historical contact coordinate) in the previous frame (or other frames before) and stored in the array (e.g., resulting in an array element of "{ X:662, Y:2334 }).
Finally, when the monitored event command is "btn_touch" and the value is "UP" (i.e. a TOUCH lift event occurs), it generally represents that a section of TOUCH operation is finished. At this time, the array (the original operation set) recorded in the above process may be used to indicate a complete touch track (i.e. the single event independent array described in the figure). At this point, the array may be stored in other data structures or databases.
In some embodiments, in the above process (i.e. the process of creating and updating the original operation set) or after the above process is finished (i.e. after the original operation set corresponding to one touch track is obtained), the contact coordinates recorded in the original operation set may be further modified, for example, where there are a plurality of contact coordinates in the data structure, where the coordinate distance between two coordinates is smaller than the distance threshold, and where the plurality of contact coordinates are adjacent in a time sequence relationship, the plurality of contact coordinates are unified or some of the plurality of contact coordinates are discarded.
The above procedure can be regarded as performing similar filtering on the generated original operation set (tiny changes in the operation process are regarded as no changes, and they are unified into one stable coordinate value), so that when the change of X or Y generated in adjacent frames in the operation process is smaller than a preset threshold (this threshold can be usually adjusted according to the actual situation), the coordinates are considered to be practically unchanged between the adjacent frames, so that the problem that the "clicking operation is misjudged as the dragging operation" due to the error of the touch module is avoided.
In the execution process, the execution logic of the process can be, for example, reading in a contact point coordinate, judging the distance between the contact point coordinate and the contact point coordinate in the previous frame, and taking the contact point coordinate in the previous frame as the contact point coordinate of the current frame when the distance is small. For example, the execution logic of the above process may be to select a contact coordinate as a reference coordinate, then determine whether there are a plurality of frames with average frame-to-frame displacement smaller than a preset threshold before and after the reference coordinate, if so, take the average value of the contact coordinates of the frames as the corrected contact coordinates, or directly take the reference coordinate or the coordinates of other frames (such as the middle frame in time sequence) as the contact coordinates of the frames, or directly discard a part of the contact coordinates.
And step S103, according to the display configuration information of the target equipment and the original equipment, correcting the contact coordinates contained in the touch data recorded by the original operation set.
In particular, since these tests may involve devices with different display configuration information, the size and layout of the controls also need to be considered when simulating the same user touch operation on different devices.
For example, in the example shown in FIG. 2, assuming that device A and device B have the same screen size, but both the lateral and longitudinal numbers of pixels of device B's screen are twice that of device A, then in one common zoom logic, for the same software interface, the button may be located at (100 ), size (20, 20), i.e., a single click event within the range may trigger the button when the interface is fully run on device A, and the location may be changed based on an adaptation algorithm deployed in the device B system layer or application under test, e.g., in one common logic, the button may be located at (200 ), size (40, 40), and likewise, a single click event within the range may trigger the button.
Then, if touch testing needs to be completed on both device a and device B, then the touch operation (touch coordinates lie in the range of (100, 100) to (120 )) for the button on device a needs to be mapped to the touch operation (touch coordinates lie in the range of (200, 200) to (240 )) for the button on device B. The above concept of "directed to" refers to the direction at the user operation level, that is, to which position of the display module of the device B this touch operation directed to a should theoretically correspond to for the user.
That is, optionally, the correcting the coordinates included in the touch data recorded in the original operation set according to the display configuration information of the target device and the original device may include determining a coordinate scaling factor between the original device and the target device according to the display configuration information of the target device and the original device, where the display configuration information includes a screen size and/or a display resolution, and scaling the coordinates of the touch point included in the touch data recorded in the original operation set according to the coordinate scaling factor.
In general, in the case of the current common software interface scaling logic, touch data (contact coordinates in) recorded in the original operation set may be modified to adapt to the target device, where the pixel density in the X-axis direction of the original device is w 1, the pixel density in the Y-axis direction of the original device is h 1, the pixel density in the X-axis direction of the target device is w 2, and the pixel density in the Y-axis direction of the target device is h 2, and then each X coordinate recorded in the original operation set may be converted as x=position_x (w 2/w1), and each X coordinate may be converted as y=position_y (h 2/h1), where position_ X, POSITION _y is the X value, Y value sum, w 2/w1, and h 2/h1 before the conversion, respectively, in order, may be regarded as the coordinate coefficient.
It should be appreciated that if the software or operating system has its unique software interface scaling logic, the modification logic described above may be adapted to such unique software interface scaling logic to be able to simulate the same touch operation on device B. The "same touch operation" as described herein refers to a touch operation that can theoretically result in the same operation result, and since the touch operation is directed to the interface for the touch module, the above-mentioned modification to the original operation set may be performed based on the display configuration information (such as the size, resolution, and pixel density of the display module) of the original device. It should be understood that the present disclosure does not exclude input devices (i.e., the above-mentioned touch operations for the original device are input by the input devices) such as a mouse, a pointing stick (e.g., trackPoint or TRACKSTICK), a touch pad (commonly found in a notebook computer) and the like, but since no matter what device is the device, the cursor controlled by the input device (the cursor may correspond to a contact point) has a correspondence with the screen coordinate system of the original device, the coordinate conversion process of the input devices is also applicable to the above-mentioned principle, and some additional conversion logic related to external devices are not repeated.
Step S104, determining a touch instruction according to the corrected original operation set, wherein the touch instruction is used for simulating the touch operation received by the original equipment on the target equipment.
After the original operation set is modified, the touch operation recorded in the original operation set can be determined to determine a touch instruction, and the specific instruction corresponding relation can be determined according to equipment executing the instructions.
In some embodiments, the determining the touch instruction according to the modified original operation set may include:
Determining contact coordinates contained in touch data aiming at each touch data in the corrected original operation set;
Determining a clicking instruction according to the contact coordinates contained in the touch operation under the condition that the number of the contact coordinates contained in the touch operation is smaller than or equal to a number threshold, and determining the clicking instruction as a touch instruction corresponding to the touch operation;
And under the condition that the number of the contact coordinates included in the touch operation is larger than the number threshold, determining a touch pressing instruction according to the first contact coordinate included in the touch operation on a time sequence relation, determining a touch lifting instruction according to the last contact coordinate included in the touch operation on the time sequence relation, determining at least one touch moving instruction according to other contact coordinates included in the touch operation, and determining the touch pressing instruction, the at least one touch moving instruction and the touch lifting instruction as touch instructions corresponding to the touch operation.
Referring to FIG. 4, wherein "filtering" in the "filtered array" refers to the correction process and the close filtering process exemplarily described above. Since the number of touch coordinates contained in the original set of operations actually reflects the time it takes for this touch operation (for devices where the reporting of touch coordinates is typically related to the number of scan frames and the specific changes in touch points), then when the array length is 1 (or other value less than the number threshold), the user (i.e., the developer entering the touch operation on the original device) may be considered to actually want to execute a single click command (because of the short duration or small changes in coordinates) during this touch process, and then a single click command (the coordinates of this command may be determined by means of mean filtering or direct valued) may be generated based on these coordinates, for example, an input tap { } { } command (in the form of an ADB command).
When the number of coordinates of the touch point in the original operation set is large, the user can be considered to perform a sliding (dragging) operation in the process, and the process is generally divided into three processes of finger pressing, finger moving and finger lifting, then the 0 th data (0 is the number of the data item in the array, the 0 th data is the first data, and the first data can be the first N data, where N can be related to the number threshold or the characteristics of the original device and the target device, for example, the number threshold is directly taken as N), and a touch control pressing instruction is generated based on the data.
Then, the touch up command may be generated according to the last contact coordinate (i.e. the element number is len-2, where len is the length of the array; as in the previous description, "one" may also be M, where M may also be related to the above number threshold or the characteristics of the original device and the target device, or may be other preset values), and the remaining data in the array is the contact moving process included in the touch track, so at least one touch move command may be generated according to the time sequence relationship according to the remaining data (i.e. the remaining data in the original operation set after removing the data for generating the touch down command and the data for generating the touch up command). After the above instructions are arranged according to the time sequence relationship, they can be input into the original equipment or another equipment (for example, the target equipment or the mechanical arm), so as to complete playback of the touch track and automatically complete touch tests on different equipment, and the operations required to be completed by the operator in this process are only instance operations (as shown in fig. 2) performed on the original equipment, which is extremely efficient.
For example, in some embodiments, as previously described, the method may further include inputting the touch instruction to the target device based on a system debug tool to simulate a touch operation received by the original device on the target device. Or inputting the touch instruction to a mechanical arm so that the mechanical arm simulates the touch operation received by the original equipment on the target equipment. The above two exemplary modes can achieve complete decoupling of the target device and the test script, that is, the test script does not need to be installed on the target device in addition to the original device (in which the example of the mechanical arm does not even need to establish a connection with the target device), thereby further improving the usability of the method. The description of the scene may be referred to above, and will not be repeated here.
The embodiment of the disclosure intuitively realizes the generation of the test instruction by adopting a mode of recording touch operation, and because the test instruction is generated based on the detected touch event, and the touch event is realized by adopting a mode of reading the touch event equipment file based on a system debugging tool, the process does not need to install application and operation script (which is a read-only process aiming at the file) on the original equipment, and therefore, the method is not easy to be limited by system authority, and can be still applied to some software systems which are relatively closed and are specially specified for a certain product, thereby greatly improving the usability and universality of the software test method based on touch. Meanwhile, based on the implementation mode, an operator can obtain a touch instruction for playing back the operation process on the target device only by simulating a real user on the original device, so that the touch instruction is visual and user experience is good, the development workload of the operator is greatly reduced, and the labor cost and the time cost of software testing are further reduced.
Corresponding to the embodiments of the aforementioned method, the present disclosure also provides embodiments of the apparatus and the terminal to which it is applied.
A second aspect of the present disclosure provides a touch simulation device, referring to fig. 5, the device includes:
the event detection module 501 is configured to read a touch event device file of an original device based on a system debug tool, so as to detect a touch event occurring in the original device;
The touch recording module 502 is configured to update an original operation set according to a touch event when the touch event is detected to occur in the original device, where the original operation set is used to record touch data, and the touch data is used to characterize a touch operation received by the original device;
A coordinate correcting module 503, configured to correct coordinates of a contact point included in the touch data recorded in the original operation set according to display configuration information of the target device and the original device;
the instruction generating module 504 is configured to determine a touch instruction according to the modified original operation set, where the touch instruction is used to simulate, on the target device, a touch operation received by the original device.
Optionally, the touch recording module is configured to, when the original operation set is updated according to the touch event, specifically:
creating a data structure for storing contact coordinates having a time sequence relationship in the case that the touch event is a contact tracking event or a touch down event;
recording coordinate values indicated by the touch point coordinate event under the condition that the touch event is the touch point coordinate event;
under the condition that the touch event is a synchronous report event, pairing the recorded coordinate values to obtain a current contact coordinate, and storing the current contact coordinate into the data structure;
And under the condition that the touch event is a touch lifting event, taking the data structure as touch data and recording the touch data into the original operation set.
Optionally, the touch recording module is configured to, when the recorded coordinate values are paired to obtain the current contact coordinate, specifically:
And under the condition that the coordinate values corresponding to part of coordinate axes are not recorded, compensating the current contact point coordinate based on at least one historical contact point coordinate stored in the data structure.
Optionally, the apparatus further includes:
And the filtering module is used for unifying the contact coordinates or discarding part of the contact coordinates when the coordinate distance between every two contact coordinates in the data structure is smaller than the distance threshold value and a plurality of contact coordinates adjacent in time sequence relation exist.
Optionally, the coordinate correcting module is configured to correct coordinates included in the touch data recorded in the original operation set according to display configuration information of the target device and the original device, and specifically is configured to:
Determining a coordinate scaling coefficient between the original equipment and the target equipment according to display configuration information of the target equipment and the original equipment, wherein the display configuration information comprises screen size and/or display resolution;
and scaling the touch point coordinates contained in the touch data recorded by the original operation set according to the coordinate scaling coefficient.
Optionally, the instruction determining module is configured to, when determining the touch instruction according to the modified original operation set, specifically:
Determining contact coordinates contained in touch data aiming at each touch data in the corrected original operation set;
Determining a clicking instruction according to the contact coordinates contained in the touch operation under the condition that the number of the contact coordinates contained in the touch operation is smaller than or equal to a number threshold, and determining the clicking instruction as a touch instruction corresponding to the touch operation;
And under the condition that the number of the contact coordinates included in the touch operation is larger than the number threshold, determining a touch pressing instruction according to the first contact coordinate included in the touch operation on a time sequence relation, determining a touch lifting instruction according to the last contact coordinate included in the touch operation on the time sequence relation, determining at least one touch moving instruction according to other contact coordinates included in the touch operation, and determining the touch pressing instruction, the at least one touch moving instruction and the touch lifting instruction as touch instructions corresponding to the touch operation.
Optionally, the device further includes a touch simulation module, configured to:
Inputting the touch instruction to the target equipment based on a system debugging tool so as to simulate touch operation received by the original equipment on the target equipment;
or inputting the touch instruction to a mechanical arm so that the mechanical arm simulates the touch operation received by the original equipment on the target equipment.
The implementation process of the functions and roles of each module in the above device is specifically shown in the implementation process of the corresponding steps in the above method, and will not be described herein again.
A third aspect of the present disclosure provides a computer program product comprising computer programs/instructions which, when executed by a processor, implement the method as described in the first aspect.
For apparatus embodiments and computer program product embodiments, reference is made to the description of method embodiments for relevance, since they correspond essentially to the method embodiments. Furthermore, the apparatus embodiments described above are merely illustrative, wherein the modules illustrated as separate components may or may not be physically separate, and the components shown as modules may or may not be physical modules, i.e., may be located in one place, or may be distributed over a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the objectives of the disclosed solution. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
In a fourth aspect, an embodiment of a touch simulation apparatus provided by the present disclosure may be applied to an electronic device. Referring to fig. 6, a hardware schematic of an electronic device is schematically shown. For example, device 600 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, or the like.
The device 600 may include one or more of a processing component 601, a memory 602, a power component 603, a multimedia component 604, an audio component 605, an input/output (I/O) interface 606, a sensor component 607, and a communication component 608.
The processing component 601 generally controls overall operation of the device 600, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 601 may include one or more processors 609 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 601 may include one or more modules that facilitate interactions between the processing component 601 and other components. For example, the processing component 601 may include a multimedia module to facilitate interaction between the multimedia component 604 and the processing component 601.
The memory 602 is configured to store various types of data to support operations at the device 600. Examples of such data include instructions for any application or method operating on device 600, contact data, phonebook data, messages, pictures, videos, and the like. The memory 602 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power component 603 provides power to the various components of the apparatus 600. Power component 603 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 600.
The multimedia component 604 includes a screen between the device 600 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touch, swipe, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or sliding action, but also the duration and pressure associated with the touch or sliding operation. In some embodiments, the multimedia component 604 includes a front camera and/or a rear camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the device 600 is in an operational mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 605 is configured to output and/or input audio signals. For example, the audio component 605 includes a Microphone (MIC) configured to receive external audio signals when the device 600 is in an operational mode, such as a call mode, a recording mode, and a speech recognition mode. The received audio signals may be further stored in the memory 602 or transmitted via the communication component 608. In some embodiments, the audio component 605 also includes a speaker for outputting audio signals.
The I/O interface 606 provides an interface between the processing component 601 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to, a home button, a volume button, an activate button, and a lock button.
The sensor assembly 607 includes one or more sensors for providing status assessment of various aspects of the device 600. For example, the sensor assembly 607 may detect the on/off state of the device 600, the relative positioning of the components, such as the display and keypad of the device 600, the sensor assembly 607 may also detect the change in position of the device 600 or a component of the device 600, the presence or absence of user contact with the device 600, the orientation or acceleration/deceleration of the device 600, and the change in temperature of the device 600. The sensor assembly 607 may also include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 607 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 607 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 608 is configured to facilitate communication between the device 600 and other devices, either wired or wireless. The device 600 may access a wireless network based on a communication standard, such as WiFi,2G or 3G,4G or 5G, or a combination thereof. In one exemplary embodiment, the communication component 608 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 608 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 600 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for performing the touch simulation method of an electronic apparatus as described above.
In a fifth aspect, the present disclosure also provides, in an exemplary embodiment, a non-transitory computer-readable storage medium, such as memory 602, comprising instructions executable by processor 609 of device 600 to perform the touch simulation method of the electronic device described above. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
The foregoing has described certain embodiments of the present disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
The foregoing description of the preferred embodiments of the present disclosure is not intended to limit the disclosure, but rather to cover all modifications, equivalents, improvements and alternatives falling within the spirit and principles of the present disclosure.

Claims (10)

1. A touch simulation method, the method comprising:
reading a touch event device file of original equipment based on a system debugging tool to detect a touch event occurring in the original equipment;
When a touch event occurs in the original equipment, updating an original operation set according to the touch event, wherein the original operation set is used for recording touch data, and the touch data are used for representing touch operations received by the original equipment;
correcting contact coordinates contained in touch data recorded by the original operation set according to display configuration information of the target equipment and the original equipment;
and determining a touch instruction according to the corrected original operation set, wherein the touch instruction is used for simulating the touch operation received by the original equipment on the target equipment.
2. The touch simulation method according to claim 1, wherein the updating the original operation set according to the touch event comprises:
creating a data structure for storing contact coordinates having a time sequence relationship in the case that the touch event is a contact tracking event or a touch down event;
recording coordinate values indicated by the touch point coordinate event under the condition that the touch event is the touch point coordinate event;
under the condition that the touch event is a synchronous report event, pairing the recorded coordinate values to obtain a current contact coordinate, and storing the current contact coordinate into the data structure;
And under the condition that the touch event is a touch lifting event, taking the data structure as touch data and recording the touch data into the original operation set.
3. The touch simulation method according to claim 2, wherein the pairing of the recorded coordinate values to obtain the current touch point coordinates includes:
And under the condition that the coordinate values corresponding to part of coordinate axes are not recorded, compensating the current contact point coordinate based on at least one historical contact point coordinate stored in the data structure.
4. The touch simulation method according to claim 2, further comprising:
When a plurality of contact coordinates exist in the data structure, which are adjacent in time sequence, and the coordinate distance between every two contact coordinates is smaller than the distance threshold value, unifying the plurality of contact coordinates or discarding part of the plurality of contact coordinates.
5. The touch simulation method according to claim 1, wherein the correcting coordinates included in the touch data recorded in the original operation set according to the display configuration information of the target device and the original device includes:
Determining a coordinate scaling coefficient between the original equipment and the target equipment according to display configuration information of the target equipment and the original equipment, wherein the display configuration information comprises screen size and/or display resolution;
and scaling the touch point coordinates contained in the touch data recorded by the original operation set according to the coordinate scaling coefficient.
6. The touch simulation method according to claim 1, wherein the determining the touch command according to the modified original operation set includes:
Determining contact coordinates contained in touch data aiming at each touch data in the corrected original operation set;
Determining a clicking instruction according to the contact coordinates contained in the touch operation under the condition that the number of the contact coordinates contained in the touch operation is smaller than or equal to a number threshold, and determining the clicking instruction as a touch instruction corresponding to the touch operation;
And under the condition that the number of the contact coordinates included in the touch operation is larger than the number threshold, determining a touch pressing instruction according to the first contact coordinate included in the touch operation on a time sequence relation, determining a touch lifting instruction according to the last contact coordinate included in the touch operation on the time sequence relation, determining at least one touch moving instruction according to other contact coordinates included in the touch operation, and determining the touch pressing instruction, the at least one touch moving instruction and the touch lifting instruction as touch instructions corresponding to the touch operation.
7. The touch simulation method according to claim 1, further comprising:
inputting the touch instruction to the target device based on a system debugging tool to simulate the touch operation received by the original device on the target device, or,
And inputting the touch instruction to a mechanical arm so that the mechanical arm simulates the touch operation received by the original equipment on the target equipment.
8. A touch simulation device, the device comprising:
the system comprises an event detection module, a touch event detection module and a touch event detection module, wherein the event detection module is used for reading a touch event device file of original equipment based on a system debugging tool so as to detect a touch event occurring in the original equipment;
The touch control recording module is used for updating an original operation set according to the touch control event when the touch control event occurs in the original equipment, wherein the original operation set is used for recording touch control data, and the touch control data are used for representing touch control operation received by the original equipment;
The coordinate correction module is used for correcting the contact coordinates contained in the touch data recorded by the original operation set according to the display configuration information of the target equipment and the original equipment;
The instruction generation module is used for determining a touch instruction according to the corrected original operation set, wherein the touch instruction is used for simulating touch operation received by the original equipment on the target equipment.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any one of claims 1 to 7 when the program is executed by the processor.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method according to any one of claims 1 to 7.
CN202411622223.1A 2024-11-13 2024-11-13 Touch simulation method, device, electronic device and storage medium Pending CN119621550A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202411622223.1A CN119621550A (en) 2024-11-13 2024-11-13 Touch simulation method, device, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202411622223.1A CN119621550A (en) 2024-11-13 2024-11-13 Touch simulation method, device, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN119621550A true CN119621550A (en) 2025-03-14

Family

ID=94897164

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202411622223.1A Pending CN119621550A (en) 2024-11-13 2024-11-13 Touch simulation method, device, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN119621550A (en)

Similar Documents

Publication Publication Date Title
US11023363B2 (en) Performance test application sequence script
CN110134600B (en) Test script recording method, device and storage medium
CN112256563B (en) Android application stability testing method and device, electronic equipment and storage medium
CN106294158A (en) Terminal test method, Apparatus and system
US20090228873A1 (en) Display breakpointing based on user interface events
CN110837473A (en) Application program debugging method, device, terminal and storage medium
CN111737100A (en) Data acquisition method, device, equipment and storage medium
CN107562500B (en) Debugging device, method and equipment
CN105117337A (en) Application debugging method, client and debugging platform
KR20170072164A (en) Method, device and terminal for optimizing air mouse remote control
WO2018076309A1 (en) Photographing method and terminal
CN112363950A (en) Application program debugging method and device
CN109254908A (en) Visualize regression testing method, device, terminal device and readable storage medium storing program for executing
CN110781080A (en) Program debugging method and device, and storage medium
CN109426504B (en) Program processing method, program processing device, electronic device and storage medium
CN109684123B (en) Problem resource positioning method, device, terminal and storage medium
CN115982024A (en) Test script generation method, device, storage medium and program product
CN119690609B (en) A method, device, electronic device, and storage medium for processing process abnormality
CN112560399B (en) Page link replacement method and device
CN105335200A (en) System upgrading method and device
CN119621550A (en) Touch simulation method, device, electronic device and storage medium
CN118138817B (en) Television equipment mouse control method and device, television equipment and medium
CN112286392A (en) Touch detection method and device of touch screen and storage medium
CN111338961A (en) Application debugging method and device, electronic device and storage medium
CN113849075B (en) Touch screen point reporting event processing method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination