CN115686328A - Contactless interaction method and device in free space, electronic equipment and storage medium - Google Patents
Contactless interaction method and device in free space, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN115686328A CN115686328A CN202211337463.8A CN202211337463A CN115686328A CN 115686328 A CN115686328 A CN 115686328A CN 202211337463 A CN202211337463 A CN 202211337463A CN 115686328 A CN115686328 A CN 115686328A
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- acceleration data
- acceleration
- moment
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Position Input By Displaying (AREA)
Abstract
The invention discloses a contactless interaction method, a contactless interaction device, electronic equipment and a storage medium in a free space, wherein the method comprises the steps of responding to a contactless interaction request, and collecting acceleration data of wearable equipment in a contactless interaction process in the free space at the current moment; the acceleration data is the acceleration data of the wearable equipment under a rectangular coordinate system; if the non-contact interaction request is a character input request, converting the acceleration data of the wearable equipment under the rectangular coordinate system into the acceleration data under the world coordinate system based on a preset unit quaternion; determining a three-dimensional position coordinate of the current moment in a world coordinate system according to acceleration data in the world coordinate system; converting the three-dimensional position coordinates into two-dimensional space position coordinates of the wearable device at the current moment; and generating a track according to the two-dimensional space position coordinates in a preset time period, and performing character recognition based on the track. The technical scheme of the application uses quaternion to change two-dimensional coordinates to avoid accumulative error of acceleration.
Description
Technical Field
The embodiment of the invention relates to the technical field of wearable equipment, in particular to a contactless interaction method and device in free space, electronic equipment and a storage medium.
Background
All intelligent devices (including AR, VR glasses, computers, pads, etc.) involve interaction between people and devices, and involve interactive operations such as text input, mouse selection, etc. in addition to display functions, input devices are required, and the input devices are most typically keyboards and mice. However, for intelligent devices, especially intelligent glasses, there are many limitations on the portability and the operation mode of the devices if they are intended to be used in any space and time. For example, when a user wants to perform operations such as text input outdoors, the input device is portable and easy to use, which is not satisfied by a keyboard and a mouse.
In the prior art, the prior art mainly adopts 6Dof operating handles used by various VR and AR glasses, a virtual handle synchronous with a physical handle can be displayed in the glasses, a user moves light spots in the virtual handle through the operating handles to perform various operations, and characters are input by displaying a virtual keyboard in the glasses and moving the light spots on the virtual keyboard.
The 6Dof operating handle that VR, AR glasses used is the great size and needs the one-hand to hold, and it is very inconvenient to operate on the virtual keyboard that glasses show, also does not accord with popular general input custom. In a data layer, because actually acquired acceleration information is not continuous, accumulated errors are easily caused in the using process, and some simple operations are carried out through human eye correction in a general operation process without problems, but the requirements cannot be met if operations with higher precision requirements such as handwriting track recognition and input are realized.
Disclosure of Invention
The invention provides a non-contact interaction method and device in free space, electronic equipment and a storage medium, and aims to solve the technical problems of inconvenience in operation and operation with high precision requirements such as handwriting track recognition and input caused by an input mode of the conventional wearable equipment.
In a first aspect, an embodiment of the present invention provides a method for contactless interaction in free space, where the method includes:
responding to the contactless interaction request, and acquiring acceleration data of the wearable device in the contactless interaction process in the free space at the current moment; the acceleration data is acceleration data of the wearable equipment under a rectangular coordinate system;
if the non-contact interaction request is a character input request, converting the acceleration data of the wearable equipment under the rectangular coordinate system into the acceleration data under the world coordinate system based on a preset unit quaternion;
determining the three-dimensional position coordinate of the current moment in the world coordinate system according to the acceleration data in the world coordinate system;
converting the three-dimensional position coordinates into two-dimensional space position coordinates of the wearable device at the current moment;
and generating a track according to the two-dimensional space position coordinates in a preset time period, and performing character recognition based on the track.
Optionally, the converting, based on a preset unit quaternion, acceleration data in a three-dimensional coordinate system of the wearable device into acceleration data in a world coordinate system includes:
determining the acceleration vector and the angular velocity data of each axis direction based on the acceleration data of the wearable device in the three axis directions at the current moment and the previous moment,
determining a rotation matrix of the current sampling moment based on the unit quaternion of the current sampling moment;
and converting the acceleration sensor data under the triaxial coordinate system at the current sampling moment into the acceleration sensor data under the space coordinate system based on the rotation matrix at the current sampling moment.
Optionally, the determining, according to the acceleration data in the world coordinate system, a three-dimensional position coordinate in the world coordinate system at the current time includes:
and performing integral operation on the acceleration sensor data in the world coordinate before the current sampling moment to obtain the three-dimensional position coordinate of the wearable device at the current sampling moment.
Optionally, the converting the three-dimensional position coordinates into two-dimensional space position coordinates of the wearable device at the current time includes:
and converting the three-dimensional position coordinates into two-dimensional space position coordinates of the wearable equipment at the current moment based on a preset projection matrix.
Optionally, in response to the contactless interaction request, acceleration data of the wearable device in a contactless interaction process in a free space at the current moment is collected; after the acceleration data is the acceleration data of the wearable device in the rectangular coordinate system, the method further comprises the following steps:
and if the non-contact interaction request is a mouse input request, determining the rotation displacement in the horizontal direction and the rotation displacement in the vertical direction based on the acceleration data of the wearable device in the three-axis directions at the current moment and the previous moment.
Optionally, after determining the rotational displacement in the horizontal direction and the rotational displacement in the vertical direction, the method further includes:
and determining whether the wearable equipment has deviation or not according to the rotation displacement in the horizontal direction and/or the rotation displacement in the vertical direction, and if so, identifying the deviation as mouse movement.
In a second aspect, an embodiment of the present invention further provides a contactless interaction apparatus in free space, where the apparatus includes:
the data acquisition module is used for responding to the contactless interaction request and acquiring acceleration data of the wearable equipment in the contactless interaction process in the free space at the current moment; the acceleration data is acceleration data of the wearable equipment in a rectangular coordinate system;
the data conversion module is used for converting the acceleration data of the wearable device under the rectangular coordinate system into the acceleration data under the world coordinate system based on a preset unit quaternion if the non-contact interaction request is a character input request;
the coordinate calculation module is used for determining the three-dimensional position coordinate of the current moment in the world coordinate system according to the acceleration data in the world coordinate system;
the coordinate conversion module is used for converting the three-dimensional position coordinate into a two-dimensional space position coordinate of the wearable device at the current moment;
and the character recognition module is used for generating a track according to the two-dimensional space position coordinates in the preset time period and recognizing characters based on the track.
In a third aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement a method for testing contactless interaction in free space as described in any of the embodiments of the present application.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program is configured to, when executed by a processor, implement a method for contactless interaction in free space according to any one of the embodiments of the present application.
The method comprises the steps of responding to a contactless interaction request, and collecting acceleration data of the wearable device in the contactless interaction process in the free space at the current moment; the acceleration data is acceleration data of the wearable equipment under a rectangular coordinate system; if the non-contact interaction request is a character input request, converting the acceleration data of the wearable equipment under the rectangular coordinate system into the acceleration data under the world coordinate system based on a preset unit quaternion; determining the three-dimensional position coordinate of the current moment in the world coordinate system according to the acceleration data in the world coordinate system; converting the three-dimensional position coordinates into two-dimensional space position coordinates of the wearable device at the current moment; the method and the device for recognizing the characters in the wearable device have the advantages that the track is generated according to the two-dimensional space position coordinates in the preset time period, character recognition is carried out based on the track, the problems that the input mode of the existing wearable device brings inconvenience in operation, the data calculation error is large, high-precision operations such as character recognition are difficult to carry out are solved, and mouse movement and selection can be achieved.
Drawings
Fig. 1 is a schematic flowchart of a method for contactless interaction in free space according to an embodiment of the present invention;
fig. 2 is a schematic flowchart illustrating a contactless interaction method in free space according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a contactless interaction device in free space according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not to be construed as limiting the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a schematic flow chart of a contactless interaction method in a free space according to an embodiment of the present invention, which is applicable to a situation of contactless interaction in a free space, and is particularly applicable to a situation of contactless text input to a VR \ AR device in a free space. The device can be configured in an electronic terminal device, and the method specifically comprises the following steps:
s110, responding to the non-contact interaction request, and collecting acceleration data of the wearable equipment in the non-contact interaction process in the free space at the current moment; the acceleration data is acceleration data of the wearable equipment under a rectangular coordinate system;
the non-contact interaction means that a user does not contact with the equipment to complete man-machine interaction with the equipment; the non-contact interaction in the free space refers to the human-computer interaction with equipment after moving in the space of the real world.
In this embodiment, the method may be performed by a device 1, where the device 1 is a wearable device supporting human-computer interaction, and the wearable device includes, but is not limited to, a smart watch, a smart bracelet, a smart ring, and the like, and the present application is not limited herein. Due to the defects of computing power, battery size limitation and the like of the wearable device, the device 1 may also perform only a part of the steps of the method, such as the acquisition of acceleration data, while the remaining steps are performed by the device 2, the device 2 may be a smart computer, a smart phone, or a smart helmet, and preferably, the device 2 may be a smart device that a user needs to perform contactless interaction in the embodiment of the present application. It should be understood that other existing or future wearable devices that support human-computer interaction, such as may be suitable for use with the present application, are also intended to be encompassed by the present application.
In this application embodiment, wearable equipment can be configured with triaxial acceleration sensor, can acquire the acceleration sensor data on each axle direction under its own rectangular coordinate system. Specifically, the acceleration data of the wearable device at the current moment can be collected based on a preset time interval, wherein the time interval can be set based on a range supported by the three-axis acceleration sensor, it should be understood that the shorter the time interval is, the more the collected acceleration data is, and the more accurate the calculated result is, and the collection time interval of multiple levels can be provided for the user in the embodiment of the present application, so as to meet different requirements of the user.
S120, if the non-contact interaction request is a character input request, converting acceleration data of the wearable equipment under a rectangular coordinate system into acceleration data under a world coordinate system based on a preset unit quaternion;
for example, an operation button may be provided on the device 1, and when the user triggers the operation button, it may be determined that the contactless interaction request of the user is a text input request. For example, a detection module may be executed on the device 2, and is configured to detect whether the contactless interaction request of the user is a text input request, and typically, if the device 2 is VR glasses, it may detect whether a visual center point of the user is in a text input box of the display interface, pop up a prompt box for the user to determine when the visual center point is in the text input box of the display interface, and after the user determines that the contactless interaction request of the user is a text input request, and feed the contactless interaction request of the user back to the device 1.
Optionally, if the device 1 includes a gyroscope, in the step S110, the device 1 may further obtain gyroscope data in each axis direction in its own rectangular coordinate system. Wherein the gyroscope data are used to represent the angular velocities of the movement of the device 1 in the three axial directions, in which case the acceleration data and the gyroscope data can be combined to determine the movement state of the device 1.
Optionally, if the device does not include a gyroscope, the embodiment of the present application determines the moving state of the device 1 through acceleration data, specifically, converts acceleration data of the wearable device in a rectangular coordinate system into acceleration data in a world coordinate system based on a preset unit quaternion, and includes:
determining the acceleration vector and the angular velocity data of each axis direction based on the acceleration data of the wearable device in the three axis directions at the current moment and the previous moment,
determining a rotation matrix of the current sampling moment based on the unit quaternion of the current sampling moment;
and converting the acceleration sensor data under the triaxial coordinate system at the current sampling moment into the acceleration sensor data under the space coordinate system based on the rotation matrix at the current sampling moment.
For example, based on the acceleration sensor data in each axis direction of the three-axis acceleration sensor at the current time acquired in step S110, a vector acceleration vector in a rectangular coordinate system of the device 1 at the current time may be determined by using the pythagorean theorem, and further, an included angle between the vector acceleration vector and each axis may be determined by using a trigonometric function or an inverse trigonometric function, and the calculation process may be repeated to determine an included angle between the vector acceleration vector and each axis at the previous time, so that the angular velocity data of the motion of the device 1 at the current time and the previous time in the three axis directions may be determined.
The unit quaternion is used to effect a transformation of the device 1 rectangular coordinate system into the world coordinate system, i.e. to effect a rotation and translation specified by the unit quaternion, such that the coordinates of the device 1 rectangular coordinate system can be represented by the coordinates of the world coordinate system. At the initial time, the unit quaternion is preset, and since the device 1 has an angular change in three-axis directions, which results in a change in the rectangular coordinate system of the device 1, in the embodiment of the present application, when the time advances, and the device 1 has an angular change in three-axis directions, the unit quaternion also changes correspondingly, and specifically, the unit quaternion at the current time is calculated from the quaternion at the previous time and the angular velocity data.
For example, the unit quaternion of the current time may be calculated as:
where Δ t represents a time interval, t + Δ t represents a current time, t represents a previous time, ω x 、ω y 、ω z Representing angular velocity data in three axis directions of a rectangular coordinate system of the device 1.
In this embodiment, the world coordinate system may be a left-handed coordinate system, and the rotation matrix for determining the current sampling time may be:
wherein q represents a unit quaternion at the current time, and the modulus thereof is 1,q = [ omega, x, y, z =] T 。
Thus, the acceleration sensor data in the three-axis coordinate system at the current sampling time can be converted into the acceleration sensor data in the spatial coordinate system based on the rotation matrix.
S130, determining a three-dimensional position coordinate of the current moment in the world coordinate system according to the acceleration data in the world coordinate system;
optionally, in the step S130, the acceleration sensor data in the world coordinate before the current sampling time may be integrated to obtain the three-dimensional position coordinate of the wearable device at the current sampling time.
For example, because the time interval is short, integral calculation can be performed to obtain instantaneous speed of each axis direction in the world coordinate, integral calculation is performed again to obtain displacement of each axis direction in the world coordinate, and therefore three-dimensional position coordinates of the wearable device at the current sampling time in the world coordinate system can be obtained.
S140, converting the three-dimensional position coordinates into two-dimensional space position coordinates of the wearable equipment at the current moment;
optionally, in the step S140, the three-dimensional position coordinates may be converted into two-dimensional spatial position coordinates of the wearable device at the current time based on a preset projection matrix.
For example, if the origin of the coordinate of the two-dimensional coordinate system is the same as the origin of the world coordinate system, the coordinate value in the y-axis direction may be directly erased through a preset projection matrix, so as to obtain the two-dimensional spatial position coordinate; and if the origin of the coordinate of the two-dimensional coordinate system is different from the origin of the world coordinate system, the three-dimensional position coordinate is changed by rotation and/or translation through a preset projection matrix to obtain a two-dimensional space position coordinate. The preset projection matrix is a coordinate transformation matrix which is commonly used in the prior art and can transform a three-dimensional coordinate into a two-dimensional coordinate.
S150, generating a track according to the two-dimensional space position coordinates in the preset time period, and performing character recognition based on the track.
In the embodiment of the application, a time period can be preset as the time for inputting a character by a user, the two-dimensional space position coordinate of the current moment is obtained in the process, the process is repeated, the two-dimensional space position coordinate in the preset time period is obtained, the moving track of the wearable device is fitted through a fitting algorithm, and the character input by the user can be recognized through a character recognition algorithm based on the track.
The method comprises the steps of responding to a contactless interaction request, and collecting acceleration data of the wearable device in the contactless interaction process in the free space at the current moment; the acceleration data is acceleration data of the wearable equipment under a rectangular coordinate system; if the non-contact interaction request is a character input request, converting the acceleration data of the wearable equipment under the rectangular coordinate system into the acceleration data under the world coordinate system based on a preset unit quaternion; determining the three-dimensional position coordinate of the current moment in the world coordinate system according to the acceleration data in the world coordinate system; converting the three-dimensional position coordinates into two-dimensional space position coordinates of the wearable device at the current moment; the method comprises the steps of generating a track according to two-dimensional space position coordinates in a preset time period, and carrying out character recognition based on the track, so that the problems of inconvenience in operation, large data calculation error, difficulty in character recognition and other high-precision operations caused by an input mode of the conventional wearable equipment are solved.
Example two
Fig. 2 is a schematic flow chart of a contactless interaction method in a free space according to a second embodiment of the present invention, which is applicable to a situation where contactless interaction is performed in a free space, and is particularly applicable to a situation where contactless mouse input is performed on a VR \ AR device in a free space. The device can be configured in an electronic terminal device. The present embodiment is a further optimization of the above embodiments, and the same terms have similar definitions, principles, procedures and technical effects as the above embodiments. The method specifically comprises the following steps:
s210, responding to the contactless interaction request, and collecting acceleration data of the wearable device in the contactless interaction process in the free space at the current moment; the acceleration data is acceleration data of the wearable equipment in a rectangular coordinate system;
s220, if the non-contact interaction request is a mouse input request, determining the rotation displacement in the horizontal direction and the rotation displacement in the vertical direction based on the acceleration data of the wearable device in the three-axis directions at the current moment and the previous moment;
the user's mouse input in free space may have both inputs in the horizontal plane or in the vertical plane. Taking the horizontal plane input as an example, since the mouse input is performed in a free space, it is likely that the angular movement of the rectangular coordinate system of the wearable device in three axis directions is included, and therefore, the output data of the three-axis acceleration sensor needs to be corrected rotationally so that the x axis is parallel to the horizontal direction. Correspondingly, taking a vertical plane input as an example, the z-axis needs to be parallel to the vertical direction. The embodiment of the present application takes horizontal plane input as an example, and it should be noted that the technical principle of the input is consistent whether horizontal plane input or vertical plane input is included in the claimed technical solution of the present application.
In the embodiment of the present application, the left-right rotation displacement in the horizontal direction and the rotation displacement in the vertical direction are determined, and similarly to the calculation of the three-dimensional position coordinates in step S130, since the time interval between the current time and the previous time is short, it can be determined that the acceleration section on the x-axis is a constant value, and at this time, the calculation of the linear velocity of the left-right rotation in the horizontal direction is also realized by integration, so that the left-right rotation displacement in the horizontal direction can be further obtained. Similar to the calculation of the angular velocity data in step S120, the included angle between the y axis of the wearable device and the horizontal direction at the current moment and the included angle between the y axis of the wearable device and the horizontal direction at the current moment are obtained through a trigonometric function and an inverse trigonometric function, so as to obtain the angular velocity of the up-and-down rotation in the vertical direction in the unit time of the current moment and the previous moment, and the angular velocity is multiplied by the rotation radius to obtain the linear velocity of the up-and-down rotation. Wherein, the turning radius can be finger length, hand length, etc. and corresponds to the wearing part of the wearable device.
And S230, determining whether the wearable device deviates or not according to the rotation displacement in the horizontal direction and/or the displacement in the vertical direction, and if so, identifying the deviation as mouse movement.
For example, if the left-right rotation displacement in the horizontal direction and/or the up-down rotation displacement in the vertical direction are obtained, whether the offset occurs may be determined through a mapping relationship, and the left-right rotation displacement movement in the horizontal direction and the up-down rotation displacement in the vertical direction are converted into the offset of the cursor on the screen of the device to be interacted on the x axis and the y axis.
In the using process, the offset of the cursor can be sent to the intelligent terminal device needing interaction in a wireless transmission mode, the intelligent terminal device moves the cursor to reach the target position according to the offset of the cursor, further, the wearable device can comprise a key, when the mouse moves to the target position, a user can click the key to form a click event, and the click event is transmitted to the intelligent terminal device needing interaction through a wireless network to realize the click on the target position.
According to the technical scheme of the embodiment, the acceleration data of the wearable device in the non-contact interaction process in the free space at the current moment is acquired by responding to the non-contact interaction request; the acceleration data are acceleration data of the wearable device in a rectangular coordinate system, if the non-contact interaction request is a mouse input request, based on the acceleration data of the wearable device in the three-axis directions at the current moment and the previous moment, the rotation displacement in the horizontal direction and the displacement in the vertical direction are determined, whether the wearable device deviates or not is determined according to the rotation displacement in the horizontal direction and/or the displacement in the vertical direction, if the deviation occurs, the deviation is recognized as mouse movement, and the effect of performing mouse movement in a non-contact manner in a free space is achieved.
Preferably, the embodiment of the present application provides an intelligent ring device, which can be used to implement the technical solutions of the above embodiments, and specifically includes a ring portion and an interaction portion of an intelligent terminal device that needs to interact, where the ring portion and the interaction portion are connected by a low power consumption bluetooth protocol.
The ring part can be sleeved on the index finger in a ring mode and comprises a power supply module, a low-power-consumption Bluetooth communication module, an acceleration sensor module, a track calculation module, a button module and a function selection module. The interactive part is adapted to various intelligent devices and comprises a track conversion module, a handwriting recognition module and an operation module.
When the ring enters a mouse input mode, a user moves in space through a forefinger, the acceleration sensor module generates acceleration in each direction of a three-dimensional space, the communication module sends acceleration information and button information to the interaction part of the intelligent device, the track conversion module converts the acceleration information into mouse track movement information, the button information is converted into button information of a mouse, and the operation module converts relevant information of mouse operation into mouse operation on the intelligent device.
When the ring enters a character input mode, a user uses a forefinger as a pen to perform simulated writing in a free space, the acceleration sensor module generates acceleration in each direction of a three-dimensional space, the track calculation module converts acceleration information into quaternion according to the sensor and then converts the quaternion into a two-dimensional space motion track, the communication module sends the two-dimensional space motion track and key information to the interaction part of the intelligent device, the handwriting recognition module recognizes the two-dimensional space motion track as character input (can be cloud calculation recognition or can be a local handwriting recognition module), and the operation module inputs a recognition result as characters.
EXAMPLE III
Fig. 3 is a schematic structural diagram of a contactless interaction device in free space according to a third embodiment of the present invention, where the specific structure of the device is as follows:
including a test data acquisition module 310, a data transformation module 320, a coordinate calculation module 330, a coordinate transformation module 340, and a text recognition module 350.
The data acquisition module 310 is configured to acquire acceleration data of the wearable device in a contactless interaction process in a free space at a current moment; the acceleration data is acceleration data of the wearable equipment under a rectangular coordinate system;
the data conversion module 320 is configured to convert, in response to a text input request, acceleration data of the wearable device in a rectangular coordinate system into acceleration data in a world coordinate system based on a preset unit quaternion;
the coordinate calculation module 330 is configured to determine a three-dimensional position coordinate of the current time in the world coordinate system according to the acceleration data in the world coordinate system;
the coordinate conversion module 340 is configured to convert the three-dimensional position coordinate into a two-dimensional space position coordinate of the wearable device at the current time;
the character recognition module 350 is configured to generate a track according to the two-dimensional spatial position coordinates in the preset time period, and perform character recognition based on the track.
The technical scheme of this application has solved the input mode of current wearable equipment and has brought the problem of the high accuracy operation such as inconvenient, the data calculation error is great, be difficult to carry out character recognition.
As an optional implementation, the data conversion module includes:
an angular velocity calculation unit for determining the vector acceleration and the angular velocity data of each axis direction based on the acceleration data of the wearable device in the three axis directions at the current moment and the previous moment,
the rotation matrix determining unit is used for determining a rotation matrix at the current sampling moment based on the unit quaternion at the current sampling moment;
and the data conversion unit is used for converting the acceleration sensor data under the triaxial coordinate system at the current sampling moment into the acceleration sensor data under the space coordinate system based on the rotation matrix at the current sampling moment.
As an optional implementation manner, the coordinate calculation module is specifically configured to perform an integration operation on acceleration sensor data in world coordinates before the current sampling time to obtain a three-dimensional position coordinate of the wearable device at the current sampling time.
As an optional implementation manner, the coordinate transformation module is specifically configured to transform the three-dimensional position coordinate into a two-dimensional space position coordinate of the wearable device at the current time based on a preset projection matrix.
In an alternative embodiment, the apparatus further comprises a displacement calculation module,
and the displacement calculation module is used for determining the rotation displacement in the horizontal direction and the displacement in the vertical direction based on the acceleration data of the wearable device in the three-axis directions at the current moment and the previous moment if the non-contact interaction request is a mouse input request.
As an alternative embodiment, the device further comprises a mouse movement recognition module,
the mouse movement identification module determines whether the wearable device deviates according to the rotation displacement in the horizontal direction and/or the displacement in the vertical direction, and if so, identifies the deviation as mouse movement.
The contactless interaction device in the free space provided by the embodiment of the invention can execute the contactless interaction method in the free space provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example four
Fig. 4 is a schematic structural diagram of an electronic apparatus according to a fourth embodiment of the present invention, as shown in the fourth embodiment, the electronic apparatus includes a processor 410, a memory 420, an input device 430, and an output device 440; the number of the processors 410 in the electronic device may be one or more, and one processor 410 is taken as an example in fig. 4; the processor 410, the memory 420, the input device 430 and the output device 440 in the electronic apparatus may be connected by a bus or other means, and the bus connection is exemplified in fig. 4.
The memory 420 serves as a computer-readable storage medium for storing software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the contactless interaction method in free space in the embodiment of the present invention (for example, the test data acquisition module 310, the data conversion module 320, the coordinate calculation module 330, the coordinate conversion module 340, and the text recognition module 350 in the contactless interaction device in free space). The processor 410 executes various functional applications of the device/terminal/server and data processing by executing software programs, instructions and modules stored in the memory 420, that is, implements the above-described contactless interaction method in free space.
The memory 420 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the memory 420 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the memory 420 may further include memory located remotely from the processor 410, which may be connected to the device/terminal/server via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input means 430 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the device/terminal/server. The output device 440 may include a display device such as a display screen.
EXAMPLE five
Fifth embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program is used to execute the method for contactless interaction in free space according to any embodiment of the present application when the computer program is executed by a computer processor.
Of course, the storage medium provided by the embodiment of the present invention contains computer-executable instructions, and the computer-executable instructions are not limited to the operations of the method described above, and may also execute the relevant operations in the method for contactless interaction in free space provided by any embodiment of the present invention.
From the above description of the embodiments, it is obvious for those skilled in the art that the present invention can be implemented by software and necessary general hardware, and certainly, can also be implemented by hardware, but the former is a better embodiment in many cases. Based on such understanding, the technical solutions of the present invention or portions thereof contributing to the prior art may be embodied in the form of a software product, which can be stored in a computer readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the methods according to the embodiments of the present invention.
It should be noted that, in the embodiment of the above search apparatus, each included unit and module are merely divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.
Claims (10)
1. A method of contactless interaction in free space, comprising:
responding to the non-contact interaction request, and acquiring acceleration data of the wearable equipment in a non-contact interaction process in free space at the current moment; the acceleration data is acceleration data of the wearable equipment in a rectangular coordinate system;
if the non-contact interaction request is a character input request, converting the acceleration data of the wearable equipment under the rectangular coordinate system into the acceleration data under the world coordinate system based on a preset unit quaternion;
determining the three-dimensional position coordinate of the current moment in the world coordinate system according to the acceleration data in the world coordinate system;
converting the three-dimensional position coordinates into two-dimensional space position coordinates of the wearable device at the current moment;
and generating a track according to the two-dimensional space position coordinates in a preset time period, and performing character recognition based on the track.
2. The method of claim 1, wherein the converting acceleration data in a three-dimensional coordinate system of the wearable device into acceleration data in a world coordinate system based on a preset unit quaternion comprises:
determining the acceleration vector and the angular velocity data of each axis direction based on the acceleration data of the wearable device in the three axis directions at the current moment and the previous moment,
determining a rotation matrix of the current sampling moment based on the unit quaternion of the current sampling moment;
and converting the acceleration sensor data under the triaxial coordinate system at the current sampling moment into the acceleration sensor data under the space coordinate system based on the rotation matrix at the current sampling moment.
3. The method of claim 1, wherein determining three-dimensional position coordinates in a world coordinate system at a current time from the acceleration data in the world coordinate system comprises:
and performing integral operation on the acceleration sensor data in the world coordinate before the current sampling moment to obtain the three-dimensional position coordinate of the wearable device at the current sampling moment.
4. The method of claim 1, wherein translating the three-dimensional location coordinates into two-dimensional spatial location coordinates of the wearable device at the current time comprises:
and converting the three-dimensional position coordinates into two-dimensional space position coordinates of the wearable device at the current moment based on a preset projection matrix.
5. The method of claim 1, wherein in response to a contactless interaction request, the acquiring wearable device acceleration data during contactless interaction in free space at a current time; after the acceleration data is the acceleration data of the wearable device in the rectangular coordinate system, the method further comprises the following steps:
and if the non-contact interaction request is a mouse input request, determining the rotation displacement in the horizontal direction and the rotation displacement in the vertical direction based on the acceleration data of the wearable device in the three-axis directions at the current moment and the previous moment.
6. The method of claim 5, wherein after determining the rotational displacement in the horizontal direction and the rotational displacement in the vertical direction, further comprising:
and determining whether the wearable device is shifted or not according to the rotation displacement in the horizontal direction and/or the rotation displacement in the vertical direction, and if so, identifying the shift as the mouse movement.
7. A contactless interactive device in free space, comprising:
the data acquisition module is used for responding to the contactless interaction request and acquiring acceleration data of the wearable equipment in the contactless interaction process in the free space at the current moment; the acceleration data is acceleration data of the wearable equipment under a rectangular coordinate system;
the data conversion module is used for converting the acceleration data of the wearable device under the rectangular coordinate system into the acceleration data under the world coordinate system based on a preset unit quaternion if the non-contact interaction request is a character input request;
the coordinate calculation module is used for determining the three-dimensional position coordinate of the current moment in the world coordinate system according to the acceleration data in the world coordinate system;
the coordinate conversion module is used for converting the three-dimensional position coordinate into a two-dimensional space position coordinate of the wearable device at the current moment;
and the character recognition module is used for generating a track according to the two-dimensional space position coordinates in the preset time period and recognizing characters based on the track.
8. The apparatus of claim 7, wherein the data conversion module comprises:
an angular velocity calculation unit for determining the vector acceleration and the angular velocity data of each axis direction based on the acceleration data of the wearable device in the three axis directions at the current moment and the previous moment,
the rotation matrix determining unit is used for determining a rotation matrix of the current sampling moment based on the unit quaternion of the current sampling moment;
and the data conversion unit is used for converting the acceleration sensor data under the triaxial coordinate system at the current sampling moment into the acceleration sensor data under the space coordinate system based on the rotation matrix at the current sampling moment.
9. An electronic device, characterized in that the electronic device comprises:
one or more processors;
a storage device to store one or more programs,
when executed by the one or more processors, cause the one or more processors to implement a method for testing contactless interaction in free space as recited in any of claims 1-6.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method for contactless interaction in free space according to any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211337463.8A CN115686328A (en) | 2022-10-28 | 2022-10-28 | Contactless interaction method and device in free space, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211337463.8A CN115686328A (en) | 2022-10-28 | 2022-10-28 | Contactless interaction method and device in free space, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115686328A true CN115686328A (en) | 2023-02-03 |
Family
ID=85046913
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211337463.8A Pending CN115686328A (en) | 2022-10-28 | 2022-10-28 | Contactless interaction method and device in free space, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115686328A (en) |
-
2022
- 2022-10-28 CN CN202211337463.8A patent/CN115686328A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Wang et al. | An inertial-measurement-unit-based pen with a trajectory reconstruction algorithm and its applications | |
US10289214B2 (en) | Method and device of controlling virtual mouse and head-mounted displaying device | |
US20140115543A1 (en) | Method and device of stroke based user input | |
JP2004280834A (en) | Motion recognition system using virtual writing plane, and recognition method thereof | |
US9378427B2 (en) | Displaying handwritten strokes on a device according to a determined stroke direction matching the present direction of inclination of the device | |
CN109960404B (en) | Data processing method and device | |
CN103294226B (en) | A virtual input device and method | |
WO2012163124A1 (en) | Spatial motion-based input method and terminal | |
CN113498502A (en) | Gesture detection using external sensors | |
CN106598422B (en) | hybrid control method, control system and electronic equipment | |
Vatavu | Gesture-based interaction | |
Liu et al. | Ultrasonic positioning and IMU data fusion for pen-based 3D hand gesture recognition | |
Lang et al. | A multimodal smartwatch-based interaction concept for immersive environments | |
CN116185205B (en) | Non-contact gesture interaction method and device | |
CN115686328A (en) | Contactless interaction method and device in free space, electronic equipment and storage medium | |
Prasanth et al. | Gesture-based mouse control system based on MPU6050 and Kalman filter technique | |
KR20120037739A (en) | User interface device and method based on hand gesture recognition | |
JP2010086367A (en) | Positional information inputting device, positional information inputting method, program, information processing system, and electronic equipment | |
CN203299754U (en) | Virtual input device | |
CN111580666B (en) | Device control method, electronic device, device control system and storage medium | |
CN113407046B (en) | User action recognition method and device, electronic equipment and storage medium | |
Sekar et al. | Wearable virtual keyboard for visually impaired person | |
CN113031793B (en) | Contour acquisition method and device and intelligent pen | |
CN112328154B (en) | Equipment control method and device and electronic equipment | |
CN118363462A (en) | Pose control method and device for augmented reality, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |