[go: up one dir, main page]

CN116136736A - Information processing method, apparatus, electronic device, storage medium, and program product - Google Patents

Information processing method, apparatus, electronic device, storage medium, and program product Download PDF

Info

Publication number
CN116136736A
CN116136736A CN202310152452.0A CN202310152452A CN116136736A CN 116136736 A CN116136736 A CN 116136736A CN 202310152452 A CN202310152452 A CN 202310152452A CN 116136736 A CN116136736 A CN 116136736A
Authority
CN
China
Prior art keywords
finger
target object
straight line
sliding
sliding operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310152452.0A
Other languages
Chinese (zh)
Inventor
杨天翼
尹子硕
陈昊芝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Positive Negative Infinite Technology Co ltd
Original Assignee
Beijing Positive Negative Infinite Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Positive Negative Infinite Technology Co ltd filed Critical Beijing Positive Negative Infinite Technology Co ltd
Priority to CN202310152452.0A priority Critical patent/CN116136736A/en
Publication of CN116136736A publication Critical patent/CN116136736A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides an information processing method, an apparatus, an electronic device, a storage medium and a program product, when a target object is to execute sliding operation, the method establishes a straight line based on a first finger, the straight line is used as a reference for sliding of a second finger tail end, a first position relation between the second finger tail end and the straight line when entering a sliding mode and a second position relation between the second finger tail end and the straight line in the sliding mode are very easy to detect, and the sliding operation is executed for the target object based on the first position relation and the second position relation, so that usability and usability of the sliding operation can be improved.

Description

Information processing method, apparatus, electronic device, storage medium, and program product
Technical Field
The present invention relates to the field of information processing technology, and in particular, to an information processing method, an apparatus, an electronic device, a storage medium, and a program product.
Background
With the continuous development of virtual reality and somatosensory interaction technologies, how to use interaction gestures to perform operations in a virtual interface has also gradually become a research hotspot. Among these, the sliding operation is one of the common interaction gestures in the virtual interface.
Currently, a main technology of performing a sliding operation on a virtual interface by using gestures is to translate a hand along a plane where the virtual interface is located after sensing that the hand is in contact with the virtual interface. However, such interactive gestures generally require a large hand motion amplitude to ensure performance of the sliding operation, which can lead to fatigue over time.
Disclosure of Invention
The embodiment of the application aims to solve the problem that virtual interface sliding operation is easy to cause fatigue.
According to an aspect of the embodiments of the present application, there is provided an information processing method, including:
when the target object is to perform sliding operation, a straight line is established based on the first finger, wherein the straight line is used as a reference for sliding the tail end of the second finger;
if the distance between the tail end of the second finger and the straight line is detected to be smaller than a first preset value, determining a first position relation between the tail end of the second finger and the straight line, and entering a sliding mode aiming at the target object;
In the sliding mode, determining a second positional relationship of the second finger tip to the straight line;
a sliding operation is performed with respect to the target object based on the first positional relationship and the second positional relationship.
In an alternative embodiment, the method further comprises:
in the sliding mode, if the distance between the tail end of the second finger and the straight line is detected to be smaller than a second preset value, the sliding mode is exited.
In an alternative embodiment, establishing a straight line based on the first finger includes:
a straight line is established by taking the tail end of the first finger and the joint between the first finger and the finger near the phalanx as endpoints.
In an alternative embodiment, the first positional relationship comprises: when entering the sliding mode, a first distance between the tail end of the second finger and the end point of the linear target is set;
the second positional relationship includes: in the sliding mode, a second distance between the second finger end and the linear target end point;
based on the first positional relationship and the second positional relationship, performing a sliding operation with respect to the target object, including:
a sliding operation is performed for the target object based on a difference between the first distance and the second distance.
In an alternative embodiment, performing a sliding operation on the target object based on the first positional relationship and the second positional relationship includes:
Determining a direction and/or an amplitude of a sliding operation performed on the target object based on the first positional relationship and the second positional relationship;
based on the direction and/or amplitude, a sliding operation is performed for the target object.
In an alternative embodiment, a sliding operation is performed on the target object based on a difference between the first distance and the second distance, including;
acquiring the association relation between the sliding amplitude of the target object and the difference value;
determining the amplitude of the sliding operation for the target object based on the association relationship and the difference value;
based on the amplitude, a sliding operation is performed for the target object.
In an alternative embodiment, a sliding operation is performed on the target object based on a difference between the first distance and the second distance, including;
determining a direction in which a sliding operation is performed with respect to the target object based on whether the difference is a positive number or a negative number;
based on the direction, a sliding operation is performed for the target object.
In an alternative embodiment, the second finger is a thumb and the first finger is any other finger than the thumb.
According to another aspect of the embodiments of the present application, there is provided an information processing apparatus including:
the establishing module is used for establishing a straight line based on the first finger when the target object is to perform sliding operation, wherein the straight line is used as a reference for sliding the tail end of the second finger;
The first determining and entering module is used for determining a first position relation between the tail end of the second finger and the straight line and entering a sliding mode aiming at the target object if the distance between the tail end of the second finger and the straight line is detected to be smaller than a first preset value;
the second determining module is used for determining a second position relation between the tail end of the second finger and the straight line in the sliding mode;
and a sliding module for performing a sliding operation with respect to the target object based on the first positional relationship and the second positional relationship.
In an alternative embodiment, the apparatus further comprises:
and the exit module is used for exiting the sliding mode if the distance between the tail end of the second finger and the straight line is detected to be smaller than a second preset value in the sliding mode.
In an alternative embodiment, the establishing module, when configured to establish a straight line based on the first finger, is specifically configured to:
a straight line is established by taking the tail end of the first finger and the joint between the first finger and the finger near the phalanx as endpoints.
In an alternative embodiment, the first positional relationship comprises: when entering the sliding mode, a first distance between the tail end of the second finger and the end point of the linear target is set;
the second positional relationship includes: in the sliding mode, a second distance between the second finger end and the linear target end point;
The sliding module is used for executing sliding operation on the target object based on the first position relation and the second position relation, and is specifically used for:
a sliding operation is performed for the target object based on a difference between the first distance and the second distance.
In an alternative embodiment, the sliding module, when configured to perform a sliding operation on the target object based on the first positional relationship and the second positional relationship, is specifically configured to:
determining a direction and/or an amplitude of a sliding operation performed on the target object based on the first positional relationship and the second positional relationship;
based on the direction and/or amplitude, a sliding operation is performed for the target object.
In an alternative embodiment, the sliding module is specifically configured to, when configured to perform a sliding operation on the target object based on a difference between the first distance and the second distance;
acquiring the association relation between the sliding amplitude of the target object and the difference value;
determining the amplitude of the sliding operation for the target object based on the association relationship and the difference value;
based on the amplitude, a sliding operation is performed for the target object.
In an alternative embodiment, the sliding module is specifically configured to, when configured to perform a sliding operation on the target object based on a difference between the first distance and the second distance;
Determining a direction in which a sliding operation is performed with respect to the target object based on whether the difference is a positive number or a negative number;
based on the direction, a sliding operation is performed for the target object.
In an alternative embodiment, the second finger is a thumb and the first finger is any other finger than the thumb.
According to still another aspect of the embodiments of the present application, there is provided an electronic device including: the information processing device comprises a memory, a processor and a computer program stored on the memory, wherein the processor executes the computer program to realize the information processing method provided by the embodiment of the application.
According to still another aspect of the embodiments of the present application, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the information processing method provided by the embodiments of the present application.
According to still another aspect of the embodiments of the present application, there is provided a computer program product, including a computer program, which when executed by a processor implements the information processing method provided by the embodiments of the present application.
According to the information processing method, the device, the electronic equipment, the storage medium and the program product, when the target object is to execute sliding operation, a straight line is established based on the first finger, the straight line is used as a reference for sliding of the tail end of the second finger, the first position relation between the tail end of the second finger and the straight line when the target object enters the sliding mode and the second position relation between the tail end of the second finger and the straight line in the sliding mode are easily detected, the sliding operation is executed for the target object based on the first position relation and the second position relation, usability and usability of the sliding operation can be improved, and compared with the sliding operation, the accurate execution of the sliding operation can be guaranteed only by small hand movement amplitude.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings that are required to be used in the description of the embodiments of the present application will be briefly described below.
Fig. 1 is a schematic flow chart of an information processing according to an embodiment of the present application;
FIG. 2a is a schematic diagram of one way of establishing a straight line provided in an embodiment of the present application;
FIG. 2b is a schematic diagram of another way of creating a straight line according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a change in positional relationship according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of yet another way of establishing a straight line provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of calculating a distance between a coordinate position and a straight line according to an embodiment of the present application;
fig. 6 is a schematic flow chart of a sliding operation according to an embodiment of the present application;
fig. 7a is a schematic diagram of a sliding operation of a volume adjustment control according to an embodiment of the present application;
fig. 7b is a schematic diagram of another sliding operation of a volume adjustment control according to an embodiment of the present application;
fig. 7c is a schematic diagram of still another sliding operation of a volume adjustment control according to an embodiment of the present application;
fig. 7d is a schematic diagram of still another sliding operation of a volume adjustment control according to an embodiment of the present application;
Fig. 8 is a schematic structural diagram of an information processing apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described below with reference to the drawings in the present application. It should be understood that the embodiments described below with reference to the drawings are exemplary descriptions for explaining the technical solutions of the embodiments of the present application, and the technical solutions of the embodiments of the present application are not limited.
As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless expressly stated otherwise, as understood by those skilled in the art. It will be further understood that the terms "comprises" and "comprising," when used in this application, specify the presence of stated features, information, data, steps, operations, elements, and/or components, but do not preclude the presence or addition of other features, information, data, steps, operations, elements, components, and/or groups thereof, all of which may be included in the present application. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. The term "and/or" as used herein indicates that at least one of the items defined by the term, e.g., "a and/or B" may be implemented as "a", or as "B", or as "a and B".
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Several terms which are referred to in this application are first introduced and explained:
(1) AR (Augmented Reality ): virtual information is overlaid on the real world for display, i.e. a real scene is combined with a virtual scene.
(2) VR (Virtual Reality): a virtual world is simulated by the VR device, and the virtual world is entered into the virtual scene by wearing the VR device.
(3) MR (Mixed Reality): the fusion of AR and VR, i.e., the mixing of the real world and virtual world, creates a new visual environment, containing both real objects and virtual information.
(4) XR (Extended Reality): additional reality or artificial reality (Artificial Reality), which may also be referred to as including but not limited to AR, VR, MR, etc., may be understood as a generic term for three, e.g., VR devices may also be referred to as XR devices, i.e., XR content may include entirely generated content, or generated content in combination with captured content (e.g., a photograph of the real world).
Currently, there are two main technologies for performing a sliding operation on a virtual interface using gestures:
1. After sensing that the hand is contacted with the virtual interface, translating the hand along a plane where the virtual interface is located;
2. after the non-contact control such as the hand ray is perceived to point to the virtual interface, the non-contact control is controlled to slide in the virtual interface by moving the hand.
These interaction gestures all require a large hand motion amplitude to ensure the performance of a sliding operation, and long-term use can lead to fatigue.
Also, since it is difficult to perceive the state of sliding of the hand with respect to the virtual interface without physical tactile feedback, the accuracy of the sliding operation is low.
In addition, since it is difficult to sense the sliding state of the hand with respect to the virtual interface, the sliding state of the hand may be separated from the virtual interface during the sliding operation performed by moving the hand, resulting in lower stability of the sliding operation.
The information processing method, the device, the electronic equipment, the storage medium and the program product provided by the application aim at providing a novel sliding operation method, improving the comfort, usability and usability of sliding operation and reducing the fatigue of users.
The technical solutions of the embodiments of the present application and technical effects produced by the technical solutions of the present application are described below by describing several exemplary embodiments. It should be noted that the following embodiments may be referred to, or combined with each other, and the description will not be repeated for the same terms, similar features, similar implementation steps, and the like in different embodiments.
An embodiment of the present application provides an information processing method, as shown in fig. 1, including:
step S101: when the target object is to perform sliding operation, a straight line is established based on the first finger, wherein the straight line is used as a reference for sliding the tail end of the second finger;
in the embodiment of the present application, the target object refers to a virtual object supporting a sliding operation, for example, a sliding control such as a volume adjustment control, a brightness adjustment control, or a desktop, a page, or the like, but is not limited thereto.
In this embodiment of the present application, before step S101, it may be determined that the target object is to perform the sliding operation before detecting the selection operation and/or the activation operation for the target object. As an example, if the volume adjustment control is selected, it may be determined that the volume adjustment control is to perform a sliding operation; or if a certain page is entered, activating the page by default, and determining that the page is to perform sliding operation. In practical applications, a person skilled in the art may set a selection manner and/or an activation manner of the target object according to practical situations, so as to detect the target object to be subjected to the sliding operation, which is not limited herein.
In this embodiment of the present application, for convenience of operation, the first finger and the second finger may refer to any two fingers of five fingers of one hand. For example, the first finger is an index finger, and the second finger is a thumb, so that convenience of operation can be improved to a greater extent. Or the first finger may be any other finger than the thumb, such as a middle finger, a ring finger, a little finger, etc., so as to be suitable for people with different special situations.
In this embodiment of the present application, the second finger end may refer to a fingertip of the second finger, or a section farthest from the palm among multiple knuckles of the second finger, which may be set in combination with the hand tracking algorithm that is used, which is not limited herein.
In this embodiment of the present application, a straight line is established based on the first finger, and specifically, a straight line in a corresponding direction may be established based on a placement manner of the first finger. As an example, taking the first finger as an index finger and the second finger as a thumb as an example, as shown in fig. 2a, a straight line in the up-down direction can be established based on the vertical arrangement mode of the first finger, and the tail end of the second finger can slide up and down with reference to the straight line; alternatively, as shown in fig. 2b, a straight line in the left-right direction may be established based on the horizontal arrangement of the first finger, and the second finger end may slide left-right with reference to the straight line. In practical applications, those skilled in the art may set the straight direction and the association with the first finger, the sliding direction and the association with the second finger end according to practical situations, and the embodiments of the present application are not limited herein.
Step S102: if the distance between the tail end of the second finger and the straight line is detected to be smaller than a first preset value, determining a first position relation between the tail end of the second finger and the straight line, and entering a sliding mode aiming at the target object;
In this embodiment, if it is detected that the distance between the end of the second finger and the straight line is smaller than the first preset value, whether the end of the second finger is contacted or will be contacted with the first finger is judged, and the sliding mode is entered for the target object, so that the sliding operation of the user can be detected in real time in the sliding mode, and erroneous responses to other operations are avoided. The specific value of the first preset value may be set according to the actual situation, which is not limited herein. For example, the first preset value may be set to 1.5cm, but is not limited thereto.
In the embodiment of the present application, entering the sliding mode may be invisible to the user, i.e., the detection behavior is different from not entering the sliding mode, and the display in the interface is not different from not entering the sliding mode; alternatively, it may be obvious to the user to enter the sliding mode, for example, to display a corresponding sliding control or sliding mode effect, etc., and those skilled in the art may set the sliding mode according to the actual situation, which is not limited herein.
In this embodiment, when the target object enters the sliding mode, an initial positional relationship (i.e., a first positional relationship) between the end of the second finger and the straight line is recorded and used as a reference for determining the sliding behavior in the sliding mode.
Step S103: in the sliding mode, determining a second positional relationship of the second finger tip to the straight line;
in this embodiment of the present application, in the sliding mode, the second position relationship between the end of the second finger and the straight line is detected in real time, that is, the position relationship after the end of the second finger slides on the straight line compared with the first position relationship is detected in real time, as shown in fig. 3.
Step S104: a sliding operation is performed with respect to the target object based on the first positional relationship and the second positional relationship.
In this embodiment of the present application, based on the first positional relationship and the second positional relationship, a sliding behavior of the user with respect to the target object may be determined, and may be mapped to a corresponding sliding operation with respect to the target object. In practical applications, the specific mapping relationship may be set according to practical situations, for example, defined in a system or a program, which is not limited herein.
Specifically, the direction and/or the magnitude of the sliding operation performed on the target object may be determined based on the first positional relationship and the second positional relationship, and the sliding operation may be performed on the target object based on the direction and/or the magnitude.
Wherein, based on the change of the first positional relationship and the second positional relationship, it is possible to determine in which direction the second finger tip slides on the straight line, for example, whether to slide left or right, or whether to slide up or down, i.e., the sliding operation in the corresponding direction can be performed with respect to the target object. For example, in one example, the right thumb slides to the left on the right index finger beyond a third preset value, the desktop switches to the left, the right thumb slides to the right on the right index finger beyond a third preset value, and the desktop switches to the right.
Similarly, based on the change of the first positional relationship and the second positional relationship, it is possible to determine what the distance the second finger tip slides in the straight line direction, that is, it is possible to perform the sliding operation of the corresponding magnitude with respect to the target object. In practical applications, the correspondence between the distance by which the second finger tip slides in the straight line direction and the magnitude of the sliding operation performed on the target object may be set according to practical situations, and the magnitude may be interpreted as a sliding distance or a percentage position or the like for different target objects, but is not limited thereto. For example, the second finger end slides 1cm along the straight line direction, the volume adjustment control adjusts 10%, and the like, and for example, the second finger end slides 1cm along the straight line direction, the page rolls 5cm, and the like, which are not limited herein.
It can be understood that, based on the change of the first positional relationship and the second positional relationship, the sliding operation performed on the target object in the corresponding direction and the sliding operation performed on the target object in the corresponding range may be combined, for example, the second finger end slides rightward by 1cm in the straight line direction, the volume adjustment control is increased by 10%, and the like, and for example, the second finger end slides downward by 1cm in the straight line direction, the page rolls downward by 5cm, and the like, which may be set by those skilled in the art according to the actual situation, and the embodiments of the present application are not limited herein.
According to the sliding operation method, when the target object is to execute sliding operation, a straight line is established based on the first finger, the straight line is used as a reference for sliding of the tail end of the second finger, the first position relation between the tail end of the second finger and the straight line when the target object enters the sliding mode and the second position relation between the tail end of the second finger and the straight line in the sliding mode are very easy to detect, the sliding operation is executed on the target object based on the first position relation and the second position relation, usability and usability of the sliding operation can be improved, and compared with the case that the hand is moved, the accurate execution of the sliding operation can be guaranteed only by small hand movement amplitude.
According to the sliding operation method, the sliding state of the tail end of the second finger relative to the first finger can be accurately perceived, so that the sliding operation function is effectively executed, and the accuracy of the sliding operation is remarkably improved.
According to the sliding operation method, the sliding state of the tail end of the second finger relative to the first finger can be accurately perceived, the tail end of the second finger is moved by referring to the first finger to execute sliding operation, and high stability is achieved.
According to the sliding operation method provided by the embodiment of the application, the sliding operation of the second finger tail end by referring to the first finger is more convenient compared with the operation of moving the hand, the required time is less, and the execution efficiency of the sliding operation can be improved.
The information processing method provided in the embodiment of the present application may further include step S105: in the sliding mode, if the distance between the tail end of the second finger and the straight line is detected to be smaller than a second preset value, the sliding mode is exited.
Further, in the sliding mode, if the distance between the end of the second finger and the straight line is detected to be smaller than the second preset value for more than a certain time, the sliding mode may be exited.
The second preset value may be set by a person skilled in the art according to actual situations, and the embodiment of the present application is not limited herein. Alternatively, the second preset value may be the same as the first preset value, for example, 1.5cm, or the second preset value may be different from the first preset value, for example, the second preset value is greater than the first preset value, so as to reduce the influence of the misoperation of the user, for example, when the first preset value is set to 1.5cm, the second preset value may be set to 2cm, but is not limited thereto.
In this embodiment of the present application, before step S101, step S100 may further include: and monitoring the position information of at least two joints of the first finger and the position information of the tail end of the second finger. Then for step S101, it may specifically include: when the target object is to perform sliding operation, establishing a straight line based on the position information of at least two joints of the first finger, wherein the straight line is used as a reference for sliding the tail end of the second finger; for step S102, specifically, it may include: and determining a first position relation between the second finger tail end and the straight line and entering a sliding mode aiming at the target object if the distance between the second finger tail end and the straight line is detected to be smaller than a first preset value based on the position information of the second finger tail end.
The position information of the second finger end may also be obtained by monitoring the position information of the hand and/or at least one finger joint in real time, for example, the position information of the finger end joint.
Alternatively, the position information may be specifically expressed in terms of a coordinate position, but is not limited thereto.
Optionally, the scheme can be applied to sliding operation of the terminal equipment, and can also be applied to sliding operation corresponding to XR equipment such as AR equipment, VR equipment and MR equipment.
As an example, if the present solution is applied to an AR device, the position information of at least two joints of the first finger and the position information of the end of the second finger may refer to real-time position information. If the AR device is an all-in-one machine, the real-time position information can be obtained through IMU (Inertial Measurement Unit ), SMI (Self-Mixing Interferometry, hybrid interferometry unit), camera vision calculation and other methods, and if the AR device is an external device, the position information can be calculated through IMU, SMI, camera vision, and the position information can be calculated through infrared, radar, ultrasound and the like, but the AR device is not limited to this.
Also for example, if the present solution is applied to VR devices. The position information of at least two joints of the first finger and the position information of the end of the second finger may refer to virtual position information in a virtual scene; if the VR device is a unitary or PC (Personal Computer ) type device, the positioning may be calculated by IMU, camera vision calculation, infrared, radar, ultrasound, etc., but is not limited thereto.
In practice, the XR device may process video, audio, tactile feedback, or some combination thereof, any of which may be presented in a single channel or multiple channels (e.g., stereoscopic video producing a three-dimensional effect for a viewer). Further, the XR device may be associated with an application, product, accessory, service, or some combination thereof, including but not limited to for creating content in an XR environment and/or a real world environment. The XR system used by the XR device may be implemented on a variety of platforms, including a head mounted display (Head Mount Display, HMD) connected to a host system, a stand alone HMD, a mobile device or computing system or other projection system, or any other platform capable of providing XR content to one or more viewers.
It will be appreciated by persons skilled in the art that the above-described several application scenarios are only illustrative, and not limiting of the embodiments of the present application, and that appropriate modifications based on these examples are also applicable to the present application and are also included within the scope of the present application. The following embodiments are applicable to these application scenarios, and the processing procedure based on these scenario extensions will not be described in detail.
In the embodiment of the application, the first finger and the second finger may be default; or the second finger is default, for example, the second finger is a thumb, and the first finger can be set in a self-defining way; or the first finger and the second finger can be set in a self-defined way.
Based on this, step S100 may specifically include: acquiring setting information of the first finger and/or the second finger; based on the set first finger and/or second finger and/or the default first finger and/or second finger, monitoring the position information of at least two joints of the first finger and the position information of the tail end of the second finger so as to identify whether the user has the intention of executing the sliding operation.
In this embodiment of the present application, a feasible implementation manner is provided for "establish a straight line based on the first finger" in step S101, which may specifically include: a straight line is established between the distal end of the first finger and the proximal interphalangeal joint of the first finger (also referred to as the proximal interphalangeal joint), as shown in fig. 4. A straight line is established with the end of the first finger at the end of the first finger near interphalangeal joint (also referred to as the proximal interphalangeal joint).
The second finger tail end slides between the first finger tail end and the first finger near inter-phalangeal joint more comfortably and conveniently, so that a straight line is established by taking the first finger tail end and the first finger near inter-phalangeal joint as endpoints and used as a reference for sliding of the second finger tail end, and the comfort and usability of sliding operation can be improved to a greater extent.
In other embodiments, other ways may be used to establish a straight line based on the first finger, for example, establishing a straight line with the distal interphalangeal joint (may also be referred to as a distal interphalangeal joint) of the first finger and the proximal interphalangeal joint of the first finger, or establishing a straight line with the distal end of the first finger and the first metacarpophalangeal joint as endpoints, which may be changed by those skilled in the art as appropriate and are also suitable for the present application and therefore included in the scope of the present application.
In this embodiment, an optional implementation manner is provided for the step S104, and specifically, for the first positional relationship in the step S102, the method may include: upon entering the sliding mode, the second finger tip is a first distance from the linear target endpoint.
For the second positional relationship in step S103, it may include: in the sliding mode, the second finger tip is a second distance from the linear target end point.
Wherein the straight line target end point may be determined based on the manner in which the straight line is established. As an example, if a straight line is established with the first finger end and the first finger proximal interphalangeal joint as the end points, the straight line target end point may be the first finger end, or the straight line target end point may be the first finger proximal interphalangeal joint, but is not limited thereto. Those skilled in the art can make appropriate changes and be adapted for use in the present application and are therefore intended to be encompassed by the scope of the present application.
Step S104 may specifically include step S1041: a sliding operation is performed for the target object based on a difference between the first distance and the second distance.
In this embodiment of the present application, based on the difference between the first distance and the second distance, a direction and/or a distance in which the end of the second finger slides along the straight line may be determined, so as to determine a direction and/or an amplitude in which the sliding operation is performed on the target object.
Specifically, step S1041 may include: acquiring the association relation between the sliding amplitude of the target object and the difference value; determining an amplitude of a sliding operation performed on the target object based on the association relationship and the calculated difference; based on the amplitude, a sliding operation is performed for the target object. That is, based on the calculated difference between the first distance and the second distance, it is possible to determine what the distance the second finger tip slides along the straight line, i.e., it is possible to perform a sliding operation of a corresponding magnitude with respect to the target object. In practical application, the association relationship between the sliding amplitude of the target object and the difference value may be set according to practical situations, and the amplitude may be interpreted as a sliding distance or a sliding percentage position for different target objects, but is not limited thereto. For example, the calculated difference is 1cm, the volume control is adjusted by 10%, and the like, and for example, the calculated difference is 1cm, the page is scrolled by 5cm, and the like, which are not limited herein.
And/or, step S1041 may further include: determining a direction in which a sliding operation is performed with respect to the target object based on whether the difference is a positive number or a negative number; based on the direction, a sliding operation is performed for the target object.
Wherein, based on the calculated difference between the first distance and the second distance being a positive number or a negative number, it can be determined in which direction the second finger tip slides along the straight line, for example, a straight line in the left-right direction is established based on the end of the right index finger and the joint between the proximal phalanges as the end points, the target end point is the end of the right index finger (see fig. 4), the calculated difference is a positive number indicating that the distance between the second finger tip and the end of the right index finger increases, i.e., the second finger tip slides to the right, the calculated difference is a negative number indicating that the distance between the second finger tip and the end of the right index finger decreases, i.e., the second finger tip slides to the left, and a sliding operation in the corresponding direction can be performed with respect to the target object. Correspondingly, if a straight line in the left-right direction is established based on the end of the left index finger and the joint between the near phalanges as the end points, the target end point is the end of the left index finger, the calculated difference value is a positive number, which indicates that the distance between the end of the second finger and the end of the left index finger is increased, that is, the end of the second finger slides leftwards, the calculated difference value is a negative number, which indicates that the distance between the end of the second finger and the end of the left index finger is decreased, that is, the end of the second finger slides rightwards, and the sliding operation in the corresponding direction can be executed for the target object. It should be understood by those skilled in the art that the above-mentioned several straight line establishment methods are only illustrative, and not limiting to the embodiments of the present application, and other straight line establishment methods may be used in the same manner and are also included in the scope of the present application.
It will be appreciated that, based on the calculated difference between the first distance and the second distance, the sliding operation performed in the corresponding direction for the target object and the sliding operation performed in the corresponding magnitude for the target object may be combined, for example, the calculated difference is 1cm, the volume adjustment control is increased by 10%, the calculated difference is-1 cm, the volume adjustment control is reduced by 10%, etc., and for example, the calculated difference is 1cm, the page rolls down by 5cm, the calculated difference is-1 cm, the page rolls up by 5cm, etc., which may be set by those skilled in the art according to the actual situation.
In this embodiment of the present application, the following calculation method may be used to calculate the distance between two coordinate positions:
Figure BDA0004095118500000141
wherein M is 1 (x 1 ,y 1 ,z 1 ) And M 2 (x 2 ,y 2 ,z 2 ) Respectively, two coordinates of the distance to be calculated, such as the coordinates of the end point of the straight line target and the coordinates of the end of the second finger, etc.
In this embodiment of the present application, the distance between a coordinate position and a straight line is calculated, and taking the distance between the coordinate of the end of the second finger and the established straight line as an example, the following calculation method may be adopted:
1. and calculating vectors corresponding to the straight lines.
For example, to connect the distal end of the first finger to the proximal phalangeal joint of the first finger By way of example, establishing a straight line as an end point, the coordinates N of the end of the first finger 1 (x 1 ,y 1 ,z 1 ) Coordinates N of the joint between the finger and the proximal phalanx of the first finger 3 (x 3 ,y 3 ,z 3 ) The subtraction results in a vector V 1
2. And calculating vectors of the tail ends of the second finger corresponding to any end point of the straight line.
For example, the coordinates N of the first finger tip 1 (x 1 ,y 1 ,z 1 ) Coordinates N with the second finger end 2 (x 2 ,y 2 ,z 2 ) The subtraction results in a vector V 2
3. Calculating V 2 At V 1 And a projection point P thereon.
For example, V 2 At V 1 Projection V on 3 Can be expressed as:
Figure BDA0004095118500000142
based on vector V 3 And the coordinates N of the first finger tip 1 (x 1 ,y 1 ,z 1 ) V can be calculated 2 At V 1 The coordinates of the projection point P on the map are shown in fig. 5. In other embodiments, other calculation methods of coordinates of the projection point P may be used.
4. And calculating the distance between the coordinates of the tail end of the second finger and the coordinates of the projection point P by the method for calculating the distance between the two coordinate positions, namely, the distance from the coordinates of the tail end of the second finger to the established straight line.
Based on at least one embodiment described above, the present embodiment shows an example of a flow of a sliding operation through fig. 6. Specifically, the method can comprise the following steps:
1. selecting or activating a target object supporting a sliding operation;
2. and monitoring the position coordinates of each joint of the thumb tail end knuckle and the index finger in real time.
3. And establishing a straight line by taking the joint between the tip of the index finger and the near phalanx as an endpoint, and judging whether the distance between the position coordinates of the distal knuckle of the thumb and the straight line is smaller than a preset value.
4. If yes, entering a sliding mode.
5. When the sliding mode is entered, the initial distance value between the position coordinates of the distal knuckle of the thumb and the coordinate positions of the tip of the index finger is recorded.
6. In the sliding mode, the latest distance value between the position coordinates of the thumb tail end knuckle and the coordinate positions of the index finger tip is calculated in real time.
7. And calculating the difference value between the initial distance value and the latest distance value in real time, and mapping the difference value into the sliding direction and the sliding amplitude.
8. And once the distance between the position coordinates of the thumb end knuckle and the straight line is larger than a preset value, the sliding mode is exited.
In the embodiment of the present application, an example of a sliding operation of the volume adjustment control is shown in fig. 7a to 7 d. Specifically, it may include:
1. selecting and activating a volume adjustment control;
2. the finger joint at the tail end of the thumb of the right hand is close to the index finger (the distance is smaller than a preset value) and enters a sliding mode;
3. recording an initial distance value between the finger joint at the tail end of the thumb of the right hand and the finger tip of the index finger;
3. every time the distal knuckle of the right thumb moves 1cm to the right (1 cm from the distance of the tip of the index finger increases), the volume increases by 10%;
4. The thumb is moved away from the index finger (distance greater than the preset value) and exits the sliding mode.
According to the sliding operation method, when the target object is to execute sliding operation, a straight line is established based on the first finger, the straight line is used as a reference for sliding of the tail end of the second finger, the first position relation between the tail end of the second finger and the straight line when the target object enters the sliding mode and the second position relation between the tail end of the second finger and the straight line in the sliding mode are very easy to detect, the sliding operation is executed on the target object based on the first position relation and the second position relation, usability and usability of the sliding operation can be improved, and compared with the case that the hand is moved, the accurate execution of the sliding operation can be guaranteed only by small hand movement amplitude.
According to the sliding operation method, the sliding state of the tail end of the second finger relative to the first finger can be accurately perceived, so that the sliding operation function is effectively executed, and the accuracy of the sliding operation is remarkably improved.
According to the sliding operation method, the sliding state of the tail end of the second finger relative to the first finger can be accurately perceived, the tail end of the second finger is moved by referring to the first finger to execute sliding operation, and high stability is achieved.
According to the sliding operation method provided by the embodiment of the application, the sliding operation of the second finger tail end by referring to the first finger is more convenient compared with the operation of moving the hand, the required time is less, and the execution efficiency of the sliding operation can be improved.
An embodiment of the present application provides an information processing apparatus, as shown in fig. 8, the information processing apparatus 80 may include: a setup module 801, a first determination and entry module 802, a second determination module 803, and a slide module 804, wherein,
the establishing module 801 is configured to establish a straight line based on the first finger when the target object is to perform the sliding operation, where the straight line is used as a reference for sliding the end of the second finger;
the first determining and entering module 802 is configured to determine a first positional relationship between the second finger end and the straight line and enter a sliding mode for the target object if it is detected that the distance between the second finger end and the straight line is smaller than a first preset value;
the second determining module 803 is configured to determine, in the sliding mode, a second positional relationship between the second finger end and the straight line;
the sliding module 804 is configured to perform a sliding operation with respect to the target object based on the first positional relationship and the second positional relationship.
In an alternative embodiment, the apparatus further comprises:
And an exit module 805, configured to exit the sliding mode if the distance between the second finger end and the straight line is detected to be smaller than the second preset value in the sliding mode.
In an alternative embodiment, the establishing module 801, when configured to establish a straight line based on the first finger, is specifically configured to:
a straight line is established by taking the tail end of the first finger and the joint between the first finger and the finger near the phalanx as endpoints.
In an alternative embodiment, the first positional relationship comprises: when entering the sliding mode, a first distance between the tail end of the second finger and the end point of the linear target is set;
the second positional relationship includes: in the sliding mode, a second distance between the second finger end and the linear target end point;
the sliding module 804 is specifically configured to, when configured to perform a sliding operation on a target object based on the first positional relationship and the second positional relationship:
a sliding operation is performed for the target object based on a difference between the first distance and the second distance.
In an alternative embodiment, the sliding module 804 is specifically configured to, when configured to perform a sliding operation on the target object based on the first positional relationship and the second positional relationship:
determining a direction and/or an amplitude of a sliding operation performed on the target object based on the first positional relationship and the second positional relationship;
Based on the direction and/or amplitude, a sliding operation is performed for the target object.
In an alternative embodiment, the sliding module 804 is specifically configured to, when configured to perform a sliding operation with respect to the target object based on a difference between the first distance and the second distance;
acquiring the association relation between the sliding amplitude of the target object and the difference value;
determining the amplitude of the sliding operation for the target object based on the association relationship and the difference value;
based on the amplitude, a sliding operation is performed for the target object.
In an alternative embodiment, the sliding module 804 is specifically configured to, when configured to perform a sliding operation with respect to the target object based on a difference between the first distance and the second distance;
determining a direction in which a sliding operation is performed with respect to the target object based on whether the difference is a positive number or a negative number;
based on the direction, a sliding operation is performed for the target object.
In an alternative embodiment, the second finger is a thumb and the first finger is any other finger than the thumb.
The apparatus of the embodiments of the present application may perform the method provided by the embodiments of the present application, and implementation principles thereof are similar, and actions performed by each module in the apparatus of each embodiment of the present application correspond to steps in the method of each embodiment of the present application, and detailed functional descriptions and beneficial effects generated by each module of the apparatus may be specifically referred to descriptions in the corresponding methods shown in the foregoing, which are not repeated herein.
An electronic device is provided in an embodiment of the present application, including a memory, a processor, and a computer program stored on the memory, where the processor executes the computer program to implement the steps of the foregoing method embodiments. Alternatively, the electronic device may be an XR device, including but not limited to an AR, device VR device, MR device, and the like. Alternatively, the electronic device may be, but is not limited to, a mobile terminal, a smart terminal, etc., such as a mobile phone, a smart phone, a tablet, a notebook, a personal digital assistant, a portable multimedia player, a navigation device, etc. It will be appreciated by those skilled in the art that the configuration according to the embodiments of the present disclosure can be applied to a fixed type terminal such as a digital tv, a desktop computer, etc., in addition to elements particularly used for mobile purposes.
In an alternative embodiment, an electronic device is provided, as shown in fig. 9, the electronic device 900 shown in fig. 9 includes: a processor 901 and a memory 903. The processor 901 is coupled to a memory 903, such as via a bus 902. Optionally, the electronic device 900 may further include a transceiver 904, where the transceiver 904 may be used for data interaction between the electronic device and other electronic devices, such as transmission of data and/or reception of data, etc. It should be noted that, in practical applications, the transceiver 904 is not limited to one, and the structure of the electronic device 900 is not limited to the embodiments of the present application.
The processor 901 may be a CPU (Central Processing Unit ), general purpose processor, DSP (Digital Signal Processor, data signal processor), ASIC (Application Specific Integrated Circuit ), FPGA (Field Programmable Gate Array, field programmable gate array) or other programmable logic device, transistor logic device, hardware components, or any combination thereof. Which may implement or perform the various exemplary logic blocks, modules, and circuits described in connection with this disclosure. The processor 901 may also be a combination that implements computing functionality, e.g., comprising one or more microprocessor combinations, a combination of DSP and microprocessor, etc.
Bus 902 may include a path to transfer information between the components. Bus 902 may be a PCI (Peripheral Component Interconnect, peripheral component interconnect Standard) bus or an EISA (Extended Industry Standard Architecture ) bus, or the like. The bus 902 may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, only one thick line is shown in fig. 9, but not only one bus or one type of bus.
The Memory 903 may be a ROM (Read Only Memory) or other type of static storage device that can store static information and instructions, a RAM (Random Access Memory ) or other type of dynamic storage device that can store information and instructions, an EEPROM (Electrically Erasable Programmable Read Only Memory ), a CD-ROM (Compact Disc Read Only Memory, compact disc Read Only Memory) or other optical disk storage, optical disk storage (including compact discs, laser discs, optical discs, digital versatile discs, blu-ray discs, etc.), magnetic disk storage media, other magnetic storage devices, or any other medium that can be used to carry or store a computer program and that can be Read by a computer, without limitation.
The memory 903 is used to store a computer program for executing the embodiments of the present application, and is controlled to be executed by the processor 901. The processor 901 is arranged to execute a computer program stored in the memory 903 to implement the steps shown in the foregoing method embodiments.
Embodiments of the present application provide a computer readable storage medium having a computer program stored thereon, where the computer program, when executed by a processor, may implement the steps and corresponding content of the foregoing method embodiments.
The embodiments of the present application also provide a computer program product, which includes a computer program, where the computer program can implement the steps of the foregoing method embodiments and corresponding content when executed by a processor.
The terms "first," "second," "third," "1," "2," "3" and the like in the description and in the claims of this application and in the above drawings are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the present application described herein may be implemented in other sequences than those illustrated or otherwise described.
It should be understood that, although the flowcharts of the embodiments of the present application indicate the respective operation steps by arrows, the order of implementation of these steps is not limited to the order indicated by the arrows. In some implementations of embodiments of the present application, the implementation steps in the flowcharts may be performed in other orders as desired, unless explicitly stated herein. Furthermore, some or all of the steps in the flowcharts may include multiple sub-steps or multiple stages based on the actual implementation scenario. Some or all of these sub-steps or phases may be performed at the same time, or each of these sub-steps or phases may be performed at different times, respectively. In the case of different execution time, the execution sequence of the sub-steps or stages may be flexibly configured according to the requirement, which is not limited in the embodiment of the present application.
The foregoing is merely an optional implementation manner of some implementation scenarios of the present application, and it should be noted that, for those skilled in the art, other similar implementation manners based on the technical ideas of the present application are adopted without departing from the technical ideas of the solution of the present application, which also belongs to the protection scope of the embodiments of the present application.

Claims (12)

1. An information processing method, characterized by comprising:
when the target object is to perform sliding operation, establishing a straight line based on the first finger, wherein the straight line is used as a reference for sliding the tail end of the second finger;
if the distance between the tail end of the second finger and the straight line is detected to be smaller than a first preset value, determining a first position relation between the tail end of the second finger and the straight line, and entering a sliding mode aiming at the target object;
determining a second positional relationship of the second finger tip to the straight line in the sliding mode;
and performing a sliding operation on the target object based on the first positional relationship and the second positional relationship.
2. The information processing method according to claim 1, characterized by further comprising:
and in the sliding mode, if the distance between the tail end of the second finger and the straight line is detected to be smaller than a second preset value, exiting the sliding mode.
3. The information processing method according to claim 1, wherein establishing a straight line based on the first finger includes:
a straight line is established by taking the tail end of the first finger and the joint between the first finger and the finger near the phalanx as endpoints.
4. An information processing method according to any one of claims 1 to 3, wherein the first positional relationship includes: upon entering the sliding mode, a first distance of the second finger tip from the linear target endpoint;
the second positional relationship includes: in the sliding mode, a second distance of the second finger tip from the linear target endpoint;
the performing a sliding operation with respect to the target object based on the first positional relationship and the second positional relationship includes:
a sliding operation is performed for the target object based on a difference between the first distance and the second distance.
5. The information processing method according to any one of claims 1 to 3, characterized in that the performing a sliding operation for the target object based on the first positional relationship and the second positional relationship includes:
determining a direction and/or magnitude of a sliding operation performed with respect to the target object based on the first positional relationship and the second positional relationship;
Based on the direction and/or the amplitude, a sliding operation is performed for the target object.
6. The information processing method according to claim 4, wherein a sliding operation is performed for the target object based on a difference between the first distance and the second distance, comprising;
acquiring an association relation between the sliding amplitude of the target object and the difference value;
determining an amplitude of a sliding operation performed on the target object based on the association relationship and the difference value;
based on the amplitude, a sliding operation is performed for the target object.
7. The information processing method according to claim 4, wherein a sliding operation is performed for the target object based on a difference between the first distance and the second distance, comprising;
determining a direction in which a sliding operation is performed with respect to the target object based on whether the difference is a positive number or a negative number;
based on the direction, a sliding operation is performed for the target object.
8. The information processing method according to claim 1, wherein the second finger is a thumb and the first finger is any other finger than the thumb.
9. An information processing apparatus, characterized by comprising:
A setting-up module for setting up a straight line based on the first finger when the target object is to perform the sliding operation, the straight line being used as a reference for sliding the end of the second finger;
the first determining and entering module is used for determining a first position relation between the tail end of the second finger and the straight line and entering a sliding mode aiming at the target object if the distance between the tail end of the second finger and the straight line is detected to be smaller than a first preset value;
a second determining module, configured to determine a second positional relationship between the second finger end and the straight line in the sliding mode;
and a sliding module configured to perform a sliding operation with respect to the target object based on the first positional relationship and the second positional relationship.
10. An electronic device comprising a memory, a processor and a computer program stored on the memory, characterized in that the processor executes the computer program to implement the method of any one of claims 1-8.
11. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method according to any of claims 1-8.
12. A computer program product comprising a computer program, characterized in that the computer program, when executed by a processor, implements the method of any of claims 1-8.
CN202310152452.0A 2023-02-16 2023-02-16 Information processing method, apparatus, electronic device, storage medium, and program product Pending CN116136736A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310152452.0A CN116136736A (en) 2023-02-16 2023-02-16 Information processing method, apparatus, electronic device, storage medium, and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310152452.0A CN116136736A (en) 2023-02-16 2023-02-16 Information processing method, apparatus, electronic device, storage medium, and program product

Publications (1)

Publication Number Publication Date
CN116136736A true CN116136736A (en) 2023-05-19

Family

ID=86333460

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310152452.0A Pending CN116136736A (en) 2023-02-16 2023-02-16 Information processing method, apparatus, electronic device, storage medium, and program product

Country Status (1)

Country Link
CN (1) CN116136736A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025031043A1 (en) * 2023-08-07 2025-02-13 中兴通讯股份有限公司 Control method, electronic device, input device, control system, and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025031043A1 (en) * 2023-08-07 2025-02-13 中兴通讯股份有限公司 Control method, electronic device, input device, control system, and storage medium

Similar Documents

Publication Publication Date Title
US10732725B2 (en) Method and apparatus of interactive display based on gesture recognition
Memo et al. Head-mounted gesture controlled interface for human-computer interaction
EP3218781B1 (en) Spatial interaction in augmented reality
US8760395B2 (en) Gesture recognition techniques
JP6165485B2 (en) AR gesture user interface system for mobile terminals
EP2558924B1 (en) Apparatus, method and computer program for user input using a camera
US11641460B1 (en) Generating a volumetric representation of a capture region
US20190050132A1 (en) Visual cue system
CN104081307A (en) Image processing apparatus, image processing method, and program
CN111383345B (en) Virtual content display method and device, terminal equipment and storage medium
CN112068698A (en) An interaction method, device, electronic device, and computer storage medium
US11397478B1 (en) Systems, devices, and methods for physical surface tracking with a stylus device in an AR/VR environment
WO2014116166A1 (en) Scalable input from tracked object
CN106662923B (en) Information processing apparatus, information processing method, and program
WO2014194148A2 (en) Systems and methods involving gesture based user interaction, user interface and/or other features
CN109960404B (en) Data processing method and device
CN104851134A (en) Augmented Reality System and Method Combining Virtual Trigger and Real Object Trigger
CN116136736A (en) Information processing method, apparatus, electronic device, storage medium, and program product
Kim et al. Oddeyecam: A sensing technique for body-centric peephole interaction using wfov rgb and nfov depth cameras
CN104834410A (en) Input apparatus and input method
US20130201157A1 (en) User interface device and method of providing user interface
CN112068699A (en) Interaction method, interaction device, electronic equipment and storage medium
CN117130518A (en) Control display method, head display device, electronic device and readable storage medium
CN112578983B (en) Finger orientation touch detection
Ki et al. 3D gaze estimation and interaction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination