CN112416115B - Method and equipment for performing man-machine interaction in control interaction interface - Google Patents
Method and equipment for performing man-machine interaction in control interaction interface Download PDFInfo
- Publication number
- CN112416115B CN112416115B CN201910785670.1A CN201910785670A CN112416115B CN 112416115 B CN112416115 B CN 112416115B CN 201910785670 A CN201910785670 A CN 201910785670A CN 112416115 B CN112416115 B CN 112416115B
- Authority
- CN
- China
- Prior art keywords
- head
- information
- motion information
- angular velocity
- interaction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 242
- 238000000034 method Methods 0.000 title claims abstract description 67
- 230000009471 action Effects 0.000 claims abstract description 76
- 230000033001 locomotion Effects 0.000 claims description 247
- 230000001133 acceleration Effects 0.000 claims description 47
- 230000015654 memory Effects 0.000 claims description 37
- 230000002452 interceptive effect Effects 0.000 claims description 16
- 238000001514 detection method Methods 0.000 claims description 11
- 230000004424 eye movement Effects 0.000 claims description 7
- 230000008569 process Effects 0.000 claims description 5
- 230000004886 head movement Effects 0.000 description 10
- 238000004590 computer program Methods 0.000 description 9
- 238000004891 communication Methods 0.000 description 8
- 238000012790 confirmation Methods 0.000 description 8
- 230000008859 change Effects 0.000 description 7
- 230000005291 magnetic effect Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 230000003190 augmentative effect Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 230000010354 integration Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000010355 oscillation Effects 0.000 description 3
- 238000006073 displacement reaction Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000002411 adverse Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000005294 ferromagnetic effect Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application aims to provide a method and equipment for performing man-machine interaction in a control interaction interface, and particularly comprises the steps of presenting the control interaction interface corresponding to head-mounted equipment, wherein the control interaction interface comprises a plurality of pieces of control identification information, collecting head action information of a user, determining a corresponding interaction instruction based on the head action information, and executing the corresponding interaction instruction based on the currently selected current control identification information in the control interaction interface. According to the method, the corresponding interaction instruction is generated by combining the head action information of the user on the control interaction interface, so that the control man-machine interaction is performed, the accuracy of action information identification is improved, the user looks and feel is good, and the user experience is improved.
Description
Technical Field
The application relates to the field of intelligent interaction, in particular to a technology for performing man-machine interaction in a control interaction interface.
Background
Headsets are also becoming more and more popular, such as augmented reality helmets, augmented reality glasses, virtual reality headsets, and the like. The existing interaction modes of the head-mounted equipment mainly comprise touch pads, voice recognition, external keyboard and mouse, gesture recognition interaction and the like. The voice recognition interaction anti-interference capability is poor, the recognition accuracy is greatly affected by environmental noise, and the requirements on the use environment are high; touch pad and external mouse-keyboard interactions require the user to bind both hands, while touch pad interactions do not allow the user to wear gloves (especially adverse to the user in the factory environment), etc.; gesture recognition requires the hands of a user to be occupied, and the hands still cannot be liberated; the purpose of releasing the hands is achieved by the wearable device, and the purpose of releasing the hands is obviously not achieved by interaction of a keyboard, a mouse, a touch pad and the like.
Disclosure of Invention
The application aims to provide a method and equipment for human-computer interaction in a control interaction interface.
According to one aspect of the present application, there is provided a method for performing man-machine interaction in a control interaction interface, applied to a head-mounted device, the method comprising:
presenting a control interaction interface corresponding to the head-mounted device, wherein the control interaction interface comprises a plurality of control identification information;
collecting head action information of a user, and determining a corresponding interaction instruction based on the head action information;
and executing a corresponding interaction instruction based on the currently selected current control identification information in the control interaction interface.
According to another aspect of the present application, there is provided a method for performing man-machine interaction in a control interaction interface, applied to a head-mounted device, the method comprising:
presenting a control interaction interface corresponding to the head-mounted device, wherein the control interaction interface comprises determining selection information and deselection information;
collecting head action information of a user, determining triaxial angular velocity template information matched with triaxial angular velocity corresponding to the head movement information, and determining corresponding interaction instructions according to the triaxial angular velocity template information, wherein the interaction instruction information comprises a confirmation selection instruction or a deselection instruction;
And executing the corresponding interaction instruction.
According to one aspect of the present application, there is provided a head-mounted device for human-machine interaction in a control interaction interface, the method comprising:
the control interaction interface comprises a plurality of control identification information;
the first module and the second module are used for collecting head action information of a user and determining corresponding interaction instructions based on the head action information;
and the three modules are used for executing corresponding interaction instructions based on the current control identification information currently selected in the control interaction interface.
According to one aspect of the present application, there is provided a head-mounted device for human-machine interaction in a control interaction interface, the method comprising:
the second module is used for presenting a control interaction interface corresponding to the head-mounted equipment, wherein the control interaction interface comprises determining selection information and deselection information;
the second module is used for collecting head action information of a user, determining triaxial angular velocity template information matched with triaxial angular velocity corresponding to the head movement information, and determining corresponding interaction instructions according to the triaxial angular velocity template information, wherein the interaction instruction information comprises a confirmation selection instruction or a deselection instruction;
And the second module and the third module are used for executing corresponding interaction instructions.
According to one aspect of the present application, there is provided an apparatus for human-machine interaction in a control interaction interface, wherein the apparatus includes:
a processor; and
a memory arranged to store computer executable instructions which, when executed, cause the processor to operate as any of the methods described above.
According to one aspect of the application there is provided a computer readable medium storing instructions that, when executed, cause a system to perform the operations of any of the methods described above.
Compared with the prior art, the method and the device have the advantages that the control interaction interface corresponding to the head-mounted equipment is presented, wherein the control interaction interface comprises a plurality of pieces of control identification information, head action information of a user is collected, corresponding interaction instructions are determined based on the head action information, and the corresponding interaction instructions are executed based on the current control identification information currently selected in the control interaction interface. According to the method, the corresponding interaction instruction is generated by combining the head action information of the user on the control interaction interface, so that the control man-machine interaction is performed, the accuracy of action information identification is improved, the user looks and feel is good, and the user experience is improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the accompanying drawings in which:
FIG. 1 illustrates a flow chart of a method for human-machine interaction in a control interaction interface, according to one embodiment of the application;
FIG. 2 illustrates an example of a single row arranged control interface interaction according to one embodiment of the application;
FIG. 3 illustrates an example of a control interface interaction in a multi-row, multi-column arrangement in accordance with another embodiment of the present application;
fig. 4 shows an example of determining a rotation angle standard value according to an embodiment of the present application;
FIG. 5 illustrates an example of a head swing action according to one embodiment of the application;
FIG. 6 shows an example of an angular velocity profile according to one embodiment of the application;
FIG. 7 shows an example of a square correspondence curve of a model of angular velocity according to one embodiment of the application;
FIG. 8 illustrates a flow chart of a method for human-machine interaction in a control interaction interface, according to one embodiment of the application;
FIG. 9 illustrates an example of a control interactive interface for confirming cancellation according to one embodiment of the application;
FIG. 10 illustrates functional modules of a headset according to one embodiment of the application;
FIG. 11 illustrates functional modules of a headset according to another embodiment of the application;
FIG. 12 illustrates an exemplary system that can be used to implement various embodiments described in the present application.
The same or similar reference numbers in the drawings refer to the same or similar parts.
Detailed Description
The application is described in further detail below with reference to the accompanying drawings.
In one exemplary configuration of the application, the terminal, the device of the service network, and the trusted party each include one or more processors (e.g., central processing units (Central Processing Unit, CPU)), input/output interfaces, network interfaces, and memory.
The Memory may include non-volatile Memory in a computer readable medium, random access Memory (Random Access Memory, RAM) and/or non-volatile Memory, etc., such as Read Only Memory (ROM) or Flash Memory (Flash Memory). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase-Change Memory (PCM), programmable Random Access Memory (Programmable Random Access Memory, PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (Dynamic Random Access Memory, DRAM), other types of Random Access Memory (RAM), read-Only Memory (ROM), electrically erasable programmable read-Only Memory (EEPROM), flash Memory or other Memory technology, read-Only Memory (Compact Disc Read-Only Memory, CD-ROM), digital versatile disks (Digital Versatile Disc, DVD) or other optical storage, magnetic cassettes, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by the computing device.
The device includes, but is not limited to, a user device, a network device, or a device formed by integrating a user device and a network device through a network. The user equipment includes, but is not limited to, any mobile electronic product which can perform man-machine interaction with a user (for example, perform man-machine interaction through a touch pad), such as a smart phone, a tablet computer and the like, and the mobile electronic product can adopt any operating system, such as an Android operating system, an iOS operating system and the like. The network device includes an electronic device capable of automatically performing numerical calculation and information processing according to a preset or stored instruction, and the hardware includes, but is not limited to, a microprocessor, an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a programmable logic device (Programmable Logic Device, PLD), a field programmable gate array (Field Programmable Gate Array, FPGA), a digital signal processor (Digital Signal Processor, DSP), an embedded device, and the like. The network device includes, but is not limited to, a computer, a network host, a single network server, a plurality of network server sets, or a cloud of servers; here, the Cloud is composed of a large number of computers or network servers based on Cloud Computing (Cloud Computing), which is a kind of distributed Computing, a virtual supercomputer composed of a group of loosely coupled computer sets. Including but not limited to the internet, wide area networks, metropolitan area networks, local area networks, VPN networks, wireless Ad Hoc networks (Ad Hoc networks), and the like. Preferably, the device may be a program running on the user device, the network device, or a device formed by integrating the user device and the network device, the touch terminal, or the network device and the touch terminal through a network.
Of course, those skilled in the art will appreciate that the above-described devices are merely examples, and that other devices now known or hereafter may be present as applicable to the present application, and are intended to be within the scope of the present application and are incorporated herein by reference.
In the description of the present application, the meaning of "a plurality" is two or more unless explicitly defined otherwise.
Fig. 1 shows a method for performing man-machine interaction in a control interaction interface according to an aspect of the present application, which is applied to a head-mounted device, and specifically includes step S101, step S102, and step S103. In step S101, a headset presents a control interaction interface corresponding to the headset, where the control interaction interface includes a plurality of control identification information; in step S102, the head-mounted device collects head motion information of a user, and determines a corresponding interaction instruction based on the head motion information; in step S103, the headset device executes the corresponding interaction instruction based on the currently selected current control identification information in the control interaction interface. The head-mounted equipment comprises a display device, a control display device and a control display device, wherein the display device is used for presenting a human-computer interaction interface, such as a display screen and the like, and displaying corresponding control information, control identification information and the like; the headset further comprises a collecting device for collecting gesture information corresponding to the action information of the head of the user, such as an inertial measurement unit, a gyroscope (a triaxial gyroscope or a plurality of uniaxial gyroscopes, etc.), etc., and it should be understood by those skilled in the art that the above collecting device is only an example, and other existing or future possible collecting devices are applicable to the present application, and are also included in the scope of the present application and are incorporated herein by reference. The head-mounted device further comprises a data processing device, wherein the data processing device is used for judging the head action of the user according to the gesture information corresponding to the head action information and generating corresponding instruction information or executing the corresponding instruction information and the like. The headset includes, but is not limited to, any headset mobile electronic device that can interact with a user in a human-machine manner, such as augmented reality glasses, augmented reality helmets, virtual reality glasses, and the like. The controls are used to encapsulate data and methods, e.g., various types of files (e.g., documents, forms, etc.), folders, applications, etc.; the control identification information is used for representing the control, so that a user can conveniently access control data, such as names, icons or access links in files, folders or applications in an interface.
Specifically, in step S101, the headset device presents a control interaction interface corresponding to the headset device, where the control interaction interface includes a plurality of control identification information. For example, the headset device may present a corresponding control interaction interface through the display device, where the control interaction interface includes a plurality of control identification information, and the headset device presents the corresponding control interaction interface on the display screen through the display device or superimposes and presents the control interaction interface in the line of sight of the user for the user to input. In some embodiments, the control interaction interface includes an application interaction interface, and the headset device presents a plurality of applications installed in the headset device through a display device, so that a user can select or start the applications on the application interaction interface.
In some embodiments, the control interactive interface includes, but is not limited to: a control interaction interface arranged in a single row; a control interaction interface arranged in a single row; a control interactive interface arranged in a plurality of rows and columns, etc. For example, when the number of controls is small, the corresponding control interaction interfaces may be formed by arranging a single row or a single column, as shown in fig. 2, where multiple application icons are arranged in a single row, and different application icons are selected by moving left and right, or an application corresponding to the currently selected application icon is started, where of course, only 4 application icons in the row are shown, and the row may further include other application icons not displayed in the current display screen, and other application icons not displayed in the current screen are displayed by moving left and right. For example, when the number of the controls in the corresponding control interaction interface is greater, the control interaction interface can be presented in an arrangement form of a plurality of rows and a plurality of columns in the interface, as shown in fig. 3, a plurality of application icons are arranged in the interface in a form of a plurality of rows and a plurality of columns, different application icons are selected by moving up and down, left and right, or a currently selected application is started, only the application icons of a plurality of rows and a plurality of columns in the current page are displayed in the current diagram, the application interaction interface also comprises other application icons which are not in the current screen, the current interface can be switched to the interface where other applications are located, or other application icons which are not in the current screen are displayed by moving up and down, left and right, and the like.
In step S102, the head-mounted device collects head motion information of a user, and determines a corresponding interaction instruction based on the head motion information. For example, the head motion information includes posture change information corresponding to the head motion of the user, such as acquiring posture change information of the head motion of the current motion according to the posture information of the initial head position and the posture information of the head position at the motion time, and in some embodiments, the data update frequency of the gyroscope of the head-mounted device is 100HZ, and the head-mounted device can receive the gyroscope data in real time and calculate (for example, integrating the angular velocity and squaring the angular velocity module) to obtain the posture change information of the head motion. The method comprises the steps that a user holds and wears head-mounted equipment, a control interaction interface in the head-mounted equipment operates a plurality of control identification information, a display device is used for presenting a corresponding control interaction interface, the user selects, accesses and the like the control identification information through head actions, the corresponding head-mounted equipment can acquire gesture change information corresponding to head action information of the user through an inertial measurement unit, a gyroscope, an acceleration sensor and the like, and corresponding interaction instructions are determined according to the corresponding gesture change information, wherein the interaction instructions are used for determining an intention control of the user, such as selecting file identification information or application identification information and the like of interest, or accessing corresponding files or starting selected applications and the like.
In some embodiments, the interactive instructions include, but are not limited to: moving a preset distance along the movement direction of the head action information from the current control identification information; and starting the control corresponding to the current control identification information. For example, the interaction instruction is used for indicating that the selection symbol such as the cursor, the pointer or the focus is moved by a preset distance along the movement direction of the head action information from the currently selected current application identification information, for example, the interaction instruction comprises rightward movement by one cell, that is, movement from the currently selected application identification information to the application identification information corresponding to the one cell on the right of the application identification, conversion of the selected application identification and the like; for another example, the interaction instruction includes moving N grids to the left, that is, moving from the currently selected application identification information to the application identification information corresponding to the N grids on the left of the application identification, converting the selected application identification, and the like. The interactive instruction is further used for starting a control corresponding to the current space identification information, if the currently selected application identification information is an application corresponding to an electronic book, the interactive instruction corresponding to the application starting is determined based on the head action information of the user, and the application is started based on the interactive instruction. In some embodiments, when the control interaction interface includes a control interaction interface arranged in a single row or a single column, only head motion information in two directions on one axis is required for movement of a selected control in the corresponding control interaction interface, and then the head motion information on the other axis is used for starting an interaction instruction corresponding to the control, for example, movement of selecting the control is performed by turning the head left or right on a horizontal axis, swinging the head, and the like, and movement corresponding to starting instructions by turning the head upward or lowering the head, and the like, in a vertical axis direction; of course, the corresponding start instruction may also be implemented by other start operations of the user.
In some embodiments, the control interaction interface information includes a control interaction interface arranged in a plurality of rows and columns, and the interaction instruction includes moving a preset distance from the current control identification information along a movement direction of the head action information. For example, the control interaction interface includes a plurality of control identification information arranged in a plurality of rows and a plurality of columns, and the movement of the selection of the control identification information in the corresponding control interaction interface is performed by head motion information in four directions on two axes, such as leftward and rightward movement of the control selection by turning the head leftwards or rightwards on the horizontal axis, swinging the head, and the like, upward and downward movement of the control selection by turning the head upwards or downwards on the vertical axis, pitching the head, and the like. In some embodiments, the method further includes step S106 (not shown), if a determining operation of the user about the control is obtained, starting the control corresponding to the currently selected control identification information. For example, if a determining operation about the currently selected control is obtained, the headset device starts the control corresponding to the currently selected control identification information, e.g., accesses the corresponding file, opens the corresponding folder, starts the corresponding application, opens the corresponding table, and so on. In some embodiments, the correspondence determination operation includes, but is not limited to: a movement of the user's head in the front-rear direction; the user head rest time is greater than or equal to a rest time threshold; the voice instruction information related to the user; gesture instruction information related to the user; the touch instruction information related to the user; the user-related eye movement instruction information. For example, the head-mounted device collects posture information on the head motion of the user, determines the displacement of the head of the user in the front-rear direction based on the change of the posture information in the front-rear direction of the head of the user, and determines that the corresponding determination operation generates the determination instruction if the displacement is greater than or equal to a certain distance threshold. Or after the head-mounted equipment selects the corresponding control identification information, continuously collecting gesture information about the head of the user, if the standing time of the head of the user is greater than or equal to a standing time threshold (such as 500 ms), determining corresponding determining operation by the head-mounted equipment, generating a determining instruction, and starting or accessing the corresponding control; or the head-mounted equipment is provided with a corresponding voice template and the like, the head-mounted equipment collects voice information and the like about a user through a microphone, if the voice information is matched with the voice template (such as similarity reaches a certain threshold), the head-mounted equipment determines corresponding determining operation, generates a determining instruction and starts or accesses a corresponding control; or the head-mounted equipment is provided with a corresponding gesture template and the like, the head-mounted equipment collects gesture information and the like about a user through a camera, if the gesture information is matched with the gesture template (such as similarity reaches a certain threshold), the head-mounted equipment determines corresponding determination operation, generates a determination instruction and starts or accesses a corresponding control; or the head-mounted equipment is provided with a corresponding touch pad and the like, the head-mounted equipment collects touch information and the like about a user through the touch pad, if the touch information is matched with a preset touch action (such as clicking by the user, double clicking by the user and the like), the head-mounted equipment determines corresponding determination operation, generates a determination instruction and starts or accesses a corresponding control; or the head-mounted equipment is provided with a corresponding eye movement template and the like, the head-mounted equipment collects eye movement information and the like about a user through a camera, and if the eye movement information is matched with the eye movement template (such as the similarity reaches a certain threshold), the head-mounted equipment determines corresponding determination operation, generates a determination instruction and starts or accesses a corresponding control.
In step S103, the headset device executes the corresponding interaction instruction based on the currently selected current control identification information in the control interaction interface. For example, the head-mounted device executes the corresponding interaction instruction according to the interaction instruction in combination with the currently selected control identification information and the like; for example, the interaction instruction includes moving N grids leftwards, that is, moving from the currently selected application identification information to the application identification information corresponding to the N grids on the left of the application identification, converting the selected application identification, and the like; for another example, the interaction instruction is further used for starting a control corresponding to the current control identification information, if the currently selected application identification information is an application corresponding to an electronic book, the interaction instruction corresponding to the application is determined based on the head action information of the user, and the application is started based on the interaction instruction.
In some embodiments, the head motion information includes, but is not limited to: the head lateral rotation motion information, wherein the corresponding rotation direction includes left or right with respect to the initial head position; the head longitudinal rotation motion information, wherein the corresponding rotation direction includes upward or downward relative to the initial head position; head lateral swing motion information, wherein the corresponding swing direction includes left or right with respect to the initial head position; head longitudinal swing motion information, wherein the corresponding swing direction includes upward or downward relative to the initial head position; for example, when the head-mounted device is in a normal wearing state, the center of the display screen of the head-mounted device is taken as an origin, the horizontal direction of the plane in which the display screen of the head-mounted device is positioned is taken as a transverse axis, the vertical direction is taken as a longitudinal axis, a corresponding space coordinate system is established for the vertical line of the center of the display screen as a third axis, and head motion information is determined to move in the space coordinate system according to posture change information corresponding to the head motion information of a user, wherein the head motion of the user comprises, but is not limited to, head transverse rotation motion information, head longitudinal rotation motion information, head transverse swing motion information, head longitudinal swing motion information and the like; wherein the turning motion represents unidirectional movement of the user's head from one position to another, such as from an initial position to the left/right/upper/lower of the user, and the corresponding turning direction includes a movement direction relative to the initial head position, such as left, right, up or down; the swing motion means a reciprocating motion of the user's head within a certain range, such as moving from an initial position to the left/right/upper/lower side of the user to another position and returning to the initial position, and the swing direction includes a moving direction with respect to the initial head position, such as left, right, upward or downward. Here, the judgment of the head motion information in the lateral and longitudinal directions may be a single judgment, for example, one motion judges only one motion in one direction; alternatively, the head motion information is determined in the lateral direction and the longitudinal direction simultaneously, for example, the current head motion information is decomposed into the axis motions in the lateral and longitudinal axes, and the user motion in the lateral direction and the longitudinal direction is determined based on the axis motions.
In some embodiments, the determining the corresponding interaction instruction based on the head action information includes: and determining angular velocity template information matched with the angular velocity corresponding to the head motion information, and determining a corresponding interaction instruction according to the angular velocity template information. For example, the headset end stores posture data related to head actions corresponding to the interaction instructions, such as triaxial angular velocity template information preset by a user, or obtains the triaxial angular velocity template information subjected to generalization matching by a clustering method according to a large amount of statistical data, wherein the triaxial angular velocity template information is only used as an example, and other uniaxial or multiaxial angular velocity template information can also be used. The head-mounted device collects three-axis gyroscope data corresponding to head motion information of a user, matches the three-axis gyroscope data with stored three-axis angular velocity template information (for example, the angular velocity correlation coefficient of each axis reaches a threshold value or the angular velocity correlation coefficient of any axis reaches a threshold value, and the like), and takes an interaction instruction corresponding to the angular velocity template information as an interaction instruction corresponding to the head motion information if the matched angular velocity template information exists.
In some embodiments, the determining the corresponding interaction instruction based on the head action information includes: angular velocity template information matched with the angular velocity corresponding to the head motion information is determined, acceleration template information matched with the acceleration corresponding to the head motion information is determined, and corresponding interaction instructions are determined according to the angular velocity template information and the acceleration template information. For example, the headset side stores posture data related to head actions corresponding to the interaction instructions, such as triaxial acceleration template information and triaxial angular velocity template information preset by a user, or obtains triaxial angular velocity template information and acceleration template information which are matched in a generalized manner through a clustering method according to a large amount of statistical data, wherein the triaxial acceleration template information and the triaxial angular velocity template information can be one template (the template contains acceleration and angular velocity information, such as (grox, groy, x, y, z)), or can be two independent templates (respectively contains acceleration and angular velocity information, such as (grox, groy, groz) and (x, y, z)). When the angular velocity template information and the acceleration template information are respectively contained in two independent templates, the two independent angular velocity template information and the acceleration template information correspond to the same interaction instruction, and the interaction instruction is the same as the interaction instruction corresponding to the corresponding unified template (template information containing six parameters of the angular velocity information and the acceleration information). The template information is a set of head motion data, for example, when angular velocity template information and acceleration template information are included in a unified template, the unified template includes assignment values corresponding to six parameters, for example, each parameter corresponds to an array after six parameter assignment, each unified template has a corresponding interaction instruction, the arrays obtained after assignment of parameters corresponding to different unified templates are different, and corresponding interaction instructions are also different. The three-axis angular velocity template information and the three-axis acceleration template information are only examples, and other single-axis or multi-axis angular velocity template information or other single-axis or multi-axis acceleration template information can be also used. The head-mounted equipment collects three-axis gyroscope data and three-axis acceleration data corresponding to head action information of a user, matches stored three-axis angular velocity template information and three-axis acceleration template information according to the three-axis gyroscope data and the three-axis acceleration data (if the correlation coefficient of each axis reaches a threshold value or the correlation coefficient of any axis reaches a threshold value and the like), and takes interaction instructions corresponding to the three-axis angular velocity template information and the three-axis angular velocity template information as interaction instructions corresponding to the head action information if the three-axis angular velocity template information and the three-axis angular velocity template information are one template and the matched three-axis angular velocity template information and the three-axis angular velocity template information exist. For another example, if the three-axis acceleration template information and the three-axis angular velocity template information are two independent templates, and there is the three-axis angular velocity template information and the three-axis acceleration template information which are matched, and the three-axis angular velocity template information and the three-axis acceleration template information correspond to the same interaction instruction, the interaction instruction is used as the interaction instruction corresponding to the head action information.
In some embodiments, the method further comprises step S104 (not shown), the head mounted device collects a plurality of motion samples related to the head motion information, and determines corresponding angular velocity template information from the plurality of motion samples. For example, a plurality of motion samples related to the head motion information are collected, such as collecting motion data of people with different body types, sexes and ages, and a learning set is formed, wherein each motion is completed in a period of time, so that each motion sample in the learning set is a set of angular velocity data (an angular velocity collecting device can be a gyroscope or an inertial measurement unit) in a period of time, and the period of time is preset, and one motion is collected under a certain time threshold value, so that a set of gyroscope data or inertial measurement unit data is obtained. And then a clustering method is used for preparing generalized matching templates, and the generalized matching templates correspond to different interaction instructions respectively.
In some embodiments, the method further comprises step S105 (not shown), the headset collects a plurality of motion samples related to the head motion information, and determines corresponding angular velocity template information and acceleration template information according to the plurality of motion samples. The angular velocity template information and the acceleration template information may be one template (the template contains acceleration and angular velocity information), or may be two templates (each containing acceleration and angular velocity information). For example, the learning set and the template may include acceleration data in addition to angular velocity data, and the current state of the headset may be more accurately determined based on the acceleration data, for example, the acceleration sensor may be used to obtain the acceleration data, and comprehensively determine static data of three axes, so as to determine whether the headset is in a wearing state, and if the headset is in the wearing state, further determine a corresponding interaction instruction, and so on.
In some embodiments, the head motion information includes the head lateral rotation motion information or the head longitudinal rotation motion information; wherein the determining the corresponding interaction instruction based on the head action information includes: and if the rotation angle of the head movement is larger than or equal to the rotation angle standard value, generating a corresponding interaction instruction. For example, based on the rotation action of the user head action information on the horizontal axis or the vertical axis, integrating the gyroscope data in the head-mounted device, determining the rotation angle corresponding to the rotation action, and if the rotation angle is greater than or equal to the rotation angle standard value, generating a corresponding interaction instruction, where the interaction instruction is used to instruct the currently selected control identification information to move by one lattice number or N lattice numbers in the movement direction consistent with the rotation direction. For another example, real-time integration is performed on the gyroscope data in the headset (for example, real-time integration is performed at intervals of the collection frequency of the gyroscope, for example, 100 HZ), if it is determined in real time that the current rotation angle exceeds the rotation angle standard value, a corresponding interaction instruction is generated and executed, and the interaction instruction is used for indicating that the currently selected control identification information moves by one lattice number or N lattice numbers in the movement direction with the consistent rotation direction. After determining an interaction instruction, the head-mounted device continues to acquire data corresponding to the gyroscope in real time and continues to judge, so that one or more interaction instructions are determined, one or more grid numbers or one or more N grid numbers of which the head action information corresponds to the movement of the selected control labeling information are determined, the movement direction is consistent with the rotation direction, and the like. The method for integrating the gyroscope data in real time to obtain the interaction instruction is beneficial to the control of the head action by the user, if the action deviates, the gyroscope data can be timely adjusted, and the user interaction feeling is good. In addition, before calculating the rotation angle of the gyroscope data acquired by the headset, filtering (such as mean filtering) can be performed on the gyroscope data to remove noise in the sensor data, and then integration is performed on the gyroscope data to acquire rotation angle information. Fig. 4 shows a determination example of a rotation angle standard value, where θ is a rotation angle standard value, corresponds to a user rotation angle ω, and is obtained by dividing the user rotation angle ω by the number of controls in the rotation direction of the control interaction interface, etc., where the user rotation angle may be preset in the headset by the user, or may be template data obtained according to a large number of user statistics, etc. For example, when the user head moves left and right, a bordered included angle formed by rotating along the cervical vertebra is counted, a large number of user rotating included angles are calculated, an included angle w with higher adaptability is obtained, and the included angle w is divided by the number of controls in the rotating direction of the control interaction interface, so that a corresponding rotating angle standard value theta is obtained. In some embodiments, the determining the corresponding interaction instruction based on the head action information includes: and if the rotation angle of the head movement is larger than or equal to the rotation angle standard value, the movement time of the head movement meets the movement time threshold value, and a corresponding interaction instruction is generated. For example, the head-mounted device is provided with threshold information of head motion, where the threshold information includes, but is not limited to, a rotation angle standard value, a motion time threshold/confidence interval (for example, the time threshold is 0.5s, or the confidence interval of the motion time is between 0.4s and 0.6 s), and the like, and if the rotation angle of the head motion information of the user within the motion time threshold is greater than or equal to the rotation angle standard value, corresponding interaction instruction information is generated. The motion time threshold may be a value or a value interval.
In some embodiments, in step S102, the head-mounted device collects head motion information of a user, and detects the head motion information; and if the detection is passed, determining a corresponding interaction instruction based on the head action information. For example, when generating the interactive instruction by the head motion information, the head motion information may be pre-detected, and if the detection is passed, the corresponding interactive instruction may be determined based on the head motion information. For example, whether the head motion information is a valid motion may be detected by presetting an angle threshold or an angular velocity square threshold, or presetting a motion time threshold/confidence interval, or presetting an angular velocity threshold/angular velocity square threshold in combination with motion time, or the like, and removing a low-frequency shifted high-frequency noise signal. For example, integrating gyroscope data of head motion to obtain an angle of the head motion, and if the angle does not accord with a preset angle threshold, removing the motion data and not judging an interaction instruction corresponding to the head motion; for example, calculating the square of a module of the angular velocity of the gyroscope corresponding to the head motion, if the square of the module of the angular velocity does not accord with a preset square threshold value of the angular velocity, removing the motion data, and not judging the interaction instruction corresponding to the head motion; for example, calculating the movement time of the head action, if the movement time does not meet a preset movement time threshold/confidence interval (for example, is lower than the acquisition time interval of the gyroscope data), removing the movement data, and not judging the interaction instruction corresponding to the head action; for another example, the gyroscope data of the head motion is integrated to obtain the angle of the head motion, and the motion time of the head motion is calculated. If the time does not meet the preset movement time threshold or if the angle does not meet the preset angle threshold, the movement data are removed, and the judgment of the interaction instruction corresponding to the head action is not carried out.
In some embodiments, the head motion information includes the head lateral rotation motion information or the head longitudinal rotation motion information; wherein the head motion information detection includes, but is not limited to: whether the square value of the module of the angular velocity of the head motion information is larger than or equal to a preset first angular velocity square threshold value; whether the difference between the angular velocity corresponding times of the angular velocity module of the head motion information is larger than or equal to a second angular velocity square threshold value meets a preset first time difference threshold value or not, wherein the first angular velocity square threshold value and the second angular velocity square threshold value can be the same or different. For example, the oscillation motion of the head motion information is a reciprocating motion, and the angular velocity value of the corresponding gyroscope forms a curve similar to a sine wave, as shown in fig. 5, the head-mounted device moves from the rest state of the point a to the point B by taking the point O as an axis, then moves from the point B to the point a by taking the point O as an axis, and the process forms a sine wave pattern similar to that shown in fig. 6, wherein the time t1 and the time t2 correspond to the oscillation starting motion (for example, from the point a to the point B) in the oscillation direction, and the time t2 and the time t3 correspond to the recovery motion (for example, from the point B to the point a). Taking the square value of the module corresponding to the angular velocity as the vertical axis and time as the horizontal axis, the relation of the square value of the module corresponding to the angular velocity of the head motion time with respect to time can be obtained, as shown in fig. 7, wherein the square value of the module corresponding to the angular velocity determines the corresponding time points t1 and t5 through a w1 curve. If the square value of the modulus corresponding to the swing angular velocity has a value larger than or equal to w1, or the angular velocity corresponding time difference (such as t1 and t 5) meeting the condition meets a preset first time threshold/confidence interval, determining that the action is effective action. The first time threshold may be a value or a value interval. In some embodiments, the head motion information includes the head lateral swing motion information or the head longitudinal swing motion information; wherein the detecting of the head motion information includes: detecting whether the square value of the module of the angular velocity of the head motion information is larger than or equal to a third angular velocity square threshold value or not, and simultaneously, whether the difference of the angular velocity corresponding time meeting the condition meets a preset second time difference threshold value or not; if so, detecting whether the square value of the module of the angular velocity of four different time nodes exists in the head movement information in the movement process is equal to a fourth angular velocity square threshold value. For example, the headset is provided with a fourth angular velocity square threshold value according to the swing curve, the fourth angular velocity square threshold value is larger than the third angular velocity square threshold value, if four intersection points exist between the swing curve corresponding to the head motion and the fourth angular velocity square threshold value, a corresponding interaction instruction is generated, as shown in fig. 7, four intersection points exist between the square value of the mode of the swing instantaneous angular velocity in the figure and the corresponding straight line of w2, and the headset confirms that the current head motion information is a valid motion.
Fig. 8 illustrates a method for performing man-machine interaction in a control interaction interface according to an aspect of the present application, which is applied to a head-mounted device, wherein the method includes step S201, step S202, and step S203. In step S201, the headset device presents a control interaction interface corresponding to the headset device, where the control interaction interface includes determination selection information and deselection information. In step S202, the head-mounted device collects head motion information of a user, determines angular velocity template information matched with angular velocity corresponding to the head motion information, and determines corresponding interaction instructions according to the angular velocity template information, wherein the interaction instruction information includes a confirmation selection instruction or a deselection instruction; in step S203, the head-mounted device executes a corresponding interaction instruction. For example, the headset presents a control interactive interface to the user, such as the confirmation cancellation interface shown in fig. 9, which includes confirmation selection information and cancellation selection information; then, the head-mounted device collects head motion information, acquires angular velocity data information and the like corresponding to the head motion information, and stores angular velocity data and the like related to the head motion corresponding to the interaction instruction at the head-mounted device end, such as triaxial angular velocity template information preset by a user, or obtains three-axis angular velocity template information and the like subjected to generalization matching through a clustering method and the like according to a large amount of statistical data, wherein the triaxial angular velocity template information is only used as an example, and can also be other uniaxial or multiaxial angular velocity template information. The head-mounted equipment collects three-axis gyroscope data corresponding to head motion information of a user, matches the three-axis gyroscope data with stored three-axis angular velocity template information (for example, the angular velocity correlation coefficient of each axis reaches a threshold value or the angular velocity correlation coefficient of any axis reaches a threshold value, and the like), and takes an interaction instruction corresponding to the three-axis angular velocity template information as an interaction instruction corresponding to the head motion information if the matched three-axis angular velocity template information exists, for example, a nod motion corresponding confirmation instruction and a nod motion corresponding cancellation instruction. Subsequently, the head-mounted device executes corresponding interaction instructions, such as executing a confirmation instruction or a cancel instruction.
In some embodiments, in step S202, the head-mounted device collects head motion information of a user, determines angular velocity template information matching an angular velocity corresponding to the head motion information, determines acceleration template information matching an acceleration corresponding to the head motion information, and determines a corresponding interaction instruction according to the angular velocity template information and the acceleration template information. For example, a plurality of motion samples related to head motion information are collected, such as collecting motion data of people with different body types, sexes and ages, and a learning set is formed, wherein each motion is completed within a period of time, so that each motion sample in the learning set is a set of angular velocity and acceleration data within a period of time, which is preset, and one motion is collected under a certain time threshold value, so as to obtain a set of angular velocity and acceleration data, such as inertial measurement unit data. And then a clustering method is used for preparing generalized matching templates, and the generalized matching templates correspond to different interaction instructions respectively.
Fig. 10 illustrates a headset for human-computer interaction in a control interaction interface according to an aspect of the present application, which specifically includes a one-to-one module 101, a two-to-two module 102, and a three-to-three module 103. The one-to-one module 101 is configured to present a control interaction interface corresponding to the headset device, where the control interaction interface includes a plurality of control identification information; the first module 102 is used for collecting head action information of a user and determining a corresponding interaction instruction based on the head action information; and the three modules 103 are used for executing corresponding interaction instructions based on the currently selected current control identification information in the control interaction interface. Here, the specific embodiments of the one-to-one module 101, the two-to-one module 102 and the three-to-one module 103 are the same as or similar to the specific embodiments of the step S101, the step S102 and the step S103, which are described in the foregoing, and are not repeated herein, and are incorporated by reference.
In some embodiments, the control interactive interface includes, but is not limited to: a control interaction interface arranged in a single row; a control interaction interface arranged in a single row; a control interactive interface arranged in a plurality of rows and columns, etc. The specific embodiments of the control interaction interface are the same as or similar to the specific embodiments of the control interaction interface described above, and are not described in detail herein, and are incorporated by reference.
In some embodiments, the interactive instructions include, but are not limited to: moving a preset distance along the movement direction of the head action information from the current control identification information; and starting the control corresponding to the current control identification information. In some embodiments, the control interaction interface information includes a control interaction interface arranged in a plurality of rows and columns, and the interaction instruction includes moving a preset distance from the current control identification information along a movement direction of the head action information. The specific embodiment of the interaction instruction is the same as or similar to the specific embodiment of the control interaction interface interaction instruction described above, and is not described in detail herein, but is incorporated by reference.
In some embodiments, the device further includes a six module 106 (not shown), configured to activate the control corresponding to the currently selected control identification information if a determination operation of the user with respect to the control is obtained. In some embodiments, the correspondence determination operation includes, but is not limited to: a movement of the user's head in the front-rear direction; the user head rest time is greater than or equal to a rest time threshold; the voice instruction information related to the user; gesture instruction information related to the user; the touch instruction information related to the user; the user-related eye movement instruction information. Here, the specific embodiment of the determining operation of the sixth module 106 is the same as or similar to the specific embodiment of the determining operation of the step S106, which is described above, and is not described in detail herein, and is incorporated by reference.
In some embodiments, the head motion information includes, but is not limited to: the head lateral rotation motion information, wherein the corresponding rotation direction includes left or right with respect to the initial head position; the head longitudinal rotation motion information, wherein the corresponding rotation direction includes upward or downward relative to the initial head position; head lateral swing motion information, wherein the corresponding swing direction includes left or right with respect to the initial head position; the head longitudinal swing motion information, wherein the corresponding swing direction includes upward or downward relative to the initial head position. Here, the specific embodiment of the head motion information is the same as or similar to the specific embodiment of the head motion information, which is not described in detail herein, and is incorporated by reference.
In some embodiments, the determining the corresponding interaction instruction based on the head action information includes: and determining angular velocity template information matched with the angular velocity corresponding to the head motion information, and determining a corresponding interaction instruction according to the angular velocity template information. In some embodiments, the determining the corresponding interaction instruction based on the head action information includes: angular velocity template information matched with the angular velocity corresponding to the head motion information is determined, acceleration template information matched with the acceleration corresponding to the head motion information is determined, and corresponding interaction instructions are determined according to the angular velocity template information and the acceleration template information. The specific embodiment of the determining interaction instruction is the same as or similar to the specific embodiment of the determining interaction instruction described above, and is not described in detail herein, but is incorporated by reference.
In some embodiments, the apparatus further comprises a four module 104 (not shown) for collecting a plurality of motion samples related to the head motion information, and determining corresponding angular velocity template information based on the plurality of motion samples. Here, the embodiment of the four modules 104 is the same as or similar to the embodiment of the step S104, and is not described in detail herein, and is incorporated by reference.
In some embodiments, the apparatus further comprises a five module 105 (not shown) for collecting a plurality of motion samples related to the head motion information, and determining corresponding angular velocity template information and acceleration template information from the plurality of motion samples. Here, the specific embodiment of the fifth module 105 is the same as or similar to the specific embodiment of the step S105, and is not described in detail herein, and is incorporated by reference.
In some embodiments, the head motion information includes the head lateral rotation motion information or the head longitudinal rotation motion information; wherein the determining the corresponding interaction instruction based on the head action information includes: and if the rotation angle of the head movement is larger than or equal to the rotation angle standard value, generating a corresponding interaction instruction. In some embodiments, the determining the corresponding interaction instruction based on the head action information includes: and if the rotation angle of the head movement is larger than or equal to the rotation angle standard value, the movement time of the head movement meets the movement time threshold value, and a corresponding interaction instruction is generated. The specific embodiment of the head rotation determination interaction instruction is the same as or similar to the specific embodiment of the head rotation determination interaction instruction, which is not described in detail herein, and is incorporated by reference.
In some embodiments, in step S102, the head-mounted device collects head motion information of a user, and detects the head motion information; and if the detection is passed, determining a corresponding interaction instruction based on the head action information. For example, when generating the interactive instruction by the head motion information, the head motion information may be pre-detected, and if the detection is passed, the corresponding interactive instruction may be determined based on the head motion information. In some embodiments, the head motion information includes the head lateral rotation motion information or the head longitudinal rotation motion information; wherein the head motion information detection includes, but is not limited to: whether the square value of the module of the angular velocity of the head motion information is larger than or equal to a preset first angular velocity square threshold value; whether the difference between the angular velocity corresponding times of the square value of the module of the angular velocity of the head motion information is larger than or equal to the second angular velocity square threshold value meets a preset first time difference threshold value or not. In some embodiments, the head motion information includes the head lateral swing motion information or the head longitudinal swing motion information; wherein the detecting of the head motion information includes: detecting whether the square value of the module of the angular velocity of the head motion information is larger than or equal to a third angular velocity square threshold value or not, and simultaneously, whether the difference of the angular velocity corresponding time meeting the condition meets a preset second time difference threshold value or not; if so, detecting whether the square value of the module of the angular velocity of four different time nodes exists in the head movement information in the movement process is equal to a fourth angular velocity square threshold value. The specific embodiment of the pre-detection of the head motion information is the same as or similar to the specific embodiment of the pre-detection of the head motion information, which is not described in detail and is incorporated herein by reference.
Fig. 11 illustrates a head-mounted device for human-machine interaction in a control interaction interface according to an aspect of the present application, where the device includes a second module 201, a second module 202, and a second module 203. And the second module 201 is configured to present a control interaction interface corresponding to the headset device, where the control interaction interface includes determination selection information and deselection information. The second-second module 202 is configured to collect head motion information of a user, determine triaxial angular velocity template information matched with triaxial angular velocity corresponding to the head motion information, and determine a corresponding interaction instruction according to the triaxial angular velocity template information, where the interaction instruction information includes a confirmation selection instruction or a deselection instruction; and the second and third modules 203 are configured to execute corresponding interaction instructions. Here, the embodiments of the second module 201, the second module 202 and the second module 203 are the same as or similar to the embodiments of the step S201, the step S202 and the step S203, and are not described in detail herein, and are incorporated herein by reference.
In some embodiments, in step S202, the head-mounted device collects head motion information of a user, determines angular velocity template information matching an angular velocity corresponding to the head motion information, determines acceleration template information matching an acceleration corresponding to the head motion information, and determines a corresponding interaction instruction according to the angular velocity template information and the acceleration template information. Here, the embodiment of the matching based on the angular velocity template and the acceleration template information in the second module 202 is the same as or similar to the embodiment of the matching based on the angular velocity template and the acceleration template information in the aforementioned step S202, and is not described in detail and is incorporated herein by reference.
In addition to the methods and apparatus described in the above embodiments, the present application also provides a computer-readable storage medium storing computer code which, when executed, performs a method as described in any one of the preceding claims.
The application also provides a computer program product which, when executed by a computer device, performs a method as claimed in any preceding claim.
The present application also provides a computer device comprising:
one or more processors;
a memory for storing one or more computer programs;
the one or more computer programs, when executed by the one or more processors, cause the one or more processors to implement the method of any preceding claim.
FIG. 12 illustrates an exemplary system that may be used to implement various embodiments described in the present disclosure;
in some embodiments, as shown in fig. 12, the system 300 can function as any of the above-described devices of the various described embodiments. In some embodiments, system 300 may include one or more computer-readable media (e.g., system memory or NVM/storage 320) having instructions and one or more processors (e.g., processor(s) 305) coupled with the one or more computer-readable media and configured to execute the instructions to implement the modules to perform the actions described in the present application.
For one embodiment, the system control module 310 may include any suitable interface controller to provide any suitable interface to at least one of the processor(s) 305 and/or any suitable device or component in communication with the system control module 310.
The system control module 310 may include a memory controller module 330 to provide an interface to the system memory 315. Memory controller module 330 may be a hardware module, a software module, and/or a firmware module.
The system memory 315 may be used, for example, to load and store data and/or instructions for the system 300. For one embodiment, system memory 315 may include any suitable volatile memory, such as, for example, a suitable DRAM. In some embodiments, the system memory 315 may comprise a double data rate type four synchronous dynamic random access memory (DDR 4 SDRAM).
For one embodiment, system control module 310 may include one or more input/output (I/O) controllers to provide an interface to NVM/storage 320 and communication interface(s) 325.
For example, NVM/storage 320 may be used to store data and/or instructions. NVM/storage 320 may include any suitable nonvolatile memory (e.g., flash memory) and/or may include any suitable nonvolatile storage device(s) (e.g., one or more Hard Disk Drives (HDDs), one or more Compact Disc (CD) drives, and/or one or more Digital Versatile Disc (DVD) drives).
NVM/storage 320 may include storage resources that are physically part of the device on which system 300 is installed or which may be accessed by the device without being part of the device. For example, NVM/storage 320 may be accessed over a network via communication interface(s) 325.
Communication interface(s) 325 may provide an interface for system 300 to communicate over one or more networks and/or with any other suitable device. The system 300 may wirelessly communicate with one or more components of a wireless network in accordance with any of one or more wireless network standards and/or protocols.
For one embodiment, at least one of the processor(s) 305 may be packaged together with logic of one or more controllers (e.g., memory controller module 330) of the system control module 310. For one embodiment, at least one of the processor(s) 305 may be packaged together with logic of one or more controllers of the system control module 310 to form a System In Package (SiP). For one embodiment, at least one of the processor(s) 305 may be integrated on the same die as logic of one or more controllers of the system control module 310. For one embodiment, at least one of the processor(s) 305 may be integrated on the same die with logic of one or more controllers of the system control module 310 to form a system on chip (SoC).
In various embodiments, the system 300 may be, but is not limited to being: a server, workstation, desktop computing device, or mobile computing device (e.g., laptop computing device, handheld computing device, tablet, netbook, etc.). In various embodiments, system 300 may have more or fewer components and/or different architectures. For example, in some embodiments, system 300 includes one or more cameras, keyboards, liquid Crystal Display (LCD) screens (including touch screen displays), non-volatile memory ports, multiple antennas, graphics chips, application Specific Integrated Circuits (ASICs), and speakers.
It should be noted that the present application may be implemented in software and/or a combination of software and hardware, e.g., using Application Specific Integrated Circuits (ASIC), a general purpose computer or any other similar hardware device. In one embodiment, the software program of the present application may be executed by a processor to perform the steps or functions described above. Likewise, the software programs of the present application (including associated data structures) may be stored on a computer readable recording medium, such as RAM memory, magnetic or optical drive or diskette and the like. In addition, some steps or functions of the present application may be implemented in hardware, for example, as circuitry that cooperates with the processor to perform various steps or functions.
Furthermore, portions of the present application may be controlled as a computer program product, such as computer program instructions, which when executed by a computer, may invoke or provide methods and/or solutions according to the present application by way of operation of the computer. Those skilled in the art will appreciate that the form of computer program instructions present in a computer readable medium includes, but is not limited to, source files, executable files, installation package files, etc., and accordingly, the manner in which the computer program instructions are executed by a computer includes, but is not limited to: the computer directly executes the instruction, or the computer compiles the instruction and then executes the corresponding compiled program, or the computer reads and executes the instruction, or the computer reads and installs the instruction and then executes the corresponding installed program. Herein, a computer-readable medium may be any available computer-readable storage medium or communication medium that can be accessed by a computer.
Communication media includes media whereby a communication signal containing, for example, computer readable instructions, data structures, program modules, or other data, is transferred from one system to another. Communication media may include conductive transmission media such as electrical cables and wires (e.g., optical fibers, coaxial, etc.) and wireless (non-conductive transmission) media capable of transmitting energy waves, such as acoustic, electromagnetic, RF, microwave, and infrared. Computer readable instructions, data structures, program modules, or other data may be embodied as a modulated data signal, for example, in a wireless medium, such as a carrier wave or similar mechanism, such as that embodied as part of spread spectrum technology. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. The modulation may be analog, digital or hybrid modulation techniques.
By way of example, and not limitation, computer-readable storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable storage media include, but are not limited to, volatile memory, such as random access memory (RAM, DRAM, SRAM); and nonvolatile memory such as flash memory, various read only memory (ROM, PROM, EPROM, EEPROM), magnetic and ferromagnetic/ferroelectric memory (MRAM, feRAM); and magnetic and optical storage devices (hard disk, tape, CD, DVD); or other now known media or later developed computer-readable information/data that can be stored for use by a computer system.
An embodiment according to the application comprises an apparatus comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the apparatus to operate a method and/or a solution according to the embodiments of the application as described above.
It will be evident to those skilled in the art that the application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is evident that the word "comprising" does not exclude other elements or steps, and that the singular does not exclude a plurality. A plurality of units or means recited in the apparatus claims can also be implemented by means of one unit or means in software or hardware. The terms first, second, etc. are used to denote a name, but not any particular order.
Claims (17)
1. A method for man-machine interaction in a control interaction interface is applied to a head-mounted device, wherein the method comprises the following steps:
presenting a control interaction interface corresponding to the head-mounted device, wherein the control interaction interface comprises a plurality of control identification information;
Collecting head action information of a user, and determining a corresponding interaction instruction based on the head action information;
executing a corresponding interaction instruction based on the currently selected current control identification information in the control interaction interface;
the method for acquiring the head action information of the user and determining the corresponding interaction instruction based on the head action information comprises the following steps:
collecting head action information of a user, and detecting the head action information;
if the detection is passed, determining that the head motion information is effective motion, and determining a corresponding interaction instruction based on the head motion information;
wherein the head motion information includes head lateral swing motion information or head longitudinal swing motion information; wherein the detecting of the head motion information includes:
detecting whether the square value of the module of the angular speed of the head motion information is larger than or equal to a third angular speed square threshold value or not, and simultaneously, whether the difference of the time corresponding to the angular speed meeting the condition meets a preset second time difference threshold value or not;
if yes, detecting whether the square value of the module of the angular velocity of four different time nodes exists in the movement process of the head motion information is equal to a fourth angular velocity square threshold value.
2. The method of claim 1, wherein the control interaction interface comprises at least any one of:
a control interaction interface arranged in a single row;
a control interaction interface arranged in a single row;
a control interactive interface arranged in a plurality of rows and columns.
3. The method of claim 2, wherein the interaction instruction comprises at least any one of:
moving a preset distance along the movement direction of the head action information from the current control identification information;
and starting the control corresponding to the current control identification information.
4. The method of claim 3, the control interface information comprising a control interface arranged in a plurality of rows and columns, the interaction instructions comprising moving a preset distance from the current control identification information along a direction of motion of the head action information.
5. The method of any of claims 1 to 4, wherein the head motion information comprises at least any of:
the head lateral rotation motion information, wherein the corresponding rotation direction includes left or right with respect to the initial head position;
the head longitudinal rotation motion information, wherein the corresponding rotation direction includes upward or downward relative to the initial head position;
Head lateral swing motion information, wherein the corresponding swing direction includes left or right with respect to the initial head position;
the head longitudinal swing motion information, wherein the corresponding swing direction includes upward or downward relative to the initial head position.
6. The method of claim 5, wherein the determining the corresponding interaction instruction based on the head action information comprises:
and determining angular velocity template information matched with the angular velocity corresponding to the head motion information, and determining a corresponding interaction instruction according to the angular velocity template information.
7. The method of claim 6, wherein the determining the corresponding interaction instruction based on the head action information comprises:
angular velocity template information matched with the angular velocity corresponding to the head motion information is determined, acceleration template information matched with the acceleration corresponding to the head motion information is determined, and corresponding interaction instructions are determined according to the angular velocity template information and the acceleration template information.
8. The method according to claim 6 or 7, wherein the method further comprises:
and acquiring a plurality of motion samples related to the head motion information, and determining corresponding angular velocity template information according to the plurality of motion samples.
9. The method of claim 7, wherein the method further comprises:
and acquiring a plurality of motion samples related to the head motion information, and determining corresponding angular velocity template information and acceleration template information according to the plurality of motion samples.
10. The method of claim 5, wherein the head motion information comprises the head lateral rotation motion information or the head longitudinal rotation motion information; wherein the determining the corresponding interaction instruction based on the head action information includes:
and if the rotation angle of the head action information is larger than or equal to the rotation angle standard value, generating a corresponding interaction instruction.
11. The method of claim 10, wherein the determining the corresponding interaction instruction based on the head action information comprises:
and if the rotation angle of the head motion information is larger than or equal to the rotation angle standard value, the motion time of the head motion information meets the motion time threshold value, and a corresponding interaction instruction is generated.
12. The method of claim 1, wherein the head motion information further comprises the head lateral rotation motion information or the head longitudinal rotation motion information; wherein the head motion information detection further includes at least any one of:
Whether the square value of the module of the angular velocity of the head motion information is larger than or equal to a preset first angular velocity square threshold value;
whether the difference between the angular velocity corresponding times of the square value of the module of the angular velocity of the head motion information is larger than or equal to the second angular velocity square threshold value meets a preset first time difference threshold value or not.
13. The method of any one of claims 1 to 4, wherein the method further comprises:
and if the determining operation of the user on the control is obtained, starting the control corresponding to the currently selected control identification information.
14. The method of claim 13, wherein the determining operation comprises at least any one of:
a movement of the user's head in the front-rear direction;
the user head rest time is greater than or equal to a rest time threshold;
the voice instruction information related to the user;
gesture instruction information related to the user;
the touch instruction information related to the user;
the user-related eye movement instruction information.
15. A headset for human-machine interaction in a control interaction interface, wherein the headset comprises:
the control interaction interface comprises a plurality of control identification information;
The first module and the second module are used for collecting head action information of a user and determining corresponding interaction instructions based on the head action information;
the three modules are used for executing corresponding interaction instructions based on the current control identification information currently selected in the control interaction interface;
the method for acquiring the head action information of the user and determining the corresponding interaction instruction based on the head action information comprises the following steps:
collecting head action information of a user, and detecting the head action information;
if the detection is passed, determining that the head motion information is effective motion, and determining a corresponding interaction instruction based on the head motion information;
wherein the head motion information includes head lateral swing motion information or head longitudinal swing motion information; wherein the detecting of the head motion information includes:
detecting whether the square value of the module of the angular speed of the head motion information is larger than or equal to a third angular speed square threshold value or not, and simultaneously, whether the difference of the time corresponding to the angular speed meeting the condition meets a preset second time difference threshold value or not;
if yes, detecting whether the square value of the module of the angular velocity of four different time nodes exists in the movement process of the head motion information is equal to a fourth angular velocity square threshold value.
16. An apparatus for human-machine interaction in a control interaction interface, wherein the apparatus comprises:
a processor; and
a memory arranged to store computer executable instructions which, when executed, cause the processor to operate in accordance with the method of any one of claims 1 to 14.
17. A computer readable medium storing instructions that, when executed, cause a system to perform the operations of the method of any one of claims 1 to 14.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910785670.1A CN112416115B (en) | 2019-08-23 | 2019-08-23 | Method and equipment for performing man-machine interaction in control interaction interface |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910785670.1A CN112416115B (en) | 2019-08-23 | 2019-08-23 | Method and equipment for performing man-machine interaction in control interaction interface |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112416115A CN112416115A (en) | 2021-02-26 |
CN112416115B true CN112416115B (en) | 2023-12-15 |
Family
ID=74779451
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910785670.1A Active CN112416115B (en) | 2019-08-23 | 2019-08-23 | Method and equipment for performing man-machine interaction in control interaction interface |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112416115B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113655927B (en) * | 2021-08-24 | 2024-04-26 | 亮风台(上海)信息科技有限公司 | Interface interaction method and device |
CN113777791B (en) * | 2021-09-14 | 2025-07-04 | 北京乐驾科技有限公司 | View display method of AR glasses and AR glasses |
CN114048726B (en) * | 2022-01-13 | 2022-04-08 | 北京中科汇联科技股份有限公司 | Computer graphic interface interaction method and system |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2808752A1 (en) * | 2013-05-28 | 2014-12-03 | BlackBerry Limited | Performing an action associated with a motion based input |
CN104536654A (en) * | 2014-12-25 | 2015-04-22 | 小米科技有限责任公司 | Menu selecting method and device on intelligent wearable device and intelligent wearable device |
CN105824409A (en) * | 2016-02-16 | 2016-08-03 | 乐视致新电子科技(天津)有限公司 | Interactive control method and device for virtual reality |
KR20160133328A (en) * | 2015-05-12 | 2016-11-22 | 삼성전자주식회사 | Remote control method and device using wearable device |
CN106527722A (en) * | 2016-11-08 | 2017-03-22 | 网易(杭州)网络有限公司 | Interactive method and system in virtual reality and terminal device |
CN106970697A (en) * | 2016-01-13 | 2017-07-21 | 华为技术有限公司 | Interface alternation device and method |
CN108008873A (en) * | 2017-11-10 | 2018-05-08 | 亮风台(上海)信息科技有限公司 | A kind of operation method of user interface of head-mounted display apparatus |
US9996149B1 (en) * | 2016-02-22 | 2018-06-12 | Immersacad Corporation | Method for one-touch translational navigation of immersive, virtual reality environments |
CN108170279A (en) * | 2015-06-03 | 2018-06-15 | 塔普翊海(上海)智能科技有限公司 | The eye of aobvious equipment is moved moves exchange method with head |
CN108304075A (en) * | 2018-02-11 | 2018-07-20 | 亮风台(上海)信息科技有限公司 | A kind of method and apparatus carrying out human-computer interaction in augmented reality equipment |
JP2019136066A (en) * | 2018-02-06 | 2019-08-22 | グリー株式会社 | Application processing system, application processing method, and application processing program |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102668725B1 (en) * | 2017-10-27 | 2024-05-29 | 매직 립, 인코포레이티드 | Virtual reticle for augmented reality systems |
US10656706B2 (en) * | 2017-12-04 | 2020-05-19 | International Business Machines Corporation | Modifying a computer-based interaction based on eye gaze |
GB2576905B (en) * | 2018-09-06 | 2021-10-27 | Sony Interactive Entertainment Inc | Gaze input System and method |
-
2019
- 2019-08-23 CN CN201910785670.1A patent/CN112416115B/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2808752A1 (en) * | 2013-05-28 | 2014-12-03 | BlackBerry Limited | Performing an action associated with a motion based input |
CN104536654A (en) * | 2014-12-25 | 2015-04-22 | 小米科技有限责任公司 | Menu selecting method and device on intelligent wearable device and intelligent wearable device |
KR20160133328A (en) * | 2015-05-12 | 2016-11-22 | 삼성전자주식회사 | Remote control method and device using wearable device |
CN108170279A (en) * | 2015-06-03 | 2018-06-15 | 塔普翊海(上海)智能科技有限公司 | The eye of aobvious equipment is moved moves exchange method with head |
CN106970697A (en) * | 2016-01-13 | 2017-07-21 | 华为技术有限公司 | Interface alternation device and method |
CN105824409A (en) * | 2016-02-16 | 2016-08-03 | 乐视致新电子科技(天津)有限公司 | Interactive control method and device for virtual reality |
US9996149B1 (en) * | 2016-02-22 | 2018-06-12 | Immersacad Corporation | Method for one-touch translational navigation of immersive, virtual reality environments |
CN106527722A (en) * | 2016-11-08 | 2017-03-22 | 网易(杭州)网络有限公司 | Interactive method and system in virtual reality and terminal device |
CN108008873A (en) * | 2017-11-10 | 2018-05-08 | 亮风台(上海)信息科技有限公司 | A kind of operation method of user interface of head-mounted display apparatus |
JP2019136066A (en) * | 2018-02-06 | 2019-08-22 | グリー株式会社 | Application processing system, application processing method, and application processing program |
CN108304075A (en) * | 2018-02-11 | 2018-07-20 | 亮风台(上海)信息科技有限公司 | A kind of method and apparatus carrying out human-computer interaction in augmented reality equipment |
Also Published As
Publication number | Publication date |
---|---|
CN112416115A (en) | 2021-02-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112416115B (en) | Method and equipment for performing man-machine interaction in control interaction interface | |
CN102262476B (en) | Tactile Communication System And Method | |
US20170242495A1 (en) | Method and device of controlling virtual mouse and head-mounted displaying device | |
CN106605202A (en) | Handedness detection from touch input | |
CN111452044A (en) | Robot system architecture and robot thereof | |
CN109828672B (en) | Method and equipment for determining man-machine interaction information of intelligent equipment | |
KR101228336B1 (en) | Personalization Service Providing Method by Using Mobile Terminal User's Activity Pattern and Mobile Terminal therefor | |
CN101676861A (en) | Tablet computer equipped with microphones | |
CN107924286A (en) | The input method of electronic equipment and electronic equipment | |
KR20150024247A (en) | Method and apparatus for executing application using multiple input tools on touchscreen device | |
CN112965592B (en) | Device interaction method, device and system | |
CN106062679A (en) | Display screen controlling method and apparatus, and terminal | |
WO2012087309A1 (en) | Touch sensor gesture recognition for operation of mobile devices | |
HK1215736A1 (en) | Method, apparatus and system for controlling browser with somatosensory remote control device | |
CN109324741A (en) | An operation control method, device and system | |
CN113805770B (en) | A cursor moving method and electronic device | |
CN110413183B (en) | Method and equipment for presenting page | |
CN105242780A (en) | Interactive control method and apparatus | |
CN113655927A (en) | Interface interaction method and device | |
CN112416140B (en) | Method and equipment for inputting characters | |
CN103177245A (en) | Gesture recognition method and device | |
CN103854026A (en) | Recognition method and electronic device | |
CN103279304B (en) | Method and device for displaying selected icon and mobile device | |
CN111380527A (en) | Navigation method and navigation controller of indoor service robot | |
CN104156058B (en) | Generate the method and system of control instruction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: 201210 7th Floor, No. 1, Lane 5005, Shenjiang Road, China (Shanghai) Pilot Free Trade Zone, Pudong New Area, Shanghai Applicant after: HISCENE INFORMATION TECHNOLOGY Co.,Ltd. Address before: Room 501 / 503-505, 570 shengxia Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai, 201203 Applicant before: HISCENE INFORMATION TECHNOLOGY Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |