[go: up one dir, main page]

CN109669538B - Object grabbing interaction method under complex motion constraint in virtual reality - Google Patents

Object grabbing interaction method under complex motion constraint in virtual reality Download PDF

Info

Publication number
CN109669538B
CN109669538B CN201811479678.7A CN201811479678A CN109669538B CN 109669538 B CN109669538 B CN 109669538B CN 201811479678 A CN201811479678 A CN 201811479678A CN 109669538 B CN109669538 B CN 109669538B
Authority
CN
China
Prior art keywords
virtual
hand
virtual hand
motion
movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811479678.7A
Other languages
Chinese (zh)
Other versions
CN109669538A (en
Inventor
陈学文
王京涛
张炎
黄鹏
杜芳
马飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BEIJING HUARU TECHNOLOGY CO LTD
China Astronaut Research and Training Center
Original Assignee
BEIJING HUARU TECHNOLOGY CO LTD
China Astronaut Research and Training Center
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BEIJING HUARU TECHNOLOGY CO LTD, China Astronaut Research and Training Center filed Critical BEIJING HUARU TECHNOLOGY CO LTD
Priority to CN201811479678.7A priority Critical patent/CN109669538B/en
Publication of CN109669538A publication Critical patent/CN109669538A/en
Application granted granted Critical
Publication of CN109669538B publication Critical patent/CN109669538B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明公开了一种在虚拟现实中复杂运动约束下物体抓取交互方法,主要包括初始设置、抓取操作、物体当前状态确定、抓取物体判断、虚拟手掌位置调整、运动趋势计算、运动趋势判断、运动趋势有效性判断。本发明正确处理了用户在现实中无运动约束情况下对虚拟环境中复杂运动约束下物体抓取的操作仿真问题,实现了虚拟空间中操作对象在交互操作过程中满足实体之间相互阻挡的约束关系,避免出现被虚拟手抓取的物体穿越约束限制的问题与虚拟手抓取物体过程中跳变的问题。

Figure 201811479678

The invention discloses an object grasping interaction method under complex motion constraints in virtual reality. Judgment, motion trend validity judgment. The invention correctly handles the operation simulation problem of grasping objects under complex motion constraints in the virtual environment when the user has no motion constraints in reality, and realizes that the operating objects in the virtual space satisfy the constraints of mutual blocking between entities during the interactive operation process. relationship, to avoid the problem that the object grasped by the virtual hand crosses the constraint limit and the problem of jumping in the process of grasping the object by the virtual hand.

Figure 201811479678

Description

Object grabbing interaction method under complex motion constraint in virtual reality
Technical Field
The invention relates to the technical field of virtual reality, in particular to an object grabbing interaction method under the constraint of complex motion in a virtual reality system.
Background
In the application of the current virtual reality system, no matter the virtual reality system is a game system or an industry professional system, including model machine principle demonstration, engineering design or verification, training and the like, the most used virtual reality hardware configuration is as follows: virtual reality helmets, operating handles or data gloves with position and pose tracking. The user wears the virtual reality helmet, holds the operating handle or the data glove, controls the object in the virtual environment, and the virtual object is used as the agent of the user in the virtual environment and interacts with other objects in the virtual environment. The user agent in the virtual environment (the invention is described with a virtual hand as the user agent) is driven in real time by the tracking location data of the real-world user. Because the user wears the virtual reality helmet, the operation handle or the data glove is controlled to move according to the virtual scene image observed by eyes so as to operate the object in the virtual environment. In this process, the real-world user hand movement is not limited by the constraints of the virtual environment. When a user grabs and moves an object which is limited to be movable in a certain range, if the object is not properly processed, the following two phenomena which do not accord with the real physical law can appear on the operated object: 1) the virtual hand moves along with the hand of the user, so that the object grabbed by the virtual hand passes through the limit; 2) at the moment when the virtual hand grabs the object, the virtual object suddenly jumps to the virtual hand, so that the grabbing process jumps and is discontinuous. Both of these two points do not conform to the physical laws of the real world in the operation of objects.
Disclosure of Invention
In view of the above problems, an object of the present invention is to provide an object grabbing interaction method under a complex motion constraint in a virtual reality system. The invention mainly aims at establishing a plurality of limiting triggers and a state trigger for the range of motion of a constrained object in a virtual environment and setting a barrier for constraining motion. If the object is in the state trigger, indicating that the object is in a motion constraint state; if the object is in a constrained state, when the object moves in the moving range and touches a certain limiting trigger, the object is judged to continue touching the limiting trigger in the next frame according to the motion trend, then the current virtual object can not continue to move forward, the motion state of the current virtual object is unchanged, meanwhile, the virtual hand grasping the virtual object can not move along with the hand of the user in the real world, and the continuous state that the virtual hand grasps the virtual object is kept, so that the interactive operation simulation effect is ensured to be in accordance with the motion relation of the hand grasping the object in the real world. At the moment, if the user continues to move the operating handle towards the movement trend direction, the vibration function of the operating handle reminds the user that the movement of the virtual hand is limited, the vibration amplitude and the frequency become larger along with the increase of the distance, and when the moving distance reaches a certain value, the virtual hand is disconnected from the virtual object.
In order to achieve the purpose, the invention adopts the following technical scheme:
an object grabbing interaction method under complex motion constraint in virtual reality comprises the following steps:
1) setting a relation between a virtual hand and a tracking positioning point of the interactive equipment, setting an object limiting constraint condition, and establishing a standard position hanging point of the object and the hand;
2) controlling a virtual hand to grab an object;
3) judging whether the object is in a constrained state or a free unconstrained state at present;
4) if the object is in the constrained state, judging that the virtual hand grabs the object when the distance between the virtual hand and the object is smaller than a set value, and then entering the step 5; if the object is in a free and unconstrained state, the object is directly attached to the virtual hand, the virtual object completely moves along with the virtual hand, and the grabbing interaction is completed;
5) after the virtual hand grabs the object, displaying a proper gesture for grabbing the object by adjusting the posture of the virtual hand;
6) calculating the motion trend of the next frame of the hand in real time according to the positions of the grabbed objects of the previous frame and the current frame;
7) judging whether the next frame of object is in a motion constraint state or a free motion state according to the motion trend;
8) according to the judgment result, when the next frame of object is in the motion constraint state, the virtual hand stops moving, and when the next frame of object is in the free motion state, the virtual hand continues moving.
9) The virtual hand releases the object.
Preferably, the object limitation constraint conditions are set in step 1) as follows: a plurality of limit triggers are provided in the range and direction to which the object is to be limited, each limit trigger having a direction of movement that inhibits passage.
Preferably, the object limitation constraint conditions are set in step 1) as follows: a state trigger is provided for determining whether the object is in a constrained state.
Preferably, in the step 1), the relationship between the virtual hand and the tracking positioning point of the interactive device is set as follows: establishing a virtual hand calculation management type HandAnchor in a virtual environment, wherein the position and the posture of the HandAnchor are directly consistent with the hand movement of a real-world user; creating a sub-level object HandOffset under the HandAchor node, wherein the position and the posture of the HandOffset represent the position and the posture of a virtual hand; when the freely moving object is operated, the position of the HandAchor is consistent with that of the HandOffset, namely the virtual hand and the operated object completely follow the hand motion of the real-world user; when the object in the constrained state is operated, the position of the HandAchor is inconsistent with that of the HandOffset, and the virtual hand and the operated object cannot follow the hand motion of the real-world user in the specific constrained region.
Preferably, in the step 5), the virtual hand posture is adjusted as follows: firstly, acquiring hanging point position information of an operated object, and converting the hanging point position information into values under a world coordinate system, wherein the values comprise positions and postures; then, acquiring position information of HandOffset, and converting the position information into a value under a world coordinate; and finally, calculating a HandOffset intermediate value by using a linear interpolation algorithm for the HandOffset under the world coordinates, and gradually moving the virtual hand to the hanging point position through assignment operation.
Preferably, the hand next frame motion trend is calculated in the step 6) as follows: and reserving the position information of one frame on the virtual palm, and subtracting the position of the current frame of the virtual palm from the position of the previous frame of the virtual palm to obtain a vector, namely the motion trend of the current frame of the virtual palm, including the direction and the distance.
Preferably, in the step 7), it is determined that the object in the next frame is in the motion constraint state according to the following manner: the object is in a restraint state at present, the obstacle restraining the movement is already touched with the limiting trigger, and the included angle between the movement trend of the virtual hand and the movement direction forbidden to pass by the limiting trigger touching the obstacle restraining the movement is smaller than 90 degrees. And the motion trend vector of the virtual hand and the motion direction vector of the obstacle which restricts motion touch are prohibited from being subjected to point multiplication by the limiting trigger, the point multiplication is judged by the positive and negative of the point multiplication result of the two vectors, and if the point multiplication result is greater than 0, the included angle between the two vectors is smaller than 90 degrees.
After the grabbing action is finished, the virtual hand can actively release the object; and when the distance exceeds a limit value due to the fact that the virtual hand is inconsistent with the tracking positioning point, the virtual hand passively releases the object.
Due to the adoption of the technical scheme, the invention has the following technical advantages: the method correctly processes the operation simulation problem that the user grasps the object under the complex motion constraint in the virtual environment under the condition of no motion constraint in reality, realizes that the operation object in the virtual space meets the constraint relation of mutual blocking between the entities in the interactive operation process, avoids the problem that the object grasped by the virtual hand passes through the constraint limit and the problem that the object grasped by the virtual hand jumps in the process of grasping the object, and enhances the reality of the interactive operation in the virtual environment. The method is suitable for continuous operation simulation of complex objects which can move in multiple degrees of freedom within a certain motion constraint range in a virtual environment operated by simple interactive equipment.
Drawings
FIG. 1 is a flow chart of a virtual hand-operated complex motion constrained object technique;
FIG. 2 is an exemplary diagram of a flip-flop setup;
FIG. 3 is a flow chart of virtual grab object determination and virtual hand position adjustment;
FIG. 4 is a schematic diagram of a virtual hand movement trend calculation;
fig. 5 is a schematic diagram illustrating the determination of the motion trend of the object.
Detailed Description
The invention is described in detail below with reference to the figures and examples.
In a virtual reality scenario, a user wears a virtual reality helmet, an operating handle, or a data glove (hereinafter, an operating handle) with position and posture tracking. The user moves the operation handle according to the virtual scene image observed by eyes, and operates the object limited to move in a certain range in the virtual environment. The technical flow for implementing the interactive operation is shown in fig. 1.
Step 1: initial setting
Firstly, setting the relation between the virtual hand and the tracking positioning point of the interactive equipment. A virtual hand calculation management class HandAnchor is established in a virtual environment, and the position and the posture of the HandAnchor are directly driven by tracking and positioning data of an operating handle. And creating a sub-level object HandOffset under the HandAchor node, wherein the position and the posture of the HandOffset represent the position and the posture of the virtual hand. When the freely moving object is operated, the position of the HandAchor is consistent with that of the HandOffset, namely the virtual hand and the operated object completely follow the hand motion of the real-world user; when the object in the constraint state is operated, the virtual hand and the operated object are constrained, the position of the HandAchor is inconsistent with that of the HandOffset, and the hand movement of the real-world user cannot drive the constrained virtual hand to move within a certain range.
Second, object limit constraints are set. The movement of the operated virtual object is limited by other objects in the virtual environment within a certain range, and the limited range and direction are defined by designing a plurality of limiting triggers. The position, size and number of the limiting triggers depend on the range which can not be moved; the Z-axis orientation of each limit trigger depends on the direction of motion that cannot be passed through. In addition, a state trigger is added for judging whether the object is in a constraint state.
For example, a square ring-shaped object, the motion of which is limited by a central post, as shown in fig. 2. Then, four limiting triggers (1) and a state trigger (2) should be added, the limiting triggers (1) are respectively positioned on four edges, the upright is a barrier (3), and the direction of the Z axis (4) is from inside to outside, which indicates that when the upright touches a certain edge, the upright cannot move outwards; the state trigger (2) is positioned in the square and is not in contact with any side, if the upright post is positioned in the state trigger, the upright post is positioned in the square, namely, an object is bound.
And then establishing a hanging point of the object and the hand standard position. In order to ensure that the virtual palm is attached to the grasped object when the virtual hand grasps the object, a standard position hanging point of the object and the hand needs to be established and used as a reference standard, the hanging point is located on the object, and when the position of the virtual palm is completely overlapped with the position of the hanging point, the virtual palm is perfectly attached to the object.
The hanging point is established in such a way that in an editing state, the virtual palm and the object are placed in the same level, and after the virtual palm and the object are adjusted to a matching position, an empty object is established as a hanging point, the position of the empty object is consistent with that of the virtual palm, the empty object is placed below the object level and is used as a subset of the object, and therefore the relative positions of the hanging point and the object can be kept static.
Step 2: start to grab
The user holds the operating handle and moves, presses the button on the handle, and controls the virtual hand to open and grasp, so as to realize the grabbing action of the virtual object.
And step 3: object constraint state determination
If the obstacle restraining the movement is in contact with the state trigger, the object is in a restraining state, otherwise the object is in a free movement state. If the object is in the constrained state, the method is used for grabbing; if the object is in a free motion state, the virtual object is directly attached to the virtual hand, and the virtual object completely moves along with the virtual hand.
And 4, step 4: determination of object to be grabbed
The virtual hand is used for judging whether the virtual hand grasps the object or not, if the object is grasped, the virtual hand is adjusted to be in a proper position and posture in a mode of matching the hand shape with the object, the virtual hand is attached to the virtual object to complete the effect of grasping the virtual object, and the problem of jumping of the virtual object under the constraint state of grasping by the virtual hand is avoided.
The principle of judging the object grabbed by the virtual hand is as shown in fig. 3, after the program acquires the grabbing command in the constrained state, the program calculates the distance between the virtual hand and the hanging point of the operated object, if the distance is smaller than a set value, the operated object is in the grabbing range, and the position of the virtual palm is adjusted according to the method of the next step, so that the virtual palm is attached to the object, and the object is grabbed without jumping.
And 5: virtual hand position adjustment
Firstly, acquiring hanging point position information of an operated object, and converting the hanging point position information into values under a world coordinate system, wherein the values comprise positions and postures; then, acquiring position information of HandOffset, and converting the position information into a value under a world coordinate; and finally, Lerp linear interpolation is used for the HandOffset under the world coordinate, the intermediate value of the HandOffset is calculated, the virtual hand is gradually and continuously moved to the hanging point position through assignment operation, and the effect that the virtual hand continuously and non-jump grabs the object is achieved. (the Lerp method of linear interpolation can realize the smooth transition of integer, floating point, vector and quaternion, and the parameters comprise a current value (Form), a target value (To) and a transition Time (Time). the method realizes that the smooth object continuously changes from the current value To the middle numerical value of the target value in the transition Time.)
Step 6: motion trend calculation
And calculating the motion trend of the hand by comparing the position of one frame on the virtual palm with the position of the current frame, and judging the effectiveness of the motion trend according to the method of the next step before the motion trend is converted into real motion. The motion trend calculation method comprises the following steps: reserving a frame of position information on the virtual palm; and subtracting the position of the current frame of the virtual palm from the position of the previous frame of the virtual palm to obtain a vector, namely the motion trend of the current frame of the virtual palm, including the direction and the distance. As shown in fig. 4.
And 7: judgment of effectiveness of movement trend
Judging that the movement trend is bound, three conditions need to be met: firstly, the object is in a constraint state at present; secondly, the obstacle restraining the movement is already touched with the limiting trigger; and thirdly, the included angle of the Z-axis of the limiting trigger for the contact of the motion trend of the hand and the obstacle restricting the motion is smaller than 90 degrees. Judging whether the included angle between the two vectors is larger than 90 degrees or not by the positive and negative dot product results of the two vectors, and if the dot product result is larger than 0, the included angle between the two vectors is between 0 and 90 degrees; if the dot product is less than 0, the two vectors are angled between 90 and 180 degrees. If the above three conditions are satisfied, the object is bound and cannot move.
The judgment principle is illustrated by the case shown in fig. 5, a black circle represents a rod-shaped obstacle, and a state trigger is arranged on the rod-shaped obstacle and used for detecting whether the rod-shaped obstacle is in contact with a frame; each side arrow of each box represents the limiting direction of the side, namely the limiting trigger Z-axis orientation of each side, the box 1 of the thick black line to the box 2 of the dotted line are the displacement controlled by the real interaction device and the displacement of the box in the expected virtual environment, and the box 3 of the thin black is the actual virtual displacement calculated after being blocked by the circular obstacle. One box expected displacement is calculated in each frame, and the box cannot actually move according to the expected displacement due to the limitation of the rod-shaped obstacle, namely if the box collides with the rod-shaped obstacle and moves towards the limited direction of the box, the movement of the box is limited,
and 8: virtual hand position and pose settings
According to the judgment result of the previous step, the virtual hand can be moved or the virtual hand can not be moved, and the position and the posture of the virtual hand are respectively set according to different conditions:
the virtual hand can move: if the hand can move, the value of HandOffset is not changed, the virtual hand is made to move along with the real world operation handle, and the virtual hand and the real hand are kept relatively still.
Virtual hand immobility: if the hand can not move, dynamically adjusting the value of HandOffset according to the motion trend (refer to the step 5, namely 'virtual hand position adjustment'), ensuring that the virtual hand does not move along with the tracking positioning point of the operating handle, simultaneously calculating the distance between the virtual hand and the tracking positioning point of the operating handle, reminding a user that the positions and postures of the virtual hand and the tracking positioning point of the operating handle are inconsistent through vibration and the like, wherein the larger the distance is, the larger the vibration frequency and the intensity are, and when the distance exceeds a preset value, the virtual hand automatically releases an object.
And step 9: releasing an object
There are two triggering situations for releasing the object: firstly, the user actively puts down; secondly, the distance between the virtual hand and the tracking positioning point of the operating handle is larger and exceeds a preset value.
When the object is released, the Lerp linear interpolation algorithm is used, so that the HandOffset is gradually changed back to the initial value, the virtual hand smoothly returns to the position of the tracking positioning point of the operating handle, and the position and the posture of the virtual hand are consistent.

Claims (9)

1.一种在虚拟现实中复杂运动约束下物体抓取交互方法,其特征在于,包括以下步骤:1. an object grasping interaction method under complex motion constraints in virtual reality, is characterized in that, comprises the following steps: 1)设定虚拟手与交互设备跟踪定位点关系,设置物体限制约束条件,建立物体与手部标准位置挂点;1) Set the relationship between the virtual hand and the interactive device to track the positioning point, set the object limit constraints, and establish the standard position of the object and the hand to hang points; 2)控制虚拟手抓取物体;2) Control the virtual hand to grasp the object; 3)判断物体当前是处于约束状态,还是自由无约束状态;3) Determine whether the object is currently in a constrained state or a free and unconstrained state; 4)如果物体处于约束状态,则当虚拟手与物体距离小于设定值时,判断虚拟手抓取物体,然后进入步骤5;如果物体处于自由无约束状态,则直接将物体附着到虚拟手上,虚拟物体完全跟随虚拟手移动,抓取交互完成;4) If the object is in a constrained state, when the distance between the virtual hand and the object is less than the set value, judge that the virtual hand grabs the object, and then go to step 5; if the object is in a free and unconstrained state, directly attach the object to the virtual hand , the virtual object moves completely with the virtual hand, and the grasping interaction is completed; 5)当虚拟手抓取物体后,通过调整虚拟手姿态,显示抓取物体的合适手势;5) After the virtual hand grabs the object, by adjusting the posture of the virtual hand, a suitable gesture for grabbing the object is displayed; 6)根据上一帧与当前帧被抓取物体的位置,实时计算手部下一帧运动趋势;6) According to the position of the grasped object in the previous frame and the current frame, calculate the movement trend of the next frame of the hand in real time; 7)根据运动趋势,判断下一帧物体是处于运动约束状态还是自由运动状态;7) According to the motion trend, judge whether the object in the next frame is in a motion-constrained state or a free-motion state; 8)根据判断结果,当下一帧物体处于运动约束状态时则虚拟手停止运动,当下一帧物体处于自由运动状态时则虚拟手继续移动;8) According to the judgment result, when the object in the next frame is in a motion-constrained state, the virtual hand stops moving, and when the object in the next frame is in a free motion state, the virtual hand continues to move; 9)虚拟手放开物体。9) The virtual hand releases the object. 2.根据权利要求1所述的在虚拟现实中复杂运动约束下物体抓取交互方法,其特征在于所述步骤1)中按照如下方式设置物体限制约束条件:在物体被限制的范围和方向上设置多个限制触发器,每个限制触发器有一个禁止通过的运动方向。2. the object grasping interaction method under complex motion constraints in virtual reality according to claim 1, is characterized in that in the described step 1), the object restriction constraint condition is set as follows: in the restricted range and direction of the object Set up multiple limit triggers, each with a prohibited movement direction. 3.根据权利要求2所述的在虚拟现实中复杂运动约束下物体抓取交互方法,其特征在于所述步骤1)中按照如下方式设置物体限制约束条件:设置一个用来判断物体是否处于约束状态的状态触发器。3. the object grasping interaction method under complex motion constraints in virtual reality according to claim 2, is characterized in that in the described step 1), the object restriction constraint condition is set as follows: set a for judging whether the object is in the constraint State triggers for states. 4.根据权利要求1所述的在虚拟现实中复杂运动约束下物体抓取交互方法,其特征在于所述步骤1)中按照如下方式设定虚拟手与交互设备跟踪定位点关系:在虚拟环境中建立一个虚拟手计算管理类HandAnchor,其位置和姿态直接与现实世界用户手部运动一致;在HandAnchor节点下创建一个子级物体HandOffset,HandOffset位置和姿态代表虚拟手的位置和姿态;当操作自由移动的物体时,HandAnchor与HandOffset位置一致,即虚拟手及被操作的物体完全跟随现实世界用户手部运动;当操作处于约束状态下的物体时,HandAnchor与HandOffset位置不一致,虚拟手及被操作的物体在特定约束区域内不能跟随现实世界用户手部运动。4. the object grasping interaction method under complex motion constraints in virtual reality according to claim 1, is characterized in that in the described step 1), the virtual hand and the interactive device tracking anchor point relationship are set as follows: in the virtual environment Create a virtual hand calculation and management class HandAnchor in When moving objects, the positions of HandAnchor and HandOffset are consistent, that is, the virtual hand and the operated object completely follow the movement of the user's hand in the real world; when operating objects under constraints, the positions of HandAnchor and HandOffset are inconsistent, and the virtual hand and the operated object are not in the same position. Objects cannot follow real-world user hand movements within certain constraints. 5.根据权利要求4所述的在虚拟现实中复杂运动约束下物体抓取交互方法,其特征在于所述步骤5)中按照如下方式调整虚拟手姿态:首先获取被操作物体上挂点位置信息,并转化为世界坐标系下的值,包括位置和姿态;接着获取HandOffset的位置信息,并转化为世界坐标下的值;最后对世界坐标下的HandOffset使用线性插值算法,计算HandOffset中间值,通过赋值操作,把虚拟手逐渐移动到挂点位置。5. the object grasping interaction method under complex motion constraints in virtual reality according to claim 4, is characterized in that in the described step 5), the virtual hand posture is adjusted as follows: first obtain the hanging point position information on the manipulated object , and convert it into the value in the world coordinate system, including the position and attitude; then obtain the position information of the HandOffset and convert it into the value in the world coordinate; finally, use the linear interpolation algorithm for the HandOffset in the world coordinate to calculate the intermediate value of the HandOffset, through Assignment operation, gradually move the virtual hand to the hanging point position. 6.据权利要求1所述的在虚拟现实中复杂运动约束下物体抓取交互方法,其特征在于所述步骤6)中按照如下方式计算手部下一帧运动趋势:保留虚拟手掌上一帧位置信息,将虚拟手掌当前帧位置与上一帧位置相减,所得向量即为虚拟手掌本帧的运动趋势,包含方向和距离。6. according to claim 1, in virtual reality, the method for grabbing objects under complex motion constraints is characterized in that in the step 6), the next frame motion trend of the hand is calculated as follows: the upper frame position of the virtual palm is reserved. information, the current frame position of the virtual palm is subtracted from the previous frame position, and the resulting vector is the motion trend of the current frame of the virtual palm, including the direction and distance. 7.根据权利要求2所述的在虚拟现实中复杂运动约束下物体抓取交互方法,其特征在于所述步骤7)中按照如下方式判断下一帧物体处于运动约束状态:物体当前处于约束状态,约束运动的障碍物已经与限制触发器发生触碰,而且虚拟手的运动趋势与约束运动的障碍物触碰的限制触发器禁止通过的运动方向夹角小于90度。7. the object grasping interaction method under complex motion constraints in virtual reality according to claim 2, is characterized in that in the step 7), it is judged that the next frame object is in a motion constraint state as follows: the object is currently in a constraint state , the movement-restraining obstacle has touched the restriction trigger, and the included angle between the movement trend of the virtual hand and the movement direction that the movement-restraining obstacle touches the restriction trigger is prohibited from passing is less than 90 degrees. 8.根据权利要求7所述的在虚拟现实中复杂运动约束下物体抓取交互方法,其特征在于按照如下方式虚拟手的运动趋势与约束运动的障碍物触碰的限制触发器禁止通过的运动方向夹角小于90度:虚拟手的运动趋势向量与约束运动的障碍物触碰的限制触发器禁止通过的运动方向向量进行点乘,通过两个向量点乘结果的正负来判断,如果点乘结果大于0,则两个向量夹角小于90度。8. The object grasping interaction method under complex motion constraints in virtual reality according to claim 7, characterized in that the movement trend of the virtual hand and the restriction trigger of the obstacle that restricts the movement are touched in the following manner, and the movement of passing through is prohibited The angle between the directions is less than 90 degrees: the movement trend vector of the virtual hand and the limit trigger of the obstacle that restricts the movement are touched by the limit trigger. The movement direction vector that is prohibited to pass is subjected to dot multiplication. It is judged by the positive and negative of the result of the dot multiplication of the two vectors. If the multiplication result is greater than 0, the angle between the two vectors is less than 90 degrees. 9.根据权利要求1至8中任一所述的在虚拟现实中复杂运动约束下物体抓取交互方法,其特征在于步骤9)按照如下方式虚拟手放开物体:虚拟手可以主动放开物体;当虚拟手与跟踪定位点不一致导致距离超过限定值时,虚拟手被动放开物体。9. according to any one of claims 1 to 8, the object grasping interaction method under complex motion constraints in virtual reality is characterized in that step 9) virtual hand releases the object in the following manner: the virtual hand can take the initiative to release the object ; When the distance between the virtual hand and the tracking positioning point is inconsistent and the distance exceeds the limit value, the virtual hand passively releases the object.
CN201811479678.7A 2018-12-05 2018-12-05 Object grabbing interaction method under complex motion constraint in virtual reality Active CN109669538B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811479678.7A CN109669538B (en) 2018-12-05 2018-12-05 Object grabbing interaction method under complex motion constraint in virtual reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811479678.7A CN109669538B (en) 2018-12-05 2018-12-05 Object grabbing interaction method under complex motion constraint in virtual reality

Publications (2)

Publication Number Publication Date
CN109669538A CN109669538A (en) 2019-04-23
CN109669538B true CN109669538B (en) 2021-06-04

Family

ID=66144729

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811479678.7A Active CN109669538B (en) 2018-12-05 2018-12-05 Object grabbing interaction method under complex motion constraint in virtual reality

Country Status (1)

Country Link
CN (1) CN109669538B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112000228B (en) * 2020-09-04 2024-04-05 河北大学 Method and system for controlling movement in immersive virtual reality
CN116110535B (en) * 2023-04-13 2023-08-15 北京康爱医疗科技股份有限公司 Breathing biofeedback method based on virtual reality, feedback equipment and storage medium
CN116449963A (en) * 2023-06-15 2023-07-18 沙核科技(北京)有限公司 Virtual reality interaction method and device based on VR (virtual reality) head-mounted equipment
CN118710856B (en) * 2024-09-02 2024-11-26 江西科骏实业有限公司 VR scene anti-threading method and system based on virtual proxy movement

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101426446A (en) * 2006-01-17 2009-05-06 马科外科公司 Apparatus and method for haptic rendering
CN104867177A (en) * 2014-12-23 2015-08-26 上海电机学院 Parallel collision detection method based on bounding box tree method
CN106371573A (en) * 2015-12-04 2017-02-01 北京智谷睿拓技术服务有限公司 Tactile feedback method and apparatus, and virtual reality interaction system
CN107430437A (en) * 2015-02-13 2017-12-01 厉动公司 The system and method that real crawl experience is created in virtual reality/augmented reality environment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10209775B2 (en) * 2015-11-09 2019-02-19 Facebook Technologies, Llc Using a magnetic actuation mechanism to provide tactile feedback to a user interacting with a virtual environment
US10281982B2 (en) * 2016-10-17 2019-05-07 Facebook Technologies, Llc Inflatable actuators in virtual reality

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101426446A (en) * 2006-01-17 2009-05-06 马科外科公司 Apparatus and method for haptic rendering
CN104867177A (en) * 2014-12-23 2015-08-26 上海电机学院 Parallel collision detection method based on bounding box tree method
CN107430437A (en) * 2015-02-13 2017-12-01 厉动公司 The system and method that real crawl experience is created in virtual reality/augmented reality environment
CN106371573A (en) * 2015-12-04 2017-02-01 北京智谷睿拓技术服务有限公司 Tactile feedback method and apparatus, and virtual reality interaction system

Also Published As

Publication number Publication date
CN109669538A (en) 2019-04-23

Similar Documents

Publication Publication Date Title
CN109669538B (en) Object grabbing interaction method under complex motion constraint in virtual reality
CN104640677B (en) Train and operate industrial robots
JP6722624B2 (en) Control method of human-computer interaction and its operation
JP5246672B2 (en) Robot system
US6072466A (en) Virtual environment manipulation device modelling and control
US6529210B1 (en) Indirect object manipulation in a simulation
CN105589553A (en) Gesture control method and system for intelligent equipment
WO2018196552A1 (en) Method and apparatus for hand-type display for use in virtual reality scene
CN106313049A (en) Somatosensory control system and control method for apery mechanical arm
JP2014510336A5 (en)
CN107132917A (en) For the hand-type display methods and device in virtual reality scenario
Matsas et al. Effectiveness and acceptability of a virtual environment for assessing human–robot collaboration in manufacturing
TWI707251B (en) Operating method of interacting with a virtual reality for depending on a wearable device and operating device thereof
CN115635482B (en) Vision-based robot-to-human object transfer method, device, medium and terminal
Toh et al. Dexterous telemanipulation with a multi-touch interface
KR102438347B1 (en) Smart wearable devices and smart wearable equipment
Regenbrecht et al. A robust and intuitive 3D interface for teleoperation of autonomous robotic agents through immersive virtual reality environments
CN105630176B (en) A kind of method and device of intelligence motion sensing control
JP2000308985A (en) Robot teaching method and teaching system
Katsuki et al. Development of fast-response master-slave system using high-speed non-contact 3D sensing and high-speed robot hand
Bolano et al. Advanced usability through constrained multi modal interactive strategies: the cookiebot
CN112306234B (en) Hand operation identification method and system based on data glove
Sato et al. Tele-Operation of a Legged Robot by a Virtual Marionette System-First report: The first prototype and the usefulness of the reaching task
Koh et al. Enhancing the robot avateering metaphor discreetly with an assistive agent and its effect on perception
CN117908664A (en) Object selection method, device and storage medium for virtual reality scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant