CN110355750A - Interactive control method for teleoperation hand-eye coordination - Google Patents
Interactive control method for teleoperation hand-eye coordination Download PDFInfo
- Publication number
- CN110355750A CN110355750A CN201811270020.5A CN201811270020A CN110355750A CN 110355750 A CN110355750 A CN 110355750A CN 201811270020 A CN201811270020 A CN 201811270020A CN 110355750 A CN110355750 A CN 110355750A
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- camera
- world coordinate
- hand
- matrix
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J3/00—Manipulators of leader-follower type, i.e. both controlling unit and controlled unit perform corresponding spatial movements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
Abstract
本发明公开了一种面向遥操作手眼协调的交互控制方法,用于解决现有人机交互控制方法实用性差的技术问题。技术方案是基于坐标系的转换,操作者通过交换设备控制场景对象运动时,首先将场景对象的位姿由世界坐标系转换到相机坐标系中,然后在相机坐标系中按预期运动加上交互设备的进动量,得到新的相机坐标系位姿坐标,最后将新得到的位姿坐标由相机坐标系转换到世界坐标系中用以控制场景对象的实际运动。本发明实现了遥操作过程的手眼协调,即交互操作的过程中场景对象的运动不受操作者观察场景视角变换的影响,与交互设备的运动一致,降低遥操作难度,实用性好。The invention discloses an interactive control method oriented to hand-eye coordination of remote operation, which is used to solve the technical problem of poor practicability of the existing human-computer interaction control method. The technical solution is based on the conversion of the coordinate system. When the operator controls the movement of the scene object by switching the device, the pose of the scene object is first converted from the world coordinate system to the camera coordinate system, and then moves as expected in the camera coordinate system plus interaction The precession amount of the device is used to obtain the pose coordinates of the new camera coordinate system, and finally the newly obtained pose coordinates are converted from the camera coordinate system to the world coordinate system to control the actual movement of the scene object. The invention realizes the hand-eye coordination in the remote operation process, that is, the movement of the scene object in the interactive operation process is not affected by the operator's viewing angle transformation of the scene, and is consistent with the movement of the interactive device, reduces the difficulty of remote operation, and has good practicability.
Description
技术领域technical field
本发明涉及一种人机交互控制方法,特别是涉及一种面向遥操作手眼协调的交互控制方法。The invention relates to a human-computer interactive control method, in particular to an interactive control method oriented to hand-eye coordination of remote operation.
背景技术Background technique
文献“遥操作机器人系统主从控制策略,江苏科技大学学报(自然科学版),2013,Vol27(8),p643-647”公开了一种主从式遥操作机器人系统的控制方法。该方法采用增量式位置控制方式,以主手的增量来控制从手的运动,较有效地避免了初始回原点的繁琐。当操作操作者直接观察被操作环境信息时与操作者通过图像设备观察被操作环境信息时,主从间位置对应不同,通过调整比例控制增益矩阵,进行主手与实际环境或主手与图像设备的坐标匹配,建立主端和从端的工作空间映射有较好的视觉临场感。文献所述方法只是基于操作者直接观察环境信息与通过图像设备观察环境信息的不同,通过比例控制增益系数简单建立了主手与实际环境或主手与图像设备的工作空间映射,而没有给出当操作者观察被操作环境信息观察视角变化时,建立主从端工作空间映射的方法,没有实现主从端工作空间映射不受观察视角变化的影响,始终运动一致,适用范围不广,操作难度较大。The document "Master-Slave Control Strategy for Teleoperation Robot System, Journal of Jiangsu University of Science and Technology (Natural Science Edition), 2013, Vol27(8), p643-647" discloses a control method for a master-slave teleoperation robot system. This method adopts an incremental position control method to control the movement of the slave hand with the increment of the master hand, which effectively avoids the cumbersome initial return to the origin. When the operator directly observes the operated environment information and when the operator observes the operated environment information through the image device, the position correspondence between the master and the slave is different. By adjusting the proportional control gain matrix, the master hand and the actual environment or the master hand and the image device Coordinate matching, establishing the workspace mapping of the master end and the slave end has a better visual sense of presence. The method described in the literature is only based on the difference between the operator’s direct observation of the environment information and the observation of the environment information through the image device, and the working space mapping between the master hand and the actual environment or the master hand and the image device is simply established through the proportional control gain coefficient, without giving When the operator observes the change of the viewing angle of the operating environment information, the method of establishing the master-slave workspace mapping method does not realize that the master-slave workspace mapping is not affected by the change of the observation angle, and the movement is always consistent, the scope of application is not wide, and the operation is difficult. larger.
发明内容Contents of the invention
为了克服现有人机交互控制方法实用性差的不足,本发明提供一种面向遥操作手眼协调的交互控制方法。该方法基于坐标系的转换,操作者通过交换设备控制场景对象运动时,首先将场景对象的位姿由世界坐标系转换到相机坐标系中,然后在相机坐标系中按预期运动加上交互设备的进动量,得到新的相机坐标系位姿坐标,最后将新得到的位姿坐标由相机坐标系转换到世界坐标系中用以控制场景对象的实际运动。本发明实现了遥操作过程的手眼协调,即交互操作的过程中场景对象的运动不受操作者观察场景视角变换的影响,与交互设备的运动一致,降低遥操作难度,实用性好。In order to overcome the disadvantage of poor practicability of the existing human-computer interaction control method, the present invention provides an interactive control method oriented to hand-eye coordination of remote operation. This method is based on the transformation of the coordinate system. When the operator controls the movement of the scene object by exchanging the device, the pose of the scene object is first converted from the world coordinate system to the camera coordinate system, and then moves as expected in the camera coordinate system plus the interactive device The precession amount of the new camera coordinate system is obtained, and finally the newly obtained pose coordinates are converted from the camera coordinate system to the world coordinate system to control the actual motion of the scene object. The invention realizes the hand-eye coordination in the remote operation process, that is, the movement of the scene object in the interactive operation process is not affected by the operator's viewing angle transformation of the scene, and is consistent with the movement of the interactive device, reduces the difficulty of remote operation, and has good practicability.
本发明解决其技术问题所采用的技术方案是:一种面向遥操作手眼协调的交互控制方法,其特点是包括以下步骤:The technical solution adopted by the present invention to solve the technical problem is: a kind of interactive control method for remote operation hand-eye coordination, which is characterized in that it includes the following steps:
步骤一、交互设备数据采集,在手控器运动过程中,等时间间隔采集手控器的实时位置信息Pc n;将采集到的手控器实时位置信息Pc n与前一时刻位姿信息Pc n-1作差得到手控器进动量ΔP;将进动量ΔP按照ΔP0=k*ΔP映射为机械臂末端的初始运动指令ΔP0,其中,k为操作比例系数。Step 1. Interactive device data collection. During the movement of the hand controller, the real-time position information P c n of the hand controller is collected at equal time intervals; the collected real-time position information P c n of the hand controller is compared with the previous moment pose The information P c n-1 is subtracted to obtain the precession ΔP of the hand controller; the precession ΔP is mapped to the initial motion command ΔP 0 at the end of the manipulator according to ΔP 0 =k*ΔP, where k is the operating proportional coefficient.
△P0=k*△P (1)△P 0 = k*△P (1)
步骤二、结合交互操作坐标系到相机坐标系的转换矩阵Rx,对初始运动指令ΔP0进行转换处理,得到相机坐标系中的运动指令ΔP1。Step 2: Convert the initial motion command ΔP 0 in combination with the conversion matrix R x from the interactive operation coordinate system to the camera coordinate system to obtain the motion command ΔP 1 in the camera coordinate system.
△P1=△P0*Rx (2)△P 1 =△P 0 *R x (2)
步骤三、结合相机坐标系在世界坐标系的位姿矩阵C,对相机坐标系中的运动指令ΔP1进行转换处理,得到世界坐标系中的运动指令ΔP2。Step 3: Combining the pose matrix C of the camera coordinate system in the world coordinate system, the motion command ΔP 1 in the camera coordinate system is converted to obtain the motion command ΔP 2 in the world coordinate system.
在交互过程中,操作者观察到的视景由相机在世界坐标系中的位置姿态决定,而相机的位置姿态矩阵取决于三个要素:即相机视线方向、相机中心的位置、相机的正向朝向,这三个要素就能确定相机在世界坐标系中的位置姿态。当观察视角转换时,本质是相机在世界坐标系中的位置姿态发生了变化,相机坐标系在世界坐标系中的方向也随之发生变化。During the interaction process, the view observed by the operator is determined by the position and attitude of the camera in the world coordinate system, and the position and attitude matrix of the camera depends on three elements: the direction of the camera line of sight, the position of the center of the camera, and the forward direction of the camera. Orientation, these three elements can determine the position and posture of the camera in the world coordinate system. When observing the viewing angle conversion, the essence is that the position and posture of the camera in the world coordinate system has changed, and the direction of the camera coordinate system in the world coordinate system has also changed accordingly.
将步骤二中得到的运动量从相机坐标系转换到世界坐标系,需要用到相机矩阵的逆矩阵C-1。此外,由于交互设备用动量矩阵为1x3矩阵,因此需要构建一个1x4的矩阵D。To convert the motion obtained in step 2 from the camera coordinate system to the world coordinate system, the inverse matrix C -1 of the camera matrix is needed. In addition, since the momentum matrix for the interactive device is a 1x3 matrix, a 1x4 matrix D needs to be constructed.
D=[△P1 0] (4)D=[△P 1 0] (4)
△P2=DC-1 (5)△P 2 =DC -1 (5)
步骤四、将世界坐标系中的运动指令ΔP2通过运动映射得到机械臂末端的最终运动量ΔP3,生成整个交互过程的遥操作指令。Step 4: The motion command ΔP 2 in the world coordinate system is obtained through motion mapping to obtain the final motion amount ΔP 3 of the end of the mechanical arm, and a teleoperation command for the entire interaction process is generated.
将步骤三得到的世界坐标系运动指令ΔP2展开得到ΔP2为1x4矩阵,其前三列元素为场景对象末端的最终运动量,通过运动映射取ΔP2前三列元素生成ΔP3,简化表示如下:Expand the world coordinate system motion command ΔP 2 obtained in step 3 to obtain ΔP 2 as a 1x4 matrix, the first three elements of which are the final movement amount at the end of the scene object, take the first three elements of ΔP 2 through motion mapping to generate ΔP 3 , the simplified expression is as follows :
△P3=[△x △y △z] (6)△P 3 =[△x △y △z] (6)
其中,Δx、Δy、Δz分别表示场景对象末端在世界坐标系中三个坐标轴方向的运动量,至此,用以实现手眼协调的遥操作指令生成。Among them, Δx, Δy, and Δz respectively represent the movement amounts of the end of the scene object in the direction of the three coordinate axes in the world coordinate system. So far, they are used to realize the generation of teleoperation commands for hand-eye coordination.
步骤五、利用步骤四生成的遥操作指令驱动机械臂运动,实现手眼协调。Step 5. Use the teleoperation command generated in step 4 to drive the movement of the robotic arm to achieve hand-eye coordination.
场景对象末端的实时位置记为Pj n,下一时刻的位置为Pj n+1,则The real-time position at the end of the scene object is recorded as P j n , and the position at the next moment is P j n+1 , then
通过场景对象实时末端位置驱动场景对象运动,实现遥操作过程的手眼协调。The movement of the scene object is driven by the real-time end position of the scene object, and the hand-eye coordination in the teleoperation process is realized.
本发明的有益效果是:该方法基于坐标系的转换,操作者通过交换设备控制场景对象运动时,首先将场景对象的位姿由世界坐标系转换到相机坐标系中,然后在相机坐标系中按预期运动加上交互设备的进动量,得到新的相机坐标系位姿坐标,最后将新得到的位姿坐标由相机坐标系转换到世界坐标系中用以控制场景对象的实际运动。本发明实现了遥操作过程的手眼协调,即交互操作的过程中场景对象的运动不受操作者观察场景视角变换的影响,与交互设备的运动一致,降低遥操作难度,实用性好。The beneficial effects of the present invention are: the method is based on the conversion of the coordinate system. When the operator controls the movement of the scene object through the exchange device, the pose of the scene object is first converted from the world coordinate system to the camera coordinate system, and then in the camera coordinate system Add the precession of the interactive device to the expected motion to obtain the pose coordinates of the new camera coordinate system, and finally convert the newly obtained pose coordinates from the camera coordinate system to the world coordinate system to control the actual motion of the scene object. The invention realizes the hand-eye coordination in the remote operation process, that is, the movement of the scene object in the interactive operation process is not affected by the operator's viewing angle transformation of the scene, and is consistent with the movement of the interactive device, reduces the difficulty of remote operation, and has good practicability.
下面结合具体实施方式对本发明作详细说明。The present invention will be described in detail below in combination with specific embodiments.
具体实施方式Detailed ways
为详细介绍本发明的内容,需要对3个常用坐标系的定义进行阐述:In order to introduce the content of the present invention in detail, it is necessary to set forth the definitions of 3 commonly used coordinate systems:
(1)世界坐标系Oworld:世界坐标系是整个系统的参考坐标系,用来描述场景中对象模型的实际运动,相当于基坐标系,场景对象的运动都是基于世界坐标系来进行描述。(1) World coordinate system O world : The world coordinate system is the reference coordinate system of the entire system, which is used to describe the actual movement of the object model in the scene, which is equivalent to the base coordinate system. The movement of scene objects is described based on the world coordinate system .
(2)相机坐标系Ocamera:相机坐标系用来描述遥操作视景,当操作者观察视角变换时,本质是相机在世界坐标系中的位置姿态发生了变化。当观察视角变换时,相机坐标系在世界坐标系中的位姿发生变化,但它相对于计算机屏幕的方向不变。(2) Camera coordinate system O camera : The camera coordinate system is used to describe the remote operation scene. When the operator observes the viewing angle transformation, the essence is that the position and posture of the camera in the world coordinate system have changed. When the viewing angle is transformed, the pose of the camera coordinate system in the world coordinate system changes, but its orientation relative to the computer screen does not change.
(3)交互操作坐标系(即手控器坐标系)Ointeraction:交互设备坐标系用来描述遥操作过程中交互设备的运动,当观察视角变换时,交互设备在世界坐标系中的位置姿态将发生变化,但它相对于计算机屏幕的方向不变。(3) Interactive operation coordinate system (i.e. hand controller coordinate system) O interaction : The interactive device coordinate system is used to describe the motion of the interactive device during teleoperation. When the viewing angle changes, the position and posture of the interactive device in the world coordinate system will change, but its orientation relative to the computer screen will not change.
为验证本发明所提出的面向遥操作手眼协调的操作技术的有效性,本发明结合三维图形开发环境OSG(OpenSenceGraph)和交互工具NovintFalcon手控器,基于对虚拟视景中IRB120机械臂运动的控制进行仿真演示验证,具体实施方式如下:In order to verify the effectiveness of the operation technology for teleoperation hand-eye coordination proposed by the present invention, the present invention combines the three-dimensional graphics development environment OSG (OpenSenceGraph) and the interactive tool NovintFalcon hand controller, based on the control of the motion of the IRB120 mechanical arm in the virtual scene Carry out simulation demonstration verification, the specific implementation method is as follows:
步骤一、交互设备数据采集,以手控器向上运动为例,在手控器运动过程中等时间间隔采集得到手控器的实时位置姿态信息Pc n;将采集到的手控器实时位姿信息Pc n与前一时刻位姿信息Pc n-1作差得到手控器进动量ΔP;将进动量ΔP按照ΔP0=k*ΔP映射为机械臂末端的初始运动指令ΔP0,其中k为操作比例系数,手控器末端运动量单位为米,虚拟场景中机械臂末端运动量单位为毫米,操作比例系数k=1000,当手控器末端向上运动0.1米时,初始运动指令为Step 1. Interactive device data collection. Taking the upward movement of the hand controller as an example, the real-time position and attitude information P c n of the hand controller is collected at equal time intervals during the movement of the hand controller; the collected real-time pose of the hand controller The difference between the information P c n and the pose information P c n-1 at the previous moment is obtained to obtain the precession ΔP of the hand controller; the precession ΔP is mapped to the initial motion command ΔP 0 at the end of the mechanical arm according to ΔP 0 =k*ΔP, where k is the operating proportionality coefficient, the unit of movement of the end of the hand controller is meter, the unit of movement of the end of the mechanical arm in the virtual scene is millimeter, the operating proportionality coefficient k=1000, when the end of the hand controller moves upwards by 0.1 meters, the initial movement command is
△P0=k*△P=[0 0 100] (1)△P 0 =k*△P=[0 0 100] (1)
步骤二、结合交互操作坐标系到相机坐标系的转换矩阵Rx,对初始运动指令ΔP0进行转换处理,得到相机坐标系中的运动指令ΔP1。Step 2: Convert the initial motion command ΔP 0 in combination with the conversion matrix R x from the interactive operation coordinate system to the camera coordinate system to obtain the motion command ΔP 1 in the camera coordinate system.
通过对交互设备坐标系与相机坐标系的位置关系分析可知,当观察视角变换时,交互设备坐标系与相机坐标系在世界坐标系中的相对方向没有发生变化,由交互设备坐标系沿X轴旋转90度即可与相机坐标系方向一致。Through the analysis of the positional relationship between the interactive device coordinate system and the camera coordinate system, it can be seen that when the viewing angle is changed, the relative direction of the interactive device coordinate system and the camera coordinate system in the world coordinate system does not change, and the interactive device coordinate system is along the X axis. Rotate 90 degrees to be consistent with the direction of the camera coordinate system.
步骤三、结合相机坐标系在世界坐标系的位姿矩阵C,对相机坐标系中的运动指令ΔP1进行转换处理,得到世界坐标系中的运动指令ΔP2。Step 3: Combining the pose matrix C of the camera coordinate system in the world coordinate system, the motion command ΔP 1 in the camera coordinate system is converted to obtain the motion command ΔP 2 in the world coordinate system.
当正视场景时,相机在世界坐标系中的位姿矩阵为C。When facing the scene, the pose matrix of the camera in the world coordinate system is C.
由于交互设备用动量矩阵为1x3矩阵,需要构建一个1x4的矩阵D。Since the momentum matrix used by the interactive device is a 1x3 matrix, a 1x4 matrix D needs to be constructed.
D=[△P1 0]=[0 100 0 0] (4)D=[△P 1 0]=[0 100 0 0] (4)
△P2=DC-1=[0 0 100 0] (5)△P 2 =DC -1 =[0 0 100 0] (5)
步骤四、将世界坐标系中的运动指令ΔP2通过运动映射得到机械臂末端的最终运动量ΔP3,生成整个交互过程的遥操作指令。Step 4: The motion command ΔP 2 in the world coordinate system is obtained through motion mapping to obtain the final motion amount ΔP 3 of the end of the mechanical arm, and a teleoperation command for the entire interaction process is generated.
△P3=[△x △y △z]=[0 0 100] (6)△P 3 =[△x △y △z]=[0 0 100] (6)
其中,Δx、Δy、Δz分别表示场景对象末端在世界坐标系中三个坐标轴方向的运动量,至此,用以实现手眼协调的遥操作指令生成。Among them, Δx, Δy, and Δz respectively represent the movement amounts of the end of the scene object in the direction of the three coordinate axes in the world coordinate system. So far, they are used to realize the generation of teleoperation commands for hand-eye coordination.
步骤五、利用步骤四生成的遥操作指令驱动机械臂运动,实现手眼协调。场景对象末端的实时位置记为Pj n,下一时刻的位置为Pj n+1,则Step 5. Use the teleoperation command generated in step 4 to drive the movement of the robotic arm to achieve hand-eye coordination. The real-time position at the end of the scene object is recorded as P j n , and the position at the next moment is P j n+1 , then
本实施例基于IRB120机械臂的交互控制,要完成步骤五,首先运用3D MAX等工具建立IRB120机械臂三维模型,完成虚拟视景的搭建;在D-H坐标系的基础上,建立机械臂的运动学模型。This embodiment is based on the interactive control of the IRB120 robotic arm. To complete step five, first use 3D MAX and other tools to establish a three-dimensional model of the IRB120 robotic arm to complete the construction of the virtual scene; on the basis of the D-H coordinate system, establish the kinematics of the robotic arm Model.
首先对其进行正向运动学分析每相邻两个坐标系可以通过四次齐次变换相互转化设变换矩阵为则Firstly, forward kinematics analysis is carried out on it. Every two adjacent coordinate systems can be transformed into each other through four homogeneous transformations. Let the transformation matrix be but
其中,ai、αi、di、θi分别为连杆长度、两岸扭转角、连杆偏移量、关节角。Among them, a i , α i , d i , and θ i are the length of the connecting rod, the torsion angle of both banks, the offset of the connecting rod, and the joint angle, respectively.
对于六自由度机械臂,当各个关节的关节角确定后,机械臂末端位姿也随之确定,设末端位姿为T,则For a six-degree-of-freedom manipulator, when the joint angles of each joint are determined, the end pose of the manipulator is also determined. Let the end pose be T, then
其次,运用解析法对其进行你运动学分析,根据末端位姿求取各个关节角。Secondly, use the analytical method to analyze its kinematics, and calculate each joint angle according to the terminal pose.
由上式可求取θ1,其他关节角按解析法依次求取。θ 1 can be obtained from the above formula, and other joint angles can be obtained sequentially by analytical method.
机械臂末端的初始位置正视场景时,The initial position of the end of the arm When facing the scene squarely,
通过以上手眼协调交互控制方法进行运动映射后可得,当手控器向上运动时,正视虚拟视景,机械臂的运动方向沿Z轴,俯视虚拟视景时,机械臂运动方向沿Y轴。由此可得,当操作者的观察视角变换时,采用本文提出的手眼协调实现方法,能够实现场景对象的运动与交互设备的运动一致,从而验证了本文提出的手眼协调方法的有效性。Through the above hand-eye coordination interactive control method for motion mapping, it can be obtained that when the hand controller moves upwards, the direction of movement of the robotic arm is along the Z-axis when the hand controller is moving upwards, and the direction of movement of the robotic arm is along the Y-axis when looking down at the virtual scene. It can be concluded that when the operator's viewing angle changes, the hand-eye coordination implementation method proposed in this paper can achieve the same movement of scene objects and interactive devices, thus verifying the effectiveness of the hand-eye coordination method proposed in this paper.
Claims (1)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201811270020.5A CN110355750B (en) | 2018-10-29 | 2018-10-29 | Interactive control method for hand-eye coordination in teleoperation |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201811270020.5A CN110355750B (en) | 2018-10-29 | 2018-10-29 | Interactive control method for hand-eye coordination in teleoperation |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN110355750A true CN110355750A (en) | 2019-10-22 |
| CN110355750B CN110355750B (en) | 2022-05-10 |
Family
ID=68214781
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201811270020.5A Active CN110355750B (en) | 2018-10-29 | 2018-10-29 | Interactive control method for hand-eye coordination in teleoperation |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN110355750B (en) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111590537A (en) * | 2020-05-23 | 2020-08-28 | 西北工业大学 | Teleoperation interactive operation method based on force position feedback |
| CN111640189A (en) * | 2020-05-15 | 2020-09-08 | 西北工业大学 | A teleoperation enhanced display method based on artificial landmarks |
| WO2022002159A1 (en) * | 2020-07-01 | 2022-01-06 | 北京术锐技术有限公司 | Master-slave motion control method, robot system, device, and storage medium |
| CN115570558A (en) * | 2022-10-28 | 2023-01-06 | 武汉恒新动力科技有限公司 | Somatosensory cooperative teleoperation system and method for controlled object cluster |
| CN115639910A (en) * | 2022-10-28 | 2023-01-24 | 武汉恒新动力科技有限公司 | All-dimensional somatosensory interaction method facing operation space of operation object and operation device |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20010046313A1 (en) * | 1992-01-21 | 2001-11-29 | Green Philip S. | Method and apparatus for transforming coordinate systems in a telemanipulation system |
| CN103991077A (en) * | 2014-02-19 | 2014-08-20 | 吉林大学 | Robot hand controller shared control method based on force fusion |
| CN104950885A (en) * | 2015-06-10 | 2015-09-30 | 东南大学 | UAV (unmanned aerial vehicle) fleet bilateral remote control system and method thereof based on vision and force sense feedback |
| CN106444861A (en) * | 2016-11-21 | 2017-02-22 | 清华大学深圳研究生院 | Space robot teleoperation system based on three-dimensional gestures |
| CN107662195A (en) * | 2017-09-22 | 2018-02-06 | 中国东方电气集团有限公司 | A kind of mechanical hand principal and subordinate isomery remote operating control system and control method with telepresenc |
| CN107748496A (en) * | 2017-09-25 | 2018-03-02 | 北京邮电大学 | Impedance controller algorithm based on parameter adaptive regulation |
-
2018
- 2018-10-29 CN CN201811270020.5A patent/CN110355750B/en active Active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20010046313A1 (en) * | 1992-01-21 | 2001-11-29 | Green Philip S. | Method and apparatus for transforming coordinate systems in a telemanipulation system |
| CN103991077A (en) * | 2014-02-19 | 2014-08-20 | 吉林大学 | Robot hand controller shared control method based on force fusion |
| CN104950885A (en) * | 2015-06-10 | 2015-09-30 | 东南大学 | UAV (unmanned aerial vehicle) fleet bilateral remote control system and method thereof based on vision and force sense feedback |
| CN106444861A (en) * | 2016-11-21 | 2017-02-22 | 清华大学深圳研究生院 | Space robot teleoperation system based on three-dimensional gestures |
| CN107662195A (en) * | 2017-09-22 | 2018-02-06 | 中国东方电气集团有限公司 | A kind of mechanical hand principal and subordinate isomery remote operating control system and control method with telepresenc |
| CN107748496A (en) * | 2017-09-25 | 2018-03-02 | 北京邮电大学 | Impedance controller algorithm based on parameter adaptive regulation |
Non-Patent Citations (1)
| Title |
|---|
| 汤卿,等.: "基于KUKA工业机器人的遥操作控制系统设计与异构主从控制方法研究", 《四川大学学报( 工程科学版)》 * |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111640189A (en) * | 2020-05-15 | 2020-09-08 | 西北工业大学 | A teleoperation enhanced display method based on artificial landmarks |
| CN111590537A (en) * | 2020-05-23 | 2020-08-28 | 西北工业大学 | Teleoperation interactive operation method based on force position feedback |
| CN111590537B (en) * | 2020-05-23 | 2023-01-24 | 西北工业大学 | A teleoperation interactive operation method based on force-position feedback |
| WO2022002159A1 (en) * | 2020-07-01 | 2022-01-06 | 北京术锐技术有限公司 | Master-slave motion control method, robot system, device, and storage medium |
| CN115570558A (en) * | 2022-10-28 | 2023-01-06 | 武汉恒新动力科技有限公司 | Somatosensory cooperative teleoperation system and method for controlled object cluster |
| CN115639910A (en) * | 2022-10-28 | 2023-01-24 | 武汉恒新动力科技有限公司 | All-dimensional somatosensory interaction method facing operation space of operation object and operation device |
| CN115639910B (en) * | 2022-10-28 | 2023-08-15 | 武汉恒新动力科技有限公司 | Omnidirectional somatosensory interaction method and equipment for operation space of controlled object |
Also Published As
| Publication number | Publication date |
|---|---|
| CN110355750B (en) | 2022-05-10 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Pan et al. | Augmented reality-based robot teleoperation system using RGB-D imaging and attitude teaching device | |
| CN103302668B (en) | Based on control system and the method thereof of the Space teleoperation robot of Kinect | |
| CN110355750A (en) | Interactive control method for teleoperation hand-eye coordination | |
| CN110238831B (en) | Robot teaching system and method based on RGB-D image and teaching device | |
| US10751877B2 (en) | Industrial robot training using mixed reality | |
| CN108214445B (en) | ROS-based master-slave heterogeneous teleoperation control system | |
| CN103481285B (en) | Based on robot for high-voltage hot-line work control system and the method for virtual reality technology | |
| CN110385694A (en) | Action teaching device, robot system and the robot controller of robot | |
| CN100484726C (en) | Flexible and remote-controlled operation platform for robot based on virtual reality | |
| CN104570731A (en) | Uncalibrated human-computer interaction control system and method based on Kinect | |
| CN206326605U (en) | A kind of intelligent teaching system based on machine vision | |
| CN115469576A (en) | A Teleoperation System Based on Hybrid Mapping of Human-Robot Arm Heterogeneous Motion Space | |
| CN105014677A (en) | Visual mechanical arm control device and method based on Camshift visual tracking and D-H modeling algorithms | |
| CN110695988A (en) | Method and system for cooperative motion of double mechanical arms | |
| CN113282173B (en) | Double-arm robot remote real-time control system and method based on virtual reality | |
| CN104842356B (en) | A multi-palletizing robot teaching method based on distributed computing and machine vision | |
| CN102221884A (en) | Visual tele-existence device based on real-time calibration of camera and working method thereof | |
| CN114536346A (en) | Mechanical arm accurate path planning method based on man-machine cooperation and visual detection | |
| CN110039561A (en) | Hot line robot remote operating staff training system and method based on cloud | |
| CN108153957A (en) | Space manipulator kinetics simulation analysis method, system and storage medium | |
| CN117415821A (en) | A robot control method, device, system and controller based on force feedback | |
| Frank et al. | Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet | |
| CN111113414B (en) | Robot three-dimensional space scale prompting method and system based on screen identification | |
| CN110142769A (en) | ROS platform online robotic arm teaching system based on human gesture recognition | |
| CN115488880A (en) | Simulation test system for mechanical arm visual servo grabbing in space environment |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |