[go: up one dir, main page]

CN104827474A - Intelligent programming method and auxiliary device of virtual teaching robot for learning person - Google Patents

Intelligent programming method and auxiliary device of virtual teaching robot for learning person Download PDF

Info

Publication number
CN104827474A
CN104827474A CN201510221175.XA CN201510221175A CN104827474A CN 104827474 A CN104827474 A CN 104827474A CN 201510221175 A CN201510221175 A CN 201510221175A CN 104827474 A CN104827474 A CN 104827474A
Authority
CN
China
Prior art keywords
robot
feature point
teaching
space
tool
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510221175.XA
Other languages
Chinese (zh)
Other versions
CN104827474B (en
Inventor
刘永
顾伟国
闫瑾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN201510221175.XA priority Critical patent/CN104827474B/en
Publication of CN104827474A publication Critical patent/CN104827474A/en
Application granted granted Critical
Publication of CN104827474B publication Critical patent/CN104827474B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Numerical Control (AREA)
  • Manipulator (AREA)

Abstract

本发明提供一种学习人的虚拟示教机器人智能编程辅助装置,机器人末端设有作业工具,包括:与机器人末端设置的作业工具相同且设有第一特征点的示教工具,依照时间顺序捕获并记录示教工具上特征点运动轨迹、示教工具关节角度信息和作业目标的作业点的相机单元,及生成机器人工作指令的机器人控制器;所述作业工具上设有与示教工具的第一特征点对应的第二特征点。

The invention provides a virtual teaching robot intelligent programming auxiliary device for learning human beings. The end of the robot is equipped with working tools, including: the teaching tool is the same as the working tool set at the end of the robot and has a first feature point, which is captured in time order And record the movement track of the feature points on the teaching tool, the joint angle information of the teaching tool and the camera unit of the operation point of the operation target, and the robot controller that generates the robot work order; A second feature point corresponding to a feature point.

Description

学习人的虚拟示教机器人智能编程方法及辅助装置Intelligent programming method and auxiliary device for learning human virtual teaching robot

技术领域technical field

本发明涉及一种工业机器人智能编成技术,特别是一种学习人的虚拟示教机器人智能编程方法及辅助装置。The invention relates to an intelligent programming technology of an industrial robot, in particular to an intelligent programming method and an auxiliary device for learning a virtual teaching robot.

背景技术Background technique

随着科学与技术的发展,越来越多的工业机器人在各工业领域中获得了广泛应用,如机器人喷漆,机器人搬运,机器人装配等。现在,在这些领域中,工业机器人可以代替人完成可重复的精确工作,从而保证了产品质量。用机器人代替人进行作业时,必须预先对机器人发出指示,规定机器人应该完成的动作和作业的具体内容。伴随着工业机器人的发展,机器人编程技术也得到发展和完善;机器人编程已成为机器人技术的一个重要部分;机器人的功能除了依靠机器人硬件的支持外,相当一部分依赖机器人语言来完成。With the development of science and technology, more and more industrial robots have been widely used in various industrial fields, such as robot painting, robot handling, robot assembly, etc. Now, in these fields, industrial robots can replace humans to complete repeatable and precise work, thus ensuring product quality. When a robot is used to replace a human to perform operations, it is necessary to issue instructions to the robot in advance, specifying the actions that the robot should complete and the specific content of the operation. Along with the development of industrial robots, robot programming technology has also been developed and improved; robot programming has become an important part of robot technology; in addition to relying on the support of robot hardware, a considerable part of robot functions depends on robot language.

机器人编程有很多方法,按位置分为在线编程和离线编程:There are many methods of robot programming, divided into online programming and offline programming according to location:

在线编程通过机器人的手控操作盒控制机器人运动,在线示教的过程包括使机器人上安装的工具的末端运动到其操作部位,并记录下在这个位置机器人的坐标,然后机器人沿示教时记录的轨迹自主运动,完成特定的操作任务。示教的精度完全靠示教者的经验目测决定,手把手的示教,无法达到预期效果,而且有可能伤到示教人员,很是危险。Online programming controls the movement of the robot through the manual operation box of the robot. The process of online teaching includes moving the end of the tool installed on the robot to its operating position, and recording the coordinates of the robot at this position, and then the robot records along the teaching path. The track autonomously moves to complete a specific operation task. The accuracy of the teaching is completely determined by the experience and visual inspection of the teaching operator. The hand-held teaching cannot achieve the expected effect, and it may hurt the teaching personnel, which is very dangerous.

离线编程的精确度较高,编程时可以不用机器人,机器人可以进行其他工作,可通过仿真试验程序,可预先优化操作方案和运行周期时间,可将以前完成的过程或子程序结合到待编程序中去,可以避免伤害到操作人员。与示教再现编程相比较,离线编程系统对操作人员要求较高,需具备专门的机器人知识和程序设计的能力,使用也不太方便,尤其是在对机器人作业任务的描述上不能简单直接。如果能在机器人离线编程的系统中引入示教再现的功能,使得操作人员通过相应的人机接口在计算机屏幕上引导在线示教机器人,产生机器人作业轨迹,进而生成机器人运动程序并可进行仿真与优化,无疑将大大的增强机器人离线编程的可操作性和用户界面的友好性,使得机器人编程更加方便简单,有利于推动机器人技术在实际生产中的应用与推广。The accuracy of off-line programming is high, the robot can be used for programming without the robot, and the robot can perform other work, and the operation plan and operation cycle time can be pre-optimized through the simulation test program, and the previously completed process or subroutine can be combined into the program to be programmed to avoid injury to the operator. Compared with teaching and reproducing programming, the off-line programming system has higher requirements for operators, and requires specialized robot knowledge and programming ability, and it is not very convenient to use, especially in the description of robot tasks. If the teaching and reproduction function can be introduced into the robot offline programming system, the operator can guide the online teaching robot on the computer screen through the corresponding human-machine interface, generate the robot's operation trajectory, and then generate the robot motion program and perform simulation and simulation. Optimization will undoubtedly greatly enhance the operability of robot offline programming and the friendliness of the user interface, make robot programming more convenient and simple, and help promote the application and promotion of robot technology in actual production.

发明内容Contents of the invention

本发明的目的在于提供一种学习人的虚拟操作的机器人智能编程方法,具有操作简单、灵活度高及智能化的优点。The purpose of the present invention is to provide a robot intelligent programming method for learning human virtual operation, which has the advantages of simple operation, high flexibility and intelligence.

一种学习人的虚拟示教机器人智能编程辅助装置,机器人末端设有作业工具,包括示教工具、机器人控制器和机器人控制器。所述示教工具与机器人末端设置的作业工具相同且设有第一特征点;所述相机单元依照时间顺序捕获并记录示教工具上特征点运动轨迹和示教工具关节角度信息的;所述生成机器人工作指令的机器人控制器。所述作业工具上设有与示教工具的第一特征点对应的第二特征点。The invention relates to an intelligent programming auxiliary device for a virtual teaching robot for learning human beings. A working tool is arranged at the end of the robot, including a teaching tool, a robot controller and a robot controller. The teaching tool is the same as the operating tool set at the end of the robot and has a first feature point; the camera unit captures and records the movement track of the feature points on the teaching tool and the joint angle information of the teaching tool in time sequence; the A robot controller that generates robot work instructions. The working tool is provided with a second feature point corresponding to the first feature point of the teaching tool.

采用上述装置实现学习人的虚拟示教机器人智能编程的方法,包括:Using the above-mentioned device to realize the method for intelligent programming of a virtual teaching robot for learning people, including:

示教工具进行虚拟作业,相机单元依照时间顺序对第一特征点的运动轨迹和关节角度进行捕获并生成第一特征点的相机空间二维坐标,The teaching tool performs virtual work, and the camera unit captures the motion trajectory and joint angle of the first feature point in time order and generates the two-dimensional coordinates of the camera space of the first feature point.

机器人控制器根据第一特征点的相机空间坐标和关节角度获得包含第一特征点的机器人空间三维坐标和关节角度在内的指令程序,The robot controller obtains an instruction program including the robot space three-dimensional coordinates and joint angles of the first feature point according to the camera space coordinates and joint angles of the first feature point,

机器人控制器将指令程序输入机器人;The robot controller inputs the instruction program into the robot;

所述机器人控制器依照指令程序中第一特征点的机器人空间三维坐标信息控制作业工具上的第二特征点的运动轨迹。The robot controller controls the motion track of the second feature point on the working tool according to the robot space three-dimensional coordinate information of the first feature point in the instruction program.

在示教工具进行虚拟作业之前,先建立机器人空间与相机空间的映射关系,具体方法为:Before the teaching tool performs virtual operations, first establish the mapping relationship between the robot space and the camera space. The specific method is:

相机单元捕获三组以上工作中第二特征点图像并获得第二特征点的相机空间二维坐标;The camera unit captures more than three sets of images of the second feature points in the work and obtains the two-dimensional coordinates of the camera space of the second feature points;

记录机器人关节角度参数,通过机器人前向运动学得到第二特征点的机器人空间三维坐标;Record the robot joint angle parameters, and obtain the robot space three-dimensional coordinates of the second feature point through the forward kinematics of the robot;

建立机器人空间与相机空间的映射关系Establish the mapping relationship between robot space and camera space

xx cc == (( CC 11 22 ++ CC 22 22 -- CC 33 22 -- CC 44 22 )) pp xx ++ 22 (( CC 22 CC 33 ++ CC 11 CC 44 )) pp ythe y ++ 22 (( CC 22 CC 44 -- CC 11 CC 33 )) pp zz ++ CC 55

ythe y cc == 22 (( CC 22 CC 33 -- CC 11 CC 44 )) pp xx ++ (( CC 11 22 -- CC 22 22 ++ CC 33 22 -- CC 44 22 )) pp ythe y ++ 22 (( CC 33 CC 44 ++ CC 11 CC 22 )) pp zz ++ CC 66

其中,(xc,yc)是第二特征点(11)的相机空间二维坐标,(px,py,pz)是第二特征点的机器人空间中三维坐标,C=[C1,C2,C3,C4,C5,C6]是映射关系参数。Among them, (x c , y c ) are the two-dimensional coordinates of the camera space of the second feature point (11), (p x , p y , p z ) are the three-dimensional coordinates of the second feature point in the robot space, C=[C 1 , C 2 , C 3 , C 4 , C 5 , C 6 ] are mapping relationship parameters.

本发明与现有技术相比,具有以下优点:(1)原来需要操作人员需要手把手地握着机器人末端进行示教,需要大力才能模拟作业的姿态且不易示教到位,现在无须费力,轻松灵巧;(2)在操作人员与机器人人机交互过程中会有一种“身临其境”的感觉,为用户提供了一种崭新和谐的人机交互作业环境;(3)手持示教装置设备上设置若干个特征点排列能唯一确定姿态,特征点可以是LED,这解决作业环境限制问题。Compared with the prior art, the present invention has the following advantages: (1) In the past, it was necessary for the operator to hold the end of the robot hand in hand for teaching, and it required great effort to simulate the posture of the operation and it was not easy to teach in place. Now, it is easy and dexterous without effort ;(2) There will be an "immersive" feeling in the process of human-computer interaction between the operator and the robot, providing users with a new and harmonious human-computer interaction environment; (3) Hand-held teaching device equipment Setting several feature point arrangements can uniquely determine the posture, and the feature points can be LEDs, which solves the problem of working environment restrictions.

下面结合说明书附图对本发明做进一步描述。The present invention will be further described below in conjunction with the accompanying drawings.

附图说明Description of drawings

图1是本发明系统构成示意图。Fig. 1 is a schematic diagram of the system structure of the present invention.

图2是本发明手持虚拟示教工具的结构示意图。Fig. 2 is a schematic structural diagram of the handheld virtual teaching tool of the present invention.

图3是本发明实现焊接车门的示意图。Fig. 3 is a schematic diagram of the present invention for welding a car door.

图4是本发明方法的流程图。Fig. 4 is a flowchart of the method of the present invention.

具体实施方式Detailed ways

结合图1和图2,一种学习人的虚拟示教机器人智能编程辅助装置,包括:示教工具2、相机单元3和机器人控制器4。所述示教工具2与机器人末端设置的作业工具1相同且设有第一特征点21;所述相机单元3依照时间顺序捕获并记录示教工具2上特征点运动轨迹和示教工具2关节角度信息;所述机器人控制器4生成机器人工作指令。所述作业工具1上设有与示教工具2的第一特征点21对应的第二特征点11。Referring to FIG. 1 and FIG. 2 , a virtual teaching robot intelligent programming auxiliary device for learning people includes: a teaching tool 2 , a camera unit 3 and a robot controller 4 . The teaching tool 2 is the same as the working tool 1 set at the end of the robot and is provided with a first feature point 21; the camera unit 3 captures and records the movement trajectory of the feature points on the teaching tool 2 and the joints of the teaching tool 2 in time sequence. Angle information; the robot controller 4 generates robot work instructions. The working tool 1 is provided with a second feature point 11 corresponding to the first feature point 21 of the teaching tool 2 .

结合图1和图2,本发明采用的机器人为六自由度的工业机器人6,其末端设有焊接工具,该焊接工具包括焊接工具上颚12和可活动的焊接工具下颚13。手持示教工具2模拟的是焊接工具上颚12,其形状和尺寸均与焊接工具上颚12相同,但是示教工具2的重量很轻,操作者可以灵活操作且方便姿态调整。1 and 2, the robot used in the present invention is a six-degree-of-freedom industrial robot 6 with a welding tool at its end, which includes a welding tool upper jaw 12 and a movable welding tool lower jaw 13 . What the hand-held teaching tool 2 simulates is the upper jaw 12 of the welding tool, and its shape and size are the same as the upper jaw 12 of the welding tool, but the weight of the teaching tool 2 is very light, and the operator can operate flexibly and facilitate posture adjustment.

结合图2,示教工具2上有若干个第一特征点21,第一特征点21可以位于模拟焊接工具上颚的后端伸出的特征点安装块上。第一特征点排21列组合应当符合唯一标识示教工具2的姿态。本发明采用L形状排列为例,第一特征点21有三个,且第一个第一特征点与第二个第一特征点之间的连线A垂直于第二个第一特征点与第三个第一特征点之间的连线B,且连线A长度是连线B长度的两倍。示教工具2的前端口模拟焊枪头。Referring to FIG. 2 , there are several first feature points 21 on the teaching tool 2 , and the first feature points 21 can be located on a feature point mounting block protruding from the rear end of the upper jaw of the simulation welding tool. The 21-column combination of the first feature point row should conform to the posture that uniquely identifies the teaching tool 2 . The present invention takes an L-shaped arrangement as an example. There are three first feature points 21, and the connection line A between the first first feature point and the second first feature point is perpendicular to the second first feature point and the second first feature point. A connection line B between the three first feature points, and the length of the connection line A is twice the length of the connection line B. The front port of the teach pendant 2 simulates a welding torch head.

焊接工具上颚12上也设有与三个第一特征点21相对应的三个第二特征点11。所述相应的第一特征点21到示教工具前端模拟焊枪头的距离与机器人末端上焊接工具的相应的第二特征点11到焊接工具上颚12前端焊枪头的距离相等。Three second feature points 11 corresponding to the three first feature points 21 are also provided on the upper jaw 12 of the welding tool. The distance from the corresponding first feature point 21 to the front end of the teaching tool to simulate the welding torch head is equal to the distance from the corresponding second feature point 11 of the welding tool on the end of the robot to the welding torch head at the front end of the upper jaw 12 of the welding tool.

如图3所示,相机单元3视野覆盖示教工具2的作业区域,实现对示教工具2的全方位无死角的拍摄。所述相机单元3由两个以上相机31组成,相机之间的距离和相机之间的夹角不必确定。以焊接车门51为例,车门51上有焊点52,示教工具2以作业姿态到达焊接点52位置,车门51上所有焊点52(即作业点)及示教工具2上的特征点21可以被若干相机31中部分相机观察到即可。As shown in FIG. 3 , the field of view of the camera unit 3 covers the working area of the teaching tool 2 , so as to realize shooting of the teaching tool 2 in all directions without blind spots. The camera unit 3 is composed of more than two cameras 31, and the distance between the cameras and the angle between the cameras do not have to be determined. Take welding the car door 51 as an example, there is a welding spot 52 on the car door 51, and the teaching tool 2 reaches the position of the welding spot 52 in an operating posture. It only needs to be observed by some of the cameras 31 .

结合图4,一种学习人的虚拟示教机器人智能编程方法,包括以下步骤。Referring to FIG. 4 , an intelligent programming method for a human-learning virtual teaching robot includes the following steps.

步骤1,建立机器人空间与相机空间的映射关系。Step 1, establish the mapping relationship between robot space and camera space.

相机单元3视野范围覆盖作业区域,采集多组作业区域内机器人末端上的焊接工具的第二特征点11图像,通过图像识别得到特征点在相机空间的坐标,记录对应的机器人关节角度信息,通过机器人前向运动学得到特征点的机器人空间坐标,建立机器人空间与相机空间的映射关系,映射关系如下:The field of view of the camera unit 3 covers the working area, collects images of the second feature point 11 of the welding tool on the end of the robot in multiple groups of working areas, obtains the coordinates of the feature point in the camera space through image recognition, records the corresponding robot joint angle information, and passes The forward kinematics of the robot obtains the robot space coordinates of the feature points, and establishes the mapping relationship between the robot space and the camera space. The mapping relationship is as follows:

xx cc == (( CC 11 22 ++ CC 22 22 -- CC 33 22 -- CC 44 22 )) pp xx ++ 22 (( CC 22 CC 33 ++ CC 11 CC 44 )) pp ythe y ++ 22 (( CC 22 CC 44 -- CC 11 CC 33 )) pp zz ++ CC 55

ythe y cc == 22 (( CC 22 CC 33 -- CC 11 CC 44 )) pp xx ++ (( CC 11 22 -- CC 22 22 ++ CC 33 22 -- CC 44 22 )) pp ythe y ++ 22 (( CC 33 CC 44 ++ CC 11 CC 22 )) pp zz ++ CC 66

其中,(xc,yc)是特征点在相机空间中的二维坐标,(px,py,pz)是特征点在机器人空间中的三维坐标,C=[C1,C2,C3,C4,C5,C6]是映射关系参数。这一过程为映射关系参数的训练步骤,捕获的第二特征点组数越多,映射关系参数C越精确。Among them, (x c , y c ) is the two-dimensional coordinates of the feature points in the camera space, (p x , p y , p z ) is the three-dimensional coordinates of the feature points in the robot space, C=[C 1 ,C 2 ,C 3 ,C 4 ,C 5 ,C 6 ] are mapping relationship parameters. This process is a training step of the mapping relationship parameter, the more the captured second feature point groups, the more accurate the mapping relationship parameter C is.

步骤2,示教工具2进行虚拟作业。Step 2, the teaching tool 2 performs virtual operations.

操作人员手持示教工具2到达机器人作业的焊点52位置,模拟机器人作业姿态并保证示教工具前端口焊枪头于焊点52上,示教工具上的第一特征点21能够被相机单元3中的相机21观察到,操作人员手持示教工具2依次到达不同焊点位置。The operator holds the teaching tool 2 to reach the position of the welding spot 52 of the robot operation, simulates the robot's working posture and ensures that the welding torch head at the front port of the teaching tool is on the welding spot 52, and the first feature point 21 on the teaching tool can be captured by the camera unit 3 The camera 21 in the camera observes that the operator holds the teaching tool 2 and arrives at different welding spot positions in sequence.

步骤3,采集虚拟示教工具信息。Step 3, collect the virtual teaching tool information.

相机单元3中所有相机采集步骤2中示教工具2上的第一特征点21,保存每个作业位置特征点在相机空间中的图像,并生成第一特征点21的相机空间二维坐标。All the cameras in the camera unit 3 collect the first feature point 21 on the teaching tool 2 in step 2, save the image of each working position feature point in the camera space, and generate the two-dimensional coordinates of the first feature point 21 in the camera space.

步骤4,机器人空间下位置坐标计算。Step 4, position coordinate calculation in robot space.

基于步骤1已经建立的机器人空间与相机空间的映射关系,将在相机空间中示教工具2上的第一特征点21的相机空间坐标(通过特征点的图像识别得到)信息转化得到机器人空间的三维坐标信息或者关节角度信息,并记录关节角度信息。Based on the mapping relationship between the robot space and the camera space established in step 1, the camera space coordinates (obtained by the image recognition of the feature points) of the first feature point 21 on the teaching tool 2 in the camera space are converted to obtain the robot space coordinates. Three-dimensional coordinate information or joint angle information, and record the joint angle information.

步骤5,自动生成机器人程序。Step 5, automatically generate the robot program.

示教结束后,机器人控制器4将步骤4中记录的关节角度信息顺序保存,生成包含第一特征点21的机器人空间三维坐标和关节角度在内的指令程序机器人程序。After teaching, the robot controller 4 sequentially saves the joint angle information recorded in step 4, and generates an instruction program robot program including the three-dimensional robot space coordinates of the first feature point 21 and the joint angle.

最后,机器人控制器4执行步骤5生成的程序,控制机器人以操作人员示教的位置及姿态,自动化运动至所有作业位置,完成焊接作业任务,作业完成后,机器人回到原始位置,等待操作人员下一次示教,实现机器人智能编程。Finally, the robot controller 4 executes the program generated in step 5, controls the robot to automatically move to all working positions at the position and posture taught by the operator, and completes the welding task. After the work is completed, the robot returns to the original position and waits for the operator. The next teaching will realize the intelligent programming of the robot.

Claims (6)

1.一种学习人的虚拟示教机器人智能编程辅助装置,机器人末端设有作业工具(1),其特征在于,包括:1. A virtual teaching robot intelligent programming auxiliary device for learning people, the end of the robot is provided with a working tool (1), it is characterized in that, comprising: 与机器人末端设置的作业工具(1)相同且设有第一特征点(21)的示教工具(2),A teaching tool (2) identical to the working tool (1) provided at the end of the robot and provided with a first feature point (21), 依照时间顺序捕获并记录示教工具(2)上特征点运动轨迹、示教工具(2)关节角度信息和作业目标作业点的相机单元(3),及A camera unit (3) that captures and records the movement trajectory of the feature points on the teaching tool (2), the joint angle information of the teaching tool (2) and the operating point of the operation target in chronological order, and 生成机器人工作指令的机器人控制器(4);A robot controller (4) generating robot work instructions; 所述作业工具(1)上设有与示教工具(2)第一特征点(21)对应的第二特征点(11)。The working tool (1) is provided with a second feature point (11) corresponding to the first feature point (21) of the teaching tool (2). 2.根据权利要求1所述的学习人的虚拟示教机器人智能编程辅助装置,第一特征点(22)包括两个以上的特征点,特征点之间排列满足唯一标识示教工具(2)的姿态。2. The virtual teaching robot intelligent programming auxiliary device for learning people according to claim 1, the first feature point (22) includes more than two feature points, and the arrangement between the feature points meets the unique identification teaching tool (2) attitude. 3.根据权利要求2所述的学习人的虚拟示教机器人智能编程辅助装置,第一特征点(22)有三个,且第一个第一特征点与第二个第一特征点之间的连线A垂直于第二个第一特征点与第三个第一特征点之间的连线B,且连线A长度是连线B长度的两倍。3. the virtual teaching robot intelligent programming auxiliary device for learning people according to claim 2, the first feature point (22) has three, and the distance between the first first feature point and the second first feature point The connection line A is perpendicular to the connection line B between the second first feature point and the third first feature point, and the length of the connection line A is twice the length of the connection line B. 4.根据权利要求2或3所述的学习人的虚拟示教机器人智能编程辅助装置,第一特征点(22)和第二特征点(11)中的特征点为LED、黑点或白点。4. according to claim 2 or 3 described learning people's virtual teaching robot intelligent programming auxiliary device, the feature point in the first feature point (22) and the second feature point (11) is LED, black point or white point . 5.一种采用上述任意一项权利要求所述的辅助装置的学习人的虚拟示教机器人智能编程方法,其特征在于,包括:5. A virtual teaching robot intelligent programming method for learning people using the auxiliary device described in any one of the above claims, characterized in that it includes: 示教工具(2)进行虚拟作业,相机单元(3)依照时间顺序对第一特征点(21)的运动轨迹、关节角度和作业目标的作业点进行捕获并生成第一特征点(21)的相机空间二维坐标,The teaching tool (2) performs virtual operations, and the camera unit (3) captures the motion trajectory, joint angles, and operating points of the first feature point (21) according to time sequence and generates an image of the first feature point (21). 2D camera space coordinates, 机器人控制器(01)根据第一特征点(21)的相机空间坐标、关节角度和作业目标的作业点获得包含第一特征点(21)的机器人空间三维坐标和关节角度在内的指令程序,The robot controller (01) obtains an instruction program including the robot space three-dimensional coordinates and joint angles of the first feature point (21) according to the camera space coordinates, joint angles, and the operating point of the operation target of the first feature point (21), 机器人控制器(4)将指令程序输入机器人;The robot controller (4) inputs the instruction program into the robot; 所述机器人控制器(4)依照指令程序中第一特征点(21)的机器人空间三维坐标信息控制作业工具(1)上的第二特征点(11)的运动轨迹。The robot controller (4) controls the motion track of the second feature point (11) on the working tool (1) according to the robot space three-dimensional coordinate information of the first feature point (21) in the instruction program. 6.根据权利要求5所述的学习人的虚拟示教机器人智能编程方法,其特征在于,在示教工具(2)进行虚拟作业之前,先建立机器人空间与相机空间的映射关系,具体方法为:6. the robot intelligent programming method of learning people's virtual teaching according to claim 5 is characterized in that, before the teaching tool (2) carries out the virtual operation, the mapping relationship between the robot space and the camera space is first established, and the specific method is as follows: : 相机单元(3)捕获三组以上工作中第二特征点(11)图像并获得第二特征点(11)的相机空间二维坐标;The camera unit (3) captures more than three sets of images of the second feature point (11) in operation and obtains the two-dimensional coordinates of the second feature point (11) in camera space; 记录机器人关节角度参数,通过机器人前向运动学得到第二特征点(11)的机器人空间三维坐标;Record the robot joint angle parameters, and obtain the robot space three-dimensional coordinates of the second feature point (11) through the forward kinematics of the robot; 建立机器人空间与相机空间的映射关系Establish the mapping relationship between robot space and camera space xx cc == (( CC 11 22 ++ CC 22 22 -- CC 33 22 -- CC 44 22 )) pp xx ++ 22 (( CC 22 CC 33 ++ CC 11 CC 44 )) pp ythe y ++ 22 (( CC 22 CC 44 -- CC 11 CC 33 )) pp zz ++ CC 55 ythe y cc == 22 (( CC 22 CC 33 -- CC 11 CC 44 )) pp xx ++ (( CC 11 22 -- CC 22 22 ++ CC 33 22 -- CC 44 22 )) pp ythe y ++ 22 (( CC 33 CC 44 ++ CC 11 CC 22 )) pp zz ++ CC 66 其中,(xc,yc)是第二特征点(11)的相机空间二维坐标,(px,py,pz)是第二特征点(11)的机器人空间中三维坐标,C=[C1,C2,C3,C4,C5,C6]是映射关系参数。Among them, (x c , y c ) are the two-dimensional coordinates of the camera space of the second feature point (11), (p x , p y , p z ) are the three-dimensional coordinates of the second feature point (11) in the robot space, and C =[C 1 , C 2 , C 3 , C 4 , C 5 , C 6 ] are mapping relationship parameters.
CN201510221175.XA 2015-05-04 2015-05-04 Intelligent programming method and auxiliary device for learning human virtual teaching robot Active CN104827474B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510221175.XA CN104827474B (en) 2015-05-04 2015-05-04 Intelligent programming method and auxiliary device for learning human virtual teaching robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510221175.XA CN104827474B (en) 2015-05-04 2015-05-04 Intelligent programming method and auxiliary device for learning human virtual teaching robot

Publications (2)

Publication Number Publication Date
CN104827474A true CN104827474A (en) 2015-08-12
CN104827474B CN104827474B (en) 2017-06-27

Family

ID=53805845

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510221175.XA Active CN104827474B (en) 2015-05-04 2015-05-04 Intelligent programming method and auxiliary device for learning human virtual teaching robot

Country Status (1)

Country Link
CN (1) CN104827474B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106444739A (en) * 2016-07-15 2017-02-22 鹿龙 Multi-industrial-robot virtual offline co-simulation system and method
CN107220099A (en) * 2017-06-20 2017-09-29 华中科技大学 A kind of robot visualization virtual teaching system and method based on threedimensional model
CN107331279A (en) * 2017-08-16 2017-11-07 嘉兴锐视智能科技有限公司 Teaching apparatus and system
CN108000499A (en) * 2016-10-27 2018-05-08 广明光电股份有限公司 Programming method of robot visual coordinate
CN108427282A (en) * 2018-03-30 2018-08-21 华中科技大学 A kind of solution of Inverse Kinematics method based on learning from instruction
CN108705536A (en) * 2018-06-05 2018-10-26 雅客智慧(北京)科技有限公司 A kind of the dentistry robot path planning system and method for view-based access control model navigation
CN109937118A (en) * 2016-11-22 2019-06-25 松下知识产权经营株式会社 Picking system and control method thereof
WO2025086177A1 (en) * 2023-10-25 2025-05-01 Abb Schweiz Ag Teaching system and corresponding method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09225872A (en) * 1996-02-23 1997-09-02 Yaskawa Electric Corp Robot teaching device
CN101100061A (en) * 2006-07-03 2008-01-09 发那科株式会社 Measuring device and calibration method
JP4449410B2 (en) * 2003-10-27 2010-04-14 ソニー株式会社 Robot apparatus and object learning method thereof
CN102004485A (en) * 2009-08-27 2011-04-06 本田技研工业株式会社 Off-line robot teaching method
JP2011224745A (en) * 2010-04-21 2011-11-10 Yaskawa Electric Corp Robot teaching device and controller for the same, and program
CN102350700A (en) * 2011-09-19 2012-02-15 华南理工大学 Method for controlling robot based on visual sense
CN102470530A (en) * 2009-11-24 2012-05-23 株式会社丰田自动织机 Method of producing teaching data of robot and robot teaching system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09225872A (en) * 1996-02-23 1997-09-02 Yaskawa Electric Corp Robot teaching device
JP4449410B2 (en) * 2003-10-27 2010-04-14 ソニー株式会社 Robot apparatus and object learning method thereof
CN101100061A (en) * 2006-07-03 2008-01-09 发那科株式会社 Measuring device and calibration method
CN102004485A (en) * 2009-08-27 2011-04-06 本田技研工业株式会社 Off-line robot teaching method
CN102470530A (en) * 2009-11-24 2012-05-23 株式会社丰田自动织机 Method of producing teaching data of robot and robot teaching system
JP2011224745A (en) * 2010-04-21 2011-11-10 Yaskawa Electric Corp Robot teaching device and controller for the same, and program
CN102350700A (en) * 2011-09-19 2012-02-15 华南理工大学 Method for controlling robot based on visual sense

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106444739A (en) * 2016-07-15 2017-02-22 鹿龙 Multi-industrial-robot virtual offline co-simulation system and method
CN108000499A (en) * 2016-10-27 2018-05-08 广明光电股份有限公司 Programming method of robot visual coordinate
CN109937118A (en) * 2016-11-22 2019-06-25 松下知识产权经营株式会社 Picking system and control method thereof
CN109937118B (en) * 2016-11-22 2023-02-03 松下知识产权经营株式会社 Picking system and control method thereof
CN107220099A (en) * 2017-06-20 2017-09-29 华中科技大学 A kind of robot visualization virtual teaching system and method based on threedimensional model
CN107331279A (en) * 2017-08-16 2017-11-07 嘉兴锐视智能科技有限公司 Teaching apparatus and system
CN108427282A (en) * 2018-03-30 2018-08-21 华中科技大学 A kind of solution of Inverse Kinematics method based on learning from instruction
CN108705536A (en) * 2018-06-05 2018-10-26 雅客智慧(北京)科技有限公司 A kind of the dentistry robot path planning system and method for view-based access control model navigation
WO2025086177A1 (en) * 2023-10-25 2025-05-01 Abb Schweiz Ag Teaching system and corresponding method

Also Published As

Publication number Publication date
CN104827474B (en) 2017-06-27

Similar Documents

Publication Publication Date Title
CN104827474B (en) Intelligent programming method and auxiliary device for learning human virtual teaching robot
Ong et al. Augmented reality-assisted robot programming system for industrial applications
Pan et al. Augmented reality-based robot teleoperation system using RGB-D imaging and attitude teaching device
CN107309882B (en) A robot teaching programming system and method
CN100484726C (en) Flexible and remote-controlled operation platform for robot based on virtual reality
Gaschler et al. Intuitive robot tasks with augmented reality and virtual obstacles
CN104936748B (en) Free-hand robot path teaching
CN110561430B (en) Robot assembly track optimization method and device for offline example learning
SE526119C2 (en) Method and system for programming an industrial robot
WO2011039542A1 (en) Method and system of programming a robot
KR102001214B1 (en) Apparatus and method for dual-arm robot teaching based on virtual reality
Thomas et al. Intuitive work assistance by reciprocal human-robot interaction in the subject area of direct human-robot collaboration
Fang et al. Robot path and end-effector orientation planning using augmented reality
CN104325268A (en) Industrial robot three-dimensional space independent assembly method based on intelligent learning
Hoang et al. Virtual barriers in augmented reality for safe and effective human-robot cooperation in manufacturing
CN114536346A (en) Mechanical arm accurate path planning method based on man-machine cooperation and visual detection
Melchiorre et al. Influence of Human Limb Motion Speed in a Collaborative Hand-over Task.
Ekrekli et al. Co-speech gestures for human-robot collaboration
CN115723133B (en) A robot space weld automatic positioning and correction system based on virtual and real combination
Waymouth et al. Demonstrating cloth folding to robots: Design and evaluation of a 2d and a 3d user interface
Frank et al. Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet
Wu et al. Improving human-robot interactivity for tele-operated industrial and service robot applications
Zieliński et al. A study of cobot practitioners needs for augmented reality interfaces in the context of current technologies
Sanches et al. Scalable. intuitive human to robot skill transfer with wearable human machine interfaces: On complex, dexterous tasks
CN110948467A (en) Handheld teaching device and method based on stereoscopic vision

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant