[go: up one dir, main page]

CN109341532B - Automatic-assembly-oriented part coordinate calibration method based on structural features - Google Patents

Automatic-assembly-oriented part coordinate calibration method based on structural features Download PDF

Info

Publication number
CN109341532B
CN109341532B CN201811310133.3A CN201811310133A CN109341532B CN 109341532 B CN109341532 B CN 109341532B CN 201811310133 A CN201811310133 A CN 201811310133A CN 109341532 B CN109341532 B CN 109341532B
Authority
CN
China
Prior art keywords
coordinate system
robot
coordinate
binocular
cabin
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811310133.3A
Other languages
Chinese (zh)
Other versions
CN109341532A (en
Inventor
王新
杨志波
宋彰桓
辛红
李兰柱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Academy of Launch Vehicle Technology CALT
Aerospace Research Institute of Materials and Processing Technology
Original Assignee
China Academy of Launch Vehicle Technology CALT
Aerospace Research Institute of Materials and Processing Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Academy of Launch Vehicle Technology CALT, Aerospace Research Institute of Materials and Processing Technology filed Critical China Academy of Launch Vehicle Technology CALT
Priority to CN201811310133.3A priority Critical patent/CN109341532B/en
Publication of CN109341532A publication Critical patent/CN109341532A/en
Application granted granted Critical
Publication of CN109341532B publication Critical patent/CN109341532B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a part coordinate calibration method based on structural characteristics and oriented to automatic assembly. The characteristic structure capable of accurately judging the posture and the position of the part is determined according to the structure of the part, and the structure can completely realize the definition of the position posture of the part in the space. And measuring the coordinate value of the part under the current coordinate system through non-contact measurement. And calibrating the coordinate position of the part by a multi-coordinate system conversion algorithm. And acquiring pose data of the part matching surface through the coordinate values, the part characteristics and the geometric relationship among the part matching surfaces, transmitting the data to a robot control system, and driving the robot to position the part to the assembled position.

Description

一种面向自动装配的基于结构特征的零件坐标标定方法A structural feature-based part coordinate calibration method for automatic assembly

技术领域technical field

本发明涉及一种面向自动装配的基于结构特征的零件坐标标定方法,属于装配制造领域。The invention relates to a structural feature-based part coordinate calibration method for automatic assembly, and belongs to the field of assembly manufacturing.

背景技术Background technique

目前,自动化、数字化、智能化是生产制造业的主要方向之一。航空航天领域复合材料舱段装配过程中自动化程度较低,导致产品质量一致性不稳定、生产效率较低。其中,复合材料结构件变形影响自动化加工、装配等制造工序的实施。At present, automation, digitization and intelligence are one of the main directions of production and manufacturing. The low degree of automation in the assembly process of composite material cabins in the aerospace field results in unstable product quality consistency and low production efficiency. Among them, the deformation of composite structural parts affects the implementation of manufacturing processes such as automated processing and assembly.

运载火箭等箭体舱段部件为柱形或锥形薄壁回转体结构,结构尺寸较大,在舱段内壁或外壁上安装装配有大量的支架类零件。该支架类零件为钣金结构或机加结构,在舱段中具有较为严格的安装位置精度要求,零件和舱段侧壁贴合面为装配配合面。目前零件的定位依靠传统的手工划线方式进行,即通过舱段上的基准,通过钢板尺或其他工具在零件安装位置画出零件的安装中心线,零件上也画上中心线,通过中心线对齐完成零件的定位。零件定位过程工作量大,定位效率低,定位精度一致性较差,由于零件定位过程全部为人工操作,还容易存在较大的质量隐患。The components of the rocket body cabin section such as the launch vehicle are cylindrical or conical thin-walled revolving body structures with large structural dimensions. A large number of bracket parts are installed on the inner wall or outer wall of the cabin section. The bracket-type parts are of sheet metal structure or machined structure, and have stricter installation position accuracy requirements in the cabin section, and the fitting surface of the parts and the side wall of the cabin section is the assembly matching surface. At present, the positioning of parts is carried out by the traditional manual marking method, that is, through the datum on the cabin section, the installation center line of the part is drawn at the installation position of the part by a steel ruler or other tools, and the center line is also drawn on the part, and the center line is drawn through the center line. Alignment completes the positioning of the part. The part positioning process has a large workload, low positioning efficiency, and poor positioning accuracy consistency. Since the part positioning process is all manual operation, it is easy to have a large quality hidden danger.

为此,在自动化装配设备的基础上,通过建立零件的关键结构特征数据库,通过零件关键特征与零件间的几何关系,间接评价零件的姿态及位置,并通过一系列的坐标转换,将零件上的关键特征的几何坐标值转化为自动装配设备下的坐标值,从而能够驱动自动化设备将零件定位至被装配位置,保证零件的位置精度要求,满足复合材料舱段装配过程中零件自动定位的生产需求。To this end, on the basis of automatic assembly equipment, by establishing the key structural feature database of the part, through the geometric relationship between the key features of the part and the part, the attitude and position of the part are indirectly evaluated, and through a series of coordinate transformations, the parts are The geometric coordinate values of the key features are converted into the coordinate values under the automatic assembly equipment, so as to drive the automatic equipment to position the parts to the assembled position, ensure the position accuracy requirements of the parts, and meet the production of automatic parts positioning during the assembly process of the composite material cabin. need.

发明内容SUMMARY OF THE INVENTION

本发明的目的在于克服现有技术的不足,提供一种面向自动装配的基于结构特征的零件坐标标定方法,The purpose of the present invention is to overcome the deficiencies of the prior art, and to provide a structural feature-based part coordinate calibration method for automatic assembly,

本发明目的通过如下技术方案予以实现:The object of the present invention is achieved through the following technical solutions:

提供一种面向自动装配装置的基于结构特征的零件坐标标定方法,自动装配装置包括一个具有末端执行器的零件抓取机器人;Provided is a structural feature-based part coordinate calibration method for an automatic assembly device, wherein the automatic assembly device includes a part grabbing robot with an end effector;

其特征在于,包括以下步骤:It is characterized in that, comprises the following steps:

步骤一、定义零件识别的双目系统坐标系Oc1,零件抓取机器人的基坐标系Ob与末端执行器坐标系Oe,舱体装配平台坐标系Op,舱体坐标系Oo,测量抓取机器人位姿与安装平台定位的双目系统坐标系Oc2Step 1. Define the binocular system coordinate system O c1 for part recognition, the base coordinate system O b of the part grabbing robot and the end effector coordinate system O e , the cabin assembly platform coordinate system Op , and the cabin coordinate system O o , The binocular system coordinate system O c2 for measuring the pose of the grasping robot and the positioning of the installation platform;

步骤二、零件识别的双目系统对零件进行特征识别,建立零件在零件识别的双目系统下的起始坐标系Ow1,获取零件起始位置特征向量Mw1Step 2: The binocular system for part recognition performs feature recognition on the part, establishes the starting coordinate system O w1 of the part under the binocular system for part recognition, and obtains the feature vector M w1 of the starting position of the part;

步骤三、标定获得零件识别的双目系统坐标系到机器人末端执行器坐标系的转换关系

Figure BDA0001854006600000021
标定获得测量抓取机器人位姿与安装平台定位的双目系统到机器人末端执行器坐标系的转换关系
Figure BDA0001854006600000022
Step 3: Calibration to obtain the conversion relationship between the coordinate system of the binocular system for part recognition and the coordinate system of the robot end effector
Figure BDA0001854006600000021
Calibration to obtain the transformation relationship between the binocular system for measuring the pose of the grasping robot and the positioning of the installation platform to the coordinate system of the robot end effector
Figure BDA0001854006600000022

步骤四、在零件抓取机器人控制器中读取机器人末端执行器坐标系到机器人基坐标系的下的转换关系

Figure BDA0001854006600000023
计算获得零件起始坐标系到机器人基坐标系的转换关系为
Figure BDA0001854006600000024
零件起始坐标系在机器人基坐标系下的坐标系为
Figure BDA0001854006600000025
在机器人基坐标系下的特征向量为
Figure BDA0001854006600000026
Step 4. Read the conversion relationship from the coordinate system of the robot end effector to the base coordinate system of the robot in the part grabbing robot controller
Figure BDA0001854006600000023
The conversion relationship from the starting coordinate system of the part to the base coordinate system of the robot is calculated as
Figure BDA0001854006600000024
The coordinate system of the starting coordinate system of the part under the robot base coordinate system is:
Figure BDA0001854006600000025
The eigenvectors in the robot base coordinate system are
Figure BDA0001854006600000026

步骤五、标定获得舱体坐标系到装配平台的坐标系的转换关系

Figure BDA0001854006600000027
Step 5. Calibration to obtain the conversion relationship from the coordinate system of the cabin to the coordinate system of the assembly platform
Figure BDA0001854006600000027

步骤六、测量抓取机器人位姿与安装平台定位的双目系统通过测量舱体装配平台的标志点,获得标志点在双目系统坐标系Oc2下的坐标,从而计算出舱体装配平台坐标系Op到测量抓取机器人位姿与安装平台定位的双目系统坐标系Oc2的转换矩阵

Figure BDA0001854006600000028
Step 6: The binocular system for measuring the pose of the grasping robot and the positioning of the installation platform By measuring the mark points of the cabin assembly platform, the coordinates of the mark points in the binocular system coordinate system O c2 are obtained, thereby calculating the coordinates of the cabin assembly platform The transformation matrix from the system Op to the coordinate system O c2 of the binocular system for measuring the pose of the grasping robot and the positioning of the installation platform
Figure BDA0001854006600000028

步骤七、获得舱体坐标系到在机器人基坐标系下的坐标系的转换关系为

Figure BDA0001854006600000029
Step 7. Obtain the conversion relationship from the cabin coordinate system to the coordinate system under the robot base coordinate system as follows:
Figure BDA0001854006600000029

步骤八、获取零件安装在舱体坐标系Oo下的目标位置坐标点坐标OW2(xw2,yw2,zw2)与目标位置特征向量Mw2;计算目标位置坐标在机器人基坐标系下的坐标为

Figure BDA0001854006600000031
则目标位置特征向量Mw2在机器人基坐标系下的特征向量为
Figure BDA0001854006600000032
Step 8. Obtain the target position coordinate point coordinates O W2 (x w2 , y w2 , z w2 ) and the target position feature vector M w2 of the parts installed under the cabin coordinate system Oo; calculate the target position coordinates in the robot base coordinate system. The coordinates are
Figure BDA0001854006600000031
Then the feature vector of the target position feature vector M w2 in the robot base coordinate system is
Figure BDA0001854006600000032

步骤九、在零件抓取机器人控制器内输入零件起始位置点坐标

Figure BDA0001854006600000033
Figure BDA0001854006600000034
特征向量
Figure BDA0001854006600000035
和目标位置点坐标
Figure BDA0001854006600000036
Figure BDA0001854006600000037
特征向量
Figure BDA0001854006600000038
控制末端执行器通过零件识别的双目系统从存放零件的物料托盘中定位并抓取零件,通过测量抓取机器人位姿与安装平台定位的双目系统的定位舱体中零件安装的指定位置并将零件放置到指定位置。Step 9. Enter the coordinates of the starting position of the part in the part grabbing robot controller
Figure BDA0001854006600000033
Figure BDA0001854006600000034
Feature vector
Figure BDA0001854006600000035
and the target position point coordinates
Figure BDA0001854006600000036
Figure BDA0001854006600000037
Feature vector
Figure BDA0001854006600000038
Control the end effector to locate and grab the parts from the material tray storing the parts through the binocular system of part recognition, and measure the position of the grabbing robot and the positioning of the installation platform. Place the part at the specified location.

优选的,零件识别的双目系统坐标系Oc1和测量抓取机器人位姿与安装平台定位的双目系统坐标系是以双目系统的光心作为坐标原点,其Z轴与双目视觉系统的光轴平行。Preferably, the binocular system coordinate system O c1 for part recognition and the binocular system coordinate system for measuring the pose of the grasping robot and the positioning of the installation platform are based on the optical center of the binocular system as the coordinate origin, and its Z axis is related to the binocular vision system. The optical axes are parallel.

优选的,零件识别的双目系统与测量抓取机器人位姿与安装平台定位的双目系统是通过手眼标定方法与机器人末端执行器进行坐标系标定。Preferably, the binocular system for part recognition and the binocular system for measuring the pose of the grasping robot and the positioning of the installation platform are to calibrate the coordinate system with the robot end effector through a hand-eye calibration method.

优选的,零件识别的双目系统标定方法为:Preferably, the binocular system calibration method for part recognition is:

4.1固定平面标定板,控制零件抓取机器人运动到一定位姿,双目视觉系统采集标定板图像;4.1 Fix the plane calibration plate, control the part grabbing robot to move to a certain position, and the binocular vision system collects the image of the calibration plate;

4.2双目视觉系统获得当前位置图像,按照角点检测算法检测标定板中的角点,在标定板平面上建立一个世界坐标系W,利用双目视觉系统求解出当前位置下,标定板世界坐标到双目视觉系统坐标系Oc1的转换

Figure BDA0001854006600000039
4.2 The binocular vision system obtains the image of the current position, detects the corner points in the calibration board according to the corner detection algorithm, establishes a world coordinate system W on the plane of the calibration board, and uses the binocular vision system to solve the world coordinate of the calibration board under the current position Conversion to binocular vision system coordinate system O c1
Figure BDA0001854006600000039

4.3从机器人控制器读取当前机器人末端位姿

Figure BDA00018540066000000310
即当前机器人基座标系Ob到末端坐标系Oe的变换关系;4.3 Read the current robot end pose from the robot controller
Figure BDA00018540066000000310
That is, the transformation relationship from the current robot base coordinate system O b to the end coordinate system O e ;

4.4控制机器人移动n个位置,从第i个位置运动到i+1个位置处,机器人末端位姿的变换关系为

Figure BDA00018540066000000311
对应的双目视觉系统的变换为
Figure BDA0001854006600000041
从双目视觉系统坐标系到机器人末端坐标系的转换关系
Figure BDA0001854006600000042
为需要标定的手眼转换关系,即满足
Figure BDA0001854006600000043
4.4 Control the robot to move n positions, from the ith position to the i+1 position, the transformation relationship of the robot end pose is as follows
Figure BDA00018540066000000311
The transformation of the corresponding binocular vision system is
Figure BDA0001854006600000041
Conversion relationship from binocular vision system coordinate system to robot end coordinate system
Figure BDA0001854006600000042
It is the hand-eye conversion relationship that needs to be calibrated, that is, it satisfies
Figure BDA0001854006600000043

优选的,零件抓取机器人的基坐标系Ob以零件抓取机器人的基座底面中心为原点,以基座底面所在平面为基准建立三维坐标。Preferably, the base coordinate system O b of the part grabbing robot takes the center of the bottom surface of the base of the part grabbing robot as the origin, and establishes three-dimensional coordinates based on the plane where the bottom surface of the base is located.

优选的,末端执行器坐标系Oe,实时从零件抓取机器人控制器中读取。Preferably, the coordinate system O e of the end effector is read from the part grabbing robot controller in real time.

优选的,舱体装配平台坐标系Op,以舱体装配平台的一个端点为坐标原点Op0,平台所在平面的水平线为Xp轴,以平台所在平面的竖直线为Yp轴。Preferably, the cabin assembly platform coordinate system Op takes one end point of the cabin assembly platform as the coordinate origin Op0 , the horizontal line of the plane where the platform is located is the X p axis, and the vertical line of the plane where the platform is located is the Y p axis.

优选的,零件起始位置坐标系Ow1,以零件特征的中心作为圆心,零件表面所在平面为Xw1Ow1Yw1平面,Xw1Ow1Yw1平面的法向为Zw1轴建立三维坐标,特征向量Mw1表征Zw1轴的方向。Preferably, the coordinate system O w1 of the starting position of the part takes the center of the part feature as the center of the circle, the plane where the surface of the part is located is the X w1 O w1 Y w1 plane, and the normal direction of the X w1 O w1 Y w1 plane is the Z w1 axis to establish three-dimensional coordinates , the eigenvector M w1 represents the direction of the Z w1 axis.

优选的,舱体坐标系Oo;舱体底部中心为原点O,以原点指向舱体的I象限0度为X轴,以原点指向舱体的II象限0度为Y轴。Preferably, the cabin coordinate system O o ; the center of the bottom of the cabin is the origin O, the X axis with the origin pointing to the I quadrant 0 degrees of the cabin, and the Y axis with the origin pointing to the II quadrant 0 degrees of the cabin.

优选的,标定获得舱体坐标系到装配平台的坐标系的转换关系

Figure BDA0001854006600000044
的标定方法为:将舱体置于装配平台上,使用装配平台上的定位装置定位并固定舱体,使舱体的Xo轴与装配平台的Xp轴平行,舱体的Yo轴与装配平台Yp轴平行;使用激光跟踪仪分别测量舱体与装配平台特征,获取两者的坐标系原点坐标,并计算转换,获取舱体坐标系到装配平台的坐标系的转换关系
Figure BDA0001854006600000045
Preferably, the calibration obtains the conversion relationship from the coordinate system of the cabin to the coordinate system of the assembly platform
Figure BDA0001854006600000044
The calibration method is as follows: place the cabin on the assembly platform, use the positioning device on the assembly platform to position and fix the cabin, so that the X o axis of the cabin is parallel to the X p axis of the assembly platform, and the Y o axis of the cabin is parallel to the X p axis of the assembly platform. The Y and p axes of the assembly platform are parallel; use the laser tracker to measure the features of the cabin and the assembly platform respectively, obtain the origin coordinates of the two coordinate systems, and calculate the transformation to obtain the transformation relationship between the cabin coordinate system and the coordinate system of the assembly platform
Figure BDA0001854006600000045

优选的,双目系统的特征识别速度为2s/个,零件特征的识别精度为0.2mm。Preferably, the feature recognition speed of the binocular system is 2s/piece, and the recognition accuracy of part features is 0.2mm.

优选的,判定零件放置到指定位置方法为:以零件特征的中心作为圆心Ow2,零件表面所在平面为Xw2Ow2Yw2平面,Xw2Ow2Yw2平面的法向为Zw2轴建立三坐标系Ow2;特征向量Mw2的方向与Zw2轴方向重合;目标位置点坐标与圆心Ow2重合。Preferably, the method for judging that the part is placed at the specified position is: take the center of the part feature as the center O w2 , the plane where the surface of the part is located is the X w2 O w2 Y w2 plane, and the normal direction of the X w2 O w2 Y w2 plane is established by the Z w2 axis The three-coordinate system O w2 ; the direction of the feature vector M w2 coincides with the direction of the Z w2 axis; the coordinates of the target position point coincide with the circle center O w2 .

本发明与现有技术相比具有如下优点:Compared with the prior art, the present invention has the following advantages:

(1)本发明提出了一种通过识别零件结构特征获取零件的安装信息的方法。该方法适用具有关键结构特征的支架类零件。零件结构特征包含结构特征、结构尺寸、安装信息等。结构特征包含轮廓、孔位、面形等零件自身的特征,结构尺寸包括零件各个特征结构的设计尺寸和分布关系,安装信息表示用于装配和调姿的位置和姿态信息。通过测量数据处理,根据零件轮廓面积、长度、主方向、轮廓矩、长宽比等信息建立图像描述向量,同时利用零件表面已有特征识别定位。通过结构特征提取与标记该获取零件空间位置精度。(1) The present invention proposes a method for obtaining installation information of a part by identifying structural features of the part. This method is suitable for bracket-type parts with key structural features. Part structural features include structural features, structural dimensions, and installation information. Structural features include the features of the part itself such as contours, hole positions, and surface shapes. Structural dimensions include the design dimensions and distribution relationships of each feature structure of the part. Installation information indicates the position and attitude information used for assembly and attitude adjustment. Through the measurement data processing, the image description vector is established according to the contour area, length, main direction, contour moment, aspect ratio and other information of the part, and the existing features on the surface of the part are used to identify and locate. Accuracy of part space position is obtained through structural feature extraction and marking.

(2)本发明提出了通过非接触式测量获取零件的位置及姿态精度的方法。该方法从采集的图像中识别出零件的关键特征,根据关键特征在零件上的实际拓扑关系,求解图像上零件的定位点的对应关系,根据位姿检测算法解算出位姿。从而达到获取特征结构的位置精度的目的。该方法具有特征测量与识别精度高,识别速度响应快等特点。(2) The present invention proposes a method for obtaining the position and attitude accuracy of a part through non-contact measurement. The method identifies the key features of the part from the collected images, solves the corresponding relationship of the positioning points of the part on the image according to the actual topological relationship of the key features on the part, and calculates the pose according to the pose detection algorithm. So as to achieve the purpose of obtaining the position accuracy of the feature structure. The method has the characteristics of high feature measurement and recognition accuracy, and fast recognition speed and response.

(3)本发明提出了一种多坐标系转换算法。该方法包括全局坐标系标定和坐标转换关系求解。实现了双目视觉系统标定,机器人动力学标定;双目视觉系统与机器人进行手眼标定。通过视觉系统与机器人标定与坐标转换算法,实现对零件位置的标定。(3) The present invention proposes a multi-coordinate system conversion algorithm. The method includes global coordinate system calibration and coordinate transformation relationship solution. The binocular vision system calibration and robot dynamics calibration are realized; the binocular vision system and the robot perform hand-eye calibration. Through the vision system and robot calibration and coordinate conversion algorithm, the calibration of the part position is realized.

(4)本发明实现了零件自动化路径规划及定位位置的确定。识别零件后,根据零件三维信息对零件当前摆放位姿进行检测,为机器人抓取提供位置和姿态数据。机器人抓取后进入到舱体中,并按照已经计算的定位数据移动到指定位置。实现采用基于视觉引导机器人高精度定位技术驱动机器人携带被装配工件运动至安装位置,达到路径和定位的目的。(4) The present invention realizes automatic path planning of parts and determination of positioning positions. After identifying the part, the current position and orientation of the part is detected according to the three-dimensional information of the part, and the position and attitude data are provided for the robot to grasp. After grabbing, the robot enters the cabin and moves to the designated position according to the calculated positioning data. Realize the use of high-precision positioning technology based on vision-guided robots to drive the robot to carry the assembled workpiece to the installation position to achieve the purpose of path and positioning.

附图说明Description of drawings

图1标定示意图;图中:零件识别的双目系统1、存放零件的物料托盘2、零件抓取机器人3、舱体装配平台4、被安装零件5、舱体6、测量抓取机器人位姿与安装平台定位的双目系统7。Fig. 1 Schematic diagram of calibration; in the figure: binocular system for part recognition 1, material tray for storing parts 2, parts grabbing robot 3, cabin assembly platform 4, installed parts 5, cabin 6, measuring the pose of the grabbing robot Binocular system 7 positioned with mounting platform.

图2关键结构特征的支架类零件。Figure 2. Bracket-like parts with key structural features.

具体实施方式Detailed ways

运载火箭等箭体舱段部件为柱形或锥形薄壁回转体结构,结构尺寸较大,在舱段内壁或外壁上安装装配有大量的支架类零件。该支架类零件为钣金结构或机加结构,在舱段中具有较为严格的安装位置精度要求,零件和舱段侧壁贴合面为装配配合面。目前零件的定位依靠传统的手工划线方式进行,即通过舱段上的基准,通过钢板尺或其他工具在零件安装位置画出零件的安装中心线,零件上也画上中心线,通过中心线对齐完成零件的定位。零件定位过程工作量大,定位效率低,定位精度一致性较差,由于零件定位过程全部为人工操作,还容易存在较大的质量隐患。The components of the rocket body cabin section such as the launch vehicle are cylindrical or conical thin-walled revolving body structures with large structural dimensions. A large number of bracket parts are installed on the inner wall or outer wall of the cabin section. The bracket-type parts are of sheet metal structure or machined structure, and have stricter installation position accuracy requirements in the cabin section, and the fitting surface of the parts and the side wall of the cabin section is the assembly matching surface. At present, the positioning of parts is carried out by the traditional manual marking method, that is, through the datum on the cabin section, the installation center line of the part is drawn at the installation position of the part by a steel ruler or other tools, and the center line is also drawn on the part, and the center line is drawn through the center line. Alignment completes the positioning of the part. The part positioning process has a large workload, low positioning efficiency, and poor positioning accuracy consistency. Since the part positioning process is all manual operation, it is easy to have a large quality hidden danger.

自动装配装置包括一个具有末端执行器的零件抓取机器人,通过零件识别的双目系统1从存放零件的物料托盘中定位并抓取零件5,通过测量抓取机器人位姿与安装平台定位的双目系统7的定位舱体6中零件安装的指定位置,末端执行器将零件安装到指定位置,舱体6垂直设置在舱体装配平台4上;The automatic assembly device includes a part grabbing robot with an end effector. The binocular system 1 for part recognition locates and grabs the part 5 from the material tray where the parts are stored. The target system 7 locates the designated position of the part installation in the cabin 6, the end effector installs the parts to the designated position, and the cabin 6 is vertically arranged on the cabin assembly platform 4;

一种面向自动装配的基于结构特征的零件坐标标定方法,包括以下步骤:A structural feature-based part coordinate calibration method for automatic assembly, comprising the following steps:

步骤一、创建图1中零件识别的双目系统1坐标系Oc1,零件抓取机器人3的基坐标系Ob与末端执行器坐标系Oe,舱体装配平台4坐标系Op,被安装零件5起始位置坐标系Ow1,舱体6坐标系Oo,测量抓取机器人位姿与安装平台定位的双目系统7坐标系Oc2Step 1. Create the binocular system 1 coordinate system O c1 for part recognition in FIG. 1 , the base coordinate system O b of the part grabbing robot 3 and the end effector coordinate system O e , and the cabin assembly platform 4 coordinate system Op , are The coordinate system O w1 of the starting position of the installation part 5 , the coordinate system O o of the cabin 6 , and the coordinate system O c2 of the binocular system 7 for measuring the pose of the grabbing robot and the positioning of the installation platform.

①零件识别的双目系统1坐标系Oc1:通过建立双目相机图像平面坐标到世界坐标的转换关系获得,标定完成后双目系统的世界坐标原点在双目相机光心处。①The coordinate system O c1 of the binocular system 1 for part recognition: obtained by establishing the transformation relationship between the image plane coordinates of the binocular camera and the world coordinates. After the calibration is completed, the world coordinate origin of the binocular system is at the optical center of the binocular camera.

②零件抓取机器人3的基坐标系Ob:为机器人自身坐标系。②The base coordinate system O b of the part grabbing robot 3 is the coordinate system of the robot itself.

③零件抓取机器人3的末端执行器坐标系Oe:直接实时从零件抓取机器人控制器中读取。③ The coordinate system O e of the end effector of the part grabbing robot 3: directly read from the part grabbing robot controller in real time.

④舱体装配4平台坐标系Op:以平台一个端点为坐标原点Op0,平台所在平面的水平线为Xp轴,以平台所在平面的竖直线为Yp轴。 ④Cabin assembly 4 platform coordinate system Op: take one end point of the platform as the coordinate origin Op0 , the horizontal line of the plane where the platform is located is the X p axis, and the vertical line of the plane where the platform is located is the Y p axis.

⑤舱体6坐标系O:舱体底部中心为原点O,以原点指向舱体的I象限0度为X轴,以原点指向舱体的II象限0度为Y轴。⑤ The coordinate system O of the cabin 6: the center of the bottom of the cabin is the origin O, the X axis with the origin pointing to the I quadrant of the cabin at 0 degrees, and the Y axis with the origin pointing to the II quadrant of the cabin at 0 degrees.

⑥测量抓取机器人位姿与安装平台定位的双目系统7坐标系Oc2:创建方式同Oc1⑥Measurement of the pose of the grasping robot and the positioning of the installation platform of the binocular system 7 coordinate system O c2 : the creation method is the same as O c1 .

步骤二、使用零件识别双目系统1对被安装零件5进行特征识别,建立被安装零件5在零件识别双目系统1的初始位置坐标系Ow1及特征向量Mw1,获取被安装零件5在零件识别双目系统1的位姿,特征向量Mw1表征零件的起始位置和方向。Step 2. Use the part recognition binocular system 1 to perform feature recognition on the installed part 5, establish the initial position coordinate system O w1 and the feature vector M w1 of the installed part 5 in the part recognition binocular system 1, and obtain the installed part 5 at. The part recognizes the pose of the binocular system 1, and the feature vector M w1 represents the starting position and orientation of the part.

步骤三、对零件抓取机器人3末端执行器坐标系Oe与零件识别的双目系统1坐标系Oc1进行手眼标定,获取零件识别的双目系统1坐标系Oc1到零件抓取机器人3末端执行器坐标系Oe的转换关系

Figure BDA0001854006600000071
Step 3: Perform hand-eye calibration on the coordinate system O e of the end effector of the part grabbing robot 3 and the coordinate system O c1 of the binocular system 1 for part recognition, and obtain the coordinate system O c1 of the binocular system 1 for part recognition to the part grabbing robot 3 Conversion relationship of end effector coordinate system O e
Figure BDA0001854006600000071

零件识别的双目系统标定方法为:The calibration method of the binocular system for part recognition is as follows:

3.1固定平面标定板,控制零件抓取机器人运动到一定位姿,双目视觉系统采集标定板图像;3.1 Fix the plane calibration board, control the part grabbing robot to move to a certain position, and the binocular vision system collects the image of the calibration board;

3.2双目视觉系统获得当前位置图像,按照角点检测算法检测标定板中的角点,在标定板平面上建立一个世界坐标系W,利用事先标定好的双目视觉系统内参便可以求解出当前位置下,标定板世界坐标到双目视觉系统坐标系Oc1的转换

Figure BDA0001854006600000072
即双目视觉系统外参;3.2 The binocular vision system obtains the current position image, detects the corner points in the calibration board according to the corner point detection algorithm, establishes a world coordinate system W on the plane of the calibration board, and uses the pre-calibrated internal parameters of the binocular vision system to solve the current position. Under the position, the conversion from the world coordinate of the calibration board to the coordinate system O c1 of the binocular vision system
Figure BDA0001854006600000072
That is, the external parameters of the binocular vision system;

3.3从机器人控制器读取当前机器人末端位姿

Figure BDA0001854006600000073
即当前机器人基座标系Ob到末端坐标系Oe的变换关系;3.3 Read the current robot end pose from the robot controller
Figure BDA0001854006600000073
That is, the transformation relationship from the current robot base coordinate system O b to the end coordinate system O e ;

3.4控制机器人移动n个位置,从第i个位置运动到i+1个位置处,机器人末端位姿的变换关系为

Figure BDA0001854006600000074
对应的双目视觉系统的变换为
Figure BDA0001854006600000075
从双目视觉系统坐标系到机器人末端坐标系的转换关系
Figure BDA0001854006600000076
为需要标定的手眼转换关系,即满足
Figure BDA0001854006600000077
3.4 Control the robot to move n positions, from the i-th position to the i+1 position, the transformation relationship of the robot end pose is as follows
Figure BDA0001854006600000074
The transformation of the corresponding binocular vision system is
Figure BDA0001854006600000075
Conversion relationship from binocular vision system coordinate system to robot end coordinate system
Figure BDA0001854006600000076
It is the hand-eye conversion relationship that needs to be calibrated, that is, it satisfies
Figure BDA0001854006600000077

测量抓取机器人位姿与安装平台定位的双目系统与机器人末端执行器的标定方法一致。The binocular system for measuring the pose of the grasping robot and the positioning of the installation platform is consistent with the calibration method of the robot end effector.

步骤四、在零件抓取机器人3控制器读取当前机器人末端位姿

Figure BDA0001854006600000081
即当前机器人末端坐标系Oe到机器人基坐标系Ob的变换关系。Step 4. Read the current robot end pose on the part grabbing robot 3 controller
Figure BDA0001854006600000081
That is, the transformation relationship from the current robot end coordinate system O e to the robot base coordinate system O b .

在统一基准下,标定获得零件识别的双目系统1坐标系Oc1到零件抓取机器人3的基坐标系Ob的转换关系

Figure BDA0001854006600000082
零件起始坐标系在机器人基坐标系下的坐标系为
Figure BDA0001854006600000083
起始位置的特征向量在机器人基坐标系下的特征向量为
Figure BDA0001854006600000084
Under the unified datum, calibrate the conversion relationship from the coordinate system O c1 of the binocular system 1 for part recognition to the base coordinate system O b of the part grabbing robot 3
Figure BDA0001854006600000082
The coordinate system of the starting coordinate system of the part under the robot base coordinate system is:
Figure BDA0001854006600000083
The eigenvector of the starting position in the robot base coordinate system is
Figure BDA0001854006600000084

步骤五、使用激光跟踪仪对舱体装配平台4坐标系OP和舱体6坐标系Oo进行标定,获得舱体坐标系到平台坐标系的转换矩阵转换关系

Figure BDA0001854006600000085
Step 5. Use the laser tracker to calibrate the cabin assembly platform 4 coordinate system OP and the cabin 6 coordinate system O o , and obtain the transformation matrix transformation relationship from the cabin coordinate system to the platform coordinate system
Figure BDA0001854006600000085

步骤六、测量抓取机器人位姿与安装平台定位的双目系统7通过测量平台已知标志点获得标志点在双目世界坐标系Oc2下的坐标,从而计算出舱体装配平台4坐标系Op到测量抓取机器人位姿与安装平台定位的双目系统7坐标系Oc2的转换矩阵

Figure BDA0001854006600000086
Step 6: The binocular system 7 for measuring the pose of the grabbing robot and the positioning of the installation platform obtains the coordinates of the marker points under the binocular world coordinate system O c2 by measuring the known marker points of the platform, thereby calculating the coordinate system of the cabin assembly platform 4 The transformation matrix from Op to the coordinate system O c2 of the binocular system 7 for measuring the pose of the grasping robot and the positioning of the installation platform
Figure BDA0001854006600000086

步骤七、对零件抓取机器人3末端执行器坐标系Oe与测量抓取机器人位姿与安装平台定位的双目系统7坐标系Oc2进行手眼标定,获取零件识别的双目系统1坐标系Oc2到零件抓取机器人3末端执行器坐标系Oe的转换关系

Figure BDA0001854006600000087
Step 7: Perform hand-eye calibration on the coordinate system O e of the end effector of the part grabbing robot 3 and the coordinate system O c2 of the binocular system 7 for measuring the pose of the grabbing robot and the positioning of the installation platform, and obtain the coordinate system of the binocular system 1 for part recognition The conversion relationship from O c2 to the coordinate system O e of the end effector of the part grabbing robot 3
Figure BDA0001854006600000087

在统一基准下,标定获得舱体6坐标系O到3零件抓取机器人的基坐标系Ob的转换关系

Figure BDA0001854006600000088
Under the unified datum, the calibration obtains the conversion relationship from the coordinate system O of the cabin 6 to the base coordinate system O b of the part grabbing robot 3
Figure BDA0001854006600000088

步骤八、被安装零件5在存放零件的物料托盘2中,零件识别的双目系统1通过零件的特征进行识别,根据步骤一所述建立被安装零件5坐标系方法,建立在零件识别的双目系统1坐标系Oc1下的起始坐标系Ow1Step 8. The installed parts 5 are in the material tray 2 for storing the parts, and the binocular system 1 for part recognition is identified by the features of the parts. The starting coordinate system O w1 under the coordinate system O c1 of the project system 1.

在统一基准下,放置在存放零件的物料托盘2中的被安装零件5的坐标系Ow1在零件抓取机器人3Ob下的坐标系为

Figure BDA0001854006600000089
Under the unified datum, the coordinate system O w1 of the mounted part 5 placed in the material tray 2 storing the parts under the part grabbing robot 3O b is the coordinate system O w1
Figure BDA0001854006600000089

通过提取舱段数模理论点,计算获得被安装零件5安装在舱体6坐标系OO下的目标位置坐标点OW2(xw2,yw2,zw2)及特征向量Mw2By extracting the theoretical points of the digital model of the cabin, the target position coordinate point O W2 (x w2 , y w2 , z w2 ) and the characteristic vector M w2 of the installed part 5 installed in the coordinate system O O of the cabin 6 are obtained by calculation,

在统一基准下,被安装零件5的目标位置坐标点坐标OW2(xw2,yw2,zw2)在零件抓取机器人3Ob下的坐标

Figure BDA0001854006600000091
Figure BDA0001854006600000097
目标位置特征向量Mw2在机器人基坐标系下的特征向量为
Figure BDA0001854006600000092
Under the unified datum, the coordinates of the target position coordinate point O W2 (x w2 , y w2 , z w2 ) of the mounted part 5 under the part grabbing robot 3O b
Figure BDA0001854006600000091
Figure BDA0001854006600000097
The feature vector of the target position feature vector M w2 in the robot base coordinate system is
Figure BDA0001854006600000092

步骤九、将起始点坐标

Figure BDA0001854006600000093
特征向量
Figure BDA0001854006600000094
和目标位置点坐标
Figure BDA0001854006600000095
特征向量
Figure BDA0001854006600000096
输入到零件抓取机器人3控制器中,编辑程序驱动机器人抓取被安装零件5由存放零件2的物料托盘定位到舱体6理论安装位置。Step 9. Set the coordinates of the starting point
Figure BDA0001854006600000093
Feature vector
Figure BDA0001854006600000094
and the target position point coordinates
Figure BDA0001854006600000095
Feature vector
Figure BDA0001854006600000096
Input into the controller of the part grabbing robot 3, and the editing program drives the robot to grab the installed part 5 and position it from the material tray storing the part 2 to the theoretical installation position of the cabin 6.

判定零件放置到指定位置方法为:以零件特征的中心作为圆心Ow2,零件表面所在平面为Xw2Ow2Yw2平面,Xw2Ow2Yw2平面的法向为Zw2轴建立三坐标系Ow2;特征向量Mw2的方向与Zw2轴方向重合;目标位置点坐标与圆心Ow2重合。The method for judging that the part is placed in the specified position is: take the center of the part feature as the center of the circle O w2 , the plane where the surface of the part is located is the X w2 O w2 Y w2 plane, and the normal direction of the X w2 O w2 Y w2 plane is the Z w2 axis to establish a three-coordinate system O w2 ; the direction of the feature vector M w2 coincides with the direction of the Z w2 axis; the coordinates of the target position point coincide with the circle center O w2 .

双目系统的特征识别速度为2s/个,零件特征的识别精度为0.2mm。The feature recognition speed of the binocular system is 2s/piece, and the recognition accuracy of part features is 0.2mm.

以上所述,仅为本发明最佳的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到的变化或替换,都应涵盖在本发明的保护范围之内。The above is only the best specific embodiment of the present invention, but the protection scope of the present invention is not limited to this. Substitutions should be covered within the protection scope of the present invention.

本发明说明书中未作详细描述的内容属于本领域专业技术人员的公知技术。Contents that are not described in detail in the specification of the present invention belong to the well-known technology of those skilled in the art.

Claims (12)

1. A part coordinate calibration method based on structural characteristics and oriented to automatic assembly is disclosed, wherein an automatic assembly device comprises a part grabbing robot with an end effector;
the method is characterized by comprising the following steps:
step one, defining a binocular system coordinate system O for part identificationc1Base coordinate system O of part grabbing robotbAnd end effector coordinate system OeCabin assembly platform coordinate system OpCabin body coordinate system OoCoordinate system O of binocular system for measuring pose of grabbing robot and positioning of mounting platformc2
Step two, the binocular system (1) for part identification carries out feature identification on the part (5) and establishes the binocular system (1) for part identification of the part (5)Starting coordinate system O ofw1Obtaining a feature vector M of the starting position of the partw1
Step three, calibrating to obtain the conversion relation from the binocular system coordinate system of part identification to the robot end effector coordinate system
Figure RE-FDA0002548169640000011
Calibration is carried out to obtain the conversion relation between the pose of the measuring and grabbing robot and the coordinate system of the binocular system positioned by the mounting platform and the end effector of the robot
Figure RE-FDA0002548169640000012
Step four, reading the conversion relation from the robot end effector coordinate system to the robot base coordinate system in the part grabbing robot controller
Figure RE-FDA0002548169640000013
The conversion relation from the part initial coordinate system to the robot base coordinate system is obtained by calculation
Figure RE-FDA0002548169640000014
The coordinate system of the part initial coordinate system under the robot base coordinate system
Figure RE-FDA0002548169640000015
The feature vector under the robot base coordinate system is
Figure RE-FDA0002548169640000016
Fifthly, calibrating to obtain the conversion relation from the cabin body coordinate system to the coordinate system of the assembly platform
Figure RE-FDA0002548169640000017
Sixthly, measuring the pose of the grabbing robot and the binocular system (7) for positioning the mounting platform to obtain the mark points in the coordinate system O of the binocular system through measuring the mark points of the cabin assembly platform (4)c2The coordinates of the cabin body are calculated, and the coordinate system O of the cabin body assembly platform is calculatedpBinocular system coordinate system O for measuring pose of grabbing robot and positioning of mounting platformc2Is converted into a matrix
Figure RE-FDA0002548169640000018
Seventhly, obtaining a conversion relation from the cabin coordinate system to a coordinate system under the robot base coordinate system
Figure RE-FDA0002548169640000019
Step eight, acquiring a target position coordinate point coordinate O of the part (5) installed under the cabin body coordinate system OoW2(xw2,yw2,zw2) And target position feature vector Mw2(ii) a Calculating the coordinates of the target position coordinates under the robot base coordinate system as
Figure RE-FDA0002548169640000021
Then the target location feature vector Mw2The feature vector under the robot base coordinate system is
Figure RE-FDA0002548169640000022
Step nine, inputting coordinates of part initial position points in a part grabbing robot controller
Figure 4
Figure 5
Feature vector
Figure RE-FDA0002548169640000025
And target location point coordinates
Figure 6
Figure 7
Feature vector
Figure RE-FDA0002548169640000028
The control end effector positions and grabs the part (5) from the material tray of depositing the part through binocular system (1) of part discernment, positions the assigned position of part installation and places the part to the assigned position in the location cabin body (6) of binocular system (7) through measuring the position appearance of grabbing robot and mounting platform location.
2. The component coordinate calibration method based on the structural features for automatic assembly as claimed in claim 1, wherein the component recognition binocular system coordinate system Oc1And a binocular system coordinate system for measuring the pose of the grabbing robot and positioning the mounting platform takes the optical center of a binocular system as a coordinate origin, and the Z axis of the binocular system coordinate system is parallel to the optical axis of the binocular vision system.
3. The automatic-assembly-oriented part coordinate calibration method based on the structural features as claimed in claim 2, wherein a binocular system for part identification and a binocular system for measuring pose of the grabbing robot and positioning of the installation platform are calibrated by a hand-eye calibration method and a robot end effector.
4. The component coordinate calibration method based on the structural features for automatic assembly according to claim 2, wherein the binocular system calibration method for component identification is as follows:
4.1 fixing a plane calibration plate, controlling a part grabbing robot to move to a certain pose, and acquiring an image of the calibration plate by a binocular vision system;
4.2 the binocular vision system obtains the current position image, detects the corner points in the calibration plate according to the corner point detection algorithm, establishes a world coordinate system W on the plane of the calibration plate, and solves the current position by using the binocular vision system, the world coordinate of the calibration plate is converted into a coordinate system O of the binocular vision systemc1Is converted into
Figure RE-FDA0002548169640000029
4.3 reading the current robot end pose from the robot controller
Figure RE-FDA0002548169640000031
I.e. the current robot base coordinate system ObTo the terminal coordinate system OeThe transformation relationship of (1);
4.4 the robot is controlled to move n positions from the ith position to the i +1 position, and the transformation relation of the terminal pose of the robot is
Figure RE-FDA0002548169640000032
Corresponding binocular visual system transformation
Figure RE-FDA0002548169640000033
Conversion relation from coordinate system of binocular vision system to coordinate system of tail end of robot
Figure RE-FDA0002548169640000034
For hand-eye conversion requiring calibration, i.e. satisfying
Figure RE-FDA0002548169640000035
5. The method for calibrating the coordinates of the part based on the structural features for automatic assembly as claimed in claim 2, wherein the basic coordinate system O of the part grabbing robotbAnd establishing a three-dimensional coordinate by taking the center of the bottom surface of the base of the part grabbing robot as an origin and the plane of the bottom surface of the base as a reference.
6. The method for calibrating the coordinates of the part based on the structural characteristics for automatic assembly as claimed in claim 5, wherein the end effector coordinate system OeAnd reading from the part grabbing robot controller in real time.
7. The component coordinate calibration method oriented to automatic assembly and based on structural features as claimed in claim 6, wherein the cabin assembly platform coordinate system O ispTaking one end point of the cabin assembly platform as a coordinate origin Op0The horizontal line of the plane of the platform is XpThe axis is Y along the vertical line of the plane of the platformpA shaft.
8. The method for calibrating the coordinates of the part based on the structural features for automatic assembly as claimed in claim 7, wherein the coordinate system of the starting position of the part is Ow1Taking the center of the feature of the part as the center of a circle, and the plane of the surface of the part is Xw1Ow1Yw1Plane, Xw1Ow1Yw1Normal to the plane Zw1Axis building three-dimensional coordinates, eigenvectors Mw1Characterization Zw1The direction of the axis.
9. The method for calibrating the coordinates of the component based on the structural characteristics for automatic assembly as claimed in claim 8, wherein the cabin coordinate system Oo(ii) a The center of the bottom of the cabin body is an original point O, the 0 degree of a quadrant I pointing to the cabin body by the original point is an X axis, and the 0 degree of a quadrant II pointing to the cabin body by the original point is a Y axis.
10. The method for calibrating the coordinates of the component based on the structural characteristics for automatic assembly as claimed in claim 9, wherein the calibration obtains the transformation relationship from the cabin coordinate system to the coordinate system of the assembly platform
Figure RE-FDA0002548169640000036
The calibration method comprises the following steps: arranging the cabin body on the assembly platform, positioning and fixing the cabin body by using a positioning device on the assembly platform to ensure that the X of the cabin bodyoX of axle and assembly platformpWith parallel axes, Y of the cabinoShaft and assembly platform YpThe axes are parallel; respectively measuring the characteristics of the cabin and the assembly platform by using a laser tracker, acquiring the origin coordinates of the coordinate systems of the cabin and the assembly platform, calculating and converting to acquire a cabin coordinate systemTranslation of coordinate system of assembly platform
Figure RE-FDA0002548169640000041
11. The automatic assembly-oriented part coordinate calibration method based on the structural features as claimed in claim 1, wherein the feature recognition speed of the binocular system is 2 s/piece, and the recognition accuracy of the part features is 0.2 mm.
12. The method for calibrating the coordinates of the part based on the structural characteristics for automatic assembly according to claim 1, wherein the method for judging that the part is placed at the designated position comprises the following steps: using the center of the part feature as the center O of a circlew2The surface of the part is in a plane Xw2Ow2Yw2Plane, Xw2Ow2Yw2Normal to the plane Zw2Three-coordinate axis system O for axis establishmentw2(ii) a Feature vector Mw2Direction of (a) and Zw2The axes are overlapped; coordinates of target position point and center of circle Ow2And (4) overlapping.
CN201811310133.3A 2018-11-05 2018-11-05 Automatic-assembly-oriented part coordinate calibration method based on structural features Active CN109341532B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811310133.3A CN109341532B (en) 2018-11-05 2018-11-05 Automatic-assembly-oriented part coordinate calibration method based on structural features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811310133.3A CN109341532B (en) 2018-11-05 2018-11-05 Automatic-assembly-oriented part coordinate calibration method based on structural features

Publications (2)

Publication Number Publication Date
CN109341532A CN109341532A (en) 2019-02-15
CN109341532B true CN109341532B (en) 2020-11-10

Family

ID=65314135

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811310133.3A Active CN109341532B (en) 2018-11-05 2018-11-05 Automatic-assembly-oriented part coordinate calibration method based on structural features

Country Status (1)

Country Link
CN (1) CN109341532B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110659652B (en) * 2019-09-10 2023-08-18 东华大学 Feature matching detection system of fan device Creo model
CN110906863B (en) * 2019-10-30 2022-01-28 成都绝影智能科技有限公司 Hand-eye calibration system and calibration method for line-structured light sensor
CN113551661A (en) * 2020-04-23 2021-10-26 曰轮法寺 Pose identification and track planning method, device and system, storage medium and equipment
CN111571596B (en) * 2020-05-26 2022-11-11 上海交通大学 Method and system for correcting robot errors in metallurgical patching and assembly operations using vision
CN111890366B (en) * 2020-08-05 2023-06-13 北京航空航天大学 Mechanical arm object grabbing planning principle and ROS-based implementation method
CN112453844B (en) * 2020-10-26 2022-04-22 航天材料及工艺研究所 An integrated assembly method for box-shaped parts positioning, measurement and compensation based on feature holes
CN113361033B (en) * 2021-06-07 2022-06-14 百斯图工具制造有限公司 Blade assembly surface positioning method, system, server and storage medium
CN113771032A (en) * 2021-09-13 2021-12-10 中国航空无线电电子研究所 Intelligent wire harness assembling auxiliary system based on man-machine cooperation robot

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5297238A (en) * 1991-08-30 1994-03-22 Cimetrix Incorporated Robot end-effector terminal control frame (TCF) calibration method and device
JPH08132373A (en) * 1994-11-08 1996-05-28 Fanuc Ltd Coordinate system coupling method in robot-sensor system
CN104165586A (en) * 2013-05-17 2014-11-26 上海三菱电梯有限公司 Non-contact high-precision calibration method and application of workpiece coordinate system of robot
CN105345453A (en) * 2015-11-30 2016-02-24 北京卫星制造厂 Position-posture determining method for automatically assembling and adjusting based on industrial robot
CN106584093A (en) * 2015-10-20 2017-04-26 沈阳新松机器人自动化股份有限公司 Self-assembly system and method for industrial robots
CN107538508A (en) * 2017-02-16 2018-01-05 北京卫星环境工程研究所 The robot automatic assembly method and system of view-based access control model positioning

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5297238A (en) * 1991-08-30 1994-03-22 Cimetrix Incorporated Robot end-effector terminal control frame (TCF) calibration method and device
JPH08132373A (en) * 1994-11-08 1996-05-28 Fanuc Ltd Coordinate system coupling method in robot-sensor system
CN104165586A (en) * 2013-05-17 2014-11-26 上海三菱电梯有限公司 Non-contact high-precision calibration method and application of workpiece coordinate system of robot
CN106584093A (en) * 2015-10-20 2017-04-26 沈阳新松机器人自动化股份有限公司 Self-assembly system and method for industrial robots
CN105345453A (en) * 2015-11-30 2016-02-24 北京卫星制造厂 Position-posture determining method for automatically assembling and adjusting based on industrial robot
CN107538508A (en) * 2017-02-16 2018-01-05 北京卫星环境工程研究所 The robot automatic assembly method and system of view-based access control model positioning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于机器人的大尺寸舱段支架辅助装配方法;陈雨杰 等;《东华大学学报(自然科学版)》;20161031;第42卷(第5期);第745-751页 *

Also Published As

Publication number Publication date
CN109341532A (en) 2019-02-15

Similar Documents

Publication Publication Date Title
CN109341532B (en) Automatic-assembly-oriented part coordinate calibration method based on structural features
CN110370286B (en) Recognition method of fixed-axis motion rigid body space position based on industrial robot and monocular camera
CN110116407B (en) Flexible robot pose measurement method and device
CN109623656B (en) Mobile double-robot cooperative polishing device and method based on thickness online detection
CN112325796A (en) Large-scale workpiece profile measuring method based on auxiliary positioning multi-view point cloud splicing
CN104690551B (en) A kind of robot automation's assembly system
CN101666619B (en) Method for calculating absolute coordinates of work piece
CN110202573B (en) Full-automatic hand-eye calibration and working plane calibration method and device
WO2023193362A1 (en) Hybrid robot and three-dimensional vision based large-scale structural part automatic welding system and method
CN108182689A (en) The plate workpiece three-dimensional recognition positioning method in polishing field is carried applied to robot
CN112648934B (en) Automatic elbow geometric form detection method
CN111531407B (en) A Fast Measurement Method of Workpiece Pose Based on Image Processing
CN108942918B (en) Stereo positioning method based on line structured light
CN116079732B (en) In-cabin component assembly method based on laser tracker and binocular vision hybrid guidance
CN113681563B (en) Assembling method and system based on double cameras
CN112356073B (en) Three-dimensional camera posture online calibration device and method for industrial robots
CN111609847B (en) Automatic planning method of robot photographing measurement system for thin plate
CN105698678B (en) A kind of basis coordinates system scaling method of the horizontal automatic drill riveter of aircraft target ship
CN112958960A (en) Robot hand-eye calibration device based on optical target
CN113781558A (en) Robot vision locating method with decoupled posture and position
Jian et al. On-line precision calibration of mobile manipulators based on the multi-level measurement strategy
CN217530907U (en) A device for secondary precise positioning of AGV based on laser ranging
CN112828878B (en) A three-dimensional measurement and tracking method for large-scale equipment docking process
CN115183677B (en) Inspection and positioning system for automobile assembly
CN111687845A (en) Mechanical arm kinematics parameter calibration method based on inertia measurement unit

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant