CN102794763B - Calibration method of welding robot system based on line structured light vision sensor guidance - Google Patents
Calibration method of welding robot system based on line structured light vision sensor guidance Download PDFInfo
- Publication number
- CN102794763B CN102794763B CN201210318783.9A CN201210318783A CN102794763B CN 102794763 B CN102794763 B CN 102794763B CN 201210318783 A CN201210318783 A CN 201210318783A CN 102794763 B CN102794763 B CN 102794763B
- Authority
- CN
- China
- Prior art keywords
- target
- point
- camera
- coordinate
- rightarrow
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Landscapes
- Length Measuring Devices By Optical Means (AREA)
- Manipulator (AREA)
Abstract
Description
技术领域 technical field
本发明涉及一种机器人视觉传感器及其标定方法,尤其是一种基于线结构光视觉传感器引导的焊接机器人系统标定方法,具体地说是机器人基于线结构光视觉传感器的手眼关系矩阵和传感器参数的快速标定方法。The present invention relates to a robot vision sensor and its calibration method, in particular to a welding robot system calibration method guided by a line structured light vision sensor, in particular to a robot based on the hand-eye relationship matrix and sensor parameters of the line structured light vision sensor Quick Calibration Method.
背景技术 Background technique
焊接的特点是工艺因素复杂、劳动强度大,生产周期长、劳动环境差,其品质依赖操作者的技能、技术和经验,也和操作者情绪及身体状况相关,因此,焊接自动化技术对于提高接头品质、保证稳定性具有很重要的意义。实现焊接自动化的关键问题是焊缝的自动跟踪,激光视觉引导的智能焊接机器人将焊缝图像识别与机器人运动控制技术结合,能够有效解决焊缝自动跟踪难题。Welding is characterized by complex process factors, high labor intensity, long production cycle, and poor working environment. Its quality depends on the operator's skills, technology and experience, and is also related to the operator's emotional and physical conditions. Therefore, welding automation technology is very important for improving joint quality. Quality and ensuring stability are of great significance. The key issue in realizing welding automation is the automatic tracking of weld seams. The intelligent welding robot guided by laser vision combines weld image recognition with robot motion control technology, which can effectively solve the problem of automatic weld seam tracking.
标定(包括相机参数标定、线激光光平面方程标定和手眼变换矩阵标定)在视觉测量系统中是非常关键且重要的环节,其标定结果的精度及算法的稳定性与实时性,直接影响到工业生产过程中测量与跟踪的精度。线结构光视觉传感器的参数包括相机的内部参数(焦距、主点以及畸变系数等)及线结构光视觉传感器的结构参数(即线激光在相机坐标系下的光平面方程),通过Tsai,张正友,胡占义等人在视觉标定中的贡献,我们可以轻松的获得相机的内部参数。线结构光视觉传感器的结构参数标定方法有许多方法,如R.Dewar提出的拉丝标定方法;D.Q.Huynh提出了交比不变性标定方法;毕德学提出的基于交线的线激光光平面方程标定具有较强的鲁棒性,实施方法简单,特征易提取,满足现场标定要求。根据测量模型,由传感器返回的测量数据恢复被测曲面的三维坐标信息必须要确定机械臂末端坐标系到相机坐标系的手眼变换矩阵H。在机器视觉领域该问题称为手眼标定。常用的机器人手眼标定方法是利用已知标定参考物(标定块),控制机械手在不同方位观察空间一个已知的标定参考物,从而推导R和t与多次观察结果的关系,其中R表示手眼变换矩阵H的旋转部分,t表示手眼变换矩阵H的平移部分。Calibration (including camera parameter calibration, line laser light plane equation calibration and hand-eye transformation matrix calibration) is a very critical and important link in the visual measurement system. The accuracy of the calibration results and the stability and real-time performance of the algorithm directly affect the industrial Accuracy of measurement and tracking during production. The parameters of the line structured light vision sensor include the internal parameters of the camera (focal length, principal point and distortion coefficient, etc.) and the structural parameters of the line structured light vision sensor (that is, the light plane equation of the line laser in the camera coordinate system), through Tsai, Zhang Zhengyou , Hu Zhanyi et al.'s contribution in visual calibration, we can easily obtain the internal parameters of the camera. There are many methods for calibrating structural parameters of line structured light vision sensors, such as R. Drawing calibration method proposed by Dewar; D. Q. Huynh proposed a cross-ratio invariant calibration method; Bi Dexue's calibration of line laser light plane equations based on intersection lines has strong robustness, simple implementation methods, easy feature extraction, and meets the requirements of on-site calibration. According to the measurement model, to restore the three-dimensional coordinate information of the measured surface from the measurement data returned by the sensor must determine the hand-eye transformation matrix H from the end coordinate system of the manipulator to the camera coordinate system. In the field of machine vision this problem is called hand-eye calibration. The commonly used robot hand-eye calibration method is to use a known calibration reference object (calibration block) to control the manipulator to observe a known calibration reference object in different orientations, thereby deriving the relationship between R and t and multiple observation results, where R represents the hand-eye The rotation part of the transformation matrix H, and t represents the translation part of the hand-eye transformation matrix H.
发明内容 Contents of the invention
本发明的目的是克服现有技术中存在的不足,提供一种基于线结构光视觉传感器引导的焊接机器人系统标定方法,其灵活、精度高、速度快、稳定性好,实时性强,方法简单,计算量小,通用性强。The purpose of the present invention is to overcome the deficiencies in the prior art, and provide a welding robot system calibration method guided by a line-structured light vision sensor, which is flexible, high in precision, fast in speed, good in stability, strong in real-time, and simple in method , with a small amount of calculation and strong versatility.
按照本发明提供的技术方案,所述基于线结构光视觉传感器引导的焊接机器人系统标定方法,所述焊接机器人系统标定方法包括如下步骤:According to the technical solution provided by the present invention, the calibration method of the welding robot system guided by the line structured light vision sensor, the calibration method of the welding robot system includes the following steps:
第一步、控制机械臂变换位姿,使得相机在多个位姿拍摄一个任意放置且位置不变的圆靶标,选择的位姿必须使的所有圆靶标上的圆点在相机视场范围内,并且保证固定在相机上线激光器所产生的线激光光平面在圆靶标上产生的光条部分也在视场范围内;通过相机获取圆靶标图像后,提取圆靶标图像上圆形斑点的圆心坐标并识别角点的行列值,以完成圆靶标图像和世界坐标的匹配,然后根据张正友标定算法得到相机的内部参数矩阵
其中,α=f/dx、β=f/dy,f为相机焦距,dx,dy为相机内单个CCD感光元件长度、宽度;γ为反映相机内CCD感光元件排列倾斜程度的物理量,u0、v0为相机的镜头光轴和CCD感光元件的交点像素坐标;Among them, α=f/dx, β=f/dy, f is the focal length of the camera, dx, dy are the length and width of a single CCD photosensitive element in the camera; v 0 is the pixel coordinates of the intersection of the lens optical axis of the camera and the CCD photosensitive element;
第二步、根据第一步中得到的带线激光光条的圆靶标图像,提取线激光光条,细化提取的线激光光条,通过Hough变换求出线激光光条的线方程;利用第一步得到的外参数矩阵RT得到线激光光条平面在相机坐标系下的平面方程;The second step, according to the circular target image of the band laser light strip obtained in the first step, extract the line laser light strip, refine the extracted line laser light strip, and obtain the line equation of the line laser light strip by Hough transform; The external parameter matrix RT obtained in the first step obtains the plane equation of the line laser light strip plane in the camera coordinate system;
第三步、根据第一步中每个位姿对应的机械臂姿态,利用四元数法计算得到机械臂末端坐标系与机械臂基坐标系的变换矩阵,即获得手眼变换矩阵H;The third step is to calculate the transformation matrix between the end coordinate system of the robot arm and the base coordinate system of the robot arm by using the quaternion method according to the posture of the robot arm corresponding to each pose in the first step, that is, obtain the hand-eye transformation matrix H;
第四步、将焊接工件放置于机械臂末端,控制焊接工件在固定机械臂末端位姿下对圆靶标上一点精确点触,计算出焊接工件末端点在机械臂坐标下的坐标值,并结合机械臂位姿计算出工件在所述位姿下的偏移值。The fourth step is to place the welding workpiece on the end of the mechanical arm, control the welding workpiece to touch a point on the circular target precisely under the fixed position and posture of the end of the mechanical arm, calculate the coordinate value of the end point of the welding workpiece under the coordinates of the mechanical arm, and combine The robot arm pose calculates the offset value of the workpiece under the pose.
所述第一步包括如下步骤:Described first step comprises the following steps:
1.1、利用相机在线实时采集圆靶标的图像,采用大津法自适应阈值对图像二值化,使得圆靶标图像上的圆形斑点突出显示;1.1. Use the camera to collect the image of the circular target online in real time, and use the adaptive threshold of the Otsu method to binarize the image, so that the circular spots on the circular target image are highlighted;
1.2、利用闭运算算子对圆靶标图像进行闭运算,以去除噪声干扰;对上述二值化后的圆靶标图像中所有目标圆点进行标记,目标圆点的面积为目标圆点包含的像素数目,统计上述圆靶标图像中的目标圆点的像素个数,标记目标圆点所包含像素数目均值为NP_average,去除标记目标中包含像素数目小于0.5*NP_average以及标记目标中包含像素数目大于1.5*NP_average的目标;1.2. Use the closed operation operator to perform closed operation on the circular target image to remove noise interference; mark all the target dots in the above-mentioned binarized circular target image, and the area of the target dot is the pixel contained in the target dot Number, count the number of pixels of the target dot in the above circular target image, the average number of pixels contained in the marked target dot is NP_average, remove the number of pixels contained in the marked target less than 0.5*NP_average and the number of pixels contained in the marked target greater than 1.5* target for NP_average;
1.3、计算标记后目标圆点的周长C和面积S,利用圆形度表达式e=C2/(2*π*S),区分目标圆点与干扰;1.3. Calculate the circumference C and area S of the target dot after marking, and use the circularity expression e=C 2 /(2*π*S) to distinguish the target dot from interference;
1.4、采用重心法求取每一个目标圆点的圆心坐标(x0,y0),
其中,m,n为圆靶标图像的行、列值,Hit[i][j]表示(i,j)位置点是否在目标圆点上,(i,j)位置点在目标圆点上Hit[i][j]=1,否则Hit[i][j]=0;Among them, m, n are the row and column values of the circle target image, Hit[i][j] indicates whether the (i, j) position point is on the target circle point, and the (i, j) position point is on the target circle point Hit [i][j]=1, otherwise Hit[i][j]=0;
1.5、圆靶标上具有若干行、列排布的圆点,相邻两行圆点之间的行距为15mm,同一行中相邻的2个圆点圆心的距离为10mm,在圆靶标第一行的第一个点行方向上5mm处有参考圆,根据步骤(1.4)得到目标圆点像素坐标,利用参考圆对圆靶标图像上的圆心点进行排序;1.5. There are several rows and columns of dots on the circular target. The distance between two adjacent rows of dots is 15mm, and the distance between the centers of two adjacent dots in the same row is 10mm. The first point of the row has a reference circle at 5mm in the direction of the row, and the pixel coordinates of the target circle are obtained according to step (1.4), and the center points on the circle target image are sorted by using the reference circle;
1.6、根据上述圆靶标图像中圆点排序,与给定圆靶标信息,完成圆靶标图像中圆心点像素坐标与世界坐标的匹配。1.6. According to the sorting of the dots in the above circular target image and the given circular target information, complete the matching between the pixel coordinates of the circle center point in the circular target image and the world coordinates.
所述第二步包括如下步骤:Described second step comprises the following steps:
2.1、线激光器固定在相机上,相机采集图片是只需保证所有靶标圆点以及光条部分在相机视场范围内即可,利用预设阈值对采集到的线激光光条图像进行二值化,对二值化图像进行图像闭运算以去除边缘奇异点;2.1. The line laser is fixed on the camera. The camera only needs to ensure that all the target dots and light strips are within the field of view of the camera to collect pictures. Use the preset threshold to binarize the collected line laser light strip images. , perform image closing operation on the binarized image to remove edge singular points;
2.2、对线激光光条区域中心进行8-邻域标记,记线激光光条区域的中心点为p1,线激光光条区域中心点p1邻域的8个点顺时针绕中心点分别为p2,p3,……,p9,其中p2点中心点在p1的上方,对线激光光条区域中心点p1进行8-邻域标记的同时满足下列条件的边界点:2.2. Carry out 8-neighborhood marking on the center of the line laser light bar area. The center point of the line laser light bar area is p1, and the 8 points in the neighborhood of the center point p1 of the line laser light bar area are respectively p2 , p3,..., p9, where the center point of point p2 is above p1, and the center point p1 of the line laser light strip area is marked with 8-neighborhood points while satisfying the following conditions:
(ⅰ)、2≤N(p1)≤6;(i), 2≤N(p1)≤6;
(ⅱ)、S(p1)=1;(ii), S(p1)=1;
(ⅲ)、p2*p4*p6=0;(Ⅲ), p2*p4*p6=0;
(ⅳ)、p4*p6*p8=0;(ⅳ), p4*p6*p8=0;
其中,N(p1)是中心点p1的非零邻点的个数;S(p1)是以p2,p3,……,p9为序时这些点的值从0→1变化次数;当对所有的边界点都检查完毕后,将所有的标记点除去,反复迭代直到没有点满足标记条件,完成光条细化;Among them, N(p1) is the number of non-zero neighbors of the center point p1; S(p1) is the number of times the values of these points change from 0→1 when the order is p2, p3,...,p9; when for all After checking all the boundary points of , remove all the marked points, iterate repeatedly until no point meets the marking condition, and complete the thinning of the light bar;
2.3、对细化后得到的点利用Hough变换提取得到在圆靶标平面上线激光光条的线方程akx+bky+ck=0,其中,ak、bk、ck分别表示线方程的参数。2.3. Use the Hough transform to extract the points obtained after refinement to obtain the line equation a k x+b k y+c k =0 of the line laser light strip on the circular target plane, where a k , b k , and c k represent parameters of the line equation.
所述第三步包括如下步骤:The third step comprises the following steps:
3.1、圆靶标在世界坐标系下的平面方程可以表示为式中:π1=[0,0,1,0]T,利用第一步得到的外参数矩阵RT,计算得到在相机坐标系下靶标平面的平面方程为其中的与分别为靶标平面在世界坐标系和相机坐标系下的平面法向坐标向量;3.1. The plane equation of the circular target in the world coordinate system can be expressed as In the formula: π 1 =[0,0,1,0] T , using the external parameter matrix RT obtained in the first step, the plane equation of the target plane in the camera coordinate system is calculated as one of them and are the plane normal coordinate vectors of the target plane in the world coordinate system and the camera coordinate system respectively;
3.2、在相机坐标系中,设分别代表激光平面法向坐标向量、第i个圆靶标的法向坐标向量、第j个圆靶标的法向坐标向量、第i个圆靶标内的激光交线图像的坐标向量、第j个平面靶标内的激光交线图像的坐标向量;注意第i个平面靶标图像上的激光交线图像的方程λi与可以直接计算出来:3.2. In the camera coordinate system, set represent the normal coordinate vector of the laser plane, the normal coordinate vector of the i-th circular target, the normal coordinate vector of the j-th circular target, the coordinate vector of the laser intersection image in the i-th circular target, and the j-th plane The coordinate vector of the laser intersection image in the target; note that the equation λ i of the laser intersection image on the i-th plane target image and can be calculated directly:
(3.3)、根据投影空间中两条激光交线及其相关联的平面的对偶关系得,
其中
,所述第四步包括如下步骤:, the fourth step includes the following steps:
4.1、取第一步中任意2个位姿,其中表示2个位姿下机械臂末端坐标系的转换矩阵,令 表示2个位姿下相机坐标系的转换矩阵,令则可以简写为ΦH=HΘ;4.1. Take any two poses in the first step, in Indicates the transformation matrix of the coordinate system at the end of the manipulator in two poses, so that Represents the transformation matrix of the camera coordinate system in two poses, so that Then it can be abbreviated as ΦH=HΘ;
4.2、ΦH=HΘ;展开为4.2, ΦH=HΘ; expanded as
RΦRH=RHRΘ ①R Φ R H =R H R Θ ①
RΦtH+tΦ=RHtΘ+tH ②R Φ t H +t Φ =R H t Θ +t H ②
其中:RΦ,RΘ,RH表示Φ,Θ,H中对应的旋转部分,tΦ,tΘ,tH表示Φ,Θ,H中对于平移部分;Wherein: R Φ , R Θ , R H represents Φ, Θ, the corresponding rotation part in H, t Φ , t Θ , t H represents Φ, Θ, for the translation part in H;
4.3、令
4.4、再将旋转矩阵数据带入②就可以求出tH,得到手眼变换矩阵H。4.4. Bring the rotation matrix data into ② to find t H , and obtain the hand-eye transformation matrix H.
所述第四步中计算焊接工件位移值包括如下步骤:Calculating the welding workpiece displacement value in the fourth step includes the following steps:
5.1、控制焊接工件在固定机械臂末端位姿下对圆靶标上预设点精确点触,读取当前机械臂位姿,获得机械臂末端坐标的第一坐标值;5.1. Control the welding workpiece to precisely touch the preset point on the circular target under the fixed end pose of the manipulator, read the current pose of the manipulator, and obtain the first coordinate value of the end coordinate of the manipulator;
5.2、定义预设点在世界坐标系下的坐标,根据第一步中得到的外参数矩阵RT,获得预设点在相机坐标系下的坐标,根据手眼变换矩阵H和拍摄圆靶标图像时的机械臂位姿计算得到预设点在机械臂基坐标系下的第二坐标值,将所述第二坐标值与第一坐标值比对得到当前位姿下焊接工件位移值。5.2. Define the coordinates of the preset point in the world coordinate system. According to the external parameter matrix RT obtained in the first step, obtain the coordinates of the preset point in the camera coordinate system. According to the hand-eye transformation matrix H and the image of the circular target The pose calculation of the manipulator obtains the second coordinate value of the preset point in the base coordinate system of the manipulator, and compares the second coordinate value with the first coordinate value to obtain the displacement value of the welding workpiece under the current pose.
利用参考圆对圆靶标图像上的圆心点进行排序时,找出圆靶标图像上有效圆点的圆心距离最小的两个目标圆点,其中离所有圆心点均值中心较远的圆的定义为0点,另一个定义为1点,然后设置0,1点为已匹配点,在未匹配点中找与1点最近的点定义2点,设置2点为已匹配点,在未匹配点中找与2点最近的点定义3点,设置3点为已匹配点…直至把所有点都找全,完成所以圆心点的排序。When using the reference circle to sort the center points on the circle target image, find out the two target circle points with the smallest distance between the center points of the effective circle points on the circle target image, and the circle farther away from the mean center of all the circle center points is defined as 0 point, the other is defined as 1 point, then set 0, 1 point is the matched point, find the nearest point to 1 point in the unmatched point, define 2 points, set 2 points as the matched point, find in the unmatched point The point nearest to point 2 defines 3 points, and sets 3 points as matched points...until all points are found, and the sorting of all the center points is completed.
本发明的优点:本发明通过分析相机的成像原理、结构光测量原理和手眼系统工作原理,设计了一种简单灵活的基于机构光引导机械臂三维跟踪的标定方法,其中包括相机内参标定,线激光光平面方程标定,手眼变换矩阵标定和工件偏移标定。该标定方法克服了传统激光光平面方程和手眼矩阵标定条件苛刻、标定步骤繁琐的缺点。该算法只需要控制机械臂随意运动3个以上的位姿拍摄固定靶标就可以完成整体标定任务,并且利用该方法法的结构光引导机械臂跟踪系统有较高的跟踪精度。该算法使得线激光,相机的结构可调,以及系统结构参数的现场标定成为可能。大大增加了结构光引导系统的灵活性,对实际的视觉测量和跟踪具有重要意义,具有良好的实用性。Advantages of the present invention: By analyzing the imaging principle of the camera, the principle of structured light measurement and the working principle of the hand-eye system, the present invention designs a simple and flexible calibration method based on the three-dimensional tracking of the mechanical light-guided manipulator, which includes camera internal parameter calibration, line Laser light plane equation calibration, hand-eye transformation matrix calibration and workpiece offset calibration. This calibration method overcomes the shortcomings of the traditional laser light plane equation and hand-eye matrix calibration conditions and cumbersome calibration steps. The algorithm only needs to control the random movement of the manipulator in more than 3 poses to shoot the fixed target to complete the overall calibration task, and the structured light-guided manipulator tracking system using this method has high tracking accuracy. The algorithm makes it possible to adjust the structure of the line laser and the camera, and to calibrate the system structure parameters on site. The flexibility of the structured light guidance system is greatly increased, which is of great significance to the actual visual measurement and tracking, and has good practicability.
附图说明 Description of drawings
图1是本发明标定的流程图。Fig. 1 is a flow chart of the calibration of the present invention.
图2是本发明相机的小孔成像模型图。Fig. 2 is a small hole imaging model diagram of the camera of the present invention.
图3是本发明圆靶标上圆点的示意图。Fig. 3 is a schematic diagram of dots on the round target of the present invention.
图4是本发明线结构光引导机械臂跟踪的流程图。Fig. 4 is a flow chart of the tracking of the line-structured light-guided robotic arm of the present invention.
图5是本发明两条激光交线在投影空间的对偶表示。Fig. 5 is a dual representation of two laser intersection lines in projection space according to the present invention.
图6是本发明相机固定在机械臂末端关系即手眼关系图。Fig. 6 is a diagram of the relationship between the camera fixed at the end of the mechanical arm, that is, the hand-eye relationship of the present invention.
具体实施方式 Detailed ways
本发明的目的在于克服传统的线激光光平面方程和手眼标定条件苛刻,标定步骤繁琐等缺点,提出一种灵活、精度高、速度快、稳定性好、实时性强、方法简单、计算量小、通用性强的标定方法。The purpose of the present invention is to overcome the shortcomings of the traditional line laser light plane equation and hand-eye calibration conditions, such as harsh conditions and cumbersome calibration steps, and propose a flexible, high-precision, fast-speed, good stability, strong real-time performance, simple method, and small amount of calculation. , Universal calibration method.
下面结合附图和实施例对本发明作进一步说明。The present invention will be further described below in conjunction with drawings and embodiments.
本发明利用如图3所示的圆靶标,可以实现圆心点的快速定位,自动匹配圆心特征点的世界坐标和像素坐标,有利于相机内参数标定的自动化;分析线结构光引导机械臂跟踪系统的数学模型,引入四元数法求解手眼矩阵,并且简化整体标定流程,只需控制机械臂以3个以上任意位姿拍摄固定圆靶标,即可实现相机标定、线激光光平面方程标定和手眼矩阵的标定,为精确跟踪提供了充分条件;线激光光条中心的精确提取,可以在线实时,稳定,精确地检测提取图像特征点,确保标定和跟踪系统的稳定性和可靠性;根据标定完成的结构参数,实现由线结构光引导机械臂三位特征点实时、精确的跟踪。The present invention utilizes the circular target as shown in Figure 3 to realize rapid positioning of the center of the circle, automatically match the world coordinates and pixel coordinates of the feature points of the circle center, and is conducive to the automation of camera internal parameter calibration; analysis line structured light guides the robotic arm tracking system The mathematical model introduced the quaternion method to solve the hand-eye matrix, and simplifies the overall calibration process, only need to control the mechanical arm to shoot a fixed circular target in more than 3 arbitrary poses, to achieve camera calibration, line laser light plane equation calibration and hand-eye The calibration of the matrix provides sufficient conditions for accurate tracking; the precise extraction of the center of the line laser light strip can detect and extract image feature points in real time online, stably and accurately, ensuring the stability and reliability of the calibration and tracking system; it is completed according to the calibration Real-time and precise tracking of the three-dimensional feature points of the manipulator guided by the line structured light.
本发明所述提取圆形斑点的亚像素圆心坐标并自动识别圆点的行列值,并自动完成和世界坐标的匹配技术是利用图像二值化,以及图像腐蚀进行预处理,根据面积大小和是否处于边缘去除非目标部分,利用圆形度方程去除干扰点,对圆形斑点进行标记,提取标记区域,经高斯滤波后采用重心法,提取圆点中心的亚像素坐标。圆靶标上设置若干行列的圆心点,相邻两行圆心之间的行距为15mm,同一行中相邻的2圆心点距离为10mm,在第一行的第一个点行方向上5mm处有一个参考圆,首先找出所有有效圆点的圆心距离最小的两个点,通过上述圆靶标上的圆心点排布可知,参考圆与所述参考圆下方圆点的圆心距离最近;其中离所有圆心点均值中心远的圆的定义为0点,另一个定义为1点,此处即是将参考圆定义为0,参考圆正下方的圆点定义为1;然和设置0,1点为已匹配点,在未匹配点中找与1点最近的点定义2点,设置2点为已匹配点在未匹配点中找与2点最近的点定义3点,设置3点为已匹配点…直至把所有点都找全,完成所以圆心点的排序,根据圆点排序,与给定圆靶标信息,完成圆靶标图像中圆心点像素坐标与世界坐标的匹配。The present invention extracts the sub-pixel circle center coordinates of circular spots and automatically recognizes the row and column values of the circle spots, and automatically completes the matching technology with the world coordinates. In the edge to remove the non-target part, use the circularity equation to remove the interference point, mark the circular spot, extract the marked area, and use the center of gravity method after Gaussian filtering to extract the sub-pixel coordinates of the center of the dot. Set the center points of several rows and columns on the circular target, the row distance between the centers of two adjacent rows is 15mm, the distance between the two adjacent center points in the same row is 10mm, and there is a point at the first point of the first row at 5mm in the row direction. For the reference circle, first find the two points with the smallest distance between the centers of all valid circles. According to the arrangement of the center points on the above-mentioned circular target, the distance between the reference circle and the center of the circle below the reference circle is the closest; The circle far from the point mean center is defined as 0 point, and the other is defined as 1 point. Here, the reference circle is defined as 0, and the circle point directly below the reference circle is defined as 1; then set 0, 1 point as already Match point, find the point closest to point 1 in the unmatched point, define point 2, set point 2 as the matched point, find the point closest to point 2 in the unmatched point, define point 3, set point 3 as the matched point... Until all the points are found, the sorting of the center points is completed. According to the sorting of the circle points, and the given circle target information, the pixel coordinates of the circle center points in the circle target image are matched with the world coordinates.
本发明所述基于对偶关系的标定方法是利用投影空间中两条激光交线及其相关联的平面的对偶关系标定激光平面方程。标定完成得到激光光平面在相机坐标系下的平面方程。The calibration method based on the dual relationship in the present invention is to calibrate the laser plane equation by using the dual relationship between two laser intersection lines and their associated planes in the projected space. After the calibration is completed, the plane equation of the laser light plane in the camera coordinate system is obtained.
本发明所述利用四元数法求解手眼变换矩阵的技术是利用已标定完成得到的世界坐标系与相机坐标系的变换矩阵和机械臂位姿矩阵,通过构造约束方程,利用单位四元数与旋转矩阵的对应关系求解约束方法,得到手眼变换矩阵H。利用现场实时采集到的图像快速确定特征点,由上述方法得到的激光平面方程和手眼变换矩阵实现三维特征点的准确跟踪。The technology of using the quaternion method in the present invention to solve the hand-eye transformation matrix is to use the transformation matrix of the world coordinate system and the camera coordinate system and the pose matrix of the manipulator that have been calibrated, by constructing constraint equations, using the unit quaternion and The corresponding relationship of the rotation matrix is solved by the constraint method, and the hand-eye transformation matrix H is obtained. The feature points are quickly determined by using the images collected in real time on site, and the laser plane equation and the hand-eye transformation matrix obtained by the above method realize the accurate tracking of the three-dimensional feature points.
如图1所示,本发明的标定具体流程如下:As shown in Figure 1, the calibration specific process of the present invention is as follows:
所述标定流程由相机标定、线激光光平面方程标定、手眼变换矩阵标定以及工件偏移标定四部分组成,利用带通滤光片和CCD采集激光光条扫描焊缝图像。采用中轴变换原理对激光光条进行细化处理,得到光条细化中心。控制机械臂以3个以上任意位姿拍摄固定靶标,利用得到的圆靶标圆心与其匹配数据经行相机内外参计算,利用得到的外参和靶标图片提取的线激光线方程标定出激光光平面方程,利用每两次机械臂移动中的外参变化和机械臂位姿变化构造ΦH=HΘ方程,并利用四元数法求解手眼变换矩阵H。The calibration process consists of four parts: camera calibration, line laser light plane equation calibration, hand-eye transformation matrix calibration, and workpiece offset calibration. Band-pass filters and CCDs are used to collect laser light bar scanning weld seam images. The laser light strip is thinned by using the principle of central axis transformation, and the light strip thinning center is obtained. Control the manipulator to shoot fixed targets in more than 3 arbitrary poses, use the obtained circular target center and its matching data to calculate the internal and external parameters of the camera, and use the obtained external parameters and the line laser line equation extracted from the target image to calibrate the laser light plane equation , construct the ΦH=HΘ equation by using the external parameter change and the pose change of the manipulator in every two movements of the manipulator, and use the quaternion method to solve the hand-eye transformation matrix H.
第一步,控制机械臂变换位姿,使得相机在多个位姿拍摄一个任意放置且位置不变的圆靶标,选择的位姿必须使的所有圆靶标上的圆点在相机视场范围内,并且保证固定在相机上线激光器所产生的线激光光平面在圆靶标上产生的光条部分也在视场范围内;通过相机获取圆靶标图像后,提取圆靶标图像上圆形斑点的圆心坐标并识别角点的行列值,以完成圆靶标图像和世界坐标的匹配,然后根据张正友标定算法得到相机的内部参数矩阵
其中,α=f/dx、β=f/dy,f为相机焦距,dx,dy为相机内单个CCD感光元件长度、宽度;γ为反映相机内CCD感光元件排列倾斜程度的物理量,u0、v0为相机的镜头光轴和CCD感光元件的交点像素坐标;Among them, α=f/dx, β=f/dy, f is the focal length of the camera, dx, dy are the length and width of a single CCD photosensitive element in the camera; v 0 is the pixel coordinates of the intersection of the lens optical axis of the camera and the CCD photosensitive element;
1.1、利用相机在线实时采集圆靶标的图像,采用大津法自适应阈值对图像二值化,使得圆靶标图像上的圆形斑点突出显示;1.1. Use the camera to collect the image of the circular target online in real time, and use the adaptive threshold of the Otsu method to binarize the image, so that the circular spots on the circular target image are highlighted;
1.2、利用闭运算算子对圆靶标图像进行闭运算,以去除噪声干扰;对上述二值化后的圆靶标图像中所有目标圆点进行标记,目标圆点的面积为目标圆点包含的像素数目,统计上述圆靶标图像中的目标圆点的像素个数,标记目标圆点所包含像素数目均值为NP_average,去除标记目标中包含像素数目小于0.5*NP_average以及标记目标中包含像素数目大于1.5*NP_average的目标;1.2. Use the closed operation operator to perform closed operation on the circular target image to remove noise interference; mark all the target dots in the above-mentioned binarized circular target image, and the area of the target dot is the pixel contained in the target dot Number, count the number of pixels of the target dot in the above circular target image, the average number of pixels contained in the marked target dot is NP_average, remove the number of pixels contained in the marked target less than 0.5*NP_average and the number of pixels contained in the marked target greater than 1.5* target for NP_average;
1.3、计算标记目标圆点的周长C和面积S,利用圆形度表达式e=C2/(2*π*S),区分目标圆点与干扰;1.3. Calculate the circumference C and area S of the marked target dot, and use the circularity expression e=C 2 /(2*π*S) to distinguish the target dot from interference;
1.4、采用重心法求取每一个目标圆点的圆心坐标1.4. Use the center of gravity method to obtain the center coordinates of each target point
其中,m,n为圆靶标图像的行、列值,Hit[i][j]表示(i,j)位置点是否在目标圆点上,(i,j)位置点在目标圆点上Hit[i][j]=1,否则Hit[i][j]=0;Among them, m, n are the row and column values of the circle target image, Hit[i][j] indicates whether the (i, j) position point is on the target circle point, and the (i, j) position point is on the target circle point Hit [i][j]=1, otherwise Hit[i][j]=0;
1.5、圆靶标上具有若干以行列排布的圆点,相邻两行圆点之间的行距为15mm,同一行中相邻的2个圆点圆心距离为10mm,在圆靶标第一行的第一个点行方向上5mm处有一个参考圆,根据步骤(1.4)得到目标圆点像素坐标,找出圆靶标图像上有效圆点的圆心距离最小的两个目标圆点,然后依照前述排序方式对圆靶标图像上的原点进行排序;1.5. There are several dots arranged in rows and columns on the circular target. The row distance between two adjacent rows of dots is 15mm, and the distance between the centers of two adjacent dots in the same row is 10mm. There is a reference circle at 5mm in the row direction of the first dot, and the pixel coordinates of the target dot are obtained according to step (1.4), and the two target dots with the smallest distance between the centers of the effective dots on the circle target image are found, and then follow the aforementioned sorting method Sort the origin on the circular target image;
1.6、根据上述圆靶标图像中圆点排序,与给定圆靶标信息,完成圆靶标图像中圆心点像素坐标与世界坐标的匹配。1.6. According to the sorting of the dots in the above circular target image and the given circular target information, complete the matching between the pixel coordinates of the circle center point in the circular target image and the world coordinates.
第二步,针对上面得到的带线激光光条的靶标图片,提取光条,细化光条,通过Hough变换求出光条线方程。结合第一步得到的外参矩阵就可以得到线激光光平面在相机坐标系下的平面方程;The second step is to extract the light strips and refine the light strips according to the target picture with the line laser light strips obtained above, and obtain the light strip line equation through Hough transform. Combined with the external parameter matrix obtained in the first step, the plane equation of the line laser light plane in the camera coordinate system can be obtained;
2.1、线激光器固定在相机上,相机采集图片是只需保证所有靶标圆点以及光条部分在相机视场范围内即可,利用预设阈值对采集到的线激光光条图像进行二值化,对二值化图像进行图像闭运算以去除边缘奇异点;2.1. The line laser is fixed on the camera. The camera only needs to ensure that all the target dots and light strips are within the field of view of the camera to collect pictures. Use the preset threshold to binarize the collected line laser light strip images. , perform image closing operation on the binarized image to remove edge singular points;
2.2、对线激光光条区域中心进行8-邻域标记,记线激光光条区域的中心点为p1,线激光光条区域中心点p1邻域的8个点顺时针绕中心点分别为p2,p3,……,p9,其中p2点中心点在p1的上方,对线激光光条区域中心点p1进行8-邻域标记的同时满足下列条件的边界点:2.2. Carry out 8-neighborhood marking on the center of the line laser light bar area. The center point of the line laser light bar area is p1, and the 8 points in the neighborhood of the center point p1 of the line laser light bar area are respectively p2 , p3,..., p9, where the center point of point p2 is above p1, and the center point p1 of the line laser light strip area is marked with 8-neighborhood points while satisfying the following conditions:
(ⅰ)、2≤N(p1)≤6;(i), 2≤N(p1)≤6;
(ⅱ)、S(p1)=1;(ii), S(p1)=1;
(ⅲ)、p2*p4*p6=0;(Ⅲ), p2*p4*p6=0;
(ⅳ)、p4*p6*p8=0;(ⅳ), p4*p6*p8=0;
其中,N(p1)是中心点p1的非零邻点的个数;S(p1)是以p2,p3,……,p9为序时这些点的值从0→1变化次数;当对所有的边界点都检查完毕后,将所有的标记点除去,反复迭代直到没有点满足标记条件,完成光条细化;Among them, N(p1) is the number of non-zero neighbors of the center point p1; S(p1) is the number of times the values of these points change from 0→1 when the order is p2, p3,...,p9; when for all After checking all the boundary points of , remove all the marked points, iterate repeatedly until no point meets the marking condition, and complete the thinning of the light bar;
2.3、对细化后得到的点利用Hough变换提取得到在圆靶标平面上线激光光条的线方程akx+bky+ck=0,其中,ak、bk、ck分别表示线方程的参数。2.3. Use the Hough transform to extract the points obtained after refinement to obtain the line equation a k x+b k y+c k =0 of the line laser light strip on the circular target plane, where a k , b k , and c k represent parameters of the line equation.
第三步,根据第一步中每个位姿对应的机械臂姿态,利用四元数法计算得到机械臂末端坐标系与机械臂基坐标系的变换矩阵,即获得手眼变换矩阵H;In the third step, according to the posture of the robot arm corresponding to each pose in the first step, the transformation matrix between the end coordinate system of the robot arm and the base coordinate system of the robot arm is calculated by using the quaternion method, that is, the hand-eye transformation matrix H is obtained;
根据第一步中每个位姿对应的机械臂六轴姿态计算得到机械臂末端坐标系与机械臂基坐标系的变换矩阵,通过四元数法求解方程ΦH=HΘ。这里Φ、H及Θ是4x4矩阵,其中Φ和Θ分别表示机械臂末端坐标系和相机坐标系从第一个位置到第二个位置的变换矩阵,H表示机械臂末端坐标系到相机坐标系的变换矩阵,即手眼关系矩阵;According to the six-axis attitude of the manipulator corresponding to each pose in the first step, the transformation matrix of the end coordinate system of the manipulator and the base coordinate system of the manipulator is obtained, and the equation ΦH=HΘ is solved by the quaternion method. Here Φ, H and Θ are 4x4 matrices, where Φ and Θ represent the transformation matrix from the first position to the second position of the end coordinate system of the manipulator and the camera coordinate system respectively, and H represents the coordinate system of the end of the manipulator to the camera coordinate system The transformation matrix of , that is, the hand-eye relationship matrix;
3.1、圆靶标在世界坐标系下的平面方程可以表示为式中:π1=[0,0,1,0]T,利用第一步得到的外参数矩阵RT,计算得到在相机坐标系下靶标平面的平面方程为其中的与分别为靶标平面在世界坐标系和相机坐标系下的平面法向坐标向量;3.1. The plane equation of the circular target in the world coordinate system can be expressed as In the formula: π 1 =[0,0,1,0] T , using the external parameter matrix RT obtained in the first step, the plane equation of the target plane in the camera coordinate system is calculated as one of them and are the plane normal coordinate vectors of the target plane in the world coordinate system and the camera coordinate system respectively;
3.2、在相机坐标系中,设分别代表激光平面法向坐标向量、第i个圆靶标的法向坐标向量、第j个圆靶标的法向坐标向量、第i个圆靶标内的激光交线图像的坐标向量、第j个平面靶标内的激光交线图像的坐标向量;注意第i个平面靶标图像上的激光交线图像的方程λi与可以直接计算出来:3.2. In the camera coordinate system, set represent the normal coordinate vector of the laser plane, the normal coordinate vector of the i-th circular target, the normal coordinate vector of the j-th circular target, the coordinate vector of the laser intersection image in the i-th circular target, and the j-th plane The coordinate vector of the laser intersection image in the target; note that the equation λ i of the laser intersection image on the i-th plane target image and can be calculated directly:
3.3、根据投影空间中两条激光交线及其相关联的平面的对偶关系得,
其中
第四步,控制焊接工件在固定末端位姿下对靶标一点经行点触,计算出工件末端点在机械臂坐标下的坐标值,并结合机械臂位姿计算出工件在该位姿下的偏移值。The fourth step is to control the welding workpiece to touch the target point under the fixed end pose, calculate the coordinate value of the end point of the workpiece under the coordinates of the manipulator, and calculate the position of the workpiece under the pose of the manipulator based on the pose of the manipulator. offset value.
4.1、取第一步中任意2个位姿,其中表示2个位姿下机械臂末端坐标系的转换矩阵,令 表示2个位姿下相机坐标系的转换矩阵,令则可以简写为ΦH=HΘ;4.1. Take any two poses in the first step, in Indicates the transformation matrix of the coordinate system at the end of the manipulator in two poses, so that Represents the transformation matrix of the camera coordinate system in two poses, so that Then it can be abbreviated as ΦH=HΘ;
4.2、ΦH=HΘ;展开为4.2, ΦH=HΘ; expanded as
RΦRH=RHRΘ ①R Φ R H =R H R Θ ①
RΦtH+tΦ=RHtΘ+tH ②R Φ t H +t Φ =R H t Θ +t H ②
其中:RΦ,RΘ,RH表示Φ,Θ,H中对应的旋转部分,tΦ,tΘ,tH表示Φ,Θ,H中对于平移部分;Wherein: R Φ , R Θ , R H represents Φ, Θ, the corresponding rotation part in H, t Φ , t Θ , t H represents Φ, Θ, for the translation part in H;
4.3、令
4.4、再将旋转矩阵数据带入②就可以求出tH,得到手眼变换矩阵H4.4. Bring the rotation matrix data into ② to find t H and obtain the hand-eye transformation matrix H
所述第四步中计算焊接工件位移值包括如下步骤:Calculating the welding workpiece displacement value in the fourth step includes the following steps:
5.1、控制焊接工件在固定机械臂末端位姿下对圆靶标上预设点精确点触,读取当前机械臂位姿,获得机械臂末端坐标的第一坐标值;5.1. Control the welding workpiece to precisely touch the preset point on the circular target under the fixed end pose of the manipulator, read the current pose of the manipulator, and obtain the first coordinate value of the end coordinate of the manipulator;
5.2、定义预设点在世界坐标系下的坐标,根据第一步中得到的外参数矩阵RT,获得预设点在相机坐标系下的坐标,根据手眼变换矩阵H和拍摄圆靶标图像时的机械臂位姿计算得到预设点在机械臂基坐标系下的第二坐标值,将所述第二坐标值与第一坐标值比对得到当前位姿下焊接工件位移值。5.2. Define the coordinates of the preset point in the world coordinate system. According to the external parameter matrix RT obtained in the first step, obtain the coordinates of the preset point in the camera coordinate system. According to the hand-eye transformation matrix H and the image of the circular target The pose calculation of the manipulator obtains the second coordinate value of the preset point in the base coordinate system of the manipulator, and compares the second coordinate value with the first coordinate value to obtain the displacement value of the welding workpiece under the current pose.
本发明实施例中,通过上述步骤,分别实现了相机内部参数标定,线激光光平面方程标定,手眼变换矩阵标定和工件偏移标定,即达到了对焊接机器人系统标定过程。In the embodiment of the present invention, through the above steps, the calibration of the internal parameters of the camera, the calibration of the line laser light plane equation, the calibration of the hand-eye transformation matrix and the calibration of the workpiece offset are respectively realized, that is, the calibration process of the welding robot system is achieved.
本发明通过分析相机的成像原理、结构光测量原理和手眼系统工作原理,设计了一种简单灵活的基于机构光引导机械臂三维跟踪的标定方法,其中包括相机内部参数标定,线激光光平面方程标定,手眼变换矩阵标定和工件偏移标定。该标定方法克服了传统激光光平面方程和手眼矩阵标定条件苛刻、标定步骤繁琐的缺点。该算法只需要控制机械臂随意运动3个以上的位姿拍摄固定靶标就可以完成整体标定任务,并且利用该方法法的结构光引导机械臂跟踪系统有较高的跟踪精度。该算法使得线激光,相机的结构可调,以及系统结构参数的现场标定成为可能。大大增加了结构光引导系统的灵活性,对实际的视觉测量和跟踪具有重要意义,具有良好的实用性。By analyzing the imaging principle of the camera, the principle of structured light measurement and the working principle of the hand-eye system, the present invention designs a simple and flexible calibration method based on the three-dimensional tracking of the mechanical light-guided manipulator, which includes the calibration of the internal parameters of the camera and the equation of the line laser light plane Calibration, hand-eye transformation matrix calibration and workpiece offset calibration. This calibration method overcomes the shortcomings of the traditional laser light plane equation and hand-eye matrix calibration conditions and cumbersome calibration steps. The algorithm only needs to control the random movement of the manipulator in more than 3 poses to shoot the fixed target to complete the overall calibration task, and the structured light-guided manipulator tracking system using this method has high tracking accuracy. The algorithm makes it possible to adjust the structure of the line laser and the camera, and to calibrate the system structure parameters on site. The flexibility of the structured light guidance system is greatly increased, which is of great significance to the actual visual measurement and tracking, and has good practicability.
Claims (7)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210318783.9A CN102794763B (en) | 2012-08-31 | 2012-08-31 | Calibration method of welding robot system based on line structured light vision sensor guidance |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210318783.9A CN102794763B (en) | 2012-08-31 | 2012-08-31 | Calibration method of welding robot system based on line structured light vision sensor guidance |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102794763A CN102794763A (en) | 2012-11-28 |
CN102794763B true CN102794763B (en) | 2014-09-24 |
Family
ID=47194189
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210318783.9A Active CN102794763B (en) | 2012-08-31 | 2012-08-31 | Calibration method of welding robot system based on line structured light vision sensor guidance |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102794763B (en) |
Families Citing this family (63)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103418950A (en) * | 2013-07-03 | 2013-12-04 | 江南大学 | Automatic posture adjusting method for industrial welding robot in seam tracking process |
CN103558850B (en) * | 2013-07-26 | 2017-10-24 | 无锡信捷电气股份有限公司 | A kind of welding robot full-automatic movement self-calibration method of laser vision guiding |
CN103538067B (en) * | 2013-10-08 | 2015-08-19 | 南京航空航天大学 | A kind of forward kinematics solution method of the rapid solving Stewart parallel institution based on hypercomplex number |
CN103810676B (en) * | 2014-01-02 | 2017-10-20 | 北京科技大学 | A kind of monitoring method of the steel pipe speed of service |
CN103878773B (en) * | 2014-02-25 | 2015-08-12 | 西安航天精密机电研究所 | A kind of adjustment method based on automatically changing electric machine people |
CN104104149A (en) * | 2014-07-01 | 2014-10-15 | 国家电网公司 | Electric apparatus state display and real-time alarm method based on SVG (Scalable Vector Graphics) technology |
CN104400265B (en) * | 2014-10-08 | 2017-06-06 | 吴长兴 | A kind of extracting method of the welding robot corner connection characteristics of weld seam of laser vision guiding |
CN104237380B (en) * | 2014-10-10 | 2017-01-18 | 北京理工大学 | Composite component manipulator scanning pose quaternion converting method |
CN104400279B (en) * | 2014-10-11 | 2016-06-15 | 南京航空航天大学 | Pipeline space weld seam based on CCD identifies the method with trajectory planning automatically |
CN104613899A (en) * | 2015-02-09 | 2015-05-13 | 淮阴工学院 | Full-automatic calibration method for structured light hand-eye three-dimensional measuring system |
CN104942401B (en) * | 2015-06-15 | 2017-03-15 | 中国地质大学(武汉) | The cold spotting device of pipe based on binocular stereo vision and the cold centring means of pipe |
CN105157725B (en) * | 2015-07-29 | 2018-06-29 | 华南理工大学 | A kind of hand and eye calibrating method of two-dimensional laser visual sensor and robot |
CN105014667B (en) * | 2015-08-06 | 2017-03-08 | 浙江大学 | A Relative Pose Calibration Method of Camera and Robot Based on Pixel Space Optimization |
EP3236286B1 (en) * | 2016-04-18 | 2023-01-25 | Otis Elevator Company | Auto commissioning system and method |
CN106272444B (en) * | 2016-08-31 | 2018-11-13 | 山东中清智能科技股份有限公司 | A method of realizing that trick relationship and dual robot relationship are demarcated simultaneously |
CN107203973B (en) * | 2016-09-18 | 2020-06-23 | 江苏科技大学 | Sub-pixel positioning method for center line laser of three-dimensional laser scanning system |
CN106910223B (en) * | 2016-11-02 | 2019-07-09 | 北京信息科技大学 | A kind of Robotic Hand-Eye Calibration method based on convex loose global optimization approach |
CN106514068A (en) * | 2016-11-15 | 2017-03-22 | 成都陵川特种工业有限责任公司 | Control method of robot intelligent welding |
CN106737859B (en) * | 2016-11-29 | 2020-08-21 | 江苏瑞伯特视觉科技股份有限公司 | External parameter calibration method for sensor and robot based on invariant plane |
CN106735995B (en) * | 2016-12-09 | 2018-11-30 | 科兰世检测技术(北京)有限公司 | Automatic seam tracking method and device based on crawl device |
CN106839979B (en) * | 2016-12-30 | 2019-08-23 | 上海交通大学 | The hand and eye calibrating method of line structured laser sensor |
CN106695805A (en) * | 2017-01-16 | 2017-05-24 | 东莞市三姆森光电科技有限公司 | Multi-axis robot calibration software |
CN109143167B (en) * | 2017-06-28 | 2021-07-23 | 杭州海康机器人技术有限公司 | Obstacle information acquisition device and method |
CN107590835B (en) * | 2017-08-24 | 2020-12-25 | 中国东方电气集团有限公司 | Mechanical arm tool quick-change visual positioning system and positioning method in nuclear environment |
CN107741224A (en) * | 2017-08-28 | 2018-02-27 | 浙江大学 | A method for automatic attitude adjustment and positioning of AGV based on visual measurement and calibration |
CN107685329A (en) * | 2017-10-16 | 2018-02-13 | 河南森源电气股份有限公司 | A kind of robot workpiece positioning control system and method |
CN107817682B (en) * | 2017-10-20 | 2021-02-09 | 北京控制工程研究所 | Space manipulator on-orbit calibration method and system based on hand-eye camera |
CN107993227B (en) * | 2017-12-15 | 2020-07-24 | 深圳先进技术研究院 | Method and device for acquiring hand-eye matrix of 3D laparoscope |
CN108106535B (en) * | 2017-12-21 | 2020-03-27 | 长沙长泰机器人有限公司 | Line laser calibration method and line laser calibration device based on robot |
CN108332658B (en) * | 2018-01-25 | 2019-08-02 | 清华大学 | A kind of welding bead pose real-time detection method for complex-curved welding |
CN108413896B (en) * | 2018-02-27 | 2019-12-13 | 博众精工科技股份有限公司 | mechanical arm calibration method |
CN108942918B (en) * | 2018-05-21 | 2023-04-07 | 沈阳建筑大学 | Stereo positioning method based on line structured light |
CN108765489B (en) * | 2018-05-29 | 2022-04-29 | 中国人民解放军63920部队 | Pose calculation method, system, medium and equipment based on combined target |
CN109084699A (en) * | 2018-07-02 | 2018-12-25 | 郑州工程技术学院 | A kind of scaling method of the car light profile measuring system based on fixed point |
CN109291048B (en) * | 2018-09-26 | 2020-11-13 | 泉州华中科技大学智能制造研究院 | Real-time online programming system and method for grinding and polishing industrial robot |
CN109242915A (en) * | 2018-09-29 | 2019-01-18 | 合肥工业大学 | Multicamera system scaling method based on multi-face solid target |
CN109623229A (en) * | 2019-02-15 | 2019-04-16 | 重庆固高科技长江研究院有限公司 | A kind of trick cooperative system based on robot welding |
CN110136208B (en) * | 2019-05-20 | 2020-03-17 | 北京无远弗届科技有限公司 | Joint automatic calibration method and device for robot vision servo system |
CN110211175B (en) * | 2019-05-21 | 2023-04-21 | 天津大学 | Method for calibrating space pose of collimated laser beam |
CN110245599A (en) * | 2019-06-10 | 2019-09-17 | 深圳市超准视觉科技有限公司 | A kind of intelligent three-dimensional weld seam Auto-searching track method |
CN110335310B (en) * | 2019-07-09 | 2021-07-02 | 中国大恒(集团)有限公司北京图像视觉技术分公司 | Calibration method under non-common vision field |
CN111223067B (en) * | 2020-02-21 | 2023-05-26 | 成都工业学院 | An automatic alignment method for machining round holes of bridge anchors |
CN111558758B (en) * | 2020-05-21 | 2021-10-26 | 宁夏天地奔牛实业集团有限公司 | Automatic surfacing method for surface of mining sprocket chain nest |
CN111739104B (en) * | 2020-06-24 | 2024-05-03 | 深圳市道通科技股份有限公司 | Calibration method and device of laser calibration system and laser calibration system |
CN112129809B (en) * | 2020-08-13 | 2023-12-29 | 苏州赛米维尔智能装备有限公司 | Copper sheet thermal resistivity detection device based on visual guidance and detection method thereof |
CN112022355B (en) * | 2020-09-27 | 2022-06-10 | 平安科技(深圳)有限公司 | Hand-eye calibration method and device based on computer vision and storage medium |
CN114643599B (en) * | 2020-12-18 | 2023-07-21 | 沈阳新松机器人自动化股份有限公司 | Three-dimensional machine vision system and method based on point laser and area array camera |
CN112754616B (en) * | 2020-12-30 | 2022-05-10 | 诺创智能医疗科技(杭州)有限公司 | Ultrasonic positioning puncture system and storage medium |
CN112750168B (en) * | 2021-01-11 | 2023-09-26 | 上海科技大学 | Calibration method, device, computer equipment and storage medium for internal parameters of event camera |
CN112894209A (en) * | 2021-01-19 | 2021-06-04 | 常州英迈乐智能系统有限公司 | Automatic plane correction method for intelligent tube plate welding robot based on cross laser |
CN112923918A (en) * | 2021-01-26 | 2021-06-08 | 南京理工大学 | Visual pose measurement method based on improved checkerboard target |
CN113211447B (en) * | 2021-05-27 | 2023-10-27 | 山东大学 | A real-time sensing planning method and system for robotic arms based on bidirectional RRT* algorithm |
CN113418927A (en) * | 2021-06-08 | 2021-09-21 | 长春汽车工业高等专科学校 | Automobile mold visual detection system and detection method based on line structured light |
CN113246142B (en) * | 2021-06-25 | 2021-10-08 | 成都飞机工业(集团)有限责任公司 | Measuring path planning method based on laser guidance |
CN113607066B (en) * | 2021-08-16 | 2023-09-12 | 上海发那科机器人有限公司 | Calibration method of optical axis coordinate system of laser displacement sensor |
CN113483669B (en) * | 2021-08-24 | 2023-02-17 | 凌云光技术股份有限公司 | Multi-sensor pose calibration method and device based on three-dimensional target |
CN113681559B (en) * | 2021-08-24 | 2023-01-03 | 宁波光雷睿融智能系统有限公司 | Line laser scanning robot hand-eye calibration method based on standard cylinder |
CN113634958A (en) * | 2021-09-27 | 2021-11-12 | 西安知象光电科技有限公司 | Three-dimensional vision-based automatic welding system and method for large structural part |
CN114505864B (en) * | 2022-03-11 | 2024-02-09 | 上海柏楚电子科技股份有限公司 | Hand-eye calibration method, device, equipment and storage medium |
CN115283905B (en) * | 2022-08-23 | 2024-03-26 | 中国核工业二三建设有限公司 | Welding gun posture adjusting method of welding robot |
CN115222826B (en) * | 2022-09-15 | 2022-12-27 | 深圳大学 | Three-dimensional reconstruction method and device with changeable relative poses of structured light and camera |
CN117102725B (en) * | 2023-10-25 | 2024-01-09 | 湖南大学 | A welding method and system for steel-concrete composite structure connectors |
CN117284499B (en) * | 2023-11-24 | 2024-01-19 | 北京航空航天大学 | Monocular vision-laser-based pose measurement method for spatial unfolding mechanism |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1097786A2 (en) * | 1999-11-05 | 2001-05-09 | Fanuc Ltd | Operation line tracking device using sensor |
JP2010276447A (en) * | 2009-05-28 | 2010-12-09 | Seiko Epson Corp | POSITION MEASURING DEVICE, POSITION MEASURING METHOD, AND ROBOT SYSTEM |
CN101943563A (en) * | 2010-03-26 | 2011-01-12 | 天津大学 | Rapid calibration method of line-structured light vision sensor based on space plane restriction |
CN102063718A (en) * | 2010-12-24 | 2011-05-18 | 江南大学 | Field calibration and precision measurement method for spot laser measuring system |
CN102303190A (en) * | 2011-08-03 | 2012-01-04 | 江南大学 | Method for visually tracking plane abut-jointed weld beam by linear laser |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0810949A (en) * | 1994-06-23 | 1996-01-16 | Fanuc Ltd | Method for controlling welding robot system in multi-layer over laying |
JP4267789B2 (en) * | 2000-01-24 | 2009-05-27 | 日立建機株式会社 | Method and apparatus for detecting weld bead shape |
-
2012
- 2012-08-31 CN CN201210318783.9A patent/CN102794763B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1097786A2 (en) * | 1999-11-05 | 2001-05-09 | Fanuc Ltd | Operation line tracking device using sensor |
JP2010276447A (en) * | 2009-05-28 | 2010-12-09 | Seiko Epson Corp | POSITION MEASURING DEVICE, POSITION MEASURING METHOD, AND ROBOT SYSTEM |
CN101943563A (en) * | 2010-03-26 | 2011-01-12 | 天津大学 | Rapid calibration method of line-structured light vision sensor based on space plane restriction |
CN102063718A (en) * | 2010-12-24 | 2011-05-18 | 江南大学 | Field calibration and precision measurement method for spot laser measuring system |
CN102303190A (en) * | 2011-08-03 | 2012-01-04 | 江南大学 | Method for visually tracking plane abut-jointed weld beam by linear laser |
Non-Patent Citations (4)
Title |
---|
基于机器人系统的线结构光视觉传感器标定新方法;林娜等;《传感器与微系统》;20070930;第26卷(第9期);正文第101-103页 * |
林嘉睿等.线结构光视觉传感器二维机器人焊缝跟踪系统.《光电子·激光》.2009,第20卷(第6期), |
林娜等.基于机器人系统的线结构光视觉传感器标定新方法.《传感器与微系统》.2007,第26卷(第9期), |
线结构光视觉传感器二维机器人焊缝跟踪系统;林嘉睿等;《光电子·激光》;20090630;第20卷(第6期);正文第793-795页 * |
Also Published As
Publication number | Publication date |
---|---|
CN102794763A (en) | 2012-11-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102794763B (en) | Calibration method of welding robot system based on line structured light vision sensor guidance | |
CN103558850B (en) | A kind of welding robot full-automatic movement self-calibration method of laser vision guiding | |
CN105678785B (en) | A kind of laser and the scaling method of camera relative pose relation | |
CN111476841B (en) | A method and system for recognition and positioning based on point cloud and image | |
CN105180890B (en) | Rock mass structural plane attitude measuring method integrating laser point cloud and digital image | |
CN110555889A (en) | CALTag and point cloud information-based depth camera hand-eye calibration method | |
CN101299270B (en) | Synchronous Fast Calibration Method for Multiple Cameras in 3D Scanning System | |
CN102303190B (en) | Method for visually tracking plane abut-jointed weld beam by linear laser | |
CN108555908A (en) | A kind of identification of stacking workpiece posture and pick-up method based on RGBD cameras | |
CN105335973B (en) | Apply to the visual processing method of strip machining production line | |
CN110906863B (en) | Hand-eye calibration system and calibration method for line-structured light sensor | |
CN106500593A (en) | Aviation electric connector contact pin position deviation detection method | |
CN108871216A (en) | A kind of porous contact method for automatic measurement of robot of view-based access control model guidance | |
CN111531407B (en) | A Fast Measurement Method of Workpiece Pose Based on Image Processing | |
CN102063718A (en) | Field calibration and precision measurement method for spot laser measuring system | |
CN107576281A (en) | A kind of method and apparatus of measurement pipe bend bilge radius | |
CN110044259A (en) | A kind of gathering pipe flexible measurement system and measurement method | |
CN110136211A (en) | A workpiece positioning method and system based on active binocular vision technology | |
CN111508032B (en) | Method for sorting feature points in camera calibration process | |
CN113863966B (en) | Segment grasping pose detection device and detection method based on deep learning vision | |
CN105574812A (en) | Multi-angle three-dimensional data registration method and device | |
Wang et al. | A binocular vision method for precise hole recognition in satellite assembly systems | |
CN118254180A (en) | Electrical cabinet knob switch pose detection and operation method | |
CN105354855A (en) | High-rise structure appearance detection device and method | |
Geng et al. | Machine vision detection method for surface defects of automobile stamping parts |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C53 | Correction of patent for invention or patent application | ||
CB02 | Change of applicant information |
Address after: 1800 No. 214122 Jiangsu city of Wuxi Province Li Lake Avenue Applicant after: Jiangnan University Applicant after: Xinje Electronic Co., Ltd. Address before: 1800 No. 214122 Jiangsu city of Wuxi Province Li Lake Avenue Applicant before: Jiangnan University Applicant before: Wuxi Czech Automation Co., Ltd. |
|
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |