CN110490934A - Mixing machine vertical blade attitude detecting method based on monocular camera and robot - Google Patents
Mixing machine vertical blade attitude detecting method based on monocular camera and robot Download PDFInfo
- Publication number
- CN110490934A CN110490934A CN201910742194.5A CN201910742194A CN110490934A CN 110490934 A CN110490934 A CN 110490934A CN 201910742194 A CN201910742194 A CN 201910742194A CN 110490934 A CN110490934 A CN 110490934A
- Authority
- CN
- China
- Prior art keywords
- blade
- coordinate system
- monocular camera
- camera
- robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Analysis (AREA)
Abstract
本发明涉及一种基于单目相机与机器人的混合机立式桨叶姿态检测方法,该方法首先获取机器人与单目相机之间的手眼标定关系,通过调整机器人来调整单目相机到达合适位置,利用单目相机采集混合机桨叶图像后,对其进行图像处理及特征提取,通过单目相机成像原理及相关几何关系可解得立式混合机桨叶与相机之间的角度关系,进而利用标定好的相机与机器人手眼关系解得桨叶在机器人基坐标系中的自转角度,即得到桨叶姿态。
The invention relates to a method for detecting the vertical blade attitude of a hybrid machine based on a monocular camera and a robot. The method first obtains the hand-eye calibration relationship between the robot and the monocular camera, and adjusts the monocular camera to a suitable position by adjusting the robot. After using the monocular camera to collect the image of the mixer blade, image processing and feature extraction are performed on it, and the angle relationship between the vertical mixer blade and the camera can be solved through the imaging principle of the monocular camera and related geometric relationships, and then used The relationship between the calibrated camera and the robot's hand-eye is solved to obtain the rotation angle of the blade in the robot's base coordinate system, that is, the attitude of the blade.
Description
技术领域technical field
本发明涉及高危易爆环境中的位姿检测领域,具体地为一种基于单目相机与机器人的混合机立式桨叶姿态检测方法。The invention relates to the field of pose detection in high-risk and explosive environments, in particular to a method for detecting the pose of a vertical blade of a hybrid machine based on a monocular camera and a robot.
背景技术Background technique
该发明背景为航天火工品生产车间,属易燃易爆高危环境。目前,航天领域中有关航天器的姿态识别研究较多,而关于高危车间内的物体姿态识别的相关文献较少。混合机立式桨叶用于火箭推进剂混合,处于易燃易爆的车间内,利用机器人对其进行相关操作前,需要检测其空间姿态。常用的视觉求解目标姿态的方式主要有单目相机、双目相机等。相比单目相机,双目相机立体标定校正匹配算法复杂,效率相对较低,且双目相机系统外形尺寸较大,不适宜安装于作业空间较小的工业现场及工业机器人末端。考虑到需要对视觉传感器进行隔爆处理,选用体积较小、安装方便、系统更稳定的单目相机进行立式混合机桨叶姿态的检测。The background of the invention is the production workshop of aerospace explosives, which is a flammable, explosive and high-risk environment. At present, there are many researches on attitude recognition of spacecraft in the field of aerospace, but there are few related literatures on attitude recognition of objects in high-risk workshops. The vertical blades of the mixer are used for rocket propellant mixing. They are located in a flammable and explosive workshop. Before using robots to perform related operations on them, their spatial attitude needs to be detected. Commonly used methods for visually solving the target pose mainly include monocular cameras, binocular cameras, etc. Compared with the monocular camera, the binocular camera stereo calibration correction matching algorithm is complex, the efficiency is relatively low, and the binocular camera system has a large size, which is not suitable for installation in industrial sites with small working space and at the end of industrial robots. Considering the need for explosion-proof treatment of the visual sensor, a monocular camera with a smaller size, easier installation, and a more stable system is selected to detect the attitude of the vertical mixer blade.
发明专利CN201410016272.0中公开了一种基于机械臂末端单目视觉的动态目标位置和姿态测量方法,首先进行摄像机标定和手眼标定,然后利用单目摄像机拍摄两幅图像,之后提取目标特征点并进行特征点匹配,进而求解摄像机的旋转变换矩阵,对特征点进行三维重构和尺度校正,最后利用重构后的特征点获得目标相对摄像机的位置和姿态。该方法有效利用机械臂的移动性能,在两个不同位置进行目标的图像采集,原理与双目相机系统类似却具有更大的灵活性和稳定性,但该方法在两幅待测目标图像的特征点匹配过程中,对特征点数目要求较多,对于较恶劣的工业测量环境,待测目标很难提供较多数量的特征点,此外,对于外形复杂的待测目标,三维重构算法计算量大,处理速度较慢,在对算法速度要求较高的场合不适用。Invention patent CN201410016272.0 discloses a dynamic target position and attitude measurement method based on monocular vision at the end of the manipulator. First, camera calibration and hand-eye calibration are performed, and then two images are captured by the monocular camera, and then target feature points are extracted and Perform feature point matching, and then solve the rotation transformation matrix of the camera, perform three-dimensional reconstruction and scale correction on the feature points, and finally use the reconstructed feature points to obtain the position and attitude of the target relative to the camera. This method effectively utilizes the mobile performance of the robotic arm to collect images of the target at two different positions. The principle is similar to the binocular camera system but has greater flexibility and stability. In the feature point matching process, there are many requirements for the number of feature points. For the harsh industrial measurement environment, it is difficult to provide a large number of feature points for the target to be measured. In addition, for the target to be measured with a complex shape, the 3D reconstruction algorithm calculates The amount is large and the processing speed is slow, so it is not suitable for occasions that require high algorithm speed.
发明内容Contents of the invention
要解决的技术问题technical problem to be solved
混合机立式桨叶为复杂曲面,其表面附着有大量残余推进剂,无法提供足够的特征点,因此,无法通过特征点匹配及三维重建来完成桨叶姿态的检测。为了克服其恶劣的检测条件及满足其检测需求,需要提出一种不依赖一定数量的特征点,无需进行三维重建的简单、可行性大的桨叶姿态检测方法。The vertical blade of the mixer is a complex curved surface with a large amount of residual propellant attached to the surface, which cannot provide enough feature points. Therefore, the detection of the blade attitude cannot be completed through feature point matching and 3D reconstruction. In order to overcome its harsh detection conditions and meet its detection requirements, it is necessary to propose a simple and feasible blade attitude detection method that does not rely on a certain number of feature points and does not require 3D reconstruction.
技术方案Technical solutions
为解决现有技术存在的问题,本发明提出了一种基于单目相机与机器人的混合机立式桨叶姿态检测方法。该方法首先获取机器人与单目相机之间的手眼标定关系,通过调整机器人来调整单目相机到达合适位置,利用单目相机采集混合机桨叶图像后,对其进行图像处理及特征提取,通过单目相机成像原理及相关几何关系可解得立式混合机桨叶与相机之间的角度关系,进而利用标定好的相机与机器人手眼关系解得桨叶在机器人基坐标系中的自转角度,即得到桨叶姿态。In order to solve the problems existing in the prior art, the present invention proposes a method for detecting the vertical blade attitude of a hybrid machine based on a monocular camera and a robot. This method first obtains the hand-eye calibration relationship between the robot and the monocular camera, adjusts the robot to adjust the monocular camera to a suitable position, uses the monocular camera to collect the image of the mixer blade, and performs image processing and feature extraction on it. The imaging principle of the monocular camera and the related geometric relationship can solve the angle relationship between the vertical mixer blade and the camera, and then use the calibrated camera and robot hand-eye relationship to solve the rotation angle of the blade in the robot base coordinate system. That is, the blade attitude is obtained.
一种基于单目相机与机器人的混合机立式桨叶姿态检测方法,其特征在于步骤如下:A method for detecting the attitude of a vertical blade of a hybrid machine based on a monocular camera and a robot, characterized in that the steps are as follows:
步骤1:定义单目相机图像像素坐标系为OpixXpixYpix,单目相机安装于机器人末端,使用棋盘格标定板进行单目相机内参数标定,求解其内参矩阵,具体如下:Step 1: Define the image pixel coordinate system of the monocular camera as O pix X pix Y pix , install the monocular camera at the end of the robot, use the checkerboard calibration board to calibrate the internal parameters of the monocular camera, and solve its internal parameter matrix, as follows:
1a.将棋盘格标定板放置于相机视野中,不断变换标定板姿态,采集标定板图像;1a. Place the checkerboard calibration board in the camera field of view, constantly change the posture of the calibration board, and collect the calibration board image;
1b.对图像进行处理,采用“张正友标定法”计算得到单目相机内参矩阵K,其中,f为单目相机焦距,dx与dy为单目相机单个感光单元芯片的长度与宽度,(u0,v0)为单目相机光轴与成像平面的交点在图像像素坐标系中的像素坐标;1b. Process the image, and use the "Zhang Zhengyou calibration method" to calculate the internal parameter matrix K of the monocular camera, Among them, f is the focal length of the monocular camera, dx and dy are the length and width of a single photosensitive unit chip of the monocular camera, (u 0 , v 0 ) is the intersection point of the optical axis of the monocular camera and the imaging plane in the image pixel coordinate system pixel coordinates;
步骤2:定义机器人基坐标系为ORXRYRZR,机器人法兰坐标系为OFXFYFZF,单目相机坐标系为OCXCYCZC,利用手眼标定方法,标定单目相机坐标系与法兰坐标系之间的坐标转换矩阵Tcf;Step 2: Define the robot base coordinate system as O R X R Y R Z R , the robot flange coordinate system as O F X F Y F Z F , and the monocular camera coordinate system as O C X C Y C Z C , use the hand-eye The calibration method is to calibrate the coordinate transformation matrix T cf between the monocular camera coordinate system and the flange coordinate system;
步骤3:定义单目相机检测时的合适位置为相机坐标系XCOCYC平面和YC轴与机器人基坐标系ZR轴平行,依据步骤2所得Tcf,计算相机到达合适位置时,机器人所需的姿态变换矩阵,通过调整机器人将单目相机调整至该位置;Step 3: Define the proper position for monocular camera detection as the camera coordinate system X C O C Y C plane and the Y C axis parallel to the Z R axis of the robot base coordinate system. According to T cf obtained in step 2, calculate when the camera reaches the proper position , the attitude transformation matrix required by the robot, adjust the monocular camera to this position by adjusting the robot;
步骤4:经步骤3后单目相机位置已调整至合适位置,利用单目相机采集一张桨叶图像,按照以下步骤对其进行滤波除噪及边缘检测、特征提取:Step 4: After step 3, the position of the monocular camera has been adjusted to a suitable position. Use the monocular camera to capture a blade image, and perform filtering, noise removal, edge detection, and feature extraction on it according to the following steps:
4a.采用Canny边缘检测算法对图像进行边缘检测,提取所测桨叶自转轴的双边边缘像素坐标,分别记录为像素坐标点集{xpix1,ypix1},{xpix2,ypix2};4a. Use the Canny edge detection algorithm to detect the edge of the image, extract the bilateral edge pixel coordinates of the measured blade rotation axis, and record them as pixel coordinate point sets {x pix1 , y pix1 }, {x pix2 , y pix2 };
4b.提取桨叶单侧特征肩点像素坐标,记录为(xpix3,ypix3);4b. Extract the pixel coordinates of the characteristic shoulder point on one side of the blade, and record it as (x pix3 , y pix3 );
步骤5:计算桨叶图像中,桨叶自转轴双边边缘之间的像素距离Δx1以及桨叶自转轴中心轴线与桨叶单侧特征肩点的像素距离Δx2,步骤如下:Step 5: Calculate the pixel distance Δx 1 between the bilateral edges of the blade rotation axis and the pixel distance Δx 2 between the central axis of the blade rotation axis and the characteristic shoulder point on one side of the blade in the blade image, the steps are as follows:
5a.由单目相机成像原理可知,桨叶自转轴在图像上成像时,成为其边缘轮廓的母线有两条,两条母线所对应的角度为θ1与θ2,由桨叶自转轴在空间中的位置和半径R,可解出θ1与θ2;5a. According to the imaging principle of the monocular camera, when the blade rotation axis is imaged on the image, there are two generatrix lines that become its edge contours, and the angles corresponding to the two bus lines are θ 1 and θ 2 . The position and radius R in space can be solved for θ 1 and θ 2 ;
5b.通过最小二乘法对步骤4中得到的像素坐标点集{xpix1,ypix1},{xpix2,ypix2}进行拟合,得到桨叶图像中桨叶自转轴双边边缘在图像像素坐标系中的直线方程l1与l2,进而计算得到桨叶自转轴双边边缘之间的像素距离Δx1,单位为像素;5b. Fit the pixel coordinate point set {x pix1 , y pix1 }, {x pix2 , y pix2 } obtained in step 4 by the least square method, and obtain the bilateral edge of the blade rotation axis in the image pixel coordinates of the blade image The linear equations l 1 and l 2 in the system, and then calculate the pixel distance Δx 1 between the bilateral edges of the blade rotation axis, the unit is pixel;
5c.由步骤5b中得到的桨叶自转轴双边边缘在图像像素坐标系中的直线方程l1与l2以及步骤4中得到的桨叶单侧特征肩点像素坐标(xpix3,ypix3),可得桨叶图像中桨叶特征肩点与其最接近的桨叶自转轴边缘之间的像素距离Δx2′,单位为像素;5c. The linear equations l 1 and l 2 of the bilateral edges of the blade rotation axis obtained in step 5b in the image pixel coordinate system and the pixel coordinates of the single-side characteristic shoulder point of the blade obtained in step 4 (x pix3 , y pix3 ) , the pixel distance Δx 2 ′ between the blade feature shoulder point in the blade image and the nearest edge of the blade rotation axis can be obtained, the unit is pixel;
5d.定义桨叶自转轴中心轴线在成像平面上的投影直线与桨叶两条母线在成像平面上的投影直线的距离分别为d1和d2,定义i为d1与d2的比值,则由单目相机成像原理,i=d1/d2=-cosθ2/cosθ1,所以在图像中,桨叶自转轴中心轴线距离桨叶自转轴双边边缘的像素距离分别为Δx1/(i+1),iΔx1/(i+1),由步骤c得到的Δx2′,可解得图像中桨叶自转轴中心轴线与桨叶单侧特征肩点的像素距离Δx2,Δx2=Δx2′+Δx1/(i+1),单位为像素;5d. Define the distances between the projection straight line of the central axis of the blade rotation axis on the imaging plane and the projection straight lines of the two generatrices of the blade on the imaging plane as d 1 and d 2 respectively, and define i as the ratio of d 1 to d 2 , According to the imaging principle of the monocular camera, i=d 1 /d 2 =-cosθ 2 /cosθ 1 , so in the image, the pixel distance between the central axis of the blade rotation axis and the bilateral edges of the blade rotation axis is Δx 1 /( i+1), iΔx 1 /(i+1), Δx 2 ′ obtained from step c, can be solved to obtain the pixel distance Δx 2 , Δx 2 =Δx 2 ′+Δx 1 /(i+1), the unit is pixel;
步骤6:测量得到桨叶两特征肩点之间的距离L,进而计算桨叶自转轴中心轴线与桨叶特征肩点的距离L′,L′=L/2;Step 6: Measure the distance L between the two characteristic shoulder points of the blade, and then calculate the distance L′ between the central axis of the blade rotation axis and the characteristic shoulder point of the blade, L′=L/2;
步骤7:定义桨叶两特征肩点连线与相机坐标系XCOCYC平面之间的夹角为桨叶在相机坐标系下的自转角度θ′,按照以下步骤计算θ′:Step 7: Define the angle between the line connecting the two characteristic shoulder points of the blade and the X C O C Y C plane of the camera coordinate system as the rotation angle θ′ of the blade in the camera coordinate system, and calculate θ′ according to the following steps:
7a.定义m为桨叶自转轴双边轮廓线在成像平面上的投影之间的距离,其与步骤5中桨叶自转轴双边边缘之间的像素距离Δx1对应,n为桨叶自转轴中心轴线与桨叶特征肩点在成像平面上的投影之间的距离,其与步骤5中桨叶自转轴中心轴线与桨叶单侧特征肩点的像素距离Δx2对应,由单目相机成像原理,m/n=Δx1/Δx2;7a. Define m as the distance between the projections of the bilateral contour lines of the blade rotation axis on the imaging plane, which corresponds to the pixel distance Δx 1 between the bilateral edges of the blade rotation axis in step 5, and n is the center of the blade rotation axis The distance between the axis and the projection of the characteristic shoulder point of the blade on the imaging plane corresponds to the pixel distance Δx 2 between the central axis of the rotation axis of the blade and the characteristic shoulder point on one side of the blade in step 5, based on the imaging principle of the monocular camera , m/n=Δx 1 /Δx 2 ;
7b.定义γ为相机光心与桨叶特征肩点两点连线和相机光轴之间的夹角,定义D1=R/cos(π-θ1),D2=R/cosθ2以及T=L′cosθ′-L′sinθ′tanγ,由相似三角形定理,m/n=(D1+D2)/T;7b. Define γ as the angle between the camera optical center and the line connecting the two points of the characteristic shoulder point of the blade and the camera optical axis, define D 1 =R/cos(π-θ 1 ), D 2 =R/cosθ 2 and T=L'cosθ'-L'sinθ'tanγ, by the similar triangle theorem, m/n=(D 1 +D 2 )/T;
7c.由单目相机成像原理,tanγ=(u0-xpix3)dx/f,f/dx,u0由步骤1得到,xpix3由步骤5得到;7c. Based on the imaging principle of a monocular camera, tanγ=(u 0 -x pix3 )d x /f, f/d x , u 0 is obtained by step 1, and x pix3 is obtained by step 5;
7d.结合步骤7a和7b,可得Δx1/Δx2=(D1+D2)/T=(R/(-cosθ1)+R/(cosθ2))/(L′cosθ′-L′sinθ′tanγ),式中,Δx1和Δx2由步骤5得到,桨叶自转轴半径R已知,桨叶自转轴中心轴线与桨叶特征肩点的距离L′由步骤6得到,将步骤c中得到的tanγ代入,经过简单三角函数运算最终解得唯一未知数θ′;7d. Combining steps 7a and 7b, Δx 1 /Δx 2 =(D 1 +D 2 )/T=(R/(-cosθ 1 )+R/(cosθ 2 ))/(L′cosθ′-L 'sinθ'tanγ), in which, Δx 1 and Δx 2 are obtained by step 5, the radius R of the blade rotation axis is known, and the distance L' between the central axis of the blade rotation axis and the characteristic shoulder point of the blade is obtained by step 6. Substitute the tanγ obtained in step c, and finally solve the unique unknown θ′ through simple trigonometric function operations;
步骤8:定义桨叶两特征肩点连线与机器人基坐标系XRORZR平面之间的夹角为桨叶在机器人基坐标系下的自转角度θ,按照以下步骤计算θ:Step 8: Define the angle between the line connecting the two characteristic shoulder points of the blade and the X R O R Z R plane of the robot base coordinate system as the rotation angle θ of the blade in the robot base coordinate system, and calculate θ according to the following steps:
8a.由步骤2得到单目相机坐标系与机器人法兰坐标系之间的坐标转换矩阵Tcf,由机器人示教器读出当前法兰坐标系与机器人基坐标系的转换矩阵Tfr,可解得当前单目相机坐标系与机器人基坐标系之间的坐标转换矩阵为Tcr,Tcr=TcfTfr;8a. The coordinate transformation matrix T cf between the monocular camera coordinate system and the robot flange coordinate system is obtained from step 2, and the transformation matrix T fr between the current flange coordinate system and the robot base coordinate system is read out by the robot teach pendant. The coordinate transformation matrix between the current monocular camera coordinate system and the robot base coordinate system is solved as T cr , T cr =T cf T fr ;
8b.Tcr可表示为[r1 r2 r3 t],其中,r1,r2,r3为旋转向量,t为平移向量,由当前相机与机器人的位置关系及坐标系旋转变换原则可得,r1=[-cosα 0 sinα 0]T,其中,α为机器人基坐标系XR轴与相机坐标系XC轴之间的夹角,由反三角函数可解得α;8b.T cr can be expressed as [r 1 r 2 r 3 t], where r 1 , r 2 , and r 3 are rotation vectors, and t is translation vector, based on the positional relationship between the current camera and robot and the principle of coordinate system rotation transformation It can be obtained that r 1 =[-cosα 0 sinα 0] T , where α is the angle between the X R axis of the robot base coordinate system and the X C axis of the camera coordinate system, and α can be obtained by inverse trigonometric functions;
8c.由各坐标系位置关系得,θ=α+θ′,其中,θ′由步骤7得到,α由步骤8b得到,最终解得桨叶在机器人基坐标系下的自转角度θ,即解得桨叶姿态信息。8c. Obtained from the positional relationship of each coordinate system, θ=α+θ′, wherein θ′ is obtained from step 7, α is obtained from step 8b, and finally the rotation angle θ of the blade in the robot base coordinate system is obtained, namely the solution Get blade attitude information.
有益效果Beneficial effect
本发明提出的一种基于单目相机与机器人的混合机立式桨叶姿态检测方法,没有过分依赖混合机立式桨叶的特征点,避免了复杂的三维重建工作,仅利用桨叶外形几何特点与单目相机成像原理即可解得桨叶姿态,自转角度求解算法计算简单,精度较高,可满足应用要求。该方法适用于多种自转运动刚体的姿态检测,可以低成本高精度地满足检测需求。A method for detecting the vertical blade attitude of a mixer based on a monocular camera and a robot proposed by the present invention does not rely too much on the feature points of the vertical blade of a mixer, avoids complicated three-dimensional reconstruction work, and only uses the shape geometry of the blade The characteristics and the principle of monocular camera imaging can solve the attitude of the blade, the calculation algorithm of the rotation angle is simple, the accuracy is high, and it can meet the application requirements. This method is suitable for attitude detection of various rotation-moving rigid bodies, and can meet the detection requirements with low cost and high precision.
附图说明Description of drawings
图1是基于单目相机与机器人的混合机立式桨叶姿态检测流程图;Figure 1 is a flow chart of attitude detection of a vertical blade of a hybrid machine based on a monocular camera and a robot;
图2是混合机立式桨叶示意图;Fig. 2 is the vertical paddle schematic diagram of mixer;
图3是桨叶与机器人及单目相机安装环境示意图;Figure 3 is a schematic diagram of the installation environment of the blade, robot and monocular camera;
图4是混合机立式桨叶自转轴成像原理图;Figure 4 is a schematic diagram of the imaging principle of the vertical paddle rotation axis of the mixer;
图5是混合机立式桨叶单目相机成像俯视示意图;Fig. 5 is a schematic top view of the vertical paddle monocular camera imaging of the mixer;
图6是各坐标系及混合机立式桨叶自转角度示意图;Fig. 6 is a schematic diagram of each coordinate system and the vertical blade rotation angle of the mixer;
其中:1-混合机立式桨叶自转轴;2-混合机立式桨叶自转轴中心轴线;3-混合机立式桨叶特征肩点;4-混合机立式桨叶特征肩点;5-机器人;6-机器人法兰;7-相机安装座;8-单目相机;9-光源;10-末端工具;11-桨叶自转轴母线;12-桨叶自转轴母线;13-相机光心;14-相机虚拟成像平面(与相机实际成像平面关于相机光心对称);15.桨叶图像中自转轴边缘;16-桨叶图像中自转轴边缘;17-相机光轴;18-立式混合机桨叶自转角度;19-相机坐标系;20-机器人基坐标系。Among them: 1-vertical blade rotation axis of the mixer; 2-central axis of the vertical blade rotation shaft of the mixer; 3-characteristic shoulder point of the vertical blade of the mixer; 4-characteristic shoulder point of the vertical blade of the mixer; 5-robot; 6-robot flange; 7-camera mount; 8-monocular camera; 9-light source; 10-end tool; 11-blade rotation axis busbar; 12-blade rotation axis busbar; Optical center; 14-camera virtual imaging plane (symmetrical to the camera’s actual imaging plane with respect to the camera’s optical center); 15. edge of the rotation axis in the blade image; 16- edge of the rotation axis in the blade image; 17-camera optical axis; 18- Vertical mixer blade rotation angle; 19-camera coordinate system; 20-robot base coordinate system.
具体实施方式Detailed ways
现结合实施例、附图对本发明作进一步描述:Now in conjunction with embodiment, accompanying drawing, the present invention will be further described:
参照附图1-6,本实施例基于单目相机与机器人的姿态检测方法,运用于混合机立式桨叶上。混合机立式桨叶自转轴1为圆柱体,位于桨叶上方,桨叶绕自转轴1的中心轴线2做自转运动,每次停止时,自转角度不确定;混合机桨叶为中心对称刚体,桨叶特征肩点3和特征肩点4的连线通过中心轴线2;单目相机8通过安装座7安装于机器人5的末端法兰6上,用于获取桨叶图像,光源9可提高图像质量,工具10用于完成后续的机器人工作任务;利用相机标定方法,获得单目相机内参矩阵与手眼关系后,通过控制机器人可将单目相机调整至合适位置,采集桨叶图像,桨叶成像于相机成像平面14,相机光轴17与成像平面14垂直,相机光心13距成像平面14的距离为焦距,桨叶自转轴的两条母线11和12成为桨叶图像中对应的桨叶双边边缘15和16;采用图像处理算法对桨叶图像进行预处理并提取桨叶特征肩点及桨叶自转轴双边边缘在图像中的像素坐标,根据单目相机成像原理和相关几何关系及手眼标定关系求得桨叶在机器人基坐标系20下的自转角度18。Referring to accompanying drawings 1-6, this embodiment is based on the monocular camera and the attitude detection method of the robot, which is applied to the vertical paddle of the mixer. The vertical blade rotation axis 1 of the mixer is a cylinder, located above the blade, and the blade rotates around the central axis 2 of the rotation axis 1, and the rotation angle is uncertain every time it stops; the mixer blade is a centrally symmetrical rigid body , the connection line between the characteristic shoulder point 3 and the characteristic shoulder point 4 of the blade passes through the central axis 2; the monocular camera 8 is installed on the end flange 6 of the robot 5 through the mount 7, and is used to obtain the blade image, and the light source 9 can be improved Image quality, tool 10 is used to complete the follow-up robot tasks; after obtaining the internal parameter matrix of the monocular camera and the hand-eye relationship by using the camera calibration method, the monocular camera can be adjusted to a suitable position by controlling the robot, and the blade image can be collected. The image is formed on the camera imaging plane 14, the camera optical axis 17 is perpendicular to the imaging plane 14, the distance between the camera optical center 13 and the imaging plane 14 is the focal length, and the two bus lines 11 and 12 of the blade rotation axis become the corresponding blades in the blade image Bilateral edges 15 and 16; use image processing algorithms to preprocess the blade image and extract the pixel coordinates of the blade feature shoulder point and the bilateral edge of the blade rotation axis in the image, according to the imaging principle of the monocular camera and related geometric relationships and hand-eye The calibration relationship obtains the rotation angle 18 of the blade in the robot base coordinate system 20 .
下面给出本实施例中方法的具体步骤:Provide the concrete steps of method in the present embodiment below:
步骤1.将单目相机安装于机器人末端,定义单目相机图像平面的像素坐标系为OpixXpixYpix,使用棋盘格标定板进行单目相机内参数标定,求解其内参矩阵,步骤如下:Step 1. Install the monocular camera on the end of the robot, define the pixel coordinate system of the monocular camera image plane as O pix X pix Y pix , use the checkerboard calibration board to calibrate the internal parameters of the monocular camera, and solve its internal parameter matrix, the steps are as follows :
a.将棋盘格标定板放置于相机视野中,使其放置位置与桨叶自转轴尽量靠近,调整单目相机焦距,使标定板上棋盘格清晰可见且所有棋盘格均在相机视野中,不断变换标定板姿态,采集20张不同姿态的标定板图像;a. Place the checkerboard calibration board in the field of view of the camera so that it is placed as close as possible to the rotation axis of the propeller, and adjust the focal length of the monocular camera so that the checkerboard on the calibration board is clearly visible and all the checkerboards are in the field of view of the camera. Change the posture of the calibration board and collect 20 images of the calibration board with different postures;
b.对单目相机采集的20张图像进行处理,采用“张正友标定法”计算得到单目相机内参矩阵K,其中,f为单目相机焦距,dx与dy为单目相机单个感光单元芯片的长度与宽度,(u0,v0)为单目相机光轴与成像平面的交点在图像像素坐标系中的像素坐标;b. Process the 20 images collected by the monocular camera, and use the "Zhang Zhengyou calibration method" to calculate the internal reference matrix K of the monocular camera, Among them, f is the focal length of the monocular camera, dx and dy are the length and width of a single photosensitive unit chip of the monocular camera, (u 0 , v 0 ) is the intersection point of the optical axis of the monocular camera and the imaging plane in the image pixel coordinate system pixel coordinates;
步骤2.定义机器人基坐标系为ORXRYRZR,机器人法兰坐标系为OFXFYFZF,单目相机坐标系为OCXCYCZC。机器人采用地面安装方式,机器人基坐标系原点位于机器人基座中心,处于机械零位时,机器人基坐标系Z轴与重力方向平行且相反。通过以下手眼标定步骤,标定单目相机坐标系与法兰坐标系之间的坐标转换矩阵Tcf:Step 2. Define the robot base coordinate system as O R X R Y R Z R , the robot flange coordinate system as O F X F Y F Z F , and the monocular camera coordinate system as O C X C Y C Z C . The robot is installed on the ground. The origin of the robot base coordinate system is located at the center of the robot base. When it is at the mechanical zero position, the Z axis of the robot base coordinate system is parallel to and opposite to the direction of gravity. The coordinate transformation matrix T cf between the monocular camera coordinate system and the flange coordinate system is calibrated through the following hand-eye calibration steps:
a.设立位姿已知的空间标定物,控制机器人移动,则单目相机随之移动,确保运动形式不是纯平移,得到两个空间标定物与单目相机坐标系之间的位姿变换矩阵T1和T2,由机器人控制器,可获取移动前后机器人法兰坐标系的位姿变换矩阵T3,则由空间位姿转换关系可得约束方程,T1TcfT3=T2Tcf;;a. Set up a space calibration object with a known pose, control the robot to move, and the monocular camera will move accordingly to ensure that the motion form is not pure translation, and obtain the pose transformation matrix between the two space calibration objects and the monocular camera coordinate system T 1 and T 2 , the robot controller can obtain the pose transformation matrix T 3 of the robot flange coordinate system before and after the movement, then the constraint equation can be obtained from the space pose transformation relationship, T 1 T cf T 3 = T 2 T cf ;
b.控制机器人再次移动,确保运动形式不是纯平移,可得新的约束方程,联立两组方程,可解得方程中的未知矩阵Tcf,完成单目相机手眼标定;b. Control the robot to move again to ensure that the motion form is not pure translation, and a new constraint equation can be obtained. By combining two sets of equations, the unknown matrix T cf in the equation can be solved to complete the hand-eye calibration of the monocular camera;
步骤3.单目相机进行图像采集时,要求相机成像平面及成像平面边缘与桨叶自转轴平行。桨叶竖直安装,桨叶自转轴中心轴线与机器人基坐标系ZR平行,则可视为要求相机坐标系XCOCYC平面和YC轴与机器人基坐标系ZR轴平行,将此位置定义为单目相机合适位置,按照以下步骤将单目相机调整至该位置:Step 3. When the monocular camera collects images, the imaging plane of the camera and the edge of the imaging plane are required to be parallel to the rotation axis of the blade. The propeller is installed vertically, and the central axis of the propeller rotation axis is parallel to the robot base coordinate system Z R , then it can be regarded as requiring that the camera coordinate system X C O C Y C plane and the Y C axis are parallel to the robot base coordinate system Z R axis, Define this position as the proper position of the monocular camera, follow the steps below to adjust the monocular camera to this position:
a.由步骤2已经得到单目相机坐标系到机器人法兰坐标系之间的转换关系Tcf,Tcf=[R T],其中,R为旋转矩阵,假设由单目相机坐标系到机器人法兰坐标系的旋转次序为Z-Y-X,即分别绕着ZC,YC,XC轴旋转βz,βy,βx,则由Tcf中的旋转矩阵R可反求三个旋转欧拉角βz,βy,βx,a. The transformation relationship T cf between the monocular camera coordinate system and the robot flange coordinate system has been obtained from step 2, T cf = [RT], where R is the rotation matrix, Assuming that the order of rotation from the monocular camera coordinate system to the robot flange coordinate system is ZYX, that is, to rotate β z , β y , and β x around the Z C , Y C , and X C axes respectively, then the rotation matrix in T cf R can reverse the three rotation Euler angles β z , β y , β x ,
进而可计算得到相机处于合适位置时,机器人法兰坐标系与相机坐标系各坐标轴之间的夹角β1,β2,β3;Then, when the camera is in a proper position, the included angles β 1 , β 2 , and β 3 between the robot flange coordinate system and the coordinate axes of the camera coordinate system can be calculated;
b.当满足相机坐标系XCOCYC平面和YC轴与机器人基坐标系ZR轴平行,得到机器人基坐标系与相机坐标系之间的转换矩阵Tcr,Tcr=TcfTfr,Tcf由步骤2得到,法兰坐标系与机器人基坐标系之间的坐标转换矩阵Tfr由机器人控制器直接得到,同步骤a,可解出机器人基坐标系与相机坐标系各坐标轴之间的夹角结合β1,β2,β3,可解出相机处于合适位置时,机器人法兰坐标系与基坐标系各坐标轴之间的夹角,据此计算出机器人姿态变换量,通过调整机器人姿态,使得相机坐标系中XCOCYC平面和YC轴与机器人基坐标系ZR轴平行,此时,单目相机位置已调整至合适位置;b. When the X C O C Y C plane of the camera coordinate system and the Y C axis are parallel to the Z R axis of the robot base coordinate system, the transformation matrix T cr between the robot base coordinate system and the camera coordinate system is obtained, T cr = T cf T fr and T cf are obtained in step 2. The coordinate transformation matrix T fr between the flange coordinate system and the robot base coordinate system is directly obtained by the robot controller. In the same step a, the respective coordinates of the robot base coordinate system and the camera coordinate system can be solved. the angle between the axes Combined with β 1 , β 2 , and β 3 , the angle between the robot flange coordinate system and the coordinate axes of the base coordinate system can be solved when the camera is in a suitable position, and the robot attitude transformation amount can be calculated accordingly. By adjusting the robot attitude , so that the X C O C Y C plane and the Y C axis in the camera coordinate system are parallel to the Z R axis of the robot base coordinate system. At this time, the position of the monocular camera has been adjusted to an appropriate position;
步骤4.经步骤3,单目相机位置已调整至合适位置,利用单目相机采集一张桨叶图像,按照以下步骤对其进行滤波除噪及边缘检测、特征提取等图像处理操作:Step 4. After step 3, the position of the monocular camera has been adjusted to a suitable position. Use the monocular camera to collect a blade image, and perform image processing operations such as filtering and denoising, edge detection, and feature extraction on it according to the following steps:
a.采用Canny边缘检测算法对图像进行边缘检测,提取所测桨叶自转轴的双边边缘像素坐标,记录为像素坐标点集{xpix1,ypix1},{xpix2,ypix2};a. Use the Canny edge detection algorithm to detect the edge of the image, extract the bilateral edge pixel coordinates of the measured blade rotation axis, and record it as a pixel coordinate point set {x pix1 , y pix1 }, {x pix2 , y pix2 };
b.提取桨叶单侧特征肩点像素坐标,记录为(xpix3,ypix3);b. Extract the pixel coordinates of the characteristic shoulder point on one side of the blade, and record it as (x pix3 , y pix3 );
步骤5.计算桨叶图像中,桨叶自转轴双边边缘之间的像素距离Δx1以及桨叶自转轴中心轴线与桨叶单侧特征肩点的像素距离Δx2,步骤如下:Step 5. Calculate the pixel distance Δx 1 between the bilateral edges of the blade rotation axis and the pixel distance Δx 2 between the central axis of the blade rotation axis and the characteristic shoulder point on one side of the blade in the blade image, the steps are as follows:
a.桨叶自转轴为圆柱体,由单目相机成像原理可知,桨叶自转轴在图像上成像时,成为其边缘轮廓的母线有两条,两条母线所对应的角度为θ1与θ2,S点为相机光心,桨叶在空间中的位置和直径D已知,经一系列坐标转换后,桨叶自转轴中心轴线在单目相机坐标系中的坐标可得到,且由步骤3可知,此时桨叶自转轴中心轴线方向与机器人基坐标系ZR轴平行,则此时桨叶中心轴线上所有点的xc0与zc0相同且确定,桨叶自转轴半径R=D/2,由下式可计算θ1与θ2:a. The rotation axis of the blade is a cylinder. According to the imaging principle of the monocular camera, when the rotation axis of the blade is imaged on the image, there are two generatrices that become the edge contour, and the angles corresponding to the two generatrixes are θ 1 and θ 2. Point S is the optical center of the camera. The position and diameter D of the blade in space are known. After a series of coordinate transformations, the coordinates of the central axis of the blade’s rotation axis in the monocular camera coordinate system can be obtained, and by the steps 3. It can be seen that the direction of the center axis of the blade rotation axis is parallel to the Z R axis of the robot base coordinate system at this time, then x c0 and z c0 of all points on the center axis of the blade are the same and determined, and the radius of the blade rotation axis R=D /2, θ 1 and θ 2 can be calculated by the following formula:
当xc0<0时, When x c0 < 0,
当xc0≥0时, When x c0 ≥ 0,
b.步骤a中的桨叶自转轴上的两条母线与步骤4中桨叶图像中的桨叶自转轴双边边缘相对应,桨叶图像中的自转轴边缘即对应着两条母线在成像平面上的投影。通过最小二乘法对步骤4中得到的像素坐标点集{xpix1,ypix1},{xpix2,ypix2}进行拟合,可获得步骤4所采集到的桨叶图像中桨叶自转轴双边边缘在图像像素坐标系中的直线方程l1与l2,进而计算得到桨叶自转轴双边边缘之间的像素距离Δx1,单位为像素;b. The two busbars on the blade rotation axis in step a correspond to the bilateral edges of the blade rotation axis in the blade image in step 4, and the edge of the rotation axis in the blade image corresponds to the two busbars on the imaging plane projection on . The pixel coordinate point set {x pix1 , y pix1 }, {x pix2 , y pix2 } obtained in step 4 is fitted by the least square method, and the two sides of the blade rotation axis in the blade image collected in step 4 can be obtained The straight line equation l 1 and l 2 of the edge in the image pixel coordinate system, and then calculate the pixel distance Δx 1 between the bilateral edges of the blade rotation axis, the unit is pixel;
c.由步骤b中得到的桨叶自转轴双边边缘在图像像素坐标系中的直线方程l1与l2以及步骤4中得到的桨叶单侧特征肩点像素坐标(xpix3,ypix3),可得步骤4所采集到的桨叶图像中桨叶特征肩点与其最接近的桨叶自转轴边缘之间的像素距离Δx2′,单位为像素;c. From the linear equations l 1 and l 2 of the bilateral edges of the blade rotation axis obtained in step b in the image pixel coordinate system and the pixel coordinates of the characteristic shoulder point on one side of the blade obtained in step 4 (x pix3 , y pix3 ) , the pixel distance Δx 2 ′ between the blade characteristic shoulder point and the nearest edge of the blade rotation axis in the blade image collected in step 4 can be obtained, the unit is pixel;
d.定义桨叶自转轴中心轴线在成像平面上的投影直线与桨叶两条母线在成像平面上的投影直线的距离分别为d1和d2,定义i为d1与d2的比值,则由单目相机成像原理,i=d1/d2=-cosθ2/cosθ1,由相似三角形定理可得,在桨叶图像中,桨叶自转轴中心轴线距离桨叶自转轴双边边缘的像素距离分别为Δx1/(i+1),iΔx1/(i+1),由步骤c得到的Δx2′,可解得图像中桨叶自转轴中心轴线与桨叶单侧特征肩点的像素距离Δx2,Δx2=Δx2′+Δx1/(i+1),单位为像素;d. Define the distances between the projection straight line of the central axis of the blade rotation axis on the imaging plane and the projection straight lines of the two generatrices of the blade on the imaging plane as d 1 and d 2 respectively, and define i as the ratio of d 1 to d 2 , According to the imaging principle of the monocular camera, i=d 1 /d 2 =-cosθ 2 /cosθ 1 , it can be obtained from the similar triangle theorem, in the image of the blade, the distance between the central axis of the blade rotation axis and the bilateral edges of the blade rotation axis The pixel distances are Δx 1 /(i+1), iΔx 1 /(i+1), and Δx 2 ′ obtained from step c can be solved to obtain the central axis of the blade rotation axis and the characteristic shoulder point on one side of the blade in the image The pixel distance Δx 2 , Δx 2 =Δx 2 ′+Δx 1 /(i+1), the unit is pixel;
步骤6.测量得到桨叶两特征肩点之间的距离L,桨叶形状中心对称,进而计算桨叶自转轴中心轴线与桨叶特征肩点的距离L′,L′=L/2;Step 6. Measure the distance L between the two characteristic shoulder points of the blade, and the shape of the blade is symmetrical to the center, and then calculate the distance L' between the central axis of the blade rotation axis and the characteristic shoulder point of the blade, L'=L/2;
步骤7.定义桨叶两特征肩点连线与相机坐标系XCOCYC平面之间的夹角为桨叶在相机坐标系下的自转角度θ′,按照以下步骤计算θ′:Step 7. Define the angle between the line connecting the two characteristic shoulder points of the blade and the X C O C Y C plane of the camera coordinate system as the rotation angle θ′ of the blade in the camera coordinate system, and calculate θ′ according to the following steps:
a.定义m为桨叶自转轴双边轮廓线在成像平面上的投影之间的距离,其与步骤5中桨叶自转轴双边边缘之间的像素距离Δx1对应,n为桨叶自转轴中心轴线与桨叶特征肩点在成像平面上的投影之间的距离,其与步骤5中桨叶自转轴中心轴线与桨叶单侧特征肩点的像素距离Δx2对应,由单目相机成像原理,m/n=Δx1/Δx2;a. Define m as the distance between the projections of the bilateral contour lines of the blade rotation axis on the imaging plane, which corresponds to the pixel distance Δx 1 between the bilateral edges of the blade rotation axis in step 5, and n is the center of the blade rotation axis The distance between the axis and the projection of the characteristic shoulder point of the blade on the imaging plane corresponds to the pixel distance Δx 2 between the central axis of the rotation axis of the blade and the characteristic shoulder point on one side of the blade in step 5, based on the imaging principle of the monocular camera , m/n=Δx 1 /Δx 2 ;
b.定义γ为相机光心与桨叶特征肩点两点连线和相机光轴之间的夹角,定义D1=R/cos(π-θ1),D2=R/cosθ2以及T=L′cosθ′-L′sinθ′tanγ,由相似三角形定理,m/n=(D1+D2)/T;b. Define γ as the angle between the optical center of the camera and the characteristic shoulder point of the blade and the optical axis of the camera, define D 1 =R/cos(π-θ 1 ), D 2 =R/cosθ 2 and T=L'cosθ'-L'sinθ'tanγ, by the similar triangle theorem, m/n=(D 1 +D 2 )/T;
c.由单目相机构造原理,tanγ=(u0-xpix3)dx/f,f/dx,u0由步骤1得到,xpix3由步骤5得到;c. Based on the construction principle of the monocular camera, tanγ=(u 0 -x pix3 )d x /f, f/d x , u 0 is obtained by step 1, and x pix3 is obtained by step 5;
d.结合步骤a,b,可得Δx1/Δx2=(D1+D2)/T=(R/(-cosθ1)+R/(cosθ2))/(L′cosθ′-L′sinθ′tanγ),式中,Δx1和Δx2由步骤5得到,桨叶自转轴半径R已知,桨叶自转轴中心轴线与桨叶特征肩点的距离L′由步骤6得到,将步骤c中得到的tanγ代入,经过简单三角函数运算最终解得唯一未知数θ′,d. Combining steps a and b, Δx 1 /Δx 2 =(D 1 +D 2 )/T=(R/(-cosθ 1 )+R/(cosθ 2 ))/(L′cosθ′-L 'sinθ'tanγ), where, Δx 1 and Δx 2 are obtained by step 5, the radius R of the blade rotation axis is known, and the distance L' between the central axis of the blade rotation axis and the characteristic shoulder point of the blade is obtained by step 6, and Substitute the tanγ obtained in step c, and finally solve the unique unknown θ′ through simple trigonometric function operations.
θ′=cos-1(((Δx2/Δx1)(-R/cosθ1+R/cosθ2))/L')-γ;θ'=cos -1 (((Δx 2 /Δx 1 )(-R/cosθ 1 +R/cosθ 2 ))/L')-γ;
步骤8.定义桨叶两特征肩点连线与机器人基坐标系XRORZR平面之间的夹角为桨叶在机器人基坐标系下的自转角度θ,按照以下步骤计算θ:Step 8. Define the angle between the line connecting the two characteristic shoulder points of the blade and the X R O R Z R plane of the robot base coordinate system as the rotation angle θ of the blade in the robot base coordinate system, and calculate θ according to the following steps:
a.由步骤2得到单目相机坐标系与机器人法兰坐标系之间的坐标转换矩阵Tcf,由机器人示教器读出当前法兰坐标系与机器人基坐标系的转换矩阵Tfr,可解得当前单目相机坐标系与机器人基坐标系之间的坐标转换矩阵为Tcr,Tcr=TcfTfr;a. The coordinate transformation matrix T cf between the monocular camera coordinate system and the robot flange coordinate system is obtained from step 2, and the transformation matrix T fr between the current flange coordinate system and the robot base coordinate system is read out by the robot teach pendant. The coordinate transformation matrix between the current monocular camera coordinate system and the robot base coordinate system is solved as T cr , T cr =T cf T fr ;
b.Tcr可表示为[r1 r2 r3 t],r1,r2,r3为旋转向量,t为平移向量,由步骤3可知,单目相机坐标系YC轴与机器人基坐标系ZR轴平行,则由笛卡尔坐标系旋转变换原则可得,r1=[-cosα 0 sinα 0]T,其中,α为机器人基坐标系XR轴与相机坐标系XC轴之间的夹角,由反三角函数可解得α;bT cr can be expressed as [r 1 r 2 r 3 t], r 1 , r 2 , r 3 are the rotation vector, t is the translation vector, from step 3 we can see that the monocular camera coordinate system Y C axis and the robot base coordinate system If the Z and R axes are parallel, it can be obtained from the principle of Cartesian coordinate system rotation transformation, r 1 =[-cosα 0 sinα 0] T , where α is the distance between the robot base coordinate system X R axis and the camera coordinate system X C axis Angle, α can be solved by inverse trigonometric function;
c.由各坐标系位置关系以及自转角度得,θ=α+θ′,其中,θ′由步骤7得到,α由步骤b得到,最终解得桨叶在机器人基坐标系下的自转角度θ,即解得桨叶姿态信息。c. Obtained from the positional relationship of each coordinate system and the rotation angle, θ=α+θ′, where θ′ is obtained from step 7, α is obtained from step b, and finally the rotation angle θ of the blade in the robot base coordinate system is obtained , that is, the blade attitude information is obtained.
Claims (1)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910742194.5A CN110490934B (en) | 2019-08-13 | 2019-08-13 | Attitude detection method of vertical blade of hybrid machine based on monocular camera and robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910742194.5A CN110490934B (en) | 2019-08-13 | 2019-08-13 | Attitude detection method of vertical blade of hybrid machine based on monocular camera and robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110490934A true CN110490934A (en) | 2019-11-22 |
CN110490934B CN110490934B (en) | 2022-04-19 |
Family
ID=68550743
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910742194.5A Active CN110490934B (en) | 2019-08-13 | 2019-08-13 | Attitude detection method of vertical blade of hybrid machine based on monocular camera and robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110490934B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111325802A (en) * | 2020-02-11 | 2020-06-23 | 中国空气动力研究与发展中心低速空气动力研究所 | Circular mark point identification matching method in helicopter wind tunnel test |
CN112419375A (en) * | 2020-11-18 | 2021-02-26 | 青岛海尔科技有限公司 | Feature point matching method and device, storage medium, and electronic device |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5499306A (en) * | 1993-03-08 | 1996-03-12 | Nippondenso Co., Ltd. | Position-and-attitude recognition method and apparatus by use of image pickup means |
CN102135776A (en) * | 2011-01-25 | 2011-07-27 | 解则晓 | Industrial robot control system based on visual positioning and control method thereof |
CN103759716A (en) * | 2014-01-14 | 2014-04-30 | 清华大学 | Dynamic target position and attitude measurement method based on monocular vision at tail end of mechanical arm |
CN104296725A (en) * | 2014-10-08 | 2015-01-21 | 南开大学 | Method applied to parameter calibration of deformable robot operation arm |
CN105160059A (en) * | 2015-07-11 | 2015-12-16 | 西安工业大学 | BP and GA based blade machining cutting quantity optimization selection method |
CN105716525A (en) * | 2016-03-30 | 2016-06-29 | 西北工业大学 | Robot end effector coordinate system calibration method based on laser tracker |
CN107966112A (en) * | 2017-12-03 | 2018-04-27 | 中国直升机设计研究所 | A kind of large scale rotor movement parameter measurement method |
CN109415119A (en) * | 2016-04-08 | 2019-03-01 | 列奥纳多股份公司 | Method of the rotor and detection blade for the aircraft that can be hovered relative to the posture of the hub of this rotor |
CN109794938A (en) * | 2019-02-01 | 2019-05-24 | 南京航空航天大学 | A robot hole making error compensation device suitable for curved surface structure and method thereof |
DE102018101162A1 (en) * | 2018-01-19 | 2019-07-25 | Hochschule Reutlingen | Measuring system and method for extrinsic calibration |
-
2019
- 2019-08-13 CN CN201910742194.5A patent/CN110490934B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5499306A (en) * | 1993-03-08 | 1996-03-12 | Nippondenso Co., Ltd. | Position-and-attitude recognition method and apparatus by use of image pickup means |
CN102135776A (en) * | 2011-01-25 | 2011-07-27 | 解则晓 | Industrial robot control system based on visual positioning and control method thereof |
CN103759716A (en) * | 2014-01-14 | 2014-04-30 | 清华大学 | Dynamic target position and attitude measurement method based on monocular vision at tail end of mechanical arm |
CN104296725A (en) * | 2014-10-08 | 2015-01-21 | 南开大学 | Method applied to parameter calibration of deformable robot operation arm |
CN105160059A (en) * | 2015-07-11 | 2015-12-16 | 西安工业大学 | BP and GA based blade machining cutting quantity optimization selection method |
CN105716525A (en) * | 2016-03-30 | 2016-06-29 | 西北工业大学 | Robot end effector coordinate system calibration method based on laser tracker |
CN109415119A (en) * | 2016-04-08 | 2019-03-01 | 列奥纳多股份公司 | Method of the rotor and detection blade for the aircraft that can be hovered relative to the posture of the hub of this rotor |
CN107966112A (en) * | 2017-12-03 | 2018-04-27 | 中国直升机设计研究所 | A kind of large scale rotor movement parameter measurement method |
DE102018101162A1 (en) * | 2018-01-19 | 2019-07-25 | Hochschule Reutlingen | Measuring system and method for extrinsic calibration |
CN109794938A (en) * | 2019-02-01 | 2019-05-24 | 南京航空航天大学 | A robot hole making error compensation device suitable for curved surface structure and method thereof |
Non-Patent Citations (4)
Title |
---|
YONG-LIN KUO 等: "Pose Determination of a Robot Manipulator Based on Monocular Vision", 《IEEE ACCESS》 * |
ZHANXI WANG 等: "Base Detection Research of Drilling Robot System by Using Visual Inspection", 《HINDAWI JOURNAL OF ROBOTICS》 * |
王君 等: "单目移动机器人相对位姿估计方法", 《应用光学》 * |
辛锋 等: "机器人系统在立式混合机清理中的设计", 《航天制造技术》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111325802A (en) * | 2020-02-11 | 2020-06-23 | 中国空气动力研究与发展中心低速空气动力研究所 | Circular mark point identification matching method in helicopter wind tunnel test |
CN112419375A (en) * | 2020-11-18 | 2021-02-26 | 青岛海尔科技有限公司 | Feature point matching method and device, storage medium, and electronic device |
CN112419375B (en) * | 2020-11-18 | 2023-02-03 | 青岛海尔科技有限公司 | Feature point matching method and device, storage medium, electronic device |
Also Published As
Publication number | Publication date |
---|---|
CN110490934B (en) | 2022-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108555908B (en) | A method for gesture recognition and picking of stacked workpieces based on RGBD cameras | |
CN102788559B (en) | Optical vision measuring system with wide-field structure and measuring method thereof | |
CN107160380B (en) | Camera calibration and coordinate transformation method based on SCARA manipulator | |
JP6180087B2 (en) | Information processing apparatus and information processing method | |
CN103759716B (en) | The dynamic target position of mechanically-based arm end monocular vision and attitude measurement method | |
JP5815761B2 (en) | Visual sensor data creation system and detection simulation system | |
JP6324025B2 (en) | Information processing apparatus and information processing method | |
CN111775152A (en) | A method and system for guiding a robotic arm to grasp scattered and stacked workpieces based on three-dimensional measurement | |
CN109035200A (en) | A kind of bolt positioning and position and posture detection method based on the collaboration of single binocular vision | |
CN107218930B (en) | Monocular-hand-eye-system-based active measurement method for six-dimensional position-posture of space circle | |
CN114474056B (en) | A monocular vision high-precision target positioning method for grasping operation | |
WO2012014545A1 (en) | Three-dimensional object recognizing device and three-dimensional object recognizing method | |
CN101377405B (en) | A Visual Measurement Method of Space Circular Attitude Parameters and Geometric Parameters | |
JP2019113895A (en) | Imaging apparatus with visual sensor for imaging work-piece | |
CN102927908A (en) | Robot eye-on-hand system structured light plane parameter calibration device and method | |
JP2010530086A (en) | Imaging model and image processing apparatus | |
CN111637851B (en) | Aruco code-based visual measurement method and device for plane rotation angle | |
CN101377404B (en) | An Ambiguity Elimination Method for Space Circle Pose Recognition Based on Angle Constraint | |
CN207231476U (en) | A kind of courier packages' grabbing device based on binocular vision | |
Hsu et al. | Development of a faster classification system for metal parts using machine vision under different lighting environments | |
CN111360820A (en) | Distance space and image feature space fused hybrid visual servo method | |
CN110490934B (en) | Attitude detection method of vertical blade of hybrid machine based on monocular camera and robot | |
JP2019049467A (en) | Distance measurement system and distance measurement method | |
WO2021019627A1 (en) | Calibration method for computer vision system and three-dimensional reference object for use in same | |
CN114549586A (en) | A target localization method based on visual perception robotic arm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |