CN102213581A - Object measuring method and system - Google Patents
Object measuring method and system Download PDFInfo
- Publication number
- CN102213581A CN102213581A CN2010101639565A CN201010163956A CN102213581A CN 102213581 A CN102213581 A CN 102213581A CN 2010101639565 A CN2010101639565 A CN 2010101639565A CN 201010163956 A CN201010163956 A CN 201010163956A CN 102213581 A CN102213581 A CN 102213581A
- Authority
- CN
- China
- Prior art keywords
- capturing device
- image capturing
- overbar
- image
- sin
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 33
- 238000012545 processing Methods 0.000 claims abstract description 46
- 238000005259 measurement Methods 0.000 claims abstract description 9
- 238000012937 correction Methods 0.000 claims description 46
- 238000003384 imaging method Methods 0.000 claims description 21
- 230000007246 mechanism Effects 0.000 claims description 17
- 230000008569 process Effects 0.000 claims description 4
- 230000003287 optical effect Effects 0.000 claims 16
- 238000006467 substitution reaction Methods 0.000 claims 8
- 230000015572 biosynthetic process Effects 0.000 claims 6
- 239000007787 solid Substances 0.000 claims 6
- NJPPVKZQTLUDBO-UHFFFAOYSA-N novaluron Chemical compound C1=C(Cl)C(OC(F)(F)C(OC(F)(F)F)F)=CC=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F NJPPVKZQTLUDBO-UHFFFAOYSA-N 0.000 claims 4
- 230000009286 beneficial effect Effects 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 8
- 238000000691 measurement method Methods 0.000 description 8
- 230000000007 visual effect Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000011017 operating method Methods 0.000 description 1
Images
Landscapes
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
技术领域technical field
本申请涉及一种物件测量方法与系统,具体地说,涉及一种利用非平行设置的二图像撷取装置依据光束交会共线函数来计算出物件的立体坐标的物件测量方法与系统。The present application relates to a method and system for object measurement, in particular, to an object measurement method and system for calculating the three-dimensional coordinates of an object by using two non-parallel image capture devices according to the intersection and collinear function of light beams.
背景技术Background technique
由于科技的快速演进,无论是在商品设计、工业制造或是高精密的操作领域中,需要借重于机器人或机器手臂等自动化系统来进行的操作程序也越来越多,因此,如何提升自动化系统的运作效率也成为重要的课题。其关键就在于如何令机器人或机器手臂等自动化系统得以精确地辨识出空间中的物件的立体坐标,据此,各种可测量出物件的立体坐标的测量方法也应运而生。Due to the rapid evolution of science and technology, whether it is in the field of commodity design, industrial manufacturing or high-precision operation, there are more and more operating procedures that need to be carried out by automation systems such as robots or robot arms. Therefore, how to improve the automation system Operational efficiency has also become an important issue. The key lies in how to enable automatic systems such as robots or robotic arms to accurately identify the three-dimensional coordinates of objects in space. Accordingly, various measurement methods that can measure the three-dimensional coordinates of objects have emerged as the times require.
如美国US 06795200号专利申请所公开的物件测量方法,其先将结构光源投射于待测平面上,接着利用平行设置的二摄影机来分别取得待测平面上的物件图像。然而,实际使用时,结构光源摆设与配置往往会增加使用者额外的负担。其次,以简易的三角几何原理来计算立体坐标时,因无法兼顾到摄影机本身的观测误差,以致其所计算出的物件的立体坐标的精确度不足,而精确度不足的立体坐标更会使系统于后续作业中产生过大的误差,故上述US 06795200号专利申请不但实用性不佳,也无法适用于高精确度的操作领域中。For example, in the object measurement method disclosed in US Patent Application No. US 06795200, a structured light source is first projected on the plane to be measured, and then two cameras arranged in parallel are used to respectively obtain images of objects on the plane to be measured. However, in actual use, the arrangement and configuration of the structured light source will often increase the additional burden on the user. Secondly, when using simple triangular geometry principles to calculate the three-dimensional coordinates, because the observation error of the camera itself cannot be taken into account, the accuracy of the calculated three-dimensional coordinates of the object is insufficient, and the insufficient three-dimensional coordinates will make the system Excessive errors are generated in the follow-up operation, so the above-mentioned US 06795200 patent application is not only not practical, but also cannot be applied in the field of high-precision operation.
再者,在美国US 20060088203号专利申请所公开的物件测量方法中,乃先于工作区域上方同时架设多台固定式的摄影机,藉此对工作区域中的物件进行三维的取像作业,接着,在取像完成后再利用简易的三角几何原理来计算出物件的立体坐标。然而,通过固定架设于工作区域上的多台摄影机来进行三维的取像作业不但成本较高,且使用弹性不佳,同时,也容易因视觉死角而阻碍三维的取像作业的进行,无法适用于高精细的操作领域。Furthermore, in the object measurement method disclosed in the U.S. Patent Application No. US 20060088203, a plurality of fixed cameras are set up above the working area at the same time, so as to perform three-dimensional imaging operations on the objects in the working area, and then, After the image acquisition is completed, the three-dimensional coordinates of the object are calculated by using the simple triangular geometry principle. However, the three-dimensional imaging operation by fixing multiple cameras on the working area is not only expensive, but also has poor flexibility in use. At the same time, it is easy to hinder the progress of the three-dimensional imaging operation due to blind spots of vision, which is not applicable. In the field of high-precision operation.
另外,欧洲WO 2008076942号专利申请亦公开了一种物件测量方法,其先将单摄影机设置于移动式的机器手臂上,以利用移动式的机器手臂来针对工作区域中的物件进行多次、不同角度的取像作业,接着,再通过简易的三角几何原理来计算出物件的立体坐标。然而,运用单摄影机来对工作区域中的物件进行多次、不同角度的取像作业需要耗费额外的时间,不但提高了成本,也降低了实用性。其次,与前述US 06795200号专利申请及US 20060088203号专利申请相同,利用简易的三角几何原理所计算出的物件的立体坐标,同样地会使系统于后续的作业中产生过大的误差,当然也无法适用于极精细的操作领域。In addition, the European patent application WO 2008076942 also discloses an object measurement method, which firstly installs a single camera on a mobile robot arm, so as to use the mobile robot arm to perform multiple and different measurements on objects in the work area. Angle imaging operation, and then calculate the three-dimensional coordinates of the object through the simple triangular geometry principle. However, it takes extra time to use a single camera to take images of objects in the working area multiple times and from different angles, which not only increases the cost, but also reduces the practicability. Secondly, the same as the aforementioned US 06795200 patent application and US 20060088203 patent application, the three-dimensional coordinates of the object calculated by the simple triangular geometry principle will also cause excessive errors in the subsequent operations of the system. It cannot be applied to extremely fine operating fields.
有鉴于此,如何提供一种用以测量物件的立体坐标的物件测量方法与系统,不但可方便、快速、精确地取得物件的立体坐标,更可适用于高精细的操作领域,亟为各界所急待解决的课题。In view of this, how to provide an object measurement method and system for measuring the three-dimensional coordinates of objects, which can not only obtain the three-dimensional coordinates of objects conveniently, quickly and accurately, but also be applicable to high-precision operation fields, is urgently requested by all walks of life. urgent issues to be resolved.
发明内容Contents of the invention
为达上述目的及其他目的,本发明提出一种物件测量方法,利用一组并排且向内旋转的非平行设置的第一图像撷取装置与第二图像撷取装置以及与该第一及第二图像撷取装置相连接的处理模块对物件进行测量,该物件测量方法包括以下步骤:(1)令该第一图像撷取装置与该第二图像撷取装置分别撷取至少一已知立体坐标的镜头校正点的第一图像及第二图像,再令该处理模块通过镜头校正演算法依据该第一图像及该第二图像分别求得该第一图像撷取装置的第一镜头扭曲参数及该第二图像撷取装置的第二镜头扭曲参数;(2)令该第一图像撷取装置与该第二图像撷取装置撷取相同的多个已知立体坐标的姿态校正点的图像坐标,再令该处理模块将该姿态校正点的立体坐标、该第一镜头扭曲参数及该第二镜头扭曲参数代入以光束交会共线成像原理为基础的几何函数,其中,该几何函数中包含未知的该第一图像撷取装置的第一镜头中心与第一姿态参数以及未知的该第二图像撷取装置的第二镜头中心与第二姿态参数;以及(3)令该处理模块利用预设的演算法计算该几何函数,以解出该第一图像撷取装置的该第一镜头中心与该第一姿态参数以及该第二图像撷取装置的该第二镜头中心与该第二姿态参数,并将所解出的该第一镜头中心、该第一姿态参数、该第二镜头中心与该第二姿态参数代入该以光束交会共线成像原理为基础的几何函数中,以产生对应该第一及第二图像撷取装置的第一光束交会共线函数与第二光束交会共线函数。In order to achieve the above and other objectives, the present invention proposes a method for measuring an object, which uses a set of non-parallel first image capture devices and second image capture devices that are arranged side by side and rotate inwards and are connected with the first and second image capture devices. The processing module connected to the two image capture devices measures the object, and the object measurement method includes the following steps: (1) causing the first image capture device and the second image capture device to respectively capture at least one known stereo The coordinates of the first image and the second image of the lens correction point, and then let the processing module calculate the first lens distortion parameter of the first image capture device according to the first image and the second image through the lens correction algorithm and the second lens distortion parameter of the second image capture device; (2) make the first image capture device and the second image capture device capture images of the same plurality of posture correction points with known three-dimensional coordinates Coordinates, and then let the processing module substitute the stereoscopic coordinates of the attitude correction point, the first lens distortion parameter and the second lens distortion parameter into a geometric function based on the beam intersection collinear imaging principle, wherein the geometric function includes The unknown first lens center and first pose parameter of the first image capture device and the unknown second lens center and second pose parameter of the second image capture device; and (3) making the processing module use the preset The established algorithm calculates the geometric function to solve the first lens center and the first pose parameter of the first image capture device and the second lens center and the second pose of the second image capture device parameters, and substitute the solved first lens center, the first attitude parameter, the second lens center and the second attitude parameter into the geometric function based on the beam intersection collinear imaging principle to generate a pair The first beam intersection collinear function and the second beam intersection collinear function of the first and second image capture devices.
在一较佳态样中,还包括步骤(4),令该第一图像撷取装置与该第二图像撷取装置同时撷取一目标物件的特征点坐标,并将该第一图像撷取装置所撷取的特征点坐标与该第二图像撷取装置所撷取的特征点坐标代入该第一及第二光束交会共线函数中,以计算出该目标物件的立体空间坐标。In a preferred form, step (4) is also included, making the first image capture device and the second image capture device simultaneously capture the feature point coordinates of a target object, and capture the first image The feature point coordinates captured by the device and the feature point coordinates captured by the second image capture device are substituted into the first and second beam intersection collinear functions to calculate the three-dimensional space coordinates of the target object.
在另一较佳态样中,上述的步骤(2)中的该以光束共线成像原理为基础的几何函数满足其中,展开后为In another preferred aspect, the geometric function based on the beam collinear imaging principle in the above step (2) satisfies Among them, the expanded
,而(XA,YA,ZA)为该姿态校正点的已知立体坐标,(xc,yc)为该第一/第二图像撷取装置对该姿态校正点撷取的该图像坐标,f为该第一/第二图像撷取装置的已知焦距,k0、k1、k2、p1、p2为该第一/第二镜头扭曲参数,(XL,YL,ZL)为该第一/第二镜头中心,其中,,m11=cosφcosκ、m12=sinωsinφcosκ+cosωsinκ、m13=-cosωsinφcosκ+sinωsinκ、m21=-cosφsinκ、m22=-sinωsinφsinκ+cosωcosκ、m23=cosωsinφsinκ+sinωcosκ、m31=sinφ、m32=-sinωcosφ以及m33=cosωcosφ,而ω、φ、κ为该第一/第二姿态参数。, and (X A , Y A , Z A ) are the known three-dimensional coordinates of the attitude correction point, (x c , y c ) are the attitude correction point captured by the first/second image capture device Image coordinates, f is the known focal length of the first/second image capture device, k 0 , k 1 , k 2 , p 1 , p 2 are the distortion parameters of the first/second lens, (X L , Y L , Z L ) is the first/second lens center, where, m 11 =cosφcosκ, m 12 =sinωsinφcosκ+cosωsinκ, m 13 =-cosωsinφcosκ+sinωsinκ, m 21 =-cosφsinκ, m 22 =-sinωsinφsinκ+ cosωcosκ, m 23 =cosωsinφsinκ+sinωcosκ, m 31 =sinφ, m 32 =-sinωcosφ and m 33 =cosωcosφ, and ω, φ, κ are the first/second attitude parameters.
其次,本发明亦提出一种物件测量方法,包括以下步骤:(1)令该第一图像撷取装置与该第二图像撷取装置分别撷取至少一已知立体坐标的校正点的第一图像及第二图像;(2)令该处理模块将该第一图像及该第二图像中对应该已知立体坐标的校正点的参数代入以光束交会共线成像原理为基础的几何函数,以利用预设的演算法计算该几何函数进而解出该第一图像撷取装置的第一镜头扭曲参数、第一镜头中心与第一姿态参数以及该第二图像撷取装置的第二镜头扭曲参数、第二镜头中心与第二姿态参数;以及(3)令该处理模块将所解出的该第一镜头扭曲参数、该第一镜头中心、该第一姿态参数、该第二镜头扭曲参数、该第二镜头中心与该第二姿态参数代入该以光束交会共线成像原理为基础的几何函数中,以产生对应该第一及第二图像撷取装置的第一光束交会共线函数与第二光束交会共线函数。Secondly, the present invention also proposes a method for measuring an object, which includes the following steps: (1) making the first image capture device and the second image capture device respectively capture at least one first calibration point of known three-dimensional coordinates. image and the second image; (2) make the processing module substitute the parameters of the correction point corresponding to the known stereo coordinates in the first image and the second image into the geometric function based on the principle of beam intersection and collinear imaging, to Using a preset algorithm to calculate the geometric function and then solve the first lens distortion parameter, the first lens center and the first attitude parameter of the first image capture device and the second lens distortion parameter of the second image capture device , the second lens center and the second attitude parameter; and (3) make the processing module solve the first lens distortion parameter, the first lens center, the first attitude parameter, the second lens distortion parameter, The second lens center and the second attitude parameter are substituted into the geometric function based on the beam intersection collinear imaging principle to generate the first beam intersection collinear function and the second beam intersection collinear function corresponding to the first and second image capture devices. Two-beam intersection collinear function.
另外,本发明还提出一种物件测量系统,包括第一图像撷取装置及第二图像撷取装置,用以撷取校正点及目标物件的图像,其中,该第一图像撷取装置及该第二图像撷取装置并排且向内旋转的非平行设置;以及处理模块,连接该第一及第二图像撷取装置,用以依据该第一及第二图像撷取装置所撷取的该校正点的图像进行镜头校正以及物件测量,其中,该处理模块将该校正点的图像的参数代入以光束交会共线成像原理为基础的几何函数,以利用预设的演算法计算该几何函数进而解出该第一图像撷取装置的第一镜头扭曲参数、第一镜头中心与第一姿态参数以及该第二图像撷取装置的第二镜头扭曲参数、第二镜头中心与第二姿态参数,再令该第一图像撷取装置与该第二图像撷取装置同时撷取该目标物件的特征点坐标,并将该第一及该第二图像撷取装置所撷取到的特征点坐标、该第一镜头扭曲参数、该第一镜头中心、该第一姿态参数、该第二镜头扭曲参数、该第二镜头中心与该第二姿态参数代入该以光束交会共线成像原理为基础的几何函数中,以计算出该目标物件的立体空间坐标。In addition, the present invention also proposes an object measurement system, including a first image capture device and a second image capture device for capturing images of calibration points and target objects, wherein the first image capture device and the A non-parallel arrangement of the second image capture devices side by side and rotated inward; and a processing module connected to the first and second image capture devices for processing according to the images captured by the first and second image capture devices The image of the correction point is used for lens correction and object measurement, wherein the processing module substitutes the parameters of the image of the correction point into a geometric function based on the principle of beam intersection and collinear imaging, so as to use a preset algorithm to calculate the geometric function and then solving the first lens distortion parameter, the first lens center and the first pose parameter of the first image capture device and the second lens distortion parameter, the second lens center and the second pose parameter of the second image capture device, Then make the first image capture device and the second image capture device simultaneously capture the feature point coordinates of the target object, and use the feature point coordinates captured by the first and the second image capture device, The first lens distortion parameter, the first lens center, the first attitude parameter, the second lens distortion parameter, the second lens center, and the second attitude parameter are substituted into the geometry based on the beam intersection collinear imaging principle function to calculate the three-dimensional space coordinates of the target object.
综上所述,本发明可利用二非平行设置的图像撷取装置来对物件进行图像撷取并推导出的图像撷取装置的光束交会共线函数,再通过该光束交会共线函数计算出物件的立体坐标,由于这些图像撷取装置在对物件进行图像撷取之前,会先对立体坐标已知的校正点进行图像撷取,并据此对这些图像撷取装置进行镜头校正与姿态校正,因此,更可进一步提升所测量出的物件的立体坐标的精确度。In summary, the present invention can use two non-parallel image capture devices to capture images of objects and derive the beam intersection collinear function of the image capture device, and then calculate the beam intersection collinear function through the beam intersection collinear function The three-dimensional coordinates of the object, because these image capture devices will first capture the image of the correction points with known three-dimensional coordinates before capturing the image of the object, and then perform lens correction and posture correction on these image capture devices , therefore, the accuracy of the measured three-dimensional coordinates of the object can be further improved.
附图说明Description of drawings
图1A是本发明的物件测量方法的流程图;Fig. 1A is the flow chart of object measurement method of the present invention;
图1B是本发明的物件测量方法的另一实施例的流程图;FIG. 1B is a flow chart of another embodiment of the object measuring method of the present invention;
图2是本发明的物件测量系统的架构图;Fig. 2 is the structural diagram of object measurement system of the present invention;
图3是本发明的光束交会关系图;Fig. 3 is a diagram of beam intersection relationship of the present invention;
图4A是本发明的图像撷取装置以平行的设置方式所设置的示意图;以及FIG. 4A is a schematic diagram of the image capturing devices of the present invention arranged in parallel; and
图4B为本发明的本发明的图像撷取装置以非平行的设置方式所设置的示意图。FIG. 4B is a schematic diagram of the image capture device of the present invention arranged in a non-parallel arrangement.
【主要元件符号说明】[Description of main component symbols]
S1~S5、S1’~S4’步骤Steps S1~S5, S1’~S4’
1 物件测量系统1 Object measurement system
10、10’ 图像撷取装置10, 10’ image capture device
11、11’ 转向机构11. 11' steering mechanism
12 固定基座12 Fixed base
13 处理模块13 processing module
2 图像画面2 Image screen
A1、A2 视野交集区域A1, A2 Field of view intersection area
具体实施方式Detailed ways
以下通过特定的具体实例说明本发明的实施方式,本领域技术人员可由本说明书所公开的内容轻易地了解本发明的其他优点与功效。Embodiments of the present invention are described below through specific examples, and those skilled in the art can easily understand other advantages and effects of the present invention from the content disclosed in this specification.
请同时参阅图1A、图1B及图2,图1A与图1B是本发明的物件测量方法的步骤流程图,而图2是本发明的物件测量系统的架构图。Please refer to FIG. 1A , FIG. 1B and FIG. 2 at the same time. FIG. 1A and FIG. 1B are flow charts of the object measuring method of the present invention, and FIG. 2 is a structure diagram of the object measuring system of the present invention.
图1A的流程应用于例如图2所示的物件测量系统1中,该系统包含至少一组并排且向内旋转的非平行设置的图像撷取装置10及图像撷取装置10’,转向机构11及转向机构11’、固定基座12,及与图像撷取装置10、10’相连接的处理模块13。The process of FIG. 1A is applied to the object measurement system 1 shown in FIG. 2, which includes at least one set of non-parallel
在本实施例中,图像撷取装置10与图像撷取装置10’可例如为包含电荷耦合元件(Charge Coupled Device,CCD)的摄影机或数字相机,且分别固设于可例如为活动式转盘的转向机构11与转向机构11’上,而转向机构11、11’则可再以可旋转活动的方式设置于具有尺规刻度的固定基座12上。再者,处理模块13可为具有逻辑运算功能的计算机或微处理芯片。In this embodiment, the
执行步骤S1时,可先将与处理模块13相连接的图像撷取装置10与图像撷取装置10’分别通过转向机构11及转向机构11’以可转动的方式设置于固定基座12上,接着再依据至少一校正点的立体坐标来调整转向机构11及转向机构11’的转向角度,使图像撷取装置10与图像撷取装置10’同时对准该校正点并以非平行设置的设置方式设置于固定基座12上。When executing step S1, the
具体实施时,图像撷取装置10与图像撷取装置10’间可具有不大于10公分的间距,例如5公分,而固定基座12更可设置于机器人或机器手臂(未图示)上,且处理模块13可内建于机器人或机器手臂中,当然,图像撷取装置10、10’及转向机构11、11’的数量可随使用者需求而增加,而处理模块13也可为简易的数据转换装置,并将图像撷取装置10与图像撷取装置10’所取得的数据通过USB、IEEE1394a或IEEE1394b等传输接口传输至外部的运算单元(未图示)再进行后续的运算。During specific implementation, the distance between the
在步骤S2中,令图像撷取装置10与图像撷取装置10’分别撷取至少一已知立体坐标的镜头校正点的第一图像及第二图像,再令处理模块13通过镜头校正演算法依据该第一图像及该第二图像分别求得图像撷取装置10以及图像撷取装置10’的镜头扭曲参数,接着进至步骤S3。In step S2, let the
在本实施例中,处理模块13会先从第一图像及第二图像中计算出镜头校正点的图像坐标,再利用镜头校正演算法,例如奇变差,依据该第一图像及该第二图像中的校正点的图像坐标,分别求得图像撷取装置10以及图像撷取装置10’的镜头扭曲参数,而通过所求得的镜头扭曲参数,即可将镜头成像边缘的扭曲曲线调整为直线。再者,所述的镜头扭曲参数,也可指图像撷取装置10、10’的镜头的径向失真与筒状失真。In this embodiment, the processing module 13 will first calculate the image coordinates of the lens correction point from the first image and the second image, and then use the lens correction algorithm, such as odd variation, according to the first image and the second image The image coordinates of the correction points in the image are used to obtain the lens distortion parameters of the
在步骤S3中,令图像撷取装置10与图像撷取装置10’同时撷取相同的多个已知立体坐标的姿态校正点的图像坐标,再令处理模块13将该姿态校正点的立体坐标、图像撷取装置10以及图像撷取装置10’的镜头扭曲参数代入以光束交会共线成像原理为基础的几何函数中,其中,该几何函数中包含未知的图像撷取装置10以及图像撷取装置10’的镜头中心与姿态参数。接着进至步骤S4。In step S3, the
在本实施例中,以光束共线成像原理为基础的几何函数满足且将镜面扭曲向量代入后可展开为In this embodiment, the geometric function based on the beam collinear imaging principle satisfies And after substituting the mirror distortion vector, it can be expanded as
,而此时,“XA,YA,ZA”代表为姿态校正点的已知立体坐标,“xc,yc”为图像撷取装置10、10’对姿态校正点所撷取的该图像坐标,“f”为图像撷取装置10、10’的已知焦距,“k0、k1、k2、p1、p2”为图像撷取装置10、10’的镜头扭曲参数,而“XL,YL,ZL”为图像撷取装置10、10’的镜头中心。, and at this time, “X A , Y A , Z A ” represent the known three-dimensional coordinates of the attitude correction points, and “x c , y c ” are the attitude correction points captured by the
再者,前述的m11=cosφcosκ、m12=sinωsinφcosκ+cosωsinκ、m13=-cosωsinφcosκ+sinωsinκ、m21=-cosφsinκ、m22=-sinωsinφsinκ+cosωcosκ、m23=cosωsinφsinκ+sinωcosκ、m31=sinφ、m32=-sinωcosφ以及m33=cosωcosφ,而ω、φ、κ即为为图像撷取装置10、10’的姿态参数。Furthermore, m 11 = cosφcosκ, m 12 = sinωsinφcosκ + cosωsinκ, m 13 = -cosωsinφcosκ + sinωsinκ, m 21 = -cosφsinκ, m 22 = -sinωsinφsinκ + cosωcosκ, m 23 = cosωsinφsinκ + sinωcos = φm, , m 32 =-sinωcosφ and m 33 =cosωcosφ, and ω, φ, κ are the attitude parameters of the
在步骤S4中,令处理模块13利用预设的演算法,例如数值迭代法或最小平方法等,计算该几何函数,以同时解出该图像撷取装置10以及图像撷取装置10’的镜头中心与姿态参数,并将所解出的图像撷取装置10以及图像撷取装置10’的镜头中心与姿态参数,代入前述以光束交会共线成像原理为基础的几何函数中,以分别产生对应图像撷取装置10及图像撷取装置10’的光束交会共线函数In step S4, let the processing module 13 use a preset algorithm, such as the numerical iteration method or the least square method, to calculate the geometric function, so as to simultaneously solve the lens of the
为了清楚说明前述步骤S2~S4,请参阅图3,以图像撷取装置10为例说明校正点A(XA,YA,ZA)、校正点A(XA,YA,ZA)的图像坐标Aa(xc,yc),及图像撷取装置10的镜头中心L(XL,YL,ZL)于三维空间中的位置关系图。In order to clearly describe the aforementioned steps S2-S4, please refer to FIG. 3 , and take the
首先,A(XA,YA,ZA)作为镜头校正点,而于图像撷取装置10针对A(XA,YA,ZA)进行图像撷取后,会撷取出一图像画面2,且图像画面2会有A(XA,YA,ZA)的图像坐标Aa(xc,yc)。因此,当处理模块13将多个A(XA,YA,ZA)的数值、多个Aa(xc,yc)的数值、以及图像撷取装置10的焦距f代入上述以光束共线成像原理为基础的几何函数后,处理模块13即可计算出图像撷取装置10的镜头扭曲参数“k0、k1、k2、p1、p2”,进而对图像撷取装置10完成镜头校正。当然,对图像撷取装置10’也可通过相同的方法完成镜头校正,值得一提的是,对图像撷取装置10及图像撷取装置10’所作的镜头校正,可同步实施或先后实施。Firstly, A(X A , Y A , Z A ) is used as the lens correction point, and after the
接着,再把A(XA,YA,ZA)作为姿态校正点,因此,当处理模块13计算出及图像撷取装置10的姿态参数“ω、ψ,及K”的数值后,因“ω、ψ,及K”可代表图像撷取装置10与空间中的方向轴的偏转角度,因此处理模块13遂可通过“ω、ψ,及K”的数值对图像撷取装置10完成姿态校正。值得一提的是,对图像撷取装置10及图像撷取装置10’所作的姿态校正,必须为同步实施。Next, A(X A , Y A , Z A ) is used as the attitude correction point. Therefore, after the processing module 13 calculates the values of the attitude parameters "ω, ψ, and K" of the
最后,处理模块13即可解出该图像撷取装置10镜头中心,即为图示中的L(XL,YL,ZL),于是,当处理模块13将所解出的图像撷取装置10的镜头中心L(XL,YL,ZL)的数值,图像撷取装置10的姿态参数“ω、ψ,及K”的数值代入该以光束交会共线成像原理为基础的几何函数中,即可产生对应图像撷取装置10的光束交会共线函数,当然,图像撷取装置10’的光束交会共线函数也可通过相同的方法所产生。Finally, the processing module 13 can solve the lens center of the
在步骤S5中,令图像撷取装置10与图像撷取装置10’同时撷取一目标物件的特征点坐标,并令处理模块13将该图像撷取装置10与图像撷取装置10’所撷取到的特征点坐标分别代入图像撷取装置10与图像撷取装置10’的光束交会共线函数中,进而计算出目标物件的立体空间坐标。In step S5, let the
在本实施例中,处理模块13还可将图像撷取装置10所撷取到的特征点坐标与图像撷取装置10’所撷取到的特征点坐标进行匹配与相似判断的动作,进而建立该目标物件的平面方程式,并通过所建立的平面方程式计算出该目标物件的立体空间坐标与姿态。In this embodiment, the processing module 13 can also perform matching and similarity judgments between the feature point coordinates captured by the
而为了更加清楚说明步骤S5的作法,请再次参阅图3,此时,A(XA,YA,ZA)代表目标物件的特征点、Aa(xc,yc)代表目标物件的特征点A(XA,YA,ZA)的图像坐标,而L(XL,YL,ZL)则与前述相同,代表图像撷取装置10的镜头中心。In order to illustrate the method of step S5 more clearly, please refer to Fig. 3 again. At this time, A(X A , Y A , Z A ) represents the feature point of the target object, and A a (x c , y c ) represents the feature point of the target object. The image coordinates of the feature point A (X A , Y A , Z A ), and L (X L , Y L , Z L ) are the same as above, representing the lens center of the
因此,当图像撷取装置10及图像撷取装置10分别对A(XA,YA,ZA)进行图像撷取后,会从图像画面2中得到两组A(XA,YA,ZA)的图像坐标Aa(xc,yc),藉此,处理模块即可将两组Aa(xc,yc)的数值分别代回图像撷取装置10及图像撷取装置10’的光束交会共线函数中,进而再解出A(XA,YA,ZA)的数值,并得到目标物件的特征点的空间坐标。Therefore, when the
在此需特别说明的是,由于校正点或目标物件的位置必需被图像撷取装置10与图像撷取装置10’的视野交集区域所涵盖,才能精确地算出空间坐标进而产生立体视觉,因此,视野交集区域的面积大小也就直接影响运算结果的精确度。另一方面,当图像撷取装置10与图像撷取装置10’的视野交集区域可距离摄影机的距离越近,也就越不容易发生因物件或校正点距离图像撷取装置10与图像撷取装置10’过近而发生的近距离失焦情形,因此,本发明将图像撷取装置10及图像撷取装置10’以非平行的设置方式予以设置,可更适用于高精细的操作领域中。It should be noted here that the spatial coordinates can be accurately calculated to generate stereoscopic vision because the position of the calibration point or the target object must be covered by the intersection area of the field of view of the
而为了清楚说明本发明将图像撷取装置10及图像撷取装置10’以非平行的设置方式予以设置的优点,请参阅图4A及图4B,其中,图4A是绘示图2中的图像撷取装置10与图像撷取装置10’以平行的设置方式的视野示意图,而图4B是绘示图2中的图像撷取装置10与图像撷取装置10’以非平行的设置方式所设置时的视野示意图。如图4A所示,以平行的设置方式所设置的图像撷取装置10与图像撷取装置10’会产生视野交集区域A1,且视野交集区域A1与平行设置的图像撷取装置10与图像撷取装置10’的距离为d1;而如图4B所示,以非平行的设置方式所设置的图像撷取装置10与图像撷取装置10’会产生视野交集区域A2,且视野交集区域A2与非平行设置的图像撷取装置10与图像撷取装置10’的距离为d2。经比较可知,因视野交集区域A2的面积大于视野交集区域A1的面积,且距离d1的长度大于距离d2的长度,所以非平行设置的图像撷取装置10与图像撷取装置10’更适用于极精细的操作领域中。In order to clearly illustrate the advantages of the present invention that the
另外,请再次参阅图1B,以进一步说明本发明的物件测量方法的另一实施例。In addition, please refer to FIG. 1B again to further illustrate another embodiment of the object measuring method of the present invention.
在此实施例的步骤S1’中,可先令第一图像撷取装置10与第二图像撷取装置10’分别撷取至少一已知立体坐标的校正点的第一图像及第二图像;接着,在步骤S2’,再令处理模块13将对应该第一图像及该第二图像的参数代入以光束交会共线成像原理为基础的几何函数中,以利用预设的演算法计算该几何函数,进而解出第一图像撷取装置10的第一镜头扭曲参数、第一镜头中心与第一姿态参数,以及第二图像撷取装置10’的第二镜头扭曲参数、第二镜头中心与第二姿态参数;在步骤S3’中,处理模块13即可将所解出的第一镜头扭曲参数、第一镜头中心、第一姿态参数、第二镜头扭曲参数、第二镜头中心与第二姿态参数代入前述以光束交会共线成像原理为基础的几何函数中,进而产生对应该第一及第二图像撷取装置的第一光束交会共线函数与第二光束交会共线函数。In the step S1' of this embodiment, the first
当然,在此实施例的步骤S4’中,可再令第一图像撷取装置10与第二图像撷取装置10’同时撷取一目标物件的特征点坐标,并将第一图像撷取装置10所撷取到的特征点坐标与第二图像撷取装置10’所撷取到的特征点坐标代入前述第一及第二光束交会共线函数中,以计算出该目标物件的立体空间坐标。Of course, in step S4' of this embodiment, the first
在此需特别说明的是,与前述实施例的差别在于,本实施例仅需撷一次校正点图像(亦即本实施例中第一图像撷取装置10与第二图像撷取装置10’仅分别撷取一次校正点图像进行校正,但图像中的校正点可能为多个),且本实施例的处理模块13同步地求得第一图像撷取装置10的第一镜头扭曲参数、第一姿态参数、第一镜头中心,以及第二图像撷取装置10’的第二镜头扭曲参数、第二姿态参数、第二镜头中心。It should be noted here that the difference from the previous embodiments is that this embodiment only needs to capture the calibration point image once (that is, the first
换句话说,本实施例通过例如调整第一图像撷取装置10与第二图像撷取装置10’的转向角度的方法,可通过单一的校正点取代前述实施例的镜头校正点与姿态校正点,接着再令处理模块13同步完成对第一图像撷取装置10及第二图像撷取装置10’的镜头校正与姿态校正。而本实施例中的演算方法与相关的参数及函数,皆与前述实施例相同,故在此不再赘述。In other words, in this embodiment, for example, by adjusting the steering angles of the first
综上所述,本发明利用二非平行设置的图像撷取装置来对目标物件进行图像撷取,并依据图像撷取装置的光束交会共线函数计算出目标物件的立体坐标,因此,可方便、快速、精确地得到目标物件的立体坐标。同时,由于这些图像撷取装置在对物件进行图像撷取之前,会先对立体坐标已知的校正点进行图像撷取,并据此对这些图像撷取装置进行镜头校正与姿态校正,因此,可进一步提升所测量出的物件的立体坐标的精确度。据此,本发明不但可快速地取得目标物件的立体坐标与姿态,更可提高目标物件测量时的精确性与便利性,有利于各种不同工作环境的应用。In summary, the present invention utilizes two non-parallel image capture devices to capture the image of the target object, and calculates the three-dimensional coordinates of the target object according to the beam intersection collinear function of the image capture device, therefore, it is convenient , Get the three-dimensional coordinates of the target object quickly and accurately. At the same time, since these image capture devices first capture images of calibration points with known stereo coordinates before capturing images of objects, and perform lens correction and posture correction on these image capture devices accordingly, therefore, The accuracy of the measured three-dimensional coordinates of the object can be further improved. Accordingly, the present invention can not only quickly obtain the three-dimensional coordinates and attitude of the target object, but also improve the accuracy and convenience of measuring the target object, which is beneficial to the application in various working environments.
上述实施类型仅例示性说明本发明的原理及其功效,而非用于限制本发明。本领域技术人员均可在不违背本发明的精神及范围下,对上述实施例进行修饰与改变。因此,本发明的权利保护范围,应如所附权利要求书所列。The above implementation types are only illustrative to illustrate the principles and effects of the present invention, but are not intended to limit the present invention. Those skilled in the art can modify and change the above embodiments without departing from the spirit and scope of the present invention. Therefore, the protection scope of the present invention should be listed in the appended claims.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201010163956.5A CN102213581B (en) | 2010-04-08 | 2010-04-08 | object measuring method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201010163956.5A CN102213581B (en) | 2010-04-08 | 2010-04-08 | object measuring method and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102213581A true CN102213581A (en) | 2011-10-12 |
CN102213581B CN102213581B (en) | 2016-06-08 |
Family
ID=44744984
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201010163956.5A Active CN102213581B (en) | 2010-04-08 | 2010-04-08 | object measuring method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102213581B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109506674A (en) * | 2017-09-15 | 2019-03-22 | 高德信息技术有限公司 | A kind of bearing calibration of acceleration and device |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06265326A (en) * | 1993-03-16 | 1994-09-20 | Kawasaki Steel Corp | Calibrating device for plate width/zigzag movement measuring apparatus using two-dimensional rangefinder |
CN101334276A (en) * | 2007-06-27 | 2008-12-31 | 中国科学院自动化研究所 | A visual measurement method and device |
WO2010011124A1 (en) * | 2008-07-21 | 2010-01-28 | Vitrox Corporation Bhd | A method and means for measuring positions of contact elements of an electronic components |
-
2010
- 2010-04-08 CN CN201010163956.5A patent/CN102213581B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06265326A (en) * | 1993-03-16 | 1994-09-20 | Kawasaki Steel Corp | Calibrating device for plate width/zigzag movement measuring apparatus using two-dimensional rangefinder |
CN101334276A (en) * | 2007-06-27 | 2008-12-31 | 中国科学院自动化研究所 | A visual measurement method and device |
WO2010011124A1 (en) * | 2008-07-21 | 2010-01-28 | Vitrox Corporation Bhd | A method and means for measuring positions of contact elements of an electronic components |
Non-Patent Citations (1)
Title |
---|
张浩鹏: "双目立体视觉及管口视觉测量系统研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》, no. 11, 15 November 2009 (2009-11-15) * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109506674A (en) * | 2017-09-15 | 2019-03-22 | 高德信息技术有限公司 | A kind of bearing calibration of acceleration and device |
CN109506674B (en) * | 2017-09-15 | 2021-05-25 | 阿里巴巴(中国)有限公司 | Acceleration correction method and device |
Also Published As
Publication number | Publication date |
---|---|
CN102213581B (en) | 2016-06-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
TWI420066B (en) | Object measuring method and system | |
CN108489395B (en) | Vision measurement system structural parameters calibration and affine coordinate system construction method and system | |
CN108700408B (en) | Three-dimensional shape data and texture information generation system, method and shooting control method | |
US9679385B2 (en) | Three-dimensional measurement apparatus and robot system | |
US9715730B2 (en) | Three-dimensional measurement apparatus and robot system | |
CN111801198A (en) | Hand-eye calibration method, system and computer storage medium | |
CN104019745B (en) | Based on the free planar dimension measuring method of single visual feel indirect calibration method | |
CN104424630A (en) | Three-dimension reconstruction method and device, and mobile terminal | |
JP2005201824A (en) | Measuring device | |
CN109272555B (en) | A method of obtaining and calibrating external parameters of RGB-D camera | |
CN102013099A (en) | Interactive calibration method for external parameters of vehicle video camera | |
CN106920261A (en) | A kind of Robot Hand-eye static demarcating method | |
CN109465830B (en) | Robot monocular stereoscopic vision calibration system and method | |
CN110398208A (en) | Big data deformation monitoring method based on photographic total station system | |
CN107230233A (en) | The scaling method and device of telecentric lens 3-D imaging system based on bundle adjustment | |
CN110779491A (en) | Method, device and equipment for measuring distance of target on horizontal plane and storage medium | |
CN112229323B (en) | Six-degree-of-freedom measurement method of checkerboard cooperative target based on monocular vision of mobile phone and application of six-degree-of-freedom measurement method | |
JP6410411B2 (en) | Pattern matching apparatus and pattern matching method | |
CN102081798A (en) | Epipolar rectification method for fish-eye stereo camera pair | |
CN113362399B (en) | Calibration method for positions and postures of focusing mirror and screen in deflection measurement system | |
JP2015031601A (en) | Three-dimensional measurement instrument, method, and program | |
JP7427370B2 (en) | Imaging device, image processing device, image processing method, calibration method for imaging device, robot device, method for manufacturing articles using robot device, control program, and recording medium | |
JP5487946B2 (en) | Camera image correction method, camera apparatus, and coordinate transformation parameter determination apparatus | |
CN109813277B (en) | Construction method of ranging model, ranging method and device and automatic driving system | |
CN110211175A (en) | Alignment laser light beam spatial pose scaling method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |