CN112070842B - Multi-camera global calibration method based on orthogonal coding stripes - Google Patents
Multi-camera global calibration method based on orthogonal coding stripes Download PDFInfo
- Publication number
- CN112070842B CN112070842B CN202010737432.6A CN202010737432A CN112070842B CN 112070842 B CN112070842 B CN 112070842B CN 202010737432 A CN202010737432 A CN 202010737432A CN 112070842 B CN112070842 B CN 112070842B
- Authority
- CN
- China
- Prior art keywords
- camera
- cameras
- image
- orthogonal
- absolute
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
本发明公开了一种基于正交编码条纹的多摄像机全局标定方法,具体包括以下步骤,步骤S1:设计编码序列;步骤S2:建立全局坐标系;步骤S3:分别进行两两摄像机的局部标定,利用两摄像机采集一组正交编码条纹图像,至少拍摄三组图像;步骤S4:对于一个正交编码条纹图像,计算其在水平/垂直方向上的绝对展开相位;步骤S5:根据绝对展开相位,建立正交编码条纹图像中每个像素点的图像坐标和世界坐标的一一映射关系;建立摄像机投影方程,求解出两摄像机的内部参数和外部参数;步骤S6:重复步骤S3‑S5,直到多摄像机系统中每个摄像机都完成了局部标定,将所有摄像机统一至全局坐标系下,实现多摄像机系统的全局标定。
The invention discloses a multi-camera global calibration method based on orthogonal coding stripes, which specifically includes the following steps: step S1: designing a coding sequence; step S2: establishing a global coordinate system; step S3: performing local calibration of two cameras respectively, Two cameras are used to collect a set of orthogonally encoded fringe images, and at least three groups of images are taken; Step S4: For an orthogonally encoded fringe image, calculate its absolute unwrapped phase in the horizontal/vertical direction; Step S5: According to the absolute unwrapped phase , set up the one-to-one mapping relationship between the image coordinates of each pixel in the orthogonal coded stripe image and the world coordinates; set up the camera projection equation, solve the internal parameters and external parameters of the two cameras; step S6: repeat steps S3-S5 until Each camera in the multi-camera system has been calibrated locally, and all cameras are unified into the global coordinate system to realize the global calibration of the multi-camera system.
Description
技术领域technical field
本发明属于多摄像摄像技术领域,具体地说,本发明涉及一种基于正交编码条纹的多摄像机全局标定方法。The invention belongs to the technical field of multi-camera imaging, in particular, the invention relates to a multi-camera global calibration method based on orthogonal coding stripes.
背景技术Background technique
多摄像机系统广泛应用于安防监控、目标跟踪、工业测量、自主导航等领域,其中一个关键问题是多摄像机标定,以确定各个摄像机的内参数(如焦距、主点、畸变等)和外参数(如旋转矩阵、平移向量等),其标定精度将直接影响着多摄像机系统的性能。Multi-camera systems are widely used in security monitoring, target tracking, industrial measurement, autonomous navigation and other fields. One of the key issues is multi-camera calibration to determine the internal parameters (such as focal length, principal point, distortion, etc.) and external parameters ( Such as rotation matrix, translation vector, etc.), its calibration accuracy will directly affect the performance of the multi-camera system.
传统标靶按照维度可划分为:一维标靶、二维标靶和三维标靶。相比之下,二维标靶加工和维护较为简单,应用最为广泛。常用的二维标靶包括棋盘格和圆点阵列,其中棋盘格的特征点是角点,圆点阵列的特征点是圆心,它们的提取精度均受灰度影响较大,进而影响着摄像机标定精度。According to the dimensions, traditional targets can be divided into: one-dimensional targets, two-dimensional targets and three-dimensional targets. In contrast, the processing and maintenance of two-dimensional targets are relatively simple and are the most widely used. Commonly used two-dimensional targets include checkerboards and dot arrays. The feature points of the checkerboard are the corner points, and the feature points of the dot array are the center of the circle. Their extraction accuracy is greatly affected by the gray scale, which in turn affects the camera calibration. precision.
许多学者设计了不同种类的相位标靶,包括相移条纹、相移圆条纹和正交条纹等,其共同点在于从标靶相位分布中提取特征点。Many scholars have designed different types of phase targets, including phase-shifted fringes, phase-shifted circular fringes, and orthogonal fringes. The common point is to extract feature points from the phase distribution of the target.
其中,相移条纹和相移圆条纹利用相移法计算其相位,但相移法一般需要多幅条纹,因此在标定过程中,摄像机需要保持固定,每个摄像机方位需要拍摄多幅条纹图像,比较耗时耗力。Among them, phase-shift fringes and phase-shift circular fringes use the phase-shift method to calculate their phases, but the phase-shift method generally requires multiple fringes. Therefore, during the calibration process, the camera needs to be kept fixed, and multiple fringe images need to be taken for each camera orientation. Relatively time-consuming and labor-intensive.
其中,正交条纹利用傅里叶分析法计算其相位,标定灵活性有所提高。然而对于多摄像机标定,不同的应用场合下多摄像机排布情况不尽相同,可能会出现邻近摄像机无公共视场的情况。Among them, the phase of the orthogonal fringe is calculated by Fourier analysis method, and the calibration flexibility is improved. However, for multi-camera calibration, the arrangement of multi-cameras is different in different applications, and there may be situations where adjacent cameras have no common field of view.
不论采用传统二维标靶或者相位标靶,传统的多摄像机标定方法需要将标靶放置于公共视场内进行拍摄,限制了其应用范围。综上所述,如何提高多摄像机标定的精度和灵活性,仍具有重要意义。Regardless of the traditional two-dimensional target or phase target, the traditional multi-camera calibration method needs to place the target in the public field of view for shooting, which limits its application range. To sum up, how to improve the accuracy and flexibility of multi-camera calibration is still of great significance.
发明内容Contents of the invention
本发明提供一种基于正交编码条纹的多摄像机全局标定方法,以解决上述背景技术中存在的问题。The present invention provides a multi-camera global calibration method based on orthogonal coding stripes to solve the problems in the background technology above.
为了实现上述目的,本发明采取的技术方案为:一种基于正交编码条纹的多摄像机全局标定方法,具体包括以下步骤,In order to achieve the above object, the technical solution adopted by the present invention is: a multi-camera global calibration method based on orthogonal coding stripes, which specifically includes the following steps,
步骤S1:设计编码序列,并根据编码序列将若干个编码点嵌入到正交条纹中,生成正交编码条纹I(x,y);然后将正交编码条纹显示于液晶屏幕上或打印于白色平面板上,作为相位标靶;Step S1: Design the coding sequence, and embed several coding points into the orthogonal stripes according to the coding sequence to generate the orthogonal coding stripes I(x,y); then display the orthogonal coding stripes on the LCD screen or print them on a white On a flat plate, as a phase target;
步骤S2:将多摄像机系统中与其他摄像机的视场重叠较多的摄像机选为基准摄像机,以该基准摄像机坐标系建立全局坐标系,其中根据摄像机的视场重叠度大小,将多摄像机中重叠度较多的两两摄像机划分为一组;Step S2: Select the camera in the multi-camera system that overlaps more with the field of view of other cameras as the reference camera, and establish a global coordinate system based on the reference camera coordinate system. Two-two cameras with more degrees are divided into one group;
步骤S3:分别进行两两摄像机的局部标定,将正交编码条纹置于两摄像机视场范围之内,两摄像机中包括一个基准摄像机或已完成局部标定的摄像机;改变正交编码条纹的摆放方位,每个摆放方位下利用两摄像机采集一组正交编码条纹图像Ia(u,v),Ib(u,v),至少拍摄三组图像;Step S3: Carry out local calibration of two cameras respectively, place the orthogonal coding stripes within the field of view of the two cameras, and the two cameras include a reference camera or a camera that has completed local calibration; change the placement of the orthogonal coding stripes Orientation, two cameras are used to collect a set of orthogonally coded fringe images I a (u, v), I b (u, v) in each orientation, and at least three sets of images are taken;
步骤S4:对于一个正交编码条纹图像I(u,v),计算其在水平/垂直方向上的绝对展开相位Φu(u,v),Φv(u,v);Step S4: For an orthogonally coded fringe image I(u,v), calculate its absolute unwrapped phase Φ u (u,v),Φ v (u,v) in the horizontal/vertical direction;
步骤S5:根据绝对展开相位Φu(u,v),Φv(u,v),建立正交编码条纹图像I(u,v) 中每个像素点的图像坐标(u,v)和世界坐标(X,Y,Z)的一一映射关系;建立摄像机投影方程,求解出两摄像机的内部参数和外部参数;Step S5: According to the absolute unwrapped phase Φ u (u, v), Φ v (u, v), establish the image coordinates (u, v) and the world The one-to-one mapping relationship of coordinates (X, Y, Z); establish the camera projection equation, and solve the internal parameters and external parameters of the two cameras;
步骤S6:重复步骤S3-S5,直到多摄像机系统中每个摄像机都完成了局部标定;最后利用两两摄像机间的外部参数,将所有摄像机统一至全局坐标系下,实现多摄像机系统的全局标定。Step S6: Repeat steps S3-S5 until each camera in the multi-camera system has completed local calibration; finally use the external parameters between two cameras to unify all cameras into the global coordinate system to realize the global calibration of the multi-camera system .
优选的,所述步骤S1中,嵌入编码点之前的正交编码条纹的标靶强度可表示为:Preferably, in the step S1, the target intensity of the orthogonal coding stripes before embedding the coding points can be expressed as:
I(x,y)=0.5+0.25cos(2πx/px)+0.25cos(2πy/py)I(x,y)=0.5+0.25cos(2πx/p x )+0.25cos(2πy/p y )
其中(x,y)表示标靶坐标;px和py分别表示水平/垂直方向上的条纹周期;嵌入编码点之前的正交编码条纹图像强度可表示为:Where (x, y) represents the coordinates of the target; p x and p y represent the fringe period in the horizontal/vertical direction, respectively; the intensity of the orthogonal coded fringe image before embedding the code point can be expressed as:
I(u,v)=A(u,v)+Bu(u,v)cos[φu(u,v)]+Bv(u,v)cos[φv(u,v)]I(u,v)=A(u,v)+B u (u,v)cos[φ u (u,v)]+B v (u,v)cos[φ v (u,v)]
其中(u,v)表示图像坐标;A(u,v)表示背景强度;Bu(u,v)和Bv(u,v)分别表示水平/垂直方向上的调制强度。where (u,v) denotes the image coordinates; A(u,v) denotes the background intensity; B u (u,v) and B v (u,v) denote the modulation intensity in the horizontal/vertical directions, respectively.
优选的,所述步骤S3中,两两摄像机的局部标定,两摄像机视场部分重叠或没有重叠,但两摄像机均可以采集到正交编码条纹的部分图像或全部图像。Preferably, in the step S3, in the local calibration of the two cameras, the field of view of the two cameras partially overlaps or does not overlap, but both cameras can collect some or all images of the orthogonal coding stripes.
优选的,所述步骤S4具体包括以下步骤,Preferably, the step S4 specifically includes the following steps,
步骤S41:对正交编码条纹图像I(u,v)进行傅里叶分析,分别计算出水平/ 垂直方向上的截断相位φu(u,v),φv(u,v);Step S41: Carry out Fourier analysis on the orthogonal coded fringe image I(u,v), and calculate the truncated phases φ u (u,v) and φ v (u,v) in the horizontal/vertical direction respectively;
步骤S42:分别对截断相位φu(u,v),φv(u,v)进行局部相位展开,计算出水平/ 垂直方向上的局部展开相位Φ′u(u,v),Φ′v(u,v);Step S42: Perform local phase unwrapping on the truncated phases φ u (u, v), φ v (u, v) respectively, and calculate the local unwrapped phases Φ′ u (u, v), Φ′ v in the horizontal/vertical direction (u,v);
步骤S43:根据截断相位φu(u,v),φv(u,v)和局部展开相位Φ′u(u,v),Φ′v(u,v)的数学关系,分别求解出水平/垂直方向上的局部条纹级次k′u(u,v),k′v(u,v); Step S43 : Solve the horizontal / local fringe order k′ u (u,v), k′ v (u,v) in the vertical direction;
步骤S44:对正交编码条纹图像I(u,v)进行中值滤波,得到滤波后的图像 I′(u,v),滤除正交编码条纹图像I(u,v)中的编码点;Step S44: Perform median filtering on the orthogonally encoded stripe image I(u,v) to obtain the filtered image I'(u,v), and filter out the coded points in the orthogonally encoded stripe image I(u,v) ;
步骤S45:将正交编码条纹图像I(u,v)减去滤波后的图像I′(u,v),然后将相减图像进行二值化,提取到编码点图像H(u,v);Step S45: Subtract the filtered image I′(u,v) from the orthogonal coded stripe image I(u,v), and then binarize the subtracted image to extract the coded point image H(u,v) ;
步骤S46:根据编码点图像H(u,v)对局部条纹级次k′u(u,v),k′v(u,v)进行校正,获得水平/垂直方向上的绝对条纹级次ku(u,v),kv(u,v);Step S46: Correct the local fringe order k′ u (u, v), k′ v (u, v) according to the code point image H(u, v), and obtain the absolute fringe order k in the horizontal/vertical direction u (u, v), k v (u, v);
步骤S47:结合截断相位φu(u,v),φv(u,v)和绝对条纹级次ku(u,v),kv(u,v),计算出绝对展开相位Φu(u,v),Φv(u,v)。Step S47: Combining the truncated phase φ u (u,v), φ v (u,v) and the absolute fringe order k u (u,v), k v (u,v), calculate the absolute unwrapped phase Φ u ( u, v), Φ v (u, v).
优选的,所述步骤S43中,所述截断相位φu(u,v),φv(u,v)和局部展开相位Φ′u(u,v),Φ′v(u,v)的数学关系为:Preferably, in the step S43, the truncated phase φ u (u, v), φ v (u, v) and the local expansion phase Φ′ u (u, v), Φ′ v (u, v) The mathematical relationship is:
k′r(u,v)=floor[Φ′r(u,v)/(2π)],r=u,v。k' r (u, v) = floor[Φ' r (u, v)/(2π)], r = u, v.
优选的,所述步骤S46中,所述绝对条纹级次ku(u,v),kv(u,v)的计算方法是:根据局部条纹级次k′u(u,v),k′v(u,v)区分水平/垂直方向上的每个条纹周期;从编码点图像中提取到每个条纹周期的码值,将连续m=6个条纹周期的码值组成码字 codeword;然后在编码序列中查找码字codeword,根据其位置便可确定绝对条纹级次ku(u,v),kv(u,v)。Preferably, in the step S46, the calculation method of the absolute fringe order k u (u, v), k v (u, v) is: according to the local fringe order k' u (u, v), k ' v (u, v) distinguishes each stripe cycle in the horizontal/vertical direction; extracts the code value of each stripe cycle from the code point image, and forms the code word codeword with the code values of continuous m=6 stripe cycles; Then look up the codeword codeword in the coding sequence, and determine the absolute stripe order k u (u, v), k v (u, v) according to its position.
优选的,所述步骤S47中,所述绝对展开相位Φu(u,v),Φv(u,v)的计算公式为:Preferably, in the step S47, the calculation formula of the absolute unwrapped phase Φ u (u, v), Φ v (u, v) is:
Φr(u,v)=φr(u,v)+2πkr(u,v),r=u,v。Φ r (u, v) = φ r (u, v) + 2πk r (u, v), r = u, v.
优选的,所述步骤S5中,根据绝对展开相位Φu(u,v),Φv(u,v),建立的图像坐标(u,v)和世界坐标(X,Y,Z)的一一映射关系为:Preferably, in the step S5, according to the absolute unfolding phase Φ u (u, v), Φ v (u, v), one of the established image coordinates (u, v) and world coordinates (X, Y, Z) A mapping relationship is:
其中,dx,dy分别表示水平/竖直方向上的正交编码条纹的像素宽度。Wherein, d x , d y represent the pixel widths of the orthogonal coding stripes in the horizontal/vertical direction, respectively.
优选的,所述步骤S5中,所述两摄像机的内部参数和外部参数求解方法,优选张正友标定算法(IEEE Transactions on Pattern Analysis and Machine Intelligence,2000,22(11):1330-4),摄像机模型选择针孔成像模型。Preferably, in the step S5, the method for solving internal parameters and external parameters of the two cameras is preferably Zhang Zhengyou's calibration algorithm (IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(11): 1330-4), and the camera model Select the pinhole imaging model.
采用以上技术方案的有益效果是:The beneficial effect of adopting above technical scheme is:
1、本发明的技术方案只采用一幅正交编码条纹作为相位标靶,利用傅里叶分析法提取其截断相位,利用编码点恢复其绝对展开相位,具有较高的相位提取和展开精度。1. The technical solution of the present invention only uses one orthogonal coding fringe as a phase target, uses Fourier analysis method to extract its truncated phase, and uses coded points to recover its absolute unfolded phase, which has high phase extraction and unfolding precision.
2.本发明的技术方案采用的正交编码条纹,其所有具有有效相位的像素点可以用作相位特征点,有效地保证了摄像机的标定精度。2. For the orthogonal coding stripes adopted in the technical solution of the present invention, all the pixels with effective phases can be used as phase feature points, which effectively guarantees the calibration accuracy of the camera.
3.本发明的技术方案能够同时利用重叠视场内和非重叠视场内的相位特征点进行两两摄像机标定,灵活性强、适用性广。3. The technical solution of the present invention can simultaneously use the phase feature points in the overlapping field of view and the non-overlapping field of view to perform pairwise camera calibration, with strong flexibility and wide applicability.
附图说明Description of drawings
图1是正交编码条纹示意图;Figure 1 is a schematic diagram of orthogonal coding stripes;
图2是两摄像机标定原理图;Figure 2 is a schematic diagram of the calibration of two cameras;
图3是正交编码条纹图像的绝对展开相位计算流程图;Fig. 3 is the absolute unwrapped phase calculation flowchart of the orthogonally coded fringe image;
具体实施方式Detailed ways
下面对照附图,通过对实施例的描述,对本发明的具体实施方式作进一步详细的说明,目的是帮助本领域的技术人员对本发明的构思、技术方案有更完整、准确和深入的理解,并有助于其实施。The specific embodiment of the present invention will be described in further detail by describing the embodiments below with reference to the accompanying drawings, the purpose is to help those skilled in the art to have a more complete, accurate and in-depth understanding of the concept and technical solutions of the present invention, and contribute to its implementation.
如图1至图3所示,本发明是一种基于正交编码条纹的多摄像机全局标定方法,具体包括以下步骤,As shown in Figures 1 to 3, the present invention is a multi-camera global calibration method based on orthogonal coding stripes, which specifically includes the following steps,
步骤S1:设计编码序列,并根据编码序列将若干个编码点嵌入到正交条纹中,生成正交编码条纹I(x,y);然后将正交编码条纹显示于液晶屏幕上或打印于白色平面板上,作为相位标靶,如图1所示;Step S1: Design the coding sequence, and embed several coding points into the orthogonal stripes according to the coding sequence to generate the orthogonal coding stripes I(x,y); then display the orthogonal coding stripes on the LCD screen or print them on a white on the flat plate as a phase target, as shown in Figure 1;
步骤S2:将多摄像机系统中与其他摄像机的视场重叠较多的摄像机选为基准摄像机,以该基准摄像机坐标系建立全局坐标系,其中根据摄像机的视场重叠度大小,将多摄像机中重叠度较多的两两摄像机划分为一组;Step S2: Select the camera in the multi-camera system that overlaps more with the field of view of other cameras as the reference camera, and establish a global coordinate system based on the reference camera coordinate system. Two-two cameras with more degrees are divided into one group;
步骤S3:分别进行两两摄像机的局部标定,将正交编码条纹置于两摄像机视场范围之内,两摄像机中包括一个基准摄像机或已完成局部标定的摄像机;如图2所示,改变正交编码条纹的摆放方位,每个摆放方位下利用两摄像机采集一组正交编码条纹图像Ia(u,v),Ib(u,v),至少拍摄三组图像;Step S3: Carry out local calibration of the two cameras respectively, place the orthogonal coding stripes within the field of view of the two cameras, and the two cameras include a reference camera or a camera that has been locally calibrated; as shown in Figure 2, change the normal The placement orientation of the cross-coding stripes, use two cameras to collect a set of orthogonal coding stripe images I a (u, v), I b (u, v) under each placement orientation, and take at least three sets of images;
步骤S4:对于一个正交编码条纹图像I(u,v),计算其在水平/垂直方向上的绝对展开相位Φu(u,v),Φv(u,v),如图3所示;Step S4: For an orthogonally coded fringe image I(u,v), calculate its absolute unwrapped phase Φ u (u,v),Φ v (u,v) in the horizontal/vertical direction, as shown in Figure 3 ;
步骤S5:根据绝对展开相位Φu(u,v),Φv(u,v),建立正交编码条纹图像I(u,v) 中每个像素点的图像坐标(u,v)和世界坐标(X,Y,Z)的一一映射关系;建立摄像机投影方程,求解出两摄像机的内部参数和外部参数;Step S5: According to the absolute unwrapped phase Φ u (u, v), Φ v (u, v), establish the image coordinates (u, v) and the world The one-to-one mapping relationship of coordinates (X, Y, Z); establish the camera projection equation, and solve the internal parameters and external parameters of the two cameras;
步骤S6:重复步骤S3-S5,直到多摄像机系统中每个摄像机都完成了局部标定;最后利用两两摄像机间的外部参数,将所有摄像机统一至全局坐标系下,实现多摄像机系统的全局标定。Step S6: Repeat steps S3-S5 until each camera in the multi-camera system has completed local calibration; finally use the external parameters between two cameras to unify all cameras into the global coordinate system to realize the global calibration of the multi-camera system .
所述步骤S1中,嵌入编码点之前的正交编码条纹的标靶强度可表示为:In the step S1, the target intensity of the orthogonal coding stripes before embedding the coding points can be expressed as:
I(x,y)=0.5+0.25cos(2πx/px)+0.25cos(2πy/py)I(x,y)=0.5+0.25cos(2πx/p x )+0.25cos(2πy/p y )
其中(x,y)表示标靶坐标;px和py分别表示水平/垂直方向上的条纹周期;嵌入编码点之前的正交编码条纹图像强度可表示为:Where (x, y) represents the coordinates of the target; p x and p y represent the fringe period in the horizontal/vertical direction, respectively; the intensity of the orthogonal coded fringe image before embedding the code point can be expressed as:
I(u,v)=A(u,v)+Bu(u,v)cos[φu(u,v)]+Bv(u,v)cos[φv(u,v)]I(u,v)=A(u,v)+B u (u,v)cos[φ u (u,v)]+B v (u,v)cos[φ v (u,v)]
其中(u,v)表示图像坐标;A(u,v)表示背景强度;Bu(u,v)和Bv(u,v)分别表示水平/垂直方向上的调制强度。where (u,v) denotes the image coordinates; A(u,v) denotes the background intensity; B u (u,v) and B v (u,v) denote the modulation intensity in the horizontal/vertical directions, respectively.
所述步骤S1中,所述编码序列是由‘0’和‘1’两种码值构成的,如图1 中所选用的编码序列为:In the step S1, the coded sequence is composed of two code values of '0' and '1', as shown in Figure 1, the selected coded sequence is:
‘01101110101011110010110011010010011100010100011000010 00000111111’,注意图1中只显示了部分编码序列(前15个码值);连续m=6个码值构成的一个码字codeword,每个码字codeword在整个编码序列中只能出现一次;将条纹级次ku(x,y),kv(x,y)作为索引值,若其在编码序列中索引的码值为‘1’,则在对应的条纹周期内嵌入编码点,若其在编码序列中索引的码值为‘0’,则在对应的条纹周期内不嵌入编码点;例如当条纹级次ku(x,y)=8,10,12时,其在编码序列中索引的码值均为‘1’,则在对应的条纹周期内嵌入编码点;当条纹级次 ku(x,y)=7,9,11时,其在编码序列中索引的码值均为‘0’,则在对应的条纹周期内不嵌入编码点;所述编码点嵌入的标靶坐标(xc,yc)可表示为:'01101110101011110010110011010010011100010100011000010 00000111111', note that only part of the code sequence (the first 15 code values) is shown in Figure 1; a codeword codeword composed of consecutive m=6 code values, each codeword codeword can only appear in the entire code sequence Once; take the stripe level k u (x, y), k v (x, y) as the index value, if the code value of its index in the coding sequence is '1', then embed the code point in the corresponding stripe period , if the code value of its index in the code sequence is '0', no code point is embedded in the corresponding stripe period; for example, when the stripe order k u (x,y)=8,10,12, it is in The code values of the indexes in the coding sequence are all '1', and the coding points are embedded in the corresponding stripe period; when the stripe level k u (x, y)=7,9,11, the indexed If the code values are all '0', the code point is not embedded in the corresponding stripe period; the target coordinates (x c , y c ) embedded in the code point can be expressed as:
其中ci表示第i个码值,cj表示第j个码值。Among them, c i represents the i-th code value, and c j represents the j-th code value.
所述步骤S3中,两两摄像机的局部标定,两摄像机视场部分重叠或没有重叠,但两摄像机均可以采集到正交编码条纹的部分图像或全部图像;如图2所示,两摄像机均可以采集到正交编码条纹的部分图像。In the step S3, the local calibration of the two cameras, the fields of view of the two cameras are partially overlapped or not overlapped, but the two cameras can collect some or all images of the orthogonal coding stripes; as shown in Figure 2, the two cameras are Partial images of orthogonally encoded stripes can be acquired.
所述步骤S4具体包括以下步骤,The step S4 specifically includes the following steps,
步骤S41:对正交编码条纹图像I(u,v)进行傅里叶分析,分别计算出水平/ 垂直方向上的截断相位φu(u,v),φv(u,v);此过程为现有技术,具体实现可参考文献(Optik,2012,123(2):171-5);Step S41: Carry out Fourier analysis on the orthogonal coded fringe image I(u,v), and calculate the truncated phases φ u (u,v) and φ v (u,v) in the horizontal/vertical directions respectively; this process As an existing technology, the specific implementation can refer to the literature (Optik, 2012, 123(2): 171-5);
步骤S42:分别对截断相位φu(u,v),φv(u,v)进行局部相位展开,计算出水平/ 垂直方向上的局部展开相位Φ′u(u,v),Φ′v(u,v);可采用质量导向Quality guided 法、枝切Branch cut法等空间算法;Step S42: Perform local phase unwrapping on the truncated phases φ u (u, v), φ v (u, v) respectively, and calculate the local unwrapped phases Φ′ u (u, v), Φ′ v in the horizontal/vertical direction (u,v); Space algorithms such as Quality guided method and Branch cut method can be used;
步骤S43:根据截断相位φu(u,v),φv(u,v)和局部展开相位Φ′u(u,v),Φ′v(u,v)的数学关系,分别求解出水平/垂直方向上的局部条纹级次ku′(u,v),kv′(u,v); Step S43 : Solve the horizontal /local fringe order k u ′(u,v),k v ′(u,v) in the vertical direction;
步骤S44:对正交编码条纹图像I(u,v)进行中值滤波,得到滤波后的图像 I′(u,v),滤除正交编码条纹图像I(u,v)中的编码点;Step S44: Perform median filtering on the orthogonally encoded stripe image I(u,v) to obtain the filtered image I'(u,v), and filter out the coded points in the orthogonally encoded stripe image I(u,v) ;
步骤S45:将正交编码条纹图像I(u,v)减去滤波后的图像I′(u,v),然后将相减图像进行二值化,提取到编码点图像H(u,v);Step S45: Subtract the filtered image I′(u,v) from the orthogonal coded stripe image I(u,v), and then binarize the subtracted image to extract the coded point image H(u,v) ;
步骤S46:根据编码点图像H(u,v)对局部条纹级次ku′(u,v),kv′(u,v)进行校正,获得水平/垂直方向上的绝对条纹级次ku(u,v),kv(u,v);Step S46: Correct the local fringe order k u '(u,v), k v '(u,v) according to the code point image H(u,v), and obtain the absolute fringe order k in the horizontal/vertical direction u (u, v), k v (u, v);
步骤S47:结合截断相位φu(u,v),φv(u,v)和绝对条纹级次ku(u,v),kv(u,v),计算出绝对展开相位Φu(u,v),Φv(u,v)。Step S47: Combining the truncated phase φ u (u,v), φ v (u,v) and the absolute fringe order k u (u,v), k v (u,v), calculate the absolute unwrapped phase Φ u ( u, v), Φ v (u, v).
所述步骤S43中,所述截断相位φu(u,v),φv(u,v)和局部展开相位Φ′u(u,v),Φ′v(u,v)的数学关系为:In the step S43, the mathematical relationship between the truncated phase φ u (u, v), φ v (u, v) and the locally expanded phase Φ′ u (u, v), Φ′ v (u, v) is :
k′r(u,v)=floor[Φ′r(u,v)/(2π)],r=u,v。k' r (u, v) = floor[Φ' r (u, v)/(2π)], r = u, v.
所述步骤S46中,所述绝对条纹级次ku(u,v),kv(u,v)的计算方法是:根据局部条纹级次k′u(u,v),k′v(u,v)区分水平/垂直方向上的每个条纹周期;从编码点图像中提取到每个条纹周期的码值,将连续m=6个条纹周期的码值组成码字 codeword;然后在编码序列中查找码字codeword,根据其位置便可确定绝对条纹级次ku(u,v),kv(u,v)。In the step S46, the calculation method of the absolute fringe order k u (u, v), k v (u, v) is: according to the local fringe order k' u (u, v), k' v ( u, v) Distinguish each stripe cycle in the horizontal/vertical direction; extract the code value of each stripe cycle from the coded point image, and form the code word codeword with the code values of continuous m=6 stripe cycles; then encode The codeword codeword is searched in the sequence, and the absolute stripe order k u (u, v), k v (u, v) can be determined according to its position.
所述步骤S47中,所述绝对展开相位Φu(u,v),Φv(u,v)的计算公式为:In the step S47, the calculation formula of the absolute unwrapped phase Φ u (u, v), Φ v (u, v) is:
Φr(u,v)=φr(u,v)+2πkr(u,v),r=u,v。Φ r (u, v) = φ r (u, v) + 2πk r (u, v), r = u, v.
所述步骤S42和步骤S47中,对于同一组正交编码条纹图像Ia(u,v)和Ib(u,v),其局部展开相位的零相位点可能不同,其绝对展开相位的零相位点一定相同。In the step S42 and step S47, for the same group of orthogonal coded fringe images I a (u, v) and I b (u, v), the zero phase points of their local unwrapped phases may be different, and the zero phase points of their absolute unwrapped phases The phase points must be the same.
所述步骤S5中,根据绝对展开相位Φu(u,v),Φv(u,v),建立的图像坐标(u,v) 和世界坐标(X,Y,Z)的一一映射关系为:In the step S5, according to the absolute unwrapped phase Φ u (u, v), Φ v (u, v), the one-to-one mapping relationship between the image coordinates (u, v) and the world coordinates (X, Y, Z) established for:
其中,dx,dy分别表示水平/竖直方向上的正交编码条纹的像素宽度。Wherein, d x , d y represent the pixel widths of the orthogonal coding stripes in the horizontal/vertical direction, respectively.
所述步骤S5中,所述两摄像机的内部参数和外部参数求解方法,优选张正友标定算法(IEEE Transactions on Pattern Analysis and Machine Intelligence,2000,22(11):1330-4),摄像机模型选择针孔成像模型。In the step S5, the method for solving internal parameters and external parameters of the two cameras is preferably Zhang Zhengyou's calibration algorithm (IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(11): 1330-4), and the camera model selects pinhole imaging model.
以上结合附图对本发明进行了示例性描述,显然,本发明具体实现并不受上述方式的限制,只要是采用了本发明的方法构思和技术方案进行的各种非实质性的改进;或未经改进,将本发明的上述构思和技术方案直接应用于其它场合的,均在本发明的保护范围之内。The present invention has been exemplarily described above in conjunction with the accompanying drawings. Obviously, the specific implementation of the present invention is not limited by the above methods, as long as various insubstantial improvements are made by adopting the method concept and technical solutions of the present invention; or not After improvement, the above-mentioned ideas and technical solutions of the present invention are directly applied to other occasions, all within the protection scope of the present invention.
Claims (6)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010737432.6A CN112070842B (en) | 2020-07-28 | 2020-07-28 | Multi-camera global calibration method based on orthogonal coding stripes |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010737432.6A CN112070842B (en) | 2020-07-28 | 2020-07-28 | Multi-camera global calibration method based on orthogonal coding stripes |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112070842A CN112070842A (en) | 2020-12-11 |
CN112070842B true CN112070842B (en) | 2023-03-21 |
Family
ID=73656774
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010737432.6A Active CN112070842B (en) | 2020-07-28 | 2020-07-28 | Multi-camera global calibration method based on orthogonal coding stripes |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112070842B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113074667B (en) * | 2021-03-22 | 2022-08-23 | 苏州天准软件有限公司 | Global absolute phase alignment method based on mark points, storage medium and system |
CN119478055B (en) * | 2024-10-18 | 2025-07-18 | 四川大学 | Camera calibration method and object three-dimensional shape information recovery method |
Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101608908A (en) * | 2009-07-20 | 2009-12-23 | 杭州先临三维科技股份有限公司 | The three-dimension digital imaging method that digital speckle projection and phase measuring profilometer combine |
CN102322823A (en) * | 2011-09-13 | 2012-01-18 | 四川大学 | Phase grade self-coding-based optical three-dimensional measurement method |
CN102519393A (en) * | 2011-11-15 | 2012-06-27 | 四川大学 | Method for realizing rapid modulation degree profilometry by use of two orthogonal sinusoidal gratings |
WO2013107076A1 (en) * | 2012-01-19 | 2013-07-25 | 东南大学 | Adaptive window fourier phase extraction method in optical three-dimensional measurement |
CN104596439A (en) * | 2015-01-07 | 2015-05-06 | 东南大学 | Speckle matching and three-dimensional measuring method based on phase information aiding |
CN104835158A (en) * | 2015-05-05 | 2015-08-12 | 中国人民解放军国防科学技术大学 | 3D point cloud acquisition method based on Gray code structure light and polar constraints |
EP2916286A1 (en) * | 2014-03-06 | 2015-09-09 | Thomson Licensing | Camera calibration method and apparatus using a color-coded structure |
CN105528770A (en) * | 2015-12-31 | 2016-04-27 | 河北工业大学 | Projector lens distortion correcting method |
CN106197321A (en) * | 2016-07-06 | 2016-12-07 | 太原科技大学 | Projector calibrating method based on red blue gridiron pattern scaling board |
CN106485755A (en) * | 2016-09-26 | 2017-03-08 | 中国科学技术大学 | A kind of multi-camera system scaling method |
CN106500626A (en) * | 2015-09-08 | 2017-03-15 | 东南大学 | A kind of mobile phone stereoscopic imaging method and three-dimensional imaging mobile phone |
CN106767533A (en) * | 2016-12-28 | 2017-05-31 | 深圳大学 | Efficient phase three-dimensional mapping method and system based on fringe projection technology of profiling |
WO2018040017A1 (en) * | 2016-08-31 | 2018-03-08 | 深圳大学 | Method and system for correcting distortion of projector lens based on adaptive fringes |
CN108088365A (en) * | 2017-12-19 | 2018-05-29 | 合肥工业大学 | A kind of digital micro-mirror camera coordinates Precision Mapping method based on phase |
CN109242957A (en) * | 2018-08-27 | 2019-01-18 | 深圳积木易搭科技技术有限公司 | A kind of single frames coding structural light three-dimensional method for reconstructing based on multiple constraint |
CN109489585A (en) * | 2018-12-06 | 2019-03-19 | 广西师范大学 | Based on the method for three-dimensional measurement for improving multifrequency fringe structure light |
CN109579741A (en) * | 2018-11-01 | 2019-04-05 | 南京理工大学 | A kind of Full-automatic multimould state three-dimensional colour measurement method based on multi-angle of view |
CN110645919A (en) * | 2019-08-23 | 2020-01-03 | 安徽农业大学 | Structured light three-dimensional measurement method based on airspace binary coding |
-
2020
- 2020-07-28 CN CN202010737432.6A patent/CN112070842B/en active Active
Patent Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101608908A (en) * | 2009-07-20 | 2009-12-23 | 杭州先临三维科技股份有限公司 | The three-dimension digital imaging method that digital speckle projection and phase measuring profilometer combine |
CN102322823A (en) * | 2011-09-13 | 2012-01-18 | 四川大学 | Phase grade self-coding-based optical three-dimensional measurement method |
CN102519393A (en) * | 2011-11-15 | 2012-06-27 | 四川大学 | Method for realizing rapid modulation degree profilometry by use of two orthogonal sinusoidal gratings |
WO2013107076A1 (en) * | 2012-01-19 | 2013-07-25 | 东南大学 | Adaptive window fourier phase extraction method in optical three-dimensional measurement |
EP2916286A1 (en) * | 2014-03-06 | 2015-09-09 | Thomson Licensing | Camera calibration method and apparatus using a color-coded structure |
CN104596439A (en) * | 2015-01-07 | 2015-05-06 | 东南大学 | Speckle matching and three-dimensional measuring method based on phase information aiding |
CN104835158A (en) * | 2015-05-05 | 2015-08-12 | 中国人民解放军国防科学技术大学 | 3D point cloud acquisition method based on Gray code structure light and polar constraints |
CN106500626A (en) * | 2015-09-08 | 2017-03-15 | 东南大学 | A kind of mobile phone stereoscopic imaging method and three-dimensional imaging mobile phone |
CN105528770A (en) * | 2015-12-31 | 2016-04-27 | 河北工业大学 | Projector lens distortion correcting method |
CN106197321A (en) * | 2016-07-06 | 2016-12-07 | 太原科技大学 | Projector calibrating method based on red blue gridiron pattern scaling board |
WO2018040017A1 (en) * | 2016-08-31 | 2018-03-08 | 深圳大学 | Method and system for correcting distortion of projector lens based on adaptive fringes |
CN106485755A (en) * | 2016-09-26 | 2017-03-08 | 中国科学技术大学 | A kind of multi-camera system scaling method |
CN106767533A (en) * | 2016-12-28 | 2017-05-31 | 深圳大学 | Efficient phase three-dimensional mapping method and system based on fringe projection technology of profiling |
CN108088365A (en) * | 2017-12-19 | 2018-05-29 | 合肥工业大学 | A kind of digital micro-mirror camera coordinates Precision Mapping method based on phase |
CN109242957A (en) * | 2018-08-27 | 2019-01-18 | 深圳积木易搭科技技术有限公司 | A kind of single frames coding structural light three-dimensional method for reconstructing based on multiple constraint |
CN109579741A (en) * | 2018-11-01 | 2019-04-05 | 南京理工大学 | A kind of Full-automatic multimould state three-dimensional colour measurement method based on multi-angle of view |
CN109489585A (en) * | 2018-12-06 | 2019-03-19 | 广西师范大学 | Based on the method for three-dimensional measurement for improving multifrequency fringe structure light |
CN110645919A (en) * | 2019-08-23 | 2020-01-03 | 安徽农业大学 | Structured light three-dimensional measurement method based on airspace binary coding |
Non-Patent Citations (4)
Title |
---|
3D acquisition and stereo-camera calibration by active devices: A unique structured light encoding framework;S.Barone等;《Optics and Lasers in Engineering》;20200430;第127卷;1-12 * |
Optimised De Bruijn patterns for one-shot shape acquisition;Jordi Pagès等;《Image and Vision Computing》;20050801;第23卷(第8期);707-720 * |
基于光栅投影的物体表面三维重构方法研究;王露玲;《中国优秀硕士学位论文全文数据库 信息科技辑》;20200315;第2020年卷(第3期);I138-1019 * |
基于正交彩色条纹投影的相位辅助双目视觉测量研究;刘慧娴;《中国优秀硕士学位论文全文数据库 信息科技辑》;20190215;第2019年卷(第2期);I138-2293 * |
Also Published As
Publication number | Publication date |
---|---|
CN112070842A (en) | 2020-12-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Chen et al. | High-accuracy multi-camera reconstruction enhanced by adaptive point cloud correction algorithm | |
Huang et al. | Research on multi-camera calibration and point cloud correction method based on three-dimensional calibration object | |
CN106485755B (en) | Calibration method of multi-camera system | |
CN103697815B (en) | Mixing structural light three-dimensional information getting method based on phase code | |
Yang et al. | Robust and accurate surface measurement using structured light | |
CN109186491A (en) | Parallel multi-thread laser measurement system and measurement method based on homography matrix | |
Shi et al. | A visual circle based image registration algorithm for optical and SAR imagery | |
Wang et al. | A two-step calibration method of a large FOV binocular stereovision sensor for onsite measurement | |
CN112070842B (en) | Multi-camera global calibration method based on orthogonal coding stripes | |
CN101105393A (en) | Vision measuring method for projecting multiple frequency grating object surface tri-dimensional profile | |
CN103226840B (en) | Full-view image splicing and measurement system and method | |
CN113096191B (en) | Intelligent calibration method for monocular camera based on coding plane target | |
Yang et al. | A novel camera calibration method based on circle projection model | |
CN101882309A (en) | Calibration pattern of camera and calibration and detection method based on same | |
CN112945141A (en) | Structured light rapid imaging method and system based on micro-lens array | |
CN114332349B (en) | Binocular structured light edge reconstruction method, system and storage medium | |
CN111462198A (en) | A Scale, Rotation and Radiation Invariant Multimodal Image Registration Method | |
CN116205993A (en) | A high-precision calibration method for bi-telecentric lens for 3D AOI | |
Wan et al. | A calibration method for defocused cameras based on defocus blur estimation | |
CN113112549B (en) | Monocular camera rapid calibration method based on coding stereo target | |
CN110969601B (en) | Non-contact identification method of structural rotation response based on visual feature tracking algorithm | |
CN115713561B (en) | A structured light system calibration method based on color composite calibration plate | |
CN115330884A (en) | Characteristic point sorting method for circular stripe array target | |
Zhang et al. | Extrinsic calibration for large FOV based on inverse depth parameterized bundle adjustment | |
CN116245948A (en) | Monocular vision cooperative target and pose measuring and calculating method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |