[go: up one dir, main page]

CN102034238B - Multi-camera system calibrating method based on optical imaging probe and visual graph structure - Google Patents

Multi-camera system calibrating method based on optical imaging probe and visual graph structure Download PDF

Info

Publication number
CN102034238B
CN102034238B CN2010105852616A CN201010585261A CN102034238B CN 102034238 B CN102034238 B CN 102034238B CN 2010105852616 A CN2010105852616 A CN 2010105852616A CN 201010585261 A CN201010585261 A CN 201010585261A CN 102034238 B CN102034238 B CN 102034238B
Authority
CN
China
Prior art keywords
camera
calibration
point
cameras
optical imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN2010105852616A
Other languages
Chinese (zh)
Other versions
CN102034238A (en
Inventor
赵宏
李进军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Cartesan Testing Technology Co Ltd
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN2010105852616A priority Critical patent/CN102034238B/en
Publication of CN102034238A publication Critical patent/CN102034238A/en
Application granted granted Critical
Publication of CN102034238B publication Critical patent/CN102034238B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

一种基于光学成像测头和视觉图结构的多摄像机系统标定方法,利用光学成像测头对各摄像进行独立标定,得到各摄像机的内参数和畸变参数初值;对多摄像机进行两两标定,采用线性估计方法得到视场有重叠区域的两个摄像机间的基本矩阵、极线约束、旋转矩阵和平移向量;根据图理论和视觉图结构建立多摄像机间的连接关系,采用最短路径方法估计每个摄像机相对于参考摄像机的旋转和平移向量初值;运用稀疏捆绑调整算法对所有摄像机的全体内外参数和采集的光学成像测头的三维标志点集进行优化估计,得到高精度的标定结果。本发明标定过程简单,由局部到全局、由粗到精,实现了高精度和鲁棒的标定,适用于不同测量范围、不同分布结构的多摄像机系统的标定。

Figure 201010585261

A calibration method for a multi-camera system based on an optical imaging probe and a visual map structure. The optical imaging probe is used to calibrate each camera independently to obtain the initial value of the internal parameters and distortion parameters of each camera; two-by-two calibration for multiple cameras, The basic matrix, epipolar constraint, rotation matrix and translation vector between two cameras with overlapping fields of view are obtained by linear estimation method; the connection relationship between multiple cameras is established according to graph theory and visual graph structure, and the shortest path method is used to estimate each The initial value of the rotation and translation vector of each camera relative to the reference camera; the sparse bundle adjustment algorithm is used to optimize the overall internal and external parameters of all cameras and the three-dimensional marker point set of the collected optical imaging probe, and obtain high-precision calibration results. The calibration process of the present invention is simple, from local to global, from coarse to fine, realizes high-precision and robust calibration, and is suitable for the calibration of multi-camera systems with different measurement ranges and different distribution structures.

Figure 201010585261

Description

Multi-camera system calibration method based on optical imaging measuring head and visual graph structure
Technical Field
The invention belongs to the technical field of vision measurement, and relates to a multi-camera system calibration method based on an optical imaging measuring head and a vision graph structure.
Background
The three-dimensional measurement is carried out by using a multi-camera system, and the calibration of the internal parameters and distortion parameters of each camera, the relative poses among the cameras and the pose parameters of a reference coordinate system must be completed firstly. The robustness and the precision of the calibration directly determine the measurement precision of the multi-camera system, so the robustness and the high-precision calibration are the premise and the condition for realizing the high-precision calibration of the multi-camera system.
Camera calibration has long been known and many effective methods have been available over decades. In recent years, due to the unique and obvious characteristics of a multi-camera system on large-scale and large-range measurement, the multi-camera system is widely paid attention to domestic and foreign research, and particularly, the high efficiency and high precision of the multi-camera system become a research hotspot. However, some of the conventional multi-camera calibration methods use electronic theodolites to establish a world coordinate system, thereby increasing the cost and complexity of the system; some adopt three-dimensional, two-dimensional, one-dimensional or 0-dimensional calibration objects, and directly utilize the pose relationship between the internal parameters of a single camera and the cameras to complete the estimation of the calibration parameters of the multi-camera system, so that on one hand, the calibration objects are required to be visible in all the camera fields, on the other hand, the calibration results are influenced by noise, calculation errors, initial parameter errors and the like, and the precision is low; although some methods also adopt a binding adjustment optimization method in the final stage of calibration, the acquisition process of the three-dimensional calibration point set is complex, and extra points need to be discarded, so that the efficiency is low and the robustness is poor.
Disclosure of Invention
The invention aims to provide a multi-camera system calibration method based on an optical imaging measuring head and a visual diagram structure, aiming at the defects or shortcomings of the existing calibration technology. The method can simultaneously complete the internal and external parameter calibration of a single camera and the calibration of a multi-camera system only by one calibrated optical imaging measuring head, is simple, does not need the optical imaging measuring head to be visible in all the camera fields, only needs two or more cameras to be visible in the field, and is suitable for various measurement fields. And the calibration of the multi-camera system is completed from coarse to fine by adopting a visual diagram structure and binding adjustment optimization, so that the robustness and high precision of the result are ensured.
In order to achieve the purpose, the invention adopts the technical scheme that:
step 1, establishing hardware:
by adopting a plurality of distributed cameras and an optical imaging measuring head, all the cameras do not require overlapped view field spaces, but require at least two cameras to have overlapped view fields:
and 2, determining the spatial position of each camera, determining one or more spatial positioning points for each camera, and positioning the optical imaging measuring head within the field of view of the calibrated camera according to the principle of spatial positioning point selection, namely in the calibration process of a single camera or multiple cameras. During calibration of a single camera, firstly, a reference positioning block (1) is installed on a corresponding positioning point, an optical imaging measuring head is placed in the field range of each camera and aligned with the reference positioning block (1), the optical imaging measuring head is rotated by taking a measuring tip (2) of the optical imaging measuring head as a circle center, seven mark points (5) on an optical imaging measuring head target body 4 concentrically rotate by taking the measuring tip (2) as the circle center, and the camera acquires a plurality of images of the optical imaging measuring head in different postures; secondly, extracting the central coordinate of each LED sub-pixel in each image by adopting a threshold segmentation and ellipse fitting algorithm, and calculating the central coordinate of each mark point 5 by adopting a gravity center method; finally, recovering the internal parameter matrix and distortion parameters of each camera according to the concentric circle rotation relation of the mark points 5 and the calibrated space distance invariant relation;
step 3, reselecting and positioning the space positioning point, adjusting the posture of the optical imaging measuring head, and ensuring that the optical imaging measuring head is positioned in the overlapped field of view of every two or more cameras; in the overlapped view field, rotating the optical imaging measuring head, simultaneously acquiring the LED images of the mark points of the optical imaging measuring head by the cameras with overlapped view fields, recovering the basic matrix, the essential matrix, the polar line geometric relationship, the rotation matrix and the translation vector between two local cameras through image processing, center recognition and mark point matching and according to the geometric constraint between the mark points;
step 4, taking each camera as a node, taking every two cameras with overlapped fields of view as edges, constructing a visual graph of the multi-camera system, wherein the direction of each edge is determined by the initially calibrated local rotation matrix and the translation vector, and the former camera points to the other camera after transformation; in the visual map, firstly determining a reference camera, starting from the reference camera, establishing a connection relation among all cameras, and calculating a rotation matrix and a translation vector of each camera relative to the reference camera by adopting a shortest path method to realize global initial calibration of a multi-camera system;
step 5, according to the initial calibration result, the mark points acquired, identified and obtained in the step 2 and the step 3 are back projected to a reference camera coordinate system, so that a three-dimensional calibration point set in a world coordinate system is established; performing optimization estimation on all calibration parameters and the three-dimensional calibration point set by adopting a sparse bundling adjustment algorithm to obtain a robust and high-precision calibration result;
step 6, if the calibration result meets the measurement precision requirement, the calibration process is finished; if the three-dimensional calibration point set does not meet the requirement, the number of the positioning points of the optical imaging measuring head is increased, the image is collected again, the identified and reconstructed new three-dimensional calibration points are added to the original marker point set, all calibration parameters and the three-dimensional calibration point set are optimized again by adopting a sparse binding adjustment algorithm, and all parameters are calibrated again.
Each marking point 5 consists of six LEDs, and the center of gravity of the marking point is the center of the marking point.
The step 2 comprises the following steps:
step 21, establishing a camera pinhole projection model:
first, a world coordinate system X is establishedWYWZWCoordinate system X of measuring headtYtZtAn image coordinate scale coordinate system xy and a pixel coordinate system uv, wherein N is 7 mark points on the optical imaging measuring head,
Figure BDA0000037914030000031
indicating the coordinates of the mark point in the measuring head coordinate system,
Figure BDA0000037914030000032
representing the coordinates of the mark point in the world coordinate system, (R, T) are the rotation and translation parameters of the camera relative to the world coordinate system, K is the internal parameter matrix of the camera, and comprises 5 parameters, namely the horizontal and vertical focal lengths (f)x,fy) Image tilt factor s and optical center coordinates (u0, v0)(k1, k2, d1 and d2) are distortion parameters in the radial direction and the tangential direction respectively, and the 7 mark points (5) on the measuring head target surface (4) satisfy the following projection relation:
<math> <mrow> <msup> <mi>&lambda;</mi> <mi>j</mi> </msup> <msup> <mover> <mi>x</mi> <mo>~</mo> </mover> <mi>j</mi> </msup> <mo>=</mo> <mi>KR</mi> <msubsup> <mi>X</mi> <mi>t</mi> <mi>j</mi> </msubsup> <mo>+</mo> <mi>KT</mi> <mo>,</mo> <mi>j</mi> <mo>&Element;</mo> <mn>7</mn> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </math>
wherein λ isjIs the projected depth of landmark point j;indicating the normalized coordinates of the landmark point j on the image.
Step 22, calibrating image acquisition and marking point identification: selecting a space positioning point according to a principle of space positioning point selection, fixing a reference positioning block (1), rotating an optical imaging measuring head by taking a measuring tip (2) as a circle center, acquiring I1 images with different postures by using a camera, distributing the acquired images in a field space of the camera as much as possible, extracting a sub-pixel center of each LED light spot by adopting a threshold segmentation and ellipse fitting algorithm, and estimating the center of each mark point by adopting a gravity center method;
step 23, according to the invariance of the radius of the concentric circle, establishing an error function of the distance between each mark point and the measuring tip when the optical imaging measuring head rotates around the measuring tip (2), wherein 7 mark points respectively have the radius rjRotating concentrically around the measuring tip (2), and because the positions of all the mark points are accurately calibrated, the rotating radiuses r of 7 concentric circlesjThe distance from each mark point to the measuring tip is constant according to the constant radius of the concentric circles during the rotation process of the measuring headFor the I1 images acquired by rotation, each label can obtain the radius error equation:
<math> <mrow> <msubsup> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mrow> <mi>I</mi> <mn>1</mn> </mrow> </msubsup> <mrow> <mo>(</mo> <msubsup> <mi>r</mi> <mi>j</mi> <mi>i</mi> </msubsup> <mo>-</mo> <msub> <mi>r</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mn>0</mn> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow> </math>
wherein,denotes an estimated rotation radius r of the jth (j-1, …, 7) index point at the ith rotationjA calibration value representing the turning radius of the jth (j ═ 1, …, 7) index point;
step 24, estimating initial values of internal and external parameters of the camera: solving initial values of a parameter matrix K, a rotation matrix R and a translational vector T in the camera by a radius error function of a minimum formula (3) due to the existence of errors, and estimating an initial value of a distortion parameter by a back projection error minimum
<math> <mrow> <msubsup> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mn>7</mn> </msubsup> <msubsup> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mrow> <mi>I</mi> <mn>1</mn> </mrow> </msubsup> <mrow> <mo>(</mo> <msubsup> <mi>r</mi> <mi>j</mi> <mi>i</mi> </msubsup> <mo>-</mo> <msub> <mi>r</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mtext>3</mtext> <mo>)</mo> </mrow> </mrow> </math>
And 25, further improving the calibration precision by adopting a sparse binding adjustment algorithm.
The step 3 comprises the following steps:
step 31, acquiring the measuring head mark point images in the overlapped view field: reselecting and positioning a space positioning point (L1, L2.. once.), adjusting the posture of the optical imaging measuring head to ensure that the optical imaging measuring head is positioned in the overlapped visual field of every two or more cameras, rotating the optical imaging measuring head, and simultaneously acquiring a three-dimensional image of the measuring head marking point by the cameras with overlapped visual fields in the area;
step 32, identifying and matching the mark points of the stereo image: extracting the center of a sub-pixel of each LED light spot by adopting a threshold segmentation and ellipse fitting algorithm, and estimating the center of each mark point by adopting a gravity center method; fast matching the mark points of the stereo image pair according to the mark point position constraint and the geometric moment invariance rule;
step 33, estimating the fundamental matrix and epipolar geometry:
assuming that the two cameras C1 and C2 have overlapped view field spaces, and no loss of generality, the origin of world coordinates is positioned at the center of the camera C1, and the pose of the camera C1 is (R)1,T1) The pose of the camera C2 is (R, 0)2,T2) The relative relationship between the two cameras is (R, T). The image coordinates of the same-name mark point j acquired by the two cameras
Figure BDA0000037914030000051
The following relations exist between the following components:
<math> <mrow> <msubsup> <mi>&lambda;</mi> <mn>2</mn> <mi>j</mi> </msubsup> <msubsup> <mover> <mi>x</mi> <mo>~</mo> </mover> <mn>2</mn> <mi>j</mi> </msubsup> <mo>=</mo> <msubsup> <mi>&lambda;</mi> <mn>1</mn> <mi>j</mi> </msubsup> <msub> <mi>K</mi> <mn>2</mn> </msub> <msub> <mi>R</mi> <mn>2</mn> </msub> <msubsup> <mi>K</mi> <mn>1</mn> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msubsup> <mrow> <msubsup> <mover> <mi>x</mi> <mo>~</mo> </mover> <mn>1</mn> <mi>j</mi> </msubsup> <mo>+</mo> <msub> <mi>K</mi> <mn>2</mn> </msub> <msub> <mi>T</mi> <mn>2</mn> </msub> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> </mrow> </mrow> </math>
estimating unknown scale parameters from the equation
Figure BDA0000037914030000053
Andand obtain
Figure BDA0000037914030000055
Wherein
Figure BDA0000037914030000056
I.e. the required fundamental matrix, and if there are enough pairs of landmark points, the 9 parameters of the fundamental matrix are solved linearly. Meanwhile, in step 2, the intrinsic parameter matrix K of the two cameras C1 and C21,K2Has been calibrated, using coordinate values
Figure BDA0000037914030000057
(i ═ 1, 2) estimating the epipolar constraint relationship of the stereo images:
x 2 jT Ex 1 j = 0 - - - ( 5 )
wherein,
Figure BDA0000037914030000059
the method is an essential matrix, and an essential matrix E can be estimated by using a 7-point algorithm;
step 34, estimating the relative pose and scale factor between every two cameras:
because the positions of 7 mark points on the measuring head are known, a rotation matrix R and a translation vector T between every two cameras can be uniquely determined, and when the scale factor is estimated, the normalized radius of the mark point relative to the rotation of the measuring tip is used
Figure BDA00000379140300000510
And actual nominal radius
Figure BDA00000379140300000511
The following results were obtained:
<math> <mrow> <msup> <mi>&lambda;</mi> <mi>j</mi> </msup> <mo>=</mo> <msup> <mi>r</mi> <mi>j</mi> </msup> <mo>/</mo> <msup> <mover> <mi>r</mi> <mo>~</mo> </mover> <mi>j</mi> </msup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>6</mn> <mo>)</mo> </mrow> </mrow> </math>
taking the average value of 7 point estimates or the average value of N times of measurement as a final scale factor lambda due to the existence of errors;
<math> <mrow> <mi>&lambda;</mi> <mo>=</mo> <msubsup> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mn>7</mn> </msubsup> <msup> <mi>&lambda;</mi> <mi>j</mi> </msup> <mo>/</mo> <mn>7</mn> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>7</mn> <mo>)</mo> </mrow> </mrow> </math>
<math> <mrow> <mi>&lambda;</mi> <mo>=</mo> <msubsup> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>N</mi> </msubsup> <mrow> <mo>(</mo> <msubsup> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mn>7</mn> </msubsup> <msup> <mi>&lambda;</mi> <mi>j</mi> </msup> <mo>/</mo> <mn>7</mn> <mo>)</mo> </mrow> <mo>/</mo> <mi>N</mi> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>8</mn> <mo>)</mo> </mrow> <mo>.</mo> </mrow> </math>
the step 4 comprises the following steps:
step 41, constructing a visual diagram of the multi-camera system according to a diagram theory:
constructing a visual graph of the multi-camera system by taking each camera as a node and taking every two cameras with overlapped view fields as edges, wherein in the visual graph, the direction of each edge is temporarily determined by an initially calibrated local rotation matrix and a translation vector, and the other camera after transformation is pointed by the previous camera;
step 42, determining the reference cameras, and reestablishing the connection relationship between the cameras:
in the visual map, determining a reference camera, considering the relation with the adjacent camera when selecting the reference camera, and readjusting and establishing the connection relation between the cameras from the reference camera;
step 43, determining the rotation and translation amount of each camera in the reference camera coordinate system by adopting a shortest path method;
solving for the absolute position from the reference camera to each camera using the shortest path, let Ci,Cj,CkThe cameras which are mutually connected on a certain path of the visual map respectively obtain the secondary camera C for every two calibrated camerasiTo CjAnd CjTo CkOf (A) a transformation matrix (R)ij,Tij) And (R)jk,Tjk) From CiTo CkIs calculated by the following equation:
Rik=RijRjk,Tik=Tij+RijTjk (9)
and if the number of nodes on the path of the reference camera is more than two, solving by using the above formula for multiple times to finish absolute calibration of all cameras relative to the reference camera.
The step 5 comprises the following steps:
step 51, obtaining an initial three-dimensional calibration point set
In the step 22 and the step 32, image coordinates of different calibration points of the marker point in all cameras are collected and identified, a three-dimensional sparse calibration point set is obtained by back projection according to all calibration parameters obtained in the previous 4 steps, and errors exist in the reconstructed three-dimensional sparse calibration point set and the calibration parameters due to the fact that a simple connection relation of multiple cameras is used in calibration;
step 52, constructing a back projection error function
Supposing that n three-dimensional coordinate points and m cameras are arranged, the projection coordinate of the ith point on the jth camera image is xijThe parameter of each camera is represented by a vector ajRepresenting, each three-dimensional point by a vector biShowing that the function Q (,) defines the projection of a three-dimensional point on the image surface of the camera, the function d (x, y) represents the Euclidean distance between the image points x and y, and the following back projection error function is established according to the projective geometrical relationship:
<math> <mrow> <msub> <mi>min</mi> <mrow> <msub> <mi>a</mi> <mi>j</mi> </msub> <mo>,</mo> <msub> <mi>b</mi> <mi>i</mi> </msub> </mrow> </msub> <msubsup> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </msubsup> <msubsup> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>m</mi> </msubsup> <mi>d</mi> <msup> <mrow> <mo>(</mo> <mi>Q</mi> <mrow> <mo>(</mo> <msub> <mi>a</mi> <mi>j</mi> </msub> <mo>,</mo> <msub> <mi>b</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>,</mo> <msub> <mi>x</mi> <mi>ij</mi> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>10</mn> <mo>)</mo> </mrow> </mrow> </math>
and step 53, using the binding adjustment minimized back projection error function to obtain an accurate calibration result.
The back projection error function is a function containing P e □MProblem of non-linear minimization of parameter vectors, P ∈ RMThe parameter vector is formed by the pose parameters of all cameras and a three-dimensional measuring point set X-shaped □NAnd (4) forming.
In order to realize the calibration of the internal and external parameters of a single camera and the relative pose of every two cameras, the invention adopts an optical imaging measuring head as a calibration object, 7 marked points with calibrated positions are arranged on the optical imaging measuring head (each marked point consists of 6 LEDs, the gravity center of the marked point is the center of the marked point), and the internal and external parameters of the cameras are recovered according to the concentric rotation motion relation of the marked points on the target surface of the measuring head around a measuring tip, thereby realizing the rapid calibration. Because the measuring space is large and the visual field of the camera is limited, the invention regards the multi-camera system as a visual graph structure relationship, each camera is a node of the visual graph, the edge of the visual graph represents the overlapped visual field range between the nodes (cameras), the edge of the visual graph has directionality, and the representing pose is changed from the previous camera to the other camera; and optimizing all pose initial parameters of the multi-camera system on a visual map by a shortest path method. Meanwhile, because the noise and the error in the coordinate transformation can influence the overall precision of the system, the invention further adopts a sparse binding adjustment algorithm to carry out optimization adjustment on the acquired three-dimensional mark points and all initial calibration parameters so as to improve the precision of the calibration parameters. By using the method, the overall error of system calibration is greatly reduced, and the robustness of each parameter is obviously improved.
Drawings
Fig. 1 is a schematic diagram of the projection relationship between the optical imaging probe and the camera.
Wherein the reference numerals denote: 1. reference positioning block, 2, measuring tip, 3, extension bar, 4, measuring head target body, 5, 7 mark points: (Each marker point consists of 6 LED light spots). XWYWZWRepresenting the world coordinate system, XtYtZtRepresenting a measuring head coordinate system, an xy image coordinate scale coordinate system and uv representing an image pixel coordinate system; P1-P7 represent 7 mark points on the optical imaging measuring head, and r 1-r 7 are distances from the 7 mark points to the measuring tip respectively.
FIG. 2 is a schematic diagram of relative position calibration of two cameras with overlapping fields of view.
Wherein the reference numbers: C1-C8 represent 8 cameras, L1 and L2 represent the positions of positioning points, and R and T represent relative pose transformation relations.
Fig. 3 is a visual diagram construction diagram of a multi-camera system.
Wherein the reference numbers: C1-C8 represent 8 cameras.
Fig. 4 is a schematic diagram illustrating a change in connection relation for determining the front and rear visual images of the reference camera.
Wherein the reference numbers: C1-C8 represent 8 cameras.
Fig. 5 is a schematic diagram of global calibration under a reference camera coordinate system.
Wherein the reference numbers: C1-C8 represent 8 cameras, L1 and L2 represent the positions of positioning points, and R and T represent relative pose transformation relations.
Detailed Description
The invention will be further explained with reference to the drawings.
The following description will specifically discuss a vision system including 8 cameras as an example. The present invention uses an optical imaging probe as shown in figure 1. The optical imaging measuring head mainly comprises a measuring tip (2), an extension bar (3), a target body (4), 7 mark points (5) and LED light spots. The distribution of 8 cameras is shown in fig. 3, where all cameras do not require overlapping fields of view, but at least every second camera has an overlapping field of view. According to the method, a space positioning point is selected and determined according to the space distribution of 8 cameras, and independent calibration of a single camera and relative pose calibration of every two cameras are completed by utilizing the rotation invariant relation of an optical imaging measuring head around a measuring tip. And then, constructing a connection relation of the multiple cameras according to the graph theory and the visual graph, and completing the calibration of the multiple cameras under a reference camera coordinate system. And finally, globally optimizing the three-dimensional geometric structure and all calibration parameters by adopting a sparse bundling adjustment algorithm in order to eliminate the initial calibration error.
The following describes the steps in a specific order of calibration.
The first stage is as follows: and (4) independently calibrating intrinsic parameters and distortion parameters of the single camera.
The numbers of the cameras are C1-C8, as shown in FIG. 2.
Step 11, establishing a world coordinate system X according to the projection relation between the optical imaging measuring head and the camera shown in figure 1WYWZWCoordinate system X of measuring headtYtZtAn image coordinate scale coordinate system xy and a pixel coordinate system uv.
And step 12, determining the spatial position of each camera, determining one or more spatial positioning points for each camera, and positioning the optical imaging measuring head within the field of view of the calibrated camera according to the principle of spatial positioning point selection, namely in the calibration process of a single camera or multiple cameras. In the calibration of a single camera, a reference positioning block (1) is fixed on a corresponding positioning point, an optical imaging measuring head rotates by taking a measuring tip (2) as a circle center, the camera is used for collecting I1 images of different postures, and the collected images are distributed in the view field space of the camera as much as possible. Extracting the center of a sub-pixel of each LED light spot by adopting a threshold segmentation and ellipse fitting algorithm, and estimating the center coordinate of each mark point (consisting of 6 LED light spots) by adopting a gravity center method(I-1, …, I1; j-1, …, 7). Meanwhile, an optical imaging probe is arranged onThe coordinate of N-7 mark points in the measuring head coordinate system is
Figure BDA0000037914030000092
The coordinates in the world coordinate system areAnd (3) establishing a projection relation of the camera according to the formula (1).
And step 13, constructing a radius error equation (2) of each mark point according to the concentric circle radius invariant geometric constraint.
And step 14, a minimized radius error function formula (3) restores the initial values of the intrinsic parameter matrix K, the rotation matrix R and the translational vector T of each camera, and estimates distortion parameters of each camera through the minimum back projection error (K1, K2, d1 and d 2).
And step 15, optimizing internal and external parameters of each camera by adopting a sparse binding adjustment algorithm, and improving the calibration precision.
And a second stage: and calibrating the relative pose of every two cameras with overlapped view fields.
A pair of 8 cameras is arbitrarily selected and grouped in the order of the small number preceding and the large number following.
Step 21, as shown in fig. 2, selecting and positioning an appropriate spatial positioning point (L1, L2,. cndot.), adjusting the posture of the optical imaging probe to ensure that the optical imaging probe is positioned in the overlapped fields of view of every two or more cameras, rotating the optical imaging probe, and simultaneously acquiring a three-dimensional image of a probe mark point by the cameras with overlapped fields of view in the area.
And step 22, extracting the center of the sub-pixel of each LED light spot by adopting a threshold segmentation and ellipse fitting algorithm, and estimating the center of each mark point by adopting a gravity center method. And quickly matching the mark points of the stereo image pair according to the mark point position constraint and the geometric moment invariance rule.
Step 23, estimating the unknown scale parameters of each pair of cameras according to the formula (4)
Figure BDA0000037914030000101
And
Figure BDA0000037914030000102
according to the relationship
Figure BDA0000037914030000103
Recovering the basic matrix of each two pairs of cameras; and restoring the epipolar constraint relation of each pair of cameras according to the formula (5).
And 24, recovering the rotation matrix R and the translational vector T of each pair of cameras, and calculating a scale factor lambda according to a formula (7) and a formula (8).
And a third stage: and global calibration of the multi-camera system based on the visual map.
Step 31, as shown in fig. 3, a visual diagram of the multi-camera system is constructed by using 8 cameras as nodes and every two cameras with overlapped fields of view as edges. In the visual map, the direction of each edge is temporarily determined by the initially calibrated local rotation matrix and the translation vector, and a directional visual map as shown in fig. 4(a) is constructed.
And step 32, in the visual map, using C8 as a reference camera, readjusting and establishing a connection relationship between the cameras, and determining a new directional visual map by using a shortest path method, as shown in fig. 4 (b).
Step 33, determining the rotation and translation amount of each camera in the reference camera coordinate system according to formula (9), as shown in fig. 5. Thus, the absolute calibration of all cameras relative to the reference camera is initially completed.
A fourth stage: and carrying out global optimization calibration based on sparse binding adjustment.
Step 41, reconstructing a three-dimensional sparse calibration point set X epsilon □ through back projection according to the initial calibration parametersN
And 42, establishing a back projection error minimum function (10) by all the calibration data and the three-dimensional point set.
And 43, using the binding adjustment minimized back projection error function to obtain an accurate calibration result.
The fifth stage: and further optimizing and estimating the calibration parameters until the three-dimensional measurement calibration precision requirement is met.
The invention is characterized in that:
the calibration method uses the imaging measuring head with fixed optical mark points to calibrate the internal and external parameters of the camera and the relative pose between the cameras, does not need any other equipment, and has simple method and low cost.
The calibration method does not need the optical imaging measuring head to be visible in all the camera view fields, only needs to be visible in the overlapped view fields of every two cameras, is suitable for multi-camera systems with different distribution structures, and can be used for small-scale space calibration and large-scale space calibration.
The visual map and the binding adjustment optimization algorithm adopted by the calibration method can obviously improve the calibration efficiency, robustness and precision.

Claims (4)

1.一种基于光学成像测头和视觉图结构的多摄像机系统标定方法,其特征在于包括以下步骤:1. A multi-camera system calibration method based on optical imaging measuring head and visual graph structure, it is characterized in that comprising the following steps: 步骤1、硬件的建立:Step 1, the establishment of hardware: 采用分布式的多个摄像机和一个已标定的光学成像测头,所有摄像机不要求有重叠的视场空间,但要求至少每两个摄像机之间有重叠视场;Using distributed multiple cameras and a calibrated optical imaging probe, all cameras do not require an overlapping field of view space, but at least every two cameras are required to have an overlapping field of view; 步骤2、确定每个摄像机的空间位置,为每个摄像机确定一个或多个空间定位点,按空间定位点选择的原则即在单摄像机或多摄像机的标定过程中光学成像测头均能位于所标定的摄像机视场以内,在单摄像机的标定中,首先在其对应的定位点上安装基准定位块(1),将光学成像测头置入每个摄像机视场范围,并与基准定位块(1)对准,以光学成像测头的测尖(2)为圆心转动光学成像测头,光学成像测头靶体(4)上的七个标志点(5)以测尖(2)为圆心同心旋转,摄像机采集光学成像测头不同姿态的多幅图像,其中每个标志点(5)由六个LED组成,其重心为标志点中心;其次,采用阈值分割和椭圆拟合算法提取每幅图像中的每个LED子像素的中心坐标,采用重心法计算每个标志点(5)的中心坐标;最后,根据标志点(5)的同心圆旋转关系和已标定的空间距离不变关系,恢复每个摄像机的内参数矩阵和畸变参数;Step 2. Determine the spatial position of each camera, and determine one or more spatial positioning points for each camera. According to the principle of spatial positioning point selection, the optical imaging probe can be located in the calibration process of a single camera or a multi-camera. Within the calibrated camera field of view, in the calibration of a single camera, first install the reference positioning block (1) on its corresponding positioning point, put the optical imaging probe into the field of view of each camera, and align with the reference positioning block ( 1) Align, rotate the optical imaging probe with the tip (2) of the optical imaging probe as the center of the circle, and the seven mark points (5) on the target body (4) of the optical imaging probe take the tip (2) as the center of the circle Concentric rotation, the camera collects multiple images of different attitudes of the optical imaging probe, in which each marker point (5) is composed of six LEDs, and its center of gravity is the center of the marker point; secondly, threshold segmentation and ellipse fitting algorithms are used to extract each image For the center coordinates of each LED sub-pixel in the image, the center coordinates of each marker point (5) are calculated using the center of gravity method; finally, according to the concentric circle rotation relationship of the marker point (5) and the calibrated spatial distance invariant relationship, Recover the intrinsic parameter matrix and distortion parameters of each camera; 步骤3、重新选择并定位空间定位点,调整光学成像测头姿态,确保光学成像测头位于每两个或以上摄像机的重叠视场内;在重叠视场内,旋转光学成像测头,由视场重叠的摄像机同时获取光学成像测头标志点的LED图像,通过图像处理、中心识别和标志点匹配,并根据标志点间的几何约束,恢复局部两摄像机间的基本矩阵、本质矩阵、极线几何关系以及旋转矩阵、平移向量;Step 3. Reselect and locate the spatial positioning point, adjust the attitude of the optical imaging probe to ensure that the optical imaging probe is located in the overlapping field of view of every two or more cameras; The cameras with overlapping fields simultaneously acquire the LED images of the marker points of the optical imaging probe, through image processing, center recognition and marker point matching, and according to the geometric constraints between the marker points, the local basic matrix, essential matrix, and epipolar line between the two cameras are restored Geometric relationship and rotation matrix, translation vector; 步骤4、以每个摄像机为结点,以具有重叠视场的每两个摄像机为边,构建多摄像机系统的视觉图,每条边的方向由初始标定的局部旋转矩阵和平移向量确定,并由前一摄像机指向变换后的另一摄像机;在视觉图内,首先确定参考摄像机,由参考摄像机开始,建立所有摄像机之间的连接关系,采用最短路径法计算每个摄像机相对参考摄像机的旋转矩阵和平移向量,实现多摄像机系统的全局初始标定;Step 4. Taking each camera as a node and every two cameras with overlapping fields of view as sides, construct a visual map of the multi-camera system. The direction of each side is determined by the initial calibration local rotation matrix and translation vector, and Point from the previous camera to the transformed camera; in the visual map, first determine the reference camera, start from the reference camera, establish the connection relationship between all cameras, and use the shortest path method to calculate the rotation matrix of each camera relative to the reference camera and translation vectors to realize the global initial calibration of the multi-camera system; 步骤5、根据多摄像机系统的全局初始标定结果,将步骤2和步骤3采集、识别、获取的标志点反投影到参考摄像机坐标系,以此建立世界坐标系中的三维标定点集;采用稀疏捆绑调整算法对全部标定参数和三维标定点集进行优化估计,获取鲁棒和高精度的标定结果;Step 5. According to the global initial calibration result of the multi-camera system, back-project the marker points collected, identified and obtained in Step 2 and Step 3 to the reference camera coordinate system, so as to establish a three-dimensional calibration point set in the world coordinate system; use sparse The bundled adjustment algorithm optimizes and estimates all calibration parameters and 3D calibration point sets to obtain robust and high-precision calibration results; 步骤6、若标定结果已满足测量精度要求,标定过程结束;若不满足,则增加光学成像测头定位点数量,重新采集图像,将识别及重建的新的三维标定点增加到原来的标志点集中,采用稀疏捆绑调整算法重新优化全部标定参数和三维标定点集,重新标定各参数。Step 6. If the calibration result meets the measurement accuracy requirements, the calibration process ends; if not, increase the number of positioning points of the optical imaging probe, re-acquire images, and add the new 3D calibration points identified and reconstructed to the original mark points Concentration, using the sparse bundle adjustment algorithm to re-optimize all calibration parameters and three-dimensional calibration point sets, and re-calibrate each parameter. 2.如权利要求1所述的基于光学成像测头和视觉图结构的多摄像机系统标定方法,其特征在于:所述的步骤2包含如下步骤:2. the multi-camera system calibration method based on optical imaging measuring head and visual graph structure as claimed in claim 1, is characterized in that: described step 2 comprises the following steps: 步骤21、建立摄像机针孔投影模型:Step 21. Establish a camera pinhole projection model: 首先,建立世界坐标系XWYWZW、测头坐标系XtYtZt、图像坐标尺度坐标系xy和像素坐标系uv,设光学成像测头上有N=7个标志点,
Figure FDA0000146011860000021
表示标志点在测头坐标系中坐标,
Figure FDA0000146011860000022
表示标志点在世界坐标系中坐标,(R,T)是摄像机相对于世界坐标系的旋转和平移参量,K是摄像机的内参数矩阵,包含5个参量,分别为水平和垂直方向焦距(fx,fy)、图像倾斜因子s和光心坐标(u0,v0),(k1,k2,d1,d2)分别为径向和切向的畸变参数,测头靶面(4)上的7个标志点(5)满足如下投影关系:
First, the world coordinate system X W Y W Z W , the probe coordinate system X t Y t Z t , the image coordinate scale coordinate system xy and the pixel coordinate system uv are established, and there are N=7 mark points on the optical imaging probe,
Figure FDA0000146011860000021
Indicates the coordinates of the marker point in the probe coordinate system,
Figure FDA0000146011860000022
Indicates the coordinates of the marker point in the world coordinate system, (R, T) is the rotation and translation parameters of the camera relative to the world coordinate system, K is the internal parameter matrix of the camera, including 5 parameters, which are the horizontal and vertical focal lengths (f x , f y ), image tilt factor s and optical center coordinates (u0, v0), (k1, k2, d1, d2) are the radial and tangential distortion parameters respectively, and the seven on the probe target surface (4) The marker point (5) satisfies the following projection relationship:
&lambda;&lambda; jj xx ~~ jj == KRXKRX tt jj ++ KTKT ,, jj == 1,21,2 ,, .. .. .. ,, 77 -- -- -- (( 11 )) 其中,λj是标志点j的投影深度;
Figure FDA0000146011860000024
表示标志点j在图像上的归一化坐标;
Among them, λ j is the projection depth of the marker point j;
Figure FDA0000146011860000024
Indicates the normalized coordinates of the marker point j on the image;
步骤22、标定图像采集与标志点识别:按照空间定位点选择的原则,选择空间定位点,并固定基准定位块(1),把光学成像测头以测尖(2)为圆心转动,使用摄像机采集不同姿态的图像I1幅,采集的图像尽量遍布摄像机的视场空间,采用阈值分割和椭圆拟合算法提取每个LED光点的子像素中心,采用重心法估计每个标志点的中心;Step 22. Calibrate image acquisition and mark point recognition: According to the principle of spatial positioning point selection, select the spatial positioning point, and fix the reference positioning block (1), rotate the optical imaging probe with the tip (2) as the center of the circle, and use the camera Collect images I1 of different postures, and the collected images spread throughout the camera's field of view space as much as possible, use threshold segmentation and ellipse fitting algorithm to extract the sub-pixel center of each LED light point, and use the center of gravity method to estimate the center of each marker point; 步骤23、根据同心圆半径不变,建立每个标志点转动时与测尖距离的误差函数,当光学成像测头绕测尖(2)转动时,7个标志点分别以半径rj绕测尖(2)同心转动,由于全部标志点的位置已精确标定,7个同心圆的转动半径rj不变,根据同心圆半径不变,测头在转动过程中每个标志点到测尖的距离不变,对转动采集的I1幅图像,每个一标志均可得到半径误差方程:Step 23. According to the constant radius of the concentric circles, establish the error function of the distance between each marker point and the measuring tip when it rotates. When the optical imaging probe rotates around the measuring tip (2), the 7 marker points are respectively measured around the radius r j The tip (2) rotates concentrically. Since the positions of all the marking points have been precisely calibrated, the turning radius r j of the seven concentric circles remains unchanged. According to the constant radius of the concentric circles, the distance from each marking point to the measuring tip during the rotation of the probe Keeping the distance constant, for the I1 images collected by rotation, the radius error equation can be obtained for each mark: &Sigma;&Sigma; ii == 11 II 11 (( rr jj ii -- rr jj )) == 00 -- -- -- (( 22 )) 其中,
Figure FDA0000146011860000032
表示第j(j=1,…,7)个标志点在第i次转动时估计的转动半径,rj表示第j(j=1,…,7)个标志点转动半径的标定值;
in,
Figure FDA0000146011860000032
Represents the jth (j=1,...,7) marker point estimated rotation radius when it rotates for the first time, r j represents the calibration value of the jth (j=1,...,7) marker point's radius of rotation;
步骤24、估计摄像机内外参数的初值:因误差的存在,通过最小化式(3)半径误差函数来求解摄像机内参数矩阵K、旋转矩阵R和平移向量T的初值,通过反投影误差最小估计畸变参数初值Step 24. Estimating the initial value of the internal and external parameters of the camera: due to the existence of errors, the initial values of the internal parameter matrix K of the camera, the rotation matrix R and the translation vector T are solved by minimizing the radius error function of formula (3), and the error is minimized by back-projection Estimating the initial value of the distortion parameter &Sigma;&Sigma; jj == 11 77 &Sigma;&Sigma; ii == 11 II 11 (( rr jj ii -- rr jj )) -- -- -- (( 33 )) 步骤25、采用稀疏捆绑调整算法进一步提高标定精度。Step 25, using the sparse bundle adjustment algorithm to further improve the calibration accuracy.
3.如权利要求1所述的基于光学成像测头和视觉图结构的多摄像机系统标定方法,其特征在于:所述的步骤4包含如下步骤:3. the multi-camera system calibration method based on optical imaging measuring head and visual graph structure as claimed in claim 1, is characterized in that: described step 4 comprises the following steps: 步骤41、根据图理论,构建多摄像机系统的视觉图:Step 41. According to the graph theory, construct the visual graph of the multi-camera system: 以每个摄像机为结点,以具有重叠视场的每两个摄像机为边,构建多摄像机系统的视觉图,在视觉图中,每条边的方向暂时由初始标定的局部旋转矩阵和平移向量确定,并由前一摄像机指向变换后的另一摄像机;With each camera as a node and every two cameras with overlapping fields of view as edges, a visual graph of a multi-camera system is constructed. In the visual graph, the direction of each edge is temporarily determined by the initially calibrated local rotation matrix and translation vector Confirm, and point the previous camera to the transformed camera; 步骤42、确定参考摄像机,重新建立摄像机之间的连接关系:Step 42, determine the reference camera, and re-establish the connection relationship between the cameras: 在视觉图内,确定一个参考摄像机,参考摄像机选择时考虑与相邻摄像机的关系,由参考摄像机开始,重新调整和建立摄像机之间的连接关系;In the visual map, determine a reference camera, consider the relationship with adjacent cameras when selecting the reference camera, start from the reference camera, readjust and establish the connection relationship between the cameras; 步骤43、采用最短路径法确定每个摄像机在参考摄像机坐标系下的旋转和平移量;Step 43, using the shortest path method to determine the rotation and translation of each camera in the reference camera coordinate system; 使用最短路径求解从参考摄像机到每个摄像机的绝对位置,设Ci,Cj,Ck是视觉图某一路径上的相互联接的摄像机,对已标定的两两摄像机,分别得到从Ci到Cj和Cj到Ck的变换矩阵(Rij,Tij)和(Rjk,,Tjk),从Ci到Ck的关系由下式计算:Use the shortest path to solve the absolute position from the reference camera to each camera. Let C i , C j , and C k be the interconnected cameras on a certain path of the visual map . Transformation matrices (R ij , T ij ) and (R jk , T jk ) to C j and C j to C k , the relationship from C i to C k is calculated by the following formula: Rik=RijRjk,Tik=Tij+RijTjk        (9)R ik =R ij R jk , T ik =T ij +R ij T jk (9) 如果从参考摄像机的路径上的结点多于两个,则多次使用上式求解,完成所有摄像机相对于参考摄像机的绝对标定。If there are more than two nodes on the path from the reference camera, the above formula is used to solve for multiple times to complete the absolute calibration of all cameras relative to the reference camera. 4.如权利要求2所述的基于光学成像测头和视觉图结构的多摄像机系统标定方法,其特征在于:所述的步骤5包含如下步骤:4. the multi-camera system calibration method based on optical imaging measuring head and visual graph structure as claimed in claim 2, is characterized in that: described step 5 comprises the following steps: 步骤51、获取初始三维标定点集Step 51. Obtain an initial 3D calibration point set 由步骤22,采集并识别标志点在所有摄像机中不同标定点的图像坐标,根据前4步获得的全部标定参数,反投影得到一个三维稀疏标定点集,由于标定中使用了多摄像机的简单连接关系,这个重建的三维稀疏标定点集和标定参数均存在误差;From step 22, collect and identify the image coordinates of different calibration points of the marker points in all cameras. According to all the calibration parameters obtained in the first 4 steps, a three-dimensional sparse calibration point set is obtained by back projection. Since the calibration uses a simple connection of multiple cameras relationship, there are errors in the reconstructed 3D sparse calibration point set and calibration parameters; 步骤52、构建反投影误差函数Step 52. Construct the backprojection error function 假设有n个三维坐标点和m个摄像机,第i个点在第j个摄像机图像上的投影坐标为xij,每个摄像机的参数由向量aj表示,每个三维点可用向量bi表示,函数Q(,)定义三维点在摄像机像面上的投影,函数d(x,y)代表图像点x和y之间的欧氏距离,根据射影几何关系,建立如下的反投影误差函数:Suppose there are n three-dimensional coordinate points and m cameras, the projection coordinates of the i-th point on the j-th camera image are x ij , the parameters of each camera are represented by vector a j , and each three-dimensional point can be represented by vector b i , the function Q(,) defines the projection of a three-dimensional point on the camera image plane, and the function d(x, y) represents the Euclidean distance between the image point x and y. According to the projective geometric relationship, the following back-projection error function is established: minmin aa jj ,, bb ii &Sigma;&Sigma; ii == 11 nno &Sigma;&Sigma; jj == 11 mm dd (( QQ (( aa jj ,, bb ii )) ,, xx ijij )) 22 -- -- -- (( 1010 )) 步骤53、使用捆绑调整最小化反投影误差函数,获取精确的标定结果。Step 53: Minimize the back-projection error function using bundle adjustment to obtain accurate calibration results.
CN2010105852616A 2010-12-13 2010-12-13 Multi-camera system calibrating method based on optical imaging probe and visual graph structure Active CN102034238B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010105852616A CN102034238B (en) 2010-12-13 2010-12-13 Multi-camera system calibrating method based on optical imaging probe and visual graph structure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010105852616A CN102034238B (en) 2010-12-13 2010-12-13 Multi-camera system calibrating method based on optical imaging probe and visual graph structure

Publications (2)

Publication Number Publication Date
CN102034238A CN102034238A (en) 2011-04-27
CN102034238B true CN102034238B (en) 2012-07-18

Family

ID=43887091

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010105852616A Active CN102034238B (en) 2010-12-13 2010-12-13 Multi-camera system calibrating method based on optical imaging probe and visual graph structure

Country Status (1)

Country Link
CN (1) CN102034238B (en)

Families Citing this family (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8965057B2 (en) * 2012-03-02 2015-02-24 Qualcomm Incorporated Scene structure-based self-pose estimation
CN103377471B (en) * 2012-04-16 2016-08-03 株式会社理光 Object positioning method and device, optimum video camera are to determining method and apparatus
CN103196370B (en) * 2013-04-01 2015-05-27 北京理工大学 Measuring method and measuring device of conduit connector space pose parameters
CN103617622A (en) * 2013-12-10 2014-03-05 云南大学 Pose estimation orthogonal iterative optimization algorithm
CN104766291B (en) * 2014-01-02 2018-04-10 株式会社理光 Multiple cameras scaling method and system
CN103792950B (en) * 2014-01-06 2016-05-18 中国航空无线电电子研究所 A kind of method that uses the stereoscopic shooting optical parallax deviation correcting device based on piezoelectric ceramics to carry out error correction
CN105072414B (en) * 2015-08-19 2019-03-12 浙江宇视科技有限公司 A target detection and tracking method and system
CN105894505A (en) * 2016-03-30 2016-08-24 南京邮电大学 Quick pedestrian positioning method based on multi-camera geometrical constraint
US10037626B2 (en) * 2016-06-30 2018-07-31 Microsoft Technology Licensing, Llc Interaction with virtual objects based on determined restrictions
CN106060524B (en) * 2016-06-30 2017-12-29 北京邮电大学 The method to set up and device of a kind of video camera
CN106408614B (en) * 2016-09-27 2019-03-15 中国船舶工业系统工程研究院 Camera intrinsic parameter Calibration Method and system suitable for field application
CN106799732A (en) * 2016-12-07 2017-06-06 中国科学院自动化研究所 For the control system and its localization method of the motion of binocular head eye coordination
CN106780630A (en) * 2017-01-09 2017-05-31 上海商泰汽车信息系统有限公司 Demarcate panel assembly, vehicle-mounted camera scaling method and device, system
CN106843224B (en) * 2017-03-15 2020-03-10 广东工业大学 Method and device for cooperatively guiding transport vehicle through multi-view visual positioning
CN109813335B (en) * 2017-11-21 2021-02-09 武汉四维图新科技有限公司 Calibration method, device and system of data acquisition system and storage medium
CN109099883A (en) * 2018-06-15 2018-12-28 哈尔滨工业大学 The big visual field machine vision metrology of high-precision and caliberating device and method
CN108827156B (en) * 2018-08-24 2021-08-10 合肥工业大学 Industrial photogrammetry reference scale
CN110969662B (en) * 2018-09-28 2023-09-26 杭州海康威视数字技术股份有限公司 Fisheye camera internal parameter calibration method, device, calibration device controller and system
CN111308448B (en) * 2018-12-10 2022-12-06 杭州海康威视数字技术股份有限公司 External parameter determining method and device for image acquisition equipment and radar
CN109785373B (en) * 2019-01-22 2022-12-23 东北大学 A six-degree-of-freedom pose estimation system and method based on speckle
CN111862225A (en) * 2019-04-30 2020-10-30 罗伯特·博世有限公司 Image calibration method, calibration system and vehicle with the same
CN110176035B (en) * 2019-05-08 2021-09-28 深圳市易尚展示股份有限公司 Method and device for positioning mark point, computer equipment and storage medium
CN110310337B (en) * 2019-06-24 2022-09-06 西北工业大学 Multi-view light field imaging system full-parameter estimation method based on light field fundamental matrix
CN110782498B (en) * 2019-09-26 2022-03-15 北京航空航天大学 A Fast and Universal Calibration Method for Visual Sensor Networks
CN111127560B (en) * 2019-11-11 2022-05-03 江苏濠汉信息技术有限公司 Calibration method and system for three-dimensional reconstruction binocular vision system
CN110889901B (en) * 2019-11-19 2023-08-08 北京航空航天大学青岛研究院 BA optimization method for sparse point cloud in large scene based on distributed system
CN111243021A (en) * 2020-01-06 2020-06-05 武汉理工大学 Vehicle-mounted visual positioning method and system based on multiple combined cameras and storage medium
CN111598954A (en) * 2020-04-21 2020-08-28 哈尔滨拓博科技有限公司 Rapid high-precision camera parameter calculation method
CN111890354B (en) * 2020-06-29 2022-01-11 北京大学 Robot hand-eye calibration method, device and system
CN114581515B (en) * 2020-12-02 2024-08-20 中国科学院沈阳自动化研究所 Multi-camera calibration parameter optimization method based on optimal path conversion
CN112781496B (en) * 2021-01-20 2022-03-08 湘潭大学 Measuring head pose calibration method of non-contact measuring system
CN113077519B (en) * 2021-03-18 2022-12-09 中国电子科技集团公司第五十四研究所 An automatic calibration method of multi-camera extrinsic parameters based on human skeleton extraction
CN114283201A (en) * 2021-04-26 2022-04-05 阿波罗智联(北京)科技有限公司 Camera calibration method, device and roadside equipment
CN113920201A (en) * 2021-07-01 2022-01-11 桂林理工大学 Fisheye camera calibration method with epipolar geometric constraints
CN113963058B (en) * 2021-09-07 2022-11-29 于留青 On-line calibration method and device for CT (computed tomography) of preset track, electronic equipment and storage medium
CN114049324B (en) * 2021-11-15 2024-08-23 天津大学 Rapid calibration method for correlated reference telecentric measurement under super-view field scale
CN114705122B (en) * 2022-04-13 2023-05-05 成都飞机工业(集团)有限责任公司 Large-view-field stereoscopic vision calibration method
CN115115945B (en) * 2022-07-08 2025-09-23 中国科学院上海技术物理研究所 A ground calibration method for multi-camera field of view stitching
CN117422770A (en) * 2022-07-11 2024-01-19 华为技术有限公司 A calibration method and device
CN115272486B (en) * 2022-07-29 2025-08-26 天津内燃机研究所(天津摩托车技术中心) CCD camera imaging center calibration method and system based on direct optical method
CN115601418A (en) * 2022-10-17 2023-01-13 杭州海康威视系统技术有限公司(Cn) Point position adjustment auxiliary method and device and electronic equipment
CN115578694A (en) * 2022-11-18 2023-01-06 合肥英特灵达信息技术有限公司 Video analysis computing power scheduling method, system, electronic equipment and storage medium
CN116188602A (en) * 2023-04-26 2023-05-30 西北工业大学青岛研究院 High-precision calibration method for underwater multi-vision three-dimensional imaging system
CN117650844B (en) * 2024-01-29 2024-04-26 江西开放大学 VLC relay system safety performance optimization method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101231749A (en) * 2007-12-20 2008-07-30 昆山华恒工程技术中心有限公司 Method for calibrating industry robot
CN101285680A (en) * 2007-12-12 2008-10-15 中国海洋大学 Calibration Method of Extrinsic Parameters of Line Structured Light Probe
CN101334267A (en) * 2008-07-25 2008-12-31 西安交通大学 Method and Device for Vector Coordinate Transformation Calibration and Error Correction of Digital Video Measuring Probe
US7554575B2 (en) * 2005-10-28 2009-06-30 Seiko Epson Corporation Fast imaging system calibration

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7961936B2 (en) * 2007-03-30 2011-06-14 Intel Corporation Non-overlap region based automatic global alignment for ring camera image mosaic

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7554575B2 (en) * 2005-10-28 2009-06-30 Seiko Epson Corporation Fast imaging system calibration
CN101285680A (en) * 2007-12-12 2008-10-15 中国海洋大学 Calibration Method of Extrinsic Parameters of Line Structured Light Probe
CN101231749A (en) * 2007-12-20 2008-07-30 昆山华恒工程技术中心有限公司 Method for calibrating industry robot
CN101334267A (en) * 2008-07-25 2008-12-31 西安交通大学 Method and Device for Vector Coordinate Transformation Calibration and Error Correction of Digital Video Measuring Probe

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Bent David Olsen等.Calibration a camera network using a domino grid.《Pattern Recognition》.2001, *
孙佳星 等.双摄像机光笔式三维坐标测量系统研究.《计测技术》.2009,第29卷(第1期), *

Also Published As

Publication number Publication date
CN102034238A (en) 2011-04-27

Similar Documents

Publication Publication Date Title
CN102034238B (en) Multi-camera system calibrating method based on optical imaging probe and visual graph structure
CN102376089B (en) Target correction method and system
JP7300550B2 (en) METHOD AND APPARATUS FOR CONSTRUCTING SIGNS MAP BASED ON VISUAL SIGNS
CN103198524B (en) A kind of three-dimensional reconstruction method for large-scale outdoor scene
CN102155923B (en) Splicing measuring method and system based on three-dimensional target
CN101943563B (en) Rapid calibration method of line-structured light vision sensor based on space plane restriction
US8208029B2 (en) Method and system for calibrating camera with rectification homography of imaged parallelogram
CN109685855B (en) A camera calibration optimization method under the road cloud monitoring platform
CN109341668B (en) A Multi-Camera Measurement Method Based on Refraction Projection Model and Beam Tracing Method
CN110992487B (en) Fast 3D map reconstruction device and reconstruction method for handheld aircraft fuel tank
CN106228538A (en) Binocular vision indoor orientation method based on logo
CN111709985A (en) A method of underwater target ranging based on binocular vision
CN104732518A (en) PTAM improvement method based on ground characteristics of intelligent robot
CN105096386A (en) Method for automatically generating geographic maps for large-range complex urban environment
CN114001651B (en) Large-scale slender barrel type component pose in-situ measurement method based on binocular vision measurement and priori detection data
CN104835158A (en) 3D point cloud acquisition method based on Gray code structure light and polar constraints
CN101419709B (en) Plane target drone characteristic point automatic matching method for demarcating video camera
CN113963068B (en) Global calibration method for mirror image type single-camera omnidirectional stereoscopic vision sensor
CN101865656B (en) Method for accurately positioning position of multi-camera system by using small number of coplanar points
CN116563377B (en) Mars rock measurement method based on hemispherical projection model
CN106157322B (en) A method of camera installation position calibration based on plane mirror
CN104036542A (en) Spatial light clustering-based image surface feature point matching method
CN101354796A (en) 3D Reconstruction Method of Omnidirectional Stereo Vision Based on Taylor Series Model
Jafarzadeh et al. Crowddriven: A new challenging dataset for outdoor visual localization
CN103093460A (en) Moving camera virtual array calibration method based on parallel parallax

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
ASS Succession or assignment of patent right

Owner name: SUZHOU DIKA TESTING TECHNOLOGY CO., LTD.

Free format text: FORMER OWNER: XI AN JIAOTONG UNIV.

Effective date: 20150528

C41 Transfer of patent application or patent right or utility model
COR Change of bibliographic data

Free format text: CORRECT: ADDRESS; FROM: 710049 XI AN, SHAANXI PROVINCE TO: 215505 SUZHOU, JIANGSU PROVINCE

TR01 Transfer of patent right

Effective date of registration: 20150528

Address after: 215505 Jiangsu province Suzhou Changshou City Lianfeng Road No. 58

Patentee after: Suzhou cartesan Testing Technology Co. Ltd

Address before: 710049 Xianning West Road, Shaanxi, China, No. 28, No.

Patentee before: Xi'an Jiaotong University

EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20110427

Assignee: Xi'an like Photoelectric Technology Co., Ltd.

Assignor: Suzhou cartesan Testing Technology Co. Ltd

Contract record no.: 2015610000089

Denomination of invention: Multi-camera system calibrating method based on optical imaging test head and visual graph structure

Granted publication date: 20120718

License type: Exclusive License

Record date: 20150902

LICC Enforcement, change and cancellation of record of contracts on the licence for exploitation of a patent or utility model