[go: up one dir, main page]

CN103727930A - Edge-matching-based relative pose calibration method of laser range finder and camera - Google Patents

Edge-matching-based relative pose calibration method of laser range finder and camera Download PDF

Info

Publication number
CN103727930A
CN103727930A CN201310742582.6A CN201310742582A CN103727930A CN 103727930 A CN103727930 A CN 103727930A CN 201310742582 A CN201310742582 A CN 201310742582A CN 103727930 A CN103727930 A CN 103727930A
Authority
CN
China
Prior art keywords
edge
point set
range finder
laser range
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310742582.6A
Other languages
Chinese (zh)
Other versions
CN103727930B (en
Inventor
熊蓉
李千山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Iplus Tech Co ltd
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201310742582.6A priority Critical patent/CN103727930B/en
Publication of CN103727930A publication Critical patent/CN103727930A/en
Application granted granted Critical
Publication of CN103727930B publication Critical patent/CN103727930B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本发明公开了一种基于边缘匹配的激光测距仪与相机相对位姿标定方法,它的步骤是首先提取激光测距仪点云数据边缘轮廓和相机图像的边缘轮廓,建立点云数据边缘的概率分布即图像边缘的概率分布,然后最小化两个分布的KL距离,求得激光测距仪和相机的相对位姿参数。本发明方法不依赖特定环境结构,不依赖标定板等辅助物件;可以在线运行,即时更新激光测距仪和相机的相对位姿;所提取的点云边缘轮廓和图像边缘轮廓可进一步用于环境物体识别和定位等其它应用。The invention discloses a method for calibrating the relative pose of a laser range finder and a camera based on edge matching. Its steps are firstly to extract the edge profile of the point cloud data of the laser range finder and the edge profile of the camera image, and to establish the edge profile of the point cloud data. The probability distribution is the probability distribution of the edge of the image, and then minimize the KL distance between the two distributions to obtain the relative pose parameters of the laser rangefinder and the camera. The method of the present invention does not depend on a specific environment structure, does not rely on auxiliary objects such as calibration boards; it can be run online, and the relative pose of the laser range finder and the camera can be updated in real time; the extracted point cloud edge contour and image edge contour can be further used in the environment Other applications such as object recognition and localization.

Description

A kind of laser range finder and camera relative pose scaling method based on edge matching
Technical field
The present invention relates to multi-sensor information fusion field, relate in particular to a kind of laser range finder and camera relative pose scaling method based on edge matching.
background technology:
The general scaling board of traditional laser range finder and camera relative pose scaling method is realized, by the angle point of visual identity scaling board, set up these angle points and in space, be positioned at the constraint of scaling board plane, minimum error function is to obtain rotation and the translation matrix that represents relative pose
Figure 2013107425826100002DEST_PATH_IMAGE001
.
Though separately have certain methods not need scaling board, also need the ad hoc structure of environment to set up geometrical constraint.
These methods all rely on specific object, need to prepare especially, and be unfavorable for on-line operation.Based on this, the present invention proposes a kind of laser range finder and camera relative pose scaling method based on edge matching, described method is extracted respectively the edge line that represents environment edge contour from laser data and image, by minimizing the symmetrical KL distance of edge line distribution, calculate the relative pose of laser range finder and camera.
summary of the invention:
The object of the invention is to overcome the deficiencies in the prior art, a kind of laser range finder and camera relative pose scaling method based on edge matching is provided.
The step of the laser range finder based on edge matching and camera relative pose scaling method is as follows:
1) by laser range finder, obtain the three-dimensional point cloud of surrounding environment, simultaneously by the image of this environment of collected by camera;
2) extract the edge contour of the some cloud that laser range finder collects, obtain representing the three-dimensional point set of three-dimensional edges ;
3), according to the performance parameter of laser range finder and error model, determine the probability distribution of three-dimensional edges point set
Figure 2013107425826100002DEST_PATH_IMAGE003
;
4) extract the edge contour of camera image, obtain representing the pixel set at two-dimentional edge
Figure 538598DEST_PATH_IMAGE004
;
5), according to the performance parameter of camera and error model, determine the probability distribution of two-dimentional edge pixel
Figure DEST_PATH_IMAGE005
;
6) with one group, comprise rotation matrix and translation matrix
Figure DEST_PATH_IMAGE007
coordinate conversion matrix represent the relative pose of laser range finder and camera, by three-dimensional edges point set
Figure 83291DEST_PATH_IMAGE002
project under camera coordinates system, obtain two-dimentional edge point set ;
7) according to the probability distribution of three-dimensional edges point set determine two-dimentional edge point set with projection relation
Figure 429456DEST_PATH_IMAGE008
probability distribution
Figure DEST_PATH_IMAGE009
;
8) calculate the probability distribution of two probability distribution three-dimensional edges point sets with two-dimentional edge point set
Figure 620452DEST_PATH_IMAGE008
probability distribution
Figure 774353DEST_PATH_IMAGE009
between symmetrical KL distance
Figure 656858DEST_PATH_IMAGE010
, with
Figure 709476DEST_PATH_IMAGE006
,
Figure 386445DEST_PATH_IMAGE007
for parameter, to minimize symmetrical KL distance
Figure 824380DEST_PATH_IMAGE010
for optimum target, try to achieve optimum laser range finder and camera relative pose transition matrix
Figure DEST_PATH_IMAGE011
.
Described step 2) be: a) for unordered some cloud, search for each point around radius be less than
Figure 448259DEST_PATH_IMAGE012
quantity in scope is no more than
Figure DEST_PATH_IMAGE013
all nearest neighbor points, obtain point set
Figure 842200DEST_PATH_IMAGE014
, for
Figure 955650DEST_PATH_IMAGE014
fit Plane
Figure DEST_PATH_IMAGE015
, with point set
Figure 818564DEST_PATH_IMAGE014
in plane interior subpoint position is independent variable, point set to plane
Figure 342714DEST_PATH_IMAGE015
distance be functional value, matching Binary quadratic functions , obtain Hessian matrix , calculate Hessian matrix
Figure 642294DEST_PATH_IMAGE017
eigenwert
Figure 823877DEST_PATH_IMAGE018
with
Figure DEST_PATH_IMAGE019
, suppose
Figure 216812DEST_PATH_IMAGE020
if,
Figure DEST_PATH_IMAGE021
and ,
Figure DEST_PATH_IMAGE023
,
Figure 308844DEST_PATH_IMAGE024
be respectively respective threshold, think that this point is marginal point; B), for orderly some cloud, depth map, utilizes Canny algorithm to extract edge point set.
Described is point set
Figure 344933DEST_PATH_IMAGE014
fit Plane
Figure 971087DEST_PATH_IMAGE015
method be: calculate point set
Figure 279577DEST_PATH_IMAGE014
average, obtain plane
Figure 649379DEST_PATH_IMAGE015
center ; Calculate
Figure 743236DEST_PATH_IMAGE026
proper vector, its minimal eigenvalue characteristic of correspondence vector is plane
Figure 540291DEST_PATH_IMAGE015
normal vector
Figure DEST_PATH_IMAGE027
; Plane center
Figure 683139DEST_PATH_IMAGE025
and normal vector
Figure 693820DEST_PATH_IMAGE027
represented that one through center
Figure 661776DEST_PATH_IMAGE025
, normal vector is
Figure 944859DEST_PATH_IMAGE027
plane.
Described matching Binary quadratic functions
Figure 656463DEST_PATH_IMAGE016
and obtain Hessian matrix
Figure 724913DEST_PATH_IMAGE017
method be: for point set
Figure 863770DEST_PATH_IMAGE014
middle every bit
Figure 181619DEST_PATH_IMAGE028
, suppose
Figure 385330DEST_PATH_IMAGE026
another two eigenwert characteristic of correspondence vectors are respectively
Figure DEST_PATH_IMAGE029
with
Figure 308286DEST_PATH_IMAGE030
, calculate one with
Figure DEST_PATH_IMAGE031
,
Figure 618045DEST_PATH_IMAGE032
for independent variable,
Figure 2013107425826100002DEST_PATH_IMAGE033
for the key-value pair of value,
Figure DEST_PATH_IMAGE035
One group of final formation
Figure 547824DEST_PATH_IMAGE036
to
Figure 601230DEST_PATH_IMAGE033
key assignments mapping, and utilize least square method to ask for Hessian matrix
Figure 441010DEST_PATH_IMAGE017
Figure DEST_PATH_IMAGE037
be expressed as
Figure 105789DEST_PATH_IMAGE038
The probability distribution of three-dimensional edges point set in described step 3)
Figure 697308DEST_PATH_IMAGE003
for:
Figure 844124DEST_PATH_IMAGE040
Wherein
Figure 495685DEST_PATH_IMAGE014
represent Gaussian distribution,
Figure 2013107425826100002DEST_PATH_IMAGE041
for point
Figure 213105DEST_PATH_IMAGE028
uncertain covariance matrix, depends on sensor performance parameter and error model.
The method of extracting camera image edge contour in described step 4) is Canny algorithm.
The probability distribution of two-dimentional edge pixel in described step 5)
Figure 296730DEST_PATH_IMAGE005
for:
Figure DEST_PATH_IMAGE043
Figure 48786DEST_PATH_IMAGE044
for pixel
Figure DEST_PATH_IMAGE045
uncertain covariance matrix, depends on sensor performance parameter and error model.
In described step 6) by three-dimensional edges point set projecting to the lower method of camera coordinates system is: for three-dimensional edges point set
Figure 325232DEST_PATH_IMAGE002
in a bit
Figure 524132DEST_PATH_IMAGE028
, the subpoint in its corresponding camera imaging plane
Figure 130694DEST_PATH_IMAGE046
for
Figure 2013107425826100002DEST_PATH_IMAGE047
Wherein
Figure 124058DEST_PATH_IMAGE006
with
Figure 566803DEST_PATH_IMAGE007
be respectively rotation matrix and translation matrix.
In described step 7) according to three-dimensional edges point set
Figure 569394DEST_PATH_IMAGE002
probability distribution
Figure 30462DEST_PATH_IMAGE003
determine two-dimentional edge point set with projection relation
Figure 194727DEST_PATH_IMAGE008
probability distribution
Figure 623303DEST_PATH_IMAGE009
method be:
Wherein
?。
Probability distribution two dimension edge point set in described step 8) probability distribution probability distribution with two-dimentional edge pixel
Figure 220366DEST_PATH_IMAGE005
between symmetrical KL distance
Figure 564759DEST_PATH_IMAGE010
computing method be:
Figure 797158DEST_PATH_IMAGE052
Minimize this symmetrical KL distance, can obtain module and carriage transformation matrix
Figure 506488DEST_PATH_IMAGE011
.
The present invention compared with prior art, the beneficial effect having:
1. do not rely on specific environment structure, do not rely on the auxiliary objects such as scaling board;
2. can on-line operation, the relative pose of immediate updating laser range finder and camera;
3. the some cloud edge contour and the image border profile that extracted can be further used for other application such as environment object identification and location.
Accompanying drawing explanation
Fig. 1 is laser range finder and the camera relative pose scaling method operation steps schematic diagram based on edge matching;
Fig. 2 is laser range finder and the camera relative pose scaling method implementation result figure based on edge matching.
Embodiment
A kind of laser range finder and camera relative pose scaling method based on edge matching of the present invention, after demarcating, the some cloud of laser range finder collection can carry out corresponding accurately with the image of collected by camera.On the one hand, image can obtain colour point clouds for the colouring of some cloud, or pastes color texture for take the surface mesh that a cloud is summit, obtains grain surface model; On the other hand, the degree of depth that some cloud can indicating section image-region, for the application such as the identification based on image, location provide support.
Timing signal, allows laser range finder and camera image data simultaneously, and guarantees their most of coincidence of observation scope, and the some cloud that laser range finder is collected and the image of collected by camera are processed, and obtain online the relative pose of laser range finder and camera.
As described in Figure 1, the step of the laser range finder based on edge matching and camera relative pose scaling method is as follows:
1) by laser range finder, obtain the three-dimensional point cloud of surrounding environment, simultaneously by the image of this environment of collected by camera;
2) extract the edge contour of the some cloud that laser range finder collects, obtain representing the three-dimensional point set of three-dimensional edges
Figure 457126DEST_PATH_IMAGE002
;
3), according to the performance parameter of laser range finder and error model, determine the probability distribution of three-dimensional edges point set
Figure 526582DEST_PATH_IMAGE003
;
4) extract the edge contour of camera image, obtain representing the pixel set at two-dimentional edge
Figure 879066DEST_PATH_IMAGE004
;
5), according to the performance parameter of camera and error model, determine the probability distribution of two-dimentional edge pixel
Figure 821614DEST_PATH_IMAGE005
;
6) with one group, comprise rotation matrix
Figure 197232DEST_PATH_IMAGE006
and translation matrix
Figure 883428DEST_PATH_IMAGE007
coordinate conversion matrix represent the relative pose of laser range finder and camera, by three-dimensional edges point set
Figure 778834DEST_PATH_IMAGE002
project under camera coordinates system, obtain two-dimentional edge point set
Figure 892284DEST_PATH_IMAGE008
;
7) according to the probability distribution of three-dimensional edges point set
Figure 817514DEST_PATH_IMAGE003
determine two-dimentional edge point set with projection relation
Figure 979506DEST_PATH_IMAGE008
probability distribution
Figure 306582DEST_PATH_IMAGE009
;
8) calculate the probability distribution of two probability distribution three-dimensional edges point sets
Figure 512304DEST_PATH_IMAGE003
with two-dimentional edge point set
Figure 924831DEST_PATH_IMAGE008
probability distribution
Figure 952830DEST_PATH_IMAGE009
between symmetrical KL distance , with
Figure 527347DEST_PATH_IMAGE006
,
Figure 850007DEST_PATH_IMAGE007
for parameter, to minimize symmetrical KL distance
Figure 416117DEST_PATH_IMAGE010
for optimum target, try to achieve optimum laser range finder and camera relative pose transition matrix
Figure 717786DEST_PATH_IMAGE011
.
Described step 2) be: a) for unordered some cloud, search for each point around radius be less than
Figure 281622DEST_PATH_IMAGE012
quantity in scope is no more than
Figure 403162DEST_PATH_IMAGE013
all nearest neighbor points, obtain point set , for fit Plane
Figure 850827DEST_PATH_IMAGE015
, with point set
Figure 459662DEST_PATH_IMAGE014
in plane
Figure 367576DEST_PATH_IMAGE015
interior subpoint position is independent variable, point set
Figure 66673DEST_PATH_IMAGE014
to plane
Figure 34629DEST_PATH_IMAGE015
distance be functional value, matching Binary quadratic functions
Figure 68444DEST_PATH_IMAGE016
, obtain Hessian matrix
Figure 780048DEST_PATH_IMAGE017
, calculate Hessian matrix
Figure 645235DEST_PATH_IMAGE017
eigenwert
Figure 971043DEST_PATH_IMAGE018
with
Figure 554472DEST_PATH_IMAGE019
, suppose
Figure 7450DEST_PATH_IMAGE020
if,
Figure 727144DEST_PATH_IMAGE021
and
Figure 36903DEST_PATH_IMAGE022
,
Figure 796042DEST_PATH_IMAGE023
,
Figure 849449DEST_PATH_IMAGE024
be respectively respective threshold, think that this point is marginal point; B) for orderly some cloud, it is depth map, utilize Canny algorithm (Canny J. A computational approach to edge detection[J]. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 1986 (6): 679-698.) extract edge point set.
Described is point set fit Plane
Figure 107572DEST_PATH_IMAGE015
method be: calculate point set average, obtain plane
Figure 444061DEST_PATH_IMAGE015
center
Figure 138348DEST_PATH_IMAGE025
; Calculate
Figure 993171DEST_PATH_IMAGE026
proper vector, its minimal eigenvalue characteristic of correspondence vector is plane
Figure 772909DEST_PATH_IMAGE015
normal vector
Figure 168118DEST_PATH_IMAGE027
; Plane
Figure 670906DEST_PATH_IMAGE015
center
Figure 493368DEST_PATH_IMAGE025
and normal vector represented that one through center
Figure 896985DEST_PATH_IMAGE025
, normal vector is
Figure 565863DEST_PATH_IMAGE027
plane.
Described matching Binary quadratic functions
Figure 746178DEST_PATH_IMAGE016
and obtain Hessian matrix method be: for point set
Figure 175202DEST_PATH_IMAGE014
middle every bit
Figure 698587DEST_PATH_IMAGE028
, suppose
Figure 754530DEST_PATH_IMAGE026
another two eigenwert characteristic of correspondence vectors are respectively
Figure 996156DEST_PATH_IMAGE029
with
Figure 271279DEST_PATH_IMAGE030
, calculate one with
Figure 101701DEST_PATH_IMAGE031
,
Figure 436867DEST_PATH_IMAGE032
for independent variable,
Figure 103472DEST_PATH_IMAGE033
for the key-value pair of value,
Figure DEST_PATH_IMAGE053
One group of final formation
Figure 182287DEST_PATH_IMAGE036
to
Figure 368679DEST_PATH_IMAGE033
key assignments mapping, and utilize least square method to ask for Hessian matrix
Figure 140326DEST_PATH_IMAGE017
Figure 911153DEST_PATH_IMAGE016
be expressed as
Figure 263637DEST_PATH_IMAGE038
The probability distribution of three-dimensional edges point set in described step 3)
Figure 393136DEST_PATH_IMAGE003
for:
Figure 831071DEST_PATH_IMAGE040
Wherein
Figure 189371DEST_PATH_IMAGE014
represent Gaussian distribution, for point
Figure 775390DEST_PATH_IMAGE028
uncertain covariance matrix, depend on sensor performance parameter and error model (Bae K H, Belton D, Lichti D. A framework for position uncertainty of unorganised three-dimensional point clouds from near-monostatic laser scanners using covariance analysis[C] //Proceedings of the ISPRS Workshop " Laser scanning. 2005.).
The method of extracting camera image edge contour in described step 4) be Canny algorithm (Canny J. A computational approach to edge detection[J]. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 1986 (6): 679-698.).
The probability distribution of two-dimentional edge pixel in described step 5)
Figure 400755DEST_PATH_IMAGE005
for:
Figure 625063DEST_PATH_IMAGE054
Figure 624243DEST_PATH_IMAGE044
for pixel uncertain covariance matrix, depend on sensor performance parameter and error model (De Santo M, Liguori C, Pietrosanto A. Uncertainty characterization in image-based measurements:a preliminary discussion[J]. Instrumentation and Measurement, IEEE Transactions on, 2000,49 (5): 1101-1107.).
In described step 6) by three-dimensional edges point set
Figure 508071DEST_PATH_IMAGE002
projecting to the lower method of camera coordinates system is: for three-dimensional edges point set
Figure 536070DEST_PATH_IMAGE002
in a bit
Figure 717653DEST_PATH_IMAGE028
, the subpoint in its corresponding camera imaging plane
Figure 845009DEST_PATH_IMAGE046
for
Figure 744832DEST_PATH_IMAGE047
Wherein
Figure 999358DEST_PATH_IMAGE006
with
Figure 301026DEST_PATH_IMAGE007
be respectively rotation matrix and translation matrix.
In described step 7) according to three-dimensional edges point set
Figure 661600DEST_PATH_IMAGE002
probability distribution
Figure 986402DEST_PATH_IMAGE003
determine two-dimentional edge point set with projection relation
Figure 356204DEST_PATH_IMAGE008
probability distribution
Figure 433750DEST_PATH_IMAGE009
method be:
Figure DEST_PATH_IMAGE055
Wherein
Figure 230805DEST_PATH_IMAGE050
?。
Probability distribution two dimension edge point set in described step 8)
Figure 777324DEST_PATH_IMAGE008
probability distribution
Figure 950816DEST_PATH_IMAGE009
probability distribution with two-dimentional edge pixel between symmetrical KL distance
Figure 617869DEST_PATH_IMAGE010
computing method be:
Figure 714001DEST_PATH_IMAGE056
Minimize this symmetrical KL distance, can obtain module and carriage transformation matrix
Figure 363288DEST_PATH_IMAGE011
.

Claims (10)

1. the laser range finder based on edge matching and a camera relative pose scaling method, is characterized in that its step is as follows:
1) by laser range finder, obtain the three-dimensional point cloud of surrounding environment, simultaneously by the image of this environment of collected by camera;
2) extract the edge contour of the some cloud that laser range finder collects, obtain representing the three-dimensional point set of three-dimensional edges
Figure 62240DEST_PATH_IMAGE001
;
3), according to the performance parameter of laser range finder and error model, determine the probability distribution of three-dimensional edges point set
Figure 902020DEST_PATH_IMAGE002
;
4) extract the edge contour of camera image, obtain representing the pixel set at two-dimentional edge ;
5), according to the performance parameter of camera and error model, determine the probability distribution of two-dimentional edge pixel ;
6) with one group, comprise rotation matrix and translation matrix
Figure 864323DEST_PATH_IMAGE006
coordinate conversion matrix represent the relative pose of laser range finder and camera, by three-dimensional edges point set
Figure 515884DEST_PATH_IMAGE001
project under camera coordinates system, obtain two-dimentional edge point set
Figure 623517DEST_PATH_IMAGE007
;
7) according to the probability distribution of three-dimensional edges point set
Figure 18727DEST_PATH_IMAGE002
determine two-dimentional edge point set with projection relation
Figure 567520DEST_PATH_IMAGE007
probability distribution
Figure 717878DEST_PATH_IMAGE008
;
8) calculate the probability distribution of two probability distribution three-dimensional edges point sets with two-dimentional edge point set
Figure 980549DEST_PATH_IMAGE007
probability distribution
Figure 649428DEST_PATH_IMAGE008
between symmetrical KL distance
Figure 472153DEST_PATH_IMAGE009
, with ,
Figure 963494DEST_PATH_IMAGE006
for parameter, to minimize symmetrical KL distance
Figure 549196DEST_PATH_IMAGE009
for optimum target, try to achieve optimum laser range finder and camera relative pose transition matrix
Figure 713461DEST_PATH_IMAGE010
.
2. laser range finder and the camera relative pose scaling method based on edge matching according to claim 1, is characterized in that described step 2) be: a) for unordered some cloud, search for each point around radius be less than
Figure 17404DEST_PATH_IMAGE011
quantity in scope is no more than
Figure 558106DEST_PATH_IMAGE012
all nearest neighbor points, obtain point set , for fit Plane
Figure 62403DEST_PATH_IMAGE014
, with point set
Figure 970578DEST_PATH_IMAGE013
in plane
Figure 468556DEST_PATH_IMAGE014
interior subpoint position is independent variable, point set
Figure 36940DEST_PATH_IMAGE013
to plane
Figure 253158DEST_PATH_IMAGE014
distance be functional value, matching Binary quadratic functions
Figure 135664DEST_PATH_IMAGE015
, obtain Hessian matrix , calculate Hessian matrix
Figure 227433DEST_PATH_IMAGE016
eigenwert
Figure 727685DEST_PATH_IMAGE017
with
Figure 413881DEST_PATH_IMAGE018
, suppose if,
Figure 298103DEST_PATH_IMAGE020
and ,
Figure 509958DEST_PATH_IMAGE022
,
Figure 837034DEST_PATH_IMAGE023
be respectively respective threshold, think that this point is marginal point; B), for orderly some cloud, depth map, utilizes Canny algorithm to extract edge point set.
3. laser range finder and the camera relative pose scaling method based on edge matching according to claim 2, is characterized in that, described is point set
Figure 183702DEST_PATH_IMAGE013
fit Plane
Figure 330650DEST_PATH_IMAGE014
method be: calculate point set
Figure 358649DEST_PATH_IMAGE013
average, obtain plane
Figure 602548DEST_PATH_IMAGE014
center
Figure 57800DEST_PATH_IMAGE024
; Calculate
Figure 521405DEST_PATH_IMAGE025
proper vector, its minimal eigenvalue characteristic of correspondence vector is plane normal vector
Figure 123604DEST_PATH_IMAGE027
; Plane
Figure 812075DEST_PATH_IMAGE014
center
Figure 933615DEST_PATH_IMAGE024
and normal vector
Figure 365733DEST_PATH_IMAGE027
represented that one through center
Figure 521908DEST_PATH_IMAGE024
, normal vector is
Figure 318962DEST_PATH_IMAGE027
plane.
4. laser range finder and the camera relative pose scaling method based on edge matching according to claim 2, is characterized in that described matching Binary quadratic functions
Figure 990115DEST_PATH_IMAGE015
and obtain Hessian matrix
Figure 898028DEST_PATH_IMAGE016
method be: for point set
Figure 472491DEST_PATH_IMAGE013
middle every bit
Figure 440447DEST_PATH_IMAGE028
, suppose
Figure 598896DEST_PATH_IMAGE025
another two eigenwert characteristic of correspondence vectors are respectively
Figure 310500DEST_PATH_IMAGE029
with
Figure 503584DEST_PATH_IMAGE030
, calculate one with
Figure 642442DEST_PATH_IMAGE031
,
Figure 960290DEST_PATH_IMAGE032
for independent variable,
Figure DEST_PATH_IMAGE033
for the key-value pair of value,
Figure 537902DEST_PATH_IMAGE034
One group of final formation
Figure 86958DEST_PATH_IMAGE035
to
Figure 396716DEST_PATH_IMAGE033
key assignments mapping, and utilize least square method to ask for Hessian matrix
Figure 201861DEST_PATH_IMAGE016
Figure 157365DEST_PATH_IMAGE015
be expressed as
Figure 700342DEST_PATH_IMAGE037
5. laser range finder and the camera relative pose scaling method based on edge matching according to claim 1, is characterized in that the probability distribution of three-dimensional edges point set in described step 3) for:
Figure 912197DEST_PATH_IMAGE038
Wherein
Figure 872063DEST_PATH_IMAGE013
represent Gaussian distribution,
Figure 523624DEST_PATH_IMAGE039
for point
Figure 867143DEST_PATH_IMAGE028
uncertain covariance matrix, depends on sensor performance parameter and error model.
6. laser range finder and the camera relative pose scaling method based on edge matching according to claim 1, is characterized in that, the method for extracting camera image edge contour in described step 4) is Canny algorithm.
7. laser range finder and the camera relative pose scaling method based on edge matching according to claim 1, is characterized in that the probability distribution of two-dimentional edge pixel in described step 5) for:
Figure 139041DEST_PATH_IMAGE040
Figure DEST_PATH_IMAGE041
for pixel
Figure 961504DEST_PATH_IMAGE042
uncertain covariance matrix, depends on sensor performance parameter and error model.
8. laser range finder and the camera relative pose scaling method based on edge matching according to claim 1, is characterized in that, in described step 6) by three-dimensional edges point set
Figure 290854DEST_PATH_IMAGE001
projecting to the lower method of camera coordinates system is: for three-dimensional edges point set
Figure 489754DEST_PATH_IMAGE001
in a bit
Figure 220950DEST_PATH_IMAGE028
, the subpoint in its corresponding camera imaging plane
Figure 766377DEST_PATH_IMAGE043
for
Figure 520706DEST_PATH_IMAGE044
Wherein
Figure 585614DEST_PATH_IMAGE005
with
Figure 108999DEST_PATH_IMAGE006
be respectively rotation matrix and translation matrix.
9. laser range finder and the camera relative pose scaling method based on edge matching according to claim 1, is characterized in that, in described step 7) according to three-dimensional edges point set probability distribution
Figure 577207DEST_PATH_IMAGE002
determine two-dimentional edge point set with projection relation
Figure 117909DEST_PATH_IMAGE007
probability distribution
Figure 558118DEST_PATH_IMAGE008
method be:
Figure 158864DEST_PATH_IMAGE045
Wherein
Figure 185988DEST_PATH_IMAGE046
?。
10. laser range finder and the camera relative pose scaling method based on edge matching according to claim 1, is characterized in that, the probability distribution two dimension edge point set in described step 8)
Figure 530381DEST_PATH_IMAGE007
probability distribution
Figure 762780DEST_PATH_IMAGE008
probability distribution with two-dimentional edge pixel
Figure 596743DEST_PATH_IMAGE004
between symmetrical KL distance computing method be:
Figure DEST_PATH_IMAGE047
Minimize this symmetrical KL distance, can obtain module and carriage transformation matrix
Figure 492204DEST_PATH_IMAGE010
.
CN201310742582.6A 2013-12-30 2013-12-30 A kind of laser range finder based on edge matching and camera relative pose scaling method Active CN103727930B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310742582.6A CN103727930B (en) 2013-12-30 2013-12-30 A kind of laser range finder based on edge matching and camera relative pose scaling method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310742582.6A CN103727930B (en) 2013-12-30 2013-12-30 A kind of laser range finder based on edge matching and camera relative pose scaling method

Publications (2)

Publication Number Publication Date
CN103727930A true CN103727930A (en) 2014-04-16
CN103727930B CN103727930B (en) 2016-03-23

Family

ID=50452110

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310742582.6A Active CN103727930B (en) 2013-12-30 2013-12-30 A kind of laser range finder based on edge matching and camera relative pose scaling method

Country Status (1)

Country Link
CN (1) CN103727930B (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104111071A (en) * 2014-07-10 2014-10-22 上海宇航系统工程研究所 High-precision position posture calculating method based on laser ranging and camera visual fusion
CN104484887A (en) * 2015-01-19 2015-04-01 河北工业大学 External parameter calibration method used when camera and two-dimensional laser range finder are used in combined mode
CN106023198A (en) * 2016-05-16 2016-10-12 天津工业大学 Hessian matrix-based method for extracting aortic dissection of human thoracoabdominal cavity CT image
CN106530345A (en) * 2016-11-07 2017-03-22 江西理工大学 Building three-dimensional laser point cloud feature extraction method based on assistance of three-dimensional laser scanning system/digital camera images
CN106931879A (en) * 2017-01-23 2017-07-07 成都通甲优博科技有限责任公司 A kind of binocular error measurement method, apparatus and system
CN107607095A (en) * 2017-09-22 2018-01-19 义乌敦仁智能科技有限公司 A kind of house measurement method of view-based access control model and laser
WO2018195999A1 (en) * 2017-04-28 2018-11-01 SZ DJI Technology Co., Ltd. Calibration of laser and vision sensors
CN109059902A (en) * 2018-09-07 2018-12-21 百度在线网络技术(北京)有限公司 Relative pose determines method, apparatus, equipment and medium
CN109782811A (en) * 2019-02-02 2019-05-21 绥化学院 An automatic following control system and method for an unmanned model car
CN110148185A (en) * 2019-05-22 2019-08-20 北京百度网讯科技有限公司 Determine method, apparatus, electronic equipment and the storage medium of coordinate system conversion parameter
CN110414558A (en) * 2019-06-24 2019-11-05 武汉大学 Feature point matching method based on event camera
CN111912346A (en) * 2020-06-30 2020-11-10 成都飞机工业(集团)有限责任公司 Nest hole online detection method suitable for robot drilling and riveting system on surface of airplane
CN112325767A (en) * 2020-10-16 2021-02-05 华中科技大学鄂州工业技术研究院 Spatial plane size measurement method integrating machine vision and flight time measurement
CN113465536A (en) * 2021-06-30 2021-10-01 皖江工学院 Laser holder based on camera guide and working method thereof
CN113587829A (en) * 2021-09-03 2021-11-02 凌云光技术股份有限公司 Edge thickness measuring method and device, edge thickness measuring equipment and medium
CN114820772A (en) * 2019-07-15 2022-07-29 牧今科技 System and method for object detection based on image data
CN116027269A (en) * 2023-03-29 2023-04-28 成都量芯集成科技有限公司 Plane scene positioning method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050237385A1 (en) * 2003-05-29 2005-10-27 Olympus Corporation Stereo camera supporting apparatus, stereo camera supporting method, calibration detection apparatus, calibration correction apparatus, and stereo camera system
EP2573584A1 (en) * 2011-09-26 2013-03-27 Honeywell International Inc. Generic surface feature extraction from a set of range data
CN103257342A (en) * 2013-01-11 2013-08-21 大连理工大学 Joint Calibration Method of 3D Laser Sensor and 2D Laser Sensor

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050237385A1 (en) * 2003-05-29 2005-10-27 Olympus Corporation Stereo camera supporting apparatus, stereo camera supporting method, calibration detection apparatus, calibration correction apparatus, and stereo camera system
EP2573584A1 (en) * 2011-09-26 2013-03-27 Honeywell International Inc. Generic surface feature extraction from a set of range data
CN103257342A (en) * 2013-01-11 2013-08-21 大连理工大学 Joint Calibration Method of 3D Laser Sensor and 2D Laser Sensor

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MASTIN,A. ET AL.: "Automatic registration of LIDAR and optical images of urban scens", 《COMPUTER VISION AND PATTERN RECOGNITION》 *
YUE WANG,RONG XIONG,ET AL.: "Kullback-Leibler Divergence based Graph Pruning in Robotic Feature Mapping", 《MOBILE ROBOTS(ECMR)》 *

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104111071B (en) * 2014-07-10 2017-01-18 上海宇航系统工程研究所 High-precision position posture calculating method based on laser ranging and camera visual fusion
CN104111071A (en) * 2014-07-10 2014-10-22 上海宇航系统工程研究所 High-precision position posture calculating method based on laser ranging and camera visual fusion
CN104484887B (en) * 2015-01-19 2017-07-07 河北工业大学 External parameters calibration method when video camera is used in combination with scanning laser range finder
CN104484887A (en) * 2015-01-19 2015-04-01 河北工业大学 External parameter calibration method used when camera and two-dimensional laser range finder are used in combined mode
CN106023198A (en) * 2016-05-16 2016-10-12 天津工业大学 Hessian matrix-based method for extracting aortic dissection of human thoracoabdominal cavity CT image
CN106530345A (en) * 2016-11-07 2017-03-22 江西理工大学 Building three-dimensional laser point cloud feature extraction method based on assistance of three-dimensional laser scanning system/digital camera images
CN106530345B (en) * 2016-11-07 2018-12-25 江西理工大学 A kind of building three-dimensional laser point cloud feature extracting method under same machine Image-aided
CN106931879A (en) * 2017-01-23 2017-07-07 成都通甲优博科技有限责任公司 A kind of binocular error measurement method, apparatus and system
CN106931879B (en) * 2017-01-23 2020-01-21 成都通甲优博科技有限责任公司 Binocular error measurement method, device and system
WO2018195999A1 (en) * 2017-04-28 2018-11-01 SZ DJI Technology Co., Ltd. Calibration of laser and vision sensors
US10884110B2 (en) 2017-04-28 2021-01-05 SZ DJI Technology Co., Ltd. Calibration of laser and vision sensors
US10436884B2 (en) 2017-04-28 2019-10-08 SZ DJI Technology Co., Ltd. Calibration of laser and vision sensors
CN107607095A (en) * 2017-09-22 2018-01-19 义乌敦仁智能科技有限公司 A kind of house measurement method of view-based access control model and laser
CN109059902A (en) * 2018-09-07 2018-12-21 百度在线网络技术(北京)有限公司 Relative pose determines method, apparatus, equipment and medium
US11372101B2 (en) 2018-09-07 2022-06-28 Apollo Intelligent Driving Technology (Beijing) Co., Ltd. Method and apparatus for determining relative pose, device and medium
CN109059902B (en) * 2018-09-07 2021-05-28 百度在线网络技术(北京)有限公司 Relative pose determination method, device, equipment and medium
CN109782811B (en) * 2019-02-02 2021-10-08 绥化学院 An automatic following control system and method for an unmanned model car
CN109782811A (en) * 2019-02-02 2019-05-21 绥化学院 An automatic following control system and method for an unmanned model car
CN110148185A (en) * 2019-05-22 2019-08-20 北京百度网讯科技有限公司 Determine method, apparatus, electronic equipment and the storage medium of coordinate system conversion parameter
CN110148185B (en) * 2019-05-22 2022-04-15 北京百度网讯科技有限公司 Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment
CN110414558A (en) * 2019-06-24 2019-11-05 武汉大学 Feature point matching method based on event camera
CN114820772A (en) * 2019-07-15 2022-07-29 牧今科技 System and method for object detection based on image data
CN111912346A (en) * 2020-06-30 2020-11-10 成都飞机工业(集团)有限责任公司 Nest hole online detection method suitable for robot drilling and riveting system on surface of airplane
CN112325767A (en) * 2020-10-16 2021-02-05 华中科技大学鄂州工业技术研究院 Spatial plane size measurement method integrating machine vision and flight time measurement
CN112325767B (en) * 2020-10-16 2022-07-26 华中科技大学鄂州工业技术研究院 A spatial plane dimension measurement method integrating machine vision and time-of-flight measurement
CN113465536A (en) * 2021-06-30 2021-10-01 皖江工学院 Laser holder based on camera guide and working method thereof
CN113587829A (en) * 2021-09-03 2021-11-02 凌云光技术股份有限公司 Edge thickness measuring method and device, edge thickness measuring equipment and medium
CN113587829B (en) * 2021-09-03 2023-08-01 凌云光技术股份有限公司 Edge thickness measuring method and device, edge thickness measuring equipment and medium
CN116027269A (en) * 2023-03-29 2023-04-28 成都量芯集成科技有限公司 Plane scene positioning method

Also Published As

Publication number Publication date
CN103727930B (en) 2016-03-23

Similar Documents

Publication Publication Date Title
CN103727930A (en) Edge-matching-based relative pose calibration method of laser range finder and camera
CN110443836B (en) A method and device for automatic registration of point cloud data based on plane features
CN108107444B (en) A method for identifying foreign objects in substations based on laser data
CN109903313B (en) A Real-time Pose Tracking Method Based on 3D Model of Target
CN107093205B (en) A kind of three-dimensional space building window detection method for reconstructing based on unmanned plane image
CN107063228B (en) Target attitude calculation method based on binocular vision
CN103411553B (en) The quick calibrating method of multi-linear structured light vision sensors
CN104748683B (en) A kind of on-line automatic measurement apparatus of Digit Control Machine Tool workpiece and measuring method
CN110910350B (en) Nut loosening detection method for wind power tower cylinder
Chen et al. Robust affine-invariant line matching for high resolution remote sensing images
CN103606170B (en) Streetscape image feature based on colored Scale invariant detects and matching process
Urban et al. Finding a good feature detector-descriptor combination for the 2D keypoint-based registration of TLS point clouds
Cheng et al. Building boundary extraction from high resolution imagery and lidar data
CN110021029B (en) Real-time dynamic registration method and storage medium suitable for RGBD-SLAM
CN104809738A (en) Airbag overall dimension detection method based on binocular vision
CN108416385A (en) It is a kind of to be positioned based on the synchronization for improving Image Matching Strategy and build drawing method
CN111126116A (en) Method and system for identifying river garbage by unmanned boat
CN103136525A (en) High-precision positioning method for special-shaped extended target by utilizing generalized Hough transformation
Yuan et al. Combining maps and street level images for building height and facade estimation
CN109035207A (en) The laser point cloud characteristic detection method of degree adaptive
CN108182705A (en) A kind of three-dimensional coordinate localization method based on machine vision
CN106886988A (en) A linear target detection method and system based on UAV remote sensing
CN103854290A (en) Extended target tracking method combining skeleton characteristic points and distribution field descriptors
Wang Automatic extraction of building outline from high resolution aerial imagery
Zheng et al. LiDAR point cloud registration based on improved ICP method and SIFT feature

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20180724

Address after: 310052 208, room 6, 1197 Binan Road, Binjiang District, Hangzhou, Zhejiang.

Patentee after: HANGZHOU IPLUS TECH CO.,LTD.

Address before: 310027 No. 38, Zhejiang Road, Hangzhou, Zhejiang, Xihu District

Patentee before: Zhejiang University

TR01 Transfer of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A method for calibrating the relative pose between a laser rangefinder and a camera based on edge matching

Granted publication date: 20160323

Pledgee: Guotou Taikang Trust Co.,Ltd.

Pledgor: HANGZHOU IPLUS TECH CO.,LTD.

Registration number: Y2024330002319

PE01 Entry into force of the registration of the contract for pledge of patent right