[go: up one dir, main page]

CN114371472B - Automatic combined calibration device and method for laser radar and camera - Google Patents

Automatic combined calibration device and method for laser radar and camera Download PDF

Info

Publication number
CN114371472B
CN114371472B CN202111539056.0A CN202111539056A CN114371472B CN 114371472 B CN114371472 B CN 114371472B CN 202111539056 A CN202111539056 A CN 202111539056A CN 114371472 B CN114371472 B CN 114371472B
Authority
CN
China
Prior art keywords
pca
calibration plate
calibration
point cloud
frame data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111539056.0A
Other languages
Chinese (zh)
Other versions
CN114371472A (en
Inventor
龚方徽
乔宝华
程坤
詹兴样
王方瑞
董帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CETHIK Group Ltd
Original Assignee
CETHIK Group Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETHIK Group Ltd filed Critical CETHIK Group Ltd
Priority to CN202111539056.0A priority Critical patent/CN114371472B/en
Publication of CN114371472A publication Critical patent/CN114371472A/en
Application granted granted Critical
Publication of CN114371472B publication Critical patent/CN114371472B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses an automatic combined calibration device and method for a laser radar and a camera, wherein the method comprises the steps of controlling movable orbit movement to enable a rotatable bracket to be positioned on a preset point position, controlling the rotatable bracket to change rotation gesture to enable a moving part to change orientation, and acquiring image frame data and point cloud frame data in each orientation of the moving part until the preset quantity is acquired; performing camera internal parameter calibration based on the acquired image frame data; the image is calculated based on the image frame data to obtain pixel coordinates of four vertexes at the outermost periphery of the calibration plate; extracting point cloud calibration points to obtain point cloud final coordinates of four vertexes at the outermost periphery of the calibration plate; and solving the transformation relation between the pixel coordinates of the four top points at the outermost periphery of the calibration plate and the final coordinates of the point cloud to obtain a rotation matrix and a translation matrix, and completing the joint calibration of the laser radar and the camera. The invention realizes the linkage calibration of the laser radar and the camera with convenience, rapidness and high precision.

Description

Automatic combined calibration device and method for laser radar and camera
Technical Field
The application belongs to the technical field of equipment joint calibration, and particularly relates to an automatic joint calibration device and method for a laser radar and a camera.
Background
In the fields of mobile robots, automatic driving, auxiliary driving, environment sensing and the like, a single sensor is difficult to meet the sensing requirement on a complex environment, a multi-sensor fusion algorithm becomes a current mainstream algorithm, and the multi-sensor acquisition information is fused to perform advantage complementation. The premise of improving the accuracy of the multi-sensor fusion algorithm is to solve the problem of time synchronization and space synchronization among sensors, wherein the space synchronization is joint calibration among the sensors. The joint calibration is divided into two parts: and (5) calibrating an internal parameter and an external parameter. The internal reference calibration is the mapping relation inside the sensor, and the external reference calibration is the coordinate conversion relation among the sensors.
In order to achieve the calibration of different types of sensors, particularly the joint calibration of a camera and a laser radar, the traditional method is based on a calibration plate (a checkerboard, an L-shaped calibration plate and a three-dimensional calibration box), and the external parameter matrix solving is carried out by manually matching the 3D characteristic points of the laser point cloud with the 2D characteristic points of the camera. In the calibration process, the calibration plate is required to be manually moved, the characteristic points are manually selected, the operation is complex, the manual participation degree is high, the time consumption is low, and the precision is difficult to guarantee. In order to improve flexibility, calibration methods without participation of calibration plates are widely used. Based on the observed data, the correlation of intensity or edge features between the observed point cloud and the image data is utilized to find external parameters. The method can be calibrated in a natural scene, but the scene needs to be selected, and the scene needs to contain information such as trees, telegraph poles, street lamps and the like.
In the prior art, as disclosed in patent document CN111127563a, a joint calibration method is proposed, and no calibration plate is needed, but only the object with angular point is needed. And a target object for calibration is arranged in the target acquisition area, coordinates of all angular points in the image data and the point cloud data are calculated, and calibration parameters are calculated through coordinate matching. However, this method requires selection of a calibration scene, which requires that the scene contains a detection object satisfying the characteristic condition. In the prior art, for example, patent document CN111735479B proposes a device and a method for assisting calibration by using a mechanical arm, so as to realize intelligent calibration. The mechanical arm is provided with a sensing fusion frame, the frame is provided with a camera and a laser radar, the center point position of each calibration plate is obtained through data processing of the four calibration plates, and the center point position is used as a characteristic point to match and calculate an external parameter matrix. In the method, the mechanical arm controls the calibration equipment to move the sampling point, the calibration range is limited due to the fact that the position of the calibration plate is unchanged, and the price of the mechanical arm is high.
In summary, the conventional calibration method in the prior art uses the calibration plate, so that the manual participation degree is high, the calibration plate needs to be manually moved, the characteristic points are manually selected or automatically detected by using an algorithm, the calibration process is tedious, the workload is large, and the automation degree is low. The positions of the laser radar and the camera are changed by using the mechanical arm, the calibration sample is collected, the automation degree is improved, but the mechanical arm has a limited moving range, the calibration range is limited, and the price and the calibration cost are increased. The calibration method of the fusion inertial navigation system needs to be carried out under the motion condition and is easily influenced by light rays and scenes.
Disclosure of Invention
The application aims to provide an automatic combined calibration device and method for a laser radar and a camera, which can realize the linkage calibration of the laser radar and the camera conveniently, rapidly and accurately.
In order to achieve the above purpose, the technical scheme adopted by the application is as follows:
An automatic joint calibration device of laser radar and camera for realize the joint calibration of laser radar and camera, laser radar and camera are installed on waiting to calibrate equipment, automatic joint calibration device of laser radar and camera includes: the device comprises a movable track, a rotatable support, a calibration plate and an industrial personal computer, wherein the calibration plate is a checkerboard calibration plate, one of equipment to be calibrated and the calibration plate is a moving part, and the other is a static part, and the device to be calibrated and the calibration plate are characterized in that:
The moving part is fixed on the rotatable bracket, and the static part and the moving part are arranged oppositely and do not move relative to the ground;
The rotatable support is provided with a plurality of preset rotating postures, the rotatable support changes the rotating postures to drive the moving piece to change the direction, and the rotatable support is arranged on the movable track;
The movable track is provided with a plurality of preset movement paths, and the movable track moves based on different movement paths to drive the rotatable support to change the point position;
the industrial personal computer is electrically connected with the laser radar, the camera, the movable track and the rotatable support and is used for issuing a calibration instruction to complete joint calibration of the laser radar and the camera, and the industrial personal computer specifically executes the following operations:
S1, controlling the movable orbit to move so that the rotatable support is positioned on a preset point, controlling the rotatable support to change the rotating gesture so that the moving part changes the direction, acquiring image frame data and point cloud frame data through a laser radar and a camera in each direction of the moving part, and repeatedly executing the step S1 until the preset number of image frame data and point cloud frame data are acquired;
S2, calibrating internal parameters of the camera based on the acquired image frame data;
S3, extracting the image calibration points: calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate based on the image frame data;
S4, extracting a point cloud calibration point:
S4.1, preprocessing point cloud frame data;
s4.2, utilizing a RANSAC algorithm to simulate the maximum plane in the point cloud frame data as a plane of the calibration plate, and extracting the point cloud frame data in the plane of the calibration plate;
s4.3, calculating a minimum external frame of point cloud frame data in the plane of the calibration plate, and taking coordinates of four vertexes of the minimum external frame as point cloud estimated coordinates of four vertexes at the outermost periphery of the calibration plate;
s4.4, correcting the point cloud estimated coordinates to obtain point cloud final coordinates;
s5, solving the transformation relation between the pixel coordinates of the four top points at the outermost periphery of the calibration plate and the final coordinates of the point cloud to obtain a rotation matrix and a translation matrix, and completing the joint calibration of the laser radar and the camera.
The following provides several alternatives, but not as additional limitations to the above-described overall scheme, and only further additions or preferences, each of which may be individually combined for the above-described overall scheme, or may be combined among multiple alternatives, without technical or logical contradictions.
Preferably, the movable track has two preset movement paths, and the two movement paths form a cross; the rotatable support has three preset rotation postures, and the three rotation postures are forward, left rotation and right rotation, wherein the rotation angle of the left rotation and the right rotation is smaller than 30 degrees.
Preferably, the calculating, based on the image frame data, the pixel coordinates of the four vertices at the outermost periphery of the calibration plate includes:
and extracting angular point position information by using a growth algorithm based on the image frame data, calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate according to the calibration plate customization parameters and the angular point position information, and sequentially storing the pixel coordinates of the four vertexes in a clockwise order from the upper left corner of the calibration plate.
Preferably, the correcting the estimated coordinates of the point cloud to obtain final coordinates of the point cloud includes:
S4.4.1, taking point cloud frame data in a plane of a calibration plate as input, and acquiring three principal component directions of the point cloud by a PCA method, wherein the three principal component directions are marked as y pca、zpca and x pca;
s4.4.2, fine tuning the obtained principal component directions y pca and z pca to enable the principal component directions to be close to the real length direction and the real width direction of the calibration plate;
S4.4.3, taking the adjusted principal component direction y pca as the length direction of the calibration plate, taking the adjusted principal component direction z pca as the width direction of the calibration plate, taking the principal component direction x pca as the normal direction of the calibration plate, calculating the coordinates of four vertexes of the adjusted minimum external frame as the final coordinates of point clouds of four vertexes of the outermost periphery of the calibration plate, and sequentially storing the final coordinates of the point clouds of the four vertexes in a clockwise order from the upper left corner of the calibration plate.
Preferably, the trimming the obtained principal component directions y pca and z pca to be close to the real length direction and width direction of the calibration plate includes:
S4.4.2.1, taking point cloud estimated coordinates of four vertexes at the outermost periphery of the calibration plate, and marking the estimated coordinates as V 1、V2、V3 and V 4;
S4.4.2.2, sequentially moving the origin of a coordinate axis formed by three principal component directions y pca、zpca and x pca to point cloud estimated coordinates of four vertexes to obtain coordinate systems C 1、C2、C3 and C 4;
S4.4.2.3, the following operations are performed under four coordinate systems C 1、C2、C3 and C 4, respectively:
(1) The principal component directions y pca and z pca rotate by a span of [ -30 °,30 ° ] about the principal component direction x pca based on the origin of the coordinate system, with a stride of 2 °;
(2) The acquisition of new y pca and z pca after each rotation in principal component directions y pca and z pca is noted as AndAnd calculate outAndAssembled planeSimultaneously recording the rotation angle theta i at this time, where i is the number of rotations and i=1, 2,3 …,30;
(3) Projecting point cloud frame data in a plane of a calibration plate to the plane Then calculate the projection surface onAndThe length in the direction is used as the length of the calibration plateSum width ofSimultaneously calculating lengthsSum width ofThe number of point clouds Num i in the enclosed rectangular area;
(4) Sorting Num i, and taking the corresponding i=i when the Num i value is maximum And
S4.4.2.3 taking four groups of data obtained after the operations are respectively executed under four coordinate systems C 1、C2、C3 and C 4 And) Taking four groups of dataAndCorresponding to the nearest real length and width of the calibration plateAndAs adjusted principal component directions y pca and z pca.
The application also provides an automatic combined calibration method of the laser radar and the camera, which is used for realizing the combined calibration of the laser radar and the camera, wherein the laser radar and the camera are arranged on equipment to be calibrated, the automatic combined calibration method of the laser radar and the camera is realized based on a movable track, a rotatable bracket, a calibration plate and an industrial personal computer, the calibration plate is a checkerboard calibration plate, one of the equipment to be calibrated and the calibration plate is a moving part, and the other one is a static part, wherein:
The moving part is fixed on the rotatable bracket, and the static part and the moving part are arranged oppositely and do not move relative to the ground;
The rotatable support is provided with a plurality of preset rotating postures, the rotatable support changes the rotating postures to drive the moving piece to change the direction, and the rotatable support is arranged on the movable track;
The movable track is provided with a plurality of preset movement paths, and the movable track moves based on different movement paths to drive the rotatable support to change the point position;
The industrial personal computer is electrically connected with the laser radar, the camera, the movable track and the rotatable support and is used for issuing a calibration instruction to complete the joint calibration of the laser radar and the camera, and the automatic joint calibration method of the laser radar and the camera comprises the following steps:
S1, controlling the movable orbit to move so that the rotatable support is positioned on a preset point, controlling the rotatable support to change the rotating gesture so that the moving part changes the direction, acquiring image frame data and point cloud frame data through a laser radar and a camera in each direction of the moving part, and repeatedly executing the step S1 until the preset number of image frame data and point cloud frame data are acquired;
S2, calibrating internal parameters of the camera based on the acquired image frame data;
S3, extracting the image calibration points: calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate based on the image frame data;
S4, extracting a point cloud calibration point:
S4.1, preprocessing point cloud frame data;
s4.2, utilizing a RANSAC algorithm to simulate the maximum plane in the point cloud frame data as a plane of the calibration plate, and extracting the point cloud frame data in the plane of the calibration plate;
s4.3, calculating a minimum external frame of point cloud frame data in the plane of the calibration plate, and taking coordinates of four vertexes of the minimum external frame as point cloud estimated coordinates of four vertexes at the outermost periphery of the calibration plate;
s4.4, correcting the point cloud estimated coordinates to obtain point cloud final coordinates;
s5, solving the transformation relation between the pixel coordinates of the four top points at the outermost periphery of the calibration plate and the final coordinates of the point cloud to obtain a rotation matrix and a translation matrix, and completing the joint calibration of the laser radar and the camera.
Preferably, the movable track has two preset movement paths, and the two movement paths form a cross; the rotatable support has three preset rotation postures, and the three rotation postures are forward, left rotation and right rotation, wherein the rotation angle of the left rotation and the right rotation is smaller than 30 degrees.
Preferably, the calculating, based on the image frame data, the pixel coordinates of the four vertices at the outermost periphery of the calibration plate includes:
and extracting angular point position information by using a growth algorithm based on the image frame data, calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate according to the calibration plate customization parameters and the angular point position information, and sequentially storing the pixel coordinates of the four vertexes in a clockwise order from the upper left corner of the calibration plate.
Preferably, the correcting the estimated coordinates of the point cloud to obtain final coordinates of the point cloud includes:
S4.4.1, taking point cloud frame data in a plane of a calibration plate as input, and acquiring three principal component directions of the point cloud by a PCA method, wherein the three principal component directions are marked as y pca、zpca and x pca;
s4.4.2, fine tuning the obtained principal component directions y pca and z pca to enable the principal component directions to be close to the real length direction and the real width direction of the calibration plate;
S4.4.3, taking the adjusted principal component direction y pca as the length direction of the calibration plate, taking the adjusted principal component direction z pca as the width direction of the calibration plate, taking the principal component direction x pca as the normal direction of the calibration plate, calculating the coordinates of four vertexes of the adjusted minimum external frame as the final coordinates of point clouds of four vertexes of the outermost periphery of the calibration plate, and sequentially storing the final coordinates of the point clouds of the four vertexes in a clockwise order from the upper left corner of the calibration plate.
Preferably, the trimming the obtained principal component directions y pca and z pca to be close to the real length direction and width direction of the calibration plate includes:
S4.4.2.1, taking point cloud estimated coordinates of four vertexes at the outermost periphery of the calibration plate, and marking the estimated coordinates as V 1、V2、V3 and V 4;
S4.4.2.2, sequentially moving the origin of a coordinate axis formed by three principal component directions y pca、zpca and x pca to point cloud estimated coordinates of four vertexes to obtain coordinate systems C 1、C2、C3 and C 4;
S4.4.2.3, the following operations are performed under four coordinate systems C 1、C2、C3 and C 4, respectively:
(1) The principal component directions y pca and z pca rotate by a span of [ -30 °,30 ° ] about the principal component direction x pca based on the origin of the coordinate system, with a stride of 2 °;
(2) The acquisition of new y pca and z pca after each rotation in principal component directions y pca and z pca is noted as AndAnd calculate outAndAssembled planeSimultaneously recording the rotation angle theta i at this time, where i is the number of rotations and i=1, 2,3 …,30;
(3) Projecting point cloud frame data in a plane of a calibration plate to the plane Then calculate the projection surface onAndThe length in the direction is used as the length of the calibration plateSum width ofSimultaneously calculating lengthsSum width ofThe number of point clouds Num i in the enclosed rectangular area;
(4) Sorting Num i, and taking the corresponding i=i when the Num i value is maximum And
S4.4.2.3 taking four groups of data obtained after the operations are respectively executed under four coordinate systems C 1、C2、C3 and C 4 And) Taking four groups of dataAndCorresponding to the nearest real length and width of the calibration plateAndAs adjusted principal component directions y pca and z pca.
Compared with the prior art, the automatic combined calibration device and method for the laser radar and the camera have the following beneficial effects:
1. the operation process is full-automatic, convenient and quick, the calibration process is not influenced by environmental and human operation factors, and the intelligent, automatic and batched calibration can be realized.
2. Compared with a mechanical arm, the track calibrating device is low in price.
3. And compared with the movement of the calibration equipment, the automatic movement of the calibration plate has a larger calibration range, is beneficial to collecting information at a relatively far place, and improves the calibration precision.
4. And fine adjustment is carried out on the vertex extraction of the laser point cloud calibration plate, the vertex coordinates are more accurate, and the calibration precision is improved.
Drawings
FIG. 1 is a schematic diagram of an automated joint calibration device for a lidar and a camera according to the present application;
FIG. 2 is a schematic rotation of a rotatable support of the present application;
FIG. 3 is a schematic view of a calibration plate of the present application;
FIG. 4 is a flow chart of an automated joint calibration method of the lidar and camera of the present application.
In the drawings: 1. an industrial personal computer; 2. equipment to be calibrated; 3. a laser radar; 4. a camera; 5. a movable rail; 6. a rotatable support; 7. and (5) calibrating the plate.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It will be understood that when an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present; when an element is referred to as being "fixed" to another element, it can be directly fixed to the other element or intervening elements may also be present.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
In one embodiment, an automatic combined calibration device for a laser radar and a camera is provided, which is used for realizing combined calibration of the laser radar and the camera, and relates to a multi-sensor fusion calibration technology. The problems of complicated process, large workload and low degree of automation of the conventional multi-sensor combined calibration are solved, the full-automatic combined calibration of the laser radar and the camera is conveniently and rapidly realized, the device cost is low, and the device is suitable for mass production.
In this embodiment, the laser radar 3 and the camera 4 are mounted on the device 2 to be calibrated, so that the camera 4 and the laser radar 3 are ensured to overlap in the field of view. The automatic joint calibration device of the laser radar 3 and the camera 4 of the present embodiment includes: the device comprises a movable track 5, a rotatable bracket 6, a calibration plate 7 and an industrial personal computer 1, wherein the calibration plate 7 is a checkerboard calibration plate, and one of the equipment 2 to be calibrated and the calibration plate 7 is a moving part, and the other is a static part.
In this embodiment, the moving member is fixed on the rotatable support, and the stationary member and the moving member are disposed opposite to each other and have no relative movement with respect to the ground. It is easy to understand that the arrangement of the stationary member and the moving member in the present embodiment is not limited to the face-to-face arrangement of the stationary member and the moving member, and a certain angle may exist between the stationary member and the moving member on the premise of ensuring that data can be effectively collected.
As shown in fig. 1, the purpose of setting the moving member and the stationary member in the present embodiment is to change the relative pose of the equipment to be calibrated and the calibration plate, based on which changing the pose of the calibration plate by the movable rail is one embodiment, and furthermore changing the pose of the equipment to be calibrated (without changing the rigid body relationship of the camera and the lidar in the equipment to be calibrated) by the movable rail is another embodiment of the present application. In order to facilitate the description of the technical scheme of the application, the embodiment takes the moving part as the calibration plate and the stationary part as the equipment to be calibrated as an example.
In this embodiment, the rotatable support 6 has a plurality of preset rotation postures, the rotatable support 6 changes the rotation postures to drive the moving member to change the orientation, and the rotatable support 6 is mounted on the movable rail 5.
In this embodiment, the movable track 5 has a plurality of preset movement paths, and the movable track moves based on different movement paths to drive the rotatable bracket to change the point position.
In this embodiment, a plurality of points are provided by setting a plurality of motion paths, a plurality of shooting angles are provided by setting a plurality of rotation postures, and a plurality of different shooting angles with a plurality of longitudinal shooting angles are obtained after the two are combined, so that a sufficient amount of data can be acquired conveniently.
In this embodiment, the industrial personal computer is electrically connected with the laser radar, the camera, the movable track and the rotatable bracket, and is used for issuing a calibration instruction to complete the joint calibration of the laser radar and the camera.
The industrial control machine specifically performs the following operations:
S1, controlling movable orbit movement to enable the rotatable support to be located on a preset point, controlling the rotatable support to change the rotation gesture to enable the moving part to change the direction, acquiring image frame data and point cloud frame data through a laser radar and a camera in each direction of the moving part, and repeatedly executing the step S1 until the preset number of image frame data and point cloud frame data are acquired.
The movable track of the embodiment has two preset movement paths, and the two movement paths form a cross shape, so that the track supports the movement in four directions of front, back, left and right. As shown in fig. 2, the rotatable support has three preset rotation postures, and the three rotation postures are forward, left and right rotation, that is, the support supports clockwise and counterclockwise rotation, and the rotation angle of the left and right rotation is less than 30 ° because the excessive rotation angle may cause the calibration plate not to be in the angle of view (no complete calibration plate in the image/no calibration plate data in the point cloud frame).
One way to acquire data based on this embodiment may be: the track driving support drives the calibration plate to move at five points of the track origin, the forward one point, the backward one point, the left one point and the right one point, the distance between each two points is 0.5-1.5 meters, and simultaneously, three pose rotations of forward, left rotation and right rotation are carried out on each point support, so that the rotation angle can be fixed, for example, 20 degrees. In the initial calibration condition, the calibration plate 7 is forward at the origin, and the camera 4 and the laser radar 5 are in front of the calibration plate 7. When the calibration plate 7 changes one pose, the laser radar 5 and the camera 4 acquire point clouds and images through data acquisition, and the total data acquisition process in the example is 15 poses.
In this embodiment, the subsequent calibration calculation is automatically performed after the data acquisition. The laser radar and camera automatic combined calibration device realizes full automation of calibration without manual participation. The track shape, the positions of the movable points, the moving sequence and the number are flexible and variable, the rotation angle of the support is variable, the calibration process is not influenced by environmental and human operation factors, and the intelligent, automatic and batch calibration can be realized.
S2, calibrating internal parameters of the camera based on the acquired image frame data.
And (5) performing camera internal parameter calibration by using the acquired frame data of the images of all the poses. Each frame of image comprises a plane calibration plate, as shown in fig. 3, the calibration plate is characterized by comprising checkerboard patterns with alternating black and white, angular point information of the intersection of black and white is extracted, and camera internal parameters are calibrated through a Zhang Zhengyou calibration method.
S3, extracting the image calibration points: calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate based on image frame data, wherein the pixel coordinates comprise:
and extracting angular point position information by using a growth algorithm based on the image frame data, calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate according to the calibration plate customization parameters and the angular point position information, and sequentially storing the pixel coordinates of the four vertexes in a clockwise order from the upper left corner of the calibration plate.
It is easy to understand that, in order to improve accuracy of pixel coordinate calculation, before image frame data processing, image frame data may be subjected to de-distortion processing, and de-distortion processing is performed on pose image frame data by using internal parameters of a camera, so as to obtain image frame data after de-distortion. And calculating four vertex coordinates of the calibration plate based on the de-distorted image frame data.
In this embodiment, when calculating the pixel coordinates of the vertex, the adopted calibration plate customization parameters include: the number of cells in the length and width directions of the plate (e.g., 7*5), the size of each cell (e.g., square cell, side length 10 cm), and the distance from the edge cell to the edge of the plate are calibrated. On the basis, according to the extracted angular point position information of the black-white grid intersection points, the pixel coordinates of the four corners of the outermost periphery of the calibration plate can be calculated.
S4, extracting a point cloud calibration point:
S4.1, preprocessing point cloud frame data; in this embodiment, an algorithm such as point cloud filtering is adopted when preprocessing is performed, and outliers in the point cloud data are removed.
And S4.2, analyzing and processing the preprocessed cloud frame data of each pose point to obtain the vertex position of the calibration plate as a calibration characteristic point. In the embodiment, the maximum plane in the RANSAC algorithm fitting point cloud frame data is used as a plane of the calibration plate, point cloud frame data in the plane of the calibration plate is extracted, and meanwhile, point cloud data far from the plane is deleted by using the distance (for example, within 2 cm) from the point to the plane.
S4.3, roughly calculating the vertex of the calibration plate. And processing the extracted calibration plate plane point cloud to obtain the position of the top point of the calibration plate as a calibration characteristic point. Specifically, the minimum external frame of the point cloud frame data in the plane of the calibration plate is calculated (namely, the coordinates of four vertexes of the minimum external frame are calculated), and the coordinates of the four vertexes of the minimum external frame are used as the point cloud estimated coordinates of the four vertexes at the outermost periphery of the calibration plate.
S4.4, correcting the point cloud estimated coordinates to obtain point cloud final coordinates, wherein the method comprises the following steps:
S4.4.1, taking point cloud frame data in a plane of a calibration plate as input, and acquiring three principal component directions of the point cloud by a PCA method, wherein the three principal component directions are marked as y pca、zpca and x pca; they represent the x, y, z axis directions of the point cloud, which are relatively close to the length direction, width direction and normal direction of the calibration plate.
S4.4.2, because of the non-uniformity of the point cloud, y pca、zpca obtained in the above steps is not completely matched with the real length direction and width direction of the calibration plate, and needs to be finely adjusted, and the adjustment strategy is as follows, and fine adjustment is performed on the obtained principal component directions y pca and z pca to enable the principal component directions y pca and z pca to be close to the real length direction and width direction of the calibration plate, including:
S4.4.2.1, taking point cloud estimated coordinates of four vertexes at the outermost periphery of the calibration plate, and marking the estimated coordinates as V 1、V2、V3 and V 4.
S4.4.2.2, sequentially moving the origin of the coordinate axis formed by the three principal component directions y pca、zpca and x pca to the point cloud estimated coordinates of the four vertexes, and obtaining coordinate systems C 1、C2、C3 and C 4.
S4.4.2.3, the following operations are performed under four coordinate systems C 1、C2、C3 and C 4, respectively:
(1) The principal component directions y pca and z pca rotate by a span of [ -30 °,30 ° ] about the principal component direction x pca based on the origin of the coordinate system, with a stride of 2 °; the span and stride of this embodiment are preferred embodiments of the present application and should not be construed as limiting the scope of the present application, e.g., 3, 4, etc.
(2) The acquisition of new y pca and z pca after each rotation in principal component directions y pca and z pca is noted asAndAnd calculate outAndAssembled planeAt this time, the rotation angle theta i is recorded at the same time, where i is the number of rotations, and i=1, 2,3 …,30.
(3) Projecting point cloud frame data in a plane of a calibration plate to the planeThen calculate the projection surface onAndThe length in the direction is used as the length of the calibration plateSum width ofSimultaneously calculating lengthsSum width ofThe number of point clouds Num i in the enclosed rectangular area.
(4) Sorting Num i, and taking the corresponding i=i when the Num i value is maximumAnd
S4.4.2.3 taking four groups of data obtained after the operations are respectively executed under four coordinate systems C 1、C2、C3 and C 4 And) Taking four groups of dataAndCorresponding to the nearest real length and width of the calibration plateAndAs adjusted principal component directions y pca and z pca.
S4.4.3, taking the adjusted principal component direction y pca as the length direction of the calibration plate, taking the adjusted principal component direction z pca as the width direction of the calibration plate, taking the principal component direction x pca as the normal direction of the calibration plate, calculating the coordinates of four vertexes of the adjusted minimum external frame as the final coordinates of point clouds of four vertexes of the outermost periphery of the calibration plate, and sequentially storing the final coordinates of the point clouds of the four vertexes in a clockwise order from the upper left corner of the calibration plate.
In the embodiment, a RANSAC algorithm is used for fitting the plane of the calibration plate, the minimum circumscribed frame of the point cloud is calculated, and four vertex coordinates of the circumscribed frame are used as initial estimated values of the four vertex coordinates of the calibration plate. And calculating three principal component directions of the point cloud of the calibration plate by a PCA method, and correcting the vertex coordinates of the calibration plate to obtain an accurate value.
S5, solving the transformation relation between the pixel coordinates of the four top points at the outermost periphery of the calibration plate and the final coordinates of the point cloud to obtain a rotation matrix and a translation matrix, and completing the joint calibration of the laser radar and the camera.
The calibration external parameters of the embodiment comprise a rotation matrix R and a translation matrix T. The transformation relationship between the laser radar coordinate system and the camera coordinate system, i.e. the rotation matrix and the translation matrix, is solved using a point pair based spatial matching method, e.g. by using the solvepnp algorithm provided in opencv. And (5) storing calibration parameters.
In another embodiment, as shown in fig. 4, an automatic joint calibration method of a laser radar and a camera is provided, which is used for realizing joint calibration of the laser radar and the camera, wherein the laser radar and the camera are installed on equipment to be calibrated, the automatic joint calibration method of the laser radar and the camera is realized based on a movable track, a rotatable bracket, a calibration plate and an industrial personal computer, the calibration plate is a checkerboard calibration plate, one of the equipment to be calibrated and the calibration plate is a moving part, and the other one is a static part, wherein:
The moving part is fixed on the rotatable bracket, and the static part and the moving part are arranged oppositely and do not move relative to the ground;
The rotatable support is provided with a plurality of preset rotating postures, the rotatable support changes the rotating postures to drive the moving piece to change the direction, and the rotatable support is arranged on the movable track;
The movable track is provided with a plurality of preset movement paths, and the movable track moves based on different movement paths to drive the rotatable support to change the point position;
The industrial personal computer is electrically connected with the laser radar, the camera, the movable track and the rotatable support and is used for issuing a calibration instruction to complete the joint calibration of the laser radar and the camera, and the automatic joint calibration method of the laser radar and the camera comprises the following steps:
S1, controlling the movable track to move so that the rotatable support is located on a preset point, controlling the rotatable support to change the rotating gesture so that the moving part changes the direction, acquiring image frame data and point cloud frame data through a laser radar and a camera in each direction of the moving part, and repeatedly executing the step S1 until the preset number of image frame data and point cloud frame data are acquired. The movable track can drive the calibration plate to move at each point, the support drives the calibration plate to rotate and stay at three positions every point, and images of all positions and point cloud data are acquired.
S2, calibrating internal parameters of the camera based on the acquired image frame data;
S3, extracting the image calibration points: calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate based on the image frame data; in order to improve the accuracy of pixel coordinate calculation, image de-distortion processing can be performed before the vertex extraction of the image data calibration plate.
S4, extracting a point cloud calibration point:
S4.1, preprocessing point cloud frame data;
S4.2, extracting a calibration plate point cloud: the RANSAC algorithm is utilized to fit the maximum plane in the point cloud frame data as a calibration plate plane, and the point cloud frame data in the calibration plate plane are extracted;
S4.3, rough calculation of the top plate of the calibration plate: calculating a minimum external frame of point cloud frame data in the plane of the calibration plate, and taking coordinates of four vertexes of the minimum external frame as point cloud estimated coordinates of four vertexes of the outermost periphery of the calibration plate;
S4.4, correcting the vertex of the calibration plate: correcting the point cloud estimated coordinates to obtain point cloud final coordinates;
s5, solving the transformation relation between the pixel coordinates of the four top points at the outermost periphery of the calibration plate and the final coordinates of the point cloud to obtain a rotation matrix and a translation matrix, and completing the joint calibration of the laser radar and the camera.
In another embodiment, the movable track has two preset movement paths, and the two movement paths form a cross; the rotatable support has three preset rotation postures, and the three rotation postures are forward, left rotation and right rotation, wherein the rotation angle of the left rotation and the right rotation is smaller than 30 degrees.
In another embodiment, the calculating, based on the image frame data, pixel coordinates of four vertices at the outermost periphery of the calibration plate includes:
and extracting angular point position information by using a growth algorithm based on the image frame data, calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate according to the calibration plate customization parameters and the angular point position information, and sequentially storing the pixel coordinates of the four vertexes in a clockwise order from the upper left corner of the calibration plate.
In another embodiment, the correcting the estimated coordinates of the point cloud to obtain final coordinates of the point cloud includes:
S4.4.1, taking point cloud frame data in a plane of a calibration plate as input, and acquiring three principal component directions of the point cloud by a PCA method, wherein the three principal component directions are marked as y pca、zpca and x pca;
s4.4.2, fine tuning the obtained principal component directions y pca and z pca to enable the principal component directions to be close to the real length direction and the real width direction of the calibration plate;
S4.4.3, taking the adjusted principal component direction y pca as the length direction of the calibration plate, taking the adjusted principal component direction z pca as the width direction of the calibration plate, taking the principal component direction x pca as the normal direction of the calibration plate, calculating the coordinates of four vertexes of the adjusted minimum external frame as the final coordinates of point clouds of four vertexes of the outermost periphery of the calibration plate, and sequentially storing the final coordinates of the point clouds of the four vertexes in a clockwise order from the upper left corner of the calibration plate.
In another embodiment, the trimming the obtained principal component directions y pca and z pca to approach the true length direction and width direction of the calibration plate includes:
S4.4.2.1, taking point cloud estimated coordinates of four vertexes at the outermost periphery of the calibration plate, and marking the estimated coordinates as V 1、V2、V3 and V 4;
S4.4.2.2, sequentially moving the origin of a coordinate axis formed by three principal component directions y pca、zpca and x pca to point cloud estimated coordinates of four vertexes to obtain coordinate systems C 1、C2、C3 and C 4;
S4.4.2.3, the following operations are performed under four coordinate systems C 1、C2、C3 and C 4, respectively:
(1) The principal component directions y pca and z pca rotate by a span of [ -30 °,30 ° ] about the principal component direction x pca based on the origin of the coordinate system, with a stride of 2 °;
(2) The acquisition of new y pca and z pca after each rotation in principal component directions y pca and z pca is noted as AndAnd calculate outAndAssembled planeSimultaneously recording the rotation angle theta i at this time, where i is the number of rotations and i=1, 2,3 …,30;
(3) Projecting point cloud frame data in a plane of a calibration plate to the plane Then calculate the projection surface onAndThe length in the direction is used as the length of the calibration plateSum width ofSimultaneously calculating lengthsSum width ofThe number of point clouds Num i in the enclosed rectangular area;
(4) Sorting Num i, and taking the corresponding i=i when the Num i value is maximum And
S4.4.2.3 taking four groups of data obtained after the operations are respectively executed under four coordinate systems C 1、C2、C3 and C 4 And) Taking four groups of dataAndCorresponding to the nearest real length and width of the calibration plateAndAs adjusted principal component directions y pca and z pca.
For specific limitations regarding the method for automatic joint calibration of the lidar and the camera, reference may be made to the above-mentioned limitations for the device for automatic joint calibration of the lidar and the camera, and no further description will be given here.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of protection of the present application is to be determined by the appended claims.

Claims (10)

1. An automatic joint calibration device of laser radar and camera for realize the joint calibration of laser radar and camera, laser radar and camera are installed and are waited to calibrate equipment, its characterized in that, automatic joint calibration device of laser radar and camera includes: the device comprises a movable track, a rotatable support, a calibration plate and an industrial personal computer, wherein the calibration plate is a checkerboard calibration plate, one of equipment to be calibrated and the calibration plate is a moving part, and the other is a static part, and the device to be calibrated and the calibration plate are characterized in that:
The moving part is fixed on the rotatable bracket, and the static part and the moving part are arranged oppositely and do not move relative to the ground;
The rotatable support is provided with a plurality of preset rotating postures, the rotatable support changes the rotating postures to drive the moving piece to change the direction, and the rotatable support is arranged on the movable track;
The movable track is provided with a plurality of preset movement paths, and the movable track moves based on different movement paths to drive the rotatable support to change the point position;
the industrial personal computer is electrically connected with the laser radar, the camera, the movable track and the rotatable support and is used for issuing a calibration instruction to complete joint calibration of the laser radar and the camera, and the industrial personal computer specifically executes the following operations:
S1, controlling the movable orbit to move so that the rotatable support is positioned on a preset point, controlling the rotatable support to change the rotating gesture so that the moving part changes the direction, acquiring image frame data and point cloud frame data through a laser radar and a camera in each direction of the moving part, and repeatedly executing the step S1 until the preset number of image frame data and point cloud frame data are acquired;
S2, calibrating internal parameters of the camera based on the acquired image frame data;
S3, extracting the image calibration points: calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate based on the image frame data;
S4, extracting a point cloud calibration point:
S4.1, preprocessing point cloud frame data;
s4.2, utilizing a RANSAC algorithm to simulate the maximum plane in the point cloud frame data as a plane of the calibration plate, and extracting the point cloud frame data in the plane of the calibration plate;
s4.3, calculating a minimum external frame of point cloud frame data in the plane of the calibration plate, and taking coordinates of four vertexes of the minimum external frame as point cloud estimated coordinates of four vertexes at the outermost periphery of the calibration plate;
s4.4, correcting the point cloud estimated coordinates to obtain point cloud final coordinates;
s5, solving the transformation relation between the pixel coordinates of the four top points at the outermost periphery of the calibration plate and the final coordinates of the point cloud to obtain a rotation matrix and a translation matrix, and completing the joint calibration of the laser radar and the camera.
2. The automatic joint calibration device for the laser radar and the camera according to claim 1, wherein the movable track has two preset movement paths, and the two movement paths form a cross; the rotatable support has three preset rotation postures, and the three rotation postures are forward, left rotation and right rotation, wherein the rotation angle of the left rotation and the right rotation is smaller than 30 degrees.
3. The automated joint calibration device for lidar and cameras according to claim 1, wherein the calculating the pixel coordinates of the four vertices of the outermost periphery of the calibration plate based on the image frame data comprises:
and extracting angular point position information by using a growth algorithm based on the image frame data, calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate according to the calibration plate customization parameters and the angular point position information, and sequentially storing the pixel coordinates of the four vertexes in a clockwise order from the upper left corner of the calibration plate.
4. The automated joint calibration device for lidar and cameras of claim 1, wherein the correction of the point cloud estimation coordinates to obtain point cloud final coordinates comprises:
S4.4.1, taking point cloud frame data in a plane of a calibration plate as input, and acquiring three principal component directions of the point cloud by a PCA method, wherein the three principal component directions are marked as y pca、zpca and x pca;
s4.4.2, fine tuning the obtained principal component directions y pca and z pca to enable the principal component directions to be close to the real length direction and the real width direction of the calibration plate;
S4.4.3, taking the adjusted principal component direction y pca as the length direction of the calibration plate, taking the adjusted principal component direction z pca as the width direction of the calibration plate, taking the principal component direction x pca as the normal direction of the calibration plate, calculating the coordinates of four vertexes of the adjusted minimum external frame as the final coordinates of point clouds of four vertexes of the outermost periphery of the calibration plate, and sequentially storing the final coordinates of the point clouds of the four vertexes in a clockwise order from the upper left corner of the calibration plate.
5. The automated joint calibration device for lidar and cameras of claim 4, wherein the fine tuning of the acquired principal component directions y pca and z pca to approach the true length and width directions of the calibration plate comprises:
S4.4.2.1, taking point cloud estimated coordinates of four vertexes at the outermost periphery of the calibration plate, and marking the estimated coordinates as V 1、V2、V3 and V 4;
S4.4.2.2, sequentially moving the origin of a coordinate axis formed by three principal component directions y pca、zpca and x pca to point cloud estimated coordinates of four vertexes to obtain coordinate systems C 1、C2、C3 and C 4;
S4.4.2.3, the following operations are performed under four coordinate systems C 1、C2、C3 and C 4, respectively:
(1) The principal component directions y pca and z pca rotate by a span of [ -30 °,30 ° ] about the principal component direction x pca based on the origin of the coordinate system, with a stride of 2 °;
(2) The acquisition of new y pca and z pca after each rotation in principal component directions y pca and z pca is noted as AndAnd calculate outAndAssembled planeSimultaneously recording the rotation angle theta i at that time, where i is the number of rotations and i=1, 2,3., 30;
(3) Projecting point cloud frame data in a plane of a calibration plate to the plane Then calculate the projection surface onAndThe length in the direction is used as the length of the calibration plateSum width ofSimultaneously calculating lengthsSum width ofThe number of point clouds Num i in the enclosed rectangular area;
(4) Sorting Num i, and taking the corresponding i=i when the Num i value is maximum And
S4.4.2.3 taking four sets of data obtained after performing operations in four coordinate systems C 1、C2、C3 and C 4, respectivelyAndTaking four groups of dataAndCorresponding to the nearest real length and width of the calibration plateAndAs adjusted principal component directions y pca and z pca.
6. The automatic combined calibration method of the laser radar and the camera is used for realizing combined calibration of the laser radar and the camera, and is characterized in that the automatic combined calibration method of the laser radar and the camera is realized based on a movable track, a rotatable bracket, a calibration plate and an industrial personal computer, the calibration plate is a checkerboard calibration plate, one of the equipment to be calibrated and the calibration plate is a moving part, and the other one is a static part, wherein:
The moving part is fixed on the rotatable bracket, and the static part and the moving part are arranged oppositely and do not move relative to the ground;
The rotatable support is provided with a plurality of preset rotating postures, the rotatable support changes the rotating postures to drive the moving piece to change the direction, and the rotatable support is arranged on the movable track;
The movable track is provided with a plurality of preset movement paths, and the movable track moves based on different movement paths to drive the rotatable support to change the point position;
The industrial personal computer is electrically connected with the laser radar, the camera, the movable track and the rotatable support and is used for issuing a calibration instruction to complete the joint calibration of the laser radar and the camera, and the automatic joint calibration method of the laser radar and the camera comprises the following steps:
S1, controlling the movable orbit to move so that the rotatable support is positioned on a preset point, controlling the rotatable support to change the rotating gesture so that the moving part changes the direction, acquiring image frame data and point cloud frame data through a laser radar and a camera in each direction of the moving part, and repeatedly executing the step S1 until the preset number of image frame data and point cloud frame data are acquired;
S2, calibrating internal parameters of the camera based on the acquired image frame data;
S3, extracting the image calibration points: calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate based on the image frame data;
S4, extracting a point cloud calibration point:
S4.1, preprocessing point cloud frame data;
s4.2, utilizing a RANSAC algorithm to simulate the maximum plane in the point cloud frame data as a plane of the calibration plate, and extracting the point cloud frame data in the plane of the calibration plate;
s4.3, calculating a minimum external frame of point cloud frame data in the plane of the calibration plate, and taking coordinates of four vertexes of the minimum external frame as point cloud estimated coordinates of four vertexes at the outermost periphery of the calibration plate;
s4.4, correcting the point cloud estimated coordinates to obtain point cloud final coordinates;
s5, solving the transformation relation between the pixel coordinates of the four top points at the outermost periphery of the calibration plate and the final coordinates of the point cloud to obtain a rotation matrix and a translation matrix, and completing the joint calibration of the laser radar and the camera.
7. The automated joint calibration method of a lidar and a camera according to claim 6, wherein the movable rail has two preset movement paths, and the two movement paths form a cross; the rotatable support has three preset rotation postures, and the three rotation postures are forward, left rotation and right rotation, wherein the rotation angle of the left rotation and the right rotation is smaller than 30 degrees.
8. The automated joint calibration method of the lidar and the camera according to claim 6, wherein the calculating the pixel coordinates of four vertices at the outermost periphery of the calibration plate based on the image frame data comprises:
and extracting angular point position information by using a growth algorithm based on the image frame data, calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate according to the calibration plate customization parameters and the angular point position information, and sequentially storing the pixel coordinates of the four vertexes in a clockwise order from the upper left corner of the calibration plate.
9. The automated joint calibration method of the lidar and the camera according to claim 6, wherein the correcting the estimated coordinates of the point cloud to obtain final coordinates of the point cloud comprises:
S4.4.1, taking point cloud frame data in a plane of a calibration plate as input, and acquiring three principal component directions of the point cloud by a PCA method, wherein the three principal component directions are marked as y pca、zpca and x pca;
s4.4.2, fine tuning the obtained principal component directions y pca and z pca to enable the principal component directions to be close to the real length direction and the real width direction of the calibration plate;
S4.4.3, taking the adjusted principal component direction y pca as the length direction of the calibration plate, taking the adjusted principal component direction z pca as the width direction of the calibration plate, taking the principal component direction x pca as the normal direction of the calibration plate, calculating the coordinates of four vertexes of the adjusted minimum external frame as the final coordinates of point clouds of four vertexes of the outermost periphery of the calibration plate, and sequentially storing the final coordinates of the point clouds of the four vertexes in a clockwise order from the upper left corner of the calibration plate.
10. The automated joint calibration method of lidar and camera of claim 9, wherein the fine tuning of the acquired principal component directions y pca and z pca to approach the true length and width directions of the calibration plate comprises:
S4.4.2.1, taking point cloud estimated coordinates of four vertexes at the outermost periphery of the calibration plate, and marking the estimated coordinates as V 1、V2、V3 and V 4;
S4.4.2.2, sequentially moving the origin of a coordinate axis formed by three principal component directions y pca、zpca and x pca to point cloud estimated coordinates of four vertexes to obtain coordinate systems C 1、C2、C3 and C 4;
S4.4.2.3, the following operations are performed under four coordinate systems C 1、C2、C3 and C 4, respectively:
(1) The principal component directions y pca and z pca rotate by a span of [ -30 °,30 ° ] about the principal component direction x pca based on the origin of the coordinate system, with a stride of 2 °;
(2) The acquisition of new y pca and z pca after each rotation in principal component directions y pca and z pca is noted as AndAnd calculate outAndAssembled planeSimultaneously recording the rotation angle theta i at that time, where i is the number of rotations and i=1, 2,3., 30;
(3) Projecting point cloud frame data in a plane of a calibration plate to the plane Then calculate the projection surface onAndThe length in the direction is used as the length of the calibration plateSum width ofSimultaneously calculating lengthsSum width ofThe number of point clouds Num i in the enclosed rectangular area;
(4) Sorting Num i, and taking the corresponding t=i when the Num i value is maximum And
S4.4.2.3 taking four sets of data obtained after performing operations in four coordinate systems C 1、C2、C3 and C 4, respectivelyAndTaking four groups of dataAndCorresponding to the nearest real length and width of the calibration plateAndAs adjusted principal component directions y pca and z pca.
CN202111539056.0A 2021-12-15 2021-12-15 Automatic combined calibration device and method for laser radar and camera Active CN114371472B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111539056.0A CN114371472B (en) 2021-12-15 2021-12-15 Automatic combined calibration device and method for laser radar and camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111539056.0A CN114371472B (en) 2021-12-15 2021-12-15 Automatic combined calibration device and method for laser radar and camera

Publications (2)

Publication Number Publication Date
CN114371472A CN114371472A (en) 2022-04-19
CN114371472B true CN114371472B (en) 2024-07-12

Family

ID=81140398

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111539056.0A Active CN114371472B (en) 2021-12-15 2021-12-15 Automatic combined calibration device and method for laser radar and camera

Country Status (1)

Country Link
CN (1) CN114371472B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114862899A (en) * 2022-04-22 2022-08-05 阿里巴巴(中国)有限公司 Equipment calibration method, system, device, electronic equipment and program product
CN115082564A (en) * 2022-04-28 2022-09-20 上海电机学院 Combined calibration method fusing binocular vision and laser radar
CN114782556B (en) * 2022-06-20 2022-09-09 季华实验室 Registration method, system and storage medium of camera and lidar
CN115719387A (en) * 2022-11-24 2023-02-28 梅卡曼德(北京)机器人科技有限公司 3D camera calibration method, point cloud image acquisition method and camera calibration system
CN116563391B (en) * 2023-05-16 2024-02-02 深圳市高素科技有限公司 Automatic laser structure calibration method based on machine vision

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112230204A (en) * 2020-10-27 2021-01-15 深兰人工智能(深圳)有限公司 Combined calibration method and device for laser radar and camera
CN112819903A (en) * 2021-03-02 2021-05-18 福州视驰科技有限公司 Camera and laser radar combined calibration method based on L-shaped calibration plate

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110221275B (en) * 2019-05-21 2023-06-23 菜鸟智能物流控股有限公司 Calibration method and device between laser radar and camera

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112230204A (en) * 2020-10-27 2021-01-15 深兰人工智能(深圳)有限公司 Combined calibration method and device for laser radar and camera
CN112819903A (en) * 2021-03-02 2021-05-18 福州视驰科技有限公司 Camera and laser radar combined calibration method based on L-shaped calibration plate

Also Published As

Publication number Publication date
CN114371472A (en) 2022-04-19

Similar Documents

Publication Publication Date Title
CN114371472B (en) Automatic combined calibration device and method for laser radar and camera
CN112669393B (en) Laser radar and camera combined calibration method
CN110136208B (en) Joint automatic calibration method and device for robot vision servo system
CN109872372B (en) Global visual positioning method and system for small quadruped robot
CN110103217B (en) Hand-eye calibration method of industrial robot
CN111968048B (en) Method and system for enhancing image data of less power inspection samples
CN110842940A (en) Building surveying robot multi-sensor fusion three-dimensional modeling method and system
CN104200086B (en) Wide-baseline visible light camera pose estimation method
CN112837383B (en) Camera and laser radar recalibration method and device and computer readable storage medium
CN103886107B (en) Robot localization and map structuring system based on ceiling image information
CN110243307B (en) An automated three-dimensional color imaging and measurement system
CN112388635B (en) Method, system and device for fusing sensing and space positioning of multiple sensors of robot
CN110312111A (en) The devices, systems, and methods calibrated automatically for image device
CN110238820A (en) Hand and eye calibrating method based on characteristic point
CN113793270A (en) Aerial image geometric correction method based on unmanned aerial vehicle attitude information
CN115908708A (en) Global 3D Reconstruction Method of Plant Population Based on Kinect
CN101776437A (en) Calibration technology for vision sub-pixel of embedded type machine with optical path adjustment
CN117934625A (en) Three-dimensional space rapid calibration and positioning method based on laser radar
CN107123135B (en) A Distortion-Free Imaging Method for Disordered 3D Point Clouds
Wei et al. Fast Multi-View 3D reconstruction of seedlings based on automatic viewpoint planning
CN110044358A (en) Method for positioning mobile robot based on live field wire feature
CN114200429A (en) SVD multi-mechanical type laser radar external parameter calibration method and system based on same visual field
Cheng et al. Camera LiDAR calibration: an automatic and accurate method with novel PLE metrics
CN117974800A (en) Single-camera and laser radar combined calibration method based on ROS platform
CN117496045A (en) Automatic model generation device, method and equipment based on cable image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant