[go: up one dir, main page]

CN119559259B - Calibration system-based vehicle sensor calibration method, system, device and medium - Google Patents

Calibration system-based vehicle sensor calibration method, system, device and medium

Info

Publication number
CN119559259B
CN119559259B CN202311113474.2A CN202311113474A CN119559259B CN 119559259 B CN119559259 B CN 119559259B CN 202311113474 A CN202311113474 A CN 202311113474A CN 119559259 B CN119559259 B CN 119559259B
Authority
CN
China
Prior art keywords
calibration
data
calibrated
point cloud
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311113474.2A
Other languages
Chinese (zh)
Other versions
CN119559259A (en
Inventor
张恒
李茹杨
张腾飞
邓琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inspur Beijing Electronic Information Industry Co Ltd
Original Assignee
Inspur Beijing Electronic Information Industry Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inspur Beijing Electronic Information Industry Co Ltd filed Critical Inspur Beijing Electronic Information Industry Co Ltd
Priority to CN202311113474.2A priority Critical patent/CN119559259B/en
Publication of CN119559259A publication Critical patent/CN119559259A/en
Application granted granted Critical
Publication of CN119559259B publication Critical patent/CN119559259B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/11Complex mathematical operations for solving equations, e.g. nonlinear equations, general mathematical optimization problems
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K19/00Record carriers for use with machines and with at least a part designed to carry digital markings
    • G06K19/06Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
    • G06K19/067Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components
    • G06K19/07Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components with integrated circuit chips
    • G06K19/077Constructional details, e.g. mounting of circuits in the carrier
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Remote Sensing (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Data Mining & Analysis (AREA)
  • Algebra (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • General Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Artificial Intelligence (AREA)
  • Operations Research (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本申请公开了一种基于标定系统的车辆传感器标定方法、系统、装置及介质,涉及自动驾驶领域,解决对车辆传感器标定效率低下的问题。该方案中,在待标定车辆满足第一标定条件时,控制全站仪对标定装置进行扫描,得到第一数据;在满足第二标定条件时,控制待标定车辆上的各个传感器对标定装置进行采集,得到第二数据;根据第一数据得到标定先验信息,并根据标定先验信息及第二数据对待标定车辆上的各个传感器进行标定。可见,本申请中使用全站仪和多个标定装置可以大大减少标定过程的人员需求和时间消耗。这样可以降低标定成本,并且便于进行量产,提高生产效率。

The present application discloses a vehicle sensor calibration method, system, device and medium based on a calibration system, which relates to the field of autonomous driving and solves the problem of low efficiency in vehicle sensor calibration. In this solution, when the vehicle to be calibrated meets the first calibration condition, the total station is controlled to scan the calibration device to obtain the first data; when the second calibration condition is met, the various sensors on the vehicle to be calibrated are controlled to collect the calibration device to obtain the second data; calibration prior information is obtained based on the first data, and the various sensors on the vehicle to be calibrated are calibrated based on the calibration prior information and the second data. It can be seen that the use of a total station and multiple calibration devices in this application can greatly reduce the personnel requirements and time consumption of the calibration process. This can reduce the calibration cost, facilitate mass production, and improve production efficiency.

Description

Calibration system-based vehicle sensor calibration method, system, device and medium
Technical Field
The application relates to the field of automatic driving, in particular to a vehicle sensor calibration method, system, device and medium based on a calibration system.
Background
The perception system, known in the field of autopilot as autopilot's eye, has the main task of acquiring environmental information and identifying environmental conditions. Due to the complexity of ambient light, weather and scenes, the use of a single sensor has certain limitations, so that a multi-sensor fusion mode is often adopted at present to improve the redundancy and complementarity of information so as to ensure the safety of automatic driving. Common sensing system sensors include lidar, cameras, millimeter wave radar, and the like. Depending on the different sensing tasks and sensor characteristics, efficient combination of sensors is required. To ensure accuracy of environmental awareness, it is often necessary to accurately calibrate sensors on the vehicle and determine internal and external parameters between the sensors. However, the existing sensor calibration method has some problems in engineering application, such as needing to consume a lot of time and manpower resources, and being difficult to realize mass production.
Disclosure of Invention
The application aims to provide a vehicle sensor calibration method, a system, a device and a medium based on a calibration system, wherein the total station and a plurality of calibration devices can be used for greatly reducing personnel requirements and time consumption in the calibration process. Therefore, the calibration cost can be reduced, mass production is convenient, and the production efficiency is improved.
In a first aspect, the present application provides a calibration method for a vehicle sensor based on a calibration system, where the calibration system includes a total station and a plurality of calibration devices, and the method includes:
when a vehicle to be calibrated meets a first calibration condition, controlling the total station to scan the calibration device to obtain first data;
When a second calibration condition is met, receiving second data acquired by each sensor on the vehicle to be calibrated for the calibration device;
and obtaining calibration prior information according to the first data, calibrating each sensor on the vehicle to be calibrated according to the calibration prior information and the second data to obtain calibration parameters of each sensor, wherein the calibration prior information is data information of the calibration device, and the calibration parameters are used for representing coordinate transformation relations between any two sensors or between the sensors and the total station.
In one embodiment, the process of determining whether the vehicle to be calibrated satisfies the first calibration condition includes:
Judging whether the vehicle to be calibrated is stopped to a preset area or not;
If the vehicle to be calibrated is stopped to the preset area, judging that the vehicle to be calibrated meets the first calibration condition, otherwise, judging that the vehicle to be calibrated does not meet the first calibration condition.
In one embodiment, the process of determining whether the vehicle to be calibrated meets the first calibration condition includes:
determining whether the vehicle to be calibrated is calibrated for the first time according to the vehicle information of the vehicle to be calibrated;
and if the first calibration is performed, judging that the vehicle to be calibrated meets the first calibration condition, otherwise, judging that the vehicle to be calibrated does not meet the first calibration condition.
In one embodiment, when it is determined that the vehicle is not first-time-calibrated, further comprising:
and calling first data which are stored in a database and correspond to the vehicle to be calibrated.
In one embodiment, the process of determining whether the second calibration condition is met includes:
judging whether a calibration starting instruction is received or not;
And if the second calibration condition is received, judging that the second calibration condition is met, otherwise, judging that the second calibration condition is not met.
In one embodiment, deriving calibration prior information from the first data includes:
removing data which do not meet preset requirements from the first data to obtain first calibration data;
and determining the calibration prior information according to the first calibration data.
In one embodiment, the step of removing the data that does not meet the preset requirement from the first data to obtain first calibration data includes:
and eliminating point cloud data which does not comprise the calibration device from the first data to obtain the first calibration data.
In one embodiment, the calibration device comprises a plurality of calibration members, the calibration prior information being determined from the first calibration data, comprising:
Performing spatial clustering segmentation on the first calibration data according to the positions of the calibration members in the calibration system to obtain first local point cloud data corresponding to each calibration member;
and determining first identity information and first coordinate information of each calibration component according to the first local point cloud data.
In one embodiment, each of the calibration members includes at least two calibration plates, and determining first identity information and first coordinate information of each of the calibration members based on the obtained first local point cloud data includes:
Determining second local point cloud data corresponding to each calibration plate from the first local point cloud data;
and determining second identity information and second coordinate information corresponding to each calibration plate according to the second local point cloud data.
In one embodiment, each calibration member includes at least two non-coplanar calibration plates, and determining second local point cloud data corresponding to each calibration plate according to the position of each calibration plate in each calibration member and the first local point cloud data includes:
And performing plane fitting on each calibration plate to extract second local point cloud data corresponding to each calibration plate.
In one embodiment, performing plane fitting on each calibration plate to extract second local point cloud data corresponding to each calibration plate includes:
acquiring a first height between the center of each calibration plate and the ground, randomly screening at least three second point clouds from the first local point clouds, and calculating a second three-dimensional space plane where the second point clouds are located;
Calculating a third distance from each point cloud in the second local point cloud data to the second three-dimensional space plane;
And taking the data corresponding to the point cloud, of which the difference value between the third distance and the first height is not more than a third preset distance, as second local point cloud data corresponding to the calibration plate.
In one embodiment, the calibration board includes a calibration carrier board and a plurality of flag codes, the flag codes are fixed on the calibration carrier board, and the flag codes carry two-dimensional code information for the sensor to recognize, wherein identity information corresponding to the two-dimensional code information of all the flag codes is different;
Determining second identity information corresponding to each calibration plate according to the second local point cloud data, including:
converting the second local point cloud data into a gray scale image;
dividing the gray level image according to the interval between the mark codes to obtain a local image of each mark code;
And carrying out two-dimensional code recognition on each local image to determine first ID information of each marker code, and determining second identity information of the calibration plate according to the first ID information of each marker code on each marker plate.
In one embodiment, determining second coordinate information corresponding to each calibration plate according to the second local point cloud data includes:
Extracting four edges of the circumscribed rectangle of the partial image of each marker code, and determining the intersection point of two adjacent edges as a key point;
Acquiring first image coordinates of each key point, wherein the first image coordinates represent the positions of the key points on the calibration plate;
And determining a first three-dimensional coordinate corresponding to each key point according to the position of the mark code on the calibration plate, the first image coordinate and the first reference three-dimensional coordinate of each point cloud in the second local point cloud data.
In one embodiment, when there is no three-dimensional coordinate corresponding to the first image coordinate of the key point in the first reference three-dimensional coordinate, the method further includes:
Acquiring three-dimensional coordinates corresponding to a plurality of reference points adjacent to the key points;
and determining a first three-dimensional coordinate corresponding to the key point according to the three-dimensional coordinates of the plurality of reference points.
In an embodiment, the calibration board is further provided with a metal block, and the second identity information corresponding to each calibration board is determined according to the second local point cloud data, and the method further includes:
and determining third identity information of the metal block according to the first ID information of each mark code.
In one embodiment, determining second coordinate information corresponding to each calibration plate according to the second local point cloud data further includes:
and determining the three-dimensional coordinates of the metal block according to the first three-dimensional coordinates of the key points corresponding to the marking codes and the positions of the metal block on the calibration plate.
In one embodiment, the metal block is fixed at the center of the calibration carrier plate, the plurality of mark codes are fixed on the calibration carrier plate and encircle the periphery of the metal block, and the plurality of mark codes are connected with the top points of the metal block near the top points of the metal block in the direction.
In one embodiment, determining the three-dimensional coordinates of the metal block according to the position of the metal block on the calibration plate and the first three-dimensional coordinates of the key points corresponding to the marker codes includes:
determining three-dimensional coordinates of four corner points of the metal block according to first three-dimensional coordinates of key points corresponding to vertexes of the marker codes, which are close to the direction of the metal block;
and determining the central three-dimensional coordinates of the metal block according to the three-dimensional coordinates of the four corner points of the metal block.
In one embodiment, when the sensor includes a camera to be calibrated, controlling each sensor on the vehicle to be calibrated to acquire the calibration device to obtain second data, including:
controlling the camera to be calibrated to acquire the calibration device to obtain a first calibration image;
identifying the mark codes in the first calibration image, and determining second ID information and partial images of the mark codes in the first calibration image;
Determining second image coordinates of each key point included in the first calibration image according to the ID information of each marker code in the first calibration image and the partial image;
determining second three-dimensional coordinates corresponding to each key point in the first calibration image according to the ID information of each mark code in the first calibration image, the second image coordinates of each key point and the first reference three-dimensional coordinates;
And determining the internal parameters of the camera to be calibrated according to the second image coordinates and the second three-dimensional coordinates.
In one embodiment, after determining the internal reference of the camera to be calibrated according to the second image coordinates and the second three-dimensional coordinates, the method further includes:
Determining an external parameter of the camera to be calibrated according to the second image coordinates, the second three-dimensional coordinates and the internal parameter of the camera to be calibrated;
the external parameters of the camera to be calibrated at least comprise a first coordinate transformation relation between the camera coordinate system to be calibrated and the total station coordinate system.
In one embodiment, when the sensor includes a look-around camera to be calibrated, controlling each sensor on the vehicle to be calibrated to acquire the calibration device, to obtain second data, including:
controlling the to-be-calibrated looking-around camera to acquire the calibration device to obtain a second calibration image;
projecting the second calibration image by using the look-around spliced homography matrix to obtain a projection image;
And calculating a second coordinate transformation relation between the coordinate systems of the adjacent two to-be-calibrated looking-around cameras according to the same key points in the projection images corresponding to the adjacent two to-be-calibrated looking-around cameras.
In one embodiment, after the controlling the to-be-calibrated looking-around camera collects the calibration device to obtain the second calibration image, the method further includes:
extracting ID information of each mark code and third image coordinates of each key point in the second calibration image;
screening key points positioned on the ground in the second calibration image according to the ID information of each marking code and the third image coordinates of each key point in the second calibration image, and acquiring the third three-dimensional coordinates of each key point positioned on the ground;
determining a look-around spliced image according to third three-dimensional coordinates of each key point on the ground;
Calibrating a fourth three-dimensional coordinate of a key point in the all-around spliced image by using a first reference three-dimensional coordinate corresponding to the key point in the all-around spliced image in the calibration prior information so as to obtain a calibrated all-around spliced image;
And obtaining the looking-around spliced homography matrix according to the calibrated looking-around spliced image.
In one embodiment, when the sensor includes a main laser radar, controlling each sensor on the vehicle to be calibrated to acquire the calibration device to obtain second data, including:
the main laser radar is controlled to acquire the calibration device, so that first test point cloud data are obtained;
and matching the first test point cloud data with the first data to obtain a third coordinate transformation relation between a main laser radar coordinate system and the total station coordinate system.
In one embodiment, after the main lidar is controlled to collect the calibration device to obtain the cloud data of the first test point, the method further includes:
matching the first test point cloud data with the first data to obtain ID information of a marker code and a fifth three-dimensional coordinate of a key point in the field of view of the main laser radar;
determining the same key point in the first calibration image and the field of view of the main laser radar according to the ID information of each mark code in the first calibration image, the second three-dimensional coordinates corresponding to each key point, the ID information of the mark code in the field of view of the main laser radar and the fifth three-dimensional coordinates of the key point, and determining the three-dimensional coordinates of the same key point;
and calculating a fourth coordinate transformation relation between the camera coordinate system to be calibrated and the main laser radar coordinate system according to the three-dimensional coordinates of the same key points.
In one embodiment, when the sensor comprises a millimeter wave radar, the calibration plate is provided with a metal block, the calibration system further comprises two rails and a conveyor belt device arranged on the rails, the two rails are parallel, the distance between the two rails is the same as the distance between the left wheel and the right wheel of the vehicle to be calibrated, and the vehicle to be calibrated is positioned on the conveyor belt device when the sensor is calibrated;
controlling each sensor on the vehicle to be calibrated to acquire the calibration device, and before obtaining the second data, further comprising:
the conveyor belt device is controlled so that the vehicle to be calibrated moves at a preset speed;
controlling each sensor on the vehicle to be calibrated to acquire the calibration device to obtain second data, wherein the second data comprises:
Triggering the millimeter wave radar to acquire coordinates of the metal block in a visual field range in the moving process of the vehicle to be calibrated to obtain acquisition data, wherein the acquisition data at least comprises the coordinates of the metal block;
determining alignment data of the first test point cloud data and the acquired data according to the data time stamp;
projecting a sixth three-dimensional coordinate of the metal block in the first test point cloud data in the alignment data onto a horizontal plane to obtain horizontal plane projection data;
And calculating a fifth coordinate transformation relation between a millimeter wave radar coordinate system and the main laser radar coordinate system according to the horizontal plane projection data, the metal block coordinates in the acquired data in the alignment data and the positions of the metal blocks in the calibration system.
In one embodiment, when the sensor includes an auxiliary laser radar, controlling each sensor on the vehicle to be calibrated to acquire the calibration device to obtain second data, including:
controlling an auxiliary laser radar to acquire the calibration device to obtain second test point cloud data;
and matching the second test point cloud data with the first test point cloud data to obtain a sixth coordinate transformation relation between an auxiliary laser radar coordinate system and the main laser radar coordinate system.
In one embodiment, the calibration device comprises a plurality of verification means for determining the calibration prior information from the first calibration data, comprising:
performing spatial clustering segmentation on the first calibration data according to the positions of the verification members in the calibration system to obtain third local point cloud data corresponding to each verification member;
and determining fifth identity information and fifth coordinate information corresponding to each verification member according to the third local point cloud data.
In one embodiment, determining fifth identity information and fifth coordinate information corresponding to each verification member according to the third local point cloud data includes:
fitting the third local point cloud data to obtain the structural information and the three-dimensional coordinates of each verification member.
In one embodiment, the verification member is a verification ball, and the fitting is performed on the third local point cloud data to obtain structural information and three-dimensional coordinates of each verification member, including:
and performing spherical fitting on the third local point cloud data to obtain radius information and three-dimensional coordinates of the check ball.
In one embodiment, after obtaining calibration prior information according to the first data and calibrating each sensor on the vehicle to be calibrated according to the calibration prior information and the second data, the method further includes:
And verifying the calibration parameters of each sensor according to the radius information and the three-dimensional coordinates of each verification ball.
In one embodiment, when the calibration parameters of the sensors include a third coordinate transformation relationship between the main lidar coordinate system and the total station coordinate system, verifying the calibration parameters of the sensors according to the radius information and the three-dimensional coordinates of each verification ball, including:
acquiring seventh three-dimensional coordinates of each check ball in the field of view of the main laser radar;
Carrying out coordinate transformation on the seventh three-dimensional coordinate according to the third coordinate transformation relation to obtain a seventh three-dimensional coordinate to be compared;
Comparing the seventh three-dimensional coordinate to be compared with a seventh reference three-dimensional coordinate to determine whether the third coordinate transformation relationship is accurate;
The first reference three-dimensional coordinate is a three-dimensional coordinate which is acquired by the total station and corresponds to a verification component in the field of view of the main laser radar.
In one embodiment, when the calibration parameters of the sensor include a fourth coordinate transformation relationship between a camera coordinate system to be calibrated and a main lidar coordinate system, verifying the calibration parameters of each sensor according to the radius information and the three-dimensional coordinates of each verification ball, including:
acquiring first check point cloud data of a check ball in a field of view of a main laser radar;
Back projecting the first check point cloud data onto the camera image according to the fourth coordinate transformation relation;
Performing preset processing on the back-projected camera image to obtain a circular edge in the camera image, and determining a corresponding first circular fitting equation according to the circular edge;
Comparing the first circular fitting equation with a reference circular fitting equation to determine whether the fourth coordinate transformation relationship is accurate.
In one embodiment, the calibration balls are metal spheres, and when the calibration parameters of the sensors include a fifth coordinate transformation relationship between a millimeter wave radar coordinate system and a main laser radar coordinate system, calibrating the calibration parameters of the sensors according to radius information and three-dimensional coordinates of each calibration ball, including:
acquiring millimeter wave radar coordinates and angles of each check ball in the visual field range of the millimeter wave radar;
acquiring a seventh three-dimensional coordinate in the view field range of the main laser radar;
projecting the millimeter wave radar coordinates and angles to a two-dimensional plane of the main laser radar coordinate system according to the fifth coordinate transformation relation;
And calculating Euclidean distance according to the projected millimeter wave radar coordinates and angles and the seventh three-dimensional coordinate to determine whether the fifth coordinate transformation relation is accurate.
In one embodiment, when the calibration parameters of the sensors include a sixth coordinate transformation relationship between the auxiliary lidar coordinate system and the primary lidar coordinate system, verifying the calibration parameters of the sensors according to the radius information and the three-dimensional coordinates of each verification ball, including:
Acquiring first check point cloud data of a check ball in a field of view of a main laser radar;
Acquiring second check point cloud data of a check ball in the range of the field of view of the auxiliary laser radar;
Transforming the second calibration point cloud data into the main laser radar coordinate system according to the sixth coordinate transformation relation to obtain second calibration point cloud data to be verified;
And determining whether the sixth coordinate transformation relation is accurate according to the first check point cloud data and the second check point cloud data to be checked.
In a second aspect, the present application further provides a calibration system for a vehicle sensor, the calibration system comprising a total station, a plurality of calibration devices, and a computing device, the computing device comprising:
the first control unit is used for controlling the total station to scan the calibration device when the vehicle to be calibrated meets a first calibration condition to obtain first data;
The second control unit is used for receiving second data acquired by the sensors on the vehicle to be calibrated for the calibration device when a second calibration condition is met;
The calibration unit is used for obtaining calibration prior information according to the first data, calibrating each sensor on the vehicle to be calibrated according to the calibration prior information and the second data to obtain calibration parameters of each sensor, wherein the calibration prior information is data information of the calibration device, and the calibration parameters are used for representing coordinate transformation relations between any two sensors or between the sensors and the total station.
In a third aspect, the present application further provides a calibration system-based calibration device for a vehicle sensor, including:
a memory for storing a computer program;
A processor for implementing the steps of the calibration system based vehicle sensor calibration method as described above when storing a computer program.
In a fourth aspect, the present application also provides a computer readable storage medium having stored thereon a computer program which when executed by a processor implements the steps of a calibration system based vehicle sensor calibration method as described above.
The application provides a vehicle sensor calibration method, a system, a device and a medium based on a calibration system, relates to the field of automatic driving, and solves the problem of low vehicle sensor calibration efficiency. According to the scheme, when a vehicle to be calibrated meets a first calibration condition, the total station is controlled to scan the calibration device to obtain first data, when the vehicle to be calibrated meets a second calibration condition, all sensors on the vehicle to be calibrated are controlled to acquire the calibration device to obtain second data, calibration prior information is obtained according to the first data, and all the sensors on the vehicle to be calibrated are calibrated according to the calibration prior information and the second data. It can be seen that the use of the total station and the plurality of calibration devices in the present application can greatly reduce personnel requirements and time consumption in the calibration process. Therefore, the calibration cost can be reduced, mass production is convenient, and the production efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required in the prior art and the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of a calibration method of a vehicle sensor based on a calibration system provided by the application;
FIG. 2 is a perspective view of a forward splice calibration member provided by the present application;
FIG. 3 is a schematic side view of a forward splice calibration member provided by the present application;
FIG. 4 is a perspective view of a reverse splice calibration member provided by the present application;
FIG. 5 is a schematic side view of a reverse splice calibration member provided by the present application;
FIG. 6 is a perspective view of a hybrid splice calibration member provided by the present application;
FIG. 7 is a schematic side view of a hybrid splice calibration member provided by the present application;
FIG. 8 is a schematic diagram of a calibration plate structure provided by the present application;
FIG. 9 is a schematic view of partial images of upper, lower, left and right corners of a calibration code provided by the application;
FIG. 10 is a schematic diagram of the world coordinate system setup provided by the present application;
FIG. 11 is a diagram illustrating an example of a prior art calibration plate according to the present application;
FIG. 12 is an exemplary diagram of another prior art calibration plate provided by the present application;
FIG. 13 is a top view of a view-around splice calibration layout provided by the present application;
FIG. 14 is a schematic diagram illustrating selection of a projective transformation matrix according to the present application;
FIG. 15 is a schematic diagram of the relationship between the millimeter wave radar coordinate system and the primary lidar coordinate system provided by the present application;
FIG. 16 is a schematic diagram of a layout of a check ball according to the present application;
FIG. 17 is a schematic view of a point cloud projected onto a camera image according to the present application;
Fig. 18 is a schematic diagram of point cloud point extraction provided by the present application;
fig. 19 is a schematic view of a point cloud point fitting circle provided by the application;
FIG. 20 is a schematic view of a vehicle parking area layout provided by the present application;
FIG. 21 is a top plan view of a vehicle parking area layout provided by the present application;
FIG. 22 is a side view of a vehicle parking area layout provided by the present application;
FIG. 23 is a block diagram of a calibration system for a vehicle sensor according to the present application;
FIG. 24 is a block diagram of a calibration system-based calibration device for a vehicle sensor according to the present application;
fig. 25 is a block diagram of a computer readable storage medium according to the present application.
Detailed Description
The application provides a calibration system-based vehicle sensor calibration method, a system, a device and a medium, wherein the total station and a plurality of calibration devices can be used for greatly reducing personnel requirements and time consumption in the calibration process. Therefore, the calibration cost can be reduced, mass production is convenient, and the production efficiency is improved.
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Automatic driving develops rapidly, and becomes the focus of competition of the automobile and the technological industry. Modular autopilot architecture is designed to be the mainstream solution due to its interpretability and ease of maintenance. The perception system is a key link of the automatic driving system, serves as eyes of the vehicle, and provides environment state information for path planning. Because of the complex environment, the existing sensing system generally adopts multi-sensor fusion, and the information redundancy and complementarity are improved to ensure the safety. Commonly used sensors include lidar, cameras and millimeter wave radar, which require efficient combination depending on the task and sensor characteristics. In order to ensure accurate sensing, accurate calibration of the sensor is required to determine the relationship between internal and external parameters. However, the existing calibration method has problems in engineering application, needs manual participation, is time-consuming and labor-consuming, is difficult to produce in mass production, has a longer resolving process, has a possibility of problem of result accuracy caused by initial value problem, and is difficult to understand the calibration result, especially for non-professional personnel.
Before describing the embodiment of the application, firstly, in order to obtain the optimal calibration precision and improve the calibration automation rate, firstly, according to the structural design parameters of the vehicle, the three-dimensional coordinates of the center of the main laser radar relative to the landing point of the vehicle tire can be obtained, and the position of the center of the main laser radar at the ground projection point is further obtained. The total station is installed in a calibration field, and the central position of the total station is required to be consistent with the center of a main laser radar; the step 2 is to make the height of the total station center relative to the ground consistent with the height of the main laser center relative to the ground, the step 3 is to make the x-axis direction of the total station coordinate system consistent with the x-axis direction of the main laser radar, and the step 4 is to adjust the total station to be horizontal.
Referring to fig. 1, fig. 1 is a schematic flow chart of a calibration method for a vehicle sensor based on a calibration system, where the calibration system includes a total station and a plurality of calibration devices, and the method includes:
S11, when a vehicle to be calibrated meets a first calibration condition, controlling a total station to scan a calibration device to obtain first data;
Specifically, before calibration of the vehicle sensor is performed, it is first necessary to confirm whether the vehicle to be calibrated satisfies the first calibration condition. Once the conditions are met, the system controls the total station to perform scanning operation on the calibration device to obtain first batch of data. The total station is an instrument for measuring and drawing a three-dimensional space, and can acquire point cloud data by scanning a calibration device. And repeatedly scanning the calibration system by using the total station to obtain the dense point cloud data of the calibration site. The point cloud data comprise specific information of the calibration device, and can be used for determining data information of the calibration device, namely calibration prior information. The calibration prior information is reference data for calibrating the sensor, and can help to determine the internal and external parameters of the sensor.
S12, when a second calibration condition is met, receiving second data acquired by each sensor on the vehicle to be calibrated for the calibration device;
in the step, after confirming that the vehicle to be calibrated meets the second calibration condition, the system receives second data obtained by collecting and operating the calibration device by each sensor on the vehicle to be calibrated. The data can reflect the performance of the sensor in the actual use scene for subsequent sensor calibration.
The specific mode of receiving the second data may include one of directly controlling and acquiring each sensor to acquire the calibration device so as to obtain the second data, and the other of passively receiving the second data sent by the sensor.
And S13, obtaining calibration priori information according to the first data, and calibrating each sensor on the vehicle to be calibrated according to the calibration priori information and the second data to obtain calibration parameters of each sensor, wherein the calibration priori information is data information of a calibration device, and the calibration parameters are used for representing coordinate transformation relations between any two sensors or between the sensors and the total station.
In this step, after the first batch of data is obtained, the system obtains calibrated prior information based on the data. And simultaneously, combining the second batch of data and calibration prior information to perform calibration operation on each sensor on the vehicle to be calibrated. The calibration prior information refers to data information obtained by scanning the calibration device through the total station and can be used as a reference for sensor calibration. By the calibration operation, the measurement error of the sensor can be calibrated, and the accuracy and reliability of the sensor are improved.
According to the embodiment, the calibration device data obtained by scanning the total station and the data acquired by the sensor are received, calculation and analysis are carried out to obtain the calibration parameters of the sensor, and finally, the accurate calibration of the sensor is realized, so that the perception capability and the safety of an automatic driving system can be improved, and effective support is provided for the development of an automatic driving technology. In addition, the calibration method in the embodiment is automatically executed by the processor, human participation is not needed, a large amount of time and human resources are saved, the working efficiency is improved, and the method is applicable to the requirement of mass production.
In one embodiment, the process of determining whether the vehicle to be calibrated satisfies the first calibration condition includes:
Judging whether the vehicle to be calibrated is stopped to a preset area or not;
if the vehicle stops to the preset area, judging that the vehicle to be calibrated meets the first calibration condition, otherwise, judging that the vehicle to be calibrated does not meet the first calibration condition.
The embodiment describes a method for judging whether a vehicle to be calibrated meets a first calibration condition. The method comprises the steps of judging whether a vehicle to be calibrated is stopped to a preset area or not, judging that the vehicle to be calibrated meets a first calibration condition if the vehicle to be calibrated is stopped to the preset area, and judging that the vehicle to be calibrated does not meet the first calibration condition if the vehicle to be calibrated is stopped to the preset area.
The embodiment can ensure that the vehicle to be calibrated must meet the preset conditions before the sensor calibration is performed. Thus, the accuracy and the reliability of calibration can be ensured. By judging whether the vehicle to be calibrated is stopped to a preset area, the vehicle can be ensured to be in a stable state when being calibrated. This is because the vehicle is stationary in the preset area and is not disturbed by the outside, and errors in the sensor data can be avoided.
In addition, the risk and uncertainty in the calibration process can be reduced. If the vehicle to be calibrated is not stopped to the preset area, the vehicle to be calibrated does not meet the first calibration condition, and the vehicle to be calibrated is judged to be unsuitable for sensor calibration. This can avoid inaccuracy or inefficiency of the calibration result due to unstable vehicle conditions.
In one embodiment, the process of determining whether the vehicle to be calibrated meets the first calibration condition includes:
Determining whether the vehicle to be calibrated is first calibrated or not according to the vehicle information of the vehicle to be calibrated;
if the vehicle to be calibrated is calibrated for the first time, judging that the vehicle to be calibrated meets the first calibration condition, otherwise, judging that the vehicle to be calibrated does not meet the first calibration condition. In one embodiment, when it is determined that the vehicle is not first calibrated, further comprising:
and calling first data corresponding to the vehicle to be calibrated, which is stored in the database.
The present embodiment describes a process of determining whether the vehicle to be calibrated satisfies the first calibration condition. Firstly, determining whether the vehicle to be calibrated is calibrated for the first time according to the vehicle information of the vehicle to be calibrated. And if the vehicle to be calibrated is calibrated for the first time, judging that the vehicle to be calibrated meets the first calibration condition. And if the vehicle to be calibrated is not calibrated for the first time, judging that the first calibration condition is not met.
The embodiment can reduce the need for multiple calibration of the vehicle. By judging whether the vehicle to be calibrated is calibrated for the first time, unnecessary repeated calibration operation can be avoided. Only when the calibration is performed for the first time, the scanning operation of the total station on the calibration device is needed to obtain first data. By the method, the calibration efficiency can be improved, and the waste of time and resources is reduced. In addition, for the vehicle which is calibrated, the calibration prior information obtained in advance can be directly used, so that the sensor calibration process is greatly accelerated.
In a word, the embodiment improves the calibration efficiency and accuracy and saves time and resources on the premise of not affecting the calibration quality.
In one embodiment, the process of determining whether the second calibration condition is met includes:
judging whether a calibration starting instruction is received or not;
if the second calibration condition is received, judging that the second calibration condition is met, otherwise, judging that the second calibration condition is not met.
The present embodiment relates to a process of judging whether or not the second calibration condition is satisfied. The judging process comprises the steps of judging whether a calibration start instruction is received or not, and judging whether the calibration start instruction sent by a calibration system is received or not in a control unit or other related equipment on the vehicle to be calibrated. And if the calibration starting instruction is received, judging that the second calibration condition is met. This means that the individual sensors on the vehicle to be calibrated can start to acquire the calibration device and that the second data can be obtained. Otherwise, the second calibration condition is judged not to be met, namely, if the calibration starting instruction is not received, the second calibration condition is judged not to be met. This means that the individual sensors on the vehicle to be calibrated cannot acquire the calibration device nor can the second data be obtained.
It can be seen that the judgment process in the present embodiment is performed based on whether or not the calibration start instruction is received. Only when a calibration start instruction is received, it is determined that the second calibration condition is satisfied, thereby starting the acquisition operation of the sensor. And if the calibration start instruction is not received, the acquisition operation of the sensor cannot be performed, so that the second calibration condition is judged not to be met.
In one embodiment, deriving calibration prior information from the first data includes:
Removing data which do not meet preset requirements from the first data to obtain first calibration data;
And determining calibration prior information according to the first calibration data.
Specifically, the first data are point cloud data obtained by scanning all calibration devices in the calibration system through a total station. The calibration priori information refers to data information of the calibration device. In implementing this embodiment, first data needs to be received and processed. The processing step comprises the step of eliminating data which do not meet preset requirements from the first data, so that first calibration data are obtained. Through the step, the data which do not meet the calibration requirements can be eliminated, and the accuracy and the reliability of the first calibration data are ensured.
Next, calibration prior information is determined from the first calibration data. The data information of the calibration device can be obtained through analysis and processing of the first calibration data. This information can be used as a priori knowledge for the calculation of subsequent calibration parameters and calibration of the sensor.
In the embodiment, the data which do not meet the preset requirement are removed through processing the first data, so that negative influence of the data on the calibration result is avoided. Meanwhile, the calibration prior information is determined according to the first calibration data, so that the existing data information can be better utilized, the calculation accuracy of the calibration parameters is improved, and the accurate calibration of each sensor is realized.
In one embodiment, removing data that does not meet a preset requirement from the first data to obtain first calibration data includes:
And eliminating the point cloud data which does not comprise the calibration device from the first data to obtain first calibration data.
In one embodiment, removing point cloud data not including the calibration device from the first data to obtain first calibration data includes:
And identifying the point cloud data of the ceiling plane and/or the ground plane and/or the surrounding wall planes in the calibration system, so as to remove the point cloud data corresponding to the ceiling plane and/or the ground plane and/or the surrounding wall planes in the first data.
The present embodiment describes a method of processing first data to obtain first calibration data. The object of the present embodiment is to reject point cloud data corresponding to a ceiling plane and/or a ground plane and/or a surrounding wall plane in the first data. By eliminating these data, more accurate first calibration data may be obtained for determining calibration prior information.
This embodiment may be implemented by identifying point cloud data for a ceiling plane, a ground plane, and/or surrounding wall planes in a calibration system. The planes can be obtained through point cloud data obtained by scanning all calibration devices in the calibration system through a total station. In the identification process, the point cloud data may be analyzed using image processing or point cloud processing algorithms to determine the location and shape of the ceiling plane, ground plane, and/or surrounding wall planes. The point cloud data corresponding to the planes are identified and removed, so that the purpose of removing the point cloud data related to the planes can be achieved.
In summary, by the method in this embodiment, more accurate first calibration data may be obtained, so as to improve the precision and accuracy of sensor calibration. This helps to ensure more reliable and accurate results in calculating the calibration parameters for each sensor, thereby effectively calibrating each sensor.
In one embodiment, a process for identifying point cloud data for a ceiling plane and/or a ground plane and/or surrounding wall planes in a calibration system includes:
Acquiring a first distance from the central position of the total station to a ceiling plane and/or a ground plane and/or surrounding wall planes;
randomly screening at least three first point clouds from the first data, and calculating a three-dimensional space plane where the first point clouds are located, wherein the difference value between the distance from the first point clouds to the central position of the total station and the first distance is not more than a first preset distance;
And calculating a second distance from each point cloud in the first calibration data to the three-dimensional space plane, and taking data corresponding to the point clouds with the difference value between the second distance and the first distance not larger than a second preset distance as the point cloud data of the ceiling plane and/or the ground plane and/or the surrounding wall planes.
In this embodiment, since the total station installation position information is known (including x, y, z, whether level, etc.), the positional relationship between the peripheral walls, floor, ceiling and the center of the total station of the calibration system can be easily utilized as the prior information. Thus, in this embodiment, it is first necessary to obtain a first distance from the central position of the total station to the ceiling plane and/or the ground plane and/or the surrounding wall planes. And then randomly screening at least three first point clouds from the first data, and calculating a three-dimensional space plane in which the point clouds are located. The point cloud refers to data points obtained by scanning all calibration devices in the calibration system by the total station. Next, the distance of the first point cloud to the central position of the total station is calculated and compared with the first distance. If the difference is not greater than the first predetermined distance, the point cloud data may be used as data for the ceiling plane and/or the ground plane and/or the surrounding wall planes. And finally, calculating a second distance from each point cloud in the first calibration data to the three-dimensional space plane, and comparing the second distance with the first distance. If the difference is not greater than the second preset distance, the corresponding point cloud data can be used as the data of the ceiling plane and/or the ground plane and/or the surrounding wall plane.
The first preset distance and the second preset distance may be the same or different, and the present application is not particularly limited herein.
In this embodiment, by measuring the plane in which the total station and the computing point cloud are located, point cloud data corresponding to the ceiling plane and/or the ground plane and/or the surrounding wall plane are identified and excluded, thereby obtaining first calibration data. By processing and comparing the data, the point cloud data meeting the preset conditions can be used as calculation calibration prior information. These calibration prior information may be used to calculate calibration parameters for each sensor to calibrate each sensor.
In one embodiment, further comprising:
Repeating the step of identifying point cloud data of a ceiling plane and/or a ground plane and/or surrounding wall planes in a calibration system;
Determining whether the difference value accords with an iteration termination condition according to the difference value between the number of point clouds of the point cloud data of the ceiling plane and/or the ground plane and/or the surrounding wall plane obtained in the current iteration process and the number of point clouds of the ceiling plane and/or the ground plane and/or the surrounding wall plane obtained in the previous iteration process;
If yes, ending the iteration, and taking the point cloud data of the ceiling plane and/or the ground plane and/or the surrounding wall plane obtained in the current iteration process as the point cloud data of the finally obtained ceiling plane and/or the ground plane and/or the surrounding wall plane.
The present embodiment describes an iterative process for identifying point cloud data for a ceiling plane and/or a ground plane and/or surrounding wall planes in a calibration system. This process includes performing the identifying step in a plurality of loops, each iteration determining whether an iteration termination condition is satisfied by comparing a difference between the number of point cloud data obtained in a current iteration process and the number of point cloud data obtained in a previous iteration process.
In each iteration process, the point cloud data identification step is firstly executed according to the point cloud data of the ceiling plane and/or the ground plane and/or the surrounding wall planes obtained before. The step calculates a three-dimensional space plane by using the distances from the central position to the planes, which are measured by the total station, and at least three point clouds selected from the first data, and calculates the distance from each point cloud to the plane. Finally, point cloud data with a small difference from the preset distance is selected from the distances and used as the point cloud data of the ceiling plane and/or the ground plane and/or the surrounding wall planes in a new iteration.
After each iteration is finished, comparing the difference between the number of the point cloud data obtained in the current iteration process and the number of the point cloud data obtained in the last iteration process, and judging whether the difference meets a preset iteration termination condition. For example, if the difference is not greater than the preset number, the iterative process is considered to have converged, ending the iteration. Or if the ratio of the number of the point cloud data obtained in the current iteration process to the number of the point cloud data obtained in the last iteration process is not larger than the preset ratio, the iteration process is considered to be converged, and the iteration is ended. And meanwhile, the point cloud data obtained in the current iteration process are used as the point cloud data of the finally obtained ceiling plane and/or ground plane and/or surrounding wall planes.
By means of the iterative mode, the identification of the point cloud data of the ceiling plane and/or the ground plane and/or the surrounding wall planes in the calibration system can be gradually improved and optimized, and therefore more accurate and reliable calibration results are obtained.
For example, assuming a calibrated site cloud as Φ, the specific steps are as follows:
According to the space distribution characteristics of the point cloud, sorting and eliminating are carried out according to interference influence in order to improve plane fitting efficiency. The ceiling has the least interference, the ground and the surrounding walls, so the plane extraction sequence is that the point cloud of the ceiling, the ground and the surrounding walls;
And carrying out point cloud identification on the ceiling plane. The total station center to ceiling distance z Top is taken as a priori information.
Step 1. At point cloud setRandom pick satisfies Selecting N (N is more than or equal to 3) points with a first preset distance in meters for a plane, wherein z is the vertical coordinate of a point cloud, and the distance between the N points is required to be larger than a threshold L Top (in meters), and calculating a three-dimensional space plane formed by the N points
It should be noted that although the total station is adjusted to be horizontal, it is difficult to ensure that the six planes in the calibration system are completely horizontal or vertical, i.e. there is a certain error. For example, ceilings are non-horizontal and have a large slope, recognizing planes with only a single condition may be poorly effective. So by setting a certain threshold valueOn one hand, the number of subsequent iterations can be greatly reduced, and on the other hand, the method has certain adaptability to non-horizontal or non-vertical planes. It should also be noted that the arrangement of L Top ensures that the spatial distribution of the N points is not too close. If the randomly selected N points are too close, a plane with larger error is easily fitted.
Step 2, calculating the point cloud setAll points in (a) to a three-dimensional space planeAnd according to the distance threshold(I.e. a second predetermined distance) (in meters) is obtained from the three-dimensional space planeClose point set
Step 3, calculating whether the iteration termination condition is satisfied(Wherein,For the number of point cloud data obtained in the i+1th iteration process,The number of point cloud data obtained in the ith iteration). If J > T top(Ttop is an iteration threshold, e.g., 0.9), thenFor the plane point cloud P Top of the ceiling, otherwise, let(By stepwise reductionA more accurate planar point cloud may be acquired) and the calculation is resumed from the first step.
Wherein, when the 1 st step is executed for the first time, the point cloud set phi is randomly selected to meet the requirementsN (N≥3) points of the condition, i.e
Further, the principle of identifying the planar point cloud P Bottom and the surrounding wall Ping Miandian cloud (P Left、PRight、PFront、PBack) is similar to that of identifying the planar point cloud of the ceiling, and the present application is not repeated here. Taking the forward wall as an example, the distance x front from the center of the total station to the forward wall is used as a priori information. And obtaining corresponding point clouds by adopting a ceiling point cloud computing method.
In one embodiment, the calibration device includes a plurality of calibration members that determine calibration prior information from the first calibration data, including:
performing spatial clustering segmentation on the first calibration data according to the positions of the calibration components in the calibration system to obtain first local point cloud data corresponding to each calibration component;
And determining first identity information and first coordinate information of each calibration component according to the first local point cloud data.
In this embodiment, when the calibration device comprises a plurality of calibration members, the calibration members have different positions in the calibration system. First, according to the point cloud data in the first data, spatial cluster segmentation can be performed. This means that the points in the point cloud data can be grouped by their locations such that the points in the same group are closer to each other and farther from the points in the other groups. Through the segmentation process, first local point cloud data corresponding to each calibration component are obtained. Next, first identity information and first coordinate information of each calibration member may be determined by analyzing the first local point cloud data. By analyzing the point cloud data of each calibration member, the characteristics or shape of each calibration member can be identified and compared to known calibration members to confirm their identity information. Meanwhile, the coordinate information (such as three-dimensional coordinate information) of each calibration component in the calibration system can be determined by analyzing the position information in the point cloud data.
It should be noted that, when the calibration component includes a plurality of calibration boards and each calibration board has identity information corresponding to itself, the first identity information may be a combination of identity information of the plurality of calibration boards. The first coordinate information is used to characterize the spatial position of the calibration member in the calibration system, such as three-dimensional coordinates of the center position of the calibration member in the calibration system, and the like.
Through the above process, the first identity information and the first coordinate information of each calibration member can be determined, and these information will be used for subsequent calculation of calibration parameters of the respective sensor to calibrate the respective sensor. Therefore, the vehicle sensor calibration method can effectively and accurately calibrate each sensor according to the data information of the calibration device.
In one embodiment, each calibration member includes at least two calibration plates, and determining first identity information and first coordinate information of each calibration member according to the obtained first local point cloud data includes:
Determining second local point cloud data corresponding to each calibration plate from the first local point cloud data;
and determining second identity information and second coordinate information corresponding to each calibration plate according to the second local point cloud data.
The present embodiment describes a determination process of the composition of the calibration member and the corresponding identity information and coordinate information in the vehicle sensor calibration method. Each calibration member in the method comprises at least two calibration plates. First, first identity information and first coordinate information of each calibration member are determined according to first local point cloud data. This may be achieved by determining second local point cloud data corresponding to each calibration plate from the first local point cloud data. And then, according to the second local point cloud data, second identity information and second coordinate information corresponding to each calibration plate can be determined.
It should be noted that, the second identity information is used for representing the identity information of the calibration board, and if the calibration board includes a plurality of calibration codes, the identity information of the calibration board may be a combination of the identity information of the calibration codes. In this embodiment, the combination of the second identity information of the calibration plate included on the calibration member is the first identity information of the calibration member. The second coordinate information is used for representing the positions of the calibration plates in the calibration system, and if the three-dimensional coordinates of the calibration members and the relative positions of the calibration plates and the calibration members are known, the positions of the calibration plates can be known.
In this way, calibration parameters of the sensor may be determined from data collected by the calibration components and the sensor on the vehicle. This ensures that each sensor is able to accurately sense the surrounding environment and provide accurate data.
In one embodiment, each calibration member includes at least two non-coplanar calibration plates, and determining second local point cloud data corresponding to each calibration plate according to the position of each calibration plate in each calibration member and the first local point cloud data includes:
and performing plane fitting on each calibration plate to extract second local point cloud data corresponding to each calibration plate.
The present embodiment describes a step of processing the first local point cloud data to determine second local point cloud data corresponding to each calibration plate. The processing step comprises the step of carrying out plane fitting on each calibration plate so as to extract second local point cloud data corresponding to each calibration plate. In such an embodiment, for each calibration plate, it is first found in the first local point cloud data. The point cloud data on each calibration plate is then processed using a plane fitting technique and the plane closest to the surface of the calibration plate is found. Plane fitting may fit a plane by minimizing the distance between points and the plane. In this embodiment, a planar fitting technique is used to extract second local point cloud data corresponding to each calibration plate, i.e. point cloud data located near the surface of the calibration plate.
By fitting the plane of each calibration plate, the second local point cloud data corresponding to each calibration plate, namely, the point cloud data located at a position close to the calibration plate on the plane obtained by plane fitting, can be determined.
Through the processing step, second local point cloud data corresponding to each calibration plate can be extracted from the first local point cloud data for subsequent calibration parameter calculation and sensor calibration. Thus, the accuracy and reliability of calibration can be enhanced, and the performance and accuracy of the vehicle sensor can be improved.
In one embodiment, performing plane fitting on each calibration plate to extract second local point cloud data corresponding to each calibration plate includes:
Acquiring a first height between the center of each calibration plate and the ground, randomly screening at least three second point clouds from the first local point clouds, and calculating a second three-dimensional space plane where the second point clouds are located;
calculating a third distance from each point cloud in the second local point cloud data to a second three-dimensional space plane;
and taking the data corresponding to the point cloud with the difference value between the third distance and the first height not larger than the third preset distance as second local point cloud data corresponding to the calibration plate.
The embodiment describes a method for performing plane fitting on each calibration plate so as to extract second local point cloud data corresponding to each calibration plate. The method comprises the specific steps of obtaining a first height between the center of each calibration plate and the ground. And randomly selecting at least three point clouds from the first local point cloud data as second point clouds. And calculating a three-dimensional space plane by using the selected second point cloud. For each point cloud in the first local point cloud data, its perpendicular distance to the second three-dimensional space plane, i.e. the third distance, is calculated. And comparing the third distance with the difference value of the first height, and screening out point cloud data meeting the condition that the difference value is not greater than a third preset distance. The point cloud data are regarded as second local point cloud data corresponding to the calibration plate.
For example, only including calibration component point cloud data setsClustering is carried out according to the space distance of the calibration components (because the space distance between the calibration components is larger, the space is easy to divide), and the point cloud data set (namely, the first local point cloud data) of each calibration component is firstly obtained(The point cloud data set corresponding to the kth calibration component is represented by M 2, which is the total number of calibration components);
since each calibration member is made up of S (typically S is 3 or 4) non-coplanar calibration plates. For the kth calibration component point cloud dataset And extracting the point cloud data of each calibration plate from top to bottom (or from bottom to top) by using a plane fitting principle, wherein the specific steps are as follows:
step 1 for calibration component Point cloud data set According to the information of the center height z i (i.e. the first height) of the calibration plate (i.e. the first height), randomly selecting the calibration plate to meet the requirement(Selecting an error threshold value for a plane, wherein N (N is more than or equal to 3) points (second point cloud) under the condition that N is more than or equal to 3, and the distance between the N points is required to be larger than a threshold value L a (L a <0.7W is usually the calibrated width, and W is expressed in meters), and calculating a three-dimensional space plane formed by the N points
It should be noted that, the calibration component is formed by splicing a plurality of calibration plates, the installation height of each calibration plate (namely the center height of the calibration plate) is known, and the plane fitting efficiency can be accelerated and the plane fitting accuracy can be improved through the prior information. If the three second point cloud distribution strips are arranged on different calibration plates, invalid calculation is easy to occur and falls into more invalid loops, so that the plane of the calibration plate can be accurately fitted in the step 1, and the plane fitting efficiency can be improved.
Step 2, calculatingAll points in (a) to a three-dimensional planeIs a distance (third distance), and according to the threshold value of the distance(In meters) acquisition and planarizationClose point set
Step 3, calculating the iteration termination conditionIf J > T (T is a threshold, e.g. 0.9), thenThe point cloud of the ith calibration plate on the kth calibration component is used as the point cloud, otherwise, the method is used for The calculation is restarted from the first step.
The point cloud of the ith calibration plate can be obtained through the stepsThen fromMiddle eliminationRepeating the steps to obtain the point cloud of each calibration plate on the kth calibration member one by one
Through the steps, the second local point cloud data corresponding to each calibration plate meeting the requirements can be extracted from the first local point cloud data, so that the goal of carrying out plane fitting on each calibration plate is realized. The method can accurately extract the shape and position information of each calibration plate, and provides valuable data reference for subsequent sensor calibration.
In one embodiment, the calibration plate comprises a calibration carrier plate and a plurality of mark codes, wherein the mark codes are fixed on the calibration carrier plate and carry two-dimensional code information for the sensor to recognize, and identity information corresponding to the two-dimensional code information of all the mark codes is different;
Determining second identity information corresponding to each calibration plate according to the second local point cloud data, including:
converting the second local point cloud data into a gray scale image;
Dividing the gray level image according to the interval between the mark codes to obtain a local image of each mark code;
And carrying out two-dimensional code recognition on each local image to determine the first ID information of each marker code, and determining the second identity information of the calibration plate according to the first ID information of each marker code on each marker plate.
The present embodiment describes a calibration plate design in a vehicle sensor calibration system. The calibration plate comprises a calibration carrier plate and a plurality of mark codes. The marking codes are fixed on the calibration carrier plate, and each marking code carries two-dimensional code information for the sensor to recognize. It should be noted that the identity information corresponding to the two-dimensional code information of each flag code is different.
On the basis, the specific implementation mode of determining the second identity information corresponding to each calibration plate according to the second local point cloud data is that the second local point cloud data is converted into a gray level image, so that subsequent image processing and analysis are facilitated. The gray image is segmented according to the spacing between the logo codes (this is a known condition) to obtain a partial image of each logo code, which is processed separately by separating each logo code. Then, two-dimensional code recognition is performed for each partial image. The first ID information of each flag code can be acquired by decoding and analyzing the two-dimensional code image. This first ID information may be used as a unique identification for the identification code. Finally, according to the first ID information of each marking code on each marking plate, the second identity information of the marking plate is determined (for example, the identity information of the marking plate is the combination of the first ID information of a plurality of marking codes on the marking plate in a preset mode). This means that by integrating and comparing the first ID information of each of the identification codes, the identity information of the entire calibration plate can be determined. Because the identity information corresponding to the two-dimensional code information of each mark code is different, the comparison can ensure that each calibration plate can be uniquely identified.
In one embodiment, after converting the second local point cloud data into the grayscale image, filtering the grayscale image is further included.
The mode in the embodiment can calibrate the sensor more accurately, and improves the stability and the precision of a calibration system.
The calibration member referred to in the embodiments is described herein, and in particular, the calibration member may include at least two calibration plates and a connecting member fixed between adjacent two calibration plates for connecting the two calibration plates. As shown in fig. 2, a plurality of calibration plates, for example, the lower edge of the calibration plate 1 is connected with the upper edge of the calibration plate 2 through a connector, so as to obtain a calibration member. The calibration member of the application, because of comprising a plurality of calibration plates at the same place, can increase the number of calibration of the sensors on the vehicle, such as cameras, laser radars, millimeter waves Lei Dashui flat and/or vertical Field of View (FOV) at the same calibration position, thereby improving the calibration precision.
Furthermore, in the calibration member provided by the application, the connecting piece is also used for controlling the included angle between any one of the two calibration plates and the vertical plane where the two calibration plates are positioned. Specifically, the connecting piece in the calibration member provided by the application can also adjust the included angles among a plurality of connected calibration plates. As shown in fig. 3, the angles between the calibration plates 1 and 2 and the vertical plane of the calibration member can be adjusted according to the requirements of the user, for example, the angles between the calibration plates 1 and 2 and the vertical plane of the plumb plane are adjusted to be 30 degrees by adjusting the connecting pieces based on the fact that the mark codes on the calibration plates 1 and 2 are easy to be recognized by the sensor.
Furthermore, the calibration member provided by the application can adjust the included angles among a plurality of connected calibration plates according to the use requirement of a user, so that a plurality of different sensors on a vehicle can effectively identify the calibration plates. And moreover, the plurality of calibration plates on the calibration component have included angles, so that non-coplanar points are increased, and the calibration precision is effectively improved.
Furthermore, in the calibration member provided by the application, the front surfaces of the two calibration plates are positioned on the same side of the calibration member. Specifically, as shown in fig. 3, the calibration member provided by the application can be used for forward splicing a plurality of calibration plates. At this time, the front surfaces of the calibration plate 1 and the calibration plate 2 are positioned on the same side of the calibration member, and the included angle between the calibration plate 1 and the vertical surface, for example, 30 degrees is adjusted according to the sensor calibration requirement of the vehicle. When the sensor of the vehicle is required to be calibrated through the sensor, two-dimensional code information of the same face of the calibration plate 1 and the calibration plate 2 can be obtained, and the two-dimensional code information is used for calibrating different sensors on the vehicle. The calibration component provided by the application increases the non-coplanar points identified by the calibration plate, so that the calibration precision of the sensor is improved.
Furthermore, in the calibration member provided by the application, the front surfaces of the two calibration plates are positioned at two sides of the calibration member. Specifically, as shown in fig. 4-5, the calibration member provided by the application can also be used for reversely splicing a plurality of calibration plates, and fig. 4 and 5 are respectively a perspective view and a side view of the calibration member during reversely splicing. At this time, the front surfaces of the calibration plate 1 and the calibration plate 2 are positioned at two sides of the calibration member, and the included angle between the calibration plate 1 and the vertical surface, for example, 30 degrees is adjusted according to the sensor calibration requirement of the vehicle. When the sensor of the vehicle is required to be calibrated through the sensor, the two-dimensional code information on the front surface of the calibration plate 1 and the two-dimensional code information on the back surface of the calibration plate 2 can be obtained and used for calibrating different sensors on the vehicle.
In addition, as shown in fig. 6 and fig. 7, the calibration member provided by the application can also be obtained by combining a plurality of calibration plates spliced in the forward direction and calibration plates spliced in the reverse direction, and fig. 6 and fig. 7 are respectively a perspective view and a side view of the calibration member obtained by combining a plurality of different splicing modes. Specifically, the calibration component provided by the application can splice a plurality of calibration plates at a certain included angle according to the calibration requirements of the sensors such as cameras, laser radars, millimeter wave radar fields of view and the like. For example, combining the two calibration plates of fig. 2 and 4 that are spliced together results in the calibration member shown in fig. 6. At this time, the sensor of the vehicle needs to identify the information such as the included angle, the mark code and the like of the front calibration plate and the back calibration plate combination or the calibration unit when the sensor of the vehicle is calibrated, and the calibration precision of the sensor of the vehicle is further improved.
It is emphasized that the application does not limit the specific number of calibration plates in the calibration member, and a user can increase or decrease the number of calibration plates based on the difference of the number of sensors and the angle of view on different vehicles according to the use requirement of the user, so as to meet the sensor calibration requirements of different vehicles.
On the basis of the embodiment, as the plurality of calibration plates in the calibration member provided by the application are different towards the front and the back of the vehicle sensor, the non-coplanar difference of the reference object for calibrating the sensor is further increased, and the calibration precision of the vehicle sensor calibration is improved.
In one embodiment, determining second coordinate information corresponding to each calibration plate according to the second local point cloud data includes:
Extracting four sides of the circumscribed rectangle of the partial image of each marker code, and determining the intersection point of two adjacent sides as a key point;
acquiring first image coordinates of each key point, wherein the first image coordinates represent the positions of the key points on the calibration plate;
and determining a first three-dimensional coordinate corresponding to each key point according to the position of the mark code on the calibration plate, the first image coordinate and the first reference three-dimensional coordinate of each point cloud in the second local point cloud data.
The specific implementation mode of the second coordinate information corresponding to each calibration plate is that four sides of an external rectangle of a local image of each mark code are extracted, intersection points of two adjacent sides are determined to be key points, first image coordinates of each key point, namely positions of the key points on the calibration plate, are obtained, and first three-dimensional coordinates corresponding to each key point are calculated according to positions of the mark codes on the calibration plate, the first image coordinates and first reference three-dimensional coordinates of each point cloud in the second local point cloud data.
In this embodiment, the key points are determined by extracting four edges of the circumscribed rectangle of the partial image of the marker code, so that errors caused by image noise can be effectively reduced, and calibration accuracy is improved. By using the mark code as the characteristic point for identification, the robustness of the system to the change of the posture of the calibration plate and the change of the observation angle of the sensor can be improved. By using the two-dimensional code technology, the first ID information of the marker code can be quickly and accurately acquired, so that the time and cost of manual processing are reduced.
The first image coordinates represent positions of the key points on the calibration plate, the first reference three-dimensional coordinates are known, and then the first three-dimensional coordinates corresponding to the key points can be determined under the condition that the positions of the calibration plate are known and the first image coordinates of the key points on the calibration plate are known. The set of all the first three-dimensional coordinates is the three-dimensional coordinate set of all the key points.
In summary, the method described in this embodiment may achieve calibration of the vehicle sensor by identifying the key points on the calibration board and the corresponding first three-dimensional coordinates thereof, and has the beneficial effects of high accuracy, high robustness and quick calibration.
In one embodiment, when there is no three-dimensional coordinate corresponding to the first image coordinate of the key point in the first reference three-dimensional coordinate, the method further includes:
acquiring three-dimensional coordinates corresponding to a plurality of reference points adjacent to the key points;
and determining a first three-dimensional coordinate corresponding to the key point according to the three-dimensional coordinates of the plurality of reference points.
When the key points do not directly correspond to the point clouds, the method can be used for fitting planes according to the pixel points which are nearest to the key points in the up-down, left-right directions and have the space corresponding to the point clouds, the pixel distances of the four directions on the image and the fitting planesAnd (3) performing bilinear interpolation and plane angle offset calibration to obtain the three-dimensional coordinates of the key point.
In one embodiment, the calibration plate is further provided with a metal block, and the second identity information corresponding to each calibration plate is determined according to the second local point cloud data, and the method further includes:
And determining third identity information of the metal block according to the first ID information of each mark code.
In an automatic calibration system of an automatic driving sensor, when the sensor comprises a millimeter wave radar, a metal block needs to be arranged on a calibration plate. Because the metal block is capable of simulating a target object on a real road, such as a vehicle, a pedestrian or a static obstacle, providing a reflection of the radar signal. During calibration, millimeter wave radar emits radar waves, and the metal block can reflect these waves and provide parameters similar to those of an actual target object. Through setting up the metal piece on the demarcation board, under different angles and positions, millimeter wave radar can discern the metal piece to help confirm accurate position and the angle of sensor.
In one embodiment, determining second coordinate information corresponding to each calibration plate according to the second local point cloud data further includes:
and determining the three-dimensional coordinates of the metal block according to the first three-dimensional coordinates of the key points corresponding to the marking codes and the positions of the metal block on the calibration plate.
In this embodiment, the first step is to acquire second local point cloud data. The second local point cloud data refers to extracting a part related to the calibration plate from the whole point cloud image. Next, second coordinate information of the metal block on each calibration plate is determined according to the second local point cloud data. This means that by analysing the second local point cloud data, the position of each metal block on the calibration plate can be determined. And finally, determining the three-dimensional coordinates of the metal block according to the position of the metal block on the calibration plate and the first three-dimensional coordinates of the key point corresponding to each mark code. This means that by matching the position of the metal block with the key points of the logo code, the position of the metal block in three-dimensional space can be determined.
In one embodiment, the metal block is fixed at the center of the calibration carrier plate, the plurality of mark codes are fixed on the calibration carrier plate and surround the periphery of the metal block, and the plurality of mark codes are connected with the top points of the metal block near the top points of the metal block in the direction of the metal block.
In one embodiment, determining the three-dimensional coordinates of the metal block according to the position of the metal block on the calibration plate and the first three-dimensional coordinates of the key points corresponding to the marker codes comprises determining the third identity information of the metal block according to the first ID information of each marker code.
If the millimeter wave radar sensor is required to be calibrated, a metal block can be further arranged in the calibration plate, and when the calibration plate is manufactured, the adjacent angle four adjacent domain calibration code ID information of the metal block is known prior information, so that the third identity information of the metal block can be encoded by the four first ID information of the adjacent angle four adjacent domain calibration codes in a clockwise manner, such as ID ltIDrtIDrdIDld.
The third identity information of each metal block is different, and when each calibration plate is provided with a calibration code, the third identity information of the metal block can be the combination of the first ID information of the calibration codes in a first mode. The second identity information of the calibration plate in the above embodiment may be a combination of the first ID information in the second manner, or a combination of each first ID information and the third identity information of the metal block, etc., which is not limited herein.
In one embodiment, determining second coordinate information corresponding to each calibration plate according to the second local point cloud data further includes:
determining three-dimensional coordinates of four corner points of the metal block according to the first three-dimensional coordinates of key points corresponding to the marking codes and the position relations of the metal block and the marking codes;
and determining the central three-dimensional coordinates of the metal block according to the three-dimensional coordinates of the four corner points of the metal block.
In this embodiment, after the three-dimensional coordinates of the key points of the calibration code are obtained, the three-dimensional coordinates of the four corner points of the metal block can be obtained according to the opposite corner relationship between the four corner angles of the metal block and the key points of the calibration code. And then calculating the central three-dimensional coordinates of the metal block according to the three-dimensional coordinates of the four corner points.
In the calibration plate structure described in the above embodiment, the calibration carrier plate is a plate body for carrying the logo code and the metal block, and the material may be a non-metal material that is not easy to deform, such as an acryl plate. The metal block positioned in the center of the calibration support plate is used for calibrating millimeter wave radars and other sensors installed on the automatic driving vehicle. The marker codes are two-dimensional codes that can be identity information coded, such as binary square fiducial markers (Augmented Reality University of Cordoba, arUco) for camera pose estimation, facilitating the identification of each marker code on the marker panel by sensors such as lidar mounted on an autonomous vehicle. The size and the number of the marker codes can be adjusted according to the user requirements based on factors such as the focal length of the camera and the size of the laser radar field of view. In order to facilitate the sensor to distinguish different mark codes, all mark codes on the calibration plate cannot be repeated, and the identity information of each mark code is unique.
It should be emphasized that, in the technical solution provided by the present application, the metal block is used for obtaining the spatial homonymy point (the same key point as described in the following embodiment) when the millimeter wave radar and the main lidar on the vehicle are calibrated later, so as to achieve the calibration effect of the millimeter wave radar and the main lidar, the ID information carried in the two-dimensional code information on the tag code is used for executing the automatic calibration action between the participating camera in the subsequent camera and the external parameters, and the ID information carried in the two-dimensional code information on the tag code can also be used for executing the automatic calibration action between the subsequent camera and the lidar.
The same-name points are used for realizing the unification of data of different sensors in the same three-dimensional space, and the sensors can be calibrated correspondingly according to the difference of the coordinates only after a plurality of different coordinates of the same point corresponding to the different sensors are obtained. For example, the coordinates of the center of the sphere of the same space sphere are (u, v) in the image information obtained by the camera of the vehicle, (x, y, z) in the point cloud obtained by the lidar of the vehicle, and (x, y, θ) in the point cloud obtained by the millimeter wave radar of the vehicle. The coordinates are unified in a three-dimensional coordinate system, and parameters of the corresponding sensor are adaptively adjusted according to the difference of different coordinates, so that the calibration effect of the vehicle sensor is achieved.
The calibration plate comprises a metal block positioned in the center for calibrating the sensor and a plurality of mark codes surrounding the metal block, wherein two-dimensional code information on the mark codes corresponds to different identity information, the identity ID information of the metal block can be rapidly identified according to the identity information, the sensor of the vehicle can automatically identify on image and point cloud data according to the different ID information, the automatic key point matching effect is good, in addition, the ID information can assist in acquiring the ID information and the spatial position information of the metal block positioned in the middle of the mark codes, confusion caused by overhigh similarity among the different metal blocks is avoided, and the problem of automatic correspondence between the laser radar and the millimeter wave radar spatial point is solved.
According to the technical scheme provided by the application, the identities of the metal blocks surrounding the mark codes can be identified through unique two-dimensional codes in the plurality of mark codes, and the manual auxiliary determination is not needed, so that the calibration accuracy is improved, and the calibration working efficiency is improved.
In the calibration plate provided by the application, in order to facilitate the vehicle sensor to distinguish the marking codes and the metal blocks on the calibration carrier plate, the marking codes, such as black marking codes or marking codes with large color difference with the calibration carrier plate, can be obtained by spraying dark paint with good absorptivity to laser point cloud on the calibration carrier plate. When the sensor of the vehicle is calibrated, for example, a laser radar, the position of the calibration carrier plate can be obtained through the point cloud plane fitting of the calibration plate, and the ID information of each mark code and the three-dimensional coordinates of the key points on the calibration plate are further identified through the filtering and the segmentation of the point cloud of the calibration carrier plate. At this time, for convenience of distinction, the surface color of the nonmetallic material selected by the calibration carrier plate needs to be different from the color of the marking code coating.
The surface of the dark area of the calibration carrier plate and the surface of the marker code have larger difference in color and the like, so that the laser radar point clouds at different positions in the calibration plate form difference, and the marker code identification efficiency of the laser radar point clouds can be improved.
It should be further noted that, in the calibration plate provided by the application, the metal block is of a polygonal structure, the marking codes are polygonal results, the vertices of the marking codes, which are close to the direction of the metal block, are connected with the vertices of the metal block, and the number of the vertices of the metal block is greater than or equal to 4.
Specifically, in the calibration plate provided by the application, the metal block and the mark codes on the calibration plate are polygonal, the plurality of vertexes of the metal block are respectively and directly connected with the vertex angles of the plurality of mark codes surrounding the periphery of the metal block, at the moment, the vehicle sensor can completely position the metal block only by acquiring the identity information and the position information of the adjacent plurality of mark codes of the metal block, so that the correlation between the identity information and the positioning information of the metal block is established, the number of the mark codes is at least 4, and the metal block is at least provided with four sides in order to ensure the positioning accuracy.
Because the metal block in the calibration plate is connected with the vertex of the marker code, the position and the identity information of the metal block can be obtained by identifying the position and the identity information of the marker code, thereby being convenient for the calibration between the millimeter wave radar and other sensors installed on the vehicle and improving the universality of the application.
It should be further noted that, in the calibration plate provided by the application, the number of the marking codes is 4, and the polygonal structures of the metal block and the marking codes are rectangular.
Specifically, in the technical scheme provided by the application, the mark code and the metal block on the calibration plate are rectangular, and 4 vertexes of the metal block are respectively connected or overlapped with the vertex of one mark code close to the metal block.
On the basis of the embodiment, the calibration board provided by the application can achieve the internal and external parameter calibration of the camera, and at least 4 key points corresponding to ArUco codes on the same calibration board are shared. And further determining three-dimensional space coordinates of each key point by utilizing the key points of the plurality of sets of calibration plates in the camera view field range and the point cloud data acquired according to each key point and the total station to calibrate the internal and external parameters of the camera, so that on one hand, automatic calibration is realized, and on the other hand, the internal parameter resolving precision of the camera is improved.
Based on the above embodiment, the two-dimensional code information carried by the marker code in the calibration plate provided by the application is a similar two-dimensional code, such as ArUco codes, which is convenient for the camera to calculate.
In one embodiment, when the sensor includes a camera to be calibrated, controlling each sensor on the vehicle to be calibrated to acquire the calibration device to obtain second data, including:
controlling a camera to be calibrated to acquire a calibration device to obtain a first calibration image;
Identifying the mark codes in the first calibration image, and determining second ID information and partial images of the mark codes in the first calibration image;
determining second image coordinates of each key point included in the first calibration image according to the ID information of each mark code in the first calibration image and the local image;
Determining second three-dimensional coordinates corresponding to each key point in the first calibration image according to the ID information of each mark code in the first calibration image, the second image coordinates of each key point and the first reference three-dimensional coordinates;
and determining an internal reference of the camera to be calibrated according to the second image coordinates and the second three-dimensional coordinates.
The embodiment provides a method for calibrating an internal reference of a camera to be calibrated on a vehicle, which comprises the steps of firstly obtaining a first calibration image sent by the camera to be calibrated, and determining identity information of a calibration plate and image coordinates of key points contained in the first calibration image through the first calibration image. And then, determining a second three-dimensional coordinate corresponding to the key point in the first calibration image by using the first reference three-dimensional coordinate of each point cloud in the second local point cloud data. And finally, determining the internal reference of the camera to be calibrated through the second image coordinates and the second three-dimensional coordinates of the key points.
The second ID information is ID information of calibration codes included in the first calibration image, the set of the calibration codes included in the first calibration image is a subset of the set of the calibration codes used in the calibration system, and then the second ID information is part of the first ID information of all the calibration codes. Then the second three-dimensional coordinates corresponding to each key point can be determined on the basis that the second ID information of the calibration code is known, the second image coordinates of each key point on the calibration plate are known, and the first reference three-dimensional coordinates are known. That is, the second three-dimensional coordinates are a part of the first three-dimensional coordinates of all the key points.
For example, the internal parameters of the camera include a focal length f x、fy, principal point coordinates u 0、v0, lens distortion k 1、k2 (only mirror distortion is considered here), and the like. Taking a vehicle forward camera as an example, assume that n calibration plates (denoted as C i, i epsilon n) on a plurality of calibration components are shot in the field of view of the camera, and M calibration code key points (denoted as M j, j epsilon M) are arranged in each calibration plate image.
For different calibration plates, the internal parameters of the same camera are consistent. Based on this, the internal reference calibration initial value acquisition steps are as follows:
(1) For a camera to be calibrated, acquiring a frame of calibration image (first calibration image);
(2) And acquiring the ID and the partial image of each calibration code by a two-dimensional code identification method. For each marker code partial image, extracting four edges of the marker code circumscribed rectangle by using Hough transformation, and determining the second image coordinates of the key point through the intersection point of the two edges (K=0, 1,2,3, respectively representing the upper left corner, upper right corner, lower left corner). Respectively on the images toAs a center, an 11×11 partial image is taken, as shown in fig. 9. For each partial image, a platform kernel fit is performed to further obtain all the marker key sub-pixel coordinates (x ij,yij)i∈n,j∈m (here sub-pixel extraction is performed in order to further increase the resolution accuracy).
The second image coordinates (X ij,yij)i∈n,j∈m and ID information I ij (I e n, j e m) of the key points of all the calibration codes in the image (the calibration code is the j-th calibration code in the I-th calibration plate) are identified, wherein the second three-dimensional coordinates (X ij,Yij,Zij)i∈n,j∈m) corresponding to the key points can be further obtained according to the prior information of the ID of the calibration plate (the ID information of all the calibration codes on the same calibration plate), the prior information obtaining process knows the attribution relationship, that is, each calibration plate contains which mark codes.
For the calibration plate visible in each camera view field, a world coordinate system O wXwYwZw (in the embodiment, the coordinate system of the total station is taken as the world coordinate system), namely, the left upper corner key point of the left upper corner calibration code of the calibration plate is taken as the origin O w of the world coordinate system, the transverse axis of the calibration plate is an X w axis, the vertical axis is a Y w axis, and the Z w axis is established according to the right hand system principle, as shown in figure 10. For each calibration plate visible in the camera field of view, the camera internal parameters f x、fy, the principal point coordinates u 0、v0 and the lens distortion k 1、k2 are solved by using the key point second image coordinates (X ij,yij)i∈n,j∈m and the second three-dimensional coordinates (X ij,Yij,Zij)i∈n,j∈m according to the Zhang Zhengyou calibration method) of the calibration plate.
It should be noted that, the Zhang Zhengyou calibration method needs to collect a plurality of calibration images (non-coplanar) with different angles of view and different distances to obtain the internal parameters of the camera. Because the camera intrinsic is uniform for different images. The method can only calculate the external parameters of the image corresponding to each calibration plate, but cannot obtain the real external parameters of the camera.
Conventional camera internal calibration requires acquisition of images of multiple calibration plates (e.g., checkerboard, as shown in fig. 11 and 12). The calibration process requires that the calibration plate be placed in different positions, different planes (non-coplanar). In this embodiment, only one frame of image needs to be acquired, so that the efficiency of calibrating the internal reference of the camera to be calibrated is improved.
In summary, in this embodiment, calibration parameters of each sensor are calculated by using identity information and image coordinates of the calibration board, and a correspondence between local point cloud data collected by the sensor and reference three-dimensional coordinates. Through the calibration parameters, the sensor can be accurately calibrated, and the measurement precision and the positioning capability of the sensor are improved.
In one embodiment, after determining the internal reference of the camera to be calibrated according to the second image coordinates and the second three-dimensional coordinates, the method further comprises:
determining external parameters of the camera to be calibrated according to the second image coordinates, the second three-dimensional coordinates and the internal parameters of the camera to be calibrated;
the external parameters of the camera to be calibrated at least comprise a first coordinate transformation relation between the coordinate system of the camera to be calibrated and the coordinate system of the total station.
In this embodiment, after determining the internal parameters of the camera to be calibrated, determining the external parameters of the camera to be calibrated is further included. The camera internal parameters, i.e. the internal parameters of the camera described in the above embodiments, including focal length, principal point coordinates, distortion parameters, etc., are used to describe the geometrical features inside the camera. The camera external parameters describe the geometric relationship between the camera coordinate system and the total station coordinate system, namely the position and the posture of the camera under the total station coordinate system.
In this embodiment, the internal parameters of the camera to be calibrated are determined according to the second image coordinates and the second three-dimensional coordinates corresponding to each key point in the first calibration image, and the next step is to determine the external parameters of the camera to be calibrated. Specifically, the external parameters of the camera to be calibrated are determined by using the second image coordinates, the second three-dimensional coordinates and the internal parameters of the camera to be calibrated.
Before determining the external parameters of the camera to be calibrated, it is first necessary to know the relationship between the coordinate system of the camera and the coordinate system of the total station. The total station has its own independent coordinate system, and the camera to be calibrated also has its own coordinate system. A certain coordinate transformation relation exists between the camera coordinate system to be calibrated and the total station coordinate system, and the relation comprises translation, rotation, scale parameters and the like. According to the internal parameters of the camera to be calibrated and the second image coordinates and the second three-dimensional coordinates corresponding to each key point in the first calibration image, the external parameters of the camera to be calibrated can be determined by utilizing the coordinate transformation relation between the camera and the total station coordinate system. Specifically, the external parameters of the camera to be calibrated can be determined by calculating translation, rotation, scale parameters and the like between the camera coordinate system to be calibrated and the total station coordinate system.
The position and the posture of the camera to be calibrated under the total station coordinate system can be accurately described by determining the external parameters of the camera to be calibrated, so that the camera to be calibrated is calibrated in the vehicle sensor calibration method.
Specifically, the steps of calibrating the external parameters of the camera to be calibrated are as follows:
Three-dimensional space points for all calibration code key points in camera view field range (Second three-dimensional coordinate) (k is n×m, n is the number of calibration plates, m is the number of mark codes on the calibration plates), its corresponding two-dimensional pixels on the picture are (x ij,yij)i∈n,j∈m (second image coordinate). Theoretically, three-dimensional space points pass through the internal reference of cameraThe two-dimensional pixel obtained after the transformation of the lens distortion k 1、k2 and the external parameter R, T is (x' ij,y′ij)i∈n,j∈m).
Thus, a function F can be constructed whose parameters are internal and external parameters. The second three-dimensional coordinates pass through the function F, outputting a two-dimensional projection image coordinate (second image coordinate) (x' ij,y′ij)i∈n,j∈m.
That is to say,R is a rotation matrix in the first coordinate transformation relation, and T is a translation vector in the first coordinate transformation relation.
In practice, the identified (x ij,yij)i∈n,j∈m has a certain deviation from theoretical (x' ij,y′ij)i∈n,j∈m. Therefore, the function to be optimized can be constructed as follows:
Iterative solutions were performed using the Levenberg-Marquardt (LM) algorithm. Here, we perform iterative calculation with the focal length f x、fy, principal point coordinates u 0、v0, and lens distortion k 1、k2 obtained in the first step as initial values. By giving a better initial value, the subsequent iterations converge faster.
In one embodiment, when the sensor includes a look-around camera to be calibrated, controlling each sensor on the vehicle to be calibrated to acquire the calibration device to obtain second data, including:
controlling the looking-around camera to be calibrated to acquire the calibration device to obtain a second calibration image;
projecting the second calibration image by using the look-around spliced homography matrix to obtain a projection image;
and calculating a second coordinate transformation relation between the coordinate systems of the adjacent two looking-around cameras to be calibrated according to the same key points in the projection images corresponding to the adjacent two looking-around cameras to be calibrated.
In this embodiment, when the sensor includes the looking-around camera to be calibrated, first, the second calibration image sent by each looking-around camera to be calibrated is obtained, where this step involves collecting calibration images captured by each looking-around camera on the vehicle to be calibrated. And in the step, adopting a technique of the looking-around spliced homography matrix to carry out projection transformation on the second calibration image so as to generate a corresponding projection image. And calculating a second coordinate transformation relation between the coordinate systems of the adjacent two looking-around cameras to be calibrated according to the same key points in the projection images corresponding to the adjacent two looking-around cameras to be calibrated, wherein the coordinate transformation relation between the adjacent two looking-around cameras to be calibrated can be calculated by comparing the same key points in the projection images of the adjacent two looking-around cameras to be calibrated.
In the embodiment, the image data of the annular camera to be calibrated is collected, projection conversion is performed by utilizing an annular spliced homography matrix technology, and finally the coordinate system transformation relation between two adjacent annular cameras to be calibrated is obtained through calculation. This result may be used as part of calculating sensor calibration parameters for calibrating the various sensors on the vehicle to be calibrated.
In one embodiment, after the second calibration images sent by the all-around cameras to be calibrated are obtained, distortion calibration is performed on the second calibration images according to internal parameters of the cameras to be calibrated, so as to obtain calibrated second calibration images.
In one embodiment, after the controlling the to-be-calibrated looking-around camera collects the calibration device to obtain the second calibration image, the method further includes:
extracting ID information of each mark code in the second calibration image and third image coordinates of each key point;
Screening key points positioned on the ground in the second calibration image according to the ID information of each mark code in the second calibration image and the third image coordinates of each key point, and acquiring the third three-dimensional coordinates of each key point positioned on the ground;
determining a look-around mosaic image according to third three-dimensional coordinates of each key point on the ground;
Calibrating a fourth three-dimensional coordinate of a key point in the all-around spliced image by using a first reference three-dimensional coordinate corresponding to the key point in the all-around spliced image in the calibration prior information so as to obtain a calibrated all-around spliced image;
And obtaining the looking-around spliced homography matrix according to the calibrated looking-around spliced image.
It should be noted that, the third three-dimensional coordinate is the three-dimensional coordinate of all the key points located on the ground in the field of view corresponding to the second calibration image, and the set of all the key points located on the ground in the field of view corresponding to the second calibration image is a subset of the set of all the key points in the calibration system, so that the third three-dimensional coordinate is a part of all the first three-dimensional coordinates. Further, the fourth three-dimensional coordinates are three-dimensional coordinates of key points in the looking-around spliced image according to third three-dimensional coordinates of each key point on the ground, the set of the key points in the looking-around spliced image is determined to be a subset of the set of all the key points which are located on the ground and are in the field of view range corresponding to the second calibration image according to the third three-dimensional coordinates of each key point on the ground, and then the fourth three-dimensional coordinates are part of all the third three-dimensional coordinates. Specifically, the task of looking around and splicing calibration is mainly to calibrate the ground projection of looking around cameras around a vehicle body and the transformation relation of adjacent cameras. The method comprises the following specific steps:
For each looking around camera c i, acquiring a second calibration image I i, and performing distortion calibration on the second calibration image I i according to each camera internal reference acquired in the previous step to obtain a calibrated second calibration image
For the second calibration image after calibrationThe third image coordinates and the ID information (fourth identity information) of the key points in the view field range are extracted, only the key points on the ground are reserved according to the ID information, and the third three-dimensional coordinate information of the key points on the ground is obtained (as shown in fig. 13, fig. 13 is a top view of the circular splice calibration layout provided by the application), compared with manual point taking, the embodiment is used for mass production and solves the automation problem.
For each camera, according to the ID information, the key point corresponding to the largest mark code in the view field range in FIG. 13 is automatically acquired. The left and right maximum cooperation mark key points are selected, a dotted rectangle ABCD is selected according to a mode like 14, and fig. 14 is a schematic diagram of projection transformation matrix selection provided by the application.
Since it is difficult to ensure that all of the calibration codes in fig. 13 are horizontal and coplanar when constructing the calibration site, the existing method ignores this, resulting in insufficient accuracy of the look-around splice. Here we calibrate the four points in fig. 14 using ABCD by calibrating the three-dimensional coordinates of the key points of the code (X i,Yi,Zi) of fig. 13, and the camera parameters acquired during the first step, i.e. (X i,Yi,Zi) by transforming the parameters (here without distortion parameters) into an imageIn (1) obtaining a calibrated point(As in the location of the black dots in FIG. 14);
By means of And for key points in the overlapped view fields of the adjacent cameras, solving transformation scale parameters according to the key point ID information and the actual size of the cooperation mark.
In one embodiment, when the sensor includes a main laser radar, controlling each sensor on the vehicle to be calibrated to acquire the calibration device to obtain second data, including:
The main laser radar is controlled to acquire the calibration device, so that cloud data of a first test point are obtained;
And matching the first test point cloud data with the first data to obtain a third coordinate transformation relation between the main laser radar coordinate system and the total station coordinate system.
The embodiment describes a process of calibrating a main laser radar, and specifically, first test point cloud data sent by the main laser radar is acquired. And then, matching the first test point cloud data with the first data to obtain a third coordinate transformation relation between the main laser radar coordinate system and the total station coordinate system.
In this embodiment, the second data collected by the sensors refers to the data collected by the respective sensors for the calibration device. The first test point cloud data sent by the main laser radar is one of the first test point cloud data. The sensors and calibration means may be data-transmitted via a wireless or wired connection.
By matching the first test point cloud data with the first data, a third coordinate transformation relationship between the primary lidar coordinate system and the total station coordinate system may be determined. This transformation relationship can be used in a subsequent sensor calibration process. And calculating the calibration parameters of the main laser radar by obtaining a third coordinate transformation relation between the main laser radar coordinate system and the total station coordinate system. These parameters may include rotation matrices and/or translation vectors, etc.
In one embodiment, after the main laser radar is controlled to collect the calibration device to obtain the cloud data of the first test point, the method further includes:
matching the cloud data of the first test point with the first data to obtain ID information of a marker code in the field of view of the main laser radar and a fifth three-dimensional coordinate of the key point;
determining the same key point in the first calibration image and the field of view of the main laser radar according to the ID information of each mark code in the first calibration image, the second three-dimensional coordinates corresponding to each key point, the ID information of the mark code in the field of view of the main laser radar and the fifth three-dimensional coordinates of the key point, and determining the three-dimensional coordinates of the same key point;
And calculating a fourth coordinate transformation relation between the camera coordinate system to be calibrated and the main laser radar coordinate system according to the three-dimensional coordinates of the same key points.
It should be noted that, the fifth three-dimensional coordinate is a three-dimensional coordinate corresponding to a key point in a field of view of the primary laser radar, and a set of key points in the field of view of the primary laser radar is a subset of a set of all key points in the calibration system, so that the fifth three-dimensional coordinate is a part of all the first three-dimensional coordinates.
Specifically, assume that the coordinates of the key points in the camera ci coordinate system are(The second three-dimensional coordinate is described above), the coordinates of the key points in the coordinate system of the primary lidar l 1 (i.e. the fifth three-dimensional coordinate) areTheoretically, the transformation relation of the homonymous points in the two coordinate systems is as follows:
Considering various errors, then
The function to be optimized is as follows:
The calibration steps are as follows:
(1) Extracting image coordinates and IDs of key points in the view field of the camera c 1, and acquiring corresponding space three-dimensional coordinates (namely the second three-dimensional coordinates) according to the ID information;
(2) Extracting three-dimensional coordinates (namely, the fifth three-dimensional coordinates) and IDs of key points in the field of view of the main laser radar l 1;
(3) And according to the ID information, solving an external parameter (fourth coordinate transformation relation) by using a homonymous point relation and using a Levenberg-Marquardt (LM) algorithm.
In one embodiment, when the sensor comprises a millimeter wave radar, the calibration plate is provided with a metal block, the calibration system further comprises two rails and a conveyor belt device arranged on the rails, the two rails are parallel, the distance between the two rails is the same as the distance between the left wheel and the right wheel of the vehicle to be calibrated, and the vehicle to be calibrated is positioned on the conveyor belt device when the sensor is calibrated;
before controlling each sensor on the vehicle to be calibrated to acquire the calibration device and obtaining the second data, the method further comprises the following steps:
the method comprises the steps that a conveyor belt device is controlled to enable a vehicle to be calibrated to move at a preset speed;
Controlling each sensor on the vehicle to be calibrated to acquire the calibration device to obtain second data, including:
Triggering the millimeter wave radar to acquire coordinates of the metal block in the range of the video field in the moving process of the vehicle to be calibrated to obtain acquisition data, wherein the acquisition data at least comprises the coordinates of the metal block;
Determining alignment data of the cloud data and the collected data of the first test point according to the data time stamp;
projecting a sixth three-dimensional coordinate of the metal block in the cloud data of the first test point in the alignment data onto a horizontal plane to obtain horizontal plane projection data;
And calculating a fifth coordinate transformation relation between the millimeter wave radar coordinate system and the main laser radar coordinate system according to the metal block coordinates in the acquisition data in the horizontal plane projection data and the alignment data and the positions of the metal blocks in the calibration system.
The sixth three-dimensional coordinate is a three-dimensional coordinate corresponding to the metal block in the field of view of the millimeter wave radar, and the set of the metal blocks in the field of view of the millimeter wave radar is a subset of the set of all the metal blocks in the calibration system.
Specifically, the millimeter wave radar can obtain X and Y coordinate information of the target, and no z coordinate information of the target is obtained, so that the conversion from the millimeter wave coordinate system O wXwYw to the main laser radar coordinate system O lXlYl can be regarded as the conversion of the two-dimensional X-Y coordinate system (as shown in fig. 15, fig. 15 is a schematic diagram of the relationship between the millimeter wave radar coordinate system and the main laser radar coordinate system provided by the application), and the parameters to be calibrated include a translation vector T and a rotation angle θ.
The conversion relation from the millimeter wave coordinate to the main laser radar coordinate system is as follows:
a function F can be constructed with parameters T x、Ty, θ. The three-dimensional world coordinates pass through the function F to output two-dimensional projection image coordinates (x' ij,y′ij);
(xl,yl)=F(Tx,Ty,θ,xw,yw);
taking into account the error factors, then
The function to be optimized is:
the calibration steps are as follows:
(1) Because the vehicle is fixed on the track back and forth, the short-distance linear motion of the vehicle in the front-back direction can be conveniently controlled through the conveyor belt on the track (in the calibration process, the vehicle only needs to move for a very short distance (such as 1 meter);
(2) Storing main laser radar data (first test point cloud data) and millimeter wave radar data (acquisition data of a metal block sent by a millimeter wave radar) in the vehicle movement process;
(3) And according to the data time stamp, finding out a time alignment data frame of the main laser radar and the millimeter wave radar. For the main laser radar l 1, three-dimensional coordinates (sixth three-dimensional coordinates of the metal block) of the central metal block of each calibration plate in the field of view are projected onto a horizontal plane to obtain horizontal plane projection data;
(4) And obtaining the optimal matching relation according to the horizontal plane projection data and the known space topological relation of the metal blocks. And according to the optimal matching relation, carrying out iterative computation on the optimization function by using a Levenberg-Marquardt (LM) algorithm, and solving a fifth coordinate transformation relation.
In one embodiment, when the sensor includes an auxiliary laser radar, controlling each sensor on the vehicle to be calibrated to acquire the calibration device to obtain second data, including:
controlling the auxiliary laser radar to acquire the calibration device to obtain cloud data of a second test point;
and matching the second test point cloud data with the first test point cloud data to obtain a sixth coordinate transformation relation between the auxiliary laser radar coordinate system and the main laser radar coordinate system.
Specifically, the calibration parameter (sixth coordinate transformation relation) between the auxiliary lidar and the main lidar is the rotation matrix R and the translation vector T.
Assuming that the primary lidar l 1 point cloud data is q= { Q 1,q2,..,qm } (first test point cloud data), respectively, the secondary lidar l 2 point cloud data is p= { P 1,p2,..,pm) (second test point cloud data). The space topological structure of the calibration component is calibrated by utilizing a calibration field, and according to the space matching property of the point clouds, the space homonymous point coordinate transformation relationship in the two point clouds is as follows:
Constructing an optimization function: R and T corresponding to the sixth coordinate transformation relation can be obtained through Gaussian Newton or LM algorithm.
The main/auxiliary laser radar calibration steps are as follows:
(1) Acquiring point cloud data Q and P of a main laser radar and an auxiliary laser radar in a distributed manner;
(2) The point cloud data Q and P are respectively aimed at, and the point cloud data after eliminating the ceilings, floors and surrounding walls is obtained by utilizing a pretreatment method for acquiring the prior information of the calibration site AndThe interference is eliminated, so that the calibration precision can be greatly improved;
(3) Separate point cloud data AndAnd acquiring information such as a calibration plate plane, a calibration sphere surface and the like in respective point clouds by using a calibration plate plane fitting method for acquiring calibration site priori information, wherein the information comprises ID information. Wherein the ID of the calibration plate can be formed by combining the ID of the internal calibration codes. For example, 6 calibration codes are arranged on the base plate, ID is ID i (i is less than or equal to 6) and then the calibration plates ID are serially connected from small to large as ID 1ID2ID3ID4ID5ID6;
(4) According to the calibration plate ID information and the calibration pellet ID, the initial value of the calibration parameter RT corresponding to the sixth coordinate transformation relation is calculated, the plane is fitted according to the calibration plate, the sphere of the calibration pellet is further calibrated according to the calibration plate, and based on the initial value of the calibration parameter RT, the final calibration parameter RT is further obtained through NDT or ICP.
In the traditional method, point data is directly used for matching, such as NDT and various ICP improvement methods, real space homonymous points caused by point cloud resolution, projection angles and the like can be very few because the difference of hardware parameters of the main/auxiliary laser radar can be large. Points in two frames of laser point cloud data cannot represent the same position in space. Using point-to-point distances as error equations tends to introduce random errors. According to the application, by fitting each calibration plate plane, the influence can be effectively reduced, namely the calibration precision is improved.
In one embodiment, the calibration device includes a plurality of calibration members, and in the following embodiment, the calibration member is taken as an example of a calibration sphere (as shown in fig. 16, fig. 16 is a schematic layout diagram of the calibration sphere provided by the present application), and the calibration priori information is determined according to the first calibration data, including:
performing spatial clustering segmentation on the first calibration data according to the positions of the verification components in the calibration system to obtain third local point cloud data corresponding to each verification component;
and determining fifth identity information and fifth coordinate information corresponding to each verification component according to the third local point cloud data.
First, spatial clustering segmentation is needed to be carried out on the first calibration data according to the positions of the calibration components in the calibration system, so as to obtain third local point cloud data corresponding to each calibration component. The spatial clustering segmentation is to collect adjacent or close point clouds in the first calibration data together to form individual point cloud clusters. The purpose of doing so is to separate the point clouds corresponding to each calibration component to be calibrated in the first calibration data, so that subsequent processing is facilitated. Cluster segmentation may be based on factors such as spatial distance, density, etc. After cluster segmentation, each point cloud cluster can be regarded as local point cloud data of one verification member. By processing each point cloud cluster, third local point cloud data of the verification member can be obtained. These local point cloud data may be used for subsequent verification parameter calculation, error analysis, etc. Next, from the third partial point cloud data, fifth identity information and fifth coordinate information corresponding to each of the verification members may be determined. The fifth identity information generally refers to a unique identification of the verification means for distinguishing between the different verification means. In the verification system, the identity information of the verification member can be determined by extracting and matching the characteristics of the shape, the size, the texture and the like of the verification member. The fifth coordinate information generally refers to position and attitude information of the verification member in the calibration system. By registering and matching the point cloud data, the position and attitude of the verification member relative to the calibration system can be determined. Thus, fifth coordinate information corresponding to the verification means can be obtained.
In summary, by performing spatial clustering segmentation on the first calibration data, third local point cloud data corresponding to each verification member can be obtained. Then, by processing and analyzing the third local point cloud data, fifth identity information and fifth coordinate information corresponding to each verification member can be determined. In this way, the parameters of the sensor can be calibrated and related work of the automatic driving system can be performed.
In one embodiment, determining fifth identity information and fifth coordinate information corresponding to each verification member according to the third local point cloud data includes:
fitting the third local point cloud data to obtain the structural information and the three-dimensional coordinates of each verification member.
The present embodiment mentions that the fifth identity information and the fifth coordinate information corresponding to each verification member are determined according to the third local point cloud data. In order to acquire the identity information and the coordinate information of the verification member, the third local point cloud data needs to be processed. Firstly, fitting the third local point cloud data, namely, fitting the point cloud data appropriately by using a mathematical model, can obtain the structural information of the verification member. Fitting may be by fitting the shape of the sphere using a surface fitting algorithm, such as least squares, or the like. By fitting the obtained structural information, the shape, size and other characteristics of the checking component can be known. And secondly, obtaining the three-dimensional coordinates of the verification member through the fitted result. Since the fitting is mainly based on point cloud data, the center point coordinates of the verification member, namely fifth coordinate information, can be obtained. By obtaining the center point coordinates of the verification member, the position of the verification member can be determined. And finally, according to the structural information and the three-dimensional coordinates of the checking component, obtaining the identity information of the checking component, namely the fifth identity information. The identity information of the verification means may include the type, number, etc. of the verification means and may be used to uniquely identify each verification means.
In summary, by fitting the third local point cloud data, the structure information and the three-dimensional coordinates of the verification member can be obtained, and then the identity information and the coordinate information of the verification member are determined. Therefore, the calibration component in the automatic calibration system of the automatic driving sensor can be accurately positioned and identified, and the accuracy and stability of calibration parameters are improved.
In one embodiment, the verification means is a verification ball, and determining fifth identity information and fifth coordinate information corresponding to each verification means according to the third local point cloud data includes:
And performing spherical fitting on the third partial point cloud data to obtain radius information and three-dimensional coordinates of the check ball.
In this embodiment, the check member is a check ball. The step of determining fifth identity information and fifth coordinate information corresponding to each check member (i.e. check ball) according to the third local point cloud data comprises the steps of performing spherical fitting on the third local point cloud data, and obtaining point cloud data of the surface of the check ball according to the third local point cloud data. By performing spherical fitting on the point cloud data, fitted spherical information can be obtained, and the fitted spherical information mainly comprises radius information of the check ball and three-dimensional coordinates of the center of the ball. The spherical fitting is to process point cloud data through a mathematical method, so that the fitting result can best meet the spherical model. And determining the identity information and the coordinate information of the check ball, namely determining the identity information and the coordinate information of the check ball through the radius information of the check ball and the three-dimensional coordinates of the sphere center obtained after spherical fitting. The identity information of the check balls is typically used to distinguish between the different check balls, while the coordinate information is used to determine the position of the check balls in three dimensions.
In this way, by performing spherical fitting on the third local point cloud data, the radius information and the position information of each check ball can be obtained, so as to determine the fifth identity information and the fifth coordinate information corresponding to each check member (i.e. the check ball). This information is used in an automatic calibration system for the automatic driving sensor to verify the calibration parameters of the individual sensors.
It should be noted that, the fifth identity information is identity information of the check member, for example, when the check member is a check ball, the fifth identity information may be radius information of the check ball. The fifth coordinate information characterizes a position of the verification member in the calibration system, such as a spatial position.
It should be noted that calibration parameters of each sensor are calibrated by using a calibration ball. This is because the check ball has a relatively uniform spherical shape, and exhibits a perfect circular surface regardless of the angle from which it is observed. This spherical feature makes the data generated by the check ball relatively easy to process and analyze. That is, sensor calibration verification using a check ball may provide accurate and reliable results. The uniform spherical shape of the check ball can ensure that the sensor acquires consistent data in different angles and directions, thereby reducing the possibility of calibration errors. In addition, the spherical characteristic of the check ball enables the check ball to reflect or scatter light in the same mode when receiving laser, radar or camera signals of the sensor, and therefore consistency and stability of data are guaranteed. The data processing of the check ball is relatively simple. Since the check ball presents a perfect circular surface, the image or point cloud data formed in the sensor can be easily processed and analyzed. Through the processing of the check ball data, the calibration parameters of the sensor can be obtained, and are optimized and adjusted, so that the accuracy and the precision of the sensor are improved.
In addition to check balls, other implementations of check members are possible, such as check plates, square or cross-shaped check members, etc.
In a word, the calibration ball is used for verifying the calibration parameters of the automatic driving sensor. The uniform spherical characteristic and the data processing advantage of the check ball can ensure the accuracy and the reliability of sensor calibration, and provide effective support for the performance of an automatic driving system.
Specifically, let P be the first calibration data after phi eliminates the point clouds of the ceiling, the ground and the surrounding walls, and only comprises the point clouds of each calibration component and the point clouds of the ground check ball.
For check pellets, the specific steps are as follows:
(1) Because the installation height of the check ball is lower than that of the calibration component, the check ball can be divided into the point clouds only containing the check ball according to the height z coordinate of the point cloud P And only includes calibration member point clouds
(2) Clustering according to the space interval of the check balls, and aiming at point cloudsPerforming spatial clustering segmentation to obtain local point clouds of each checked ball(M1 is the total number of calibration pellets; for eachThrough point cloud spherical fitting, three-dimensional space coordinates (x j,yj,zj) of the center of the check ball and ball radius R j information can be obtained.
In one embodiment, after obtaining calibration priori information according to the first data and calibrating each sensor on the vehicle to be calibrated according to the calibration priori information and the second data, the method further includes:
And calibrating the calibration parameters of each sensor according to the radius information and the three-dimensional coordinates of each calibration sphere.
Because the multistage sensing system needs a plurality of sensors, the accuracy of the calibration result directly influences the performance of the sensing system. The application provides a method for automatically verifying calibration parameters, in particular to a method for verifying the results of binocular calibration, camera and main laser radar calibration, main/auxiliary laser radar calibration and millimeter wave radar and main laser radar calibration, wherein the basic thought is as follows:
For spheres of known spatial position and size in the calibration system, three-dimensional coordinates of the sphere center in different coordinate systems are identified and compared with true values. The method comprises the following specific steps:
in one embodiment, when the calibration parameters of the sensors include a third coordinate transformation relationship between the main lidar coordinate system and the total station coordinate system, verifying the calibration parameters of the sensors according to the radius information and the three-dimensional coordinates of each verification ball, including:
Acquiring seventh three-dimensional coordinates of each check ball in the field of view of the main laser radar;
Carrying out coordinate transformation on the seventh three-dimensional coordinate according to the third coordinate transformation relation to obtain a seventh three-dimensional coordinate to be compared;
comparing the seventh three-dimensional coordinate to be compared with the seventh reference three-dimensional coordinate to determine whether the third coordinate transformation relationship is accurate;
the first reference three-dimensional coordinate is a three-dimensional coordinate which is acquired by the total station and corresponds to the verification component in the field of view range of the main laser radar.
The embodiment aims at verifying the calibration precision of the coordinate system of the main laser radar and the total station. For a laser radar, a sphere center three-dimensional coordinate is obtained according to reflectivity difference and spatial clustering through a sphere ROI (region of interest) which is known in advance(I.e. seventh three-dimensional coordinate), according to the external parameter R/t relation (i.e. third coordinate transformation relation) between the main laser radar and the total stationCoordinate transformation is carried out to obtain(I.e., the seventh three-dimensional coordinate to be compared); willAnd comparing the obtained coordinates with true values (the coordinates obtained by the total station, namely the seventh reference three-dimensional coordinates) to judge the error of R/T.
The seventh three-dimensional coordinate is a three-dimensional coordinate of the verification member in the field of view of the primary lidar, and the set of verification members in the field of view of the primary lidar is a subset of the set of all verification members, so that the seventh three-dimensional coordinate is a part of all fifth coordinate information.
In one embodiment, when the calibration parameters of the sensor include a fourth coordinate transformation relationship between the camera coordinate system to be calibrated and the main laser radar coordinate system, verifying the calibration parameters of each sensor according to the radius information and the three-dimensional coordinates of each verification ball, including:
acquiring first check point cloud data of a check ball in a field of view of a main laser radar;
Back-projecting the first check point cloud data onto a camera image according to a fourth coordinate transformation relation;
performing preset processing on the back-projected camera image to obtain a circular edge in the camera image, and determining a corresponding first circular fitting equation according to the circular edge;
the first circular fit equation is compared with a reference circular fit equation to determine if the fourth coordinate transformation relationship is accurate.
The embodiment aims at verifying the calibration parameter errors of the camera and the main laser radar. In an embodiment, first checkpoint cloud data of a verification member within a primary lidar field of view is acquired. And then, back projecting the first check point cloud data onto the camera image according to the fourth coordinate transformation relation. The three-dimensional coordinates of the verification point cloud are mapped onto the two-dimensional image of the camera through a back projection operation. And finally, determining whether the fourth coordinate transformation relation is accurate according to the camera image after the back projection. By comparing the positions of the check points on the camera image with the positions of the actual check members, the accuracy of the fourth coordinate transformation relationship can be evaluated.
When the check member is a check ball, the first check point cloud data of the check member in the field of view of the main laser radar is back projected onto each camera image according to the external parameter R c-l、tc-l corresponding to the fourth coordinate transformation relation, and the back projection of the point cloud of the sphere area in each camera is shown as figure 17, wherein the sphere is a gray prototype on the image, and the dotted line represents the back projection of the laser radar point cloud. All circular edges in the image are extracted through gray scale, binarization and other processes, a first circular fitting equation O i (i epsilon n, n is the number of circular areas of the image) is obtained, point cloud data in the sphere are obtained through clustering according to the space-to-space difference, and each transverse line endpoint is obtained, as shown in fig. 18, and a transverse line broken line fitting circle Q i (as shown in fig. 19) is obtained. And calculating the distance between O i and Q i, and further judging the accuracy of the camera internal parameter, the camera external parameter R c-l、tc-l (fourth coordinate transformation relation) of the main laser radar.
The purpose of the calibration method is to ensure that the coordinate transformation relationship between the camera to be calibrated and the main laser radar is accurate. Through verification, the accuracy of the calibration parameters can be verified, and the reliability and the accuracy of sensor calibration are improved.
In one embodiment, the calibration balls are metal spheres, and when the calibration parameters of the sensors include a fifth coordinate transformation relationship between a millimeter wave radar coordinate system and a main laser radar coordinate system, calibrating the calibration parameters of the sensors according to the radius information and the three-dimensional coordinates of each calibration ball, including:
Acquiring millimeter wave radar coordinates and angles of each check ball in the field of view of the millimeter wave radar;
acquiring a seventh three-dimensional coordinate in the view field range of the main laser radar;
projecting millimeter wave radar coordinates and angles to a two-dimensional plane of a main laser radar coordinate system according to a fifth coordinate transformation relation;
and calculating Euclidean distance according to the projected millimeter wave radar coordinates and angles and the seventh three-dimensional coordinate to determine whether the fifth coordinate transformation relationship is accurate.
The present embodiment aims at verifying the fifth coordinate transformation relationship between the auxiliary lidar l 1 coordinate system and the main lidar l 2 coordinate system. The verification process comprises the following steps of obtaining millimeter wave radar coordinates and angles of each verification ball in the visual field range of the millimeter wave radar. And acquiring a seventh three-dimensional coordinate in the field of view of the main laser radar. And projecting the millimeter wave radar coordinates and the angles to a two-dimensional plane of a main laser radar coordinate system according to the fifth coordinate transformation relation. And calculating the Euclidean distance according to the projected millimeter wave radar coordinates and angles and the seventh three-dimensional coordinate. And determining whether the fifth coordinate transformation relation is accurate according to the calculated Euclidean distance.
The method comprises the steps of respectively extracting millimeter wave radar coordinates and a yaw angle and a spherical center three-dimensional coordinate in a main laser radar coordinate system, projecting the spherical center coordinate in millimeter wave data to an XY plane of the main laser radar coordinate system through a fifth coordinate transformation relation, calculating Euclidean distance, and judging calibration accuracy.
Through the verification process, the accuracy of the fifth coordinate transformation relation between the millimeter wave radar coordinate system and the main laser radar coordinate system can be judged. Therefore, the accuracy and the reliability of the calibration parameters of the sensor can be ensured, and the application performance and the accuracy of the sensor in a vehicle are improved.
In one embodiment, when the calibration parameters of the sensors include a sixth coordinate transformation relationship between the auxiliary laser radar coordinate system and the main laser radar coordinate system, verifying the calibration parameters of the sensors according to the radius information and the three-dimensional coordinates of each verification ball includes:
Acquiring first check point cloud data of a check ball in a field of view of a main laser radar;
Acquiring second check point cloud data of a check ball in the range of the field of view of the auxiliary laser radar;
Transforming the second check point cloud data into a main laser radar coordinate system according to the sixth coordinate transformation relation to obtain second check point cloud data to be checked;
And determining whether the sixth coordinate transformation relation is accurate according to the first check point cloud data and the second check point cloud data to be checked.
The present embodiment aims to verify the sixth coordinate transformation relationship between the auxiliary lidar coordinate system and the primary lidar coordinate system. Specifically, first checkpoint cloud data of a checkpoint member within a field of view of a primary lidar is acquired. Next, second checkpoint cloud data of the checkpoint member within the auxiliary lidar field of view is acquired. Then, according to the sixth coordinate transformation relation, coordinate transformation is carried out on the acquired second check point cloud data, and the second check point cloud data is transformed into a main laser radar coordinate system. This transformation process typically involves rotation, translation, and possibly dimensional changes. Finally, the accuracy of the sixth coordinate transformation relationship can be evaluated by comparing the first check point cloud data with the second check point cloud data to be checked. If the difference between the two sets of point cloud data is small, the sixth coordinate transformation relation is accurate, and if the difference is large, the sixth coordinate transformation relation needs to be adjusted or recalculated.
Specifically, the two laser radar point clouds are respectively used for extracting sphere point clouds in the respective field of view according to the space distance and the difference of the emissivity, and the spherical point clouds are obtained byAnd (a fifth coordinate transformation relation), transforming the sphere point cloud in the l 2 into the l 1 coordinate system, calculating Euclidean distance, and judging the precision of the fifth coordinate transformation relation.
In summary, the present embodiment is used to verify the accuracy of the sixth coordinate transformation relationship between the auxiliary lidar and the primary lidar in the sensor calibration parameters. The method can help ensure that the calibration parameters of the sensors can accurately reflect the spatial relative position relationship between the sensors in the actual scene.
In addition, in order to improve the calibration efficiency and the calibration precision, the vehicle to be calibrated needs to be quickly positioned, namely quickly driven into a designated position in the calibration system. The application utilizes four touchdown points of the vehicle to limit. The limiting comprises front and rear aspects and left and right aspects. The scheme is shown in fig. 20-22, and specifically, two rails are installed in a calibration system, the width of each rail is equal to the width of a vehicle tire, the vehicle tire and the rails run on the rails in a flat mode, a limiter is arranged at the front end of each rail, after the tire contacts with the limiter, the vehicle can stop, and after stopping, the vehicle needs to be fixed for limiting the rear wheels. The rear wheel is fixed, on the one hand for safety. On the other hand, the coordinate system of the millimeter wave radar and the laser radar is calibrated.
Through the operation, the initial positions of all vehicles to be calibrated can be ensured to be relatively close, and the deviation between the main laser radar coordinate system and the total station coordinate system is as small as possible. In the point cloud matching process, the deviation of two point cloud coordinate systems is as small as possible, the iterative algorithm converges faster, and the accuracy is easier to guarantee.
In addition, in order to assist the millimeter wave radar to calibrate with the main laser radar, a driving belt device is arranged on the track, so that the short-distance uniform linear motion of the vehicle can be realized.
In a second aspect, the present application further provides a calibration system for a vehicle sensor, as shown in fig. 23, fig. 23 is a structural block diagram of the calibration system for a vehicle sensor provided by the present application, where the calibration system includes a total station, a plurality of calibration devices, and a computing device, and the computing device includes:
A first control unit 91, configured to control the total station to scan the calibration device when the vehicle to be calibrated meets a first calibration condition, so as to obtain first data;
the second control unit 92 is configured to control each sensor on the vehicle to be calibrated to acquire the calibration device when the second calibration condition is satisfied, so as to obtain second data;
the calibration unit 93 is configured to obtain calibration priori information according to the first data, and calibrate each sensor on the vehicle to be calibrated according to the calibration priori information and the second data, so as to obtain calibration parameters of each sensor, where the calibration priori information is data information of the calibration device, and the calibration parameters are used to characterize coordinate transformation relationships between any two sensors or between a sensor and the total station.
For the description of the calibration system-based vehicle sensor calibration system, refer to the above embodiment, and the description of the present application is omitted herein.
In a third aspect, the present application further provides a calibration system-based calibration device for a vehicle sensor, as shown in fig. 24, and fig. 24 is a block diagram of a calibration system-based calibration device for a vehicle sensor, where the device includes:
A memory 201 for storing a computer program;
the processor 202 is configured to implement the steps of the calibration system-based vehicle sensor calibration method described above when storing a computer program.
For the description of the calibration device for the vehicle sensor based on the calibration system, refer to the above embodiment, and the description of the present application is omitted herein.
In a fourth aspect, the present application further provides a computer readable storage medium 210, as shown in fig. 25, fig. 25 is a block diagram of a structure of the computer readable storage medium provided in the present application, where a computer program 211 is stored on the computer readable storage medium 210, and the computer program 211 implements the steps of the vehicle sensor calibration method as described above when executed by the processor 202. For the description of the computer-readable storage medium 210, refer to the above embodiments, and the disclosure is not repeated here.
It should also be noted that in this specification, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (32)

1.一种基于标定系统的车辆传感器标定方法,其特征在于,所述标定系统包括全站仪、多个标定装置,包括:1. A vehicle sensor calibration method based on a calibration system, characterized in that the calibration system includes a total station and multiple calibration devices, including: 在待标定车辆满足第一标定条件时,控制所述全站仪对所述标定装置进行扫描,得到第一数据;判断所述待标定车辆是否满足所述第一标定条件的过程包括:判断所述待标定车辆是否停至预设区域;若停至预设区域,则判定所述待标定车辆满足所述第一标定条件,否则,判定不满足所述第一标定条件;或,根据所述待标定车辆的车辆信息确定所述待标定车辆是否为首次标定;若为所述首次标定,则判定所述待标定车辆满足所述第一标定条件,否则,判定不满足所述第一标定条件;在判定所述待标定车辆不是首次标定时,还包括:调用数据库中存储的与所述待标定车辆对应的第一数据;When the vehicle to be calibrated meets the first calibration condition, the total station is controlled to scan the calibration device to obtain first data; the process of determining whether the vehicle to be calibrated meets the first calibration condition includes: determining whether the vehicle to be calibrated is parked in a preset area; if so, determining that the vehicle to be calibrated meets the first calibration condition, otherwise, determining that the first calibration condition is not met; or, determining whether the vehicle to be calibrated is being calibrated for the first time based on the vehicle information of the vehicle to be calibrated; if so, determining that the vehicle to be calibrated meets the first calibration condition, otherwise, determining that the first calibration condition is not met; when it is determined that the vehicle to be calibrated is not being calibrated for the first time, the process also includes: calling the first data corresponding to the vehicle to be calibrated stored in a database; 在满足第二标定条件时,接收待标定车辆上的各个所述传感器对所述标定装置进行采集,得到第二数据;判断是否满足所述第二标定条件的过程包括:判断是否接收到标定开始指令;若接收到,则判定满足所述第二标定条件,否则,判定不满足所述第二标定条件;When the second calibration condition is met, receiving data collected by the calibration device from each sensor on the vehicle to be calibrated to obtain second data; the process of determining whether the second calibration condition is met includes: determining whether a calibration start instruction is received; if so, determining that the second calibration condition is met; otherwise, determining that the second calibration condition is not met; 根据所述第一数据得到标定先验信息,并根据所述标定先验信息及所述第二数据对所述待标定车辆上的各个所述传感器进行标定,以得到各所述传感器的标定参数;所述标定先验信息为所述标定装置的数据信息,所述标定参数用于表征任意两个所述传感器之间或所述传感器与所述全站仪之间的坐标变换关系;Obtaining calibration priori information based on the first data, and calibrating each of the sensors on the vehicle to be calibrated based on the calibration priori information and the second data to obtain calibration parameters of each of the sensors; the calibration priori information is data information of the calibration device, and the calibration parameters are used to characterize the coordinate transformation relationship between any two of the sensors or between the sensor and the total station; 根据所述第一数据得到标定先验信息,包括:Obtaining calibration prior information according to the first data includes: 从所述第一数据中剔除不满足预设要求的数据,得到第一标定数据;Eliminating data that does not meet preset requirements from the first data to obtain first calibration data; 根据所述第一标定数据确定所述标定先验信息。The calibration prior information is determined according to the first calibration data. 2.如权利要求1所述的基于标定系统的车辆传感器标定方法,其特征在于,从所述第一数据中剔除不满足预设要求的数据,得到第一标定数据,包括:2. The vehicle sensor calibration method based on the calibration system according to claim 1, wherein removing data that does not meet preset requirements from the first data to obtain the first calibration data comprises: 从所述第一数据中剔除不包括所述标定装置的点云数据,得到所述第一标定数据。The point cloud data not including the calibration device is eliminated from the first data to obtain the first calibration data. 3.如权利要求2所述的基于标定系统的车辆传感器标定方法,其特征在于,所述标定装置包括多个标定构件,根据所述第一标定数据确定所述标定先验信息,包括:3. The vehicle sensor calibration method based on the calibration system according to claim 2, wherein the calibration device comprises a plurality of calibration components, and determining the calibration prior information according to the first calibration data comprises: 根据各个所述标定构件在所述标定系统中的位置对所述第一标定数据进行空间聚类分割,得到每个所述标定构件对应的第一局部点云数据;Performing spatial clustering and segmentation on the first calibration data according to the position of each calibration component in the calibration system to obtain first local point cloud data corresponding to each calibration component; 根据所述第一局部点云数据确定各个所述标定构件的第一身份信息和第一坐标信息。The first identity information and the first coordinate information of each of the calibration components are determined according to the first local point cloud data. 4.如权利要求3所述的基于标定系统的车辆传感器标定方法,其特征在于,每个所述标定构件包括至少两个标定板,根据得到第一局部点云数据确定各个所述标定构件的第一身份信息和第一坐标信息,包括:4. The vehicle sensor calibration method based on the calibration system according to claim 3, wherein each calibration component includes at least two calibration plates, and determining the first identity information and first coordinate information of each calibration component based on the obtained first local point cloud data comprises: 从所述第一局部点云数据中确定每个所述标定板对应的第二局部点云数据;Determine second local point cloud data corresponding to each calibration plate from the first local point cloud data; 根据所述第二局部点云数据确定各个所述标定板对应的第二身份信息和第二坐标信息。The second identity information and the second coordinate information corresponding to each of the calibration plates are determined according to the second local point cloud data. 5.如权利要求4所述的基于标定系统的车辆传感器标定方法,其特征在于,每个所述标定构件包括至少两个不共面的标定板,根据每个所述标定构件中各个标定板的位置及第一局部点云数据确定每个所述标定板对应的第二局部点云数据,包括:5. The vehicle sensor calibration method based on the calibration system according to claim 4, wherein each calibration component includes at least two non-coplanar calibration plates, and determining the second local point cloud data corresponding to each calibration plate based on the position of each calibration plate in each calibration component and the first local point cloud data comprises: 对各个所述标定板进行平面拟合,以提取各个所述标定板对应的第二局部点云数据。Plane fitting is performed on each of the calibration plates to extract second local point cloud data corresponding to each of the calibration plates. 6.如权利要求5所述的基于标定系统的车辆传感器标定方法,其特征在于,对各个所述标定板进行平面拟合,以提取各个所述标定板对应的第二局部点云数据,包括:6. The vehicle sensor calibration method based on the calibration system according to claim 5, wherein performing plane fitting on each calibration plate to extract the second local point cloud data corresponding to each calibration plate comprises: 获取各个所述标定板中心与地面之间的第一高度,随机从所述第一局部点云中筛选至少三个第二点云,计算所述第二点云所在的第二三维空间平面;Obtaining a first height between the center of each calibration plate and the ground, randomly selecting at least three second point clouds from the first local point cloud, and calculating a second three-dimensional space plane where the second point clouds are located; 计算所述第二局部点云数据中各个点云到所述第二三维空间平面的第三距离;Calculating a third distance between each point cloud in the second local point cloud data and the second three-dimensional space plane; 将所述第三距离与所述第一高度的差值不大于第三预设距离的点云对应的数据作为所述标定板对应的第二局部点云数据。The data corresponding to the point cloud in which the difference between the third distance and the first height is not greater than the third preset distance is used as the second local point cloud data corresponding to the calibration plate. 7.如权利要求4所述的基于标定系统的车辆传感器标定方法,其特征在于,所述标定板包括标定载板、多个标志码,多个所述标志码固定在所述标定载板上所述标志码携带供所述传感器识别的二维码信息,其中,所有所述标志码的二维码信息对应的身份信息均不相同;7. The vehicle sensor calibration method based on the calibration system according to claim 4, wherein the calibration plate comprises a calibration carrier plate and a plurality of identification codes, wherein the plurality of identification codes are fixed to the calibration carrier plate, the identification codes carrying QR code information for recognition by the sensor, wherein the identity information corresponding to the QR code information of all the identification codes is different; 根据所述第二局部点云数据确定各个所述标定板对应的第二身份信息,包括:Determining second identity information corresponding to each of the calibration plates according to the second local point cloud data includes: 将所述第二局部点云数据转化为灰度图像;Converting the second local point cloud data into a grayscale image; 根据所述标志码之间的间隔对所述灰度图像进行切分,得到每个所述标志码的局部图像;Segmenting the grayscale image according to the intervals between the marker codes to obtain a partial image of each marker code; 对每个所述局部图像进行二维码识别,以确定每个所述标志码的第一ID信息,根据每个所述标志板上的各个所述标志码的第一ID信息确定所述标定板的第二身份信息。Perform two-dimensional code recognition on each of the partial images to determine the first ID information of each of the marking codes, and determine the second identity information of the calibration plate according to the first ID information of each of the marking codes on each of the marking plates. 8.如权利要求7所述的基于标定系统的车辆传感器标定方法,其特征在于,根据所述第二局部点云数据确定各个所述标定板对应的第二坐标信息,包括:8. The vehicle sensor calibration method based on the calibration system according to claim 7, wherein determining the second coordinate information corresponding to each calibration plate according to the second local point cloud data comprises: 提取每个所述标志码的局部图像的外接矩形的四条边,将相邻两条边的交点确定为关键点;Extracting four sides of the circumscribed rectangle of each partial image of the logo code, and determining the intersection of two adjacent sides as a key point; 获取各个所述关键点的第一图像坐标,所述第一图像坐标表征所述关键点在所述标定板上的位置;Acquire a first image coordinate of each of the key points, where the first image coordinate represents a position of the key point on the calibration plate; 根据所述标志码在所述标定板上的位置、所述第一图像坐标及所述第二局部点云数据中各个点云的第一参考三维坐标确定各个所述关键点对应的第一三维坐标。The first three-dimensional coordinates corresponding to each key point are determined according to the position of the marker code on the calibration plate, the first image coordinates and the first reference three-dimensional coordinates of each point cloud in the second local point cloud data. 9.如权利要求8所述的基于标定系统的车辆传感器标定方法,其特征在于,在所述第一参考三维坐标中没有与所述关键点的第一图像坐标对应的三维坐标时,还包括:9. The vehicle sensor calibration method based on the calibration system according to claim 8, characterized in that when there is no three-dimensional coordinate corresponding to the first image coordinate of the key point in the first reference three-dimensional coordinates, the method further comprises: 获取与所述关键点相邻的若干个参考点对应的三维坐标;Obtaining three-dimensional coordinates corresponding to several reference points adjacent to the key point; 根据若干个所述参考点的三维坐标确定所述关键点对应的第一三维坐标。The first three-dimensional coordinate corresponding to the key point is determined according to the three-dimensional coordinates of the plurality of reference points. 10.如权利要求8所述的基于标定系统的车辆传感器标定方法,其特征在于,所述标定板上还设置有金属块,根据所述第二局部点云数据确定各个所述标定板对应的第二身份信息,还包括:10. The vehicle sensor calibration method based on the calibration system according to claim 8, wherein a metal block is further provided on the calibration plate, and determining the second identity information corresponding to each calibration plate according to the second local point cloud data further comprises: 根据各个所述标志码的第一ID信息确定所述金属块的第三身份信息。The third identity information of the metal block is determined according to the first ID information of each identification code. 11.如权利要求10所述的基于标定系统的车辆传感器标定方法,其特征在于,根据所述第二局部点云数据确定各个所述标定板对应的第二坐标信息,还包括:11. The vehicle sensor calibration method based on the calibration system according to claim 10, wherein determining the second coordinate information corresponding to each calibration plate according to the second local point cloud data further comprises: 根据所述金属块在所述标定板上的位置及各所述标志码对应的关键点的第一三维坐标确定所述金属块的三维坐标。The three-dimensional coordinates of the metal block are determined according to the position of the metal block on the calibration plate and the first three-dimensional coordinates of the key points corresponding to the marking codes. 12.如权利要求11所述的基于标定系统的车辆传感器标定方法,其特征在于,所述金属块固定在所述标定载板的中心,多个所述标志码固定在所述标定载板上且环绕在所述金属块的四周,多个所述标志码靠近所述金属块方向的顶点与所述金属块的顶点相连。12. The vehicle sensor calibration method based on the calibration system as described in claim 11 is characterized in that the metal block is fixed at the center of the calibration carrier, multiple marking codes are fixed on the calibration carrier and surround the metal block, and multiple vertices of the marking codes close to the direction of the metal block are connected to the vertices of the metal block. 13.如权利要求12所述的基于标定系统的车辆传感器标定方法,其特征在于,根据所述金属块在所述标定板上的位置及各所述标志码对应的关键点的第一三维坐标确定所述金属块的三维坐标,包括:13. The vehicle sensor calibration method based on the calibration system according to claim 12 , wherein determining the three-dimensional coordinates of the metal block based on the position of the metal block on the calibration plate and the first three-dimensional coordinates of the key points corresponding to the marker codes comprises: 根据各个所述标志码靠近所述金属块方向的顶点对应的关键点的第一三维坐标确定所述金属块的四个角点的三维坐标;Determine the three-dimensional coordinates of the four corner points of the metal block according to the first three-dimensional coordinates of the key points corresponding to the vertices of each of the marking codes close to the metal block; 根据所述金属块的四个角点的三维坐标确定所述金属块的中心三维坐标。The three-dimensional coordinates of the center of the metal block are determined according to the three-dimensional coordinates of the four corner points of the metal block. 14.如权利要求11所述的基于标定系统的车辆传感器标定方法,其特征在于,所述传感器包括待标定相机时,控制待标定车辆上的各个所述传感器对所述标定装置进行采集,得到第二数据,包括:14. The vehicle sensor calibration method based on the calibration system according to claim 11, wherein when the sensor includes a camera to be calibrated, controlling each sensor on the vehicle to be calibrated to collect data from the calibration device to obtain the second data comprises: 控制所述待标定相机对所述标定装置进行采集,得到第一标定图像;Controlling the camera to be calibrated to collect data from the calibration device to obtain a first calibration image; 对所述第一标定图像中的标志码进行识别,确定所述第一标定图像中各个所述标志码的第二ID信息和局部图像;Identify the marker codes in the first calibration image and determine the second ID information and the partial image of each marker code in the first calibration image; 根据所述第一标定图像中各个所述标志码的ID信息和局部图像确定所述第一标定图像中包括的各个关键点的第二图像坐标;determining second image coordinates of each key point included in the first calibration image according to the ID information of each marker code and the partial image in the first calibration image; 根据所述第一标定图像中各个所述标志码的ID信息、各个所述关键点的第二图像坐标及所述第一参考三维坐标确定所述第一标定图像中各个所述关键点对应的第二三维坐标;Determining the second three-dimensional coordinates corresponding to each key point in the first calibration image according to the ID information of each marker code in the first calibration image, the second image coordinates of each key point, and the first reference three-dimensional coordinates; 根据所述第二图像坐标和所述第二三维坐标确定所述待标定相机的内参。Determine the intrinsic parameters of the camera to be calibrated according to the second image coordinates and the second three-dimensional coordinates. 15.如权利要求14所述的基于标定系统的车辆传感器标定方法,其特征在于,根据所述第二图像坐标和所述第二三维坐标确定所述待标定相机的内参之后,还包括:15. The vehicle sensor calibration method based on the calibration system according to claim 14, characterized in that after determining the intrinsic parameters of the camera to be calibrated according to the second image coordinates and the second three-dimensional coordinates, the method further comprises: 根据所述第二图像坐标、所述第二三维坐标及所述待标定相机的内参确定所述待标定相机的外参;Determine the extrinsic parameters of the camera to be calibrated according to the second image coordinates, the second three-dimensional coordinates, and the intrinsic parameters of the camera to be calibrated; 所述待标定相机的外参至少包括待标定相机坐标系与全站仪坐标系之间的第一坐标变换关系。The external parameters of the camera to be calibrated include at least a first coordinate transformation relationship between the coordinate system of the camera to be calibrated and the coordinate system of the total station. 16.如权利要求15所述的基于标定系统的车辆传感器标定方法,其特征在于,所述传感器包括待标定环视相机时,控制待标定车辆上的各个所述传感器对所述标定装置进行采集,得到第二数据,包括:16. The vehicle sensor calibration method based on the calibration system according to claim 15, wherein when the sensor includes a surround view camera to be calibrated, controlling each sensor on the vehicle to be calibrated to collect data from the calibration device to obtain the second data comprises: 控制所述待标定环视相机对所述标定装置进行采集,得到第二标定图像;Controlling the surround-view camera to be calibrated to collect data from the calibration device to obtain a second calibration image; 利用环视拼接单应矩阵对所述第二标定图像进行投影,得到投影图像;Projecting the second calibration image using the surround stitching homography matrix to obtain a projected image; 根据相邻两个所述待标定环视相机对应的投影图像中相同的关键点计算相邻两个所述待标定环视相机的坐标系之间的第二坐标变换关系。A second coordinate transformation relationship between the coordinate systems of two adjacent surround-view cameras to be calibrated is calculated according to the same key points in the projection images corresponding to the two adjacent surround-view cameras to be calibrated. 17.如权利要求16所述的基于标定系统的车辆传感器标定方法,其特征在于,控制所述待标定环视相机对所述标定装置进行采集,得到第二标定图像之后,还包括:17. The vehicle sensor calibration method based on the calibration system according to claim 16, characterized in that after controlling the surround view camera to be calibrated to collect data from the calibration device to obtain the second calibration image, the method further comprises: 提取所述第二标定图像中各个标志码的ID信息以及各个关键点的第三图像坐标;Extracting the ID information of each marker code and the third image coordinates of each key point in the second calibration image; 根据所述第二标定图像中各个标志码的ID信息以及各个关键点的第三图像坐标筛选所述第二标定图像中位于地面上的关键点,并获取位于地面上的各个所述关键点的第三三维坐标;Filtering key points on the ground in the second calibration image according to the ID information of each marker code in the second calibration image and the third image coordinates of each key point, and obtaining third three-dimensional coordinates of each key point on the ground; 根据位于地面上的各个所述关键点的第三三维坐标确定环视拼接图像;determining a surround stitching image according to the third three-dimensional coordinates of each of the key points located on the ground; 利用所述标定先验信息中与所述环视拼接图像中关键点对应的第一参考三维坐标对所述环视拼接图像中关键点的第四三维坐标进行校准,以得到校准后的环视拼接图像;calibrating the fourth three-dimensional coordinates of the key points in the surround view stitched image using the first reference three-dimensional coordinates corresponding to the key points in the surround view stitched image in the calibration prior information to obtain a calibrated surround view stitched image; 根据校准后的所述环视拼接图像得到所述环视拼接单应矩阵。The surround view stitching homography matrix is obtained according to the calibrated surround view stitching image. 18.如权利要求15所述的基于标定系统的车辆传感器标定方法,其特征在于,所述传感器包括主激光雷达时,控制待标定车辆上的各个所述传感器对所述标定装置进行采集,得到第二数据,包括:18. The vehicle sensor calibration method based on the calibration system according to claim 15, wherein when the sensor includes a primary laser radar, controlling each sensor on the vehicle to be calibrated to collect data from the calibration device to obtain the second data comprises: 控制所述主激光雷达对所述标定装置进行采集,得到第一测试点云数据;Controlling the main laser radar to collect data from the calibration device to obtain first test point cloud data; 将所述第一测试点云数据与所述第一数据进行匹配,以得到主激光雷达坐标系与所述全站仪坐标系之间的第三坐标变换关系。The first test point cloud data is matched with the first data to obtain a third coordinate transformation relationship between the main lidar coordinate system and the total station coordinate system. 19.如权利要求18所述的基于标定系统的车辆传感器标定方法,其特征在于,控制所述主激光雷达对所述标定装置进行采集,得到第一测试点云数据之后,还包括:19. The vehicle sensor calibration method based on the calibration system according to claim 18, characterized in that after controlling the main laser radar to collect data from the calibration device to obtain the first test point cloud data, the method further comprises: 将所述第一测试点云数据与所述第一数据进行匹配,得到所述主激光雷达的视场范围内的标志码的ID信息及关键点的第五三维坐标;Matching the first test point cloud data with the first data to obtain ID information of the marker code and the fifth three-dimensional coordinates of the key points within the field of view of the main laser radar; 根据所述第一标定图像中各个所述标志码的ID信息、各个所述关键点对应的第二三维坐标、所述主激光雷达的视场范围内的标志码的ID信息及关键点的第五三维坐标确定所述第一标定图像中与所述主激光雷达的视场范围内的相同关键点,并确定所述相同关键点的三维坐标;Determine, based on the ID information of each marker code in the first calibration image, the second three-dimensional coordinates corresponding to each key point, the ID information of the marker code within the field of view of the main laser radar, and the fifth three-dimensional coordinates of the key point, the key point in the first calibration image that is identical to the key point within the field of view of the main laser radar, and determine the three-dimensional coordinates of the identical key point; 根据所述相同关键点的三维坐标计算所述待标定相机坐标系和所述主激光雷达坐标系之间的第四坐标变换关系。The fourth coordinate transformation relationship between the camera coordinate system to be calibrated and the main lidar coordinate system is calculated based on the three-dimensional coordinates of the same key point. 20.如权利要求19所述的基于标定系统的车辆传感器标定方法,其特征在于,所述传感器包括毫米波雷达时,所述标定板上设有金属块,所述标定系统还包括两条轨道、设于所述轨道上的传送带装置,两条轨道之间平行且两条轨道之间的距离与待标定车辆左右轮之间的间距相同,在进行传感器标定时所述待标定车辆位于所述传送带装置上;20. The vehicle sensor calibration method based on a calibration system according to claim 19, characterized in that when the sensor comprises a millimeter-wave radar, a metal block is provided on the calibration plate, and the calibration system further comprises two tracks and a conveyor belt device provided on the tracks, wherein the two tracks are parallel to each other and the distance between the two tracks is the same as the distance between the left and right wheels of the vehicle to be calibrated, and the vehicle to be calibrated is positioned on the conveyor belt device during sensor calibration; 控制待标定车辆上的各个所述传感器对所述标定装置进行采集,得到第二数据之前,还包括:Before controlling each of the sensors on the vehicle to be calibrated to collect data from the calibration device to obtain the second data, the method further includes: 通过对所述传送带装置进行控制以使所述待标定车辆以预设速度移动;Controlling the conveyor belt device to move the vehicle to be calibrated at a preset speed; 控制待标定车辆上的各个所述传感器对所述标定装置进行采集,得到第二数据,包括:Controlling each of the sensors on the vehicle to be calibrated to collect data from the calibration device to obtain second data includes: 在所述待标定车辆移动过程中,触发所述毫米波雷达对视场范围内的所述金属块的坐标进行采集,得到采集数据,所述采集数据中至少包括金属块坐标;During the movement of the vehicle to be calibrated, triggering the millimeter-wave radar to collect the coordinates of the metal block within the field of view to obtain collected data, wherein the collected data at least includes the coordinates of the metal block; 根据数据时间戳确定所述第一测试点云数据和所述采集数据的对齐数据;Determining alignment data between the first test point cloud data and the collected data according to the data timestamp; 将所述对齐数据中的所述第一测试点云数据中的所述金属块的第六三维坐标投影至水平面上,得到水平面投影数据;Projecting the sixth three-dimensional coordinate of the metal block in the first test point cloud data in the alignment data onto a horizontal plane to obtain horizontal plane projection data; 根据所述水平面投影数据、所述对齐数据中的所述采集数据中的金属块坐标及各所述金属块在所述标定系统中的位置计算毫米波雷达坐标系与所述主激光雷达坐标系之间的第五坐标变换关系。The fifth coordinate transformation relationship between the millimeter wave radar coordinate system and the main lidar coordinate system is calculated based on the horizontal plane projection data, the coordinates of the metal blocks in the collected data in the alignment data, and the positions of each metal block in the calibration system. 21.如权利要求20所述的基于标定系统的车辆传感器标定方法,其特征在于,所述传感器包括辅助激光雷达时,控制待标定车辆上的各个所述传感器对所述标定装置进行采集,得到第二数据,包括:21. The vehicle sensor calibration method based on the calibration system according to claim 20, wherein when the sensor includes an auxiliary laser radar, controlling each sensor on the vehicle to be calibrated to collect data from the calibration device to obtain the second data comprises: 控制辅助激光雷达对所述标定装置进行采集,得到第二测试点云数据;controlling the auxiliary laser radar to collect data from the calibration device to obtain second test point cloud data; 将所述第二测试点云数据与所述第一测试点云数据进行匹配,以得到辅助激光雷达坐标系与所述主激光雷达坐标系之间的第六坐标变换关系。The second test point cloud data is matched with the first test point cloud data to obtain a sixth coordinate transformation relationship between the auxiliary lidar coordinate system and the main lidar coordinate system. 22.如权利要求1-21任一项所述的基于标定系统的车辆传感器标定方法,其特征在于,所述标定装置包括多个校验构件,根据所述第一标定数据确定所述标定先验信息,包括:22. The vehicle sensor calibration method based on the calibration system according to any one of claims 1 to 21, wherein the calibration device comprises a plurality of verification components, and determining the calibration prior information according to the first calibration data comprises: 根据各个所述校验构件在所述标定系统中的位置对所述第一标定数据进行空间聚类分割,得到每个所述校验构件对应的第三局部点云数据;Performing spatial clustering and segmentation on the first calibration data according to the position of each verification component in the calibration system to obtain third local point cloud data corresponding to each verification component; 根据所述第三局部点云数据确定各个所述校验构件对应的第五身份信息和第五坐标信息。The fifth identity information and the fifth coordinate information corresponding to each of the verification components are determined according to the third local point cloud data. 23.如权利要求22所述的基于标定系统的车辆传感器标定方法,其特征在于,根据所述第三局部点云数据确定各个所述校验构件对应的第五身份信息和第五坐标信息,包括:23. The vehicle sensor calibration method based on the calibration system according to claim 22, wherein determining the fifth identity information and fifth coordinate information corresponding to each of the verification components based on the third partial point cloud data comprises: 对所述第三局部点云数据进行拟合,得到各个所述校验构件的结构信息和三维坐标。The third local point cloud data is fitted to obtain structural information and three-dimensional coordinates of each verification component. 24.如权利要求23所述的基于标定系统的车辆传感器标定方法,其特征在于,所述校验构件为校验球,对所述第三局部点云数据进行拟合,得到各个所述校验构件的结构信息和三维坐标,包括:24. The vehicle sensor calibration method based on the calibration system according to claim 23, wherein the calibration member is a calibration sphere, and fitting the third local point cloud data to obtain the structural information and three-dimensional coordinates of each calibration member comprises: 对所述第三局部点云数据进行球面拟合,得到所述校验球的半径信息和三维坐标。Spherical fitting is performed on the third local point cloud data to obtain radius information and three-dimensional coordinates of the calibration sphere. 25.如权利要求24所述的基于标定系统的车辆传感器标定方法,其特征在于,根据所述第一数据得到标定先验信息,并根据所述标定先验信息及所述第二数据对所述待标定车辆上的各个所述传感器进行标定之后,还包括:25. The vehicle sensor calibration method based on the calibration system according to claim 24, characterized in that after obtaining calibration priori information based on the first data and calibrating each of the sensors on the vehicle to be calibrated based on the calibration priori information and the second data, the method further comprises: 根据各个所述校验球的半径信息和三维坐标对各个所述传感器的标定参数进行校验。The calibration parameters of each of the sensors are verified according to the radius information and three-dimensional coordinates of each of the calibration spheres. 26.如权利要求25所述的基于标定系统的车辆传感器标定方法,其特征在于,在所述传感器的标定参数包括主激光雷达坐标系与全站仪坐标系之间的第三坐标变换关系时,根据各个所述校验球的半径信息和三维坐标对各个所述传感器的标定参数进行校验,包括:26. The vehicle sensor calibration method based on the calibration system according to claim 25, characterized in that when the calibration parameters of the sensors include a third coordinate transformation relationship between a main lidar coordinate system and a total station coordinate system, the calibration parameters of each sensor are verified based on the radius information and three-dimensional coordinates of each calibration sphere, comprising: 获取主激光雷达视场范围内的各个所述校验球的第七三维坐标;Obtaining the seventh three-dimensional coordinates of each calibration sphere within the field of view of the main laser radar; 根据所述第三坐标变换关系对所述第七三维坐标进行坐标变换,得到待比较的第七三维坐标;performing coordinate transformation on the seventh three-dimensional coordinate according to the third coordinate transformation relationship to obtain a seventh three-dimensional coordinate to be compared; 将所述待比较的第七三维坐标与第七参考三维坐标比较,以确定所述第三坐标变换关系是否准确;comparing the seventh three-dimensional coordinate to be compared with the seventh reference three-dimensional coordinate to determine whether the third coordinate transformation relationship is accurate; 所述第一参考三维坐标为所述全站仪获取到的与所述主激光雷达视场范围内校验构件对应的三维坐标。The first reference three-dimensional coordinates are the three-dimensional coordinates obtained by the total station and corresponding to the verification component within the field of view of the main laser radar. 27.如权利要求25所述的基于标定系统的车辆传感器标定方法,其特征在于,在所述传感器的标定参数包括待标定相机坐标系和主激光雷达坐标系之间的第四坐标变换关系时,根据各个所述校验球的半径信息和三维坐标对各个所述传感器的标定参数进行校验,包括:27. The vehicle sensor calibration method based on the calibration system of claim 25, wherein when the calibration parameters of the sensor include a fourth coordinate transformation relationship between the coordinate system of the camera to be calibrated and the coordinate system of the main lidar, the calibration parameters of each sensor are verified according to the radius information and three-dimensional coordinates of each calibration sphere, comprising: 获取主激光雷达视场范围内的校验球的第一校验点云数据;Acquire first calibration point cloud data of a calibration sphere within the field of view of the primary laser radar; 根据所述第四坐标变换关系将所述第一校验点云数据反投影至所述相机图像上;Back-projecting the first verification point cloud data onto the camera image according to the fourth coordinate transformation relationship; 对所述反投影的相机图像进行预设处理得到所述相机图像中的圆形边缘,根据所述圆形边缘确定对应的第一圆形拟合方程;Performing preset processing on the back-projected camera image to obtain a circular edge in the camera image, and determining a corresponding first circular fitting equation according to the circular edge; 将所述第一圆形拟合方程与参考圆形拟合方程比较,以确定所述第四坐标变换关系是否准确。The first circle fitting equation is compared with a reference circle fitting equation to determine whether the fourth coordinate transformation relationship is accurate. 28.如权利要求25所述的基于标定系统的车辆传感器标定方法,其特征在于,所述校验球为金属球体,在所述传感器的标定参数包括毫米波雷达坐标系与主激光雷达坐标系之间的第五坐标变换关系时,根据各个所述校验球的半径信息和三维坐标对各个所述传感器的标定参数进行校验,包括:28. The vehicle sensor calibration method based on the calibration system according to claim 25, wherein the calibration sphere is a metal sphere, and when the calibration parameters of the sensor include a fifth coordinate transformation relationship between the millimeter wave radar coordinate system and the main lidar coordinate system, the calibration parameters of each sensor are verified based on the radius information and three-dimensional coordinates of each calibration sphere, comprising: 获取所述毫米波雷达的视场范围内的各个校验球的毫米波雷达坐标和角度;Obtaining the millimeter-wave radar coordinates and angles of each calibration sphere within the field of view of the millimeter-wave radar; 获取所述主激光雷达的视场范围内的第七三维坐标;Obtaining a seventh three-dimensional coordinate within the field of view of the primary laser radar; 根据所述第五坐标变换关系将所述毫米波雷达坐标和角度投影至所述主激光雷达坐标系的二维平面;Projecting the millimeter-wave radar coordinates and angles onto a two-dimensional plane of the main lidar coordinate system according to the fifth coordinate transformation relationship; 根据投影后的毫米波雷达坐标和角度及所述第七三维坐标计算欧氏距离,以确定所述第五坐标变换关系是否准确。The Euclidean distance is calculated based on the projected millimeter-wave radar coordinates and angles and the seventh three-dimensional coordinates to determine whether the fifth coordinate transformation relationship is accurate. 29.如权利要求25所述的基于标定系统的车辆传感器标定方法,其特征在于,在所述传感器的标定参数包括辅助激光雷达坐标系与主激光雷达坐标系之间的第六坐标变换关系时,根据各个所述校验球的半径信息和三维坐标对各个所述传感器的标定参数进行校验,包括:29. The vehicle sensor calibration method based on the calibration system of claim 25, wherein when the calibration parameters of the sensor include a sixth coordinate transformation relationship between the auxiliary lidar coordinate system and the main lidar coordinate system, the calibration parameters of each sensor are verified based on the radius information and three-dimensional coordinates of each calibration sphere, comprising: 获取主激光雷达视场范围内校验球的第一校验点云数据;Acquire first calibration point cloud data of the calibration sphere within the field of view of the main laser radar; 获取辅助激光雷达视场范围内校验球的第二校验点云数据;Acquire the second calibration point cloud data of the calibration sphere within the auxiliary laser radar field of view; 根据所述第六坐标变换关系将所述第二校验点云数据变换至所述主激光雷达坐标系中,得到待校验的第二校验点云数据;transforming the second verification point cloud data into the main lidar coordinate system according to the sixth coordinate transformation relationship to obtain second verification point cloud data to be verified; 根据所述第一校验点云数据及所述待校验的第二校验点云数据确定所述第六坐标变换关系是否准确。Determine whether the sixth coordinate transformation relationship is accurate based on the first verification point cloud data and the second verification point cloud data to be verified. 30.一种基于标定系统的车辆传感器标定系统,其特征在于,所述标定系统包括全站仪、多个标定装置,还包括计算装置,所述计算装置包括:30. A vehicle sensor calibration system based on a calibration system, characterized in that the calibration system includes a total station, multiple calibration devices, and a computing device, wherein the computing device includes: 第一控制单元,用于在待标定车辆满足第一标定条件时,控制所述全站仪对所述标定装置进行扫描,得到第一数据;判断所述待标定车辆是否满足所述第一标定条件的过程包括:判断所述待标定车辆是否停至预设区域;若停至预设区域,则判定所述待标定车辆满足所述第一标定条件,否则,判定不满足所述第一标定条件;或,根据所述待标定车辆的车辆信息确定所述待标定车辆是否为首次标定;若为所述首次标定,则判定所述待标定车辆满足所述第一标定条件,否则,判定不满足所述第一标定条件;在判定所述待标定车辆不是首次标定时,还包括:调用数据库中存储的与所述待标定车辆对应的第一数据;A first control unit is configured to control the total station to scan the calibration device to obtain first data when the vehicle to be calibrated meets a first calibration condition; the process of determining whether the vehicle to be calibrated meets the first calibration condition includes: determining whether the vehicle to be calibrated is parked in a preset area; if so, determining that the vehicle to be calibrated meets the first calibration condition; otherwise, determining that the first calibration condition is not met; or determining whether the vehicle to be calibrated is being calibrated for the first time based on the vehicle information of the vehicle to be calibrated; if so, determining that the vehicle to be calibrated meets the first calibration condition; otherwise, determining that the first calibration condition is not met; when it is determined that the vehicle to be calibrated is not being calibrated for the first time, further comprising: calling first data corresponding to the vehicle to be calibrated stored in a database; 第二控制单元,用于在满足第二标定条件时,接收待标定车辆上的各个所述传感器对所述标定装置进行采集得到的第二数据;判断是否满足所述第二标定条件的过程包括:判断是否接收到标定开始指令;若接收到,则判定满足所述第二标定条件,否则,判定不满足所述第二标定条件;a second control unit configured to receive, when a second calibration condition is met, second data acquired by the calibration device from each of the sensors on the vehicle to be calibrated; wherein the process of determining whether the second calibration condition is met comprises: determining whether a calibration start instruction is received; if so, determining that the second calibration condition is met; otherwise, determining that the second calibration condition is not met; 标定单元,用于根据所述第一数据得到标定先验信息,并根据所述标定先验信息及所述第二数据对所述待标定车辆上的各个所述传感器进行标定,以得到各所述传感器的标定参数;所述标定先验信息为所述标定装置的数据信息,所述标定参数用于表征任意两个所述传感器之间或所述传感器与所述全站仪之间的坐标变换关系;根据所述第一数据得到标定先验信息,包括:从所述第一数据中剔除不满足预设要求的数据,得到第一标定数据;根据所述第一标定数据确定所述标定先验信息。A calibration unit is configured to obtain calibration priori information based on the first data, and to calibrate each of the sensors on the vehicle to be calibrated based on the calibration priori information and the second data to obtain calibration parameters of each of the sensors; the calibration priori information is data information of the calibration device, and the calibration parameters are used to characterize the coordinate transformation relationship between any two of the sensors or between the sensor and the total station; obtaining the calibration priori information based on the first data comprises: eliminating data that does not meet preset requirements from the first data to obtain first calibration data; and determining the calibration priori information based on the first calibration data. 31.一种基于标定系统的车辆传感器标定装置,其特征在于,包括:31. A vehicle sensor calibration device based on a calibration system, comprising: 存储器,用于存储计算机程序;Memory for storing computer programs; 处理器,用于在存储计算机程序时,实现如权利要求1-29任一项所述的基于标定系统的车辆传感器标定方法的步骤。A processor, configured to implement the steps of the vehicle sensor calibration method based on the calibration system as described in any one of claims 1 to 29 when storing a computer program. 32.一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储有计算机程序,所述计算机程序被处理器执行时实现如权利要求1-29任一项所述的基于标定系统的车辆传感器标定方法的步骤。32. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the vehicle sensor calibration method based on the calibration system according to any one of claims 1 to 29 are implemented.
CN202311113474.2A 2023-08-31 2023-08-31 Calibration system-based vehicle sensor calibration method, system, device and medium Active CN119559259B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311113474.2A CN119559259B (en) 2023-08-31 2023-08-31 Calibration system-based vehicle sensor calibration method, system, device and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311113474.2A CN119559259B (en) 2023-08-31 2023-08-31 Calibration system-based vehicle sensor calibration method, system, device and medium

Publications (2)

Publication Number Publication Date
CN119559259A CN119559259A (en) 2025-03-04
CN119559259B true CN119559259B (en) 2025-09-26

Family

ID=94745275

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311113474.2A Active CN119559259B (en) 2023-08-31 2023-08-31 Calibration system-based vehicle sensor calibration method, system, device and medium

Country Status (1)

Country Link
CN (1) CN119559259B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112241007A (en) * 2020-07-01 2021-01-19 北京新能源汽车技术创新中心有限公司 Calibration method and arrangement structure of automatic driving environment perception sensor and vehicle
CN112750165A (en) * 2019-10-29 2021-05-04 商汤集团有限公司 Parameter calibration method, intelligent driving method and device, equipment and storage medium thereof

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109211298B (en) * 2017-07-04 2021-08-17 百度在线网络技术(北京)有限公司 Sensor calibration method and device
CN113767264A (en) * 2020-03-05 2021-12-07 深圳市大疆创新科技有限公司 Parameter calibration method, device, system and storage medium
CN111735479B (en) * 2020-08-28 2021-03-23 中国计量大学 Multi-sensor combined calibration device and method
CN113205563B (en) * 2021-06-03 2022-11-18 河南科技大学 Automatic driving sensor combined calibration target and calibration method
CN114578329A (en) * 2022-03-01 2022-06-03 亿咖通(湖北)技术有限公司 Multi-sensor joint calibration method, device, storage medium and program product
CN114706060B (en) * 2022-04-19 2025-08-01 中国铁建重工集团股份有限公司 Vehicle-mounted multi-laser radar calibration method, device, equipment and storage medium
CN116106870A (en) * 2023-01-31 2023-05-12 新石器慧通(北京)科技有限公司 Calibration method and device for external parameters of vehicle laser radar

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112750165A (en) * 2019-10-29 2021-05-04 商汤集团有限公司 Parameter calibration method, intelligent driving method and device, equipment and storage medium thereof
CN112241007A (en) * 2020-07-01 2021-01-19 北京新能源汽车技术创新中心有限公司 Calibration method and arrangement structure of automatic driving environment perception sensor and vehicle

Also Published As

Publication number Publication date
CN119559259A (en) 2025-03-04

Similar Documents

Publication Publication Date Title
Yan et al. Joint camera intrinsic and LiDAR-camera extrinsic calibration
Kang et al. Automatic targetless camera–lidar calibration by aligning edge with gaussian mixture model
Daftry et al. Building with drones: Accurate 3D facade reconstruction using MAVs
US6906620B2 (en) Obstacle detection device and method therefor
CN114413958A (en) Monocular vision distance and speed measurement method of unmanned logistics vehicle
KR102490521B1 (en) Automatic calibration through vector matching of the LiDAR coordinate system and the camera coordinate system
CN104574406A (en) Joint calibration method between 360-degree panorama laser and multiple visual systems
CN111932565A (en) A Multi-target Recognition Tracking Solution Method
CN110763204B (en) Planar coding target and pose measurement method thereof
Taylor et al. Automatic calibration of multi-modal sensor systems using a gradient orientation measure
CN110851978B (en) A visibility-based camera position optimization method
Xie et al. A4LiDARTag: Depth-based fiducial marker for extrinsic calibration of solid-state LiDAR and camera
CN113724333A (en) Space calibration method and system of radar equipment
CN118411507A (en) A method and system for constructing a semantic map of a scene with dynamic targets
Wang et al. Autonomous landing of multi-rotors UAV with monocular gimbaled camera on moving vehicle
Jeong et al. O 3 LiDAR–camera calibration: One-shot, one-target and overcoming LiDAR limitations
JPH07103715A (en) Method and apparatus for recognizing three-dimensional position and attitude based on visual sense
CN117115242A (en) Identification method of mark point, computer storage medium and terminal equipment
Goronzy et al. QRPos: Indoor positioning system for self-balancing robots based on QR codes
Kim et al. Extrinsic calibration of a camera and a 2D LiDAR using a dummy camera with IR cut filter removed
CN119559259B (en) Calibration system-based vehicle sensor calibration method, system, device and medium
Kim et al. An automatic robust point cloud registration on construction sites
CN119533543A (en) Vehicle sensor calibration method, system, device and storage medium
CN119183553A (en) Method for determining a group of points that are visible or invisible from a given observation point
Bazin et al. Dynamic programming and skyline extraction in catadioptric infrared images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant