CN116468804B - Laser radar and camera external parameter calibration precision evaluation method and device - Google Patents
Laser radar and camera external parameter calibration precision evaluation method and device Download PDFInfo
- Publication number
- CN116468804B CN116468804B CN202310437131.5A CN202310437131A CN116468804B CN 116468804 B CN116468804 B CN 116468804B CN 202310437131 A CN202310437131 A CN 202310437131A CN 116468804 B CN116468804 B CN 116468804B
- Authority
- CN
- China
- Prior art keywords
- calibration plate
- calibration
- point cloud
- camera
- coordinate information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention provides a laser radar and camera external parameter calibration precision assessment method, a device, equipment and a storage medium, wherein the method comprises the following steps: acquiring image data recorded by a camera aiming at a calibration plate, and acquiring point cloud data recorded by a laser radar aiming at the calibration plate; acquiring a calibration plate image based on the image data, and acquiring first pixel coordinate information of a target position in the calibration plate according to the mark information of the calibration plate in the calibration plate image; extracting point cloud coordinate information of a target position in the calibration plate based on the point cloud data, and converting the point cloud coordinate information into second pixel coordinate information of the target position in the calibration plate according to the external calibration parameters of the radar and the camera and the internal camera parameters; and calculating error information of the first pixel coordinate information and the second pixel coordinate information, and determining an external parameter calibration precision assessment result according to the error information. The invention can quantitatively describe the radar and camera external parameter calibration precision, and effectively improves the efficiency of verification of the radar and camera external parameter calibration precision.
Description
Technical Field
The invention relates to the technical field of parameter calibration, in particular to a method, a device, equipment and a storage medium for evaluating the external parameter calibration precision of a laser radar and a camera.
Background
In the field of autopilot, multi-line lidar and camera sensors are commonly used. And when the data are fused, the radar and the camera are required to be calibrated in an external parameter mode, and the precision of the external parameter calibration of the multi-line laser radar and the camera is required to be assessed.
Currently, the quality of external parameter calibration of a multi-line laser radar and a camera is judged by adopting a laser point cloud projection picture mode, and according to the superposition degree of point clouds in the picture, the quality of external parameter calibration is judged by manual observation. The verification method for the external parameter calibration accuracy has the defects of time and labor waste and no quantitative description.
Disclosure of Invention
The invention aims to provide a method, a device, equipment and a storage medium for evaluating the calibration precision of a laser radar and a camera external parameter, so as to solve the technical problems, improve the efficiency of verifying the calibration precision of the radar and the camera external parameter, and quantitatively describe the calibration precision of the radar and the camera external parameter.
In order to solve the technical problems, the invention provides a laser radar and camera external parameter calibration precision assessment method, which comprises the following steps:
acquiring image data recorded by a camera aiming at a calibration plate, and acquiring point cloud data recorded by a laser radar aiming at the calibration plate;
acquiring a calibration plate image based on the image data, and acquiring first pixel coordinate information of a target position in a calibration plate according to marking information of the calibration plate in the calibration plate image;
extracting point cloud coordinate information of a target position in the calibration plate based on the point cloud data, and converting the point cloud coordinate information into second pixel coordinate information of the target position in the calibration plate according to a calibration external reference of the radar and the camera and a camera internal reference;
and calculating error information of the first pixel coordinate information and the second pixel coordinate information, and determining an external parameter calibration precision assessment result according to the error information.
Further, the obtaining the first pixel coordinate information of the target position in the calibration plate according to the mark information of the calibration plate in the calibration plate image specifically includes:
acquiring the marking information of the calibration plate image, and acquiring the posture transformation information of the target position from world coordinate information to a camera coordinate system according to the position relation between the predetermined marking information and the target position;
and transforming the attitude transformation information into a pixel coordinate system according to the camera internal parameters, and obtaining first pixel coordinate information of the target position in the calibration plate in the pixel coordinate system.
Further, the extracting the point cloud coordinate information of the target position in the calibration plate based on the point cloud data specifically includes:
screening all point cloud data recorded from a preset recording position and aiming at a target position of a calibration plate based on the recording residence time of the laser radar, and combining multi-frame point cloud data corresponding to the same recording position into single-frame point cloud data;
transforming the single-frame point cloud data from a three-dimensional space to a two-dimensional plane according to the acquired RT matrix, sliding the single-frame point cloud data on the laser point cloud two-dimensional plane where the calibration plate is positioned by using a rectangular frame circumscribed with a hollow circle according to a certain step length based on the scale information of the calibration plate, and acquiring a target position in the calibration plate when the point cloud area of the rectangular frame is minimum;
and converting the target position in the calibration plate according to the inverse matrix of the RT matrix to obtain the point cloud coordinate information of the target position in the calibration plate.
Further, the calculating the error information of the first pixel coordinate information and the second pixel coordinate information specifically includes:
and respectively moving the radar and the camera to different recording positions, acquiring a plurality of groups of first pixel coordinate information and second pixel coordinate information, and calculating the average error of the plurality of groups of first pixel coordinate information and second pixel coordinate information as the error information of the first pixel coordinate information and the second pixel coordinate information.
Further, the step of determining the evaluation result of the external parameter calibration precision according to the error information specifically includes:
and comparing the error information with preset error standard information to obtain an external parameter calibration precision assessment result of the radar and the camera.
Further, the marking information of the calibration plate is a plurality of ar codes preset on the calibration plate.
Further, the target position in the calibration plate is a hollow circle positioned at the center of the calibration plate.
The invention also provides a laser radar and camera external parameter calibration precision assessment device, which comprises:
the data acquisition module is used for acquiring image data recorded by a camera aiming at the calibration plate and acquiring point cloud data recorded by a laser radar aiming at the calibration plate;
the first coordinate acquisition module is used for acquiring a calibration plate image based on the image data and acquiring first pixel coordinate information of a target position in the calibration plate according to the mark information of the calibration plate in the calibration plate image;
the second coordinate acquisition module is used for extracting point cloud coordinate information of a target position in the calibration plate based on the point cloud data, and converting the point cloud coordinate information into second pixel coordinate information of the target position in the calibration plate according to the external calibration parameters of the radar and the camera and the internal camera parameters;
and the calibration precision evaluation module is used for calculating error information of the first pixel coordinate information and the second pixel coordinate information and determining an external parameter calibration precision evaluation result according to the error information.
The invention also provides a terminal device which comprises a processor and a memory storing a computer program, wherein the processor realizes any laser radar and camera external parameter calibration precision evaluation method when executing the computer program.
The invention also provides a non-transitory computer readable storage medium, on which a computer program is stored, which when executed by a processor implements the method for evaluating the calibration accuracy of the lidar and the camera external parameters.
Compared with the prior art, the invention has the following beneficial effects:
the invention provides a laser radar and camera external parameter calibration precision assessment method, a device, equipment and a storage medium, wherein the method comprises the following steps: acquiring image data recorded by a camera aiming at a calibration plate, and acquiring point cloud data recorded by a laser radar aiming at the calibration plate; acquiring a calibration plate image based on the image data, and acquiring first pixel coordinate information of a target position in a calibration plate according to marking information of the calibration plate in the calibration plate image; extracting point cloud coordinate information of a target position in the calibration plate based on the point cloud data, and converting the point cloud coordinate information into second pixel coordinate information of the target position in the calibration plate according to a calibration external reference of the radar and the camera and a camera internal reference; and calculating error information of the first pixel coordinate information and the second pixel coordinate information, and determining an external parameter calibration precision assessment result according to the error information. According to the method, the error calculation is carried out on the point cloud data of the radar and the image data of the camera in a coordinate transformation mode, so that quantitative description can be carried out on the calibration precision of the radar and the camera external parameters, and the efficiency of verification of the calibration precision of the radar and the camera external parameters is effectively improved.
Drawings
FIG. 1 is a schematic flow chart of a method for evaluating the calibration accuracy of a laser radar and a camera external parameter provided by the invention;
FIG. 2 is a second flow chart of the method for evaluating the calibration accuracy of the external parameters of the laser radar and the camera provided by the invention;
FIG. 3 is a schematic view of a calibration plate structure provided by the present invention;
FIG. 4 is a schematic view of the laser coil of the present invention tangential to the calibration plate surface;
FIG. 5 is a schematic diagram of a coordinate transformation process provided by the present invention;
fig. 6 is a schematic structural diagram of the laser radar and camera external parameter calibration accuracy assessment device provided by the invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, the embodiment of the invention provides a method for evaluating calibration accuracy of a laser radar and a camera external parameter, which comprises the following steps:
s1, acquiring image data recorded by a camera aiming at a calibration plate, and acquiring point cloud data recorded by a laser radar aiming at the calibration plate;
s2, acquiring a calibration plate image based on the image data, and acquiring first pixel coordinate information of a target position in a calibration plate according to marking information of the calibration plate in the calibration plate image;
s3, extracting point cloud coordinate information of a target position in the calibration plate based on the point cloud data, and converting the point cloud coordinate information into second pixel coordinate information of the target position in the calibration plate according to the external calibration parameters of the radar and the camera and the internal camera parameters;
s4, calculating error information of the first pixel coordinate information and the second pixel coordinate information, and determining an external parameter calibration precision assessment result according to the error information.
In the embodiment of the present invention, further, the obtaining, according to the marking information of the calibration plate in the calibration plate image, the first pixel coordinate information of the target position in the calibration plate specifically includes:
acquiring the marking information of the calibration plate image, and acquiring the posture transformation information of the target position from world coordinate information to a camera coordinate system according to the position relation between the predetermined marking information and the target position;
and transforming the attitude transformation information into a pixel coordinate system according to the camera internal parameters, and obtaining first pixel coordinate information of the target position in the calibration plate in the pixel coordinate system.
In the embodiment of the present invention, further, the extracting, based on the point cloud data, point cloud coordinate information of the target position in the calibration plate specifically includes:
screening all point cloud data recorded from a preset recording position and aiming at a target position of a calibration plate based on the recording residence time of the laser radar, and combining multi-frame point cloud data corresponding to the same recording position into single-frame point cloud data;
transforming the single-frame point cloud data from a three-dimensional space to a two-dimensional plane according to the acquired RT matrix, sliding the single-frame point cloud data on the laser point cloud two-dimensional plane where the calibration plate is positioned by using a rectangular frame circumscribed with a hollow circle according to a certain step length based on the scale information of the calibration plate, and acquiring a target position in the calibration plate when the point cloud area of the rectangular frame is minimum;
and converting the target position in the calibration plate according to the inverse matrix of the RT matrix to obtain the point cloud coordinate information of the target position in the calibration plate.
In the embodiment of the present invention, further, the calculating the error information of the first pixel coordinate information and the second pixel coordinate information specifically includes:
and respectively moving the radar and the camera to different recording positions, acquiring a plurality of groups of first pixel coordinate information and second pixel coordinate information, and calculating the average error of the plurality of groups of first pixel coordinate information and second pixel coordinate information as the error information of the first pixel coordinate information and the second pixel coordinate information.
In the embodiment of the present invention, further, the determining the external parameter calibration accuracy assessment result according to the error information specifically includes:
and comparing the error information with preset error standard information to obtain an external parameter calibration precision assessment result of the radar and the camera.
In the embodiment of the invention, further, the marking information of the calibration plate is a plurality of ar codes preset on the calibration plate.
In the embodiment of the invention, further, the target position in the calibration plate is a hollow circle positioned at the center of the calibration plate.
Referring to fig. 2, based on the above scheme, in order to better understand the method for evaluating the calibration accuracy of the external parameters of the lidar and the camera provided by the embodiment of the invention, the following details are described:
the embodiment of the invention can be realized by the following steps:
1) Selecting a special calibration plate:
a matrix hard plate can be used, the center position of the matrix hard plate is provided with hollowed concentric circles, the four corners of the matrix hard plate are attached with ar codes, and the sizes and the relative positions of the components are known, as shown in fig. 3.
2) The recording mode is as follows:
2-1) determining the distance between the calibration plate and the position of the multi-line laser radar according to actual use requirements;
2-2) the calibration plate surface is tangent to the multi-line laser coil as much as possible, and the number of laser lines which are arranged on the calibration plate surface is as much as possible;
as shown in fig. 4, the laser coil is shown on the left side from the top view, only one line is drawn here as a display, and the calibration plate is shown on the right side;
2-3) the calibration plate needs to be within fov of the target camera;
2-4) data recording, the same position needs to be still recorded for 5 seconds and then transferred to the next acquisition position.
3) Extracting the pixel coordinates of the circle center according to the picture ar code information:
3-1) selecting a frame of picture, identifying four corners ar from the picture, and obtaining position information corresponding to an ar code;
principle of:
3-1-1) ar can provide pose transformation information from the self code position to the camera;
3-1-2) opencv provides an ar code recognition interface, so that the ar code positions around the calibration plate can be obtained to obtain the posture transformation information of the camera;
3-2) calculating the position information of the center point (circle center) according to the position information corresponding to ar;
3-2-1) because the positions of the four ar codes on the calibration plate are known (the positions of the ar code patches are manually determined) and the positions of the circles on the calibration plate are known, the position relation between the positions of the four ar codes and the circle center can also be determined, so that the posture transformation information from the circle center to the camera (namely, the posture transformation information from the world coordinate system to the camera coordinate system) is calculated;
3-3) converting the circle center position information into pixel coordinate system information;
3-3-1) transforming information from the center of the calibration plate to the pose of the camera (herein referred to as a camera coordinate system) according to the calibration plate center acquired in the previous step; the positional information and pixel coordinate system relationship can be understood from the camera coordinate to pixel coordinate relationship in fig. 5, and the mathematical relationship can be understood from 5-2-2) chinese.
3-3-2) according to the camera calibration procedure, the internal parameters of the camera can be obtained;
3-3-3) the camera coordinate system can be transformed to the image coordinate system according to the internal parameters of the camera, and then transformed to the pixel coordinate system;
3-3-4) so as to obtain the coordinate information of the circle center of the calibration plate in the pixel coordinate system Ouv.
4) Extracting the center coordinates of the point cloud according to the multi-line laser point cloud information:
4-1) selecting all corresponding point cloud data of the corresponding picture position;
4-1-1) synchronously recording the camera and laser radar data according to fov of the calibration plate in the picture and the constraint of the laser point distance condition, suspending the placement position of the calibration plate for 5 seconds, and selecting the point cloud data of the calibration plate according to the recording time;
4-2) combining the selected point cloud data into a frame of point cloud for point cloud extraction in order to make the point cloud sufficiently dense and the extracted circle center accuracy higher;
4-2-1) writing the multi-frame calibration plate laser point cloud obtained in the last step into Shan Zhen calibration plate laser point cloud according to an interface provided by a pcl library;
4-3) extracting the center coordinates of the point clouds by the combined point clouds;
4-3-1) carrying out plane fitting on the laser point cloud of the calibration plate by using a plane equation fitting interface of the pcl according to the laser point of the calibration plate obtained in the last step to obtain a plane equation;
4-3-2) according to a plane equation, projecting the laser point cloud on the plane of the plane equation to obtain a corresponding RT, and realizing the transformation of the laser point cloud of the calibration plate from a 3d space point to a 2d plane point;
4-3-3) sliding the laser point cloud plane where the calibration plate is positioned according to a certain step length by using a rectangular frame circumscribed with a hollow circle according to the scale information of the known calibration plate in the 2d plane; because the position of the hollow circle is not provided with a laser spot, when the point cloud area of the rectangular frame is minimum, the position of the hollow circle is the position; acquiring the circle center position according to the circle fitting interface of the pcl;
4-3-4) obtaining the inverse matrix of RT according to 4-3-2), and calculating the coordinates of the circle center position in the 3d space.
5) Transforming the center of a point cloud to a pixel coordinate system according to the calibrated external participation camera internal parameters to obtain pixel coordinates;
5-1) acquiring calibrated lidar-camera external participation camera internal parameters according to a calibration tool;
5-2) transforming the center coordinates of the point cloud into a pixel coordinate system according to the calibration external parameters and the camera internal parameters;
5-2-1) lidar-camera external parameters, in order to obtain a transformation matrix from a world coordinate system to a camera coordinate system (also called a camera coordinate system), the basic transformation relationship from the world coordinate system to a pixel coordinate system is shown in fig. 5.
5-2-2) the four-coordinate system transformation relationship is described mathematically as follows:
5-2-3) so far, the pixel coordinates of the circle center of the hollow circle of the laser point cloud of the calibration plate in the pixel coordinate system can be obtained.
6) Calculating the pixel average deviation of the two:
6-1) the pixel error (|u1-u2|, |v1-v2|) of the pixel coordinate (u 2, v 2|) uv of the pixel coordinate (u 2, v 2|) of the pixel coordinate system by calculating the pixel coordinate system 0uv of the plurality of positions (here, the acquisition position described in the step 2) and the pixel coordinate (u 1, v 1) of the picture circle center position acquired by opencv and the laser point cloud space circle center coordinate acquired by pcl.
6-2) calculating the pixel average error of the hollow circle center of the pixel coordinate system, thereby being used as an index for measuring the accuracy of the lidar-camera calibration external parameter.
Compared with the prior art, the embodiment of the invention can quantitatively describe the precision of the lidar-camera calibrated external parameters, so that the precision condition of the lidar-camera calibrated external parameters can be efficiently evaluated.
It should be noted that, for simplicity of description, the above method or flow embodiments are all described as a series of combinations of acts, but it should be understood by those skilled in the art that the embodiments of the present invention are not limited by the order of acts described, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are all alternative embodiments and that the actions involved are not necessarily required for the embodiments of the present invention.
Referring to fig. 6, the embodiment of the invention further provides a laser radar and camera external parameter calibration precision assessment device, which includes:
the data acquisition module 1 is used for acquiring image data recorded by a camera aiming at a calibration plate and acquiring point cloud data recorded by a laser radar aiming at the calibration plate;
the first coordinate acquisition module 2 is used for acquiring a calibration plate image based on the image data and acquiring first pixel coordinate information of a target position in a calibration plate according to the mark information of the calibration plate in the calibration plate image;
the second coordinate acquisition module 3 is used for extracting point cloud coordinate information of the target position in the calibration plate based on the point cloud data, and converting the point cloud coordinate information into second pixel coordinate information of the target position in the calibration plate according to the external calibration parameters of the radar and the camera and the internal camera parameters;
and the calibration precision evaluation module 4 is used for calculating error information of the first pixel coordinate information and the second pixel coordinate information and determining an external parameter calibration precision evaluation result according to the error information.
It can be understood that the embodiment of the device item corresponds to the embodiment of the method item of the invention, and the device for evaluating the calibration accuracy of the laser radar and the camera external parameters provided by the embodiment of the invention can realize the method for evaluating the calibration accuracy of the laser radar and the camera external parameters provided by any one of the embodiment of the method item of the invention.
The invention also provides a non-transitory computer readable storage medium, on which a computer program is stored, which when executed by a processor implements the method for evaluating the calibration accuracy of the lidar and the camera external parameters.
It should be noted that the above-described apparatus embodiments are merely illustrative, and the units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. In addition, in the drawings of the embodiment of the device provided by the invention, the connection relation between the modules represents that the modules have communication connection, and can be specifically implemented as one or more communication buses or signal lines. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
It will be clear to those skilled in the art that, for convenience and brevity, the specific working process of the apparatus described above may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
The terminal device may be a computing device such as a desktop computer, a notebook computer, a palm computer, a cloud server, etc. The terminal device may include, but is not limited to, a processor, a memory.
The processor may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, which is a control center of the terminal device, and which connects various parts of the entire terminal device using various interfaces and lines.
The memory may be used to store the computer program, and the processor may implement various functions of the terminal device by running or executing the computer program stored in the memory and invoking data stored in the memory. The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to the use of the cellular phone, etc. In addition, the memory may include high-speed random access memory, and may also include non-volatile memory, such as a hard disk, memory, plug-in hard disk, smart Media Card (SMC), secure Digital (SD) Card, flash Card (Flash Card), at least one disk storage device, flash memory device, or other volatile solid-state storage device.
The storage medium is a computer readable storage medium, and the computer program is stored in the computer readable storage medium, and when executed by a processor, the computer program can implement the steps of the above-mentioned method embodiments. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
While the foregoing is directed to the preferred embodiments of the present invention, it will be appreciated by those skilled in the art that changes and modifications may be made without departing from the principles of the invention, such changes and modifications are also intended to be within the scope of the invention.
Claims (7)
1. A laser radar and camera external parameter calibration precision assessment method is characterized by comprising the following steps:
acquiring image data recorded by a camera aiming at a calibration plate, and acquiring point cloud data recorded by a laser radar aiming at the calibration plate;
acquiring a calibration plate image based on the image data, and acquiring first pixel coordinate information of a target position in a calibration plate according to marking information of the calibration plate in the calibration plate image;
extracting point cloud coordinate information of a target position in the calibration plate based on the point cloud data, and converting the point cloud coordinate information into second pixel coordinate information of the target position in the calibration plate according to a calibration external reference of the radar and the camera and a camera internal reference;
calculating error information of the first pixel coordinate information and the second pixel coordinate information, and determining an external parameter calibration precision assessment result according to the error information;
the method for converting the point cloud coordinate information into second pixel coordinate information of the target position in the calibration plate according to the calibration external parameters of the radar and the camera internal parameters comprises the following steps:
obtaining a calibration external parameter and a camera internal parameter of the radar and the camera according to a calibration tool, and obtaining a transformation matrix from a world coordinate system to a camera coordinate system according to the calibration external parameter of the radar and the camera;
according to the transformation matrix, transforming the point cloud coordinate information into second pixel coordinate information of a target position in a calibration plate;
when screening all point cloud data recorded from a preset recording position and aiming at a target position of a calibration plate based on the recording residence time of the laser radar, the method comprises the following steps: the calibration plate surface is tangent to the multi-line laser coil as much as possible, and the number of laser lines on the calibration plate surface is as much as possible;
the extracting the point cloud coordinate information of the target position in the calibration plate based on the point cloud data specifically comprises the following steps:
screening all point cloud data recorded from a preset recording position and aiming at a target position of a calibration plate based on the recording residence time of the laser radar, and combining multi-frame point cloud data corresponding to the same recording position into single-frame point cloud data;
transforming the single-frame point cloud data from a three-dimensional space to a two-dimensional plane according to the acquired RT matrix, sliding the single-frame point cloud data on the laser point cloud two-dimensional plane where the calibration plate is positioned by using a rectangular frame circumscribed with a hollow circle according to a certain step length based on the scale information of the calibration plate, and acquiring a target position in the calibration plate when the point cloud area of the rectangular frame is minimum;
converting the target position in the calibration plate according to the inverse matrix of the RT matrix to obtain the point cloud coordinate information of the target position in the calibration plate;
the mark information of the calibration plate is a plurality of ar codes preset on the calibration plate; the target position in the calibration plate is a hollow circle positioned at the center of the calibration plate; the calibration plate is a rectangular hard plate.
2. The method for evaluating the calibration accuracy of the external parameters of the laser radar and the camera according to claim 1, wherein the step of obtaining the first pixel coordinate information of the target position in the calibration plate according to the mark information of the calibration plate in the calibration plate image specifically comprises the following steps:
acquiring the marking information of the calibration plate image, and acquiring the posture transformation information of the target position from world coordinate information to a camera coordinate system according to the position relation between the predetermined marking information and the target position;
and transforming the attitude transformation information into a pixel coordinate system according to the camera internal parameters, and obtaining first pixel coordinate information of the target position in the calibration plate in the pixel coordinate system.
3. The method for evaluating the calibration accuracy of the external parameters of the laser radar and the camera according to claim 1, wherein the calculating the error information of the first pixel coordinate information and the second pixel coordinate information specifically comprises:
and respectively moving the radar and the camera to different recording positions, acquiring a plurality of groups of first pixel coordinate information and second pixel coordinate information, and calculating the average error of the plurality of groups of first pixel coordinate information and second pixel coordinate information as the error information of the first pixel coordinate information and the second pixel coordinate information.
4. The method for evaluating the calibration accuracy of the external parameters of the laser radar and the camera according to claim 1, wherein the determining the evaluation result of the calibration accuracy of the external parameters according to the error information specifically comprises:
and comparing the error information with preset error standard information to obtain an external parameter calibration precision assessment result of the radar and the camera.
5. The utility model provides a laser radar and camera extrinsic parameter calibration precision evaluation device which characterized in that includes:
the data acquisition module is used for acquiring image data recorded by a camera aiming at the calibration plate and acquiring point cloud data recorded by a laser radar aiming at the calibration plate;
the first coordinate acquisition module is used for acquiring a calibration plate image based on the image data and acquiring first pixel coordinate information of a target position in the calibration plate according to the mark information of the calibration plate in the calibration plate image;
the second coordinate acquisition module is used for extracting point cloud coordinate information of a target position in the calibration plate based on the point cloud data, and converting the point cloud coordinate information into second pixel coordinate information of the target position in the calibration plate according to the external calibration parameters of the radar and the camera and the internal camera parameters;
the calibration precision evaluation module is used for calculating error information of the first pixel coordinate information and the second pixel coordinate information and determining an external parameter calibration precision evaluation result according to the error information;
the method for converting the point cloud coordinate information into second pixel coordinate information of the target position in the calibration plate according to the calibration external parameters of the radar and the camera internal parameters comprises the following steps:
obtaining a calibration external parameter and a camera internal parameter of the radar and the camera according to a calibration tool, and obtaining a transformation matrix from a world coordinate system to a camera coordinate system according to the calibration external parameter of the radar and the camera;
according to the transformation matrix, transforming the point cloud coordinate information into second pixel coordinate information of a target position in a calibration plate;
when screening all point cloud data recorded from a preset recording position and aiming at a target position of a calibration plate based on the recording residence time of the laser radar, the method comprises the following steps: the calibration plate surface is tangent to the multi-line laser coil as much as possible, and the number of laser lines on the calibration plate surface is as much as possible;
the extracting the point cloud coordinate information of the target position in the calibration plate based on the point cloud data specifically comprises the following steps:
screening all point cloud data recorded from a preset recording position and aiming at a target position of a calibration plate based on the recording residence time of the laser radar, and combining multi-frame point cloud data corresponding to the same recording position into single-frame point cloud data;
transforming the single-frame point cloud data from a three-dimensional space to a two-dimensional plane according to the acquired RT matrix, sliding the single-frame point cloud data on the laser point cloud two-dimensional plane where the calibration plate is positioned by using a rectangular frame circumscribed with a hollow circle according to a certain step length based on the scale information of the calibration plate, and acquiring a target position in the calibration plate when the point cloud area of the rectangular frame is minimum;
converting the target position in the calibration plate according to the inverse matrix of the RT matrix to obtain the point cloud coordinate information of the target position in the calibration plate;
the mark information of the calibration plate is a plurality of ar codes preset on the calibration plate; the target position in the calibration plate is a hollow circle positioned at the center of the calibration plate; the calibration plate is a rectangular hard plate.
6. A terminal device comprising a processor and a memory storing a computer program, characterized in that the processor, when executing the computer program, implements the method for assessing the calibration accuracy of the lidar and camera external parameters according to any of claims 1 to 4.
7. A non-transitory computer readable storage medium having stored thereon a computer program, which when executed by a processor, implements the method for assessing laser radar and camera exogenous calibration accuracy according to any one of claims 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310437131.5A CN116468804B (en) | 2023-04-21 | 2023-04-21 | Laser radar and camera external parameter calibration precision evaluation method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310437131.5A CN116468804B (en) | 2023-04-21 | 2023-04-21 | Laser radar and camera external parameter calibration precision evaluation method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116468804A CN116468804A (en) | 2023-07-21 |
CN116468804B true CN116468804B (en) | 2024-04-02 |
Family
ID=87180372
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310437131.5A Active CN116468804B (en) | 2023-04-21 | 2023-04-21 | Laser radar and camera external parameter calibration precision evaluation method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116468804B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118037857A (en) * | 2024-02-20 | 2024-05-14 | 北京集度科技有限公司 | Sensor external parameter calibration method and related device |
CN118469941B (en) * | 2024-05-11 | 2024-12-13 | 南京英麒智能科技有限公司 | Radar point cloud preprocessing method, system, equipment and storage medium |
CN118864615B (en) * | 2024-09-23 | 2025-03-07 | 湖州丽天智能科技有限公司 | Camera based on simulation, radar calibration method, computer equipment and storage medium |
CN119087413B (en) * | 2024-11-08 | 2025-01-14 | 广东电网有限责任公司佛山供电局 | External parameter calibration matrix evaluation device and method for camera and radar and electronic equipment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021184218A1 (en) * | 2020-03-17 | 2021-09-23 | 华为技术有限公司 | Relative pose calibration method and related apparatus |
CN114879153A (en) * | 2022-06-08 | 2022-08-09 | 中国第一汽车股份有限公司 | Radar parameter calibration method and device and vehicle |
CN115631246A (en) * | 2022-10-26 | 2023-01-20 | 浙江智马达智能科技有限公司 | Method and device for jointly calibrating camera internal reference and camera relative laser radar external reference |
CN115792865A (en) * | 2022-10-31 | 2023-03-14 | 东风悦享科技有限公司 | Camera and mechanical laser radar-based external parameter calibration method, system, medium and vehicle |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112017205B (en) * | 2020-07-27 | 2021-06-25 | 清华大学 | A method and system for automatic calibration of spatial position of lidar and camera sensor |
-
2023
- 2023-04-21 CN CN202310437131.5A patent/CN116468804B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021184218A1 (en) * | 2020-03-17 | 2021-09-23 | 华为技术有限公司 | Relative pose calibration method and related apparatus |
CN114879153A (en) * | 2022-06-08 | 2022-08-09 | 中国第一汽车股份有限公司 | Radar parameter calibration method and device and vehicle |
CN115631246A (en) * | 2022-10-26 | 2023-01-20 | 浙江智马达智能科技有限公司 | Method and device for jointly calibrating camera internal reference and camera relative laser radar external reference |
CN115792865A (en) * | 2022-10-31 | 2023-03-14 | 东风悦享科技有限公司 | Camera and mechanical laser radar-based external parameter calibration method, system, medium and vehicle |
Non-Patent Citations (2)
Title |
---|
Calibration of the internal and external parameters of wheeled robot mobile chasses and inertial measurement units based on nonlinear optimization;Gang Peng;《IEEE》;1-14 * |
远距离运动车辆快速跟踪与定位系统设计;吴勇;《测绘通报》;112-115 * |
Also Published As
Publication number | Publication date |
---|---|
CN116468804A (en) | 2023-07-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN116468804B (en) | Laser radar and camera external parameter calibration precision evaluation method and device | |
CN110874594B (en) | Human body appearance damage detection method and related equipment based on semantic segmentation network | |
US10872439B2 (en) | Method and device for verification | |
CN109543489B (en) | Positioning method and device based on two-dimensional code and storage medium | |
US10726580B2 (en) | Method and device for calibration | |
US11189022B2 (en) | Automatic detection, counting, and measurement of logs using a handheld device | |
US10839599B2 (en) | Method and device for three-dimensional modeling | |
CN111291825A (en) | Focus classification model training method and device, computer equipment and storage medium | |
CN107862235B (en) | Two-dimensional code position positioning method and device and terminal equipment | |
CN110796135B (en) | Target positioning method and device, computer equipment and computer storage medium | |
CN108765584A (en) | Laser point cloud data collection augmentation method, apparatus and readable storage medium storing program for executing | |
CN111951335A (en) | Method, device, processor and image acquisition system for determining camera calibration parameters | |
CN114387347B (en) | Method, device, electronic equipment and medium for determining external parameter calibration | |
CN111340801A (en) | Livestock checking method, device, equipment and storage medium | |
CN117095002B (en) | Hub defect detection method and device and storage medium | |
CN116452655B (en) | Laminating and positioning method, device, equipment and medium applied to MPIS industrial control main board | |
CN108564571B (en) | Image area selection method and terminal device | |
CN111104965A (en) | Vehicle target identification method and device | |
CN115564828A (en) | Machine vision positioning method and system based on ambiguity algorithm | |
CN117333892A (en) | Method and device for identifying stems and answers, readable storage medium and terminal equipment | |
CN113610086A (en) | Reading correction method and device for vertical-scale pointer instrument and terminal equipment | |
CN112435291A (en) | Multi-target volume measurement method and device and storage medium | |
CN114255268A (en) | Disparity map processing and deep learning model training method and related equipment | |
CN118608750B (en) | Image parameter calibration method, device and equipment | |
CN119205855B (en) | Point cloud image registration method, device, electronic device and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |