CN114814758B - Camera-millimeter wave radar-laser radar combined calibration method and device - Google Patents
Camera-millimeter wave radar-laser radar combined calibration method and device Download PDFInfo
- Publication number
- CN114814758B CN114814758B CN202210720349.7A CN202210720349A CN114814758B CN 114814758 B CN114814758 B CN 114814758B CN 202210720349 A CN202210720349 A CN 202210720349A CN 114814758 B CN114814758 B CN 114814758B
- Authority
- CN
- China
- Prior art keywords
- camera
- virtual
- coordinate
- millimeter wave
- wave radar
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Radar Systems Or Details Thereof (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The utility model provides a camera-millimeter wave radar-laser radar combined calibration method and device, which relates to a multi-sensor combined calibration technology, comprising the following steps: acquiring parameters of a camera, parameters of a millimeter wave radar and parameters of a laser radar; acquiring coordinates of a camera in a world coordinate system, coordinates of a millimeter wave radar, coordinates of a laser radar and coordinates of a target corner point; generating a virtual camera, a virtual millimeter wave radar, a virtual laser radar and a virtual target corner point; adjusting the angle of a virtual camera, the angle of a virtual millimeter wave radar and the angle of a virtual laser radar, and acquiring an image, first point cloud information and second point cloud information by using the adjusted angles; and determining a calibration result according to the image, the first point cloud information, the second point cloud information and the target corner coordinates. And obtaining the installation angles of the plurality of sensors in a simulation mode, and obtaining a calibration result by using the angles. This angle may be used to guide installation. The calibration is simple, and the calibration efficiency can be improved to a certain extent.
Description
Technical Field
The disclosure relates to a multi-sensor combined calibration technology, and in particular relates to a camera-millimeter wave radar-laser radar combined calibration method and device.
Background
At present, in intelligent traffic or vehicle-road cooperation, roadside equipment generally adopts a camera and a radar to detect a perception target and perform fusion output. Therefore, the camera and the radar need to be calibrated jointly, and the joint calibration results in a rotation and translation matrix of the radar relative to the camera.
In the prior art, the radar and the camera are jointly calibrated in a calibration plate mode.
However, in this way, the calibration process is complicated, and the efficiency needs to be improved.
Disclosure of Invention
The invention provides a camera-millimeter wave radar-laser radar combined calibration method and device, which are simple in calibration process and improve calibration efficiency to a certain extent.
According to a first aspect of the present disclosure, a camera-millimeter wave radar-laser radar joint calibration method is provided, including:
acquiring a first parameter of a camera, a second parameter of a millimeter wave radar and a third parameter of a laser radar; acquiring a first coordinate of the camera in a world coordinate system, a second coordinate of the millimeter wave radar in the world coordinate system, a third coordinate of the laser radar in the world coordinate system, a fourth coordinate of a first target corner point of the camera in the world coordinate system, a fifth coordinate of a second target corner point of the millimeter wave radar in the world coordinate system, and a sixth coordinate of a third target corner point of the laser radar in the world coordinate system;
generating a virtual camera in a simulation space according to the first coordinate and the first parameter, generating a virtual millimeter-wave radar in the simulation space according to the second coordinate and the second parameter, generating a virtual laser radar in the simulation space according to the third coordinate and the third parameter, generating a first virtual target corner point in the simulation space according to the fourth coordinate, generating a second virtual target corner point in the simulation space according to the fifth coordinate, and generating a third virtual target corner point in the simulation space according to the sixth coordinate; wherein the simulation space is a space corresponding to a world coordinate system;
adjusting a first angle of the virtual camera, a second angle of the virtual millimeter wave radar and a third angle of the virtual laser radar, acquiring an image by using the adjusted virtual camera, acquiring first point cloud information by using the adjusted virtual millimeter wave radar, and acquiring second point cloud information by using the adjusted laser radar;
and determining a calibration result of any two combinations of the camera, the millimeter wave radar and the laser radar according to the image, the first point cloud information, the second point cloud information, the fourth coordinate, the fifth coordinate and the sixth coordinate.
According to a second aspect of the present disclosure, there is provided a camera-millimeter wave radar-laser radar combined calibration apparatus, including:
the acquisition unit is used for acquiring a first parameter of the camera, a second parameter of the millimeter wave radar and a third parameter of the laser radar; acquiring a first coordinate of the camera in a world coordinate system, a second coordinate of the millimeter wave radar in the world coordinate system, a third coordinate of the laser radar in the world coordinate system, a fourth coordinate of a first target corner point of the camera in the world coordinate system, a fifth coordinate of a second target corner point of the millimeter wave radar in the world coordinate system, and a sixth coordinate of a third target corner point of the laser radar in the world coordinate system;
a simulation unit, configured to generate a virtual camera in a simulation space according to the first coordinate and the first parameter, generate a virtual millimeter-wave radar in the simulation space according to the second coordinate and the second parameter, generate a virtual laser radar in the simulation space according to the third coordinate and the third parameter, generate a first virtual target corner point in the simulation space according to the fourth coordinate, generate a second virtual target corner point in the simulation space according to the fifth coordinate, and generate a third virtual target corner point in the simulation space according to the sixth coordinate; wherein the simulation space is a space corresponding to a world coordinate system;
the simulation unit is further used for adjusting a first angle of the virtual camera, a second angle of the virtual millimeter wave radar and a third angle of the virtual laser radar, acquiring an image by using the adjusted virtual camera, acquiring first point cloud information by using the adjusted virtual millimeter wave radar and acquiring second point cloud information by using the adjusted laser radar;
and the determining unit is used for determining a calibration result of any two combinations of the camera, the millimeter wave radar and the laser radar according to the image, the first point cloud information, the second point cloud information, the fourth coordinate, the fifth coordinate and the sixth coordinate.
According to a third aspect of the present disclosure, there is provided an internal reference calibration apparatus for a camera, including:
an acquisition unit configured to acquire a frame frequency parameter of a camera;
the simulation unit is used for generating a virtual camera in a simulation space according to the frame frequency parameters;
the simulation unit is also used for acquiring a plurality of checkerboard images in the simulation space by utilizing the virtual camera;
and the identification unit is used for determining an internal reference calibration result of the camera according to the checkerboard images.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising a memory and a processor; wherein,
the memory for storing a computer program;
the processor is configured to read the computer program stored in the memory, and execute the method according to the computer program in the memory.
According to a fifth aspect of the present disclosure, there is provided a computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, implement the method according to the first aspect.
According to a sixth aspect of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, performs the method according to the first aspect.
The camera-millimeter wave radar-laser radar combined calibration method and device provided by the disclosure comprise the following steps: acquiring a first parameter of a camera, a second parameter of a millimeter wave radar and a third parameter of a laser radar; acquiring a first coordinate of a camera in a world coordinate system, a second coordinate of a millimeter wave radar in the world coordinate system, a third coordinate of a laser radar in the world coordinate system, a fourth coordinate of a first target corner point of the camera in the world coordinate system, a fifth coordinate of a second target corner point of the millimeter wave radar in the world coordinate system, and a sixth coordinate of a third target corner point of the laser radar in the world coordinate system; generating a virtual camera in a simulation space according to the first coordinate and the first parameter, generating a virtual millimeter wave radar in the simulation space according to the second coordinate and the second parameter, generating a virtual laser radar in the simulation space according to the third coordinate and the third parameter, generating a first virtual target corner point in the simulation space according to the fourth coordinate, generating a second virtual target corner point in the simulation space according to the fifth coordinate, and generating a third virtual target corner point in the simulation space according to the sixth coordinate; wherein, the simulation space is a space corresponding to a world coordinate system; adjusting a first angle of a virtual camera, a second angle of a virtual millimeter wave radar and a third angle of a virtual laser radar, acquiring an image by using the adjusted virtual camera, acquiring first point cloud information by using the adjusted virtual millimeter wave radar, and acquiring second point cloud information by using the adjusted laser radar; and determining the calibration result of any two combinations of the camera, the millimeter wave radar and the laser radar according to the image, the first point cloud information, the second point cloud information, the fourth coordinate, the fifth coordinate and the sixth coordinate. According to the camera-millimeter wave radar-laser radar combined calibration method and device, the installation angles of the sensors can be obtained in a simulation mode, and the calibration result can be obtained by utilizing the installation angles. This angle can be used to guide the installation. The method provided by the scheme is simple in calibration process, and the calibration efficiency is improved to a certain extent.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and other drawings can be obtained by those skilled in the art without inventive exercise.
Fig. 1 is a schematic flowchart illustrating a camera-millimeter wave radar-lidar joint calibration method according to an exemplary embodiment of the present disclosure;
fig. 2 is a schematic flowchart illustrating a camera-millimeter wave radar-lidar joint calibration method according to another exemplary embodiment of the disclosure;
FIG. 3 is a schematic flow chart diagram illustrating a method for calibrating an internal reference of a camera according to an exemplary embodiment of the present disclosure;
fig. 4 is a structural diagram of a combined calibration apparatus of a camera-millimeter wave radar-laser radar according to an exemplary embodiment of the present disclosure;
FIG. 5 is a block diagram of an internal reference calibration apparatus of a camera according to an exemplary embodiment of the present disclosure;
fig. 6 is a block diagram of an electronic device shown in an exemplary embodiment of the present disclosure.
Detailed Description
At present, in intelligent traffic or vehicle-road cooperation, roadside equipment generally adopts a camera and a radar to detect a sensing target and output fusion, the calibration mode commonly used by the camera at present is a Zhang-Yongyou calibration method, the radar mostly adopts angle inverse calibration with strong reflectivity, the radar and the camera mostly adopt a calibration plate mode to carry out combined calibration, and the combined calibration result is to obtain rotation parameters and translation parameters of the radar relative to the camera.
However, in the prior art, external reference calibration of a camera depends on parallel lane lines, the lane lines cannot be bent, the requirement on a road surface is strict, external reference calibration of a radar depends on angle reversal, the requirement on testing is strict, a calibration plate mode is adopted for combined calibration of the camera and the radar, and the calibration process is inconvenient for roadside equipment with a generally high installation height (5-7 m) and even relates to a road sealing condition. The existing calibration process is complex, and the efficiency needs to be improved.
In order to solve the technical problem, in the scheme provided by the disclosure, a camera, a millimeter wave radar and a laser radar which need to be calibrated are simulated in a simulation space, and target angular points which respectively correspond to the camera, the millimeter wave radar and the laser radar are arranged in the simulation space. Therefore, the method provided by the scheme is simple in calibration process, and the calibration efficiency is improved to a certain extent.
Fig. 1 is a schematic flowchart of a camera-millimeter wave radar-laser radar joint calibration method according to an exemplary embodiment of the present disclosure.
As shown in fig. 1, the camera-millimeter wave radar-laser radar combined calibration method provided in this embodiment includes:
101, acquiring a first parameter of a camera, a second parameter of a millimeter wave radar and a third parameter of a laser radar; and acquiring a first coordinate of the camera in a world coordinate system, a second coordinate of the millimeter wave radar in the world coordinate system, a third coordinate of the laser radar in the world coordinate system, a fourth coordinate of a first target corner point of the camera in the world coordinate system, a fifth coordinate of a second target corner point of the millimeter wave radar in the world coordinate system, and a sixth coordinate of a third target corner point of the laser radar in the world coordinate system.
The method provided by the present disclosure may be executed by an electronic device with computing capability, such as a computer.
Wherein the first parameter of the camera can be obtained from a configuration file of the camera. The first parameters of the camera may include information of field angle, aperture, resolution, working distance, field of view, optical magnification, numerical aperture, exposure time, ghost, area of the photosensitive element, pixel, focal length of the camera lens, optical center and radial distortion coefficient, etc.
Further, internal reference calibration can be performed on the camera to obtain an internal reference calibration result. And carrying out combined calibration of the camera, the millimeter wave radar and the laser radar according to the internal reference calibration result.
The second parameter of the millimeter wave radar can be obtained from a configuration file of the millimeter wave radar. The second parameter may include information such as a maximum working distance, a distance resolution, a ranging accuracy, a maximum detection speed, a speed resolution, a speed measurement accuracy, a detection view angle range, an angle resolution, and an angle measurement accuracy.
Wherein the third parameter of the lidar may be obtained from a profile of the lidar. The third parameter may include information such as wavelength, beam, laser grade, accuracy, range, number of points, vertical angle resolution, horizontal angle resolution, rotational speed, input voltage, product power, protection safety level, operating temperature, specification, weight, collected data, and the like.
Wherein, since the sensor can be placed at any position in the environment, a reference coordinate system is selected in the environment to describe the position of the sensor and to describe the position of any object in the environment, which is called world coordinate system. The relationship between the sensor coordinate system and the world coordinate system may be described in terms of a rotation-translation matrix.
And the first target corner point is used for calibrating the camera. The first target corner point may select a point on a lane line in the environment.
And the second target corner point is used for calibrating the millimeter wave radar. The second target corner point may be a corner inverse. The corner reflector is also called a radar reflector or a corner reflector, and is a radar reflector made of a metal plate. When the radar electromagnetic wave scans the angle reflection, the refraction of the electromagnetic wave on the metal angle of the angle reflection can be amplified, a strong echo signal can be generated, and the radar can receive the echo signal. The kind of angle reversal: the material is divided into a metal type and a coating type; the shape of the device is divided into a quadrangle shape, an octagon shape, a hexagon shape and a polygon shape; the placement method is divided into a fixed type and a hanging type.
And the third target corner point is used for calibrating the laser radar. The third target corner point may be a checkerboard calibration board.
Specifically, a first coordinate of the camera in the world coordinate system, a second coordinate of the millimeter wave radar in the world coordinate system, a third coordinate of the laser radar in the world coordinate system, a fourth coordinate of the first target corner point in the world coordinate system, a fifth coordinate of the second target corner point in the world coordinate system, and a sixth coordinate of the third target corner point in the world coordinate system may be acquired from the high-precision map. In particular, the high-precision map applies a world coordinate system.
102, generating a virtual camera in a simulation space according to a first coordinate and a first parameter, generating a virtual millimeter wave radar in the simulation space according to a second coordinate and a second parameter, generating a virtual laser radar in the simulation space according to a third coordinate and a third parameter, generating a first virtual target corner point in the simulation space according to a fourth coordinate, generating a second virtual target corner point in the simulation space according to a fifth coordinate, and generating a third virtual target corner point in the simulation space according to a sixth coordinate; the simulation space is a space corresponding to a world coordinate system.
Specifically, the simulation can be performed by using kara (cara) simulation software. Among them, cara is an open source simulator for automated driving studies. CARLA supports flexible specification of sensor conditions and environmental conditions.
Specifically, the information of a high-precision map (the range of the high-precision map includes the positions of the camera, the millimeter-wave radar, the laser radar, the first target corner point, the second target corner point, and the third target corner point) can be input into the cara, and a simulation space corresponding to the world coordinate system is generated.
Specifically, a virtual camera can be generated in the simulation space according to a first coordinate of the camera in the world coordinate system and a first parameter of the camera; the position of the virtual camera in the simulation space corresponds to a first coordinate of the camera in the world coordinate system.
Similarly, a virtual millimeter-wave radar can be generated in the simulation space according to a second coordinate of the millimeter-wave radar in the world coordinate system and a second parameter of the millimeter-wave radar; the position of the virtual millimeter-wave radar in the simulation space corresponds to a second coordinate of the millimeter-wave radar in the world coordinate system.
Similarly, a virtual lidar may be generated in the simulation space according to a third coordinate of the lidar in the world coordinate system and a third parameter of the lidar; and the position of the virtual laser radar in the simulation space corresponds to a third coordinate of the laser radar in a world coordinate system.
And 103, adjusting a first angle of the virtual camera, a second angle of the virtual millimeter wave radar and a third angle of the virtual laser radar, acquiring an image by using the adjusted virtual camera, acquiring first point cloud information by using the adjusted virtual millimeter wave radar, and acquiring second point cloud information by using the adjusted laser radar.
Specifically, the first angle of the virtual camera, the second angle of the virtual millimeter wave radar, and the third angle of the virtual laser radar may be adjusted in the simulation software, and the coverage range of the virtual camera, the coverage range of the virtual millimeter wave radar, and the coverage range of the virtual laser radar may be recorded, and the adjustment may be stopped until the coverage range of the camera, the coverage range of the virtual millimeter wave radar, and the coverage range of the virtual laser radar all respectively satisfy the preset condition.
Specifically, the first angle of the virtual camera, the second angle of the virtual millimeter wave radar and the third angle of the virtual laser radar can be adjusted by receiving an operation instruction of a user; the first angle of the virtual camera, the second angle of the virtual millimeter wave radar and the third angle of the virtual laser radar can be automatically adjusted according to the preset instruction.
Specifically, the coverage of the adjusted virtual camera includes the position of the first virtual target corner point, the coverage of the virtual millimeter wave radar includes the position of the second virtual target corner point, and the coverage of the virtual laser radar includes the position of the third virtual target corner point. Therefore, the image obtained by the adjusted virtual camera includes information of the first virtual target corner point, the first point cloud information obtained by the adjusted virtual millimeter wave radar includes information of the second virtual target corner point, and the second point cloud information obtained by the adjusted laser radar includes information of the third virtual target corner point.
And step 104, determining a calibration result of any two combinations of the camera, the millimeter wave radar and the laser radar according to the image, the first point cloud information, the second point cloud information, the fourth coordinate, the fifth coordinate and the sixth coordinate.
Specifically, the position of the first virtual target corner point in the image may be acquired, and a first rotation-translation matrix between the coordinate system of the camera and the world coordinate system may be determined according to the position and the fourth coordinate. The position of the first virtual target corner point in the image is the position of the first target corner point in the camera coordinate system, the fourth coordinate is the position of the first target corner point in the world coordinate system, and the position of the first target corner point is the same in any coordinate system, so that a first rotation-translation matrix between the camera coordinate system and the world coordinate system can be obtained according to the position of the first virtual target corner point in the image and the fourth coordinate.
Similarly, the position of the second virtual target corner point in the first point cloud information may be acquired, and a second rotation-translation matrix between the coordinate system of the millimeter wave radar and the world coordinate system may be determined according to the position and the fifth coordinate.
Similarly, the position of a third virtual target corner point in the second point cloud information may be acquired, and a third rotation-translation matrix between the coordinate system of the laser radar and the world coordinate system may be determined according to the position and the sixth coordinate.
Specifically, a rotation and translation matrix between the coordinate system of the camera and the coordinate system of the millimeter wave radar may be determined according to the first rotation-translation matrix and the second rotation-translation matrix, where the rotation and translation matrix is a calibration result of the camera and the millimeter wave radar. The first rotation-translation matrix is a conversion relation between a coordinate system of the camera and a world coordinate system, and the second rotation-translation matrix is a conversion relation between a coordinate system of the millimeter-wave radar and the world coordinate system, namely, the conversion relations between the coordinate system of the camera and the coordinate system of the millimeter-wave radar are both conversion relations between the coordinate system of the millimeter-wave radar and the world coordinate system.
Similarly, a rotation and translation matrix between the coordinate system of the camera and the coordinate system of the lidar can be determined according to the first rotation-translation matrix and the third rotation-translation matrix, and the rotation and translation matrix is a calibration result of the camera and the lidar.
Similarly, a rotation and translation matrix between the coordinate system of the millimeter wave radar and the coordinate system of the laser radar can be determined according to the second rotation-translation matrix and the third rotation-translation matrix, and the rotation and translation matrix is a calibration result of the millimeter wave radar and the laser radar.
The combined calibration result obtained in this way is obtained based on the virtual camera, the virtual millimeter wave radar, and the virtual laser radar, the angles of which are adjusted, and therefore the camera, the millimeter wave radar, and the laser radar can be installed in the actual environment according to the adjusted angles. The camera, the millimeter wave radar and the laser radar which are installed in the actual environment can be applied based on the determined combined calibration result.
The camera-millimeter wave radar-laser radar combined calibration method provided by the disclosure comprises the following steps: acquiring a first parameter of a camera, a second parameter of a millimeter wave radar and a third parameter of a laser radar; acquiring a first coordinate of a camera under a world coordinate system, a second coordinate of a millimeter-wave radar under the world coordinate system, a third coordinate of a laser radar under the world coordinate system, a fourth coordinate of a first target corner point of the camera under the world coordinate system, a fifth coordinate of a second target corner point of the millimeter-wave radar under the world coordinate system, and a sixth coordinate of a third target corner point of the laser radar under the world coordinate system; generating a virtual camera in a simulation space according to the first coordinate and the first parameter, generating a virtual millimeter wave radar in the simulation space according to the second coordinate and the second parameter, generating a virtual laser radar in the simulation space according to the third coordinate and the third parameter, generating a first virtual target corner point in the simulation space according to the fourth coordinate, generating a second virtual target corner point in the simulation space according to the fifth coordinate, and generating a third virtual target corner point in the simulation space according to the sixth coordinate; wherein, the simulation space is a space corresponding to a world coordinate system; adjusting a first angle of a virtual camera, a second angle of a virtual millimeter wave radar and a third angle of a virtual laser radar, acquiring an image by using the adjusted virtual camera, acquiring first point cloud information by using the adjusted virtual millimeter wave radar, and acquiring second point cloud information by using the adjusted laser radar; and determining the calibration result of any two combinations of the camera, the millimeter wave radar and the laser radar according to the image, the first point cloud information, the second point cloud information, the fourth coordinate, the fifth coordinate and the sixth coordinate. In this scheme, the simulation needs the camera of demarcation, millimeter wave radar, laser radar in the simulation space to set up the target angular point that corresponds respectively with camera, millimeter wave radar, laser radar in the simulation space, through this kind of mode, need not to install camera, millimeter wave radar, laser radar and each target angular point in actual environment, just can reach the effect of jointly maring. Therefore, the method provided by the scheme is simple in calibration process, and the calibration efficiency is improved to a certain extent.
Fig. 2 is a schematic flowchart of a camera-millimeter wave radar-lidar joint calibration method according to another exemplary embodiment of the disclosure.
As shown in fig. 2, the camera-millimeter wave radar-laser radar combined calibration method provided in this embodiment includes:
Specifically, the principle and implementation of step 201 are similar to those of step 101, and are not described again.
202, generating a virtual camera in a simulation space according to a first coordinate and a first parameter, generating a virtual millimeter wave radar in the simulation space according to a second coordinate and a second parameter, generating a virtual laser radar in the simulation space according to a third coordinate and a third parameter, generating a first virtual target corner point in the simulation space according to a fourth coordinate, generating a second virtual target corner point in the simulation space according to a fifth coordinate, and generating a third virtual target corner point in the simulation space according to a sixth coordinate; the simulation space is a space corresponding to a world coordinate system.
Specifically, the principle and implementation of step 202 are similar to those of step 102, and are not described again.
Specifically, the yaw angle, the pitch angle, and the roll angle are rotation angles around different coordinate axes of the same coordinate system, respectively. Wherein the coordinate system may be a world coordinate system, and assuming that the world coordinate system includes an X axis, a Y axis, and a Z axis, the yaw angle refers to a rotation angle around the X axis, the pitch angle refers to a rotation angle around the Y axis, and the roll angle refers to a rotation angle around the Z axis.
Wherein the coverage may be determined using field of view (FOV) and probe distance values comprised by intrinsic device parameters of the perceiving device. Wherein the FOV includes a horizontal scan angle and a pitch scan angle. The sensing device comprises a virtual camera, a virtual millimeter wave radar and a virtual laser radar.
The first range threshold, the second range threshold and the third range threshold are preset range thresholds according to actual conditions.
In one implementation manner, an initial value of a first angle of the virtual camera may be determined according to the model of the camera and a first parameter of the camera; the initial values of the first angle include a camera deflection angle initial value, a camera pitch angle initial value and a camera rotation angle initial value. Then, with CARLA, a first angle of the virtual camera may be adjusted such that a first coverage of the virtual camera in the simulation space meets a first range threshold.
Specifically, the fact that the first coverage area of the virtual camera in the simulation space meets the first range threshold means that the first coverage area coincides with the coverage area represented by the first range threshold, or the first coverage area includes the coverage area represented by the first range threshold.
Similarly, the initial value of the second angle of the virtual millimeter-wave radar may be determined according to the model of the millimeter-wave radar and the second parameter of the millimeter-wave radar; the initial values of the second angle comprise a millimeter wave radar deflection angle initial value, a millimeter wave radar pitch angle initial value and a millimeter wave radar revolution angle initial value. Then, a second angle of the virtual millimeter wave radar may be adjusted using the CARLA such that a second coverage area of the virtual millimeter wave radar in the simulation space meets a second range threshold.
Specifically, the fact that a second coverage area of the virtual millimeter wave radar in the simulation space meets a second range threshold means that the second coverage area coincides with a coverage area represented by the second range threshold, or the second coverage area includes the coverage area represented by the second range threshold.
Similarly, the initial value of the third angle of the laser radar may be determined according to the model of the laser radar and the third parameter of the laser radar; the initial value of the third angle comprises a laser radar deflection angle initial value, a laser radar pitch angle initial value and a laser radar revolution angle initial value. Then, a third angle of the virtual lidar may be adjusted using the carala such that a third coverage of the lidar in the simulation space meets a third range threshold.
Specifically, the fact that the third coverage area of the laser radar in the simulation space meets the third range threshold means that the third coverage area coincides with the coverage area represented by the third range threshold, or the third coverage area includes the coverage area represented by the third range threshold.
Specifically, the adjusted first coverage range of the virtual camera includes a position of a first virtual target corner point, the second coverage range of the virtual millimeter wave radar includes a position of a second virtual target corner point, and the third coverage range of the virtual laser radar includes a position of a third virtual target corner point.
And 204, acquiring an image by using the adjusted virtual camera, acquiring first point cloud information by using the adjusted virtual millimeter wave radar, and acquiring second point cloud information by using the adjusted laser radar.
Specifically, the adjusted virtual camera may be used to acquire an image, the adjusted virtual millimeter wave radar may be used to acquire the first point cloud information, and the adjusted laser radar may be used to acquire the second point cloud information in the simulation software.
Specifically, the image, the first point cloud information, and the second point cloud information all include virtual target corner point information.
Specifically, the user may identify a first position of the first virtual target corner point in the image, generate a first position instruction, and the simulation space may determine the first position of the first virtual target corner point in the image according to the first position instruction.
Similarly, the user may identify a second position of the second virtual target corner point in the first point cloud information, generate a second position instruction, and the simulation space may determine the second position of the second virtual target corner point in the first point cloud information according to the second position instruction.
Similarly, the user may identify a third position of a third virtual target corner in the second point cloud information, generate a third position instruction, and the simulation space may determine a third position of the third virtual target corner in the second point cloud information according to the third position instruction.
Specifically, the solvePnP function in OpenCV-Python can be called to determine the rotation-translation matrix.
Specifically, a fourth coordinate of the first target corner point in the world coordinate system and a first position of the first virtual target corner point in the image may be used as an input of the solvePnP function, and an output of the solvePnP function is a first rotation-translation matrix between the coordinate system of the camera and the world coordinate system.
Wherein the solvePnP function is one function in OpenCV-Python. The OpenCV-Python is a Python special library aiming at solving the computer vision problem.
Similarly, a fifth coordinate of the second target corner point in the world coordinate system and a second position of the second virtual target corner point in the first point cloud information may be used as inputs of the solvePnP function, and an output of the solvePnP function is a second rotation-translation matrix between the coordinate system of the millimeter wave radar and the world coordinate system.
Similarly, a sixth coordinate of the third target corner point in the world coordinate system and a third position of the third virtual target corner point in the second point cloud information may be used as inputs of the solvePnP function, and an output of the solvePnP function is a third rotation-translation matrix between the coordinate system of the laser radar and the world coordinate system.
Specifically, the calibration result of any two combinations of the camera, the millimeter-wave radar and the laser radar may be determined according to the first rotation-translation matrix, the second rotation-translation matrix and the third rotation-translation matrix, and the conversion among the three matrices.
In one implementation, a fourth rotation-translation matrix between the camera and the millimeter wave radar is determined according to the first rotation-translation matrix and the second rotation-translation matrix; determining a fifth rotation-translation matrix between the camera and the laser radar according to the first rotation-translation matrix and the third rotation-translation matrix; and determining a sixth rotation-translation matrix between the millimeter wave radar and the laser radar according to the second rotation-translation matrix and the third rotation-translation matrix.
Specifically, a fourth rotation-translation matrix between the camera and the millimeter wave radar may be determined according to the first rotation-translation matrix and the second rotation-translation matrix, where the fourth rotation-translation matrix is a calibration result between the camera and the millimeter wave radar.
Similarly, a fifth rotation-translation matrix between the camera and the lidar may be determined according to the first rotation-translation matrix and the third rotation-translation matrix, where the fifth rotation-translation matrix is a calibration result between the camera and the lidar.
Similarly, a sixth rotation-translation matrix between the millimeter-wave radar and the laser radar may be determined according to the second rotation-translation matrix and the third rotation-translation matrix, where the sixth rotation-translation matrix is a calibration result between the millimeter-wave radar and the laser radar.
Further, the joint calibration results of multiple sensors may be determined in the manner described above. The present embodiment does not limit the similarity of the plurality of sensors, nor does it limit the number of sensors.
Fig. 3 is a flowchart illustrating a camera internal reference calibration method according to an exemplary embodiment of the present disclosure.
The internal reference of the camera can be calibrated before the combined calibration, and then the calibrated internal reference of the camera is utilized to carry out the combined calibration between the sensors.
As shown in fig. 3, the internal reference calibration method for a camera provided in this embodiment includes:
in step 301, parameters of a camera are acquired.
The parameters of the camera include information such as field angle, aperture, resolution, working distance, field range, optical magnification, numerical aperture, exposure time, ghost, area of the photosensitive element, pixel, focal length of the camera lens, optical center, and radial distortion coefficient.
Specifically, the parameters of the camera may be obtained from a configuration file of the camera.
Specifically, CARLA simulation software can be utilized to generate a virtual camera in a simulation space according to frame rate parameters of the camera.
Specifically, a checkerboard may be preset in the simulation space. The angle of the virtual camera may be adjusted in the simulation space such that the coverage of the virtual camera includes a checkerboard. Then, an image may be acquired in the simulation space using the virtual camera, the image including a checkerboard, and the image may be referred to as a checkerboard image. The angle of the virtual camera can be fixed, and the positions of the checkerboards can be changed for multiple times to obtain multiple checkerboard images. The positions of the checkerboards can be fixed, and the angles of the virtual cameras can be changed for multiple times to obtain multiple checkerboard images.
The internal parameters of the camera include the focal length, the optical center and the radial distortion coefficient of the camera lens.
Specifically, the camera internal reference calibration result can be determined by utilizing OpenCV-Python.
Specifically, the acquired multiple checkerboard images may be preprocessed first. For example, the checkerboard image is compressed.
A findChessboardCorrers function in OpenCV-Python can be called, and the coordinates of the corner points of the checkerboard are identified in the preprocessed checkerboard image;
then, a cornerubtix function in OpenCV-Python can be called to further extract sub-pixel corner information;
then, a calibretreCAMERa function in OpenCV-Python can be called to calculate a parameter matrix and a radial distortion coefficient in the camera; the camera intrinsic parameter matrix and the radial distortion coefficient are the camera intrinsic parameter calibration result. The parameter matrix in the camera comprises the focal length and the optical center of the camera lens.
Further, an undistort function in OpenCV-Python can be called to calibrate the distorted image.
Further, the first parameter of the camera includes an internal reference calibration result. The first parameter of the camera can be used for joint calibration of the camera, the millimeter wave radar and the laser radar.
Fig. 4 is a structural diagram of a camera-millimeter wave radar-lidar combined calibration apparatus according to an exemplary embodiment of the disclosure.
As shown in fig. 4, the present disclosure provides a camera-millimeter wave radar-lidar combined calibration apparatus 400, including:
an obtaining unit 410, configured to obtain a first parameter of a camera, a second parameter of a millimeter wave radar, and a third parameter of a laser radar; acquiring a first coordinate of a camera in a world coordinate system, a second coordinate of a millimeter wave radar in the world coordinate system, a third coordinate of a laser radar in the world coordinate system, a fourth coordinate of a first target corner point of the camera in the world coordinate system, a fifth coordinate of a second target corner point of the millimeter wave radar in the world coordinate system, and a sixth coordinate of a third target corner point of the laser radar in the world coordinate system;
a simulation unit 420, configured to generate a virtual camera in a simulation space according to the first coordinate and the first parameter, generate a virtual millimeter-wave radar in the simulation space according to the second coordinate and the second parameter, generate a virtual laser radar in the simulation space according to the third coordinate and the third parameter, generate a first virtual target corner point in the simulation space according to the fourth coordinate, generate a second virtual target corner point in the simulation space according to the fifth coordinate, and generate a third virtual target corner point in the simulation space according to the sixth coordinate; wherein, the simulation space is a space corresponding to a world coordinate system;
the simulation unit 420 is further configured to adjust a first angle of the virtual camera, a second angle of the virtual millimeter wave radar, and a third angle of the virtual laser radar, acquire an image by using the adjusted virtual camera, acquire first point cloud information by using the adjusted virtual millimeter wave radar, and acquire second point cloud information by using the adjusted laser radar;
the determining unit 430 is configured to determine a calibration result of any two combinations of the camera, the millimeter wave radar, and the laser radar according to the image, the first point cloud information, the second point cloud information, the fourth coordinate, the fifth coordinate, and the sixth coordinate.
The simulation unit 420 is specifically configured to adjust a first angle of the virtual camera, a second angle of the virtual millimeter wave radar, and a third angle of the virtual laser radar, so that a first coverage area of the virtual camera in the simulation space meets a first range threshold, a second coverage area of the millimeter wave radar in the simulation space meets a second range threshold, and a third coverage area of the laser radar in the simulation space meets a third range threshold; wherein, the angle comprises a deflection angle, a pitch angle and a revolution angle.
A determining unit 430, specifically configured to determine a first position of a first virtual target corner in the image, a second position of a second virtual target corner in the first point cloud information, and a third position of a third virtual target corner in the second point cloud information;
determining a first rotation-translation matrix between the coordinate system of the camera and the world coordinate system according to the fourth coordinate and the first position;
determining a second rotation-translation matrix between the coordinate system of the millimeter wave radar and the world coordinate system according to the fifth coordinate and the second position;
determining a third rotation-translation matrix between the coordinate system of the laser radar and the world coordinate system according to the sixth coordinate and the third position;
and determining the calibration result of any two combinations of the camera, the millimeter wave radar and the laser radar according to the first rotation-translation matrix, the second rotation-translation matrix and the third rotation-translation matrix.
A determining unit 430, specifically configured to determine a fourth rotation-translation matrix between the camera and the millimeter wave radar according to the first rotation-translation matrix and the second rotation-translation matrix;
determining a fifth rotation-translation matrix between the camera and the laser radar according to the first rotation-translation matrix and the third rotation-translation matrix;
and determining a sixth rotation-translation matrix between the millimeter wave radar and the laser radar according to the second rotation-translation matrix and the third rotation-translation matrix.
Fig. 5 is a structural diagram of an internal reference calibration apparatus of a camera according to an exemplary embodiment of the present disclosure.
As shown in fig. 5, the present disclosure provides an internal reference calibration apparatus 500 of a camera, including:
an acquisition unit 510 for acquiring parameters of the camera;
a simulation unit 520 for generating a virtual camera in a simulation space according to the parameters;
the simulation unit 520 is further configured to acquire a plurality of checkerboard images in the simulation space by using the virtual camera;
the identifying unit 530 is configured to determine an internal reference calibration result of the camera according to the multiple checkerboard images; the first parameter comprises an internal reference calibration result.
Fig. 6 is a block diagram of an electronic device shown in an exemplary embodiment of the present disclosure.
As shown in fig. 6, the electronic device provided in this embodiment includes:
a memory 601;
a processor 602; and
a computer program;
wherein the computer program is stored in the memory 601 and configured to be executed by the processor 602 to implement any of the methods as above.
The present embodiments also provide a computer-readable storage medium having a computer program stored thereon, the computer program being executable by a processor to implement any of the methods as above.
The present embodiment also provides a computer program product comprising a computer program which, when executed by a processor, performs any of the methods described above.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and these modifications or substitutions do not depart from the spirit of the corresponding technical solutions of the embodiments of the present invention.
Claims (10)
1. A camera-millimeter wave radar-laser radar combined calibration method is characterized by comprising the following steps:
acquiring a first parameter of a camera, a second parameter of a millimeter wave radar and a third parameter of a laser radar; acquiring a first coordinate of the camera in a world coordinate system, a second coordinate of the millimeter wave radar in the world coordinate system, a third coordinate of the laser radar in the world coordinate system, a fourth coordinate of a first target corner point of the camera in the world coordinate system, a fifth coordinate of a second target corner point of the millimeter wave radar in the world coordinate system, and a sixth coordinate of a third target corner point of the laser radar in the world coordinate system;
generating a virtual camera in a simulation space according to the first coordinate and the first parameter, generating a virtual millimeter-wave radar in the simulation space according to the second coordinate and the second parameter, generating a virtual laser radar in the simulation space according to the third coordinate and the third parameter, generating a first virtual target corner point in the simulation space according to the fourth coordinate, generating a second virtual target corner point in the simulation space according to the fifth coordinate, and generating a third virtual target corner point in the simulation space according to the sixth coordinate; wherein the simulation space is a space corresponding to a world coordinate system;
adjusting a first angle of the virtual camera, a second angle of the virtual millimeter wave radar and a third angle of the virtual laser radar, acquiring an image by using the adjusted virtual camera, acquiring first point cloud information by using the adjusted virtual millimeter wave radar, and acquiring second point cloud information by using the adjusted laser radar;
and determining a calibration result of any two combinations of the camera, the millimeter wave radar and the laser radar according to the image, the first point cloud information, the second point cloud information, the fourth coordinate, the fifth coordinate and the sixth coordinate.
2. The method of claim 1, wherein the adjusting the first angle of the virtual camera, the second angle of the virtual millimeter wave radar, and the third angle of the virtual lidar comprises:
adjusting a first angle of the virtual camera, a second angle of the virtual millimeter wave radar and a third angle of the virtual laser radar, so that a first coverage area of the virtual camera in the simulation space meets a first range threshold, a second coverage area of the millimeter wave radar in the simulation space meets a second range threshold, and a third coverage area of the laser radar in the simulation space meets a third range threshold; wherein the angles include a yaw angle, a pitch angle, and a roll angle.
3. The method of claim 1, wherein determining calibration results for any two combinations of the camera, the millimeter wave radar, and the lidar based on the image, the first point cloud information, the second point cloud information, and the fourth coordinate, the fifth coordinate, and the sixth coordinate comprises:
determining a first position of the first virtual target corner in the image, a second position of the second virtual target corner in the first point cloud information, and a third position of the third virtual target corner in the second point cloud information;
determining a first rotation-translation matrix between the coordinate system of the camera and the world coordinate system according to the fourth coordinate and the first position;
determining a second rotation-translation matrix between the coordinate system of the millimeter wave radar and the world coordinate system according to the fifth coordinate and the second position;
determining a third rotation-translation matrix between the coordinate system of the laser radar and the world coordinate system according to the sixth coordinate and the third position;
and determining the calibration result of any two combinations of the camera, the millimeter wave radar and the laser radar according to the first rotation-translation matrix, the second rotation-translation matrix and the third rotation-translation matrix.
4. The method of claim 3, wherein determining the calibration results of any two combinations of the camera, the millimeter wave radar, and the lidar comprises at least one of:
determining a fourth rotation-translation matrix between the camera and the millimeter wave radar according to the first rotation-translation matrix and the second rotation-translation matrix;
determining a fifth rotation-translation matrix between the camera and the lidar from the first rotation-translation matrix and the third rotation-translation matrix;
and determining a sixth rotation-translation matrix between the millimeter wave radar and the laser radar according to the second rotation-translation matrix and the third rotation-translation matrix.
5. The method of claim 1, further comprising:
acquiring parameters of the camera;
generating a virtual camera in the simulation space according to the parameters;
acquiring a plurality of checkerboard images in the simulation space by using the virtual camera;
determining an internal reference calibration result of the camera according to the checkerboard images; the first parameter comprises the internal reference calibration result.
6. A camera-millimeter wave radar-laser radar combined calibration device is characterized by comprising:
the acquisition unit is used for acquiring a first parameter of the camera, a second parameter of the millimeter wave radar and a third parameter of the laser radar; acquiring a first coordinate of the camera under a world coordinate system, a second coordinate of the millimeter wave radar under the world coordinate system, a third coordinate of the laser radar under the world coordinate system, a fourth coordinate of a first target corner point of the camera under the world coordinate system, a fifth coordinate of a second target corner point of the millimeter wave radar under the world coordinate system, and a sixth coordinate of a third target corner point of the laser radar under the world coordinate system;
a simulation unit, configured to generate a virtual camera in a simulation space according to the first coordinate and the first parameter, generate a virtual millimeter-wave radar in the simulation space according to the second coordinate and the second parameter, generate a virtual laser radar in the simulation space according to the third coordinate and the third parameter, generate a first virtual target corner point in the simulation space according to the fourth coordinate, generate a second virtual target corner point in the simulation space according to the fifth coordinate, and generate a third virtual target corner point in the simulation space according to the sixth coordinate; wherein the simulation space is a space corresponding to a world coordinate system;
the simulation unit is further used for adjusting a first angle of the virtual camera, a second angle of the virtual millimeter wave radar and a third angle of the virtual laser radar, acquiring an image by using the adjusted virtual camera, acquiring first point cloud information by using the adjusted virtual millimeter wave radar and acquiring second point cloud information by using the adjusted laser radar;
and the determining unit is used for determining a calibration result of any two combinations of the camera, the millimeter wave radar and the laser radar according to the image, the first point cloud information, the second point cloud information, the fourth coordinate, the fifth coordinate and the sixth coordinate.
7. The apparatus according to claim 6, wherein the simulation unit is specifically configured to:
adjusting a first angle of the virtual camera, a second angle of the virtual millimeter wave radar and a third angle of the virtual laser radar so that a first coverage area of the virtual camera in the simulation space meets a first range threshold, a second coverage area of the millimeter wave radar in the simulation space meets a second range threshold, and a third coverage area of the laser radar in the simulation space meets a third range threshold; wherein the angles include a yaw angle, a pitch angle, and a roll angle.
8. The apparatus according to claim 6, wherein the determining unit is specifically configured to:
determining a first position of the first virtual target corner in the image, a second position of the second virtual target corner in the first point cloud information, and a third position of the third virtual target corner in the second point cloud information;
determining a first rotation-translation matrix between the coordinate system of the camera and the world coordinate system according to the fourth coordinate and the first position;
determining a second rotation-translation matrix between the coordinate system of the millimeter wave radar and the world coordinate system according to the fifth coordinate and the second position;
determining a third rotation-translation matrix between the coordinate system of the laser radar and the world coordinate system according to the sixth coordinate and the third position;
and determining the calibration result of any two combinations of the camera, the millimeter wave radar and the laser radar according to the first rotation-translation matrix, the second rotation-translation matrix and the third rotation-translation matrix.
9. An electronic device comprising a memory and a processor; wherein,
the memory for storing a computer program;
the processor is used for reading the computer program stored in the memory and executing the method of any one of the preceding claims 1-5 according to the computer program in the memory.
10. A computer-readable storage medium having computer-executable instructions stored thereon which, when executed by a processor, implement the method of any one of claims 1-5.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202210720349.7A CN114814758B (en) | 2022-06-24 | 2022-06-24 | Camera-millimeter wave radar-laser radar combined calibration method and device |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202210720349.7A CN114814758B (en) | 2022-06-24 | 2022-06-24 | Camera-millimeter wave radar-laser radar combined calibration method and device |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN114814758A CN114814758A (en) | 2022-07-29 |
| CN114814758B true CN114814758B (en) | 2022-09-06 |
Family
ID=82521831
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202210720349.7A Active CN114814758B (en) | 2022-06-24 | 2022-06-24 | Camera-millimeter wave radar-laser radar combined calibration method and device |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN114814758B (en) |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114187365A (en) * | 2021-12-09 | 2022-03-15 | 联陆智能交通科技(上海)有限公司 | Camera and millimeter wave radar combined calibration method and system for roadside sensing system |
| CN115494467A (en) * | 2022-09-28 | 2022-12-20 | 上汽通用五菱汽车股份有限公司 | Simulation test method and system for multi-angle millimeter wave radar, electronic device and medium |
| CN115574841A (en) * | 2022-10-11 | 2023-01-06 | 潍柴动力股份有限公司 | Joint calibration method, joint calibration device and joint calibration system |
| CN116416319B (en) * | 2022-11-17 | 2023-11-24 | 南京理工大学 | A one-time joint calibration method for multi-type sensor calibration for intelligent driving |
| CN116243324B (en) * | 2022-12-02 | 2025-01-07 | 深圳市普渡科技有限公司 | Positioning method, device, robot and storage medium |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104715486A (en) * | 2015-03-25 | 2015-06-17 | 北京经纬恒润科技有限公司 | Simulated rack camera calibration method and real-time machine |
| CN104778718A (en) * | 2015-05-07 | 2015-07-15 | 西安电子科技大学 | Single-image truck volume measurement method based on three-dimensional model |
| CN112419385A (en) * | 2021-01-25 | 2021-02-26 | 国汽智控(北京)科技有限公司 | 3D depth information estimation method and device and computer equipment |
| CN112596039A (en) * | 2020-12-31 | 2021-04-02 | 安徽江淮汽车集团股份有限公司 | Radar calibration method and system, simulation calibration system and control method thereof |
| CN113485392A (en) * | 2021-06-17 | 2021-10-08 | 广东工业大学 | Virtual reality interaction method based on digital twins |
| CN113514803A (en) * | 2021-03-25 | 2021-10-19 | 武汉光庭信息技术股份有限公司 | Combined calibration method for monocular camera and millimeter wave radar |
| CN114076918A (en) * | 2020-08-20 | 2022-02-22 | 北京万集科技股份有限公司 | Method and device for joint calibration of millimeter-wave radar, lidar and camera |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11960276B2 (en) * | 2020-11-19 | 2024-04-16 | Tusimple, Inc. | Multi-sensor collaborative calibration system |
-
2022
- 2022-06-24 CN CN202210720349.7A patent/CN114814758B/en active Active
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104715486A (en) * | 2015-03-25 | 2015-06-17 | 北京经纬恒润科技有限公司 | Simulated rack camera calibration method and real-time machine |
| CN104778718A (en) * | 2015-05-07 | 2015-07-15 | 西安电子科技大学 | Single-image truck volume measurement method based on three-dimensional model |
| CN114076918A (en) * | 2020-08-20 | 2022-02-22 | 北京万集科技股份有限公司 | Method and device for joint calibration of millimeter-wave radar, lidar and camera |
| CN112596039A (en) * | 2020-12-31 | 2021-04-02 | 安徽江淮汽车集团股份有限公司 | Radar calibration method and system, simulation calibration system and control method thereof |
| CN112419385A (en) * | 2021-01-25 | 2021-02-26 | 国汽智控(北京)科技有限公司 | 3D depth information estimation method and device and computer equipment |
| CN113514803A (en) * | 2021-03-25 | 2021-10-19 | 武汉光庭信息技术股份有限公司 | Combined calibration method for monocular camera and millimeter wave radar |
| CN113485392A (en) * | 2021-06-17 | 2021-10-08 | 广东工业大学 | Virtual reality interaction method based on digital twins |
Non-Patent Citations (2)
| Title |
|---|
| Model-Based Estimation of Road Direction in Urban Scenes Using Virtual LiDAR Signals;Miho Adachi 等;《2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC)》;20201014;全文 * |
| 激光雷达在无人驾驶环境感知中的应用;黄武陵;《单片机与嵌入式系统应用》;20161001(第10期);全文 * |
Also Published As
| Publication number | Publication date |
|---|---|
| CN114814758A (en) | 2022-07-29 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN114814758B (en) | Camera-millimeter wave radar-laser radar combined calibration method and device | |
| CN111179358B (en) | Calibration method, device, equipment and storage medium | |
| US20220276339A1 (en) | Calibration method and apparatus for sensor, and calibration system | |
| US20220276360A1 (en) | Calibration method and apparatus for sensor, and calibration system | |
| US10999519B2 (en) | Target tracking method and device, movable platform, and storage medium | |
| CN113256740A (en) | Calibration method of radar and camera, electronic device and storage medium | |
| CN111862208B (en) | Vehicle positioning method, device and server based on screen optical communication | |
| CN109901140B (en) | Laser radar light path deviation detection method and device and terminal equipment | |
| CN112927306B (en) | Calibration method and device of shooting device and terminal equipment | |
| CN113895482B (en) | Train speed measuring method and device based on trackside equipment | |
| CN109828250B (en) | Radar calibration method, calibration device and terminal equipment | |
| CN114519845A (en) | Multi-sensing data fusion method and device, computer equipment and storage medium | |
| CN113748693B (en) | Position and orientation correction method and device of roadbed sensor and roadbed sensor | |
| CN105526906A (en) | Wide-angle dynamic high-precision laser angle measurement method | |
| WO2020133230A1 (en) | Radar simulation method, apparatus and system | |
| CN116125488A (en) | Target tracking method, signal fusion method, device, terminal and storage medium | |
| CN114690157A (en) | Automatic calibration method of reflectivity of laser radar, target detection method and device | |
| CN115824170A (en) | A Method of Fusion of Photogrammetry and LiDAR to Measure Ocean Waves | |
| CN113655494B (en) | Road side camera and 4D millimeter wave fused target detection method, device and medium | |
| CN107678012A (en) | Laser radar closed-loop control system, laser radar and laser radar control method | |
| CN114529789A (en) | Target detection method, target detection device, computer equipment and storage medium | |
| CN118762157B (en) | A method and device for generating a detection sample loading path | |
| CN115327512B (en) | Laser radar and camera calibration method, device, server and storage medium | |
| CN118314216A (en) | Parameter calibration method and device, storage medium and electronic equipment | |
| CN119165452A (en) | A joint calibration method and device for camera and 4D millimeter wave radar |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |