CN110686704A - Pose calibration method, system and medium for laser radar and combined inertial navigation - Google Patents
Pose calibration method, system and medium for laser radar and combined inertial navigation Download PDFInfo
- Publication number
- CN110686704A CN110686704A CN201910995815.0A CN201910995815A CN110686704A CN 110686704 A CN110686704 A CN 110686704A CN 201910995815 A CN201910995815 A CN 201910995815A CN 110686704 A CN110686704 A CN 110686704A
- Authority
- CN
- China
- Prior art keywords
- inertial navigation
- point cloud
- reference points
- data sets
- laser radar
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 49
- 230000000007 visual effect Effects 0.000 claims abstract description 46
- 230000003068 static effect Effects 0.000 claims abstract description 24
- 230000009466 transformation Effects 0.000 claims description 55
- 239000011159 matrix material Substances 0.000 claims description 54
- 238000004364 calculation method Methods 0.000 claims description 12
- 238000013507 mapping Methods 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims description 6
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 6
- 238000012800 visualization Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 238000013519 translation Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 101100489717 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) GND2 gene Proteins 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 125000004122 cyclic group Chemical group 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
- G01C25/005—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Manufacturing & Machinery (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Navigation (AREA)
Abstract
The embodiment of the invention discloses a method, a system and a medium for calibrating the pose of a laser radar and a combined inertial navigation system. Wherein, the method comprises the following steps: acquiring two point cloud data sets and two inertial navigation data sets of a same static object acquired by a laser radar and a combined inertial navigation at two acquisition positions; the laser radar is rigidly connected with the combined inertial navigation system; determining a plurality of groups of reference points according to the visual images of the two point cloud data sets; each group of reference points corresponds to the same position on the static object on the two point cloud data sets; and determining pose calibration parameters of the laser radar and the combined inertial navigation according to the two point cloud data sets, the two inertial navigation data sets and the multiple groups of reference points. The technical scheme of the invention is not limited by the pose relationship between the laser radar and the combined inertial navigation, calibration equipment and manpower, and can quickly and accurately calibrate the relative pose between the laser radar and the combined inertial navigation.
Description
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to a method, a system and a medium for calibrating the pose of a laser radar and a combined inertial navigation system.
Background
With the development of sensor technology, in the field of outdoor 3D mapping and positioning, the combined use of the multi-line laser radar and the combined inertial navigation becomes an indispensable scheme in multi-sensor fusion. The calibration of the relative pose between the two is crucial to the use of the sensor, and is directly related to the accuracy of the measurement result of the sensor.
At present, when the pose calibration of the laser radar and the combined inertial navigation is performed, the adopted method is to obtain the height of the multi-line laser radar from the horizontal ground through a specific device, obtain the inclination angle of the central axis of the multi-line laser radar and the vertical direction through the combined inertial navigation, calculate point cloud data obtained by scanning the ground through the multi-line laser radar through the height and the inclination angle, namely reference point data, compare the reference point data with the point cloud data obtained by actually scanning the ground through the multi-line laser radar, manually adjust the relative pose parameters between the multi-line laser radar and the combined inertial navigation, recalculate the reference point data, compare again, and perform cyclic reciprocation until the reference point data and the actual scanning data coincide.
However, the existing pose calibration method has requirements on the pose relationship between the laser radar and the combined inertial navigation, can only calibrate the pose of the laser radar and the combined inertial navigation under a specific pose, needs a specific height measurement device additionally, and is difficult to manufacture; in addition, the pose parameters need to be adjusted manually and repeatedly, so that the pose calibration is long in time consumption and low in accuracy.
Disclosure of Invention
The embodiment of the invention provides a method, a system and a medium for calibrating the pose of a laser radar and a combined inertial navigation system, which are not limited by the pose relation, calibration equipment and manpower between the laser radar and the combined inertial navigation system, and can quickly and accurately calibrate the relative pose between the laser radar and the combined inertial navigation system.
In a first aspect, an embodiment of the present invention provides a pose calibration method for a laser radar and a combined inertial navigation, including:
acquiring two point cloud data sets and two inertial navigation data sets of a same static object acquired by a laser radar and a combined inertial navigation at two acquisition positions; the laser radar is rigidly connected with the combined inertial navigation system;
determining a plurality of groups of reference points according to the visual images of the two point cloud data sets; each group of reference points corresponds to the same position on the static object on the two point cloud data sets;
and determining pose calibration parameters of the laser radar and the combined inertial navigation according to the two point cloud data sets, the two inertial navigation data sets and the multiple groups of reference points.
In a second aspect, an embodiment of the present invention further provides a pose calibration apparatus for a laser radar and a combined inertial navigation, where the apparatus includes:
the data set acquisition module is used for acquiring two point cloud data sets and two inertial navigation data sets acquired by the laser radar and the combined inertial navigation on the same static object at two acquisition positions; the laser radar is rigidly connected with the combined inertial navigation system;
a reference point determining module, configured to determine multiple groups of reference points according to the visualized images of the two point cloud data sets; wherein each group of reference points corresponds to the same position on the two point cloud data sets for the same stationary object;
and the calibration parameter determining module is used for determining the pose calibration parameters of the laser radar and the combined inertial navigation according to the two point cloud data sets, the two inertial navigation data sets and the multiple groups of reference points.
In a third aspect, an embodiment of the present invention further provides a mapping system, where the mapping system includes a laser radar, a combined inertial navigation and control device; the control device is respectively connected with the laser radar and the combined inertial navigation system, and comprises: :
one or more processors;
storage means for storing one or more programs;
when the one or more programs are executed by the one or more processors, the one or more processors implement the position and pose calibration method for lidar and combined inertial navigation according to any of the first aspects.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the method for calibrating the pose of the lidar and the combined inertial navigation according to any of the first aspects is implemented.
According to the method, the system and the medium for calibrating the pose of the laser radar and the combined inertial navigation, provided by the embodiment of the invention, aiming at the same static object, two point cloud data sets and two inertial navigation data sets acquired by the laser radar and the combined inertial navigation on the static object at two different acquisition positions are obtained, a plurality of groups of reference points required by the calibration are determined according to the visual images of the two point cloud data sets, and then the pose calibration parameters of the laser radar and the combined inertial navigation are determined according to the determined plurality of groups of reference points, the two point cloud data sets and the two inertial navigation data sets. According to the technical scheme of the embodiment of the invention, other additional calibration equipment is not needed except the laser radar and the combined inertial navigation, the requirement on the calibration equipment is reduced, the posture between the laser radar and the combined inertial navigation is not limited, and the application range of the pose calibration method is widened; in addition, the pose parameters do not need to be adjusted manually and repeatedly, and the pose calibration efficiency and accuracy are improved. A new idea is provided for calibrating the relative pose between the laser radar and the navigation positioning system.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a flowchart of a pose calibration method for a laser radar and a combined inertial navigation system according to a first embodiment of the present invention;
FIG. 2 is a flowchart of a pose calibration method for a laser radar and a combined inertial navigation system according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a pose calibration apparatus for a laser radar and a combined inertial navigation in a third embodiment of the present invention;
fig. 4A is a schematic structural diagram of a mapping system according to a fourth embodiment of the present invention;
fig. 4B is a schematic structural diagram of a control device of a mapping system according to a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present invention, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
Example one
Fig. 1 is a flowchart of a pose calibration method for a laser radar and a combined inertial navigation system according to an embodiment of the present invention, which is suitable for a situation how to accurately calibrate a relative pose between a laser radar and a combined inertial navigation system. The method may be performed by a control device in the mapping system according to an embodiment of the present invention, and the control device may be implemented in software and/or hardware. As shown in fig. 1, the method specifically includes the following steps:
s101, acquiring two point cloud data sets and two inertial navigation data sets of the same static object acquired by the laser radar and the combined inertial navigation at two acquisition positions.
The laser radar is a radar system that detects the position of an object by emitting a laser beam, and optionally, the laser radar in this embodiment may be a multi-line laser radar. The combined Inertial Navigation may be a combination of at least two units or systems having a positioning function, and may be, for example, a combination of at least one of an Inertial Measurement Unit (IMU), an Inertial Navigation System (INS), and at least one of a Global Positioning System (GPS) and a BeiDou Navigation Satellite System (BDS). It should be noted that, in the embodiment of the present invention, the positioning accuracy of the combined inertial navigation system formed by combining a plurality of units or systems having positioning functions is higher than that of a single positioning unit or system. Optionally, in this embodiment, the laser radar and the combined inertial navigation are rigidly connected. The point cloud data set is a point set which is collected by a laser radar and consists of points containing three-dimensional coordinates, and can be used for representing the shape of the outer surface of an object. The three-dimensional space geometric position information of each point can be represented by (x, y, z), and the point cloud data can also represent the reflected light intensity of one point. The inertial navigation data set is a data set which is acquired by combining inertial navigation and represents the position and the posture of a rigid structure formed by rigidly connected laser radar and the combined inertial navigation, and comprises: longitude, latitude, altitude, and rotation angle of x, y, and z axes of the combined inertial navigation. The stationary object in this embodiment may be any stationary object within the current capture scene range (that is, the stationary object may be captured at both capture positions), and in order to ensure accuracy of pose calibration, an object with a convex contour is preferably used as the stationary object in this embodiment.
Optionally, a rigid body structure formed by the laser radar and the combined inertial navigation system may be set at a first acquisition position, a first point cloud data set acquired by the laser radar at the first acquisition position for a stationary object and a first inertial navigation data set acquired by the combined inertial navigation system at the first acquisition position for the stationary object are acquired; and then, keeping the static object still, moving the rigid body structure formed by the laser radar and the combined inertial navigation to a second acquisition position, and acquiring a second point cloud data set acquired by the laser radar at the second acquisition position aiming at the static object and a second inertial navigation data set acquired by the combined inertial navigation at the second acquisition position aiming at the static object. In this step, when the laser radar and the combined inertial navigation which are rigidly connected acquire data for the stationary object at the first acquisition position and the second acquisition position, the position of the stationary object is fixed, and only the position of the laser radar and the position of the combined inertial navigation which are rigidly connected change.
It should be noted that in the embodiment of the present invention, the laser radar and the combined inertial navigation are in rigid connection, and when the laser radar acquires a point cloud data set of a certain scene, the combined inertial navigation system also locates the same scene to acquire an inertial navigation data set corresponding to the scene, that is, the first point cloud data set and the first inertial navigation data set are acquired synchronously; the second point cloud data set and the second inertial navigation data set are acquired synchronously. Therefore, the point cloud data sets at the same acquisition position are associated with the inertial navigation data set, and each point cloud data in the point cloud data sets at the acquisition positions corresponds to the inertial navigation data set associated with the point cloud data set.
S102, determining a plurality of groups of reference points according to the visual images of the two point cloud data sets.
The visual image of the point cloud data set may be a visual image formed by inputting the acquired point cloud data set into a visualization program and visually displaying coordinates of each point in the point cloud data set by the visualization program. The reference point may be a point selected from a visualized image of the point cloud data set. Optionally, the reference points in this embodiment are divided into groups, and each group of reference points corresponds to the same position on the two point cloud data sets for the same stationary object. For example, if the stationary object is a rectangular table, and according to the two visualized images of the point cloud data, four corner points of the desktop in each image are determined as reference points, then the points of the top left corners of the desktop in the two visualized images at this time are a set of reference points.
Optionally, in this step, there are many methods for determining multiple sets of reference points according to the visualized images of the two point cloud data sets, and this embodiment is not limited thereto. For example, one possible implementation may be: generating a visual image of each point cloud data set according to the two point cloud data sets; determining common features of the two visual images; and extracting feature points corresponding to the common features from the two visual images as a group of reference points aiming at each common feature. Specifically, a first point cloud data set acquired by a laser radar at a first acquisition position is input into a visualization program to obtain a first visualization image of the first point cloud data set, a second point cloud data set acquired by the laser radar at a second acquisition position is input into the visualization program to obtain a second visualization image of the second point cloud data set, then image feature recognition is performed on the two visualization images to determine common features of the two visualization images, and feature points corresponding to each common feature in the two visualization images are used as a group of reference points, namely, two feature points corresponding to the same position on a static object in the two visualization images are used as a group of reference points. For example, if the stationary object is a rectangular table, four corner points of the tabletop in the two visual images may be identified, and the upper left corner of the tabletop in the two visual images may be used as a first set of reference points, the upper right corner of the tabletop may be used as a second set of reference points, the lower left corner of the tabletop may be used as a third set of reference points, and the lower right corner of the tabletop may be used as a fourth set of reference points.
Another possible implementation may be: generating a visual image of each point cloud data set according to the two point cloud data sets and displaying the visual image to a user; and acquiring a plurality of reference points selected by a user in the two visual images, and taking the two reference points which belong to different visual images and correspond to the same position on the static object as a group of reference points. Specifically, the method for generating the visual image of each point cloud data set according to two point cloud data sets in this embodiment is the same as the method in the above embodiment, except that after the visual image of each point cloud data set is obtained, the generated visual image is displayed to the user through a display device (such as a display screen), and the user manually selects a plurality of reference points from the two visual images displayed by the display device according to the user's own needs. At this time, the processing process of pose positioning determines each reference point of the visual images of the two point cloud data sets according to the selection of the user. For example, each click position of the user on the display device may be detected, and then each click position is converted to the visual image to determine a reference point corresponding to each click position. After all the reference points are determined, the determined reference points need to be grouped, at this time, the two visual images belonging to each reference point are divided into two types according to which the reference points belong, and then two reference points representing the same position on the stationary object in the two types are used as a group of reference points. Optionally, when the user selects the reference point in the visualized image, it is required to ensure that the positions of the reference points in the two images correspond to each other one by one, so in order to ensure accuracy of the selection of the reference point, the user may preferentially use a position point with a prominent feature in the image as the reference point, for example, may select a corner point position in the image as the reference point.
And S103, determining pose calibration parameters of the laser radar and the combined inertial navigation according to the two point cloud data sets, the two inertial navigation data sets and the multiple groups of reference points.
Optionally, in this embodiment, when determining the pose calibration parameters of the laser radar and the combined inertial navigation according to the two point cloud data sets, the two inertial navigation data sets, and the multiple groups of reference points, the point cloud coordinates and the coordinate system transformation matrix of each reference point in each group may be determined according to the two point cloud data sets and the two inertial navigation data sets; and then determining pose calibration parameters of the laser radar and the combined inertial navigation according to the point cloud coordinates of the multiple groups of reference points and the coordinate system transformation matrix.
Specifically, the point cloud coordinates of the reference point may be three-dimensional spatial geometric position coordinates (x, y, z) of the point corresponding to the reference point in the point cloud data set to which the visual image belongs. The present embodiment may directly obtain the point cloud coordinates of each reference point included in the visualized image of each point cloud data set from the data set. The coordinate system transformation matrix of the reference point may be a coordinate system transformation matrix in which the inertial navigation data set corresponding to each reference point is transformed from the combined inertial navigation coordinate system to the northeast coordinate system, and in this embodiment, the inertial navigation data set is determined for each reference point according to the association relationship between the inertial navigation data set and the point cloud data set from the two inertial navigation data sets, and then each inertial navigation data set is transformed from the combined inertial navigation coordinate system to the northeast coordinate system according to a preset transformation formula to obtain the coordinate system transformation matrix of each reference point. And generating a pose parameter equation set by combining a preset pose parameter calculation formula according to the determined point cloud coordinates and coordinate system transformation matrixes of the groups of reference points, and solving the equation set to obtain the pose calibration parameters of the laser radar and the combined inertial navigation. It should be noted that, in this step, a detailed description will be given to how to determine the pose calibration parameters of the laser radar and the combined inertial navigation according to a preset conversion formula and a pose parameter calculation formula.
Optionally, in order to improve the accuracy of the pose calibration parameters of the laser radar and the combined inertial navigation, in this embodiment, after the operations of S101 to S102 are performed on one data acquisition environment, the current data acquisition environment is adjusted, and then based on the adjusted data acquisition environment, two point cloud data sets and two inertial navigation data sets in the adjusted data acquisition environment are obtained again according to the operations of S101 to S102, and a plurality of groups of reference points are determined; and determining pose calibration parameters of the laser radar and the combined inertial navigation according to the two point cloud data sets, the two inertial navigation data sets and the multiple groups of reference points in each data acquisition environment. Specifically, in this embodiment, the operations of S101-S102 are executed for the data acquisition environment 1, two point cloud data sets and two inertial navigation data sets corresponding to the data acquisition environment 1 are obtained, and a plurality of groups of reference points are determined; then adjusting a data acquisition environment, executing the operations of S101-S102 aiming at the adjusted data acquisition environment 2, acquiring two point cloud data sets and two inertial navigation data sets corresponding to the data acquisition environment 2 and determining a plurality of groups of reference points; further generating a plurality of pose parameter equations corresponding to the data acquisition environment 1 according to the two point cloud data sets corresponding to the acquisition environment 1, the two inertial navigation data sets and the determined reference points; generating a plurality of pose parameter equations corresponding to the data acquisition environment 2 by the same method; and finally, solving an equation set formed by a plurality of pose parameter equations corresponding to the data acquisition environment 1 and the data acquisition environment 2 by adopting a least square method, and taking a calculation result as an optimal pose calibration parameter of the laser radar and the combined inertial navigation. Among other things, adjusting the data acquisition environment may include, but is not limited to: adjusting at least one of the position of the stationary object, adjusting the position of the rigidly connected lidar and the combined inertial navigation, and reselecting the stationary object.
According to the pose calibration method of the laser radar and the combined inertial navigation, provided by the embodiment of the invention, aiming at the same static object, two point cloud data sets and two inertial navigation data sets acquired by the laser radar and the combined inertial navigation on the static object at two different acquisition positions are obtained, a plurality of groups of reference points required by the calibration are determined according to the visual images of the two point cloud data sets, and then pose calibration parameters of the laser radar and the combined inertial navigation are determined according to the determined groups of reference points, the two point cloud data sets and the two inertial navigation data sets. According to the technical scheme of the embodiment of the invention, other additional calibration equipment is not needed except the laser radar and the combined inertial navigation, the requirement on the calibration equipment is reduced, the posture between the laser radar and the combined inertial navigation is not limited, and the application range of the pose calibration method is widened; in addition, the pose parameters do not need to be adjusted manually and repeatedly, and the pose calibration efficiency and accuracy are improved. A new idea is provided for calibrating the relative pose between the laser radar and the navigation positioning system.
Example two
Fig. 2 is a flowchart of a pose calibration method for a laser radar and a combined inertial navigation system in the second embodiment of the present invention, and this embodiment is based on the above embodiments and further optimizes the pose calibration method, and specifically provides a description of how to determine pose calibration parameters for the laser radar and the combined inertial navigation system according to two point cloud data sets, two inertial navigation data sets, and multiple groups of reference points. As shown in fig. 2, the method of this embodiment specifically includes the following steps:
s201, two point cloud data sets and two inertial navigation data sets acquired by the laser radar and the combined inertial navigation on the same static object at two acquisition positions are acquired.
Wherein the laser radar is rigidly connected with the combined inertial navigation system.
S202, determining a plurality of groups of reference points according to the visual images of the two point cloud data sets.
Wherein each set of reference points corresponds to the same position on the two point cloud data sets for the same stationary object.
And S203, acquiring the position coordinates of each reference point in each group in the point cloud data set corresponding to the visual image to which the reference point belongs, and taking the position coordinates as the point cloud coordinates of each reference point.
Optionally, the visualized image is composed of data of each point cloud in the point cloud data set, and in this embodiment, when the point cloud coordinates of each reference point in each group of reference points are determined, the point cloud data set of the visualized image to which each reference point belongs may be obtained first, and then the point cloud coordinates (x, y, z) corresponding to the reference point may be directly obtained from the point cloud data set. For example, if the plurality of sets of reference points in S202 are manually selected by the user, the three-dimensional spatial geometric position coordinates of the point cloud data at the position selected by the user on the visualized image may be used as the point cloud coordinates P of the reference point. If the multiple sets of reference points in S202 are automatically determined by the system according to the feature points of the visualized image, the three-dimensional space geometric position coordinates of the point cloud data corresponding to the reference points in the visualized image may be used as the point cloud coordinates P of the reference points.
And S204, calculating a coordinate system transformation matrix from the combined inertial navigation coordinate system to the northeast coordinate system corresponding to the two inertial navigation data sets, and taking the coordinate system transformation matrix as a coordinate system transformation matrix of each reference point contained in the visual image of the point cloud data set acquired at the same acquisition position with each inertial navigation data set.
The inertial navigation data is used for calculating a coordinate system transformation matrix. Optionally, in this step, when a coordinate system transformation matrix from the combined inertial navigation coordinate system to the northeast coordinate system corresponding to the two inertial navigation data sets is calculated, the coordinate system transformation matrix from the combined inertial navigation coordinate system to the northeast coordinate system corresponding to each inertial navigation data set may be calculated according to the initial data, each inertial navigation data set, and the coordinate transformation formula.
The initial data can be data obtained by processing an initial inertial navigation data set acquired at an initial point by combined inertial navigation. The optional initial data may include: the world coordinates of the initial point (i.e., GNDXYZ0) and the transformation matrix of the world coordinate system of the initial point to the northeast coordinate system (i.e., GND2 ENS). The initial point may be predetermined, and may be any point other than the acquisition position in S201. Controlling a group of inertial navigation data sets acquired at an initial point of the combined inertial navigation as an initial inertial navigation data set, wherein the initial inertial navigation data set may include: initial point longitude L0Dimension of initial point B0And initial point altitude H0. Initial data (i.e., GNDXYZ0 and GND2ENS) were then calculated for the acquired initial inertial navigation data set according to equations (1) - (6) below.
X0=(N0+H0)×cos(B0)×cos(L0) (2)
Y0=(N0+H0)×cos(B0)×sin(L0) (3)
Z0=(N0×(1-e2)+H0)×sin(B0) (4)
GNDXYZ0=[X0,Y0,Z0](5)
GND2ENS=[[-sin(L0),cos(L0),0],[-sin(B0)×cos(L0),-sin(B0)×sin(L0),cos(B0)],[cos(B0)×cos(L0),cos(B0)×sin(L0),sin(B0)]](6)
Wherein, a is the radius of the major-semiaxis of the earth, and the value can be 6378137 meters, e is the first eccentricity of an ellipsoid in a WGS-84 geocentric coordinate system, and the value can be 0.08181919084255234.
Optionally, when a coordinate system transformation matrix from the combined inertial navigation coordinate system to the northeast coordinate system corresponding to the inertial navigation data set is calculated according to the initial data obtained by the above calculation, the coordinate transformation matrix may be calculated according to the following coordinate transformation equations (7) to (14):
X1=(N1+H1)×cos(B1)×cos(L1) (9)
Y1=(N1+H1)×cos(B1)×sin(L1) (10)
Z1=(N1×(1-e2)+H1)×sin(B1) (11)
GND_translation=[X1-GNDXYZ0[0],Y1-GNDXYZ0[1],Z1-GNDXYZ0[2]](12)
T3×1=GND2ENS·GND_translation (13)
wherein, a is the radius of the major-semiaxis of the earth, and the value can be 6378137 meters, e is the first eccentricity of an ellipsoid in a WGS-84 geocentric coordinate system, and the value can be 0.08181919084255234; roll is the rotation angle of the combined inertial navigation data set around the y axis of the combined inertial navigation; the pitch is the rotation angle of the combined inertial navigation data set around the x axis of the combined inertial navigation; the heading is a rotation angle of the combined inertial navigation data set around a combined inertial navigation z axis; l is1The longitude value in the combined inertial navigation data set is obtained; b is1The latitude value in the combined inertial navigation data set is obtained; h1The altitude value in the combined inertial navigation data set is obtained; s is a coordinate system transformation matrix from a combined inertial navigation coordinate system to a northeast coordinate system corresponding to the inertial navigation data set; GND _ translation is the offset of the position coordinate in the combined inertial navigation data set from the position coordinate of the initial point in the world coordinate system.
Optionally, because the point cloud data set at the same acquisition position in S201 is associated with the inertial navigation data set, and each point cloud in the point cloud data set at each acquisition position corresponds to the inertial navigation data set associated with the point cloud data set, if the visual image to which a certain reference point belongs is the point cloud data set acquired at the first acquisition position, the coordinate system transformation matrix of the reference point is the coordinate system transformation matrix acquired by the first acquisition position and calculated from the inertial navigation data set; if the visual image to which the reference point belongs is the point cloud data set acquired at the second acquisition position, the coordinate system transformation matrix of the reference point is the coordinate system transformation matrix acquired by the second acquisition position and calculated by the inertial navigation data set.
And S205, taking the point cloud coordinates and the coordinate system transformation matrix of each group of reference points as a group of parameter variables, and generating a pose parameter equation corresponding to each group of reference points based on a pose parameter calculation formula.
Optionally, in this embodiment, the number of the reference points included in the set of reference points is two, and the point cloud coordinates and the coordinate system transformation matrix of the two reference points are taken as a set of parameter variables and are substituted into the following pose calculation formula (15), so as to obtain a pose parameter equation related to the set of reference points:
SRLP=S′RLP′ (15)
the S is a transformation matrix from an inertial navigation coordinate to a northeast coordinate of a first reference point in a group of reference points, namely a coordinate system transformation matrix of the first reference point; s' is a transformation matrix from the inertial navigation coordinate to the northeast coordinate of a second reference point in the group of reference points, namely a coordinate system transformation matrix of the second reference point; p is the point cloud coordinates of the first reference point in the set of reference points; p' is the point cloud coordinates of the second reference point in the set of reference points; rLA pose calibration matrix containing pose calibration parameters of the laser radar and the combined inertial navigation, namely a transformation matrix from radar coordinates to inertial navigation coordinates;
optionally, RLAnd calibrating the pose calibration parameters of the laser radar and the combined inertial navigation to be finally determined. The concrete representation form is as follows:wherein M is3×3Is a rotation variation matrix containing 9 orientation calibration parameters, N3×1Is a translation transformation matrix, which contains 3 position calibration parameters.
And S206, determining pose calibration parameters of the laser radar and the combined inertial navigation according to pose parameter equations corresponding to the multiple groups of reference points.
Optionally, since the point cloud data set and the inertial navigation data set of this embodiment are acquired on the premise that the object is stationary, that is, in the northeast coordinate system, the coordinates of two reference points in each set of reference points should be the same, in this embodiment, when solving the equation including 12 pose calibration parameters, at least 4 sets of reference points are required. In order to reduce the amount of calculation, this embodiment may select 4 sets of pose parameter equations corresponding to the reference points to form a pose parameter equation set, and solve the pose parameter equation set to obtain RLThe system comprises 12 position and posture calibration parameters, namely position and posture calibration parameters of the laser radar and the combined inertial navigation to be calibrated in the embodiment.
Optionally, in order to improve the accuracy of the calibration result, when the number of pose parameter equations corresponding to the multiple sets of reference points is greater than the number of pose calibration parameters, a least square method is adopted to calculate the pose calibration parameters of the laser radar and the combined inertial navigation according to the multiple sets of pose parameter equations. For example, the number of the pose calibration parameters in this embodiment is 12, and if the number of the pose parameter equations corresponding to the multiple sets of reference points generated in S205 is greater than 12 (where 4 sets of reference points may generate 12 pose parameter equations), this embodiment may solve the equation sets by using a least square method to obtain the final pose calibration parameters of the laser radar and the combined inertial navigation.
According to the pose calibration method of the laser radar and the combined inertial navigation, provided by the embodiment of the invention, aiming at the same static object, two point cloud data sets and two inertial navigation data sets acquired by the laser radar and the combined inertial navigation on the static object at two different acquisition positions are obtained, and a plurality of groups of reference points required by the calibration are determined according to the visual images of the two point cloud data sets; and determining the point cloud coordinates and the coordinate system transformation matrix of each group of reference points, and further accurately solving the pose calibration parameters of the laser radar and the combined inertial navigation based on a preset pose parameter calculation formula. According to the scheme of the embodiment, 12 pose calibration parameters between the laser radar and the combined inertial navigation can be accurately solved through 4 reference points on the premise of not using other special calibration devices according to a rigid body structure formed by the laser radar and the combined inertial navigation with any pose relation, manual repeated adjustment of the pose parameters is not needed, and the efficiency and the accuracy of pose calibration are improved.
EXAMPLE III
Fig. 3 is a schematic structural diagram of a pose calibration device for a laser radar and a combined inertial navigation system according to a third embodiment of the present invention. The device can execute the pose calibration method of the laser radar and the combined inertial navigation provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method. As shown in fig. 3, the apparatus specifically includes:
a data set obtaining module 301, configured to obtain two point cloud data sets and two inertial navigation data sets, where the two point cloud data sets and the two inertial navigation data sets are obtained by using the laser radar and the combined inertial navigation system at two collecting positions; the laser radar is rigidly connected with the combined inertial navigation system;
a reference point determining module 302, configured to determine multiple sets of reference points according to the visualized images of the two point cloud data sets; wherein each set of reference points corresponds to the same position on the same stationary object in the two point cloud data sets;
and a calibration parameter determining module 303, configured to determine pose calibration parameters of the laser radar and the combined inertial navigation according to the two point cloud data sets, the two inertial navigation data sets, and the multiple groups of reference points.
According to the pose calibration device for the laser radar and the combined inertial navigation, provided by the embodiment of the invention, aiming at the same static object, two point cloud data sets and two inertial navigation data sets acquired by the laser radar and the combined inertial navigation on the static object at two different acquisition positions are obtained, a plurality of groups of reference points required by the calibration are determined according to the visual images of the two point cloud data sets, and then pose calibration parameters of the laser radar and the combined inertial navigation are determined according to the determined groups of reference points, the two point cloud data sets and the two inertial navigation data sets. According to the technical scheme of the embodiment of the invention, other additional calibration equipment is not needed except the laser radar and the combined inertial navigation, the requirement on the calibration equipment is reduced, the posture between the laser radar and the combined inertial navigation is not limited, and the application range of the pose calibration method is widened; in addition, the pose parameters do not need to be adjusted manually and repeatedly, and the pose calibration efficiency and accuracy are improved. A new idea is provided for calibrating the relative pose between the laser radar and the navigation positioning system.
Further, the reference point determining module 302 is specifically configured to:
generating a visual image of each point cloud data set according to the two point cloud data sets;
acquiring a plurality of reference points selected by a user in two visual images, and taking the two reference points which belong to different visual images and correspond to the same position on the static object as a group of reference points; or,
determining common features of the two visual images; and extracting feature points corresponding to the common features from the two visual images as a group of reference points aiming at each common feature.
Further, the calibration parameter determining module 303 includes:
the coordinate matrix determination unit is used for determining point cloud coordinates and coordinate system transformation matrixes of reference points in each group according to the two point cloud data sets and the two inertial navigation data sets;
and the calibration parameter determining unit is used for determining the pose calibration parameters of the laser radar and the combined inertial navigation according to the point cloud coordinates of the multiple groups of reference points and the coordinate system transformation matrix.
Further, the coordinate matrix determination unit specifically includes:
the point cloud coordinate determining subunit is used for acquiring the position coordinates of each reference point in each group in a point cloud data set corresponding to the visual image to which the reference point belongs, and taking the position coordinates as the point cloud coordinates of each reference point;
and the coordinate transformation matrix determining subunit is used for calculating a coordinate transformation matrix from the combined inertial navigation coordinate system to the northeast coordinate system corresponding to the two inertial navigation data sets, and the coordinate transformation matrix is used as a coordinate transformation matrix of each reference point contained in a visual image of the point cloud data set acquired at the same acquisition position as each inertial navigation data set.
Further, the coordinate transformation matrix determining subunit is specifically configured to:
and calculating a coordinate system transformation matrix from the combined inertial navigation coordinate system to the northeast coordinate system corresponding to each inertial navigation data set according to the initial data, each inertial navigation data set and the coordinate conversion formula.
Further, the calibration parameter determining unit specifically includes:
the parameter equation generating subunit is used for generating a pose parameter equation corresponding to each group of reference points by taking the point cloud coordinates and the coordinate system transformation matrix of each group of reference points as a group of parameter variables based on a pose parameter calculation formula;
and the pose parameter solving subunit is used for determining pose calibration parameters of the laser radar and the combined inertial navigation according to the pose parameter equations corresponding to the multiple groups of reference points.
Further, the pose calculation formula is as follows:
SRLP=S′RLP′;
wherein S is a coordinate system transformation matrix of a first reference point in a group of reference points; s' is a coordinate system transformation matrix of a second reference point in the set of reference points; rLA pose calibration matrix containing pose calibration parameters of the laser radar and the combined inertial navigation is obtained; p is the point cloud coordinates of the first reference point in the set of reference points; p' is the point cloud coordinates of the second reference point in the set of reference points.
Further, the pose parameter solving subunit is specifically configured to:
and when the number of the pose parameter equations corresponding to the multiple groups of reference points is greater than the number of the pose calibration parameters, calculating the pose calibration parameters of the laser radar and the combined inertial navigation according to the multiple groups of pose parameter equations by adopting a least square method.
Further, the data set obtaining module 301 and the reference point determining module 302 cooperate to perform the adjustment based on the adjusted data acquisition environment, to obtain two point cloud data sets and two inertial navigation data sets again in the data acquisition environment, and to determine a plurality of groups of reference points;
correspondingly, the calibration parameter determining module 303 is specifically configured to: and determining pose calibration parameters of the laser radar and the combined inertial navigation according to the two point cloud data sets, the two inertial navigation data sets and the multiple groups of reference points in each data acquisition environment.
Example four
Fig. 4A is a schematic structural diagram of a surveying and mapping system according to a fourth embodiment of the present invention, and fig. 4B is a schematic structural diagram of a control device of the surveying and mapping system according to the fourth embodiment of the present invention. The mapping system 4 shown in fig. 4A comprises a lidar 41, a combined inertial navigation system 42 and a control device 40. FIG. 4B illustrates a block diagram of an exemplary control device 40 suitable for use in implementing embodiments of the present invention. The control device 40 shown in fig. 4B is only an example, and should not bring any limitation to the function and the range of use of the embodiment of the present invention. As shown in fig. 4B, the control device 40 is in the form of a general purpose computing device. The components of the control device 40 may include, but are not limited to: one or more processors or processing units 401, a system memory 402, and a bus 403 that couples the various system components (including the system memory 402 and the processing unit 401).
Bus 403 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
The system memory 402 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)404 and/or cache memory 405. The control device 40 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 406 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 4B and commonly referred to as a "hard drive"). Although not shown in FIG. 4B, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to the bus 403 by one or more data media interfaces. System memory 402 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 408 having a set (at least one) of program modules 407 may be stored, for example, in system memory 402, such program modules 407 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 407 generally perform the functions and/or methods of the described embodiments of the invention.
The control apparatus 40 may also communicate with one or more external devices 409 (e.g., keyboard, pointing device, display 410, etc.), with one or more devices that enable a user to interact with the device, and/or with any devices (e.g., network card, modem, etc.) that enable the control apparatus 40 to communicate with one or more other computing devices. Such communication may be through input/output (I/O) interface 411. Also, the control device 40 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 412. As shown in fig. 4B, the network adapter 412 communicates with the other modules of the control device 40 via the bus 403. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the control device 40, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 401 executes various functional applications and data processing by running the program stored in the system memory 402, for example, implementing the pose calibration method for the laser radar and the combined inertial navigation provided by the embodiment of the present invention.
EXAMPLE five
The fifth embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, can implement the method for calibrating the pose of the laser radar and the combined inertial navigation described in the foregoing embodiments.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer-readable storage medium may be, for example but not limited to: an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The above example numbers are for description only and do not represent the merits of the examples.
It will be appreciated by those of ordinary skill in the art that the modules or operations of the embodiments of the invention described above may be implemented using a general purpose computing device, which may be centralized on a single computing device or distributed across a network of computing devices, and that they may alternatively be implemented using program code executable by a computing device, such that the program code is stored in a memory device and executed by a computing device, and separately fabricated into integrated circuit modules, or fabricated into a single integrated circuit module from a plurality of modules or operations thereof. Thus, the present invention is not limited to any specific combination of hardware and software.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts in the embodiments are referred to each other.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (10)
1. A pose calibration method of a laser radar and a combined inertial navigation is characterized by comprising the following steps:
acquiring two point cloud data sets and two inertial navigation data sets of a same static object acquired by a laser radar and a combined inertial navigation at two acquisition positions; the laser radar is rigidly connected with the combined inertial navigation system;
determining a plurality of groups of reference points according to the visual images of the two point cloud data sets; each group of reference points corresponds to the same position on the static object on the two point cloud data sets;
and determining pose calibration parameters of the laser radar and the combined inertial navigation according to the two point cloud data sets, the two inertial navigation data sets and the multiple groups of reference points.
2. The method of claim 1, wherein determining a plurality of sets of reference points from the visualized images of the two point cloud data sets comprises:
generating a visual image of each point cloud data set according to the two point cloud data sets;
acquiring a plurality of reference points selected by a user in two visual images, and taking the two reference points which belong to different visual images and correspond to the same position on the static object as a group of reference points; or,
common features of the two visual images are determined, and for each common feature, feature points corresponding to the common features are extracted from the two visual images to serve as a group of reference points.
3. The method of claim 1, wherein determining pose calibration parameters of the lidar and the combined inertial navigation from the two point cloud data sets, the two inertial navigation data sets, and the plurality of sets of reference points comprises:
determining point cloud coordinates and coordinate system transformation matrixes of reference points in each group according to the two point cloud data sets and the two inertial navigation data sets;
and determining pose calibration parameters of the laser radar and the combined inertial navigation according to the point cloud coordinates and the coordinate system transformation matrix of the multiple groups of reference points.
4. The method of claim 3, wherein determining point cloud coordinates and coordinate system transformation matrices for reference points in each group from the two point cloud data sets and the two inertial navigation data sets comprises:
acquiring the position coordinates of each reference point in each group in a point cloud data set corresponding to the visual image to which the reference point belongs, and taking the position coordinates as the point cloud coordinates of each reference point;
and calculating a coordinate system transformation matrix from the combined inertial navigation coordinate system to the northeast coordinate system corresponding to the two inertial navigation data sets, wherein the coordinate system transformation matrix is used as a coordinate system transformation matrix of each reference point contained in the visual image of the point cloud data set acquired at the same acquisition position with each inertial navigation data set.
5. The method of claim 4, wherein computing a coordinate system transformation matrix for the two sets of inertial navigation data from the combined inertial navigation coordinate system to the northeast coordinate system comprises:
and calculating a coordinate system transformation matrix from the combined inertial navigation coordinate system to the northeast coordinate system corresponding to each inertial navigation data set according to the initial data, each inertial navigation data set and the coordinate conversion formula.
6. The method of claim 3, wherein determining pose calibration parameters of the lidar and the combined inertial navigation system according to the point cloud coordinates and the coordinate system transformation matrix of the plurality of sets of reference points comprises:
taking the point cloud coordinates and the coordinate system transformation matrix of each group of reference points as a group of parameter variables, and generating a pose parameter equation corresponding to each group of reference points based on a pose parameter calculation formula;
and determining pose calibration parameters of the laser radar and the combined inertial navigation according to pose parameter equations corresponding to the multiple groups of reference points.
7. The method according to claim 6, characterized in that the pose calculation formula is:
SRLP=S′RLP′;
wherein S is a coordinate system transformation matrix of a first reference point in a group of reference points; s' is a coordinate system transformation matrix of a second reference point in the set of reference points; rLA pose calibration matrix containing pose calibration parameters of the laser radar and the combined inertial navigation is obtained; p is the point cloud coordinates of the first reference point in the set of reference points; p' is the point cloud coordinates of the second reference point in the set of reference points.
8. The method of claim 1, further comprising, after determining a plurality of sets of reference points from the visualized images of the two point cloud data sets:
based on the adjusted data acquisition environment, two point cloud data sets and two inertial navigation data sets under the data acquisition environment are obtained again, and a plurality of groups of reference points are determined;
correspondingly, determining pose calibration parameters of the laser radar and the combined inertial navigation according to the two point cloud data sets, the two inertial navigation data sets and the multiple groups of reference points, and the determining comprises the following steps:
and determining pose calibration parameters of the laser radar and the combined inertial navigation according to the two point cloud data sets, the two inertial navigation data sets and the multiple groups of reference points in each data acquisition environment.
9. A surveying and mapping system is characterized by comprising a laser radar, a combined inertial navigation and control device; the control device is respectively connected with the laser radar and the combined inertial navigation system, and comprises:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method for pose calibration of lidar and combined inertial navigation according to any of claims 1-8.
10. A computer-readable storage medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method for pose calibration of lidar and combined inertial navigation according to any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910995815.0A CN110686704A (en) | 2019-10-18 | 2019-10-18 | Pose calibration method, system and medium for laser radar and combined inertial navigation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910995815.0A CN110686704A (en) | 2019-10-18 | 2019-10-18 | Pose calibration method, system and medium for laser radar and combined inertial navigation |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110686704A true CN110686704A (en) | 2020-01-14 |
Family
ID=69113465
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910995815.0A Pending CN110686704A (en) | 2019-10-18 | 2019-10-18 | Pose calibration method, system and medium for laser radar and combined inertial navigation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110686704A (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111289957A (en) * | 2020-03-10 | 2020-06-16 | 上海高仙自动化科技发展有限公司 | External parameter calibration method and device, intelligent robot and computer readable storage medium |
CN111443337A (en) * | 2020-03-27 | 2020-07-24 | 北京航空航天大学 | Radar-IMU calibration method based on hand-eye calibration |
CN111735479A (en) * | 2020-08-28 | 2020-10-02 | 中国计量大学 | A multi-sensor joint calibration device and method |
CN112051591A (en) * | 2020-08-31 | 2020-12-08 | 广州文远知行科技有限公司 | Detection method and related device for laser radar and inertial measurement unit |
CN112051590A (en) * | 2020-08-31 | 2020-12-08 | 广州文远知行科技有限公司 | Detection method and related device for laser radar and inertial measurement unit |
CN112146682A (en) * | 2020-09-22 | 2020-12-29 | 福建牧月科技有限公司 | Sensor calibration method and device for intelligent automobile, electronic equipment and medium |
CN112362054A (en) * | 2020-11-30 | 2021-02-12 | 上海商汤临港智能科技有限公司 | Calibration method, calibration device, electronic equipment and storage medium |
WO2021174507A1 (en) * | 2020-03-05 | 2021-09-10 | 深圳市大疆创新科技有限公司 | Parameter calibration method, device, and system, and storage medium |
CN113759349A (en) * | 2021-09-22 | 2021-12-07 | 阿波罗智能技术(北京)有限公司 | Calibration method and device for laser radar and positioning device and automatic driving vehicle |
CN113848541A (en) * | 2021-09-22 | 2021-12-28 | 深圳市镭神智能系统有限公司 | Calibration method and device, unmanned aerial vehicle and computer readable storage medium |
WO2023226155A1 (en) * | 2022-05-24 | 2023-11-30 | 芯跳科技(广州)有限公司 | Multi-source data fusion positioning method and apparatus, device, and computer storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107796370A (en) * | 2016-08-30 | 2018-03-13 | 北京四维图新科技股份有限公司 | For obtaining the method, apparatus and mobile mapping system of conversion parameter |
CN109297510A (en) * | 2018-09-27 | 2019-02-01 | 百度在线网络技术(北京)有限公司 | Relative pose scaling method, device, equipment and medium |
CN109345596A (en) * | 2018-09-19 | 2019-02-15 | 百度在线网络技术(北京)有限公司 | Multisensor scaling method, device, computer equipment, medium and vehicle |
CN109901139A (en) * | 2018-12-28 | 2019-06-18 | 文远知行有限公司 | Lidar calibration method, device, equipment and storage medium |
CN109901138A (en) * | 2018-12-28 | 2019-06-18 | 文远知行有限公司 | Lidar calibration method, device, equipment and storage medium |
KR20190086960A (en) * | 2018-01-15 | 2019-07-24 | 주식회사 스트리스 | System and Method for Calibration and Integration of Multi-Sensor using Feature Geometry |
-
2019
- 2019-10-18 CN CN201910995815.0A patent/CN110686704A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107796370A (en) * | 2016-08-30 | 2018-03-13 | 北京四维图新科技股份有限公司 | For obtaining the method, apparatus and mobile mapping system of conversion parameter |
KR20190086960A (en) * | 2018-01-15 | 2019-07-24 | 주식회사 스트리스 | System and Method for Calibration and Integration of Multi-Sensor using Feature Geometry |
CN109345596A (en) * | 2018-09-19 | 2019-02-15 | 百度在线网络技术(北京)有限公司 | Multisensor scaling method, device, computer equipment, medium and vehicle |
CN109297510A (en) * | 2018-09-27 | 2019-02-01 | 百度在线网络技术(北京)有限公司 | Relative pose scaling method, device, equipment and medium |
CN109901139A (en) * | 2018-12-28 | 2019-06-18 | 文远知行有限公司 | Lidar calibration method, device, equipment and storage medium |
CN109901138A (en) * | 2018-12-28 | 2019-06-18 | 文远知行有限公司 | Lidar calibration method, device, equipment and storage medium |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113767264A (en) * | 2020-03-05 | 2021-12-07 | 深圳市大疆创新科技有限公司 | Parameter calibration method, device, system and storage medium |
WO2021174507A1 (en) * | 2020-03-05 | 2021-09-10 | 深圳市大疆创新科技有限公司 | Parameter calibration method, device, and system, and storage medium |
CN111289957A (en) * | 2020-03-10 | 2020-06-16 | 上海高仙自动化科技发展有限公司 | External parameter calibration method and device, intelligent robot and computer readable storage medium |
CN111289957B (en) * | 2020-03-10 | 2022-08-16 | 上海高仙自动化科技发展有限公司 | External parameter calibration method and device, intelligent robot and computer readable storage medium |
CN111443337A (en) * | 2020-03-27 | 2020-07-24 | 北京航空航天大学 | Radar-IMU calibration method based on hand-eye calibration |
CN111443337B (en) * | 2020-03-27 | 2022-03-08 | 北京航空航天大学 | Radar-IMU calibration method based on hand-eye calibration |
CN111735479A (en) * | 2020-08-28 | 2020-10-02 | 中国计量大学 | A multi-sensor joint calibration device and method |
CN111735479B (en) * | 2020-08-28 | 2021-03-23 | 中国计量大学 | Multi-sensor combined calibration device and method |
CN112051590B (en) * | 2020-08-31 | 2021-06-15 | 广州文远知行科技有限公司 | Detection method and related device for laser radar and inertial measurement unit |
CN112051590A (en) * | 2020-08-31 | 2020-12-08 | 广州文远知行科技有限公司 | Detection method and related device for laser radar and inertial measurement unit |
CN112051591A (en) * | 2020-08-31 | 2020-12-08 | 广州文远知行科技有限公司 | Detection method and related device for laser radar and inertial measurement unit |
CN112051591B (en) * | 2020-08-31 | 2022-11-29 | 广州文远知行科技有限公司 | Detection method and related device for laser radar and inertial measurement unit |
CN112146682A (en) * | 2020-09-22 | 2020-12-29 | 福建牧月科技有限公司 | Sensor calibration method and device for intelligent automobile, electronic equipment and medium |
CN112362054A (en) * | 2020-11-30 | 2021-02-12 | 上海商汤临港智能科技有限公司 | Calibration method, calibration device, electronic equipment and storage medium |
CN113759349A (en) * | 2021-09-22 | 2021-12-07 | 阿波罗智能技术(北京)有限公司 | Calibration method and device for laser radar and positioning device and automatic driving vehicle |
CN113848541A (en) * | 2021-09-22 | 2021-12-28 | 深圳市镭神智能系统有限公司 | Calibration method and device, unmanned aerial vehicle and computer readable storage medium |
CN113848541B (en) * | 2021-09-22 | 2022-08-26 | 深圳市镭神智能系统有限公司 | Calibration method and device, unmanned aerial vehicle and computer readable storage medium |
CN113759349B (en) * | 2021-09-22 | 2022-10-04 | 阿波罗智能技术(北京)有限公司 | Calibration method of laser radar and positioning equipment Equipment and autonomous driving vehicle |
WO2023226155A1 (en) * | 2022-05-24 | 2023-11-30 | 芯跳科技(广州)有限公司 | Multi-source data fusion positioning method and apparatus, device, and computer storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110686704A (en) | Pose calibration method, system and medium for laser radar and combined inertial navigation | |
JP6918885B2 (en) | Relative position / orientation orientation method, relative position / orientation orientation device, equipment and medium | |
CN110780285B (en) | Pose calibration method, system and medium for laser radar and combined inertial navigation | |
CN110764111B (en) | Conversion method, device, system and medium of radar coordinates and geodetic coordinates | |
CN109285220B (en) | Three-dimensional scene map generation method, device, equipment and storage medium | |
CN109387186B (en) | Surveying and mapping information acquisition method and device, electronic equipment and storage medium | |
US10482659B2 (en) | System and method for superimposing spatially correlated data over live real-world images | |
CN110849363B (en) | Pose calibration method, system and medium for laser radar and combined inertial navigation | |
JP2020034559A (en) | Method, apparatus, device, and storage medium for calibrating posture of dynamic obstacle | |
CN103578141A (en) | Method and device for achieving augmented reality based on three-dimensional map system | |
US9805058B2 (en) | Visibility of a point of interest based on environmental conditions | |
US20130127852A1 (en) | Methods for providing 3d building information | |
CN112652062A (en) | Point cloud map construction method, device, equipment and storage medium | |
CN106599119A (en) | Image data storage method and apparatus | |
CN113077548A (en) | Collision detection method, device, equipment and storage medium for object | |
CN110517209A (en) | Data processing method, device, system and computer readable storage medium | |
CN108090212B (en) | Method, device and equipment for showing interest points and storage medium | |
CN114187357A (en) | High-precision map production method and device, electronic equipment and storage medium | |
CN112785708A (en) | Method, equipment and storage medium for building model singleization | |
CN113496503A (en) | Point cloud data generation and real-time display method, device, equipment and medium | |
WO2024213029A1 (en) | Stakeout method and apparatus, device, and storage medium | |
US20180039715A1 (en) | System and method for facilitating an inspection process | |
CN114266876B (en) | Positioning method, visual map generation method and device | |
CN112023400A (en) | Height map generation method, device, equipment and storage medium | |
US20230281942A1 (en) | Measurement processing device, method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |