Disclosure of Invention
The invention aims to provide a method and a device for correcting laser point cloud data, a storage medium and electronic equipment, which are used for solving the problem that the laser point cloud data acquired by a laser radar is distorted due to the movement of equipment to be detected.
In order to achieve the above object, according to a first aspect of embodiments of the present disclosure, the present disclosure provides a method for correcting laser point cloud data, the method including:
acquiring a plurality of pose information of equipment to be detected in a first time range, wherein the first time range comprises a second time range, and a target frame comprises a plurality of point cloud data acquired by a laser radar in the second time range;
fitting the plurality of pose information to obtain a pose curve in the first time range;
acquiring pose information corresponding to each point cloud data on the pose curve according to the acquisition time corresponding to each point cloud data in the plurality of point cloud data;
determining a target conversion matrix between pose information corresponding to first point cloud data and pose information corresponding to target point cloud data, wherein the first point cloud data are point cloud data acquired by the laser radar in the target frame at a first acquisition moment, and the target point cloud data are any point cloud data except the first point cloud data in the plurality of point cloud data;
and correcting the target point cloud data according to the target conversion matrix.
Optionally, each of the plurality of pose information comprises: position information and angle information; the pose curve includes: a position curve and an angle curve;
the fitting the plurality of pose information to obtain a pose curve in the first time range includes:
fitting position information of each of the plurality of pose information to obtain the position curve within the first time range;
fitting the angle information of each of the plurality of pose information to obtain the angle curve within the first time range.
Optionally, the fitting the plurality of pose information to obtain a pose curve in the first time range includes:
taking the plurality of pose information as a plurality of control points, and acquiring a pose curve by using a preset B spline curve;
the B-spline curve includes:
wherein u ∈ [0,1]Representing the normalized value of the first time range, C (u) representing the pose curve, PiRepresenting ith position information in N position information, r represents the times of the B spline curve, Ni,r(u) represents a basis function of the B-spline curve.
Optionally, the acquiring, according to the acquisition time corresponding to each point cloud data in the plurality of point cloud data, the pose information corresponding to each point cloud data on the pose curve includes:
determining the position of the acquisition time corresponding to each point cloud data within the first time range by using a preset first formula;
determining pose information corresponding to each point cloud data by using a preset second formula according to the position of the acquisition time corresponding to each point cloud data in the first time range and the pose curve;
the first formula includes:
the second formula includes:
wherein, tjRepresenting the acquisition time u corresponding to the j point cloud datajRepresents tjPosition in the first time range, t1Representing a start time, t, of said first time range2Represents the end time of the first time range, C (u)j) Representing the pose information corresponding to the j point cloud data, Ni,r(uj) A basis function corresponding u representing the B-spline curvejThe value of (a).
Optionally, the determining a target transformation matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data includes:
using the pose information corresponding to the first point cloud data as an origin of a first coordinate system, and using the pose information corresponding to the target point cloud data as an origin of a second coordinate system;
acquiring a transformation matrix of the first coordinate system and the second coordinate system;
and taking the transformation matrix of the first coordinate system and the second coordinate system as the target transformation matrix.
Optionally, each point cloud data of the plurality of point cloud data comprises: a plurality of location information;
the correcting the target point cloud data according to the target conversion matrix comprises the following steps:
correcting each position information in the target point cloud data by using a preset third formula according to the target conversion matrix;
the third formula includes:
x'm=T*xm
wherein T represents the target transformation matrix, xmRepresenting the m-th position information, x 'in the target point cloud data'mDenotes xmCorrected position information.
According to a second aspect of the embodiments of the present disclosure, the present disclosure provides an apparatus for correcting laser point cloud data, the apparatus comprising:
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a plurality of pose information of the device to be detected in a first time range, the first time range comprises a second time range, and a target frame comprises a plurality of point cloud data acquired by the laser radar in the second time range;
the fitting module is used for fitting the plurality of pose information to acquire a pose curve in the first time range;
the second acquisition module is used for acquiring the pose information corresponding to each point cloud data on the pose curve according to the acquisition time corresponding to each point cloud data in the plurality of point cloud data;
the determining module is used for determining a target conversion matrix between pose information corresponding to first point cloud data and pose information corresponding to target point cloud data, wherein the first point cloud data are point cloud data acquired by the laser radar in the target frame at a first acquisition moment, and the target point cloud data are any point cloud data except the first point cloud data in the plurality of point cloud data;
and the correction module is used for correcting the target point cloud data according to the target conversion matrix.
Optionally, each of the plurality of pose information comprises: position information and angle information; the pose curve includes: a position curve and an angle curve;
the fitting module includes:
a first fitting submodule configured to fit position information of each of the plurality of pose information to obtain the position curve within the first time range;
and the second fitting submodule is used for fitting the angle information of each pose information in the plurality of pose information to acquire the angle curve in the first time range.
Optionally, the fitting module is configured to:
taking the plurality of pose information as a plurality of control points, and acquiring a pose curve by using a preset B spline curve;
the B-spline curve includes:
wherein u ∈ [0,1]Representing the normalized value of the first time range, C (u) representing the pose curve, PiRepresenting ith position information in N position information, r represents the times of the B spline curve, Ni,r(u) represents a basis function of the B-spline curve.
Optionally, the second obtaining module includes:
the first acquisition submodule is used for determining the position of the acquisition time corresponding to each point cloud data within the first time range by using a preset first formula;
the second acquisition sub-module is used for determining pose information corresponding to each point cloud data by using a preset second formula according to the position of the acquisition time corresponding to each point cloud data in the first time range and the pose curve;
the first formula includes:
the second formula includes:
wherein, tjRepresenting the acquisition time u corresponding to the j point cloud datajRepresents tjPosition in the first time range, t1Representing a start time, t, of said first time range2Represents the end time of the first time range, C (u)j) Representing the pose information corresponding to the j point cloud data, Ni,r(uj) A basis function corresponding u representing the B-spline curvejThe value of (a).
Optionally, the determining module includes:
the first determining submodule is used for taking the pose information corresponding to the first point cloud data as an origin of a first coordinate system, and taking the pose information corresponding to the target point cloud data as an origin of a second coordinate system;
the conversion submodule is used for acquiring a conversion matrix of the first coordinate system and the second coordinate system;
and the second determining submodule is used for taking a conversion matrix of the first coordinate system and the second coordinate system as the target conversion matrix.
Optionally, each point cloud data of the plurality of point cloud data comprises: a plurality of location information;
the correction module is used for:
correcting each position information in the target point cloud data by using a preset third formula according to the target conversion matrix;
the third formula includes:
x'm=T*xm
wherein T represents the target transformation matrix, xmRepresenting the m-th position information, x 'in the target point cloud data'mDenotes xmCorrected position information.
According to a third aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium on which a computer program is stored, which when executed by a processor, implements the steps of the method for correcting laser point cloud data provided by the first aspect.
According to a fourth aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to realize the steps of the method for correcting the laser point cloud data provided by the first aspect.
By the technical scheme, the method for correcting the multiple point cloud data in the target frame comprises the steps of firstly obtaining multiple pose information of a device to be detected in a first time range, wherein the first time range comprises a second time range corresponding to the multiple point cloud data, then fitting the multiple pose information to obtain a pose curve in the first time range, then obtaining the pose information corresponding to each point cloud data on the pose curve according to the acquisition time corresponding to each point cloud data in the multiple point cloud data, then determining a target conversion matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data, wherein the first point cloud data is the point cloud data acquired by a laser radar in the target frame at the first acquisition time, and the target data is any point cloud data except the first point cloud data in the multiple point cloud data, and finally, correcting the target point cloud data according to the target conversion matrix. According to the method and the device, the point cloud data are corrected according to the pose information corresponding to each point cloud data, so that the accuracy of the point cloud data is improved, and the accuracy of laser radar mapping and positioning is guaranteed.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Before introducing the method, the apparatus, the storage medium, and the electronic device for correcting the laser point cloud data provided by the present disclosure, an application scenario related to various embodiments of the present disclosure is first introduced. The application scene is the equipment to be tested which is provided with the laser radar, the laser radar can emit laser beams according to certain collection frequency to obtain point cloud data, and the equipment to be tested can be a robot, a vehicle and other equipment.
Fig. 1 is a flowchart illustrating a method for correcting laser point cloud data according to an exemplary embodiment. As shown in fig. 1, the method comprises the steps of:
in step 101, a plurality of pose information of the device to be measured in a first time range is obtained, the first time range includes a second time range, and the target frame includes a plurality of point cloud data acquired by the laser radar in the second time range.
For example, the lidar acquires a plurality of frames during the measurement, and each frame includes a plurality of point cloud data acquired by the lidar in the second time range. Taking the example of correcting the plurality of point cloud data in the target frame, the target frame is any one of the plurality of frames. Firstly, acquiring a plurality of pose information of a device to be tested in a first time range, wherein the pose information can be set in the device to be testedPose information collected by sensors such as a program meter, an IMU (English: Inertial Measurement Unit, Chinese: Inertial Measurement Unit) and the like. The acquisition frequency of the pose information (for example, 100Hz) is usually much greater than that of the lidar (for example, 10Hz), so the amount of the pose information is much greater than that of point cloud data in the target frame. The pose information may include position information and posture information, the position information may be three coordinate values in a three-dimensional coordinate, and the posture information may include angles included with three coordinate axes in the three-dimensional coordinate. The device to be tested can be a device which uses laser radar to sense and position the environment, such as a robot and an automatic driving vehicle. The first time range is a time range used by the device to be tested to acquire the plurality of pose information, and the first time range comprises a second time range. The target frame is one frame of point cloud data in the point cloud data collected by the device to be tested, and the second time range is the time range for collecting multiple point cloud data by the target frame, for example, the starting time of the point cloud data of the target frame is tsThe end time is teThe second time range is tsTo teTime range between, target frame including lidar at tsTo teA plurality of point cloud data collected in between. Then the first time range may be any time range including tsTo teIn a time range of, for example, toTo tpThen toIs at tsPrevious time, tpIs at teAt a later time.
In step 102, the pose information is fitted to obtain a pose curve in a first time range.
In step 103, pose information corresponding to each point cloud data on the pose curve is obtained according to the acquisition time corresponding to each point cloud data in the plurality of point cloud data.
For example, after obtaining the pose information of the device under test in the first time range, the pose information may be stored in a queue in time sequence, and if the pose information at any time is to be obtained, the pose information may be obtained by performing interpolation on the pose information (which may be understood as using and calculating the pose information)The position and pose information at a certain moment is determined according to the position and pose information at the moment, so that the position and pose curve in a first time range can be obtained by fitting a plurality of position and pose information acquired by the equipment to be tested, and the position and pose information at any moment on the position and pose curve can be obtained. And after the pose curve is obtained, obtaining pose information corresponding to each point cloud data on the pose curve according to the acquisition time corresponding to each point cloud data in the plurality of point cloud data. For example, the device under test stores the acquired pose information of the device under test in the queue Q { P in time sequence within the first time range1,P2…Pi,…PnIn (f), PiAnd fitting the pose information in the queue Q to obtain a pose curve for the ith pose information, and searching the acquisition time corresponding to each point cloud data in the plurality of point cloud data on the pose curve to obtain the pose information corresponding to each point cloud data.
In step 104, a target transformation matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data is determined, the first point cloud data is point cloud data acquired by the laser radar in the target frame at a first acquisition time, and the target point cloud data is any point cloud data except the first point cloud data in the plurality of point cloud data.
In step 105, the target point cloud data is modified according to the target transformation matrix.
In an example, after a pose curve is obtained, pose information of a first point cloud data corresponding to the pose curve and pose information of a target point cloud data corresponding to the pose curve are obtained, wherein the first point cloud data is point cloud data acquired by a laser radar in a target frame at a first acquisition time, the target point cloud data is any one of a plurality of point cloud data except the first point cloud data, and the plurality of point cloud data in the target frame are corrected. According to the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data, determining a target conversion matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data, namely determining a target conversion matrix of the pose information corresponding to any point cloud data except the first point cloud data in the plurality of point cloud data relative to the pose information corresponding to the first point cloud data, wherein if the target frame comprises M point cloud data, the number of the target conversion matrices is M-1. And finally, correcting the target point cloud data by using a target conversion matrix, wherein the target conversion matrix can convert the pose information, corresponding to the target point cloud data, in different coordinate systems into the pose information, corresponding to the first point cloud data, in the coordinate system, and convert the pose information, corresponding to the point cloud data, in the different coordinate systems into the pose information in the same coordinate system.
In summary, in the present disclosure, to correct a plurality of point cloud data in a target frame, a plurality of pose information of a device under test in a first time range is obtained, the first time range includes a second time range corresponding to the plurality of point cloud data, the plurality of pose information is fitted to obtain a pose curve in the first time range, then pose information corresponding to each point cloud data on the pose curve is obtained according to an acquisition time corresponding to each point cloud data in the plurality of point cloud data, a target transformation matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data is determined, the first point cloud data is the point cloud data acquired by a laser radar in the target frame at the first acquisition time, the target point cloud data is any one point cloud data except the first point cloud data in the plurality of point cloud data, and finally, correcting the target point cloud data according to the target conversion matrix. According to the method and the device, the point cloud data are corrected according to the pose information corresponding to each point cloud data, so that the accuracy of the point cloud data is improved, and the accuracy of laser radar mapping and positioning is guaranteed.
Fig. 2 is a flow chart illustrating one step 102 of the embodiment shown in fig. 1. As shown in fig. 2, each of the plurality of pose information includes: position information and angle information. The pose curve includes: a position curve and an angle curve.
Step 102 comprises the steps of:
in step 1021, the position information of each of the plurality of pose information is fitted to obtain a position curve within the first time range.
In step 1022, the angle information of each pose information of the plurality of pose information is fitted to obtain an angle curve in the first time range.
Illustratively, each of the plurality of pose information includes: the position information may be coordinate values of the device under test in a map (for example, three coordinate values x, y, and z in a three-dimensional coordinate system), and the angle information may be an angle at which the device under test is located (for example, an included angle between the device under test and three coordinate axes of the three-dimensional coordinate system). The fitting of the plurality of pose information may be divided into fitting of position information of each of the plurality of pose information and fitting of angle information of each of the plurality of pose information to acquire a position curve and an angle curve within the first time range, respectively. For example, the pose information includes position information (x, y, z) and angle information (α, β, γ), where x, y, z are three coordinate values corresponding to X, Y, Z coordinate axes in the three-dimensional coordinate system, and α, β, γ are included angles between the device under test and X, Y, Z coordinate axes in the three-dimensional coordinate system. The fitting of the position information of each of the plurality of pose information may be further divided into fitting of three coordinate values of x, y, and z, respectively, and obtaining a position curve of the three coordinate values of x, y, and z, and similarly, the fitting of the angle information of each of the plurality of pose information may be also divided into fitting of three angles of α, β, and γ, respectively, and obtaining an angle curve of the three angles of α, β, and γ, respectively.
Optionally, the implementation manner of step 102 may be:
and taking the plurality of pose information as a plurality of control points, and acquiring a pose curve by using a preset B spline curve.
The B-spline curve includes:
wherein u ∈ [0,1]The value after the normalization of the first time range is shown, C (u) shows a pose curve, PiRepresenting the ith position information in the N position information, r representing the times of the B spline curve, Ni,r(u) represents a basis function of the B-spline curve.
For example, the mode of fitting the pose information may be to use the pose information as control points and obtain a pose curve by using a preset B-spline curve. The preset B-spline curve can be a quadratic B-spline curve or a cubic B-spline curve. Basis function N of B-spline curvei,r(u) is:
wherein u isiIndicates the position, u, corresponding to the ith control point (pose information) in the first time rangei+1Indicates the position, u, corresponding to the (i + 1) th control point (pose information) within the first time rangei+rIndicates the position u corresponding to the i + r control point (pose information) in the first time rangei+r+1And (3) indicating the position corresponding to the (i + r + 1) th control point (pose information) in the first time range.
Taking the example of fitting a plurality of pose information by adopting a cubic B spline curve, the basis function of the spline curve is as follows:
the cubic B-spline curve is:
fig. 3 is a flow chart illustrating one step 103 of the embodiment shown in fig. 1. As shown in fig. 3, step 103 includes the following steps:
in step 1031, the position of the acquisition time corresponding to each point cloud data within the first time range is determined by using a preset first formula.
In step 1032, the pose information corresponding to each point cloud data is determined by using a preset second formula according to the position and pose curve of the acquisition time corresponding to each point cloud data within the first time range.
The first formula includes:
the second formula includes:
wherein, tjRepresenting the acquisition time u corresponding to the j point cloud datajRepresents tjPosition in a first time range, t1Representing the start time, t, of a first time range2Denotes the end time of the first time range, C (u)j) Representing the pose information corresponding to the j point cloud data, Ni,r(uj) Corresponding u of base function representing B spline curvejThe value of (a).
In an example, after the device to be tested acquires the pose curve within the first time range, the position of the acquisition time corresponding to each point cloud data within the first time range is determined by using a preset first formula, and the position of each point cloud data on the pose curve can be determined by the position of the acquisition time corresponding to each point cloud data within the first time range, so that the corresponding pose information is determined. The first formula includes:
when t is
j=t
1Time u
jWhen t is equal to 0
j=t
2Time u
j1. The device to be tested utilizes the position and pose curve of the acquisition moment corresponding to each point cloud data in the first time rangeAnd determining pose information corresponding to each point cloud data by a preset second formula. Taking the example of fitting a plurality of pose information by using cubic B-spline curve, the starting time t of the first time range
10, the end time t of the first time range
210s, the acquisition time t corresponding to the j point cloud data
jIs 1s, t
jPosition in a first time range
The position and pose information corresponding to the jth point cloud data is
Fig. 4 is a flow chart illustrating one step 104 of the embodiment shown in fig. 1. As shown in fig. 4, step 104 includes the steps of:
in step 1041, the pose information corresponding to the first point cloud data is used as the origin of the first coordinate system, and the pose information corresponding to the target point cloud data is used as the origin of the second coordinate system.
In step 1042, a transformation matrix of the first coordinate system and the second coordinate system is obtained.
In step 1043, the transformation matrix of the first coordinate system and the second coordinate system is used as the target transformation matrix.
For example, a target transformation matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data is determined, the pose information corresponding to the first point cloud data may be used as an origin of a first coordinate system, the pose information corresponding to the target point cloud data may be used as an origin of a second coordinate system, that is, a coordinate system using the pose information corresponding to the first point cloud data as the origin and a coordinate system using the pose information corresponding to the target point cloud data as the origin are established, and a transformation matrix between the two coordinate systems is the target transformation matrix. The target transformation matrix may be a 4 x 4 matrix, the upper left corner 3 x 3 matrix of the target transformation matrix may represent the angular relationship between the first coordinate system and the second coordinate system, and the first 3 data in the last column of data in the target transformation matrix may represent the first coordinate system and the second coordinate systemCoordinate relationship between the systems, the last action {0,0,0,1} in the target transformation matrix. The pose information corresponding to the first point cloud data is set as { x
o,y
o,z
o,α
o,β
o,γ
oAnd the pose information corresponding to the target point cloud data is { x }, wherein the pose information is {0,0,0,0, 0}
j,y
j,z
j,α
j,β
j,γ
j{0.2,0.1,0,0,0,0.1} for example, where { x }
o,y
o,z
oAnd { x }
j,y
j,z
jPosition information in pose information, { α
o,β
o,γ
oAnd α
j,β
j,γ
jIf the pose information is the angle information in the pose information, the target transformation matrix can be obtained as follows:
and the 3 x 3 matrix at the upper left corner of the target conversion matrix is an angle conversion value between coordinate axes of the two coordinate systems, and the 3 x 1 matrix at the upper right corner is a coordinate conversion value between the two coordinate systems.
Optionally, each point cloud data of the plurality of point cloud data comprises: a plurality of location information.
The implementation of step 105 may be:
and correcting each position information in the target point cloud data by using a preset third formula according to the target conversion matrix.
The third formula includes:
x'm=T*xm
where T represents the target transformation matrix, xmRepresenting the m-th position information, x 'in the target point cloud data'mDenotes xmCorrected position information.
For example, each point cloud data of the plurality of point cloud data may include: a plurality of location information. The position information may include, for example, a distance and an orientation between the laser radar and a reflection point of the emitted laser beam, and the distance between the laser radar and the reflection point may be represented by three coordinate values of x, y, and z in a three-dimensional coordinate system, the laser radar and the reflection pointThe orientation between can be expressed in degrees. And the equipment to be tested corrects each position information in the target point cloud data by using a preset third formula according to the target conversion matrix. For example, the plurality of point cloud data is { X }
1,X
2,X
3…X
j…},X
jFor target point cloud data, X
jIncluding { x
1,x
2,x
3…x
m…},x
mIs X
jM-th position information in x
mIs {0.5, 0.2, 0}
TThe target transformation matrix T is:
for example, corrected location information
Corrected position information x 'can be obtained'
mIs {0.498,0.799,0}
T。
In summary, in the present disclosure, to correct a plurality of point cloud data in a target frame, a plurality of pose information of a device under test in a first time range is obtained, the first time range includes a second time range corresponding to the plurality of point cloud data, the plurality of pose information is fitted to obtain a pose curve in the first time range, then pose information corresponding to each point cloud data on the pose curve is obtained according to an acquisition time corresponding to each point cloud data in the plurality of point cloud data, a target transformation matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data is determined, the first point cloud data is the point cloud data acquired by a laser radar in the target frame at the first acquisition time, the target point cloud data is any one point cloud data except the first point cloud data in the plurality of point cloud data, and finally, correcting the target point cloud data according to the target conversion matrix. According to the method and the device, the point cloud data are corrected according to the pose information corresponding to each point cloud data, so that the accuracy of the point cloud data is improved, and the accuracy of laser radar mapping and positioning is guaranteed.
Fig. 5 is a block diagram illustrating a correction apparatus for laser point cloud data according to an exemplary embodiment. As shown in fig. 5, the apparatus 200 includes:
the first obtaining module 201 is configured to obtain multiple pose information of the device to be measured in a first time range, where the first time range includes a second time range, and the target frame includes multiple point cloud data acquired by the laser radar in the second time range.
The fitting module 202 is configured to fit the pose information to obtain a pose curve in a first time range.
The second obtaining module 203 is configured to obtain pose information corresponding to each point cloud data on the pose curve according to a collection time corresponding to each point cloud data in the plurality of point cloud data.
The determining module 204 is configured to determine a target transformation matrix between pose information corresponding to first point cloud data and pose information corresponding to target point cloud data, where the first point cloud data is point cloud data acquired by a laser radar in a target frame at a first acquisition time, and the target point cloud data is any one of the plurality of point cloud data except the first point cloud data.
And the correcting module 205 is used for correcting the target point cloud data according to the target conversion matrix.
FIG. 6 is a block diagram of one type of fitting module shown in the embodiment of FIG. 5. As shown in fig. 6, each of the plurality of pose information includes: position information and angle information. The pose curve includes: a position curve and an angle curve.
The fitting module 202 includes:
the first fitting submodule 2021 is configured to fit the position information of each of the plurality of pose information to obtain a position curve in the first time range.
The second fitting submodule 2022 is configured to fit the angle information of each pose information of the plurality of pose information to obtain an angle curve in the first time range.
Optionally, the fitting module 202 is configured to:
and taking the plurality of pose information as a plurality of control points, and acquiring a pose curve by using a preset B spline curve.
The B-spline curve includes:
wherein u ∈ [0,1]The value after the normalization of the first time range is shown, C (u) shows a pose curve, PiRepresenting the ith position information in the N position information, r representing the times of the B spline curve, Ni,r(u) represents a basis function of the B-spline curve.
Fig. 7 is a block diagram of a second acquisition module shown in the embodiment of fig. 5. As shown in fig. 7, the second obtaining module 203 includes:
the first obtaining sub-module 2031 is configured to determine, by using a preset first formula, a position of an acquisition time corresponding to each point cloud data within a first time range.
The second obtaining sub-module 2032 is configured to determine, according to the position and the pose curve of the acquisition time corresponding to each point cloud data within the first time range, pose information corresponding to each point cloud data by using a preset second formula.
The first formula includes:
the second formula includes:
wherein, tjRepresenting the acquisition time u corresponding to the j point cloud datajRepresents tjPosition in a first time range, t1Representing the start time, t, of a first time range2Denotes the end time of the first time range, C (u)j) Representing the pose information corresponding to the j point cloud data, Ni,r(uj) Corresponding u of base function representing B spline curvejThe value of (a).
FIG. 8 is a block diagram of one type of determination module shown in the embodiment shown in FIG. 5. As shown in fig. 8, the determining module 204 includes:
the first determining submodule 2041 is configured to use the pose information corresponding to the first point cloud data as an origin of a first coordinate system, and use the pose information corresponding to the target point cloud data as an origin of a second coordinate system.
The converting submodule 2042 is configured to obtain a conversion matrix of the first coordinate system and the second coordinate system.
The second determining submodule 2043 is configured to use a transformation matrix of the first coordinate system and the second coordinate system as a target transformation matrix.
Optionally, each point cloud data of the plurality of point cloud data comprises: a plurality of location information.
The correction module 205 is configured to:
and correcting each position information in the target point cloud data by using a preset third formula according to the target conversion matrix.
The third formula includes:
x'm=T*xm
where T represents the target transformation matrix, xmRepresenting the m-th position information, x 'in the target point cloud data'mDenotes xmCorrected position information.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
In summary, in the present disclosure, to correct a plurality of point cloud data in a target frame, a plurality of pose information of a device under test in a first time range is obtained, the first time range includes a second time range corresponding to the plurality of point cloud data, the plurality of pose information is fitted to obtain a pose curve in the first time range, then pose information corresponding to each point cloud data on the pose curve is obtained according to an acquisition time corresponding to each point cloud data in the plurality of point cloud data, a target transformation matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data is determined, the first point cloud data is the point cloud data acquired by a laser radar in the target frame at the first acquisition time, the target point cloud data is any one point cloud data except the first point cloud data in the plurality of point cloud data, and finally, correcting the target point cloud data according to the target conversion matrix. According to the method and the device, the point cloud data are corrected according to the pose information corresponding to each point cloud data, so that the accuracy of the point cloud data is improved, and the accuracy of laser radar mapping and positioning is guaranteed.
Fig. 9 is a block diagram illustrating an electronic device 300 in accordance with an example embodiment. As shown in fig. 9, the electronic device 300 may include: a processor 301 and a memory 302. The electronic device 300 may also include one or more of a multimedia component 303, an input/output (I/O) interface 304, and a communication component 305.
The processor 301 is configured to control the overall operation of the electronic device 300, so as to complete all or part of the steps in the above method for correcting the laser point cloud data. The memory 302 is used to store various types of data to support operation at the electronic device 300, such as instructions for any application or method operating on the electronic device 300 and application-related data, such as contact data, transmitted and received messages, pictures, audio, video, and the like. The Memory 302 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk or optical disk. The multimedia components 303 may include a screen and an audio component. Wherein the screen may be, for example, a touch screen and the audio component is used for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signal may further be stored in the memory 302 or transmitted through the communication component 305. The audio assembly also includes at least one speaker for outputting audio signals. The I/O interface 304 provides an interface between the processor 301 and other interface modules, such as a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 305 is used for wired or wireless communication between the electronic device 300 and other devices. Wireless communication, such as Wi-Fi, bluetooth, Near Field Communication (NFC), 2G, 3G, or 4G, or a combination of one or more of them, so that the corresponding communication component 305 may include: Wi-Fi module, bluetooth module, NFC module.
In an exemplary embodiment, the electronic Device 300 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors or other electronic components, and is used for executing the above-mentioned method for correcting the laser point cloud data.
In another exemplary embodiment, there is also provided a computer readable storage medium including program instructions, which when executed by a processor, implement the steps of the above-described method for correcting laser point cloud data. For example, the computer readable storage medium may be the memory 302 including the program instructions, which are executable by the processor 301 of the electronic device 300 to perform the method for correcting the laser point cloud data.
In summary, in the present disclosure, to correct a plurality of point cloud data in a target frame, a plurality of pose information of a device under test in a first time range is obtained, the first time range includes a second time range corresponding to the plurality of point cloud data, the plurality of pose information is fitted to obtain a pose curve in the first time range, then pose information corresponding to each point cloud data on the pose curve is obtained according to an acquisition time corresponding to each point cloud data in the plurality of point cloud data, a target transformation matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data is determined, the first point cloud data is the point cloud data acquired by a laser radar in the target frame at the first acquisition time, the target point cloud data is any one point cloud data except the first point cloud data in the plurality of point cloud data, and finally, correcting the target point cloud data according to the target conversion matrix. According to the method and the device, the point cloud data are corrected according to the pose information corresponding to each point cloud data, so that the accuracy of the point cloud data is improved, and the accuracy of laser radar mapping and positioning is guaranteed.
The preferred embodiments of the present disclosure are described in detail with reference to the accompanying drawings, however, the present disclosure is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solution of the present disclosure within the technical idea of the present disclosure, and these simple modifications all belong to the protection scope of the present disclosure.
It should be noted that, in the foregoing embodiments, various features described in the above embodiments may be combined in any suitable manner, and in order to avoid unnecessary repetition, various combinations that are possible in the present disclosure are not described again.
In addition, any combination of various embodiments of the present disclosure may be made, and the same should be considered as the disclosure of the present disclosure, as long as it does not depart from the spirit of the present disclosure.