[go: up one dir, main page]

CN109613543B - Method and device for correcting laser point cloud data, storage medium and electronic equipment - Google Patents

Method and device for correcting laser point cloud data, storage medium and electronic equipment Download PDF

Info

Publication number
CN109613543B
CN109613543B CN201811489855.XA CN201811489855A CN109613543B CN 109613543 B CN109613543 B CN 109613543B CN 201811489855 A CN201811489855 A CN 201811489855A CN 109613543 B CN109613543 B CN 109613543B
Authority
CN
China
Prior art keywords
point cloud
cloud data
pose
curve
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811489855.XA
Other languages
Chinese (zh)
Other versions
CN109613543A (en
Inventor
李连中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cloudminds Shanghai Robotics Co Ltd
Original Assignee
Cloudminds Shenzhen Robotics Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cloudminds Shenzhen Robotics Systems Co Ltd filed Critical Cloudminds Shenzhen Robotics Systems Co Ltd
Priority to CN201811489855.XA priority Critical patent/CN109613543B/en
Publication of CN109613543A publication Critical patent/CN109613543A/en
Application granted granted Critical
Publication of CN109613543B publication Critical patent/CN109613543B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The present disclosure relates to a method, an apparatus, a storage medium, and an electronic device for correcting laser point cloud data, the method including: the method comprises the steps of obtaining a plurality of pose information of equipment to be detected in a first time range, fitting the pose information to obtain a pose curve in the first time range, obtaining pose information corresponding to each point cloud data on the pose curve according to the collection time corresponding to each point cloud data in a plurality of point cloud data, determining a target conversion matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to target point cloud data, wherein the first point cloud data are the point cloud data collected by a laser radar in a target frame at the first collection time, the target point cloud data are any point cloud data except the first point cloud data in the point cloud data, and correcting the target point cloud data according to the target conversion matrix. According to the method and the device, the point cloud data are corrected according to the pose information corresponding to each point cloud data, so that the accuracy of the point cloud data is improved.

Description

Method and device for correcting laser point cloud data, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of laser technologies, and in particular, to a method and an apparatus for correcting laser point cloud data, a storage medium, and an electronic device.
Background
With the development of electronic technology, laser radars are beginning to be widely applied, for example, laser radars can be used as sensing equipment in the field of robot measurement, and have the advantages of high measurement precision, small environmental interference and the like. The laser radar can quickly establish a three-dimensional model of the surrounding environment of the robot through laser scanning, and provides basic data for operations such as high-precision map making, obstacle recognition, accurate robot positioning and the like. Because the acquisition frequency of the laser radar is relatively low, the laser radar does not acquire a frame of laser point cloud data instantly, namely the frame of laser point cloud data comprises a plurality of point cloud data acquired by the laser radar within a period of time. Because the robot is usually in a motion state, in a frame of point cloud data acquired by the laser radar, each point cloud data is not on the same coordinate system, namely the laser point cloud data is distorted, and measurement errors and accuracy are influenced by directly utilizing the acquired laser point cloud data to establish a map or positioning.
Disclosure of Invention
The invention aims to provide a method and a device for correcting laser point cloud data, a storage medium and electronic equipment, which are used for solving the problem that the laser point cloud data acquired by a laser radar is distorted due to the movement of equipment to be detected.
In order to achieve the above object, according to a first aspect of embodiments of the present disclosure, the present disclosure provides a method for correcting laser point cloud data, the method including:
acquiring a plurality of pose information of equipment to be detected in a first time range, wherein the first time range comprises a second time range, and a target frame comprises a plurality of point cloud data acquired by a laser radar in the second time range;
fitting the plurality of pose information to obtain a pose curve in the first time range;
acquiring pose information corresponding to each point cloud data on the pose curve according to the acquisition time corresponding to each point cloud data in the plurality of point cloud data;
determining a target conversion matrix between pose information corresponding to first point cloud data and pose information corresponding to target point cloud data, wherein the first point cloud data are point cloud data acquired by the laser radar in the target frame at a first acquisition moment, and the target point cloud data are any point cloud data except the first point cloud data in the plurality of point cloud data;
and correcting the target point cloud data according to the target conversion matrix.
Optionally, each of the plurality of pose information comprises: position information and angle information; the pose curve includes: a position curve and an angle curve;
the fitting the plurality of pose information to obtain a pose curve in the first time range includes:
fitting position information of each of the plurality of pose information to obtain the position curve within the first time range;
fitting the angle information of each of the plurality of pose information to obtain the angle curve within the first time range.
Optionally, the fitting the plurality of pose information to obtain a pose curve in the first time range includes:
taking the plurality of pose information as a plurality of control points, and acquiring a pose curve by using a preset B spline curve;
the B-spline curve includes:
Figure BDA0001895423220000021
wherein u ∈ [0,1]Representing the normalized value of the first time range, C (u) representing the pose curve, PiRepresenting ith position information in N position information, r represents the times of the B spline curve, Ni,r(u) represents a basis function of the B-spline curve.
Optionally, the acquiring, according to the acquisition time corresponding to each point cloud data in the plurality of point cloud data, the pose information corresponding to each point cloud data on the pose curve includes:
determining the position of the acquisition time corresponding to each point cloud data within the first time range by using a preset first formula;
determining pose information corresponding to each point cloud data by using a preset second formula according to the position of the acquisition time corresponding to each point cloud data in the first time range and the pose curve;
the first formula includes:
Figure BDA0001895423220000031
the second formula includes:
Figure BDA0001895423220000032
wherein, tjRepresenting the acquisition time u corresponding to the j point cloud datajRepresents tjPosition in the first time range, t1Representing a start time, t, of said first time range2Represents the end time of the first time range, C (u)j) Representing the pose information corresponding to the j point cloud data, Ni,r(uj) A basis function corresponding u representing the B-spline curvejThe value of (a).
Optionally, the determining a target transformation matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data includes:
using the pose information corresponding to the first point cloud data as an origin of a first coordinate system, and using the pose information corresponding to the target point cloud data as an origin of a second coordinate system;
acquiring a transformation matrix of the first coordinate system and the second coordinate system;
and taking the transformation matrix of the first coordinate system and the second coordinate system as the target transformation matrix.
Optionally, each point cloud data of the plurality of point cloud data comprises: a plurality of location information;
the correcting the target point cloud data according to the target conversion matrix comprises the following steps:
correcting each position information in the target point cloud data by using a preset third formula according to the target conversion matrix;
the third formula includes:
x'm=T*xm
wherein T represents the target transformation matrix, xmRepresenting the m-th position information, x 'in the target point cloud data'mDenotes xmCorrected position information.
According to a second aspect of the embodiments of the present disclosure, the present disclosure provides an apparatus for correcting laser point cloud data, the apparatus comprising:
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a plurality of pose information of the device to be detected in a first time range, the first time range comprises a second time range, and a target frame comprises a plurality of point cloud data acquired by the laser radar in the second time range;
the fitting module is used for fitting the plurality of pose information to acquire a pose curve in the first time range;
the second acquisition module is used for acquiring the pose information corresponding to each point cloud data on the pose curve according to the acquisition time corresponding to each point cloud data in the plurality of point cloud data;
the determining module is used for determining a target conversion matrix between pose information corresponding to first point cloud data and pose information corresponding to target point cloud data, wherein the first point cloud data are point cloud data acquired by the laser radar in the target frame at a first acquisition moment, and the target point cloud data are any point cloud data except the first point cloud data in the plurality of point cloud data;
and the correction module is used for correcting the target point cloud data according to the target conversion matrix.
Optionally, each of the plurality of pose information comprises: position information and angle information; the pose curve includes: a position curve and an angle curve;
the fitting module includes:
a first fitting submodule configured to fit position information of each of the plurality of pose information to obtain the position curve within the first time range;
and the second fitting submodule is used for fitting the angle information of each pose information in the plurality of pose information to acquire the angle curve in the first time range.
Optionally, the fitting module is configured to:
taking the plurality of pose information as a plurality of control points, and acquiring a pose curve by using a preset B spline curve;
the B-spline curve includes:
Figure BDA0001895423220000051
wherein u ∈ [0,1]Representing the normalized value of the first time range, C (u) representing the pose curve, PiRepresenting ith position information in N position information, r represents the times of the B spline curve, Ni,r(u) represents a basis function of the B-spline curve.
Optionally, the second obtaining module includes:
the first acquisition submodule is used for determining the position of the acquisition time corresponding to each point cloud data within the first time range by using a preset first formula;
the second acquisition sub-module is used for determining pose information corresponding to each point cloud data by using a preset second formula according to the position of the acquisition time corresponding to each point cloud data in the first time range and the pose curve;
the first formula includes:
Figure BDA0001895423220000052
the second formula includes:
Figure BDA0001895423220000053
wherein, tjRepresenting the acquisition time u corresponding to the j point cloud datajRepresents tjPosition in the first time range, t1Representing a start time, t, of said first time range2Represents the end time of the first time range, C (u)j) Representing the pose information corresponding to the j point cloud data, Ni,r(uj) A basis function corresponding u representing the B-spline curvejThe value of (a).
Optionally, the determining module includes:
the first determining submodule is used for taking the pose information corresponding to the first point cloud data as an origin of a first coordinate system, and taking the pose information corresponding to the target point cloud data as an origin of a second coordinate system;
the conversion submodule is used for acquiring a conversion matrix of the first coordinate system and the second coordinate system;
and the second determining submodule is used for taking a conversion matrix of the first coordinate system and the second coordinate system as the target conversion matrix.
Optionally, each point cloud data of the plurality of point cloud data comprises: a plurality of location information;
the correction module is used for:
correcting each position information in the target point cloud data by using a preset third formula according to the target conversion matrix;
the third formula includes:
x'm=T*xm
wherein T represents the target transformation matrix, xmRepresenting the m-th position information, x 'in the target point cloud data'mDenotes xmCorrected position information.
According to a third aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium on which a computer program is stored, which when executed by a processor, implements the steps of the method for correcting laser point cloud data provided by the first aspect.
According to a fourth aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to realize the steps of the method for correcting the laser point cloud data provided by the first aspect.
By the technical scheme, the method for correcting the multiple point cloud data in the target frame comprises the steps of firstly obtaining multiple pose information of a device to be detected in a first time range, wherein the first time range comprises a second time range corresponding to the multiple point cloud data, then fitting the multiple pose information to obtain a pose curve in the first time range, then obtaining the pose information corresponding to each point cloud data on the pose curve according to the acquisition time corresponding to each point cloud data in the multiple point cloud data, then determining a target conversion matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data, wherein the first point cloud data is the point cloud data acquired by a laser radar in the target frame at the first acquisition time, and the target data is any point cloud data except the first point cloud data in the multiple point cloud data, and finally, correcting the target point cloud data according to the target conversion matrix. According to the method and the device, the point cloud data are corrected according to the pose information corresponding to each point cloud data, so that the accuracy of the point cloud data is improved, and the accuracy of laser radar mapping and positioning is guaranteed.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure without limiting the disclosure. In the drawings:
fig. 1 is a flowchart illustrating a method for correcting laser point cloud data according to an exemplary embodiment.
Fig. 2 is a flow chart illustrating one step 102 of the embodiment shown in fig. 1.
Fig. 3 is a flow chart illustrating one step 103 of the embodiment shown in fig. 1.
Fig. 4 is a flow chart illustrating one step 104 of the embodiment shown in fig. 1.
Fig. 5 is a block diagram illustrating a correction apparatus for laser point cloud data according to an exemplary embodiment.
FIG. 6 is a block diagram of one type of fitting module shown in the embodiment of FIG. 5.
Fig. 7 is a block diagram of a second acquisition module shown in the embodiment of fig. 5.
FIG. 8 is a block diagram of one type of determination module shown in the embodiment shown in FIG. 5.
FIG. 9 is a block diagram of an electronic device provided in accordance with an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Before introducing the method, the apparatus, the storage medium, and the electronic device for correcting the laser point cloud data provided by the present disclosure, an application scenario related to various embodiments of the present disclosure is first introduced. The application scene is the equipment to be tested which is provided with the laser radar, the laser radar can emit laser beams according to certain collection frequency to obtain point cloud data, and the equipment to be tested can be a robot, a vehicle and other equipment.
Fig. 1 is a flowchart illustrating a method for correcting laser point cloud data according to an exemplary embodiment. As shown in fig. 1, the method comprises the steps of:
in step 101, a plurality of pose information of the device to be measured in a first time range is obtained, the first time range includes a second time range, and the target frame includes a plurality of point cloud data acquired by the laser radar in the second time range.
For example, the lidar acquires a plurality of frames during the measurement, and each frame includes a plurality of point cloud data acquired by the lidar in the second time range. Taking the example of correcting the plurality of point cloud data in the target frame, the target frame is any one of the plurality of frames. Firstly, acquiring a plurality of pose information of a device to be tested in a first time range, wherein the pose information can be set in the device to be testedPose information collected by sensors such as a program meter, an IMU (English: Inertial Measurement Unit, Chinese: Inertial Measurement Unit) and the like. The acquisition frequency of the pose information (for example, 100Hz) is usually much greater than that of the lidar (for example, 10Hz), so the amount of the pose information is much greater than that of point cloud data in the target frame. The pose information may include position information and posture information, the position information may be three coordinate values in a three-dimensional coordinate, and the posture information may include angles included with three coordinate axes in the three-dimensional coordinate. The device to be tested can be a device which uses laser radar to sense and position the environment, such as a robot and an automatic driving vehicle. The first time range is a time range used by the device to be tested to acquire the plurality of pose information, and the first time range comprises a second time range. The target frame is one frame of point cloud data in the point cloud data collected by the device to be tested, and the second time range is the time range for collecting multiple point cloud data by the target frame, for example, the starting time of the point cloud data of the target frame is tsThe end time is teThe second time range is tsTo teTime range between, target frame including lidar at tsTo teA plurality of point cloud data collected in between. Then the first time range may be any time range including tsTo teIn a time range of, for example, toTo tpThen toIs at tsPrevious time, tpIs at teAt a later time.
In step 102, the pose information is fitted to obtain a pose curve in a first time range.
In step 103, pose information corresponding to each point cloud data on the pose curve is obtained according to the acquisition time corresponding to each point cloud data in the plurality of point cloud data.
For example, after obtaining the pose information of the device under test in the first time range, the pose information may be stored in a queue in time sequence, and if the pose information at any time is to be obtained, the pose information may be obtained by performing interpolation on the pose information (which may be understood as using and calculating the pose information)The position and pose information at a certain moment is determined according to the position and pose information at the moment, so that the position and pose curve in a first time range can be obtained by fitting a plurality of position and pose information acquired by the equipment to be tested, and the position and pose information at any moment on the position and pose curve can be obtained. And after the pose curve is obtained, obtaining pose information corresponding to each point cloud data on the pose curve according to the acquisition time corresponding to each point cloud data in the plurality of point cloud data. For example, the device under test stores the acquired pose information of the device under test in the queue Q { P in time sequence within the first time range1,P2…Pi,…PnIn (f), PiAnd fitting the pose information in the queue Q to obtain a pose curve for the ith pose information, and searching the acquisition time corresponding to each point cloud data in the plurality of point cloud data on the pose curve to obtain the pose information corresponding to each point cloud data.
In step 104, a target transformation matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data is determined, the first point cloud data is point cloud data acquired by the laser radar in the target frame at a first acquisition time, and the target point cloud data is any point cloud data except the first point cloud data in the plurality of point cloud data.
In step 105, the target point cloud data is modified according to the target transformation matrix.
In an example, after a pose curve is obtained, pose information of a first point cloud data corresponding to the pose curve and pose information of a target point cloud data corresponding to the pose curve are obtained, wherein the first point cloud data is point cloud data acquired by a laser radar in a target frame at a first acquisition time, the target point cloud data is any one of a plurality of point cloud data except the first point cloud data, and the plurality of point cloud data in the target frame are corrected. According to the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data, determining a target conversion matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data, namely determining a target conversion matrix of the pose information corresponding to any point cloud data except the first point cloud data in the plurality of point cloud data relative to the pose information corresponding to the first point cloud data, wherein if the target frame comprises M point cloud data, the number of the target conversion matrices is M-1. And finally, correcting the target point cloud data by using a target conversion matrix, wherein the target conversion matrix can convert the pose information, corresponding to the target point cloud data, in different coordinate systems into the pose information, corresponding to the first point cloud data, in the coordinate system, and convert the pose information, corresponding to the point cloud data, in the different coordinate systems into the pose information in the same coordinate system.
In summary, in the present disclosure, to correct a plurality of point cloud data in a target frame, a plurality of pose information of a device under test in a first time range is obtained, the first time range includes a second time range corresponding to the plurality of point cloud data, the plurality of pose information is fitted to obtain a pose curve in the first time range, then pose information corresponding to each point cloud data on the pose curve is obtained according to an acquisition time corresponding to each point cloud data in the plurality of point cloud data, a target transformation matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data is determined, the first point cloud data is the point cloud data acquired by a laser radar in the target frame at the first acquisition time, the target point cloud data is any one point cloud data except the first point cloud data in the plurality of point cloud data, and finally, correcting the target point cloud data according to the target conversion matrix. According to the method and the device, the point cloud data are corrected according to the pose information corresponding to each point cloud data, so that the accuracy of the point cloud data is improved, and the accuracy of laser radar mapping and positioning is guaranteed.
Fig. 2 is a flow chart illustrating one step 102 of the embodiment shown in fig. 1. As shown in fig. 2, each of the plurality of pose information includes: position information and angle information. The pose curve includes: a position curve and an angle curve.
Step 102 comprises the steps of:
in step 1021, the position information of each of the plurality of pose information is fitted to obtain a position curve within the first time range.
In step 1022, the angle information of each pose information of the plurality of pose information is fitted to obtain an angle curve in the first time range.
Illustratively, each of the plurality of pose information includes: the position information may be coordinate values of the device under test in a map (for example, three coordinate values x, y, and z in a three-dimensional coordinate system), and the angle information may be an angle at which the device under test is located (for example, an included angle between the device under test and three coordinate axes of the three-dimensional coordinate system). The fitting of the plurality of pose information may be divided into fitting of position information of each of the plurality of pose information and fitting of angle information of each of the plurality of pose information to acquire a position curve and an angle curve within the first time range, respectively. For example, the pose information includes position information (x, y, z) and angle information (α, β, γ), where x, y, z are three coordinate values corresponding to X, Y, Z coordinate axes in the three-dimensional coordinate system, and α, β, γ are included angles between the device under test and X, Y, Z coordinate axes in the three-dimensional coordinate system. The fitting of the position information of each of the plurality of pose information may be further divided into fitting of three coordinate values of x, y, and z, respectively, and obtaining a position curve of the three coordinate values of x, y, and z, and similarly, the fitting of the angle information of each of the plurality of pose information may be also divided into fitting of three angles of α, β, and γ, respectively, and obtaining an angle curve of the three angles of α, β, and γ, respectively.
Optionally, the implementation manner of step 102 may be:
and taking the plurality of pose information as a plurality of control points, and acquiring a pose curve by using a preset B spline curve.
The B-spline curve includes:
Figure BDA0001895423220000121
wherein u ∈ [0,1]The value after the normalization of the first time range is shown, C (u) shows a pose curve, PiRepresenting the ith position information in the N position information, r representing the times of the B spline curve, Ni,r(u) represents a basis function of the B-spline curve.
For example, the mode of fitting the pose information may be to use the pose information as control points and obtain a pose curve by using a preset B-spline curve. The preset B-spline curve can be a quadratic B-spline curve or a cubic B-spline curve. Basis function N of B-spline curvei,r(u) is:
Figure BDA0001895423220000122
Figure BDA0001895423220000123
wherein u isiIndicates the position, u, corresponding to the ith control point (pose information) in the first time rangei+1Indicates the position, u, corresponding to the (i + 1) th control point (pose information) within the first time rangei+rIndicates the position u corresponding to the i + r control point (pose information) in the first time rangei+r+1And (3) indicating the position corresponding to the (i + r + 1) th control point (pose information) in the first time range.
Taking the example of fitting a plurality of pose information by adopting a cubic B spline curve, the basis function of the spline curve is as follows:
Figure BDA0001895423220000124
the cubic B-spline curve is:
Figure BDA0001895423220000125
fig. 3 is a flow chart illustrating one step 103 of the embodiment shown in fig. 1. As shown in fig. 3, step 103 includes the following steps:
in step 1031, the position of the acquisition time corresponding to each point cloud data within the first time range is determined by using a preset first formula.
In step 1032, the pose information corresponding to each point cloud data is determined by using a preset second formula according to the position and pose curve of the acquisition time corresponding to each point cloud data within the first time range.
The first formula includes:
Figure BDA0001895423220000131
the second formula includes:
Figure BDA0001895423220000132
wherein, tjRepresenting the acquisition time u corresponding to the j point cloud datajRepresents tjPosition in a first time range, t1Representing the start time, t, of a first time range2Denotes the end time of the first time range, C (u)j) Representing the pose information corresponding to the j point cloud data, Ni,r(uj) Corresponding u of base function representing B spline curvejThe value of (a).
In an example, after the device to be tested acquires the pose curve within the first time range, the position of the acquisition time corresponding to each point cloud data within the first time range is determined by using a preset first formula, and the position of each point cloud data on the pose curve can be determined by the position of the acquisition time corresponding to each point cloud data within the first time range, so that the corresponding pose information is determined. The first formula includes:
Figure BDA0001895423220000133
when t isj=t1Time ujWhen t is equal to 0j=t2Time uj1. The device to be tested utilizes the position and pose curve of the acquisition moment corresponding to each point cloud data in the first time rangeAnd determining pose information corresponding to each point cloud data by a preset second formula. Taking the example of fitting a plurality of pose information by using cubic B-spline curve, the starting time t of the first time range10, the end time t of the first time range210s, the acquisition time t corresponding to the j point cloud datajIs 1s, tjPosition in a first time range
Figure BDA0001895423220000141
The position and pose information corresponding to the jth point cloud data is
Figure BDA0001895423220000142
Fig. 4 is a flow chart illustrating one step 104 of the embodiment shown in fig. 1. As shown in fig. 4, step 104 includes the steps of:
in step 1041, the pose information corresponding to the first point cloud data is used as the origin of the first coordinate system, and the pose information corresponding to the target point cloud data is used as the origin of the second coordinate system.
In step 1042, a transformation matrix of the first coordinate system and the second coordinate system is obtained.
In step 1043, the transformation matrix of the first coordinate system and the second coordinate system is used as the target transformation matrix.
For example, a target transformation matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data is determined, the pose information corresponding to the first point cloud data may be used as an origin of a first coordinate system, the pose information corresponding to the target point cloud data may be used as an origin of a second coordinate system, that is, a coordinate system using the pose information corresponding to the first point cloud data as the origin and a coordinate system using the pose information corresponding to the target point cloud data as the origin are established, and a transformation matrix between the two coordinate systems is the target transformation matrix. The target transformation matrix may be a 4 x 4 matrix, the upper left corner 3 x 3 matrix of the target transformation matrix may represent the angular relationship between the first coordinate system and the second coordinate system, and the first 3 data in the last column of data in the target transformation matrix may represent the first coordinate system and the second coordinate systemCoordinate relationship between the systems, the last action {0,0,0,1} in the target transformation matrix. The pose information corresponding to the first point cloud data is set as { xo,yo,zo,αo,βo,γoAnd the pose information corresponding to the target point cloud data is { x }, wherein the pose information is {0,0,0,0, 0}j,yj,zj,αj,βj,γj{0.2,0.1,0,0,0,0.1} for example, where { x }o,yo,zoAnd { x }j,yj,zjPosition information in pose information, { αo,βo,γoAnd αj,βj,γjIf the pose information is the angle information in the pose information, the target transformation matrix can be obtained as follows:
Figure BDA0001895423220000143
and the 3 x 3 matrix at the upper left corner of the target conversion matrix is an angle conversion value between coordinate axes of the two coordinate systems, and the 3 x 1 matrix at the upper right corner is a coordinate conversion value between the two coordinate systems.
Optionally, each point cloud data of the plurality of point cloud data comprises: a plurality of location information.
The implementation of step 105 may be:
and correcting each position information in the target point cloud data by using a preset third formula according to the target conversion matrix.
The third formula includes:
x'm=T*xm
where T represents the target transformation matrix, xmRepresenting the m-th position information, x 'in the target point cloud data'mDenotes xmCorrected position information.
For example, each point cloud data of the plurality of point cloud data may include: a plurality of location information. The position information may include, for example, a distance and an orientation between the laser radar and a reflection point of the emitted laser beam, and the distance between the laser radar and the reflection point may be represented by three coordinate values of x, y, and z in a three-dimensional coordinate system, the laser radar and the reflection pointThe orientation between can be expressed in degrees. And the equipment to be tested corrects each position information in the target point cloud data by using a preset third formula according to the target conversion matrix. For example, the plurality of point cloud data is { X }1,X2,X3…Xj…},XjFor target point cloud data, XjIncluding { x1,x2,x3…xm…},xmIs XjM-th position information in xmIs {0.5, 0.2, 0}TThe target transformation matrix T is:
Figure BDA0001895423220000151
for example, corrected location information
Figure BDA0001895423220000152
Corrected position information x 'can be obtained'mIs {0.498,0.799,0}T
In summary, in the present disclosure, to correct a plurality of point cloud data in a target frame, a plurality of pose information of a device under test in a first time range is obtained, the first time range includes a second time range corresponding to the plurality of point cloud data, the plurality of pose information is fitted to obtain a pose curve in the first time range, then pose information corresponding to each point cloud data on the pose curve is obtained according to an acquisition time corresponding to each point cloud data in the plurality of point cloud data, a target transformation matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data is determined, the first point cloud data is the point cloud data acquired by a laser radar in the target frame at the first acquisition time, the target point cloud data is any one point cloud data except the first point cloud data in the plurality of point cloud data, and finally, correcting the target point cloud data according to the target conversion matrix. According to the method and the device, the point cloud data are corrected according to the pose information corresponding to each point cloud data, so that the accuracy of the point cloud data is improved, and the accuracy of laser radar mapping and positioning is guaranteed.
Fig. 5 is a block diagram illustrating a correction apparatus for laser point cloud data according to an exemplary embodiment. As shown in fig. 5, the apparatus 200 includes:
the first obtaining module 201 is configured to obtain multiple pose information of the device to be measured in a first time range, where the first time range includes a second time range, and the target frame includes multiple point cloud data acquired by the laser radar in the second time range.
The fitting module 202 is configured to fit the pose information to obtain a pose curve in a first time range.
The second obtaining module 203 is configured to obtain pose information corresponding to each point cloud data on the pose curve according to a collection time corresponding to each point cloud data in the plurality of point cloud data.
The determining module 204 is configured to determine a target transformation matrix between pose information corresponding to first point cloud data and pose information corresponding to target point cloud data, where the first point cloud data is point cloud data acquired by a laser radar in a target frame at a first acquisition time, and the target point cloud data is any one of the plurality of point cloud data except the first point cloud data.
And the correcting module 205 is used for correcting the target point cloud data according to the target conversion matrix.
FIG. 6 is a block diagram of one type of fitting module shown in the embodiment of FIG. 5. As shown in fig. 6, each of the plurality of pose information includes: position information and angle information. The pose curve includes: a position curve and an angle curve.
The fitting module 202 includes:
the first fitting submodule 2021 is configured to fit the position information of each of the plurality of pose information to obtain a position curve in the first time range.
The second fitting submodule 2022 is configured to fit the angle information of each pose information of the plurality of pose information to obtain an angle curve in the first time range.
Optionally, the fitting module 202 is configured to:
and taking the plurality of pose information as a plurality of control points, and acquiring a pose curve by using a preset B spline curve.
The B-spline curve includes:
Figure BDA0001895423220000171
wherein u ∈ [0,1]The value after the normalization of the first time range is shown, C (u) shows a pose curve, PiRepresenting the ith position information in the N position information, r representing the times of the B spline curve, Ni,r(u) represents a basis function of the B-spline curve.
Fig. 7 is a block diagram of a second acquisition module shown in the embodiment of fig. 5. As shown in fig. 7, the second obtaining module 203 includes:
the first obtaining sub-module 2031 is configured to determine, by using a preset first formula, a position of an acquisition time corresponding to each point cloud data within a first time range.
The second obtaining sub-module 2032 is configured to determine, according to the position and the pose curve of the acquisition time corresponding to each point cloud data within the first time range, pose information corresponding to each point cloud data by using a preset second formula.
The first formula includes:
Figure BDA0001895423220000172
the second formula includes:
Figure BDA0001895423220000173
wherein, tjRepresenting the acquisition time u corresponding to the j point cloud datajRepresents tjPosition in a first time range, t1Representing the start time, t, of a first time range2Denotes the end time of the first time range, C (u)j) Representing the pose information corresponding to the j point cloud data, Ni,r(uj) Corresponding u of base function representing B spline curvejThe value of (a).
FIG. 8 is a block diagram of one type of determination module shown in the embodiment shown in FIG. 5. As shown in fig. 8, the determining module 204 includes:
the first determining submodule 2041 is configured to use the pose information corresponding to the first point cloud data as an origin of a first coordinate system, and use the pose information corresponding to the target point cloud data as an origin of a second coordinate system.
The converting submodule 2042 is configured to obtain a conversion matrix of the first coordinate system and the second coordinate system.
The second determining submodule 2043 is configured to use a transformation matrix of the first coordinate system and the second coordinate system as a target transformation matrix.
Optionally, each point cloud data of the plurality of point cloud data comprises: a plurality of location information.
The correction module 205 is configured to:
and correcting each position information in the target point cloud data by using a preset third formula according to the target conversion matrix.
The third formula includes:
x'm=T*xm
where T represents the target transformation matrix, xmRepresenting the m-th position information, x 'in the target point cloud data'mDenotes xmCorrected position information.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
In summary, in the present disclosure, to correct a plurality of point cloud data in a target frame, a plurality of pose information of a device under test in a first time range is obtained, the first time range includes a second time range corresponding to the plurality of point cloud data, the plurality of pose information is fitted to obtain a pose curve in the first time range, then pose information corresponding to each point cloud data on the pose curve is obtained according to an acquisition time corresponding to each point cloud data in the plurality of point cloud data, a target transformation matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data is determined, the first point cloud data is the point cloud data acquired by a laser radar in the target frame at the first acquisition time, the target point cloud data is any one point cloud data except the first point cloud data in the plurality of point cloud data, and finally, correcting the target point cloud data according to the target conversion matrix. According to the method and the device, the point cloud data are corrected according to the pose information corresponding to each point cloud data, so that the accuracy of the point cloud data is improved, and the accuracy of laser radar mapping and positioning is guaranteed.
Fig. 9 is a block diagram illustrating an electronic device 300 in accordance with an example embodiment. As shown in fig. 9, the electronic device 300 may include: a processor 301 and a memory 302. The electronic device 300 may also include one or more of a multimedia component 303, an input/output (I/O) interface 304, and a communication component 305.
The processor 301 is configured to control the overall operation of the electronic device 300, so as to complete all or part of the steps in the above method for correcting the laser point cloud data. The memory 302 is used to store various types of data to support operation at the electronic device 300, such as instructions for any application or method operating on the electronic device 300 and application-related data, such as contact data, transmitted and received messages, pictures, audio, video, and the like. The Memory 302 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk or optical disk. The multimedia components 303 may include a screen and an audio component. Wherein the screen may be, for example, a touch screen and the audio component is used for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signal may further be stored in the memory 302 or transmitted through the communication component 305. The audio assembly also includes at least one speaker for outputting audio signals. The I/O interface 304 provides an interface between the processor 301 and other interface modules, such as a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 305 is used for wired or wireless communication between the electronic device 300 and other devices. Wireless communication, such as Wi-Fi, bluetooth, Near Field Communication (NFC), 2G, 3G, or 4G, or a combination of one or more of them, so that the corresponding communication component 305 may include: Wi-Fi module, bluetooth module, NFC module.
In an exemplary embodiment, the electronic Device 300 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors or other electronic components, and is used for executing the above-mentioned method for correcting the laser point cloud data.
In another exemplary embodiment, there is also provided a computer readable storage medium including program instructions, which when executed by a processor, implement the steps of the above-described method for correcting laser point cloud data. For example, the computer readable storage medium may be the memory 302 including the program instructions, which are executable by the processor 301 of the electronic device 300 to perform the method for correcting the laser point cloud data.
In summary, in the present disclosure, to correct a plurality of point cloud data in a target frame, a plurality of pose information of a device under test in a first time range is obtained, the first time range includes a second time range corresponding to the plurality of point cloud data, the plurality of pose information is fitted to obtain a pose curve in the first time range, then pose information corresponding to each point cloud data on the pose curve is obtained according to an acquisition time corresponding to each point cloud data in the plurality of point cloud data, a target transformation matrix between the pose information corresponding to the first point cloud data and the pose information corresponding to the target point cloud data is determined, the first point cloud data is the point cloud data acquired by a laser radar in the target frame at the first acquisition time, the target point cloud data is any one point cloud data except the first point cloud data in the plurality of point cloud data, and finally, correcting the target point cloud data according to the target conversion matrix. According to the method and the device, the point cloud data are corrected according to the pose information corresponding to each point cloud data, so that the accuracy of the point cloud data is improved, and the accuracy of laser radar mapping and positioning is guaranteed.
The preferred embodiments of the present disclosure are described in detail with reference to the accompanying drawings, however, the present disclosure is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solution of the present disclosure within the technical idea of the present disclosure, and these simple modifications all belong to the protection scope of the present disclosure.
It should be noted that, in the foregoing embodiments, various features described in the above embodiments may be combined in any suitable manner, and in order to avoid unnecessary repetition, various combinations that are possible in the present disclosure are not described again.
In addition, any combination of various embodiments of the present disclosure may be made, and the same should be considered as the disclosure of the present disclosure, as long as it does not depart from the spirit of the present disclosure.

Claims (14)

1. A method for correcting laser point cloud data is characterized by comprising the following steps:
acquiring a plurality of pose information of equipment to be detected in a first time range, wherein the first time range comprises a second time range, and a target frame comprises a plurality of point cloud data acquired by a laser radar in the second time range;
fitting the plurality of pose information to obtain a pose curve in the first time range;
acquiring pose information corresponding to each point cloud data on the pose curve according to the acquisition time corresponding to each point cloud data in the plurality of point cloud data;
determining a target conversion matrix between pose information corresponding to first point cloud data and pose information corresponding to target point cloud data, wherein the first point cloud data are point cloud data acquired by the laser radar in the target frame at a first acquisition moment, and the target point cloud data are any point cloud data except the first point cloud data in the plurality of point cloud data;
and correcting the target point cloud data according to the target conversion matrix.
2. The method of claim 1, wherein each of the plurality of pose information comprises: position information and angle information; the pose curve includes: a position curve and an angle curve;
the fitting the plurality of pose information to obtain a pose curve in the first time range includes:
fitting position information of each of the plurality of pose information to obtain the position curve within the first time range;
fitting the angle information of each of the plurality of pose information to obtain the angle curve within the first time range.
3. The method of claim 1, wherein fitting the plurality of pose information to obtain a pose curve over the first time range comprises:
taking the plurality of pose information as a plurality of control points, and acquiring a pose curve by using a preset B spline curve;
the B-spline curve includes:
Figure FDA0002599497300000021
wherein u ∈ [0,1]Representing the normalized value of the first time range, C (u) representing the pose curve, PiRepresenting ith position information in N position information, r represents the times of the B spline curve, Ni,r(u) represents a basis function of the B-spline curve.
4. The method of claim 3, wherein the obtaining the pose information corresponding to each point cloud data of the plurality of point cloud data on the pose curve according to the acquisition time corresponding to each point cloud data comprises:
determining the position of the acquisition time corresponding to each point cloud data within the first time range by using a preset first formula;
determining pose information corresponding to each point cloud data by using a preset second formula according to the position of the acquisition time corresponding to each point cloud data in the first time range and the pose curve;
the first formula includes:
Figure FDA0002599497300000022
the second formula includes:
Figure FDA0002599497300000023
wherein, tjRepresenting the acquisition time u corresponding to the j point cloud datajRepresents tjPosition in the first time range, toRepresenting a start time, t, of said first time rangepRepresents the end time of the first time range, C (u)j) Representing the pose information corresponding to the j point cloud data, Ni,r(uj) A basis function corresponding u representing the B-spline curvejThe value of (a).
5. The method of claim 1, wherein determining a target transformation matrix between pose information corresponding to the first point cloud data and pose information corresponding to the target point cloud data comprises:
using the pose information corresponding to the first point cloud data as an origin of a first coordinate system, and using the pose information corresponding to the target point cloud data as an origin of a second coordinate system;
acquiring a transformation matrix of the first coordinate system and the second coordinate system;
and taking the transformation matrix of the first coordinate system and the second coordinate system as the target transformation matrix.
6. The method of claim 1, wherein each point cloud data of the plurality of point cloud data comprises: a plurality of location information;
the correcting the target point cloud data according to the target conversion matrix comprises the following steps:
correcting each position information in the target point cloud data by using a preset third formula according to the target conversion matrix;
the third formula includes:
x'm=T*xm
wherein T represents the target transformation matrix, xmRepresenting the m-th position information, x 'in the target point cloud data'mDenotes xmCorrected position information.
7. An apparatus for correcting laser point cloud data, the apparatus comprising:
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a plurality of pose information of the device to be detected in a first time range, the first time range comprises a second time range, and a target frame comprises a plurality of point cloud data acquired by the laser radar in the second time range;
the fitting module is used for fitting the plurality of pose information to acquire a pose curve in the first time range;
the second acquisition module is used for acquiring the pose information corresponding to each point cloud data on the pose curve according to the acquisition time corresponding to each point cloud data in the plurality of point cloud data;
the determining module is used for determining a target conversion matrix between pose information corresponding to first point cloud data and pose information corresponding to target point cloud data, wherein the first point cloud data are point cloud data acquired by the laser radar in the target frame at a first acquisition moment, and the target point cloud data are any point cloud data except the first point cloud data in the plurality of point cloud data;
and the correction module is used for correcting the target point cloud data according to the target conversion matrix.
8. The apparatus of claim 7, wherein each of the plurality of pose information comprises: position information and angle information; the pose curve includes: a position curve and an angle curve;
the fitting module includes:
a first fitting submodule configured to fit position information of each of the plurality of pose information to obtain the position curve within the first time range;
and the second fitting submodule is used for fitting the angle information of each pose information in the plurality of pose information to acquire the angle curve in the first time range.
9. The apparatus of claim 7, wherein the fitting module is configured to:
taking the plurality of pose information as a plurality of control points, and acquiring a pose curve by using a preset B spline curve;
the B-spline curve includes:
Figure FDA0002599497300000041
wherein u ∈[0,1]Representing the normalized value of the first time range, C (u) representing the pose curve, PiRepresenting ith position information in N position information, r represents the times of the B spline curve, Ni,r(u) represents a basis function of the B-spline curve.
10. The apparatus of claim 9, wherein the second obtaining module comprises:
the first acquisition submodule is used for determining the position of the acquisition time corresponding to each point cloud data within the first time range by using a preset first formula;
the second acquisition sub-module is used for determining pose information corresponding to each point cloud data by using a preset second formula according to the position of the acquisition time corresponding to each point cloud data in the first time range and the pose curve;
the first formula includes:
Figure FDA0002599497300000051
the second formula includes:
Figure FDA0002599497300000052
wherein, tjRepresenting the acquisition time u corresponding to the j point cloud datajRepresents tjPosition in the first time range, toRepresenting a start time, t, of said first time rangepRepresents the end time of the first time range, C (u)j) Representing the pose information corresponding to the j point cloud data, Ni,r(uj) A basis function corresponding u representing the B-spline curvejThe value of (a).
11. The apparatus of claim 7, wherein the determining module comprises:
the first determining submodule is used for taking the pose information corresponding to the first point cloud data as an origin of a first coordinate system, and taking the pose information corresponding to the target point cloud data as an origin of a second coordinate system;
the conversion submodule is used for acquiring a conversion matrix of the first coordinate system and the second coordinate system;
and the second determining submodule is used for taking a conversion matrix of the first coordinate system and the second coordinate system as the target conversion matrix.
12. The apparatus of claim 7, wherein each of the plurality of point cloud data comprises: a plurality of location information;
the correction module is used for:
correcting each position information in the target point cloud data by using a preset third formula according to the target conversion matrix;
the third formula includes:
x'm=T*xm
wherein T represents the target transformation matrix, xmRepresenting the m-th position information, x 'in the target point cloud data'mDenotes xmCorrected position information.
13. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
14. An electronic device, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to carry out the steps of the method of any one of claims 1 to 6.
CN201811489855.XA 2018-12-06 2018-12-06 Method and device for correcting laser point cloud data, storage medium and electronic equipment Active CN109613543B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811489855.XA CN109613543B (en) 2018-12-06 2018-12-06 Method and device for correcting laser point cloud data, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811489855.XA CN109613543B (en) 2018-12-06 2018-12-06 Method and device for correcting laser point cloud data, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN109613543A CN109613543A (en) 2019-04-12
CN109613543B true CN109613543B (en) 2020-09-25

Family

ID=66007758

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811489855.XA Active CN109613543B (en) 2018-12-06 2018-12-06 Method and device for correcting laser point cloud data, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN109613543B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110244321B (en) * 2019-04-22 2023-09-26 武汉理工大学 A road passable area detection method based on three-dimensional lidar
WO2021035748A1 (en) * 2019-08-30 2021-03-04 深圳市大疆创新科技有限公司 Pose acquisition method, system, and mobile platform
CN110703229A (en) * 2019-09-25 2020-01-17 禾多科技(北京)有限公司 Point cloud distortion removal method and external reference calibration method for vehicle-mounted laser radar reaching IMU
CN111060138B (en) * 2019-12-31 2022-01-28 上海商汤智能科技有限公司 Calibration method and device, processor, electronic equipment and storage medium
CN111442722B (en) * 2020-03-26 2022-05-17 达闼机器人股份有限公司 Positioning method, positioning device, storage medium and electronic equipment
CN111192303B (en) * 2020-04-09 2020-08-07 北京三快在线科技有限公司 Point cloud data processing method and device
WO2021212477A1 (en) * 2020-04-24 2021-10-28 华为技术有限公司 Point cloud data correction method, and related device
WO2022061850A1 (en) * 2020-09-28 2022-03-31 深圳市大疆创新科技有限公司 Point cloud motion distortion correction method and device
CN113759348B (en) * 2021-01-20 2024-05-17 京东鲲鹏(江苏)科技有限公司 Radar calibration method, device, equipment and storage medium
CN113822210B (en) * 2021-09-27 2022-10-14 山东睿思奥图智能科技有限公司 Human leg detection method based on laser technology
CN114296057A (en) * 2021-12-08 2022-04-08 深圳奥锐达科技有限公司 Method, device and storage medium for calculating relative external parameter of distance measuring system
CN114425774B (en) * 2022-01-21 2023-11-03 深圳优地科技有限公司 Robot walking road recognition method, robot walking road recognition device, and storage medium
CN115937069B (en) * 2022-03-24 2023-09-19 北京小米移动软件有限公司 Part detection method, device, electronic equipment and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103729882B (en) * 2013-12-30 2016-09-28 浙江大学 A kind of some cloud relative pose estimation method based on three-dimensional curve coupling
CN106123890A (en) * 2016-06-14 2016-11-16 中国科学院合肥物质科学研究院 A kind of robot localization method of Fusion
CN108732582B (en) * 2017-04-20 2020-07-10 百度在线网络技术(北京)有限公司 Vehicle positioning method and device
CN106971403B (en) * 2017-04-27 2020-04-03 武汉数文科技有限公司 Point cloud image processing method and device
CN108230379B (en) * 2017-12-29 2020-12-04 百度在线网络技术(北京)有限公司 Method and device for fusing point cloud data
CN108875139A (en) * 2018-05-18 2018-11-23 中广核研究院有限公司 A kind of three dimensional arrangement method and system based on actual environment

Also Published As

Publication number Publication date
CN109613543A (en) 2019-04-12

Similar Documents

Publication Publication Date Title
CN109613543B (en) Method and device for correcting laser point cloud data, storage medium and electronic equipment
CN111442722B (en) Positioning method, positioning device, storage medium and electronic equipment
CN112106111B (en) A calibration method, device, movable platform and storage medium
CN109345596B (en) Multi-sensor calibration method, device, computer equipment, medium and vehicle
CN110927708B (en) Calibration method, device and equipment of intelligent road side unit
US10984554B2 (en) Monocular vision tracking method, apparatus and non-volatile computer-readable storage medium
CN109887032B (en) Monocular vision SLAM-based vehicle positioning method and system
CN111060101B (en) Vision-assisted distance SLAM method and device and robot
CN109212540B (en) Ranging method and device based on laser radar system and readable storage medium
WO2021213432A1 (en) Data fusion
CN112017205B (en) A method and system for automatic calibration of spatial position of lidar and camera sensor
CN113701745B (en) External parameter change detection method, device, electronic equipment and detection system
CN110873883B (en) Positioning method, medium, terminal and device integrating laser radar and IMU
CN111427061A (en) Robot mapping method and device, robot and storage medium
CN110501712B (en) Method, device and equipment for determining position attitude data in unmanned driving
CN108957466A (en) Radar data compensation method, device, equipment and the storage medium of mobile robot
KR101890612B1 (en) Method and apparatus for detecting object using adaptive roi and classifier
US12031825B2 (en) Calibration-free positioning method and system
CN109871019B (en) Method and device for acquiring coordinates by automatic driving
CN116047481A (en) Method, device, equipment and storage medium for correcting point cloud data distortion
CN113920258B (en) Map generation method, device, medium and electronic equipment
KR20200076628A (en) Location measuring method of mobile device, location measuring device and electronic device
US20210004978A1 (en) Method for acquiring depth information of target object and movable platform
CN112815962B (en) Method and device for calibrating parameters of combined application sensors
CN113513983B (en) Precision detection method and device, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20210302

Address after: 201111 2nd floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Patentee after: Dalu Robot Co.,Ltd.

Address before: 518000 Room 201, building A, No. 1, Qian Wan Road, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong (Shenzhen Qianhai business secretary Co., Ltd.)

Patentee before: Shenzhen Qianhaida Yunyun Intelligent Technology Co.,Ltd.

TR01 Transfer of patent right
CP03 Change of name, title or address

Address after: 201111 Building 8, No. 207, Zhongqing Road, Minhang District, Shanghai

Patentee after: Dayu robot Co.,Ltd.

Address before: 201111 2nd floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Patentee before: Dalu Robot Co.,Ltd.

CP03 Change of name, title or address