[go: up one dir, main page]

CN111337941B - A dynamic obstacle tracking method based on sparse lidar data - Google Patents

A dynamic obstacle tracking method based on sparse lidar data Download PDF

Info

Publication number
CN111337941B
CN111337941B CN202010192403.6A CN202010192403A CN111337941B CN 111337941 B CN111337941 B CN 111337941B CN 202010192403 A CN202010192403 A CN 202010192403A CN 111337941 B CN111337941 B CN 111337941B
Authority
CN
China
Prior art keywords
point
point cloud
trajectory
points
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010192403.6A
Other languages
Chinese (zh)
Other versions
CN111337941A (en
Inventor
周增祥
孙翔峻
段仕鹏
左家乐
黎梦涛
柴源
王应富
刘志刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology of China USTC
Original Assignee
University of Science and Technology of China USTC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology of China USTC filed Critical University of Science and Technology of China USTC
Priority to CN202010192403.6A priority Critical patent/CN111337941B/en
Publication of CN111337941A publication Critical patent/CN111337941A/en
Application granted granted Critical
Publication of CN111337941B publication Critical patent/CN111337941B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/66Tracking systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

本发明公开了一种基于稀疏激光雷达数据的动态障碍物追踪方法及系统,属于动态障碍物追踪技术领域,包括以下步骤:S1:消除静态背景;S2:动态点云聚类;S3:凸包提取;S4:特征提取;S5:数据关联;S6:预测轨迹。本发明使用栅格地图过滤静态障碍物激光雷达点云,剩余少量静态点云通过dbscan算法,将极为稀疏的点云过滤,增强了最终的过滤效果;通过提取点云凸包的角度信息模糊分类点云为椭圆、矩形、直线,然后辅助点云拟合图形的尺寸获得正确的点云位置点,保证了数据关联的准确性;还综合了最近邻域和多目标假设算法的特性,改进了多目标关联算法,能够在保证准确性的前提下高效地完成数据关联工作。

Figure 202010192403

The invention discloses a dynamic obstacle tracking method and system based on sparse laser radar data, belonging to the technical field of dynamic obstacle tracking, comprising the following steps: S1: eliminate static background; S2: dynamic point cloud clustering; S3: convex hull extraction; S4: feature extraction; S5: data association; S6: predicted trajectory. The invention uses the grid map to filter the static obstacle lidar point cloud, and the remaining small amount of static point cloud is filtered by the dbscan algorithm to filter the extremely sparse point cloud, which enhances the final filtering effect; fuzzy classification is performed by extracting the angle information of the convex hull of the point cloud. The point cloud is an ellipse, a rectangle, and a straight line, and then assists the point cloud to fit the size of the graph to obtain the correct point cloud position, which ensures the accuracy of the data association. The multi-objective association algorithm can efficiently complete the data association work under the premise of ensuring accuracy.

Figure 202010192403

Description

Dynamic obstacle tracking method based on sparse laser radar data
Technical Field
The invention relates to the technical field of dynamic obstacle tracking, in particular to a dynamic obstacle tracking method and system based on sparse laser radar data.
Background
Object detection and tracking technologies based on visual sensors have become more sophisticated, but visual sensors often fail to provide distance information and must be used in well-lit conditions, limiting their use in robotic navigation. The laser radar is not influenced by illumination and can provide distance data, and the tasks of robot navigation, tracking and the like can be completed. In recent years, detection of a dynamic obstacle and prediction of a trajectory by a laser radar have been important issues in the field of mobile robots.
With the development of unmanned technology, there have been a number of methods for detecting and tracking dynamic obstacles. For the detection and tracking of vehicles, Konrad M et al use a grid map to detect and track automobiles. Zan Tong child proposes a new global column coordinate histogram feature for vehicle identification in urban environment, a dynamic vehicle detection and tracking algorithm based on a likelihood field model is used for tracking vehicles, Anna Petrovskaya of Stanford university and the like extract the geometric and dynamic characteristics of the tracked vehicles, establish corresponding feature models, and update the tracked target by using a Bayes filter, but the blocked vehicles still cannot be detected. The method for matching the outline shape characteristics of the obstacle by using the template matching method is used in the flying industry, but the types of models are few, and the adaptability of obstacle matching is poor. And the computing method comprises the steps of clustering single-frame laser radar point cloud data, extracting outline characteristics of an external rectangle of an obstacle, performing data association on obstacle information of two continuous frames by adopting a multi-hypothesis tracking Model (MHT) algorithm, and continuously predicting and tracking a dynamic obstacle by utilizing a Kalman filtering algorithm.
In the field of unmanned driving, pedestrian movement is considered in addition to avoidance of large obstacles such as vehicles. Pedestrian detection and tracking techniques have also been studied in recent years. The traditional pedestrian detection method mainly depends on images acquired by sensors such as a visible light camera and the like, and adopts a machine vision related method to detect pedestrians. Premebida et al propose a lidar-based 15-dimensional feature for pedestrian detection in urban environments. Some of these features are, for example, the minimum reflection distance, the number of clustered points in the class, and the like. Wan et al propose a pedestrian detection method based on laser radar and image fusion, which can utilize fusion features to detect pedestrians. Kocelanfeng and the like can cluster non-ground laser radar point clouds by using a dbscan algorithm, and provides a distribution characteristic of a fast point characteristic histogram for training a support vector machine classifier to detect pedestrians.
Three-dimensional laser radar is generally used for detecting and tracking in the field of unmanned driving, but for the closed environment of a mine, the data volume acquired by the three-dimensional laser radar is large, and the omnibearing mobile inspection robot with high real-time requirement on data is a great burden. And the two-dimensional laser radar collects point clouds of a horizontal plane, so that the data volume is small, and the processing is quicker. However, the point cloud is sparse due to a small amount of data, and the accuracy of processing results obtained by using the sparse data is deteriorated. Accuracy is a first concern in data processing when using two-dimensional radars as sensors. In the underground mine, a vision sensor cannot be used for assistance due to illumination, and the three-dimensional radar in the closed environment of the mine has large data volume and low processing speed, so that the two-dimensional radar is finally selected as the sensor to assist the mobile robot to complete detection and tracking tasks. The two-dimensional laser radar acquires point clouds on a horizontal plane, the point clouds are sparse due to small data volume, and accuracy of processing results obtained by using the sparse data is poor.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: how to ensure that the two-dimensional radar sparse data can be used for obtaining the correct tracking effect and reduce the data error in the data processing process, provides a dynamic obstacle tracking method based on sparse laser radar data.
The invention solves the technical problems through the following technical scheme, and the invention comprises the following steps:
s1: eliminating static background
Determining a travelable area, deleting the ground point cloud of the travelable area and the point cloud exceeding a specified height, and obtaining the residual point cloud as the obstacle point cloud;
s2: dynamic point cloud clustering
Clustering the obstacle point clouds in the step S1, and dividing the point clouds belonging to different dynamic obstacles into different clusters;
s3: convex hull extraction
After the clustering of step S2, extracting a convex hull of the clustered point cloud;
s4: feature extraction
Carrying out fuzzy classification by utilizing the angle information of the convex hull, and fitting the point clouds to obtain position information of the obstacle;
s5: data association
Associating positions belonging to the same barrier at different moments according to position information of the barriers in different frames, fitting the positions into a track, associating the tracked track with the clustered point cloud detected by the current data frame, and updating the motion track of the dynamic barrier;
s6: predicting trajectories
Predicting the position of the dynamic obstacle using a trajectory function: increasing x of x value of last track point according to track functionpSubstituting into the formula to find ypCoordinate (x)p,yp) Namely predicting the track point, the formula brought in during the straight line prediction is as follows:
Figure GDA0003333278190000021
wherein (x)i,yi) Is a locus of a fitted straight line, a0To fit the slope of the straight line, a1For the intercept of a fitted straight line, R2Summing the distance differences of the locus points and the fitted straight line;
the formula carried in when predicting a polynomial curve is as follows:
Figure GDA0003333278190000031
wherein (x)i,yi) Is a locus point of a fitting cubic polynomial, a0、a1、a2、a3To fit coefficients of a cubic polynomial, R2Summing the distance differences of the locus points and the fitted curve;
the trajectory functions include a straight line prediction function and a polynomial curve prediction function, and the straight line prediction function is as follows:
xp=x[n]+(len)/(t*n)
wherein, X [ n ] is the current track point, len is the sum of the increment absolute values of the X coordinate of the existing track point, t is an adjustable parameter, n is the number of the existing track points, and the possible X transformation degree at the current moment is estimated according to the average value of the X transformation degree of the existing track in the straight line prediction;
the polynomial curve prediction function is as follows:
xp=x[n]+len/(t*n)*k
wherein X [ n ] is the current track point, t is an adjustable parameter, n is the number of the existing track points, len is the sum of Euclidean distances of the existing track points, k is the slope of the current track, and in the polynomial curve prediction, the X increment of the current moment is estimated according to the average value of the X increments among the existing track points.
Further, in the step S1, the point cloud only includes coordinate information, and the coordinate is in the form of two-dimensional coordinates (x, y).
Further, the specific processing procedure in S1 of the step includes the following steps:
s11: determining an acquisition plane
Setting the height of the acquisition plane to be 0.8-1.0 m according to the height of the main obstacle;
s12: generating a template
Constructing a grid map or converting an existing mine map into the grid map, wherein the grid map is divided into an obstacle layer, an expansion layer and a travelable layer;
s13: pretreatment of
Loading the grid map built in the step S12, setting the thickness of the expansion layer to be 0.05 m, receiving the original data of the laser radar, processing the point cloud, and deleting the illegal value in the point cloud;
s14: filtering point cloud
And comparing the point coordinates with the grid map to judge whether the point is positioned on a travelable layer, and if the point is positioned on the travelable layer, keeping the point or deleting the point.
Further, in step S2, the point cloud is divided into a core point and a boundary point in the clustering process, where the core point is a clustered skeleton, and the clustering process involves two core parameters, respectively Eps and MinPts, and Eps is a boundary formula of a neighborhood between the point clouds, as shown below:
Ne={xi∈D|distance(xi,xj)<Eps}
wherein x isiRepresenting points in a point cloud, distance (x)i,yi) The Euclidean distance between point clouds is shown, Eps is a distance threshold value of the clustering algorithm classified into the same type of point clouds, and D is a range field;
when the distance between the two points is smaller than Eps, judging that the two points are in the same neighborhood; MinPts is a parameter for judging the core point, and if the number of points with the distance in the neighborhood of the points smaller than Eps is larger than MinPts, the points are judged to be the core points.
Further, the clustering process includes the steps of:
s21: counting the number of points in each point neighborhood, storing the number into pts attributes, judging whether each point pts attribute is larger than MinPts, if so, judging that the point is a core point, and storing the core point into a core point array;
s22: traversing the array of core points concatenates core points within the neighborhood of the current core point into one class,
s23: and traversing all the boundary points, and merging the boundary points in the neighborhood of the core point into the core point class.
Further, in step S4, the moving objects in the mine environment are pedestrians and mine cars, the pedestrians are classified into elliptical point clouds, the mine cars can be classified into rectangular point clouds and linear point clouds when viewed from different angles, and the moving objects are classified into elliptical, rectangular and linear points based on the characteristics of the moving objects and the angle information of the convex hulls.
Further, in step S4, the included angle information of the point cloud is obtained by a convex hull, and after a group of convex hull points of the point cloud is obtained, a vertex in which the closest point to the laser radar is an angle is selected, and angle information using the closest point as the vertex is counted and used as the angle feature of the point cloud, that is, the included angle information.
Further, the specific processing process of the point cloud angular features comprises the following steps:
s41: taking the point closest to the laser radar in the obtained convex hull array as a vertex;
s42: taking a point at the lower left corner in the convex hull array as a starting point, starting clockwise as an end point of one side of the angle, starting anticlockwise, taking a convex hull point as an end point of the other side of the angle, calculating the angle, and counting the maximum value of the angle and summing;
s43: obtaining an angle classification standard through angle statistics after point cloud angle features of the dynamic barrier are extracted, and determining an angle type according to the angle classification standard;
s44: and judging the source of the point cloud data according to the angle type, fitting different types of graphic point clouds by using different fitting formulas according to different driving characteristics of different sources, and further calculating the set characteristics of the graphic point clouds.
Furthermore, an ellipse fitting formula is used for fitting the ellipse to obtain the center of the circle, the major axis, the minor axis and the deflection angle of the ellipse, and the ellipse fitting formula is as follows:
f=∑(xi 2+Axiyi+Byi 2+Cxi+Dyi+E)
wherein (x)i,yi) For the points of the fitted ellipse, A, B, C, D, E is the coefficients of the non-standard equation of the fitted ellipse;
and fitting a straight line by using a straight line fitting formula to obtain the length, the coordinates of the starting point, the coordinates of the ending point and the coordinates of the central point, wherein the straight line fitting formula is as follows:
Figure GDA0003333278190000051
wherein (x)i,yi) Is the locus point of the fitting straight line, a is the slope of the fitting straight line, b is the intercept of the fitting straight line, R2Summing the distance differences of the locus points and the fitted straight line;
obtaining coordinates of a center point, a length, a width and four corner points of a rectangle by using the existing rectangle fitting function; the point representing the position is then determined based on the difference between the pedestrian and the mine car, and the position of the vehicle is represented by a right-angle point formed by the front end of the car and the side near the radar when the position point is determined. Represents an ellipse of the pedestrian, and takes the center point of the ellipse as the position point of the pedestrian.
The invention also provides a dynamic obstacle tracking system based on sparse lidar data, which comprises:
the elimination module is used for determining a travelable area and deleting ground point clouds and point clouds exceeding a certain height in the travelable area;
the clustering module is used for clustering the obstacle point clouds and dividing the point clouds belonging to different dynamic obstacles into different clusters;
the convex hull extracting module is used for extracting a convex hull of the clustered point cloud after clustering;
the characteristic extraction module is used for carrying out fuzzy classification by utilizing the angle information of the convex hull and then fitting the point clouds to obtain the position information of the obstacle;
the data association module is used for associating positions at different moments belonging to the same barrier according to the position information of the barrier in different frames, fitting the positions into a track, associating the tracked track with the clustered point cloud detected by the current data frame, and updating the motion track of the dynamic barrier;
the track prediction module is used for predicting the position of the dynamic obstacle by utilizing a track function;
the control module is used for controlling the other modules to execute instructions;
the elimination module, the clustering module, the convex hull extraction module, the feature extraction module, the data association module and the track prediction module are all electrically connected with the control module.
Compared with the prior art, the invention has the following advantages: according to the dynamic obstacle tracking method and system based on the sparse lidar data, the static obstacle lidar point cloud is filtered by using the grid map, and the extremely sparse point cloud is filtered by the remaining few static point clouds through a dbscan algorithm, so that the final filtering effect is enhanced; the point cloud is classified into an ellipse, a rectangle and a straight line in a fuzzy way by extracting the angle information of the convex hull of the point cloud, and then the correct point cloud position point is obtained by assisting the size of the point cloud fitting graph, so that the accuracy of data association is ensured; the characteristics of the nearest neighborhood and the multi-target hypothesis algorithm are integrated, the multi-target association algorithm is improved, and the data association work can be efficiently completed on the premise of ensuring the accuracy.
Drawings
FIG. 1 is a general flow chart of a method for dynamic obstacle tracking based on sparse lidar data in an embodiment of the present invention;
FIG. 2(a) is a grid map in an embodiment of the present invention; FIG. 2(b) is an original point cloud in an embodiment of the present invention; FIG. 2(c) is a diagram of the effect of the embodiment of the present invention after actual data acquisition and background removal;
FIG. 3 is a diagram illustrating the clustering effect of the point clouds according to the embodiment of the present invention;
FIG. 4(a) is a cloud of points with fuzzy classification as ellipses in an embodiment of the present invention; FIG. 4(b) is a cloud point diagram with fuzzy classification as straight lines in an embodiment of the present invention; FIG. 4(c) is a cloud point diagram with fuzzy classification as rectangles in an embodiment of the present invention;
FIG. 5 is a diagram illustrating the effect of convex hull extraction according to an embodiment of the present invention;
FIG. 6(a) is a rectangular boundary diagram formed by the vehicle head and the point cloud on the side close to the radar in the embodiment of the invention; FIG. 6(b) is a straight line boundary diagram formed by point clouds of a vehicle near a radar side in the embodiment of the invention; FIG. 6(c) is a rectangular boundary diagram of the vehicle tail and the point cloud on the side near the radar in the embodiment of the present invention;
FIG. 7 is a schematic diagram of a process of calculating a front end right-angle point of a vehicle according to an embodiment of the present invention;
FIG. 8 is a flow chart illustrating track update matching according to an embodiment of the present invention;
FIG. 9(a) is a plot of a straight line fit trajectory in an embodiment of the present invention; FIG. 9(b) is a plot of a polynomial fit trajectory in an embodiment of the present invention.
Detailed Description
The following examples are given for the detailed implementation and specific operation of the present invention, but the scope of the present invention is not limited to the following examples.
The embodiment provides a technical scheme: a dynamic obstacle tracking method based on sparse lidar data, as shown in fig. 1, includes the following steps:
the first step is as follows: eliminating static background
The data collected by the laser radar comprises static obstacles and dynamic obstacles, and for the tracking of the dynamic obstacles, the static obstacle point cloud is noise, and the part of the point cloud is to be removed. The general process of background elimination based on three-dimensional radar is as follows: firstly, determining a travelable area, then deleting a ground point cloud and a point cloud exceeding a certain height of the travelable area, and obtaining a residual point cloud which is an obstacle point cloud. This process can delete ground point clouds, overly high ambient point clouds but still have a partially static point cloud retention. The two-dimensional radar has a small data amount, and noise is reduced as much as possible in order to achieve a desired effect.
The specific processing procedure adopted in this embodiment is as follows:
1) and determining an acquisition plane, wherein the two-dimensional radar can only acquire information of one horizontal plane, and when the horizontal plane is determined, the height of a main obstacle must be considered, and the height of the general mine mobile equipment is 0.8-2.4 meters and the leg length of a pedestrian is generally 0.8-1.1 meters by looking up related data, so that the acquisition plane is set to be 0.8-1.0 meter.
2) And generating a template, and constructing a grid map by using a mapping function packet or converting an existing mine map into the grid map, wherein the grid map is divided into an obstacle layer, an expansion layer and a travelable layer.
3) Preprocessing, namely loading the built grid map, and accurately filtering static point cloud when the distance between the dynamic barrier and the static barrier is more than 0.1 m; when the distance between the dynamic barrier and the static barrier is 0-0.1 m, part of dynamic point cloud can be filtered due to the expansion layer, so that the mistaken deletion occurring here is reduced by reducing the size of the expansion layer, the thickness of the expansion layer is set to be 0.05 m in the embodiment, and the original data of the laser radar is received.
Since the data coordinate system collected by the lidar is referred to as "laser" and the static map coordinate system is referred to as "map", the lidar data is then converted to the map coordinate system. The point cloud data is then processed to remove illegal values such as inf and invalid values such as (0, 0) from the point cloud.
4) Filtering the point cloud, comparing the point cloud with a grid map to judge whether the point cloud is in a travelable layer, and if the point cloud is in the travelable layer, keeping the point cloud, otherwise, deleting the point cloud; the point cloud of the obstacle will float small in size because of the instability of the lidar data and is also considered a static obstacle for the point cloud at the dilated layer.
The created grid map is shown in fig. 2(a), the original point cloud is shown in fig. 2(b), after actual data acquisition and background elimination, the effect map is shown in fig. 2(c), and most of the static point cloud is filtered through background elimination. Even if a small part of static point cloud is still remained, the static point cloud remained after processing is distributed sparsely, and can be completely filtered through a subsequent clustering algorithm. The interference of static point cloud is eliminated, the data volume is further reduced, and the subsequent algorithm processing is quicker; and the obstacle can be more accurately tracked without static point cloud interference.
The second step is that: dynamic point cloud clustering
The dynamic point cloud contains data of a plurality of dynamic obstacles, and the point cloud has only coordinate information. In order to achieve tracking of dynamic obstacles, point clouds belonging to the same obstacle must be classified into one category. Therefore, comprehensive characteristics of a point cloud can be extracted to serve as characteristics of the dynamic obstacle. Two commonly used clustering algorithms DBSCAN and K-means are compared. k-means requires the number of clusters to be specified, k, and the initial cluster center has a large influence on the clustering. k-means classifies any point into a certain class and is sensitive to abnormal points. The DBSCAN can eliminate noise, a neighborhood distance threshold value eps and a sample number threshold value MinPts need to be appointed, and the cluster number can be automatically determined. Because the number of dynamic obstacles is uncertain and there is a significant difference between point clouds belonging to the same obstacle and point clouds belonging to different obstacles, the DBSCAN algorithm is selected for clustering in this embodiment.
DBSCAN is a density-based clustering algorithm that generally assumes that classes are determined by how closely the samples are distributed. The distance between the point clouds of the samples in the same category is close, namely the samples in the same category exist in a short distance around any sample in the category, the point clouds are divided into core points and boundary points by a clustering algorithm, and the core points are clustered frameworks. The algorithm has two core parameters Eps, MinPts. Eps is the boundary formula for the neighborhood between point clouds, as follows:
Ne={xi∈D|distance(xi,xj)<Eps} (1)
the distance between two points is less than Eps, then it is considered to be in the same neighborhood. In addition, MinPts is a parameter for determining the core point, xiRepresenting points in a point cloud, distance (x)i,yi) The Eps is classified into the same kind of point cloud by clustering algorithm according to Euclidean distance between point cloudsDistance threshold, D is range domain. The field in mathematics is defined as belonging to the neighborhood: the region within a given object radius e is called the neighborhood of e for the object.
The clustering process is as follows: firstly, counting the number of points in each point neighborhood, storing the number into pts attributes, then counting each point pts attributes, judging whether the number is greater than MinPts, if so, determining a core point, and storing the core point array; secondly, the core point array is traversed in a depth mode, and the core points in the neighborhood range of the current core point are connected in series to form a class; and finally, traversing all the boundary points, and merging the boundary points in the neighborhood of the core point into the core point class. The effect graph after clustering is shown in fig. 3, three obstacles are arranged in the view field, and two pedestrian obstacles and one vehicle-like obstacle are shown in fig. 3.
The result shows that the point clouds belonging to different dynamic obstacles can be divided into different clusters after being clustered by the DNSCAN algorithm. And the method has a filtering function on static obstacle point clouds which are not completely filtered, and the point clouds are few in number, are not distributed in a concentrated mode and cannot be classified. After clustering, the part of the point cloud is filtered by deleting the point cloud without classification.
The third step: convex hull extraction
After point cloud clustering, each class is used as a detected moving object, and moving object data belonging to different frames are associated to determine the track of the moving object. Before this, there is characteristic information for each class. The convex hull (covex hull) of the set of points Q refers to a minimum convex polygon, satisfying that the points in Q are either on the polygon sides or within them. The polygon represented by the line segment in fig. 5 is a convex hull of the point set Q ═ { p0, p 1.. p12 }. The contour shape of the clustered point cloud is the main characteristic of classification, and a convex hull of the clustered point cloud is obtained by using a gram scanning algorithm. The algorithm is based on the property of cross product, and the length | a × b | of the cross product can be interpreted as the area of the parallelogram formed when the two cross product vectors a, b share the starting point.
The cross product formula of the two-dimensional vector is as follows:
Figure GDA0003333278190000081
if p1Xp2 is less than zero, p1 to p2 are counterclockwise, and if p1Xp2 is greater than zero, p1 to p2 are clockwise.
And enveloping all point clouds in the same group in a polygonal form by the convex hull extracted by the graham algorithm. The convex hull reflects the outer contour information of the dynamic barrier, and then the geometric characteristics of the dynamic barrier are extracted.
The fourth step: feature extraction
The most critical to achieving tracking of dynamic obstacles is to associate the positions of the obstacles at different times into a group. The present embodiment provides a way to correlate data based on the observation of a two-dimensional point cloud. The main basis for the correlation data is the geometry of the dynamic obstacle. The data processing is divided into two steps: firstly, carrying out fuzzy classification on the obstacles into ellipses, rectangles and straight lines according to the extracted angle information of the convex hull; and then, fitting different types of point clouds, and extracting interested position points and geometric features.
The main moving objects in the mine environment are pedestrians and mine cars, the pedestrians are similar to an ellipse, and the mine cars can be classified into rectangular point clouds and linear point clouds when observed from different angles. Arbitrary point clouds can be classified into these three geometries. According to the fact, a calculation method divides the point cloud after clustering into three types of 'e' (ellipse), 'r' (rectangle) and 'l' (straight line).
Data obtained by the radar can only be point clouds of one direction of a moving object, as shown in fig. 4, the point clouds of one direction only reflect partial geometrical characteristics of an obstacle, and the most prominent geometrical characteristic is included angle information. For pedestrians, the point clouds in all directions are similar to circular arcs, and vehicles can be summarized as rectangles. The fuzzy classification of the point cloud according to the information of the included angle is shown in fig. 4. Point clouds like 4(a) are divided into ellipses, point clouds like 4(b) are divided into straight lines, and point clouds like 4(c) are divided into rectangles.
The angle information of the point cloud is obtained by convex hull, as shown in fig. 5.
And obtaining a group of convex package points of the point cloud after calculation of the graham algorithm. And observing and finding according to radar data of actual vehicles and pedestrians. The closest point to the lidar is typically the vertex of the angle. And counting angle information taking the closest point as a vertex, and taking the angle information as the angle characteristic of the point cloud. The specific treatment process is as follows:
1) and taking the point closest to the laser radar in the obtained convex hull array as a vertex.
2) And taking the point at the lower left corner in the convex hull array as a starting point, starting clockwise as an end point of one side of the angle, starting anticlockwise, taking the convex hull point as an end point of the other side, calculating the angle by using a formula, counting the maximum value of the angle and summing, and in the process of traversing all the convex hull points, when the end point is 0.05 m away from the vertex, considering the angle to be invalid. A large number of angle operation results are summarized, and the classification range of the angle pair clustering point clouds is shown in table 1:
TABLE 1 Angle Classification Standard Table
Figure GDA0003333278190000091
3) And obtaining the classification standard shown in the table 1 through angle statistics after the radar point cloud angle features of the pedestrians and the vehicle-like obstacles are extracted. Firstly, the average angle value is used as a basis for judgment, and if the average angle value is within the interval of 0.7-3.0, the maximum angle value is used as a basis, experiments prove that the accuracy of the method is about 90%.
4) After the type is determined, whether the point cloud data is a source vehicle or a pedestrian can be determined, and the point cloud is further processed according to different driving characteristics of the vehicle and the pedestrian.
And fitting different types of graphic point clouds by using different fitting formulas so as to calculate the set characteristics of the graphic point clouds. And (4) fitting the ellipse by using a formula (3) to obtain the center, the long axis, the short axis and the deflection angle of the ellipse. And (4) fitting a straight line by using a formula (4) to obtain the length, the coordinates of the starting point, the coordinates of the ending point and the coordinates of the central point. And obtaining coordinates of a center point, a length, a width and four corner points of the rectangle by using a rectangle fitting function provided by opencv, and then determining points representing positions according to differences of pedestrians and vehicles.
f=∑((xi-xc)2+(yi+yc)2-R2)2(3)
Wherein (x)i,yi) For the points of the fitted ellipse, A, B, C, D, E is the coefficients of the non-standard equation of the fitted ellipse;
Figure GDA0003333278190000101
wherein (x)i,yi) Is the locus point of the fitted straight line, a is the slope of the fitted straight line, b is the intercept of the fitted straight line, R2Summing the distance differences of the locus points and the fitted straight line;
and (3) processing the vehicle: as shown in FIG. 6, assuming a mine car is traveling, the corresponding lidar data will undergo three state changes, the first case is shown in FIG. 6(a), where the car is near the rectangular boundary formed by the point cloud on the head and near the radar side, the second case is shown in FIG. 6(b), where the car is near the straight boundary formed by the point cloud on the radar side, and the third case is shown in FIG. 6(c), where the car is near the rectangular boundary formed by the point cloud on the tail and near the radar side.
The present embodiment selects the right-angle point composed of the vehicle front end and the side near the radar side to represent the position of the vehicle, because the points at which the vehicle can most reflect the wheel profile information in this process are the right-angle point a at the vehicle front end and the right-angle point b at the vehicle rear end. These two points reflect the front-most and rear-most ends of the vehicle, respectively, and collectively reflect the sides of the vehicle. Then, the length len of the vehicle can be obtained after the straight line state of the vehicle is collected, and the position of the point a can be calculated according to the position of the point b according to the length len, namely the front end right-angle point can be obtained under three conditions. The process of estimating the front end right-angle point is shown in fig. 7. The method comprises the following steps:
1) and determining the driving direction of the vehicle, and performing vector difference according to the position of the rectangular central point of the current position of the vehicle and the position of the rectangular central point at the last moment. A vector a of the vehicle's direction of travel is obtained. If the current state is a straight line state, the difference between the starting point and the ending point of the straight line is respectively used as a vector A;
2) determining the running state of the vehicle, and obtaining a vector B by using the difference between the center point of the current fitted rectangle and the collected right-angle point;
3) and calculating an angle between the vector A and the vector B, and if the angle is more than 90 degrees, considering that the center point of the rectangle is in the third state before the right-angle point. If the central point of the rectangle is less than 90, the central point of the rectangle is considered to be a first state after the right-angle point;
4) in the third state, the right-angle point a is calculated by using the formulas (5) and (6).
Xa=Xb+len*cos(arctan(angle)) (5)
Xa=Xb+len*sin(arctan(angle)) (6)
And (3) processing the pedestrian: the center of the ellipse represents the position of the pedestrian. Because the geometric characteristics of the pedestrians are similar to an ellipse, the floor area of a single pedestrian is within 1 square meter, the size of the ellipse representing the pedestrians is small, and the distance between the center of the ellipse and the outline is short, the center of the ellipse represents the position of the obstacle of the pedestrian is feasible.
And obtaining dynamic obstacle type information and position information within an error allowable range through feature extraction, and then performing data association according to the obtained reliable information.
The fifth step: data association
After the above operation, the original point cloud data is converted into a characteristic graph as shown in table 2. The data association is to associate the tracked track with the clustering point cloud detected by the current data frame so as to update the motion track of the dynamic obstacle. The invention combines Nearest Neighbor (NN) and multi-target hypothesis tracking (MHT) algorithms. A multi-target association method based on the weight distance is provided.
TABLE 2 Point cloud characteristic information Table
Figure GDA0003333278190000111
Common data association algorithms include nearest neighbor algorithm and multi-target hypothesis tracking algorithm. The multi-target association mode which combines the characteristics of the two algorithms and takes the weight distance as the association basis is called as the weight distance algorithm. The weight distance algorithm gives different weights according to the quantitative characteristics of the point cloud in various aspects and the importance degree of the characteristics in data association. Therefore, the characteristics of the obstacle point cloud are comprehensively considered, and the probability calculation time is simplified.
Distance between obstacle tracking points; the width and length of the obstacle; the included angle between the rectangular boundary of the obstacle and the x coordinate axis; the angle between the rectangular boundary of the obstacle and the y coordinate axis. For the same obstacle, the geometrical size of the point cloud fitting may change at different distances in different directions, that is, the size factor may increase the error. Similarly, the two-dimensional radar sparse point cloud angle information is not stable enough. And finally, the point cloud weight takes the fuzzy classification type of the point cloud and the corrected position as an association basis.
The weight formula is as follows: weight ═ coef [0 ]. W [0] + coef [1] -W [1] (7)
The Coef [ ] array in weight formula (7) is the coefficient of each weight, W [0] is the type weight, if the same type value is 1, the different type value is-1; w1 is the distance weight, the point of the track prediction is the distance from the current frame point cloud, since the closer the distance, the higher the degree of association and the higher the weight. The distance weight is inversely related to the final weight. In the embodiment of the present invention, the (coefficient-distance) is used as the weighting distance.
Referring to the association strategy of the multi-target association algorithm, the track update matching process is discussed in cases as shown in fig. 8.
The matching process mainly considers the following three cases:
in the first case, a newly emerging obstacle is considered to be a newly emerging obstacle if its association weights with all trajectories are less than the successfully associated limit value wei _ limit. And establishing a new track. Since only one location point cannot predict the trajectory, the initial radius pre _ radius is defined to draw a circle as shown in fig. 8, and the distance between the point cloud location and the circle trajectory is taken as the basis for the next time.
In the second case. The obstacle is matched with the trajectory, and if the obstacle and the existing trajectory have the correlation value which is greater than the successful correlation limit value wei _ limit and is the maximum correlation value, the information of the trajectory is updated by the information provided by the obstacle. As there are instances of fuzzy classification errors. Different point cloud types may be included in the trajectory. And counting the number of different types of point clouds in the track during updating, wherein the track type is the type with the largest number. And updating the track type number, the track size and the track position. And the current obstacle information and the predicted track point information are obtained by adding weights according to the thought track position and the size of Kalman filtering.
In the third case, the obstacle disappears, some trajectories do not match the obstacle, and the trajectories are deleted.
And a sixth step: trajectory prediction
And after the characteristic graphs are clustered, storing the central coordinates of the characteristic graphs as track points. For more accurate description and prediction of the motion trajectory, a least squares polynomial curve is preferably used to fit each trajectory, and the formula is as follows:
A*X=Y(8)
Figure GDA0003333278190000121
wherein (x)i,yi) Is a locus point of a fitting cubic polynomial, a0、a1、a2、a3To fit coefficients of a cubic polynomial, R2Summing the distance differences of the locus points and the fitted curve;
and (4) bringing a set of trace points to solve the A parameter vector which minimizes the value of the formula (8). And (3) calculating a sufficient number (more than or equal to 4) of track points required by the track by using the polynomial curve least square method of the formula (9), and using a method of fitting a straight line by using the least square method of the formula (10) as a predicted track when the number of the acquired track points is less than 4.
Figure GDA0003333278190000122
Wherein (x)i,yi) Is a locus of a fitted straight line, a1Is the slope of the fitted line, a0Is the intercept of a fitted straight line, R2Difference in distance between locus point and fitted straight lineThe sum of (1);
the implementation of the track prediction is divided into straight line prediction and polynomial curve prediction, and the basic flow is that firstly, according to the derived fitting function, the x of the x value of the last track point is properly increasedpSubstituting into the formula to find ypCoordinate (x)p,yp) Namely the predicted track points.
xp=x[n]+(len)/(t*n)(11)
xp=x[n]+len/(t*n)*k (12)
Equation (11) is a straight line prediction xpIs calculated by the formula X [ n ]]The current track point is len, the sum of the increment absolute values of the x coordinate of the existing track point is len, t is an adjustable parameter, and n is the number of the existing track points. The embodiment estimates the possible x transformation degree at the current moment according to the average value of the x transformation degrees of the existing tracks. Equation (12) is a calculation equation of polynomial curve prediction xp, X [ n ]]And n, t and the meanings in the straight line prediction are correspondingly the same, len is the sum of Euclidean distances of the existing track points, and k is the slope of the current track. The curve prediction estimates the possible x transformation degree at the current moment by using the average Euclidean distance of the existing track, and the transformation direction of the curve is the tangential direction. The trajectory prediction effect graph is shown in fig. 9, where fig. 9(a) is a straight line-fitted trajectory graph and fig. 9(b) is a polynomial-fitted trajectory graph.
In order to verify the feasibility of the algorithm, a grAN _ SNhing map building algorithm is used for building a grid map of a laboratory, then the algorithm intercepts and captures laser radar data which are directly transmitted to a navigation module, and data processing is carried out to generate the obstacle track prediction point cloud. And transmitting the generated point cloud to a navigation module through a ros system framework. And then, receiving the track point cloud generated by the algorithm for navigation under the assistance of the track point cloud, wherein the navigation result is as follows: compared with the situation that the algorithm is not processed, the track point cloud generated by the algorithm can effectively assist the obstacle avoidance navigation function of the robot.
In summary, in the two sets of dynamic obstacle tracking methods based on sparse lidar data according to the embodiments, the grid map is used to filter the static obstacle lidar point cloud, and the remaining few static point clouds are filtered by the dbscan algorithm to obtain a very sparse point cloud, so that the final filtering effect is enhanced; the point cloud is classified into an ellipse, a rectangle and a straight line in a fuzzy way by extracting the angle information of the convex hull of the point cloud, and then the correct point cloud position point is obtained by assisting the size of the point cloud fitting graph, so that the accuracy of data association is ensured; the characteristics of the nearest neighborhood and the multi-target hypothesis algorithm are integrated, the multi-target association algorithm is improved, the data association work can be efficiently completed on the premise of ensuring the accuracy, and the method is worthy of being popularized and used.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (10)

1.一种基于稀疏激光雷达数据的动态障碍物追踪方法,其特征在于,包括以下步骤:1. a dynamic obstacle tracking method based on sparse lidar data, is characterized in that, comprises the following steps: S1:消除静态背景S1: Eliminate static background 确定可行驶区域,删除可行驶区域的地面点云和超过指定高度的点云,剩余点云即为障碍物点云;Determine the drivable area, delete the ground point cloud in the drivable area and the point cloud exceeding the specified height, and the remaining point cloud is the obstacle point cloud; S2:动态点云聚类S2: Dynamic point cloud clustering 对步骤S1中的障碍物点云进行聚类,将属于不同动态障碍物的点云分为不同的簇;Clustering the obstacle point clouds in step S1, and dividing the point clouds belonging to different dynamic obstacles into different clusters; S3:凸包提取S3: Convex Hull Extraction 经过步骤S2的聚类后,提取聚类点云的凸包;After the clustering in step S2, the convex hull of the clustered point cloud is extracted; S4:特征提取S4: Feature Extraction 利用凸包的角度信息进行模糊分类,再对这些点云拟合获得障碍物位置信息;Use the angle information of the convex hull to perform fuzzy classification, and then fit these point clouds to obtain the obstacle position information; S5:数据关联S5: Data Association 根据不同帧中障碍物的位置信息关联属于同一障碍物的不同时刻的位置,并拟合为轨迹,将已经追踪到的轨迹与当前数据帧检测到的聚类点云关联,更新动态障碍物的运动轨迹;According to the position information of obstacles in different frames, the positions belonging to the same obstacle at different times are associated and fitted as trajectories. The tracked trajectories are associated with the clustered point clouds detected in the current data frame, and the dynamic obstacles are updated. movement track; S6:预测轨迹S6: Predict the trajectory 利用轨迹函数预测动态障碍物的位置:使用最小二乘法拟合轨迹点得到轨迹函数,根据轨迹函数在最后一个轨迹点的横坐标基础增加适当增量x得到xp,带入公式求得yp坐标,(xp,yp)即为预测轨迹点,直线预测时带入的最小二乘法公式如下:Use the trajectory function to predict the position of the dynamic obstacle: use the least squares method to fit the trajectory points to obtain the trajectory function, add an appropriate increment x to the abscissa base of the last trajectory point according to the trajectory function to obtain x p , and bring it into the formula to obtain y p The coordinates, (x p , y p ) are the predicted trajectory points. The least squares formula introduced in the straight line prediction is as follows:
Figure FDA0003333278180000011
Figure FDA0003333278180000011
其中,(xi,yi)是拟合直线的轨迹点,a1是拟合直线的斜率,a0是拟合直线的截距,R2为轨迹点与拟合直线的距离差的和,多项式曲线预测时带入的公式如下:Among them, (x i , y i ) is the trajectory point of the fitted straight line, a 1 is the slope of the fitted straight line, a 0 is the intercept of the fitted straight line, and R 2 is the sum of the distance difference between the trajectory point and the fitted straight line , the formula introduced in the polynomial curve prediction is as follows:
Figure FDA0003333278180000012
Figure FDA0003333278180000012
其中,(xi,yi)是拟合三次多项式的轨迹点,a0、a1、a2、a3为拟合三次多项式的系数,R2为轨迹点与拟合曲线的距离差的和;Among them, (x i , y i ) is the trajectory point of the fitting cubic polynomial, a 0 , a 1 , a 2 , and a 3 are the coefficients of the fitting cubic polynomial, and R 2 is the distance difference between the trajectory point and the fitting curve and; 轨迹函数包括直线预测和多项式曲线预测函数,直线预测函数如下:The trajectory function includes straight line prediction and polynomial curve prediction function. The straight line prediction function is as follows: xp=x[n]+(len)/(t*n)x p =x[n]+(len)/(t*n) 其中,X[n]是当前轨迹点,len为已有轨迹点的x坐标的增量绝对值的和,t为可调参数,n为已有轨迹点个数,在直线预测中根据已有轨迹点之间的x增量的平均值估计当前时刻的x增量;其多项式曲线预测函数如下:Among them, X[n] is the current trajectory point, len is the sum of the incremental absolute value of the x-coordinate of the existing trajectory point, t is an adjustable parameter, and n is the number of existing trajectory points. The average of the x-deltas between the trajectory points estimates the x-delta at the current moment; its polynomial curve prediction function is as follows: xp=x[n]+len/(t*n)*kxp=x[n]+len/(t*n)*k 其中,X[n]是当前轨迹点,t为可调参数,n为已有轨迹点个数,len为已有轨迹点的欧式距离的和,k为当前轨迹的斜率,在多项式曲线预测中根据已有轨迹点之间的x增量的平均值估计当前时刻的x增量。Among them, X[n] is the current trajectory point, t is an adjustable parameter, n is the number of existing trajectory points, len is the sum of the Euclidean distances of the existing trajectory points, and k is the slope of the current trajectory. In the polynomial curve prediction Estimate the x-delta at the current moment based on the average of the x-deltas between the existing trajectory points.
2.根据权利要求1所述的一种基于稀疏激光雷达数据的动态障碍物追踪方法,其特征在于:点云只包含坐标信息,坐标形式为二维坐标(x,y)。2 . The method for tracking dynamic obstacles based on sparse lidar data according to claim 1 , wherein the point cloud only contains coordinate information, and the coordinate form is two-dimensional coordinates (x, y). 3 . 3.根据权利要求1所述的一种基于稀疏激光雷达数据的动态障碍物追踪方法,其特征在于,所述步骤S1中的具体处理过程包括以下步骤:3. The method for tracking dynamic obstacles based on sparse lidar data according to claim 1, wherein the specific processing process in the step S1 comprises the following steps: S11:确定采集平面S11: Determine the acquisition plane 根据主要障碍物的高度设置采集平面的高度为0.8米~1.0米;According to the height of the main obstacles, set the height of the acquisition plane to be 0.8m to 1.0m; S12:生成模板S12: Generate Template 构建栅格地图或者转换已经存在矿井地图为栅格地图,栅格地图分为障碍层、膨胀层、可行驶层;Build a raster map or convert an existing mine map into a raster map. The raster map is divided into barrier layers, expansion layers, and drivable layers; S13:预处理S13: Preprocessing 加载步骤S12中已经建好的栅格地图,设置膨胀层厚度为0.05米,接收激光雷达的原始数据,然后处理点云,删除点云中的非法值;Load the grid map that has been built in step S12, set the thickness of the inflation layer to 0.05 meters, receive the original data of the lidar, and then process the point cloud to delete illegal values in the point cloud; S14:过滤点云S14: Filter the point cloud 用点坐标和栅格地图做比较判断是否处于可行驶层,如果在可行驶层则保留否则删除此点。Compare the point coordinates with the grid map to determine whether it is in the drivable layer, if it is in the drivable layer, keep it, otherwise delete the point. 4.根据权利要求1所述的一种基于稀疏激光雷达数据的动态障碍物追踪方法,其特征在于,在所述步骤S2中,聚类过程中将点云分为核心点和边界点,其中核心点为聚类后的骨架,聚类过程中涉及两个核心参数,分别为Eps、MinPts,Eps是点云之间的邻域的边界公式,如下所示:4. The method for tracking dynamic obstacles based on sparse lidar data according to claim 1, wherein in the step S2, the point cloud is divided into core points and boundary points in the clustering process, wherein The core point is the skeleton after clustering. Two core parameters are involved in the clustering process, namely Eps and MinPts. Eps is the boundary formula of the neighborhood between point clouds, as shown below: Ne={xi∈D|distance(xi,xj)<Eps}Ne={x i ∈ D|distance(x i ,x j )<Eps} 其中,xi代表点云中的点,distance(xi,yi)为点云之间的欧式距离,Eps为聚类算法归为同一类点云的距离阈值,D为范围领域;Among them, x i represents the point in the point cloud, distance(x i , y i ) is the Euclidean distance between the point clouds, Eps is the distance threshold for the clustering algorithm to be classified as the same type of point cloud, and D is the range field; 当两点之间的距离小于Eps,则认为两点属于同类点云;MinPts是判断核心点的参数,如果点的邻域内距离小于Eps的点个数大于MinPts则判断其为核心点。When the distance between two points is less than Eps, it is considered that the two points belong to the same point cloud; MinPts is a parameter for judging the core point, if the number of points whose distance is less than Eps in the neighborhood of a point is greater than MinPts, it is judged as a core point. 5.根据权利要求4所述的一种基于稀疏激光雷达数据的动态障碍物追踪方法,其特征在于,聚类过程包括以下步骤:5. A kind of dynamic obstacle tracking method based on sparse lidar data according to claim 4, is characterized in that, the clustering process comprises the following steps: S21:统计每个点邻域内点的个数,存入pts属性中,判断每个点pts属性判断是否大于MinPts,如果大于则是核心点,并存入核心点数组中;S21: Count the number of points in the neighborhood of each point, store them in the pts attribute, and judge whether the pts attribute of each point is greater than MinPts, if it is greater than the core point, and store it in the core point array; S22:遍历核心点数组将在当前核心点邻域范围内的核心点串联为一类;S22: Traverse the core point array to concatenate the core points within the neighborhood of the current core point into one category; S23:遍历所有边界点,将核心点邻域内的边界点并入核心点类中。S23: Traverse all the boundary points, and merge the boundary points in the core point neighborhood into the core point class. 6.根据权利要求1所述的一种基于稀疏激光雷达数据的动态障碍物追踪方法,其特征在于:在所述步骤S4中,矿井环境下的移动物体是行人和矿车,行人分类为椭圆点云,矿车从不同的角度观察可以分类为矩形点云和直线点云,根据上述移动物体的特点利用凸包的角度信息模糊分类为椭圆形、矩形、直线。6 . The method for tracking dynamic obstacles based on sparse lidar data according to claim 1 , wherein in the step S4 , the moving objects in the mine environment are pedestrians and minecarts, and pedestrians are classified as ellipses. 7 . Point cloud, mine car can be classified into rectangular point cloud and straight point cloud when viewed from different angles, and according to the characteristics of the above-mentioned moving objects, the angle information of the convex hull is used to fuzzy classification into ellipse, rectangle and straight line. 7.根据权利要求6所述的一种基于稀疏激光雷达数据的动态障碍物追踪方法,其特征在于:在所述步骤S4中,点云的夹角信息是通过凸包获取的,获得点云的一组凸包点后,选择其中距离激光雷达的最近点为角的顶点,统计以最近点为顶点的角度信息,以此作为点云的角度特征,即夹角信息。7 . The method for tracking dynamic obstacles based on sparse lidar data according to claim 6 , wherein in the step S4 , the angle information of the point cloud is obtained through the convex hull, and the point cloud is obtained. 8 . After obtaining a set of convex hull points, select the vertex with the closest point from the lidar as the angle, and count the angle information with the closest point as the vertex as the angle feature of the point cloud, that is, the angle information. 8.根据权利要求7所述的一种基于稀疏激光雷达数据的动态障碍物追踪方法,其特征在于,点云角度特征的具体处理过程包括以下步骤:8. A kind of dynamic obstacle tracking method based on sparse lidar data according to claim 7, is characterized in that, the concrete processing procedure of point cloud angle characteristic comprises the following steps: S41:将所获得的凸包数组中距离激光雷达最近的点作为顶点;S41: Take the point closest to the lidar in the obtained convex hull array as the vertex; S42:以凸包数组中左下角的点作为起始点顺时针开始作为角一边的端点,逆时针开始以凸包点作为另一边的端点计算角度,同时统计角度最大值并求和;S42: Take the point in the lower left corner of the convex hull array as the starting point and start clockwise as the end point of one side of the angle, and start counting the angle with the convex hull point as the end point of the other side counterclockwise, and count the maximum value of the angle and sum it up; S43:通过对动态障碍物的点云角度特征提取后的角度统计,得到角度分类标准,根据角度分类标准确定角度类型;S43: Obtain angle classification standards through angle statistics after the point cloud angle feature extraction of the dynamic obstacles, and determine the angle type according to the angle classification standards; S44:根据角度类型判断点云数据的来源,根据不同来源的不同行驶特征使用不同的拟合公式对不同类型的图形点云拟合,进而计算出其集合特征。S44 : Determine the source of the point cloud data according to the angle type, and use different fitting formulas to fit different types of graph point clouds according to different driving characteristics of different sources, and then calculate their set features. 9.根据权利要求8所述的一种基于稀疏激光雷达数据的动态障碍物追踪方法,其特征在于,在所述步骤S44中,利用椭圆拟合公式拟合椭圆获得椭圆的圆心、长轴、短轴、偏角,椭圆拟合公式如下:9 . The method for tracking dynamic obstacles based on sparse lidar data according to claim 8 , wherein in step S44 , an ellipse is fitted with an ellipse fitting formula to obtain the center, long axis, The short axis, declination, and ellipse fitting formula are as follows: f=∑(xi 2+Axiyi+Byi 2+Cxi+Dyi+E)f=∑(x i 2 +Ax i y i +By i 2 +Cx i +Dy i +E) 其中,(xi,yi)为拟合椭圆的点,A、B、C、D、E为拟合椭圆的非标准方程的系数;Among them, (x i , y i ) are the points of the fitted ellipse, and A, B, C, D, and E are the coefficients of the non-standard equations of the fitted ellipse; 利用直线拟合公式拟合直线获得长度、起始点坐标、终止点坐标、中心点坐标,直线拟合公式如下:Use the straight line fitting formula to fit the straight line to obtain the length, starting point coordinates, ending point coordinates, and center point coordinates. The straight line fitting formula is as follows:
Figure FDA0003333278180000031
Figure FDA0003333278180000031
其中,(xi,yi)是拟合直线的轨迹点,a拟合直线的斜率,b拟合直线的截距,R2为轨迹点与拟合直线的距离差的和;Among them, (x i , y i ) is the trajectory point of the fitted straight line, a is the slope of the fitted straight line, b is the intercept of the fitted straight line, and R 2 is the sum of the distance difference between the trajectory point and the fitted straight line; 利用现有矩形拟合函数获得矩形的中心点、长、宽、四个角点坐标;接下来根据行人和矿车的不同确定代表位置的点,在确定位置点时,用车前端和靠近雷达一侧的边组成的直角点代表车辆的位置,代表行人的椭圆,并以椭圆的中心点作为行人的位置点。Use the existing rectangle fitting function to obtain the coordinates of the center point, length, width, and four corner points of the rectangle; then determine the point representing the position according to the difference between pedestrians and mine carts. The right-angle point formed by the side of one side represents the position of the vehicle and the ellipse of the pedestrian, and the center point of the ellipse is used as the position point of the pedestrian.
10.一种基于稀疏激光雷达数据的动态障碍物追踪系统,其特征在于,利用如权利要求1~9任一所述的追踪方法进行动态障碍物的追踪工作,包括:10. A dynamic obstacle tracking system based on sparse lidar data, characterized in that, using the tracking method according to any one of claims 1 to 9 to track dynamic obstacles, comprising: 消除模块,用于确定可行驶区域,删除可行驶区域的地面点云和超过一定高度的点云;The elimination module is used to determine the drivable area and delete the ground point cloud of the drivable area and the point cloud that exceeds a certain height; 聚类模块,用于对障碍物点云进行聚类,将属于不同动态障碍物的点云分为不同的簇;The clustering module is used to cluster the obstacle point cloud, and divide the point cloud belonging to different dynamic obstacles into different clusters; 凸包提取模块,用于在聚类后提取聚类点云的凸包;The convex hull extraction module is used to extract the convex hull of the clustered point cloud after clustering; 特征提取模块,用于利用凸包的角度信息进行模糊分类,再对这些点云拟合获得障碍物位置信息;The feature extraction module is used for fuzzy classification using the angle information of the convex hull, and then fitting these point clouds to obtain the obstacle position information; 数据关联模块,用于根据不同帧中障碍物的位置信息关联属于同一障碍物的不同时刻的位置,并拟合为轨迹,将已经追踪到的轨迹与当前数据帧检测到的聚类点云关联,更新动态障碍物的运动轨迹;The data association module is used to associate the position of the same obstacle at different times according to the position information of the obstacle in different frames, and fit it as a trajectory, and associate the tracked trajectory with the cluster point cloud detected in the current data frame. , update the trajectory of dynamic obstacles; 轨迹预测模块,用于利用轨迹函数预测动态障碍物的位置;The trajectory prediction module is used to predict the position of dynamic obstacles by using the trajectory function; 控制模块,用于控制其余模块执行指令;The control module is used to control other modules to execute instructions; 所述消除模块、聚类模块、凸包提取模块、特征提取模块、数据关联模块、轨迹预测模块均与控制模块电连接。The elimination module, the clustering module, the convex hull extraction module, the feature extraction module, the data association module and the trajectory prediction module are all electrically connected with the control module.
CN202010192403.6A 2020-03-18 2020-03-18 A dynamic obstacle tracking method based on sparse lidar data Active CN111337941B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010192403.6A CN111337941B (en) 2020-03-18 2020-03-18 A dynamic obstacle tracking method based on sparse lidar data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010192403.6A CN111337941B (en) 2020-03-18 2020-03-18 A dynamic obstacle tracking method based on sparse lidar data

Publications (2)

Publication Number Publication Date
CN111337941A CN111337941A (en) 2020-06-26
CN111337941B true CN111337941B (en) 2022-03-04

Family

ID=71184332

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010192403.6A Active CN111337941B (en) 2020-03-18 2020-03-18 A dynamic obstacle tracking method based on sparse lidar data

Country Status (1)

Country Link
CN (1) CN111337941B (en)

Families Citing this family (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111915657B (en) * 2020-07-08 2024-08-27 浙江大华技术股份有限公司 Point cloud registration method and device, electronic equipment and storage medium
CN111813120A (en) * 2020-07-10 2020-10-23 北京林业大学 Recognition method, device and electronic device of moving target of robot
CN111708021B (en) * 2020-07-15 2022-04-15 四川长虹电器股份有限公司 Personnel tracking and identifying algorithm based on millimeter wave radar
CN111860321B (en) * 2020-07-20 2023-12-22 浙江光珀智能科技有限公司 Obstacle recognition method and system
CN111929699B (en) * 2020-07-21 2023-05-09 北京建筑大学 A Lidar Inertial Navigation Odometer and Mapping Method and System Considering Dynamic Obstacles
CN113971712A (en) * 2020-07-22 2022-01-25 上海商汤临港智能科技有限公司 A method, device, electronic device and storage medium for processing point cloud data
CN111929676B (en) * 2020-07-30 2022-07-08 上海交通大学 X-band radar target detection and tracking method based on density clustering
CN111866305A (en) * 2020-08-11 2020-10-30 普达迪泰(天津)智能装备科技有限公司 Image enhancement and environment adaptability method under indoor and outdoor specific conditions
CN111967374B (en) * 2020-08-14 2021-10-01 安徽海博智能科技有限责任公司 Mine obstacle identification method, system and equipment based on image processing
CN112346014B (en) * 2020-09-23 2022-06-21 宁波大学 A positioning method of multistatic sonar based on signal arrival time difference
CN112505678A (en) * 2020-10-23 2021-03-16 中国第一汽车股份有限公司 Vehicle track calculation method and device, vehicle and medium
CN112562419B (en) * 2020-11-03 2022-04-08 南京航空航天大学 A weather avoidance zone setting method based on offline multi-target tracking
CN112462381B (en) * 2020-11-19 2024-06-04 浙江吉利控股集团有限公司 Multi-laser radar fusion method based on vehicle-road cooperation
CN112560580B (en) * 2020-11-20 2022-01-28 腾讯科技(深圳)有限公司 Obstacle recognition method, device, system, storage medium and electronic equipment
TWI785421B (en) * 2020-11-24 2022-12-01 大云永續科技股份有限公司 Inventory management method and system
CN112379673B (en) * 2020-11-26 2023-12-29 广东盈峰智能环卫科技有限公司 Robot self-following method and device based on single-line laser radar and robot
CN112666569B (en) * 2020-12-01 2023-03-24 天津优控智行科技有限公司 Compression method of laser radar continuous point cloud of unmanned system
CN112529874B (en) * 2020-12-14 2024-07-12 上海智蕙林医疗科技有限公司 Obstacle detection method and device based on three-dimensional radar, medium and robot
US11885638B2 (en) * 2020-12-28 2024-01-30 Bear Robotics, Inc. Method, system, and non-transitory computer-readable recording medium for generating a map for a robot
CN112733922A (en) * 2021-01-04 2021-04-30 上海高仙自动化科技发展有限公司 Method and device for determining forbidden area, robot and storage medium
CN112883909B (en) * 2021-03-16 2024-06-14 东软睿驰汽车技术(沈阳)有限公司 Obstacle position detection method and device based on bounding box and electronic equipment
CN113074748B (en) * 2021-03-29 2022-08-26 北京三快在线科技有限公司 Path planning method and device for unmanned equipment
CN113111887B (en) * 2021-04-26 2022-04-15 河海大学常州校区 Semantic segmentation method and system based on information fusion of camera and laser radar
IL282873A (en) * 2021-05-03 2022-12-01 Israel Aerospace Ind Ltd System and method for tracking and classifying moving objects
CN113111973A (en) * 2021-05-10 2021-07-13 北京华捷艾米科技有限公司 Depth camera-based dynamic scene processing method and device
CN113191459B (en) * 2021-05-27 2022-09-09 山东高速建设管理集团有限公司 A roadside lidar-based approach to target classification in transit
CN113183967A (en) * 2021-06-04 2021-07-30 多伦科技股份有限公司 Vehicle safety control method, device, equipment and storage medium
CN113269889B (en) * 2021-06-04 2023-04-07 重庆大学 Self-adaptive point cloud target clustering method based on elliptical domain
CN113536959A (en) * 2021-06-23 2021-10-22 复旦大学 Dynamic obstacle detection method based on stereoscopic vision
CN113313654B (en) * 2021-06-23 2023-12-15 上海西井科技股份有限公司 Laser point cloud filtering denoising method, system, equipment and storage medium
CN113447953B (en) * 2021-06-29 2022-08-26 山东高速建设管理集团有限公司 Background filtering method based on road traffic point cloud data
CN115685224A (en) * 2021-07-13 2023-02-03 华为技术有限公司 Laser radar point cloud clustering method and device, laser radar and vehicle
CN113759900B (en) * 2021-08-12 2023-05-02 中南大学 Method and system for track planning and real-time obstacle avoidance of inspection robot based on obstacle region prediction
CN113807239B (en) * 2021-09-15 2023-12-08 京东鲲鹏(江苏)科技有限公司 Point cloud data processing method and device, storage medium and electronic equipment
CN116069051B (en) * 2021-10-29 2024-03-19 北京三快在线科技有限公司 Unmanned aerial vehicle control method, device, equipment and readable storage medium
CN114266796B (en) * 2021-11-05 2022-09-13 广东省国土资源测绘院 Method, device and medium for automatically acquiring natural shoreline based on laser point cloud data and average large-tide high-tide surface
CN114037736B (en) * 2021-11-15 2024-05-14 重庆大学 Vehicle detection and tracking method based on self-adaptive model
CN113926174A (en) * 2021-11-16 2022-01-14 南京禾蕴信息科技有限公司 Pile-winding motion trajectory analysis and timing device and analysis method thereof
CN114241439B (en) * 2021-11-26 2024-08-27 新奇点智能科技集团有限公司 Acquisition range determining method and device, electronic equipment and storage medium
CN114460582B (en) * 2021-12-14 2023-04-14 江苏航天大为科技股份有限公司 Millimeter wave radar cart identification method based on point cloud speed
CN114611586A (en) * 2022-02-22 2022-06-10 武汉路特斯汽车有限公司 A road scene recognition method, device, vehicle and computer storage medium
CN114384920B (en) 2022-03-23 2022-06-10 安徽大学 Dynamic obstacle avoidance method based on real-time construction of local grid map
CN114384491B (en) * 2022-03-24 2022-07-12 北京一径科技有限公司 Point cloud processing method and device for laser radar and storage medium
CN114384492B (en) * 2022-03-24 2022-06-24 北京一径科技有限公司 Point cloud processing method and device for laser radar and storage medium
CN114724109A (en) * 2022-04-06 2022-07-08 深兰人工智能(深圳)有限公司 Target detection method, device, equipment and storage medium
CN114973564A (en) * 2022-04-28 2022-08-30 北京机械设备研究所 A method and device for detecting remote personnel intrusion under no-light conditions
CN114743181A (en) * 2022-04-29 2022-07-12 重庆长安汽车股份有限公司 Road vehicle target detection method and system, electronic device and storage medium
CN114994705B (en) * 2022-05-16 2025-02-11 杭州电子科技大学 An obstacle avoidance method for unmanned boat based on laser radar angle correction
CN114924289B (en) * 2022-06-14 2024-09-20 燕山大学 Laser radar point cloud target fitting algorithm
CN115147813A (en) * 2022-07-13 2022-10-04 纵目科技(上海)股份有限公司 Object space feature-based sparse point cloud processing method, system, device and medium
CN115561772B (en) * 2022-08-26 2023-08-29 东莞理工学院 Laser radar driving environment cognition system based on visual area guidance
CN116309689B (en) * 2023-05-17 2023-07-28 上海木蚁机器人科技有限公司 Obstacle track prediction method, device, equipment and medium
CN116299474B (en) * 2023-05-23 2023-09-12 禾多科技(北京)有限公司 Integrated radar device and vehicle obstacle avoidance method
CN118071994B (en) * 2024-02-27 2024-08-27 数据堂(北京)科技股份有限公司 Incomplete point cloud data labeling method, device and terminal in automatic driving scene

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101676744A (en) * 2007-10-31 2010-03-24 北京航空航天大学 High-precision tracking method for small and weak targets under complex background and low signal-to-noise ratio
CN104417562A (en) * 2013-08-29 2015-03-18 株式会社电装 Method and apparatus for recognizing road shape, program, and recording medium
KR20170074542A (en) * 2015-12-22 2017-06-30 부산대학교 산학협력단 Apparatus and Method for Building Indoor Map using Geometric Features and Probability Method
JP2018063236A (en) * 2016-10-13 2018-04-19 バイドゥ ネットコム サイエンス アンド テクノロジー(ペキン) カンパニー リミテッド Method and apparatus for annotating point cloud data
CN108983248A (en) * 2018-06-26 2018-12-11 长安大学 It is a kind of that vehicle localization method is joined based on the net of 3D laser radar and V2X
CN109682388A (en) * 2018-12-21 2019-04-26 北京智行者科技有限公司 Follow the determination method in path
WO2019136479A1 (en) * 2018-01-08 2019-07-11 The Regents On The University Of California Surround vehicle tracking and motion prediction
CN110018489A (en) * 2019-04-25 2019-07-16 上海蔚来汽车有限公司 Target tracking method, device and controller and storage medium based on laser radar
CN110210389A (en) * 2019-05-31 2019-09-06 东南大学 A kind of multi-targets recognition tracking towards road traffic scene
CN110487183A (en) * 2019-08-27 2019-11-22 中国科学技术大学 A kind of multiple target fiber position accurate detection system and application method
CN110567324A (en) * 2019-09-04 2019-12-13 深圳市唯特视科技有限公司 multi-target group threat degree prediction device and method based on DS evidence theory
CN110658531A (en) * 2019-08-23 2020-01-07 畅加风行(苏州)智能科技有限公司 Dynamic target tracking method for port automatic driving vehicle

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6788692B2 (en) * 2017-07-13 2020-11-25 ベイジン・ボイジャー・テクノロジー・カンパニー・リミテッド Systems and methods for trajectory determination
CN109509210B (en) * 2017-09-15 2020-11-24 百度在线网络技术(北京)有限公司 Obstacle tracking method and device
US11104334B2 (en) * 2018-05-31 2021-08-31 Tusimple, Inc. System and method for proximate vehicle intention prediction for autonomous vehicles

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101676744A (en) * 2007-10-31 2010-03-24 北京航空航天大学 High-precision tracking method for small and weak targets under complex background and low signal-to-noise ratio
CN104417562A (en) * 2013-08-29 2015-03-18 株式会社电装 Method and apparatus for recognizing road shape, program, and recording medium
KR20170074542A (en) * 2015-12-22 2017-06-30 부산대학교 산학협력단 Apparatus and Method for Building Indoor Map using Geometric Features and Probability Method
JP2018063236A (en) * 2016-10-13 2018-04-19 バイドゥ ネットコム サイエンス アンド テクノロジー(ペキン) カンパニー リミテッド Method and apparatus for annotating point cloud data
WO2019136479A1 (en) * 2018-01-08 2019-07-11 The Regents On The University Of California Surround vehicle tracking and motion prediction
CN108983248A (en) * 2018-06-26 2018-12-11 长安大学 It is a kind of that vehicle localization method is joined based on the net of 3D laser radar and V2X
CN109682388A (en) * 2018-12-21 2019-04-26 北京智行者科技有限公司 Follow the determination method in path
CN110018489A (en) * 2019-04-25 2019-07-16 上海蔚来汽车有限公司 Target tracking method, device and controller and storage medium based on laser radar
CN110210389A (en) * 2019-05-31 2019-09-06 东南大学 A kind of multi-targets recognition tracking towards road traffic scene
CN110658531A (en) * 2019-08-23 2020-01-07 畅加风行(苏州)智能科技有限公司 Dynamic target tracking method for port automatic driving vehicle
CN110487183A (en) * 2019-08-27 2019-11-22 中国科学技术大学 A kind of multiple target fiber position accurate detection system and application method
CN110567324A (en) * 2019-09-04 2019-12-13 深圳市唯特视科技有限公司 multi-target group threat degree prediction device and method based on DS evidence theory

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
A micro-optical fiber positioner;Hongzhuan Hu 等;《Advances in Optical and Mechanical Technologies for Telescopes and Instrumentation III》;20180710;第1-7页 *
Dynamic Multi-LiDAR Based Multiple Object Detection and Tracking;Muhammad Sualeh 等;《Sensors 2019》;20190326;第1-20页 *
LAMOST焦面板球面测量误差分析;周增祥 等;《中国科学技术大学学报》;20070630;第37卷(第6期);第631-635页 *
The obstacle detection and obstacle avoidance algorithm based on 2-D lidar;Yan Peng 等;《2015 IEEE International Conference on Information and Automation》;20151001;第1648-1653页 *
基于三维激光雷达的动态障碍物检测和追踪方法;邹斌 等;《汽车技术》;20170831;第19-25页 *
基于多项式预测滤波的机动目标轨迹获取;孙双 等;《南京师范大学学报(工程技术版)》;20091231;第9卷(第4期);第12-15页 *
基于机载激光雷达成像的动目标轨迹检测模型;寇添 等;《激光与光电子学进展》;20151031;第1-5页 *
结构化道路中动态车辆的轨迹预测;谢辉 等;《汽车安全与节能学报》;20190630;第413-422页 *

Also Published As

Publication number Publication date
CN111337941A (en) 2020-06-26

Similar Documents

Publication Publication Date Title
CN111337941B (en) A dynamic obstacle tracking method based on sparse lidar data
CN113436260B (en) Mobile robot pose estimation method and system based on multi-sensor tight coupling
CN109684921B (en) A Road Boundary Detection and Tracking Method Based on 3D LiDAR
WO2022188663A1 (en) Target detection method and apparatus
CN112767490B (en) Outdoor three-dimensional synchronous positioning and mapping method based on laser radar
CN112014857A (en) Three-dimensional laser radar positioning and navigation method for intelligent inspection and inspection robot
CN110674705B (en) Small-sized obstacle detection method and device based on multi-line laser radar
CN112836633A (en) Parking space detection method and parking space detection system
CN111583369A (en) A Laser SLAM Method Based on Surface Line and Corner Feature Extraction
CN110163904A (en) Object marking method, control method for movement, device, equipment and storage medium
CN109001757B (en) Parking space intelligent detection method based on 2D laser radar
CN114842438A (en) Terrain detection method, system and readable storage medium for autonomous driving vehicle
CN112184736A (en) Multi-plane extraction method based on European clustering
CN115861968A (en) Dynamic obstacle removing method based on real-time point cloud data
CN113554705B (en) Laser radar robust positioning method under changing scene
Meng et al. Loop-closure detection with a multiresolution point cloud histogram mode in lidar odometry and mapping for intelligent vehicles
CN112669358A (en) Map fusion method suitable for multi-platform collaborative perception
Xiong et al. Road-model-based road boundary extraction for high definition map via lidar
CN114998276A (en) Robot dynamic obstacle real-time detection method based on three-dimensional point cloud
CN117367427A (en) Multi-mode slam method applicable to vision-assisted laser fusion IMU in indoor environment
WO2022000882A1 (en) Mapping and localization method, system, and apparatus for agv, and computer readable storage medium
CN115908539A (en) Target volume automatic measurement method and device and storage medium
Wang et al. 3D-LIDAR based branch estimation and intersection location for autonomous vehicles
Gate et al. Using targets appearance to improve pedestrian classification with a laser scanner
CN118226401A (en) Fixed-wing aircraft posture measurement method and device based on laser radar three-dimensional point cloud

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant