[go: up one dir, main page]

CN109470255B - High-precision map automatic generation method based on high-precision positioning and lane line identification - Google Patents

High-precision map automatic generation method based on high-precision positioning and lane line identification Download PDF

Info

Publication number
CN109470255B
CN109470255B CN201811468535.6A CN201811468535A CN109470255B CN 109470255 B CN109470255 B CN 109470255B CN 201811468535 A CN201811468535 A CN 201811468535A CN 109470255 B CN109470255 B CN 109470255B
Authority
CN
China
Prior art keywords
frame
map
lane line
point set
precision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811468535.6A
Other languages
Chinese (zh)
Other versions
CN109470255A (en
Inventor
胡禹超
戴震
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Heduo Technology Guangzhou Co ltd
Original Assignee
HoloMatic Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HoloMatic Technology Beijing Co Ltd filed Critical HoloMatic Technology Beijing Co Ltd
Priority to CN201811468535.6A priority Critical patent/CN109470255B/en
Publication of CN109470255A publication Critical patent/CN109470255A/en
Application granted granted Critical
Publication of CN109470255B publication Critical patent/CN109470255B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Instructional Devices (AREA)
  • Processing Or Creating Images (AREA)

Abstract

本发明公开了一种基于高精度定位及车道线识别的高精度地图自动生成方法,包括:使每次获取的高精度定位数据与车道线数据同步;利用同步后的数据建立地图帧,并存入地图帧数据库;将新获取的车道线数据与已建立的所有的地图帧进行匹配,匹配失败,则建立新的地图帧;若匹配成功,则对已建立的地图帧进行信息更新;对已建立的地图帧中已有的所有帧做帧间平滑处理,以得平滑处理后的三次曲线;通过将三次曲线拼接,以生成高精度地图。本发明以生成基于高精度定位精确到车道线的,并能自动拼接车道线的高精度地图,降低了生成高精度地图的复杂性,并避免了传统高精度地图制作需要的大量的人力消耗以及错误率高的问题。

Figure 201811468535

The invention discloses a method for automatically generating a high-precision map based on high-precision positioning and lane line recognition. Enter the map frame database; match the newly acquired lane line data with all the established map frames, if the matching fails, a new map frame will be established; if the matching is successful, the information of the established map frame will be updated; All existing frames in the established map frame are smoothed between frames to obtain a smoothed cubic curve; a high-precision map is generated by splicing the cubic curve. The present invention generates a high-precision map that is accurate to lane lines based on high-precision positioning and can automatically splicing lane lines, reduces the complexity of generating a high-precision map, and avoids a large amount of manpower consumption required for traditional high-precision map production. high error rate.

Figure 201811468535

Description

High-precision map automatic generation method based on high-precision positioning and lane line identification
Technical Field
The invention relates to the technical field of unmanned driving, in particular to a high-precision map automatic generation method based on high-precision positioning and lane line identification.
Background
The high-precision map is used as a scarce resource in the field of unmanned driving and is just needed, plays a core role in the whole field, can help an unmanned vehicle to sense complex road information such as gradient, curvature, course and the like in advance, and is an indispensable data source for unmanned vehicle driving by combining with intelligent path planning to make a correct decision for the unmanned vehicle. The information collected by the sensor needs to be compared with a stored high-precision map to judge the position and the direction to ensure that the unmanned vehicle can safely drive to a destination, so that the accuracy of high-precision map data acquisition is very critical for unmanned driving; the preparation of traditional high accuracy map needs a large amount of manpower marks, and is not only wasted time and energy, and the error rate that the error that leads to because of artifical mark produced is also higher moreover to do not benefit to unmanned development, the preparation of high accuracy map now, because of the required precision is higher, the computational process is complicated, the preparation process is also consuming time relatively, it is very necessary to provide one kind based on can automatic concatenation lane on high accuracy location and lane line discernment basis, the short high accuracy map of consuming time.
Disclosure of Invention
An object of the present invention is to solve at least the above problems and to provide at least the advantages described later.
It is still another object of the present invention to provide a high-precision map automatic generation method based on high-precision positioning and lane line identification, so as to generate a high-precision map that is accurate to a lane line based on high-precision positioning and can automatically splice the lane lines, reduce the complexity of generating the high-precision map, and avoid the problems of large amount of labor consumption and high error rate required by the conventional high-precision map production.
To achieve these objects and other advantages in accordance with the present invention, there is provided a high-precision map automatic generation method based on high-precision positioning and lane line recognition, comprising:
step 1, synchronizing the high-precision positioning data acquired each time with lane line data through time alignment processing to obtain the position and the posture of the synchronized high-precision positioning data.
And 2, establishing a map frame through the synchronized lane line data and the high-precision positioning data obtained in the step 1, and storing the map frame in a map frame database.
Step 3, matching the newly acquired lane line data with all the map frames established according to the acquired lane line data, and if the matching fails, establishing a new map frame by using the newly acquired lane line data; and if the matching is successful, updating the information of the map frame established in the step 2 until the updating is finished.
And 4, performing interframe smoothing treatment on all the existing frames in the map frames established in the steps 2 and 3, and recalculating by combining the point set of the map frame updated in the step 3 to obtain a cubic curve representing the lane line information corresponding to the frames subjected to interframe smoothing treatment.
And 5, splicing the cubic curves to generate a high-precision map.
Preferably, step 1 further comprises:
taking the timestamp for acquiring the lane line data as a time alignment point, the position p 'and the posture r' of the high-precision positioning data aligned to the timestamp of the lane line data are respectively:
p′=p+v(tm-tl); (1)
r′=rω(tm-tl); (2)
wherein, tmA timestamp representing the acquisition of the lane line data.
tlA time stamp representing the acquisition of the high accuracy positioning data.
p, r, v and ω represent the position, attitude and linear velocity of the high-precision positioning data before alignment, respectively
Degrees and angular velocities.
Preferably, step 2 further comprises:
the map frame mainly comprises the following elements:
PF: a location of the frame spatial information; rF: a pose of the frame space information.
CF: a cubic curve of lane line information; sF: and (5) sampling point sets of lane lines.
LF: and associating the frames before and after the inter-frame topology information.
The elements are all represented by a frame coordinate system, wherein the frame coordinate system takes the transverse direction as an x axis, the longitudinal direction as a y axis, and the directions vertical to the x axis and the y axis as a z axis.
Preferably, step 2 further comprises:
the map frame is established on the premise that the type and/or the color of the lane line are changed and/or the lane line is disconnected and/or an unmanned vehicle changes lanes in driving and/or the y-axis direction length of the map frame exceeds a threshold value.
Preferably, the conditions for successful matching in step 3 are as follows:
the newly acquired lane line data and one or more of all the map frames established according to the acquired lane line data have an overlapping part, and the overlapping part reaches the threshold value; and/or
And the newly acquired lane line data is connected with any one of the map frames which are established in a front-back mode.
Preferably, the information updating of the map frame in step 3 further includes:
and step C, sampling the cubic curve representing the lane line at certain intervals to obtain a sampling point set.
And D, calculating the position and the posture of the sampling point set relative to the non-updated map frame, and projecting the position of the sampling point set into a frame coordinate system of the non-updated map frame to obtain the sampling point set in the frame coordinate system.
And E, if the position of the sampling point set in the frame coordinate system exceeds the length limit of the frame coordinate system, cutting out an excess part to serve as new lane line data, and entering the step C to update the map frame.
And F, merging the sampling point set of the non-updated map frame with the sampling point set in the frame coordinate system in the step D to obtain a sampling point set which is re-sampled at a certain interval after merging, and updating the sampling point set of the non-updated map frame into the re-sampled sampling point set.
And G, carrying out cubic curve fitting on the sampling point set re-sampled in the step F to obtain a fitted sampling point set, and updating the result of cubic curve fitting of the sampling point set of the non-updated map frame into the fitted sampling point set.
And H, finishing the incidence relation between the front frame and the rear frame of the connected frame by calculating the connection relation between the new map frame and the established map frame so as to finish the information updating of the established map frame.
Preferably, the inter-frame smoothing process in step 4 is performed on the premise that no new lane line data is input.
Preferably, the inter-frame smoothing process in step 4 further includes:
and establishing a cubic curve, wherein the point set for fitting the cubic curve is obtained by mixing a second half point set in the point sets of the previous frame and a first half point set in the point sets of the subsequent frame in the previous frame and the subsequent frame which have a correlation.
And projecting the second half point set onto the cubic curve along a direction perpendicular to a y axis of the frame coordinate system to obtain a projected point set.
Smoothing the projection points corresponding to the points under the former frame coordinate system and the latter half point set on the y axis of the frame coordinate system to obtain smoothed points, thereby obtaining smoothed frames; the smoothing treatment of the point set under the frame coordinate system is similar; the formula of the point-to-point smoothing process is as follows:
P″=((1-a)x+ax′,y,(1-a)z+az′) (3)
wherein, the smoothing coefficient a of the previous frame point set is y/L; l represents the length of the frame.
And the smoothing coefficient a of the subsequent frame point set is 1-y/L.
And P ═ x, y, z denotes any point in the latter half set of points in the former frame coordinate system.
P ' ═ x ', y, z ' denotes any point in the first half set of points in the frame-after coordinate system.
The invention at least comprises the following beneficial effects:
according to the method, time intervals exist between high-precision positioning data and lane line data due to the lead or lag of the high-precision positioning data, time is aligned to the acquisition time stamp of the lane line data through time alignment processing, so that the consistency of data time during data processing is ensured, and the data processing is performed on the premise that a high-precision map is generated, and the coordinate system conversion between the high-precision positioning and the lane line data is required; when a new lane line data is acquired and a map frame is established, the position and the posture of spatial information in the frame are the orientation, namely the position and the posture of the high-precision positioning data after time alignment processing, by taking the high-precision positioning data and the lane line data as prior conditions, and when the lane line data changes, the lane line data is firstly matched with the established map frame to be overlapped or front-back associated parts according to requirements, so that the acquired lane line data is updated and replaced, and if the matching is unsuccessful, the acquired new lane line data establishes a new map frame again, so that the lane line splicing is automatically completed through matching updating and reconstruction; and through the interframe smoothing processing, interframe smooth transition is realized, the overlapped parts of the point sets in the frames are smoothly connected to generate a high-precision map which is accurate to the lane line based on high-precision positioning and can automatically splice the lane line, and the high-precision map which can automatically splice the lane line is generated through the prior conditions of the high-precision positioning and the lane line acquisition, so that the complexity of generating the high-precision map is reduced, the problems of large manpower consumption and high error rate required by the traditional high-precision map manufacturing are solved, and the method has important significance for the unmanned driving to reliably and safely drive.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention.
Drawings
FIG. 1 is a flow chart of a high-precision map automatic generation method based on high-precision positioning and lane line identification according to the present invention;
FIG. 2 is a diagram illustrating elements included in a map frame in a frame coordinate system according to the present invention;
FIG. 3 is a schematic diagram of the lane line data of the present invention when there is a blind area;
FIG. 4 is a diagram illustrating a process for updating map frame information according to the present invention;
fig. 5 is a schematic diagram of the updated map frame information according to the present invention.
Detailed Description
The present invention is further described in detail below with reference to the attached drawings so that those skilled in the art can implement the invention by referring to the description text.
It will be understood that terms such as "having," "including," and "comprising," as used herein, do not preclude the presence or addition of one or more other elements or groups thereof.
As shown in fig. 1, the present invention provides a high-precision map automatic generation method based on high-precision positioning and lane line identification, including:
step 1, synchronizing the high-precision positioning data acquired each time with lane line data through time alignment processing to obtain the position and the posture of the synchronized high-precision positioning data.
And 2, establishing a map frame through the synchronized lane line data and the high-precision positioning data obtained in the step 1, and storing the map frame in a map frame database.
Step 3, matching the newly acquired lane line data with all the map frames established according to the acquired lane line data, and if the matching fails, establishing a new map frame by using the newly acquired lane line data; and if the matching is successful, updating the information of the map frame established in the step 2 until the updating is finished.
And 4, performing interframe smoothing treatment on all the existing frames in the map frames established in the steps 2 and 3, and recalculating by combining the point set of the map frame updated in the step 3 to obtain a cubic curve representing the lane line information corresponding to the frames subjected to interframe smoothing treatment.
And 5, splicing the cubic curves to generate a high-precision map.
In the scheme, a time interval exists between the high-precision positioning data and the lane line data due to the lead or lag of the high-precision positioning data, and the time is aligned to the acquisition timestamp of the lane line data through time alignment processing so as to ensure the consistency of data time during data processing, which is taken as the premise of high-precision map generation, and the coordinate system conversion between the high-precision positioning and the lane line data is firstly required in the data processing; when a new lane line data is acquired and a map frame is established, the position and the posture of spatial information in the frame are the orientation, namely the position and the posture of the high-precision positioning data after time alignment processing, by taking the high-precision positioning data and the lane line data as prior conditions, and when the lane line data changes, the lane line data is firstly matched with the established map frame to be overlapped or front-back associated parts according to requirements, so that the acquired lane line data is updated and replaced, and if the matching is unsuccessful, the acquired new lane line data establishes a new map frame again, so that the lane line splicing is automatically completed through matching updating and reconstruction; and through the interframe smoothing processing, interframe smooth transition is realized, the overlapped parts of the point sets in the frames are smoothly connected to generate a high-precision map which is accurate to the lane line based on high-precision positioning and can automatically splice the lane line, and the high-precision map which can automatically splice the lane line is generated through the prior conditions of the high-precision positioning and the lane line acquisition, so that the complexity of generating the high-precision map is reduced, the problems of large manpower consumption and high error rate required by the traditional high-precision map manufacturing are solved, and the method has important significance for the unmanned driving to reliably and safely drive.
In a preferred embodiment, step 1 further comprises:
taking the timestamp for acquiring the lane line data as a time alignment point, the position p 'and the posture r' of the high-precision positioning data aligned to the timestamp of the lane line data are respectively:
p′=p+v(tm-tl); (1)
r′=rω(tm-tl); (2)
wherein, tmA timestamp representing the acquisition of the lane line data.
tlA time stamp representing the acquisition of the high accuracy positioning data.
p, r, v and ω represent the position, attitude and linear velocity of the high-precision positioning data before alignment, respectively
Degrees and angular velocities.
In the above scheme, a timestamp is set for the time for acquiring the lane line data, and the timestamp for acquiring the lane line data is used as a reference, so that high-precision positioning data is acquired, and certain advance or delay is generated due to different data sampling moments, so that the lane line data and the high-precision data are synchronized in the generation method, and are converted by formula 1, and the high-precision positioning data after algorithm synchronization is obtained.
In a preferred embodiment, step 2 further comprises:
the map frame mainly comprises the following elements:
PF: a location of the frame spatial information; rF: a pose of the frame space information.
CF: a cubic curve of lane line information; sF: and (5) sampling point sets of lane lines.
LF: and associating the frames before and after the inter-frame topology information.
The elements are all represented by a frame coordinate system, wherein the frame coordinate system takes the transverse direction as an x axis, the longitudinal direction as a y axis, and the directions vertical to the x axis and the y axis as a z axis.
In the above scheme, as shown in fig. 2, the map frame includes elements in the frame coordinate system.
In a preferred embodiment, step 2 further comprises:
the map frame is established on the premise that the type and/or the color of the lane line are changed and/or the lane line is disconnected and/or an unmanned vehicle changes lanes in driving and/or the y-axis direction length of the map frame exceeds a threshold value.
In the scheme, each frame represented by the map frame corresponds to a position and a posture when lane line data are acquired, and different positions and postures are formed, namely different map frames form a high-precision map based on lane line recognition, when the lane line is recognized, the lane line data change, on the premise of unsuccessful matching, a new map frame is established, the lane line data change, and when the map frame is established, if the y-axis direction of the lane line in the map frame exceeds a threshold value, the position P of the map frameFAnd attitude RFAre assigned values of p 'and r' respectively,as shown in fig. 3, the lane line data in the area indicated by W is extracted to the map frame database because the acquired lane line data has a blind area [0, y ] in the y-axis directionb]That is, in the range of G, searching the data corresponding to the area in the existing map frame, if the data can be found, filling the corresponding data into the blind area.
In a preferred embodiment, the conditions for successful matching in step 3 are as follows:
the newly acquired lane line data and one or more of all the map frames established according to the acquired lane line data have an overlapping part, and the overlapping part reaches the threshold value; and/or
And the newly acquired lane line data is connected with any one of the map frames which are established in a front-back mode.
In the above solution, the newly acquired lane line data first needs to be matched with an already established map frame to acquire a map frame which overlaps or is connected with the newly acquired lane line data, so as to determine the position of the newly acquired lane line data, and update the map frame.
In a preferred embodiment, the information updating of the map frame in step 3 further includes:
and step C, sampling the cubic curve representing the lane line at certain intervals to obtain a sampling point set.
And D, calculating the position and the posture of the sampling point set relative to the non-updated map frame, and projecting the position of the sampling point set into a frame coordinate system of the non-updated map frame to obtain the sampling point set in the frame coordinate system.
And E, if the position of the sampling point set in the frame coordinate system exceeds the length limit of the frame coordinate system, cutting out an excess part to serve as new lane line data, and entering the step C to update the map frame.
And F, merging the sampling point set of the non-updated map frame with the sampling point set in the frame coordinate system in the step D to obtain a sampling point set which is re-sampled at a certain interval after merging, and updating the sampling point set of the non-updated map frame into the re-sampled sampling point set.
And G, carrying out cubic curve fitting on the sampling point set re-sampled in the step F to obtain a fitted sampling point set, and updating the result of cubic curve fitting of the sampling point set of the non-updated map frame into the fitted sampling point set.
And H, finishing the incidence relation between the front frame and the rear frame of the connected frame by calculating the connection relation between the new map frame and the established map frame so as to finish the information updating of the established map frame.
In the above solution, as shown in fig. 4 and 5, the basic flow of map frame information update is as follows:
sampling the cubic curve representing the lane line A at certain intervals to obtain a sampling point set SA
Calculating the position and the attitude of A relative to the current map frame F and sampling the sampling point set SAIs projected on F, and is recorded as
Figure BDA0001890403500000081
If it is not
Figure BDA0001890403500000082
If the length limit of the current map frame F is exceeded, cutting the exceeded part out and updating the map frame again by taking the exceeded part as new lane line data;
sampling point S of current map frame FFAnd
Figure BDA0001890403500000083
combining, and resampling at certain intervals to obtain SF', and will SFUpdate to SF′;
To SF' fitting cubic curve to obtain CF', and CFIs updated to CF′;
Calculating the new map frame and the existing mapThe connection relation of the frames is updated according to the connection relation L of the frames before and after the connectionF
In a preferred embodiment, the inter-frame smoothing in step 4 is performed on the premise that no new lane line data is input.
In the above scheme, the condition that the update of the established map frame information is completed is that no new lane data line is acquired, otherwise, once the new lane data is acquired, the new lane data will enter into matching with the established map frame, or establishing a new map frame or entering into the update of the established map frame, so that the map frame can be guaranteed to be updated to perform inter-frame smoothing only under the condition that no new lane data is input.
In a preferred embodiment, the inter-frame smoothing process in step 4 further includes:
and establishing a cubic curve, wherein the point set for fitting the cubic curve is obtained by mixing a second half point set in the point sets of the previous frame and a first half point set in the point sets of the subsequent frame in the previous frame and the subsequent frame which have a correlation.
And projecting the second half point set onto the cubic curve along a direction perpendicular to a y axis of the frame coordinate system to obtain a projected point set.
Smoothing the projection points corresponding to the points under the former frame coordinate system and the latter half point set on the y axis of the frame coordinate system to obtain smoothed points, thereby obtaining smoothed frames; the smoothing treatment of the point set under the frame coordinate system is similar; the formula of the point-to-point smoothing process is as follows:
P″=((1-a)x+ax′,y,(1-a)z+az′) (3)
wherein, the smoothing coefficient a of the previous frame point set is y/L; l represents the length of the frame.
And the smoothing coefficient a of the subsequent frame point set is 1-y/L.
And P ═ x, y, z denotes any point in the latter half set of points in the former frame coordinate system.
P ' ═ x ', y, z ' denotes any point in the first half set of points in the frame-after coordinate system.
In the above scheme, two frames with a front-back relationship are respectively set as F1And F2Taking a previous frame point set
Figure BDA0001890403500000091
Second half of point set
Figure BDA0001890403500000092
And set of post-frame points
Figure BDA0001890403500000093
First half point set of
Figure BDA0001890403500000094
Mixing and fitting cubic curve Cm(ii) a For point sets
Figure BDA0001890403500000095
Project it to a cubic curve C along a direction perpendicular to the y-axismTo obtain a projection point set
Figure BDA0001890403500000096
For frame F1Set of points under a coordinate system
Figure BDA0001890403500000097
And each point P ═ x, y, z and with it
Figure BDA0001890403500000098
The corresponding point P ' ═(x ', y, z '), where the smoothing coefficient a is y/L, and L is the frame F, ((1-a) x + ax ', y, (1-a) z + az '), is calculated as follows1Length of (d).
Frame F2The point set smoothing of (1) is similar to this except that the smoothing coefficient becomes a ═ 1-x/L; in particular, for a set of points
Figure BDA0001890403500000099
Project it to a cubic curve C along a direction perpendicular to the y-axismTo obtain a projection point set
Figure BDA00018904035000000910
For frame F2Set of points under a coordinate system
Figure BDA00018904035000000911
And each point P ═ x, y, z and with it
Figure BDA00018904035000000912
The corresponding point P ' ═(x ', y, z '), where the smoothing coefficient a is 1-y/L, and the smoothed point P ═ ((1-a) x + ax ', y, (1-a) z + az ') is calculated as follows, and L frames F are obtained2Length of (d);
for the frame with the point set smoothed, recalculating the lane line cubic curve C of the frame according to the updated point setF
While embodiments of the invention have been described above, it is not limited to the applications set forth in the description and the embodiments, which are fully applicable in various fields of endeavor to which the invention pertains, and further modifications may readily be made by those skilled in the art, it being understood that the invention is not limited to the details shown and described herein without departing from the general concept defined by the appended claims and their equivalents.

Claims (6)

1.一种基于高精度定位及车道线识别的高精度地图自动生成方法,其中,包括:1. A high-precision map automatic generation method based on high-precision positioning and lane line recognition, comprising: 步骤1、通过时间对齐处理,使每次获取的高精度定位数据与车道线数据同步,以得同步后的所述高精度定位数据的位置和姿态;Step 1. Synchronize the high-precision positioning data obtained each time with the lane line data through time alignment processing, so as to obtain the position and attitude of the synchronized high-precision positioning data; 步骤2、通过步骤1中所得同步后的所述车道线数据和高精度定位数据建立地图帧,并存储至地图帧数据库;Step 2, establish a map frame through the synchronized lane line data and high-precision positioning data obtained in step 1, and store in the map frame database; 步骤3、将新获取的所述车道线数据与根据已获取的所述车道线数据建立的所有的所述地图帧进行匹配,若匹配失败,则利用新获取的所述车道线数据建立新的地图帧;若匹配成功,则对步骤2中已建立的所述地图帧进行信息更新,直至更新完毕;Step 3. Match the newly acquired lane line data with all the map frames established according to the acquired lane line data. If the matching fails, use the newly acquired lane line data to create a new map frame. map frame; if the matching is successful, then update the information of the map frame established in step 2 until the update is completed; 步骤4、通过对步骤2以及步骤3中建立的所述地图帧中已有的所有帧做帧间平滑处理,再结合步骤3中更新后的所述地图帧的点集重新计算,以得经所述帧间平滑处理后的所述帧对应的代表车道线信息的三次曲线;Step 4, by performing inter-frame smoothing processing on all existing frames in the map frame established in step 2 and step 3, and recalculating the point set of the map frame updated in step 3 to obtain a cubic curve representing lane line information corresponding to the frame after the inter-frame smoothing process; 步骤5、通过将所述三次曲线拼接,以生成高精度地图;Step 5. Generate a high-precision map by splicing the cubic curves; 其中,步骤3中匹配成功的条件为:Among them, the conditions for successful matching in step 3 are: 新获取的所述车道线数据与根据已获取的所述车道线数据建立的所有的所述地图帧中的一个或多个有重叠部分,且所述重叠部分达到阈值;和/或The newly acquired lane line data has overlapping parts with one or more of all the map frames established according to the acquired lane line data, and the overlapping parts reach a threshold; and/or 新获取的所述车道线数据与已建立的所述地图帧中的任何一个前后相连;The newly acquired lane line data is connected back and forth with any one of the established map frames; 步骤3中所述地图帧的信息更新进一步包括:The information update of the map frame described in step 3 further includes: 步骤C、通过按一定的间隔对代表所述车道线的三次曲线采样,以得采样点集;Step C, by sampling the cubic curve representing the lane line at a certain interval, to obtain a sampling point set; 步骤D、通过计算所述采样点集相对于未更新的所述地图帧的位置和姿态,并将所述采样点集的位置投影到未更新的所述地图帧的帧坐标系内,以得所述帧坐标系内的采样点集;Step D, by calculating the position and attitude of the sampling point set relative to the unupdated map frame, and projecting the position of the sampling point set into the frame coordinate system of the unupdated map frame to obtain a set of sampling points in the frame coordinate system; 步骤E、若所述帧坐标系内的所述采样点集的位置超出所述帧坐标系的长度限制,通过将超出部分切割出,以作为新的所述车道线数据进入步骤C进行所述地图帧的更新;Step E, if the position of the sampling point set in the frame coordinate system exceeds the length limit of the frame coordinate system, by cutting out the excess part, enter step C to carry out the described lane line data as new. update of the map frame; 步骤F、通过将所述未更新的所述地图帧的采样点集和步骤D中所述帧坐标系内的采样点集合并,以得合并后按一定间隔重新采样的采样点集,并将所述未更新的所述地图帧的采样点集更新为所述重新采样的采样点集;Step F, by merging the sampling point set of the map frame that is not updated and the sampling point set in the frame coordinate system in step D, to obtain the sampling point set that is resampled at certain intervals after the merger, and The unupdated sampling point set of the map frame is updated to the re-sampled sampling point set; 步骤G、通过对步骤F中重新采样的采样点集进行三次曲线拟合,以得拟合后的采样点集,并将所述未更新的所述地图帧的采样点集三次曲线拟合的结果更新为所述拟合后的采样点集;Step G, by carrying out cubic curve fitting to the sampling point set resampled in step F, so as to obtain the fitting sampling point set, and fitting the cubic curve of the sampling point set of the unupdated said map frame. The result is updated to the fitted sampling point set; 步骤H、通过计算所述新的地图帧与已建的所述地图帧的连接关系,完成相连接帧的前帧和后帧的关联关系,以完成所述已建立的地图帧的信息更新。Step H, by calculating the connection relationship between the new map frame and the established map frame, complete the association relationship between the previous frame and the next frame of the connected frame, so as to complete the information update of the established map frame. 2.如权利要求1所述基于高精度定位及车道线识别的高精度地图自动生成方法,其中,步骤1进一步包括:2. the high-precision map automatic generation method based on high-precision positioning and lane line recognition as claimed in claim 1, wherein, step 1 further comprises: 以获取所述车道线数据的时间戳为时间对齐点,则对齐到所述车道线数据的时间戳的所述高精度定位数据的位置p′和姿态r′分别为:Taking the time stamp of acquiring the lane line data as the time alignment point, the position p' and attitude r' of the high-precision positioning data aligned to the time stamp of the lane line data are respectively: p′=p+v(tm-tl); (1)p'=p+v(t m -t l ); (1) r′=rω(tm-tl); (2)r′=rω(t m -t l ); (2) 其中,tm表示获取所述车道线数据的时间戳;Wherein, t m represents the time stamp of obtaining the lane line data; tl表示获取所述高精度定位数据的时间戳; tl represents the time stamp for obtaining the high-precision positioning data; p、r、v以及ω分别表示对齐前的所述高精度定位数据的位置、姿态、线速度以及角速度。p, r, v, and ω represent the position, attitude, linear velocity, and angular velocity of the high-precision positioning data before alignment, respectively. 3.如权利要求1所述基于高精度定位及车道线识别的高精度地图自动生成方法,其中,步骤2进一步包括:3. the high-precision map automatic generation method based on high-precision positioning and lane line recognition as claimed in claim 1, wherein, step 2 further comprises: 所述地图帧主要包含以下元素:The map frame mainly includes the following elements: PF:所述帧空间信息的位置;RF:所述帧空间信息的姿态;P F : the position of the frame space information; RF : the pose of the frame space information; CF:车道线信息的三次曲线;SF:车道线采样点集;C F : cubic curve of lane line information; SF : set of lane line sampling points; LF:所述帧间拓扑信息的前后所述帧关联; LF : the frame association before and after the inter-frame topology information; 所述元素均以帧坐标系表示,其中帧坐标系以横向为x轴,纵向为y轴,垂直x轴、y轴方向为z轴。The elements are all represented by a frame coordinate system, wherein the frame coordinate system takes the horizontal direction as the x-axis, the vertical direction as the y-axis, and the vertical x-axis and y-axis directions as the z-axis. 4.如权利要求1所述基于高精度定位及车道线识别的高精度地图自动生成方法,其中,步骤2还进一步包括:4. the high-precision map automatic generation method based on high-precision positioning and lane line recognition as claimed in claim 1, wherein, step 2 also further comprises: 所述地图帧建立的前提是代表所述车道线的类型和/或颜色发生变化和/或所述车道线断开和/或驾驶中的无人车换道和/或所述地图帧y轴方向长度超出阈值。The premise of the establishment of the map frame is to represent that the type and/or color of the lane line changes and/or the lane line is disconnected and/or the driverless vehicle changes lanes while driving and/or the y-axis of the map frame The direction length exceeds the threshold. 5.如权利要求1所述的基于高精度定位及车道线识别的高精度地图自动生成方法,其中,步骤4中做所述帧间平滑处理的前提是无新的所述车道线数据的输入。5. The high-precision map automatic generation method based on high-precision positioning and lane line recognition as claimed in claim 1, wherein the premise of doing the smoothing between frames in step 4 is that there is no new input of the lane line data . 6.如权利要求1所述的基于高精度定位及车道线识别的高精度地图自动生成方法,其中,步骤4中帧间平滑处理进一步包括:6. The high-precision map automatic generation method based on high-precision positioning and lane line recognition as claimed in claim 1, wherein, in step 4, inter-frame smoothing further comprises: 步骤a、建立三次曲线,其中,拟合所述三次曲线的点集由有关联关系的所述前帧和后帧中的所述前帧的点集中的后半部分点集和所述后帧的点集中的前半部分点集混合所得;Step a, establishing a cubic curve, wherein the point set for fitting the cubic curve consists of the second half point set and the latter frame in the point set of the former frame and the latter frame that are associated with each other. The first half of the point set of the point set is mixed; 步骤b、通过将所述后半部分点集沿垂直于所述帧坐标系的y轴的方向,投影到所述三次曲线上,以得投影点集;Step b, by projecting the second half point set on the cubic curve along the direction perpendicular to the y-axis of the frame coordinate system to obtain a projected point set; 步骤c、通过对所述前帧坐标系下的点与所述后半部分点集在所述帧坐标系的y轴上对应的投影点做平滑处理后,以得平滑后的点,从而得平滑后的帧;所述后帧坐标系下点集的平滑处理类似;点与点间平滑处理的公式为:Step c, by smoothing the point under the previous frame coordinate system and the projection point corresponding to the second half of the point set on the y-axis of the frame coordinate system, to obtain a smoothed point, thereby obtaining: The smoothed frame; the smoothing processing of the point set under the post-frame coordinate system is similar; the formula for smoothing processing between points is: P″=((1-a)x+ax′,y,(1-a)z+az′) (3)P″=((1-a)x+ax′,y,(1-a)z+az′) (3) 其中,所述前帧点集的平滑系数a=y/L;L表示所述帧的长度;Wherein, the smoothing coefficient a=y/L of the point set of the previous frame; L represents the length of the frame; 所述后帧点集的平滑系数a=1-y/L;The smoothing coefficient a=1-y/L of the later frame point set; P=(x,y,z)表示所述前帧坐标系下的所述后半部分点集中的任一点;P=(x, y, z) represents any point in the second half point set under the previous frame coordinate system; P′=(x′,y,z′)表示所述后帧坐标系下的所述前半部分点集中的任一点。P'=(x', y, z') represents any point in the first half of the point set in the latter frame coordinate system.
CN201811468535.6A 2018-12-03 2018-12-03 High-precision map automatic generation method based on high-precision positioning and lane line identification Active CN109470255B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811468535.6A CN109470255B (en) 2018-12-03 2018-12-03 High-precision map automatic generation method based on high-precision positioning and lane line identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811468535.6A CN109470255B (en) 2018-12-03 2018-12-03 High-precision map automatic generation method based on high-precision positioning and lane line identification

Publications (2)

Publication Number Publication Date
CN109470255A CN109470255A (en) 2019-03-15
CN109470255B true CN109470255B (en) 2022-03-29

Family

ID=65675001

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811468535.6A Active CN109470255B (en) 2018-12-03 2018-12-03 High-precision map automatic generation method based on high-precision positioning and lane line identification

Country Status (1)

Country Link
CN (1) CN109470255B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110160540B (en) * 2019-06-12 2020-12-18 禾多科技(北京)有限公司 Lane line data fusion method based on high-precision map
CN110954128B (en) * 2019-12-03 2021-11-16 阿波罗智能技术(北京)有限公司 Method, device, electronic equipment and storage medium for detecting lane line position change
CN111559373B (en) * 2020-04-26 2021-08-13 东风汽车集团有限公司 A vehicle active steering control method
CN111626206A (en) * 2020-05-27 2020-09-04 北京百度网讯科技有限公司 High-precision map construction method and device, electronic equipment and computer storage medium
CN114238354A (en) * 2021-12-21 2022-03-25 高德软件有限公司 Map data updating method, device and computer storage medium
CN114610831B (en) * 2022-03-25 2025-05-16 智道网联科技(北京)有限公司 Real-time update method and device for high-precision map lanes
CN114577225B (en) * 2022-04-28 2022-07-22 北京百度网讯科技有限公司 Map drawing method and device, electronic equipment and storage medium
CN114719872B (en) * 2022-05-13 2022-09-23 高德软件有限公司 Lane line processing method and device and electronic equipment
CN114719873B (en) * 2022-06-02 2022-09-02 四川省公路规划勘察设计研究院有限公司 A low-cost fine map automatic generation method, device and readable medium
CN116793369B (en) * 2023-02-10 2024-03-08 北京斯年智驾科技有限公司 Path planning method, device, equipment and computer readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103954275A (en) * 2014-04-01 2014-07-30 西安交通大学 Lane line detection and GIS map information development-based vision navigation method
CN104535070A (en) * 2014-12-26 2015-04-22 上海交通大学 High-precision map data structure, high-precision map data acquiringand processing system and high-precision map data acquiringand processingmethod
CN104573733A (en) * 2014-12-26 2015-04-29 上海交通大学 High-precision map generation system and method based on high-definition ortho-photo map
CN106441319A (en) * 2016-09-23 2017-02-22 中国科学院合肥物质科学研究院 A system and method for generating a lane-level navigation map of an unmanned vehicle
CN106525057A (en) * 2016-10-26 2017-03-22 陈曦 Generation system for high-precision road map
JP2017533482A (en) * 2015-09-10 2017-11-09 バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド Lane data processing method, apparatus, storage medium and equipment
CN107976182A (en) * 2017-11-30 2018-05-01 深圳市隐湖科技有限公司 A kind of Multi-sensor Fusion builds drawing system and its method
CN108036794A (en) * 2017-11-24 2018-05-15 华域汽车系统股份有限公司 A kind of high accuracy map generation system and generation method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018126228A1 (en) * 2016-12-30 2018-07-05 DeepMap Inc. Sign and lane creation for high definition maps used for autonomous vehicles
CN108955670B (en) * 2017-05-25 2021-02-09 百度在线网络技术(北京)有限公司 Information acquisition method and device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103954275A (en) * 2014-04-01 2014-07-30 西安交通大学 Lane line detection and GIS map information development-based vision navigation method
CN104535070A (en) * 2014-12-26 2015-04-22 上海交通大学 High-precision map data structure, high-precision map data acquiringand processing system and high-precision map data acquiringand processingmethod
CN104573733A (en) * 2014-12-26 2015-04-29 上海交通大学 High-precision map generation system and method based on high-definition ortho-photo map
JP2017533482A (en) * 2015-09-10 2017-11-09 バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド Lane data processing method, apparatus, storage medium and equipment
CN106441319A (en) * 2016-09-23 2017-02-22 中国科学院合肥物质科学研究院 A system and method for generating a lane-level navigation map of an unmanned vehicle
CN106525057A (en) * 2016-10-26 2017-03-22 陈曦 Generation system for high-precision road map
CN108036794A (en) * 2017-11-24 2018-05-15 华域汽车系统股份有限公司 A kind of high accuracy map generation system and generation method
CN107976182A (en) * 2017-11-30 2018-05-01 深圳市隐湖科技有限公司 A kind of Multi-sensor Fusion builds drawing system and its method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
High-precision lane-level road map building for vehicle navigation;Anning Chen等;《IEEE/ION Position, Location and Navigation Symposium》;20100506;第1035-1042页 *
基于多传感器的车道级高精细地图制作方法;贺勇等;《长安大学学报(自然科学版)》;20150115;全文 *
基于激光点云扫描的高精导航地图关键技术研究;杨玉荣等;《现代计算机(专业版)》;20180325(第09期);全文 *

Also Published As

Publication number Publication date
CN109470255A (en) 2019-03-15

Similar Documents

Publication Publication Date Title
CN109470255B (en) High-precision map automatic generation method based on high-precision positioning and lane line identification
CN106441286B (en) UAV tunnel inspection system based on BIM technology
CN110163968A (en) RGBD camera large-scale three dimensional scenario building method and system
CN112022382B (en) Automatic cutting method and device for tooth socket
CN105225269A (en) Based on the object modelling system of motion
CN103824049A (en) Cascaded neural network-based face key point detection method
CN112418288A (en) A Dynamic Vision SLAM Method Based on GMS and Motion Detection
CN101320485A (en) A 3D Face Model Acquisition Method Based on Stereo Matching
CN109059930A (en) A kind of method for positioning mobile robot of view-based access control model odometer
CN110702101A (en) Positioning method and system for power inspection scene
KR101227613B1 (en) Updating system for numerical map reflecting the change of geographic information
CN114279434A (en) Picture construction method and device, electronic equipment and storage medium
CN112462385A (en) Map splicing and positioning method based on laser radar under outdoor large environment
CN113628307A (en) Skeleton driving method and device of three-dimensional model
CN116242374A (en) A SLAM localization method based on multi-sensor fusion by direct method
CN113160391A (en) Double-stage three-dimensional scene modeling method
CN108731686B (en) A UAV navigation control method and system based on big data analysis
CN116429116A (en) Robot positioning method and equipment
CN110906941A (en) Construction method and system of automatic driving map for long-distance tunnel
CN114757990A (en) Indoor environment map updating method and device, storage medium and indoor robot
CN109189871A (en) A kind of method and apparatus of Indoor environment path planning
CN117078873B (en) Three-dimensional high-precision map generation method, system and cloud platform
CN118069724A (en) Intelligent building site data acquisition method based on BIM and Internet of things
CN112489176B (en) Tightly-coupled graph building method fusing ESKF, g2o and point cloud matching
CN114693880B (en) Building mesh model elevation trimming method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 201, 202, 301, No. 56-4 Fenghuang South Road, Huadu District, Guangzhou City, Guangdong Province, 510806

Patentee after: Heduo Technology (Guangzhou) Co.,Ltd.

Address before: 100089 21-14, 1st floor, building 21, Enji West Industrial Park, No.1, liangjiadian, Fuwai, Haidian District, Beijing

Patentee before: HOLOMATIC TECHNOLOGY (BEIJING) Co.,Ltd.

CP03 Change of name, title or address
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A high-precision map automatic generation method based on high-precision positioning and lane recognition

Granted publication date: 20220329

Pledgee: Bank of Shanghai Co.,Ltd. Beijing Branch

Pledgor: Heduo Technology (Guangzhou) Co.,Ltd.

Registration number: Y2024980009891

PE01 Entry into force of the registration of the contract for pledge of patent right
PP01 Preservation of patent right

Effective date of registration: 20250121

Granted publication date: 20220329

PP01 Preservation of patent right