[go: up one dir, main page]

CN116295496A - Automatic driving vehicle path planning method, device, equipment and medium - Google Patents

Automatic driving vehicle path planning method, device, equipment and medium Download PDF

Info

Publication number
CN116295496A
CN116295496A CN202310288212.3A CN202310288212A CN116295496A CN 116295496 A CN116295496 A CN 116295496A CN 202310288212 A CN202310288212 A CN 202310288212A CN 116295496 A CN116295496 A CN 116295496A
Authority
CN
China
Prior art keywords
vehicle
obstacle
determining
information
input branch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310288212.3A
Other languages
Chinese (zh)
Inventor
姚志鹏
李勇强
吕强
付一豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neolix Technologies Co Ltd
Original Assignee
Neolix Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neolix Technologies Co Ltd filed Critical Neolix Technologies Co Ltd
Priority to CN202310288212.3A priority Critical patent/CN116295496A/en
Publication of CN116295496A publication Critical patent/CN116295496A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3415Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/343Calculating itineraries, i.e. routes leading from a starting point to a series of categorical destinations using a global route restraint, round trips, touristic trips
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses an automatic driving vehicle path planning method, device, equipment and storage medium, and relates to the field of unmanned driving. The method comprises the following steps: according to the perception information of the unmanned vehicle, determining the movement characteristics, the navigation map characteristics and at least one obstacle characteristic of the unmanned vehicle; the obstacle features include a priori features of the obstacle; inputting the movement characteristics, the navigation map characteristics and at least one obstacle characteristic of the unmanned vehicle into a pre-trained prediction model, and determining the prediction information of the unmanned vehicle; the prediction model comprises an input branch structure, a multi-head self-attention structure, a multi-layer perceptron structure and a long-term and short-term memory structure; and determining a planned path of the automatic driving vehicle according to the prediction information of the unmanned vehicle. The technical scheme solves the problems of low accuracy, high time cost and the like caused by directly extracting the abstract features of the perception information by the prediction model, and can enhance the interpretability of the prediction model while improving the prediction accuracy of the prediction model.

Description

Automatic driving vehicle path planning method, device, equipment and medium
Technical Field
The present invention relates to the field of unmanned technologies, and in particular, to a method and apparatus for planning a path of an automatically driven vehicle, an electronic device, and a storage medium.
Background
With the continuous progress of technology, the demand for navigation ability of unmanned vehicles under various road conditions is increasingly urgent. The navigation system of the unmanned vehicle needs to acquire sensing information such as radar, vision and the like, and makes reasonable decisions based on the recognition result of the sensing information so as to arrive at a destination timely and accurately on the premise of ensuring driving safety.
At present, a navigation system of an unmanned vehicle mainly depends on predictive models such as Lyft, waymo and the like, directly performs feature representation on perception information based on a deep learning network, and performs automatic driving vehicle path planning according to abstract features extracted from the perception information.
However, in the prior art, the manner of directly characterizing the perception information through the deep learning network requires a great deal of time to train the deep learning network, and the stability of the training effect is difficult to ensure. Meanwhile, the interpretation of the prediction model is poor, the perception information in the driving scene is difficult to fully utilize, and further the reliability of driving decision is poor and the accuracy is low.
Disclosure of Invention
The invention provides an automatic driving vehicle path planning method, an automatic driving vehicle path planning device, electronic equipment and a storage medium, which are used for solving the problems of low accuracy, high time cost and the like caused by directly extracting abstract features of perception information by a prediction model, and can enhance the interpretation of the prediction model while improving the prediction accuracy and the model generation efficiency of the prediction model.
According to an aspect of the present invention, there is provided a method of automatically driving a vehicle path planning, the method comprising:
determining the motion characteristics, the navigation map characteristics and at least one obstacle characteristic of the vehicle according to the perception information of the vehicle; wherein the obstacle features comprise a priori features of the obstacle;
inputting the motion characteristics, the navigation map characteristics and at least one obstacle characteristic of the vehicle into a pre-trained prediction model to determine prediction information of the vehicle; the prediction model comprises an input branch structure, a multi-head self-attention structure, a multi-layer perceptron structure and a long-term and short-term memory structure;
and determining a planned path of the automatic driving vehicle according to the prediction information of the vehicle.
According to another aspect of the present invention, there is provided an autonomous vehicle path planning apparatus comprising:
The feature determining module is used for determining the motion feature, the navigation map feature and at least one obstacle feature of the vehicle according to the perception information of the vehicle; wherein the obstacle features comprise a priori features of the obstacle;
the first information prediction module is used for inputting the motion characteristics, the navigation map characteristics and at least one obstacle characteristic of the vehicle into a pre-trained prediction model to determine the prediction information of the vehicle; the prediction model comprises an input branch structure, a multi-head self-attention structure, a multi-layer perceptron structure and a long-term and short-term memory structure;
and the first planning path determining module is used for determining the planning path of the automatic driving vehicle according to the prediction information of the vehicle.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the autonomous vehicle path planning method according to any of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to execute the method for autonomous vehicle path planning according to any of the embodiments of the present invention.
According to the technical scheme, the motion characteristics, the navigation map characteristics and at least one obstacle characteristic of the vehicle are determined through the perception information of the vehicle; then, the motion characteristics, the navigation map characteristics and at least one obstacle characteristic of the vehicle are input into a pre-trained prediction model, and prediction information of the vehicle is determined; and determining a planned path of the automatic driving vehicle according to the prediction information of the vehicle. The method solves the problems of low accuracy, high time cost and the like caused by directly extracting the abstract features of the perception information by the prediction model, and can enhance the interpretability of the prediction model while improving the prediction accuracy of the prediction model and the generation efficiency of the model.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for automatically driving a vehicle path planning in accordance with a first embodiment of the present invention;
FIG. 2 is a schematic diagram of a prediction model provided according to an embodiment of the present invention;
fig. 3 is a flowchart of an automatic driving vehicle path planning method according to a second embodiment of the present invention;
fig. 4 is a schematic view of candidate region division in a simple driving scenario provided according to an embodiment of the present invention;
fig. 5 is a schematic diagram of candidate region division in a complex driving scenario according to an embodiment of the present invention;
fig. 6 is a schematic structural view of an automatic driving vehicle path planning apparatus according to a third embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device implementing a method for path planning of an autonomous vehicle according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, apparatus, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus. The data acquisition, storage, use, processing and the like in the technical scheme meet the relevant regulations of national laws and regulations.
Example 1
Fig. 1 is a flowchart of an automatic driving vehicle path planning method according to an embodiment of the present invention, where the method may be implemented by an automatic driving vehicle path planning apparatus, and the apparatus may be implemented in hardware and/or software, and the apparatus may be configured in an electronic device.
As shown in fig. 1, the method includes:
s110, determining the motion characteristics, the navigation map characteristics and at least one obstacle characteristic of the vehicle according to the perception information of the vehicle.
The solution may be performed by a vehicle control device of an autonomous vehicle, which may be deployed inside the autonomous vehicle, such as a vehicle controller. The vehicle control device may also be deployed at a remote vehicle control center, with vehicle control being achieved by communication with the autonomous vehicle. The vehicle control apparatus may acquire the perception information of the autonomous vehicle. It is readily understood that the sensing information may be collected based on various types of sensors configured for the autonomous vehicle, such as a radar, a camera, a gyroscope, and the like. The perception information may include radar data, laser point cloud data, driving scene images, inertial measurement data, and the like.
The vehicle control device may determine characteristics of each target in the own vehicle and the driving environment directly from the sensing information. For example, the vehicle control apparatus may calculate the own vehicle position and each target position from the radar data. The vehicle control apparatus may also process the perceived information to determine characteristics of the own vehicle and various obstacles in the driving environment. For example, the vehicle control apparatus may recognize the driving scene image, and determine the type of each obstacle in the driving scene based on the recognition result.
The motion characteristics of the vehicle can comprise the position, the running direction, the motion speed and the like of the vehicle. The navigation map feature may include road features of the host vehicle driving environment, such as road boundary lines, reference lines, etc.
The vehicle control device can determine the navigation map features through the sensing information acquired by the camera, the radar and other sensors, for example, the vehicle control device can identify the lane line features according to the lane line images acquired by the camera. The vehicle control device can also directly acquire navigation map information through a navigation platform and the like, and extract navigation map features from the navigation map information.
It will be appreciated that the obstacle may include dynamic obstacles such as running vehicles and pedestrians, or static obstacles such as traffic barriers and road signs. The obstacle characteristics may include characteristics of the identity, location, direction of movement, speed of movement, etc. of the obstacle.
It should be noted that in this embodiment, the obstacle feature may include an a priori feature of the obstacle. The prior feature may be a feature obtained by evaluating the risk of the obstacle in advance based on information such as the position and speed of the obstacle.
In one possible approach, the a priori features may include security loss and rate of change of security loss for obstacle matching.
Wherein the safety loss is determined based on a safety factor of the obstacle; the safety coefficient is the ratio of the current distance between the vehicle and the obstacle to the safety distance;
the rate of change of the safety loss may be determined based on the distance of the host vehicle from the obstacle and the relative speed.
It will be appreciated that the vehicle control apparatus may determine the safe distance of the host vehicle from each obstacle based on the speed information of the host vehicle, the speed information of the obstacle, and the response time. According to the safety distance and the distance information of the vehicle and the obstacles, the vehicle control device can calculate the safety coefficient of the vehicle and each obstacle. The safety factor may be used to evaluate the degree of risk of the obstacle relative to the host vehicle. After the safety coefficient is obtained, in order to improve the prediction accuracy of the prediction model, the vehicle control apparatus may normalize the safety coefficient to obtain a safety loss.
Specifically, for a dynamic obstacle in the obstacles, the safety coefficient can be represented by a ratio of a current distance between the vehicle and the dynamic obstacle to a safety distance, and the safety distance can be calculated based on the speed of the vehicle, the speed of the dynamic obstacle and the response time of the vehicle to driving decisions. The speed information of the host vehicle can indicate the movement direction of the host vehicle, and the speed information of the dynamic obstacle can indicate the movement direction of the dynamic obstacle.
Since the traveling directions of the dynamic obstacles are different in the driving scene, the vehicle control apparatus can analyze the own vehicle and each dynamic obstacle from both the lateral and longitudinal angles. According to the information such as the speed of the vehicle, the acceleration of the vehicle, the speed of the dynamic obstacle, the acceleration of the dynamic obstacle, the response time and the like, the vehicle control equipment can calculate the transverse safety distance between the vehicle and each dynamic obstacle. Based on the lateral travel safety distance, the vehicle control device may calculate a lateral safety factor for each dynamic obstacle match. According to the information such as the speed of the vehicle, the acceleration of the vehicle, the speed of the dynamic obstacle, the acceleration of the dynamic obstacle, the response time and the like, the vehicle control equipment can also calculate the longitudinal safety distance between the vehicle and each dynamic obstacle. Based on the longitudinal safety distance, the vehicle control device may calculate a longitudinal safety factor for each dynamic barrier match.
1. The determining of the longitudinal safety distance may include:
(1) And determining the relative movement direction of the vehicle and the dynamic obstacle according to the speed information of the vehicle and the speed information of the dynamic obstacle.
It is easy to understand that the speed information of the host vehicle may indicate the movement direction of the host vehicle, and the speed information of the dynamic obstacle may indicate the movement direction of the dynamic obstacle. For the longitudinal safety distance, the vehicle control device may determine the relative movement direction of the host vehicle and the dynamic obstacle according to the movement direction of the host vehicle and the movement direction of the dynamic obstacle. Based on the transverse and longitudinal directions of the vehicle, the speed of the dynamic obstacle is decomposed, and the relative movement direction of the vehicle and the dynamic obstacle is judged according to the decomposition speed of the dynamic obstacle in the longitudinal direction of the vehicle. The vehicle control device can respectively set different longitudinal safety distance calculation modes according to the relative movement direction of the vehicle and the dynamic obstacle.
According to the scheme, the relative movement direction of the vehicle and the dynamic obstacles is determined according to the movement direction indicated by the speed information, and then the longitudinal safety distance between the vehicle and each dynamic obstacle is calculated according to different relative movement directions by utilizing the speed information of the vehicle, the speed information of the dynamic obstacles and the response time. The scheme realizes accurate longitudinal safety distance calculation, and is beneficial to ensuring the accuracy of safety coefficient calculation.
(2) And determining the longitudinal safety distance between the vehicle and each dynamic obstacle according to the relative movement direction, the speed information of the vehicle, the speed information of the dynamic obstacle and the response time.
It will be appreciated that the relative movement direction may include both co-directional and non-directional; the speed information may include speed, lateral speed information, and longitudinal speed information; wherein the longitudinal speed information may include a longitudinal speed, a longitudinal deceleration, and a longitudinal acceleration; the lateral velocity information may include a lateral velocity.
<1> if the relative movement direction of the host vehicle and the dynamic obstacle is the same direction, determining the longitudinal safety distance between the host vehicle and the dynamic obstacle according to the longitudinal speed of the preceding traffic object, the longitudinal speed of the following traffic object, the maximum longitudinal deceleration, the maximum longitudinal acceleration, the minimum longitudinal deceleration and the response time; wherein the maximum longitudinal deceleration, the maximum longitudinal acceleration, and the minimum longitudinal deceleration are determined based on statistics of traffic target attribute parameters.
If the relative movement direction of the host vehicle and the dynamic obstacle is the same direction, the vehicle control apparatus may calculate the longitudinal safety distance of the host vehicle and the dynamic obstacle based on the longitudinal speed of the preceding traffic object, the longitudinal speed of the following traffic object, the maximum longitudinal deceleration, the maximum longitudinal acceleration, the minimum longitudinal deceleration, and the response time. Wherein the preceding traffic objective is different from the following traffic objective, which may be the host vehicle or a dynamic obstacle, and similarly, the following traffic objective may be one of the host vehicle or the dynamic obstacle. The maximum longitudinal deceleration, the maximum longitudinal acceleration, and the minimum longitudinal deceleration are determined based on statistics of traffic target attribute parameters. The traffic target attribute parameters may include parameters such as a category, model, size, load range, acceleration range, deceleration range, and the like of the traffic target. The vehicle control device can acquire attribute parameters of traffic targets in a traffic environment, and respectively perform parameter statistics on each type of traffic targets to obtain the maximum longitudinal deceleration, the maximum longitudinal acceleration and the minimum longitudinal deceleration of each type of traffic targets. The vehicle control apparatus may determine the maximum longitudinal deceleration, the maximum longitudinal acceleration, and the minimum longitudinal deceleration of the dynamic obstacle according to the maximum longitudinal deceleration, the maximum longitudinal acceleration, and the minimum longitudinal deceleration corresponding to the traffic target of the category to which the dynamic obstacle belongs.
In one particular approach, the vehicle control device may determine a maximum displacement of the following traffic object within the response time based on the longitudinal speed, the maximum longitudinal acceleration, and the response time of the following traffic object. The vehicle control device may determine a maximum braking distance between the preceding traffic target and the following traffic target within the response time based on the longitudinal speed of the following traffic target, the longitudinal speed of the preceding traffic target, the maximum longitudinal acceleration, the minimum longitudinal deceleration, the maximum longitudinal deceleration, and the response time. And adding the maximum displacement and the maximum braking distance of the subsequent traffic targets as the safety distance. Specifically, when the relative motion direction of the vehicle and the dynamic obstacle is in the same direction, the calculation formula of the longitudinal safety distance can be expressed as follows:
Figure BDA0004141016800000061
wherein v is r Representing longitudinal speed, v, of a following traffic object f Representing the longitudinal speed of a preceding traffic target, t representing the response time, a lon,max,accel Indicating maximum longitudinal acceleration, a lon,min,brake Representing the minimum longitudinal deceleration, a lon,max,brake Indicating the maximum longitudinal deceleration.
<2> if the relative movement direction of the host vehicle and the dynamic obstacle is in the opposite direction, determining the longitudinal safety distance between the host vehicle and the dynamic obstacle according to the longitudinal speed of the host vehicle, the longitudinal speed of the dynamic obstacle, the minimum longitudinal deceleration, the maximum longitudinal acceleration, the minimum longitudinal deceleration of the host vehicle and the response time.
If the relative movement direction of the vehicle and the dynamic obstacle is different, the vehicle control device can determine the longitudinal safety distance between the vehicle and the dynamic obstacle according to the longitudinal speed of the vehicle, the longitudinal speed of the dynamic obstacle, the minimum longitudinal deceleration, the maximum longitudinal acceleration, the minimum longitudinal deceleration and the response time of the vehicle.
Taking the forward running of the vehicle and the reverse running of the dynamic obstacle as an example, the vehicle control device can determine the maximum speed of the vehicle in the response time according to the longitudinal speed, the maximum longitudinal acceleration and the response time of the vehicle. Similarly, the vehicle control apparatus may determine the maximum speed of the dynamic obstacle within the response time based on the longitudinal speed, the maximum longitudinal acceleration, and the response time of the dynamic obstacle. The vehicle control apparatus may calculate the maximum displacement of the host vehicle based on the longitudinal speed of the host vehicle, the maximum speed of the host vehicle in the response time, the minimum longitudinal deceleration of the host vehicle, and the response time. Based on the longitudinal speed of the dynamic obstacle, the maximum speed of the dynamic obstacle in the response time, the minimum longitudinal deceleration, and the response time, the vehicle control apparatus may calculate the maximum displacement of the dynamic obstacle. And adding the maximum displacement of the vehicle and the maximum displacement of the dynamic obstacle as a safety distance. When the vehicle runs reversely and the dynamic obstacle runs forward in the same way, and the relative movement direction of the vehicle and the dynamic obstacle is different, the calculation formula of the longitudinal safety distance can be expressed as follows:
Figure BDA0004141016800000071
v 1lon,t =v 1lon +ta lon,max,accel
v 2lon,t =|v 2lon |+ta lon,max,accel
Wherein v is 1lon Representing the speed, v, of a traffic target travelling in forward direction 2lon Representing the speed of a traffic target traveling in reverse, t representing the response time, a lon,max,accel Indicating maximum longitudinal acceleration, a lon,min,brake Representing the minimum longitudinal deceleration, a lon,min,brake,correct Representing the minimum longitudinal deceleration of the vehicle.
2. The determination of the lateral safety distance may include:
(1) And determining the lateral safety distance between the vehicle and each dynamic obstacle according to the lateral speed of the vehicle, the lateral speed of the dynamic obstacle, the maximum lateral acceleration, the minimum lateral deceleration and the response time.
It will be appreciated that the lateral velocity information also includes lateral acceleration and lateral deceleration. For the lateral safety distance, the vehicle control apparatus may determine the lateral safety distance of the host vehicle and each dynamic obstacle according to the relative position of the host vehicle and the dynamic obstacle. The vehicle control apparatus may bring the two traffic targets close to each other with the maximum lateral acceleration and brake with the minimum lateral deceleration until the lateral displacement of the motion stop is taken as the lateral safe distance.
In a specific scheme, the calculation formula of the lateral safety distance can be expressed as follows:
Figure BDA0004141016800000081
v 1lat,t =v 1lat +ta lat,max,accel
v 2lat,t =v 2lat -ta lat,max,accel
wherein μ represents a fault tolerant space, v 1lat Representing lateral speed, v, of traffic targets located on the left 2lat Represents the lateral speed of a traffic target located on the right, t represents the response time, a lat,max,accel Indicating maximum lateral acceleration, a lat,min,brake Representing the minimum lateral deceleration.
According to the scheme, the safety distances are calculated from the transverse angle and the longitudinal angle respectively, so that the accurate calculation of the safety coefficient is realized, and the driving safety of the vehicle is guaranteed to the greatest extent. In some embodiments, mu is 0.2m, and fault tolerance is increased on the basis of calculating the safety distance so as to ensure running safety.
After the transverse safety distance and the longitudinal safety distance between the vehicle and the dynamic obstacle are obtained, the transverse safety coefficient and the longitudinal safety coefficient can be calculated through the following calculation formulas:
Figure BDA0004141016800000082
Figure BDA0004141016800000083
wherein lon_dis represents the current longitudinal distance between the vehicle and the dynamic obstacle, d min Representing the longitudinal safety distance, lon_safe_coeff representing the longitudinal safety factor; lat_dis represents the current lateral distance between the vehicle and the dynamic obstacle, d min,lat Represents the lateral safety distance, lat_safe_coeff represents the lateral safety factor.
After obtaining the lateral safety factor and the longitudinal safety factor, the vehicle control apparatus may determine a safety loss matching each dynamic obstacle based on the inverse gaussian model according to each safety factor.
The vehicle control device can calculate the safety loss change rate according to the distance between the vehicle and the dynamic obstacle and the relative speed at the same time of calculating the safety coefficient so as to describe the change condition of the dangerous degree of the dynamic obstacle relative to the vehicle.
The security loss rate of change may include a lateral security loss rate of change and a longitudinal security loss rate of change. The transverse safety loss change rate can be determined according to the ratio of the current transverse relative speed of the vehicle to the dynamic obstacle to the current transverse distance, and the longitudinal safety loss change rate can be determined according to the ratio of the current longitudinal relative speed of the vehicle to the dynamic obstacle to the current longitudinal distance.
Specifically, if the safety coefficient of the dynamic obstacle is smaller than the preset coefficient threshold, determining the safety loss matched with the dynamic obstacle based on the inverse Gaussian model. According to the calculation mode of the safety coefficient, the greater the current distance is relative to the safety distance, the greater the safety coefficient is, and the lower the risk degree of the dynamic obstacle is. If the safety coefficient is greater than or equal to a preset coefficient threshold value, the dynamic obstacle corresponding to the safety coefficient is not dangerous or has low risk degree. The controller can directly set the safety loss of the dynamic barrier without danger or with low danger degree to 0, and the safety loss calculation is not needed.
If the safety coefficient is smaller than the preset coefficient threshold, the controller can restrict the safety coefficient matched with each dynamic obstacle to be between (0 and 1) according to the inverted Gaussian model, so that the convergence speed of the prediction model is increased in the training process, and the driving decision accuracy of the prediction model is improved.
The controller may determine a longitudinal safety loss from the longitudinal safety factor and a lateral safety loss from the lateral safety factor based on the inverse gaussian model; determining the safety loss matched with the dynamic obstacle according to the longitudinal safety loss, the transverse safety loss and a predetermined weight coefficient; wherein the weight coefficient may be determined based on a security loss rate of change.
Specifically, the calculation formulas of the longitudinal safety loss and the transverse safety loss can be expressed as:
lon_safe_cost=-1×[1-gaussian(K1-lon_safe_coeff)];
lat_safe_cost=-1×[1-gaussian(K2-lat_safe_coeff)];
the method comprises the steps of determining a longitudinal safety loss of a vehicle, wherein the lon_safe_cost represents the longitudinal safety loss, the lon_safe_coeff represents the longitudinal safety coefficient, the lat_safe_cost represents the transverse safety loss, the lat_safe_coeff represents the transverse safety coefficient, the gaussian represents the gaussian distribution, K1 and K2 represent preset coefficient thresholds, and K1 and K2 can be the same or different.
The controller can determine the overall safety loss of the operation target according to the transverse safety loss, the longitudinal safety loss and the weight coefficient. Specifically, the calculation formula of the security loss can be expressed as:
safe_cost=w_lon_safe×lon_safe_cost+w_lat_safe×lat_safe_cost;
where w_lon_safe represents the weight coefficient of the longitudinal safety loss, and w_lat_safe represents the weight coefficient of the transverse safety loss.
The weight coefficient may be statistically derived from historical security loss data, for example, the weight coefficient of lateral security loss may be set to 0.54, and the weight coefficient of longitudinal security loss may be set to 0.46. The weight coefficient may also be determined based on a security loss rate of change. The controller can determine the change rate of the safety loss according to the current distance between the vehicle and the dynamic obstacle and the current speed. The security loss rate of change may include a lateral security loss rate of change and a longitudinal security loss rate of change. The transverse safety loss change rate can be determined according to the ratio of the current transverse relative speed of the vehicle and the dynamic obstacle to the current transverse distance, and the longitudinal safety loss change rate can be determined according to the ratio of the current longitudinal relative speed of the vehicle and the dynamic obstacle to the current longitudinal distance.
The calculation formulas of the longitudinal and lateral loss change rates can be expressed as:
Figure BDA0004141016800000091
Figure BDA0004141016800000101
wherein lon_speed represents the relative speed of the vehicle and the dynamic obstacle in the longitudinal direction, and lat_speed represents the relative speed of the vehicle and the dynamic obstacle in the transverse direction.
According to the scheme, the safety loss and the safety loss change rate are used as priori features of the dynamic obstacle, and the convergence speed of the prediction model can be accelerated and the accuracy of the prediction information can be improved by extracting the features of the priori features.
S120, inputting the motion characteristics, the navigation map characteristics and at least one obstacle characteristic of the vehicle into a pre-trained prediction model, and determining the prediction information of the vehicle.
The vehicle control device may determine prediction information of the host vehicle by using the motion feature, the navigation map feature, and each obstacle feature of the host vehicle obtained from the sensing information as inputs of the prediction model. The prediction model can be determined based on a deep learning algorithm, and is used for predicting the motion of the vehicle according to the motion characteristics, the navigation map characteristics and the obstacle characteristics of the vehicle. The prediction model can be obtained by training sample data in advance, and the training sample data can comprise a plurality of groups so as to enable the prediction model to achieve good training effect. Each set of training sample data may include feature data and tag data. Similar to the application of the predictive model, the feature data may include a host vehicle motion feature, a navigational map feature, and at least one obstacle feature. The tag data may be own vehicle motion information matched with the feature data.
Fig. 2 is a schematic structural diagram of a prediction model provided according to an embodiment of the present invention, and as shown in fig. 2, the prediction model may include an input branch structure, a multi-head self-attention structure, a multi-layer perceptron structure, and a long-short-term memory structure. The input branch structure can be used for extracting characteristics of the motion characteristics, the obstacle characteristics and the navigation map characteristics of the vehicle. The multi-head self-attention structure can be used for extracting the relation features among the input features so as to improve the accuracy and the prediction efficiency of the prediction model. The multi-layer perceptron mechanism may compress the features. The long-short term memory structure can extract time sequence characteristics among multiple frames of input data. In a preferred scheme, the long-short-term memory structure can take the current frame and the historical four frames as input data, and extract time sequence characteristics among the input data so as to achieve a good prediction effect.
In one possible implementation, the input branch structure includes a first input branch, a second input branch, and a third input branch;
the first input branch, the second input branch and the third input branch are respectively used for extracting the characteristics of the motion characteristic, the obstacle characteristic and the navigation map characteristic of the vehicle;
The input branch structure, the multi-head self-attention structure, the multi-layer perceptron structure and the long-period memory structure are sequentially connected.
It is easy to understand that each input branch in the input branch structure is used for extracting the characteristics of the motion characteristics, the obstacle characteristics and the navigation map characteristics of the vehicle respectively. Each input branch may include at least one full connection layer, and the structures of the input branches may be the same or different. The first input branch, the second input branch and the third input branch can respectively perform feature extraction on the motion feature, the obstacle feature and the navigation map feature of the vehicle to obtain a first feature extraction result, a second feature extraction result and a third feature extraction result.
The vehicle control apparatus may input the first feature extraction result, the second feature extraction result, and the third feature extraction result to the multi-head self-attention structure in a preset input manner to extract a relationship feature between the input features. After obtaining the output characteristics with the relation characteristics output by the multi-head self-attention structure, the vehicle control device can further extract deep characteristics of the output characteristics by utilizing the multi-layer perceptron structure, and input the deep characteristics obtained by extraction into the long-short-period memory structure so as to extract time sequence characteristics among multi-frame deep characteristics.
The prediction model structure of the scheme can realize multi-dimensional feature extraction of the motion features, the obstacle features and the navigation map features of the vehicle, and is beneficial to improving the robustness and the accuracy of the model.
Based on the above scheme, optionally, the multi-head self-attention structure may include three inputs; wherein the first input and the second input are each determined based on a first feature extraction result of the first input branch, a second feature extraction result of the second input branch, and a third feature extraction result of the third input branch; the third input is determined based on the first feature extraction result of the first input branch.
Specifically, the vehicle control device may fuse the first feature extraction result, the second feature extraction result, and the third feature extraction result according to a preset fusion manner, and use the fused feature as the first input or the second input. Wherein the first input and the second input may be the same or different. The vehicle control device may directly splice the first feature extraction result, the second feature extraction result, and the third feature extraction result according to a certain arrangement order, to obtain the fusion feature. For example, the first feature extraction result, the second feature extraction result, and the third feature extraction result are represented by A, B and C, respectively, and the fusion feature may be represented as [ a, B, C ].
The vehicle control device may also sequentially select a part of feature extraction results from the first feature extraction result, the second feature extraction result, and the third feature extraction result, and the different feature extraction results are alternately spliced to generate the fusion feature. For example, the fused feature may be represented as [ A1, B1, C1, A2, B2, C2], where A1 represents a first partial feature extraction of the first feature extraction and A2 represents a second partial feature extraction of the first feature extraction, and B1, C1, B2, and C2 are the same.
It is to be understood that the vehicle control apparatus may further use the first feature extraction result as a third input to determine an output feature having a relationship between the own vehicle motion feature and the navigation map feature and the obstacle feature, based on the first input, the second input, and the third input.
The calculation formula of the output characteristic can be expressed as:
Figure BDA0004141016800000121
wherein K represents a first input, V represents a second input, Q represents a third input, d k Representing the dimension of the first input, softmax represents the normalized exponential function, and Attention (Q, K, V) represents the output characteristics of the multi-headed attentional mechanism.
The prediction model with the multi-head self-attention structure can improve the prediction efficiency and accuracy.
S130, determining a planned path of the automatic driving vehicle according to the prediction information of the vehicle.
The prediction information may include information such as a position, a traveling direction, a turning curvature, a speed, and an acceleration of the host vehicle. Based on the prediction information, the vehicle control apparatus may determine a planned path of the host vehicle. Meanwhile, according to the prediction information and the current motion information output by the prediction model, the vehicle control equipment can generate a driving decision of the vehicle in the current decision period. For example, the speed of the vehicle output by the prediction model is 10m/s, the current speed of the vehicle is 8m/s, and the vehicle control device can generate an acceleration instruction to control the power device of the vehicle to increase the speed. According to the driving decision, the vehicle control equipment can control the vehicle to drive according to the predicted planning path, and the driving safety and reliability of the vehicle are ensured.
According to the technical scheme, through the perception information of the vehicle, the motion characteristics, the navigation map characteristics and at least one obstacle characteristic of the vehicle are determined; then, the motion characteristics, the navigation map characteristics and at least one obstacle characteristic of the vehicle are input into a pre-trained prediction model, and prediction information of the vehicle is determined by the prediction information; and determining a planned path of the automatic driving vehicle according to the prediction information of the vehicle. The method solves the problems of low accuracy, high time cost and the like caused by directly extracting the abstract features of the perception information by the prediction model, and can enhance the interpretability of the prediction model while improving the prediction accuracy of the prediction model and the generation efficiency of the model.
Example two
Fig. 3 is a flowchart of a method for planning a path of an autonomous vehicle according to a second embodiment of the present invention, which is based on the above embodiment. As shown in fig. 3, the method includes:
s210, determining the motion characteristics, the navigation map characteristics and at least one obstacle characteristic of the vehicle according to the perception information of the vehicle.
In this aspect, optionally, the vehicle movement feature includes vehicle position data, driving direction, curvature, speed data, acceleration data, and vehicle parameter data; the navigation map feature may include reference line position data, reference line direction, and reference line curvature; the obstacle characteristics further include distance data, obstacle parameter data, and direction of movement.
The vehicle position data may be position coordinates of the vehicle, the speed data may include information such as a lateral speed and a longitudinal speed, the acceleration data may include information such as a lateral acceleration and a longitudinal acceleration, and the vehicle parameter data may include information such as a length and a width of the vehicle.
It will be appreciated that the reference line may provide a road reference for the driver of the host vehicle, and thus the navigation map feature may comprise a reference line feature, which may include information such as reference line position data, reference line direction, and reference line curvature. In the driving scene of the structured road, the reference line may be determined based on the lane line recognition result or may be acquired from the navigation platform. In a driving scenario of an unstructured road, the reference line may be generated based on a boundary recognition result fit of the unstructured road.
The difficulty of driving decision is mainly reflected in the influence of the obstacle in the driving environment on the vehicle, so that the vehicle control device can identify the obstacle in the driving environment as much as possible and determine the characteristics of each obstacle. In addition to the prior characteristics of the obstacle, the characteristics of the obstacle may also include information such as distance data between the obstacle and the vehicle, obstacle parameter data, and movement direction. The distance data between the obstacle and the vehicle may include a lateral distance, a longitudinal distance, a linear distance, and the like. Similar to the vehicle parameter data in the motor vehicle motion characteristics, the obstacle parameter data may include information of the length, width, etc. of the obstacle.
The motion characteristics, the navigation map characteristics and the obstacle characteristics of the vehicle are used as the input of the prediction model, so that the reliability and the stability of the prediction model can be improved.
S220, according to the motion state of the vehicle, determining the weight matched with at least two candidate areas based on the vehicle position division, and determining the sequencing result of the obstacles according to the weight of each candidate area, the prior characteristic of each obstacle and the position of each obstacle.
In complex driving scenes such as traffic jams, a large number of obstacles may exist in the perception range of the vehicle, and the prediction efficiency is greatly reduced by inputting the obstacle characteristics of all the obstacles into the prediction model, and sufficient hardware resource support is required. There may also be obstacles within the perception of the host vehicle that have no impact on the driving decisions of the host vehicle, such as obstacles that do not present a risk of collision with the host vehicle. Therefore, the vehicle control device can screen the obstacles in the perception range of the vehicle, and select the obstacles with guiding significance for the driving decision of the vehicle as the input of the prediction model.
In this way, in a preferred embodiment, the vehicle control device may determine the weights of the matching of the at least two candidate areas based on the vehicle-mounted division according to the movement state of the vehicle, and determine the ranking result of the obstacles according to the weights of the candidate areas, the a priori characteristics of the obstacles, and the positions of the obstacles. According to the sequencing result, the vehicle control device can determine a preset number of dangerous obstacles in each obstacle, input the motion characteristics, the navigation map characteristics and the preset number of dangerous obstacle characteristics of the vehicle into a pre-trained prediction model, and determine the prediction information of the vehicle. The vehicle control apparatus may generate the planned path of the autonomous vehicle based on the prediction information of the host vehicle.
Specifically, the vehicle control apparatus may divide the plurality of candidate areas according to the vehicle position. The vehicle control apparatus may set the area size of each candidate region in the driving scene, and sets a plurality of sets of weights for the candidate regions according to the motion state of the host vehicle. The motion state may include straight, right turn, left turn, head drop, etc. Each motion state may correspond to one or more sets of candidate region weights, e.g., the straight-going state may correspond to the weight set a.
Based on the weights of the candidate regions, the prior characteristics of the obstacles, and the positions of the obstacles, the vehicle control apparatus may determine a risk ranking result of the obstacles. The vehicle control apparatus may determine the candidate region to which each obstacle belongs, based on the position of each obstacle. And calculating risk assessment values of the obstacles according to the prior characteristics of the obstacles and the weights of the candidate areas. The risk evaluation values of the respective obstacles are ranked, and the vehicle control apparatus can obtain the ranking result of the obstacles.
S230, determining a preset number of dangerous barriers in each barrier according to the sorting result.
The vehicle control apparatus may select a preset number of obstacles having a high risk evaluation value as the dangerous obstacle according to the sorting result. For example, the obstacle at the first 64 bits of the risk assessment value in the ranking result is determined as a risk obstacle.
The vehicle control apparatus may also select, as the dangerous obstacle, an obstacle whose risk evaluation value is greater than a preset evaluation threshold value, based on the ranking result. For example, an obstacle having a risk assessment value of greater than 0.8 in the ranked result is taken as a risk obstacle.
In a specific example, the dangerous obstacle determination process may be as follows:
(1) At least one candidate region is determined based on the host vehicle location.
After obtaining the prior characteristics of each obstacle, the vehicle control device may divide the candidate region for the vehicle sensing range according to the vehicle position. The region division may be different according to different driving scenarios. Fig. 4 is a schematic view of candidate region division in a simple driving scenario according to an embodiment of the present invention. As shown in fig. 4, in a narrow one-way driving scenario, the vehicle control apparatus may divide 3 candidate areas, namely candidate area (1), candidate area (2) and candidate area (3), according to the current position of the host vehicle. The vehicle control device may also divide the sensing range of the vehicle according to the current position of the vehicle in a gradient manner according to the distance, for example, into 5 candidate regions, which are respectively a candidate region (1), a candidate region (2), a candidate region (3), a candidate region (4) and a candidate region (5).
Fig. 5 is a schematic diagram of candidate region division in a complex driving scenario according to an embodiment of the present invention. In a complex driving scenario such as traffic jam, the vehicle control apparatus may divide 9 candidate areas shown in fig. 5 with the candidate area (5) where the host vehicle is located as a center area, and the vehicle control apparatus may set the area size of each candidate area according to the driving scenario.
It should be noted that, the above-mentioned candidate region division according to the complexity of the driving scene is only one of the candidate region division modes, and the vehicle control device may also perform the candidate region division according to factors such as the lane where the host vehicle is located, the driving road section of the host vehicle, and the like. The present embodiment does not limit the division manner of the candidate region. The areas of the candidate regions may be the same or different, and the shapes of the candidate regions may be regular or irregular.
(2) And determining the weight of each candidate area according to the motion state of the vehicle and a preset weight distribution principle.
The movement state of the autonomous vehicle may include straight, u-turn, left-turn, right-turn, reverse, etc. The vehicle control apparatus may set the weight of each candidate region to which the motion state matches, according to the degree of influence of each motion state on each candidate region. Taking the candidate region division manner as shown in fig. 5 as an example, in the straight-running state of the vehicle, the affected candidate regions may include (3), (5), (6) and (9), where the degree of influence of the candidate regions (3), (5), (6) and (9) may be ordered as (5), (6), (3) and (9), and then the weights of the candidate regions (1) - (9) that are matched in the straight-running state may be respectively: 1. 1, 1.5, 1, 2, 1.8, 1 and 1.5. The vehicle control apparatus may further gradient-increase the weight of the target candidate region according to the order in which the candidate regions are affected by the motion state of the host vehicle. Still taking the candidate region division manner as shown in fig. 5 as an example, when the host vehicle sequentially affects the candidate regions (5), (6), (3), (2) and (1) in the left turn state, the weights of the candidate regions (1) to (9) matched in the left turn state may be respectively: 1.2, 1.4, 1.6, 1, 2, 1.8, 1 and 1.
(3) And determining the candidate area of each obstacle according to the position of each obstacle.
The vehicle control apparatus may compare each obstacle position with each candidate region range, and determine a candidate region to which each obstacle belongs.
(4) And determining the sequencing result of the obstacles according to the weight of each candidate region, the candidate region of each obstacle and the safety loss of each obstacle.
In one possible implementation, the determining the ranking result of the obstacles according to the weight of each candidate area, the candidate area of each obstacle, and the safety loss of each obstacle includes:
determining the weight of the candidate area of each obstacle, and determining the weighted safety loss according to the weight of the candidate area of each obstacle and the safety loss of each obstacle;
and determining the sequencing result of the barriers according to the weighted safety loss of each barrier.
The vehicle control apparatus may determine the weight of each obstacle safety loss match based on the weight of each candidate region and the candidate region to which each obstacle belongs. By multiplying the safety loss of each obstacle by the matched weight, the vehicle control apparatus can obtain the weighted safety loss of each obstacle. The weighted safety losses of the respective obstacles are ranked, and the vehicle control apparatus may output the ranking result of the obstacles.
The scheme can determine the weighted safety loss of each obstacle, and is beneficial to realizing the reliability of obstacle risk degree evaluation.
(5) And selecting a preset number of barriers as dangerous barriers according to the sequence of the weighted safety loss from large to small according to the sequencing result.
The vehicle control apparatus may determine the degree of risk of each obstacle according to the weighted security loss in the result of the ranking of the obstacles, and select a preset number of obstacles having a relatively high degree of risk among the obstacles as the dangerous obstacle.
According to the scheme, a certain number of barriers are selected from the barriers to serve as dangerous barriers, so that the driving decision timeliness is guaranteed, and meanwhile, the safety and reliability of the driving decision are improved.
S240, inputting the motion characteristics, the navigation map characteristics and the preset number of dangerous obstacle characteristics of the vehicle into a pre-trained prediction model, and determining the prediction information of the vehicle.
The vehicle control device can take the motion characteristics, the navigation map characteristics and the dangerous obstacle characteristics of the vehicle as the input of a prediction model to predict the running information of the vehicle so as to improve the prediction efficiency of the prediction model.
S250, determining a planned path of the automatic driving vehicle according to the prediction information of the vehicle.
In this scenario, optionally, the prediction information may include position data, driving direction, curvature, speed, acceleration, and interval time.
It is easy to understand that the location data may be a destination location of the host vehicle, and may be represented by coordinates. The interval time may be an update period of the prediction information.
According to the scheme, the vehicle control equipment can generate the running track of the vehicle according to the position data, the running direction, the curvature, the speed, the acceleration and the interval time of the vehicle and control the vehicle to move to the target position according to the running track so as to ensure the driving safety and reliability.
According to the technical scheme, through the perception information of the vehicle, the motion characteristics, the navigation map characteristics and at least one obstacle characteristic of the vehicle are determined; screening out a preset number of dangerous obstacles from the obstacles; inputting the motion characteristics, the navigation map characteristics and the preset number of dangerous obstacle characteristics of the vehicle into a pre-trained prediction model, and determining prediction information of the vehicle; and determining a planned path of the automatic driving vehicle according to the prediction information of the vehicle. The method solves the problems of low accuracy, high time cost and the like caused by directly extracting the abstract features of the perception information by the prediction model, and can enhance the interpretability of the prediction model while improving the prediction accuracy of the prediction model and the generation efficiency of the model.
Example III
Fig. 6 is a schematic structural diagram of an automatic driving vehicle path planning device according to a third embodiment of the present invention. As shown in fig. 6, the apparatus includes:
the feature determining module 310 is configured to determine a motion feature, a navigation map feature, and at least one obstacle feature of the host vehicle according to the perception information of the host vehicle; wherein the obstacle features comprise a priori features of the obstacle;
the first information prediction module 320 is configured to input the motion feature, the navigation map feature, and the at least one obstacle feature of the host vehicle into a pre-trained prediction model, where the prediction information determines prediction information of the host vehicle; the prediction model comprises an input branch structure, a multi-head self-attention structure, a multi-layer perceptron structure and a long-term and short-term memory structure;
the first planned path determining module 330 is configured to determine a planned path of the autonomous vehicle according to the prediction information of the vehicle.
In this scenario, optionally, the a priori features include a security loss and a security loss rate of change;
wherein the safety loss is determined based on a safety factor of the obstacle; the safety coefficient is the ratio of the current distance between the vehicle and the obstacle to the safety distance;
The safety loss change rate is determined based on the distance between the vehicle and the obstacle and the relative speed.
In one possible implementation, the input branch structure includes a first input branch, a second input branch, and a third input branch;
the first input branch, the second input branch and the third input branch are respectively used for extracting the characteristics of the motion characteristic, the obstacle characteristic and the navigation map characteristic of the vehicle;
the input branch structure, the multi-head self-attention structure, the multi-layer perceptron structure and the long-period memory structure are sequentially connected.
Based on the above scheme, optionally, the multi-head self-attention structure comprises three inputs; wherein the first input and the second input are each determined based on a first feature extraction result of the first input branch, a second feature extraction result of the second input branch, and a third feature extraction result of the third input branch; the third input is determined based on the first feature extraction result of the first input branch.
In a preferred embodiment, the apparatus further comprises:
the sequencing result determining module is used for determining the matching weight of at least two candidate areas based on the vehicle position division according to the motion state of the vehicle, and determining the sequencing result of the obstacles according to the weight of each candidate area, the prior characteristic of each obstacle and the position of each obstacle;
The dangerous obstacle determining module is used for determining a preset number of dangerous obstacles in each obstacle according to the sorting result;
the second information prediction module is used for inputting the motion characteristics, the navigation map characteristics and the preset number of dangerous obstacle characteristics of the vehicle into a pre-trained prediction model, and the prediction information is used for determining the prediction information of the vehicle;
and the second planning path determining module is used for determining the planning path of the automatic driving vehicle according to the prediction information of the vehicle.
In this embodiment, optionally, the vehicle movement feature includes vehicle position data, driving direction, curvature, speed data, acceleration data, and vehicle parameter data; the navigation map features include reference line position data, reference line direction, and reference line curvature; the obstacle characteristics further include distance data, obstacle parameter data, and direction of movement.
On the basis of the scheme, optionally, the prediction information comprises position data, running direction, curvature, speed, acceleration and interval time.
The automatic driving vehicle path planning device provided by the embodiment of the invention can execute the automatic driving vehicle path planning method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example IV
Fig. 7 shows a schematic diagram of an electronic device 410 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 7, the electronic device 410 includes at least one processor 411, and a memory, such as a Read Only Memory (ROM) 412, a Random Access Memory (RAM) 413, etc., communicatively connected to the at least one processor 411, wherein the memory stores a computer program executable by the at least one processor, and the processor 411 may perform various suitable actions and processes according to the computer program stored in the Read Only Memory (ROM) 412 or the computer program loaded from the storage unit 418 into the Random Access Memory (RAM) 413. In the RAM 413, various programs and data required for the operation of the electronic device 410 may also be stored. The processor 411, the ROM 412, and the RAM 413 are connected to each other through a bus 414. An input/output (I/O) interface 415 is also connected to bus 414.
Various components in the electronic device 410 are connected to the I/O interface 415, including: an input unit 416 such as a keyboard, a mouse, etc.; an output unit 417 such as various types of displays, speakers, and the like; a storage unit 418, such as a magnetic disk, optical disk, or the like; and a communication unit 419 such as a network card, modem, wireless communication transceiver, etc. The communication unit 419 allows the electronic device 410 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
The processor 411 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 411 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 411 performs the various methods and processes described above, such as an autonomous vehicle path planning method.
In some embodiments, the autonomous vehicle path planning method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as storage unit 418. In some embodiments, some or all of the computer program may be loaded and/or installed onto the electronic device 410 via the ROM 412 and/or the communication unit 419. When the computer program is loaded into RAM 413 and executed by processor 411, one or more steps of the autonomous vehicle path planning method described above may be performed. Alternatively, in other embodiments, the processor 411 may be configured to perform the autonomous vehicle path planning method in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. A method of automatically driving a vehicle path planning, the method comprising:
determining the motion characteristics, the navigation map characteristics and at least one obstacle characteristic of the vehicle according to the perception information of the vehicle; wherein the obstacle features comprise a priori features of the obstacle;
inputting the motion characteristics, the navigation map characteristics and at least one obstacle characteristic of the vehicle into a pre-trained prediction model to determine prediction information of the vehicle; the prediction model comprises an input branch structure, a multi-head self-attention structure, a multi-layer perceptron structure and a long-term and short-term memory structure;
And determining a planned path of the automatic driving vehicle according to the prediction information of the vehicle.
2. The method of claim 1, wherein the a priori features include a security loss and a security loss rate of change;
wherein the safety loss is determined based on a safety factor of the obstacle; the safety coefficient is the ratio of the current distance between the vehicle and the obstacle to the safety distance;
the safety loss change rate is determined based on the distance between the vehicle and the obstacle and the relative speed.
3. The method of claim 1, wherein the input branch structure comprises a first input branch, a second input branch, and a third input branch;
the first input branch, the second input branch and the third input branch are respectively used for extracting the characteristics of the motion characteristic, the obstacle characteristic and the navigation map characteristic of the vehicle;
the input branch structure, the multi-head self-attention structure, the multi-layer perceptron structure and the long-period memory structure are sequentially connected.
4. A method according to claim 3, wherein the multi-headed self-attention structure comprises three inputs; wherein the first input and the second input are each determined based on a first feature extraction result of the first input branch, a second feature extraction result of the second input branch, and a third feature extraction result of the third input branch; the third input is determined based on the first feature extraction result of the first input branch.
5. The method of claim 1, wherein after determining the host vehicle motion feature, the navigation map feature, and the at least one obstacle feature, the method further comprises:
according to the motion state of the vehicle, determining weights matched with at least two candidate areas based on vehicle position division, and determining the sequencing result of the obstacles according to the weights of the candidate areas, the prior characteristics of the obstacles and the positions of the obstacles;
determining a preset number of dangerous barriers in each barrier according to the sequencing result;
inputting the motion characteristics, the navigation map characteristics and the preset number of dangerous obstacle characteristics of the vehicle into a pre-trained prediction model, and determining prediction information of the vehicle;
and determining a planned path of the automatic driving vehicle according to the prediction information of the vehicle.
6. The method of claim 1, wherein the host vehicle movement characteristics include host vehicle position data, direction of travel, curvature, speed data, acceleration data, and vehicle parameter data; the navigation map features include reference line position data, reference line direction, and reference line curvature; the obstacle characteristics further include distance data, obstacle parameter data, and direction of movement.
7. The method of claims 1-6, wherein the predicted information includes position data, direction of travel, curvature, speed, acceleration, and interval time.
8. An autonomous vehicle path planning apparatus, the apparatus comprising:
the feature determining module is used for determining the motion feature, the navigation map feature and at least one obstacle feature of the vehicle according to the perception information of the vehicle; wherein the obstacle features comprise a priori features of the obstacle;
the first information prediction module is used for inputting the motion characteristics, the navigation map characteristics and at least one obstacle characteristic of the vehicle into a pre-trained prediction model to determine the prediction information of the vehicle; the prediction model comprises an input branch structure, a multi-head self-attention structure, a multi-layer perceptron structure and a long-term and short-term memory structure;
and the first planning path determining module is used for determining the planning path of the automatic driving vehicle according to the prediction information of the vehicle.
9. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
The memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the autonomous vehicle path planning method of any of claims 1-7.
10. A computer readable storage medium storing computer instructions for causing a processor to perform the autonomous vehicle path planning method of any of claims 1-7.
CN202310288212.3A 2023-03-22 2023-03-22 Automatic driving vehicle path planning method, device, equipment and medium Pending CN116295496A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310288212.3A CN116295496A (en) 2023-03-22 2023-03-22 Automatic driving vehicle path planning method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310288212.3A CN116295496A (en) 2023-03-22 2023-03-22 Automatic driving vehicle path planning method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN116295496A true CN116295496A (en) 2023-06-23

Family

ID=86783142

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310288212.3A Pending CN116295496A (en) 2023-03-22 2023-03-22 Automatic driving vehicle path planning method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN116295496A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116499487A (en) * 2023-06-28 2023-07-28 新石器慧通(北京)科技有限公司 Vehicle path planning method, device, equipment and medium
CN117870688A (en) * 2024-01-12 2024-04-12 哈尔滨工业大学(威海) Unmanned vehicle navigation obstacle modeling method and system based on Gaussian probability model

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116499487A (en) * 2023-06-28 2023-07-28 新石器慧通(北京)科技有限公司 Vehicle path planning method, device, equipment and medium
CN116499487B (en) * 2023-06-28 2023-09-05 新石器慧通(北京)科技有限公司 Vehicle path planning method, device, equipment and medium
CN117870688A (en) * 2024-01-12 2024-04-12 哈尔滨工业大学(威海) Unmanned vehicle navigation obstacle modeling method and system based on Gaussian probability model
CN117870688B (en) * 2024-01-12 2024-08-06 哈尔滨工业大学(威海) Unmanned vehicle navigation obstacle modeling method and system based on Gaussian probability model

Similar Documents

Publication Publication Date Title
CN110796856B (en) Vehicle lane change intention prediction method and training method of lane change intention prediction network
CN111739344B (en) Early warning method and device and electronic equipment
CN116358584A (en) Automatic driving vehicle path planning method, device, equipment and medium
CN116295496A (en) Automatic driving vehicle path planning method, device, equipment and medium
CN114212110A (en) Obstacle trajectory prediction method, obstacle trajectory prediction device, electronic device, and storage medium
CN112526999A (en) Speed planning method, device, electronic equipment and storage medium
CN115909749B (en) Vehicle running road risk early warning method, device, equipment and storage medium
CN116499487B (en) Vehicle path planning method, device, equipment and medium
CN117168488A (en) Vehicle path planning method, device, equipment and medium
CN114030483A (en) Vehicle control method, device, electronic apparatus, and medium
CN114475656A (en) Travel track prediction method, travel track prediction device, electronic device, and storage medium
CN115221722A (en) Simulation test method, model training method and device for automatic driving vehicle
CN116572946A (en) Vehicle detouring method, device, electronic equipment, vehicle and storage medium
CN115675534A (en) Vehicle track prediction method and device, electronic equipment and storage medium
CN115782919A (en) Information sensing method and device and electronic equipment
CN115837919A (en) Interactive behavior decision method and device for automatic driving vehicle and automatic driving vehicle
CN115959154A (en) Method and device for generating lane change track and storage medium
CN114333416A (en) Vehicle risk early warning method and device based on neural network and automatic driving vehicle
CN115909813B (en) Vehicle collision early warning method, device, equipment and storage medium
CN115798261B (en) Vehicle obstacle avoidance control method, device and equipment
CN117912295A (en) Vehicle data processing method and device, electronic equipment and storage medium
CN117707164A (en) Avoidance method, device, equipment and medium of unmanned sweeper
CN114582125B (en) Method, device, equipment and storage medium for identifying road traffic direction
CN115583258A (en) Automatic vehicle meeting control method and device, vehicle control equipment and medium
CN115214722A (en) Automatic driving method, device, electronic device, storage medium and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination