Detailed Description
Exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present disclosure to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
In the technical scheme of the disclosure, the related processes of collecting, storing, using, processing, transmitting, providing, disclosing and the like of the personal information of the user accord with the regulations of related laws and regulations, and the public order colloquial is not violated.
Fig. 1 is a schematic plan view of a traffic road provided by an embodiment of the present disclosure, and fig. 2 is a schematic plan view of another traffic road provided by an embodiment of the present disclosure. A plurality of vehicles travel on a traffic road, and a plurality of vehicles on the same road travel on the road based on the destination to which each of the vehicles is traveling. The plurality of vehicles on the road may include a first vehicle 100 and a second vehicle 200. The first vehicle may be an autonomous vehicle or a manual vehicle. The second vehicle may be a manually driven vehicle or an automatically driven vehicle.
The travel of the first vehicle 100 on the road may be affected by the second vehicle 200. It will be appreciated that the driving decisions of the first vehicle 100 may vary as a result of the different driving patterns adopted by the second vehicle 200.
In some examples, as shown in fig. 1, the second vehicle 200 may be a vehicle located around the first vehicle 100, and the distance between the second vehicle 200 and the first vehicle 100 is less than a threshold, i.e., the second vehicle 200 is a vehicle within a preset range of the first vehicle 100. For example, the preset range is a circular range with the first vehicle as a center radius as a preset distance, and the radius may be 1m, 2m, 3m, 5m, 8m, 10m, 12m, 15m, 17m, 20m, and the like. For example, taking the second vehicle 200 as the vehicle 201 traveling on the road on which the first vehicle 100 is located and traveling in front of the first vehicle 100, when the vehicle 201 is traveling at a reduced speed, the first vehicle 100 is caused to make a driving decision to step on the brake, and when the vehicle 201 is traveling at a constant speed, the first vehicle 100 is caused to make a driving decision to keep the vehicle speed. For example, when the second vehicle 200 is a vehicle 202 traveling in a lane adjacent to the right side of the road on which the first vehicle 100 is located, and the vehicle 202 is traveling in a lane change to the left side, the first vehicle 100 may make a driving decision to step on the brake.
In other examples, the second vehicle 200 may also be a vehicle that may interact with the first vehicle predicted future based on the relative positions of the first vehicle 100 and the second vehicle 200, and the driving states (e.g., driving direction and/or driving speed) of the first vehicle 100 and the second vehicle 200 after analysis. The interaction between the first vehicle and the second vehicle may include, but is not limited to, the second vehicle traveling side-by-side with the first vehicle in a direction perpendicular to the direction of travel of the road, the second vehicle being located in front of or behind the first vehicle on the road, and other suitable interactions.
In theory, other vehicles on the road may have an impact on the driving decisions of the first vehicle. Thus, in other examples, as shown in FIG. 2, the second vehicle may also be all vehicles except the first vehicle.
The first vehicle may be provided with a plurality of detection devices. The first vehicle may sense objects (e.g., cars, bicycles, pedestrians, road blocks, etc.) in the surrounding road environment with the aid of the detection device, so that the first vehicle can obtain richer actual road information to facilitate the first vehicle to make a correct driving decision.
For example, the detection device on the first vehicle may comprise a camera. The camera can be fixed camera, and a plurality of fixed cameras can set up respectively at the automobile body outer wall of first vehicle and set up outwards to gather the environmental image information of first vehicle surrounding road. The camera can also be a rotary camera, the rotary camera can be arranged at the top end of the car body, and the environment image information of the road around the first car is collected in the rotation process.
The detection device on the first vehicle may also comprise an ultrasonic probe, for example. During travel of the first vehicle, the ultrasonic probe transmits an ultrasonic signal and receives an ultrasonic signal reflected back from an object in the surrounding environment of the first vehicle. By analyzing the reflection position of the ultrasonic wave, the object in the road environment around the first vehicle is known.
The detection device on the first vehicle may also comprise other sensors with detection functions, such as laser sensors etc., without limitation.
As already described above, the second vehicle has an influence on the driving decision of the first vehicle, and thus the traffic state of the road in front of the second vehicle is particularly important. The road in front of the second vehicle is accurately identified to be in a clear state or in a congestion state, and future driving behaviors of the second vehicle are predicted, so that the first vehicle is assisted in making a correct driving decision based on the predicted driving behaviors of the second vehicle.
Based on this, embodiments of the present disclosure provide a method, apparatus, device, and storage medium for recognizing a traffic state of a lane. The method and the system help the first vehicle to accurately identify the traffic state of the road in front of the second vehicle, improve the accuracy of the future driving behavior prediction of the second vehicle, and further help the first vehicle to make a correct driving decision. The following description will be given separately.
The method for identifying the traffic state of the lane provided by the embodiment of the disclosure can be applied to the first vehicle, such as a device for identifying the traffic state of the lane in the first vehicle. It should be understood that the apparatus for identifying traffic conditions of a lane for performing the method of the present disclosure is not limited to being disposed in a first vehicle, but may be disposed in other devices, such as a roadside unit or other server, that transmit the identification result to the first vehicle after identifying the traffic conditions of a road in front of the second vehicle, so that the first vehicle makes a driving decision according to the identification result. The following description is given by taking an example of application to the first vehicle.
In the case where the number of the second vehicles is plural, the first vehicle may identify the traffic state of the plural second vehicle front roads at the same time, the method shown in fig. 3 described below will be described by taking the traffic state of one second vehicle front road as an example, and the process of identifying the traffic state of the other second vehicle front road is not limited as described in the present disclosure.
Fig. 3 is a flow chart of a method of identifying lane traffic status provided by some embodiments of the present disclosure. The method is executed by a first vehicle, specifically by a device for identifying a traffic state of a lane in the first vehicle, as shown in fig. 3, and the method for identifying the traffic state of the lane includes steps S301 to S304.
Step S301, acquiring driving data of the second vehicle.
In some examples, the driving data of the second vehicle may include pose information of the second vehicle. For example, the body of the second vehicle is inclined (head-facing) in the direction. As another example, the direction of deflection of the second vehicle tire.
In still other examples, the driving data of the second vehicle may also include a reminder message sent by the second vehicle. For example, a warning message from a turn signal of the second vehicle. For another example, the second vehicle may be a brake warning light.
In still other examples, the driving data of the second vehicle may further include a driving state of the second vehicle, such as a current driving road of the second vehicle, a driving speed of the second vehicle, and movement information of the second vehicle. The movement information of the second vehicle includes a movement direction of the second vehicle in a lateral direction (a direction perpendicular to the lane extending direction). As another example, the lateral speed of the second vehicle when moving to the left.
Of course, the driving data of the second vehicle may be other suitable information, for example, the driving data of the second vehicle in step S500 may be historical driving data of the second vehicle (or referred to as driving data of a historical time, where the historical time is a time before the current time), driving data of the second vehicle at the current time, or driving data of the second vehicle in a preset period of time, where the preset period of time includes the current time and a plurality of consecutive times before the current time, which is not limited in the embodiment of the present disclosure.
For example, the first vehicle may acquire driving data of the second vehicle through the above-described detection device. For example, the first vehicle may obtain pose information of the second vehicle through the ultrasonic probe, and/or the first vehicle may obtain prompt information sent by the second vehicle through the camera.
Step S302, based on the driving data of the second vehicle, predicting a target lane corresponding to the second vehicle.
In high-precision maps, a large amount of road information is generally included, such as map data of sidewalks, lanes, intersections, signs, traffic lights, and the like, which are each composed of a plurality of pieces of data. Taking a lane as an example, the lane generation device of the high-precision map divides the lane into a plurality of continuous lane segments. In some examples, the division of lane segments may be divided according to the extending direction of lanes and the changing trend thereof. Taking a straight lane as an example, as shown in fig. 4, one straight lane includes at least two road segments 40 with different extending directions, wherein a road segment in one extending direction is one lane segment 41, and a road segment in the other extending direction in the straight lane is another lane segment 42. That is, the road generating device of the high-precision map may divide the lane into different lane segments according to the change condition of the extending direction of the lane. In other examples, the road generating device of the high-precision map may divide the lane into different lane segments according to the change condition of the lateral width of the lane, and may divide the lane into different lane segments according to other suitable factors, which is not limited herein.
It will be appreciated that the longitudinal (parallel to the direction of extension of the lane) lengths of the different lane segments may be different. The longitudinal length of a lane segment may be several meters or hundreds of meters, which is not limited herein.
The target lane corresponding to the second vehicle refers to a lane segment predicted by the first vehicle into which the second vehicle will drive in the future. The target lane may be a lane segment of the current driving lane of the second vehicle or may be a lane segment of a lane adjacent to the current driving lane of the second vehicle.
After the first vehicle acquires the driving data of the second vehicle, the first vehicle predicts the target lane of the second vehicle based on the driving data of the second vehicle. It should be noted that, the driving data of the second vehicle is not unchanged, the driving data of the second vehicle is changed, and the target lane of the second vehicle predicted by the first vehicle may also be changed accordingly after the driving data of the second vehicle is changed. For example, the driving data of the second vehicle at the first time is different from the driving data of the second vehicle at the second time, and the target lane of the second vehicle predicted by the first vehicle at the first time may be different from the target lane of the second vehicle predicted by the first vehicle at the second time.
Step S303, determining the traffic state of the target lane based on the obstacle condition of the target lane.
The first vehicle can detect the presence of objects other than the second vehicle on the determined target lane by means of its own detection device. If an object other than the second vehicle is present on the target lane, it may be considered that an obstacle is present on the target lane. The obstacle may be an automobile, a bicycle, an electric vehicle, a pedestrian, an animal, a roadblock, etc., without limitation.
The obstacle condition of the first vehicle output at least comprises the existence of an obstacle on the target lane or the absence of an obstacle on the target lane. In some examples, the obstacle condition of the first vehicle output includes only the presence of an obstacle on the target lane or the absence of an obstacle on the target lane. In other examples, the obstacle condition output by the first vehicle may further include suitable obstacle information such as a moving speed of the obstacle in the longitudinal direction, an obstacle width of the obstacle in the lateral direction, and the like, which is not limited herein.
The traffic state of the target lane represents the traffic state of the future road of the second vehicle. The traffic state can be simply classified into congestion or smoothness, and congestion conditions with different degrees can be classified under the condition that the target lane is in the congestion state.
The traffic state of the target lane can be determined by detecting the target lane through the detection equipment of the first vehicle, or can be determined based on the monitoring data of the road condition of the actual road in real time by the high-precision map, or can be determined by other suitable modes, and the traffic state of the target lane is not limited herein.
Step S304, determining a driving strategy of the first vehicle based on the traffic state of the target lane.
It should be noted that, in the case that the first vehicle is a manually driven vehicle, the recognition result of the traffic state on the target lane may be transmitted to the driver of the first vehicle by means of a screen display or a voice prompt, so as to facilitate the driver to make a driving decision. In the case where the first vehicle is an autonomous vehicle, the driving control unit of the autonomous vehicle may receive and automatically make a driving decision based on the traffic state of the target lane.
In an embodiment of the present disclosure, a method of recognizing a traffic state of a lane predicts a target lane of a second vehicle by driving data of the second vehicle, and recognizes the traffic state of the target lane by detecting an obstacle of the target lane. The method and the system can enable the first vehicle to acquire richer and comprehensive actual road information, and help the first vehicle to carry out driving decision so as to improve the safety and driving efficiency of the first vehicle.
In some embodiments, the method of identifying a lane traffic state may further comprise, prior to step S302, acquiring a lane set of the second vehicle.
The lane set comprises a plurality of lane segment sequences, each lane segment sequence corresponds to one driving behavior of the second vehicle, and each lane segment sequence comprises a sequence formed by identification information of a plurality of continuously spliced lane segments.
The lane set of the second vehicle may be created by another device and then sent to the first vehicle, or may be created by the first vehicle itself, which is not limited herein.
The lane segment sequence is an exemplary sequence in which the identification information of the plurality of lane segments are arranged and combined according to the driving order of the second vehicle. The lane segment sequence is used for representing a driving route which is generated after a second vehicle on the real road generates a driving action, the driving route can be divided into a plurality of lane segments, one lane segment corresponds to one piece of identification information, and the identification information of the plurality of lane segments corresponding to the driving route is combined together according to the driving sequence of the second vehicle to be used as the lane segment sequence.
The driving behavior may include braking, turning around, turning left, turning right, etc., without limitation. It is to be understood that the operation of adjusting the running manner of the vehicle may be regarded as the driving behavior.
In some examples, one lane segment sequence includes (p 1, p2, p3, p 4). As shown in fig. 5, a lane segment p1 representing the current driving lane of the second vehicle on the real road, a lane segment p2 representing the right lane of the current driving lane of the second vehicle, an intersection right-turn lane segment p3, and a lane segment p4 after crossing. The lane segments p 1-p 4 are sequentially and continuously spliced to form a driving route which is generated after the second vehicle performs right-turning driving.
In other examples, one lane segment sequence includes (p 5, p6, p 7). As shown in fig. 6, a lane segment p5, a turn-around lane segment p6, and a turn-back opposite side lane segment p7 representing the current driving lane of the second vehicle on the real road. The lane segments p 5-p 7 are spliced sequentially and continuously to form a driving route which is generated after the second vehicle turns around to drive.
The lane set of the second vehicle may comprise all possible lane segment sequences of the second vehicle. For example, in a case where the second vehicle may perform driving behaviors such as left turn, straight turn, right turn, and turn around, the lane set of the second vehicle may include a lane segment sequence corresponding to the left turn of the second vehicle, a lane segment sequence corresponding to the straight turn of the second vehicle, a lane segment sequence corresponding to the right turn of the second vehicle, and a lane segment sequence corresponding to the turn around of the second vehicle. It will be appreciated that the lane set of the second vehicle includes a lane segment into which the second vehicle will drive in the future.
Taking the example of a first vehicle creating a lane set of a second vehicle, the first vehicle first determines all the optional driving behaviors of the second vehicle, and then determines a driving route that the second vehicle would form under each of the optional driving behaviors. Determining the lane segments contained in each driving route, combining the identification information of the lane segments on one driving route according to the driving sequence to obtain one lane segment sequence of the second vehicle, further obtaining all lane segment sequences of the second vehicle, and successfully creating a lane set of the second vehicle.
For example, as shown in fig. 5 and fig. 6, after the first vehicle determines that all optional driving behaviors of the second vehicle are turning around and turning right, determines a driving route 1 formed by the second vehicle under the action of turning right and a driving route 2 formed by the second vehicle under the action of turning around, determines that the lane segments p1 to p4 included in the driving route 1 form a lane segment sequence, determines that the lane segments p5 to p7 included in the driving route 2 form another lane segment sequence, and successfully creates a lane set of the second vehicle. The lane set comprises lane segments p 1-p 7.
On this basis, step S302 may include predicting a target driving behavior of the second vehicle based on driving data of the second vehicle, determining a target lane segment sequence corresponding to the target driving behavior from the lane set, and selecting at least one lane segment in the target lane segment sequence as a target lane.
In some examples, the driving data of the second vehicle includes data that is representative of a current driving condition of the second vehicle. For example, a tire of the second vehicle being deflected to the left can indicate that the second vehicle wants to move to the left. For example, the left turn indicator of the second vehicle blinks, which can also indicate that the second vehicle intends to move to the left.
In other examples, the second vehicle driving data includes driving habits that can be manifested by a driver in the second vehicle. As another example, the second vehicle driving data includes historical driving data of the second vehicle. Based on the historical driving data of the second vehicle, a proportion of the second vehicle's past different driving behaviors on the current lane can be obtained.
The target driving behavior of the second vehicle may be predicted based on data of one or more dimensions (second vehicle tire deflection direction, second vehicle indicator light prompt information, historical driving data of the second vehicle, etc.) in the driving data of the second vehicle.
It has been mentioned above that each sequence of lane segments in the lane set corresponds to a driving behaviour of the second vehicle, respectively. After the target driving behavior of the second vehicle is predicted, a target lane segment sequence corresponding to the target driving behavior can be found out from the lane set.
The target lane may include at least a lane segment of the target lane segment sequence in which the second vehicle is currently located, and may further include other lane segments of the target lane segment sequence that are connected to the lane segment. Embodiments of the present disclosure do not limit the number of lane segments in the target lane.
In this embodiment, the lane set of the second vehicle includes lane segments into which the second vehicle will drive in the future, and at least one lane segment is selected as the target lane from the lane set according to the most likely driving behavior of the second vehicle. The lane segment with the highest accuracy rate can be selected from the limited plurality of lane segments to serve as the target lane, so that the accuracy of target lane prediction is improved.
In some embodiments, predicting the target driving behavior of the second vehicle based on the driving data of the second vehicle may specifically include taking the driving data of the second vehicle as input data of a trajectory prediction algorithm and acquiring a predicted trajectory output by the trajectory prediction algorithm, and determining the target driving behavior of the second vehicle based on the predicted trajectory.
The track prediction algorithm can obtain better prediction accuracy after deep learning. The trajectory prediction algorithm may include vectornet algorithm, multipath++, or other suitable algorithms, not limited herein.
In this embodiment, the driving data of the second vehicle includes at least input data required by the trajectory prediction algorithm. In this way, the trajectory prediction algorithm can normally output the predicted data after taking the driving data of the second vehicle as the input data of the trajectory prediction algorithm.
The predicted data output by the track prediction algorithm is a plurality of track points, and the predicted track output by the track prediction algorithm can be obtained by connecting adjacent track points.
The predicted track outputted by the track prediction algorithm can know the future driving route of the vehicle predicted by the track prediction algorithm according to the current position of the second vehicle, so that the target driving behavior corresponding to the driving route can be determined.
Illustratively, as shown in fig. 7, the second vehicle is in the left second lane, and the predicted trajectory (shown in fig. 7 with a black bold dashed line) output by the trajectory prediction algorithm passes through the left second lane, the left third lane, and travels straight through the front intersection in order. It may be determined that the target driving behavior corresponding to the driving route is a lane change to the right.
It is to be understood that, in the scheme of determining the target lane segment sequence by the method of recognizing the traffic state of the lane using the predicted trajectory algorithm, the steps of determining the target driving behavior of the second vehicle based on the predicted trajectory and determining the target lane segment sequence corresponding to the target driving behavior from the lane set may be regarded as a process of matching the predicted trajectory outputted by the predicted trajectory algorithm with the plurality of lane sequences in the lane set.
In this embodiment, the target driving behavior of the second vehicle is predicted using the prediction trajectory algorithm for which deep learning has been completed. Because the prediction trajectory algorithm for which the deep learning has been completed has higher prediction accuracy, it is possible to improve the prediction accuracy of the target driving behavior of the first vehicle for predicting the second vehicle.
The actual travel trajectory of the second vehicle does not coincide with the predicted trajectory, since a blocking object that may be present in front of the second vehicle may affect the actual travel trajectory of the second vehicle.
Thus, in some embodiments, determining the target driving behavior of the second vehicle based on the predicted trajectory may include detecting an outline and a size of a barrier positioned at least in front of the second vehicle and representing the barrier with a convex hull, modifying the predicted trajectory if the predicted trajectory overlaps with position data of the convex hull, and determining the target driving behavior of the second vehicle based on the modified predicted trajectory.
A convex hull is a geometric figure that can represent shape and area size in a high-definition map. The convex hull can be a closed polygon formed by connecting a plurality of points of the periphery. The convex hull may represent the forward projection range (i.e., the lateral-longitudinal projection range) of the obstacle on the actual lane on the road surface in the high-precision map, and the position data of the convex hull in the high-precision map represents the position of the obstacle on the road surface.
After the first vehicle detects the shape and size of the obstacle located at least in front of the second vehicle by means of its own detection device, a corresponding convex hull can be simulated in the high-definition map based on the shape and size of the obstacle for representing the obstacle on the lane.
The use of convex hulls to represent the barrier enables the first vehicle to more fully and specifically understand the barrier to the lane and the impact on vehicle traffic in the actual environment than if the center point represented the barrier.
The predicted track output by the track prediction algorithm overlaps with the position data of the convex hull, which indicates that the running track of the second vehicle predicted by the track prediction algorithm passes through the obstacle, which obviously does not conform to the actual situation.
Thus, for this case, the predicted track output by the track prediction algorithm may be corrected by the correction algorithm, so that the corrected predicted track does not overlap with the position data of the convex hull, which indicates that the second vehicle on the real road may continue to travel around the obstacle in front.
As shown in fig. 8, the black color is filled with convex hulls in fig. 8, which indicates that two vehicles cannot move temporarily after collision to form a barrier in the lane, and occupy the first lane and the second lane on the left side. The trajectory prediction algorithm predicts that the second vehicle needs to drive into the first lane on the left in order for the second vehicle to follow the left turn, thus outputting the predicted trajectory I. However, the predicted trajectory I overlaps the position data of the convex hull, which means that the second vehicle will pass the obstacle, which is obviously not practical.
The correction algorithm can correct the predicted track I based on the position data of the convex hull, so that a predicted track II is obtained, and the predicted track II bypasses a front barrier, enters a first lane on the left side and continues to run. In the case that the track prediction algorithm predicts that the driving behavior of the second vehicle is correct, the predicted track II is obviously more practical than the predicted track I, and is also the track that the second vehicle will actually travel.
In this embodiment, the obstacle in front of the second vehicle is represented by using the convex hull, and the predicted track is corrected under the condition that the predicted track output by the track prediction algorithm overlaps with the position data of the convex hull, so as to obtain a predicted track more in line with the actual situation, thereby more comprehensively and specifically predicting the target driving behavior of the second vehicle in the actual environment, and improving the accuracy of predicting the target driving behavior.
In some embodiments, the method of identifying a lane traffic state may further include acquiring road traffic information prior to step S302.
The first vehicle may acquire road traffic information using its own detection device. For example, the first vehicle may acquire traffic light information using the traffic light image captured by the camera.
The road traffic information includes at least traffic light information, and the road traffic information may also include road obstacle information, road surface water information, and the like, which is not limited herein.
Predicting the target driving behavior of the second vehicle based on the driving information of the second vehicle may specifically include predicting the target driving behavior of the second vehicle based on the driving data of the second vehicle and the road traffic information.
For example, when the head of the second vehicle is offset to the left and there is a lateral speed to the left, the first vehicle may predict that the target driving behavior of the second vehicle is lane change to the left from the lane in which the second vehicle is located.
For example, when the head of the second vehicle is facing forward, the vehicle speed is decreasing, and the forward traffic signal is red, the first vehicle may predict that the target driving behavior of the second vehicle is braking.
In this embodiment, road traffic information is further combined on the basis of driving data of the second vehicle, so that the first vehicle can obtain richer and comprehensive actual road information, and accuracy of target driving behavior prediction of the second vehicle by the first vehicle can be improved.
It should be noted that, in the present embodiment and the embodiment using the predicted trajectory algorithm, the target driving behavior of the second vehicle may be predicted separately, or may be predicted in combination with the target driving behavior of the second vehicle.
In some embodiments, the number of lane segments in the target lane may be determined based on the speed of the second vehicle. The obtained driving data of the second vehicle includes a vehicle speed of the second vehicle. The vehicle speed may refer to a speed of the second vehicle in the longitudinal direction, or may refer to a speed of the second vehicle in the actual traveling direction. The selecting of at least one lane segment in the target lane segment sequence as the target lane may specifically include determining a distance threshold based on a speed of a second vehicle, the distance threshold being positively correlated with the speed of the second vehicle, and selecting at least one lane segment, from a plurality of lane segments in the target lane segment sequence, having a distance from the second vehicle in a lane extension direction less than or equal to the distance threshold as the target lane.
The distance threshold is a reference tool for selecting a lane segment in the target lane. The distance threshold may be close to or equal to a distance traveled by the second vehicle for a preset period of time. For example, the distance threshold may be equal to the distance traveled by the second vehicle within 20 seconds, or may be equal to the distance traveled by the second vehicle within 30 seconds. The embodiments of the present disclosure are not limited to a preset duration.
The faster the speed of the second vehicle, the farther the distance traveled by the second vehicle within the preset time period, and thus the greater the distance threshold may be. It will be appreciated that selecting a lane segment in the target lane using the distance threshold may cause the target lane to include a lane segment in which the second vehicle is currently located, and a lane segment that has elapsed within a preset time period thereafter.
As shown in FIG. 5, the target lane segment sequence is illustrated as including lane segments p 1-p 4, where the distance threshold may be d1 for a second vehicle having a slower speed, the first vehicle may select lane segments p1 and p2 as the target lanes, and where the distance threshold may be d2 for a second vehicle having a faster speed, the first vehicle may select lane segments p 1-p 3 as the target lanes.
When the target lane includes a plurality of lane segments, the target lane is a plurality of lane segments that are continuously spliced to each other.
In this embodiment, the lane segment in the target lane is selected by using the distance threshold value positively related to the vehicle speed, so that the target lane includes the lane segment in which the second vehicle is currently located and the lane segment passing within the preset time period thereafter. Thereby facilitating subsequent identification of lane communication conditions of the second vehicle within a predetermined period of time thereafter.
In some examples, the traffic state of the target lane may be directly determined according to the obstacle condition, for example, if the obstacle condition indicates that an obstacle exists on the target lane, the traffic state of the target lane indicates that the target lane is in a congestion state, and if the obstacle condition indicates that no obstacle exists on the target lane, the traffic state of the target lane indicates that the target lane is in a clear state. The congestion state indicates that there is an obstacle in the target lane in front of the second vehicle, and the clear state indicates that there is no obstacle in the target lane in front of the second vehicle.
In this example, the difficulty in determining the traffic state of the target lane can be simplified, and the recognition efficiency of the traffic state of the first vehicle to the target lane can be improved, so that the prediction efficiency of the first vehicle to the future driving behavior of the second vehicle based on the traffic state of the target lane can be improved.
In some embodiments, the obstacle condition characterizes the presence of an obstacle on the target lane and the attribute characteristics of the presence of the obstacle, including the speed of movement, and the traffic state of the target lane may also characterize the congestion level of the target lane. The determining the traffic state of the target lane according to the obstacle condition may include determining a moving speed of the obstacle on the target lane in the extending direction of the target lane based on the obstacle condition, and obtaining a congestion level of the target lane according to the moving speed of the obstacle in the extending direction of the target lane. Wherein the moving speed of the obstacle in the direction of extension of the target lane may be inversely related to the congestion level of the target lane.
The moving speed of the obstacle in the direction of extension of the target lane can represent the degree of congestion on the target lane. For example, the obstacle has a slow moving speed in the longitudinal direction, and the vehicle cannot pass through the obstacle, so that the slowing effect of all vehicles (including the second vehicle) on the target lane is serious, and thus the target lane congestion is serious. For another example, the moving speed of the obstacle in the longitudinal direction is fast, the vehicles behind the obstacle are slightly affected by the obstacle deceleration, and the moving speeds of all the vehicles (including the second vehicle) on the target lane may be relatively fast, so that the target lane congestion degree is slight.
In some examples, the congestion state of the first lane on the target lane is classified into three levels, including a first-level congestion state, a second-level congestion state, and a third-level congestion state, in which the congestion degree corresponding to the first-level congestion state is slight, and the congestion degree corresponding to the third-level congestion state is severe.
In the case where the obstacle condition includes an obstacle whose moving speed in the lane extending direction is greater than or equal to the vehicle speed of the second vehicle in the lane extending direction in the driving data of the second vehicle, it may be considered that the second vehicle is in a following state with a high probability, the driving behavior of the second vehicle is likely not to be changed suddenly, and the influence of the obstacle on the deceleration of the second vehicle is slight, so that it is determined that the target lane is in a first-stage congestion state. Thus, the first vehicle outputting the recognition result that the target lane is in the congestion state may include that the target lane is in the first-level congestion state.
In the case where the obstacle situation includes an obstacle whose moving speed in the lane extending direction is greater than zero and smaller than the vehicle speed of the second vehicle in the lane extending direction in the driving data of the second vehicle, it may be considered that the obstacle has a moderate influence on the deceleration of the second vehicle, and thus it is determined that the target lane is in the second-stage congestion state. Thus, the first vehicle outputting the recognition result that the target lane is in the congestion state may include that the target lane is in the secondary congestion state.
In the case where the obstacle condition includes an obstacle whose moving speed in the lane extending direction is equal to zero, it can be considered that the obstacle is a stationary object other than a traffic participant, such as a cone, a fence, or the like, the obstacle has a serious influence on the deceleration of the second vehicle, and thus it is determined that the target lane is in the three-stage congestion state. Thus, the first vehicle outputting the recognition result that the target lane is in the congestion state may include that the target lane is in the three-level congestion state.
In other examples, the first vehicle may divide the congestion status of the target lane into two levels, four levels, etc., without limitation herein. In addition, the determination factor for the influence of the moving speed of the obstacle in the lane extending direction on the deceleration of the second vehicle may also be different. For example, it may be considered that the moving speed of the obstacle in the lane extending direction is greater than or equal to 50km/h, that the obstacle has a slight influence on the deceleration of the second vehicle, and thus the target lane is determined to be in the first-stage congestion state, and that the moving speed of the obstacle in the lane extending direction is less than 10km/h, that the obstacle has a serious influence on the deceleration of the second vehicle, and thus the target lane is determined to be in the third-stage congestion state.
In this embodiment, the congestion state on the target lane is classified according to the movement speed of the obstacle in the longitudinal direction, so that the first vehicle can obtain richer and comprehensive actual road information, and the first vehicle is facilitated to make driving decisions.
In other embodiments, the obstacle condition characterizes the presence of an obstacle on the target lane and the attribute characteristics of the obstacle present, including the size of the obstacle in the lateral direction, and the traffic state of the target lane may also characterize the congestion level of the target lane. The determining the traffic state of the target lane according to the obstacle situation may include determining a size of the obstacle in the lateral direction on the target lane based on the obstacle situation, and obtaining a congestion level of the target lane according to the size of the obstacle in the lateral direction. Wherein the size of the obstacle in the lateral direction may be positively correlated with the target lane congestion level.
In this embodiment, the convex hull may represent the orthographic projection range of the obstacle on the actual lane on the road surface in the high-precision map, and the position data of the convex hull in the high-precision map represents the position of the obstacle on the road surface.
The size of the obstacle in the lateral direction can directly affect the time consuming and detour distance required for the vehicle to detour around the obstacle and thus the degree of congestion on the target lane. For example, the obstacle is long in the lateral direction, the time required for the vehicle to detour around the obstacle is long, and the detour distance is long, so that the influence of the deceleration of all vehicles (including the second vehicle) on the target lane is serious, and thus the target lane congestion is serious. As another example, the obstacle is short in size in the lateral direction, the time required for the vehicle to detour around the obstacle is short, and the detour distance is short, so that the influence of the deceleration of all vehicles (including the second vehicle) on the target lane is slight, and thus the target lane congestion is slight.
In this embodiment, by classifying the congestion state on the target lane according to the size of the obstacle in the lateral direction, the first vehicle can obtain richer and comprehensive actual road information, which is helpful for the first vehicle to make driving decisions.
In some road scenarios, the first vehicle may have a probe blind zone for the target lane. For example, as shown in fig. 9, there is a large vehicle (e.g., a heavy truck, a bus, etc.) between the first vehicle and the target lane, and the large vehicle may block the detection device of the first vehicle from detecting the target lane, resulting in a detection dead zone for the target lane.
In the case where the first vehicle determines that a detection dead zone exists for the target lane and recognizes that the target lane is in a clear state, the first vehicle may be inaccurate in determining the target lane as to the clear state.
In some embodiments, in a case where the first vehicle determines, at the first time, that the traffic state of the target lane is used to characterize that the target lane is in the clear state, as shown in fig. 10, the method for identifying the traffic state of the lane further includes steps S305 to S307.
Step S305 is to determine whether or not the first vehicle has a detection blind area for the target lane.
It should be noted that, after receiving the feedback detection signal, the detection device of the first vehicle may determine whether a detection blind area exists in the actual road environment based on the feedback detection signal.
By way of example, by determining whether there is overlap of the probe blind area with the target lane, it is determined that the target lane has the probe blind area in the case where the probe blind area overlaps with the target lane.
And step S306, monitoring whether the speed of the second vehicle at the second moment is lower than the speed of the second vehicle at the first moment based on the driving data of the second vehicle under the condition that the first vehicle has a detection blind area on the target lane.
Under the condition that the first vehicle determines that a detection blind area exists on the target lane and recognizes that the target lane is in a smooth state, the first vehicle continuously monitors whether the speed of the second vehicle is reduced compared with the speed of the second vehicle at the moment when the target lane is in the smooth state.
Step S307, if the speed of the second vehicle at the second moment is lower than the speed of the second vehicle at the first moment, the traffic state of the target lane is used for representing that the target lane is in a congestion state.
If the speed of the second vehicle at the second moment is lower than that of the second vehicle at the first moment, the fact that the obstacle possibly exists in the detection blind area in the target lane has an effect of reducing the speed of the second vehicle is indicated. Thus, the traffic state of the target lane characterizes the target lane as being in a congested state.
In this embodiment, when the first vehicle determines that a detection blind area exists on the target lane and identifies that the target lane is in a smooth state, in order to avoid the problem that the identification result is inaccurate due to the detection blind area existing on the first vehicle, the identification result is corrected to be in a congestion state when the second vehicle speed is monitored to be reduced, so that the accuracy of the identification result is improved, and driving decision of the first vehicle based on the identification result of the passing state on the target lane is facilitated.
Fig. 11 is a schematic structural view of an apparatus for recognizing traffic lane according to some embodiments of the present disclosure. The device for identifying the traffic state of the lane is applied to a first vehicle. As shown in fig. 11, the apparatus includes:
An obtaining unit 401 is configured to obtain a driving number of at least one second vehicle within a preset range of the first vehicle. A prediction unit 402, configured to predict a target lane corresponding to the second vehicle based on driving data of the second vehicle. A first determining unit 403 for determining a traffic state of the target lane based on an obstacle condition of the target lane. A second determining unit 404 for determining a driving strategy of the first vehicle based on the traffic state of the target lane.
Optionally, the prediction unit 402 is specifically configured to predict a target driving behavior of the second vehicle based on driving data of the second vehicle, determine a target lane segment sequence corresponding to the target driving behavior from a lane set, where the lane set includes a plurality of lane segment sequences, each lane segment sequence corresponds to one driving behavior of the second vehicle, each lane segment sequence includes a plurality of continuously spliced lane segments, and select at least one lane segment in the target lane segment sequence as a target lane.
Optionally, the driving data of the second vehicle includes a speed of the second vehicle. The prediction unit 402 is further configured to determine a distance threshold based on a speed of the second vehicle, wherein the distance threshold is positively related to the speed of the second vehicle, and select at least one lane segment with a distance between the second vehicle and the lane segment in the lane extending direction less than or equal to the distance threshold as the target lane from the plurality of lane segments in the target lane segment sequence.
Optionally, the prediction unit 402 is further configured to use driving data of the second vehicle as input data of a track prediction algorithm, obtain a predicted track output by the track prediction algorithm, and determine a target driving behavior of the second vehicle based on the predicted track.
Optionally, the obtaining unit 401 is further configured to obtain road traffic information, where the road traffic information includes at least traffic light information. The prediction unit 402 is further configured to predict a target driving behavior of the second vehicle based on the driving data of the second vehicle and the road traffic information.
Optionally, the prediction unit 402 is further configured to detect an outline and a size of a blocking object located at least in front of the second vehicle, and indicate the blocking object by using a convex hull, correct the predicted trajectory if the predicted trajectory overlaps with the position data of the convex hull, determine the target driving behavior of the second vehicle based on the corrected predicted trajectory, and the corrected predicted trajectory does not overlap with the position data of the convex hull.
Optionally, the obstacle condition characterizes whether an obstacle is present. The first determining unit 403 is specifically configured to, if the obstacle condition indicates that an obstacle exists on the target lane, identify that the target lane is in a congestion state, and if the obstacle condition indicates that no obstacle exists on the target lane, identify that the target lane is in a clear state.
Optionally, the obstacle condition characterizes the presence of an obstacle on the target lane and the attribute characteristics of the obstacle present, including the speed of movement. The first determining unit 403 is further configured to determine a moving speed of the obstacle on the target lane in the extending direction of the target lane based on the obstacle situation, and obtain a congestion level of the target lane according to the moving speed of the obstacle in the extending direction of the target lane, where the moving speed of the obstacle in the extending direction of the target lane is inversely related to the congestion level of the target lane.
Optionally, the traffic state of the target lane at the first moment is used for representing the situation that the target lane is in a smooth state. The first determining unit 403 is further configured to determine whether a detection blind area exists on the target lane by the first vehicle, monitor, based on driving data of the second vehicle, whether a speed of the second vehicle at a second moment is lower than a speed of the second vehicle at the first moment when the detection blind area exists on the target lane by the first vehicle, and if the speed of the second vehicle at the second moment is lower than the speed of the second vehicle at the first moment, a traffic state of the target lane is used to indicate that the target lane is in a congestion state.
According to embodiments of the present disclosure, the present disclosure also provides an electronic device, a readable storage medium and a computer program product.
Fig. 12 shows a schematic block diagram of an example electronic device 800 that may be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 12, the electronic device 800 includes a computing unit 801 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 802 or a computer program loaded from a storage unit 808 into a Random Access Memory (RAM) 803. In the RAM 803, various programs and data required for the operation of the electronic device 800 can also be stored. The computing unit 801, the ROM 802, and the RAM 803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to the bus 804.
Various components in the electronic device 800 are connected to the I/O interface 805, including an input unit 806 such as a keyboard, a mouse, etc., an output unit 807 such as various types of displays, speakers, etc., a storage unit 808 such as a magnetic disk, an optical disk, etc., and a communication unit 809 such as a network card, a modem, a wireless communication transceiver, etc. The communication unit 809 allows the electronic device 800 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
The computing unit 801 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 801 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The calculation unit 801 performs the respective methods and processes described above, for example, a method of recognizing a traffic state of a lane. For example, in some embodiments, the method of identifying lane traffic status may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as the storage unit 808. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 800 via the ROM 802 and/or the communication unit 809. When the computer program is loaded into the RAM 803 and executed by the computing unit 801, one or more steps of the method of identifying a lane traffic state described above may be performed. Alternatively, in other embodiments, the computing unit 801 may be configured to perform the method of identifying lane traffic status by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), complex Programmable Logic Devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include being implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be a special or general purpose programmable processor, operable to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user, for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback), and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a Local Area Network (LAN), a Wide Area Network (WAN), and the Internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server incorporating a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel, sequentially, or in a different order, provided that the desired results of the technical solutions provided by the present disclosure are achieved, and are not limited herein.
The above detailed description should not be taken as limiting the scope of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present disclosure are intended to be included within the scope of the present disclosure.