[go: up one dir, main page]

CN111295629B - Detecting and responding to traffic redirection of autonomous vehicles - Google Patents

Detecting and responding to traffic redirection of autonomous vehicles Download PDF

Info

Publication number
CN111295629B
CN111295629B CN201880070898.6A CN201880070898A CN111295629B CN 111295629 B CN111295629 B CN 111295629B CN 201880070898 A CN201880070898 A CN 201880070898A CN 111295629 B CN111295629 B CN 111295629B
Authority
CN
China
Prior art keywords
vehicle
traffic
lanes
lane
traffic flow
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201880070898.6A
Other languages
Chinese (zh)
Other versions
CN111295629A (en
Inventor
D.H.西尔弗
P.乔达里
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Waymo LLC
Original Assignee
Waymo LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Waymo LLC filed Critical Waymo LLC
Priority to CN202111496739.2A priority Critical patent/CN114518749A/en
Priority claimed from PCT/US2018/057971 external-priority patent/WO2019089444A1/en
Publication of CN111295629A publication Critical patent/CN111295629A/en
Application granted granted Critical
Publication of CN111295629B publication Critical patent/CN111295629B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/056Detecting movement of traffic to be counted or controlled with provision for distinguishing direction of travel
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3415Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • G08G1/0145Measuring and analyzing of parameters relative to traffic conditions for specific applications for active traffic flow control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0027Planning or execution of driving tasks using trajectory prediction for other traffic participants
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3658Lane guidance
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0088Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/22Command input arrangements
    • G05D1/228Command input arrangements located on-board unmanned vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/617Safety or protection, e.g. defining protection zones around obstacles or avoiding hazards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/09623Systems involving the acquisition of information from passive traffic signs by means mounted on the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/09626Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages where the origin of the information is within the own vehicle, e.g. a local storage device, digital map
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/35Road bumpiness, e.g. potholes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/20Static objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4044Direction of movement, e.g. backwards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4045Intention, e.g. lane change or imminent movement
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/60Traffic rules, e.g. speed limits or right of way
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30236Traffic on road, railway or crossing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096805Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route
    • G08G1/096811Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route where the route is computed offboard
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096805Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route
    • G08G1/096827Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route where the route is computed onboard

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Medical Informatics (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

The present technology relates to a method of controlling a vehicle in an autonomous driving mode. For example, the vehicle 100 may be operated in an autonomous driving mode using pre-stored map information identifying traffic flow directions. Data may be received from a perception system of the vehicle that identifies objects in the external environment of the vehicle that are associated with traffic diversions not identified by the map information. The received data may be used to identify one or more of the channels 910, 920 for traffic redirection. One of the one or more lanes may be selected based on a direction of traffic flow through the selected lane. The vehicle may then be controlled in an autonomous driving mode to enter and follow a selected one of the one or more lanes based on the determined direction of traffic flow through each of the one or more lanes.

Description

Detecting and responding to traffic redirection of autonomous vehicles
Cross Reference to Related Applications
This application is a continuation of application No. 15/798,881 filed on 31/10/2017, the entire disclosure of which is incorporated herein by reference.
Technical Field
The present application relates to detecting and responding to traffic redirection of an autonomous vehicle.
Background
Autonomous vehicles, such as vehicles that do not require a human driver, may be used to assist in transporting passengers or items from one location to another. Such vehicles may operate in a fully automatic mode, wherein the passenger may provide some initial input, such as a pick-up (pick-up) location or a destination location, and the vehicle maneuvers to that location itself.
Steady (robust) operation of an autonomous vehicle or a vehicle operating in an autonomous driving mode requires appropriate response to an unexpected situation, such as construction to change the normal traffic flow. In other words, the traffic flow may be temporarily redirected (redirect) due to construction or traffic accidents. For example, the lane may be closed by blocking the lane with an object such as an emergency vehicle, a construction sign, a cone marker, a barrel marker, or other object. At the same time, other lanes may be left open and/or cones or other markers are used to create new lanes (corridors) to separate new "lanes" or reverse traffic. In many cases, the features that mark redirection (such as cones or emergency vehicles) are not pre-recorded in a map that is used by the vehicle's control computing device to navigate the vehicle. Thus, identifying and responding to such situations is a vital function for these vehicles for safe and effective control.
Disclosure of Invention
One aspect of the present disclosure provides a method of controlling a vehicle in an autonomous driving mode. The method includes manipulating, by one or more processors, a vehicle in an autonomous driving mode using pre-stored map information identifying traffic flow directions; receiving, by one or more processors, data from a vehicle awareness system, the data identifying an object in an environment external to the vehicle that is associated with a traffic redirection that is not identified by the map information; identifying, by the one or more processors, one or more lanes of traffic redirection using the received data; selecting, by the one or more processors, one of the one or more lanes based on a direction of traffic flow through the selected lane; and controlling, by the one or more processors, the vehicle in the autonomous driving mode to enter and follow (follow) the selected one of the one or more lanes.
In one example, the method further includes determining a direction of traffic flow through the selected lane by analyzing how reverse traffic relative to the vehicle enters and passes through the one or more lanes. In another example, the method further includes determining a direction of traffic flow through the selected lane by analyzing a sign proximate to any of the one or more lanes. In another example, the method further includes determining a direction of traffic flow through the selected lane by observing traffic through any of the one or more lanes. In another example, the method further includes receiving information identifying one or more lanes from the one or more computing devices of the second vehicle and determining a direction of traffic flow through the selected lane based on the received information. In another example, the method further includes, after identifying the one or more channels using the received data, sending a request to a computing device remote from the vehicle for instructions on how to travel, and receiving the instructions, and wherein selecting the selected one of the one or more channels is further based on the received instructions. In another example, the method further comprises determining a direction of traffic flow through each of the one or more lanes, and wherein selecting the selected lane is further based on any determined direction of traffic flow. In another example, one or more lanes are not defined by two or more lane lines.
Another aspect of the present disclosure provides a system for controlling a vehicle in an autonomous driving mode. The system includes one or more processors configured to operate a vehicle in an autonomous driving mode using pre-stored map information identifying traffic flow directions; receiving data from a perception system of the vehicle, wherein the data identifies an object in an external environment of the vehicle that is associated with a traffic redirection that is not identified by the map information; identifying one or more lanes of traffic redirection using the received data; selecting one of the one or more lanes based on a direction of traffic flow through the selected lane; and controlling the vehicle to enter and follow the selected one of the one or more channels in the autonomous driving mode.
In one example, the one or more processors are further configured to determine a direction of traffic flow through the selected lane by analyzing how reverse traffic relative to the vehicle enters and passes through the one or more lanes. In another example, the one or more processors are further configured to determine a direction of traffic flow through the selected lane by analyzing a landmark proximate to any of the one or more lanes. In another example, the one or more processors are further configured to determine a direction of traffic flow through the selected lane by observing traffic through any of the one or more lanes. In another example, the one or more processors are further configured to receive information identifying one or more lanes from the one or more computing devices of the second vehicle and determine a direction of traffic flow through the selected lane based on the received information. In another example, the one or more processors are further configured to, after identifying the one or more channels using the received data, send a request to a computing device remote from the vehicle for instructions on how to travel, and receive the instructions, and wherein selecting the selected one of the one or more channels is further based on the received instructions. In another example, the one or more processors are further configured to determine a direction of traffic flow through each of the one or more lanes, and wherein selecting the selected lane is further based on any determined direction of traffic flow. In another example, one or more lanes are not defined by two or more lane lines. In another example, the system further includes a vehicle.
Another aspect of the disclosure provides a non-transitory computer readable medium having instructions stored thereon, which when executed by one or more processors, cause the one or more processors to perform a method of controlling a vehicle in an autonomous driving mode, the method comprising maneuvering the vehicle in the autonomous driving mode using pre-stored map information identifying traffic flow directions; receiving data from a perception system of the vehicle, the data identifying an object in an external environment of the vehicle that is associated with a traffic redirection that is not identified by the map information; identifying one or more lanes of traffic redirection using the received data; selecting one of the one or more lanes based on a direction of traffic flow through the selected lane; and controlling the vehicle to enter and follow the selected one of the one or more channels in the autonomous driving mode.
In one example, the method further includes determining a direction of traffic flow through the selected lane by analyzing how reverse traffic relative to the vehicle enters and passes through the one or more lanes. In another example, the method further includes determining a direction of traffic flow through the selected lane by analyzing a sign proximate to any of the one or more lanes. In another example, the method further includes determining a direction of traffic flow through the selected lane by observing traffic through any of the one or more lanes. In another example, the method further includes receiving information identifying one or more lanes from the one or more computing devices of the second vehicle and determining a direction of traffic flow through the selected lane based on the received information. In another example, the method further includes, after identifying the one or more channels using the received data, sending a request to a computing device remote from the vehicle for instructions on how to travel, and receiving the instructions, and wherein selecting the selected one of the one or more channels is further based on the received instructions. In another example, the method further comprises determining a direction of traffic flow through each of the one or more lanes, and wherein selecting the selected lane is further based on any determined direction of traffic flow. In another example, one or more lanes are not defined by two or more lane lines.
Drawings
FIG. 1 is a functional diagram of an example vehicle, according to aspects of the present disclosure.
FIG. 2 is an example representation of detailed map information according to aspects of the present disclosure.
Fig. 3A-3D are example exterior views of a vehicle according to aspects of the present disclosure.
FIG. 4 is an example pictorial diagram of a system in accordance with aspects of the present disclosure.
Fig. 5 is an example functional diagram of a system according to aspects of the present disclosure.
Fig. 6 is a view of a road segment according to aspects of the present disclosure.
Fig. 7 is an example of sensor data and other information for road segments in accordance with aspects of the present disclosure.
Fig. 8 is another example of sensor data and other information for road segments in accordance with aspects of the present disclosure.
Fig. 9 is another example of sensor data and other information for road segments in accordance with aspects of the present disclosure.
Fig. 10 is yet another example of sensor data and other information for road segments in accordance with aspects of the present disclosure.
Fig. 11 is a flow diagram according to aspects of the present disclosure.
Detailed Description
SUMMARY
In many cases, traffic flow redirection is well defined. However, in some cases, redirection may involve newly created lanes that do not clearly or completely separate reverse traffic. That is, traffic from any opposite direction of traffic may enter one or more lanes. In such ambiguous situations, the computing device of the autonomous vehicle must select the correct channel. Otherwise, the vehicle may get stuck or enter a wrong way, which may present additional safety issues.
Furthermore, these channels may be easily understandable to human drivers, but ambiguous to the vehicle's computing system. This may be due to the presence of important signals that the vehicle's computing device cannot detect or identify, such as non-standard signs that the vehicle cannot detect (e.g., a hand-written arrow or holding left/right signs), or the presence of a prompt that is outside the vehicle's sensing range but within the human's sensing range. In other cases, the vehicle's computing device may receive all of the signals that the computing device needs, but must perform appropriate analysis to determine how to proceed. To fully understand what is happening, it is necessary for a computing device to first detect possible ambiguities and then look for a signal that can resolve it.
To determine which lane the vehicle should enter, the vehicle's computing device must first identify that an ambiguity exists. This may be accomplished by processing data from the vehicle's perception system to identify one or more channels. In some cases, if the computing device identifies more than one possible lane, this may create ambiguity as to which lane (left, right, center, etc.) the vehicle should enter.
The computing device may then attempt to resolve the ambiguity by analyzing the lanes using one or more methods and determining the appropriate traffic flow (the same or opposite of the vehicle) through each lane. In one example method, the computing device may analyze the channels in reverse. As another approach, the computing device may attempt to resolve the ambiguity by analyzing any flags. As yet another approach, the computing device may attempt to determine the direction of traffic through each lane by observing the behavior of other vehicles. As another approach, the computing device may use information provided by other vehicles that have recently passed through the area.
If the ambiguity cannot be resolved using one or more of the above actions, the computing device may send a request to the human operator requesting it to provide instructions on how to proceed. This may include sending information identifying the channel that the computing device identifies for viewing, and receiving instructions on how to proceed. In some cases, a human operator may simply reroute the vehicle, or the computing device may control the vehicle to avoid the lane altogether by steering and/or rerouting the vehicle.
The features described herein may allow a vehicle operating in an autonomous driving mode to identify ambiguity caused by traffic redirection including one or more lanes, "reason" about the situation and identify how traffic should flow through the lanes, and make appropriate responses. In vehicles with manual driving mode, this may reduce the incidence of disengagement from automatic driving mode.
Example System
As shown in fig. 1, a vehicle 100 according to one aspect of the present disclosure includes various components. Although certain aspects of the present disclosure are particularly useful with certain types of vehicles, the vehicle may be any type of vehicle, including, but not limited to, cars, trucks, motorcycles, buses, recreational vehicles (recreational vehicles), and the like. The vehicle may have one or more computing devices, such as computing device 110 containing one or more processors 120, memory 130, and other components typically found in a general purpose computing device.
Memory 130 stores information accessible to one or more processors 120, including instructions 132 and data 134 that may be executed or otherwise used by processors 120. The memory 130 may be of any type capable of storing information accessible by the processor, including a computing device readable medium, or other medium that stores data readable by means of an electronic device, such as a hard disk drive, memory card, ROM, RAM, DVD or other optical disk, as well as other writable and read-only memories. The systems and methods may include different combinations of the foregoing, whereby different portions of the instructions and data are stored on different types of media.
The instructions 132 may be any set of instructions executed directly (e.g., machine code) or indirectly (e.g., scripts) by a processor. For example, the instructions may be stored as computing device code on a computing device readable medium. In this regard, the terms "instructions" and "programs" may be used interchangeably herein. The instructions may be stored in an object code format for direct processing by a processor, or in any other computing device language, including a collection of independent source code modules or scripts that are interpreted or pre-compiled as needed. The routines, methods, and functions of the instructions are explained in more detail below.
Processor 120 may retrieve, store, or modify data 134 according to instructions 132. As an example, the data 134 of the memory 130 may store predefined scenes. A given scene may identify a set of scene requirements including the type of object, the range of positions of the object relative to the vehicle, and other factors such as whether the autonomous vehicle is able to maneuver around the object, whether the object is using turn signals, the status of traffic lights associated with the current position of the object, whether the object is approaching a stop sign, etc. These requirements may include discrete values (e.g., "right turn signal on" or "on right turn lane only") or a range of values (e.g., "have a course oriented at an angle of 30 to 60 degrees off from the current path of the vehicle 100"). In some examples, the predetermined scene may include similar information for multiple objects.
The one or more processors 120 may be any conventional processor, such as a commercially available CPU. Alternatively, one or more processors may be a dedicated device, such as an ASIC or other hardware-based processor. Although fig. 1 functionally shows the processors, memory, and other elements of the computing device 110 as being within the same block, those of ordinary skill in the art will appreciate that a processor, computing device, or memory may actually comprise multiple processors, computing devices, or memories that are or are not housed within the same physical housing. By way of example, the internal electronic display 152 may be controlled by a special purpose computing device with its own processor or Central Processing Unit (CPU), memory, etc., which may interface with the computing device 110 via a high bandwidth or other network connection. In some examples, the computing device may be a user interface computing device that may communicate with a client device of a user. Similarly, the memory may be a hard drive or other storage medium located in a different enclosure than the enclosure of the computing device 110. Thus, references to a processor or computing device are to be understood as including references to a collection of processors or computing devices or memories that may or may not operate in parallel.
Computing device 110 may have all of the components typically used in connection with computing devices, such as the processors and memory described above, as well as user input 150 (e.g., a mouse, keyboard, touch screen, and/or microphone) and various electronic displays (e.g., a monitor having a screen or any other electrical device operable to display information). In this example, the vehicle includes an internal electronic display 152 and one or more speakers 154 to provide an informational or audiovisual experience. In this regard, the internal electronic display 152 may be located within a cabin of the vehicle 100 and may be used by the computing device 110 to provide information to passengers within the vehicle 100. In addition to the internal speakers, the one or more speakers 154 may include external speakers disposed at various locations on the vehicle to provide audible notifications to objects external to the vehicle 100.
In one example, the computing device 110 may be an autonomous driving computing system incorporated into the vehicle 100. The autonomous computing system may be capable of communicating with various components of the vehicle. For example, returning to fig. 1, computing device 110 may communicate with various systems of vehicle 100 (e.g., a deceleration system 160 (for controlling braking of the vehicle), an acceleration system 162 (for controlling acceleration of the vehicle), a steering system 164 (for controlling orientation of wheels and direction of the vehicle), a signaling system 166 (for controlling steering signals), a navigation system 168 (for navigating the vehicle to a location or around an object), a positioning system 170 (for determining the location of the vehicle), a sensing system 172 (for detecting objects in the environment external to the vehicle), and a power system 174 (e.g., a battery and/or a gasoline or diesel powered engine)) to control movement, speed, etc. of vehicle 100 according to instructions 132 of memory 130 in an autonomous driving mode that does not require or requires continuous or periodic input from occupants of the vehicle. Again, although these systems are shown external to computing device 110, in practice, these systems may also be incorporated into computing device 110, again as an autonomous driving computing system for controlling vehicle 100.
The computing device 110 may control the direction and speed of the vehicle by controlling various components. By way of example, the computing device 110 may use data from the map information and navigation system 168 to fully automatically navigate the vehicle to the destination location. The computing device 110 may use the positioning system 170 to determine the location of the vehicle and the perception system 172 to detect and respond to objects as needed to safely reach the location. To do so, computing device 110 may accelerate the vehicle (e.g., by increasing fuel or other energy provided to the engine by acceleration system 162), decelerate (e.g., by decreasing fuel supplied to the engine by deceleration system 160, changing gears, and/or by applying brakes), change direction (e.g., by turning front or rear wheels of vehicle 100 by steering system 164), and signal such a change (e.g., by illuminating a steering signal of signaling system 166). Thus, the acceleration system 162 and the deceleration system 160 may be part of a powertrain system (drivetrain) that includes various components between the engine of the vehicle and the wheels of the vehicle. Still further, by controlling these systems, the computing device 110 may also control the powertrain of the vehicle to automatically steer the vehicle.
As an example, the computing device 110 may interact with a deceleration system 160 and an acceleration system 162 to control the speed of the vehicle. Similarly, the computing device 110 may use the steering system 164 to control the direction of the vehicle 100. For example, if the vehicle 100 is configured for use on a roadway, such as a car or truck, the steering system may include components that control wheel angle to steer the vehicle. The signaling system 166 may be used by the computing device 110 to signal the intent of the vehicle to other drivers or other vehicles, for example, by illuminating turn signals or brake lights when needed.
The navigation system 168 may be used by the computing device 110 to determine and follow a route to a location. In this regard, the navigation system 168 and/or the data 134 may store map information, such as a highly detailed map that the computing device 110 may use to navigate or control the vehicle. By way of example, these maps may identify the shape and height (elevation) of roads, lane markings, intersections, crosswalks, speed limits, traffic lights, buildings, signs, real-time traffic information, vegetation, or other such objects and information. Lane markings may include features such as solid or dashed double or single lane lines, solid or dashed lane lines, reflective signs (reflectors), and the like. A given lane may be associated with left and right lane lines or other lane markings defining lane boundaries. Thus, most lanes may be bounded by the left edge of one lane line and the right edge of another lane line.
The sensing system 172 also includes one or more components for detecting objects external to the vehicle, such as other vehicles, obstacles in the road, traffic signals, signs, trees, and the like. For example, the perception system 172 may include one or more LIDAR sensors, sonar devices, radar units, cameras, and/or any other detection devices that record data that may be processed by the computing device 110. The sensors of the perception system may detect objects and their characteristics such as position, orientation, size, shape, type (e.g., vehicle, pedestrian, rider, etc.), speed of movement, and the like. The raw data from the sensors and/or the aforementioned features, when generated by the perception system 172, may be quantified or set into descriptive functions, vectors, and/or bounding boxes and periodically and continuously sent to the computing device 110 for further processing. As discussed in further detail below, the computing device 110 may use the positioning system 170 to determine the location of the vehicle and the perception system 172 to detect and respond to objects as needed to safely reach the location.
Fig. 2 is an example of map information 200 of a link. The map information 200 includes information identifying the shape, location, and other characteristics of various road features. In this example, the map information includes three lanes 212, 214, 216 bounded by a curb 220, lane lines 222, 224, 226, and a curb 228. Lanes 212 and 214 have the same traffic flow direction (east direction), while lane 216 has a different traffic flow (west direction). Additionally, lane 212 is significantly wider than lane 214, for example, to allow the vehicle to stop adjacent curb 220. Although examples of map information include only a few road features, such as curbs, lane lines, and lanes, the map information 200 may also identify various other road features, such as traffic lights, crosswalks, sidewalks, stop signs, yield signs, speed limit signs, road signs, and so forth, in view of the nature of the roads. Although not shown, the detailed map information may also include information identifying speed limits and other legal traffic requirements, as well as historical information identifying typical and historical traffic conditions at various dates and times.
Although the detailed map information is depicted herein as an image-based map, the map information need not be entirely image-based (e.g., raster). For example, the detailed map information may include one or more roadmaps (roadmaps) or graphical networks of information such as roads, lanes, intersections, and connections between these features. Each feature may be stored as graphical data and may be associated with information such as the geographic location and whether it is linked to other related features (e.g., a stop sign may be linked to a road, intersection, etc.). In some examples, the associated data may include a grid-based index of road maps to allow efficient lookup of certain road map features.
Fig. 3A to 3D are examples of external views of the vehicle 100. As can be seen, the vehicle 100 includes many features of a typical vehicle, such as headlights 302, a windshield 303, tail/turn signals 304, a rear windshield 305, doors 306, side-view mirrors 308, tires and wheels 310, and turn signal/stop lights 312. Headlights 302, tail/turn signals 304, and turn signal/stop lights 312 may be associated with signaling system 166. Light bar 307 may also be associated with signal system 166. The housing 314 may house one or more sensors, such as LIDAR sensors, sonar devices, radar units, cameras, etc., of the sensing system 172, although such sensors may also be incorporated in other areas of the vehicle.
One or more computing devices 110 of vehicle 100 may also receive information from or transmit information to other computing devices, for example, using wireless network connection 156. The wireless network connection may include, for example, bluetooth (r), bluetooth (LE), LTE, cellular, near field communication, and the like, as well as various combinations of the foregoing. Fig. 4 and 5 are a schematic and functional diagram, respectively, of an example system 400, the example system 400 including a plurality of computing devices 410, 420, 430, 440 and a storage system 450 connected via a network 460. The system 400 also includes a vehicle 100, and a vehicle 100A that may be configured similarly to the vehicle 100. Although only a few vehicles and computing devices are depicted for simplicity, a typical system may include significantly more vehicles and computing devices.
As shown in fig. 4, each of the computing devices 410, 420, 430, 440 may include one or more processors, memory, data, and instructions. Such processors, memories, data, and instructions may be configured similarly to the one or more processors 120, memories 130, data 134, and instructions 132 of the computing device 110.
Network 460 and intermediate nodes may include various configurations and protocols, including short-range communication protocols such as bluetooth (r), bluetooth LE, the internet, the world wide web, intranets, virtual private networks, wide area networks, local networks, private networks using one or more company-proprietary communication protocols, ethernet, WiFi, and HTTP, as well as various combinations of the foregoing. Any device capable of sending data to and from other computing devices (e.g., modems and wireless interfaces) may facilitate such communication.
In one example, one or more computing devices 110 may include a server having multiple computing devices, e.g., a load balancing server farm, that exchanges information with different nodes of a network for receiving data from, processing data, and sending data to other computing devices. For example, the one or more computing devices 410 may include one or more server computing devices capable of communicating with one or more computing devices 110 of the vehicle 100 or similar computing devices of the vehicle 100A and client computing devices 420, 430, 440 via the network 460. For example, vehicles 100 and 100A may be part of a fleet of vehicles that may be dispatched to various locations by a server computing device. In this regard, vehicles of a fleet may periodically transmit location information provided by respective positioning systems of the vehicles to a server computing device, and one or more server computing devices may track the locations of the vehicles.
Additionally, the server computing device 410 may send and present information to users (e.g., users 422, 432, 442) on displays (e.g., displays 424, 434, 444 of computing devices 420, 430, 440) using the network 460. In this regard, the computing devices 420, 430, 440 may be considered client computing devices.
As shown in fig. 5, each client computing device 420, 430, 440 may be a personal computing device intended for use by a user 422, 432, 442 and have all of the components typically used in connection with a personal computing device that includes one or more processors (e.g., a Central Processing Unit (CPU)), memory (e.g., RAM and internal hard drives) to store data and instructions, a display such as display 424, 434, 444 (e.g., a monitor with a screen, a touch screen, a projector, a television, or other device operable to display information), and a user input device 426, 436, 446 (e.g., a mouse, keyboard, touch screen, or microphone). The client computing device may also include a camera for recording video streams, speakers, a network interface device, and all components for connecting these elements to each other.
While each of the client computing devices 420, 430, and 440 may comprise full-size personal computing devices, they may alternatively comprise mobile computing devices capable of wirelessly exchanging data with a server over a network such as the internet. By way of example only, the client computing device 420 may be a mobile phone or a device such as a wireless-enabled PDA, a tablet PC, a wearable computing device or system, or a netbook capable of obtaining information via the internet or other network. In another example, the client computing device 430 may be a wearable computing device, such as the wristwatch shown in fig. 4. As an example, a user may input information using a keypad, a microphone, a visual signal with a camera, or a touch screen, among others.
In some examples, the client computing device 440 may be a concierge workstation used by an administrator to provide concierge services (concierge services) to users such as users 422 and 432. For example, a remote operator or concierge 442 may use the concierge workstation 440 to communicate via a phone call or audio connection with the user through their respective client computing device or vehicle 100 or 100A to ensure safe operation of the vehicles 100 and 100A and safety of the user, as described in further detail below. Although only a single concierge workstation 440 is shown in fig. 4 and 5, any number of such workstations may be included in a typical system.
The storage system 450 may store various types of information, as described in more detail below. This information may be retrieved or accessed by a server computing device (e.g., one or more server computing devices 410) in order to perform some or all of the features described herein. For example, the information may include user account information such as credentials (credential) that can be used to identify the user to one or more server computing devices (e.g., username and password in the case of traditional single-factor authentication, as well as other types of credentials typically used in multi-factor authentication, such as random identifiers, biometrics, etc.). The user account information may also include personal information, such as the user's name, contact information, identification information of the user's client computing device (or client computing devices if multiple devices are used with the same user account), and one or more unique signals of the user.
The storage system 450 may also store route data (routing data) for generating and evaluating routes between locations. For example, route information may be used to estimate how long it will take for a vehicle at a first location to reach a second location. In this regard, the routing information may include map information, not necessarily as specific as the detailed map information described above, but rather roads, as well as information about those roads, such as direction (one-way, two-way, etc.), heading (north, south, etc.), speed limits, and traffic information identifying expected traffic conditions, etc.
The storage system 450 may also store information that may be provided to the client computing device for display to the user. For example, the storage system 450 may store predetermined distance information that is used to determine areas where the vehicle may stop for a given pickup location or destination location. The storage system 450 may also store graphics, icons, and other items that may be displayed to a user, as discussed below.
Like the memory 130, the storage system 450 may be any type of computerized storage device capable of storing information accessible to the server computing device 410, such as a hard drive, memory card, ROM, RAM, DVD, CD-ROM, writable and read-only memory. Additionally, storage system 450 may comprise a distributed storage system in which data is stored on a number of different storage devices, which may be physically located in the same or different geographic locations. Storage system 450 may be connected to computing devices via network 460, as shown in fig. 4, and/or may be directly connected to or incorporated into any computing device 110, 410, 420, 430, 440, etc.
Example method
In addition to the operations described above and illustrated in the accompanying figures, various operations will now be described. It should be understood that the following operations need not be performed in the exact order described below. Rather, various steps may be processed in a different order or concurrently, and steps may also be added or omitted.
Fig. 6 is an example view of the vehicle 100 traveling along a road 610 corresponding to the road of fig. 2. In this regard, the lanes 612, 614, 616 correspond to the shape and location of the lanes 212, 214, 216, the curbs 620, 628 correspond to the shape and location of the curbs 220, and the lane lines 622, 624, 626 correspond to the shape and location of the lane lines 222, 224, 226 and the curbs 228. In this example, the vehicle 100 is traveling on a lane 612.
As the vehicle moves along the lane 612, the perception system 172 provides the computing device with sensor data regarding the shape and location of objects such as curbs 620, 628, lane lines 622, 624, landmarks 650, and traffic cones a-R. FIG. 7 depicts sensor data sensed by various sensors of the sensing system 172 when the vehicle 100 is in the condition shown in FIG. 6, as well as other information available to the computing device 110. In this example, the vehicles 640, 642, 644 are represented by bounding boxes 740, 742, 744 provided to the computing device 110 by the perception system 172. Traffic cones A-R are represented by bounding boxes 7A-7R, and flags 650 are represented by bounding box 750. Of course, these bounding boxes merely represent the volume of space in which the data points corresponding to the object are defined at least approximately. Additionally, the actual heading of the vehicle 100 and the estimated heading of the bounding boxes 740 and 742 are represented by arrows 770, 760, and 762, respectively. When the bounding box 744 appears to move very slowly or not at all, the computing device 110 may determine that the object represented by the bounding box is parked adjacent the curb 628.
To determine which lane the vehicle should enter, the vehicle's computing device must first identify that an ambiguity exists. This may be accomplished by processing data from the vehicle sensing system to identify one or more channels. Each of the one or more channels corresponds to a path along the road, wherein the path has not been identified in the map information of the vehicle. In other words, the path generally does not correspond to the lane of traffic defined in the map information and the characteristics of the lane. For example, the characteristics or rules of the traffic lanes may change, such as where a central turn lane may be configured with a traffic cone to provide steering and travel through an intersection, where an east lane of traffic may become a west lane of traffic, or where a path does not correspond to a lane or traffic or an area between two lane lines (or other lane markings) in the map information.
For example, in addition to lane lines, certain types of objects, such as cones or buckets, may be grouped together to determine the "boundaries" of the lane. For example, if a vehicle cannot pass between two cones, the objects may converge together and be considered part of the tunnel. As shown in the image of FIG. 7, the computing device 110 may group the cones A-N (or bounding boxes 7A-7N) together based on their proximity to each other because the vehicle 100 cannot be placed (fit) between the cones, or the cones are positioned in a manner that forms a barrier. Further, the computing device 110 may group the cones O-Q (or bounding boxes 7O-7Q) together based on their proximity to each other because the vehicle 100 cannot be placed between the cones. FIG. 8 depicts a cluster 810 corresponding to cones A-N and a cluster 820 corresponding to cones O-Q. The cone 7 (or bounding box 7R) is not included in the cluster 810 or 820. For clarity and ease of understanding, fig. 8 does not include bounding boxes 740, 742, 744, or 740.
Once these objects have been aggregated, the computing device 110 may use the cluster and other non-aggregated objects to identify one or more possible channels for the vehicle to follow in order to avoid the aggregated objects. In this regard, turning to fig. 9, the computing device may identify two channels, channel 910 and channel 920, as possible options for the vehicle 100 to follow given the locations of the clusters 810 and 820 and the cone 7 (or bounding box 7R). Also, for clarity and ease of understanding, fig. 9 does not include bounding boxes 740, 742, 744, or 740.
Additionally or alternatively, the physical geometric location of these objects may cause ambiguity. For example, for cone a (or bounding box 7A), the vehicle may pass left to enter channel 920, or right to enter channel 910. Furthermore, when in the channel 920, the vehicle may pass from either the right or left side of the cone R (or the bounding box 7R), which again provides the possibility of two sub-channels 920A, 920B of the channel 920. Thus, there is more than one possibility of travel. This creates ambiguity as to which lane the vehicle should enter. In other words, if there are more than two lanes, the vehicle may have more than one option. In another similar example, where two cones demarcate three separate lanes, the vehicle may travel to the right of two cones (or other objects), between two cones (or other objects), or to the left of two cones (or other objects). Thus, in such an example, there are three possible channels, which may cause more complex ambiguity.
The computing device may then attempt to resolve the ambiguity by analyzing the lanes using one or more methods and determining the appropriate traffic flow through each lane. Briefly, the computing device 110 may determine whether the traffic flow for each lane continues in the direction in which the vehicle is currently traveling, or is actually opposite the direction in which the vehicle is currently traveling, and in this regard, the lane is configured to allow reverse traffic. Referring again to fig. 9, for humans, it may be simple to determine the appropriate lane to travel, but this is not always clear to a computing device of a vehicle, such as vehicle 100.
In one example analysis, the computing device may analyze the channels in reverse. For example, if the situation is not ambiguous for reverse traffic, the computing device may determine that such a lane is for reverse traffic. In other words, if it is apparent which lane or lanes should be used by reverse traffic, the lane or lanes may be excluded from possible options for the vehicle. Also, as shown in fig. 9, based on the relative position of the cone 7 (or the bounding box 7N), it may be more easily determined that a vehicle traveling in the reverse traffic lane (here, the lane 616) may travel along the road 610 by following the pathway 920. For example, another vehicle is already following the lane 920 because it will be traveling past the cone 7 (or the bounding box 7N) while remaining in the lane 616. In this regard, the computing device 110 may determine that the channel 920, including sub-channels 920A and 920B, is configured for reverse traffic.
Through the process of excluding, the computing device 110 may determine that any remaining channels are suitable for the vehicle 100 to pass through. In this regard, one of the lanes may be selected based on the determined traffic flow through the lane. For example, because there are only two identified lanes, and lane 820 is determined to be configured for reverse traffic, computing device 110 may then determine that the vehicle should travel along lane 810. At this point, the vehicle may be controlled to enter and follow the selected lane. Of course, as noted above, additional methods may be used if more than one possible channel remains after using this technique.
In this regard, additionally or alternatively, the computing device may attempt to resolve the ambiguity by analyzing any of the flags. For example, in a redirected region, there may be a flag indicating which channels should or should not be used from certain directions. Such indicia may include left or right arrows, misway indicia, and the like. In some cases, these signs may be held by construction workers who direct traffic in both directions through the same aisle. These markers can be detected using various image recognition and optical character recognition techniques. Again, these flags may indicate which, if any, of the lanes are suitable for vehicle passage. For example, the computing device 110 may use optical character recognition technology to identify text of the marker 650 in an image captured by a camera of the vehicle perception system 172. The flag may indicate that the vehicle should "go to the right" or "not enter". This may indicate that the vehicle 100 is more likely to be following the pathway 810 than 820.
In addition to the context of the marker, the location and orientation of the marker may provide clues to the computing device 110 as to the "meaning" of the marker. For example, whether the sign is located in a position that is clearly associated with one lane or another, whether the sign gives a command (e.g., a right or left arrow) with respect to one lane or another, whether the content of the sign is visible from one direction or another of traffic (as this may indicate which direction the sign is expected to affect traffic), etc. For example, given the location of the marker 650 relative to the channels 810 and 820, and the orientation of the marker toward east traffic, the computing device 110 may determine that the vehicle 100 should be more likely to follow the channel 810 than 820.
Thus, the usage indication may also provide the computing device 110 with information regarding the direction of traffic flow through one or more lanes, thereby indicating which lane the computing device 110 should select to enter and follow, as described above. However, there may not always be enough indicia to identify the lanes that vehicles may enter.
As yet another approach, the computing device may attempt to determine the direction of traffic through each lane by observing the behavior of other vehicles. For example, if a vehicle from either direction (the same or opposite of the vehicle) is observed to pass through a particular lane in a particular direction, the computing device may use this information to determine which, if any, of the lanes are suitable for vehicle entry. Turning to fig. 10, given the position and heading (arrow 760) of vehicle 640 (or bounding box 740), the vehicle appears to be most likely to follow pathway 820, and here sub-pathway 820A. Because vehicle 640 is actually approaching vehicle 100 (as reverse traffic), computing device 110 may determine that for this reason only or for this additional reason that lane 820 and sub-lane 820A are lanes configured for reverse traffic. Similarly, given the position and lack of movement of vehicle 640 (or bounding box 740), which appears to be most likely to block sub-channel 820B, computing device 110 may determine that channel 820B may not be a suitable channel for vehicle 100 or reverse traffic. Thus, the behavior of using other vehicles may also provide the computing device 110 with information regarding the direction of traffic flow through one or more lanes, indicating which lane the computing device 110 should select to enter and follow, as described above.
As another method for use in addition to or in lieu of any of the above, the computing device may detect a road surface condition and use the condition to determine whether the vehicle should avoid a certain lane. For example, using sensor data provided by the sensing system 172, the computing device may determine whether the pathway includes open trenches or drops of a certain height, e.g., more than a few inches, or whether the pathway includes unpaved roads. In this case, the computing device may determine that the vehicle should not use the lane.
As another method for use in addition to or in place of any of the methods described above, the computing device may use information provided by other vehicles that have recently passed through the area. For example, if a vehicle operating in an autonomous driving mode (or in a manual driving mode where autonomous driving software runs in the background but does not control the vehicle) passes through an area, the vehicle's computing device may share information about the ambiguity and how the computing device responds with other vehicles in the area. Additionally or alternatively, if the vehicle's computing device identifies such a channel and possible ambiguity, the computing device may send this information along with any sensor information, such as a camera image or Lidar data. This is particularly useful for vehicles that may approach ambiguity from different directions or vantages. For example, if a vehicle passes through an intersection that is not redirected, but one of the crossroads along the intersection detects a sign of going left, right, walking wrong, etc. This information will be used for any vehicles that are subsequently driven along the cross street.
If the ambiguity cannot yet be resolved using one or more of the above methods or acts, the computing device may send a request to the human operator requesting that it provide instructions on how to proceed. For example, the computing device 110 may request assistance from the concierge 442 via the concierge workstation 440 using the network 460. This may include sending information identifying the lane that the computing device identifies for viewing, and receiving instructions on how to proceed (i.e., which lane or lanes are eligible for vehicle entry). In some cases, the concierge 442 may simply reroute the vehicle, for example, if the ambiguity is such that the human operator is not certain. If the concierge 442 is not available or cannot confidently determine a correct answer, e.g., the relevant sign is further behind, has been knocked over, is unclear, etc., the computing device 110 may determine that it is unacceptable to continue through any of the channels. As a result, the computing device 110 may control the vehicle to avoid the lane altogether by steering and/or rerouting the vehicle.
Fig. 11 is a flow diagram 1100 that may be executed by one or more processors (e.g., one or more processors 120 of computing device 110) to control a vehicle in an autonomous driving mode. At block 1102, the vehicle is steered in an autonomous driving mode using pre-stored map information identifying traffic flow directions. At block 1104, data is received from a vehicle awareness system that identifies objects in the environment external to the vehicle that are associated with traffic diversions not identified by the map information. At block 1106, the received data is used to identify one or more lanes of traffic redirection. At block 1108, one of the one or more lanes is selected based on the direction of traffic flow through the selected lane. At block 1110, the vehicle is controlled in an autonomous driving mode to enter and follow a selected one of the one or more lanes.
Unless otherwise specified, the foregoing alternative examples are not mutually exclusive, but may be implemented in various combinations to achieve unique advantages. As these and other variations and combinations of features may be utilized without departing from the subject matter defined by the claims, the foregoing description of the embodiments should be taken by way of illustration rather than by way of limitation of the subject matter defined by the claims. In addition, the provision of examples described herein, as well as clauses phrased as "such as," "including," and the like, should not be interpreted as limiting the claimed subject matter to the specific examples; rather, these examples are intended to illustrate only one of many possible embodiments. Further, the same reference numbers in different drawings may identify the same or similar elements.

Claims (20)

1. A method of controlling a vehicle in an autonomous driving mode, the method comprising:
manipulating, by one or more processors, a vehicle in an autonomous driving mode using pre-stored map information identifying one or more traffic lanes, wherein each traffic lane has a path and a traffic flow direction;
receiving, by one or more processors, data from a perception system of a vehicle, wherein the data identifies an object in an environment external to the vehicle that is relevant to traffic redirection at a given location, wherein the traffic redirection includes at least one channel that does not correspond to one or both of a path and a traffic flow direction of a traffic lane identified in pre-stored map information for the given location;
identifying, by one or more processors, one or more lanes of traffic redirection using the received data, wherein the one or more lanes are newly created based on an object related to traffic redirection at the given location;
selecting, by the one or more processors, one of the one or more lanes based on a direction of traffic flow through the selected lane; and
the vehicle is controlled by the one or more processors to enter and follow a selected one of the one or more channels in an autonomous driving mode.
2. The method of claim 1, further comprising determining a direction of traffic flow through the selected lane by analyzing how reverse traffic relative to the vehicle enters and passes through one or more lanes.
3. The method of claim 1, further comprising determining a direction of traffic flow through the selected lane by analyzing a landmark proximate to any of the one or more lanes.
4. The method of claim 1, further comprising determining a direction of traffic flow through the selected lane by observing traffic through any of the one or more lanes.
5. The method of claim 1, further comprising:
receiving, from one or more computing devices of a second vehicle, information identifying one or more channels; and
a direction of traffic flow through the selected lane is determined based on the received information.
6. The method of claim 1, further comprising:
after identifying one or more channels using the received data, sending a request to a computing device remote from the vehicle for instructions on how to travel; and
the instructions are received, and wherein selecting the selected one of the one or more channels is further based on the received instructions.
7. The method of claim 1, further comprising determining a direction of traffic flow through each of the one or more lanes, and wherein selecting the selected lane is further based on any determined direction of traffic flow.
8. The method of claim 1, wherein the one or more lanes are not defined by two or more lane lines.
9. A system for controlling a vehicle in an autonomous driving mode, the system comprising one or more processors configured to:
using pre-stored map information identifying one or more traffic lanes, each traffic lane having a path and a traffic flow direction, to maneuver the vehicle in an autonomous driving mode;
receiving data from a perception system of a vehicle, wherein the data identifies an object in an environment external to the vehicle that is relevant to traffic redirection at a given location, wherein the traffic redirection includes at least one lane that does not correspond to one or both of a path and a traffic flow direction of a traffic lane identified in pre-stored map information for the given location;
identifying one or more lanes of traffic redirection using the received data, wherein the one or more lanes are newly created based on an object related to traffic redirection at the given location;
selecting one of the one or more lanes based on a direction of traffic flow through the selected lane; and
the vehicle is controlled to enter and follow a selected one of the one or more channels in an autonomous driving mode.
10. The system of claim 9, wherein the one or more processors are further configured to determine the direction of traffic flow through the selected lane by analyzing how reverse traffic relative to the vehicle enters and passes through the one or more lanes.
11. The system of claim 9, wherein the one or more processors are further configured to determine the direction of traffic flow through the selected lane by analyzing a marker proximate to any of the one or more lanes.
12. The system of claim 9, wherein the one or more processors are further configured to determine the direction of traffic flow through the selected lane by observing traffic through any of the one or more lanes.
13. The system of claim 9, wherein the one or more processors are further configured to:
receiving, from one or more computing devices of a second vehicle, information identifying one or more channels; and
a direction of traffic flow through the selected lane is determined based on the received information.
14. The system of claim 9, wherein the one or more processors are further configured to:
after identifying one or more channels using the received data, sending a request to a computing device remote from the vehicle for instructions on how to travel; and
the instructions are received, and wherein selecting the selected one of the one or more channels is further based on the received instructions.
15. The system of claim 9, wherein the one or more processors are further configured to determine a direction of traffic flow through each of the one or more lanes, and wherein selecting the selected lane is further based on any determined direction of traffic flow.
16. The system of claim 9, wherein the one or more lanes are not defined by two or more lane lines.
17. The system of claim 9, further comprising a vehicle.
18. A non-transitory computer-readable medium having instructions stored thereon, which when executed by one or more processors, cause the one or more processors to perform a method for controlling a vehicle in an autonomous driving mode, the method comprising:
using pre-stored map information identifying one or more traffic lanes, each traffic lane having a path and a traffic flow direction, to maneuver the vehicle in an autonomous driving mode;
receiving data from a perception system of a vehicle, wherein the data identifies an object in an environment external to the vehicle that is relevant to traffic redirection at a given location, wherein the traffic redirection includes at least one lane that does not correspond to one or both of a path and a traffic flow direction of a traffic lane identified in pre-stored map information for the given location;
identifying one or more lanes of traffic redirection using the received data, wherein the one or more lanes are newly created based on an object related to traffic redirection at the given location;
selecting one of the one or more lanes based on a direction of traffic flow through the selected lane; and
the vehicle is controlled to enter and follow a selected one of the one or more channels in an autonomous driving mode.
19. The medium of claim 18, wherein the method further comprises determining a direction of traffic flow through the selected lane by analyzing how reverse traffic relative to the vehicle enters and passes through one or more lanes.
20. The medium of claim 18, wherein the method further comprises determining a direction of traffic flow through the selected lane by observing traffic through any of the one or more lanes.
CN201880070898.6A 2017-10-31 2018-10-29 Detecting and responding to traffic redirection of autonomous vehicles Expired - Fee Related CN111295629B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111496739.2A CN114518749A (en) 2017-10-31 2018-10-29 Detecting and responding to traffic redirection of autonomous vehicles

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US15/798,881 2017-10-31
US15/798,881 US10713940B2 (en) 2017-10-31 2017-10-31 Detecting and responding to traffic redirection for autonomous vehicles
PCT/US2018/057971 WO2019089444A1 (en) 2017-10-31 2018-10-29 Detecting and responding to traffic redirection for autonomous vehicles

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202111496739.2A Division CN114518749A (en) 2017-10-31 2018-10-29 Detecting and responding to traffic redirection of autonomous vehicles

Publications (2)

Publication Number Publication Date
CN111295629A CN111295629A (en) 2020-06-16
CN111295629B true CN111295629B (en) 2021-11-30

Family

ID=66244168

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202111496739.2A Pending CN114518749A (en) 2017-10-31 2018-10-29 Detecting and responding to traffic redirection of autonomous vehicles
CN201880070898.6A Expired - Fee Related CN111295629B (en) 2017-10-31 2018-10-29 Detecting and responding to traffic redirection of autonomous vehicles

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202111496739.2A Pending CN114518749A (en) 2017-10-31 2018-10-29 Detecting and responding to traffic redirection of autonomous vehicles

Country Status (6)

Country Link
US (2) US10713940B2 (en)
JP (1) JP7093408B2 (en)
KR (1) KR102488743B1 (en)
CN (2) CN114518749A (en)
IL (1) IL274061B2 (en)
SG (1) SG11202003335YA (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7095968B2 (en) 2017-10-02 2022-07-05 トヨタ自動車株式会社 Management device
JP6986234B2 (en) * 2018-04-06 2021-12-22 トヨタ自動車株式会社 Vehicles, vehicle control devices, and vehicle control systems
JP6708249B1 (en) * 2018-12-26 2020-06-10 株式会社Jvcケンウッド Vehicle recording control device, vehicle recording device, vehicle recording control method, and program
US10665109B1 (en) * 2019-05-17 2020-05-26 sibrtech inc. Construction zone apparatus and method
CN110702135A (en) * 2019-10-14 2020-01-17 广州小鹏汽车科技有限公司 Navigation method and device for vehicle, automobile and storage medium
US11603094B2 (en) 2020-02-20 2023-03-14 Toyota Motor North America, Inc. Poor driving countermeasures
US11527154B2 (en) * 2020-02-20 2022-12-13 Toyota Motor North America, Inc. Wrong way driving prevention
CN113639760A (en) * 2020-04-27 2021-11-12 福特全球技术公司 Navigation system and display method of navigation map
US11695851B2 (en) * 2020-06-03 2023-07-04 Micron Technology, Inc. Gateway for vehicle with caching buffer for distributed storage system
US11420656B2 (en) * 2020-07-13 2022-08-23 GM Global Technology Operations LLC Security system and method for disabling vehicle functions
JP7614774B2 (en) * 2020-10-02 2025-01-16 株式会社Subaru Vehicle driving support device
US20230419678A1 (en) * 2022-06-22 2023-12-28 Waymo Llc Joint Detection and Grouping of Road Objects Using Machine Learning
US20240286635A1 (en) * 2023-02-28 2024-08-29 Gm Cruise Holdings Llc Systems and techniques for classification of signs and gestures of traffic controllers

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000321081A (en) * 1999-04-15 2000-11-24 Daimlerchrysler Ag Update method for traffic route network map and map- supported method for generating automobile guide information
CN102227612A (en) * 2008-10-24 2011-10-26 格瑞股份公司 Control and systems for autonomously driven vehicles
US20170045885A1 (en) * 2014-11-13 2017-02-16 Toyota Motor Engineering & Manufacturing North America, Inc. Remote operation of autonomous vehicle in unexpected environment
US20170043768A1 (en) * 2015-08-14 2017-02-16 Toyota Motor Engineering & Manufacturing North America, Inc. Autonomous vehicle operation relative to unexpected dynamic objects
CN106458218A (en) * 2014-03-04 2017-02-22 谷歌公司 Reporting road event data and sharing with other vehicles
CN206300797U (en) * 2016-11-18 2017-07-04 特路(北京)科技有限公司 The static-obstacle thing response performance checkout area of automatic driving vehicle
CN106940933A (en) * 2017-03-08 2017-07-11 北京理工大学 A kind of intelligent vehicle decision-making lane-change method based on intelligent transportation system
CN107272687A (en) * 2017-06-29 2017-10-20 深圳市海梁科技有限公司 A kind of driving behavior decision system of automatic Pilot public transit vehicle

Family Cites Families (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10573093B2 (en) * 1995-06-07 2020-02-25 Automotive Technologies International, Inc. Vehicle computer design and use techniques for receiving navigation software
US5764283A (en) 1995-12-29 1998-06-09 Lucent Technologies Inc. Method and apparatus for tracking moving objects in real time using contours of the objects and feature paths
CN100588909C (en) 2003-07-17 2010-02-10 哈曼贝克自动系统股份有限公司 Route calculation around traffic obstacles using marked diversions
US7689348B2 (en) 2006-04-18 2010-03-30 International Business Machines Corporation Intelligent redirection of vehicular traffic due to congestion and real-time performance metrics
JP4793094B2 (en) 2006-05-17 2011-10-12 株式会社デンソー Driving environment recognition device
WO2008029802A1 (en) 2006-09-04 2008-03-13 Panasonic Corporation Travel information providing device
US7880621B2 (en) 2006-12-22 2011-02-01 Toyota Motor Engineering & Manufacturing North America, Inc. Distraction estimator
JP5236631B2 (en) 2007-05-09 2013-07-17 パナソニック株式会社 Display device, display method, display program
JP2010003157A (en) * 2008-06-20 2010-01-07 Toyota Motor Corp Travel support device
US8346426B1 (en) * 2010-04-28 2013-01-01 Google Inc. User interface for displaying internal state of autonomous driving system
US8818702B2 (en) 2010-11-09 2014-08-26 GM Global Technology Operations LLC System and method for tracking objects
WO2012150591A2 (en) 2011-05-03 2012-11-08 Alon Atsmon Automatic content analysis method and system
US20120294540A1 (en) 2011-05-17 2012-11-22 Microsoft Corporation Rank order-based image clustering
US8498448B2 (en) 2011-07-15 2013-07-30 International Business Machines Corporation Multi-view object detection using appearance model transfer from similar scenes
US8880272B1 (en) * 2012-03-16 2014-11-04 Google Inc. Approach for estimating the geometry of roads and lanes by using vehicle trajectories
US9002060B2 (en) 2012-06-28 2015-04-07 International Business Machines Corporation Object retrieval in video data using complementary detectors
US20140003706A1 (en) 2012-07-02 2014-01-02 Sony Pictures Technologies Inc. Method and system for ensuring stereo alignment during pipeline processing
KR101949294B1 (en) 2012-07-24 2019-02-18 삼성전자주식회사 apparatus and method of calculating histogram accumulation of image
US9633436B2 (en) 2012-07-26 2017-04-25 Infosys Limited Systems and methods for multi-dimensional object detection
US8996286B1 (en) * 2012-08-03 2015-03-31 Google Inc. Method for analyzing traffic patterns to provide solutions for alleviating traffic problems
US9355562B1 (en) * 2012-08-14 2016-05-31 Google Inc. Using other vehicle trajectories to aid autonomous vehicles driving through partially known areas
US9221461B2 (en) * 2012-09-05 2015-12-29 Google Inc. Construction zone detection using a plurality of information sources
US8996228B1 (en) * 2012-09-05 2015-03-31 Google Inc. Construction zone object detection using light detection and ranging
US9195914B2 (en) 2012-09-05 2015-11-24 Google Inc. Construction zone sign detection
US9600768B1 (en) 2013-04-16 2017-03-21 Google Inc. Using behavior of objects to infer changes in a driving environment
US9110163B2 (en) 2013-06-14 2015-08-18 Microsoft Technology Licensing, Llc Lidar-based classification of object movement
US8825259B1 (en) 2013-06-21 2014-09-02 Google Inc. Detecting lane closures and lane shifts by an autonomous vehicle
US9145139B2 (en) * 2013-06-24 2015-09-29 Google Inc. Use of environmental information to aid image processing for autonomous vehicles
US9315192B1 (en) 2013-09-30 2016-04-19 Google Inc. Methods and systems for pedestrian avoidance using LIDAR
US9335178B2 (en) 2014-01-28 2016-05-10 GM Global Technology Operations LLC Method for using street level images to enhance automated driving mode for vehicle
JP6537780B2 (en) 2014-04-09 2019-07-03 日立オートモティブシステムズ株式会社 Traveling control device, in-vehicle display device, and traveling control system
US9734583B2 (en) 2014-06-30 2017-08-15 Collin Walker Systems and methods for controlling vehicle position and orientation
JP6340957B2 (en) 2014-07-02 2018-06-13 株式会社デンソー Object detection apparatus and object detection program
WO2016013095A1 (en) 2014-07-25 2016-01-28 株式会社日立製作所 Autonomous moving device
CN107111742B (en) 2014-08-18 2021-04-02 无比视视觉技术有限公司 Identification and prediction of lane restrictions and construction areas in navigation
JP6417191B2 (en) 2014-11-06 2018-10-31 キヤノン株式会社 Image processing apparatus and image processing method
US9892296B2 (en) 2014-11-12 2018-02-13 Joseph E. Kovarik Method and system for autonomous vehicles
CN107004363B (en) 2014-12-10 2020-02-18 三菱电机株式会社 Image processing device, on-vehicle display system, display device, and image processing method
KR101621649B1 (en) * 2015-01-28 2016-05-16 한양대학교 산학협력단 Method for Generating Position Distribution Information of Moving Obstacle, Method and System for Controlling Vehicle Using the Position Distribution Information
US9676386B2 (en) 2015-06-03 2017-06-13 Ford Global Technologies, Llc System and method for controlling vehicle components based on camera-obtained image information
JP6606369B2 (en) 2015-07-21 2019-11-13 株式会社Soken Object detection apparatus and object detection method
JP6444835B2 (en) 2015-09-07 2018-12-26 株式会社東芝 Image processing apparatus, image processing program, and image processing system
US9503860B1 (en) 2015-09-10 2016-11-22 Ca, Inc. Intelligent pursuit detection
US9576185B1 (en) 2015-09-14 2017-02-21 Toyota Motor Engineering & Manufacturing North America, Inc. Classifying objects detected by 3D sensors for autonomous vehicle operation
US9566986B1 (en) 2015-09-25 2017-02-14 International Business Machines Corporation Controlling driving modes of self-driving vehicles
JP6798779B2 (en) 2015-11-04 2020-12-09 トヨタ自動車株式会社 Map update judgment system
WO2017106846A2 (en) 2015-12-18 2017-06-22 Iris Automation, Inc. Real-time visual situational awareness system
DE102016000199A1 (en) * 2016-01-11 2017-07-13 Trw Automotive Gmbh Control system and method for determining a safe lane change by motor vehicles
CN105699095B (en) * 2016-01-27 2018-11-13 常州加美科技有限公司 A kind of test method of automatic driving vehicle
DE112016003567T5 (en) * 2016-02-25 2018-04-19 Hitachi, Ltd. Control method for a moving body, moving body, and a moving body control system
CN108698598A (en) * 2016-03-15 2018-10-23 本田技研工业株式会社 Vehicle control system, control method for vehicle and vehicle control program
JP6583185B2 (en) * 2016-08-10 2019-10-02 トヨタ自動車株式会社 Automatic driving system and automatic driving vehicle
US10515543B2 (en) * 2016-08-29 2019-12-24 Allstate Insurance Company Electrical data processing system for determining status of traffic device and vehicle movement
JP6940612B2 (en) 2016-09-14 2021-09-29 ナウト, インコーポレイテッドNauto, Inc. Near crash judgment system and method
DE102016218876B4 (en) 2016-09-29 2022-05-19 Audi Ag Method for operating a motor vehicle, motor vehicle and method for determining setpoint trajectory information
US10699305B2 (en) 2016-11-21 2020-06-30 Nio Usa, Inc. Smart refill assistant for electric vehicles
JP6895634B2 (en) * 2016-12-16 2021-06-30 パナソニックIpマネジメント株式会社 Information processing systems, information processing methods, and programs
US10296812B2 (en) 2017-01-04 2019-05-21 Qualcomm Incorporated Systems and methods for mapping based on multi-journey data
US11009875B2 (en) 2017-03-09 2021-05-18 Waymo Llc Preparing autonomous vehicles for turns
US10389432B2 (en) * 2017-06-22 2019-08-20 At&T Intellectual Property I, L.P. Maintaining network connectivity of aerial devices during unmanned flight
US10558217B2 (en) 2017-08-28 2020-02-11 GM Global Technology Operations LLC Method and apparatus for monitoring of an autonomous vehicle
US10867510B2 (en) 2018-04-05 2020-12-15 Toyota Jidosha Kabushiki Kaisha Real-time traffic monitoring with connected cars

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000321081A (en) * 1999-04-15 2000-11-24 Daimlerchrysler Ag Update method for traffic route network map and map- supported method for generating automobile guide information
CN102227612A (en) * 2008-10-24 2011-10-26 格瑞股份公司 Control and systems for autonomously driven vehicles
CN106458218A (en) * 2014-03-04 2017-02-22 谷歌公司 Reporting road event data and sharing with other vehicles
US20170045885A1 (en) * 2014-11-13 2017-02-16 Toyota Motor Engineering & Manufacturing North America, Inc. Remote operation of autonomous vehicle in unexpected environment
US20170043768A1 (en) * 2015-08-14 2017-02-16 Toyota Motor Engineering & Manufacturing North America, Inc. Autonomous vehicle operation relative to unexpected dynamic objects
CN206300797U (en) * 2016-11-18 2017-07-04 特路(北京)科技有限公司 The static-obstacle thing response performance checkout area of automatic driving vehicle
CN106940933A (en) * 2017-03-08 2017-07-11 北京理工大学 A kind of intelligent vehicle decision-making lane-change method based on intelligent transportation system
CN107272687A (en) * 2017-06-29 2017-10-20 深圳市海梁科技有限公司 A kind of driving behavior decision system of automatic Pilot public transit vehicle

Also Published As

Publication number Publication date
US10713940B2 (en) 2020-07-14
CN111295629A (en) 2020-06-16
SG11202003335YA (en) 2020-05-28
IL274061A (en) 2020-06-30
CN114518749A (en) 2022-05-20
IL274061B2 (en) 2024-09-01
JP7093408B2 (en) 2022-06-29
US11887474B2 (en) 2024-01-30
IL274061B1 (en) 2024-05-01
US20190130736A1 (en) 2019-05-02
US20200365016A1 (en) 2020-11-19
KR20200047766A (en) 2020-05-07
JP2021501395A (en) 2021-01-14
KR102488743B1 (en) 2023-01-17

Similar Documents

Publication Publication Date Title
CN111295629B (en) Detecting and responding to traffic redirection of autonomous vehicles
US11537133B2 (en) Dynamic routing for autonomous vehicles
US11056003B1 (en) Occupant facing vehicle display
CN111132884B (en) Method and system for stopping vehicle
CA3072744C (en) Recognizing assigned passengers for autonomous vehicles
US9740202B2 (en) Fall back trajectory systems for autonomous vehicles
US9551992B1 (en) Fall back trajectory systems for autonomous vehicles
US10220776B1 (en) Scenario based audible warnings for autonomous vehicles
AU2021201402B2 (en) Detecting and responding to traffic redirection for autonomous vehicles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20211130