US20150316386A1 - Detailed map format for autonomous driving - Google Patents
Detailed map format for autonomous driving Download PDFInfo
- Publication number
- US20150316386A1 US20150316386A1 US14/265,370 US201414265370A US2015316386A1 US 20150316386 A1 US20150316386 A1 US 20150316386A1 US 201414265370 A US201414265370 A US 201414265370A US 2015316386 A1 US2015316386 A1 US 2015316386A1
- Authority
- US
- United States
- Prior art keywords
- border
- lane
- map format
- segment
- traffic signal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000004397 blinking Effects 0.000 claims description 3
- 238000012545 processing Methods 0.000 description 6
- 239000003086 colorant Substances 0.000 description 5
- 238000004891 communication Methods 0.000 description 3
- 238000000034 method Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000004807 localization Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
- G01C21/32—Structuring or formatting of map data
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/18—Propelling the vehicle
- B60W30/18009—Propelling the vehicle related to particular drive situations
- B60W30/18154—Approaching an intersection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3658—Lane guidance
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3807—Creation or updating of map data characterised by the type of data
- G01C21/3815—Road data
- G01C21/3819—Road shape data, e.g. outline of a route
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/40—High definition maps
Definitions
- Fully or highly automated, e.g. autonomous or self-driven, driving systems are designed to operate a vehicle on the road either without or with low levels of driver interaction or other external controls.
- Autonomous driving systems require certainty in the position of and distance to geographic features surrounding the vehicle with a sufficient degree of accuracy to adequately control the vehicle. Details about the road or other geographic features surrounding the vehicle can be recorded on a detailed virtual map. The more accurate the detailed virtual map, the better the performance of the autonomous driving system.
- Existing virtual maps do not include sufficient or sufficiently accurate geographic feature details for optimized autonomous operation.
- the detailed map format described here can be used to represent the drivable area of a road, including the boundary locations of each lane, the exact width of each lane, and the location of the impassable borders of a given lane, such as curbs, medians, and islands.
- the detailed map format can also include information to support driving rules associated with a given lane of the road, to calculate the distance from any object within the map format to the boundary of a lane, and to identify other map features intersecting a lane, such as crosswalks and driveways.
- the highly detailed nature of this map format allows for improved control of a highly-automated or autonomous vehicle as well as for improved localization (exact positioning) of the autonomous vehicle in respect to the detailed map format.
- Each lane within the detailed map format can include lane segments formed of waypoints.
- the detailed map format disclosed can also include border segments formed of borderpoints. Information associated with these border segments and borderpoints includes border type and border color.
- An autonomous vehicle can be controlled to operate according to driving rules based on a given border type and border color associated with the detailed map format. Border segments can also be used to determine the distance to an edge of a lane for a given lane segment or the width of the lane at any point along the lane segment, providing for more accurate control of the autonomous vehicle than is possible using lane segments formed of waypoints alone.
- a computer-readable map format includes at least a lane segment and a border segment.
- the distance to an edge of the lane segment can be determined at any location along the lane segment by measuring a distance between the chosen location and a portion of the border segment closest to the location.
- the map format includes at least a lane segment and a plurality of border segments.
- a lane width of the lane segment can be determined at any location along the lane segment by measuring a distance between at least two of the plurality of border segments positioned proximate to and on opposite sides of the lane segment.
- FIG. 1 is a block diagram of a computing device
- FIG. 2 is a schematic illustration of an autonomous vehicle including the computing device of FIG. 1 ;
- FIG. 3 shows an example two-dimensional representation of a portion of a two-lane road as represented within a detailed map format for use with the autonomous vehicle of FIG. 2 ;
- FIG. 4 shows an example two-dimensional representation of a portion of a four-lane road transitioning to a five-lane road at an intersection as represented within a detailed map format for use with the autonomous vehicle of FIG. 2 ;
- FIG. 5 shows an example two-dimensional representation of a roundabout as represented within a detailed map format for use with the autonomous vehicle of FIG. 2 .
- a computer-readable, highly detailed map format for an autonomous vehicle includes information representing the geographical location, travel direction, and speed limit of lanes on a given road using lane segments formed of waypoints. Beyond this basic information, the detailed map format also includes the geographical location for the borders of each lane in the form of border segments formed of borderpoints. Information associated with the border segments and border points within the detailed map format can include the border type and border color, such that driving rules can be associated with the lane segments based on the closest border segments.
- the detailed map format can also include stop lines linked to the end of lanes at traffic intersections to better position the autonomous vehicle for entry into a traffic intersection and to indicate where the autonomous vehicle should stop at the traffic intersection. Crosswalks can also be included in the detailed map format and associated with safety rules to be followed when the autonomous vehicle approaches the crosswalk.
- FIG. 1 is a block diagram of a computing device 100 , for example, for use with autonomous driving system.
- the computing device 100 can be any type of vehicle-installed, handheld, desktop, or other form of single computing device, or can be composed of multiple computing devices.
- the processing unit in the computing device can be a conventional central processing unit (CPU) 102 or any other type of device, or multiple devices, capable of manipulating or processing information.
- a memory 104 in the computing device can be a random access memory device (RAM) or any other suitable type of storage device.
- the memory 104 can include data 106 that is accessed by the CPU 102 using a bus 108 .
- the memory 104 can also include an operating system 110 and installed applications 112 , the installed applications 112 including programs that permit the CPU 102 to perform automated driving methods using the detailed map format described below.
- the computing device 100 can also include secondary, additional, or external storage 114 , for example, a memory card, flash drive, or any other form of computer readable medium.
- the installed applications 112 can be stored in whole or in part in the external storage 114 and loaded into the memory 104 as needed for processing.
- the computing device 100 can also be in communication with one or more sensors 116 .
- the sensors 116 can capture data and/or signals for processing by an inertial measurement unit (IMU), a dead-reckoning system, a global navigation satellite system (GNSS), a light detection and ranging (LIDAR) system, a radar system, a sonar system, an image-based sensor system, or any other type of system capable of capturing information specific to the environment surrounding a vehicle for use in creating a detailed map format as described below, including information specific to objects such as features of the route being travelled by the vehicle or other localized position data and/or signals and outputting corresponding data and/or signals to the CPU 102 .
- IMU inertial measurement unit
- GNSS global navigation satellite system
- LIDAR light detection and ranging
- radar system a sonar system
- image-based sensor system or any other type of system capable of capturing information specific to the environment surrounding a vehicle for use in creating a detailed map format as described below, including information specific to objects such
- the sensors 116 can capture, at least, signals for a GNSS or other system that determines vehicle position and velocity and data for a LIDAR system or other system that measures vehicle distance from lane lines (e.g., route surface markings or route boundaries), obstacles, objects, or other environmental features including traffic lights and road signs.
- the computing device 100 can also be in communication with one or more vehicle systems 118 , such as vehicle braking systems, vehicle propulsions systems, etc.
- the vehicle systems 118 can also be in communication with the sensors 116 , the sensors 116 being configured to capture data indicative of performance of the vehicle systems 118 .
- FIG. 2 is a schematic illustration of an autonomous vehicle 200 including the computing device 100 of FIG. 1 .
- the computing device 100 can be located within the vehicle 200 as shown in FIG. 2 or can be located remotely from the vehicle 200 in an alternate location (not shown). If the computing device 100 is located remotely from the vehicle 200 , the vehicle 200 can include the capability of communicating with the computing device 100 .
- the vehicle 200 can also include a plurality of sensors, such as the sensors 116 described in reference to FIG. 1 .
- One or more of the sensors 116 shown can be configured to capture the distance to objects within the surrounding environment for use by the computing device 100 to estimate position and orientation of the vehicle 200 , images for processing by an image sensor, vehicle position in global coordinates based on signals from a plurality of satellites, or any other data and/or signals that could be used to determine the current state of the vehicle or determine the position of the vehicle 200 in respect to its environment for use in either creating a detailed map format or comparing the vehicle's 200 position to the detailed map format. Recognized geographic features such as those described below can be used to build a detailed map format, and objects such as other vehicles can be recognized and excluded from the detailed map format.
- Map formats can be constructed using geographic features captured by the vehicle 200 such as lane lines and curbs proximate the vehicle 200 as it travels a route. These geographic features can be captured using the above described LIDAR system and/or cameras in combination with an algorithm such as random sample consensus (RANSAC) to find lines, record the position of the vehicle 200 , and collect data on position from a GNSS and/or an IMU. The captured geographic features can then be manipulated using a simultaneous localization and mapping (SLAM) technique to position all of the geographic features in relation to the vehicle's 200 position. Some of the geographic features can be categorized as lane borders, and lane centers can be determined based on the lane borders. Alternatively, map formats can be constructed using overhead images (e.g. satellite images) of geographic features traced by a map editor that allows selection of different categories for each geographic feature.
- RANSAC random sample consensus
- SLAM simultaneous localization and mapping
- FIG. 3 shows an example two-dimensional representation of a portion of a two-lane road 300 as represented within a detailed map format for use with the autonomous vehicle 200 of FIG. 2 .
- the two-lane road 300 in this example map format includes lanes 302 , 304 .
- Each of the lanes 302 , 304 can include a lane segment 306 , 308 .
- Each of the lane segments 306 , 308 can extend between at least two waypoints 310 , 312 , 314 , 316 , for example, the lane segment 306 extends between the waypoints 310 , 312 and the lane segment 308 extends between the waypoints 314 , 316 .
- each waypoint 310 , 312 , 314 , 316 can include information such as geographical location, lane speed, and lane direction.
- the lane 302 is shown as having a left-to-right direction by arrows touching the waypoints 310 , 312 and the lane 304 is shown as having a right-to-left direction by arrows touching the waypoints 314 , 316 .
- the overall computer-readable map format can be stored in plain text, binary, or xml, for example.
- the basic map information can be gathered from a route network definition file (RNDF) or any other available source. However, this basic map information is not sufficient for control of the autonomous vehicle 200 .
- RNDF route network definition file
- each of the lanes 302 , 304 can be further associated with borders.
- Each of the borders can be formed of one or more border segments 318 , 320 , 322 .
- Each of the border segments 318 , 320 , 322 can extend between at least two borderpoints 324 , 326 , 328 , 330 , 332 , 334 .
- the border segment 318 extends between the borderpoints 324 , 326 and the border segment 322 extends between the borderpoints 332 , 334 .
- each borderpoint 324 , 326 , 328 , 330 , 332 , 334 can include information such as geographical location, border type, and border color.
- the information associated with each borderpoint 324 , 326 , 328 , 330 , 332 , 334 can be used by the autonomous vehicle 200 in order to determine navigation routes, make decisions regarding passing other vehicles, position or localize the autonomous vehicle 200 in respect to the border segments 318 , 320 , 322 , and determine the driveable area along a given navigation route in order to support safety maneuvers or obstacle tracking.
- the information associated with each borderpoint 324 , 326 , 328 , 330 , 332 , 334 can, for example, be built from data collected using a LIDAR sensor and manipulated using a SLAM technique when building the detailed map as describe above.
- the map information associated with the borders and lanes 302 , 304 can be stored, for example, in the form of spline points or as curves with knot vectors in the memory 104 of the computing device 100 or can be available from a remote location.
- Examples of different border types that can be associated with the borderpoints 324 , 326 , 328 , 330 , 332 , 334 can include a “curb,” a “single solid line,” a “double solid line,” a “single dashed line,” a “combined dashed line and solid line,” and “no line.”
- the borderpoints 324 , 326 and hence the border segment 318 extending between them can be associated with a “single solid line” border type.
- borderpoints 324 , 326 are shown with an open circle representation and the border segment 318 is shown using a thin, solid line representation.
- border segment 320 extending between the borderpoints 328 , 330 can be associated with a “double solid line” border type.
- the partially-shown border segments 336 , 338 can be associated with a “combined dashed line and solid line” border type.
- border color can also be associated with the borderpoints 324 , 326 , 328 , 330 , 332 , 334 .
- border colors can include “yellow,” “white,” or “unknown.”
- Border types and border colors can be used to associate a driving rule with each of the various lane segments 306 , 308 (and/or with the waypoints 310 , 312 , 314 , 316 forming the lane segments 306 , 308 ).
- a driving rule can be based at least in part on the border type and the border color associated with the borderpoints 324 , 326 , 328 , 330 , 332 , 334 and border segments 318 , 318 , 322 , closest to the lane segment 306 , 308 .
- two driving rules can be associated with lane segment 306 : first, a driving rule of “no passing border” based on the border segment 320 extending between the borderpoints 328 , 330 given that the border segment 320 can be associated with a border type of “double solid line” and a border color of “yellow;” second, a driving rule of “drivable lane border” based on the border segment 318 extending between the borderpoints 324 , 326 given that the border segment 318 can be associated with a border type of “single solid line” and a border color of “white.”
- the border types, border colors, and driving rules described in reference to FIG. 3 reflect commonly understood traffic rules in the United States, other traffic rules, border types, border colors, and driving rules are also possible.
- Another benefit of storing information for both lane segments 306 , 308 and border segments 318 , 320 , 322 in the map format is that the distance to an edge of the lane segment 306 , 308 can be determined at any location along the lane segment 306 , 308 by measuring a distance between the location and a portion of the border segment 318 , 320 , 322 closest to the location.
- the autonomous vehicle 200 can be positioned within, for example, either of the lanes 302 , 304 at an optimum spacing based on the actual geographical location, border color, and border type of the border segment 318 , 320 , 322 instead of relying on fixed lane widths associated only with waypoints 324 , 326 , 328 , 330 , 332 , 334 . Knowing the actual distance to an edge of the lane segment 306 , 308 leads to greater maneuverability of the autonomous vehicle 200 . Further, the ability to localize the autonomous vehicle 200 is improved because the border segments 318 , 320 , 322 as stored within the detailed map format can be matched to images of lane borders or other geographic features captured during autonomous operation of vehicle 200 .
- border segments 318 , 320 , 322 can be positioned both proximate to and on opposite sides of a given lane segment 306 , 308 .
- a lane width of the lane segment 306 , 308 can be determined at a chosen location along the lane segment 306 , 308 by measuring the distance between the two border segments 318 , 320 , 322 positioned proximate to and on opposite sides of the lane segment 306 , 308 .
- the lane width for lane 302 can be calculated anywhere along lane segment 306 by measuring the distance between border segments 318 , 320 within the map format. Again, knowing the actual lane width at any point along the lane 302 , 304 is beneficial both for overall positioning and maneuvering of the autonomous vehicle 200 . The positioning benefit is further described in reference to FIG. 4 .
- FIG. 4 shows an example two-dimensional representation of a portion of a four-lane road transitioning to a five-lane road 400 at an intersection as represented within a detailed map format for use with the autonomous vehicle 200 of FIG. 2 .
- the five-lane road 400 in this example map format includes lanes 402 , 404 , 406 , 408 , 410 .
- Each of the lanes 402 , 404 , 406 , 408 , 410 can include a lane segment 412 , 414 , 416 , 418 , 420 .
- Each of the lane segments 412 , 414 , 416 , 418 , 420 can extend between at least two waypoints 422 , 424 , 426 , 428 , 430 , 432 , 434 , 436 , 438 , 440 , for example, the lane segment 416 extends between the waypoints 430 , 432 and the lane segment 420 extends between the waypoints 438 , 440 .
- the lanes 402 , 404 , 406 proceed in a left-to-right direction as indicated by the arrows associated with the waypoints 422 , 424 , 426 , 428 , 430 , 432 and the lanes 408 , 410 proceed in a right-to-left direction as indicated by the arrows associated with the waypoints 434 , 436 , 438 , 440 .
- Each of the lanes 402 , 404 , 406 , 408 , 410 can be further associated with borders formed of one or more border segments extending between at least two borderpoints. For simplicity, only a few of the border segments and borderpoints are numbered on the example map format of the road 400 of FIG. 4 .
- border segment 442 extends between borderpoints 444 , 446 and border segment 448 extends between borderpoints 450 , 452 .
- border segments 442 , 448 are associated with lane segments 412 , 414 , respectively. As described in respect to FIG. 3 , the border segments 442 , 448 and borderpoints 444 , 446 , 450 , 452 can be associated with various border types and border colors for use in establishing driving rules.
- the border segments 442 , 448 can be associated with the “curb” border type, which is represented within the detailed map format using a dotted, hashed line type.
- the borderpoints 444 , 446 , 452 are represented using filled circles, and together, the borderpoints 444 , 446 , 452 and border segments 442 , 448 indicate an “impassable” driving rule.
- the driving rule indicated is “impassable,” the autonomous vehicle 200 is controlled in a manner such that the vehicle 200 will not attempt to navigate beyond the border.
- the borderpoint 450 is represented using a half-filled circle, indicating a transition in the border type from a “solid line” type to a “curb” type at the location of the borderpoint 450 .
- a median 464 is shown as present between lane 404 and lane 406 .
- the left-most part of the median 464 is bordered by border segments of solid lines associated with a “drivable lane border” driving rule and a “line” type border while the right-most part of the median 464 is bordered by border segments of dotted, hashed lines associated with an “impassable” driving rule and a “curb” type border. If necessary, the vehicle 200 could navigate across only the left-most part of the median 464 .
- FIG. 4 shows additional features added to the map format in order to improve the map format for use with the autonomous vehicle 200 of FIG. 2 .
- three of the lane segments 412 , 414 , 416 are associated with stop lines 454 , 456 near the traffic intersection (the traffic intersection being located on the right-most side of FIG. 4 ).
- the stop lines 454 , 456 can be linked to the end of one or more lanes 402 , 404 , 406 and information associated with the stop lines 454 , 456 can include a geographical location of a position where the vehicle 200 must stop before the traffic intersection.
- the stop line 456 extends between the border segments 442 , 448 , denoting the geographical location at which the autonomous vehicle 200 should be positioned if stopping in front of the traffic intersection within either of the lanes 402 , 404 .
- the additional information provided by the stop lines 454 , 456 is useful in operation of the autonomous vehicle 200 because the stop lines 454 , 456 allow the autonomous vehicle 200 to be positioned at the traffic intersection in a manner consistent with manual operation of a vehicle. For example, if the autonomous vehicle 200 approaches the traffic intersection within lane 402 , instead of stopping at the waypoint 424 denoting the end of the lane segment 412 , the autonomous vehicle 200 can be controlled to move forward to the stop line 456 and slightly around the corner of the lane 402 as denoted by the border segment 442 . This maneuver is more consistent with how a driver would manually operate a vehicle on the road 400 when making a right turn at a traffic intersection.
- crosswalks can also be included in the detailed map format in a manner similar to that used for the stop lines 454 , 456 .
- Information associated with the crosswalks can include a geographical location of a position of the crosswalk and a driving rule associated with the crosswalk that directs the automated vehicle system to implement additional safety protocols.
- Traffic signals are another feature present within the map format shown in FIG. 4 .
- Each traffic signal can include information such as geographical location, traffic signal type, and traffic signal state.
- Traffic signal type can include information on the structure and orientation of a traffic light or traffic sign.
- Traffic signal structure and orientation for a traffic light can include “vertical three,” “vertical three left arrow,” “horizontal three,” “right arrow,” etc.
- Traffic signal state for a traffic light can include, for example, “green,” “green arrow,” “yellow,” “blinking yellow,” or “red.”
- three traffic lights 458 , 460 , 462 are shown within the traffic intersection.
- traffic light 458 is associated with lane 402 , and given the structure of the intersection and the shape and type of the border segments proximate lane 402 , lane 402 is understood to be a right turn lane.
- traffic light 462 is associated with lane 406 , and given the structure of the intersection and shape and type of the border segments proximate lane 406 , lane 406 is understood to be a left turn lane.
- FIG. 5 shows an example two-dimensional representation of a roundabout 500 as represented within a detailed map format for use with the autonomous vehicle 200 of FIG. 2 .
- Maneuvering the autonomous vehicle 200 through the roundabout is greatly aided by the use of lane segments, border segments, and stop lines.
- medians 502 , 503 , 504 and center circle 505 can be identified using borderpoints and border segments and be associated with driving rules as “impassable” areas of the roundabout 500 .
- stop lines 506 , 508 , 510 can be used to indicate to the autonomous vehicle 200 the exact location where the autonomous vehicle 200 should stop before entering the roundabout 500 . As shown in FIG.
- stop line 510 indicates a position to the right and below the nearest waypoint within lane 512 .
- Using the stop line 510 to position the autonomous vehicle 200 at the entrance to the roundabout 500 is much closer to how a driver would operate a vehicle when compared to stopping the autonomous vehicle 200 at the final waypoint within the lane 512 .
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Navigation (AREA)
- Human Computer Interaction (AREA)
- Traffic Control Systems (AREA)
- Artificial Intelligence (AREA)
- Aviation & Aerospace Engineering (AREA)
- Medical Informatics (AREA)
- Game Theory and Decision Science (AREA)
- Evolutionary Computation (AREA)
- Health & Medical Sciences (AREA)
- Business, Economics & Management (AREA)
- Instructional Devices (AREA)
Abstract
A computer-readable detailed map format is disclosed. The detailed map format includes a lane segment and one or more border segments. A distance to an edge of the lane segment can be determined at a location along the lane segment by measuring a distance between the location and a portion of the border segment closest to the location. A lane width of the lane segment can be determined at a location along the lane segment by measuring a distance between two of the border segments positioned proximate to and on opposite sides of the lane segment.
Description
- Fully or highly automated, e.g. autonomous or self-driven, driving systems are designed to operate a vehicle on the road either without or with low levels of driver interaction or other external controls. Autonomous driving systems require certainty in the position of and distance to geographic features surrounding the vehicle with a sufficient degree of accuracy to adequately control the vehicle. Details about the road or other geographic features surrounding the vehicle can be recorded on a detailed virtual map. The more accurate the detailed virtual map, the better the performance of the autonomous driving system. Existing virtual maps do not include sufficient or sufficiently accurate geographic feature details for optimized autonomous operation.
- The detailed map format described here can be used to represent the drivable area of a road, including the boundary locations of each lane, the exact width of each lane, and the location of the impassable borders of a given lane, such as curbs, medians, and islands. The detailed map format can also include information to support driving rules associated with a given lane of the road, to calculate the distance from any object within the map format to the boundary of a lane, and to identify other map features intersecting a lane, such as crosswalks and driveways. The highly detailed nature of this map format allows for improved control of a highly-automated or autonomous vehicle as well as for improved localization (exact positioning) of the autonomous vehicle in respect to the detailed map format.
- Each lane within the detailed map format can include lane segments formed of waypoints. The detailed map format disclosed can also include border segments formed of borderpoints. Information associated with these border segments and borderpoints includes border type and border color. An autonomous vehicle can be controlled to operate according to driving rules based on a given border type and border color associated with the detailed map format. Border segments can also be used to determine the distance to an edge of a lane for a given lane segment or the width of the lane at any point along the lane segment, providing for more accurate control of the autonomous vehicle than is possible using lane segments formed of waypoints alone.
- In one implementation, a computer-readable map format is disclosed. The map format includes at least a lane segment and a border segment. The distance to an edge of the lane segment can be determined at any location along the lane segment by measuring a distance between the chosen location and a portion of the border segment closest to the location.
- In another implementation, another computer-readable map format is disclosed. The map format includes at least a lane segment and a plurality of border segments. A lane width of the lane segment can be determined at any location along the lane segment by measuring a distance between at least two of the plurality of border segments positioned proximate to and on opposite sides of the lane segment.
- The description herein makes reference to the accompanying drawings wherein like reference numerals refer to like parts throughout the several views, and wherein:
-
FIG. 1 is a block diagram of a computing device; -
FIG. 2 is a schematic illustration of an autonomous vehicle including the computing device ofFIG. 1 ; -
FIG. 3 shows an example two-dimensional representation of a portion of a two-lane road as represented within a detailed map format for use with the autonomous vehicle ofFIG. 2 ; -
FIG. 4 shows an example two-dimensional representation of a portion of a four-lane road transitioning to a five-lane road at an intersection as represented within a detailed map format for use with the autonomous vehicle ofFIG. 2 ; and -
FIG. 5 shows an example two-dimensional representation of a roundabout as represented within a detailed map format for use with the autonomous vehicle ofFIG. 2 . - A computer-readable, highly detailed map format for an autonomous vehicle is disclosed. The detailed map format includes information representing the geographical location, travel direction, and speed limit of lanes on a given road using lane segments formed of waypoints. Beyond this basic information, the detailed map format also includes the geographical location for the borders of each lane in the form of border segments formed of borderpoints. Information associated with the border segments and border points within the detailed map format can include the border type and border color, such that driving rules can be associated with the lane segments based on the closest border segments. The detailed map format can also include stop lines linked to the end of lanes at traffic intersections to better position the autonomous vehicle for entry into a traffic intersection and to indicate where the autonomous vehicle should stop at the traffic intersection. Crosswalks can also be included in the detailed map format and associated with safety rules to be followed when the autonomous vehicle approaches the crosswalk.
-
FIG. 1 is a block diagram of acomputing device 100, for example, for use with autonomous driving system. Thecomputing device 100 can be any type of vehicle-installed, handheld, desktop, or other form of single computing device, or can be composed of multiple computing devices. The processing unit in the computing device can be a conventional central processing unit (CPU) 102 or any other type of device, or multiple devices, capable of manipulating or processing information. Amemory 104 in the computing device can be a random access memory device (RAM) or any other suitable type of storage device. Thememory 104 can includedata 106 that is accessed by theCPU 102 using abus 108. - The
memory 104 can also include anoperating system 110 and installedapplications 112, the installedapplications 112 including programs that permit theCPU 102 to perform automated driving methods using the detailed map format described below. Thecomputing device 100 can also include secondary, additional, orexternal storage 114, for example, a memory card, flash drive, or any other form of computer readable medium. The installedapplications 112 can be stored in whole or in part in theexternal storage 114 and loaded into thememory 104 as needed for processing. - The
computing device 100 can also be in communication with one ormore sensors 116. Thesensors 116 can capture data and/or signals for processing by an inertial measurement unit (IMU), a dead-reckoning system, a global navigation satellite system (GNSS), a light detection and ranging (LIDAR) system, a radar system, a sonar system, an image-based sensor system, or any other type of system capable of capturing information specific to the environment surrounding a vehicle for use in creating a detailed map format as described below, including information specific to objects such as features of the route being travelled by the vehicle or other localized position data and/or signals and outputting corresponding data and/or signals to theCPU 102. - In the examples described below, the
sensors 116 can capture, at least, signals for a GNSS or other system that determines vehicle position and velocity and data for a LIDAR system or other system that measures vehicle distance from lane lines (e.g., route surface markings or route boundaries), obstacles, objects, or other environmental features including traffic lights and road signs. Thecomputing device 100 can also be in communication with one ormore vehicle systems 118, such as vehicle braking systems, vehicle propulsions systems, etc. Thevehicle systems 118 can also be in communication with thesensors 116, thesensors 116 being configured to capture data indicative of performance of thevehicle systems 118. -
FIG. 2 is a schematic illustration of anautonomous vehicle 200 including thecomputing device 100 ofFIG. 1 . Thecomputing device 100 can be located within thevehicle 200 as shown inFIG. 2 or can be located remotely from thevehicle 200 in an alternate location (not shown). If thecomputing device 100 is located remotely from thevehicle 200, thevehicle 200 can include the capability of communicating with thecomputing device 100. - The
vehicle 200 can also include a plurality of sensors, such as thesensors 116 described in reference toFIG. 1 . One or more of thesensors 116 shown can be configured to capture the distance to objects within the surrounding environment for use by thecomputing device 100 to estimate position and orientation of thevehicle 200, images for processing by an image sensor, vehicle position in global coordinates based on signals from a plurality of satellites, or any other data and/or signals that could be used to determine the current state of the vehicle or determine the position of thevehicle 200 in respect to its environment for use in either creating a detailed map format or comparing the vehicle's 200 position to the detailed map format. Recognized geographic features such as those described below can be used to build a detailed map format, and objects such as other vehicles can be recognized and excluded from the detailed map format. - Map formats can be constructed using geographic features captured by the
vehicle 200 such as lane lines and curbs proximate thevehicle 200 as it travels a route. These geographic features can be captured using the above described LIDAR system and/or cameras in combination with an algorithm such as random sample consensus (RANSAC) to find lines, record the position of thevehicle 200, and collect data on position from a GNSS and/or an IMU. The captured geographic features can then be manipulated using a simultaneous localization and mapping (SLAM) technique to position all of the geographic features in relation to the vehicle's 200 position. Some of the geographic features can be categorized as lane borders, and lane centers can be determined based on the lane borders. Alternatively, map formats can be constructed using overhead images (e.g. satellite images) of geographic features traced by a map editor that allows selection of different categories for each geographic feature. -
FIG. 3 shows an example two-dimensional representation of a portion of a two-lane road 300 as represented within a detailed map format for use with theautonomous vehicle 200 ofFIG. 2 . The two-lane road 300 in this example map format includeslanes lanes lane segment lane segments waypoints lane segment 306 extends between thewaypoints lane segment 308 extends between thewaypoints waypoints waypoint - In the example map format shown in
FIG. 3 , thelane 302 is shown as having a left-to-right direction by arrows touching thewaypoints lane 304 is shown as having a right-to-left direction by arrows touching thewaypoints autonomous vehicle 200. - Additional detail can be added to the map format in order to improve the map format for use with the
autonomous vehicle 200. As shown inFIG. 3 , each of thelanes more border segments border segments borderpoints border segment 318 extends between theborderpoints border segment 322 extends between theborderpoints 332, 334. Information can be associated with theborderpoints borderpoint - The information associated with each borderpoint 324, 326, 328, 330, 332, 334 can be used by the
autonomous vehicle 200 in order to determine navigation routes, make decisions regarding passing other vehicles, position or localize theautonomous vehicle 200 in respect to theborder segments lanes memory 104 of thecomputing device 100 or can be available from a remote location. - Examples of different border types that can be associated with the
borderpoints borderpoints border segment 318 extending between them can be associated with a “single solid line” border type. To represent the “single solid line” border type within the map format,borderpoints border segment 318 is shown using a thin, solid line representation. Similarly, theborder segment 320 extending between theborderpoints border segments borderpoints - Border types and border colors can be used to associate a driving rule with each of the
various lane segments 306, 308 (and/or with thewaypoints lane segments 306, 308). A driving rule can be based at least in part on the border type and the border color associated with theborderpoints border segments lane segment border segment 320 extending between theborderpoints border segment 320 can be associated with a border type of “double solid line” and a border color of “yellow;” second, a driving rule of “drivable lane border” based on theborder segment 318 extending between theborderpoints border segment 318 can be associated with a border type of “single solid line” and a border color of “white.” Though the border types, border colors, and driving rules described in reference toFIG. 3 reflect commonly understood traffic rules in the United States, other traffic rules, border types, border colors, and driving rules are also possible. - Another benefit of storing information for both
lane segments border segments lane segment lane segment border segment autonomous vehicle 200 to be positioned within, for example, either of thelanes border segment waypoints lane segment autonomous vehicle 200. Further, the ability to localize theautonomous vehicle 200 is improved because theborder segments vehicle 200. - In some examples,
border segments lane segment lane segment lane segment border segments lane segment lane 302 can be calculated anywhere alonglane segment 306 by measuring the distance betweenborder segments lane autonomous vehicle 200. The positioning benefit is further described in reference toFIG. 4 . -
FIG. 4 shows an example two-dimensional representation of a portion of a four-lane road transitioning to a five-lane road 400 at an intersection as represented within a detailed map format for use with theautonomous vehicle 200 ofFIG. 2 . The five-lane road 400 in this example map format includeslanes lanes lane segment lane segments waypoints lane segment 416 extends between thewaypoints lane segment 420 extends between thewaypoints lanes waypoints lanes waypoints - Similar information as described above in reference to
FIG. 3 is associated with thewaypoints FIG. 4 . Each of thelanes road 400 ofFIG. 4 . For example,border segment 442 extends betweenborderpoints border segment 448 extends betweenborderpoints border segments lane segments FIG. 3 , theborder segments borderpoints - For example, the
border segments borderpoints borderpoints border segments autonomous vehicle 200 is controlled in a manner such that thevehicle 200 will not attempt to navigate beyond the border. Theborderpoint 450 is represented using a half-filled circle, indicating a transition in the border type from a “solid line” type to a “curb” type at the location of theborderpoint 450. - Understanding the location of a transition between border types is important for autonomous control, as the
vehicle 200 is physically able, if necessary, to navigate along or across a “line” type border, but is not able to safely navigate along or across a “curb” type border. The benefit of using border segments is clear in this example. A median 464 is shown as present betweenlane 404 andlane 406. The left-most part of the median 464 is bordered by border segments of solid lines associated with a “drivable lane border” driving rule and a “line” type border while the right-most part of the median 464 is bordered by border segments of dotted, hashed lines associated with an “impassable” driving rule and a “curb” type border. If necessary, thevehicle 200 could navigate across only the left-most part of the median 464. -
FIG. 4 shows additional features added to the map format in order to improve the map format for use with theautonomous vehicle 200 ofFIG. 2 . First, three of thelane segments stop lines FIG. 4 ). The stop lines 454, 456 can be linked to the end of one ormore lanes vehicle 200 must stop before the traffic intersection. In the example ofFIG. 4 , thestop line 456 extends between theborder segments autonomous vehicle 200 should be positioned if stopping in front of the traffic intersection within either of thelanes - The additional information provided by the stop lines 454, 456 is useful in operation of the
autonomous vehicle 200 because the stop lines 454, 456 allow theautonomous vehicle 200 to be positioned at the traffic intersection in a manner consistent with manual operation of a vehicle. For example, if theautonomous vehicle 200 approaches the traffic intersection withinlane 402, instead of stopping at thewaypoint 424 denoting the end of thelane segment 412, theautonomous vehicle 200 can be controlled to move forward to thestop line 456 and slightly around the corner of thelane 402 as denoted by theborder segment 442. This maneuver is more consistent with how a driver would manually operate a vehicle on theroad 400 when making a right turn at a traffic intersection. Though not shown, crosswalks can also be included in the detailed map format in a manner similar to that used for the stop lines 454, 456. Information associated with the crosswalks can include a geographical location of a position of the crosswalk and a driving rule associated with the crosswalk that directs the automated vehicle system to implement additional safety protocols. - Traffic signals are another feature present within the map format shown in
FIG. 4 . Each traffic signal can include information such as geographical location, traffic signal type, and traffic signal state. Traffic signal type can include information on the structure and orientation of a traffic light or traffic sign. Traffic signal structure and orientation for a traffic light can include “vertical three,” “vertical three left arrow,” “horizontal three,” “right arrow,” etc. Traffic signal state for a traffic light can include, for example, “green,” “green arrow,” “yellow,” “blinking yellow,” or “red.” In the map format shown inFIG. 4 , threetraffic lights traffic light 458 is associated withlane 402, and given the structure of the intersection and the shape and type of the border segmentsproximate lane 402,lane 402 is understood to be a right turn lane. Similarly,traffic light 462 is associated withlane 406, and given the structure of the intersection and shape and type of the border segmentsproximate lane 406,lane 406 is understood to be a left turn lane. -
FIG. 5 shows an example two-dimensional representation of aroundabout 500 as represented within a detailed map format for use with theautonomous vehicle 200 ofFIG. 2 . Maneuvering theautonomous vehicle 200 through the roundabout is greatly aided by the use of lane segments, border segments, and stop lines. For example,medians center circle 505 can be identified using borderpoints and border segments and be associated with driving rules as “impassable” areas of theroundabout 500. In another example, stoplines autonomous vehicle 200 the exact location where theautonomous vehicle 200 should stop before entering theroundabout 500. As shown inFIG. 5 , stopline 510 indicates a position to the right and below the nearest waypoint withinlane 512. Using thestop line 510 to position theautonomous vehicle 200 at the entrance to theroundabout 500 is much closer to how a driver would operate a vehicle when compared to stopping theautonomous vehicle 200 at the final waypoint within thelane 512. - The foregoing description relates to what are presently considered to be the most practical embodiments. It is to be understood, however, that the disclosure is not to be limited to these embodiments but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims. The scope of the claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures as is permitted under the law.
Claims (20)
1. A computer-readable map format, comprising:
a lane segment; and
a border segment;
wherein a distance to an edge of the lane segment can be determined at a location along the lane segment by measuring a distance between the location and a portion of the border segment closest to the location.
2. The map format of claim 1 , wherein the lane segment is formed from a plurality of waypoints and wherein information associated with each waypoint includes at least one of a geographical location and a lane speed and a lane direction.
3. The map format of claim 1 , wherein the border segment is formed from a plurality of borderpoints and wherein information associated with each borderpoint includes at least a geographical location and a border type and a border color.
4. The map format of claim 3 , wherein information associated with the lane segment includes a driving rule and wherein the driving rule is based at least in part on the border type and the border color associated with a proximate one of the plurality of borderpoints.
5. The map format of claim 3 , wherein the border type includes at least one of a curb, a single solid line, a double solid line, a single dashed line, a combined dashed line and solid line, and no line.
6. The map format of claim 3 , wherein the border color includes at least one of yellow, white, and unknown.
7. The map format of claim 1 , further comprising:
a stop line associated with an end of the lane segment, wherein information associated with the stop line includes a geographical location, the geographical location representing a position where a vehicle must stop before a traffic intersection.
8. The map format of claim 1 , further comprising:
a traffic signal, wherein information associated with the traffic signal includes a geographical location and a traffic signal type and a traffic signal state.
9. The map format of claim 8 , wherein the traffic signal type includes information regarding structure and orientation for at least one of a traffic light and a traffic sign.
10. The map format of claim 9 , wherein the traffic signal type is a traffic light and the traffic signal state includes at least one of green, green arrow, yellow, blinking yellow, and red.
11. A computer-readable map format, comprising:
a lane segment; and
a plurality of border segments;
wherein a lane width of the lane segment can be determined at a location along the lane segment by measuring a distance between at least two of the plurality of border segments positioned proximate to and on opposite sides of the lane segment.
12. The map format of claim 11 , wherein the lane segment is formed from a plurality of waypoints and wherein information associated with each waypoint includes at least one of a geographical location and a lane speed and a lane direction.
13. The map format of claim 11 , wherein the border segment is formed from a plurality of borderpoints and wherein information associated with each borderpoint includes at least a geographical location and a border type and a border color.
14. The map format of claim 13 , wherein information associated with the lane segment includes a driving rule and wherein the driving rule is based at least in part on the border type and the border color associated with a proximate one of the plurality of borderpoints.
15. The map format of claim 13 , wherein the border type includes at least one of a curb, a single solid line, a double solid line, a single dashed line, a combined dashed line and solid line, and no line.
16. The map format of claim 13 , wherein the border color includes at least one of yellow, white, and unknown.
17. The map format of claim 11 , further comprising:
a stop line associated with an end of the lane segment, wherein information associated with the stop line includes a geographical location, the geographical location representing a position where a vehicle must stop before a traffic intersection.
18. The map format of claim 11 , further comprising:
a traffic signal, wherein information associated with the traffic signal includes a geographical location and a traffic signal type and a traffic signal state.
19. The map format of claim 18 , wherein the traffic signal type includes information regarding structure and orientation for at least one of a traffic light and a traffic sign.
20. The map format of claim 19 , wherein the traffic signal type is a traffic light and the traffic signal state includes at least one of green, green arrow, yellow, blinking yellow, and red.
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/265,370 US20150316386A1 (en) | 2014-04-30 | 2014-04-30 | Detailed map format for autonomous driving |
US14/301,079 US20150316387A1 (en) | 2014-04-30 | 2014-06-10 | Detailed map format for autonomous driving |
JP2016564950A JP2017516135A (en) | 2014-04-30 | 2015-04-23 | Detailed map format for autonomous driving |
DE112015002037.3T DE112015002037T5 (en) | 2014-04-30 | 2015-04-23 | Detailed map format for autonomous driving |
PCT/US2015/027347 WO2015167931A1 (en) | 2014-04-30 | 2015-04-23 | Detailed map format for autonomous driving |
EP15166003.2A EP2940427A1 (en) | 2014-04-30 | 2015-04-30 | Detailed map format for autonomous driving |
US15/155,313 US10118614B2 (en) | 2014-04-30 | 2016-05-16 | Detailed map format for autonomous driving |
US15/176,903 US9921585B2 (en) | 2014-04-30 | 2016-06-08 | Detailed map format for autonomous driving |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/265,370 US20150316386A1 (en) | 2014-04-30 | 2014-04-30 | Detailed map format for autonomous driving |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/301,079 Continuation-In-Part US20150316387A1 (en) | 2014-04-30 | 2014-06-10 | Detailed map format for autonomous driving |
US15/155,313 Continuation US10118614B2 (en) | 2014-04-30 | 2016-05-16 | Detailed map format for autonomous driving |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150316386A1 true US20150316386A1 (en) | 2015-11-05 |
Family
ID=53200295
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/265,370 Abandoned US20150316386A1 (en) | 2014-04-30 | 2014-04-30 | Detailed map format for autonomous driving |
US15/155,313 Active US10118614B2 (en) | 2014-04-30 | 2016-05-16 | Detailed map format for autonomous driving |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/155,313 Active US10118614B2 (en) | 2014-04-30 | 2016-05-16 | Detailed map format for autonomous driving |
Country Status (4)
Country | Link |
---|---|
US (2) | US20150316386A1 (en) |
JP (1) | JP2017516135A (en) |
DE (1) | DE112015002037T5 (en) |
WO (1) | WO2015167931A1 (en) |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150345959A1 (en) * | 2014-05-30 | 2015-12-03 | Nissan North America, Inc. | Vehicle trajectory optimization for autonomous vehicles |
US20160167582A1 (en) * | 2014-12-16 | 2016-06-16 | Here Global B.V. | Learning Lanes From Vehicle Probes |
US20170171375A1 (en) * | 2015-12-09 | 2017-06-15 | Toyota Motor Engineering & Manufacturing North America, Inc. | Dynamic vehicle automation level availability indication system and method |
US20180024565A1 (en) * | 2016-07-21 | 2018-01-25 | Mobileye Vision Technologies Ltd. | Navigating a vehicle using a crowdsourced sparse map |
US20180038701A1 (en) * | 2015-03-03 | 2018-02-08 | Pioneer Corporation | Route search device, control method, program and storage medium |
US20180345963A1 (en) * | 2015-12-22 | 2018-12-06 | Aisin Aw Co., Ltd. | Autonomous driving assistance system, autonomous driving assistance method, and computer program |
US20180364720A1 (en) * | 2017-06-14 | 2018-12-20 | Robert Bosch Gmbh | Method for creating a digital map for an automated vehicle |
EP3428577A1 (en) * | 2017-07-12 | 2019-01-16 | Veoneer Sweden AB | A driver assistance system and method |
US20190086928A1 (en) * | 2017-09-15 | 2019-03-21 | Here Global B.V. | Lane-centric road network model for navigation |
CN109934862A (en) * | 2019-02-22 | 2019-06-25 | 上海大学 | A binocular vision SLAM method combining point and line features |
US20200174115A1 (en) * | 2018-05-18 | 2020-06-04 | Zendar Inc. | Systems and methods for detecting objects |
WO2020215254A1 (en) * | 2019-04-24 | 2020-10-29 | 深圳市大疆创新科技有限公司 | Lane line map maintenance method, electronic device and storage medium |
CN113405558A (en) * | 2020-02-29 | 2021-09-17 | 华为技术有限公司 | Construction method of automatic driving map and related device |
US11131552B2 (en) * | 2018-12-10 | 2021-09-28 | Toyota Jidosha Kabushiki Kaisha | Map generation system |
US11156468B2 (en) * | 2016-11-26 | 2021-10-26 | Thinkware Corporation | Device, method, computer program, and computer-readable recording medium for route guidance |
US20230019719A1 (en) * | 2021-06-29 | 2023-01-19 | Beijing Baidu Netcom Science Technology Co., Ltd. | Method and apparatus for constructing lane-level navigation map, device and storage medium |
US11727730B2 (en) | 2018-07-02 | 2023-08-15 | Smartdrive Systems, Inc. | Systems and methods for generating and providing timely vehicle event information |
US11830365B1 (en) * | 2018-07-02 | 2023-11-28 | Smartdrive Systems, Inc. | Systems and methods for generating data describing physical surroundings of a vehicle |
US20230410536A1 (en) * | 2019-02-26 | 2023-12-21 | Tusimple, Inc. | Method and system for map construction |
US11920942B2 (en) | 2016-11-26 | 2024-03-05 | Thinkware Corporation | Device, method, computer program, and computer readable-recording medium for route guidance |
US12008922B2 (en) | 2018-07-02 | 2024-06-11 | Smartdrive Systems, Inc. | Systems and methods for comparing driving performance for simulated driving |
US12013250B2 (en) | 2016-11-26 | 2024-06-18 | Thinkware Corporation | Apparatus, method, computer program, and computer readable recording medium for route guidance |
US12097875B1 (en) | 2021-07-23 | 2024-09-24 | Apple Inc. | Map generation using locally captured sensor data |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107438754A (en) | 2015-02-10 | 2017-12-05 | 御眼视觉技术有限公司 | Sparse map for autonomous vehicle navigation |
JP6835499B2 (en) * | 2016-07-29 | 2021-02-24 | 株式会社ゼンリン | Data structure of control system and map data used in control system |
US10712746B2 (en) * | 2016-08-29 | 2020-07-14 | Baidu Usa Llc | Method and system to construct surrounding environment for autonomous vehicles to make driving decisions |
JP6469066B2 (en) * | 2016-11-17 | 2019-02-13 | 株式会社ゼンリン | Driving support system |
KR102585219B1 (en) * | 2016-11-29 | 2023-10-05 | 삼성전자주식회사 | Device and method to control speed of vehicle |
US10380890B2 (en) * | 2017-02-08 | 2019-08-13 | Baidu Usa Llc | Autonomous vehicle localization based on walsh kernel projection technique |
US20180347993A1 (en) * | 2017-05-31 | 2018-12-06 | GM Global Technology Operations LLC | Systems and methods for verifying road curvature map data |
JP7081117B2 (en) * | 2017-11-06 | 2022-06-07 | いすゞ自動車株式会社 | Steering control device and steering control method |
KR20190072834A (en) * | 2017-12-18 | 2019-06-26 | 삼성전자주식회사 | Method and device to control longitudinal velocity of vehicle |
CN112033420B (en) * | 2019-06-03 | 2024-06-18 | 北京京东叁佰陆拾度电子商务有限公司 | Lane map construction method and device |
JP7414683B2 (en) * | 2020-09-29 | 2024-01-16 | 日立Astemo株式会社 | Own vehicle position estimation device and own vehicle position estimation method |
US11620836B2 (en) | 2020-12-04 | 2023-04-04 | Argo AI, LLC | Light emitting diode flicker mitigation |
Family Cites Families (132)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3771096A (en) * | 1969-04-03 | 1973-11-06 | V Walter | Video and audio passing and lane changing signaling system for vehicles |
US3613073A (en) | 1969-05-14 | 1971-10-12 | Eugene Emerson Clift | Traffic control system |
DE3532527A1 (en) | 1985-09-12 | 1987-03-19 | Robot Foto Electr Kg | DEVICE FOR PHOTOGRAPHIC MONITORING OF CROSSINGS |
US4775865A (en) | 1985-12-16 | 1988-10-04 | E-Lited Limited, A California Limited Partnership | Emergency vehicle warning and traffic control system |
US4704610A (en) | 1985-12-16 | 1987-11-03 | Smith Michel R | Emergency vehicle warning and traffic control system |
DE3727562C2 (en) | 1987-08-19 | 1993-12-09 | Robot Foto Electr Kg | Traffic monitoring device |
US5278554A (en) | 1991-04-05 | 1994-01-11 | Marton Louis L | Road traffic control system with alternating nonstop traffic flow |
US6553130B1 (en) * | 1993-08-11 | 2003-04-22 | Jerome H. Lemelson | Motor vehicle warning and control system and method |
US5798949A (en) | 1995-01-13 | 1998-08-25 | Kaub; Alan Richard | Traffic safety prediction model |
US6405132B1 (en) * | 1997-10-22 | 2002-06-11 | Intelligent Technologies International, Inc. | Accident avoidance system |
US7085637B2 (en) * | 1997-10-22 | 2006-08-01 | Intelligent Technologies International, Inc. | Method and system for controlling a vehicle |
US7418346B2 (en) * | 1997-10-22 | 2008-08-26 | Intelligent Technologies International, Inc. | Collision avoidance methods and systems |
US7629899B2 (en) * | 1997-10-22 | 2009-12-08 | Intelligent Technologies International, Inc. | Vehicular communication arrangement and method |
US7202776B2 (en) * | 1997-10-22 | 2007-04-10 | Intelligent Technologies International, Inc. | Method and system for detecting objects external to a vehicle |
US7610146B2 (en) * | 1997-10-22 | 2009-10-27 | Intelligent Technologies International, Inc. | Vehicle position determining system and method |
US7426437B2 (en) * | 1997-10-22 | 2008-09-16 | Intelligent Technologies International, Inc. | Accident avoidance systems and methods |
US7295925B2 (en) * | 1997-10-22 | 2007-11-13 | Intelligent Technologies International, Inc. | Accident avoidance systems and methods |
US6720920B2 (en) * | 1997-10-22 | 2004-04-13 | Intelligent Technologies International Inc. | Method and arrangement for communicating between vehicles |
US7912645B2 (en) * | 1997-10-22 | 2011-03-22 | Intelligent Technologies International, Inc. | Information transfer arrangement and method for vehicles |
US7110880B2 (en) * | 1997-10-22 | 2006-09-19 | Intelligent Technologies International, Inc. | Communication method and arrangement |
US6526352B1 (en) * | 2001-07-19 | 2003-02-25 | Intelligent Technologies International, Inc. | Method and arrangement for mapping a road |
US5712618A (en) * | 1995-08-29 | 1998-01-27 | Mckenna; Michael R. | Method and apparatus for an automatic signaling device |
JPH09325913A (en) | 1996-06-05 | 1997-12-16 | Toshiba Corp | Semiconductor memory |
US5873674A (en) | 1996-12-05 | 1999-02-23 | Hohl; Barney K. | Roadway safety warning system and method of making same |
US5801646A (en) | 1997-08-22 | 1998-09-01 | Pena; Martin R. | Traffic alert system and method for its use |
US5926126A (en) * | 1997-09-08 | 1999-07-20 | Ford Global Technologies, Inc. | Method and system for detecting an in-path target obstacle in front of a vehicle |
JP3500928B2 (en) | 1997-09-17 | 2004-02-23 | トヨタ自動車株式会社 | Map data processing device, map data processing method, and map data processing system |
US8060308B2 (en) * | 1997-10-22 | 2011-11-15 | Intelligent Technologies International, Inc. | Weather monitoring techniques |
US8965677B2 (en) * | 1998-10-22 | 2015-02-24 | Intelligent Technologies International, Inc. | Intra-vehicle information conveyance system and method |
US8255144B2 (en) * | 1997-10-22 | 2012-08-28 | Intelligent Technologies International, Inc. | Intra-vehicle information conveyance system and method |
US20090043506A1 (en) * | 1997-10-22 | 2009-02-12 | Intelligent Technologies International, Inc. | Method and System for Controlling Timing of Vehicle Transmissions |
US8000897B2 (en) * | 1997-10-22 | 2011-08-16 | Intelligent Technologies International, Inc. | Intersection collision avoidance techniques |
US8209120B2 (en) * | 1997-10-22 | 2012-06-26 | American Vehicular Sciences Llc | Vehicular map database management techniques |
US20080147253A1 (en) * | 1997-10-22 | 2008-06-19 | Intelligent Technologies International, Inc. | Vehicular Anticipatory Sensor System |
US7796081B2 (en) * | 1997-10-22 | 2010-09-14 | Intelligent Technologies International, Inc. | Combined imaging and distance monitoring for vehicular applications |
US10358057B2 (en) * | 1997-10-22 | 2019-07-23 | American Vehicular Sciences Llc | In-vehicle signage techniques |
US20080154629A1 (en) * | 1997-10-22 | 2008-06-26 | Intelligent Technologies International, Inc. | Vehicle Speed Control Method and Arrangement |
US7791503B2 (en) * | 1997-10-22 | 2010-09-07 | Intelligent Technologies International, Inc. | Vehicle to infrastructure information conveyance system and method |
JP3869108B2 (en) * | 1998-02-23 | 2007-01-17 | 株式会社小松製作所 | Unmanned vehicle interference prediction apparatus and unmanned vehicle guided traveling method |
DE19882276D2 (en) | 1998-02-27 | 2000-05-18 | Mitsubishi Int Gmbh | Traffic management system |
US6202482B1 (en) | 1998-03-23 | 2001-03-20 | Lehighton Electronics, Inc. | Method and apparatus for testing of sheet material |
US8630795B2 (en) * | 1999-03-11 | 2014-01-14 | American Vehicular Sciences Llc | Vehicle speed control method and arrangement |
JP2000302055A (en) * | 1999-04-20 | 2000-10-31 | Honda Motor Co Ltd | Traffic lane followup control device |
US6232889B1 (en) | 1999-08-05 | 2001-05-15 | Peter Apitz | System and method for signal light preemption and vehicle tracking |
AUPQ281299A0 (en) | 1999-09-14 | 1999-10-07 | Locktronic Systems Pty. Ltd. | Improvements in image recording apparatus |
US6317058B1 (en) | 1999-09-15 | 2001-11-13 | Jerome H. Lemelson | Intelligent traffic control and warning system and method |
EP1092949B1 (en) | 1999-09-29 | 2007-01-17 | Matsushita Electric Industrial Co., Ltd | Route selection method and system |
DE10036042B4 (en) * | 2000-07-25 | 2004-12-16 | Daimlerchrysler Ag | Multisensorial lane assignment |
JP4791649B2 (en) * | 2001-05-07 | 2011-10-12 | 株式会社ゼンリン | Electronic map data, display control device and computer program |
US20030016143A1 (en) | 2001-07-23 | 2003-01-23 | Ohanes Ghazarian | Intersection vehicle collision avoidance system |
JP4023201B2 (en) | 2002-04-25 | 2007-12-19 | アイシン・エィ・ダブリュ株式会社 | Navigation device |
US9007197B2 (en) * | 2002-05-20 | 2015-04-14 | Intelligent Technologies International, Inc. | Vehicular anticipatory sensor system |
US8068036B2 (en) | 2002-07-22 | 2011-11-29 | Ohanes Ghazarian | Intersection vehicle collision avoidance system |
US7433889B1 (en) | 2002-08-07 | 2008-10-07 | Navteq North America, Llc | Method and system for obtaining traffic sign data using navigation systems |
US9341485B1 (en) * | 2003-06-19 | 2016-05-17 | Here Global B.V. | Method and apparatus for representing road intersections |
ATE540289T1 (en) | 2003-07-16 | 2012-01-15 | Navteq North America Llc | DRIVER ASSISTANCE SYSTEM OF A MOTOR VEHICLE |
US7663505B2 (en) | 2003-12-24 | 2010-02-16 | Publicover Mark W | Traffic management device and system |
US7482916B2 (en) * | 2004-03-15 | 2009-01-27 | Anita Au | Automatic signaling systems for vehicles |
JP4291741B2 (en) * | 2004-06-02 | 2009-07-08 | トヨタ自動車株式会社 | Lane departure warning device |
WO2006012696A2 (en) | 2004-08-04 | 2006-02-09 | Speedalert Pty Ltd | An information apparatus for an operator of a land or water based motor driven conveyance |
JP2006189325A (en) * | 2005-01-06 | 2006-07-20 | Aisin Aw Co Ltd | Present location information management device of vehicle |
US7899617B2 (en) | 2005-02-17 | 2011-03-01 | Denso Corporation | Navigation system providing route guidance in multi-lane road according to vehicle lane position |
JP4534838B2 (en) | 2005-03-30 | 2010-09-01 | 株式会社デンソー | Navigation device and program for navigation device |
JP4742285B2 (en) * | 2005-09-20 | 2011-08-10 | 株式会社ゼンリン | MAP INFORMATION CREATION DEVICE AND METHOD, AND PROGRAM |
JP5075331B2 (en) * | 2005-09-30 | 2012-11-21 | アイシン・エィ・ダブリュ株式会社 | Map database generation system |
US7400236B2 (en) * | 2005-10-21 | 2008-07-15 | Gm Global Technology Operations, Inc. | Vehicular lane monitoring system utilizing front and rear cameras |
MXNL05000085A (en) | 2005-10-26 | 2007-04-25 | Azael Flores Rendon | Quick traffic turning system. |
RU2008138562A (en) | 2006-02-27 | 2010-04-10 | ВОО ДЗЕОН ГРИН Ко., Лтд. (KR) | INTEGRATED LIGHT, SIGN AND INFORMATION DISPLAY |
US20090135024A1 (en) | 2006-03-17 | 2009-05-28 | Park Jin-Gu | Display control system of traffic light and display method |
JP4702149B2 (en) * | 2006-04-06 | 2011-06-15 | 株式会社日立製作所 | Vehicle positioning device |
US7477988B2 (en) | 2006-05-16 | 2009-01-13 | Navteq North America, Llc | Dual road geometry representation for position and curvature-heading |
JP4687563B2 (en) * | 2006-05-23 | 2011-05-25 | 株式会社デンソー | Lane mark recognition device for vehicles |
US7468680B2 (en) | 2006-06-24 | 2008-12-23 | John Heffernan | Traffic light safety zone |
US9302678B2 (en) | 2006-12-29 | 2016-04-05 | Robotic Research, Llc | Robotic driving system |
JP5194452B2 (en) * | 2007-01-10 | 2013-05-08 | 朝日航洋株式会社 | Road data generation method, apparatus and program |
US9460619B2 (en) | 2007-01-17 | 2016-10-04 | The Boeing Company | Methods and systems for controlling traffic flow |
TWI326859B (en) | 2007-03-30 | 2010-07-01 | Ind Tech Res Inst | System and method for intelligent traffic control using wireless sensor and actuator networks |
JP4561769B2 (en) | 2007-04-27 | 2010-10-13 | アイシン・エィ・ダブリュ株式会社 | Route guidance system and route guidance method |
US7772996B2 (en) | 2007-05-25 | 2010-08-10 | Spot Devices, Inc. | Alert and warning system and method |
JP2009015504A (en) | 2007-07-03 | 2009-01-22 | Aisin Aw Co Ltd | Traffic restriction position detection device, traffic restriction position detection method and computer program |
JP2009053231A (en) * | 2007-08-23 | 2009-03-12 | Aisin Aw Co Ltd | Road information generating device, road information generating method, and program |
JP5227065B2 (en) | 2008-01-25 | 2013-07-03 | 株式会社岩根研究所 | 3D machine map, 3D machine map generation device, navigation device and automatic driving device |
DE102008010968A1 (en) | 2008-02-25 | 2009-09-17 | Robert Bosch Gmbh | Display of a relevant traffic sign or a relevant traffic facility |
JP5359085B2 (en) * | 2008-03-04 | 2013-12-04 | 日産自動車株式会社 | Lane maintenance support device and lane maintenance support method |
JP4623145B2 (en) | 2008-06-16 | 2011-02-02 | トヨタ自動車株式会社 | Driving assistance device |
US8311283B2 (en) * | 2008-07-06 | 2012-11-13 | Automotive Research&Testing Center | Method for detecting lane departure and apparatus thereof |
JP2010019759A (en) * | 2008-07-11 | 2010-01-28 | Mazda Motor Corp | Traveling lane detector for vehicle |
US8099213B2 (en) * | 2008-07-18 | 2012-01-17 | GM Global Technology Operations LLC | Road-edge detection |
JP5353097B2 (en) | 2008-07-22 | 2013-11-27 | 朝日航洋株式会社 | Road network data generation device, intersection lane generation device, and method and program thereof |
US20100020170A1 (en) * | 2008-07-24 | 2010-01-28 | Higgins-Luthman Michael J | Vehicle Imaging System |
US8121749B1 (en) | 2008-09-25 | 2012-02-21 | Honeywell International Inc. | System for integrating dynamically observed and static information for route planning in a graph based planner |
DE102009005920A1 (en) | 2009-01-23 | 2010-07-29 | Hella Kgaa Hueck & Co. | Method and device for controlling at least one traffic light system of a pedestrian crossing |
US8150620B2 (en) | 2009-04-14 | 2012-04-03 | Alpine Electronics, Inc. | Route search method and apparatus for navigation system utilizing map data of XML format |
US20110006915A1 (en) | 2009-07-13 | 2011-01-13 | Sower Charles D | Turn/no turn on red traffic light signal |
US20110080303A1 (en) | 2009-09-01 | 2011-04-07 | Goldberg Allen | Computerized traffic signal system |
US8773281B2 (en) | 2009-09-15 | 2014-07-08 | Ohanes D. Ghazarian | Intersection vehicle collision avoidance system |
JP5135321B2 (en) | 2009-11-13 | 2013-02-06 | 株式会社日立製作所 | Autonomous traveling device |
US20110182473A1 (en) | 2010-01-28 | 2011-07-28 | American Traffic Solutions, Inc. of Kansas | System and method for video signal sensing using traffic enforcement cameras |
US9254781B2 (en) | 2010-02-02 | 2016-02-09 | Craig David Applebaum | Emergency vehicle warning device and system |
US8525700B2 (en) | 2010-03-02 | 2013-09-03 | Mohammadreza Rejali | Control system and a method for information display systems for vehicles on cross roads |
DE112010005501T5 (en) | 2010-04-19 | 2013-03-14 | Toyota Jidosha Kabushiki Kaisha | Vehicle control unit |
DE102010049086A1 (en) * | 2010-10-21 | 2012-04-26 | Gm Global Technology Operations Llc (N.D.Ges.D. Staates Delaware) | Method for assessing driver attention |
DE102010049087A1 (en) * | 2010-10-21 | 2012-04-26 | Gm Global Technology Operations Llc (N.D.Ges.D. Staates Delaware) | Method for assessing driver attention |
US9685080B2 (en) | 2010-11-05 | 2017-06-20 | International Business Machines Corporation | Traffic light preemption management system |
DE102011010846B4 (en) | 2011-02-10 | 2014-02-06 | Audi Ag | Method and system for visual connection-independent data transmission |
WO2012114382A1 (en) | 2011-02-24 | 2012-08-30 | 三菱電機株式会社 | Navigation device, advisory speed arithmetic device and advisory speed presentation device |
RU2454726C1 (en) | 2011-03-03 | 2012-06-27 | Игорь Юрьевич Мацур | Method of controlling movement of vehicles and apparatus for realising said method |
DE102011076763A1 (en) | 2011-05-31 | 2012-12-06 | Robert Bosch Gmbh | Driver assistance system and method for operating a driver assistance system |
KR20130007754A (en) | 2011-07-11 | 2013-01-21 | 한국전자통신연구원 | Apparatus and method for controlling vehicle at autonomous intersection |
JP5652364B2 (en) * | 2011-09-24 | 2015-01-14 | 株式会社デンソー | Vehicle behavior control device |
WO2013060925A1 (en) | 2011-10-28 | 2013-05-02 | Nokia Corporation | Method and apparatus for constructing a road network based on point-of-interest (poi) information |
US8712624B1 (en) | 2012-04-06 | 2014-04-29 | Google Inc. | Positioning vehicles to improve quality of observations at intersections |
US8761991B1 (en) | 2012-04-09 | 2014-06-24 | Google Inc. | Use of uncertainty regarding observations of traffic intersections to modify behavior of a vehicle |
JP5505453B2 (en) * | 2012-04-26 | 2014-05-28 | 株式会社デンソー | Vehicle behavior control device |
US8527199B1 (en) * | 2012-05-17 | 2013-09-03 | Google Inc. | Automatic collection of quality control statistics for maps used in autonomous driving |
US8489316B1 (en) * | 2012-06-28 | 2013-07-16 | Delphi Technologies, Inc. | Map matching method for vehicle safety warning system |
US8855904B1 (en) | 2012-10-10 | 2014-10-07 | Google Inc. | Use of position logs of vehicles to determine presence and behaviors of traffic controls |
JPWO2014064805A1 (en) * | 2012-10-25 | 2016-09-05 | 日産自動車株式会社 | Vehicle travel support device |
DE102012111740A1 (en) | 2012-12-03 | 2014-06-05 | Continental Teves Ag & Co. Ohg | Method for supporting a traffic light phase assistant detecting a traffic light of a vehicle |
US8917190B1 (en) | 2013-01-23 | 2014-12-23 | Stephen Waller Melvin | Method of restricting turns at vehicle intersections |
US20140257659A1 (en) * | 2013-03-11 | 2014-09-11 | Honda Motor Co., Ltd. | Real time risk assessments using risk functions |
AT514754B1 (en) * | 2013-09-05 | 2018-06-15 | Avl List Gmbh | Method and device for optimizing driver assistance systems |
US9881220B2 (en) | 2013-10-25 | 2018-01-30 | Magna Electronics Inc. | Vehicle vision system utilizing communication system |
EP3514032B1 (en) * | 2013-12-04 | 2024-02-07 | Mobileye Vision Technologies Ltd. | Adjusting velocity of a vehicle for a curve |
US9091558B2 (en) * | 2013-12-23 | 2015-07-28 | Automotive Research & Testing Center | Autonomous driver assistance system and autonomous driving method thereof |
DE102014205953A1 (en) | 2014-03-31 | 2015-10-01 | Robert Bosch Gmbh | Method for analyzing a traffic environment situation of a vehicle |
WO2015156818A1 (en) * | 2014-04-11 | 2015-10-15 | Nissan North America, Inc. | Autonomous vehicle control system |
CN104036275B (en) * | 2014-05-22 | 2017-11-28 | 东软集团股份有限公司 | The detection method and its device of destination object in a kind of vehicle blind zone |
US9830517B2 (en) * | 2014-06-19 | 2017-11-28 | Toyota Motor Engineering & Manufacturing North America, Inc. | Road branch detection and path selection for lane centering |
US10507807B2 (en) | 2015-04-28 | 2019-12-17 | Mobileye Vision Technologies Ltd. | Systems and methods for causing a vehicle response based on traffic light detection |
JP6572847B2 (en) * | 2016-08-10 | 2019-09-11 | トヨタ自動車株式会社 | Automated driving system |
KR102057532B1 (en) * | 2016-10-12 | 2019-12-20 | 한국전자통신연구원 | Device for sharing and learning driving environment data for improving the intelligence judgments of autonomous vehicle and method thereof |
-
2014
- 2014-04-30 US US14/265,370 patent/US20150316386A1/en not_active Abandoned
-
2015
- 2015-04-23 JP JP2016564950A patent/JP2017516135A/en active Pending
- 2015-04-23 WO PCT/US2015/027347 patent/WO2015167931A1/en active Application Filing
- 2015-04-23 DE DE112015002037.3T patent/DE112015002037T5/en not_active Withdrawn
-
2016
- 2016-05-16 US US15/155,313 patent/US10118614B2/en active Active
Cited By (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9405293B2 (en) * | 2014-05-30 | 2016-08-02 | Nissan North America, Inc | Vehicle trajectory optimization for autonomous vehicles |
US20150345959A1 (en) * | 2014-05-30 | 2015-12-03 | Nissan North America, Inc. | Vehicle trajectory optimization for autonomous vehicles |
US10262213B2 (en) * | 2014-12-16 | 2019-04-16 | Here Global B.V. | Learning lanes from vehicle probes |
US20160167582A1 (en) * | 2014-12-16 | 2016-06-16 | Here Global B.V. | Learning Lanes From Vehicle Probes |
US20180038701A1 (en) * | 2015-03-03 | 2018-02-08 | Pioneer Corporation | Route search device, control method, program and storage medium |
US10520324B2 (en) | 2015-03-03 | 2019-12-31 | Pioneer Corporation | Route search device, control method, program and storage medium |
US9699289B1 (en) * | 2015-12-09 | 2017-07-04 | Toyota Motor Engineering & Manufacturing North America, Inc. | Dynamic vehicle automation level availability indication system and method |
US20170171375A1 (en) * | 2015-12-09 | 2017-06-15 | Toyota Motor Engineering & Manufacturing North America, Inc. | Dynamic vehicle automation level availability indication system and method |
US20180345963A1 (en) * | 2015-12-22 | 2018-12-06 | Aisin Aw Co., Ltd. | Autonomous driving assistance system, autonomous driving assistance method, and computer program |
US10703362B2 (en) * | 2015-12-22 | 2020-07-07 | Aisin Aw Co., Ltd. | Autonomous driving autonomous system, automated driving assistance method, and computer program |
US20180024565A1 (en) * | 2016-07-21 | 2018-01-25 | Mobileye Vision Technologies Ltd. | Navigating a vehicle using a crowdsourced sparse map |
US10558222B2 (en) * | 2016-07-21 | 2020-02-11 | Mobileye Vision Technologies Ltd. | Navigating a vehicle using a crowdsourced sparse map |
US12013250B2 (en) | 2016-11-26 | 2024-06-18 | Thinkware Corporation | Apparatus, method, computer program, and computer readable recording medium for route guidance |
US11156468B2 (en) * | 2016-11-26 | 2021-10-26 | Thinkware Corporation | Device, method, computer program, and computer-readable recording medium for route guidance |
US11920942B2 (en) | 2016-11-26 | 2024-03-05 | Thinkware Corporation | Device, method, computer program, and computer readable-recording medium for route guidance |
US20180364720A1 (en) * | 2017-06-14 | 2018-12-20 | Robert Bosch Gmbh | Method for creating a digital map for an automated vehicle |
US11163308B2 (en) * | 2017-06-14 | 2021-11-02 | Robert Bosch Gmbh | Method for creating a digital map for an automated vehicle |
CN110869702A (en) * | 2017-07-12 | 2020-03-06 | 维宁尔瑞典公司 | Driver assistance system and method |
EP3428577A1 (en) * | 2017-07-12 | 2019-01-16 | Veoneer Sweden AB | A driver assistance system and method |
US12123722B2 (en) | 2017-07-12 | 2024-10-22 | Arriver Software Ab | Driver assistance system and method |
WO2019025112A1 (en) * | 2017-07-12 | 2019-02-07 | Veoneer Sweden Ab | A driver assistance system and method |
US11550330B2 (en) | 2017-07-12 | 2023-01-10 | Arriver Software Ab | Driver assistance system and method |
US10809728B2 (en) * | 2017-09-15 | 2020-10-20 | Here Global B.V. | Lane-centric road network model for navigation |
US20190086928A1 (en) * | 2017-09-15 | 2019-03-21 | Here Global B.V. | Lane-centric road network model for navigation |
US20200174115A1 (en) * | 2018-05-18 | 2020-06-04 | Zendar Inc. | Systems and methods for detecting objects |
CN112437885A (en) * | 2018-05-18 | 2021-03-02 | 甄达公司 | System and method for detecting an object |
US11668815B2 (en) * | 2018-05-18 | 2023-06-06 | Zendar Inc. | Systems and methods for detecting objects |
US11830365B1 (en) * | 2018-07-02 | 2023-11-28 | Smartdrive Systems, Inc. | Systems and methods for generating data describing physical surroundings of a vehicle |
US11727730B2 (en) | 2018-07-02 | 2023-08-15 | Smartdrive Systems, Inc. | Systems and methods for generating and providing timely vehicle event information |
US20240021081A1 (en) * | 2018-07-02 | 2024-01-18 | Smartdrive Systems, Inc. | Systems and methods for generating data describing physical surroundings of a vehicle |
US12008922B2 (en) | 2018-07-02 | 2024-06-11 | Smartdrive Systems, Inc. | Systems and methods for comparing driving performance for simulated driving |
US12170023B2 (en) * | 2018-07-02 | 2024-12-17 | Smartdrive Systems, Inc. | Systems and methods for generating data describing physical surroundings of a vehicle |
US11131552B2 (en) * | 2018-12-10 | 2021-09-28 | Toyota Jidosha Kabushiki Kaisha | Map generation system |
CN109934862A (en) * | 2019-02-22 | 2019-06-25 | 上海大学 | A binocular vision SLAM method combining point and line features |
US20230410536A1 (en) * | 2019-02-26 | 2023-12-21 | Tusimple, Inc. | Method and system for map construction |
US12190609B2 (en) * | 2019-02-26 | 2025-01-07 | Tusimple, Inc. | Method and system for map construction |
WO2020215254A1 (en) * | 2019-04-24 | 2020-10-29 | 深圳市大疆创新科技有限公司 | Lane line map maintenance method, electronic device and storage medium |
CN113405558A (en) * | 2020-02-29 | 2021-09-17 | 华为技术有限公司 | Construction method of automatic driving map and related device |
US20230019719A1 (en) * | 2021-06-29 | 2023-01-19 | Beijing Baidu Netcom Science Technology Co., Ltd. | Method and apparatus for constructing lane-level navigation map, device and storage medium |
US12097875B1 (en) | 2021-07-23 | 2024-09-24 | Apple Inc. | Map generation using locally captured sensor data |
Also Published As
Publication number | Publication date |
---|---|
WO2015167931A1 (en) | 2015-11-05 |
US20160257307A1 (en) | 2016-09-08 |
US10118614B2 (en) | 2018-11-06 |
JP2017516135A (en) | 2017-06-15 |
DE112015002037T5 (en) | 2017-03-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10118614B2 (en) | Detailed map format for autonomous driving | |
US9921585B2 (en) | Detailed map format for autonomous driving | |
US11113544B2 (en) | Method and apparatus providing information for driving vehicle | |
US11029697B2 (en) | Systems and methods for vehicular navigation | |
US10753750B2 (en) | System and method for mapping through inferences of observed objects | |
CN110866433B (en) | Method and device for determining road markings | |
US11131550B2 (en) | Method for generating road map for vehicle navigation and navigation device | |
US9495602B2 (en) | Image and map-based detection of vehicles at intersections | |
US10809076B2 (en) | Active driving map for self-driving road vehicle | |
US20200286387A1 (en) | System and method for vehicle platooning | |
US9576200B2 (en) | Background map format for autonomous driving | |
US11378970B2 (en) | Visual localization support system | |
US10942519B2 (en) | System and method for navigating an autonomous driving vehicle | |
US10699571B2 (en) | High definition 3D mapping | |
US20210110715A1 (en) | System and method for navigation with external display | |
US12055410B2 (en) | Method for generating road map for autonomous vehicle navigation | |
US20240037961A1 (en) | Systems and methods for detecting lanes using a segmented image and semantic context | |
US12087063B2 (en) | Systems and methods for detecting traffic lights corresponding to a driving lane | |
US11410356B2 (en) | Systems and methods for representing objects using a six-point bounding box | |
CN109144052B (en) | Navigation system for autonomous vehicle and method thereof | |
US12073633B2 (en) | Systems and methods for detecting traffic lights of driving lanes using a camera and multiple models | |
WO2022215325A1 (en) | Vehicle control device and vehicle control method | |
KR20240068861A (en) | Method and device with autonomous driving plan |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AME Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DELP, MICHAEL J.;REEL/FRAME:032834/0761 Effective date: 20140506 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |