US11530931B2 - System for creating a vehicle surroundings model - Google Patents
System for creating a vehicle surroundings model Download PDFInfo
- Publication number
- US11530931B2 US11530931B2 US16/441,247 US201916441247A US11530931B2 US 11530931 B2 US11530931 B2 US 11530931B2 US 201916441247 A US201916441247 A US 201916441247A US 11530931 B2 US11530931 B2 US 11530931B2
- Authority
- US
- United States
- Prior art keywords
- vehicle
- information
- road
- merged
- position information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000000034 method Methods 0.000 claims description 35
- 238000006073 displacement reaction Methods 0.000 claims description 11
- 238000011156 evaluation Methods 0.000 claims description 4
- 230000001131 transforming effect Effects 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 26
- 238000004891 communication Methods 0.000 description 11
- 238000010276 construction Methods 0.000 description 5
- 238000013213 extrapolation Methods 0.000 description 5
- 102100034112 Alkyldihydroxyacetonephosphate synthase, peroxisomal Human genes 0.000 description 3
- 101000799143 Homo sapiens Alkyldihydroxyacetonephosphate synthase, peroxisomal Proteins 0.000 description 3
- 238000000848 angular dependent Auger electron spectroscopy Methods 0.000 description 3
- 230000002596 correlated effect Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000010354 integration Effects 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 231100001261 hazardous Toxicity 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3667—Display of a road map
- G01C21/367—Details, e.g. road map scale, orientation, zooming, illumination, level of detail, scrolling of road map or positioning of current position marker
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/51—Relative positioning
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
- G01C21/32—Structuring or formatting of map data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3807—Creation or updating of map data characterised by the type of data
- G01C21/3815—Road data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3833—Creation or updating of map data characterised by the source of data
- G01C21/3841—Data obtained from two or more sources, e.g. probe vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/01—Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/13—Receivers
- G01S19/14—Receivers specially adapted for specific applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/11—Complex mathematical operations for solving equations, e.g. nonlinear equations, general mathematical optimization problems
- G06F17/13—Differential equations
Definitions
- V2X vehicle-to-everything communication
- Infrastructure objects may include traffic lights, traffic signs, mobile and stationary road edge markings, buildings, signs and billboards, for example.
- information from a digital map also plays an increasingly important role as a data source.
- information about road topology, speed limits, traffic signs and the gradient and curvature of roads can be stored in digital maps.
- HD maps which contain information about the course of a road and additional data with a very high precision.
- information that cannot be detected by conventional vehicle sensors can be stored in digital maps.
- the gradient and curvature of a road can be read out from the map in order to be able to automatically adjust the driving dynamics.
- DE 10 2014 111 126 A1 discloses a method for creating an environment map of the area surrounding a motor vehicle.
- An object in the surrounding area can be detected by means of a sensor device in/on the motor vehicle, wherein a position value describing a position of the object is determined by a control device of the motor vehicle on the basis of data from the sensor.
- the position value thereby ascertained is transferred to maps of the surroundings, wherein a vector between the object and a predetermined reference point of the motor vehicle is ascertained, forming a point of origin of a vehicle coordinate system.
- the vector determined in the vehicle coordinate system is transformed into a global coordinate system of the vehicle surroundings map and the position value in the vehicle surroundings map is determined on the basis of the transformed vector.
- US 2017/371349 A1 discloses a vehicle control device comprising a communication unit.
- the communication unit detects location information on the vehicle and can communicate with an external server and another vehicle.
- a processor controls the communication unit to receive map information from the external server and receive from the other vehicle location information about the other vehicle.
- the processor combines the detected location information on the vehicle and the received location information on the other vehicle with the received map information in order to control the vehicle on the basis of the combined information.
- the disclosure relates to the object of providing an improved system and an improved method for creating a model of the surroundings of a vehicle with which position information in different formats can be merged to create a model of the surroundings of the vehicle, regardless of the density of the available information.
- a system defined in patent claim 1 for creating a model of the surroundings of a vehicle is proposed as the solution.
- This system for creating a model of the surroundings of a vehicle is associated with at least one navigation unit, at least one interface and/or at least one sensor unit.
- the at least one navigation unit is equipped to provide information about the instantaneous position of the vehicle and information about at least one segment of road in front of the vehicle in space and time, wherein the navigation unit provides the information in a digital map format and/or in absolute position information.
- the at least one interface is equipped to communicate with at least one object to be merged in the surroundings of the vehicle, wherein the information received by the interface comprises absolute position information on the at least one object to be merged.
- the at least one sensor unit is equipped to detect at least one object to be merged in the surroundings of the vehicle, wherein the at least one sensor unit is additionally equipped to provide relative position information on the at least one object to be merged relative to the vehicle.
- the system is equipped to ascertain the road geometry of the segment of road in front of the vehicle by using the information about the segment of road in front of the vehicle made available by the at least one navigation unit, wherein the system is equipped to merge the absolute position information and/or the relative position information on the at least one object to be merged with the information provided by the at least navigation unit in the digital map format, based on the road geometry thereby ascertained, to create a model of the surroundings of the vehicle.
- the information from a digital map such as, for example, the information with respect to the instantaneous position of the vehicle and information about the segment of road in front of the vehicle is referred to as information in a digital map format.
- Information from the various information sources for example, digital map in the vehicle, data from the Cloud, V2X and information from sensor units (camera, radar, etc.) can be merged.
- this merged information it is possible to create a model of the surroundings which is filled from all these sources.
- Driver assistance systems and systems for driverless vehicles can access this merged model of the surroundings.
- the information from the various information sources can be merged even if the data density is relatively low, i.e., there are relatively few objects or points with known position information on the segment of road in front of the vehicle. Since the proposed system can ascertain the geometry of the segment of road in front of the vehicle, the information itself can be merged on the basis of the road geometry thereby ascertained even if the data density for that segment of road in front of the vehicle is relatively low.
- Relative position information can be provided by sensor units such as the camera unit, the radar unit and the ultrasonic sensor units, for example. Examples of such relative position information from the individual units include:
- Absolute position information can be sent by objects such as traffic lights, traffic signs or the like, for example.
- One example of an application for merging information to create a model of the surroundings of a vehicle could be a traffic light as an object to be merged.
- the traffic light is detected by the sensor unit, for example, by a camera unit.
- the traffic light sends its position in absolute position information (e.g., in world coordinates) to the system via the interface.
- the information can be merged in the two cases which follow, for example, to create a uniform model of the surroundings:
- the system can be equipped to transform the relative or absolute position information of the object to be merged in a digital map format. Accordingly, a surroundings model can be created which is based on information in a digital map format. Additionally or alternatively, the system may be equipped to transform the relative or absolute position information on the object to be merged and the information in a digital map format into a predefined coordinate format. It is therefore also possible to create a surroundings model, which fits with another coordinate format.
- the system may be equipped to ascertain absolute position information from the relative position information on the at least one object to be merged.
- the system may be equipped to ascertain the absolute position information based on the distance and additional information with respect to the segment of road and/or the object to be merged if the relative position information is limited to the distance from the object to be merged.
- This additional information can be ascertained or detected by the following steps, for example:
- the information in a digital map format may be information in a path/offset format.
- the digital map format may be, for example, a path/offset format according to the ADASIS protocol.
- Driver assistance systems in motor vehicles often do not have enough memory to store a large digital map there. For this reason, one of the units and/or system in a vehicle will usually have a relatively large memory unit.
- Such a unit may be the navigation unit of the vehicle, for example.
- a relevant detail can be read out of the map from the memory unit and transferred in a predefined format to one of the driver assistance systems. This transfer may take place, for example, over a vehicle bus system or by means of other technologies such as shared memory, for example.
- ADASIS is a known protocol for such an information transfer and display. For transfer of information, this information is not transmitted as raw data, such as complete geographic coordinates, for example, but instead is transmitted in a special form of display.
- the system may be equipped to determine an offset of the object to be merged by merging the information about the instantaneous position of the vehicle in a path/offset format and the absolute or relative position information of the object to be merged.
- the proposed system can merge relative or absolute position information with information in the path/offset format.
- the absolute position information of the at least one interface and/or the relative position information of the at least one sensor unit can be made available to the system.
- the navigation unit can provide information about the instantaneous position of the vehicle and about the segment of road in front of the vehicle in the path/offset format.
- the offset between the instantaneous vehicle position and the path/offset format and the relative or absolute position information of the object to be merged can be ascertained from the information provided.
- the system may be equipped to ascertain one or more geometry points, whose absolute position information and/or whose position in the digital map format is/are known, by using the information provided by the at least one navigation unit for ascertaining the geometry of the road.
- the system may be equipped to ascertain an offset of the object to be merged by using the geometry point(s) thereby ascertained. If there are numerous geometry points (high data density), a geometry point may in the simplest case be associated directly with the object to be merged. The geometry point at the shortest distance from the object to be merged may be used. In this case, the offset of the geometry point thereby ascertained may be used as the offset for the object to be merged. If the distance of the object to be merged from the nearest geometry point is greater than the predetermined threshold, there cannot be a direct association.
- a neutral geometry point that is closer to the object to be merged can be determined by interpolation (or extrapolation) between two or more geometry points.
- interpolation or extrapolation
- a linear method, a higher order polynomial or some other suitable method may be used for interpolation and/or extrapolation.
- the absolute position information of the geometry point thereby ascertained may be known in world coordinates, and the course of the offset may also be known. For this reason, the offset of the object to be merged may correspond to the offset of the geometry point.
- the system may be equipped to ascertain a node point by using the information made available by the at least one navigation unit for ascertaining the geometry of the road such that the absolute position information on this node point and/or its position in the path/offset format are known.
- the system may be equipped to estimate the geometry of the segment of road between the object to be merged and the node point nearest to the object to be merged, wherein the system is further equipped to estimate the distance between the object to be merged and the node point based on the estimated geometry of the segment of road.
- the system may be equipped to ascertain an offset between the node point and the object to be merged based on the estimated distance.
- the system may be equipped to estimate the course of the segment of road based on at least one item of information detected by one or at least one sensor unit and/or based on at least one item of information or information provided by at least one interface.
- the following information may be included in the estimate:
- the system may be equipped to ascertain whether the object to be merged is on the same path or in the same lane as the vehicle. In this way, there can be a correct association of path and lane of the object to be merged. If the object to be merged is on a different path or in a different lane, the offset must be corrected accordingly because each path has its own offset origin.
- the data from the digital map in the path/offset format and the data detected by the at least one sensor unit may be used for this purpose.
- the system may be equipped to ascertain the information in the digital map format of an object to be merged, whose absolute position information is known, by means of a relative displacement vector starting from the instantaneous absolute position information on the vehicle.
- a method for creating a surroundings model of a motor vehicle comprises the steps of: providing information about the instantaneous position of the vehicle and information about at least one segment of road in front of the vehicle in both time and space, wherein the information is supplied in a digital map format and/or in absolute position information, communicating with at least one object to be merged in the surroundings of the vehicle, wherein the received information comprises absolute position information on the at least one object to be merged, and/or detecting at least one object to be merged in the surroundings of the vehicle, wherein relative position information is supplied about the at least one object to be merged relative to the vehicle; ascertaining the geometry of a segment of road in front of the vehicle using the information about the segment of road in front of the vehicle, said information being supplied by the at least one navigation unit, merging the absolute position information and/or the relative position information about the at least one object to be merged with the information supplied by the at least one navigation unit in the digital map format to create a vehicle surroundings model based on the
- the method may comprise the steps of: transforming the relative or absolute position information on the object to be merged into information in the digital map format and/or transforming the relative or absolute position information on the object to be merged and the information in the digital map format to a predefined coordinate format.
- Absolute position information can be ascertained from the relative position information on the at least one object to be merged.
- the absolute position information can be ascertained on the basis of the distance and additional information with respect to the segment of road and/or the object to be merged.
- One or more geometry points whose absolute position information and/or whose position is/are known in the digital map format can be ascertained using the information supplied by the at least one navigation unit for ascertaining the geometry of the road.
- the information in a digital map format may be information in a path/offset format.
- An offset of the object to be merged can be ascertained by merger of the information about the instantaneous position of the vehicle in the digital map format and the absolute or relative position information on the object to be merged.
- An offset of the object to be merged can be ascertained with the geometry point(s) that is ascertained.
- At least one node point whose absolute or relative position information and/or whose position in the digital map format is/are known can be ascertained by using the information supplied by the at least one navigation unit for ascertaining the geometry of the road.
- the distance between the object to be merged and the node point can be estimated based on the estimated geometry of the segment of road.
- an offset can be ascertained between the node point and the object to be merged.
- the course of the segment of road can be estimated based on information detected by at least one sensor or the at least one sensor unit and/or based on information provided by at least one interface or the at least one interface.
- the information in the digital map format of an object to be merged whose absolute position information is known can be ascertained by means of a relative displacement vector starting from the instantaneous absolute position information on the vehicle.
- FIG. 1 shows a schematic diagram of a system for creating a surroundings model for a vehicle
- FIG. 2 shows an example of a view to illustrate a logic map format in path/offset format
- FIG. 3 shows an example of a diagram to illustrate the merger of information in the path/offset format with relative position information
- FIG. 4 shows another example of a diagram to illustrate the merger of information in the path/offset format with relative position information
- FIG. 5 shows an example of a diagram to illustrate the procedure in ascertaining absolute position information from the distance between the object to be merged and the vehicle;
- FIG. 6 shows an example of a diagram for ascertaining the offset of the object to be merged for the case when the geometry of the road is ascertained by using a plurality of geometry points;
- FIG. 7 shows an example of a diagram in which the geometry of the road is ascertained by using node points with known absolute position information
- FIG. 8 shows an example of a diagram to illustrate the determination of the offset of an object to be merged by means of node points
- FIG. 9 shows an example of a diagram in which the road runs with a curve between the vehicle, the node point and the object to be merged;
- FIG. 10 shows an example of an estimated road geometry and/or an estimated course of the road
- FIG. 11 shows an example of a diagram to illustrate how the position information is ascertained in the path/offset format on the example of a traffic light system with known absolute position information
- FIG. 12 shows an example of a diagram to illustrate the problems in merger of position information in the path/offset format with absolute position information in the case of a curved road course
- FIGS. 13 and 14 each show a diagram of a different possibility for merging data to create a vehicle surroundings model
- FIG. 15 shows a diagram to illustrate the advantages of a vehicle surroundings model with merged position information.
- FIG. 1 shows schematically a sensor unit 110 , a system 120 , a navigation unit 130 and an interface 140 , wherein the sensor unit 110 , the system 120 , the navigation unit 130 and the interface may be incorporated into a motor vehicle (not shown).
- the sensor unit 110 , the navigation unit 130 and the interface 140 may be connected to the system 120 , which is in turn connected to the navigation unit 130 .
- the sensor unit 110 may be, for example, a camera unit, a radar unit, a lidar unit or the like. However, the system 120 may also be connected to a plurality of sensor units 110 , i.e., the system 120 may be connected to a camera unit, a radar unit and a lidar unit.
- the sensor unit 110 supplies relative position information on an object to be merged (not shown) in the surroundings of the vehicle to the system 120 . If the sensor unit 110 is a camera unit, it may be a time-of-flight (TOF) camera unit. A time-of-flight camera can detect the surroundings of the vehicle in 3D based on the distance measurement method it carries out.
- TOF time-of-flight
- a time-of-flight camera illuminates the surroundings of the vehicle with pulses of light, with the camera unit measuring the time needed by the light to travel to the object and back for each pixel. The required time is then used to determine the distance from the object detected.
- the sensor unit 110 can be additionally equipped to detect the course of a road border and/or a lane marking. Furthermore, the sensor unit 110 may be equipped to detect the width of the road.
- the navigation unit 130 is equipped to supply information about the instantaneous position of the vehicle and at least one segment of road in front of the vehicle in time and space based on position information on the vehicle and/or map information. This information can be supplied in a digital map format.
- the navigation unit 130 may be equipped accordingly to ascertain the instantaneous position of the motor vehicle based on a signal, in particular a GPS signal.
- the navigation unit 130 may access map data in a digital map format stored in a memory in the navigation unit 130 , supplied in the form of a external data medium and/or a Cloud system.
- the map data may also contain information about the course of the road border and/or the course of the lane marking and/or the width of the road.
- the current vehicle position can be supplied to the navigation unit 130 in a digital map format.
- the map data may also include information about the geometry of the road and the topology of the segment of road in front of the vehicle.
- the interface 140 is equipped to communicate with at least one object to be merged in the surroundings of the vehicle.
- the information received by the interface 140 includes absolute position information on the at least one object to be merged.
- the interface 140 may also be an interface for the so-called “V2X” communication.
- V2X refers to the communication of a vehicle with objects. This expression thus includes communication of the vehicle with other vehicles, with infrastructure objects, but also with humans (pedestrians).
- Infrastructure objects may be, for example, traffic lights, traffic signs, mobile and stationary road surface borders, buildings, signs, billboards or the like.
- the system 120 is equipped to ascertain geometry points and/or node points with known absolute position information and/or with known position information in the path/offset format from the information supplied by the navigation unit 130 . With the geometry points and/or node points thereby ascertained, the system 120 can ascertain the geometry of the segment of road in front of the vehicle.
- the system is additionally equipped to merge the absolute position information and/or the relative position information on the at least one object to be merged with the information supplied by the at least one navigation unit 130 in a path/offset format based on the geometry of the road thereby ascertained in order to create a vehicle surroundings model.
- the map data may also include information about the course of the road border and/or the course of the lane marking and/or the width of the road.
- FIG. 2 shows a schematic diagram to illustrate the path/offset format for the position information about objects such as vehicles and traffic signs, nameplates and intersections in a digital map according to the ADASIS protocol.
- Roads are represented by paths and are labeled unambiguously by a path ID.
- the current vehicle position is defined by means of path/offset information.
- Position information originating from a sensor unit such as a camera unit, a radar unit or a lidar unit in the vehicle, for example, is usually given in relative position information.
- the camera or radar may thus indicate that a relevant object and/or an object to be merged (e.g., traffic sign or another vehicle) is located at a certain distance from the vehicle.
- a relevant object and/or an object to be merged e.g., traffic sign or another vehicle
- data received via V2X technology is available as absolute position information (e.g., WGS84 world coordinates).
- WGS84 world coordinates
- street lights can transmit the instantaneous switch status together with absolute position information in WGS84 format to street lights.
- FIGS. 3 through 11 An embodiment of a method for creating a surroundings model of a vehicle, which can be carried out by the system 120 , for example, is described below with reference to FIGS. 3 through 11 .
- merger of relative or absolute position information from various information sources with information in the path/offset format according to the ADASIS protocol will be explained.
- the position information from a digital map is referred to as information in the path/offset format or also as information in a digital map format.
- the path/offset format is an example of a digital map format.
- FIG. 3 shows an example of a diagram of merger of information in the path/offset format with relative position information.
- the ego vehicle 10 is driving on a road 12 .
- the path ID 8 was assigned to the road 12 in the digital map.
- the offset for path 8 has a linear increase in the direction of the dashed arrow P.
- a construction site 14 and a vehicle 16 in a lane 18 are detected by the camera unit and/or the radar unit (not shown) on the ego vehicle 10 .
- the distance from the construction site 14 and the vehicle 16 can be determined from the camera and radar information.
- it is known from the camera and radar information which lanes 18 and 20 the objects 14 and 16 are in.
- Offset object Offset ego vehicle + ⁇ Offset
- FIG. 4 shows another example of a diagram of a situation of the vehicle in which the merger of information in the path/offset format with relative position information is preferred.
- FIG. 4 shows the ego vehicle 10 on a road 12 which in this exemplary diagram has only a single lane. Unlike FIG. 2 , the road 12 here is not straight but has a curvature. Intersections are a special case. At intersections, the direction of travel may change by 90° from the previous direction of travel. Again in the special case of intersections, the merger of information in the path/offset format with relative position information may take place according to the following description.
- the ego vehicle 10 has a sensor unit (not shown) such as a camera unit or a radar unit, for example, that serves to detect objects to be merged such as the traffic sign 22 (speed limit 60 km/h).
- the traffic sign represents the object 22 to be merged.
- an object to be merged may also be another traffic participant, a street light, a traffic sign or pedestrians.
- the sensor unit (not shown) can supply relative position information with only one coordinate with respect to the object 22 to be merged.
- This position information may be, for example, the distance of the object 22 relative to the ego vehicle 10 (e.g., object is 10 meters away).
- the sensor unit it is also possible for the sensor unit to supply more accurate position information with at least two coordinates. Such coordinates may be given, for example, in polar coordinates or Cartesian coordinates.
- the position information may then include a distance and an angle such as, for example, the object is 50 meters away at an angle of 5° to the direction of travel.
- x w , y w World coordinates e.g., WGS84 coordinates x o Offset coordinate on a path in a path/offset format x wo , y wo World coordinates or offset coordinate of the object to or x oo be merged
- O Object point of the object to be merged e.g., traffic sign
- Geometry point e.g., lane marking
- Node point for the merger e.g., landmark
- the digital map may provide information with respect to the geometry of the road usually in absolute position information such as in world coordinates, for example.
- the geometry of the road includes, among other things, the geometry of the road markings and the geometry of lane markings.
- the geometry points G of a lane marking are shown in FIG. 4 .
- the world coordinates G(x w , y w ) and the offset value on the respective path G(x o ) are known. In the example according to FIG.
- the information is available with a sufficiently high density, i.e., the distance between the individual points G(x w , y w ) is so small that a plurality of geometry points G(x w , y w ) is located in the immediate vicinity of the object 22 to be merged.
- the position information in world coordinates E(x w , y w ) from the ego vehicle 10 and the offset value E(x o ) are also known.
- the sensor unit detects the object 22 to be merged and represents its relative position information in relation to the ego vehicle 10 either one-dimensionally, specifying only the distance from the ego vehicle 10 , or by more accurate position information, for example, the angle and distance in relation to the ego vehicle 10 and with a displacement vector.
- absolute position information is ascertained from the relative position information on the object 22 supplied by the sensor unit.
- the absolute position information can be given in world coordinates.
- the world coordinates for the object 22 (O(x wO , y wO )) are ascertained from the relative position information on the object 22 .
- World coordinates are not usually given as Cartesian coordinates but instead are given as spherical coordinates in first approximation.
- the WGS84 model uses an oblate spheroid to describe the earth's surface. For a simpler illustration of merging, a spherical representation of the earth is assumed. This approximation is accurate enough for the short distances between the vehicle and the object for merger.
- the sensor unit supplies the angle ⁇ and the distance d from the object 22 to be merged, the result is the world coordinates of the object 22 (O(x wO , y wO )) from the world coordinates of the ego vehicle E(x wE , y wE ):
- x wo arcsin ⁇ ( sin ⁇ ⁇ x wE ⁇ cos ⁇ d R + cos ⁇ ⁇ x wE ⁇ sin ⁇ d R ⁇ cos ⁇ ⁇ ⁇ )
- y wo y wE + arctan ⁇ ⁇ 2 ⁇ ( sin ⁇ ⁇ ⁇ ⁇ sin ⁇ d R ⁇ cos ⁇ ⁇ x wE , cos ⁇ d R - sin ⁇ ⁇ x wE ⁇ sin ⁇ ⁇ x wo )
- d is the distance from the ego vehicle 10 to the object 22
- ⁇ is the angle in the direction of the object 22 , measured from the connecting line between the ego vehicle and the north pole and R is the radius of earth.
- ⁇ F is the orientation of the ego vehicle (measured from the connecting line between the ego vehicle and the north pole) and ⁇ O is the angle between the longitudinal axis of the ego vehicle 10 and the object 22 . This angle is derived from the vector ⁇ right arrow over (v) ⁇ thereby ascertained.
- FIG. 5 shows an example of a diagram to illustrate the procedure for ascertaining the absolute position information of the object 22 to be merged from the distance d between the ego vehicle 10 and the object 22 to be merged.
- the sensor unit (not shown) supplies the distance d between the ego vehicle 10 and the object 22 to be merged, which is a traffic sign according to the example of a diagram in FIG. 5 .
- Possible points O possible (x w , y w ) for indicating the absolute position information of the object 22 lie on a circle around the ego vehicle 10 having the radius d, i.e., the radius corresponds to the distance d. It is possible to ascertain that the object 22 to be merged is a traffic sign based on the information ascertained from the sensor unit or the combination of information from a plurality of sensor units such as, for example, the combinations of information supplied by the radar unit and the camera unit.
- the geometry of the road can be ascertained with the help of the geometry of the road border as a reference geometry.
- the geometry points G(x w , y w ) and G(x o ) are known from the geometry of the road border.
- the absolute position information O(x wO , y wO ) in world coordinates for the object 22 can be determined with the point of intersection of the curve having the radius d of all possible object points O possible (x w , y w ) for the object 22 to be merged with the geometry of the road G(x w , y w ) thereby ascertained.
- the absolute position information O(x wO , y wO ) on the object 22 to be merged cannot be determined directly in world coordinates because of inadequate sensor information, then various types of additional information may be used to determine the best possible alternative point for the absolute position information on the object 22 to be merged.
- This additional information can be ascertained or detected by the following steps, for example:
- FIG. 6 shows an example of a diagram for ascertaining the offset of the object 22 to be merged for the case when the geometry of the road is ascertained using a plurality of geometry points G whose absolute position information is known.
- the geometry points G having known absolute position information in this case are a short distance away from one another.
- a suitable geometry point G search After calculating the absolute position information in world coordinates of the object to be merged O(x wO , y wO ), for example, a suitable geometry point G search must be selected and must correspond to O(x wO , y wO ) as well as possible. If they do correspond, then the geometry point G, which is the shortest distance away from O(x wO , y wO ), for example, may be intended.
- the geometry point G 105 (x w105 , y w105 ) has a good correspondence with the object 22 to be merged (O(x wO , y wO )); this is not shown in FIG. 6 in order to simplify the diagram, but it is in the position shown in FIG. 5 .
- the position of the geometry point G 105 (x w105 , y w105 ) corresponds essentially to the position of the object 22 to be merged (O(x wO , y wO )) (see FIG. 5 ).
- FIG. 7 shows an example of a diagram, in which the road geometry is ascertained with the help of node points with known absolute position information.
- the road geometry cannot be ascertained with geometry points.
- the data density is so low that a simple interpolation or extrapolation method with the known geometry points cannot yield a sufficiently accurate result.
- FIG. 8 shows an example of the starting situation in a merger with a low data density, i.e., with a few known geometry points and/or geometry points that are too far apart from one another.
- the node points S 1 , S 2 and S n may be geometry points on the road. The great distance between the node points S 1 , S 2 and S n does not allow a reasonable interpolation.
- the node point S 3 may be a landmark, a street light or a V2X-capable traffic sign, which can transmit their absolute position information.
- the absolute position information on the object to be merged O(x wO , y wO ) can also be determined first in the present case if this is not known. This was already described in detail above.
- the node point S search (x w , y w ) representing the smallest distance from the object O(x wO , y wO ) can be found.
- the node point S 3 is selected for this purpose.
- the distance d from the object O(x wO , y wO ) to the available node points is calculated by using the known world coordinates of the node points S(x w , y w ) and of the object O(x wO , y wO ):
- a spherical model of the earth with the radius R is assumed as the basis. Such a model of the earth is expected to meet the requirements for precision for most applications. If a greater precision is nevertheless required, other models of the earth (e.g., rotational ellipsoid) may be used.
- node points and objects in a Cartesian coordinate system are referenced in the surroundings of the ego vehicle 10 . This is true in particular of node points and objects in a near circle around the ego vehicle 10 .
- the distance d between the object to be merged and O(x wO , y wO ) and node point S n is an important criterion for selection of a suitable node point S n .
- the node point S n at the smallest distance from the object to be merged O(x wO , y wO ) may be selected but other parameters can also have an influence on the choice of a suitable node point S n .
- the node points S n may thus have one (or more) confidence indicators.
- This indicator may indicate, for example, how high the confidence is that the node point is actually located at the stored position.
- a high confidence is obtained, for example, by the fact that the position of the node point S n has been reported by an official authority (e.g., highway authority reports the position of a speed limit sign in world coordinates) or when the position of the node point S n has been confirmed by many different participants. If the confidence level is low, a node point for further processing can be ruled out, and a node point S n at a greater distance d but with a higher confidence may be selected. Either one or more confidence parameters supplied by a data provider may be used as the confidence interval. Furthermore, it is possible to calculate the confidence parameter before using the node point itself.
- time stamps e.g., the last confirmation of the position of the node point S n
- control parameters e.g., variance of the measured node point position
- type of data source e.g., other traffic participants or public authority
- type of node point S n e.g., traffic sign erected temporarily or permanently
- FIG. 8 shows an example of a diagram for determining the offset of an object to be merged with node points S, where the road is straight in the diagram according to FIG. 8 .
- OS search,road ⁇ square root over ( OS search 2 ⁇ b road 2 ) ⁇
- this may involve other geometric variables in addition to the width of the road, such as the lane width or other distances that can be derived from the sensor information detected by the at least one sensor unit or from digital maps.
- the sensor units for example, camera unit, radar unit
- the sensor units on the vehicle may be used to verify whether the prerequisites of a straight road course are met. If the road has a tight curve, the determination of OS search,road will lead to an inaccurate result, as shown in FIG. 8 .
- it is necessary to determine the geometry of the road and thus the course of the offset along the road by means of an estimate based on the node points with known absolute position information and/or known information in the path/offset format. Mainly information from the vehicle sensors is therefore used as input.
- the road geometry between S search and the object to be merged can therefore be estimated, for example, with information from the at least one sensor unit.
- the at least one sensor unit may be a camera unit, for example.
- the course of the lane markings may be evaluated in the pictures taken by the camera unit.
- the steering angle and the vehicle trajectory that has been driven and recorded for example, may be used.
- the remaining course of the road can thus be extrapolated from the course of the road that has been driven for a certain distance.
- FIG. 9 shows an example of a diagram, in which the road has a curvature between the ego vehicle 10 , the node point S search and the object 22 to be merged.
- the road geometry between the ego vehicle 10 and the object 22 to be merged can be estimated with one or more sensor units of the ego vehicle 10 (camera unit, odometer, GPS, etc.).
- the most precise possible method of determining OS search,road,estimated is particularly important to achieve the required precision.
- the following information can enter into the estimate, for example:
- a denotes the starting point of the path integration (e.g., node point, vehicle) and b denotes the location of the object 22 to be merged.
- b denotes the location of the object 22 to be merged.
- different coordinate systems e.g., Cartesian, polar coordinates
- the choice of the coordinate system depends on the respective conditions, i.e., in particular, which input parameters can be used to estimate the course of the geometry of the road.
- the camera unit (not shown) of the ego vehicle 10 detects the surroundings of the vehicle, in particular the surroundings in front of the vehicle.
- the geometry of the road and/or the course of the road are to be estimated with the help of the information detected by the camera unit.
- the estimate can be obtained, for example, by evaluation of the lane markings in the camera images.
- it is advisable to give the estimated path of the road 12 s g as a function of the longitudinal coordinate x F in the Cartesian vehicle coordinate system: s g f ( x F )
- ⁇ ⁇ ⁇ Offset ⁇ 0 b ⁇ 1 + ( d ⁇ ⁇ s g d ⁇ ⁇ x ) 2 ⁇ d ⁇ ⁇ x ⁇
- FIG. 11 shows how the position information is ascertained in the path/offset format on the example of a traffic light system having known absolute position information.
- the traffic light system 24 represents the object to be merged. This is mostly absolute position information in the case of position information transmitted from a V2X base station to the vehicle, for example.
- a traffic light system transmits the instantaneous switch status together with its absolute position information to the traffic light 24 in WGS84 coordinates, i.e., in world coordinates.
- the instantaneous absolute vehicle position (or some other suitable reference) is used.
- the absolute position information on the vehicle 10 is known from the GPS receiver in the vehicle 10 , for example.
- the position of the vehicle in the path/offset format e.g., ADASIS
- the path/offset display of the relevant object 24 can be found in different ways.
- a relative displacement vector vehicle 10 to the object 24 to be merged
- the relative displacement between the vehicle 10 and the object 24 is thus known. It was explained above with reference to FIGS. 3 through 10 how the position information is determined from known absolute position information in the path/offset format.
- Both the position information in the path/offset format and the absolute position information in world coordinates (WGS84 coordinates) are known by the ego vehicle 10 .
- the relative displacement vector between the ego vehicle 10 and the traffic light system 24 can be calculated from these world coordinates of the ego vehicle 10 and the traffic light system 24 which represents the object to be merged. This vector can then be used to calculate the position information of the traffic light system in the path/offset format from the position information in the path/offset format of the ego vehicle 10 .
- the traffic light system can be inserted into a surroundings model of the vehicle.
- This vehicle surroundings model may be an electronic horizon (e.g., according to the ADASIS protocol), for example.
- FIG. 12 shows an example of a diagram to illustrate the problems in merger of position information in the path/offset format and absolute position information with a curved course of the road.
- a displacement vector between the ego vehicle 10 and the relevant object or the object 26 to be merged can be calculated.
- the results is a false value of the offset (and possibly of the path) of the object 26 to be merged because the offset value of the road does not follow the displacement vector (see FIG. 10 ).
- the actual offset course corresponds to the path integral along the arrow P, which is a dashed line in FIG. 11 , as described above.
- the additional procedure then corresponds to the merger of the absolute position information with position information in the path/offset format, as described above. To avoid repetition, reference is made to the detailed description of FIGS. 3 through 10 given above.
- FIGS. 13 and 14 show which merger possibilities are created by the present disclosure in order to be able to create a vehicle surroundings model.
- the examples described above relate to merging relative position information and absolute position information (in relative and world coordinate formats) with position information in a logic map display (path/offset format). This is shown in FIG. 13 .
- the position information detected by a sensor unit of the vehicle may be information from a camera unit and/or a radar unit, for example.
- the logic map display may be an electronic horizon according to the ADASIS protocol, for example.
- the embodiments described above thus relate to among other things the integration of relative or absolute position information into a digital horizon. For the ADAS/AD applications, this yields the advantage that a uniformed and merged surroundings model is created regardless of the source of the position information.
- the same steps must be carried out first, as in the embodiments described above. After these steps, the merged position information can be transferred to another coordinate system (for example, a Cartesian coordinate system and/or a polar coordinate system).
- another coordinate system for example, a Cartesian coordinate system and/or a polar coordinate system.
- either relative position information relative to the ego vehicle or absolute position information is merged with information in the path/offset format. Due to this merger of information, for example, the following information can be integrated into an electronic horizon and/or into a surroundings model of the vehicle:
- FIG. 15 shows the advantages of merger of information from different information sources on the basis of the example of a traffic light system. Without the integration and correlation of the data from different sources, for example, the digital map, the vehicle sensors and V2X data sources, a uniform surroundings model is not obtained. It is impossible to correlate the same objects from different data sources and combine their information.
- ADASIS ADAS/AD range
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Automation & Control Theory (AREA)
- Computational Mathematics (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Pure & Applied Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Algebra (AREA)
- Operations Research (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
Description
-
- Radar unit: distance and optionally angles to with an object in front of the vehicle
- Camera unit: distance from traffic signs, traffic lights and pedestrians
- Ultrasonic sensor units: distance from vehicles on a parallel lane
-
- 1. The traffic light is listed in the model of the surroundings of the vehicle, and the object to be merged, i.e., the traffic light, which is detected by the sensor unit, should be correctly associated with the existing object in the model of the surroundings.
- 2. The object detected is not present in the model of the surroundings and should be inserted by merger of the information from the individual information sources into the model of the surroundings.
-
- Determining the type of object to be merged (e.g., traffic signs, other traffic participants), for example, by evaluating the sensor information (e.g., the camera unit) and/or the position information.
- Calculating of several possible alternative points for the position of the object to be merged wherein when the distance between the vehicle and the object to be merged is known, the possible alternative points lie on a circle around the vehicle.
- Using information known for different types of objects from the objects to be merged. Street signs, for example, are located at the edges of roads so that the edge of the road can be used as reference geometry for determining the geometry of the road. For vehicles as objects to be merged, for example, the lane dividing line may be a suitable reference geometry for determining the geometry of the road.
-
- Evaluating the information from sensor units, such as the images of a camera unit, for example, with respect to the course of road markings, the position of detected objects relative to one another, etc.,
- Using distance and angle information from sensor units, such as the radar unit and/or lidar unit,
- Storing the vehicle trajectory driven recently, for example, by storing the GPS position(s),
- Using the steering angle and the vehicle speed, and
- Using node points from the digital map, from which the world coordinates are known.
Offsetobject=Offsetego vehicle+ΔOffset
Symbol | Meaning |
xw, yw | World coordinates, e.g., WGS84 coordinates |
xo | Offset coordinate on a path in a path/offset format |
xwo, ywo | World coordinates or offset coordinate of the object to |
or xoo | be merged |
O | Object point of the object to be merged (e.g., traffic |
sign) either in world coordinates O(xwo, ywo) or with | |
offset coordinates O(xoo) | |
G | Geometry point (e.g., lane marking) either in world |
coordinates G(xw, yw) or with offset coordinate G(x0) | |
S | Node point for the merger (e.g., landmark) either with |
world coordinates S(xw, yw) or with offset coordinate S(xo) | |
d=√{square root over (a 2 +b 2)}
α=αF+αO
-
- Determining the type of object to be merged (for example, traffic signs, other traffic participants), for example, by evaluating the camera information and/or position information.
- Calculating a plurality of possible alternative points for Opossible(xw, yw). This was described above with reference to the example of a diagram according to
FIG. 5 , wherein the possible alternative points Opossible(xw, yw) lie on a circle around the ego vehicle 10 (seeFIG. 5 ) if the distance d from theobject 22 to be merged is known. - Using known information for the different object types of objects to be merged. This includes, for example, information such as: street signs are located at the edges of lanes, so the edge of the road is best used as a reference geometry. When the object to be merged is another vehicle, the lane marking can be used as a suitable reference geometry for ascertaining the geometry of the road.
O(x wO ,y wO)=G search(x w ,y w)
O(x wO ,y wO)=G interpolo(x w ,y w)
G search(x o)=O(x oo)
G interpol(x o)=O(x oo)
G 105(x o,105)=O(x oo)
d=√{square root over ((x wO −x w)2+(y wO −y w)2)}
S search(x o)≠O(x o,o)
x o,o =x o,S
x o,o =x o,S
-
- Evaluating the camera images, e.g., with respect to the course of the road markings, position of detected objects relative to one another, etc.,
- Using distance information and angle information from sensor units such as the radar unit and/or the lidar unit,
- Storing the vehicle trajectory driven recently, for example, by storing the GPS position data,
- Using the steering angle and the vehicle speed, and
- Using node points from the digital map, for which the world coordinates are known.
ΔOffset=∫a b ds
s g =f(x F)
b=d cos α
-
- Information about the instantaneous condition of traffic light systems. With the merger of information as described here, it is possible to merge real-time updates about the switch status of the traffic light system originating from a V2X data source with the position information from the traffic light system which originates from a digital map. Only by merger of such information can real-time information be merged with static data from a digital map. Without such a merger, it is impossible to determine how the real-time information can be correlated with objects in the electronic horizon. With this correlation, a (self-driving) vehicle can respond appropriately to the respective traffic light.
- Information about the instantaneous condition of variable speed displays (e.g., LED speed displays on highways). What was said above also applies here with respect to the real-time data of traffic light systems. Without merger of information, the real-time data on the speed limits received by V2X, for example, cannot be correlated with the static objects from the digital map and/or the electronic horizon.
- Information about the location and speed of other traffic participants (e.g., other vehicles, pedestrians, etc.). Merger of information makes it possible to integrate the positions of the other traffic participants into the digital horizon. Therefore information from the digital map as well as information from the vehicle sensors are available in a shared surroundings model. This results in a better evaluation of the situation with regard with many ADAS/AD functions.
- Information about recent changes in road courses (e.g., at construction sites). For example, the course of the altered lane guidance at a construction site can be transmitted by V2X to the vehicle. By merger of information, this information can be integrated into the digital horizon.
- Information about traffic obstacles (e.g., accidents, road blocks). In this context it may be helpful to integrate the position information from different sources into a shared electronic horizon.
-
- Warning the driver of hazardous sites or traffic obstacles along the path traveled. Due to the merger of information it is possible to ascertain whether the hazardous locations are on the path currently being driven and whether these are relevant for the vehicle.
- Taking into account information about the instantaneous condition of traffic lights and information about variable speed limits in the choice of speed for an intelligent cruise control (e.g., green ACC). Through merger of information, the real-time data from sources such as V2X cannot be merged and correlated with the other objects (traffic lights, traffic signs) from a digital map to form a shared surroundings model and/or horizon. Without this correlation, the information received from a V2X data source cannot be interpreted.
- Displaying information about the instantaneous traffic light condition and information about variable speed limits on a human-machine interface (HMI) in the vehicle. By merging information, it is possible to correlate and interpret data from different sources in a shared surroundings model.
- Choosing an alternative route due to traffic obstacles. Through merger of information it is possible to determine whether the traffic obstacle is relevant for the vehicle.
Claims (27)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102018005869.8A DE102018005869A1 (en) | 2018-07-25 | 2018-07-25 | System for creating an environmental model of a vehicle |
DE102018005869.8 | 2018-07-25 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20200033153A1 US20200033153A1 (en) | 2020-01-30 |
US11530931B2 true US11530931B2 (en) | 2022-12-20 |
Family
ID=69148588
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/441,247 Active 2039-10-14 US11530931B2 (en) | 2018-07-25 | 2019-06-14 | System for creating a vehicle surroundings model |
Country Status (3)
Country | Link |
---|---|
US (1) | US11530931B2 (en) |
CN (1) | CN110779534B (en) |
DE (1) | DE102018005869A1 (en) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102018216562A1 (en) * | 2018-09-27 | 2020-04-02 | Conti Temic Microelectronic Gmbh | Method for detecting light conditions in a vehicle |
WO2020091111A1 (en) * | 2018-11-01 | 2020-05-07 | 엘지전자 주식회사 | Electronic device for vehicle, and operation method and system of electronic device for vehicle |
WO2020141493A1 (en) * | 2019-01-04 | 2020-07-09 | Visteon Global Technologies, Inc. | Ehorizon upgrader module, moving objects as ehorizon extension, sensor detected map data as ehorizon extension, and occupancy grid as ehorizon extension |
DE102019211174A1 (en) * | 2019-07-26 | 2021-01-28 | Robert Bosch Gmbh | Method for determining a model for describing at least one environment-specific GNSS profile |
DE102019133613B3 (en) * | 2019-12-10 | 2020-12-31 | Audi Ag | Method for providing a three-dimensional map in a motor vehicle |
CN113447031A (en) * | 2020-03-24 | 2021-09-28 | 厦门雅迅网络股份有限公司 | Gradient point screening method, terminal equipment, medium and gradient calculation method and system |
CN112053592A (en) * | 2020-04-28 | 2020-12-08 | 上海波若智能科技有限公司 | Road network dynamic data acquisition method and road network dynamic data acquisition system |
CN113587941A (en) * | 2020-05-01 | 2021-11-02 | 华为技术有限公司 | High-precision map generation method, positioning method and device |
EP3907468A1 (en) * | 2020-05-08 | 2021-11-10 | Volkswagen Aktiengesellschaft | Vehicle, apparatus, method, and computer program for determining a merged environmental map |
DE102020207065B3 (en) * | 2020-06-05 | 2021-02-11 | Volkswagen Aktiengesellschaft | Vehicle, method, computer program and device for merging object information about one or more objects in the surroundings of a vehicle |
CN113806380B (en) * | 2020-06-16 | 2024-01-26 | 财团法人车辆研究测试中心 | Intersection dynamic image resource updating and sharing system and method |
DE102021207515A1 (en) | 2021-07-14 | 2023-01-19 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for providing an electronic horizon for an autonomous motor vehicle |
US20230032998A1 (en) * | 2021-07-30 | 2023-02-02 | Magna Electronics Inc. | Vehicular object detection and door opening warning system |
JP2023078605A (en) * | 2021-11-26 | 2023-06-07 | 本田技研工業株式会社 | Vehicle control device |
FR3132588B1 (en) * | 2022-02-07 | 2024-07-19 | Renault Sas | Process for developing an artificial horizon |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100164789A1 (en) * | 2008-12-30 | 2010-07-01 | Gm Global Technology Operations, Inc. | Measurement Level Integration of GPS and Other Range and Bearing Measurement-Capable Sensors for Ubiquitous Positioning Capability |
US20110047338A1 (en) | 2008-04-30 | 2011-02-24 | Continental Teves Ag & Co. Ohg | Self-learning map on basis on environment sensors |
DE102010049215A1 (en) | 2010-10-21 | 2011-07-28 | Daimler AG, 70327 | Method for determining vehicle environment, particularly for determining traffic lane course, involves determining objects in environment of vehicle from current local environment data |
US20120323474A1 (en) * | 1998-10-22 | 2012-12-20 | Intelligent Technologies International, Inc. | Intra-Vehicle Information Conveyance System and Method |
US20170261995A1 (en) * | 2014-08-05 | 2017-09-14 | Valeo Schalter Und Sensoren Gmbh | Method for generating a surroundings map of a surrounding area of a motor vehicle, driver assistance system and motor vehicle |
US20170277716A1 (en) | 2016-03-23 | 2017-09-28 | Here Global B.V. | Map Updates from a Connected Vehicle Fleet |
US20170371349A1 (en) | 2016-06-23 | 2017-12-28 | Lg Electronics Inc. | Vehicle control device mounted on vehicle and method for controlling the vehicle |
US20180149487A1 (en) | 2016-11-26 | 2018-05-31 | Thinkware Corporation | Image processing apparatus, image processing method, computer program and computer readable recording medium |
US20190137286A1 (en) * | 2016-06-14 | 2019-05-09 | Robert Bosch Gmbh | Method and apparatus for creating an optimized localization map and method for creating a localization map for a vehicle |
US20190304097A1 (en) * | 2018-03-29 | 2019-10-03 | Aurora Innovation, Inc. | Relative atlas for autonomous vehicle and generation thereof |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5041638B2 (en) * | 2000-12-08 | 2012-10-03 | パナソニック株式会社 | Method for transmitting location information of digital map and device used therefor |
KR20060119862A (en) * | 2003-08-04 | 2006-11-24 | 마쯔시다덴기산교 가부시키가이샤 | Method of transmitting location information of digital map and program, program product, system and apparatus for implementing same |
JP2006208223A (en) * | 2005-01-28 | 2006-08-10 | Aisin Aw Co Ltd | Vehicle position recognition device and vehicle position recognition method |
JP4645516B2 (en) * | 2005-08-24 | 2011-03-09 | 株式会社デンソー | Navigation device and program |
CN101641610A (en) * | 2007-02-21 | 2010-02-03 | 电子地图北美公司 | System and method for vehicle navigation and piloting including absolute and relative coordinates |
DE102013011827A1 (en) * | 2013-07-15 | 2015-01-15 | Audi Ag | Method for operating a navigation device, navigation device and motor vehicle |
JP6298772B2 (en) * | 2015-01-14 | 2018-03-20 | 日立オートモティブシステムズ株式会社 | In-vehicle control device, own vehicle position and orientation identification device, in-vehicle display device |
-
2018
- 2018-07-25 DE DE102018005869.8A patent/DE102018005869A1/en active Pending
-
2019
- 2019-06-14 US US16/441,247 patent/US11530931B2/en active Active
- 2019-07-24 CN CN201910669159.5A patent/CN110779534B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120323474A1 (en) * | 1998-10-22 | 2012-12-20 | Intelligent Technologies International, Inc. | Intra-Vehicle Information Conveyance System and Method |
US20110047338A1 (en) | 2008-04-30 | 2011-02-24 | Continental Teves Ag & Co. Ohg | Self-learning map on basis on environment sensors |
US20100164789A1 (en) * | 2008-12-30 | 2010-07-01 | Gm Global Technology Operations, Inc. | Measurement Level Integration of GPS and Other Range and Bearing Measurement-Capable Sensors for Ubiquitous Positioning Capability |
DE102010049215A1 (en) | 2010-10-21 | 2011-07-28 | Daimler AG, 70327 | Method for determining vehicle environment, particularly for determining traffic lane course, involves determining objects in environment of vehicle from current local environment data |
US20170261995A1 (en) * | 2014-08-05 | 2017-09-14 | Valeo Schalter Und Sensoren Gmbh | Method for generating a surroundings map of a surrounding area of a motor vehicle, driver assistance system and motor vehicle |
US20170277716A1 (en) | 2016-03-23 | 2017-09-28 | Here Global B.V. | Map Updates from a Connected Vehicle Fleet |
US20190137286A1 (en) * | 2016-06-14 | 2019-05-09 | Robert Bosch Gmbh | Method and apparatus for creating an optimized localization map and method for creating a localization map for a vehicle |
US20170371349A1 (en) | 2016-06-23 | 2017-12-28 | Lg Electronics Inc. | Vehicle control device mounted on vehicle and method for controlling the vehicle |
US20180149487A1 (en) | 2016-11-26 | 2018-05-31 | Thinkware Corporation | Image processing apparatus, image processing method, computer program and computer readable recording medium |
US20190304097A1 (en) * | 2018-03-29 | 2019-10-03 | Aurora Innovation, Inc. | Relative atlas for autonomous vehicle and generation thereof |
Also Published As
Publication number | Publication date |
---|---|
CN110779534B (en) | 2025-01-03 |
US20200033153A1 (en) | 2020-01-30 |
CN110779534A (en) | 2020-02-11 |
DE102018005869A1 (en) | 2020-01-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11530931B2 (en) | System for creating a vehicle surroundings model | |
CN106352867B (en) | Method and device for determining the position of a vehicle | |
JP7167876B2 (en) | Map generation system, server and method | |
JP7156206B2 (en) | Map system, vehicle side device, and program | |
US20200255027A1 (en) | Method for planning trajectory of vehicle | |
US10240934B2 (en) | Method and system for determining a position relative to a digital map | |
US9052207B2 (en) | System and method for vehicle navigation using lateral offsets | |
CN108351218B (en) | Method and system for generating digital maps | |
US20210199437A1 (en) | Vehicular component control using maps | |
US20080243378A1 (en) | System and method for vehicle navigation and piloting including absolute and relative coordinates | |
US20100121518A1 (en) | Map enhanced positioning sensor system | |
CN111508276B (en) | High-precision map-based V2X reverse overtaking early warning method, system and medium | |
CN110530377B (en) | Method and device for implementing at least one safety-improving measure for a vehicle | |
US20220042804A1 (en) | Localization device for visually determining the location of a vehicle | |
JP2019513996A (en) | Method of determining the attitude of at least a partially autonomous driving vehicle in the vicinity by a ground sign | |
KR100976964B1 (en) | Navigation system and road lane recognition method thereof | |
US20220221298A1 (en) | Vehicle control system and vehicle control method | |
CN111123334A (en) | A multi-vehicle cooperative positioning platform and positioning method under extreme working conditions | |
CN112673232A (en) | Lane map making device | |
JP2020140602A (en) | Map data update system, travel probe information collection device, travel probe information providing device, and travel probe information collection method | |
KR102273506B1 (en) | Method, device and computer-readable storage medium with instructions for determinig the position of data detected by a motor vehicle | |
US20240221499A1 (en) | Method and Apparatus for Obtaining Traffic Information, and Storage Medium | |
CN112534209B (en) | Self-position estimation method and self-position estimation device | |
US20240210207A1 (en) | Method and device for determining and providing environmental feature information and for creating a map | |
CN114212088B (en) | Vehicle control method, device, electronic equipment, vehicle and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: ZF ACTIVE SAFETY GMBH, GERMANY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SCHNEIDER, GEORG;MULLER, THOMAS;SIGNING DATES FROM 20190727 TO 20190730;REEL/FRAME:049968/0668 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |