WO2022130718A1 - Information generation device - Google Patents
Information generation device Download PDFInfo
- Publication number
- WO2022130718A1 WO2022130718A1 PCT/JP2021/034453 JP2021034453W WO2022130718A1 WO 2022130718 A1 WO2022130718 A1 WO 2022130718A1 JP 2021034453 W JP2021034453 W JP 2021034453W WO 2022130718 A1 WO2022130718 A1 WO 2022130718A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- event
- information
- formation
- unit
- Prior art date
Links
- 230000015572 biosynthetic process Effects 0.000 claims abstract description 182
- 238000000605 extraction Methods 0.000 claims abstract description 42
- 230000007704 transition Effects 0.000 claims abstract description 31
- 238000013075 data extraction Methods 0.000 claims abstract description 13
- 230000002093 peripheral effect Effects 0.000 claims abstract description 8
- 230000008859 change Effects 0.000 claims description 37
- 230000001133 acceleration Effects 0.000 claims description 25
- 238000012360 testing method Methods 0.000 abstract description 22
- 239000000284 extract Substances 0.000 abstract description 8
- 238000005755 formation reaction Methods 0.000 description 169
- 238000000034 method Methods 0.000 description 11
- 238000004088 simulation Methods 0.000 description 8
- 238000011161 development Methods 0.000 description 7
- 230000006399 behavior Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000011156 evaluation Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 230000003044 adaptive effect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000011056 performance test Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000002250 progressing effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0956—Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/04—Traffic conditions
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/10—Historical data
Definitions
- the present invention refers to a scenario generation method capable of generating a test scenario for each event by analyzing the formation of the own vehicle and surrounding objects, or a scenario generated by the scenario generation method.
- the present invention relates to an information generator suitable for a vehicle control method capable of performing vehicle travel control in real time.
- ADAS advanced driver assistance system
- automated driving related technology is rapidly progressing, and adaptive cruise control, lane keep assist system, and emergency are available as functions to automate a part of driving operation.
- Automatic braking and the like have already been put into practical use.
- test scenario should include everything from everyday safety scenes to diverse and complex danger scenes. It is not realistic to manually create these test scenarios one by one, so a mechanism for automatic generation is required.
- an information generation device that generates a test case of an incident scene described in Patent Document 1 is known.
- the presence or absence of abnormal approach is determined by comparing the distance between moving objects and the threshold value (distance threshold value) of the inter-vehicle distance.
- An information generator that generates a test case using distance information with an object is disclosed.
- Patent Document 1 since the information generator described in Patent Document 1 extracts an incident scene based on TTC (Time To Collision), it is not determined as an incident scene depending on the setting of the TTC threshold value, and a test case is generated. Leakage may occur.
- TTC Time To Collision
- Patent Document 1 has a problem that it is not possible to generate a test scenario for a scene that is not actually an incident due to the characteristics of generating a test scenario of an incident scene using TTC as described above. There is also.
- an information generation device capable of generating test scenarios of various scenes without being limited to incident scenes by performing analysis using a formation format that abstracts the positional relationship of surrounding objects.
- the purpose is to provide.
- the information generation device of the present invention is an information generation device that generates a scenario of a scene in which the own vehicle travels based on the travel information of the own vehicle, based on the travel information of the own vehicle.
- a formation forming unit that forms a formation that abstracts the layout of the own vehicle and the surrounding object according to a formation format in which the surrounding object is assigned to each previously arbitrarily divided area around the own vehicle, and the surrounding object.
- a formation transition determination unit for determining whether or not an event in which the formation of the surrounding object changes has occurred in the traveling information of the own vehicle is provided. It is characterized by that.
- a test scenario is generated including not only the scene in the dangerous area but also the scene in the normal area. can do.
- most of the performance evaluation (80%) of the automatic driving system has a large number of test scenarios that are automatically generated by the information generator of the present invention, which is the process that was mainly performed in the performance evaluation test and development using the actual vehicle in the past. Since the above) can be performed efficiently by simulation, the development speed of the automated driving system can be increased, and the development cost can be expected to be significantly reduced.
- test scenario event scenario for each event
- various scenes including not only dangerous scenes but also regular scenes.
- FIG. 3 is a block diagram showing an overall configuration example of a travel control device according to a second embodiment of an information generation device to which the formation analysis according to the present invention is applied.
- the present invention generates a test scenario (event scenario) for each event by analyzing the formation of the own vehicle and surrounding objects.
- the definition of the formation in the present invention is not simply determined by the relative distance or the relative position of the own vehicle and the surrounding vehicle, but is an abstraction of the layout of the surrounding vehicle with respect to the own vehicle according to the formation format specified by the user. Point to.
- FIG. 1 is a block diagram showing an overall configuration example of a scenario generation device to which a scenario generation method based on the formation analysis of the present invention is applied.
- the scenario generation device 3 exemplified here is a scenario generation device that outputs an event scenario 19 by inputting a vehicle (hereinafter, may be referred to as a own vehicle or a own vehicle) 1 or a driving log 2 acquired by a driving simulator 1A. be.
- the scenario generation device 3 is configured as a computer including a processor such as a CPU (Central Processing Unit), a memory such as a ROM (Read Only Memory), a RAM (Random Access Memory), and an HDD (Hard Disk Drive). Each function of the scenario generator 3 is realized by the processor executing the program stored in the ROM.
- the RAM stores data including intermediate data of operations performed by a program executed by the processor.
- the scenario generator 3 analyzes the formation of the own vehicle and surrounding objects based on the log analysis unit 4 for analyzing the information required for the formation analysis from the travel log 2 and the information analyzed by the log analysis unit 4. Then, the formation analysis unit 8 that outputs the extracted data 12 that is extracted by dividing the travel log 2 based on the analysis result, and the scenario event (cut-in, cut-out, rapid addition) to the extracted data 12. It is provided with an event information addition unit 13 for adding information (deceleration, wobbling, etc.) and a format conversion unit 18 for outputting an event scenario 19 in which the extraction data 12 to which event information is added is converted into a format suitable for the simulator environment. There is.
- the travel log 2 records travel information that is data related to the travel scene of the vehicle 1, and is vehicle information (vehicle speed) that can be acquired by the vehicle-mounted network of the vehicle 1 (CAN (Control Area Network), Ethernet (registered trademark)). , Position, etc.) and log data that records detection information of surrounding objects that can be acquired by external recognition sensors (camera, radar, lidar, etc.), and simulation results obtained by driving in a virtual space like the driving simulator 1A. May be used as.
- the travel log 2 is applied with the data obtained by acquiring the information of the surrounding objects by the outside world recognition sensor installed in the infrastructure such as the security camera and the N system. Is also good.
- the log analysis unit 4 analyzes the traveling lane of the own vehicle, the traveling lane of the surrounding object, and the relative position between the own vehicle and the surrounding object, which are mainly required for the formation analysis, from the traveling log 2.
- the log analysis unit 4 is composed of a vehicle lane analysis unit 5, a surrounding object lane analysis unit 6, and a relative position analysis unit 7.
- the own vehicle lane analysis unit 5 determines the travel lane of the own vehicle by analyzing the data (GNSS and map data) related to the travel position of the own vehicle recorded in the travel log 2.
- the surrounding object lane analysis unit 6 has data on the traveling position of the surrounding object recorded in the traveling log 2 (the detection position of the surrounding object detected by the outside world recognition sensor, the relative position with the own vehicle, and the pair obtained by the vehicle-to-vehicle communication. By analyzing the position information of the vehicle, etc.), the traveling lane of the surrounding object is determined.
- the relative position analysis unit 7 obtains relative position information by analyzing data related to the relative positions of the own vehicle and surrounding objects recorded in the travel log 2 (detection positions of surrounding objects detected by the outside world recognition sensor, etc.). Relative position information is required to determine the front-back relationship between the vehicle and surrounding objects, which cannot be determined from the travel lane information alone in the formation analysis.
- the formation analysis unit 8 is composed of a formation formation unit 9, a formation transition determination unit 10, and a data extraction unit 11.
- the formation forming unit 9 assigns which area around the own vehicle the surrounding object belongs to based on the information analyzed by the log analysis unit 4 according to the formation format specified by the user.
- the formation format can specify in advance which format (for example, designations such as those in FIGS. 2A and 2B) the user uses to analyze the formation of the surrounding object.
- the formation format is centered on the vehicle presence lane (“C”), the right area is “R”, the left area is “L”, the road shoulder area is “S”, the front area is “F”, and the front area “F”.
- the area around the vehicle is arbitrarily divided in advance, such as "FF" for the front area and "R” for the rear area.
- FIG. 2A two regions of “F” and “FF” are set in the front region of the vehicle 20, “R” is set in the rear region, and “R”, “L” and “S” are set as the left and right regions.
- the front area is divided into two, so that not only the scenario related to the preceding vehicle close to the own vehicle but also the behavior of the own vehicle can be indirectly affected by the preceding vehicle. It is possible to generate a scenario in which there is a possibility (for example, when the preceding vehicle interrupts in front of the preceding vehicle and the preceding vehicle suddenly brakes, the own vehicle also needs to suddenly decelerate).
- FIG. 2B is a setting example in which the front region of the vehicle 20 is "F", the rear region is “R”, and the left and right regions are “R", “L”, and “S”.
- a scenario can be generated in consideration of two objects of the road shoulder "S" in addition to the six objects of front, rear, left and right directly facing the own vehicle. If it is not necessary to consider a scene that indirectly affects the behavior of the own vehicle, such as the preceding vehicle, use the formation format shown in Fig. 2 (b) to narrow down the generation to the minimum necessary scenarios. Can be done.
- FIGS. 2 (a) and 2 (b) Examples of the formation format have been described with reference to FIGS. 2 (a) and 2 (b), but the formation format can be freely (arbitrarily) set according to the scenario that the user wants to generate, and FIG. The setting is not limited to (a) and FIG. 2 (b).
- the formation transition determination unit 10 determines whether or not the formation (abstract layout of the surrounding object) of the surrounding object assigned to the area has changed according to the formation format specified by the formation forming unit 9. For example, if the preceding vehicle in the adjacent lane changes lanes to the driving lane of the own vehicle, it is determined that the formation has changed.
- the data extraction unit 11 generates the extraction data 12 by dividing the travel log 2 before and after the formation is switched each time the formation transition determination unit 10 determines that the formation (abstracted layout) has changed. do.
- the event information addition unit 13 is composed of a lane change extraction unit 14, an acceleration extraction unit 15, and an event determination unit 16.
- the lane change extraction unit 14 analyzes whether or not a lane change has occurred due to a surrounding object based on the data (lane recognition information, lateral position, lateral speed, etc.) recorded in the extracted data 12 regarding the surrounding object. do. Depending on the specified formation format, it may be determined that the assigned area of the surrounding object has changed, assuming that the lane has changed. Taking the formation format shown in FIG. 2A as an example, one example is a case where a surrounding object belonging to “FR” is changed to the assignment to “FC”.
- the acceleration extraction unit 15 determines the magnitude / change of the acceleration of the surrounding object based on the data (velocity, acceleration, etc.) recorded in the extraction data 12 regarding the surrounding object.
- the event determination unit 16 determines whether or not the behavior information of the surrounding objects analyzed by the lane change extraction unit 14 and the acceleration extraction unit 15 corresponds to the event conditions predefined in the scenario definition file 17. Then, event information (cut-in, cut-out, sudden deceleration, sudden acceleration, etc.) is added to the extracted data 12.
- the format conversion unit 18 generates an event scenario 19 by converting the extraction data 12 to which the event information is added by the event information addition unit 13 into a format that can be used in various simulation environments.
- FIG. 3 shows a processing flow of the formation analysis unit 8
- FIG. 4 shows an example in which formation analysis is performed using a certain travel log 2.
- the formation analysis unit 8 first reads the log analysis data (information analyzed by the log analysis unit 4) (S300), and then allocates surrounding objects to the area defined in the formation format based on the read log analysis data. Then, a formation is formed (S301).
- step S302 it is determined whether or not there is a formation registration history.
- step S302 If there is no formation registration history in step S302, the formation formed in that step is registered as the initial formation (S303).
- analysis start point (t1) for starting the analysis of the formation for dividing the running log 2 for each formation is set (S304).
- This analysis start point (t1) can be arbitrarily set by the user, but basically it may be the same as the first step of reading the travel log.
- the vehicle 41 is assigned to the adjacent right lane front “FR” and the vehicle 42 is assigned to the adjacent left lane front “FL” (definition area) around the own vehicle 40.
- the formation 401 is registered and set as the analysis start point (t1).
- step S305 whether or not the formation of the surrounding object formed in each step has changed from the last registered formation, that is, whether or not an event has occurred in the traveling log 2 in which the formation of the surrounding object changes.
- step S305 If it is determined in step S305 that there is no change in the formation, the process returns to the log analysis data reading process (S300) in the next step.
- step S305 The case where there is a change in the formation of surrounding objects in step S305 will be described below.
- FIG. 5 is a diagram summarizing the state transitions of the formation when there is one or no surrounding object
- FIG. 6 is a table summarizing the conditions when the states in FIG. 5 are switched.
- FIG. 7 shows a case where two surrounding objects exist in the front direction of the own vehicle
- FIG. 8 is a table summarizing the state transition conditions in FIG. 7.
- the initial formation 401 is the “object 1 in front of the right lane and the object 2 in front of the left lane” in FIG. It is a state, and the next formation 402 transitions to the state of “the object 1 in front of the own lane and the object 2 in front of the left lane” in FIG. 7.
- the state transition is performed based on the condition of "1" in FIG. 7, and it is determined that the condition of No. 1 in FIG. 8 is satisfied, and it is determined that the formation has been switched.
- the condition “13” is changed from the state of “the object 1 in front of the own lane and the object 2 in front of the left lane” in FIG. By satisfying ", there is an object 1 in front of the own lane and an object 2 in front of the object 1".
- the definitions of the front object 1 and the front object 2 shown in FIGS. 7 and 8 are merely examples, and the indexes such as the object 1 and the object 2 should be referred to as independent definitions between the states.
- the indexes such as the object 1 and the object 2 should be referred to as independent definitions between the states.
- the indexes such as the object 1 and the object 2 should be referred to as independent definitions between the states.
- the indexes such as the object 1 and the object 2 should be referred to as independent definitions between the states.
- state transitions in FIGS. 7 and 8 are part of all formation transition judgment rule databases, and combinations can be flexibly defined by the front-back and left-right relationships between not only two objects but also a larger number of surrounding objects.
- step S305 of FIG. 3 The processing after it is determined in step S305 of FIG. 3 that there is a formation change will be described.
- step S305 of FIG. 3 If it is determined in step S305 of FIG. 3 that there is a formation change (that is, an event that changes the formation of a surrounding object has occurred), the formation at the time of determination is registered (S306).
- the timing at which the formation is switched (that is, the timing at which the formation of the surrounding object changes occurs) is set as the formation change point (t1_fc, t2_fc, ...) (S307).
- a log data division point (t2, t3) is set after an arbitrary time from the formation change point (S308).
- the user can arbitrarily decide how many seconds before and after the formation change point is set as the log data division point.
- the extraction range of the travel log 2 used to generate the scenario corresponding to the event in which the formation of the surrounding object changes is determined with the formation change point as the base point.
- the log data division point is arbitrarily set with the formation change point as the base point, but the log data division point is not determined only by the first formation change point, but the subsequent second formation change point. It is also possible to set in a form that includes. Specifically, the start point of the log data division is set based on the first formation change point, and the end point of the log data division is set based on the second formation change point. That is, the extraction range of the travel log 2 includes at least one of the log data division points (corresponding to the occurrence timing of the event in which the formation of the surrounding object changes).
- step S309 by dividing and extracting the travel log data based on the log data division point set in step S308, the extraction data 12 divided for each formation change (within the extraction range described above) is generated. do.
- S300 and S301 in FIG. 3 are executed by the formation forming unit 9
- S302 to S305 are executed by the formation transition determination unit 10
- S306 and thereafter are executed by the data extraction unit 11.
- Event information is given by the event information giving department>
- the event information giving unit 13 of FIG. 1 gives event information such as cut-in, cut-out, overtaking, sudden deceleration, etc. to each extracted data 12 generated by the formation analysis unit 8. It is necessary to analyze the behavior of surrounding objects in order to determine the event to be given.
- the lane change extraction unit 14 extracts whether or not the surrounding object has changed lanes based on the lateral position and speed of the surrounding object, or information indicating whether or not the lane has been changed directly.
- the acceleration extraction unit 15 extracts information such as sudden deceleration or rapid acceleration (that is, change in speed or acceleration) of the surrounding object based on the moving speed or acceleration information of the surrounding object.
- information such as sudden deceleration or rapid acceleration (that is, change in speed or acceleration) of the surrounding object based on the moving speed or acceleration information of the surrounding object.
- sudden deceleration and sudden acceleration are not uniquely determined, but can be arbitrarily determined by the user.
- the scenario definition includes the own vehicle lane, the surrounding object lane, and the relative position information analyzed by the log analysis unit 4, and the behavior information of the surrounding object obtained by the lane change extraction unit 14 and the acceleration extraction unit 15. It is determined whether or not the event condition defined in advance in the file 17 is satisfied, and the determined event information is added to the extracted data 12.
- FIG. 9 is described according to the formations (401, 402, 403) formed based on the travel log shown in FIG. After the extraction data 90 of the front and rear sections t1 to t2 of the formation 402 and the extraction data 91 of the front and rear sections t2 to t3 of the formation 403 are generated (by the formation analysis unit 8), the event information is given to each extraction data. Is shown.
- FIG. 9 describes the details when event information is given to the extracted data 90.
- the scenario definition file 92 in which the definition of the event (cut-in, cutout, follow-up, deceleration, sudden deceleration, wobbling, etc.) is described is read.
- the surrounding vehicle 41A decelerates after changing lanes from the right lane of the own vehicle 40A to the own vehicle traveling lane (at this time, the surrounding vehicle). 42A is traveling in the left lane), and then the surrounding vehicle 42B traveling in the left lane interrupts in front of the own vehicle 40B and the surrounding vehicle 41B, so that the surrounding vehicle (preceding vehicle) 41B is traveling in a staggered manner.
- the extracted data 90 generated based on the formation 402 it is determined that a cut-in and sudden deceleration event has occurred as an event determination, and tag information of cut-in and sudden deceleration is added. Further, in the extracted data 91 generated based on the formation 403, it is determined that a cut-in and wobble event has occurred as an event judgment, and tag information of cut-in and wobble is added.
- the scenario definition file 92 describes the definitions of cut-in, sudden deceleration, and wobbling.
- cut-in satisfies both the condition that the surrounding object has transitioned from the adjacent lanes "R" and "L” to the own vehicle traveling lane "C" and that the relative distance to the surrounding object is ahead. It is supposed to be.
- the definition of the cut-in is only an example defined in this embodiment, and it is also possible to define the cut-in according to the lateral position and lateral speed of the surrounding object.
- the definition of sudden deceleration is when the acceleration of the surrounding object is less than or equal to the preset threshold value (for example, -0.4 [G]) in the scenario definition file.
- the preset threshold value for example, -0.4 [G]
- the definition of wobbling may be carried out by setting a certain threshold value from information such as the steering angle, lateral speed, lateral acceleration, or lane departure warning of surrounding objects, and determining how many times the threshold value is exceeded. ..
- the vehicle is generated by generating the event scenario 19 by converting the extracted data 12 (90, 91) to which the event information is added as described above into a format that can be used in various simulation environments by the format conversion unit 18.
- the event scenario 19 can be generated including not only the scene in the dangerous area (incident scene) but also the scene in the regular area.
- the scenario generator (information generator) 3 of the first embodiment is an information generator (in other words, an information generator) that generates a scenario of the scene in which the own vehicle travels based on the travel information of the own vehicle.
- a scenario generator that generates a scenario used for simulation based on the actual vehicle driving and the driving log acquired by the driving simulator) 3.
- the formation forming unit 9 that forms a formation that abstracts the layout of the own vehicle and the surrounding objects, and the event that the formation of the surrounding objects changes.
- the formation transition determination unit 10 for determining whether or not an event in which the formation of the surrounding object changes has occurred in the travel information (travel log) of the own vehicle is provided.
- the extraction range of the running information (running log) of the own vehicle used to generate the scenario corresponding to the event is extracted based on the occurrence timing (formation change point) of the event.
- a data extraction unit 11 for determining and extracting data in the extraction range is provided.
- the data extraction unit 11 divides the travel information (travel log) of the own vehicle before and after the event occurrence timing (log data division point) and obtains data (of the event). (Data divided for each occurrence) is extracted.
- the extraction range of the travel information (travel log) of the own vehicle includes one or more of the occurrence timings of the event.
- the event information giving unit 13 that gives the event information of the scenario to the extracted data is provided.
- the scenario generation device (information generation device) 3 of the first embodiment includes a log analysis unit 4 that analyzes the relative positions of the own vehicle and surrounding objects and the relative positions of the own vehicle and surrounding objects from the travel log data.
- the own vehicle and the surrounding object are assigned according to a formation format in which the surrounding object is assigned to each previously arbitrarily divided area around the own vehicle based on the analysis data of the existing lanes and the relative positions of the own vehicle and the surrounding object.
- the formation forming unit 9 that sets the formation of the above, the formation transition determination unit 10 that determines whether or not an event that changes the formation of the surrounding object has occurred, and the formation timing of the event when the event occurs.
- the data extraction unit 11 that determines the extraction range of the actual vehicle travel log used to generate the scenario corresponding to the event and extracts the data of the extraction range, and the lane change by the surrounding object with respect to the extracted data. It also includes an event information giving unit 13 that analyzes and gives scenario information (cut-in, cut-out, sudden deceleration, sudden acceleration, etc.) based on acceleration / deceleration.
- a test scenario is generated including not only the scene in the dangerous area but also the scene in the normal area. Can be done.
- most of the performance evaluation of the automatic driving system is based on the large number of test scenarios that are automatically generated by the scenario generator 3 of this embodiment, which is the process that was mainly performed in the performance evaluation test and development using the actual vehicle in the past. Since 80% or more) can be efficiently performed by simulation, the development speed of the automatic driving system can be increased, and the development cost can be expected to be significantly reduced.
- Example 2 (travel control device)] ⁇ Overall configuration>
- the scenario generation device 3 including the formation analysis unit 8 has been described, but the formation analysis unit 8 obtains information on surrounding objects obtained by the external world recognition sensor mounted on the vehicle during actual vehicle driving. Based on this, the formation analysis can be applied in real time, and it may be provided in the travel control device of the vehicle.
- FIG. 10 is a block diagram showing an overall configuration example of a vehicle travel control device to which the formation analysis according to the present invention is applied.
- the travel control device 101 exemplified here is an event scenario 119 stored based on formation analysis and actual travel by inputting vehicle information, external world information, infrastructure information, etc. acquired by vehicle 1B during actual vehicle travel. It is a driving control device that predicts an event by collating the scenes and provides driving control suitable for an actual driving scene.
- the travel control device 101 is analyzed by the information receiving unit 102 that receives information necessary for formation analysis and travel control from the vehicle 1B, the information analysis unit 104 that analyzes the information used in the formation analysis, and the information analysis unit 104.
- a formation that analyzes the transition of the formation of the own vehicle and surrounding objects based on the information obtained, and outputs the extracted data 112 including at least one formation change point (occurrence timing of an event that changes the formation) based on the analysis result.
- the analysis unit 108 determines a scenario event (cut-in, cutout, rapid acceleration / deceleration, wobble, etc.) with respect to the extracted data 112, outputs the event determination result, and outputs the event determination result information.
- the event prediction unit 121 that predicts the event that the vehicle 1B will encounter in the near future by collating the scenes, the scene predicted by the event prediction unit 121, and the vehicle information and surrounding information obtained by the information receiving unit 102.
- the vehicle 1B is provided with a travel control unit 122 that instructs the vehicle 1B to perform appropriate travel control.
- the information receiving unit 102 receives from the vehicle 1B the traveling information related to the traveling scene of the vehicle 1B including the own vehicle information, the outside world recognition information, the GNSS, the map information, the infrastructure information, and the like.
- the information analysis unit 104 analyzes the traveling lane of the own vehicle, the traveling lane of the surrounding object, and the relative position between the own vehicle and the surrounding object, which are mainly required for the formation analysis, from the traveling information received by the information receiving unit 102.
- the information analysis unit 104 has a vehicle lane analysis unit 105 that analyzes the travel lane of the vehicle, a peripheral object lane analysis unit 106 that analyzes the moving lane (travel lane) of the surrounding object, and a relative relationship between the vehicle and the peripheral object. It has a relative position analysis unit 107 that analyzes the position.
- the formation analysis unit 108 has a formation forming unit 109 that forms a formation to which the surrounding object belongs to which area around the own vehicle according to the formation format specified by the user, and whether or not the formation has been switched (that is, the surrounding object). Includes at least one formation transition determination unit 110 that determines (whether or not an event that changes the formation of the formation has occurred) and a formation change point (that is, the occurrence timing of the event that changes the formation) that indicates the timing at which the formation is switched. It has a data extraction unit 111 that accumulates data in an extraction range for an arbitrary past X [s] in the form and extracts the extraction data 112. X can be arbitrarily set in consideration of the design intention.
- the event information addition unit 113 extracts whether or not the lane of the own vehicle or the surrounding vehicle has been changed based on the data (lane recognition information, lateral position, lateral speed, etc.) recorded in the extraction data 112 regarding the surrounding objects.
- the acceleration extraction unit 115 that extracts the degree of acceleration / deceleration (change) of the surrounding object based on the extraction unit 114 and the data (velocity, acceleration, etc.) about the surrounding object recorded in the extraction data 112, and the scenario definition file 117.
- the event (cut-in, cut-out, sudden deceleration, sudden acceleration, etc.) of the extracted data 112 is determined according to (event conditions defined in advance in), and what kind of event the own vehicle 1B is under at that time is determined. It has an event determination unit 116 that outputs as a signal.
- the travel control device 101 accumulates the event scenario 119 to which the event information is added by the storage unit 120 based on the event signal output from the event determination unit 116 of the event information addition unit 113.
- the event prediction unit 121 holds the event signal (corresponding to the actual driving scene in which the current own vehicle 1B is placed) output from the event determination unit 116 of the event information addition unit 113 and the data of the past similar event. By collating the event scenario 119, it is predicted what kind of event is likely to occur in the near future regarding the event occurring in the running own vehicle 1B. Further, the predicted event and its occurrence probability are input to the traveling control unit 122.
- the own vehicle 1B is traveling on a deceleration command related to travel control, a target travel speed command, a lateral movement amount command, etc., based on the predicted event and the occurrence probability input from the event prediction unit 121. It can be set appropriately according to the scene.
- Event prediction by the event prediction unit and command setting by the driving control unit For example, assuming the situation shown in FIG. 4, the vehicle 41A cuts in in front of the overtaking vehicles 41A and 41B that have been cut in in front of the own vehicles 40A and 40B (own vehicle traveling lane). It is assumed that an event scenario in which 41B suddenly brakes is already stored by the storage unit 120. When the vehicle is driving in the second lane of a three-lane road while the actual vehicle is running, the overtaking vehicle in the third lane is approaching, and the vehicle is also driving in the first lane.
- vent prediction unit 121 When the scene in which the current own vehicle is placed is collated with the past event scenario, it can be determined that there is a high possibility that an event in which the preceding vehicle cuts in and suddenly decelerates occurs (event prediction unit 121). It becomes possible to adjust the traveling control in real time, such as accelerating the deceleration timing of the ACC (Adaptive Cruise Control) (travel control unit 122).
- ACC Adaptive Cruise Control
- the travel control device (information generation device) 101 of the second embodiment is based on the storage unit 120 that stores the event scenario to which the event information is assigned and the current travel information of the own vehicle.
- the event prediction unit 121 that predicts an event that will occur in the current own vehicle in the future (high probability) by collating the scene in which the current own vehicle is placed with a past event scenario (storage unit 120).
- the travel control unit 122 that controls the travel of the current own vehicle based on at least one of the event predicted by the event prediction unit 121 (prediction event) or the probability of occurrence of the predicted event.
- the scene of the own vehicle in motion is collated with the generated test scenario (event scenario for each event) to create a dangerous scene. Not only that, it is possible to appropriately and in real time control the running of the own vehicle in various scenes including the scene in the regular area.
- each of the above configurations, functions, processing units, processing means, etc. may be realized by hardware by designing a part or all of them by, for example, an integrated circuit. Further, each of the above configurations, functions, and the like may be realized by software by the processor interpreting and executing a program that realizes each function. Information such as programs, tables, and files that realize each function can be stored in a memory, a hard disk, a storage device such as an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
- SSD Solid State Drive
- control lines and information lines indicate those that are considered necessary for explanation, and do not necessarily indicate all control lines and information lines in the product. In practice, it can be considered that almost all configurations are interconnected.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
<全体構成>
図1は、本発明のフォーメーション解析に基づくシナリオ生成手法を適用したシナリオ生成装置の全体構成例を示すブロック図である。ここに例示するシナリオ生成装置3は、車両(以下、自車や自車両と称する場合がある)1又はドライビングシミュレータ1Aで取得される走行ログ2を入力にイベントシナリオ19を出力するシナリオ生成装置である。 [Example 1 (scenario generator)]
<Overall configuration>
FIG. 1 is a block diagram showing an overall configuration example of a scenario generation device to which a scenario generation method based on the formation analysis of the present invention is applied. The
ログ解析部4は、フォーメーション解析で主に必要となる自車の走行レーン、周囲物体の走行レーン、自車と周囲物体間の相対位置を走行ログ2から解析する。ログ解析部4は、自車レーン解析部5と、周囲物体レーン解析部6と、相対位置解析部7で構成される。 <Log analysis unit>
The
自車レーン解析部5は、走行ログ2に記録されている自車の走行位置に関するデータ(GNSSや地図データ)を解析することで自車の走行レーンを判断する。 (Vehicle lane analysis department)
The own vehicle
周囲物体レーン解析部6は、走行ログ2に記録されている周囲物体の走行位置に関するデータ(外界認識センサで検知した周囲物体の検知位置や自車との相対位置、車車間通信で得られる対車両の位置情報等)を解析することで周囲物体の走行レーンを判断する。 (Around object lane analysis unit)
The surrounding object
相対位置解析部7は、走行ログ2に記録されている自車と周囲物体の相対位置に関するデータ(外界認識センサで検知した周囲物体の検知位置等)を解析することで相対位置情報を得る。フォーメーション解析において走行レーン情報だけでは判断できない自車と周囲物体の前後関係を判断するために相対位置情報は必要となる。 (Relative position analysis unit)
The relative
フォーメーション解析部8は、フォーメーション形成部9と、フォーメーション遷移判断部10と、データ抽出部11で構成される。 <Formation analysis department>
The
フォーメーション形成部9は、ユーザが指定するフォーメーションフォーマットに従って、前記ログ解析部4で解析した情報に基づいて、周囲物体が自車周囲のどのエリアに属するかを割り当てる。 (Formation forming part)
The
フォーメーション遷移判断部10は、前記フォーメーション形成部9で指定されたフォーメーションフォーマットに従ってエリア割当された周囲物体のフォーメーション(周囲物体の抽象化したレイアウト)が変わったか否かを判断する。例えば、隣接車線の先行車両が自車の走行レーンに車線変更すると、フォーメーションが変わったと判断する。 (Formation transition judgment unit)
The formation
データ抽出部11は、前記フォーメーション遷移判断部10でフォーメーション(抽象化したレイアウト)が変わったと判断される度に、フォーメーションが切り替わる前後で走行ログ2を区切って抽出することで、抽出データ12を生成する。 (Data extraction unit)
The
イベント情報付与部13は、車線変更抽出部14と、加速度抽出部15と、イベント判断部16で構成される。 <Event information granting department>
The event
車線変更抽出部14では、前記抽出データ12に記録されている周囲物体に関するデータ(車線認識情報、横位置、横速度など)を基に、周囲物体による車線変更が発生していたか否かを解析する。なお、指定されたフォーメーションフォーマット次第では、周囲物体の割当エリアが変更したことを車線変更があったとみなして判断してもよい。図2(a)に示したフォーメーションフォーマットを例に述べると、“FR”に属していた周囲物体が“FC”への割当に変わったケースが一つの例として挙げられる。 (Lane change extraction section)
The lane
加速度抽出部15では、前記抽出データ12に記録されている周囲物体に関するデータ(速度、加速度など)を基に、周囲物体の加速度の大きさ・変化を判断する。 (Acceleration extraction unit)
The
イベント判断部16では、前記車線変更抽出部14と加速度抽出部15で解析された周囲物体の挙動情報が、シナリオ定義ファイル17に予め定義されているイベント条件に該当するか否かを判断することで、前記抽出データ12にイベント情報(カットイン、カットアウト、急減速、急加速など)を付与する。 (Event Judgment Department)
The
フォーマット変換部18では、前記イベント情報付与部13でイベント情報が付与された抽出データ12を各種シミュレーション環境で利用可能なフォーマットに変換することで、イベントシナリオ19が生成される。 <Format conversion unit>
The
ここで本実施例の特徴であるフォーメーション解析について、図3と図4を用いて詳細に説明する。図3は、フォーメーション解析部8の処理フローを表しており、図4は、ある走行ログ2を用いてフォーメーション解析を行った場合の実施例を示している。 <Formation analysis by formation analysis department>
Here, the formation analysis, which is a feature of this embodiment, will be described in detail with reference to FIGS. 3 and 4. FIG. 3 shows a processing flow of the
図1のイベント情報付与部13は、前記フォーメーション解析部8で生成される各抽出データ12に対して、カットイン、カットアウト、追い越し、急減速等といったイベント情報を付与する。付与するイベントの判断を行うために、周囲物体の挙動を解析する必要がある。 <Event information is given by the event information giving department>
The event
以上で説明したように、本実施例1のシナリオ生成装置(情報生成装置)3は、自車両の走行情報に基づいて、前記自車両が走行するシーンのシナリオを生成する情報生成装置(言い換えれば、実車走行やドライビングシミュレータによって取得される走行ログに基づいて、シミュレーションに用いるシナリオを生成するシナリオ生成装置)3において、前記自車両の走行情報(走行ログ)から、前記自車両の周囲の予め任意に分割された領域ごとに周囲物体が割り当てられたフォーメーションフォーマットに従って、前記自車両と前記周囲物体のレイアウトを抽象化したフォーメーションを形成するフォーメーション形成部9と、前記周囲物体のフォーメーションが変化するイベントに対応するシナリオを生成するために、前記自車両の走行情報(走行ログ)において、前記周囲物体のフォーメーションが変化するイベントが発生したか否かを判断するフォーメーション遷移判断部10と、を備える。 <Operation and effect of Example 1 (scenario generator)>
As described above, the scenario generator (information generator) 3 of the first embodiment is an information generator (in other words, an information generator) that generates a scenario of the scene in which the own vehicle travels based on the travel information of the own vehicle. , A scenario generator that generates a scenario used for simulation based on the actual vehicle driving and the driving log acquired by the driving simulator) 3. From the driving information (driving log) of the own vehicle, it is arbitrary in advance around the own vehicle. According to the formation format in which the surrounding objects are assigned to each of the divided areas, the
<全体構成>
上述した実施例1では、フォーメーション解析部8を備えるシナリオ生成装置3として説明してきたが、フォーメーション解析部8は、実際の実車走行時において、車両が搭載する外界認識センサで得た周囲物体情報を基に、リアルタイムでフォーメーション解析を適用することができ、車両の走行制御装置に備わっても良い。 [Example 2 (travel control device)]
<Overall configuration>
In the first embodiment described above, the
情報受信部102は、車両1Bから自車両情報や外界認識情報、GNSS、地図情報、インフラ情報等からなる車両1Bの走行シーンに関わる走行情報を受信する。 <Information receiver>
The
情報解析部104は、情報受信部102で受信した走行情報から、フォーメーション解析で主に必要となる自車の走行レーン、周囲物体の走行レーン、自車と周囲物体間の相対位置を解析する。情報解析部104は、自車両の走行レーンを解析する自車レーン解析部105と、周囲物体の移動レーン(走行レーン)を解析する周囲物体レーン解析部106と、自車と周囲物体との相対位置を解析する相対位置解析部107を有する。 <Information analysis department>
The
フォーメーション解析部108は、ユーザが指定するフォーメーションフォーマットに従って、周囲物体が自車周囲のどのエリアに属するかを割り当てたフォーメーションを形成するフォーメーション形成部109と、フォーメーションが切り替わったか否か(つまり、周囲物体のフォーメーションが変化するイベントが発生したか否か)を判断するフォーメーション遷移判断部110と、フォーメーションが切り替わったタイミングを表すフォーメーション変更点(つまり、フォーメーションが変化するイベントの発生タイミング)を少なくとも一つ含む形で任意の過去X[s]分の抽出範囲のデータを蓄積して抽出データ112を抽出するデータ抽出部111を有する。Xは、設計意図を考慮して任意に設定できる。 <Formation analysis department>
The
イベント情報付与部113は、抽出データ112に記録されている周囲物体に関するデータ(車線認識情報、横位置、横速度など)を基に、自車や周囲車両の車線変更の有無を抽出する車線変更抽出部114と、抽出データ112に記録されている周囲物体に関するデータ(速度、加速度など)を基に、周囲物体の加減速の度合い(変化)を抽出する加速度抽出部115と、シナリオ定義ファイル117(に予め定義されているイベント条件)に従って前記抽出データ112のイベント(カットイン、カットアウト、急減速、急加速など)を判断し、その時点において自車両1Bがどういうイベント下にあるかをイベント信号として出力するイベント判断部116を有する。 <Event information granting department>
The event
また、走行制御装置101は、イベント情報付与部113のイベント判断部116から出力されたイベント信号を基に、イベント情報が付与されたイベントシナリオ119を記憶部120によって蓄積していく。 <Memory>
Further, the
イベント予測部121では、前記イベント情報付与部113のイベント判断部116から出力されたイベント信号(現在の自車両1Bが置かれている実際の走行シーンに対応)と過去類似イベントのデータを保有するイベントシナリオ119を照合することによって、走行中の自車両1Bに起きているイベントについて近い将来どういうイベントが発生する確率が高いかを予測する。また、その予測イベントとその発生確率を走行制御部122に入力する。 <Event prediction department>
The
走行制御部122では、イベント予測部121から入力された予測イベントと発生確率を基に、走行制御に関する減速指令や目標走行速度の指令、横移動量の指令等を自車両1Bが走行しているシーンに応じて適切に設定することができる。 <Driving control unit>
In the
例えば図4のようなシチュエーションを想定し、自車両40A、40Bの前(自車走行レーン)にカットインしてきた追い越し車両41A、41Bの前方に車両42Bがカットインしてくることで、車両41A、41Bが急ブレーキをかけるようなイベントシナリオが既に記憶部120によって記憶されていたとする。実際の実車走行中に、自車両が3車線道路の第2車線を走行中に、第3車線の追い越し車両が迫っているかつ、第1車線にも車両が走行しているシーンに遭遇した時、前記現在の自車両が置かれているシーンを過去のイベントシナリオに照合すると、先行車がカットインして急減速するイベントが発生する可能性が高いと判断できるため(イベント予測部121)、ACC(Adaptive Cruise Control)の減速タイミングを早める等といった走行制御の調整がリアルタイムにできるようになる(走行制御部122)。 <Event prediction by the event prediction unit and command setting by the driving control unit>
For example, assuming the situation shown in FIG. 4, the
以上で説明したように、本実施例2の走行制御装置(情報生成装置)101は、前記イベント情報が付与されたイベントシナリオを記憶する記憶部120と、現在の自車両の走行情報に基づいて、前記現在の自車両が置かれているシーンを過去のイベントシナリオ(記憶部120)に照合することによって、前記現在の自車両に将来発生する(確率が高い)イベントを予測するイベント予測部121と、前記イベント予測部121で予測したイベント(予測イベント)又は予測したイベントの発生確率の少なくとも一つに基づいて、前記現在の自車両の走行制御を行う走行制御部122と、を備える。 <Operational effect of Example 2 (travel control device)>
As described above, the travel control device (information generation device) 101 of the second embodiment is based on the
Claims (8)
- 自車両の走行情報に基づいて、前記自車両が走行するシーンのシナリオを生成する情報生成装置において、
前記自車両の走行情報から、前記自車両の周囲の予め任意に分割された領域ごとに周囲物体が割り当てられたフォーメーションフォーマットに従って、前記自車両と前記周囲物体のレイアウトを抽象化したフォーメーションを形成するフォーメーション形成部と、
前記周囲物体のフォーメーションが変化するイベントに対応するシナリオを生成するために、前記自車両の走行情報において、前記周囲物体のフォーメーションが変化するイベントが発生したか否かを判断するフォーメーション遷移判断部と、を備えることを特徴とする情報生成装置。 In the information generation device that generates a scenario of the scene in which the own vehicle travels based on the travel information of the own vehicle.
From the traveling information of the own vehicle, a formation that abstracts the layout of the own vehicle and the surrounding object is formed according to a formation format in which a peripheral object is assigned to each previously arbitrarily divided area around the own vehicle. Formation formation part and
In order to generate a scenario corresponding to an event in which the formation of the surrounding object changes, a formation transition determination unit for determining whether or not an event in which the formation of the surrounding object changes has occurred in the traveling information of the own vehicle. An information generator, characterized in that it comprises. - 請求項1に記載の情報生成装置において、
前記イベントが発生した場合に、前記イベントの発生タイミングに基づいて、前記イベントに対応するシナリオを生成するのに用いる前記自車両の走行情報の抽出範囲を決め、前記抽出範囲のデータを抽出するデータ抽出部を備えることを特徴とする情報生成装置。 In the information generator according to claim 1,
When the event occurs, data for extracting the data in the extraction range by determining the extraction range of the traveling information of the own vehicle used to generate the scenario corresponding to the event based on the occurrence timing of the event. An information generation device characterized by having an extraction unit. - 請求項2に記載の情報生成装置において、
前記データ抽出部は、前記イベントが発生した場合に、前記イベントの発生タイミングの前後で前記自車両の走行情報を分割してデータを抽出することを特徴とする情報生成装置。 In the information generator according to claim 2,
The data extraction unit is an information generation device, characterized in that, when the event occurs, the travel information of the own vehicle is divided and data is extracted before and after the event occurrence timing. - 請求項2に記載の情報生成装置において、
前記自車両の走行情報の抽出範囲には、前記イベントの発生タイミングの一つもしくは複数を含むことを特徴とする情報生成装置。 In the information generator according to claim 2,
An information generation device characterized in that the extraction range of travel information of the own vehicle includes one or a plurality of occurrence timings of the event. - 請求項2に記載の情報生成装置において、
前記データ抽出部で抽出されたデータにおける前記周囲物体の挙動情報が予め定義されたイベント条件に該当するか否かを判断することで、前記抽出されたデータに対してシナリオのイベント情報を付与するイベント情報付与部を備えることを特徴とする情報生成装置。 In the information generator according to claim 2,
By determining whether or not the behavior information of the surrounding object in the data extracted by the data extraction unit corresponds to the predefined event conditions, the event information of the scenario is given to the extracted data. An information generation device including an event information addition unit. - 請求項5に記載の情報生成装置において、
前記周囲物体の挙動情報には、前記周囲物体の車線変更有無、速度の変化、又は加速度の変化の少なくとも一つを含むことを特徴とする情報生成装置。 In the information generator according to claim 5,
The information generation device, characterized in that the behavior information of the surrounding object includes at least one of the presence / absence of a lane change, a change in speed, or a change in acceleration of the surrounding object. - 請求項5に記載の情報生成装置において、
前記イベント情報が付与されたイベントシナリオを記憶する記憶部と、
現在の自車両の走行情報に基づいて、前記現在の自車両が置かれているシーンを過去のイベントシナリオに照合することによって、前記現在の自車両に将来発生するイベントを予測するイベント予測部と、
前記イベント予測部で予測したイベント又は予測したイベントの発生確率の少なくとも一つに基づいて、前記現在の自車両の走行制御を行う走行制御部と、を備えることを特徴とする情報生成装置。 In the information generator according to claim 5,
A storage unit that stores the event scenario to which the event information is assigned, and a storage unit.
An event prediction unit that predicts future events in the current own vehicle by collating the scene in which the current own vehicle is placed with the past event scenario based on the current running information of the own vehicle. ,
An information generation device including a travel control unit that controls the travel of the current own vehicle based on at least one of the event predicted by the event prediction unit and the probability of occurrence of the predicted event. - 請求項1に記載の情報生成装置において、
前記自車両の走行情報には、前記自車両の実車走行又はドライビングシミュレータによって取得される走行ログ、或いは、実車走行中に前記自車両で取得される走行情報を含むことを特徴とする情報生成装置。 In the information generator according to claim 1,
The travel information of the own vehicle includes a travel log acquired by the actual vehicle travel of the own vehicle or a driving simulator, or a travel information acquired by the own vehicle during the actual vehicle travel. ..
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112021005100.8T DE112021005100T5 (en) | 2020-12-18 | 2021-09-21 | INFORMATION GENERATING DEVICE |
JP2022569714A JP7470213B2 (en) | 2020-12-18 | 2021-09-21 | Information Generator |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020-210425 | 2020-12-18 | ||
JP2020210425 | 2020-12-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022130718A1 true WO2022130718A1 (en) | 2022-06-23 |
Family
ID=82059698
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/034453 WO2022130718A1 (en) | 2020-12-18 | 2021-09-21 | Information generation device |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP7470213B2 (en) |
DE (1) | DE112021005100T5 (en) |
WO (1) | WO2022130718A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7476412B2 (en) | 2022-08-17 | 2024-04-30 | ティーユーヴィー シュード コリア リミテッド | Vehicle crash test method |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016215790A (en) * | 2015-05-19 | 2016-12-22 | 株式会社デンソー | Lane change plan generating device, lane change plan generating method |
JP2017058761A (en) * | 2015-09-14 | 2017-03-23 | 株式会社デンソー | Driving assistance device and driving assistance program |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7043785B2 (en) | 2017-10-25 | 2022-03-30 | 株式会社Ihi | Information generator |
-
2021
- 2021-09-21 JP JP2022569714A patent/JP7470213B2/en active Active
- 2021-09-21 DE DE112021005100.8T patent/DE112021005100T5/en active Pending
- 2021-09-21 WO PCT/JP2021/034453 patent/WO2022130718A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016215790A (en) * | 2015-05-19 | 2016-12-22 | 株式会社デンソー | Lane change plan generating device, lane change plan generating method |
JP2017058761A (en) * | 2015-09-14 | 2017-03-23 | 株式会社デンソー | Driving assistance device and driving assistance program |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7476412B2 (en) | 2022-08-17 | 2024-04-30 | ティーユーヴィー シュード コリア リミテッド | Vehicle crash test method |
Also Published As
Publication number | Publication date |
---|---|
JP7470213B2 (en) | 2024-04-17 |
DE112021005100T5 (en) | 2023-10-12 |
JPWO2022130718A1 (en) | 2022-06-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10636295B1 (en) | Method and device for creating traffic scenario with domain adaptation on virtual driving environment for testing, validating, and training autonomous vehicle | |
US20170132117A1 (en) | Method and device for generating test cases for autonomous vehicles | |
WO2013108406A1 (en) | Vehicle behavior prediction device and vehicle behavior prediction method, and driving assistance device | |
JP2018113015A (en) | Autonomous system validation method | |
Lengyel et al. | Conflicts of automated driving with conventional traffic infrastructure | |
CN114647954B (en) | Simulation scene generation method, device, computer equipment and storage medium | |
So et al. | Generating traffic safety test scenarios for automated vehicles using a big data technique | |
US20190130760A1 (en) | In-vehicle device, information processing system, and information processing method | |
CN114091223A (en) | A construction method and simulation equipment for simulating traffic flow | |
CN114117742A (en) | Automatic driving scene generation method, device, electronic device and storage medium | |
KR20200082672A (en) | Simulation method for autonomous vehicle linked game severs | |
KR20210065409A (en) | Method and Apparatus for Collision Avoidance Trajectory Planning of Autonomous Vehicle | |
CN114692289A (en) | Automatic driving algorithm testing method and related equipment | |
WO2022130718A1 (en) | Information generation device | |
Oboril et al. | Mtbf model for avs-from perception errors to vehicle-level failures | |
JP6689477B2 (en) | In-vehicle device, information processing method, and information processing program | |
KR20200026031A (en) | Deep learning based traffic signal control method and device for rlr detection and accident prevention | |
KR102766063B1 (en) | Vils system and vils test method | |
Wang et al. | Analysis and Prevention of Chain Collision in Traditional and Connected Vehicular Platoon | |
US10346690B2 (en) | Driving assistance systems and method implemented in such a system | |
Kindo et al. | Theory of collision avoidance capability in automated driving technologies | |
Li et al. | Driver braking behaviour under near-crash scenarios | |
Enayati et al. | Resilient Multi-range Radar Detection System for Autonomous Vehicles: A New Statistical Method | |
Riegl et al. | Parameterization of automated driving functions in virtual environments based on characteristic test scenarios | |
JP7658514B2 (en) | Control specification definition method, vehicle control device, and control specification definition device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21906083 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022569714 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112021005100 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21906083 Country of ref document: EP Kind code of ref document: A1 |