EP3342661B1 - Sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method - Google Patents
Sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method Download PDFInfo
- Publication number
- EP3342661B1 EP3342661B1 EP17205392.8A EP17205392A EP3342661B1 EP 3342661 B1 EP3342661 B1 EP 3342661B1 EP 17205392 A EP17205392 A EP 17205392A EP 3342661 B1 EP3342661 B1 EP 3342661B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- pedestrian
- vehicle
- information
- pdcms
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
- B60Q9/008—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for anti-collision purposes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60T—VEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
- B60T7/00—Brake-action initiating means
- B60T7/12—Brake-action initiating means for automatic initiation; for initiation not subject to will of driver or passenger
- B60T7/22—Brake-action initiating means for automatic initiation; for initiation not subject to will of driver or passenger initiated by contact of vehicle, e.g. bumper, with an external object, e.g. another vehicle, or by means of contactless obstacle detectors mounted on the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/09—Taking automatic action to avoid collision, e.g. braking and steering
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/26—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
- B60Q1/44—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating braking action or preparation for braking, e.g. by detection of the foot approaching the brake pedal
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
- B60R21/34—Protecting non-occupants of a vehicle, e.g. pedestrians
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60T—VEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
- B60T8/00—Arrangements for adjusting wheel-braking force to meet varying vehicular or ground-surface conditions, e.g. limiting or varying distribution of braking force
- B60T8/17—Using electrical or electronic regulation means to control braking
- B60T8/171—Detecting parameters used in the regulation; Measuring values used in the regulation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60T—VEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
- B60T8/00—Arrangements for adjusting wheel-braking force to meet varying vehicular or ground-surface conditions, e.g. limiting or varying distribution of braking force
- B60T8/17—Using electrical or electronic regulation means to control braking
- B60T8/172—Determining control parameters used in the regulation, e.g. by calculations involving measured or detected parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60T—VEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
- B60T8/00—Arrangements for adjusting wheel-braking force to meet varying vehicular or ground-surface conditions, e.g. limiting or varying distribution of braking force
- B60T8/32—Arrangements for adjusting wheel-braking force to meet varying vehicular or ground-surface conditions, e.g. limiting or varying distribution of braking force responsive to a speed condition, e.g. acceleration or deceleration
- B60T8/58—Arrangements for adjusting wheel-braking force to meet varying vehicular or ground-surface conditions, e.g. limiting or varying distribution of braking force responsive to a speed condition, e.g. acceleration or deceleration responsive to speed and another condition or to plural speed conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/085—Taking automatic action to adjust vehicle attitude in preparation for collision, e.g. braking for nose dropping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0953—Predicting travel path or likelihood of collision the prediction being responsive to vehicle dynamic parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0956—Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/23—Updating
- G06F16/2365—Ensuring data consistency and integrity
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60T—VEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
- B60T2201/00—Particular use of vehicle brake systems; Special systems using also the brakes; Special software modules within the brake system controller
- B60T2201/02—Active or adaptive cruise control system; Distance control
- B60T2201/024—Collision mitigation systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60T—VEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
- B60T2210/00—Detection or estimation of road or environment conditions; Detection or estimation of road shapes
- B60T2210/30—Environment conditions or position therewithin
- B60T2210/32—Vehicle surroundings
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60T—VEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
- B60T2220/00—Monitoring, detecting driver behaviour; Signalling thereof; Counteracting thereof
- B60T2220/03—Driver counter-steering; Avoidance of conflicts with ESP control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0001—Details of the control system
- B60W2050/0043—Signal treatments, identification of variables or parameters, parameter estimation or state estimation
- B60W2050/0052—Filtering, filters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/143—Alarm means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/408—Radar; Laser, e.g. lidar
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2510/00—Input parameters relating to a particular sub-units
- B60W2510/18—Braking system
- B60W2510/182—Brake pressure, e.g. of fluid or between pad and disc
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
- B60W2520/105—Longitudinal acceleration
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/18—Steering angle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/402—Type
- B60W2554/4029—Pedestrians
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
- B60W2554/802—Longitudinal distance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
- B60W2554/804—Relative longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2555/00—Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
- B60W2555/20—Ambient conditions, e.g. wind or rain
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2710/00—Output or target parameters relating to a particular sub-units
- B60W2710/18—Braking system
- B60W2710/182—Brake pressure, e.g. of fluid or between pad and disc
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2720/00—Output or target parameters relating to overall vehicle dynamics
- B60W2720/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Y—INDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
- B60Y2400/00—Special features of vehicle units
- B60Y2400/30—Sensors
Definitions
- Exemplary embodiments of the present invention relate to a sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method, and more particularly, to an apparatus and a method for activating a Pedestrian Detection and Collision Mitigation Systems (PDCMS) of a vehicle capable of recognizing a pedestrian by allowing a front detection sensor to integrate sensing results from different types of sensors and protecting a pedestrian by activating a PDCMS function when an accident is highly likely to occur by analyzing gaze information of the pedestrian.
- PDCMS Pedestrian Detection and Collision Mitigation Systems
- the ADAS includes a PDCMS.
- the PDCMS is a technology that warns a driver of a vehicle of a pedestrian collision when a collision of a pedestrian with the vehicle is expected and automatically activates an emergency brake of the vehicle.
- the PDCMS system may help reduce a speed of the vehicle against inevitable pedestrian collisions, thereby alleviating pedestrian impacts and reducing the lethality and the injury rates.
- DE 10 2011 112985 A1 discloses a method involving stepwise activation of safety measures depending on a determined criticality of detected driving conditions.
- JP 2014-059841 A discloses a driving support device including detection means, face orientation recognition means, estimation means, a warning device, determination means and control means.
- An object of the present invention is to provide an apparatus for activating a PDCMS including a front detection sensor capable of more accurately detecting presence of a pedestrian and measuring a distance and a relative speed between a vehicle and the pedestrian even in a low illuminance condition or at night by integrating sensing results from different types of sensors.
- Another object of the present invention is to provide a system for more safely protecting a pedestrian by accurately activating a PDCMS function.
- an apparatus for activating a pedestrian detection and collision mitigation system (PDCMS) of a vehicle according to independent claim 1 is provided.
- PDCMS pedestrian detection and collision mitigation system
- the information integration unit may include: a consistency determination unit calculating a correlation between the radar detection information and the far-infrared recognition information to form a binary matrix; a similarity calculation unit generating a measurement value using the binary matrix formed by the consistency determination unit and calculating similarity in a state variable area between the measurement value and track tracking information of a previous time; a state variable updating unit updating each tracking information by performing Kalman filtering using the measurement value and the state variable; and a tracking information management unit performing merging, generation, and deletion of the track.
- the similarity calculation unit may represent longitudinal / lateral relative positions of the radar detection information and the far-infrared recognition information using a polar coordinate system to generate the measurement value.
- the state variable updating unit may apply an extended Kalman filter to perform the Kalman filtering.
- the front detection sensor may further include a radar detection information post-processing unit removing a result other than the pedestrian among the radar detection information before the integration of the result with the far-infrared recognition information.
- the radar detection information may be generated by the number equal to or less than 64.
- the far-infrared recognition information may be generated by the number equal to or less than 8.
- the gaze information of the pedestrian may correspond to a front or a diagonal if
- the electronic control unit may perform the activation of the operation of the warning unit and the activation of the operation of the brake by delaying the activation of the operation of the warning unit and the activation of the operation of the brake by a predetermined time when the gaze information of the pedestrian is the front or the diagonal, compared to when the gaze information of the pedestrian is the rear or the side.
- the vehicle sensor may further include at least any one of a rain sensor, a temperature sensor, and an illumination sensor.
- the electronic control unit may perform the activation of the operation of the brake so that the speed of the vehicle is reduced to at least a predetermined speed or more from time when the operation of the brake is activated to time when the collision of the pedestrian with the vehicle occurs.
- the electronic control unit may permit the driver to operate the brake for a maximum possible deceleration even after the activation of the operation of the brake starts.
- the electronic control unit may control the warning unit to inform the driver that the PDCMS function is in an available state.
- the warning unit may include a display unit visually informing the collision of the pedestrian with the vehicle or a speaker unit audibly informing the collision of the pedestrian with the vehicle.
- the PDCMS function may further include an operation of a rear brake lamp.
- the PDCMS function may further include an operation of an electrical stability control (ESC).
- ESC electrical stability control
- a method for activating a pedestrian detection and collision mitigation system (PDCMS) of a vehicle according to independent claim 15 is provided.
- PDCMS pedestrian detection and collision mitigation system
- FIG. 1 is a diagram illustrating a schematic concept of a PDCMS.
- the PDCMS is a technology that warns a driver of a vehicle of a pedestrian collision when a collision of a pedestrian with the vehicle is expected and automatically activates an emergency brake of the vehicle.
- the PDCMS function is performed by issuing a warning to driver and activating a vehicle control.
- a system designer may design the PDCMS function to operate solely in the risk of collision of a pedestrian with a vehicle or may design the PDCMS function to operate in combination with other driving assistance systems.
- FIG. 2 is a block diagram illustrating a change in a PDCMS state according to a vehicle.
- the PDCMS off state In a PDCMS off state, no action is taken on the operation of the vehicle.
- the PDCMS off state is produced when an engine of a vehicle stalls.
- the apparatus for activating a PDCMS monitors a speed of a vehicle and determine whether the PDCMS is in an appropriate state to activate.
- the PDCMS deactivation state is produced by turning on the engine in the PDCMS off state. Further, the PDCMS deactivation state is produced even when the vehicle is in a state other than the conditions that the vehicle is activated from the PDCMS activation state. For example, when the speed of the vehicle falls below a predetermined value Vmin, the PDCMS deactivation state is produced.
- the PDCMS activation state is produced when the speed of the vehicle is equal to or greater than the predetermined value Vmin and equal to or less than a predetermined value Vmax.
- Vmin the predetermined value
- Vmax a predetermined value
- an operation of a pedestrian and an operation of a vehicle are monitored.
- the PDCMS function starts.
- the PDCMS function includes a collision warning to a driver and an operation of an emergency brake or optionally includes braking actions by a driver.
- FIG. 3 is a block diagram schematically illustrating an apparatus for activating a PDCMS of a vehicle according to an embodiment of the present invention.
- a PDCMS operating apparatus 100 of a vehicle includes a front detection sensor 200, a vehicle sensor 300, an electronic control unit 400, and a warning unit 500.
- the front detection sensor 200 includes a radar sensor and a far-infrared sensor (which can be in a form of camera) and integrates and uses output results of each of the sensors to more accurately detect a distance and a relative speed between a vehicle and a pedestrian in a low illuminance condition or at night.
- the front detection sensor 200 may extract characteristics of obstacles detected in front of the vehicle to identify objects and detect various objects such as vehicles on a roadside as well as pedestrians.
- the front detection sensor 200 may detect even parts configuring a pedestrian as well as the overall appearance of the pedestrian to detect the pedestrian even when only a part of the pedestrian covered by various objects such as vehicles on a roadside is detected. Further, the front detection sensor 200 may detect gaze information of a pedestrian when an object in front of the vehicle is determined as the pedestrian.
- the front detection sensor 200 transmits the detected information on the pedestrian to the electronic control unit 400.
- the vehicle sensor 300 measures revolutions per minute (RPM) of a vehicle wheel from a vehicle engine and calculates a driving speed of a vehicle based on the known circumference of the wheel and the measured RPM and time. Further, the vehicle sensor 300 may detect information on driving conditions of a vehicle such as acceleration, a steering angle, a steering angular velocity, and a pressure of a master cylinder. Further, the vehicle sensor 300 may also detect information on driving environment of a vehicle by including a rain sensor, a temperature sensor, an illuminance sensor, etc. The vehicle sensor 300 may transmit the information on the detected driving conditions and driving environment of the vehicle to the electronic control unit 400.
- RPM revolutions per minute
- the electronic control unit 400 determines whether to operate the PDCMS function of the vehicle based on the information received from the front detection sensor 200 and the vehicle sensor 300. Specifically, the electronic control unit 400 determines whether the conditions that the PDCMS function may be operated are satisfied by combining the pedestrian state and the vehicle state. That is, the electronic control unit 400 determines the risk of collision between a vehicle and a pedestrian using a current position of the pedestrian, a current position of the vehicle, and speed information on the vehicle if it is determined that an obstacle is the pedestrian.
- the conditions that the PDCMS function may be operated are satisfied because the collision is highly likely to occur and if the distance between the pedestrian and the vehicle is below a predetermined distance but the motion direction of the pedestrian differs from the movement direction of the vehicle, it is determined that the conditions that the PDCMS function may be operated are not satisfied because the collision is less likely to occur.
- the electronic control unit 400 determines whether the conditions that the PDCMS function may be operated are satisfied based on the mapping table.
- the mapping table will be described below with reference to FIG. 6 .
- the PDCMS function includes operating the warning unit 500 to warn the driver of the collision of the pedestrian with the vehicle or operating the brake without the operation of the driver.
- Warning the driver of the collision of the pedestrian with the vehicle is performed by operating the warning unit 500.
- the warning unit 500 is operated by the control of the electronic control unit 400.
- the warning unit 500 may include a display unit or a speaker unit.
- the display unit included in the warning unit 500 may provide a driver with a visual warning through a head-up display, a navigation display, etc.
- the speaker unit included in the warning unit 500 may provide a driver with an audible warning through an audio.
- the content of the warning that the warning unit 500 performs is that there is a potential risk of collision of the pedestrian with the vehicle since obstacles exist in the front of a driving lane of the vehicle.
- the activation of the operation of the brake regardless of whether the driver operates the brake is performed only by the control of the electronic control unit 400 without the operation of the driver.
- the activation of the operation of the brake is to automatically reduce the relative speed between the vehicle and the pedestrian if it is found that the pedestrian collision is just around the corner.
- the activation of the operation of the brake is performed so that the speed of the vehicle may be reduced to at least a predetermined speed or more from the time when the operation of the brake is activated to the time when the collision of the pedestrian with the vehicle occurs.
- the predetermined speed may be 20 km/h.
- the driver manually operates the brake, thereby performing the maximum possible deceleration. That is, the driver may manually operate the brake so that the speed of the vehicle is reduced more than the predetermined speed. For example, the driver may manually operate the brake so that the speed of the vehicle is maximally decelerated to 20 km / h or more that is the predetermined speed.
- the electronic control unit 400 may inform a driver that the PDCMS function is in an available state.
- the electronic control unit 400 may control the warning unit 500 to inform the driver that the PDCMS function is in the available state through the display unit or the speaker unit of the warning unit 500.
- the PDCMS function may control an operation of a brake lamp to prevent the potential risk of collision with the following vehicles.
- the PDCMS function may further include an operation of an electrical stability control (ESC).
- ESC is an apparatus that allows a vehicle itself to intervene in an operation of a brake of the vehicle in an emergency situation such as an oversteer (when a vehicle enters inwardly beyond a turning radius of a road) or an understeer (when a vehicle deviates outwardly beyond the turning radius of the road) of a vehicle to thereby help a driver to escape from an emergency situation.
- FIG. 4 is an exemplified view illustrating an AEB VRU scenario.
- the Euro NCAP sets the pedestrian AEB as a test item by defining the pedestrian AEB as a test item under the name of vulnerable road use (AEB VRU) and describes in detail a test procedure and a scoring method under the name of test protocol
- an appearance of a pedestrian is defined as vehicle to VRU far-side adult (CVFA), vehicle to VRU near-side adult (CVNA), and vehicle to VRU near-side child (CVNC) as illustrated in FIG. 6 .
- CVFA vehicle to VRU far-side adult
- CVNA vehicle to VRU near-side adult
- CVNC vehicle to VRU near-side child
- the CVFA is assumed to be a situation in which an adult moves at a speed of 8 km / h to collide with a center of a vehicle
- the CVNA is assumed to be a situation in which an adult moves at a speed of 5 km / h to collide with points corresponding to 25% and 75% of a vehicle width
- the CVNC is assumed to be a situation in which a child moves at a speed of 5 km / h between stopped obstacle vehicles to collide with a center of a vehicle.
- the moving speed of the vehicle is set to be 20 to 60 km / h, and the movement of the vehicle is set to start from time to collision (TTC) that is about 4.0 seconds.
- TTC time to collision
- the scoring criterion after AEB braking is changed depends on how much the speed of the vehicle is decelerated from the first moving speed. In order to get a perfect score, the collision need not occur up to 40 km / h and the deceleration needs to be made at a speed higher than that.
- the present invetnion relates to the pedestrian recognition, and a braking control related algorithm corresponds to one deviating from the scope of the present invention.
- the first recognition needs to be performed before the warning time.
- the time corresponds to the TTC of about 2 seconds.
- FIG. 5 is an exemplified view illustrating initial position setting in a CVFA scenario.
- the initial position as illustrated in FIG. 5 is set according to the scenario.
- the overall motion of the pedestrian is the order of stop, uniformly accelerated motion, and uniform velocity motion.
- FIG. 6 is an exemplified view illustrating a pedestrian speed profile used in a simulation.
- the time intervals for each motion are a function of a final speed, an initial position, and an acceleration distance.
- the CVFA has a time interval of 1.075 seconds at t 1 , 0.90 seconds at t 2 , and 2.025 seconds at t 3
- the CVNA has a time interval of 1.10 seconds at t 1 , 0.72 seconds at t 2 , and 2.16 seconds at t 3
- the CVNC has a time interval of 1.10 seconds at t 1 , 0.72 seconds at t 2 , and 2.16 seconds at t 3 .
- FIGS. 7A to 7F are exemplified views illustrating simulation results of a relative distance and a relative angle of a pedestrian.
- FIG. 7A illustrates the result of calculating the relative distance with time of the CVFA, in which the relative distance is uniformly increased depending on the speed of the vehicle
- FIG. 7B illustrates the relative angle with time of the CVFA, in which a pedestrian exists at the left of the vehicle and therefore an angle value appears as a negative number.
- an absolute value of the angle increases as the vehicle approaches and then has a constant relative angle when the vehicle enters a uniform velocity section.
- FIGS. 7C and 7D may illustrate that the relative angle at the TTC of 0.0 is -90° or 90° as a result of the relative angle in the case of CVNA 25/75
- FIGS. 7E and 7F illustrates the relative distance and the relative angle of the CVNC and the head-on collision situation, and therefore it may be confirmed that the behavior of the CVNC appears as the same as that of the CVFA but is not observed at the beginning due to obstacles and the information thereon appears from the TTC of about 2.0 seconds.
- a recognition target is calculated based on the TTC of about 2 seconds that is a warning time, it is required to satisfy a maximum recognition distance of 40 m at a vehicle speed of 60 km / h and a maximum recognition angle of 44° at a vehicle speed of 20 km / h.
- the AEB pedestrian recognition system needs to include a sensor for pedestrian recognition and may have recognition characteristics according to the applied sensor.
- FIG. 8 is an exemplified view illustrating a distance and speed measurement principle of a radar.
- the radar uses a method for comparing a received radio wave and a transmitted radio wave to output a distance through a phase difference and output a speed through a frequency change by a Doppler effect. Since two physical quantities of the distance and the speed measured by the radar are all directly obtained from the sensor, a radar sensor is classified into sensors that measure a distance and a speed.
- FIG. 9 is an exemplified view illustrating a distance calculation principle in an image using a vanishing point.
- a camera In contrast to the radar, a camera needs to calculate a three-dimensional space using the obtained pixel information and calculate a distance using on a camera model or using a correlation between a size and a distance of a pixel unit of a recognized object, and therefore is classified as a sensor of estimating the distance.
- FIG. 9 is one example illustrating the distance calculation principle in the image using the vanishing point, and the distance estimation results may vary depending on internal and external variables of the camera.
- LEDDAR light emitting diode detection and ranging
- Lidar is a sensor that measures the distance similar to the radar but may not sense the Doppler effect, and therefore performs Bayesian filtering on the data of the measured points to calculate the speed. That is, it may be classified as a sensor that measures a distance and estimates a speed.
- a monocular (multi-functional front) camera which is practically currently used has a viewing angle (or horizontal viewing angle) of 52°.
- the recognition range is known as 30 m within an irradiation range of a head lamp.
- the distance measurement sensor measures the distance and the speed, and therefore has the accuracy slightly lower than that of the distance measurement sensor.
- FIG. 10 is an exemplified view illustrating pedestrian recognition of a far-infrared image.
- the far-infrared camera may image heat radiated from a target object to recognize a pedestrian even in a low illuminance condition or an extremely low illuminance condition and recognize a pedestrian up to 50 m that is a relatively long distance.
- the far-infrared camera has a disadvantage in that since the viewing angle is somewhat insufficient as 38° and the resolution is only 1/10 of that of the monocular camera, the estimation accuracy of the distance and the speed is relatively lower than that of the monocular camera.
- FIG. 11 is an exemplified view illustrating a pedestrian detection using the LEDDAR.
- the LEDDAR sensor has a limit in that since 16 detectors are arranged horizontally and an output is restrictive, it is impossible to recognize a pedestrian and the detection may also be made only in about 20 m.
- FIG. 12 is an exemplified view illustrating a pedestrian signal using 3D Lidar.
- the 3D Lidar sensor has a plurality of layers in a vertical direction, and therefore applies vertical / horizontal direction information and a pattern recognition technology in a restrictive situation, thereby recognizing a pedestrian. Specifically, pedestrians within 15m may be recognized, and pedestrians within 40m may be detected. Referring to FIG. 12 , it may be seen that contour lines are revealed when a large amount of data is collected for two pedestrians, but the amount of data is reduced as the distance is increased and only the data corresponding to one layer appears at a distance of 45 m or more. In this case, since the amount of data decreases as the distance increases, and only data corresponding to one layer is displayed for more than 45 meters.
- the LEDDAR and the 3D Lidar sensor commonly have a disadvantage in that they are greatly affected by a material and a color of the target object because they use a near-infrared wavelength signal.
- FIG. 13 is an exemplified view illustrating a pedestrian detection using a radar.
- the radar sensor is a sensor that uses a radio wave reflected from an obstacle in front of a vehicle and it may be confirmed that in the case of detecting a pedestrian, a power of a radio wave reflected from a pedestrian is low and therefore a pedestrian may be detected at a distance within 40 m that is a distance shorter than a vehicle.
- the detection probability is somewhat lower than the vehicle according to the surrounding environment.
- the unit of the horizontal axis and the vertical axis is m, and a portion where a discontinuous signal appears corresponds to a phenomenon occurring when a pedestrian is not detected for a predetermined time.
- FIG. 14 is a configuration diagram of the front detection sensor according to an embodiment of the present invention.
- the front detection sensor 200 may solve the above problem and may more accurately recognize a pedestrian even in the low illumination condition or at night by satisfying a recognition distance of 40m and a horizontal viewing angle of 44°.
- the front detection sensor 200 includes a radar sensor 210 detecting a pedestrian using a radio wave reflected from the pedestrian and generating radar detection information, a far-infrared sensor 220 imaging heat radiated from an object to generate far-infrared recognition information, and an information integration unit 230 integrating the radar detection information and the far-infrared recognition information.
- the radar sensor 210 has features such as a horizontal viewing angle of 90°, a detection distance of about 40 meters, and an estimation of a distance and a speed that may not be recognized and the far-infrared sensor 220 has features such as a horizontal viewing angle of 38°, a detection / recognition distance of 50 m, and an estimation of a distance with low accuracy.
- the maximum recognition distance is the shorter distance of the detection distance and the recognition distance, but the pedestrian information may be maintained even if the result is output from only one of the two sensors after the first recognition, and therefore the recognition angle becomes the wider horizontal viewing angle of the recognition angle and the detection angle. That is, the radar sensor 210 and the far-infrared sensor 220 of the present invention are combined to meet the recognition distance of 40 m and the horizontal viewing angle of 44°.
- the information integration unit 230 integrates the radar detection information generated from the radar sensor 210 and the far-infrared recognition information generated from the far-infrared sensor 220.
- the present invention may include a radar detection information post-processing unit 211 removing the result other than the pedestrian among the radar detection information before the integration of the result with the far-infrared recognition information, prior to integrating each information.
- the detection by radar restrictively appears because the reflection of the radio wave is relatively low. It may be confirmed that the detection is made only for a predetermined distance and angle or less by the pedestrian detection experiment, and numerical values of the reflected power, the width, etc., among the radar output information appears to be low.
- the radar detection information post-processing unit 211 previously removes the detection result other than not the pedestrian, based on the experimental data on the condition of the detection distance, the angle, the width, the reflected power, and the like.
- FIGS. 15A to 15D are exemplified views illustrating a radar detection information post-processing process of the radar detection information post-processing unit according to an embodiment of the present invention. . Specifically, it is a result obtained by post-processing a result of detecting a pedestrian moving in a lateral direction in a state where a vehicle equipped with the radar sensor 210 stops.
- Values represented on the graph are angle values of each track with time
- FIG. 15A illustrates an initial input value and may confirm that it includes all information of 64 tracks as an initial input value and it is very difficult to observe a certain value.
- FIG. 15B in which only valid states are left, it may be seen that certain values appear.
- the graph represents an angle value with time
- tracks corresponding to straight lines parallel in a horizontal direction may be regarded as an object that is in a stop state even after time lapses and a straight line appearing in a diagonal direction may be regarded as a signal for a pedestrian moving in a lateral direction in which an angle is changed with time
- FIG. 15C illustrate results after the removal of FIG.
- FIG. 15B on the condition of the distance and the angle and may confirm that a lot of data are reduced and FIG. 15D illustrates results after the removal of the signals of FIG. 15C on the condition of the reflected power and the detection width.
- Data about a pedestrian appearing diagonally and information about some other objects that are not filtered remain.
- more than 90% of the total output from the radar sensor 210 is removed, such that errors of the information integration (combination) performed in the subsequent steps may be reduced.
- the information integration unit 230 after the post-processing of the radar detection information post-processing unit 211 (or including the case where there is no post-processing) integrates the radar detection information with the far-infrared recognition information.
- the information integration unit is configured to include a consistency determination unit 231 calculating a correlation between the radar detection information and the far-infrared recognition information to form a binary matrix, a similarity calculation unit 232 using the binary matrix formed by the consistency determination unit to generate measurement values and calculating similarity in a state variable area between the measurement value and track tracking information, a state variable updating unit 233 performing Kalman filtering using the measurement values and the state variables to update each tracking information, and a tracking information management unit 234 performing merging, generation, and deletion of the track.
- the description of each configuration will be as follows.
- the consistency determination unit 231 calculates a correlation between a maximum of 64 radar detection information and a maximum of 8 far-infrared recognition information to form an 8 x 64 binary matrix. If the angles and distances at the outputs of each sensor have a difference equal to or less than thresholds of 3 steps that are formed according to the conditions, it is determined that two results are highly likely to result from the same object and thus a matrix value is set to be 1, and otherwise, the matrix value is set to be 0. The determined result serves as a kind of candidate groups transmitted to a step of calculating the similarity between the tracking information and the measurement value.
- the similarity calculation unit 232 generates a measurement value to be used in a Kalman filter by integrating the output of the sensor having a value of 1 in the matrix and adds one row and one column so that even the output of the single sensor may generate a filtering result to thereby calculate and use the measurement value generated as the result of the single sensor.
- FIG. 16 is an exemplified diagram illustrating a generation of measurement values by a similarity calculation unit according to an embodiment of the present invention.
- the similarity is calculated in the state variable area for the generated measurement value and all the track tracking information at the previous time to connect between a measurement value having maximum similarity and the previous track tracking degree, and matching using GNN is performed to ensure the similarity of the pair of the entire track-measurement value.
- the measurement value to be generated needs to be output to longitudinal / lateral relative positions that are variables that may be compared with the state variable, and the relative positions are values included in the radar detection information and the far-infrared recognition information and therefore the two results need to be integrated.
- FIGS. 17A and 17B are exemplified views illustrating a comparison between an integration of longitudinal / lateral relative positions of radar detection information and far-infrared recognition information by a rectangular coordinate system and an integration of longitudinal / lateral relative positions integration of radar detection information and far-infrared recognition information by a polar coordinate system according to an embodiment of the present invention.
- the integration of the longitudinal / lateral relative positions is performed by using the polar coordinate system rather than the rectangular coordinate system.
- FIG. 17A illustrates the case where the longitudinal / lateral relative positions are integrated by using the rectangular coordinate system, and when viewed from the perspective of the reflected errors, errors occurring in the longitudinal direction with respect to the error region of the radar sensor 210 represented by a red ellipse and errors occurring in the lateral direction with respect to the error region of the far-infrared sensor 220 represented a blue ellipse are reflected to the integration result.
- the target object exists on the front of the vehicle, relatively smaller errors occur, but when the target object exists on the side, the errors in the diagonal direction of the ellipse occur.
- FIG. 17B illustrates the case where the longitudinal / lateral relative positions are integrated using the polar coordinate system, and it may be confirmed that the small errors occur in both cases where the target object is on the front of the vehicle or on the side thereof.
- the angles and the distances integrated by the polar coordinate system may be converted into the rectangular coordinate system through a simple trigonometric operation, such that the conversion and the integration as described above in the step of outputting the final measurement value may be a method for generating the smaller errors.
- the state variable updating unit 233 may update each track tracking information by performing the Kalman filtering using the integrated measurement value and the state variable.
- the present invention may also apply an extended Kalman filter is configured to change element values of a state variable transition matrix every time in consideration of the fact that the time at which the time when the output of the far-infrared sensor 220 is generated may be changed depending on the number of candidate groups every time.
- the tracking information management unit 234 performs the merging, generation, and deletion of the track.
- the later generated track is configured to be merged into the first generated track.
- the corresponding track and the radar detection information of the radar sensor 210 are identically generated by being integrated on the polar coordinate system by observing whether there are the results not applied as the measurement values of the track among the outputs of the far-infrared sensor 220 and when the reliability of the far-infrared recognition information is high while the corresponding radar detection information may not be found, the track may be configured to be generated solely.
- the measurement values are not allocated for a predetermined period of time, so that the track is deleted only by the predicted value.
- coast tracking is set to be maintained for a longer period time in consideration of that fact that the output of the sensor is difficult to be generated in the approach situation within 5 m.
- the measurement values are allocated each time, when it is determined that the case in which the allocation is continuously made only by the result of the single sensor rather than the integration result is abnormal and deleted, even when the case in which the longitudinal / lateral speed component on the state variable exceeds 15 km / h, the corresponding object is regarded as not being a pedestrian and thus is deleted.
- FIGS. 18A to 18F are exemplified views illustrating an output result of the front detection sensor according to an embodiment of the present invention.
- FIGS. 18A to 18C illustrate that white dots represent the radar detection information, red dots represent the far-infrared ray recognition information, and a sky blue circle represents the information integration result. Further, FIGS. 20B, 20D, and 20F illustrate the information imaged by the far-infrared sensor 220 at the time of the integrating.
- the integration result of the relative positions of the pedestrian is outputted normally from the front detection sensor 200.
- the front detection sensor 200 includes the radar sensor 210 and the far-infrared sensor 220 in an automatic emergency braking system applied to a vehicle and integrates and uses the results using the polar coordinate system, such that the distance and the relative speed between the vehicle and the pedestrian may be more accurately calculated even in the low illuminance condition or at nighttime.
- FIG. 19 is an exemplified view for identifying gaze information of the front detection sensor according to the embodiment of the present invention.
- the front detection sensor 200 may detect the gaze information of the pedestrian and reflect the detected gaze information to the operation of the PDCMS of the electronic control unit 400.
- the front detection sensor 200 sets a horizontal length of the entire face of the pedestrian to be x, a horizontal length from a left face contour to a left eye to be x1, a horizontal length from a right face contour to a right eye to be x2, and a horizontal length between the left eye and the right eye to be x3 and then detects the gaze information of the pedestrian.
- the gaze information of the pedestrian corresponds to the front or the diagonal if
- the front detection sensor 200 transmits the detected gaze information of the pedestrian to the electronic control unit 400.
- the method for reflecting the gaze information of the pedestrian received by the electronic control unit 400 to the operation of the PDCMS will be described with reference to FIGS. 23 to 26 .
- FIG. 20 is a diagram illustrating a concept of a pedestrian moving speed.
- the front detection sensor 200 may detect a distance between a pedestrian 600 and a vehicle 700 that are moving within a driving lane and a moving speed of the pedestrian 600.
- the pedestrian 600 moves from the left to the right with respect to a front view of the vehicle 700, the pedestrian 600 has a negative (-) moving speed and if the pedestrian 600 moves from the right to left with respect to the front view of the vehicle 700, the pedestrian 600 has a positive (+) moving speed.
- the front detection sensor 200 may detect the distance between the vehicle 700 and the pedestrian 600 moving on the driving lane of the vehicle.
- FIG. 21 is a diagram illustrating an example of a mapping table for activating a PDCMS function according to the embodiment of the present invention.
- the electronic control unit 400 uses the mapping table to determine the risk of collision of the pedestrian with the vehicle, and furthermore, whether the PDCMS function is operated.
- the electronic control unit 400 determines the operation of the PDCMS function based on an initial speed at a boundary of a driving lane on which the pedestrian moves and an initial speed of the vehicle.
- the electronic control unit 400 determines that the PDCMS function is operated.
- the operation possible area means the area in which the Vmin or the Vmax may be adjusted according to the selection of the manufacturer.
- the electronic control unit 400 may determine that the PDCMS is in the deactivation state and thus the PDCMS function is not operated.
- the electronic control unit 400 may determine that the PDCMS function is operated.
- FIG. 22 is a diagram illustrating an example of the operation of the PDCMS function according to the embodiment of the present invention.
- a vertical axis represents the TTC derived from the distance and the relative speed between the vehicle and the pedestrian and a horizontal axis represents the operation of the PDCMS function of the vehicle.
- the electronic control unit 400 performs the PDCMS operation by steps according to the distance between the vehicle and the pedestrian.
- the warning of the warning unit 500 may include the visual warning through the display unit or the audible warning through the speaker unit.
- the partial braking means reducing the speed of the vehicle to at least a predetermined speed or more and the full braking means maximally reducing the speed of the vehicle.
- the driver may manually operate the brake to perform the maximum possible deceleration. That is, the driver may manually operate the brake to reduce the speed of the vehicle more than the sequential deceleration according to the PDCMS function.
- FIGS. 23 to 26 illustrate the method for applying the gaze information of the pedestrian received from the front-side sensor 200 to the operation of the PDCMS function for performing the sequential deceleration of the electronic control unit 400.
- the pedestrian When the pedestrian recognizes the driving direction of the vehicle, the pedestrian is less likely to move to the inside of the route of the vehicle is low and the possibility of collision is reduced accordingly.
- the gaze direction of the pedestrian when the gaze direction of the pedestrian is not taken into consideration, the gaze of the pedestrian is directed toward the front of the vehicle, and therefore it is likely to activate the warning and the operation of the brake due to the unnecessary operation of the PDCMS function even in the situation that the pedestrian recognizes the driving direction of the vehicle and the possibility of collision is low.
- the essential purpose of the PDCMS for protecting the pedestrian may be achieved and the reduction in the ride comfort of the driver may be prevented.
- FIG. 23 is an exemplified view illustrating the operation of the PDCMS function when the gaze information is the front according to the embodiment of the present invention.
- the horizontal length of the entire face of the pedestrian is x
- the horizontal length from the left face contour to the left eye is defined as x 1
- the horizontal length from the right face contour to the right eye is defined as x 2
- the horizontal length between the left eye and the right eye is defined as x 3 .
- the gaze of the pedestrian is directed toward the front of the vehicle, it corresponds to the case where
- the electronic control unit 400 may delay the warning of the driver and the partial braking by a predetermined time from the initially set times t 1 and t 2 as illustrated in FIG. 22 during the operating time of the PDCMS function.
- the electronic control unit 400 may delay all of the warning of the driver, the partial braking, and the full braking by a predetermined time from all of the initially set times t 1 , t 2 , and t 3 as illustrated in FIG. 22 during the operating time of the PDCMS function.
- FIG. 24 is an exemplified view illustrating the operation of the PDCMS function when the gaze information is the diagonal according to the embodiment of the present invention.
- the gaze of the pedestrian is directed toward the diagonal of the vehicle, it corresponds to the case where
- the electronic control unit 400 may delay the warning of the driver and the partial braking by a predetermined time from the initially set times t 1 and t 2 as illustrated in FIG. 22 during the operating time of the PDCMS function.
- the electronic control unit 400 may delay all of the warning of the driver, the partial braking, and the full braking by a predetermined time from all of the initially set times t 1 , t 2 , and t 3 as illustrated in FIG. 22 during the operating time of the PDCMS function.
- FIG. 25 is an exemplified view illustrating the operation of the PDCMS function when the gaze information is the rear according to the embodiment of the present invention.
- the electronic control unit 400 may not delay the operating time of the PDCMS function from the initially set times t 1 , t 2 , and t 3 as illustrated in FIG. 22 .
- FIG. 26 is an exemplified view illustrating the operation of the PDCMS function when the gaze information is the side according to the embodiment of the present invention.
- the gaze direction of the pedestrian forms about 90° with respect to the driving direction of the vehicle, that is, if the vehicle is driving toward the side of the pedestrian, it corresponds to the case where
- the electronic control unit 400 may not delay the operating time of the PDCMS function from the initially set times t 1 , t 2 , and t 3 as illustrated in FIG. 22 .
- FIG. 27 is a flow chart illustrating a flow of a method for activating a PDCMS function according to an embodiment of the present invention.
- a method for activating a pedestrian detection and collision mitigation system (PDCMS) function of a vehicle includes: generating radar detection information by detecting a pedestrian on a driving lane of a vehicle using a radio wave reflected from a pedestrian (S100); generating far-infrared recognition information by imaging heat radiated from the pedestrian (S200); detecting pedestrian information including presence of the pedestrian on a driving lane of the vehicle, gaze information of the pedestrian, and a distance and a relative speed between the pedestrian and the vehicle by integrating the radar detection information with the far-infrared recognition information (S300); detecting vehicle information including at least any one of a speed, an acceleration, a steering angle, a steering angular velocity, and a pressure of a master cylinder of the vehicle (S400); and activating a PDCMS function based on the pedestrian information and the vehicle information (S500).
- PDCMS pedestrian detection and collision mitigation system
- the heat radiated by the target object may be imaged and therefore the pedestrian may be recognized even in the low illumination or extremely low illumination condition.
- the heat radiated by the target object is imaged and therefore the pedestrian may be recognized even in the low illumination or extremely low illuminance condition.
- the detecting of the pedestrian information including the presence of the pedestrian on the driving lane of the vehicle and the distance and the relative speed between the pedestrian and the vehicle by integrating the radar detection information with the far-infrared recognition information (S300), it is possible to more accurately detect the distance and the relative speed between the vehicle and the pedestrian even in the low illuminance condition or at night.
- the vehicle sensor 300 may detect the information on the driving conditions of the vehicle such as the acceleration, the steering angle, the steering angular velocity, the pressure of the master cylinder, and the like.
- the PDCMS function based on the pedestrian information and the vehicle information it is determined whether to operate the PDCMS function of the vehicle based on the mapping table using the acquired pedestrian information and pedestrian information. Specifically, it is determined whether the conditions that the PDCMS function on the mapping table may be operated are satisfied based on the integration of the pedestrian information and the vehicle information. That is, the risk of collision of the pedestrian with the vehicle is determined on the mapping table using the current position of the pedestrian, the current position of the vehicle, and the speed information of the vehicle.
- the PDCMS function of the vehicle is operated if it is determined that the pedestrian state and the vehicle state satisfy the conditions that the PDCMS function may start on the mapping table.
- the PDCMS function includes the activation of the operation of the warning unit that is operated to inform the driver of the collision of the pedestrian with the vehicle and the operation of the brake regardless of whether the driver operates the brake. Further, the activation of the operation of the warning unit and the activation of the operation of the brake are performed in order of the activation of the operation of the warning unit, the partial braking of the vehicle, and the full braking of the vehicle.
- the apparatus for activating a PDCMS of a vehicle may include different types of sensors and may integrate and use the respective sensing results to more accurately detect the presence of the pedestrian and the distance and the relative speed between the vehicle and the pedestrian even in the low illuminance condition and at night.
- the embodiment of the present invention may classify the gaze of the pedestrian into the front, the diagonal, the rear, and the side to identify whether the pedestrian may recognize the vehicle or not and then apply the identified situation to the operation of the PDCMS, such that the vehicle braking control may be delayed by the predetermined time only when the pedestrian may recognize the vehicle.
- the apparatus for activating a PDCMS may more accurately detect the pedestrian to effectively protect the pedestrian and optimize the vehicle braking control unnecessarily frequently performed to resolve the sense of difference and discomfort of the driver, thereby improving the commerciality of the vehicle.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Security & Cryptography (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- Traffic Control Systems (AREA)
Description
- Exemplary embodiments of the present invention relate to a sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method, and more particularly, to an apparatus and a method for activating a Pedestrian Detection and Collision Mitigation Systems (PDCMS) of a vehicle capable of recognizing a pedestrian by allowing a front detection sensor to integrate sensing results from different types of sensors and protecting a pedestrian by activating a PDCMS function when an accident is highly likely to occur by analyzing gaze information of the pedestrian.
- Recently, advanced driver assistance systems (ADAS) are being developed to assist driving of a driver. The ADAS has multiple sub-technology categories. Among those, the ADAS includes a PDCMS.
- The PDCMS is a technology that warns a driver of a vehicle of a pedestrian collision when a collision of a pedestrian with the vehicle is expected and automatically activates an emergency brake of the vehicle.
- Lethality and injury rates of pedestrian-related traffic accidents are so high, which leads to a lot of life loss. The PDCMS system may help reduce a speed of the vehicle against inevitable pedestrian collisions, thereby alleviating pedestrian impacts and reducing the lethality and the injury rates.
- Therefore, a technology development for specific application of the PDCMS has been required.
DE 10 2011 112985 A1JP 2014-059841 A - An object of the present invention is to provide an apparatus for activating a PDCMS including a front detection sensor capable of more accurately detecting presence of a pedestrian and measuring a distance and a relative speed between a vehicle and the pedestrian even in a low illuminance condition or at night by integrating sensing results from different types of sensors.
- Another object of the present invention is to provide a system for more safely protecting a pedestrian by accurately activating a PDCMS function.
- Other objects and advantages of the present invention can be understood by the following description, and become apparent with reference to the embodiments of the present invention. Also, it is obvious to those skilled in the art to which the present invention pertains that the objects and advantages of the present invention can be realized by the means as claimed and combinations thereof.
- In accordance with one aspect of the present invention, an apparatus for activating a pedestrian detection and collision mitigation system (PDCMS) of a vehicle according to
independent claim 1 is provided. - The information integration unit may include: a consistency determination unit calculating a correlation between the radar detection information and the far-infrared recognition information to form a binary matrix; a similarity calculation unit generating a measurement value using the binary matrix formed by the consistency determination unit and calculating similarity in a state variable area between the measurement value and track tracking information of a previous time; a state variable updating unit updating each tracking information by performing Kalman filtering using the measurement value and the state variable; and a tracking information management unit performing merging, generation, and deletion of the track.
- The similarity calculation unit may represent longitudinal / lateral relative positions of the radar detection information and the far-infrared recognition information using a polar coordinate system to generate the measurement value.
- The state variable updating unit may apply an extended Kalman filter to perform the Kalman filtering.
- The front detection sensor may further include a radar detection information post-processing unit removing a result other than the pedestrian among the radar detection information before the integration of the result with the far-infrared recognition information.
- The radar detection information may be generated by the number equal to or less than 64.
- The far-infrared recognition information may be generated by the number equal to or less than 8.
- When a horizontal length of the entire face of the pedestrian is defined as x, a horizontal length from a left face contour to a left eye of the pedestrian is defined as x1, a horizontal length from a right face contour to a right eye of the pedestrian is defined as x2, and a horizontal length between the left eye and the right eye of the pedestrian is defined as x3, the gaze information of the pedestrian may correspond to a front or a diagonal if |(x1 - x2) / x| < a, a rear if x1 = x2 = x3 = 0, and a side if |(x1 - x2) / x| ≥ a or x1 = 0 or x2 = 0, and the a may be selected in a range between 0.65 and 0.95.
- The electronic control unit may perform the activation of the operation of the warning unit and the activation of the operation of the brake by delaying the activation of the operation of the warning unit and the activation of the operation of the brake by a predetermined time when the gaze information of the pedestrian is the front or the diagonal, compared to when the gaze information of the pedestrian is the rear or the side.
- The vehicle sensor may further include at least any one of a rain sensor, a temperature sensor, and an illumination sensor.
- The electronic control unit may perform the activation of the operation of the brake so that the speed of the vehicle is reduced to at least a predetermined speed or more from time when the operation of the brake is activated to time when the collision of the pedestrian with the vehicle occurs.
- The electronic control unit may permit the driver to operate the brake for a maximum possible deceleration even after the activation of the operation of the brake starts.
- The electronic control unit may control the warning unit to inform the driver that the PDCMS function is in an available state.
- The warning unit may include a display unit visually informing the collision of the pedestrian with the vehicle or a speaker unit audibly informing the collision of the pedestrian with the vehicle.
- The PDCMS function may further include an operation of a rear brake lamp.
- The PDCMS function may further include an operation of an electrical stability control (ESC).
- In accordance with another aspect of the present invention, a method for activating a pedestrian detection and collision mitigation system (PDCMS) of a vehicle according to
independent claim 15 is provided. - The above and other objects, features and other advantages of the present invention will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings, in which:
-
FIG. 1 is a diagram illustrating a schematic concept of a PDCMS; -
FIG. 2 is a block diagram illustrating a change in a PDCMS state according to a vehicle; -
FIG. 3 is a block diagram schematically illustrating an apparatus for activating a PDCMS of a vehicle according to an embodiment of the present invention; -
FIG. 4 is an exemplified view illustrating an AEB VRU scenario; -
FIG. 5 is an exemplified view illustrating initial position setting in a CVFA scenario; -
FIG. 6 is an exemplified view illustrating a pedestrian speed profile used in a simulation; -
FIGS. 7A to 7F are exemplified views illustrating simulation results of a relative distance and a relative angle of a pedestrian; -
FIG. 8 is an exemplified view illustrating a distance and speed measurement principle of a radar; -
FIG. 9 is an exemplified view illustrating a distance calculation principle in an image using a vanishing point; -
FIG. 10 is an exemplified view illustrating pedestrian recognition of a far-infrared image; -
FIG. 11 is an exemplified view illustrating a pedestrian detection using LEDDAR; -
FIG. 12 is an exemplified view illustrating a pedestrian signal using 3D Lidar; -
FIG. 13 is an exemplified view illustrating a pedestrian detection using a radar; -
FIG. 14 is a configuration diagram of a front detection sensor according to an embodiment of the present invention; -
FIGS. 15A to 15D are exemplified views illustrating a radar detection information post-processing process of a radar detection information post-processing unit according to an embodiment of the present invention; -
FIG. 16 is an exemplified diagram illustrating a generation of measurement values by a similarity calculation unit according to an embodiment of the present invention; -
FIGS. 17A and 17B are exemplified views illustrating a comparison between an integration of longitudinal / lateral relative positions of radar detection information and far-infrared recognition information by a rectangular coordinate system and an integration of longitudinal / lateral relative positions integration of radar detection information and far-infrared recognition information by a polar coordinate system according to an embodiment of the present invention; -
FIGS. 18A to 18F are exemplified views illustrating an output result of the front detection sensor according to an embodiment of the present invention; -
FIG. 19 is an exemplified view for identifying gaze information of a front detection sensor according to an embodiment of the present invention; -
FIG. 20 is a diagram illustrating a concept of a pedestrian moving speed; -
FIG. 21 is a diagram illustrating an example of a mapping table for activating a PDCMS function according to the embodiment of the present invention; -
FIG. 22 is a diagram illustrating an example of the operation of the PDCMS function according to the embodiment of the present invention; -
FIG. 23 is an exemplified view illustrating the operation of the PDCMS function when the gaze information is the front according to the embodiment of the present invention; -
FIG. 24 is an exemplified view illustrating the operation of the PDCMS function when the gaze information is the diagonal according to the embodiment of the present invention; -
FIG. 25 is an exemplified view illustrating the operation of the PDCMS function when the gaze information is the rear according to the embodiment of the present invention; -
FIG. 26 is an exemplified view illustrating the operation of the PDCMS function when the gaze information is the side according to the embodiment of the present invention; and -
FIG. 27 is a flow chart illustrating a flow of a method for activating a PDCMS function according to an embodiment of the present invention. - Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art may easily practice the present invention.
- A part irrelevant to the description will be omitted to clearly describe the present invention, and the same elements will be designated by the same reference numerals throughout the specification.
-
FIG. 1 is a diagram illustrating a schematic concept of a PDCMS. - The PDCMS is a technology that warns a driver of a vehicle of a pedestrian collision when a collision of a pedestrian with the vehicle is expected and automatically activates an emergency brake of the vehicle.
- Referring to
FIG. 1 , it is determined whether the PDCMS is operated based on an operation determination of a pedestrian and an operation determination of a vehicle. When the operation of the PDCMS is determined, the PDCMS function is performed by issuing a warning to driver and activating a vehicle control. - A system designer may design the PDCMS function to operate solely in the risk of collision of a pedestrian with a vehicle or may design the PDCMS function to operate in combination with other driving assistance systems.
-
FIG. 2 is a block diagram illustrating a change in a PDCMS state according to a vehicle. - In a PDCMS off state, no action is taken on the operation of the vehicle. The PDCMS off state is produced when an engine of a vehicle stalls.
- In the PDCMS deactivation state, the apparatus for activating a PDCMS monitors a speed of a vehicle and determine whether the PDCMS is in an appropriate state to activate. The PDCMS deactivation state is produced by turning on the engine in the PDCMS off state. Further, the PDCMS deactivation state is produced even when the vehicle is in a state other than the conditions that the vehicle is activated from the PDCMS activation state. For example, when the speed of the vehicle falls below a predetermined value Vmin, the PDCMS deactivation state is produced.
- The PDCMS activation state is produced when the speed of the vehicle is equal to or greater than the predetermined value Vmin and equal to or less than a predetermined value Vmax. To determine whether to operate the PDCMS function in the PDCMS activation state, an operation of a pedestrian and an operation of a vehicle are monitored. When the apparatus for activating a PDCMS determines that the PDCMS function needs to be operated, the PDCMS function starts. The PDCMS function includes a collision warning to a driver and an operation of an emergency brake or optionally includes braking actions by a driver.
-
FIG. 3 is a block diagram schematically illustrating an apparatus for activating a PDCMS of a vehicle according to an embodiment of the present invention. - Referring to
FIG. 3 , aPDCMS operating apparatus 100 of a vehicle according to an embodiment of the present invention includes afront detection sensor 200, avehicle sensor 300, anelectronic control unit 400, and awarning unit 500. - The
front detection sensor 200 includes a radar sensor and a far-infrared sensor (which can be in a form of camera) and integrates and uses output results of each of the sensors to more accurately detect a distance and a relative speed between a vehicle and a pedestrian in a low illuminance condition or at night. Thefront detection sensor 200 may extract characteristics of obstacles detected in front of the vehicle to identify objects and detect various objects such as vehicles on a roadside as well as pedestrians. Thefront detection sensor 200 may detect even parts configuring a pedestrian as well as the overall appearance of the pedestrian to detect the pedestrian even when only a part of the pedestrian covered by various objects such as vehicles on a roadside is detected. Further, thefront detection sensor 200 may detect gaze information of a pedestrian when an object in front of the vehicle is determined as the pedestrian. Thefront detection sensor 200 transmits the detected information on the pedestrian to theelectronic control unit 400. - The
vehicle sensor 300 measures revolutions per minute (RPM) of a vehicle wheel from a vehicle engine and calculates a driving speed of a vehicle based on the known circumference of the wheel and the measured RPM and time. Further, thevehicle sensor 300 may detect information on driving conditions of a vehicle such as acceleration, a steering angle, a steering angular velocity, and a pressure of a master cylinder. Further, thevehicle sensor 300 may also detect information on driving environment of a vehicle by including a rain sensor, a temperature sensor, an illuminance sensor, etc. Thevehicle sensor 300 may transmit the information on the detected driving conditions and driving environment of the vehicle to theelectronic control unit 400. - The
electronic control unit 400 determines whether to operate the PDCMS function of the vehicle based on the information received from thefront detection sensor 200 and thevehicle sensor 300. Specifically, theelectronic control unit 400 determines whether the conditions that the PDCMS function may be operated are satisfied by combining the pedestrian state and the vehicle state. That is, theelectronic control unit 400 determines the risk of collision between a vehicle and a pedestrian using a current position of the pedestrian, a current position of the vehicle, and speed information on the vehicle if it is determined that an obstacle is the pedestrian. For example, if the distance between the pedestrian and the vehicle is below a predetermined distance and the motion direction of the pedestrian is the same as the movement direction of the vehicle, it is determined that the conditions that the PDCMS function may be operated are satisfied because the collision is highly likely to occur and if the distance between the pedestrian and the vehicle is below a predetermined distance but the motion direction of the pedestrian differs from the movement direction of the vehicle, it is determined that the conditions that the PDCMS function may be operated are not satisfied because the collision is less likely to occur. - Preferably, the
electronic control unit 400 determines whether the conditions that the PDCMS function may be operated are satisfied based on the mapping table. The mapping table will be described below with reference toFIG. 6 . - If the
electronic control unit 400 determines that the pedestrian state and the vehicle state satisfy the conditions that the PDCMS function may start, the PDCMS function of the vehicle is operated. The PDCMS function includes operating thewarning unit 500 to warn the driver of the collision of the pedestrian with the vehicle or operating the brake without the operation of the driver. - Warning the driver of the collision of the pedestrian with the vehicle is performed by operating the
warning unit 500. Thewarning unit 500 is operated by the control of theelectronic control unit 400. Thewarning unit 500 may include a display unit or a speaker unit. The display unit included in thewarning unit 500 may provide a driver with a visual warning through a head-up display, a navigation display, etc. The speaker unit included in thewarning unit 500 may provide a driver with an audible warning through an audio. The content of the warning that thewarning unit 500 performs is that there is a potential risk of collision of the pedestrian with the vehicle since obstacles exist in the front of a driving lane of the vehicle. - The activation of the operation of the brake regardless of whether the driver operates the brake is performed only by the control of the
electronic control unit 400 without the operation of the driver. The activation of the operation of the brake is to automatically reduce the relative speed between the vehicle and the pedestrian if it is found that the pedestrian collision is just around the corner. - The activation of the operation of the brake is performed so that the speed of the vehicle may be reduced to at least a predetermined speed or more from the time when the operation of the brake is activated to the time when the collision of the pedestrian with the vehicle occurs. Preferably, the predetermined speed may be 20 km/h.
- Further, even after the activation of the operation of the brake starts, the driver manually operates the brake, thereby performing the maximum possible deceleration. That is, the driver may manually operate the brake so that the speed of the vehicle is reduced more than the predetermined speed. For example, the driver may manually operate the brake so that the speed of the vehicle is maximally decelerated to 20 km / h or more that is the predetermined speed.
- In addition, the
electronic control unit 400 may inform a driver that the PDCMS function is in an available state. Specifically, theelectronic control unit 400 may control thewarning unit 500 to inform the driver that the PDCMS function is in the available state through the display unit or the speaker unit of thewarning unit 500. - In addition, the PDCMS function may control an operation of a brake lamp to prevent the potential risk of collision with the following vehicles.
- In addition, the PDCMS function may further include an operation of an electrical stability control (ESC). The ESC is an apparatus that allows a vehicle itself to intervene in an operation of a brake of the vehicle in an emergency situation such as an oversteer (when a vehicle enters inwardly beyond a turning radius of a road) or an understeer (when a vehicle deviates outwardly beyond the turning radius of the road) of a vehicle to thereby help a driver to escape from an emergency situation.
-
FIG. 4 is an exemplified view illustrating an AEB VRU scenario. - Prior to describing the detailed configuration of the
front detection sensor 200 according to the embodiment of the present invention, there is a need to analyze a pedestrian AEB scenario of a new vehicle assessment program (Euro NCAP), which may be the most important application field for understanding the configuration of thefront detection sensor 200. - The Euro NCAP sets the pedestrian AEB as a test item by defining the pedestrian AEB as a test item under the name of vulnerable road use (AEB VRU) and describes in detail a test procedure and a scoring method under the name of test protocol
- In the Euro NCAP test protocol, an appearance of a pedestrian is defined as vehicle to VRU far-side adult (CVFA), vehicle to VRU near-side adult (CVNA), and vehicle to VRU near-side child (CVNC) as illustrated in
FIG. 6 . - The CVFA is assumed to be a situation in which an adult moves at a speed of 8 km / h to collide with a center of a vehicle, the CVNA is assumed to be a situation in which an adult moves at a speed of 5 km / h to collide with points corresponding to 25% and 75% of a vehicle width, and the CVNC is assumed to be a situation in which a child moves at a speed of 5 km / h between stopped obstacle vehicles to collide with a center of a vehicle.
- In each situation, the moving speed of the vehicle is set to be 20 to 60 km / h, and the movement of the vehicle is set to start from time to collision (TTC) that is about 4.0 seconds.
- The scoring criterion after AEB braking is changed depends on how much the speed of the vehicle is decelerated from the first moving speed. In order to get a perfect score, the collision need not occur up to 40 km / h and the deceleration needs to be made at a speed higher than that.
- For the purpose, setting of a first braking time and a deceleration at the time of the braking time is considered as an important factor, but the present invetnion relates to the pedestrian recognition, and a braking control related algorithm corresponds to one deviating from the scope of the present invention. However, for the warning, the first recognition needs to be performed before the warning time. Here, the time corresponds to the TTC of about 2 seconds.
-
FIG. 5 is an exemplified view illustrating initial position setting in a CVFA scenario. - In the test protocol, the initial position as illustrated in
FIG. 5 is set according to the scenario. According to the defined content, since the pedestrian has a constant acceleration section and a first lateral position, the overall motion of the pedestrian is the order of stop, uniformly accelerated motion, and uniform velocity motion. -
FIG. 6 is an exemplified view illustrating a pedestrian speed profile used in a simulation. - The time intervals for each motion are a function of a final speed, an initial position, and an acceleration distance. Assuming the pedestrian speed profile as illustrated in
FIG. 8 , the CVFA has a time interval of 1.075 seconds at t1, 0.90 seconds at t2, and 2.025 seconds at t3, the CVNA has a time interval of 1.10 seconds at t1, 0.72 seconds at t2, and 2.16 seconds at t3, and the CVNC has a time interval of 1.10 seconds at t1, 0.72 seconds at t2, and 2.16 seconds at t3. -
FIGS. 7A to 7F are exemplified views illustrating simulation results of a relative distance and a relative angle of a pedestrian. - Calculating the relative position and the relative angle by calculating the position of the vehicle and the position of the pedestrian based on the speed profile obtained by the calculation, the results as illustrated in
FIG. 9 are obtained. The horizontal axes of each graph are time in msec, and the calculation is performed for a total of 4 seconds from the TTC of 4.0 seconds to the TTC of 0.0 seconds. - That is,
FIG. 7A illustrates the result of calculating the relative distance with time of the CVFA, in which the relative distance is uniformly increased depending on the speed of the vehicle,FIG. 7B illustrates the relative angle with time of the CVFA, in which a pedestrian exists at the left of the vehicle and therefore an angle value appears as a negative number. In the beginning, it may be confirmed that an absolute value of the angle increases as the vehicle approaches and then has a constant relative angle when the vehicle enters a uniform velocity section. -
FIGS. 7C and 7D may illustrate that the relative angle at the TTC of 0.0 is -90° or 90° as a result of the relative angle in the case ofCVNA 25/75, andFIGS. 7E and 7F illustrates the relative distance and the relative angle of the CVNC and the head-on collision situation, and therefore it may be confirmed that the behavior of the CVNC appears as the same as that of the CVFA but is not observed at the beginning due to obstacles and the information thereon appears from the TTC of about 2.0 seconds. - As described above, when a recognition target is calculated based on the TTC of about 2 seconds that is a warning time, it is required to satisfy a maximum recognition distance of 40 m at a vehicle speed of 60 km / h and a maximum recognition angle of 44° at a vehicle speed of 20 km / h.
- In order to achieve the pedestrian recognition by satisfying the above-mentioned recognition distance and recognition angle, the AEB pedestrian recognition system needs to include a sensor for pedestrian recognition and may have recognition characteristics according to the applied sensor.
-
FIG. 8 is an exemplified view illustrating a distance and speed measurement principle of a radar. - For example, the radar uses a method for comparing a received radio wave and a transmitted radio wave to output a distance through a phase difference and output a speed through a frequency change by a Doppler effect. Since two physical quantities of the distance and the speed measured by the radar are all directly obtained from the sensor, a radar sensor is classified into sensors that measure a distance and a speed.
-
FIG. 9 is an exemplified view illustrating a distance calculation principle in an image using a vanishing point. - In contrast to the radar, a camera needs to calculate a three-dimensional space using the obtained pixel information and calculate a distance using on a camera model or using a correlation between a size and a distance of a pixel unit of a recognized object, and therefore is classified as a sensor of estimating the distance. However,
FIG. 9 is one example illustrating the distance calculation principle in the image using the vanishing point, and the distance estimation results may vary depending on internal and external variables of the camera. - In addition, light emitting diode detection and ranging (LEDDAR) or Lidar is a sensor that measures the distance similar to the radar but may not sense the Doppler effect, and therefore performs Bayesian filtering on the data of the measured points to calculate the speed. That is, it may be classified as a sensor that measures a distance and estimates a speed.
- A monocular (multi-functional front) camera which is practically currently used has a viewing angle (or horizontal viewing angle) of 52°. When a pedestrian is recognized by using the monocular camera at night, the recognition range is known as 30 m within an irradiation range of a head lamp. On the other hand, it measures the distance and the speed, and therefore has the accuracy slightly lower than that of the distance measurement sensor.
-
FIG. 10 is an exemplified view illustrating pedestrian recognition of a far-infrared image. - There is an advantage in that since the far-infrared camera may image heat radiated from a target object to recognize a pedestrian even in a low illuminance condition or an extremely low illuminance condition and recognize a pedestrian up to 50 m that is a relatively long distance. However, the far-infrared camera has a disadvantage in that since the viewing angle is somewhat insufficient as 38° and the resolution is only 1/10 of that of the monocular camera, the estimation accuracy of the distance and the speed is relatively lower than that of the monocular camera.
-
FIG. 11 is an exemplified view illustrating a pedestrian detection using the LEDDAR. - The LEDDAR sensor has a limit in that since 16 detectors are arranged horizontally and an output is restrictive, it is impossible to recognize a pedestrian and the detection may also be made only in about 20 m.
-
FIG. 12 is an exemplified view illustrating a pedestrian signal using 3D Lidar. - The 3D Lidar sensor has a plurality of layers in a vertical direction, and therefore applies vertical / horizontal direction information and a pattern recognition technology in a restrictive situation, thereby recognizing a pedestrian. Specifically, pedestrians within 15m may be recognized, and pedestrians within 40m may be detected. Referring to
FIG. 12 , it may be seen that contour lines are revealed when a large amount of data is collected for two pedestrians, but the amount of data is reduced as the distance is increased and only the data corresponding to one layer appears at a distance of 45 m or more. In this case, since the amount of data decreases as the distance increases, and only data corresponding to one layer is displayed for more than 45 meters. In this case, since the resolutions in both the vertical direction and the horizontal direction are insufficient, the detection is difficult to perform the detection. In addition, the LEDDAR and the 3D Lidar sensor commonly have a disadvantage in that they are greatly affected by a material and a color of the target object because they use a near-infrared wavelength signal. -
FIG. 13 is an exemplified view illustrating a pedestrian detection using a radar. - The radar sensor is a sensor that uses a radio wave reflected from an obstacle in front of a vehicle and it may be confirmed that in the case of detecting a pedestrian, a power of a radio wave reflected from a pedestrian is low and therefore a pedestrian may be detected at a distance within 40 m that is a distance shorter than a vehicle. Referring to
FIG. 13 , it may be seen that the detection probability is somewhat lower than the vehicle according to the surrounding environment. InFIG. 13 , the unit of the horizontal axis and the vertical axis is m, and a portion where a discontinuous signal appears corresponds to a phenomenon occurring when a pedestrian is not detected for a predetermined time. - As described with reference to
FIGS. 8 to 13 , it is impossible for only a single sensor to recognize pedestrians at a level suitable for AEB application in a low illuminance condition or at night. Since the monocular camera is somewhat insufficient in the recognition distance, the far-infrared camera is insufficient in the viewing angle and the distance accuracy, the LEEDAR and the radar sensor may not recognize a pedestrian and therefore is not easy to apply, and the 3D Lidar sensor has a restrictive recognition distance, it is difficult for the single sensor to meet a sensing solution for driving the AEB system. -
FIG. 14 is a configuration diagram of the front detection sensor according to an embodiment of the present invention. - The
front detection sensor 200 according to the embodiment of the present invention may solve the above problem and may more accurately recognize a pedestrian even in the low illumination condition or at night by satisfying a recognition distance of 40m and a horizontal viewing angle of 44°. Specifically, thefront detection sensor 200 includes aradar sensor 210 detecting a pedestrian using a radio wave reflected from the pedestrian and generating radar detection information, a far-infrared sensor 220 imaging heat radiated from an object to generate far-infrared recognition information, and aninformation integration unit 230 integrating the radar detection information and the far-infrared recognition information. - As described above, the
radar sensor 210 has features such as a horizontal viewing angle of 90°, a detection distance of about 40 meters, and an estimation of a distance and a speed that may not be recognized and the far-infrared sensor 220 has features such as a horizontal viewing angle of 38°, a detection / recognition distance of 50 m, and an estimation of a distance with low accuracy. - The maximum recognition distance is the shorter distance of the detection distance and the recognition distance, but the pedestrian information may be maintained even if the result is output from only one of the two sensors after the first recognition, and therefore the recognition angle becomes the wider horizontal viewing angle of the recognition angle and the detection angle. That is, the
radar sensor 210 and the far-infrared sensor 220 of the present invention are combined to meet the recognition distance of 40 m and the horizontal viewing angle of 44°. - In order to integrate the output results of two sensors having different characteristics, it is necessary to determine that the two results are generated from one object. According to the invention, the
information integration unit 230 integrates the radar detection information generated from theradar sensor 210 and the far-infrared recognition information generated from the far-infrared sensor 220. - Although it is possible to integrate the radar detection information with the far-infrared recognition information, in order to achieve more efficient integration, the present invention may include a radar detection information
post-processing unit 211 removing the result other than the pedestrian among the radar detection information before the integration of the result with the far-infrared recognition information, prior to integrating each information. - In the case of the pedestrian, the detection by radar restrictively appears because the reflection of the radio wave is relatively low. It may be confirmed that the detection is made only for a predetermined distance and angle or less by the pedestrian detection experiment, and numerical values of the reflected power, the width, etc., among the radar output information appears to be low.
- Therefore, the radar detection information
post-processing unit 211 previously removes the detection result other than not the pedestrian, based on the experimental data on the condition of the detection distance, the angle, the width, the reflected power, and the like. -
FIGS. 15A to 15D are exemplified views illustrating a radar detection information post-processing process of the radar detection information post-processing unit according to an embodiment of the present invention. . Specifically, it is a result obtained by post-processing a result of detecting a pedestrian moving in a lateral direction in a state where a vehicle equipped with theradar sensor 210 stops. - Values represented on the graph are angle values of each track with time, and
FIG. 15A illustrates an initial input value and may confirm that it includes all information of 64 tracks as an initial input value and it is very difficult to observe a certain value. However, in the case ofFIG. 15B in which only valid states are left, it may be seen that certain values appear. Since the graph represents an angle value with time, tracks corresponding to straight lines parallel in a horizontal direction may be regarded as an object that is in a stop state even after time lapses and a straight line appearing in a diagonal direction may be regarded as a signal for a pedestrian moving in a lateral direction in which an angle is changed with timeFIG. 15C illustrate results after the removal ofFIG. 15B on the condition of the distance and the angle and may confirm that a lot of data are reduced andFIG. 15D illustrates results after the removal of the signals ofFIG. 15C on the condition of the reflected power and the detection width. Data about a pedestrian appearing diagonally and information about some other objects that are not filtered remain. By the process, more than 90% of the total output from theradar sensor 210 is removed, such that errors of the information integration (combination) performed in the subsequent steps may be reduced. - The
information integration unit 230 after the post-processing of the radar detection information post-processing unit 211 (or including the case where there is no post-processing) integrates the radar detection information with the far-infrared recognition information. The information integration unit is configured to include aconsistency determination unit 231 calculating a correlation between the radar detection information and the far-infrared recognition information to form a binary matrix, asimilarity calculation unit 232 using the binary matrix formed by the consistency determination unit to generate measurement values and calculating similarity in a state variable area between the measurement value and track tracking information, a statevariable updating unit 233 performing Kalman filtering using the measurement values and the state variables to update each tracking information, and a trackinginformation management unit 234 performing merging, generation, and deletion of the track. The description of each configuration will be as follows. - The
consistency determination unit 231 calculates a correlation between a maximum of 64 radar detection information and a maximum of 8 far-infrared recognition information to form an 8 x 64 binary matrix. If the angles and distances at the outputs of each sensor have a difference equal to or less than thresholds of 3 steps that are formed according to the conditions, it is determined that two results are highly likely to result from the same object and thus a matrix value is set to be 1, and otherwise, the matrix value is set to be 0. The determined result serves as a kind of candidate groups transmitted to a step of calculating the similarity between the tracking information and the measurement value. - The
similarity calculation unit 232 generates a measurement value to be used in a Kalman filter by integrating the output of the sensor having a value of 1 in the matrix and adds one row and one column so that even the output of the single sensor may generate a filtering result to thereby calculate and use the measurement value generated as the result of the single sensor. -
FIG. 16 is an exemplified diagram illustrating a generation of measurement values by a similarity calculation unit according to an embodiment of the present invention. - Referring to
FIG. 16 , the similarity is calculated in the state variable area for the generated measurement value and all the track tracking information at the previous time to connect between a measurement value having maximum similarity and the previous track tracking degree, and matching using GNN is performed to ensure the similarity of the pair of the entire track-measurement value. - In this case, in order to calculate the similarity of the track information, the measurement value to be generated needs to be output to longitudinal / lateral relative positions that are variables that may be compared with the state variable, and the relative positions are values included in the radar detection information and the far-infrared recognition information and therefore the two results need to be integrated.
-
FIGS. 17A and 17B are exemplified views illustrating a comparison between an integration of longitudinal / lateral relative positions of radar detection information and far-infrared recognition information by a rectangular coordinate system and an integration of longitudinal / lateral relative positions integration of radar detection information and far-infrared recognition information by a polar coordinate system according to an embodiment of the present invention. Specifically, according to the embodiment of the present invention, the integration of the longitudinal / lateral relative positions is performed by using the polar coordinate system rather than the rectangular coordinate system. - First,
FIG. 17A illustrates the case where the longitudinal / lateral relative positions are integrated by using the rectangular coordinate system, and when viewed from the perspective of the reflected errors, errors occurring in the longitudinal direction with respect to the error region of theradar sensor 210 represented by a red ellipse and errors occurring in the lateral direction with respect to the error region of the far-infrared sensor 220 represented a blue ellipse are reflected to the integration result. In this situation, when the target object exists on the front of the vehicle, relatively smaller errors occur, but when the target object exists on the side, the errors in the diagonal direction of the ellipse occur. - In contrast,
FIG. 17B illustrates the case where the longitudinal / lateral relative positions are integrated using the polar coordinate system, and it may be confirmed that the small errors occur in both cases where the target object is on the front of the vehicle or on the side thereof. The angles and the distances integrated by the polar coordinate system may be converted into the rectangular coordinate system through a simple trigonometric operation, such that the conversion and the integration as described above in the step of outputting the final measurement value may be a method for generating the smaller errors. - In addition, the state
variable updating unit 233 may update each track tracking information by performing the Kalman filtering using the integrated measurement value and the state variable. The present invention may also apply an extended Kalman filter is configured to change element values of a state variable transition matrix every time in consideration of the fact that the time at which the time when the output of the far-infrared sensor 220 is generated may be changed depending on the number of candidate groups every time. - Further, the tracking
information management unit 234 performs the merging, generation, and deletion of the track. - In the case of the track merging, if the position and speed of the two tracks have the certain similarity, then the later generated track is configured to be merged into the first generated track.
- In the case of the track generation, the corresponding track and the radar detection information of the
radar sensor 210 are identically generated by being integrated on the polar coordinate system by observing whether there are the results not applied as the measurement values of the track among the outputs of the far-infrared sensor 220 and when the reliability of the far-infrared recognition information is high while the corresponding radar detection information may not be found, the track may be configured to be generated solely. - In the case of the track deletion, the measurement values are not allocated for a predetermined period of time, so that the track is deleted only by the predicted value. Exceptionally, coast tracking is set to be maintained for a longer period time in consideration of that fact that the output of the sensor is difficult to be generated in the approach situation within 5 m. In case of other exceptions, even if the measurement values are allocated each time, when it is determined that the case in which the allocation is continuously made only by the result of the single sensor rather than the integration result is abnormal and deleted, even when the case in which the longitudinal / lateral speed component on the state variable exceeds 15 km / h, the corresponding object is regarded as not being a pedestrian and thus is deleted.
- It goes without saying that the above specific conditions may be changed according to the characteristics and purposes of the system to be implemented.
-
FIGS. 18A to 18F are exemplified views illustrating an output result of the front detection sensor according to an embodiment of the present invention. -
FIGS. 18A to 18C illustrate that white dots represent the radar detection information, red dots represent the far-infrared ray recognition information, and a sky blue circle represents the information integration result. Further, FIGS. 20B, 20D, and 20F illustrate the information imaged by the far-infrared sensor 220 at the time of the integrating. - As a result, it may be confirmed that the integration result of the relative positions of the pedestrian is outputted normally from the
front detection sensor 200. - As a result, the
front detection sensor 200 according to the embodiment of the present invention includes theradar sensor 210 and the far-infrared sensor 220 in an automatic emergency braking system applied to a vehicle and integrates and uses the results using the polar coordinate system, such that the distance and the relative speed between the vehicle and the pedestrian may be more accurately calculated even in the low illuminance condition or at nighttime. -
FIG. 19 is an exemplified view for identifying gaze information of the front detection sensor according to the embodiment of the present invention. - The
front detection sensor 200 according to the embodiment of the present invention may detect the gaze information of the pedestrian and reflect the detected gaze information to the operation of the PDCMS of theelectronic control unit 400. - In order to detect the gaze information of the pedestrian, the
front detection sensor 200 sets a horizontal length of the entire face of the pedestrian to be x, a horizontal length from a left face contour to a left eye to be x1, a horizontal length from a right face contour to a right eye to be x2, and a horizontal length between the left eye and the right eye to be x3 and then detects the gaze information of the pedestrian. - Specifically, the gaze information of the pedestrian corresponds to the front or the diagonal if |(x1 - x2) / x | < a, the rear if x1 = x2 = x3 = 0, and the side if |(x1 - x2) / x| ≥ a or x1 = 0 or x2 = 0, in which the a may be selected in the range between 0.65 and 0.95.
- The
front detection sensor 200 transmits the detected gaze information of the pedestrian to theelectronic control unit 400. The method for reflecting the gaze information of the pedestrian received by theelectronic control unit 400 to the operation of the PDCMS will be described with reference toFIGS. 23 to 26 . -
FIG. 20 is a diagram illustrating a concept of a pedestrian moving speed. - Referring to
FIG. 20 , thefront detection sensor 200 may detect a distance between apedestrian 600 and avehicle 700 that are moving within a driving lane and a moving speed of thepedestrian 600. - For example, if the
pedestrian 600 moves from the left to the right with respect to a front view of thevehicle 700, thepedestrian 600 has a negative (-) moving speed and if thepedestrian 600 moves from the right to left with respect to the front view of thevehicle 700, thepedestrian 600 has a positive (+) moving speed. - In addition, the
front detection sensor 200 may detect the distance between thevehicle 700 and thepedestrian 600 moving on the driving lane of the vehicle. -
FIG. 21 is a diagram illustrating an example of a mapping table for activating a PDCMS function according to the embodiment of the present invention. - The
electronic control unit 400 uses the mapping table to determine the risk of collision of the pedestrian with the vehicle, and furthermore, whether the PDCMS function is operated. - Referring to
FIG. 21 , theelectronic control unit 400 determines the operation of the PDCMS function based on an initial speed at a boundary of a driving lane on which the pedestrian moves and an initial speed of the vehicle. - Specifically, if an absolute value of the initial speed at the boundary of the driving lane on which the pedestrian is moving and the initial speed of the vehicle are in an area in which the PDCMS function is essentially operated at the time of determining whether the PDCMS function is operated, the
electronic control unit 400 determines that the PDCMS function is operated. The operation possible area means the area in which the Vmin or the Vmax may be adjusted according to the selection of the manufacturer. - For example, if the speed of the vehicle falls below 8.4 m / s (30 km / h) as the Vmin or rises above 16.6 m / s (60 km / h) as the Vmax, then the
electronic control unit 400 may determine that the PDCMS is in the deactivation state and thus the PDCMS function is not operated. - Further, when the initial speed of the vehicle is between the Vmin and the Vmax and the absolute value of the initial speed at the boundary of the driving lane on which the pedestrian moves is between 0.83 m / s and 1.5 m / s, the
electronic control unit 400 may determine that the PDCMS function is operated. -
FIG. 22 is a diagram illustrating an example of the operation of the PDCMS function according to the embodiment of the present invention. A vertical axis represents the TTC derived from the distance and the relative speed between the vehicle and the pedestrian and a horizontal axis represents the operation of the PDCMS function of the vehicle. - The
electronic control unit 400 performs the PDCMS operation by steps according to the distance between the vehicle and the pedestrian. - That is, when t1 > t2 > t2 for t1, t2, and t3 which are different TTCs, if the TTC of the vehicle and the pedestrian is t1, the warning is issued to the driver through the
warning unit 500, if the TTC of the vehicle and the pedestrian is t2, the vehicle is partially braked, and if the TTC of the vehicle and the pedestrian is t3, the vehicle is fully braked. - The warning of the
warning unit 500 may include the visual warning through the display unit or the audible warning through the speaker unit. - The partial braking means reducing the speed of the vehicle to at least a predetermined speed or more and the full braking means maximally reducing the speed of the vehicle.
- However, even after the PDCMS function is operated, the driver may manually operate the brake to perform the maximum possible deceleration. That is, the driver may manually operate the brake to reduce the speed of the vehicle more than the sequential deceleration according to the PDCMS function.
-
FIGS. 23 to 26 illustrate the method for applying the gaze information of the pedestrian received from the front-side sensor 200 to the operation of the PDCMS function for performing the sequential deceleration of theelectronic control unit 400. - When the pedestrian recognizes the driving direction of the vehicle, the pedestrian is less likely to move to the inside of the route of the vehicle is low and the possibility of collision is reduced accordingly. However, when the gaze direction of the pedestrian is not taken into consideration, the gaze of the pedestrian is directed toward the front of the vehicle, and therefore it is likely to activate the warning and the operation of the brake due to the unnecessary operation of the PDCMS function even in the situation that the pedestrian recognizes the driving direction of the vehicle and the possibility of collision is low.
- The unnecessary operation of the PDCMS function causes the driver to experience the sense of difference and the discomfort. This is directly connected to the commerciality of the vehicle, which is a big problem for car makers.
- Therefore, if the possibility of collision is determined according to the gaze direction of the pedestrian and the operating time of the PDCMS function is controlled, the essential purpose of the PDCMS for protecting the pedestrian may be achieved and the reduction in the ride comfort of the driver may be prevented.
-
FIG. 23 is an exemplified view illustrating the operation of the PDCMS function when the gaze information is the front according to the embodiment of the present invention. - As described above with reference to
FIG. 19 , the horizontal length of the entire face of the pedestrian is x, the horizontal length from the left face contour to the left eye is defined as x1, the horizontal length from the right face contour to the right eye is defined as x2, the horizontal length between the left eye and the right eye is defined as x3. - If the gaze of the pedestrian is directed toward the front of the vehicle, it corresponds to the case where |(x1 - x2) / x| < a. Therefore, if it corresponds to |(x1-x2) / x| < a based on the lengths of each part of the face of the pedestrian derived from the
front detection sensor 200, it may be determined that the gaze of the pedestrian is directed toward the front of the vehicle. - In this case, since the vehicle exists within the view of the pedestrian, it is expected that the possibility of collision is low because the pedestrian recognizes the driving direction of the vehicle. Therefore, the
electronic control unit 400 may delay the warning of the driver and the partial braking by a predetermined time from the initially set times t1 and t2 as illustrated inFIG. 22 during the operating time of the PDCMS function. Alternatively, theelectronic control unit 400 may delay all of the warning of the driver, the partial braking, and the full braking by a predetermined time from all of the initially set times t1, t2, and t3 as illustrated inFIG. 22 during the operating time of the PDCMS function. -
FIG. 24 is an exemplified view illustrating the operation of the PDCMS function when the gaze information is the diagonal according to the embodiment of the present invention. - If the gaze of the pedestrian is directed toward the diagonal of the vehicle, it corresponds to the case where |(x1 - x2) / x| < a like the case where the gaze of the pedestrian is the front of the vehicle. Therefore, if the gaze of the pedestrian corresponds to |(x1 - x2) / x| < a based on the lengths of each part of the face of the pedestrian derived from the
front detection sensor 200, it may be determined that the gaze of the pedestrian is directed toward the diagonal of the vehicle. - In this case, since the vehicle exists within the view of the pedestrian, it is expected that the possibility of collision is low because the pedestrian recognizes the driving direction of the vehicle. Therefore, the
electronic control unit 400 may delay the warning of the driver and the partial braking by a predetermined time from the initially set times t1 and t2 as illustrated inFIG. 22 during the operating time of the PDCMS function. Alternatively, theelectronic control unit 400 may delay all of the warning of the driver, the partial braking, and the full braking by a predetermined time from all of the initially set times t1, t2, and t3 as illustrated inFIG. 22 during the operating time of the PDCMS function. -
FIG. 25 is an exemplified view illustrating the operation of the PDCMS function when the gaze information is the rear according to the embodiment of the present invention. - If the gaze direction of the pedestrian is in the same direction as the driving direction of the vehicle, that is, if the vehicle is driving toward the rear of the pedestrian, it corresponds to x1 = x2 = x3 = 0. Therefore, if it corresponds to |(x1 - x2) / x| < a based on the lengths of each part of the face of the pedestrian derived from the
front detection sensor 200, it may be determined that the gaze direction of the pedestrian is the same as the driving direction of the vehicle, that is, the vehicle is driving toward the rear of the vehicle. - In this case, since the vehicle exists within the view of the pedestrian, it is expected that the possibility of collision is high because the pedestrian does not recognize the driving direction of the vehicle. Therefore, the
electronic control unit 400 may not delay the operating time of the PDCMS function from the initially set times t1, t2, and t3 as illustrated inFIG. 22 . -
FIG. 26 is an exemplified view illustrating the operation of the PDCMS function when the gaze information is the side according to the embodiment of the present invention. - If the gaze direction of the pedestrian forms about 90° with respect to the driving direction of the vehicle, that is, if the vehicle is driving toward the side of the pedestrian, it corresponds to the case where |(x1 - x2) / x| > a or x1 = 0 or x2 = 0. Therefore, if it corresponds to |(x1 - x2) / x| > a or x1 = 0 or x2 = 0 based on the lengths of each part of the face of the pedestrian derived from the
front detection sensor 200, it may be determined that the gaze direction of the pedestrian forms about 90° with respect to the driving direction of the vehicle, that is, the vehicle is driving toward the side of the vehicle. - In this case, since the possibility that the vehicle does not exist within the view of the pedestrian is significant, it is expected that the possibility of collision is high because the pedestrian does not recognize the driving direction of the vehicle. Therefore, the
electronic control unit 400 may not delay the operating time of the PDCMS function from the initially set times t1, t2, and t3 as illustrated inFIG. 22 . -
FIG. 27 is a flow chart illustrating a flow of a method for activating a PDCMS function according to an embodiment of the present invention. - Referring to
FIG. 27 , a method for activating a pedestrian detection and collision mitigation system (PDCMS) function of a vehicle according to an embodiment of the present invention includes: generating radar detection information by detecting a pedestrian on a driving lane of a vehicle using a radio wave reflected from a pedestrian (S100); generating far-infrared recognition information by imaging heat radiated from the pedestrian (S200); detecting pedestrian information including presence of the pedestrian on a driving lane of the vehicle, gaze information of the pedestrian, and a distance and a relative speed between the pedestrian and the vehicle by integrating the radar detection information with the far-infrared recognition information (S300); detecting vehicle information including at least any one of a speed, an acceleration, a steering angle, a steering angular velocity, and a pressure of a master cylinder of the vehicle (S400); and activating a PDCMS function based on the pedestrian information and the vehicle information (S500). - In the generating of the radar detection information by detecting the pedestrian on the driving lane of the vehicle using the radio wave reflected from the pedestrian (S100), the heat radiated by the target object may be imaged and therefore the pedestrian may be recognized even in the low illumination or extremely low illumination condition.
- In the generating of the far-infrared ray recognition information by imaging the heat radiated by the pedestrian (S200), the heat radiated by the target object is imaged and therefore the pedestrian may be recognized even in the low illumination or extremely low illuminance condition.
- In the detecting of the pedestrian information including the presence of the pedestrian on the driving lane of the vehicle and the distance and the relative speed between the pedestrian and the vehicle by integrating the radar detection information with the far-infrared recognition information (S300), it is possible to more accurately detect the distance and the relative speed between the vehicle and the pedestrian even in the low illuminance condition or at night.
- In the detecting of the vehicle information including at least any one of the speed, the acceleration, the steering angle, the steering angular velocity and the pressure of the master cylinder of the vehicle (S400), the RPM of the vehicle wheel from the vehicle engine is measured and the driving speed of the vehicle is calculated based on the known circumference of the wheel and the measured RPM and time. Further, the
vehicle sensor 300 may detect the information on the driving conditions of the vehicle such as the acceleration, the steering angle, the steering angular velocity, the pressure of the master cylinder, and the like. - In the operating of the PDCMS function based on the pedestrian information and the vehicle information (S500), it is determined whether to operate the PDCMS function of the vehicle based on the mapping table using the acquired pedestrian information and pedestrian information. Specifically, it is determined whether the conditions that the PDCMS function on the mapping table may be operated are satisfied based on the integration of the pedestrian information and the vehicle information. That is, the risk of collision of the pedestrian with the vehicle is determined on the mapping table using the current position of the pedestrian, the current position of the vehicle, and the speed information of the vehicle.
- Further, in the operating of the PDCMS function based on the pedestrian information and the vehicle information (S500), the PDCMS function of the vehicle is operated if it is determined that the pedestrian state and the vehicle state satisfy the conditions that the PDCMS function may start on the mapping table. The PDCMS function includes the activation of the operation of the warning unit that is operated to inform the driver of the collision of the pedestrian with the vehicle and the operation of the brake regardless of whether the driver operates the brake. Further, the activation of the operation of the warning unit and the activation of the operation of the brake are performed in order of the activation of the operation of the warning unit, the partial braking of the vehicle, and the full braking of the vehicle.
- The apparatus for activating a PDCMS of a vehicle according to the embodiment of the present invention may include different types of sensors and may integrate and use the respective sensing results to more accurately detect the presence of the pedestrian and the distance and the relative speed between the vehicle and the pedestrian even in the low illuminance condition and at night.
- In addition, the embodiment of the present invention may classify the gaze of the pedestrian into the front, the diagonal, the rear, and the side to identify whether the pedestrian may recognize the vehicle or not and then apply the identified situation to the operation of the PDCMS, such that the vehicle braking control may be delayed by the predetermined time only when the pedestrian may recognize the vehicle.
- Therefore, the apparatus for activating a PDCMS according to the embodiment of the present invention may more accurately detect the pedestrian to effectively protect the pedestrian and optimize the vehicle braking control unnecessarily frequently performed to resolve the sense of difference and discomfort of the driver, thereby improving the commerciality of the vehicle.
- The foregoing includes examples of one or more embodiments. Of course, all possible combinations of components or methods for the purpose of describing the embodiments described above are not described, but those skilled in the art may recognize that many combinations and substitutions of various embodiments are possible.
Claims (15)
- An apparatus for activating a pedestrian detection and collision mitigation system PDCMS of a vehicle, comprising:a front detection sensor (200) detecting presence of a pedestrian on a driving lane of the vehicle, gaze information of the pedestrian, and a distance and a relative speed between the pedestrian and the vehicle;a vehicle sensor (300) detecting at least one of a speed, an acceleration, a steering angle, a steering angular velocity, or a pressure of a master cylinder of the vehicle;an electronic control unit (400) determining whether to operate a function of the PDCMS based on information detected by the front detection sensor (200) and the vehicle sensor (300); anda warning unit (500) operating to inform the vehicle's driver of a collision of the pedestrian with the vehicle by controlling the electronic control unit (400),wherein the front detection sensor (200) includes:a radar sensor (210) detecting the pedestrian on the driving lane of the vehicle using a radio wave reflected from the pedestrian to generate radar detection information;a far-infrared sensor (220) imaging heat radiated from the pedestrian to generate far-infrared recognition information; andan information integration unit (230) integrating the radar detection information and the far-infrared recognition informationthe function of the PDCMS includes activating an operation of the warning unit (500) and activating an operation of a brake of the vehicle regardless of whether the driver operates the brake, andthe activation of the operation of the warning unit (500) and the activation of the operation of the brake are performed in order of the activation of the operation of the warning unit (500), a partial braking of the vehicle, and a full braking of the vehicle,wherein a state of the PDCMS is produced as an off state when an engine of the vehicle stalls,wherein the state of the PDCMS is produced as a deactivation state by turning on the engine in the off state, andwherein the state of the PDCMS is produced as an activation state when the speed of the vehicle is equal to or greater than a first predetermined minimum value and equal to or less than a first predetermined maximum value,wherein the electronic control unit (400) is configured to determine the operation of the PDCMS function when an initial speed of the vehicle is in an area in which the PDCMS function is essentially operated and an initial speed of the pedestrian at the boundary of a driving lane is equal to or greater than a second predetermined minimum value and equal to or less than a second predetermined maximum value.
- The apparatus of claim 1,
wherein the information integration unit (230) includes:a consistency determination unit (231) calculating a correlation between the radar detection information and the far-infrared recognition information to form a binary matrix;a similarity calculation unit (232) generating a measurement value using the binary matrix formed by the consistency determination unit (231) and calculating similarity in a state variable area between the measurement value and track tracking information of a previous time;a state variable updating unit (233) updating each tracking information by performing Kalman filtering using the measurement value and the state variable; anda tracking information management unit (234) performing merging, generation, and deletion of the track. - The apparatus of claim 2, wherein the similarity calculation unit (232) represents longitudinal and lateral relative positions of the radar detection information and the far-infrared recognition information using a polar coordinate system to generate the measurement value.
- The apparatus of claim 2 or 3, wherein the state variable updating unit (233) applies an extended Kalman filter to perform the Kalman filtering.
- The apparatus of any one of the preceding claims, wherein the front detection sensor (200) further includes a radar detection information post-processing unit (211) removing a result other than the pedestrian among the radar detection information before the integration of the result with the far-infrared recognition information.
- The apparatus of any one of the preceding claims, wherein the number of generated radar detection information is equal to or less than 64.
- The apparatus of any one of the preceding claims, wherein the number of generated far-infrared recognition information is equal to or less than 8.
- The apparatus of any one of the preceding claims, wherein when a horizontal length of the entire face of the pedestrian is defined as x, a horizontal length from a left face contour to a left eye of the pedestrian is defined as x1, a horizontal length from a right face contour to a right eye of the pedestrian is defined as x2, and a horizontal length between the left eye and the right eye of the pedestrian is defined as x3, the gaze information of the pedestrian corresponds to a front or a diagonal if |(x1 - x2) / x| < a, a rear if x1 = x2 = x3 = 0, a side if |(x1 - x2) / x| ≥ a or x1 = 0 or x2 = 0, and the a is selected in a range between 0.65 and 0.95.
- The apparatus of claim 8, wherein the electronic control unit (400) performs the activation of the operation of the warning unit (500) and the activation of the operation of the brake by delaying the activation of the operation of the warning unit (500) and the activation of the operation of the brake by a predetermined time when the gaze information of the pedestrian is the front or the diagonal, compared to when the gaze information of the pedestrian is the rear or the side.
- The apparatus of any one of the preceding claims, wherein the electronic control unit (400) performs the activation of the operation of the brake so that the speed of the vehicle is reduced to at least a predetermined speed or more from time when the operation of the brake is activated to time when the collision of the pedestrian with the vehicle occurs.
- The apparatus of any one of the preceding claims, wherein the electronic control unit (400) permits the driver to operate the brake for a maximum possible deceleration even after the activation of the operation of the brake starts.
- The apparatus of any one of the preceding claims, wherein the electronic control unit (400) controls the warning unit (500) to inform the driver that the PDCMS function is in an available state.
- The apparatus of any one of the preceding claims, wherein the warning unit (500) includes a display unit visually informing of the collision of the pedestrian with the vehicle or a speaker unit audibly informing of the collision of the pedestrian with the vehicle.
- The apparatus of any one of the preceding claims, wherein the PDCMS function further includes an operation of a rear brake lamp.
- A method for activating a pedestrian detection and collision mitigation system PDCMS of a vehicle, comprising:detecting the pedestrian on a driving lane of the vehicle using a radio wave reflected from the pedestrian to generate radar detection information;imaging heat radiated from the pedestrian to generate far-infrared recognition information;integrating the radar detection information with the far-infrared recognition information to detect pedestrian information including presence of the pedestrian on the driving lane of the vehicle, gaze information of the pedestrian, and a distance and a relative speed between the pedestrian and the vehicle;detecting vehicle information including at least any one of a speed, an acceleration, a steering angle, a steering angular velocity, or a pressure of a master cylinder of the vehicle; andactivating a PDCMS function based on the pedestrian information and the vehicle information,wherein the PDCMS function includes activating an operation of a warning unit (500) operated to inform a driver of the vehicle of a collision of the pedestrian with the vehicle and activating an operation of a brake of the vehicle regardless of whether the driver operates the brake, andthe activation of the operation of the warning unit (500) and the activation of the operation of the brake are performed in order of the activation of the operation of the warning unit (500), a partial braking of the vehicle, and a full braking of the vehicle,wherein a state of the PDCMS is produced as an off state when an engine of the vehicle stalls,wherein the state of the PDCMS is produced as a deactivation state by turning on the engine in the off state, andwherein the state of the PDCMS is produced as an activation state when the speed of the vehicle is equal to or greater than a first predetermined minimum value and equal to or less than a first predetermined maximum value,wherein the electronic control unit (400) is configured to determine the operation of the PDCMS function when an initial speed of the vehicle is in an area in which the PDCMS function is essentially operated and an initial speed of the pedestrian at the boundary of a driving lane is equal to or greater than a second predetermined minimum value and equal to or less than a second predetermined maximum value.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160184299A KR101996418B1 (en) | 2016-12-30 | 2016-12-30 | Sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3342661A1 EP3342661A1 (en) | 2018-07-04 |
EP3342661B1 true EP3342661B1 (en) | 2024-03-20 |
Family
ID=60654678
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17205392.8A Active EP3342661B1 (en) | 2016-12-30 | 2017-12-05 | Sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method |
Country Status (4)
Country | Link |
---|---|
US (2) | US10814840B2 (en) |
EP (1) | EP3342661B1 (en) |
KR (1) | KR101996418B1 (en) |
CN (1) | CN108263279B (en) |
Families Citing this family (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6592266B2 (en) * | 2015-03-31 | 2019-10-16 | 株式会社デンソー | Object detection apparatus and object detection method |
KR101996418B1 (en) * | 2016-12-30 | 2019-07-04 | 현대자동차주식회사 | Sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method |
KR101996414B1 (en) * | 2016-12-30 | 2019-07-04 | 현대자동차주식회사 | Pedestrian collision prevention apparatus and method considering pedestrian gaze |
US20230356728A1 (en) * | 2018-03-26 | 2023-11-09 | Nvidia Corporation | Using gestures to control machines for autonomous systems and applications |
US10467903B1 (en) | 2018-05-11 | 2019-11-05 | Arnold Chase | Passive infra-red pedestrian detection and avoidance system |
US10750953B1 (en) | 2018-05-11 | 2020-08-25 | Arnold Chase | Automatic fever detection system and method |
US11294380B2 (en) | 2018-05-11 | 2022-04-05 | Arnold Chase | Passive infra-red guidance system |
US11062608B2 (en) | 2018-05-11 | 2021-07-13 | Arnold Chase | Passive infra-red pedestrian and animal detection and avoidance system |
CN109094458A (en) * | 2018-08-20 | 2018-12-28 | 特治(深圳)智能科技实业有限公司 | The control method of safe driving of vehicle and control device for safe driving of vehicle |
CN109334566B (en) * | 2018-08-31 | 2022-01-25 | 阿波罗智联(北京)科技有限公司 | Method, device, equipment and storage medium for providing feedback outside vehicle |
KR102572784B1 (en) | 2018-10-25 | 2023-09-01 | 주식회사 에이치엘클레무브 | Driver assistance system and control method for the same |
WO2020100585A1 (en) * | 2018-11-13 | 2020-05-22 | ソニー株式会社 | Information processing device, information processing method, and program |
CN109490890B (en) * | 2018-11-29 | 2023-06-02 | 重庆邮电大学 | A smart car-oriented millimeter wave radar and monocular camera information fusion method |
CN109448439B (en) * | 2018-12-25 | 2021-03-23 | 科大讯飞股份有限公司 | Vehicle safe driving method and device |
CN110239529A (en) * | 2019-06-28 | 2019-09-17 | 北京海益同展信息科技有限公司 | Control method for vehicle, device and computer readable storage medium |
IT201900012414A1 (en) * | 2019-07-19 | 2021-01-19 | Ubiquicom S R L | Anti-collision system and method of land vehicles |
US11320830B2 (en) | 2019-10-28 | 2022-05-03 | Deere & Company | Probabilistic decision support for obstacle detection and classification in a working area |
JP7256812B2 (en) * | 2019-12-20 | 2023-04-12 | バイドゥドットコム タイムズ テクノロジー (ベイジン) カンパニー リミテッド | How to Realize a Dynamic Cost Function for Autonomous Vehicles |
US11403948B2 (en) * | 2020-02-26 | 2022-08-02 | Compal Electronics, Inc. | Warning device of vehicle and warning method thereof |
KR20210149547A (en) * | 2020-06-02 | 2021-12-09 | 현대모비스 주식회사 | Forward collision avoidance system and method of vehicle |
CN111845554A (en) * | 2020-06-03 | 2020-10-30 | 北京中科慧眼科技有限公司 | Pedestrian collision early warning method and device based on binocular stereo camera |
CN112505708A (en) * | 2020-11-19 | 2021-03-16 | 武汉中海庭数据技术有限公司 | Automatic human range finding identification circuit of driving, device and car |
US11488479B2 (en) * | 2020-12-10 | 2022-11-01 | Toyota Research Institute, Inc. | Methods and systems for generating targeted warnings |
KR20220086155A (en) * | 2020-12-16 | 2022-06-23 | 현대자동차주식회사 | A module and method for track mergence |
CN113022539B (en) * | 2021-03-26 | 2022-10-04 | 浙江吉利控股集团有限公司 | Animal driving-away method, system, storage medium and equipment |
US11932167B2 (en) * | 2021-10-06 | 2024-03-19 | Leo Polosajian | System and method of alerting pedestrians to vehicles |
JP7442948B2 (en) * | 2021-10-18 | 2024-03-05 | 矢崎総業株式会社 | External display device |
KR20230168859A (en) * | 2022-06-08 | 2023-12-15 | 현대모비스 주식회사 | Vehicle lighting device and method of operating thereof |
CN115273539B (en) * | 2022-06-16 | 2024-01-30 | 中国第一汽车股份有限公司 | Vehicle danger early warning method and device based on V2X communication and computer readable storage medium |
CN115056773B (en) * | 2022-06-30 | 2024-09-20 | 重庆长安汽车股份有限公司 | Pedestrian micro-collision recognition method and system in low-speed scene |
CN116985787A (en) * | 2023-06-26 | 2023-11-03 | 重庆长安汽车股份有限公司 | Vehicle running control method, device, equipment and storage medium |
CN117901822B (en) * | 2024-03-20 | 2024-05-28 | 衢州海易科技有限公司 | Anti-collision braking method and system for inner wheel difference area of engineering vehicle |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140044310A1 (en) * | 2004-07-26 | 2014-02-13 | Automotive Systems Laboratory, Inc. | Method of identifying an object in a visual scene |
Family Cites Families (58)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06107141A (en) * | 1992-09-30 | 1994-04-19 | Mazda Motor Corp | Automatic braking device for vehicle |
US7426437B2 (en) * | 1997-10-22 | 2008-09-16 | Intelligent Technologies International, Inc. | Accident avoidance systems and methods |
JP3843502B2 (en) * | 1996-09-30 | 2006-11-08 | マツダ株式会社 | Vehicle motion recognition device |
JP2000357299A (en) * | 1999-06-16 | 2000-12-26 | Honda Motor Co Ltd | Safety device for vehicle running |
JP4319535B2 (en) * | 2003-12-19 | 2009-08-26 | 株式会社東海理化電機製作所 | Face orientation detection device |
JP4255906B2 (en) * | 2004-12-03 | 2009-04-22 | 富士通テン株式会社 | Driving assistance device |
US7380633B2 (en) * | 2005-06-09 | 2008-06-03 | Delphi Technologies, Inc. | Vehicle sensing method for detecting a pedestrian impact |
US20100007728A1 (en) * | 2006-10-13 | 2010-01-14 | Continental Teves Ag & Co., Ohg | System for Determining Objects |
JP5077639B2 (en) * | 2006-12-11 | 2012-11-21 | 株式会社デンソー | Pedestrian collision detection device and pedestrian protection system |
JP2008197720A (en) * | 2007-02-08 | 2008-08-28 | Mitsubishi Electric Corp | Pedestrian warning device |
US8725309B2 (en) * | 2007-04-02 | 2014-05-13 | Panasonic Corporation | Safety driving support apparatus |
JP2009169776A (en) * | 2008-01-18 | 2009-07-30 | Hitachi Ltd | Detector |
JP4561863B2 (en) * | 2008-04-07 | 2010-10-13 | トヨタ自動車株式会社 | Mobile body path estimation device |
US8605947B2 (en) * | 2008-04-24 | 2013-12-10 | GM Global Technology Operations LLC | Method for detecting a clear path of travel for a vehicle enhanced by object detection |
JP2010127717A (en) * | 2008-11-26 | 2010-06-10 | Sumitomo Electric Ind Ltd | Object detection device and object detection system |
US8384531B2 (en) * | 2009-04-02 | 2013-02-26 | GM Global Technology Operations LLC | Recommended following distance on full-windshield head-up display |
JP2011063187A (en) * | 2009-09-18 | 2011-03-31 | Autonetworks Technologies Ltd | Vehicle lamp control device |
JP4873068B2 (en) * | 2009-11-20 | 2012-02-08 | 株式会社デンソー | Collision damage reduction device |
JP5401344B2 (en) * | 2010-01-28 | 2014-01-29 | 日立オートモティブシステムズ株式会社 | Vehicle external recognition device |
JP5402813B2 (en) * | 2010-04-22 | 2014-01-29 | 株式会社豊田中央研究所 | Vehicle motion control device and program |
US9507998B2 (en) * | 2011-06-13 | 2016-11-29 | Toyota Jidosha Kabushiki Kaisha | Pedestrian motion predicting device |
CN103703422B (en) * | 2011-07-20 | 2016-03-23 | 飞思卡尔半导体公司 | Safety-critical equipment and for controlling the method that safety-critical equipment operation person laxes |
DE102011111899A1 (en) * | 2011-08-30 | 2013-02-28 | Gm Global Technology Operations, Llc | Detection device and method for detecting a carrier of a transceiver, motor vehicle |
DE102011112985A1 (en) * | 2011-09-10 | 2013-03-14 | Daimler Ag | Method for operating safety device i.e. safety belt of vehicle i.e. car, involves outputting warning to driver of vehicle during existing of collision probability, and automatically reducing motor torque of drive unit of vehicle |
JP5696701B2 (en) * | 2012-08-31 | 2015-04-08 | 株式会社デンソー | Anti-pedestrian notification device |
JP2014059841A (en) * | 2012-09-19 | 2014-04-03 | Daimler Ag | Driving support device |
EP2916307B1 (en) * | 2012-10-30 | 2021-05-19 | Toyota Jidosha Kabushiki Kaisha | Vehicle safety apparatus |
GB2511748B (en) * | 2013-03-11 | 2015-08-12 | Jaguar Land Rover Ltd | Emergency braking system for a vehicle |
DE202013006676U1 (en) * | 2013-07-25 | 2014-10-28 | GM Global Technology Operations LLC (n. d. Ges. d. Staates Delaware) | System for warning of a possible collision of a motor vehicle with an object |
JP2015031607A (en) * | 2013-08-02 | 2015-02-16 | トヨタ自動車株式会社 | Object recognition device |
KR101489836B1 (en) * | 2013-09-13 | 2015-02-04 | 자동차부품연구원 | Pedestrian detecting and collision avoiding apparatus and method thereof |
US9315192B1 (en) * | 2013-09-30 | 2016-04-19 | Google Inc. | Methods and systems for pedestrian avoidance using LIDAR |
US9336436B1 (en) * | 2013-09-30 | 2016-05-10 | Google Inc. | Methods and systems for pedestrian avoidance |
KR101402206B1 (en) * | 2014-04-10 | 2014-05-30 | 국방과학연구소 | Multiple target tracking method with kinematics and feature information of targets |
JP2016009251A (en) * | 2014-06-23 | 2016-01-18 | エイディシーテクノロジー株式会社 | Control device for vehicle |
US9925980B2 (en) * | 2014-09-17 | 2018-03-27 | Magna Electronics Inc. | Vehicle collision avoidance system with enhanced pedestrian avoidance |
US10634778B2 (en) * | 2014-10-21 | 2020-04-28 | Texas Instruments Incorporated | Camera assisted tracking of objects in a radar system |
WO2016126317A1 (en) * | 2015-02-06 | 2016-08-11 | Delphi Technologies, Inc. | Method of automatically controlling an autonomous vehicle based on electronic messages from roadside infrastructure of other vehicles |
US9718405B1 (en) * | 2015-03-23 | 2017-08-01 | Rosco, Inc. | Collision avoidance and/or pedestrian detection system |
JP6561584B2 (en) * | 2015-05-27 | 2019-08-21 | 株式会社デンソー | Vehicle control apparatus and vehicle control method |
JP6432447B2 (en) * | 2015-05-27 | 2018-12-05 | 株式会社デンソー | Vehicle control apparatus and vehicle control method |
US9637120B2 (en) * | 2015-06-24 | 2017-05-02 | Delphi Technologies, Inc. | Cognitive driver assist with variable assistance for automated vehicles |
US9493118B1 (en) * | 2015-06-24 | 2016-11-15 | Delphi Technologies, Inc. | Cognitive driver assist with variable warning for automated vehicles |
KR102356656B1 (en) * | 2015-07-29 | 2022-01-28 | 주식회사 만도모빌리티솔루션즈 | Driving assistant device and driving assistant method |
JP6358409B2 (en) * | 2016-01-22 | 2018-07-25 | 日産自動車株式会社 | Vehicle driving support control method and control device |
JP6418407B2 (en) * | 2016-05-06 | 2018-11-07 | トヨタ自動車株式会社 | Brake control device for vehicle |
US9449506B1 (en) * | 2016-05-09 | 2016-09-20 | Iteris, Inc. | Pedestrian counting and detection at a traffic intersection based on location of vehicle zones |
US9981602B2 (en) * | 2016-08-19 | 2018-05-29 | 2236008 Ontario Inc. | System and method for pedestrian alert |
US20180105107A1 (en) * | 2016-10-19 | 2018-04-19 | Novateur Research Solutions LLC | Pedestrian collision warning system for vehicles |
KR101996415B1 (en) * | 2016-12-30 | 2019-07-04 | 현대자동차주식회사 | Posture information based pedestrian detection and pedestrian collision prevention apparatus and method |
KR101996418B1 (en) * | 2016-12-30 | 2019-07-04 | 현대자동차주식회사 | Sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method |
KR101996416B1 (en) * | 2016-12-30 | 2019-10-01 | 현대자동차주식회사 | Method and apparatus for pedestrian collision mitigation |
KR101996419B1 (en) * | 2016-12-30 | 2019-07-04 | 현대자동차주식회사 | Sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method |
KR101996414B1 (en) * | 2016-12-30 | 2019-07-04 | 현대자동차주식회사 | Pedestrian collision prevention apparatus and method considering pedestrian gaze |
KR101996417B1 (en) * | 2016-12-30 | 2019-07-04 | 현대자동차주식회사 | Posture information based pedestrian detection and pedestrian collision prevention apparatus and method |
US11062608B2 (en) * | 2018-05-11 | 2021-07-13 | Arnold Chase | Passive infra-red pedestrian and animal detection and avoidance system |
US10347132B1 (en) * | 2018-10-30 | 2019-07-09 | GM Global Technology Operations LLC | Adjacent pedestrian collision mitigation |
US11682272B2 (en) * | 2020-07-07 | 2023-06-20 | Nvidia Corporation | Systems and methods for pedestrian crossing risk assessment and directional warning |
-
2016
- 2016-12-30 KR KR1020160184299A patent/KR101996418B1/en active Active
-
2017
- 2017-12-05 EP EP17205392.8A patent/EP3342661B1/en active Active
- 2017-12-06 US US15/833,799 patent/US10814840B2/en active Active
- 2017-12-08 CN CN201711299202.0A patent/CN108263279B/en active Active
-
2020
- 2020-09-23 US US17/029,252 patent/US11584340B2/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140044310A1 (en) * | 2004-07-26 | 2014-02-13 | Automotive Systems Laboratory, Inc. | Method of identifying an object in a visual scene |
Also Published As
Publication number | Publication date |
---|---|
US10814840B2 (en) | 2020-10-27 |
KR101996418B1 (en) | 2019-07-04 |
CN108263279B (en) | 2022-11-01 |
KR20180078983A (en) | 2018-07-10 |
US20180236985A1 (en) | 2018-08-23 |
US11584340B2 (en) | 2023-02-21 |
US20210031737A1 (en) | 2021-02-04 |
CN108263279A (en) | 2018-07-10 |
EP3342661A1 (en) | 2018-07-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3342661B1 (en) | Sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method | |
EP3342660B1 (en) | Sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method | |
US10870429B2 (en) | Posture information based pedestrian detection and pedestrian collision prevention apparatus and method | |
JP6690517B2 (en) | Driving support device and driving support method | |
CN109204311B (en) | A vehicle speed control method and device | |
EP3342665B1 (en) | Pedestrian collision prevention apparatus and method considering pedestrian gaze | |
US20080015743A1 (en) | Method and system for assisting the driver of a motor vehicle in identifying road bumps | |
US20120041632A1 (en) | Combined lane change assist and rear, cross-traffic alert functionality | |
KR20190049221A (en) | an Autonomous Vehicle of pedestrians facial features | |
US11518373B2 (en) | Vehicle and control method thereof | |
KR20150051548A (en) | Driver assistance systems and controlling method for the same corresponding to dirver's predisposition | |
KR101519215B1 (en) | Driver assistance systems and controlling method for the same | |
JP7243034B2 (en) | Collision determination device and method | |
JP4204830B2 (en) | Vehicle driving support device | |
KR20160123110A (en) | Autonomous emergency braking system | |
US12153128B2 (en) | Apparatus and method for vehicle control in the presence of temporarily obscured moving obstacles | |
JP6548147B2 (en) | Vehicle control device | |
JP2014000854A (en) | Vehicular rearward alarm system | |
KR101511859B1 (en) | Lane recognition enhanced driver assistance systems and controlling method for the same | |
KR20230071575A (en) | Vehicle Collision-Avoidance system and Vehicle equipped with the system and Collision-Avoidance Method thereof | |
KR20230164257A (en) | Vehicle collision-avoidance system and vehicle equipped with the system and collision-avoidance method thereof | |
KR20230172054A (en) | Vehicle collision-avoidance system and vehicle equipped with the system and collision-avoidance method thereof | |
KR20220071559A (en) | System for Emergency Braking and Method therefor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: MIN, SUK KI Inventor name: KIM, EUNG SEO Inventor name: LEE, TAE YOUNG Inventor name: KWON, YONG SEOK Inventor name: LEE, SANG MIN Inventor name: LEE, KANG HOON Inventor name: LEE, MIN BYEONG Inventor name: JUNG, IN YONG Inventor name: PARK, SEUNG WOOK Inventor name: KIM, YOON SOO Inventor name: LEE, WAN JAE Inventor name: SUNG, DONG HYUN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20190102 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
RAP3 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: HYUNDAI MOTOR COMPANY Owner name: KIA CORPORATION |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20211020 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230420 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20231018 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602017080162 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240621 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240620 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20240320 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240620 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240620 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240621 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1667576 Country of ref document: AT Kind code of ref document: T Effective date: 20240320 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240720 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240722 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240722 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240720 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602017080162 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20241121 Year of fee payment: 8 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20241121 Year of fee payment: 8 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20241121 Year of fee payment: 8 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
26N | No opposition filed |
Effective date: 20241223 |