[go: up one dir, main page]

US10981507B1 - Interactive safety system for vehicles - Google Patents

Interactive safety system for vehicles Download PDF

Info

Publication number
US10981507B1
US10981507B1 US16/930,974 US202016930974A US10981507B1 US 10981507 B1 US10981507 B1 US 10981507B1 US 202016930974 A US202016930974 A US 202016930974A US 10981507 B1 US10981507 B1 US 10981507B1
Authority
US
United States
Prior art keywords
vehicle
safety system
display
processor
interactive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US16/930,974
Other versions
US20210138960A1 (en
Inventor
Timothy J. Benjamin
Peter M. Bartek
Jon A. Preston
Jeff Chastine
Joseph L. Calabrese
Corey Dash
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Railserve Inc
Original Assignee
Focused Technology Solutions Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US16/930,974 priority Critical patent/US10981507B1/en
Application filed by Focused Technology Solutions Inc filed Critical Focused Technology Solutions Inc
Assigned to FOCUSED TECHNOLOGY SOLUTIONS, INC. reassignment FOCUSED TECHNOLOGY SOLUTIONS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: RAILSERVE INC.
Assigned to FOCUSED TECHNOLOGY SOLUTIONS, INC. reassignment FOCUSED TECHNOLOGY SOLUTIONS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BARTEK, PETER M., CALABRESE, JOSEPH L.
Assigned to RAILSERVE INC. reassignment RAILSERVE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BENJAMIN, TIMOTHY J, CHASTINE, JEFF, PRESTON, JON A
Priority to US17/232,068 priority patent/US11318886B2/en
Publication of US10981507B1 publication Critical patent/US10981507B1/en
Application granted granted Critical
Publication of US20210138960A1 publication Critical patent/US20210138960A1/en
Assigned to FOCUSED TECHNOLOGY SOLUTIONS, INC. reassignment FOCUSED TECHNOLOGY SOLUTIONS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DASH, COREY
Priority to US17/734,699 priority patent/US12077046B2/en
Assigned to RAILSERVE, INC. reassignment RAILSERVE, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FOCUSED TECHNOLOGY SOLUTIONS, INC.
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • B60K35/223Flexible displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/26Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using acoustic output
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/55Instruments with parts that can change their shape or position to configure an active screen, e.g. by folding or by rolling
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/60Instruments characterised by their location or relative disposition in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/65Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive
    • B60K35/654Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive the user being the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K37/00Dashboards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/525Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking automatically indicating risk of collision between vehicles in traffic or with pedestrians, e.g. after risk assessment using the vehicle sensor data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q5/00Arrangement or adaptation of acoustic signal devices
    • B60Q5/005Arrangement or adaptation of acoustic signal devices automatically actuated
    • B60Q5/006Arrangement or adaptation of acoustic signal devices automatically actuated indicating risk of collision between vehicles or with pedestrians
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • B60Q9/008Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for anti-collision purposes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/002Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles specially adapted for covering the peripheral part of the vehicle, e.g. for viewing tyres, bumpers or the like
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/24Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/26Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/31Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing stereoscopic vision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • B60R16/0231Circuits relating to the driving or the functioning of the vehicle
    • B60R16/0232Circuits relating to the driving or the functioning of the vehicle for measuring vehicle parameters and indicating critical, abnormal or dangerous conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/292Multi-camera tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • H04N5/247
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/445Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/176Camera images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/177Augmented reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/178Warnings
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/21Optical features of instruments using cameras
    • B60K2370/1529
    • B60K2370/157
    • B60K2370/178
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2400/00Special features or arrangements of exterior signal lamps for vehicles
    • B60Q2400/50Projected symbol or information, e.g. onto the road or car body
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/20Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
    • B60R2300/202Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used displaying a blind spot scene on the vehicle part responsible for the blind spot
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/20Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
    • B60R2300/205Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used using a head-up display
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/301Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with other obstacle sensor information, e.g. using RADAR/LIDAR/SONAR sensors for estimating risk of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/802Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying vehicle exterior blind spot views
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8093Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Definitions

  • the present invention relates vehicles, and more particularly, an interactive vehicle safety system having capabilities to improve peripheral vision, provide warning, and improve reaction time for operators of vehicles.
  • a system is needed to improve safety for vehicles by improving peripheral vision and situational awareness to reduce vehicle collisions and accidents.
  • a system is also needed with predictive capabilities for vehicles, people, and other objects that will help the driver to make initial decisions and have the machine or system take over the decision-making and actions when the driver is making a mistake (i.e. an imminent accident or collision). This is especially needed with bigger vehicles, for example trucks, trains, etc.
  • the present invention generally provides an interactive vehicle safety system having capabilities to improve peripheral vision, provide warning, and improve reaction time for operators of vehicles, there are a number of vehicle collisions and accidents related to reduced peripheral vision.
  • an interactive safety system for vehicles may comprise: one or more image capturing devices located on the vehicle configured to transmit one or more images from the vehicle to a processor; one or more object detecting sensors located on the vehicle that detect location, speed, and direction data of one or more objects external to the vehicle; and a display system connected to the processor.
  • the one or more object detecting sensors may be configured to transmit the location, speed, and direction data from the one or more objects to the processor.
  • the display system may comprise a pillar display located on an A-pillar of the vehicle configured to portray an image blocked by an obstruction of the A-pillar from the one or more image capturing devices.
  • the processor and the memory unit storing computer-executable instructions may cause the processor to further provide a visual warning to the danger object, wherein the visual warning is provided external to the vehicle.
  • the processor and the memory unit storing computer-executable instructions may cause the processor to further provide an audible warning to the danger object, wherein the audible warning is provided external to the vehicle, wherein the audible warning is a 3D sound directed to the danger object.
  • the one or more object detecting sensors may be ultrasonic sensors, LIDAR radar sensors, or photoelectric sensors.
  • the processor and the memory unit storing computer-executable instructions may cause the processor to further light a path of the vehicle with high-intensity lights based on the predictive algorithm and trajectory analysis, wherein the path is externally outside the vehicle on a pavement of a street.
  • the display system may further include a heads-up display on a windshield of the vehicle to display the one or more objects with the location, speed, and direction data and the danger object with the future location and the route on a heads-up display.
  • the display system may further include a dashboard display on a dashboard of the vehicle to display the one or more objects with the location, speed, and direction data and the danger object with the future location and the route on a heads-up display.
  • an interactive safety system for vehicles may comprise: one or more image capturing devices located on the vehicle configured to transmit one or more images from the vehicle to a processor; one or more object detecting sensors located on the vehicle that detect location, speed, and direction data of one or more objects external to the vehicle; and a display system connected to the processor.
  • the display system may include a pillar display located on an A-pillar of the vehicle configured to portray an image blocked by an obstruction of the A-pillar from the one or more image capturing devices; and a rear-view display located in the vehicle configured to portray an image that would be seen in a rear-view mirror from the one or more image capturing devices.
  • the one or more object detecting sensors may be configured to transmit the location, speed, and direction data from the one or more objects to the processor.
  • an interactive safety system for vehicles may comprise: one or more image capturing devices located on the vehicle configured to transmit one or more images from the vehicle to a processor; one or more object detecting sensors located on the vehicle that detect location, speed, and direction data of one or more objects external to the vehicle; one or more telematics devices configured to transmit vehicle telematics data from the vehicle to the processor; and a display system connected to the processor.
  • the display system may include: a pillar display located on an A-pillar of the vehicle configured to portray an image blocked by an obstruction of the A-pillar from the one or more image capturing devices; a rear-view display located in the vehicle configured to portray an image that would be seen in a rear-view mirror from the one or more image capturing devices; and heads-up display on a windshield of the vehicle to display the one or more objects with the location, speed, and direction data. Additionally, the one or more object detecting sensors configured to transmit the location, speed, and direction data from the one or more objects to the processor.
  • the danger object may be highlighted yellow if the danger object is potentially in a path of an accident or a collision and the danger object may be highlighted red if the danger object is imminently in a path of an accident or a collision with the vehicle.
  • the highlighting of the danger object may include one or more of the following: blinking colors on the danger object, circles around the danger object, or blinking circles around the danger object.
  • an interactive safety system for vehicles may comprise: one or more image capturing devices located on the vehicle configured to transmit one or more images from the vehicle to a processor; one or more object detecting sensors located on the vehicle that detect location, speed, and direction data of one or more objects external to the vehicle; and a display system connected to the processor.
  • the display system may include a pillar display located on an A-pillar of the vehicle configured to portray an image blocked by an obstruction of the A-pillar from the one or more image capturing devices.
  • the one or more object detecting sensors may be configured to transmit the location, speed, and direction data from the one or more objects to the processor.
  • FIG. 1 illustrates a prior art version of a vehicle with a side rearview mirror and an A-pillar
  • FIG. 2 illustrates a vehicle with an interactive vehicle safety system with side rearview mirror elimination in accordance with an embodiment of the present invention
  • FIG. 3 illustrates a vehicle with an interactive vehicle safety system having pillar obstruction elimination in accordance with an embodiment of the present invention
  • FIG. 4 illustrates a vehicle with an interactive vehicle safety system having both the side rearview mirror elimination and pillar obstruction elimination in accordance with an embodiment of the present invention
  • FIGS. 5A-5D illustrate top views of various vehicles with an interactive vehicle safety system in accordance with an embodiment of the present invention
  • FIG. 6 illustrates an illustrative system depiction of an interactive vehicle safety system in accordance with an embodiment of the present invention
  • FIGS. 7 A 1 and 7 A 2 illustrate an interactive vehicle safety system with trajectory and dead-reckoning analysis showing a second vehicle turning left in front of a main vehicle in accordance with an embodiment of the present invention
  • FIGS. 7 B 1 and 7 B 2 illustrate an interactive vehicle safety system with trajectory and dead-reckoning analysis showing a main vehicle turning left in front of a second vehicle in accordance with an embodiment of the present invention
  • FIG. 7C illustrates an interactive vehicle safety system with trajectory and dead-reckoning analysis showing a pedestrian walking in a crosswalk in front of a main vehicle in accordance with an embodiment of the present invention
  • FIGS. 7 D 1 and 7 D 2 illustrate an interactive vehicle safety system with trajectory and dead-reckoning analysis showing a main vehicle turning left in front of a pedestrian in a crosswalk in accordance with an embodiment of the present invention
  • FIG. 8A illustrates the interactive vehicle safety system from FIGS. 7 A 1 and 7 A 2 with a heads-up-display (HUD) in accordance with an embodiment of the present invention
  • FIG. 8B illustrates the interactive vehicle safety system from FIGS. 7 B 1 and 7 B 2 with a heads-up-display (HUD) in accordance with an embodiment of the present invention
  • FIG. 8C illustrates the interactive vehicle safety system from FIG. 7C with a heads-up-display (HUD) in accordance with an embodiment of the present invention
  • FIG. 8D illustrates the interactive vehicle safety system from FIGS. 7 D 1 and 7 D 2 with a heads-up-display (HUD) in accordance with an embodiment of the present invention
  • FIG. 9 illustrates an interactive vehicle safety system with a display in the dashboard in accordance with an embodiment of the present invention.
  • FIG. 10 illustrates an interactive vehicle safety system using hologram technology in accordance with an embodiment of the present invention
  • FIG. 11 illustrates an interactive vehicle safety system using audio spotlight technology in accordance with an embodiment of the present invention.
  • FIG. 12 illustrates an interactive vehicle safety system using an external highlighting technology in accordance with an embodiment of the present invention.
  • the present invention relates to vehicles, and more particularly, an interactive vehicle safety system having capabilities to improve peripheral vision, provide warning, and improve reaction time for operators of vehicles.
  • the interactive vehicle safety system may have capabilities for portraying objects, which are being blocked by any of the structural pillars and/or mirrors of a vehicle (such as a truck, van, train, etc.).
  • the interactive vehicle safety system disclosed may comprise one or more image capturing devices (such as camera, sensor, laser), distance and object sensors (such as ultrasonic sensor, LIDAR radar sensor, photoelectric sensor, and infrared sensor), a real-time image processing of an object, and one or more display systems (such as LCD or LED displays).
  • the interactive vehicle safety system may give a seamless 360-degree front panoramic view to a driver.
  • the invention relates to an interactive vehicle safety system that includes real-time image processing for a vehicle with clear metal technology.
  • Clear metal technology refers to the capability for portraying objects, which are being blocked by any of the structural pillars and/or mirrors of a vehicle.
  • the clear metal technology may use one or multiple cameras located on the other side of a structural pillar or obstruction blocking the vision of the operator to portray an image blocked by the structural pillar/obstruction on the structural pillar/obstruction of the vehicle to make the structural pillar/obstruction functionally disappear.
  • the interactive vehicle safety system and clear metal technology may be used with vehicles such as automobiles, trucks, trains, bicycles, motorcycles, trains, forklifts, etc.
  • the interactive vehicle safety system may include one or more features such as: changing the image on the screen as the vehicle driver's head moves (using one camera or multiple cameras in “stereo” tracking the driver's head position); detect and track in real-time object movement outside the vehicle and highlight stationary or moving objects around the vehicle that may be “danger” by using trajectory analysis and dead reckoning of the vehicle (speed, direction, turn radius, etc.) and/or using trajectory analysis and dead reckoning of the moving objects (speed, direction, etc.); and a dead reckoning heads-up display to include object detecting sensors (for example, ultrasonic sensor, LIDAR radar sensor, photoelectric sensor, and infrared sensor) to the front bumper, rear bumper, or other locations around the vehicle that provides a dead reckoning time fully across the windshield.
  • the interactive vehicle safety system may utilize sensors on the vehicle to sense objects approaching to turn-on the clear metal technology and turn off the clear metal technology if nothing is sensed by the sensors.
  • FIG. 1 illustrates a prior art version of the inside of a vehicle 10 .
  • the vehicle 10 includes a large rear-view mirror 12 and a structural A-pillar 14 on the vehicle 10 .
  • FIG. 1 demonstrates the need for a system to improve the peripheral vision of the driver from the blocked view of the rear-view mirror 12 and/or one or more of the structural pillars 14 of the vehicle 10 .
  • FIG. 2 illustrates a vehicle 10 with an interactive vehicle safety system 100 with a side rearview mirror elimination system 200 .
  • the side mirror elimination system 200 of the interactive vehicle safety system 100 of FIG. 2 may include one or more image capturing devices 210 (such as a camera, sensor, or laser) to take the place of the rearview mirror 12 , real-time image processing, and one or more display systems 220 (such as an LCD or LED display) to portray an image 222 that would be seen on the rear-view mirror 12 .
  • the display system 220 may be a retina scanner display.
  • the display 220 may be flat, flexible, bendable, or curved without departing from this invention.
  • the display system 220 may be located on the A-pillar 14 .
  • the one or more displays 220 may be located in other locations without departing from this invention, such as located on another structural pillar, the dashboard 16 , or as part of a heads-up-display or other locations.
  • FIG. 3 illustrates a vehicle 10 with an interactive vehicle safety system 100 having a pillar obstruction elimination system 250 and capabilities for portraying objects that are being blocked by an A-pillar 14 (or other structural pillars) of the vehicle 10 .
  • the pillar obstruction elimination system 250 of the interactive vehicle safety system 100 of FIG. 3 may include one or more image capturing devices (such as a camera, sensor, or laser) to portray an image 262 blocked by the obstruction of the A-pillar 14 on the vehicle 10 to make the structural pillar/obstruction 14 functionally disappear.
  • the interactive vehicle safety system 100 may include real-time image processing and one or more display systems 260 (such as an LCD or LED display) to portray the image 262 that blocked by the A-pillar 14 .
  • the display system 260 may be a retina scanner display.
  • the display 260 may be flat, flexible, bendable, or curved without departing from this invention.
  • the display system 260 may be located on the A-pillar 14 .
  • the display 260 may be located in other locations without departing from this invention, such as located on another structural pillar, the dashboard 16 , or as part of a heads-up-display or other locations.
  • FIG. 4 illustrates a vehicle 10 with an interactive vehicle safety system 100 having both a side rearview mirror elimination system 200 and a pillar obstruction elimination system 260 for portraying objects that are being blocked by an A-pillar 14 or other structural pillar of the vehicle 10 .
  • the interactive vehicle safety system 100 of FIG. 4 may include one or more image capturing devices 210 (such as a camera, sensor, or laser) to take the place of the rearview mirror. Additionally, the interactive vehicle safety system 100 of FIG. 4 may include one or more image capturing devices (such as a camera, sensor, or laser) to portray an image 262 blocked by the obstruction of the A-pillar 14 on the vehicle 10 to make the structural pillar/obstruction 14 functionally disappear.
  • the interactive vehicle safety system 100 may include real-time image processing and a display system with one or more displays 220 , 260 (such as an LCD or LED display) to portray the image 222 that would be seen on the rear-view mirror 12 and to portray the image 262 that blocked by the A-pillar 14 or other structural pillar.
  • the displays 220 , 260 may be a retina scanner display.
  • the displays 220 , 260 may be flat, flexible, bendable, or curved without departing from this invention.
  • the display system 220 , 260 may be located on the A-pillar 14 .
  • the display 220 , 260 may be located in other locations without departing from this invention, such as located on another structural pillar, the dashboard 16 , or as part of a heads-up-display or other locations.
  • the interactive vehicle safety system 100 may help an operator detect a collision and provide further information to eliminate or reduce the risks in many different areas, such as: providing vision to the operator, predicting when or how long of a reaction time before an accident occurs, providing audible warnings to the public and operator, providing visual warnings to the public and operator, determining and providing a suggested escape route or action, and determining and providing any automatic vehicle operations in response to the imminent collision (i.e. automatic braking and/or steering).
  • FIGS. 5A-5D illustrate top views of various vehicles with an interactive vehicle safety system 100 and various input systems.
  • FIG. 5A illustrates a top view of an automobile 10 A with the interactive vehicle safety system 100 .
  • FIG. 5B illustrates a top view of a pick-up truck 10 B with the interactive vehicle safety system 100 .
  • FIG. 5C illustrates a top view of a delivery truck 10 C with the interactive vehicle safety system 100 .
  • FIG. 5D illustrates a top view of a semi-truck 10 D with the interactive vehicle safety system 100 .
  • the interactive vehicle safety system 100 may be utilized with any vehicle without departing from this invention.
  • the interactive vehicle safety system 100 may provide an operator with vision and situational awareness to what is happening external to the vehicle 10 by providing peripheral visual awareness. As illustrated in FIGS.
  • the interactive vehicle safety system 100 and vehicles 10 A, 10 B, 10 C, 10 D may include one or more of the following: cameras 124 , object detecting sensors 120 , ultrasonic sensors 122 , and vehicle telematic sensors 130 , etc.
  • Other inputs may be included with the interactive vehicle safety system 100 and vehicles 10 A, 10 B, 10 C, 10 D as described and illustrated in FIG. 6 .
  • the one or more cameras 124 , object detecting sensors 120 , etc. and other input data devices as listed and described with FIG.
  • the vehicle 6 may be located at various locations throughout the vehicle 10 , such as on the front bumper, rear bumper or rear area, side of the vehicle 10 , on top of the vehicle 10 , under the vehicle, within the inside of the vehicle 10 , or any other locations that can provide meaningful inputs to the interactive vehicle safety system 100 .
  • FIG. 6 illustrates an illustrative system depiction of the interactive vehicle safety system 100 .
  • the interactive vehicle safety system 100 may include a processor 105 that includes a processing unit and a system memory to store and execute software instructions.
  • the interactive vehicle safety system 100 may provide various data inputs 110 and provide various outputs 120 to predict a potential incident, determine danger, and inform the vehicle 10 and operator to slow down, turn, or stop. As illustrated in FIG.
  • the various inputs 110 to the processor 105 and the interactive vehicle safety system 100 may include one or more of the following: depth camera 112 , lasers 114 , accelerometer device 116 , aimed audio device 118 , object detecting sensors 120 , ultrasonic sensor 122 , cameras 124 , LIDAR radar sensor 126 , photoelectric sensor 128 , telematics device 130 , infrared sensor 132 , internet of things (IoT) 132 , or GPS device 134 .
  • the interactive vehicle safety system 100 may include various image capturing devices, such as cameras 124 to capture what is happening in the real world external to the vehicle 10 and bringing in and highlighting what may happen in the near future.
  • the interactive vehicle safety system 100 may include LIDAR radar 126 and other object detecting sensors 120 (such as ultrasonic sensor 122 , photoelectric sensor 128 , and infrared sensor 132 ).
  • Other sensors and inputs 110 may be utilized for the interactive vehicle safety system 100 , such as GPS 136 , vehicle telematic sensors 140 , Internet of Things (IoT) 132 information.
  • Vehicle telematics sensors 140 may monitor the vehicle 10 by using GPS 136 and onboard diagnostics to record movements on a computerized map, such as with a GPS receiver, and engine interface, an input/output interface (expander port) in the vehicle 10 , a SIM card, or an accelerometer 116 .
  • the interactive vehicle safety system 100 will gather data from these inputs and the real world to help the operator see obstacles and provide the operator information to be able to react to obstacles.
  • the processor 105 may include a processing unit and a system memory to store and execute software instructions.
  • the various inputs 110 and outputs 150 may be connected to the processor 105 . Additionally, the processor 105 may be in communication with and connected to other various computer systems.
  • the processor 105 of the interactive vehicle safety system 100 may have various outputs 150 after processing the various inputs 110 .
  • the outputs 150 of the interactive vehicle safety system 100 may include one or more of the following: audio spotlight 152 , hologram display 154 , heads-up display 156 , LED display 158 , display system 160 , LCD display 162 , dashboard display 164 , haptic warning 166 , audible warnings 168 , image analysis 170 , augmented reality display 172 , external visual warnings 174 , or accident prediction 176 .
  • the processor 105 of the interactive vehicle safety system 100 may control and process various actions for the interactive vehicle safety system 100 as will be described further below.
  • the processor 105 may be a general-purpose processor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein.
  • DSP digital signal processor
  • ASIC application-specific integrated circuit
  • FPGA field programmable gate array
  • a general-purpose processor may be a microprocessor, or any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
  • the one or more implementations described throughout this disclosure may utilize logical blocks, modules, and circuits that may be implemented or performed with the processor 105 .
  • the processor 105 may be used to implement various aspects and features described herein. As such, the processor 105 may be configured to execute multiple calculations, in parallel or serial and may execute coordinate transformations, curve smoothing, noise filtering, outlier removal, amplification, and summation processes, and the like.
  • the processor 105 may include a processing unit and system memory to store and execute software instructions.
  • the processor 105 may be in communication with and/or connected to the interactive vehicle safety system 100 that may provide a central analysis and display.
  • FIGS. 7 A 1 - 7 D 2 illustrate the interactive vehicle safety system 100 with trajectory and dead-reckoning analysis and a heads-up display 270 A along the full windshield 18 .
  • FIGS. 8A-8D illustrate the interactive vehicle safety system 100 with trajectory and dead-reckoning analysis and a heads-up display 270 B along a portion of the windshield 18 , and the lower left-hand portion of the windshield 18 .
  • the interactive vehicle safety system 100 may include trajectory and dead reckoning analysis.
  • dead reckoning is the process of calculating a vehicle's current position and/or future position by using a previously determined position, or current position, and by using estimation of speed and course over elapsed time.
  • the interactive vehicle safety system 100 may calculate a vehicle's future position by using a current position and by using estimation of speed, course, and other inputs over elapsed time. For collision avoidance, the reaction time may be approximately 2 seconds with a possible stopping distance of 3-5 seconds.
  • the interactive vehicle safety system 100 may include real-time image processing for a vehicle to detect and track in real-time object movement outside the vehicle.
  • the interactive vehicle safety system 100 may also highlight stationary or moving objects around the vehicle that are predicted to be a danger utilizing trajectory analysis and dead reckoning of the vehicle and the stationary or moving object.
  • the trajectory analysis and dead reckoning analysis may utilize speed, direction, acceleration, turn radius, etc. from the vehicle and speed, direction, acceleration, etc. from the moving object.
  • the prediction of the route and path of the vehicle and the location of the moving object may be calculated by the interactive vehicle safety system 100 by various methods, such as algorithms using speed, direction, turn radius, acceleration, GPS, vehicle telematic data and sensors, cameras, external sensors, mapping information, etc. Additionally, the interactive vehicle safety system 100 may include a predictive algorithm utilizing data and information to watch what pedestrians are doing, looking at the phone or wearing headphones or tracking the pedestrian's acceleration to determine if they are walking, about to run, or about to stop. The interactive vehicle safety system 100 may also utilize machine learning with the image processing to provide better predictive algorithms for the trajectory analysis.
  • the interactive vehicle safety system 100 may include a dead-reckoning heads-up display 270 to include ultrasound sensors or other sensors to the front bumper, rear bumper, or other locations around the vehicle.
  • the interactive vehicle safety system 100 provides a dead reckoning display 270 A with information and warnings fully across the windshield and a real time picture across the windshield that can be displayed in various different colors (i.e. gray for dead reckoning and another color for real time).
  • the heads-up display 270 A may include both the vehicle/obstacles or text 274 , such as “WARNING” as shown in the heads-up display 270 A.
  • the interactive vehicle safety system 100 removes the A-pillar and other structural pillars to provide a clear, complete, open view to the operator of the vehicle 10 .
  • the interactive vehicle safety system 100 displays the object and the object speed and direction both on the clear metal screen 260 but also across the windshield 18 by way of the heads-up-display 270 A on the windshield 18 or dash 16 .
  • the interactive vehicle safety system 100 may then utilize real-time trajectory and object movement analysis and bring the dead reckoning into the real time space on the heads-up-display.
  • the interactive vehicle safety system 100 may include a dead reckoning strip of LED pictures across the front of the dashboard 16 of the vehicle 10 and reflecting onto the windshield 18 .
  • FIGS. 7 A 1 and 7 A 2 illustrate the interactive vehicle safety system 100 with trajectory and dead-reckoning analysis showing a second vehicle 20 turning left in front of a main vehicle 10 .
  • FIG. 7 A 1 shows the second vehicle 20 preparing to turn left in front of the main vehicle 10 .
  • FIG. 7 A 2 shows the heads-up-display 270 B with the second vehicle 20 A turning in front of the main vehicle 10 and “WARNING” 274 by utilizing the trajectory and dead-reckoning analysis of the interactive vehicle safety system 100 .
  • FIGS. 7 B 1 and 7 B 2 illustrate the interactive vehicle safety system 100 with trajectory and dead-reckoning analysis showing a main vehicle 10 turning left in front of a second vehicle 20 .
  • FIG. 7 B 1 shows the main vehicle 10 preparing to turn left in front of the second vehicle 20 .
  • FIG. 7 B 2 shows the heads-up-display 270 B with the main vehicle 10 A turning in front of the second vehicle 20 and “WARNING” 274 by utilizing the trajectory and dead-reckoning analysis of the interactive vehicle safety system 100 .
  • FIG. 7C illustrates the interactive vehicle safety system 100 with trajectory and dead-reckoning analysis showing a pedestrian 22 walking in a crosswalk in front of a main vehicle 10 .
  • the pedestrian 22 is behind and blocked by the A-pillar 14 .
  • the operator is able to view the pedestrian 22 in the crosswalk because of the display 260 from the pillar obstruction elimination system 250 .
  • the heads-up-display 270 B may display “WARNING” 274 by utilizing the trajectory and dead-reckoning analysis of the interactive vehicle safety system 100 .
  • FIGS. 7 D 1 and 7 D 2 illustrate the interactive vehicle safety system 100 with trajectory and dead-reckoning analysis showing a main vehicle 10 turning left in front of a pedestrian 20 in a crosswalk.
  • FIG. 7 D 1 shows the main vehicle 10 preparing to turn left in the direction of the pedestrian 22 in the crosswalk.
  • FIG. 7 D 2 shows the heads-up-display 270 B with the main vehicle 10 A turning towards the pedestrian 22 in the crosswalk along with “WARNING” 274 by utilizing the trajectory and dead-reckoning analysis of the interactive vehicle safety system 100 .
  • FIGS. 8A-8D illustrate the interactive vehicle safety system 100 with trajectory and dead-reckoning analysis and a heads-up display 270 B along a portion of the windshield 18 , and the lower left-hand portion of the windshield 18 .
  • the heads-up display 270 B (along with the heads-up display 270 A) may include both the vehicle/obstacles 272 or text 274 , such as “WARNING” as shown in the heads-up display 270 B.
  • FIG. 8A illustrates the interactive vehicle safety system 100 with the heads-up-display (HUD) 270 B in the lower left-hand portion of the windshield 18 showing a second vehicle 20 turning left in front of a main vehicle 10 .
  • FIG. 8B illustrates the interactive vehicle safety system 100 with a heads-up-display (HUD) 270 B in the lower left-hand portion of the windshield 18 showing a main vehicle 10 turning left in front of a second vehicle 20 .
  • FIG. 8C illustrates the interactive vehicle safety system 100 with a heads-up-display (HUD) 270 B in the lower left-hand portion of the windshield 18 showing a pedestrian 22 walking in a crosswalk in front of a main vehicle 10 .
  • FIG. 8D illustrates the interactive vehicle safety system 100 with a heads-up-display (HUD) 270 B in the lower left-hand portion of the windshield 18 showing a main vehicle 10 turning left in front of a pedestrian 22 in a crosswalk.
  • HUD heads-up-display
  • the highlighting of the moving object in danger may be highlighted in various stages.
  • the moving object may be highlighted yellow if the object is potentially in the path of accident or collision with the vehicle.
  • the moving object may be highlighted red if the object is moving and imminently in the path of accident or collision with the vehicle. Highlighting may be in the form of different colors, blinking colors, circles around the object, blinking circles, etc, without departing from this invention.
  • the interactive vehicle safety system 100 may utilize the heads-up display 270 as described and illustrated previously or the interactive vehicle safety system 100 may utilize a dashboard display 280 .
  • FIG. 9 illustrates an exemplary dashboard display 280 located on the dashboard 16 of the vehicle 10 .
  • the dashboard display 280 may include both the vehicle/obstacles 282 or text 284 , such as “WARNING” as shown in the dashboard display 280 .
  • the interactive vehicle safety system 100 may utilize any one of or any combination of the heads-up display 270 A, the heads-up display 270 B, or the dashboard display 280 without departing from this invention to display what is happening outside the vehicle or what might happen outside the vehicle.
  • the interactive vehicle safety system 100 may provide a full vision of what is happening outside of the vehicle 10 .
  • the interactive vehicle safety system 100 will provide and improve an operator's peripheral visual awareness to provide situational awareness to the operator.
  • the interactive vehicle safety system 100 may utilize one or more of the following systems and information to provide and improve the operator's peripheral visual awareness.
  • the interactive vehicle safety system 100 may provide an augmented reality system.
  • the interactive vehicle safety system 100 may also provide depth cameras or other system that provide depth imagery allowing the ability to change the perspective of the operator's view.
  • the augmented reality system and/or the depth cameras may provide a display with the driver's perspective “in front” of the vehicle or with the driver's perspective “on top” of the vehicle.
  • Depth cameras may also not return colors and may return a grey-scale image to help determine depth and therefore the distance of a pedestrian, object, or other danger object in the path of the vehicle.
  • the operator may also utilize glasses, contacts, or a circular plastic cover that drops over the face and eyes to provide an augmented reality vision of the full vision of what is happening outside of the vehicle.
  • the cover may drop over the face and eyes of the operator when required or when initiated by the interactive vehicle safety system 100 .
  • the augmented reality system may initiate the movement of the cover based on a projected trajectory of a collision and/or accident.
  • the augmented reality system may initiate the movement of the cover upon movement of the vehicle.
  • the interactive vehicle safety system 100 may utilize various sensors to provide the operator and system with additional information and situational awareness.
  • the various sensors may include one or more of the following sensors: density, vibration, audio, humidity, air pressure, color, synthetic sensors, etc.
  • the various sensors may be located on the front bumper, rear bumper, or other locations around the vehicle.
  • the one or more sensors may include trajectory sensors to help determine and provide data and analysis of the trajectory and dead reckoning of the vehicle and any moving objects external to the vehicle.
  • the one or more sensors may also include synthetic sensors wherein one sensor senses an action and the other sensors as part of the synthetic sensor confirms that action.
  • the interactive vehicle safety system 100 may utilize stereoscopy with the plurality of cameras.
  • the plurality of cameras may be located throughout the exterior of the vehicle, such as in front, sides, back, top, or bottom of the vehicle.
  • the stereoscopy may utilize two or more cameras to accurately determine depth, location, and trajectory of moving objects or pedestrians external to the vehicle.
  • the interactive vehicle safety system 100 may determine and track the location of the operator's head to change the view for the operator's view based on the operator's head location when moving, rotating, or at different heights. Tracking the operator's head location will help maintain aspect ratio location of the operator's vision of the display and external from the interactive vehicle safety system 100 .
  • the interactive vehicle safety system 100 may integrate data from various other information sources.
  • the plurality of information sources may be one or more information sources on the Internet of Things (IoT), such as from camera information from intersections, buildings, autonomous vehicles, or other camera sources, sensors, or measuring devices throughout the area.
  • IoT Internet of Things
  • the interactive vehicle safety system 100 may include image analysis with cameras that can detection what a pedestrian is doing. For example, image analysis may detect earbuds and/or headphones or a pedestrian talking on a cell phone. The interactive vehicle safety system 100 may utilize this image analysis information to potentially take a different action.
  • the interactive vehicle safety system 100 may utilize hologram technology with motion parallax.
  • the hologram system may provide a hologram 30 on the pillar 14 of the pedestrian 22 located behind the pillar 14 .
  • the hologram technology may utilize a laser, one or more beam splitter mirrors (such as two mirrors), one or more lens (such as three lens), and a holograph film located on the pillar 14 or other location within the vehicle 10 .
  • the light may be in the same direction and same wavelength to provide coherent beams with all light waves in phase to project the hologram 30 or 3D version of what is happening outside and blocked by the driver's vision.
  • the interactive vehicle safety system 100 may provide audible and/or visual warnings to pedestrians and/or other danger objects.
  • the interactive vehicle safety system 100 may provide audible, visual, and other warnings (such as sounds, visual, and/or motion—such as a vibrating seat) to both the pedestrians and the vehicle operator when collision, an accident, or danger is possible.
  • the interactive vehicle safety system 100 may utilize a transducer or other systems to send directed audible warnings to the danger pedestrian, such as 3D sounds.
  • the interactive vehicle safety system 100 may provide an audio “spotlight” 290 for the operator.
  • an aimed audio device 292 located on the vehicle 10 casts the audio “spotlight” 290 to a surface redirecting the source/sound and pointed specifically at that spot, such as the sound coming from that spot.
  • the interactive vehicle safety system 100 may provide other warnings to the operator and/or pedestrians.
  • the interactive vehicle safety system 100 may provide a vibrating seat or haptic warning to the operator when a potential collision or accident is detected.
  • the interactive vehicle safety system 100 may provide external, automatic, audible warnings outside of the vehicle, such as, “Warning—Vehicle approaching” or “Warning—Vehicle turning.”
  • the interactive vehicle safety system 100 may provide internal, automatic, audible warnings inside of the vehicle to the operator, such as, “Warning—Pedestrian in crosswalk.”
  • the interactive vehicle safety system 100 may also include an audio system with “white” noise or nuisance noise, ambient noise sensor to change external sounds, or GPS geofencing.
  • the interactive vehicle safety system 100 may include a highlighting feature 240 on the display or on the external area 40 when the vehicle is making a turn.
  • the interactive vehicle safety system 100 may display the path of the turn on the display for the operator based on the trajectory analysis, sensors, and cameras with the interactive vehicle safety system 100 .
  • the interactive vehicle safety system 100 may externally, outside of the vehicle 12 , light up the external path 240 on the pavement or external area 40 of where the vehicle 10 is going, such as the trajectory of going straight or a turn during the turn.
  • the interactive vehicle safety system 100 may laser the path 240 and the direction of the vehicle 10 on the street or pavement 40 —painting a turn or going straight with high-intensity LED lights 242 or similar lighting elements.
  • the interactive vehicle safety system 100 may also provide infrared and heat-detection sensors to help “block-out” the road to display the external projected path of the vehicle on the road.
  • the interactive vehicle safety system 100 may include optical sensors on a steering wheel for determining a turning radius of a turn and providing predictive modeling on the turning path of the vehicle. Additionally, the interactive vehicle safety system 100 may include sensors on the wheels of the vehicle, such as an electromagnetic arm or pitmen arm for determining a predictive analysis of the turning radius of the vehicle. Additionally, back-up cameras or front cameras may provide additional turning trajectory analysis and path projection of the vehicle.
  • the interactive vehicle safety system 100 may provide visual information and visual warnings.
  • the visual warning may include a strobe or laser directed specific to a “danger” pedestrian to alert the pedestrian of an oncoming, turning vehicle.
  • the interactive vehicle safety system 100 may include a heads-up-display (HUD) on the windshield or other location based on the various cameras and sensors associated with the interactive vehicle safety system 100 .
  • the heads-up-display may include any of the information as described to include information about various moving objects and stationary objects from the trajectory analysis determined as potential collision or accident.
  • the heads-up-display may include a distance or how many feet or inches an object is away from the vehicle and/or collision.
  • the heads-up-display may also include a time to collision based on the trajectory analysis of the vehicle and the moving object.
  • the interactive vehicle safety system 100 may include a processor and database for recording and storing information and images from turns and actions based on “danger” present or imminent—critical moments.
  • the critical moments may be determined by geofencing, accelerometer analysis, and impact detection.
  • the interactive vehicle safety system 100 may create a 3D model from the cameras and sensors to recreate an accident. This information can be very helpful in a number of situations as providing evidence of the actual actions that occurred during these critical moments.
  • the interactive vehicle safety system 100 may include a number of automatic actions in response to an imminent or present danger situation. For example, the interactive vehicle safety system 100 may automatically stop acceleration of the vehicle at a “yellow” condition for a collision or accident. In another example, the interactive vehicle safety system 100 may automatically apply the brakes on the vehicle at a “red” condition for a collision or accident. The “yellow” and “red” conditions may be set by parameters within the interactive vehicle safety system 100 . The interactive vehicle safety system 100 may require the vehicle to maintain within the speed limit, utilizing GPS-location services or even a camera that recognizes the speed limit through image analysis. In another example, the interactive vehicle safety system 100 may utilize other automatic actions, such as: pre-emptive braking, changing steering direction, horn honking, flashing lights, or vibration in the seats to help assist with maintain vehicle safety.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)

Abstract

An interactive vehicle safety system having capabilities to improve peripheral vision, provide warning, and improve reaction time for operators of vehicles. For example, the interactive vehicle safety system may have capabilities for portraying objects, which are being blocked by any of the structural pillars and/or mirrors of a vehicle (such as a truck, van, train, etc.). The interactive vehicle safety system disclosed may comprise one or more image capturing devices (such as camera, sensor, laser), distance and object sensors (such as ultrasonic sensor, LIDAR radar sensor, photoelectric sensor, and infrared sensor), a real-time image processing of an object, and one or more display systems (such as LCD or LED displays). The interactive vehicle safety system may give a seamless 360-degree front panoramic view to a driver.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims priority to U.S. Provisional Patent No. 62/932,188, filed Nov. 7, 2019, entitled Interactive Safety System for Vehicles, which is incorporated herein by reference in its entirety and made a part hereof.
FIELD OF THE INVENTION
The present invention relates vehicles, and more particularly, an interactive vehicle safety system having capabilities to improve peripheral vision, provide warning, and improve reaction time for operators of vehicles.
BACKGROUND
Currently, there are a number of vehicle collisions and accidents related to reduced peripheral vision. A system is needed to improve safety for vehicles by improving peripheral vision and situational awareness to reduce vehicle collisions and accidents. A system is also needed with predictive capabilities for vehicles, people, and other objects that will help the driver to make initial decisions and have the machine or system take over the decision-making and actions when the driver is making a mistake (i.e. an imminent accident or collision). This is especially needed with bigger vehicles, for example trucks, trains, etc.
SUMMARY
The present invention generally provides an interactive vehicle safety system having capabilities to improve peripheral vision, provide warning, and improve reaction time for operators of vehicles, there are a number of vehicle collisions and accidents related to reduced peripheral vision.
According to one embodiment, an interactive safety system for vehicles may comprise: one or more image capturing devices located on the vehicle configured to transmit one or more images from the vehicle to a processor; one or more object detecting sensors located on the vehicle that detect location, speed, and direction data of one or more objects external to the vehicle; and a display system connected to the processor. The one or more object detecting sensors may be configured to transmit the location, speed, and direction data from the one or more objects to the processor. The display system may comprise a pillar display located on an A-pillar of the vehicle configured to portray an image blocked by an obstruction of the A-pillar from the one or more image capturing devices. The processor and a memory unit storing computer-executable instructions, which when executed by the processor, may cause the processor to: receive images from the one or more image capturing devices; receive the location, speed, and direction data from the one or more object detecting sensors; process in real-time the images and the location, speed, and direction data from the one or more objects external to the vehicle; track in real-time the one or more objects external to the vehicle using the images and the location, speed, and direction data; predict a future location and a route of the one or more objects external to the vehicle using a predictive algorithm and a trajectory analysis of the one or more objects external to the vehicle; determine in real-time a danger object from the one or more objects external to the vehicle, wherein the danger object is a potential collision based on the predictive algorithm and trajectory analysis; and display the image blocked by the obstruction of the A-pillar on the pillar display.
Additionally, the processor and the memory unit storing computer-executable instructions may cause the processor to further provide a visual warning to the danger object, wherein the visual warning is provided external to the vehicle. The processor and the memory unit storing computer-executable instructions may cause the processor to further provide an audible warning to the danger object, wherein the audible warning is provided external to the vehicle, wherein the audible warning is a 3D sound directed to the danger object. Additionally, the one or more object detecting sensors may be ultrasonic sensors, LIDAR radar sensors, or photoelectric sensors. The processor and the memory unit storing computer-executable instructions may cause the processor to further light a path of the vehicle with high-intensity lights based on the predictive algorithm and trajectory analysis, wherein the path is externally outside the vehicle on a pavement of a street. Additionally, the display system may further include a heads-up display on a windshield of the vehicle to display the one or more objects with the location, speed, and direction data and the danger object with the future location and the route on a heads-up display. The display system may further include a dashboard display on a dashboard of the vehicle to display the one or more objects with the location, speed, and direction data and the danger object with the future location and the route on a heads-up display.
According to another embodiment, an interactive safety system for vehicles may comprise: one or more image capturing devices located on the vehicle configured to transmit one or more images from the vehicle to a processor; one or more object detecting sensors located on the vehicle that detect location, speed, and direction data of one or more objects external to the vehicle; and a display system connected to the processor. The display system may include a pillar display located on an A-pillar of the vehicle configured to portray an image blocked by an obstruction of the A-pillar from the one or more image capturing devices; and a rear-view display located in the vehicle configured to portray an image that would be seen in a rear-view mirror from the one or more image capturing devices. Further, the one or more object detecting sensors may be configured to transmit the location, speed, and direction data from the one or more objects to the processor. The processor and a memory unit storing computer-executable instructions, which when executed by the processor, may cause the processor to: receive images from the one or more image capturing devices; receive the location, speed, and direction data from the one or more object detecting sensors; process in real-time the images and the location, speed, and direction data from the one or more objects external to the vehicle; track in real-time the one or more objects external to the vehicle using the images and the location, speed, and direction data; predict a future location and a route of the one or more objects external to the vehicle using a predictive algorithm and a trajectory analysis of the one or more objects external to the vehicle; determine in real-time a danger object from the one or more objects external to the vehicle, wherein the danger object is a potential collision based on the predictive algorithm and trajectory analysis; display the image blocked by the obstruction of the A-pillar on the pillar display; and display the image that would be seen in the rear-view mirror on the rear-view display.
In yet a further embodiment, an interactive safety system for vehicles may comprise: one or more image capturing devices located on the vehicle configured to transmit one or more images from the vehicle to a processor; one or more object detecting sensors located on the vehicle that detect location, speed, and direction data of one or more objects external to the vehicle; one or more telematics devices configured to transmit vehicle telematics data from the vehicle to the processor; and a display system connected to the processor. The display system may include: a pillar display located on an A-pillar of the vehicle configured to portray an image blocked by an obstruction of the A-pillar from the one or more image capturing devices; a rear-view display located in the vehicle configured to portray an image that would be seen in a rear-view mirror from the one or more image capturing devices; and heads-up display on a windshield of the vehicle to display the one or more objects with the location, speed, and direction data. Additionally, the one or more object detecting sensors configured to transmit the location, speed, and direction data from the one or more objects to the processor. The processor and a memory unit storing computer-executable instructions, which when executed by the processor, may cause the processor to: receive images from the one or more image capturing devices; receive the location, speed, and direction data from the one or more object detecting sensors; receive the telematics data from the telematics device; process in real-time the images, the telematics data, and the location, speed, and direction data from the one or more objects external to the vehicle; track in real-time the one or more objects external to the vehicle using the images and the location, speed, and direction data; predict a future location and a route of the one or more objects external to the vehicle using a predictive algorithm and a trajectory analysis of the one or more objects external to the vehicle; determine in real-time a danger object from the one or more objects external to the vehicle, wherein the danger object is a potential collision based on the predictive algorithm and trajectory analysis; highlight the danger object on the display system; display the image blocked by the obstruction of the A-pillar on the pillar display; display the image that would be seen in the rear-view mirror on the rear-view display; and display the one or more objects with the location, speed, and direction data and the danger object with the future location and the route on the heads-up display. Further, the danger object may be highlighted yellow if the danger object is potentially in a path of an accident or a collision and the danger object may be highlighted red if the danger object is imminently in a path of an accident or a collision with the vehicle. The highlighting of the danger object may include one or more of the following: blinking colors on the danger object, circles around the danger object, or blinking circles around the danger object.
In an additional embodiment, an interactive safety system for vehicles may comprise: one or more image capturing devices located on the vehicle configured to transmit one or more images from the vehicle to a processor; one or more object detecting sensors located on the vehicle that detect location, speed, and direction data of one or more objects external to the vehicle; and a display system connected to the processor. The display system may include a pillar display located on an A-pillar of the vehicle configured to portray an image blocked by an obstruction of the A-pillar from the one or more image capturing devices. The one or more object detecting sensors may be configured to transmit the location, speed, and direction data from the one or more objects to the processor. The processor and a memory unit storing computer-executable instructions, which when executed by the processor, may cause the processor to: receive images from the one or more image capturing devices; receive the location, speed, and direction data from the one or more object detecting sensors; process in real-time the images and the location, speed, and direction data from the one or more objects external to the vehicle; track in real-time the one or more objects external to the vehicle using the images and the location, speed, and direction data; predict a future location and a route of the one or more objects external to the vehicle using a predictive algorithm and a trajectory analysis of the one or more objects external to the vehicle; determine in real-time a danger object from the one or more objects external to the vehicle, wherein the danger object is a potential collision based on the predictive algorithm and trajectory analysis; display the image blocked by the obstruction of the A-pillar on the pillar display; and provide an audible warning to an operator coming from a location of the danger object, wherein the audible warning is an audio spotlight from an aimed audio device connected to the processor that casts a sound to a surface redirecting the sound to come from the location of the danger object to the vehicle.
Other features and advantages of the invention will be apparent from the following specification taken in conjunction with the following drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
To understand the present invention, it will now be described by way of example, with reference to the accompanying drawings in which:
FIG. 1 illustrates a prior art version of a vehicle with a side rearview mirror and an A-pillar;
FIG. 2 illustrates a vehicle with an interactive vehicle safety system with side rearview mirror elimination in accordance with an embodiment of the present invention;
FIG. 3 illustrates a vehicle with an interactive vehicle safety system having pillar obstruction elimination in accordance with an embodiment of the present invention;
FIG. 4 illustrates a vehicle with an interactive vehicle safety system having both the side rearview mirror elimination and pillar obstruction elimination in accordance with an embodiment of the present invention;
FIGS. 5A-5D illustrate top views of various vehicles with an interactive vehicle safety system in accordance with an embodiment of the present invention;
FIG. 6 illustrates an illustrative system depiction of an interactive vehicle safety system in accordance with an embodiment of the present invention;
FIGS. 7A1 and 7A2 illustrate an interactive vehicle safety system with trajectory and dead-reckoning analysis showing a second vehicle turning left in front of a main vehicle in accordance with an embodiment of the present invention;
FIGS. 7B1 and 7B2 illustrate an interactive vehicle safety system with trajectory and dead-reckoning analysis showing a main vehicle turning left in front of a second vehicle in accordance with an embodiment of the present invention;
FIG. 7C illustrates an interactive vehicle safety system with trajectory and dead-reckoning analysis showing a pedestrian walking in a crosswalk in front of a main vehicle in accordance with an embodiment of the present invention;
FIGS. 7D1 and 7D2 illustrate an interactive vehicle safety system with trajectory and dead-reckoning analysis showing a main vehicle turning left in front of a pedestrian in a crosswalk in accordance with an embodiment of the present invention;
FIG. 8A illustrates the interactive vehicle safety system from FIGS. 7A1 and 7A2 with a heads-up-display (HUD) in accordance with an embodiment of the present invention;
FIG. 8B illustrates the interactive vehicle safety system from FIGS. 7B1 and 7B2 with a heads-up-display (HUD) in accordance with an embodiment of the present invention;
FIG. 8C illustrates the interactive vehicle safety system from FIG. 7C with a heads-up-display (HUD) in accordance with an embodiment of the present invention;
FIG. 8D illustrates the interactive vehicle safety system from FIGS. 7D1 and 7D2 with a heads-up-display (HUD) in accordance with an embodiment of the present invention;
FIG. 9 illustrates an interactive vehicle safety system with a display in the dashboard in accordance with an embodiment of the present invention;
FIG. 10 illustrates an interactive vehicle safety system using hologram technology in accordance with an embodiment of the present invention;
FIG. 11 illustrates an interactive vehicle safety system using audio spotlight technology in accordance with an embodiment of the present invention; and
FIG. 12 illustrates an interactive vehicle safety system using an external highlighting technology in accordance with an embodiment of the present invention.
The reader is advised that the attached drawings are not necessarily drawn to scale.
DETAILED DESCRIPTION OF EMBODIMENTS
In the following description of various examples of the invention, reference is made to the accompanying drawings, which form a part hereof, and in which are shown by way of illustration various example structures, systems, and steps in which aspects of the invention may be practiced. It is to be understood that other specific arrangements of parts, structures, example devices, systems, and steps may be utilized and structural and functional modifications may be made without departing from the scope of the present invention. Also, while the terms “top,” “bottom,” “front,” “back,” “side,” and the like may be used in this specification to describe various example features and elements of the invention, these terms are used herein as a matter of convenience, e.g., based on the example orientations shown in the figures. Nothing in this specification should be construed as requiring a specific three-dimensional orientation of structures in order to fall within the scope of this invention.
The present invention relates to vehicles, and more particularly, an interactive vehicle safety system having capabilities to improve peripheral vision, provide warning, and improve reaction time for operators of vehicles. For example, the interactive vehicle safety system may have capabilities for portraying objects, which are being blocked by any of the structural pillars and/or mirrors of a vehicle (such as a truck, van, train, etc.). The interactive vehicle safety system disclosed may comprise one or more image capturing devices (such as camera, sensor, laser), distance and object sensors (such as ultrasonic sensor, LIDAR radar sensor, photoelectric sensor, and infrared sensor), a real-time image processing of an object, and one or more display systems (such as LCD or LED displays). The interactive vehicle safety system may give a seamless 360-degree front panoramic view to a driver.
The invention relates to an interactive vehicle safety system that includes real-time image processing for a vehicle with clear metal technology. Clear metal technology refers to the capability for portraying objects, which are being blocked by any of the structural pillars and/or mirrors of a vehicle. The clear metal technology may use one or multiple cameras located on the other side of a structural pillar or obstruction blocking the vision of the operator to portray an image blocked by the structural pillar/obstruction on the structural pillar/obstruction of the vehicle to make the structural pillar/obstruction functionally disappear. The interactive vehicle safety system and clear metal technology may be used with vehicles such as automobiles, trucks, trains, bicycles, motorcycles, trains, forklifts, etc.
The interactive vehicle safety system may include one or more features such as: changing the image on the screen as the vehicle driver's head moves (using one camera or multiple cameras in “stereo” tracking the driver's head position); detect and track in real-time object movement outside the vehicle and highlight stationary or moving objects around the vehicle that may be “danger” by using trajectory analysis and dead reckoning of the vehicle (speed, direction, turn radius, etc.) and/or using trajectory analysis and dead reckoning of the moving objects (speed, direction, etc.); and a dead reckoning heads-up display to include object detecting sensors (for example, ultrasonic sensor, LIDAR radar sensor, photoelectric sensor, and infrared sensor) to the front bumper, rear bumper, or other locations around the vehicle that provides a dead reckoning time fully across the windshield. The interactive vehicle safety system may utilize sensors on the vehicle to sense objects approaching to turn-on the clear metal technology and turn off the clear metal technology if nothing is sensed by the sensors.
FIG. 1 illustrates a prior art version of the inside of a vehicle 10. As illustrated in the prior art, the vehicle 10 includes a large rear-view mirror 12 and a structural A-pillar 14 on the vehicle 10. FIG. 1 demonstrates the need for a system to improve the peripheral vision of the driver from the blocked view of the rear-view mirror 12 and/or one or more of the structural pillars 14 of the vehicle 10.
FIG. 2 illustrates a vehicle 10 with an interactive vehicle safety system 100 with a side rearview mirror elimination system 200. The side mirror elimination system 200 of the interactive vehicle safety system 100 of FIG. 2 may include one or more image capturing devices 210 (such as a camera, sensor, or laser) to take the place of the rearview mirror 12, real-time image processing, and one or more display systems 220 (such as an LCD or LED display) to portray an image 222 that would be seen on the rear-view mirror 12. In one embodiment the display system 220 may be a retina scanner display. In another example, the display 220 may be flat, flexible, bendable, or curved without departing from this invention. As illustrated in FIG. 2, the display system 220 may be located on the A-pillar 14. The one or more displays 220 may be located in other locations without departing from this invention, such as located on another structural pillar, the dashboard 16, or as part of a heads-up-display or other locations.
FIG. 3 illustrates a vehicle 10 with an interactive vehicle safety system 100 having a pillar obstruction elimination system 250 and capabilities for portraying objects that are being blocked by an A-pillar 14 (or other structural pillars) of the vehicle 10. The pillar obstruction elimination system 250 of the interactive vehicle safety system 100 of FIG. 3 may include one or more image capturing devices (such as a camera, sensor, or laser) to portray an image 262 blocked by the obstruction of the A-pillar 14 on the vehicle 10 to make the structural pillar/obstruction 14 functionally disappear. The interactive vehicle safety system 100 may include real-time image processing and one or more display systems 260 (such as an LCD or LED display) to portray the image 262 that blocked by the A-pillar 14. In one embodiment, the display system 260 may be a retina scanner display. In another example, the display 260 may be flat, flexible, bendable, or curved without departing from this invention. As illustrated in FIG. 3, the display system 260 may be located on the A-pillar 14. The display 260 may be located in other locations without departing from this invention, such as located on another structural pillar, the dashboard 16, or as part of a heads-up-display or other locations.
FIG. 4 illustrates a vehicle 10 with an interactive vehicle safety system 100 having both a side rearview mirror elimination system 200 and a pillar obstruction elimination system 260 for portraying objects that are being blocked by an A-pillar 14 or other structural pillar of the vehicle 10. The interactive vehicle safety system 100 of FIG. 4 may include one or more image capturing devices 210 (such as a camera, sensor, or laser) to take the place of the rearview mirror. Additionally, the interactive vehicle safety system 100 of FIG. 4 may include one or more image capturing devices (such as a camera, sensor, or laser) to portray an image 262 blocked by the obstruction of the A-pillar 14 on the vehicle 10 to make the structural pillar/obstruction 14 functionally disappear. The interactive vehicle safety system 100 may include real-time image processing and a display system with one or more displays 220, 260 (such as an LCD or LED display) to portray the image 222 that would be seen on the rear-view mirror 12 and to portray the image 262 that blocked by the A-pillar 14 or other structural pillar. In one embodiment the displays 220, 260 may be a retina scanner display. In another example, the displays 220, 260 may be flat, flexible, bendable, or curved without departing from this invention. As illustrated in FIG. 4, the display system 220, 260 may be located on the A-pillar 14. The display 220, 260 may be located in other locations without departing from this invention, such as located on another structural pillar, the dashboard 16, or as part of a heads-up-display or other locations.
In general, the interactive vehicle safety system 100 may help an operator detect a collision and provide further information to eliminate or reduce the risks in many different areas, such as: providing vision to the operator, predicting when or how long of a reaction time before an accident occurs, providing audible warnings to the public and operator, providing visual warnings to the public and operator, determining and providing a suggested escape route or action, and determining and providing any automatic vehicle operations in response to the imminent collision (i.e. automatic braking and/or steering).
FIGS. 5A-5D illustrate top views of various vehicles with an interactive vehicle safety system 100 and various input systems. FIG. 5A illustrates a top view of an automobile 10A with the interactive vehicle safety system 100. FIG. 5B illustrates a top view of a pick-up truck 10B with the interactive vehicle safety system 100. FIG. 5C illustrates a top view of a delivery truck 10C with the interactive vehicle safety system 100. FIG. 5D illustrates a top view of a semi-truck 10D with the interactive vehicle safety system 100. The interactive vehicle safety system 100 may be utilized with any vehicle without departing from this invention. The interactive vehicle safety system 100 may provide an operator with vision and situational awareness to what is happening external to the vehicle 10 by providing peripheral visual awareness. As illustrated in FIGS. 5A-5D, the interactive vehicle safety system 100 and vehicles 10A, 10B, 10C, 10D may include one or more of the following: cameras 124, object detecting sensors 120, ultrasonic sensors 122, and vehicle telematic sensors 130, etc. Other inputs may be included with the interactive vehicle safety system 100 and vehicles 10A, 10B, 10C, 10D as described and illustrated in FIG. 6. The one or more cameras 124, object detecting sensors 120, etc. and other input data devices as listed and described with FIG. 6 may be located at various locations throughout the vehicle 10, such as on the front bumper, rear bumper or rear area, side of the vehicle 10, on top of the vehicle 10, under the vehicle, within the inside of the vehicle 10, or any other locations that can provide meaningful inputs to the interactive vehicle safety system 100.
FIG. 6 illustrates an illustrative system depiction of the interactive vehicle safety system 100. The interactive vehicle safety system 100 may include a processor 105 that includes a processing unit and a system memory to store and execute software instructions. The interactive vehicle safety system 100 may provide various data inputs 110 and provide various outputs 120 to predict a potential incident, determine danger, and inform the vehicle 10 and operator to slow down, turn, or stop. As illustrated in FIG. 6, the various inputs 110 to the processor 105 and the interactive vehicle safety system 100 may include one or more of the following: depth camera 112, lasers 114, accelerometer device 116, aimed audio device 118, object detecting sensors 120, ultrasonic sensor 122, cameras 124, LIDAR radar sensor 126, photoelectric sensor 128, telematics device 130, infrared sensor 132, internet of things (IoT) 132, or GPS device 134. The interactive vehicle safety system 100 may include various image capturing devices, such as cameras 124 to capture what is happening in the real world external to the vehicle 10 and bringing in and highlighting what may happen in the near future. Additionally, the interactive vehicle safety system 100 may include LIDAR radar 126 and other object detecting sensors 120 (such as ultrasonic sensor 122, photoelectric sensor 128, and infrared sensor 132). Other sensors and inputs 110 may be utilized for the interactive vehicle safety system 100, such as GPS 136, vehicle telematic sensors 140, Internet of Things (IoT) 132 information. Vehicle telematics sensors 140 may monitor the vehicle 10 by using GPS 136 and onboard diagnostics to record movements on a computerized map, such as with a GPS receiver, and engine interface, an input/output interface (expander port) in the vehicle 10, a SIM card, or an accelerometer 116. The interactive vehicle safety system 100 will gather data from these inputs and the real world to help the operator see obstacles and provide the operator information to be able to react to obstacles.
The processor 105 may include a processing unit and a system memory to store and execute software instructions. The various inputs 110 and outputs 150 may be connected to the processor 105. Additionally, the processor 105 may be in communication with and connected to other various computer systems. The processor 105 of the interactive vehicle safety system 100 may have various outputs 150 after processing the various inputs 110. The outputs 150 of the interactive vehicle safety system 100 may include one or more of the following: audio spotlight 152, hologram display 154, heads-up display 156, LED display 158, display system 160, LCD display 162, dashboard display 164, haptic warning 166, audible warnings 168, image analysis 170, augmented reality display 172, external visual warnings 174, or accident prediction 176.
The processor 105 of the interactive vehicle safety system 100 may control and process various actions for the interactive vehicle safety system 100 as will be described further below. The processor 105 may be a general-purpose processor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, or any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. The one or more implementations described throughout this disclosure may utilize logical blocks, modules, and circuits that may be implemented or performed with the processor 105.
The processor 105 may be used to implement various aspects and features described herein. As such, the processor 105 may be configured to execute multiple calculations, in parallel or serial and may execute coordinate transformations, curve smoothing, noise filtering, outlier removal, amplification, and summation processes, and the like. The processor 105 may include a processing unit and system memory to store and execute software instructions. The processor 105 may be in communication with and/or connected to the interactive vehicle safety system 100 that may provide a central analysis and display.
FIGS. 7A1-7D2 illustrate the interactive vehicle safety system 100 with trajectory and dead-reckoning analysis and a heads-up display 270A along the full windshield 18. FIGS. 8A-8D illustrate the interactive vehicle safety system 100 with trajectory and dead-reckoning analysis and a heads-up display 270B along a portion of the windshield 18, and the lower left-hand portion of the windshield 18. In another embodiment of this invention, the interactive vehicle safety system 100 may include trajectory and dead reckoning analysis. For example, dead reckoning is the process of calculating a vehicle's current position and/or future position by using a previously determined position, or current position, and by using estimation of speed and course over elapsed time. The interactive vehicle safety system 100 may calculate a vehicle's future position by using a current position and by using estimation of speed, course, and other inputs over elapsed time. For collision avoidance, the reaction time may be approximately 2 seconds with a possible stopping distance of 3-5 seconds. The interactive vehicle safety system 100 may include real-time image processing for a vehicle to detect and track in real-time object movement outside the vehicle. The interactive vehicle safety system 100 may also highlight stationary or moving objects around the vehicle that are predicted to be a danger utilizing trajectory analysis and dead reckoning of the vehicle and the stationary or moving object. The trajectory analysis and dead reckoning analysis may utilize speed, direction, acceleration, turn radius, etc. from the vehicle and speed, direction, acceleration, etc. from the moving object. The prediction of the route and path of the vehicle and the location of the moving object may be calculated by the interactive vehicle safety system 100 by various methods, such as algorithms using speed, direction, turn radius, acceleration, GPS, vehicle telematic data and sensors, cameras, external sensors, mapping information, etc. Additionally, the interactive vehicle safety system 100 may include a predictive algorithm utilizing data and information to watch what pedestrians are doing, looking at the phone or wearing headphones or tracking the pedestrian's acceleration to determine if they are walking, about to run, or about to stop. The interactive vehicle safety system 100 may also utilize machine learning with the image processing to provide better predictive algorithms for the trajectory analysis.
In another embodiment of this invention, the interactive vehicle safety system 100 may include a dead-reckoning heads-up display 270 to include ultrasound sensors or other sensors to the front bumper, rear bumper, or other locations around the vehicle. As illustrated in FIGS. 7A1-7D2, the interactive vehicle safety system 100 provides a dead reckoning display 270A with information and warnings fully across the windshield and a real time picture across the windshield that can be displayed in various different colors (i.e. gray for dead reckoning and another color for real time). As illustrated in FIGS. 7A1-7D2, the heads-up display 270A may include both the vehicle/obstacles or text 274, such as “WARNING” as shown in the heads-up display 270A.
First, the interactive vehicle safety system 100 removes the A-pillar and other structural pillars to provide a clear, complete, open view to the operator of the vehicle 10. As an external, moving object's (such as a pedestrian 22) direction and speed is detected, the interactive vehicle safety system 100 displays the object and the object speed and direction both on the clear metal screen 260 but also across the windshield 18 by way of the heads-up-display 270A on the windshield 18 or dash 16. By including ultrasonic sensors on the front bumper, the rear bumper, or other locations around the vehicle, the interactive vehicle safety system 100 may then utilize real-time trajectory and object movement analysis and bring the dead reckoning into the real time space on the heads-up-display. The interactive vehicle safety system 100 may include a dead reckoning strip of LED pictures across the front of the dashboard 16 of the vehicle 10 and reflecting onto the windshield 18.
FIGS. 7A1 and 7A2 illustrate the interactive vehicle safety system 100 with trajectory and dead-reckoning analysis showing a second vehicle 20 turning left in front of a main vehicle 10. Specifically, FIG. 7A1 shows the second vehicle 20 preparing to turn left in front of the main vehicle 10. FIG. 7A2 shows the heads-up-display 270B with the second vehicle 20A turning in front of the main vehicle 10 and “WARNING” 274 by utilizing the trajectory and dead-reckoning analysis of the interactive vehicle safety system 100.
FIGS. 7B1 and 7B2 illustrate the interactive vehicle safety system 100 with trajectory and dead-reckoning analysis showing a main vehicle 10 turning left in front of a second vehicle 20. Specifically, FIG. 7B1 shows the main vehicle 10 preparing to turn left in front of the second vehicle 20. FIG. 7B2 shows the heads-up-display 270B with the main vehicle 10A turning in front of the second vehicle 20 and “WARNING” 274 by utilizing the trajectory and dead-reckoning analysis of the interactive vehicle safety system 100.
FIG. 7C illustrates the interactive vehicle safety system 100 with trajectory and dead-reckoning analysis showing a pedestrian 22 walking in a crosswalk in front of a main vehicle 10. As illustrated in FIG. 7C, the pedestrian 22 is behind and blocked by the A-pillar 14. The operator is able to view the pedestrian 22 in the crosswalk because of the display 260 from the pillar obstruction elimination system 250. The heads-up-display 270B may display “WARNING” 274 by utilizing the trajectory and dead-reckoning analysis of the interactive vehicle safety system 100.
FIGS. 7D1 and 7D2 illustrate the interactive vehicle safety system 100 with trajectory and dead-reckoning analysis showing a main vehicle 10 turning left in front of a pedestrian 20 in a crosswalk. Specifically, FIG. 7D1 shows the main vehicle 10 preparing to turn left in the direction of the pedestrian 22 in the crosswalk. FIG. 7D2 shows the heads-up-display 270B with the main vehicle 10A turning towards the pedestrian 22 in the crosswalk along with “WARNING” 274 by utilizing the trajectory and dead-reckoning analysis of the interactive vehicle safety system 100.
FIGS. 8A-8D illustrate the interactive vehicle safety system 100 with trajectory and dead-reckoning analysis and a heads-up display 270B along a portion of the windshield 18, and the lower left-hand portion of the windshield 18. As illustrated in FIGS. 8A-8D, the heads-up display 270B (along with the heads-up display 270A) may include both the vehicle/obstacles 272 or text 274, such as “WARNING” as shown in the heads-up display 270B. Specifically, FIG. 8A illustrates the interactive vehicle safety system 100 with the heads-up-display (HUD) 270B in the lower left-hand portion of the windshield 18 showing a second vehicle 20 turning left in front of a main vehicle 10. FIG. 8B illustrates the interactive vehicle safety system 100 with a heads-up-display (HUD) 270B in the lower left-hand portion of the windshield 18 showing a main vehicle 10 turning left in front of a second vehicle 20. FIG. 8C illustrates the interactive vehicle safety system 100 with a heads-up-display (HUD) 270B in the lower left-hand portion of the windshield 18 showing a pedestrian 22 walking in a crosswalk in front of a main vehicle 10. FIG. 8D illustrates the interactive vehicle safety system 100 with a heads-up-display (HUD) 270B in the lower left-hand portion of the windshield 18 showing a main vehicle 10 turning left in front of a pedestrian 22 in a crosswalk.
The highlighting of the moving object in danger may be highlighted in various stages.
For example, the moving object may be highlighted yellow if the object is potentially in the path of accident or collision with the vehicle. In addition, the moving object may be highlighted red if the object is moving and imminently in the path of accident or collision with the vehicle. Highlighting may be in the form of different colors, blinking colors, circles around the object, blinking circles, etc, without departing from this invention.
The interactive vehicle safety system 100 may utilize the heads-up display 270 as described and illustrated previously or the interactive vehicle safety system 100 may utilize a dashboard display 280. FIG. 9 illustrates an exemplary dashboard display 280 located on the dashboard 16 of the vehicle 10. The dashboard display 280 may include both the vehicle/obstacles 282 or text 284, such as “WARNING” as shown in the dashboard display 280. The interactive vehicle safety system 100 may utilize any one of or any combination of the heads-up display 270A, the heads-up display 270B, or the dashboard display 280 without departing from this invention to display what is happening outside the vehicle or what might happen outside the vehicle.
In another embodiment of this invention, the interactive vehicle safety system 100 may provide a full vision of what is happening outside of the vehicle 10. For example, the interactive vehicle safety system 100 will provide and improve an operator's peripheral visual awareness to provide situational awareness to the operator. The interactive vehicle safety system 100 may utilize one or more of the following systems and information to provide and improve the operator's peripheral visual awareness. For example, the interactive vehicle safety system 100 may provide an augmented reality system. The interactive vehicle safety system 100 may also provide depth cameras or other system that provide depth imagery allowing the ability to change the perspective of the operator's view. For example, the augmented reality system and/or the depth cameras may provide a display with the driver's perspective “in front” of the vehicle or with the driver's perspective “on top” of the vehicle. Depth cameras may also not return colors and may return a grey-scale image to help determine depth and therefore the distance of a pedestrian, object, or other danger object in the path of the vehicle.
In another embodiment of the invention, the operator may also utilize glasses, contacts, or a circular plastic cover that drops over the face and eyes to provide an augmented reality vision of the full vision of what is happening outside of the vehicle. The cover may drop over the face and eyes of the operator when required or when initiated by the interactive vehicle safety system 100. The augmented reality system may initiate the movement of the cover based on a projected trajectory of a collision and/or accident. The augmented reality system may initiate the movement of the cover upon movement of the vehicle.
In another embodiment of this invention, the interactive vehicle safety system 100 may utilize various sensors to provide the operator and system with additional information and situational awareness. The various sensors may include one or more of the following sensors: density, vibration, audio, humidity, air pressure, color, synthetic sensors, etc. The various sensors may be located on the front bumper, rear bumper, or other locations around the vehicle. The one or more sensors may include trajectory sensors to help determine and provide data and analysis of the trajectory and dead reckoning of the vehicle and any moving objects external to the vehicle. The one or more sensors may also include synthetic sensors wherein one sensor senses an action and the other sensors as part of the synthetic sensor confirms that action.
In another embodiment of this invention, the interactive vehicle safety system 100 may utilize stereoscopy with the plurality of cameras. The plurality of cameras may be located throughout the exterior of the vehicle, such as in front, sides, back, top, or bottom of the vehicle. The stereoscopy may utilize two or more cameras to accurately determine depth, location, and trajectory of moving objects or pedestrians external to the vehicle.
In another embodiment of this invention, the interactive vehicle safety system 100 may determine and track the location of the operator's head to change the view for the operator's view based on the operator's head location when moving, rotating, or at different heights. Tracking the operator's head location will help maintain aspect ratio location of the operator's vision of the display and external from the interactive vehicle safety system 100.
In another embodiment of this invention, the interactive vehicle safety system 100 may integrate data from various other information sources. The plurality of information sources may be one or more information sources on the Internet of Things (IoT), such as from camera information from intersections, buildings, autonomous vehicles, or other camera sources, sensors, or measuring devices throughout the area.
In another embodiment of this invention, the interactive vehicle safety system 100 may include image analysis with cameras that can detection what a pedestrian is doing. For example, image analysis may detect earbuds and/or headphones or a pedestrian talking on a cell phone. The interactive vehicle safety system 100 may utilize this image analysis information to potentially take a different action.
In another embodiment of this invention illustrated in FIG. 10, the interactive vehicle safety system 100 may utilize hologram technology with motion parallax. As illustrated in FIG. 10, the hologram system may provide a hologram 30 on the pillar 14 of the pedestrian 22 located behind the pillar 14. For example, as the operator moves eye direction, the foreground may shift faster than the background and create a stereoscopic view. The hologram technology may utilize a laser, one or more beam splitter mirrors (such as two mirrors), one or more lens (such as three lens), and a holograph film located on the pillar 14 or other location within the vehicle 10. By using a laser light, the light may be in the same direction and same wavelength to provide coherent beams with all light waves in phase to project the hologram 30 or 3D version of what is happening outside and blocked by the driver's vision.
In another embodiment of this invention, the interactive vehicle safety system 100 may provide audible and/or visual warnings to pedestrians and/or other danger objects. For example, the interactive vehicle safety system 100 may provide audible, visual, and other warnings (such as sounds, visual, and/or motion—such as a vibrating seat) to both the pedestrians and the vehicle operator when collision, an accident, or danger is possible. The interactive vehicle safety system 100 may utilize a transducer or other systems to send directed audible warnings to the danger pedestrian, such as 3D sounds.
In another embodiment of this invention as illustrated in FIG. 11, the interactive vehicle safety system 100 may provide an audio “spotlight” 290 for the operator. In this embodiment, an aimed audio device 292 located on the vehicle 10 casts the audio “spotlight” 290 to a surface redirecting the source/sound and pointed specifically at that spot, such as the sound coming from that spot. This allows the interactive vehicle safety system 100 to provide an audible warning 294 to the operator coming from the location of a potential accident or collision location based on the trajectory analysis, sensors, and cameras with the interactive vehicle safety system 100.
In another embodiment of this invention, the interactive vehicle safety system 100 may provide other warnings to the operator and/or pedestrians. For example, the interactive vehicle safety system 100 may provide a vibrating seat or haptic warning to the operator when a potential collision or accident is detected. In another example, the interactive vehicle safety system 100 may provide external, automatic, audible warnings outside of the vehicle, such as, “Warning—Vehicle approaching” or “Warning—Vehicle turning.” The interactive vehicle safety system 100 may provide internal, automatic, audible warnings inside of the vehicle to the operator, such as, “Warning—Pedestrian in crosswalk.” The interactive vehicle safety system 100 may also include an audio system with “white” noise or nuisance noise, ambient noise sensor to change external sounds, or GPS geofencing.
In another embodiment of this invention, the interactive vehicle safety system 100 may include a highlighting feature 240 on the display or on the external area 40 when the vehicle is making a turn. For example, as was described above, the interactive vehicle safety system 100 may display the path of the turn on the display for the operator based on the trajectory analysis, sensors, and cameras with the interactive vehicle safety system 100. Additionally, as illustrated in FIG. 12, the interactive vehicle safety system 100 may externally, outside of the vehicle 12, light up the external path 240 on the pavement or external area 40 of where the vehicle 10 is going, such as the trajectory of going straight or a turn during the turn. For example, the interactive vehicle safety system 100 may laser the path 240 and the direction of the vehicle 10 on the street or pavement 40—painting a turn or going straight with high-intensity LED lights 242 or similar lighting elements. To assist with optically displaying the path 240 externally on the street or pavement 40, the interactive vehicle safety system 100 may also provide infrared and heat-detection sensors to help “block-out” the road to display the external projected path of the vehicle on the road.
The interactive vehicle safety system 100 may include optical sensors on a steering wheel for determining a turning radius of a turn and providing predictive modeling on the turning path of the vehicle. Additionally, the interactive vehicle safety system 100 may include sensors on the wheels of the vehicle, such as an electromagnetic arm or pitmen arm for determining a predictive analysis of the turning radius of the vehicle. Additionally, back-up cameras or front cameras may provide additional turning trajectory analysis and path projection of the vehicle.
In another embodiment of this invention, the interactive vehicle safety system 100 may provide visual information and visual warnings. The visual warning may include a strobe or laser directed specific to a “danger” pedestrian to alert the pedestrian of an oncoming, turning vehicle. Additionally, the interactive vehicle safety system 100 may include a heads-up-display (HUD) on the windshield or other location based on the various cameras and sensors associated with the interactive vehicle safety system 100. The heads-up-display may include any of the information as described to include information about various moving objects and stationary objects from the trajectory analysis determined as potential collision or accident. The heads-up-display may include a distance or how many feet or inches an object is away from the vehicle and/or collision. The heads-up-display may also include a time to collision based on the trajectory analysis of the vehicle and the moving object.
In another embodiment of this invention, the interactive vehicle safety system 100 may include a processor and database for recording and storing information and images from turns and actions based on “danger” present or imminent—critical moments. The critical moments may be determined by geofencing, accelerometer analysis, and impact detection. The interactive vehicle safety system 100 may create a 3D model from the cameras and sensors to recreate an accident. This information can be very helpful in a number of situations as providing evidence of the actual actions that occurred during these critical moments.
In another embodiment of this invention, the interactive vehicle safety system 100 may include a number of automatic actions in response to an imminent or present danger situation. For example, the interactive vehicle safety system 100 may automatically stop acceleration of the vehicle at a “yellow” condition for a collision or accident. In another example, the interactive vehicle safety system 100 may automatically apply the brakes on the vehicle at a “red” condition for a collision or accident. The “yellow” and “red” conditions may be set by parameters within the interactive vehicle safety system 100. The interactive vehicle safety system 100 may require the vehicle to maintain within the speed limit, utilizing GPS-location services or even a camera that recognizes the speed limit through image analysis. In another example, the interactive vehicle safety system 100 may utilize other automatic actions, such as: pre-emptive braking, changing steering direction, horn honking, flashing lights, or vibration in the seats to help assist with maintain vehicle safety.
It is to be understood that the invention is not limited in its application to the details of construction and the arrangement of the components set forth herein. The invention is capable of other embodiments and of being practiced or being carried out in various ways. Variations and modifications of the foregoing are within the scope of the present invention. It should be understood that the invention disclosed and defined herein extends to all alternative combinations of two or more of the individual features mentioned or evident from the text and/or drawings. All of these different combinations constitute various alternative aspects of the present invention. The embodiments described herein explain the best modes known for practicing the invention and will enable others skilled in the art to utilize the invention.
While the preferred embodiments of the invention have been shown and described, it will be apparent to those skilled in the art that changes and modifications may be made therein without departing from the spirit of the invention, the scope of which is defined by this description.

Claims (15)

We claim:
1. An interactive safety system for a vehicle comprising:
one or more image capturing devices located on the vehicle and configured to transmit one or more images from the vehicle to a processor;
one or more object detecting sensors located on the vehicle that detect location, speed, and direction data of one or more objects external to the vehicle, the one or more object detecting sensors configured to transmit the location, speed, and direction data from of the one or more objects to the processor;
a display system connected to the processor, the display system comprising: a pillar display located on an A-pillar of the vehicle configured to portray an image blocked by an obstruction of the A-pillar from the one or more image capturing devices;
the processor and a memory unit storing computer-executable instructions, which when executed by the processor, cause the processor to:
receive images from the one or more image capturing devices;
receive the location, speed, and direction data from the one or more object detecting sensors;
process in real-time the images and the location, speed, and direction data from of the one or more objects external to the vehicle;
track in real-time the one or more objects external to the vehicle using the images and the location, speed, and direction data;
predict a future location and a route of the one or more objects external to the vehicle using a predictive algorithm and a trajectory analysis of the one or more objects external to the vehicle;
determine in real-time a danger object from the one or more objects external to the vehicle, wherein the danger object is a potential collision based on the predictive algorithm and the trajectory analysis;
display the image blocked by the obstruction of the A-pillar on the pillar display; and
provide an audible warning to an operator coming from a location of the danger object, wherein the audible warning is an audio spotlight from an aimed audio device connected to the processor that casts a sound to a surface redirecting the sound to come from the location of the danger object relative to the vehicle.
2. The interactive safety system of claim 1, wherein the computer-executable instructions further cause the processor to:
provide a visual warning to the danger object, wherein the visual warning is provided external to the vehicle.
3. The interactive safety system of claim 1, wherein the computer-executable instructions further cause the processor to further:
light a path of the vehicle with high-intensity lights based on the predictive algorithm and trajectory analysis, wherein the path is externally outside the vehicle on a pavement of a street.
4. The interactive safety system of claim 1, the display system further including a heads-up display on a windshield of the vehicle to display the one or more objects with the location, speed, and direction data and the danger object with the future location and the route on a heads-up display.
5. The interactive safety system of claim 1, wherein the computer-executable instructions further cause the processor to:
provide an audible warning to the danger object, wherein the audible warning is provided external to the vehicle.
6. The interactive safety system of claim 5, wherein the audible warning is a 3D sound directed to the danger object.
7. The interactive safety system of claim 1, wherein the one or more object detecting sensors are ultrasonic sensors.
8. The interactive safety system of claim 1, wherein the one or more object detecting sensors are LIDAR radar sensors.
9. The interactive safety system of claim 1, wherein the one or more object detecting sensors are photoelectric sensors.
10. The interactive safety system of claim 1, the display system further including a rear-view display located in the vehicle and configured to portray an image from the one or more image capturing devices that would be seen in a rear-view mirror, wherein the computer-executable instructions further cause the processor to:
display the image that would be seen in the rear-view mirror on the rear-view display.
11. The interactive safety system of claim 1, further comprising one or more telematics devices configured to transmit vehicle telematics data from the vehicle to the processor, wherein the computer-executable instructions further cause the processor to:
receive the telematics data from the telematics device; and
process in real-time the telematics data, wherein the one or more objects external to the vehicle are tracked further using the telematics data.
12. The interactive safety system of claim 1, wherein the computer-executable instructions further cause the processor to:
highlight the danger object on the display system.
13. The interactive safety system of claim 12, wherein the danger object is highlighted yellow if the danger object is potentially in a path of an accident or a collision.
14. The interactive safety system of claim 12, wherein the danger object is highlighted red if the danger object is imminently in a path of an accident or a collision with the vehicle.
15. The interactive safety system of claim 12, wherein highlighting the danger object includes one or more of the following: blinking colors on the danger object, circling the danger object, or blinking circles around the danger object.
US16/930,974 2019-11-07 2020-07-16 Interactive safety system for vehicles Active US10981507B1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16/930,974 US10981507B1 (en) 2019-11-07 2020-07-16 Interactive safety system for vehicles
US17/232,068 US11318886B2 (en) 2019-11-07 2021-04-15 Interactive safety system for vehicles
US17/734,699 US12077046B2 (en) 2019-11-07 2022-05-02 Interactive safety system for vehicles

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962932188P 2019-11-07 2019-11-07
US16/930,974 US10981507B1 (en) 2019-11-07 2020-07-16 Interactive safety system for vehicles

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/232,068 Continuation US11318886B2 (en) 2019-11-07 2021-04-15 Interactive safety system for vehicles

Publications (2)

Publication Number Publication Date
US10981507B1 true US10981507B1 (en) 2021-04-20
US20210138960A1 US20210138960A1 (en) 2021-05-13

Family

ID=75494969

Family Applications (3)

Application Number Title Priority Date Filing Date
US16/930,974 Active US10981507B1 (en) 2019-11-07 2020-07-16 Interactive safety system for vehicles
US17/232,068 Active US11318886B2 (en) 2019-11-07 2021-04-15 Interactive safety system for vehicles
US17/734,699 Active US12077046B2 (en) 2019-11-07 2022-05-02 Interactive safety system for vehicles

Family Applications After (2)

Application Number Title Priority Date Filing Date
US17/232,068 Active US11318886B2 (en) 2019-11-07 2021-04-15 Interactive safety system for vehicles
US17/734,699 Active US12077046B2 (en) 2019-11-07 2022-05-02 Interactive safety system for vehicles

Country Status (1)

Country Link
US (3) US10981507B1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114454832A (en) * 2022-03-14 2022-05-10 陈潇潇 Independent evidence and complete fact recording method for accidents of intelligent driving
CN114872542A (en) * 2022-04-20 2022-08-09 中国第一汽车股份有限公司 Automobile external signal interaction method and system, electronic equipment and automobile
US20220319316A1 (en) * 2021-03-30 2022-10-06 Toyota Jidosha Kabushiki Kaisha Information processing device, information processing system, information processing method, and vehicle
US11648962B1 (en) * 2021-01-19 2023-05-16 Zoox, Inc. Safety metric prediction
EP4299378A1 (en) * 2022-06-29 2024-01-03 Faurecia Clarion Electronics Co., Ltd. Display control device
US20240001843A1 (en) * 2022-07-04 2024-01-04 Nippon Seiki Co., Ltd. Display device
US11948451B2 (en) 2021-02-22 2024-04-02 Toyota Jidosha Kabushiki Kaisha Information processing device, information processing system, information processing method, and display device
EP4480754A1 (en) * 2023-06-12 2024-12-25 Honeywell International Inc. Inertial camera scene motion compensation
US12220984B2 (en) * 2020-12-31 2025-02-11 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America User perspective alignment for vehicle see-through applications

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230003872A1 (en) * 2021-06-30 2023-01-05 Zoox, Inc. Tracking objects with radar data
US11794766B2 (en) * 2021-10-14 2023-10-24 Huawei Technologies Co., Ltd. Systems and methods for prediction-based driver assistance
JP7460674B2 (en) * 2022-03-16 2024-04-02 本田技研工業株式会社 Control device
DE102022117068A1 (en) * 2022-07-08 2024-01-11 Bayerische Motoren Werke Aktiengesellschaft Device, means of transport and method for the comprehensive display of moving image content
GB2623996A (en) * 2022-11-03 2024-05-08 Lynley Ashley Adrian Vehicle panoramic aspect
CN120187608A (en) * 2022-11-29 2025-06-20 哈曼贝克自动系统股份有限公司 Surround view system
WO2024123589A1 (en) * 2022-12-07 2024-06-13 Stoneridge Electronics Ab Trailer camera display system

Citations (85)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6593960B1 (en) 1999-08-18 2003-07-15 Matsushita Electric Industrial Co., Ltd. Multi-functional on-vehicle camera system and image display method for the same
US6642840B2 (en) 2000-07-28 2003-11-04 Lang-Mekra North Amicica, Llc Rearview mirror assembly with monitor
US6970184B2 (en) 2001-03-29 2005-11-29 Matsushita Electric Industrial Co., Ltd. Image display method and apparatus for rearview system
US7110021B2 (en) 2002-05-31 2006-09-19 Matsushita Electric Industrial Co., Ltd. Vehicle surroundings monitoring device, and image production method/program
US7266220B2 (en) 2002-05-09 2007-09-04 Matsushita Electric Industrial Co., Ltd. Monitoring device, monitoring method and program for monitoring
US7353086B2 (en) 2002-11-19 2008-04-01 Timothy James Ennis Methods and systems for providing a rearward field of view for use with motorcycles
US7432799B2 (en) 2004-11-09 2008-10-07 Alpine Electronics, Inc. Driving support apparatus and driving support method
US20080309764A1 (en) 2007-06-13 2008-12-18 Aisin Aw Co., Ltd. Driving assist apparatuses and methods
US7486175B2 (en) 2005-06-15 2009-02-03 Denso Corporation Vehicle drive assist system
US7511734B2 (en) 2004-08-05 2009-03-31 Kabushiki Kaisha Toshiba Monitoring apparatus and method of displaying bird's-eye view image
US7564479B2 (en) 2005-04-06 2009-07-21 Audiovox Corporation Rearview camera display mounted on a vehicle overhead console
US7592928B2 (en) 2005-06-07 2009-09-22 Nissan Motor Co., Ltd. Image display device and method
US20090299857A1 (en) * 2005-10-25 2009-12-03 Brubaker Curtis M System and method for obtaining revenue through the display of hyper-relevant advertising on moving objects
US8058980B2 (en) 2007-09-26 2011-11-15 Nissan Motor Co., Ltd. Vehicle periphery monitoring apparatus and image displaying method
US8130270B2 (en) 2007-10-23 2012-03-06 Alpine Electronics, Inc. Vehicle-mounted image capturing apparatus
US8154418B2 (en) 2008-03-31 2012-04-10 Magna Mirrors Of America, Inc. Interior rearview mirror system
US8345095B2 (en) 2005-10-07 2013-01-01 Nissan Motor Co., Ltd. Blind spot image display apparatus and method thereof for vehicle
US8547298B2 (en) 2009-04-02 2013-10-01 GM Global Technology Operations LLC Continuation of exterior view on interior pillars and surfaces
US20140019005A1 (en) * 2012-07-10 2014-01-16 Samsung Electronics Co., Ltd. Transparent display apparatus for displaying information of danger element, and method thereof
US8655019B2 (en) 2009-09-24 2014-02-18 Panasonic Corporation Driving support display device
US8733938B2 (en) 2012-03-07 2014-05-27 GM Global Technology Operations LLC Virtual convertible tops, sunroofs, and back windows, and systems and methods for providing same
US8749404B2 (en) 2007-05-14 2014-06-10 Bayerische Motoren Werke Aktiengesellschaft Motor vehicle
US20140285666A1 (en) 2011-11-01 2014-09-25 Magna Mirrors Of America, Inc. Vision system with door mounted exterior mirror and display
US20140293267A1 (en) * 2013-03-27 2014-10-02 Omron Automotive Electronics Co., Ltd. Laser radar device
US8854197B2 (en) 2009-06-03 2014-10-07 Aisin Seiki Kabushiki Kaisha Method of monitoring vehicle surroundings, and apparatus for monitoring vehicle surroundings
US20140336876A1 (en) * 2013-05-10 2014-11-13 Magna Electronics Inc. Vehicle vision system
US8976247B1 (en) 2004-09-14 2015-03-10 Magna Electronics Inc. Rear vision system for a vehicle
US8977008B2 (en) 2004-09-30 2015-03-10 Donnelly Corporation Driver assistance system for vehicle
US20150092042A1 (en) 2013-09-19 2015-04-02 Magna Electronics Inc. Vehicle vision system with virtual retinal display
US9041806B2 (en) 2009-09-01 2015-05-26 Magna Electronics Inc. Imaging and display system for vehicle
US9102269B2 (en) 2011-08-09 2015-08-11 Continental Automotive Systems, Inc. Field of view matching video display system
US9126533B2 (en) 2007-02-23 2015-09-08 Aisin Aw Co., Ltd. Driving support method and driving support device
US9238434B2 (en) 2010-04-19 2016-01-19 Smr Patents S.A.R.L. Rear view mirror simulation
US9290128B2 (en) 2011-10-31 2016-03-22 Lei Pan Multi-functional vehicle rearview mirror
US20160137126A1 (en) 2013-06-21 2016-05-19 Magna Electronics Inc. Vehicle vision system
US9349300B2 (en) 2011-10-31 2016-05-24 Lifelong Driver Llc Senior driver training
US9463741B2 (en) 2014-04-25 2016-10-11 Hitachi Construction Machinery Co., Ltd. Vehicle peripheral obstacle notification system
US20160311375A1 (en) 2015-04-21 2016-10-27 Magna Electronics Inc. Vehicle vision system with exchangeable cameras
US20170015248A1 (en) 2015-07-17 2017-01-19 Magna Mirrors Of America, Inc. Rearview vision system for vehicle
US20170036599A1 (en) 2015-08-06 2017-02-09 Ford Global Technologies, Llc Vehicle display and mirror
US9580017B2 (en) 2007-08-09 2017-02-28 Donnelly Corporation Vehicle mirror assembly with wide angle element
US9598016B2 (en) 2010-10-15 2017-03-21 Magna Mirrors Of America, Inc. Interior rearview mirror assembly
US9604573B2 (en) 2013-07-05 2017-03-28 Mitsubishi Electric Corporation Transmissive information display apparatus for obstacle detection outside a vehicle
US20170129405A1 (en) 2014-08-12 2017-05-11 Sony Corporation Vehicle display device, display control method, and rearview monitoring system
US9654687B2 (en) 2014-12-24 2017-05-16 Agamemnon Varonos Panoramic windshield viewer system
US9661280B2 (en) 2014-10-23 2017-05-23 Honda Motor Co., Ltd. Rearview obstruction camera system and associated method
US9674490B2 (en) 2013-04-18 2017-06-06 Magna Electronics Inc. Vision system for vehicle with adjustable cameras
US20170218678A1 (en) * 2015-11-18 2017-08-03 Be Topnotch, Llc Apparatus, system, and method for preventing vehicle door related accidents
US20170282796A1 (en) 2016-04-04 2017-10-05 Toshiba Alpine Automotive Technology Corporation Vehicle periphery monitoring apparatus
US20180050636A1 (en) 2015-04-13 2018-02-22 SEs Solutions GmbH Virtual panoramic roof or sunroof assembly
US20180096605A1 (en) * 2013-08-02 2018-04-05 Honda Motor Co., Ltd. Vehicle pedestrian safety system and methods of use and manufacture thereof
US9959767B1 (en) 2016-11-03 2018-05-01 GM Global Technology Operations LLC Method and apparatus for warning of objects
US9969330B2 (en) 2013-06-26 2018-05-15 Conti Temic Microelectronic Gmbh Mirror substitute device and vehicle
US20180134217A1 (en) 2015-05-06 2018-05-17 Magna Mirrors Of America, Inc. Vehicle vision system with blind zone display and alert system
US20180136652A1 (en) * 2016-11-14 2018-05-17 Baidu Usa Llc Planning feedback based decision improvement system for autonomous driving vehicle
US20180158255A1 (en) * 2016-07-29 2018-06-07 Faraday&Future Inc. Informational visual display for vehicles
US20180170261A1 (en) 2016-12-19 2018-06-21 Honda Motor Co., Ltd. System and method for activating a rear-facing camera
US10007854B2 (en) 2016-07-07 2018-06-26 Ants Technology (Hk) Limited Computer vision based driver assistance devices, systems, methods and associated computer executable code
US20180229649A1 (en) * 2017-02-14 2018-08-16 Ford Global Technologies, Llc Vehicle hazard notification system
US20180272948A1 (en) 2017-03-24 2018-09-27 Toyota Jidosha Kabushiki Kaisha Viewing device for vehicle
US20180272936A1 (en) 2017-03-24 2018-09-27 Ford Global Technologies, Llc Detection and presentation of obstructed vehicle views
US20180290593A1 (en) 2017-04-10 2018-10-11 Hyundai Motor Company Pillar display system for blind spot of vehicle
US10109200B1 (en) * 2017-06-29 2018-10-23 GM Global Technology Operations LLC Graphical multi-layer light alert display and control method thereof
US20180334100A1 (en) 2017-05-22 2018-11-22 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Rear view display image view positioning and zoom effect system and method
US10144289B2 (en) 2015-04-27 2018-12-04 Lg Electronics Inc. Display apparatus and method for controlling the same
US10161720B2 (en) 2016-12-12 2018-12-25 Toyota Jidosha Kabushiki Kaisha Apparatuses and methods for making an object appear transparent
US20190005337A1 (en) 2017-06-29 2019-01-03 Grammer Ag Device and method for displaying regions
US10183621B2 (en) 2015-11-12 2019-01-22 Mitsubishi Electric Corporation Vehicular image processing apparatus and vehicular image processing system
US10210761B2 (en) 2013-09-30 2019-02-19 Sackett Solutions & Innovations, LLC Driving assistance systems and methods
US10222613B2 (en) 2016-12-07 2019-03-05 Toyota Jidosha Kabushiki Kaisha Display apparatus for a vehicle
US10232848B2 (en) 2016-01-29 2019-03-19 Toyota Motor Engineering & Manufacturing North America, Inc. Detection of left turn across path/opposite direction oncoming objects
US10247941B2 (en) 2015-01-19 2019-04-02 Magna Electronics Inc. Vehicle vision system with light field monitor
US20190122037A1 (en) * 2017-10-24 2019-04-25 Waymo Llc Pedestrian behavior predictions for autonomous vehicles
US20190161274A1 (en) * 2017-11-27 2019-05-30 Amazon Technologies, Inc. Collision prevention for autonomous vehicles
US10315573B2 (en) 2014-02-19 2019-06-11 Magna Electronics Inc. Method for displaying information to vehicle driver
US10331963B2 (en) 2015-06-30 2019-06-25 Denso Corporation Camera apparatus and in-vehicle system capturing images for vehicle tasks
US10345605B2 (en) 2017-11-15 2019-07-09 Toyota Motor Engineering & Manufacturing North America Cloaking devices constructed from polyhedrons and vehicles comprising the same
US10343607B2 (en) 2017-03-24 2019-07-09 Toyota Jidosha Kabushiki Kaisha Viewing device for vehicle
WO2019134845A1 (en) 2018-01-05 2019-07-11 Bcs Automotive Interface Solutions Gmbh Mirror replacement system and method for displaying image and/or video data of the surroundings of a motor vehicle
US20190241126A1 (en) 2018-02-06 2019-08-08 GM Global Technology Operations LLC Vehicle-trailer rearview vision system and method
US20190248288A1 (en) 2016-07-13 2019-08-15 Sony Corporation Image generating device, image generating method, and program
US20190287282A1 (en) * 2018-03-14 2019-09-19 Ford Global Technologies, Llc Vehicle display with augmented realty
US20190367021A1 (en) * 2018-05-31 2019-12-05 Nissan North America, Inc. Predicting Behaviors of Oncoming Vehicles
US20200160537A1 (en) * 2018-11-16 2020-05-21 Uber Technologies, Inc. Deep Structured Scene Flow for Autonomous Devices
US10744938B1 (en) * 2015-03-23 2020-08-18 Rosco, Inc. Collision avoidance and/or pedestrian detection system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6697184B1 (en) 2002-01-29 2004-02-24 Texas Instruments Incorporated Method and system for steering a collimated light beam with a pivotable mirror
JP2008062666A (en) * 2006-09-04 2008-03-21 Toyota Motor Corp Vehicle alarm device
JP6363558B2 (en) * 2015-06-02 2018-07-25 株式会社デンソー Vehicle control apparatus and vehicle control method
KR101632156B1 (en) 2015-10-27 2016-06-21 김진태 Calibration lens can be seen ultra short distance
US10303181B1 (en) * 2018-11-29 2019-05-28 Eric John Wengreen Self-driving vehicle systems and methods
CA3044820C (en) * 2018-05-30 2021-10-12 Rick Giampietro Collision avoidance apparatus
US11507857B2 (en) * 2019-10-01 2022-11-22 TeleLingo Systems and methods for using artificial intelligence to present geographically relevant user-specific recommendations based on user attentiveness

Patent Citations (85)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6593960B1 (en) 1999-08-18 2003-07-15 Matsushita Electric Industrial Co., Ltd. Multi-functional on-vehicle camera system and image display method for the same
US6642840B2 (en) 2000-07-28 2003-11-04 Lang-Mekra North Amicica, Llc Rearview mirror assembly with monitor
US6970184B2 (en) 2001-03-29 2005-11-29 Matsushita Electric Industrial Co., Ltd. Image display method and apparatus for rearview system
US7266220B2 (en) 2002-05-09 2007-09-04 Matsushita Electric Industrial Co., Ltd. Monitoring device, monitoring method and program for monitoring
US7110021B2 (en) 2002-05-31 2006-09-19 Matsushita Electric Industrial Co., Ltd. Vehicle surroundings monitoring device, and image production method/program
US7353086B2 (en) 2002-11-19 2008-04-01 Timothy James Ennis Methods and systems for providing a rearward field of view for use with motorcycles
US7511734B2 (en) 2004-08-05 2009-03-31 Kabushiki Kaisha Toshiba Monitoring apparatus and method of displaying bird's-eye view image
US8976247B1 (en) 2004-09-14 2015-03-10 Magna Electronics Inc. Rear vision system for a vehicle
US8977008B2 (en) 2004-09-30 2015-03-10 Donnelly Corporation Driver assistance system for vehicle
US7432799B2 (en) 2004-11-09 2008-10-07 Alpine Electronics, Inc. Driving support apparatus and driving support method
US7564479B2 (en) 2005-04-06 2009-07-21 Audiovox Corporation Rearview camera display mounted on a vehicle overhead console
US7592928B2 (en) 2005-06-07 2009-09-22 Nissan Motor Co., Ltd. Image display device and method
US7486175B2 (en) 2005-06-15 2009-02-03 Denso Corporation Vehicle drive assist system
US8345095B2 (en) 2005-10-07 2013-01-01 Nissan Motor Co., Ltd. Blind spot image display apparatus and method thereof for vehicle
US20090299857A1 (en) * 2005-10-25 2009-12-03 Brubaker Curtis M System and method for obtaining revenue through the display of hyper-relevant advertising on moving objects
US9126533B2 (en) 2007-02-23 2015-09-08 Aisin Aw Co., Ltd. Driving support method and driving support device
US8749404B2 (en) 2007-05-14 2014-06-10 Bayerische Motoren Werke Aktiengesellschaft Motor vehicle
US20080309764A1 (en) 2007-06-13 2008-12-18 Aisin Aw Co., Ltd. Driving assist apparatuses and methods
US9580017B2 (en) 2007-08-09 2017-02-28 Donnelly Corporation Vehicle mirror assembly with wide angle element
US8058980B2 (en) 2007-09-26 2011-11-15 Nissan Motor Co., Ltd. Vehicle periphery monitoring apparatus and image displaying method
US8130270B2 (en) 2007-10-23 2012-03-06 Alpine Electronics, Inc. Vehicle-mounted image capturing apparatus
US8154418B2 (en) 2008-03-31 2012-04-10 Magna Mirrors Of America, Inc. Interior rearview mirror system
US8547298B2 (en) 2009-04-02 2013-10-01 GM Global Technology Operations LLC Continuation of exterior view on interior pillars and surfaces
US8854197B2 (en) 2009-06-03 2014-10-07 Aisin Seiki Kabushiki Kaisha Method of monitoring vehicle surroundings, and apparatus for monitoring vehicle surroundings
US9041806B2 (en) 2009-09-01 2015-05-26 Magna Electronics Inc. Imaging and display system for vehicle
US8655019B2 (en) 2009-09-24 2014-02-18 Panasonic Corporation Driving support display device
US9238434B2 (en) 2010-04-19 2016-01-19 Smr Patents S.A.R.L. Rear view mirror simulation
US9598016B2 (en) 2010-10-15 2017-03-21 Magna Mirrors Of America, Inc. Interior rearview mirror assembly
US9102269B2 (en) 2011-08-09 2015-08-11 Continental Automotive Systems, Inc. Field of view matching video display system
US9290128B2 (en) 2011-10-31 2016-03-22 Lei Pan Multi-functional vehicle rearview mirror
US9349300B2 (en) 2011-10-31 2016-05-24 Lifelong Driver Llc Senior driver training
US20140285666A1 (en) 2011-11-01 2014-09-25 Magna Mirrors Of America, Inc. Vision system with door mounted exterior mirror and display
US8733938B2 (en) 2012-03-07 2014-05-27 GM Global Technology Operations LLC Virtual convertible tops, sunroofs, and back windows, and systems and methods for providing same
US20140019005A1 (en) * 2012-07-10 2014-01-16 Samsung Electronics Co., Ltd. Transparent display apparatus for displaying information of danger element, and method thereof
US20140293267A1 (en) * 2013-03-27 2014-10-02 Omron Automotive Electronics Co., Ltd. Laser radar device
US9674490B2 (en) 2013-04-18 2017-06-06 Magna Electronics Inc. Vision system for vehicle with adjustable cameras
US20140336876A1 (en) * 2013-05-10 2014-11-13 Magna Electronics Inc. Vehicle vision system
US20160137126A1 (en) 2013-06-21 2016-05-19 Magna Electronics Inc. Vehicle vision system
US9969330B2 (en) 2013-06-26 2018-05-15 Conti Temic Microelectronic Gmbh Mirror substitute device and vehicle
US9604573B2 (en) 2013-07-05 2017-03-28 Mitsubishi Electric Corporation Transmissive information display apparatus for obstacle detection outside a vehicle
US20180096605A1 (en) * 2013-08-02 2018-04-05 Honda Motor Co., Ltd. Vehicle pedestrian safety system and methods of use and manufacture thereof
US20150092042A1 (en) 2013-09-19 2015-04-02 Magna Electronics Inc. Vehicle vision system with virtual retinal display
US10210761B2 (en) 2013-09-30 2019-02-19 Sackett Solutions & Innovations, LLC Driving assistance systems and methods
US10315573B2 (en) 2014-02-19 2019-06-11 Magna Electronics Inc. Method for displaying information to vehicle driver
US9463741B2 (en) 2014-04-25 2016-10-11 Hitachi Construction Machinery Co., Ltd. Vehicle peripheral obstacle notification system
US20170129405A1 (en) 2014-08-12 2017-05-11 Sony Corporation Vehicle display device, display control method, and rearview monitoring system
US9661280B2 (en) 2014-10-23 2017-05-23 Honda Motor Co., Ltd. Rearview obstruction camera system and associated method
US9654687B2 (en) 2014-12-24 2017-05-16 Agamemnon Varonos Panoramic windshield viewer system
US10247941B2 (en) 2015-01-19 2019-04-02 Magna Electronics Inc. Vehicle vision system with light field monitor
US10744938B1 (en) * 2015-03-23 2020-08-18 Rosco, Inc. Collision avoidance and/or pedestrian detection system
US20180050636A1 (en) 2015-04-13 2018-02-22 SEs Solutions GmbH Virtual panoramic roof or sunroof assembly
US20160311375A1 (en) 2015-04-21 2016-10-27 Magna Electronics Inc. Vehicle vision system with exchangeable cameras
US10144289B2 (en) 2015-04-27 2018-12-04 Lg Electronics Inc. Display apparatus and method for controlling the same
US20180134217A1 (en) 2015-05-06 2018-05-17 Magna Mirrors Of America, Inc. Vehicle vision system with blind zone display and alert system
US10331963B2 (en) 2015-06-30 2019-06-25 Denso Corporation Camera apparatus and in-vehicle system capturing images for vehicle tasks
US20170015248A1 (en) 2015-07-17 2017-01-19 Magna Mirrors Of America, Inc. Rearview vision system for vehicle
US20170036599A1 (en) 2015-08-06 2017-02-09 Ford Global Technologies, Llc Vehicle display and mirror
US10183621B2 (en) 2015-11-12 2019-01-22 Mitsubishi Electric Corporation Vehicular image processing apparatus and vehicular image processing system
US20170218678A1 (en) * 2015-11-18 2017-08-03 Be Topnotch, Llc Apparatus, system, and method for preventing vehicle door related accidents
US10232848B2 (en) 2016-01-29 2019-03-19 Toyota Motor Engineering & Manufacturing North America, Inc. Detection of left turn across path/opposite direction oncoming objects
US20170282796A1 (en) 2016-04-04 2017-10-05 Toshiba Alpine Automotive Technology Corporation Vehicle periphery monitoring apparatus
US10007854B2 (en) 2016-07-07 2018-06-26 Ants Technology (Hk) Limited Computer vision based driver assistance devices, systems, methods and associated computer executable code
US20190248288A1 (en) 2016-07-13 2019-08-15 Sony Corporation Image generating device, image generating method, and program
US20180158255A1 (en) * 2016-07-29 2018-06-07 Faraday&Future Inc. Informational visual display for vehicles
US9959767B1 (en) 2016-11-03 2018-05-01 GM Global Technology Operations LLC Method and apparatus for warning of objects
US20180136652A1 (en) * 2016-11-14 2018-05-17 Baidu Usa Llc Planning feedback based decision improvement system for autonomous driving vehicle
US10222613B2 (en) 2016-12-07 2019-03-05 Toyota Jidosha Kabushiki Kaisha Display apparatus for a vehicle
US10161720B2 (en) 2016-12-12 2018-12-25 Toyota Jidosha Kabushiki Kaisha Apparatuses and methods for making an object appear transparent
US20180170261A1 (en) 2016-12-19 2018-06-21 Honda Motor Co., Ltd. System and method for activating a rear-facing camera
US20180229649A1 (en) * 2017-02-14 2018-08-16 Ford Global Technologies, Llc Vehicle hazard notification system
US20180272948A1 (en) 2017-03-24 2018-09-27 Toyota Jidosha Kabushiki Kaisha Viewing device for vehicle
US20180272936A1 (en) 2017-03-24 2018-09-27 Ford Global Technologies, Llc Detection and presentation of obstructed vehicle views
US10343607B2 (en) 2017-03-24 2019-07-09 Toyota Jidosha Kabushiki Kaisha Viewing device for vehicle
US20180290593A1 (en) 2017-04-10 2018-10-11 Hyundai Motor Company Pillar display system for blind spot of vehicle
US20180334100A1 (en) 2017-05-22 2018-11-22 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Rear view display image view positioning and zoom effect system and method
US10109200B1 (en) * 2017-06-29 2018-10-23 GM Global Technology Operations LLC Graphical multi-layer light alert display and control method thereof
US20190005337A1 (en) 2017-06-29 2019-01-03 Grammer Ag Device and method for displaying regions
US20190122037A1 (en) * 2017-10-24 2019-04-25 Waymo Llc Pedestrian behavior predictions for autonomous vehicles
US10345605B2 (en) 2017-11-15 2019-07-09 Toyota Motor Engineering & Manufacturing North America Cloaking devices constructed from polyhedrons and vehicles comprising the same
US20190161274A1 (en) * 2017-11-27 2019-05-30 Amazon Technologies, Inc. Collision prevention for autonomous vehicles
WO2019134845A1 (en) 2018-01-05 2019-07-11 Bcs Automotive Interface Solutions Gmbh Mirror replacement system and method for displaying image and/or video data of the surroundings of a motor vehicle
US20190241126A1 (en) 2018-02-06 2019-08-08 GM Global Technology Operations LLC Vehicle-trailer rearview vision system and method
US20190287282A1 (en) * 2018-03-14 2019-09-19 Ford Global Technologies, Llc Vehicle display with augmented realty
US20190367021A1 (en) * 2018-05-31 2019-12-05 Nissan North America, Inc. Predicting Behaviors of Oncoming Vehicles
US20200160537A1 (en) * 2018-11-16 2020-05-21 Uber Technologies, Inc. Deep Structured Scene Flow for Autonomous Devices

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12220984B2 (en) * 2020-12-31 2025-02-11 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America User perspective alignment for vehicle see-through applications
US11648962B1 (en) * 2021-01-19 2023-05-16 Zoox, Inc. Safety metric prediction
US11948451B2 (en) 2021-02-22 2024-04-02 Toyota Jidosha Kabushiki Kaisha Information processing device, information processing system, information processing method, and display device
US12183188B2 (en) 2021-02-22 2024-12-31 Toyota Jidosha Kabushiki Kaisha Information processing device, information processing system, information processing method, and display device
US20220319316A1 (en) * 2021-03-30 2022-10-06 Toyota Jidosha Kabushiki Kaisha Information processing device, information processing system, information processing method, and vehicle
US11948457B2 (en) * 2021-03-30 2024-04-02 Toyota Jidosha Kabushiki Kaisha Information processing device, information processing system, information processing method, and vehicle
CN114454832A (en) * 2022-03-14 2022-05-10 陈潇潇 Independent evidence and complete fact recording method for accidents of intelligent driving
CN114872542A (en) * 2022-04-20 2022-08-09 中国第一汽车股份有限公司 Automobile external signal interaction method and system, electronic equipment and automobile
EP4299378A1 (en) * 2022-06-29 2024-01-03 Faurecia Clarion Electronics Co., Ltd. Display control device
US20240001850A1 (en) * 2022-06-29 2024-01-04 Faurecia Clarion Electronics Co., Ltd. Display control device
US20240001843A1 (en) * 2022-07-04 2024-01-04 Nippon Seiki Co., Ltd. Display device
EP4480754A1 (en) * 2023-06-12 2024-12-25 Honeywell International Inc. Inertial camera scene motion compensation

Also Published As

Publication number Publication date
US20230052890A1 (en) 2023-02-16
US20210138960A1 (en) 2021-05-13
US20210339679A1 (en) 2021-11-04
US11318886B2 (en) 2022-05-03
US12077046B2 (en) 2024-09-03

Similar Documents

Publication Publication Date Title
US11318886B2 (en) Interactive safety system for vehicles
US11681299B2 (en) Vehicle sensor system and method of use
CN113998034B (en) Rider assistance system and method
US8350686B2 (en) Vehicle information display system
CN111273765B (en) Vehicle display control device, vehicle display control method, and storage medium
US20150293534A1 (en) Vehicle control system and method
JP2019049774A (en) Vehicle control device, vehicle control method, and program
JP2019053574A (en) Vehicle control device, vehicle control method, and program
US20190135169A1 (en) Vehicle communication system using projected light
JP4517393B2 (en) Driving assistance device
EP3690859B1 (en) Method for monitoring blind spot of cycle using smart helmet for cycle rider and blind spot monitoring device using them
CN112052716B (en) Identification device, identification method, and storage medium
EP4224456A1 (en) Information processing device, information processing method, program, and projection device
JP7652329B2 (en) Vehicle control device and vehicle control method
CN114852010A (en) Vehicle seat belt device
CN114765974A (en) Vehicle control method and device
US20250236174A1 (en) Interactive Safety System for Vehicles
KR102094405B1 (en) Method and apparatus for determining an accident using an image
US20240140463A1 (en) Method and apparatus for providing driving data of an autonomous vehicle
JP2022121370A (en) Display control device and display control program
JP2022100852A (en) Attention evocation device and attention evocation method
JP7537391B2 (en) Object detection device
TWI876714B (en) Method for vehicle detects stationary objects at high speeds
US20250218065A1 (en) Information processing apparatus and information processing method
WO2024189766A1 (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: RAILSERVE INC., GEORGIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PRESTON, JON A;BENJAMIN, TIMOTHY J;CHASTINE, JEFF;SIGNING DATES FROM 20191108 TO 20191111;REEL/FRAME:054009/0785

Owner name: FOCUSED TECHNOLOGY SOLUTIONS, INC., NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CALABRESE, JOSEPH L.;BARTEK, PETER M.;SIGNING DATES FROM 20191104 TO 20200511;REEL/FRAME:054009/0782

Owner name: FOCUSED TECHNOLOGY SOLUTIONS, INC., NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:RAILSERVE INC.;REEL/FRAME:054009/0790

Effective date: 20191111

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: FOCUSED TECHNOLOGY SOLUTIONS, INC., NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DASH, COREY;REEL/FRAME:056270/0557

Effective date: 20210517

CC Certificate of correction
AS Assignment

Owner name: RAILSERVE, INC., GEORGIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FOCUSED TECHNOLOGY SOLUTIONS, INC.;REEL/FRAME:060524/0972

Effective date: 20220714

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4