[go: up one dir, main page]

GB2632707A - Vehicle warning system - Google Patents

Vehicle warning system Download PDF

Info

Publication number
GB2632707A
GB2632707A GB2312665.9A GB202312665A GB2632707A GB 2632707 A GB2632707 A GB 2632707A GB 202312665 A GB202312665 A GB 202312665A GB 2632707 A GB2632707 A GB 2632707A
Authority
GB
United Kingdom
Prior art keywords
vehicle
audio
audio signal
control system
dependence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2312665.9A
Other versions
GB202312665D0 (en
Inventor
Cook Luke
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB2312665.9A priority Critical patent/GB2632707A/en
Publication of GB202312665D0 publication Critical patent/GB202312665D0/en
Publication of GB2632707A publication Critical patent/GB2632707A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q5/00Arrangement or adaptation of acoustic signal devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Transportation (AREA)
  • Traffic Control Systems (AREA)

Abstract

Aspects of the present invention relate to a control system (100) for controlling a warning system of a vehicle (200), the control system (100) comprising one or more processors (120) collectively configured to receive (310) object data (160) from one or more sensors (220A-G) of the vehicle (200), the object data (160) comprising data indicative of one or more objects in a vicinity of the vehicle (200), wherein the object data (160) at least comprises location and audio characteristics of the one or more objects; generate (320), in dependence on the object data (160), one or more audio signals (170), wherein an audio signal (170) is generated for each of the one or more objects in the vicinity of the vehicle (200), the audio signal (170) being generated in dependence on the audio characteristic of the respective object; and output (330), to one or more speakers (240) of the vehicle (200), the one or more audio signals (170), wherein each generated audio signal (170) is output to the one or more speakers (240) in dependence on the location characteristic of the respective object. Aspects of the invention are also related to a system incorporating a control system (100), one or more sensors (220A-G) of the vehicle (200) and one or more speakers (240) of the vehicle (200), a vehicle (200) incorporating a control system (100), and a method (300) of controlling a warning system of the vehicle (200).

Description

VEHICLE WARNING SYSTEM
TECHNICAL FIELD
The present disclosure relates to a vehicle warning system. Aspects of the invention relate a control system, a system, a vehicle, a method and computer readable instructions.
BACKGROUND
It is known to provide vehicles with a warning system whereby an audio signal is output to the speakers of the vehicle if a potential hazard is detected within the vicinity of the vehicle, to thereby alert a user of the vehicle.
Typically, the same audio signal will be output for all potential hazards, and only provide the user with an indication that a potential hazard exists. If any other information about the potential hazard is provided, it is provided through a human-machine interface of the vehicle dashboard.
It is an aim of the present invention to address one or more of the disadvantages associated with the prior art.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide a control system, a system, a vehicle, a method and computer readable instructions as claimed in the appended claims.
This disclosure provides a warning system of a vehicle that provides different audio signals to a user depending on the audio and location characteristics of the potentials hazards in the vicinity of the vehicle. The warning system detects objects in the vicinity of the vehicle, and then generates and outputs audio signals to alert the user based on the sound generated by the objects and their position relative to the vehicle.
According to an aspect of the present invention there is provided a control system for controlling a warning system of a vehicle, the control system comprising one or more processors collectively configured to receive object data from one or more sensors of the vehicle, the object data comprising data indicative of one or more objects in a vicinity of the vehicle, wherein the object data at least comprises location and audio characteristics of the one or more objects. The one or more processors are collectively configured to generate, in dependence on the object data, one or more audio signals, wherein an audio signal is generated for each of the one or more objects in the vicinity of the vehicle, the audio signal being generated in dependence on the audio characteristic of the respective object, and output, to one or more speakers of the vehicle, the one or more audio signals, wherein each generated audio signal is output to the one or more speakers in dependence on the location characteristic of the respective object.
In this way, the sound generated by each object that is a potential hazard to the vehicle is detected and output to the speakers of the vehicle depending on the position of the object relative to the vehicle. In doing so, the driver may be alerted to multiple objects and their position relative to the vehicle solely through the audio signals without needing to look around the vehicle or at the vehicle dashboard.
The control system comprises one or more controllers collectively comprising at least one electronic processor having an electrical input for receiving an input signal; and at least one memory device electrically coupled to the at least one electronic processor and having instructions stored therein; and wherein the at least one electronic processor is configured to access the at least one memory device and execute the instructions thereon so as to receive object data from one or more sensors of the vehicle, the object data comprising data indicative of one or more objects in a vicinity of the vehicle, wherein the object data at least comprises location and audio characteristics of the one or more objects; generate, in dependence on the object data, one or more audio signals, wherein an audio signal is generated for each of the one or more objects in the vicinity of the vehicle, the audio signal being generated in dependence on the audio characteristic of the respective object; and output, to one or more speakers of the vehicle, the one or more audio signals, wherein each generated audio signal is output to the one or more speakers in dependence on the location characteristic of the respective object Optionally, the audio characteristic of the one or more objects may comprise at least one of a frequency characteristic and an amplitude characteristic.
In this way, the audio signal output to the driver provides an indication of the type of object by virtue of the sound it is generating, and its distance to the vehicle by virtue of the amplitude of the sound.
Optionally, the one or more processors may be collectively configured to determine the location characteristic of the one or more objects in dependence on a position of the one or more sensors on the vehicle.
In this way, the location of the objects relative to the vehicle is derived based on the position of the sensor that detects the audio associated with the object.
Optionally, the one or more sensors may comprise one or more microphones arranged at one or more positions on the vehicle.
Optionally, the one or more processors may be collectively configured to receive background audio data from at least one of the one or more microphones, the background audio data comprising data indicative of background noise in the vicinity of the vehicle, and generate, in dependence on the background audio data, one or more anti-noise signals, wherein the one or more audio signals are further generated in dependence on the one or more anti-noise signals.
For example, the one or more processors may be collectively configured to generate the one or more audio signals by combining the one or more anti-noise signals with the audio characteristics of the respective object.
In this way, the other microphones around the vehicle are used to cancel out any background noise not related to the object to thereby improve the ease with which the driver can identify the object from the audio signals being output to the speakers.
Optionally, the one or more processors may be collectively configured to determine whether the audio characteristic of an object is above a predetermined threshold, and generate an audio signal for the object in dependence on the determination.
In such cases, the audio characteristic may be an amplitude characteristic.
For example, the one or more processors may be collectively configured to generate the audio signal in dependence on the audio characteristic of the respective object if the audio characteristic is above the predetermined threshold.
As such, an audio signal is only generated if the sound generated by the object is above a certain threshold of amplitude. In this way, if the audio signal is below the threshold (i.e., it is too quiet), for example, because it is not close enough be considered a hazard, then no audio signal will be generated.
Optionally, the one or more processors may be collectively configured to generate an audio signal comprising a plurality of frequencies if the audio characteristic is below the predetermined threshold, wherein at least one frequency of the audio signal is generated in dependence on one or more further characteristics of the respective object.
In this way, if the audio signal is below the threshold because it is an object that does not generate any or a significant amount of sound, a different audio signal will be generated based on other characteristics of the object.
Optionally, the one or more further characteristics of the respective object may comprise at least one of a size of the object, a position of the object relative to the vehicle, an object type, and a velocity of the object relative to the vehicle.
According to another aspect of the invention, there is provided a system comprising the control system as mentioned above, one or more sensors of the vehicle and one or more speakers of the vehicle.
According to yet another aspect of the invention, there is provided a vehicle comprising the system as mentioned above or the control system as mentioned above.
According to a further aspect of the invention, there is provided a method for controlling a warning system of a vehicle. The method comprises receiving object data from one or more sensors of the vehicle, the object data comprising data indicative of one or more objects in a vicinity of the vehicle, wherein the object data comprises location and audio characteristics of the one or more objects. The method further comprises generating, in dependence on the object data, one or more audio signals, wherein an audio signal is generated for each of the one or more objects in the vicinity of the vehicle, the audio signal being generated in dependence on the audio characteristic of the respective object, and outputting, to one or more speakers of the vehicle, the one or more audio signals, wherein each generated audio signal is output to the one or more speakers in dependence on the location characteristic of the respective object.
According to a still further aspect of the invention, there is provided a computer readable instructions which, when executed by a computer, are arranged to perform the method as mentioned above.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination.
That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 shows a block diagram illustrating a control system according to an embodiment of the present invention; Figure 2A shows a schematic illustration of a vehicle according to an embodiment of the present invention; Figure 2B shows a schematic illustration of a rear-view of the vehicle of Figure 2a; Figure 3 shows a method according to an embodiment of the invention; Figure 4 shows a schematic illustration of the vehicle of Figure 2a during the operations performed by the control system of Figure 1; Figure 5 shows a second flow chart showing operations performed by the control system of Figure 1 according to an embodiment of the present invention.
DETAILED DESCRIPTION
A control system 100 for controlling a warning system of a vehicle in accordance with an embodiment of the present invention is described herein with reference to the accompanying Figure 1. As shown in Figure 2, the control system 100 is installed in a vehicle 200. The control system 100 comprises one or more controller 110.
With reference to Figure 1, there is illustrated a control system 100 for a vehicle 200. The control system 100 comprises one or more controller 110.
The control system 100 is configured to receive object data 160 from one or more sensors 220A-G of the vehicle 200 and generate one or more audio signals 170 associated with the object data 160. The control system 100 may then output the audio signals 170 to one or more speakers 240 of the vehicle 200 to alert the driver of the vehicle 200 to any potential hazards within the vicinity of the vehicle 200. As described in detail below the control system 100 generates an audio signal 170 that is tailored to each object that is a potential hazard to the vehicle 200 as detected by one or more sensors 220A-G. The audio signal generated depends on the characteristics of the object which is a potential hazard, such as the location of the object and the sound generated by the object, and then output to the driver. In doing so, the driver may be alerted to multiple objects and their position relative to the vehicle 200 solely through the audio signals without needing to look around the vehicle 200 or at the vehicle dashboard or HMI.
The control system 100 as illustrated in Figure 1 comprises one controller 110, although it will be appreciated that this is merely illustrative. The controller 110 comprises processing means 120 and memory means 130. The processing means 120 may be one or more electronic processing device 120 which operably executes computer-readable instructions. The memory means 130 may be one or more memory device 130. The memory means 130 is electrically coupled to the processing means 120. The memory means 130 is configured to store instructions, and the processing means 120 is configured to access the memory means 130 and execute the instructions stored thereon.
The controller 110 comprises an input means 140 and an output means 150. The input means 140 may comprise an electrical input 140 of the controller 110. The output means 150 may comprise an electrical output 150 of the controller 110. The input 140 is arranged to receive object data 160 from one or more sensors of the vehicle 200. The object data 160 is an electrical signal which is indicative of one or more characteristics of objects in the vicinity of the vehicle 200. For example, the object data 16 may comprise data indicative of one or more audio characteristics of an object, such as the frequency and amplitude of a sound generated by an object, and data indicative of the position of an object relative to the vehicle 200. The object data 160 may also comprise one or more of the size of an object, the type of object and a velocity of the object relative to the vehicle 200. The one or more sensors 220A-G may be any sensor configured to detect objects within the vicinity of the vehicle 200, from which the characteristics the objects can be derived. For example, the one or more sensors may comprise one or more audio sensors, such as microphones, or any other sensor configured to detect sound. Optionally, the one or more sensors may comprise one or more of a radar system, such as a frequency modulated continuous wave (FMCVV) radar system, a lidar system, one or more image sensors, and/or an ultrasonic sensor system.
Optionally, the input 140 may be configured to receive an operating conditions signal 165 from one or more further sensors of the vehicle 200. The operating conditions signal 165 is an electrical signal which is indicative of one or more operational conditions of the vehicle 200. For example, the operating conditions signal 165 may comprise one or more of a signal indicative of the speed of the vehicle 200, a signal indicative of an operating mode of the vehicle 200, a signal indicative of the activation of a direction indicator by a user, a signal indicative of the activation of one or more lights by a user (e.g., hazard lights, fog lights, head lights etc.), a signal indicative of the activation of the windscreen wipers by a user, a steering angle signal from angle sensor indicative of an angle of the steering wheel, and a temperature signal from a temperature sensor indicative of the surrounding temperature.
The output 150 is arranged to output an audio signal 170 to one or more speakers 240 of the vehicle, the audio signal 170 being indicative of the characteristics of the objects in the vicinity of the vehicle 200. The audio signal 170 thereby indicates to the driver of the vehicle the presence of one or more objects in the vicinity of the vehicle 200.
Figure 2A illustrates a vehicle 200 according to an embodiment of the present invention. Figure 2B illustrates a rear-view of the vehicle 200 of Figure 2A. The vehicle 200 comprises controller 110 as illustrated in Figure 1. The controller 110 is shown as mounted within the vehicle 200 and is in communication with one or more sensors 220A-G distributed about the vehicle 200, such that the object data 160 can be received from the one or more sensors 220A-G. It will of course be appreciated that the sensors 220A-G shown in Figures 2A-B are merely illustrative and there may be any number of sensors provided at any suitable location about the vehicle 200. The control system 110 is in further communication with a speaker system 240 located within the vehicle 200 such that the audio signals 170 can be transmitted to the one or more speakers of the speaker system 240. Again, it will be appreciated that the speaker system 240 may comprise any number of speakers distributed about the vehicle 200 at any suitable position. For example, the speaker system 240 may comprise a plurality of speakers, with at least one speaker located next to the driver's seat, one speaker located next to the front passenger seat, one speaker located at the left side in the back of the vehicle 200 and another speaker located at the right side in the back of the vehicle 200.
Vehicle 200 may be a known human controlled vehicle or an EGO vehicle, i.e., a vehicle that is equipped with autonomous or semi-autonomous driving technology and is capable of sensing and navigating its environment without direct input from a human driver.
Figure 3 illustrates a method 300 according to an embodiment of the invention. The method 300 is a method of controlling a warning system of a vehicle 200, such as the vehicle 200 illustrated in Figures 2A and 2B and with further reference to Figure 4. The method 300 may be performed by the system 100 illustrated in Figure 1. In particular, the memory 130 may comprise computer-readable instructions which, when executed by the processor 120, perform the method 300 according to an embodiment of the invention. In the example shown in Figure 4, the vehicle 200 is travelling along a road and approaching an object, which in this case is a further vehicle 400, in its vicinity. It will of course be appreciated that the "objects" described herein may be any object within the vicinity of the vehicle 200 that could be a potential hazard to the vehicle 200, including but not limited to other vehicles, cyclists, pedestrians, walls, barriers, bollards, lampposts or any other physical object in the surrounding area.
At step 310, the controller 110 is configured to receive object data from one or more sensors 220A-G of the vehicle 200. The object data is received as an input signal 160 at the input means 140 of the controller 110 and comprises data indicative of one or more objects in the vicinity of the vehicle 200 and the characteristics thereof.
In the example shown in Figure 4, one or more sensors 220A, 220C, 220D detect a further vehicle 400 in front of the vehicle 200 and determines one or more characteristics thereof In this example, the sensors 220A, 220C, 220D comprise audio sensors, such as an array of microphones, configured to detect sounds generated by objects in the vicinity of the vehicle 200, from which one or more audio characteristics such as frequency and amplitude can be determined. Additionally, a location characteristic of an object (i.e., its position relative to the vehicle 200) can also be determined, for example, based on the position of the sensors 220A-G that detect the object and the amplitude of the sound (i.e., the loudness) detected at each respective sensor 220A-G. For example, in Figure 4, the sound generated by the further vehicle 400 is detected at sensors 220A, 220C, 220D. The sound detected at sensor 220C will be louder relative to the sound detected at sensors 220A and 220D, with the sound being the quietist at sensor 220D since it is the furthest away from the further vehicle 400, and thus it can be determined that the further vehicle 400 is in front and to the left of the vehicle 200. Other characteristics such as the velocity of the further vehicle 400 relative to the vehicle 200 may also be determined, for example, by tracking any changes in the sensors 220A-G at which the further vehicle 400 is detected and changes in the audio characteristics detected at each sensor 220A-G as the further vehicle 400 moves relative to the vehicle 200. That is to say, as the further vehicle 400 moves, the sound of the further vehicle 400 will be detected at different sets of sensors 220A-G with the amplitude changing as the further vehicle 400 moves closer and/or further away from each sensor 220A-G. This can then be tracked to determine the velocity at which the further vehicle 400 is moving.
Whilst Figure 4 shows the use of three sensors 220A, 220C, 220D to detect the object 400, it will of course be appreciated that objects may be detected by any number of sensors of the vehicle 200. It will also be appreciated that multiple types of sensors may be used in combination. For example, in the scenario shown in Figure 4, object data 160 may also be received from a radar system, from which characteristics such as the size of an object, the position of an object relative to the vehicle 200, the type of object and a velocity of the object relative to the vehicle 200 may be determined.
At step 320, the controller 110 is configured to generate one or more audio signals based on the received object data 160. In this respect, the processing means 120 receives the input signal 160 from the input means 140 and, upon executing the instructions stored in the memory means 130, generates one or more audio signals. An audio signal is generated for each of the detected objects within the vicinity of the vehicle 200 based on the audio characteristics of the respective object. For example, the audio signal may comprise the same frequency and audio characteristics as the sound detected by the sensors 220A-G of the vehicle 200, such that the audio signal replicates the sound emitted by the objects within the vicinity of the vehicle 200. In doing so, the frequency and amplitude of the audio signal provides an indication to the driver of the type of object and its proximity to the vehicle 200. Referring to the example shown in Figure 4, the sound made by the further vehicle 400 will be generated as an audio signal, thereby indicating to the driver that another vehicle is in the vicinity of the vehicle 200, the amplitude of said audio signal providing an indication of how close the further vehicle 400 is to the vehicle 200. Likewise, if two or more objects are detected in the vicinity of the vehicle 200, different audio signals will be generated, each having the audio characteristics of the respective object to thereby allow the driver to distinguish between each object.
Optionally, the one or more audio signals may be further generated based on background audio data received from one or more further sensors 220A-G of the vehicle 200. For example, if the object data is received from one or more sensors in the front of the vehicle 200 (e.g., sensors 220A, 220C and 220D as shown in Figure 4), background audio data may be received from one or more sensors 220A-G in the rear of the vehicle (e.g., sensors 220E-G shown in Figure 2). In this respect, it will be appreciated that the background audio data may be received as the input signal 160 from the input means 140. The background audio data comprises data indicative of background noise in the vicinity of the vehicle 200, and thus corresponds to input data 160 received from the one or more sensors 220A-G in locations where an object is not detected. The processing means 120 may then be configured to generate one or more anti-noise signals from the background audio data, and generate the one or more audio signals in further dependence on the one or more anti-noise signals, for example, by combining the one or more anti-noise signals with the audio characteristics of the respective object.
Optionally, before the one or more audio signals are generated, the processing means 120 may be configured to determine whether at least one audio characteristic of an object is above a pre-determined threshold, and generate an audio signal for the object in dependence on the determination. For example, if the amplitude characteristic for an object is at or above a predetermined threshold (i.e., it is a relatively loud), the processing means 120 will generate an audio signals based on the audio characteristics of the respective object, as described above. For example, the predetermined threshold may be an amplitude of approximately 60 dB, although it will be appreciated that any suitable threshold may be used.
If the amplitude characteristic is below the predetermined threshold, the processing means 120 may be configured to generate the one or more audio signals based on object data 160 received from one or more further sensors 220A-G of the vehicle 200 and one or more further characteristics of the object derived from said object data 160. For example, the processing means 120 may receive further object data 160 from a radar system, and process the further object data 160 to determine one or more of the size of the object, a position of the object relative to the vehicle 200, an object type and a velocity of the object relative to the vehicle 200. The processing means 120 may then be configured to generate an audio signal for the detected object, the audio signal comprising a plurality of frequencies. For example, the audio signal may be a random noise signal comprising a plurality of different frequencies, such as white noise or grey noise. In this respect, it will be appreciated that the audio signals may comprise any frequency within the audible sound spectrum, that is, frequencies within the range of 20 to 20,000 Hz. At least one frequency of the audio signal is generated in dependence on at least one characteristic of the respective object such that an audio signal is generated that is tailored to the characteristics of the object to which it relates.
For example, at least one frequency of the audio signal may be generated based on the size of the object. For example, an audio signal having a low average frequency may be generated for larger objects (e.g., a lorry or a truck), whilst an audio signal with a relatively higher average frequency may be generated for smaller objects (e.g., pedestrians, cyclists or smaller motor vehicles). In doing so, separate audio signals may be generated for each object in the vicinity of the vehicle 200 that enable the driver to distinguish between different smaller or larger objects in the vicinity of the vehicle 200.
As another example, at least one frequency of the audio signal may be generated based on the type of object. For example, a first range of frequencies may be used for cars, a second range of frequencies may be used for lorries or trucks, a third range of frequencies may be used for pedestrians and so on. In doing so, the driver will be able to associate a particular sound with a particular type of object.
As a further example, at least one frequency of the audio signal may be generated based on the velocity of the object relative to the vehicle 200. For example, as described above, an initial range of frequencies of the audio signal may be first selected based on another characteristic of the object, such as size and/or object type. A Doppler effect may then be applied to the audio signal based on the velocity of the object relative to the vehicle 200, such that the frequencies of the audio signal are shifted according to the velocity at which the object is travelling, thereby providing an indication of the speed at which the object is travelling and any changes in speed. For example, if the object is travelling at a high speed relative to the vehicle 200 (e.g., above 80kph), the range of frequencies or the average frequency may be increased, and then decreased if the object slows down to a lower speed.
The audio signals may be further generated in dependence on the position of the detected objects relative to the vehicle 200. In this respect, the amplitude of the audio signal generated for each respective object may be adjusted based on the position of the object relative to the vehicle 200. For example, the amplitude of the audio signal may be increased as the distance between the object and the vehicle 200 decreases, and then decreased as the distance between the object and the vehicle 200 increases. In doing so, the driver is provided with an indication as to how close an object is to the vehicle 200. In cases where the distance between the object and the vehicle 200 is below a predetermined distance threshold, for example, less than 5 meters, a pulsed audio signal may be generated.
Once the processing means 120 has generated the audio signals for each respective object, the controller 110 outputs, at step 330, the audio signals 170 to one or more speakers 240 of the vehicle 200. The audio signals 170 may be output to one or more of the speakers 240 based on the location characteristic of the respective object. In this respect, the processing means 120 is configured to process the object data 160 to determine a location characteristic of the object, as described above with respect to step 310, the location characteristic comprising a position of the object relative to the vehicle 200. The audio signals 170 may then be output to one or more of the speakers 240 based on the position of the object relative to the vehicle 200 such that the position of the speakers 240 corresponds to the relative position of the object. For example, in Figure 4, the audio signal 170 will be output to the speakers 240 located in front, left side of the vehicle 200 to correspond to the relative position of the further vehicle 400. In doing so, the driver will be able to determine that the further vehicle 400 is in front of the vehicle 200 and to the left side based on the position of the speakers 240 outputting the audio signal 170.
In order to output the audio signals 170 to the speakers 240 based on the location characteristics of the detected objects, the processor 120 may be configured to map the one or more sensors 220A-G to at least one of the speakers 240 of the vehicle 200 to determine which speakers 240 the audio signal 170 will be output to and the amplitude of the audio signal 170 at those respective speakers 240. For example, referring again to Figure 4, the sensors 220A, 220C, 220D may be each mapped to a particular speaker 240 in the vehicle 200 (e.g., one or more of the front speakers). The audio signal 170 may then be output to the relevant speakers 240, with the audio signal 170 being rendered the most loudly in the speaker 240 that is mapped to the sensor that detected the loudest sound (i.e., the sensor 220C closest to the further vehicle 400) so that the driver can infer where the object is relative to the vehicle 200. As noted above, the amplitude of the audio signal 170 output to the speakers 240 may also increase or decrease as objects become closer or further away so that the driver may infer how close the object is to the vehicle 200 from the volume of the audio signal 170.
Likewise, in the case of a moving object, the audio signal 170 may be output to different speakers 240 as the object moves relative to the vehicle 200 so that the driver may infer the direction in which the object is moving, as well as the speed at which the object is moving relative to the vehicle 200.
As such, based on the sound, amplitude and direction of the audio signals 170 generated and output to the speakers 240, the driver will be alerted to the position, size, velocity and/or type of objects in the vicinity of the vehicle 200.
Figure 5 is a flowchart 500 according to an embodiment of the present invention. The flowchart 500 illustrates further steps performed by the control system 100 of Figure 1 in controlling a warning system of a vehicle 200, such as the vehicle 200 illustrated in Figures 2A and 2B. In particular, the memory 130 may comprise computer-readable instructions which, when executed by the processor 120, perform the method 500 according to an embodiment of the invention. The method 500 shown in Figure 5 provides a method of determining whether one or more objects in the vicinity of the vehicle 200 are a potential hazard to the vehicle 200. It will be appreciated that the method 500 may be performed between steps 310 and 320 of the method 300, such that once object data 160 has been received at step 310, the method 500 will be performed to determine whether each detected object is a potential hazard based on the received object data 160, with an audio signal then be generated at step 320 if the object is identified as a potential hazard.
At step 510, the controller 110 is configured to determine whether an object is a potential hazard based on the type of object. In this respect, the object data 160 is processed to determine an object type characteristic of a detected object, and determine whether the object corresponds to a known object type based on one or more known object types, which may be stored in the memory means 130. In this respect, the memory means 130 may store a first list of known object types that are associated with a particular location and categorised as potential hazards (e.g., lane markings, traffic cones, pedestrians, cyclists) and a second list of known object types that are categorised as not potential hazards (e.g., plastic bags or other trash), in which case there no need to generate an audio signal for the object. For the objects in the first list of known object types, there may be a pre-defined audio signal stored in the memory means 130, in which case the method will proceed to step 320 to generate the pre-defined audio signal. For example, if the object is identified as a set of lane markings, a pre-defined audio signal may be used to alert the driver if the vehicle 200 gets too close to the edge of the lane markings.
If the object is not determined to correspond to a known object type at step 510, steps 520-540 will be performed, either in sequence or in parallel, to determine whether the unknown object is a potential hazard.
At step 520, the controller 110 is configured to determine a predicted trajectory of the vehicle 200 based on one or more operating conditions of the vehicle 200. In this respect, the operational conditions signal 165 is processed to determine one or more operating conditions of the vehicle 200, such as, whether a direction indicator is activated. The controller 110 is then configured to predict the trajectory of the vehicle 200 based on one or more of the operating conditions. For example, if the operating conditions signal 165 comprises a signal indicating that the direction indicator has been activated and indicating left, the controller 110 will determine a predicted trajectory in a left direction for a range of possible steering wheel angles, for example, 0° to 180° in an anticlockwise direction. If there is no indication that a direction indicator is activated, the predicted trajectory will be determined to be along a generally straight line.
At step 530, the controller 110 is configured to determine a trajectory of a moving object relative to the vehicle 200 based on the object data 160 and one or more operating conditions of the vehicle 200. In this respect, the object data 160 will be processed to determine the position and velocity of the object relative to the vehicle 200, the operating conditions signal 165 will be processed to determine one or more of a speed of the vehicle 200, whether one or more lights of the vehicle 200 are activated, whether the windscreen wipers are activated, a steering wheel angle and a temperature of the surrounding area.
The processing means 120 will then determine a predicted time to collision based on the position and velocity of the object relative to the vehicle 200. If the predicted time to collision is below a threshold time, such that it is within a threshold distance from the vehicle 200, the object will be determined to be a hazard.
The threshold distance may be determined based on the operating conditions of the vehicle 200, the threshold distance being the minimum distance the object can be from the vehicle 200 without a collision occurring. As such, as the threshold distance changes, the threshold time may change correspondingly. The threshold distance may be determined based on one or more of the trajectory or the predicted trajectory (as determined at step 520) of the vehicle 200, the speed of the vehicle 200, a predicted reaction time of the driver, and a predicted braking distance. In this respect, the predicted reaction time and braking distance may be determined based on whether one or more lights of the vehicle 200 are activated, whether the windscreen wipers are activated, and/or a temperature of the surrounding area. For example, if the operating conditions signal 165 comprises a signal indicating that the fog lights are on, this will indicate that the visibility of the driver may be reduced and thus their predicted reaction time may be greater than in clearer conditions. If the operating conditions signal 165 comprises a signal indicating that the windscreen wipers are on, or that the surrounding area has a temperature below 0°, this may indicate wet or icy conditions, and thus the predicted braking distance may be increased. The parameters of the threshold distance calculation may be weighted differently around the perimeter of the vehicle 200, such that the threshold distance may be smaller or larger in some directions. For example, the braking distance may have a zero or near-negligible weighting for calculating the threshold distance at the side of the vehicle 200 (since any braking will have little or no effect on side impacts), such that the threshold distance at the sides of the vehicle may be smaller than in the front or rear of the vehicle 200.
At step 540, the controller 110 is configured to determine whether an object is a potential hazard based on the position of the object relative to the vehicle 200. In this respect, the object data 160 is processed to determine a position of the object relative to the vehicle 200. If the object is within a predetermined minimum distance of the vehicle 200, it will be determined to be a hazard. For example, the predetermined minimum distance may be 2 metres from the vehicle 200. It will be appreciated that the predetermined minimum distance may be adapted based on one or more operating conditions of the vehicle 200, such as the temperature of the surrounding area, which may be indicative of icy conditions.
Once an object has been determined to be a hazard, the controller 110 will then generate an audio signal for the respective object, as described with reference to step 320 above.
It will be appreciated that steps 520, 530 and 540 may be performed in any order, in sequence or in parallel.
Additionally, once an object has been identified as a potential hazard at one of the steps 510, 520, 530 and 540, the other steps may be dispensed with, and the method will proceed to step 320 shown in Figure 3 to generate an audio signal for the object.
It will be appreciated that various changes and modifications can be made to the present invention without departing from the scope of the present application.

Claims (14)

  1. CLAIMS1. A control system for controlling a warning system of a vehicle, the control system comprising one or more processors collectively configured to: receive object data from one or more sensors of the vehicle, the object data comprising data indicative of one or more objects in a vicinity of the vehicle, wherein the object data at least comprises location and audio characteristics of the one or more objects; generate, in dependence on the object data, one or more audio signals, wherein an audio signal is generated for each of the one or more objects in the vicinity of the vehicle, the audio signal being generated in dependence on the audio characteristic of the respective object; and output, to one or more speakers of the vehicle, the one or more audio signals, wherein each generated audio signal is output to the one or more speakers in dependence on the location characteristic of the respective object.
  2. 2. A control system according to claim 1, wherein the audio characteristic of the one or more objects comprises at least one of a frequency characteristic and an amplitude characteristic.
  3. 3. A control system according to claims 1 or 2, wherein the one or more processors are collectively configured to determine the location characteristic of the one or more objects in dependence on a position of the one or more sensors on the vehicle.
  4. 4. A control system according to any preceding claim, wherein the one or more sensors comprise one or more microphones arranged at one or more positions on the vehicle.
  5. 5. A control system according to claim 4, wherein the one or more processors are collectively configured to: receive background audio data from at least one of the one or more microphones, the background audio data comprising data indicative of background noise in the vicinity of the vehicle; and generate, in dependence on the background audio data, one or more anti-noise signals, wherein the one or more audio signals are further generated in dependence on the one or more anti-noise signals.
  6. 6. A control system according to claim 5, wherein the one or more processors are collectively configured to generate the one or more audio signals by combining the one or more anti-noise signals with the audio characteristics of the respective object.
  7. 7. A control system according to any preceding claim, wherein the one or more processors are collectively configured to: determine whether the audio characteristic of an object is above a predetermined threshold; and generate an audio signal for the object in dependence on the determination.
  8. 8. A control system according to claim 7, wherein the one or more processors are collectively configured to generate the audio signal in dependence on the audio characteristic of the respective object if the audio characteristic is above the predetermined threshold.
  9. 9. A control system according to claims 7 or 8, wherein the one or more processors are collectively configured to generate an audio signal comprising a plurality of frequencies if the audio characteristic is below the predetermined threshold, wherein at least one frequency of the audio signal is generated in dependence on one or more further characteristics of the respective object.
  10. 10. A control system according to claim 9, wherein the one or more further characteristics of the respective object comprise at least one of a size of the object, a position of the object relative to the vehicle, an object type, and a velocity of the object relative to the vehicle.
  11. 11. A system comprising the control system of any preceding claim, one or more sensors of the vehicle and one or more speakers of the vehicle.
  12. 12. A vehicle comprising the system of claim 11 or the control system of claims 1 -10.
  13. 13. A method for controlling a warning system of a vehicle, the method comprising: receiving object data from one or more sensors of the vehicle, the object data comprising data indicative of one or more objects in a vicinity of the vehicle, wherein the object data comprises location and audio characteristics of the one or more objects; generating, in dependence on the object data, one or more audio signals, wherein an audio signal is generated for each of the one or more objects in the vicinity of the vehicle, the audio signal being generated in dependence on the audio characteristic of the respective object; and outputting, to one or more speakers of the vehicle, the one or more audio signals, wherein each generated audio signal is output to the one or more speakers in dependence on the location characteristic of the respective object.
  14. 14. Computer readable instructions which, when executed by a computer, are arranged to perform a method according to claim 13.
GB2312665.9A 2023-08-18 2023-08-18 Vehicle warning system Pending GB2632707A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2312665.9A GB2632707A (en) 2023-08-18 2023-08-18 Vehicle warning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2312665.9A GB2632707A (en) 2023-08-18 2023-08-18 Vehicle warning system

Publications (2)

Publication Number Publication Date
GB202312665D0 GB202312665D0 (en) 2023-10-04
GB2632707A true GB2632707A (en) 2025-02-19

Family

ID=88189652

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2312665.9A Pending GB2632707A (en) 2023-08-18 2023-08-18 Vehicle warning system

Country Status (1)

Country Link
GB (1) GB2632707A (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6097285A (en) * 1999-03-26 2000-08-01 Lucent Technologies Inc. Automotive auditory feedback of changing conditions outside the vehicle cabin
US20030156019A1 (en) * 2002-02-19 2003-08-21 Continental Teves, Inc. Object detection system providing driver information through sound
WO2007049995A1 (en) * 2005-10-24 2007-05-03 Volvo Lastvagnar Ab Object detection system and method
US20090316939A1 (en) * 2008-06-20 2009-12-24 Denso Corporation Apparatus for stereophonic sound positioning
GB2534163A (en) * 2015-01-14 2016-07-20 Jaguar Land Rover Ltd Vehicle interface device
US20170096104A1 (en) * 2015-10-02 2017-04-06 Ford Global Technologies, Llc Potential hazard indicating system and method
GB2560230A (en) * 2017-01-03 2018-09-05 Ford Global Tech Llc Spatial auditory alerts for a vehicle
US10166999B1 (en) * 2018-02-12 2019-01-01 Ambarella, Inc. Surround sound based warning system
WO2021022227A1 (en) * 2019-07-31 2021-02-04 Karma Automotive Llc System and method for a combined visual and audible spatial warning system
WO2022073676A1 (en) * 2020-10-08 2022-04-14 Valeo Telematik Und Akustik Gmbh Method, apparatus, and computer-readable storage medium for providing three-dimensional stereo sound
US20220118997A1 (en) * 2020-10-20 2022-04-21 Aptiv Technologies Limited Notifying Hazards to a Driver of a Motor Vehicle

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6097285A (en) * 1999-03-26 2000-08-01 Lucent Technologies Inc. Automotive auditory feedback of changing conditions outside the vehicle cabin
US20030156019A1 (en) * 2002-02-19 2003-08-21 Continental Teves, Inc. Object detection system providing driver information through sound
WO2007049995A1 (en) * 2005-10-24 2007-05-03 Volvo Lastvagnar Ab Object detection system and method
US20090316939A1 (en) * 2008-06-20 2009-12-24 Denso Corporation Apparatus for stereophonic sound positioning
GB2534163A (en) * 2015-01-14 2016-07-20 Jaguar Land Rover Ltd Vehicle interface device
US20170096104A1 (en) * 2015-10-02 2017-04-06 Ford Global Technologies, Llc Potential hazard indicating system and method
GB2560230A (en) * 2017-01-03 2018-09-05 Ford Global Tech Llc Spatial auditory alerts for a vehicle
US10166999B1 (en) * 2018-02-12 2019-01-01 Ambarella, Inc. Surround sound based warning system
WO2021022227A1 (en) * 2019-07-31 2021-02-04 Karma Automotive Llc System and method for a combined visual and audible spatial warning system
WO2022073676A1 (en) * 2020-10-08 2022-04-14 Valeo Telematik Und Akustik Gmbh Method, apparatus, and computer-readable storage medium for providing three-dimensional stereo sound
US20220118997A1 (en) * 2020-10-20 2022-04-21 Aptiv Technologies Limited Notifying Hazards to a Driver of a Motor Vehicle

Also Published As

Publication number Publication date
GB202312665D0 (en) 2023-10-04

Similar Documents

Publication Publication Date Title
US10053067B2 (en) Vehicle safety assist system
KR102005253B1 (en) Lane assistance system responsive to extremely fast approaching vehicles
JP6060091B2 (en) Inter-vehicle distance control system
US9177478B2 (en) Vehicle contact avoidance system
JP6318864B2 (en) Driving assistance device
US10124727B2 (en) Method for warning a driver of a vehicle of the presence of an object in the surroundings, driver assistance system and motor vehicle
US9406230B2 (en) Drive control apparatus
US9424750B2 (en) Vehicle control system, specific object determination device, specific object determination method, and non-transitory storage medium storing specific object determination program
CN103827940B (en) The drive assist system of vehicle
JP2008273251A (en) Vehicle alarm device
JPWO2014203333A1 (en) Overtaking support system
JP5716700B2 (en) Driving assistance device
JP7150247B2 (en) vehicle alarm system
EP1522458A2 (en) Automotive system including a back-up aid with parking assist
JP2008062666A (en) Vehicle alarm device
JP2020040648A (en) Method and computer program for operating motor vehicle, particularly motorcycle
JP2011016418A (en) Vehicle control device
KR101747818B1 (en) Intelligent alarm apparatus of vehicle and method of the same
GB2523097A (en) System for use in a vehicle
US9293046B2 (en) Vehicle rearward travel control device and method using lateral and rear detection areas
JP2011018165A (en) Vehicle travel safety device
GB2632707A (en) Vehicle warning system
GB2540749A (en) A system for use in a vehicle
CN116331198A (en) Road early warning system and method for vehicle running
JP5007167B2 (en) Vehicle travel control device