[go: up one dir, main page]

CN114067583A - Alarm broadcasting method and device, alarm prompting method and device, medium and vehicle - Google Patents

Alarm broadcasting method and device, alarm prompting method and device, medium and vehicle Download PDF

Info

Publication number
CN114067583A
CN114067583A CN202010750836.9A CN202010750836A CN114067583A CN 114067583 A CN114067583 A CN 114067583A CN 202010750836 A CN202010750836 A CN 202010750836A CN 114067583 A CN114067583 A CN 114067583A
Authority
CN
China
Prior art keywords
visibility
alarm
alert
distance
environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010750836.9A
Other languages
Chinese (zh)
Inventor
唐帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Audi AG
Original Assignee
Audi AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi AG filed Critical Audi AG
Priority to CN202010750836.9A priority Critical patent/CN114067583A/en
Publication of CN114067583A publication Critical patent/CN114067583A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/091Traffic information broadcasting
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096733Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H20/00Arrangements for broadcast or for distribution combined with broadcast
    • H04H20/53Arrangements specially adapted for specific applications, e.g. for traffic information or for mobile receivers
    • H04H20/61Arrangements specially adapted for specific applications, e.g. for traffic information or for mobile receivers for local area broadcast, e.g. instore broadcast
    • H04H20/62Arrangements specially adapted for specific applications, e.g. for traffic information or for mobile receivers for local area broadcast, e.g. instore broadcast for transportation systems, e.g. in vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)

Abstract

提供了一种广播警报的方法和装置、提示警报的方法和装置、介质及车辆。基于环境中的能见度广播警报的方法包括:获取待广播的警报信息;确定所述环境中的能见度;基于所述能见度确定警报广播距离,其中,所述警报广播距离定义所述警报信息被广播到的距离范围;以及按照所述警报广播距离,广播所述警报信息。由此,能够根据能见度来适应性地调节所述警报距离,从而增加警报的有效性和提高驾驶安全性。

Figure 202010750836

Provided are a method and device for broadcasting an alarm, a method and device for prompting an alarm, a medium and a vehicle. A method of broadcasting an alarm based on visibility in an environment includes: obtaining alarm information to be broadcast; determining visibility in the environment; determining an alarm broadcast distance based on the visibility, wherein the alarm broadcast distance defines the alarm information to be broadcast to and broadcast the alarm information according to the alarm broadcast distance. Thereby, the warning distance can be adaptively adjusted according to the visibility, thereby increasing the effectiveness of the warning and improving driving safety.

Figure 202010750836

Description

Alarm broadcasting method and device, alarm prompting method and device, medium and vehicle
Technical Field
The present disclosure relates to the field of driving assistance for vehicles, and more particularly, to a method of broadcasting an alert based on visibility in an environment, an apparatus for broadcasting an alert based on visibility in an environment, a method of prompting a user for an alert based on visibility in an environment, an apparatus for prompting a user for an alert based on visibility in an environment, a vehicle, and a non-transitory computer-readable storage medium storing a program.
Background
At present, the car networking uses the vehicle as information perception object usually, with the help of information communication technology, realizes that the vehicle is linked with all-round network such as cloud platform, car and car, car and road, car and people, car in, can promote the holistic intelligent driving level of vehicle from this, for the user provides safe and intelligent drive impression and traffic service, improves traffic operating efficiency simultaneously. In the field of vehicle networks, there are alarm notification functions for traffic information, vehicle information, road condition information, and the like, for example, warning information on a forward traffic jam, an accident, a faulty vehicle, a vehicle traveling in the wrong direction, a vehicle parked illegally, a pedestrian on a road, an animal crossing a road, a road surface slippery, a road surface collapse, a construction road, an ambulance on a road, and a cargo left on a road is presented to a vehicle. The alert prompt functions include broadcasting an alert and receiving an alert. In the case of a broadcast alert, the vehicle's drive assist system broadcasts an alert to other vehicles in the surrounding environment. In the case of receiving an alert, the vehicle's driving assistance system receives the alert broadcast from other vehicles, traffic infrastructure, or base stations and prompts the alert to the driver. In both cases, a fixed and constant alert distance is typically set so that the alert sender broadcasts the alert with a fixed coverage area and the alert recipient immediately alerts the user when entering the fixed coverage area.
The apparatus or methods described in this section are not necessarily ones that have been previously conceived or pursued. Unless otherwise indicated, it should not be assumed that any of the devices or methods described in this section qualify as prior art merely by virtue of their inclusion in this section. Similarly, unless otherwise indicated, the problems mentioned in this section should not be considered as having been acknowledged in any prior art.
Disclosure of Invention
According to an aspect of the present disclosure, there is provided a method of broadcasting an alert based on visibility in an environment, comprising: acquiring alarm information to be broadcasted; determining visibility in the environment; determining an alarm broadcast distance based on the visibility, wherein the alarm broadcast distance defines a distance range to which the alarm information is broadcast; and broadcasting the alarm information according to the alarm broadcasting distance.
According to another aspect of the present disclosure, there is provided an apparatus for broadcasting an alert based on visibility in an environment, comprising: a first acquisition unit configured to acquire alarm information to be broadcast; a first determination unit configured to determine visibility in the environment and determine an alarm broadcast distance based on the visibility, wherein the alarm broadcast distance defines a distance range to which the alarm information is broadcast; and a broadcasting unit configured to broadcast the alarm information according to the alarm broadcasting distance.
According to another aspect of the present disclosure, there is provided a method of alerting a user of an alert based on visibility in an environment, comprising: acquiring alarm information, wherein the alarm information comprises position data of an alarm transmitter and the alarm; determining visibility in the environment; determining an alert cue distance based on the visibility, wherein the alert cue distance defines a distance range of the user from the alert transmitter within which the alert is to be cued to the user; and prompting the user for the alert depending on the location data and the alert prompt distance. .
According to another aspect of the present disclosure, there is provided an apparatus for alerting a user of an alert based on visibility in an environment, comprising: a second acquisition unit configured to acquire alarm information, wherein the alarm information includes position data of an alarm transmitter and the alarm; a second determination unit configured to determine visibility in the environment and to determine an alert cue distance based on the visibility, wherein the alert cue distance defines a distance range of the user from the alert transmitter within which the user is to be alerted to the alert; and a prompting unit configured to prompt the user for the alert depending on the location data and the alert prompt distance.
According to another aspect of the present disclosure, there is provided a vehicle comprising the above-described means for broadcasting an alert based on visibility in an environment and/or means for prompting a user for an alert based on visibility in an environment.
According to another aspect of the present disclosure, there is provided a non-transitory computer readable storage medium storing a program, the program comprising instructions which, when executed by one or more processors, cause the one or more processors to perform the above-described method of broadcasting an alert based on visibility in an environment and/or perform the above-described method of alerting a user based on visibility in an environment.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the embodiments and, together with the description, serve to explain the exemplary implementations of the embodiments. The illustrated embodiments are for purposes of illustration only and do not limit the scope of the claims. Throughout the drawings, identical reference numbers designate similar, but not necessarily identical, elements. The attached drawings are as follows:
FIG. 1 is a flow diagram of a method of broadcasting alerts based on visibility in an environment, according to an example embodiment;
FIG. 2 is an exemplary block diagram of the steps in the method of FIG. 1 to obtain alert information to be broadcast;
FIGS. 3A-3D are exemplary pre-defined functional graphs of alert broadcast distance with respect to visibility;
FIG. 4 is an exemplary flowchart of the step of determining an alert broadcast distance based on the visibility in the method of FIG. 1;
FIG. 5 is a block diagram of an apparatus for broadcasting alerts based on visibility in an environment, according to an example embodiment;
FIG. 6 is a flow diagram of a method of alerting a user based on visibility in an environment, according to an example embodiment;
FIG. 7 is an exemplary block diagram of the step of obtaining alert information in the method of FIG. 6;
FIG. 8 is an exemplary flowchart of the step of prompting the user for the alert in the method of FIG. 6;
FIG. 9 is another exemplary flow chart of the step of prompting the user for the alert in the method of FIG. 6;
FIG. 10 is a block diagram of an apparatus for alerting a user based on visibility in an environment, according to an example embodiment;
FIG. 11 is a schematic diagram of an application scenario in which the methods and functions described herein may be implemented.
Detailed Description
In the present disclosure, unless otherwise specified, the use of the terms "first", "second", etc. to describe various elements is not intended to limit the positional relationship, the timing relationship, or the importance relationship of the elements, and such terms are used only to distinguish one element from another. In some examples, a first element and a second element may refer to the same instance of the element, and in some cases, based on the context, they may also refer to different instances.
The terminology used in the description of the various described examples in this disclosure is for the purpose of describing particular examples only and is not intended to be limiting. Unless the context clearly indicates otherwise, if the number of elements is not specifically limited, the elements may be one or more. Furthermore, the term "and/or" as used in this disclosure is intended to encompass any and all possible combinations of the listed items.
In the related art, a fixed and unchanging alert distance is generally set so that an alert sender broadcasts an alert with a fixed coverage and an alert receiver immediately prompts an alert to a user when entering the fixed coverage. This fixed warning distance does not take into account ambient visibility. Environmental visibility is critical to vehicle driving safety. Visibility can be very different under different light conditions or weather conditions, and thus driver sensitivity and desirability for warning information is also different. For example, in situations where visibility is high (e.g., sunny and daytime hours), the driver's demand for alerts is relatively small, and prompting an alert prematurely may be annoying to the driver. In the case of low visibility (e.g., fog or evening hours), the warning is more important for safe driving, and the driver should be presented with warning information as early as possible for safety reasons. Thus, it may be advantageous to consider environmental visibility as a consideration for adjusting the alert distance.
Embodiments of the present disclosure provide an alert method that takes environmental visibility as a consideration. In the case of broadcasting an alarm, an alarm broadcasting distance is determined according to visibility in the current environment, and the alarm is broadcast only within the alarm broadcasting distance. In the case of receiving an alarm, an alarm presentation distance is determined from visibility in the current environment, and the user is presented with the alarm only when the user is less than the alarm presentation distance from the alarm transmission position. Thus, the alarm range can be adaptively adjusted according to visibility in the current environment. This can not only increase the effectiveness of the warning, but also reduce the possibility that the driver is distracted by receiving unnecessary warnings, thereby effectively increasing driving safety.
In the context of the present disclosure, the term "alarm" may be understood to include a notification, an alert, a warning, or other type of signal or information having a reminder or alarm function.
FIG. 1 shows a flow diagram of a method 100 of broadcasting an alert based on visibility in an environment, according to an example embodiment. As shown in fig. 1, the method 100 for broadcasting an alert based on visibility in an environment includes:
step S110, acquiring alarm information to be broadcasted;
step S120, determining visibility in the environment;
step S130, determining an alarm broadcasting distance based on the visibility;
step S140, broadcasting the alarm information according to the alarm broadcasting distance.
To perform the method 100, a means for broadcasting an alert based on visibility in the environment (not shown) may be used, which may for example comprise: a processor, and a memory storing a program comprising instructions that, when executed by the processor, cause the processor to perform the method 100. In some embodiments, the processor and memory may be embodied by, for example, controller 2130 and its associated memory in fig. 11.
Fig. 2 is an exemplary block diagram of step S110. Fig. 3A-3D are exemplary pre-defined functional graphs of alert broadcast distance versus visibility. Fig. 4 is an exemplary flowchart of step S130. Fig. 5 is a block diagram of an apparatus 500 that may implement the method 100 according to an example embodiment. As shown in fig. 5, the apparatus 500 includes a first obtaining unit 510, a first determining unit 520, and a broadcasting unit 530.
For illustrative purposes, the method 100 will be further explained below with the aid of fig. 1 to 4.
In step S110, the first acquisition unit 510 acquires alarm information to be broadcast. As shown in fig. 2, acquiring the alarm information to be broadcast may include, for example, acquiring (S112), in response to an input by a user, an alarm input by the user as the alarm information to be broadcast. The acquiring of the alarm information to be broadcast may further include generating (S114) an alarm corresponding to an emergency event as the alarm information to be broadcast in response to a sensor sensing signal indicating the emergency event. In the latter case, said generating respective alert information comprises at least one selected from the group consisting of: generating (S116) an accident alert as the alert information to be broadcast in response to the analysis of the image captured by the image sensor indicating the occurrence of a traffic accident; generating (S117) a malfunction alarm as the alarm information to be broadcast in response to the sensor sensing signal indicating occurrence of a vehicle malfunction; and generating (S118) a road condition alert as the alert information to be broadcast in response to the analysis of the image captured by the image sensor indicating the occurrence of a road surface anomaly.
According to some embodiments, the method 100 may be initiated by a user, for example, a user entering alert information to be broadcast on a broadcast alert device. Here, the alarm information may be any alarm that the user considers to be required to be received by other vehicles or devices in traffic, such as unstable vehicle conditions, sudden accidents in traffic, abnormal road conditions, and the like.
As an example, the "unstable vehicle condition" is, for example, a user finding that a vehicle abnormal sound (for example, a tire abnormal sound, a brake abnormal sound, a vehicle body abnormal sound, or the like), a vehicle function (for example, a brake function, a steering function, or the like) is abnormal, or the like while the vehicle is running. The "sudden accident" is, for example, a situation in which the user knows that another vehicle is unstable in the current traffic, a collision accident of multiple vehicles, a forward congestion, a vehicle traveling in the wrong direction, a vehicle parked in a peck, a pedestrian at a high speed, an animal crossing road, a road construction, an ambulance or a fire truck passing, a foreign object (for example, an animal, a lost cargo, or a vehicle component) affecting driving on a driving lane, or the like. The "road surface abnormality" is, for example, a case where a user finds out that a road surface is depressed, a road surface is collapsed, a road surface is raised, a road surface is abnormally slippery due to contaminants (e.g., oil, water, snow cover) or the like.
According to some embodiments, the method 100 may be triggered by the sensing of a signal by a sensor indicating an emergency event. The "emergency" may be any abnormal traffic condition, such as occurrence of a traffic accident, occurrence of a vehicle fault, abnormal driving of the vehicle, occurrence of an abnormal road condition, etc., which is obtained by analyzing the image acquired by the image sensor by the processor. Here, the image sensor may be, for example, a stand-alone device (e.g., a camera, a video camera, and any other type of camera), or may be, for example, an image capture module included in various types of electronic apparatuses (e.g., a car navigation apparatus, a base station apparatus, a roadside unit apparatus).
It can be seen that the alarm information to be broadcast may be not only an alarm input by the user based on his own judgment but also an alarm automatically generated according to various emergency events sensed by the sensors.
In step S120, the first determination unit 520 determines visibility in the environment. In some embodiments, this may be achieved by at least one selected from the group consisting of: acquiring image data of the environment acquired by an image sensor, determining visibility in the environment based on the acquired image data; acquiring visibility data measured by a laser measurer as visibility in the environment; obtaining visibility data from a weather content provider as visibility in the environment; and obtaining visibility data input by a user as visibility in the environment.
In the present disclosure, the "visibility" or "visibility in the environment" refers to the visible distance of an object in the environment, i.e., the maximum distance that an observer can recognize the object from the background when observing the object. Factors that influence visibility in the environment are, for example, target optical properties, background optical properties, natural lighting, atmospheric transparency, etc.
According to some embodiments, the visibility in the environment can be determined by means of laser measurement, thereby enabling the accuracy and reliability of the measured visibility to be ensured.
According to some embodiments, visibility in the environment may be provided by a weather content provider or determined by user input. In the case of user input, the user may estimate visibility, for example by directly visually observing and estimating the distance of himself from a target object in his surroundings, and input the estimated visibility as visibility in the surroundings. Alternatively, the user may also obtain the current environmental visibility in advance and input it as the visibility in the environment, for example, the user may first obtain the current visibility information by listening to a weather report, by querying through a mobile device (e.g., a mobile phone, a tablet computer, a smart watch, a bracelet, or the like) or a vehicle-mounted device (e.g., a vehicle-mounted entertainment device), and then input it as the visibility in the environment.
According to some embodiments, visibility in the environment may be determined from analysis of environmental image data collected by an image sensor. The image sensor may, for example, be carried on a vehicle or on an infrastructure (e.g., a roadside unit device or a base station), or the image sensor may, for example, be carried on a separate alarm transmitter, which may be mounted on the vehicle or on the infrastructure. In this case, the acquiring image data of the environment acquired by the image sensor includes: acquiring image data of the environment at one time; and acquiring image data of the environment a plurality of times at predetermined time intervals. As an example, after the step S110 is executed, the image sensor acquires image data of the current environment at one time. Here, at least one image (e.g., multiple images at different perspectives) near the alarm point may be collected, for example, for visibility analysis of the environment. As another example, the image sensor acquires the environment image a plurality of times at predetermined time intervals, such as every 10 seconds, every 30 seconds, every 2 minutes, every 10 minutes, every 20 minutes, and the like. The time interval may for example be predefined and fixed. Alternatively, the time interval may be variably set, for example, according to the current environmental data. As an example, when the alarm information in the step S110 is generated based on a sensor sensing signal indicating a road surface abnormality, a larger time interval may be set, for example, image data of the environment may be acquired every 20-30 minutes, in consideration of the fact that the road surface condition is not likely to change in a short time. Whereas, when the alarm information in the step S110 is generated based on the sensor sensing signal indicating the occurrence of the traffic accident, it is possible to set a small time interval, for example, to acquire image data of the environment every 1-5 minutes, considering that the traffic accident is generally eliminated as soon as possible in order to restore the road passage.
Further, as some embodiments, the determining visibility in the environment based on the acquired image data comprises: determining a target object in the image; extracting edge features of the target object; and determining the visibility based on the edge features. In an example, the edge features may include one or more of: gradient at the edge, contrast at the edge, and brightness at the edge.
In step S130, the first determination unit 520 determines an alarm broadcast distance based on visibility. In some embodiments, this may be achieved by one selected from the group consisting of: obtaining an alarm broadcasting distance according to a predefined function of the alarm broadcasting distance on the visibility; and obtaining the alarm broadcasting distance according to a predefined comparison table describing the corresponding relation between the visibility and the alarm broadcasting distance.
Here, the "alarm broadcast distance" is defined as a distance range to which the alarm information is broadcast, i.e., the alarm information acquired in step S110 is broadcast only within the alarm broadcast distance. That is, the corresponding alarm receiving side (e.g., an alarm receiver mounted on another vehicle) can receive the alarm information only when the corresponding vehicle comes within the alarm broadcast distance.
In a first case of performing step S130, an alarm broadcast distance is obtained according to a predefined function of the alarm broadcast distance with respect to the visibility. An exemplary predefined functional relationship of alarm broadcast distance with respect to visibility is shown in fig. 3A-3D, but it should be understood that any predefined function capable of representing a relationship between alarm broadcast distance and visibility is contemplated within the scope of the present disclosure.
In some embodiments, the predefined function is, for example, a negative correlation function. It is advantageous that the alarm is broadcast a shorter distance when the visibility is greater. As an example, when weather conditions are more clear (e.g. sunny days), traffic participants only need to receive the warning information within a short distance from the warning point (i.e. a short warning broadcast distance) because of greater visibility in the environment, otherwise too many warning indications may disturb driving and create a safety hazard. In contrast, when weather conditions are less obvious (e.g., fog), due to less visibility in the environment, traffic participants need to obtain warning information earlier, i.e., receive it when the vehicle is farther from the warning point, thereby increasing the effectiveness of the warning and improving driving safety.
Further, the negative correlation function is, for example, a linear function, as shown, for example, in fig. 3A. This can simplify the calculation of the alarm broadcast distance, thereby increasing the calculation speed. As an example, the predefined function may be:
RB=k*Lv+ b (formula 1)
In formula 1, RBIndicates the alarm broadcast distance, LvRepresenting visibility in the environment, k being a negative number, b being any natural number. It should be understood that equation 1 is only forIs a simple example for the predefined function. As the predefined function, other linear negative correlation functions, non-linear negative correlation functions (such as those shown in fig. 3B, 3C, and 3D), positive correlation functions, and any combination thereof are also conceivable, as long as they represent a functional relationship between the alarm broadcast distance and the visibility.
In the second case of performing step S130, the alarm broadcast distance is obtained according to a predefined look-up table describing the correspondence between the visibility and the alarm broadcast distance. The predefined lookup table may be pre-stored in the corresponding storage module and retrieved and used when performing step S130. Fig. 4 shows an exemplary flowchart of step S130 by means of a predefined look-up table, said step S130 may further comprise the steps of:
step S131: obtaining a first numerical value closest to the visibility from a first array of the predefined look-up table;
step S132: obtaining a second numerical value corresponding to the first numerical value in a second number group of the predefined comparison table based on the first numerical value; and
step S133: the resulting second value is output as the alarm broadcast distance.
In steps S131 to S133, the first array includes a set representing the reference visibility (e.g., the second column array in the following table), and the second array represents a set representing the reference alarm broadcast distance (e.g., the third column array in the following table).
Table 1 exemplary predefined look-up tables
First array (visibility)/m Second array (alarm broadcast distance)/m
1 10 200
2 20 170
3 30 140
4 40 110
5 50 80
6 60 50
...... ...... ......
As an example, table 1 represents a predefined lookup table of the correspondence between the visibility and the alarm broadcast distance. In table 1, when the first array representing visibility is incremented, the second array representing the alarm broadcast distance is decremented, since it is advantageous when the alarm broadcast distance varies inversely with respect to visibility, however, such a variation trend is not necessary, and other variation possibilities of the alarm broadcast distance with respect to visibility are also considered. In addition, in table 1, the values in the first array are incremented by 10 meters of difference, and the values in the second array are decremented by 30 meters of difference. It should be understood that these differences are merely exemplary and may be adjusted depending on the actual application (e.g., computational power of the processor, memory storage, required computational accuracy).
Next, step S131 to step S133 are explained with reference to table 1 and fig. 4. Exemplarily, if it is determined in step S120 that the visibility in the environment is, for example, about 31m, the step S130 may be performed by: first, in step S131, a first value (here, a 3 rd row value) 30m closest to the visibility is obtained from the first array (i.e., the second row data) in table 1. Next, in step S132, based on the obtained first value 30m, a second value 140m (i.e. of the same row) corresponding to the first value is searched in table 1. Finally, in step S133, the second value 140m is output as the alarm broadcast distance.
According to an embodiment, to perform the step S131, an operation selected from the group consisting of: comparing the values in the first array in the predefined comparison table with the visibility, and outputting the values as the first values when the difference value between one value in the first array and the visibility is less than a threshold value; and comparing all values in the first array in the predefined comparison table with the visibility values, respectively calculating differences between all values and the visibility to obtain a minimum difference, and outputting the value in the first array corresponding to the minimum difference as the first value.
In step S140, the broadcasting unit 530 broadcasts the alarm information according to the alarm broadcasting distance. In some embodiments, step S140 may include: the alert information is broadcast for a predetermined period of time. Alternatively or additionally, step S140 may comprise: and broadcasting alarm information until the interrupt signal is acquired. The interrupt signal may, for example, comprise one selected from the group consisting of: a user input indicating an interrupt; and a sensor sensing signal of the on-vehicle sensor indicating the elimination of the emergency event.
As an example of broadcasting the warning information for a predetermined period of time, after the warning information (e.g., a vehicle failure warning) is acquired in step S110 and after the warning broadcasting distance (e.g., 140m determined above by means of table 1) is determined in step S130, a warning (i.e., a warning that the own vehicle has failed) is issued to other traffic participants (e.g., other vehicles) within the warning broadcasting distance (i.e., within the 140m circumference centered on the warning transmitter) for a predetermined period of time in step S140. Here, the predetermined period of time may be any length of time, such as 5 minutes, 10 minutes, 30 minutes, 2 hours, and the like. Advantageously, the predetermined period of time may be set or selected according to the type of the alarm, for example when the type of alarm is an event which is easier to eliminate (e.g. a vehicle malfunction alarm or a vehicle crash accident alarm), a shorter period of time may be set or selected to broadcast the alarm and the broadcast of the alarm is automatically ended after the predetermined period of time has ended. In contrast, when the alarm type is, for example, an event that is less easily eliminated (e.g., a road surface sudden depression alarm), a longer period of time may be set or selected to broadcast the alarm, and the broadcast of the alarm may be ended after the predetermined period of time is ended.
As an example of broadcasting the alarm information until the interrupt signal is acquired, the alarm information will be continuously broadcast until the interrupt signal input by the user or the sensor sensing signal indicating the removal of the emergency event by the in-vehicle sensor is acquired in step S140. Advantageously, in case the alarm information acquired in step S110 is input by the user, then an interrupt signal of the user may be waited for in step S140 to end the broadcasting of the alarm. In the case where the alarm information acquired in step S110 is generated based on each sensor sensing an emergency event, it may wait for the sensor to further sense the elimination of the emergency event as a signal to end broadcasting the alarm in step S140.
It should be understood that in the present disclosure, the "broadcasting" includes, but is not limited to, transmitting a signal (here, alarm information) by means of the communication device 2140 (fig. 11) in a manner of, for example, a wireless local area network or bluetooth, so that a corresponding signal receiving side (for example, an alarm receiving device) can receive the broadcasted signal within a broadcasting range. It will also be understood that in some embodiments, the range of distances to which the alert information is broadcast (i.e., "alert broadcast distance") may be set by adjusting the transmit power of the transmitter of the communication device 2140, although the disclosure is not so limited.
FIG. 6 shows a flowchart of a method 600 of prompting a user for an alert based on visibility in an environment, according to an example embodiment. As seen in fig. 6, method 600 may, for example, comprise:
step S610: acquiring alarm information, wherein the alarm information comprises position data of an alarm transmitter and the alarm;
step S620: determining visibility in the environment;
step S630: determining an alert cue distance based on the visibility;
step S640: prompting a user for the alert depending on the location data and the alert prompt distance.
Method 600 may be implemented by means of a device (not shown) for alerting a user of an alert based on visibility in an environment, the device may include: a processor, and a memory storing a program comprising instructions that, when executed by the processor, cause the processor to perform the method 600. In some embodiments, the processor and memory may be embodied by, for example, controller 2130 and its associated memory in fig. 11.
Fig. 7 shows an exemplary block diagram of step S610. Fig. 8 is an exemplary flowchart of step S640. Fig. 9 is another exemplary flowchart of step S640. Fig. 10 is a block diagram of an apparatus 1000 that may implement the method 600 according to an example embodiment. As shown in fig. 10, the apparatus 1000 includes a second acquiring unit 1010, a second determining unit 1020, and a prompting unit 1030.
For illustrative purposes, the method 600 will be further explained below with the aid of fig. 6 to 10.
In step S610, the second acquisition unit 1010 acquires alarm information, which may include position data of an alarm transmitter and an alarm. In some embodiments, this may be achieved by at least one selected from the group consisting of: location data of where the alert transmitter is located is acquired (S612), and an alert issued by the alert transmitter is acquired (S614). Here, it is advantageous to acquire position data (e.g., coordinate values in a map) of the warning transmitter, based on which the distance between the user on the warning reception side, i.e., the host vehicle, and the warning point can be further obtained. In the present disclosure, the "alarm point" may be understood as a location or a vicinity of the alarm transmitter, a location or a vicinity of the emergency occurrence sensed by a sensor, or a location or a vicinity of a user who inputs alarm information at an alarm transmitting side. As shown in fig. 7, acquiring (S614) an alert issued by an alert transmitter may further include acquiring (S616) an alert entered by a user at an alert transmitting side on the alert transmitter, and acquiring (S617) a corresponding alert generated by the alert transmitter in response to a sensor sensing signal indicating an emergency event.
In step S620, the second determination unit 1020 determines visibility in the environment.
According to an embodiment, step S620 may comprise an operation selected from the group consisting of: acquiring image data of the environment acquired by an image sensor, determining visibility in the environment based on the acquired image data; acquiring visibility data measured by a laser measurer as visibility in the environment; obtaining visibility data from a weather content provider as visibility in the environment; and acquiring visibility data input by a user, for example, on the alarm receiving side, as visibility in the environment.
According to an embodiment, the acquiring image data of the environment acquired by an image sensor comprises: acquiring image data of a surrounding environment at one time; or image data of the environment is acquired a plurality of times at predetermined time intervals.
According to an embodiment, said determining visibility in said environment based on said acquired image data comprises: determining a target object in the image; extracting edge features of the target object, wherein the edge features include at least one of the group consisting of: gradient at edge, contrast at edge and brightness at edge; and determining the visibility based on the edge features.
In step S630, the second determination unit 1020 determines an alert presentation distance based on the visibility.
According to an embodiment, step S630 comprises one selected from the group consisting of: obtaining an alarm prompt distance according to a predefined function of the alarm prompt distance on the visibility; and obtaining the alarm prompt distance according to a predefined comparison table describing the corresponding relation between the visibility and the alarm prompt distance.
According to an embodiment, the predefined function comprises a negative correlation function. The negative correlation function may include a linear function and a non-linear function. As an example, the predefined function is, for example:
Rw=k*Lv+ b (formula 2)
In formula 2, RwIndicating an alarm indication distance, LvRepresenting visibility in the environment, k being a negative number, b being any natural number. It should be understood that equation 2 is only a simple example for the predefined function. As the predefined function, other linear negative correlation functions, non-linear negative correlation functions, positive correlation functions, and any combination of the above functions are also conceivable as long as they represent a functional relationship between the alarm broadcast distance and the visibility.
According to an embodiment, said deriving an alert cue distance according to a predefined look-up table describing correspondence between said visibility and said alert cue distance comprises: obtaining a first numerical value closest to the visibility from a first array of the predefined look-up table; obtaining a second numerical value corresponding to the first numerical value in a second number group of the predefined comparison table based on the first numerical value; outputting the resulting second value as an alert cue distance, wherein the first array comprises a set representing reference visibility and the second array represents a set of reference alert cue distances.
Table 2 exemplary predefined look-up tables
First array (visibility)/m Second array (alert cue distance)/m
1 10 300
2 25 280
3 40 260
4 55 240
5 70 220
6 85 200
...... ...... ......
As an example, table 2 represents a predefined lookup table of the correspondence between the visibility and the alert cue distance. In table 2, when the first array representing visibility is incremented, the second array representing alert cue distance is decremented, since it is advantageous when the alert cue distance varies inversely with respect to visibility, however, such a trend of variation is not necessary and other possibilities of variation of alert cue distance with respect to visibility are also contemplated. In addition, in table 2, the values in the first array are incremented by a difference of 15 meters, and the values in the second array are decremented by a difference of 20 meters. It should be understood that these differences are merely exemplary and that the differences may be adjusted depending on the actual application.
According to an embodiment, said obtaining a first value from a first array of said predefined look-up table that is closest to said visibility comprises one selected from the group consisting of: selecting a value from the first array, wherein the difference value between the first value and the visibility is smaller than a threshold value, as the first value; and selecting a value with the minimum difference value with the visibility from the first array as the first value.
It should be understood that the definition of the steps (e.g., steps S610, S620, S630) or terms (e.g., alarm, emergency event, visibility, predefined function, predefined lookup table, etc.) involved in the method 600 may refer to the corresponding description in the method 100, and will not be described herein again.
In step S640, the presentation unit 1030 presents the alert to the user depending on the position data and the alert presentation distance data.
Fig. 8 and 9 show flowcharts of different examples of step S640 in method 600. As seen in fig. 8, step S640 may include: determining whether a user is approaching the location of the alert transmitter indicated by the location data; in response to determining that a user is approaching the location of the alert transmitter, prompting the alert to the user; and in response to determining that the user is not approaching the location of the alert transmitter, not prompting the user for the alert.
Here, it is advantageous to consider whether the user is approaching the alert delivery point. Typically, an alarm transmitter transmits and broadcasts an associated alarm within an alarm broadcast range (e.g., a circumferential range centered at an alarm point and having a radius of the alarm broadcast distance), i.e., all alarm receivers within the alarm broadcast range are able to receive the alarm emitted by the alarm transmitter. However, the warning information is of little use or even no use for the host vehicle, for example in the case where the vehicle has driven past the warning point, i.e. the warning point is behind the driving route of the vehicle, or for example in the case where the warning point is not in the current driving plan route of the vehicle. In these cases, it is particularly advantageous to set the warning information not to be presented to the user in order to prevent disturbance to the vehicle driver.
As seen in fig. 9, step S640 may further include: determining whether a distance between the location of the user and the location of the alert transmitter is less than the alert prompt distance; in response to determining that the distance between the location of the user and the location of the alert transmitter is less than the alert prompt distance, prompting the alert to the user; in response to determining that the distance between the location of the user and the location of the alert transmitter is not less than the alert prompt distance, not prompting the alert to the user.
Here, the determination of whether the distance between the position of the user and the position of the alert transmitter is less than the alert notice distance may be understood as determining whether the user is within an alert notice range centered on an alert point and having an alert notice distance as a radius. In the case where it has been determined that the user is approaching an alarm point, it is advantageous to further determine that the user is within an alarm prompting range. Generally, a warning receiver mounted on a vehicle can receive a warning after the vehicle comes within a warning broadcasting range, but if the received warning is immediately notified to the driver at this time, it is highly likely that it is unnecessary and may interfere with driving. Therefore, it is advantageous and safe to prompt the driver with the warning information only when the vehicle comes within a warning prompt range determined based on the visibility of the current environment.
Although the operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, nor that all illustrated operations be performed, to achieve desirable results.
Additionally, while particular functionality is discussed above with reference to particular units, it should be noted that the functionality of the various units discussed herein can be divided into multiple units, and/or at least some of the functionality of multiple units can be combined into a single unit. Performing an action by a particular element discussed herein includes the particular element itself performing the action, or alternatively the particular element invoking or otherwise accessing another component or element that performs the action (or performs the action in conjunction with the particular element). Thus, a particular element that performs an action can include the particular element that performs the action itself and/or another element that performs the action that the particular element invokes or otherwise accesses.
More generally, various techniques may be described herein in the general context of software hardware elements or program modules. The various elements described above with respect to fig. 5 and 10 may be implemented in hardware or in hardware in combination with software and/or firmware. For example, the units may be implemented as computer program code/instructions configured to be executed in one or more processors and stored in a computer readable storage medium. Alternatively, these units may be implemented as hardware logic/circuits. For example, in some embodiments, one or more of the first obtaining unit 510, the first determining unit 520, the broadcasting unit 530, the second obtaining unit 1010, the second determining unit 1020, and the prompting unit 1030 may be implemented together in a system on chip (SoC). The SoC may include an integrated circuit chip including one or more components of a processor (e.g., a Central Processing Unit (CPU), microcontroller, microprocessor, Digital Signal Processor (DSP), etc.), memory, one or more communication interfaces, and/or other circuitry, and may optionally execute received program code and/or include embedded firmware to perform functions.
Yet another aspect of the present disclosure provides a vehicle comprising the apparatus 500 for broadcasting an alert based on visibility in an environment and/or the apparatus 1000 for prompting a user for an alert based on visibility in an environment. An example of such a vehicle will be described below in conjunction with fig. 11.
Yet another aspect of the present disclosure provides a non-transitory computer-readable storage medium storing a program, the program comprising instructions that, when executed by one or more processors, cause the one or more processors to perform the method 100 of broadcasting an alert based on visibility in an environment and/or the method 600 of prompting a user for an alert based on visibility in an environment. Examples of such non-transitory computer readable storage media are described below in conjunction with fig. 11.
Fig. 11 shows a schematic diagram of one application scenario 2100 in which the methods and functions described herein may be implemented.
Motor vehicle 2010 may include sensor 2110 for sensing the surrounding environment. The sensors 2110 may include one or more of the following sensors: ultrasonic sensor, millimeter wave radar, laser radar, vision camera and infrared camera. Different sensors may provide different detection accuracies and ranges. The ultrasonic sensors can be arranged around the vehicle and used for measuring the distance between an object outside the vehicle and the vehicle by utilizing the characteristics of strong ultrasonic directionality and the like. The millimeter wave radar may be installed in front of, behind, or other positions of the vehicle for measuring the distance of an object outside the vehicle from the vehicle using the characteristics of electromagnetic waves. The lidar may be mounted in front of, behind, or otherwise of the vehicle for detecting object edges, shape information, and thus object identification and tracking. The radar apparatus can also measure a speed variation of the vehicle and the moving object due to the doppler effect. The camera may be mounted in front of, behind, or otherwise on the vehicle. The visual camera may capture conditions inside and outside the vehicle in real time and present to the driver and/or passengers. In addition, by analyzing the picture captured by the visual camera, information such as traffic light indication, intersection situation, other vehicle running state, and the like can be acquired. The infrared camera can capture objects under night vision conditions.
Motor vehicle 2010 may also include an output device 2120. The output device 2120 includes, for example, a display, a speaker, and the like to present various outputs or instructions. Furthermore, the display may be implemented as a touch screen, so that input may also be detected in different ways. A user graphical interface may be presented on the touch screen to enable a user to access and control the corresponding controls.
Motor vehicle 2010 may also include one or more controllers 2130. The controller 2130 may include a processor, such as a Central Processing Unit (CPU) or a Graphics Processing Unit (GPU), or other special purpose processor, etc., that communicates with various types of computer-readable storage devices or media. A computer-readable storage apparatus or medium may include any non-transitory storage device, which may be non-transitory and may implement any storage device that stores data, and may include, but is not limited to, a magnetic disk drive, an optical storage device, solid state memory, floppy disk, flexible disk, hard disk, magnetic tape, or any other magnetic medium, an optical disk or any other optical medium, a Read Only Memory (ROM), a Random Access Memory (RAM), a cache memory, and/or any other memory chip or cartridge, and/or any other medium from which a computer may read data, instructions, and/or code. Some of the data in the computer readable storage device or medium represents executable instructions used by the controller 2130 to control the vehicle. Controller 2130 may include an autopilot system for automatically controlling various actuators in a vehicle. The autopilot system is configured to control the powertrain, steering system, and braking system, etc. of the motor vehicle 2010 to control acceleration, steering, and braking, respectively, via a plurality of actuators in response to inputs from a plurality of sensors 2110 or other input devices, without human intervention or limited human intervention. Part of the processing functions of the controller 2130 may be implemented by cloud computing. For example, some processing may be performed using an onboard processor while other processing may be performed using the computing resources in the cloud.
Motor vehicle 2010 also includes communication device 2140. The communication device 2140 includes a satellite positioning module capable of receiving satellite positioning signals from the satellites 2012 and generating coordinates based on these signals. The communication device 2140 also includes modules to communicate with the mobile communication network 2013, which may implement any suitable communication technology, such as current or evolving wireless communication technologies (e.g., 5G technologies) like GSM/GPRS, CDMA, LTE, etc. The communications device 2140 may also have a Vehicle-to-Vehicle (V2X) module configured to enable Vehicle-to-Vehicle (V2V) communications with other vehicles 2011 and Vehicle-to-Infrastructure (V2I) communications with the outside world, for example. In addition, the communication device 2140 may also have a module configured to communicate with the user terminal 2014 (including but not limited to a smartphone, a tablet computer, or a wearable device such as a watch), for example, via wireless local area network using IEEE802.11 standards or bluetooth. With the communications device 2140, the motor vehicle 2010 can access via a wireless communications system an online server 2015 or a cloud server 2016 configured to provide respective data processing, data storage, and data transmission services for the motor vehicle.
In addition, the motor vehicle 2010 includes a power train, a steering system, a brake system, and the like, which are not shown in fig. 11, for implementing a motor vehicle driving function.
While the disclosure has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative and exemplary and not restrictive; the present disclosure is not limited to the disclosed embodiments. Variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed subject matter, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps not listed, the indefinite article "a" or "an" does not exclude a plurality, and the term "a plurality" means two or more. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Claims (15)

1. A method of broadcasting alerts based on visibility in an environment, comprising:
acquiring alarm information to be broadcasted;
determining visibility in the environment;
determining an alarm broadcast distance based on the visibility, wherein the alarm broadcast distance defines a distance range to which the alarm information is broadcast; and
and broadcasting the alarm information according to the alarm broadcasting distance.
2. The method of claim 1, wherein the obtaining alert information to be broadcast comprises at least one selected from the group consisting of:
acquiring an alarm input by a user as the alarm information to be broadcast in response to an input by the user; and
generating an alarm corresponding to an emergency event as the alarm information to be broadcast in response to a sensor sensing signal indicating the emergency event.
3. The method of claim 1, wherein the determining visibility in the environment comprises at least one selected from the group consisting of:
acquiring image data of the environment acquired by an image sensor, determining visibility in the environment based on the acquired image data;
acquiring visibility data measured by a laser measurer as visibility in the environment;
obtaining visibility data from a weather content provider as visibility in the environment; and
visibility data input by a user is acquired as visibility in the environment.
4. The method of claim 3, wherein said determining visibility in the environment based on the acquired image data comprises:
determining a target object in the image;
extracting edge features of the target object, wherein the edge features include at least one of the group consisting of: gradient at edge, contrast at edge and brightness at edge; and
determining the visibility based on the edge feature.
5. The method of any of claims 1-4, wherein the determining an alert broadcast distance comprises one selected from the group consisting of:
obtaining an alarm broadcasting distance according to a predefined function of the alarm broadcasting distance on the visibility; and
and obtaining the alarm broadcasting distance according to a predefined comparison table describing the corresponding relation between the visibility and the alarm broadcasting distance.
6. An apparatus for broadcasting alerts based on visibility in an environment, comprising:
a first acquisition unit configured to acquire alarm information to be broadcast;
a first determination unit configured to determine visibility in the environment and determine an alarm broadcast distance based on the visibility, wherein the alarm broadcast distance defines a distance range to which the alarm information is broadcast; and
a broadcasting unit configured to broadcast the alarm information according to the alarm broadcasting distance.
7. A method of alerting a user of an alert based on visibility in an environment, comprising:
acquiring alarm information, wherein the alarm information comprises position data of an alarm transmitter and the alarm;
determining visibility in the environment;
determining an alert cue distance based on the visibility, wherein the alert cue distance defines a distance range of the user from the alert transmitter within which the alert is to be cued to the user; and
prompting the user for the alert depending on the location data and the alert prompt distance.
8. The method of claim 7, wherein the determining visibility in the environment comprises at least one selected from the group consisting of:
acquiring image data of the environment acquired by an image sensor, determining visibility in the environment based on the acquired image data;
acquiring visibility data measured by a laser measurer as visibility in the environment;
obtaining visibility data from a weather content provider as visibility in the environment; and
visibility data input by a user is acquired as visibility in the environment.
9. The method of claim 8, wherein said determining visibility in the environment based on the acquired image data comprises:
determining a target object in the image;
extracting edge features of the target object, wherein the edge features include at least one of the group consisting of: gradient at edge, contrast at edge and brightness at edge; and
determining the visibility based on the edge feature.
10. The method of claim 7, wherein said determining an alert cue distance based on said visibility comprises one selected from the group consisting of:
obtaining an alarm prompt distance according to a predefined function of the alarm prompt distance on the visibility; and
and obtaining the alarm prompt distance according to a predefined comparison table describing the corresponding relation between the visibility and the alarm prompt distance.
11. The method of any of claims 7 to 10, wherein said prompting the user for the alert depending on the location data and the alert prompt distance comprises:
determining whether a user is approaching the location of the alert transmitter indicated by the location data; and
in response to determining that a user is not approaching the location of the alert transmitter indicated by the location data, not prompting the alert to the user.
12. The method of claim 11, further comprising:
in response to determining that a user is approaching the location of the alert transmitter indicated by the location data, determining whether a distance between the location of the user and the location of the alert transmitter is less than the alert prompt distance;
in response to determining that the distance between the location of the user and the location of the alert transmitter is less than the alert prompt distance, prompting the alert to the user; and
in response to determining that the distance between the location of the user and the location of the alert transmitter is not less than the alert prompt distance, not prompting the alert to the user.
13. An apparatus for alerting a user of an alert based on visibility in an environment, comprising:
a second acquisition unit configured to acquire alarm information, wherein the alarm information includes position data of an alarm transmitter and the alarm;
a second determination unit configured to determine visibility in the environment and to determine an alert cue distance based on the visibility, wherein the alert cue distance defines a distance range of the user from the alert transmitter within which the user is to be alerted to the alert; and
a prompt unit configured to prompt the user for the alert depending on the location data and the alert prompt distance.
14. A vehicle, comprising: the apparatus of claim 6 and/or the apparatus of claim 13.
15. A non-transitory computer-readable storage medium storing a program, the program comprising instructions that, when executed by one or more processors, cause the one or more processors to perform the method of any of claims 1-5 or perform the method of any of claims 7-12.
CN202010750836.9A 2020-07-30 2020-07-30 Alarm broadcasting method and device, alarm prompting method and device, medium and vehicle Pending CN114067583A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010750836.9A CN114067583A (en) 2020-07-30 2020-07-30 Alarm broadcasting method and device, alarm prompting method and device, medium and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010750836.9A CN114067583A (en) 2020-07-30 2020-07-30 Alarm broadcasting method and device, alarm prompting method and device, medium and vehicle

Publications (1)

Publication Number Publication Date
CN114067583A true CN114067583A (en) 2022-02-18

Family

ID=80227130

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010750836.9A Pending CN114067583A (en) 2020-07-30 2020-07-30 Alarm broadcasting method and device, alarm prompting method and device, medium and vehicle

Country Status (1)

Country Link
CN (1) CN114067583A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007050375A1 (en) * 2007-10-22 2009-04-23 Daimler Ag Weather information e.g. rain information, utilizing method for vehicle i.e. car, involves utilizing traffic patterns for searching target information of drivers, where patterns are utilized for providing suggestions to drivers
KR20170037761A (en) * 2015-09-25 2017-04-05 박희성 Apparatus and system for alerting vehicle's accident conditions using wireless communications
CN109561394A (en) * 2018-11-16 2019-04-02 维沃移动通信有限公司 A kind of warning message broadcasting method and terminal
CN110349404A (en) * 2018-04-03 2019-10-18 百度(美国)有限责任公司 Operate method, system and the machine readable media of automatic driving vehicle and warning service
CN110349422A (en) * 2019-08-19 2019-10-18 深圳成谷科技有限公司 A kind of method, device and equipment of road weather warning

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007050375A1 (en) * 2007-10-22 2009-04-23 Daimler Ag Weather information e.g. rain information, utilizing method for vehicle i.e. car, involves utilizing traffic patterns for searching target information of drivers, where patterns are utilized for providing suggestions to drivers
KR20170037761A (en) * 2015-09-25 2017-04-05 박희성 Apparatus and system for alerting vehicle's accident conditions using wireless communications
CN110349404A (en) * 2018-04-03 2019-10-18 百度(美国)有限责任公司 Operate method, system and the machine readable media of automatic driving vehicle and warning service
CN109561394A (en) * 2018-11-16 2019-04-02 维沃移动通信有限公司 A kind of warning message broadcasting method and terminal
CN110349422A (en) * 2019-08-19 2019-10-18 深圳成谷科技有限公司 A kind of method, device and equipment of road weather warning

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
时岩,等: "无线泛在网络的移动性管理技术", 28 February 2017, 北京邮电大学出版社, pages: 144 *
石玉立等: "基于图像特征的大气能见度估算方法", 南京理工大学学报, 30 October 2018 (2018-10-30), pages 1 - 3 *
陈文兵等: "基于图像边缘的能见度计算方法", 微型电脑应用, 20 April 2009 (2009-04-20), pages 1 - 3 *

Similar Documents

Publication Publication Date Title
US11295143B2 (en) Information processing apparatus, information processing method, and program
US9786171B2 (en) Systems and methods for detecting and distributing hazard data by a vehicle
US20150307131A1 (en) Autonomous Driving in a Hazard Situation
US20190143967A1 (en) Method and system for collaborative sensing for updating dynamic map layers
US20130120158A1 (en) False Event Suppression for Collision Avoidance Systems
JP6834860B2 (en) Collision prevention device, collision prevention method, collision prevention program, recording medium
US10909848B2 (en) Driving assistance device
CN111319628A (en) Method and system for evaluating false threat detection
CN112534487B (en) Information processing apparatus, moving body, information processing method, and program
US11600076B2 (en) Detection of a hazardous situation in road traffic
US11062603B2 (en) Object detection device for vehicle and object detection system for vehicle
CN116312045A (en) Vehicle danger early warning method and device
CN110271554B (en) Driver assistance system and method for vehicle
EP2797027A1 (en) A vehicle driver alert arrangement, a vehicle and a method for alerting a vehicle driver
US20240157961A1 (en) Vehicle system and storage medium
CN210617998U (en) Blind area detection equipment for freight transport and passenger transport vehicles
KR20180130201A (en) Apparatus and method of support safe driving considering rear vehicle
CN114067583A (en) Alarm broadcasting method and device, alarm prompting method and device, medium and vehicle
KR20150042479A (en) Rear alarm device of depending on relative velocity using a rear sensor, and the method of thereof
CN115643547A (en) Whistling prompting method and device, vehicle and storage medium
JP2019212186A (en) Drive assist device
CN115366900A (en) Vehicle fault detection method and device, vehicle and storage medium
CN118215612A (en) Information processing method and device
JP2012103849A (en) Information provision device
JP2022056153A (en) Temporary stop detection device, temporary stop detection system, and temporary stop detection program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination