Detailed Description
In the present disclosure, unless otherwise specified, the use of the terms "first", "second", etc. to describe various elements is not intended to limit the positional relationship, the timing relationship, or the importance relationship of the elements, and such terms are used only to distinguish one element from another. In some examples, a first element and a second element may refer to the same instance of the element, and in some cases, based on the context, they may also refer to different instances.
The terminology used in the description of the various described examples in this disclosure is for the purpose of describing particular examples only and is not intended to be limiting. Unless the context clearly indicates otherwise, if the number of elements is not specifically limited, the elements may be one or more. Furthermore, the term "and/or" as used in this disclosure is intended to encompass any and all possible combinations of the listed items.
In the related art, a fixed and unchanging alert distance is generally set so that an alert sender broadcasts an alert with a fixed coverage and an alert receiver immediately prompts an alert to a user when entering the fixed coverage. This fixed warning distance does not take into account ambient visibility. Environmental visibility is critical to vehicle driving safety. Visibility can be very different under different light conditions or weather conditions, and thus driver sensitivity and desirability for warning information is also different. For example, in situations where visibility is high (e.g., sunny and daytime hours), the driver's demand for alerts is relatively small, and prompting an alert prematurely may be annoying to the driver. In the case of low visibility (e.g., fog or evening hours), the warning is more important for safe driving, and the driver should be presented with warning information as early as possible for safety reasons. Thus, it may be advantageous to consider environmental visibility as a consideration for adjusting the alert distance.
Embodiments of the present disclosure provide an alert method that takes environmental visibility as a consideration. In the case of broadcasting an alarm, an alarm broadcasting distance is determined according to visibility in the current environment, and the alarm is broadcast only within the alarm broadcasting distance. In the case of receiving an alarm, an alarm presentation distance is determined from visibility in the current environment, and the user is presented with the alarm only when the user is less than the alarm presentation distance from the alarm transmission position. Thus, the alarm range can be adaptively adjusted according to visibility in the current environment. This can not only increase the effectiveness of the warning, but also reduce the possibility that the driver is distracted by receiving unnecessary warnings, thereby effectively increasing driving safety.
In the context of the present disclosure, the term "alarm" may be understood to include a notification, an alert, a warning, or other type of signal or information having a reminder or alarm function.
FIG. 1 shows a flow diagram of a method 100 of broadcasting an alert based on visibility in an environment, according to an example embodiment. As shown in fig. 1, the method 100 for broadcasting an alert based on visibility in an environment includes:
step S110, acquiring alarm information to be broadcasted;
step S120, determining visibility in the environment;
step S130, determining an alarm broadcasting distance based on the visibility;
step S140, broadcasting the alarm information according to the alarm broadcasting distance.
To perform the method 100, a means for broadcasting an alert based on visibility in the environment (not shown) may be used, which may for example comprise: a processor, and a memory storing a program comprising instructions that, when executed by the processor, cause the processor to perform the method 100. In some embodiments, the processor and memory may be embodied by, for example, controller 2130 and its associated memory in fig. 11.
Fig. 2 is an exemplary block diagram of step S110. Fig. 3A-3D are exemplary pre-defined functional graphs of alert broadcast distance versus visibility. Fig. 4 is an exemplary flowchart of step S130. Fig. 5 is a block diagram of an apparatus 500 that may implement the method 100 according to an example embodiment. As shown in fig. 5, the apparatus 500 includes a first obtaining unit 510, a first determining unit 520, and a broadcasting unit 530.
For illustrative purposes, the method 100 will be further explained below with the aid of fig. 1 to 4.
In step S110, the first acquisition unit 510 acquires alarm information to be broadcast. As shown in fig. 2, acquiring the alarm information to be broadcast may include, for example, acquiring (S112), in response to an input by a user, an alarm input by the user as the alarm information to be broadcast. The acquiring of the alarm information to be broadcast may further include generating (S114) an alarm corresponding to an emergency event as the alarm information to be broadcast in response to a sensor sensing signal indicating the emergency event. In the latter case, said generating respective alert information comprises at least one selected from the group consisting of: generating (S116) an accident alert as the alert information to be broadcast in response to the analysis of the image captured by the image sensor indicating the occurrence of a traffic accident; generating (S117) a malfunction alarm as the alarm information to be broadcast in response to the sensor sensing signal indicating occurrence of a vehicle malfunction; and generating (S118) a road condition alert as the alert information to be broadcast in response to the analysis of the image captured by the image sensor indicating the occurrence of a road surface anomaly.
According to some embodiments, the method 100 may be initiated by a user, for example, a user entering alert information to be broadcast on a broadcast alert device. Here, the alarm information may be any alarm that the user considers to be required to be received by other vehicles or devices in traffic, such as unstable vehicle conditions, sudden accidents in traffic, abnormal road conditions, and the like.
As an example, the "unstable vehicle condition" is, for example, a user finding that a vehicle abnormal sound (for example, a tire abnormal sound, a brake abnormal sound, a vehicle body abnormal sound, or the like), a vehicle function (for example, a brake function, a steering function, or the like) is abnormal, or the like while the vehicle is running. The "sudden accident" is, for example, a situation in which the user knows that another vehicle is unstable in the current traffic, a collision accident of multiple vehicles, a forward congestion, a vehicle traveling in the wrong direction, a vehicle parked in a peck, a pedestrian at a high speed, an animal crossing road, a road construction, an ambulance or a fire truck passing, a foreign object (for example, an animal, a lost cargo, or a vehicle component) affecting driving on a driving lane, or the like. The "road surface abnormality" is, for example, a case where a user finds out that a road surface is depressed, a road surface is collapsed, a road surface is raised, a road surface is abnormally slippery due to contaminants (e.g., oil, water, snow cover) or the like.
According to some embodiments, the method 100 may be triggered by the sensing of a signal by a sensor indicating an emergency event. The "emergency" may be any abnormal traffic condition, such as occurrence of a traffic accident, occurrence of a vehicle fault, abnormal driving of the vehicle, occurrence of an abnormal road condition, etc., which is obtained by analyzing the image acquired by the image sensor by the processor. Here, the image sensor may be, for example, a stand-alone device (e.g., a camera, a video camera, and any other type of camera), or may be, for example, an image capture module included in various types of electronic apparatuses (e.g., a car navigation apparatus, a base station apparatus, a roadside unit apparatus).
It can be seen that the alarm information to be broadcast may be not only an alarm input by the user based on his own judgment but also an alarm automatically generated according to various emergency events sensed by the sensors.
In step S120, the first determination unit 520 determines visibility in the environment. In some embodiments, this may be achieved by at least one selected from the group consisting of: acquiring image data of the environment acquired by an image sensor, determining visibility in the environment based on the acquired image data; acquiring visibility data measured by a laser measurer as visibility in the environment; obtaining visibility data from a weather content provider as visibility in the environment; and obtaining visibility data input by a user as visibility in the environment.
In the present disclosure, the "visibility" or "visibility in the environment" refers to the visible distance of an object in the environment, i.e., the maximum distance that an observer can recognize the object from the background when observing the object. Factors that influence visibility in the environment are, for example, target optical properties, background optical properties, natural lighting, atmospheric transparency, etc.
According to some embodiments, the visibility in the environment can be determined by means of laser measurement, thereby enabling the accuracy and reliability of the measured visibility to be ensured.
According to some embodiments, visibility in the environment may be provided by a weather content provider or determined by user input. In the case of user input, the user may estimate visibility, for example by directly visually observing and estimating the distance of himself from a target object in his surroundings, and input the estimated visibility as visibility in the surroundings. Alternatively, the user may also obtain the current environmental visibility in advance and input it as the visibility in the environment, for example, the user may first obtain the current visibility information by listening to a weather report, by querying through a mobile device (e.g., a mobile phone, a tablet computer, a smart watch, a bracelet, or the like) or a vehicle-mounted device (e.g., a vehicle-mounted entertainment device), and then input it as the visibility in the environment.
According to some embodiments, visibility in the environment may be determined from analysis of environmental image data collected by an image sensor. The image sensor may, for example, be carried on a vehicle or on an infrastructure (e.g., a roadside unit device or a base station), or the image sensor may, for example, be carried on a separate alarm transmitter, which may be mounted on the vehicle or on the infrastructure. In this case, the acquiring image data of the environment acquired by the image sensor includes: acquiring image data of the environment at one time; and acquiring image data of the environment a plurality of times at predetermined time intervals. As an example, after the step S110 is executed, the image sensor acquires image data of the current environment at one time. Here, at least one image (e.g., multiple images at different perspectives) near the alarm point may be collected, for example, for visibility analysis of the environment. As another example, the image sensor acquires the environment image a plurality of times at predetermined time intervals, such as every 10 seconds, every 30 seconds, every 2 minutes, every 10 minutes, every 20 minutes, and the like. The time interval may for example be predefined and fixed. Alternatively, the time interval may be variably set, for example, according to the current environmental data. As an example, when the alarm information in the step S110 is generated based on a sensor sensing signal indicating a road surface abnormality, a larger time interval may be set, for example, image data of the environment may be acquired every 20-30 minutes, in consideration of the fact that the road surface condition is not likely to change in a short time. Whereas, when the alarm information in the step S110 is generated based on the sensor sensing signal indicating the occurrence of the traffic accident, it is possible to set a small time interval, for example, to acquire image data of the environment every 1-5 minutes, considering that the traffic accident is generally eliminated as soon as possible in order to restore the road passage.
Further, as some embodiments, the determining visibility in the environment based on the acquired image data comprises: determining a target object in the image; extracting edge features of the target object; and determining the visibility based on the edge features. In an example, the edge features may include one or more of: gradient at the edge, contrast at the edge, and brightness at the edge.
In step S130, the first determination unit 520 determines an alarm broadcast distance based on visibility. In some embodiments, this may be achieved by one selected from the group consisting of: obtaining an alarm broadcasting distance according to a predefined function of the alarm broadcasting distance on the visibility; and obtaining the alarm broadcasting distance according to a predefined comparison table describing the corresponding relation between the visibility and the alarm broadcasting distance.
Here, the "alarm broadcast distance" is defined as a distance range to which the alarm information is broadcast, i.e., the alarm information acquired in step S110 is broadcast only within the alarm broadcast distance. That is, the corresponding alarm receiving side (e.g., an alarm receiver mounted on another vehicle) can receive the alarm information only when the corresponding vehicle comes within the alarm broadcast distance.
In a first case of performing step S130, an alarm broadcast distance is obtained according to a predefined function of the alarm broadcast distance with respect to the visibility. An exemplary predefined functional relationship of alarm broadcast distance with respect to visibility is shown in fig. 3A-3D, but it should be understood that any predefined function capable of representing a relationship between alarm broadcast distance and visibility is contemplated within the scope of the present disclosure.
In some embodiments, the predefined function is, for example, a negative correlation function. It is advantageous that the alarm is broadcast a shorter distance when the visibility is greater. As an example, when weather conditions are more clear (e.g. sunny days), traffic participants only need to receive the warning information within a short distance from the warning point (i.e. a short warning broadcast distance) because of greater visibility in the environment, otherwise too many warning indications may disturb driving and create a safety hazard. In contrast, when weather conditions are less obvious (e.g., fog), due to less visibility in the environment, traffic participants need to obtain warning information earlier, i.e., receive it when the vehicle is farther from the warning point, thereby increasing the effectiveness of the warning and improving driving safety.
Further, the negative correlation function is, for example, a linear function, as shown, for example, in fig. 3A. This can simplify the calculation of the alarm broadcast distance, thereby increasing the calculation speed. As an example, the predefined function may be:
RB=k*Lv+ b (formula 1)
In formula 1, RBIndicates the alarm broadcast distance, LvRepresenting visibility in the environment, k being a negative number, b being any natural number. It should be understood that equation 1 is only forIs a simple example for the predefined function. As the predefined function, other linear negative correlation functions, non-linear negative correlation functions (such as those shown in fig. 3B, 3C, and 3D), positive correlation functions, and any combination thereof are also conceivable, as long as they represent a functional relationship between the alarm broadcast distance and the visibility.
In the second case of performing step S130, the alarm broadcast distance is obtained according to a predefined look-up table describing the correspondence between the visibility and the alarm broadcast distance. The predefined lookup table may be pre-stored in the corresponding storage module and retrieved and used when performing step S130. Fig. 4 shows an exemplary flowchart of step S130 by means of a predefined look-up table, said step S130 may further comprise the steps of:
step S131: obtaining a first numerical value closest to the visibility from a first array of the predefined look-up table;
step S132: obtaining a second numerical value corresponding to the first numerical value in a second number group of the predefined comparison table based on the first numerical value; and
step S133: the resulting second value is output as the alarm broadcast distance.
In steps S131 to S133, the first array includes a set representing the reference visibility (e.g., the second column array in the following table), and the second array represents a set representing the reference alarm broadcast distance (e.g., the third column array in the following table).
Table 1 exemplary predefined look-up tables
|
First array (visibility)/m
|
Second array (alarm broadcast distance)/m
|
1
|
10
|
200
|
2
|
20
|
170
|
3
|
30
|
140
|
4
|
40
|
110
|
5
|
50
|
80
|
6
|
60
|
50
|
......
|
......
|
...... |
As an example, table 1 represents a predefined lookup table of the correspondence between the visibility and the alarm broadcast distance. In table 1, when the first array representing visibility is incremented, the second array representing the alarm broadcast distance is decremented, since it is advantageous when the alarm broadcast distance varies inversely with respect to visibility, however, such a variation trend is not necessary, and other variation possibilities of the alarm broadcast distance with respect to visibility are also considered. In addition, in table 1, the values in the first array are incremented by 10 meters of difference, and the values in the second array are decremented by 30 meters of difference. It should be understood that these differences are merely exemplary and may be adjusted depending on the actual application (e.g., computational power of the processor, memory storage, required computational accuracy).
Next, step S131 to step S133 are explained with reference to table 1 and fig. 4. Exemplarily, if it is determined in step S120 that the visibility in the environment is, for example, about 31m, the step S130 may be performed by: first, in step S131, a first value (here, a 3 rd row value) 30m closest to the visibility is obtained from the first array (i.e., the second row data) in table 1. Next, in step S132, based on the obtained first value 30m, a second value 140m (i.e. of the same row) corresponding to the first value is searched in table 1. Finally, in step S133, the second value 140m is output as the alarm broadcast distance.
According to an embodiment, to perform the step S131, an operation selected from the group consisting of: comparing the values in the first array in the predefined comparison table with the visibility, and outputting the values as the first values when the difference value between one value in the first array and the visibility is less than a threshold value; and comparing all values in the first array in the predefined comparison table with the visibility values, respectively calculating differences between all values and the visibility to obtain a minimum difference, and outputting the value in the first array corresponding to the minimum difference as the first value.
In step S140, the broadcasting unit 530 broadcasts the alarm information according to the alarm broadcasting distance. In some embodiments, step S140 may include: the alert information is broadcast for a predetermined period of time. Alternatively or additionally, step S140 may comprise: and broadcasting alarm information until the interrupt signal is acquired. The interrupt signal may, for example, comprise one selected from the group consisting of: a user input indicating an interrupt; and a sensor sensing signal of the on-vehicle sensor indicating the elimination of the emergency event.
As an example of broadcasting the warning information for a predetermined period of time, after the warning information (e.g., a vehicle failure warning) is acquired in step S110 and after the warning broadcasting distance (e.g., 140m determined above by means of table 1) is determined in step S130, a warning (i.e., a warning that the own vehicle has failed) is issued to other traffic participants (e.g., other vehicles) within the warning broadcasting distance (i.e., within the 140m circumference centered on the warning transmitter) for a predetermined period of time in step S140. Here, the predetermined period of time may be any length of time, such as 5 minutes, 10 minutes, 30 minutes, 2 hours, and the like. Advantageously, the predetermined period of time may be set or selected according to the type of the alarm, for example when the type of alarm is an event which is easier to eliminate (e.g. a vehicle malfunction alarm or a vehicle crash accident alarm), a shorter period of time may be set or selected to broadcast the alarm and the broadcast of the alarm is automatically ended after the predetermined period of time has ended. In contrast, when the alarm type is, for example, an event that is less easily eliminated (e.g., a road surface sudden depression alarm), a longer period of time may be set or selected to broadcast the alarm, and the broadcast of the alarm may be ended after the predetermined period of time is ended.
As an example of broadcasting the alarm information until the interrupt signal is acquired, the alarm information will be continuously broadcast until the interrupt signal input by the user or the sensor sensing signal indicating the removal of the emergency event by the in-vehicle sensor is acquired in step S140. Advantageously, in case the alarm information acquired in step S110 is input by the user, then an interrupt signal of the user may be waited for in step S140 to end the broadcasting of the alarm. In the case where the alarm information acquired in step S110 is generated based on each sensor sensing an emergency event, it may wait for the sensor to further sense the elimination of the emergency event as a signal to end broadcasting the alarm in step S140.
It should be understood that in the present disclosure, the "broadcasting" includes, but is not limited to, transmitting a signal (here, alarm information) by means of the communication device 2140 (fig. 11) in a manner of, for example, a wireless local area network or bluetooth, so that a corresponding signal receiving side (for example, an alarm receiving device) can receive the broadcasted signal within a broadcasting range. It will also be understood that in some embodiments, the range of distances to which the alert information is broadcast (i.e., "alert broadcast distance") may be set by adjusting the transmit power of the transmitter of the communication device 2140, although the disclosure is not so limited.
FIG. 6 shows a flowchart of a method 600 of prompting a user for an alert based on visibility in an environment, according to an example embodiment. As seen in fig. 6, method 600 may, for example, comprise:
step S610: acquiring alarm information, wherein the alarm information comprises position data of an alarm transmitter and the alarm;
step S620: determining visibility in the environment;
step S630: determining an alert cue distance based on the visibility;
step S640: prompting a user for the alert depending on the location data and the alert prompt distance.
Method 600 may be implemented by means of a device (not shown) for alerting a user of an alert based on visibility in an environment, the device may include: a processor, and a memory storing a program comprising instructions that, when executed by the processor, cause the processor to perform the method 600. In some embodiments, the processor and memory may be embodied by, for example, controller 2130 and its associated memory in fig. 11.
Fig. 7 shows an exemplary block diagram of step S610. Fig. 8 is an exemplary flowchart of step S640. Fig. 9 is another exemplary flowchart of step S640. Fig. 10 is a block diagram of an apparatus 1000 that may implement the method 600 according to an example embodiment. As shown in fig. 10, the apparatus 1000 includes a second acquiring unit 1010, a second determining unit 1020, and a prompting unit 1030.
For illustrative purposes, the method 600 will be further explained below with the aid of fig. 6 to 10.
In step S610, the second acquisition unit 1010 acquires alarm information, which may include position data of an alarm transmitter and an alarm. In some embodiments, this may be achieved by at least one selected from the group consisting of: location data of where the alert transmitter is located is acquired (S612), and an alert issued by the alert transmitter is acquired (S614). Here, it is advantageous to acquire position data (e.g., coordinate values in a map) of the warning transmitter, based on which the distance between the user on the warning reception side, i.e., the host vehicle, and the warning point can be further obtained. In the present disclosure, the "alarm point" may be understood as a location or a vicinity of the alarm transmitter, a location or a vicinity of the emergency occurrence sensed by a sensor, or a location or a vicinity of a user who inputs alarm information at an alarm transmitting side. As shown in fig. 7, acquiring (S614) an alert issued by an alert transmitter may further include acquiring (S616) an alert entered by a user at an alert transmitting side on the alert transmitter, and acquiring (S617) a corresponding alert generated by the alert transmitter in response to a sensor sensing signal indicating an emergency event.
In step S620, the second determination unit 1020 determines visibility in the environment.
According to an embodiment, step S620 may comprise an operation selected from the group consisting of: acquiring image data of the environment acquired by an image sensor, determining visibility in the environment based on the acquired image data; acquiring visibility data measured by a laser measurer as visibility in the environment; obtaining visibility data from a weather content provider as visibility in the environment; and acquiring visibility data input by a user, for example, on the alarm receiving side, as visibility in the environment.
According to an embodiment, the acquiring image data of the environment acquired by an image sensor comprises: acquiring image data of a surrounding environment at one time; or image data of the environment is acquired a plurality of times at predetermined time intervals.
According to an embodiment, said determining visibility in said environment based on said acquired image data comprises: determining a target object in the image; extracting edge features of the target object, wherein the edge features include at least one of the group consisting of: gradient at edge, contrast at edge and brightness at edge; and determining the visibility based on the edge features.
In step S630, the second determination unit 1020 determines an alert presentation distance based on the visibility.
According to an embodiment, step S630 comprises one selected from the group consisting of: obtaining an alarm prompt distance according to a predefined function of the alarm prompt distance on the visibility; and obtaining the alarm prompt distance according to a predefined comparison table describing the corresponding relation between the visibility and the alarm prompt distance.
According to an embodiment, the predefined function comprises a negative correlation function. The negative correlation function may include a linear function and a non-linear function. As an example, the predefined function is, for example:
Rw=k*Lv+ b (formula 2)
In formula 2, RwIndicating an alarm indication distance, LvRepresenting visibility in the environment, k being a negative number, b being any natural number. It should be understood that equation 2 is only a simple example for the predefined function. As the predefined function, other linear negative correlation functions, non-linear negative correlation functions, positive correlation functions, and any combination of the above functions are also conceivable as long as they represent a functional relationship between the alarm broadcast distance and the visibility.
According to an embodiment, said deriving an alert cue distance according to a predefined look-up table describing correspondence between said visibility and said alert cue distance comprises: obtaining a first numerical value closest to the visibility from a first array of the predefined look-up table; obtaining a second numerical value corresponding to the first numerical value in a second number group of the predefined comparison table based on the first numerical value; outputting the resulting second value as an alert cue distance, wherein the first array comprises a set representing reference visibility and the second array represents a set of reference alert cue distances.
Table 2 exemplary predefined look-up tables
|
First array (visibility)/m
|
Second array (alert cue distance)/m
|
1
|
10
|
300
|
2
|
25
|
280
|
3
|
40
|
260
|
4
|
55
|
240
|
5
|
70
|
220
|
6
|
85
|
200
|
......
|
......
|
...... |
As an example, table 2 represents a predefined lookup table of the correspondence between the visibility and the alert cue distance. In table 2, when the first array representing visibility is incremented, the second array representing alert cue distance is decremented, since it is advantageous when the alert cue distance varies inversely with respect to visibility, however, such a trend of variation is not necessary and other possibilities of variation of alert cue distance with respect to visibility are also contemplated. In addition, in table 2, the values in the first array are incremented by a difference of 15 meters, and the values in the second array are decremented by a difference of 20 meters. It should be understood that these differences are merely exemplary and that the differences may be adjusted depending on the actual application.
According to an embodiment, said obtaining a first value from a first array of said predefined look-up table that is closest to said visibility comprises one selected from the group consisting of: selecting a value from the first array, wherein the difference value between the first value and the visibility is smaller than a threshold value, as the first value; and selecting a value with the minimum difference value with the visibility from the first array as the first value.
It should be understood that the definition of the steps (e.g., steps S610, S620, S630) or terms (e.g., alarm, emergency event, visibility, predefined function, predefined lookup table, etc.) involved in the method 600 may refer to the corresponding description in the method 100, and will not be described herein again.
In step S640, the presentation unit 1030 presents the alert to the user depending on the position data and the alert presentation distance data.
Fig. 8 and 9 show flowcharts of different examples of step S640 in method 600. As seen in fig. 8, step S640 may include: determining whether a user is approaching the location of the alert transmitter indicated by the location data; in response to determining that a user is approaching the location of the alert transmitter, prompting the alert to the user; and in response to determining that the user is not approaching the location of the alert transmitter, not prompting the user for the alert.
Here, it is advantageous to consider whether the user is approaching the alert delivery point. Typically, an alarm transmitter transmits and broadcasts an associated alarm within an alarm broadcast range (e.g., a circumferential range centered at an alarm point and having a radius of the alarm broadcast distance), i.e., all alarm receivers within the alarm broadcast range are able to receive the alarm emitted by the alarm transmitter. However, the warning information is of little use or even no use for the host vehicle, for example in the case where the vehicle has driven past the warning point, i.e. the warning point is behind the driving route of the vehicle, or for example in the case where the warning point is not in the current driving plan route of the vehicle. In these cases, it is particularly advantageous to set the warning information not to be presented to the user in order to prevent disturbance to the vehicle driver.
As seen in fig. 9, step S640 may further include: determining whether a distance between the location of the user and the location of the alert transmitter is less than the alert prompt distance; in response to determining that the distance between the location of the user and the location of the alert transmitter is less than the alert prompt distance, prompting the alert to the user; in response to determining that the distance between the location of the user and the location of the alert transmitter is not less than the alert prompt distance, not prompting the alert to the user.
Here, the determination of whether the distance between the position of the user and the position of the alert transmitter is less than the alert notice distance may be understood as determining whether the user is within an alert notice range centered on an alert point and having an alert notice distance as a radius. In the case where it has been determined that the user is approaching an alarm point, it is advantageous to further determine that the user is within an alarm prompting range. Generally, a warning receiver mounted on a vehicle can receive a warning after the vehicle comes within a warning broadcasting range, but if the received warning is immediately notified to the driver at this time, it is highly likely that it is unnecessary and may interfere with driving. Therefore, it is advantageous and safe to prompt the driver with the warning information only when the vehicle comes within a warning prompt range determined based on the visibility of the current environment.
Although the operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, nor that all illustrated operations be performed, to achieve desirable results.
Additionally, while particular functionality is discussed above with reference to particular units, it should be noted that the functionality of the various units discussed herein can be divided into multiple units, and/or at least some of the functionality of multiple units can be combined into a single unit. Performing an action by a particular element discussed herein includes the particular element itself performing the action, or alternatively the particular element invoking or otherwise accessing another component or element that performs the action (or performs the action in conjunction with the particular element). Thus, a particular element that performs an action can include the particular element that performs the action itself and/or another element that performs the action that the particular element invokes or otherwise accesses.
More generally, various techniques may be described herein in the general context of software hardware elements or program modules. The various elements described above with respect to fig. 5 and 10 may be implemented in hardware or in hardware in combination with software and/or firmware. For example, the units may be implemented as computer program code/instructions configured to be executed in one or more processors and stored in a computer readable storage medium. Alternatively, these units may be implemented as hardware logic/circuits. For example, in some embodiments, one or more of the first obtaining unit 510, the first determining unit 520, the broadcasting unit 530, the second obtaining unit 1010, the second determining unit 1020, and the prompting unit 1030 may be implemented together in a system on chip (SoC). The SoC may include an integrated circuit chip including one or more components of a processor (e.g., a Central Processing Unit (CPU), microcontroller, microprocessor, Digital Signal Processor (DSP), etc.), memory, one or more communication interfaces, and/or other circuitry, and may optionally execute received program code and/or include embedded firmware to perform functions.
Yet another aspect of the present disclosure provides a vehicle comprising the apparatus 500 for broadcasting an alert based on visibility in an environment and/or the apparatus 1000 for prompting a user for an alert based on visibility in an environment. An example of such a vehicle will be described below in conjunction with fig. 11.
Yet another aspect of the present disclosure provides a non-transitory computer-readable storage medium storing a program, the program comprising instructions that, when executed by one or more processors, cause the one or more processors to perform the method 100 of broadcasting an alert based on visibility in an environment and/or the method 600 of prompting a user for an alert based on visibility in an environment. Examples of such non-transitory computer readable storage media are described below in conjunction with fig. 11.
Fig. 11 shows a schematic diagram of one application scenario 2100 in which the methods and functions described herein may be implemented.
Motor vehicle 2010 may include sensor 2110 for sensing the surrounding environment. The sensors 2110 may include one or more of the following sensors: ultrasonic sensor, millimeter wave radar, laser radar, vision camera and infrared camera. Different sensors may provide different detection accuracies and ranges. The ultrasonic sensors can be arranged around the vehicle and used for measuring the distance between an object outside the vehicle and the vehicle by utilizing the characteristics of strong ultrasonic directionality and the like. The millimeter wave radar may be installed in front of, behind, or other positions of the vehicle for measuring the distance of an object outside the vehicle from the vehicle using the characteristics of electromagnetic waves. The lidar may be mounted in front of, behind, or otherwise of the vehicle for detecting object edges, shape information, and thus object identification and tracking. The radar apparatus can also measure a speed variation of the vehicle and the moving object due to the doppler effect. The camera may be mounted in front of, behind, or otherwise on the vehicle. The visual camera may capture conditions inside and outside the vehicle in real time and present to the driver and/or passengers. In addition, by analyzing the picture captured by the visual camera, information such as traffic light indication, intersection situation, other vehicle running state, and the like can be acquired. The infrared camera can capture objects under night vision conditions.
Motor vehicle 2010 may also include an output device 2120. The output device 2120 includes, for example, a display, a speaker, and the like to present various outputs or instructions. Furthermore, the display may be implemented as a touch screen, so that input may also be detected in different ways. A user graphical interface may be presented on the touch screen to enable a user to access and control the corresponding controls.
Motor vehicle 2010 may also include one or more controllers 2130. The controller 2130 may include a processor, such as a Central Processing Unit (CPU) or a Graphics Processing Unit (GPU), or other special purpose processor, etc., that communicates with various types of computer-readable storage devices or media. A computer-readable storage apparatus or medium may include any non-transitory storage device, which may be non-transitory and may implement any storage device that stores data, and may include, but is not limited to, a magnetic disk drive, an optical storage device, solid state memory, floppy disk, flexible disk, hard disk, magnetic tape, or any other magnetic medium, an optical disk or any other optical medium, a Read Only Memory (ROM), a Random Access Memory (RAM), a cache memory, and/or any other memory chip or cartridge, and/or any other medium from which a computer may read data, instructions, and/or code. Some of the data in the computer readable storage device or medium represents executable instructions used by the controller 2130 to control the vehicle. Controller 2130 may include an autopilot system for automatically controlling various actuators in a vehicle. The autopilot system is configured to control the powertrain, steering system, and braking system, etc. of the motor vehicle 2010 to control acceleration, steering, and braking, respectively, via a plurality of actuators in response to inputs from a plurality of sensors 2110 or other input devices, without human intervention or limited human intervention. Part of the processing functions of the controller 2130 may be implemented by cloud computing. For example, some processing may be performed using an onboard processor while other processing may be performed using the computing resources in the cloud.
Motor vehicle 2010 also includes communication device 2140. The communication device 2140 includes a satellite positioning module capable of receiving satellite positioning signals from the satellites 2012 and generating coordinates based on these signals. The communication device 2140 also includes modules to communicate with the mobile communication network 2013, which may implement any suitable communication technology, such as current or evolving wireless communication technologies (e.g., 5G technologies) like GSM/GPRS, CDMA, LTE, etc. The communications device 2140 may also have a Vehicle-to-Vehicle (V2X) module configured to enable Vehicle-to-Vehicle (V2V) communications with other vehicles 2011 and Vehicle-to-Infrastructure (V2I) communications with the outside world, for example. In addition, the communication device 2140 may also have a module configured to communicate with the user terminal 2014 (including but not limited to a smartphone, a tablet computer, or a wearable device such as a watch), for example, via wireless local area network using IEEE802.11 standards or bluetooth. With the communications device 2140, the motor vehicle 2010 can access via a wireless communications system an online server 2015 or a cloud server 2016 configured to provide respective data processing, data storage, and data transmission services for the motor vehicle.
In addition, the motor vehicle 2010 includes a power train, a steering system, a brake system, and the like, which are not shown in fig. 11, for implementing a motor vehicle driving function.
While the disclosure has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative and exemplary and not restrictive; the present disclosure is not limited to the disclosed embodiments. Variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed subject matter, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps not listed, the indefinite article "a" or "an" does not exclude a plurality, and the term "a plurality" means two or more. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.