Disclosure of Invention
The embodiment of the application provides a blind area monitoring system and a blind area monitoring method, which can help a target object at a visual blind area to know whether the target object is in the visual blind area of a driver or not no matter what driving state the vehicle belongs to, and avoid accidents.
The embodiment of the application provides a blind area monitoring system, which can comprise:
The monitoring unit and the warning unit;
The monitoring unit is used for acquiring a target object entering a blind area in a preset range of the periphery of the vehicle body in real time and determining the distance between the target object and the vehicle body;
and the warning unit is used for initiating visual warning information to the target object if the distance between the target object and the vehicle body is smaller than or equal to a preset distance threshold, wherein the display position of the warning information is in the sight range of the target object.
In an exemplary embodiment of the application, the warning unit comprises a warning lamp control module and a warning lamp;
And the warning lamp control module is used for controlling the warning lamp to project preset image information as the warning information so as to warn the target object if the distance between the target object and the vehicle body is smaller than or equal to the distance threshold value.
In an exemplary embodiment of the present application, the image information is displayed on the ground within the line of sight of the target object, and the image information includes at least one of light color, text, numbers, graphic codes, and animation.
In an exemplary embodiment of the present application, the blind area monitoring system further includes a processing unit;
The processing unit is configured to predict a moving direction of the target object according to a moving image of the target object, wherein the moving image is acquired by the monitoring unit.
In an exemplary embodiment of the present application, the processing unit predicts a moving direction of the target object according to at least one frame of the moving image, and/or the processing unit predicts a moving direction of the target object according to direction indication information in the moving image.
In an exemplary embodiment of the present application, the monitoring unit includes a ranging device for determining a distance between the target object and a vehicle body, and a photographing device for acquiring a moving image of the target object, or
The monitoring unit includes a detecting device for determining a distance between the target object and a vehicle body and acquiring a moving image of the target object.
In an exemplary embodiment of the present application, the detection device includes a camera disposed at a vehicle body position corresponding to at least one of the blind areas.
In an exemplary embodiment of the present application, the ranging apparatus includes a ranging sensor or a radar, the photographing apparatus includes a camera, and the ranging apparatus and the photographing apparatus are disposed at least one vehicle body position corresponding to the blind area.
In an exemplary embodiment of the present application, the position of the monitoring unit or the position of the warning unit includes at least one of a head position, a tail position, and a body both side position.
In an exemplary embodiment of the present application, the warning unit is further configured to display different image information according to a distance between the target object and the vehicle body.
In an exemplary embodiment of the application, the warning unit further comprises an audible alarm arranged to emit a warning sound when the distance between the target object and the vehicle body is less than or equal to the distance threshold.
In an exemplary embodiment of the application, the audible alarm emits different types and/or different decibels of warning sounds according to the distance between the target object and the vehicle body.
In an exemplary embodiment of the present application, the system further includes a display unit for displaying a moving image of the target object in the vehicle.
The embodiment of the application also provides a blind area monitoring method, which can comprise the following steps of:
Acquiring a target object entering a blind area in a preset range around a vehicle body in real time, and determining the distance between the target object and the vehicle body;
if the distance between the target object and the vehicle body is smaller than or equal to a preset distance threshold value, visual warning information is initiated to the target object, wherein the display position of the warning information is in the sight range of the target object.
Compared with the related art, the blind area monitoring system comprises a monitoring unit and a warning unit, wherein the monitoring unit is used for acquiring target objects entering a blind area in a preset range around a vehicle body in real time and determining the distance between the target objects and the vehicle body, and the warning unit is used for initiating visual warning information to the target objects if the distance between the target objects and the vehicle body is smaller than or equal to a preset distance threshold, wherein the display position of the warning information is in the sight range of the target objects. According to the embodiment, no matter what driving state the vehicle belongs to, the target object at the vision blind area can be helped to know whether the target object is in the vision blind area of the driver or not, and accidents are avoided.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application. Other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and drawings.
Detailed Description
The present application has been described in terms of several embodiments, but the description is illustrative and not restrictive, and it will be apparent to those of ordinary skill in the art that many more embodiments and implementations are possible within the scope of the described embodiments. Although many possible combinations of features are shown in the drawings and discussed in the detailed description, many other combinations of the disclosed features are possible. Any feature or element of any embodiment may be used in combination with or in place of any other feature or element of any other embodiment unless specifically limited.
The present application includes and contemplates combinations of features and elements known to those of ordinary skill in the art. The disclosed embodiments, features and elements of the present application may also be combined with any conventional features or elements to form a unique inventive arrangement as defined by the claims. Any feature or element of any embodiment may also be combined with features or elements from other inventive arrangements to form another unique inventive arrangement as defined in the claims. It is therefore to be understood that any of the features shown and/or discussed in the present application may be implemented alone or in any suitable combination. Accordingly, the embodiments are not to be restricted except in light of the attached claims and their equivalents. Further, various modifications and changes may be made within the scope of the appended claims.
Furthermore, in describing representative embodiments, the specification may have presented the method and/or process as a particular sequence of steps. However, to the extent that the method or process does not rely on the particular order of steps set forth herein, the method or process should not be limited to the particular sequence of steps described. Other sequences of steps are possible as will be appreciated by those of ordinary skill in the art. Accordingly, the particular order of the steps set forth in the specification should not be construed as limitations on the claims. Furthermore, the claims directed to the method and/or process should not be limited to the performance of their steps in the order written, and one skilled in the art can readily appreciate that the sequences may be varied and still remain within the spirit and scope of the embodiments of the present application.
The embodiment of the application provides a blind area monitoring system, as shown in figure 1, which can comprise a monitoring unit 1 and a warning unit 2;
the monitoring unit 1 is used for acquiring a target object entering a blind area in a preset range of the periphery of the vehicle body in real time and determining the distance between the target object and the vehicle body;
And the warning unit 2 is configured to initiate visual warning information to the target object if the distance between the target object and the vehicle body is smaller than or equal to a preset distance threshold, wherein the display position of the warning information is within the sight range of the target object.
In the exemplary embodiment of the present application, the vehicle body periphery preset range may be determined according to the effective monitoring area of the monitoring unit 1. The blind area refers to an area around the vehicle that is not visible to the driver's view, for example, an area outside the driver's view (e.g., an area below the view) in front of the vehicle, an area outside the driver's view (e.g., an area below the view) in back of the vehicle, and an area on the side of the vehicle that is not visible to the rear view mirror. Preferably, the target object entering the blind area within the preset range around the vehicle body may be acquired only during the running of the vehicle.
In an exemplary embodiment of the present application, the distance between the target object and the vehicle body may be the distance between the target object and the geometric center of the vehicle body, or may be the distance between the target object and the monitoring unit, where the preset distance threshold is related to the distance between the target object and the vehicle body, may be determined empirically, or may be calculated according to motion parameters such as the running speed, the reaction time, etc. of the vehicle, and in the range corresponding to the distance threshold, the target object has a risk of rubbing or collision with the vehicle body.
In the exemplary embodiment of the application, when the warning information is visualized, the display position of the warning information is within the sight line range of the target object, so that the target object can observe the warning information, preferably, the warning information can be initiated to the direction in which the target object is located, so that the target object can receive the warning information, for example, if the warning unit 2 is located at the rear side of the vehicle, the warning unit initiates the warning information towards the rear right in the case that the target object is located at the rear right of the vehicle, and if the target object is located at the rear left of the vehicle, the warning unit 2 initiates the warning information towards the rear left. Further, under the condition that a plurality of warning units 2 are arranged on the vehicle body, if the target object is determined to be on the left side of the vehicle, the warning unit 2 arranged on the left side of the vehicle is started to initiate warning information to the target object, and if the target object is determined to be on the rear side of the vehicle, the warning unit 2 arranged on the rear side of the vehicle is started to initiate warning information to the target object, so that the pertinence of the warning information is improved, and the warning effect on the target object is improved.
It should be noted that, the target object in the present application may be a pedestrian, a motor vehicle, a non-motor vehicle, an animal or other obstacles, and the target object may be in a moving state or a stationary state.
In the exemplary embodiment of the application, the monitoring unit 1 is used for acquiring the target object entering the blind area in the preset range around the vehicle body in real time, so that the target object at the visual blind area can be acquired in time no matter what driving state the vehicle belongs to, and the warning unit 2 is used for initiating visual warning information to the target object according to the distance between the target object and the vehicle body, thereby helping the target object at the visual blind area to know whether the target object is in the visual blind area of the driver or not and avoiding accidents.
In an exemplary embodiment of the present application, the monitoring unit 1 may include a ranging device and a photographing device. The distance measuring device is used for determining the distance between the target object and the vehicle body, and the shooting device is used for acquiring the moving image of the target object.
In an exemplary embodiment of the present application, the ranging apparatus may include a ranging sensor or a radar, the photographing apparatus may include a camera, and the ranging apparatus and the photographing apparatus are disposed at least one vehicle body position corresponding to the blind area.
In an exemplary embodiment of the present application, the monitoring unit 1 may further include only a detecting device for determining a distance between the target object and the vehicle body and simultaneously acquiring a moving image of the target object.
Preferably, the detection device may include a camera, and the camera is disposed at a vehicle body position corresponding to at least one blind area, and captures a moving image of the target object in the corresponding blind area. In this embodiment, in the case where the camera has a ranging function, ranging and photographing can be completed alone.
Further, the detection device may be a monocular camera or a binocular camera, so that the type of the target object, such as pedestrians, non-motor vehicles, other obstacles, and the like, can be determined through a visual algorithm such as deep learning. In addition, based on a monocular camera or a binocular camera, the distance measuring function can be directly realized, specifically, calibration is performed firstly to determine the relationship between the physical distance between the camera position and the determination point and the number of pixels in an image, then a target object is detected according to a moving image, and the distance between the target object and a vehicle body is determined by using calibration information. Further, depth information may be added to improve ranging accuracy.
The position of the monitoring unit 1 or the position of the warning unit can at least comprise one of a head position, a tail position and two side positions of the vehicle body.
In the exemplary embodiment of the present application, these setting positions may enable the monitoring unit 1 to better detect the target object, and enable the warning unit 2 to more prominently warn the detected target object, thereby improving the monitoring efficiency and the monitoring effect of the blind area.
In an exemplary embodiment of the present application, the radar may include, for example, a first radar located in front of the vehicle body, a second radar located behind the vehicle body, a third radar located on a first side of the vehicle body, and a fourth radar located on a second side of the vehicle body.
In an exemplary embodiment of the present application, the monitoring unit 1 may include a plurality of sets of ranging devices and photographing devices, where the ranging devices and photographing devices are disposed correspondingly, and for example, at least one set of ranging devices and photographing devices are disposed at a vehicle body position corresponding to each blind area around a vehicle body, where the ranging devices may be ranging sensors or radars, and the photographing devices may include cameras.
In an exemplary embodiment of the present application, the plurality of sets of ranging apparatuses and photographing apparatuses may include a first set of ranging apparatuses and photographing apparatuses located in front of a vehicle body, a second set of ranging apparatuses and photographing apparatuses located in rear of the vehicle body, a third set of ranging apparatuses and photographing apparatuses located at a first side of the vehicle body, and a fourth set of ranging apparatuses and photographing apparatuses located at a second side of the vehicle body.
In the exemplary embodiment of the present application, in a practical situation, the ranging sensors included in the monitoring unit 1 may detect all the targets within the range, and the sensing ranges of the ranging sensors may overlap, so that it needs to be considered how to determine and warn the distance in a case that there are multiple target objects within the detection range of a certain ranging sensor or the same target object is detected by multiple sensors.
In view of the above, the monitoring unit 1 detects all the target objects in the monitoring area including the dead zone, and then obtains the distance from each target object to the vehicle body by a preset algorithm. The geometric relationship between the monitoring unit 1 and the vehicle body can be determined through pre-calibration, so that the distance between the target object and the vehicle body can be obtained through conversion according to the distance between the target object and the monitoring unit. When a plurality of target objects are provided, whether collision danger exists or not and whether warning is needed or not can be determined according to the distance between the nearest target object and the vehicle body, and warning is sent out as long as the nearest target has collision risk.
In an exemplary embodiment of the present application, the warning unit 2 may include a warning light control module 21 and a warning light.
In an exemplary embodiment of the present application, the warning light is used to warn the detected target object in the blind area. The warning lamp can be specially arranged, and can also be realized through a projection lamp on the vehicle, so that the cost is reduced, and the arrangement space on the vehicle is saved.
In an exemplary embodiment of the present application, the warning light control module 21 may be configured to control the warning light to project preset image information as the warning information so as to warn the target object if the distance between the target object and the vehicle body is less than or equal to the distance threshold.
In the exemplary embodiment of the application, the target object which is too close to the car body can be timely detected and the alarm can be timely sent out to the target object, so that the target object can be warned to be far away from the car body before the danger of the target object occurs, and the occurrence of car accidents can be effectively prevented.
Preferably, the warning lamp in this embodiment is a projection lamp, specifically, an automobile projection lamp, which is also called a ground lamp, and the pattern projected by the automobile projection lamp not only has the functions of searching for a car and illuminating the road surface in front of the car, but also can realize more functions of signal indication, information interaction and the like, so that the image information in the application can be projected onto the ground by the projection lamp.
In an exemplary embodiment of the present application, the image information is displayed on the ground within the line of sight of the target object, and specifically, the image information may include at least one of light color, text, number, graphic code, and animation. The light color can comprise light and/or images (for example, flashing color light images with specific display effects and/or colors at set frequency), the light images can prompt the target objects to be alert at first time, particularly in a scene with darker light, the words can be various languages with warning significance, the numbers can be actual distances between the target objects and a car body or due distance thresholds between the target objects and the car body, the image codes can comprise but are not limited to warning pictures with preset patterns, geometric figures and/or logo (icons) with warning information and the like displayed on the target objects, for example, pictures with traffic marks and the like capable of indicating the target objects to stop, keep away or not to get close, visual impact can be improved through image code display, the warning effect is improved, and timely warning of specific personnel (such as people) is realized. It can be appreciated that the image information includes various combinations of the above-mentioned light colors, letters, numbers, graphic codes, and animations.
In an exemplary embodiment of the present application, the warning unit 2 may include a plurality of warning lamps, and at least one warning lamp is disposed at a vehicle body position corresponding to each blind area of the vehicle body periphery.
In an exemplary embodiment of the present application, the blind zone monitoring system may further include a processing unit 3.
In an exemplary embodiment of the present application, when the processing unit 3 detects that the distance between the target object (e.g., a person or a car) and the car body is less than or equal to the distance threshold, a control command is sent to the warning light control module 21, and the warning light control module 21 controls the warning light to be started according to the control command, so as to issue a warning to the target object.
In an exemplary embodiment of the present application, the plurality of warning lamps may include a first warning lamp located in front of the vehicle body, a second warning lamp located behind the vehicle body, a third warning lamp located at a first side of the vehicle body, and a fourth warning lamp located at a second side of the vehicle body.
In an exemplary embodiment of the present application, the warning unit 2 is further configured to display different image information according to the distance between the target object and the vehicle body.
In an exemplary embodiment of the present application, the distance between the target object and the vehicle body may be divided into different distance levels in advance, and the images displayed by the warning unit may be divided into different image levels, where the different distance levels may correspond to the corresponding image levels.
In an exemplary embodiment of the present application, for example, when the distance is further, the distance level is lower, at which time the image level is lower, only low-frequency light flickering may be performed, when the distance is reduced, the distance level is increased, at which time the image level is increased, high-frequency light flickering may be performed and a character pattern display such as "attention distance" is increased, and when the distance is further reduced, the distance level is further increased, at which time the image level is further increased, high-frequency light flickering may be performed and a character pattern display such as "accident about to occur" may be displayed, and at the same time a crash simulation animation may be displayed to increase visual impact.
In an exemplary embodiment of the present application, the system may further include a display unit 4, and the display unit 4 may be configured to display a moving image of the target object in the vehicle, so that the driver can know the target object at the blind area in time.
In the exemplary embodiment of the application, the driver can be intuitively reminded of the existence of the target object at the blind area through the display of the moving image, the driver can timely pre-judge the next movement of the target object according to the movement condition of the target object, and the vehicle is timely controlled to stop or avoid when the danger possibly occurs in the pre-judgment, so that the protection is further provided for avoiding the occurrence of traffic accidents, and the driving safety is further improved.
In the exemplary embodiment of the present application, the warning light control module 21 may be disposed in the processing unit 3, may be disposed separately, may be disposed integrally with the display unit 4, may be disposed integrally with the monitoring unit 1, or may be disposed integrally with the warning unit 2.
In an exemplary embodiment of the present application, as shown in fig. 2, a connection schematic diagram of each of the processing unit 3, the monitoring unit 1, the display unit 4 and the warning light control module 21 is provided separately, wherein the warning unit 2 includes a front projection lamp, a left projection lamp, a right projection lamp and a rear projection lamp, the monitoring unit 1 includes a plurality of front cameras (or radars), as shown in fig. 3, a connection schematic diagram of the warning light control module 21 provided in the processing unit 3 is provided, as shown in fig. 4, a connection schematic diagram of the warning light control module 21 provided in the monitoring unit 1 is provided, and at this time, each set of front cameras (or radars) may correspond to one warning light control module 21, as shown in fig. 5, a connection schematic diagram of the warning light control module 21 provided in the display unit 1 is provided. The front projection lamp may refer to a projection lamp disposed in front of the vehicle body (e.g., at the vehicle head), the rear projection lamp may refer to a projection lamp disposed behind the vehicle body (e.g., at the vehicle tail), the left projection lamp may refer to a projection lamp disposed at a first side of the vehicle body, and the right projection lamp may refer to a projection lamp disposed at a second side of the vehicle body.
In an exemplary embodiment of the present application, the alarm unit 2 may further include an audible alarm;
The audible alarm may be configured to emit a warning sound when the processing unit detects that the distance between the target object and the vehicle body is less than or equal to a distance threshold.
In the exemplary embodiment of the application, the audible alarm is added on the warning lamp, so that the target object around the vehicle body can know that the area where the target object is located is a blind area of a driver, and the degree of danger can be known more clearly through the urgent and gradual sound of the audible alarm.
In an exemplary embodiment of the application the processing unit 3 may be further arranged to control the audible alarm to emit different types and/or different decibels of warning sounds depending on the distance of the target object from the vehicle body.
In an exemplary embodiment of the present application, the warning sound emitted by the warning unit 2 may be divided into different sound levels, and the different distance levels may correspond to the respective sound levels.
In an exemplary embodiment of the present application, for example, when the distance is farther, the distance level is lower, and at this time the sound level is lower, only sound playing at low speed and low decibel can be performed, when the distance is reduced, the distance level is increased, and at this time the sound level is increased, sound playing at high speed and high decibel can be performed and a voice prompt such as "attention to car distance" is increased, and when the distance is further reduced, the distance level is further increased, and at this time the sound level is further increased, light flickering at higher speed and higher decibel can be performed and a voice prompt such as "car accident is about to occur", and the attention of a person in a blind area is increased.
In an exemplary embodiment of the application, the processing unit may be connected with the respective radar, display unit, monitoring unit 1 and warning unit 2 (e.g. a vehicle lamp control module) via data lines. And after the blind area monitoring system is electrified, the processing unit obtains an image signal of the shooting equipment and a distance signal of the distance measuring equipment. According to the image signal and the distance signal, the processing unit calculates the distance between the vehicle body and the target object (such as an automobile, a motorcycle, a bicycle and a pedestrian) of the vision blind area outside the vehicle at present, and accesses the shooting image of shooting equipment arranged on the vehicle body where the target object is located to a display unit which can be watched by a driver, and triggers a warning unit 2 on the vehicle body where the target object is located.
In the exemplary embodiments of the present application, several specific embodiments of the inventive arrangements are presented below:
When the distance between the target object in front of the vehicle and the vehicle body is smaller than or equal to a set distance threshold value, the processing unit accesses the video of the front camera of the vehicle, displays the image in front of the vehicle in real time on the display unit, transmits a signal to the warning lamp control module, and controls the front warning lamp to be turned on;
When the distance between the target object at the left side (namely the first side) of the vehicle and the vehicle body is smaller than or equal to a set distance threshold value, the processing unit accesses the video of the camera at the left side of the vehicle, displays the left image in real time on the display unit, transmits a signal to the warning lamp control module, and controls the left warning lamp to be turned on;
when the distance between the target object on the right side (namely the second side) of the vehicle and the vehicle body is smaller than or equal to a set distance threshold value, the processing unit accesses the video of the camera on the right side of the vehicle, displays the image on the right side of the vehicle in real time on the display unit, transmits a signal to the warning lamp control module, and controls the right warning lamp to be turned on;
when the distance between the target object at the rear of the vehicle and the vehicle body is smaller than or equal to the set distance threshold value, the processing unit is used for accessing the video of the camera at the rear of the vehicle, displaying the image at the rear of the vehicle in real time on the display unit and outputting a signal to the warning lamp control module to control the rear warning lamp to be turned on.
In an exemplary embodiment of the present application, the processing unit may be further configured to predict a moving direction of the target object according to a moving image of the target object, and control a braking system of the vehicle to perform braking when the moving direction is predicted to be directed to the vehicle body, wherein the moving image may be acquired by the monitoring unit 1.
In the exemplary embodiment of the application, in order to avoid that a driver fails to make accurate and timely judgment on an impending traffic accident, a function capable of pre-judging the action track of a target object around a vehicle body can be added in the processing unit, and the automatic emergency stop and brake of the vehicle can be controlled when the action track of the target object is determined that the next moving direction of the target object possibly points to the vehicle body.
In an exemplary embodiment of the present application, the processing unit may predict the moving direction of the target object according to at least one frame of the moving image, and/or the processing unit may predict the moving direction of the target object according to direction indication information in the moving image.
In an exemplary embodiment of the present application, the processing unit prejudging a moving direction of the target object according to the moving image may include:
Judging the type of the target object;
When the target object is a vehicle, acquiring a steering wheel image from a plurality of frames of moving images, detecting the rotation direction of the steering wheel according to the plurality of frames of steering wheel images, and predicting the movement direction of the vehicle according to the rotation direction of the steering wheel;
when the target object is a pedestrian, acquiring a toe image from the moving image, detecting the pointing direction of the toe, and pre-judging the moving direction of the pedestrian according to the execution direction of the toe.
In an exemplary embodiment of the present application, the monitoring unit 1 may include an external camera, or may include a camera in a vehicle that may cover a driver and a steering wheel, where the camera that covers the steering wheel may monitor the movement of the steering wheel in real time, and determine the steering direction of the steering wheel through multiple frames of images.
In the exemplary embodiment of the present application, the method of acquiring the movement direction of the vehicle is not limited to the steering wheel image, but may be that the monitoring unit 1 is connected to the vehicle body data unit through a CAN (controller area network) bus or other communication lines, or the monitoring unit 1 includes a gyroscope or an inertial navigation chip, and is acquired through the gyroscope or the inertial navigation chip.
In an exemplary embodiment of the present application, the acquiring of the steering wheel image from the moving image and the acquiring of the toe image from the moving image may also be performed by a preset recognition model, for example, one or more moving images of the acquired moving images may be input into the preset recognition model, and the recognition model may include a steering wheel image recognition model and a toe image recognition model which are built and trained in advance, the steering wheel image is recognized by the steering wheel image recognition model, and the toe image is recognized by the toe image recognition model. The steering wheel image recognition model and the toe image recognition model can have a direction recognition function, so that the rotation direction of the steering wheel and the execution direction of the toe can be determined.
In the exemplary embodiment of the application, no matter what driving state the vehicle belongs to, the distance between each target object and the vehicle body can be obtained in time through the camera, the radar and other equipment, the display function can be automatically opened on the display unit according to the distance and the related distance threshold value set in advance, the camera video image in the corresponding direction is displayed on the screen of the driver in real time, and the related direction warning lamp is controlled to be opened through the output signal of the lamp control module, so that the target object (such as an automobile, a motorcycle, a bicycle and a pedestrian) in the external vision blind area is helped to know whether the target object is in the blind area of the driver or not, and the occurrence of accidents is avoided.
In the embodiment of the application, the scheme of the embodiment does not need manual intervention or triggering through actual operation actions such as a steering lamp, thereby realizing the automation and the intellectualization of blind area detection and reminding and greatly improving the driving safety.
The embodiment of the application also provides a blind area monitoring method, which is based on the blind area monitoring system, as shown in fig. 6, and can comprise the steps of S101-S102:
s101, acquiring a target object entering a blind area in a preset range around a vehicle body in real time, and determining the distance between the target object and the vehicle body;
S102, if the distance between the target object and the vehicle body is smaller than or equal to a preset distance threshold, visual warning information is initiated to the target object, wherein the display position of the warning information is in the sight range of the target object.
In the exemplary embodiment of the application, through the step S101 and the step S102, the target object entering the blind area in the preset range around the vehicle body can be obtained in real time, so that the target object at the vision blind area can be timely obtained no matter what driving state the vehicle belongs to, and the visual warning information is initiated to the target object according to the distance between the target object and the vehicle body, thereby helping the target object at the vision blind area to know whether the target object is in the vision blind area of the driver or not, and avoiding the occurrence of accidents.
In the exemplary embodiment of the present application, any embodiment of the foregoing blind area monitoring system may be applied to the blind area monitoring method embodiment, which is not described herein in detail.
Those of ordinary skill in the art will appreciate that all or some of the steps, systems, functional modules/units in the apparatus, and methods disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. In a hardware implementation, the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components, for example, one physical component may have a plurality of functions, or one function or step may be cooperatively performed by several physical components. Some or all of the components may be implemented as software executed by a processor, such as a digital signal processor or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as known to those skilled in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer. Furthermore, as is well known to those of ordinary skill in the art, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.