Disclosure of Invention
The invention aims to provide an accident recording method which is helpful for clearing the traffic accident occurrence process.
In order to solve the technical problems, the accident recording method is implemented by an accident recording system suitable for being arranged on a vehicle, the accident recording system comprises an activity sensing unit, an environment sensing unit and a processing unit, the accident recording method comprises the following steps that the processing unit judges whether the activity of the vehicle meets a collision condition according to activity sensing data generated by the activity sensing unit, and when the judgment result is yes, a target object which is related to the collision condition and is presented by the environment sensing data is identified from the environment sensing data generated by the environment sensing unit, and at least image data is generated and output according to the environment sensing data, wherein the image data at least presents the relative position relation between the vehicle and the target object within a key period, and the key period comprises a time point when the processing unit judges that the collision condition meets.
Preferably, the image data is a dynamic image presented in a top view, and the image data at least dynamically presents the moving process of the vehicle and the target object in the critical period.
Preferably, the processing unit further calculates a distance parameter according to the environmental sensing data, the distance parameter indicates a distance change between the vehicle and the target object within the critical period, and the image data dynamically presents the distance parameter.
Preferably, the processing unit further obtains a first speed parameter corresponding to the vehicle, and calculates a second speed parameter corresponding to the target object according to the environmental sensing data, wherein the first speed parameter and the second speed parameter indicate at least a moving speed of the vehicle and the target object before the collision condition is met, respectively, and the image data further shows the first speed parameter and the second speed parameter.
Preferably, the environment sensing unit includes a shooting module, and the environment sensing datagram includes dynamic image data generated by shooting by the shooting module, and in case that the processing unit determines that the dynamic image data presents a traffic signal lamp, the image data generated by the processing unit also presents a signal diagram corresponding to the traffic signal lamp, and the signal diagram presents a signal state of the traffic signal lamp within the critical period.
Preferably, the accident recording system stores an electronic map, a first graphic corresponding to the vehicle, and a second icon corresponding to the target object, wherein the image data is that the first icon and the second icon show the relative position relationship between the vehicle and the target object within the critical period, and the image data also shows the current positions of the vehicle and the target object within the critical period with a critical portion of the electronic map.
Preferably, when the processing unit determines that the collision condition is met and the target object is another vehicle, the processing unit further determines at least one of a first turn signal on/off state corresponding to the vehicle and a second turn signal on/off state corresponding to the other vehicle, wherein the first turn signal on/off state indicates whether the turn signal of the vehicle is on within the critical period, the second turn signal on/off state indicates whether the turn signal of the other vehicle is on within the critical period, and the image data further indicates the at least one of the first turn signal on/off state and the second turn signal on/off state.
Another object of the present invention is to provide an accident recording system capable of implementing the accident recording method.
In order to solve the above another technical problem, the accident recording system of the present invention comprises a motion sensing unit, an environment sensing unit and a processing unit electrically connected with the motion sensing unit and the environment sensing unit. The processing unit judges whether the activity of the vehicle meets a collision condition according to activity sensing data generated by the activity sensing unit, and when the judgment result is yes, identifies a target object which is related to the collision condition and is presented by the environment sensing data from the environment sensing data generated by the environment sensing unit, and generates and outputs image data at least according to the environment sensing data, wherein the image data at least presents a relative position relation between the vehicle and the target object within a key period, and the key period comprises a time point when the processing unit judges that the collision condition meets.
Preferably, the image data is a dynamic image presented in a top view, and the image data at least dynamically presents the moving process of the vehicle and the target object in the critical period.
Preferably, the processing unit further calculates a distance parameter according to the environmental sensing data, the distance parameter indicates a distance change between the vehicle and the target object within the critical period, and the image data dynamically presents the distance parameter.
Preferably, the processing unit further obtains a first speed parameter corresponding to the vehicle, and calculates a second speed parameter corresponding to the target object according to the environmental sensing data, wherein the first speed parameter and the second speed parameter indicate at least a moving speed of the vehicle and the target object before the collision condition is met, respectively, and the image data further shows the first speed parameter and the second speed parameter.
Preferably, the environment sensing unit includes a shooting module, and the environment sensing datagram includes dynamic image data generated by shooting by the shooting module, and in case that the processing unit determines that the dynamic image data presents a traffic signal lamp, the image data generated by the processing unit also presents a signal diagram corresponding to the traffic signal lamp, and the signal diagram presents a signal state of the traffic signal lamp within the critical period.
Preferably, the accident recording system stores an electronic map, a first graphic corresponding to the vehicle, and a second icon corresponding to the target object, wherein the image data is that the first icon and the second icon show the relative position relationship between the vehicle and the target object within the critical period, and the image data also shows the current positions of the vehicle and the target object within the critical period with a critical portion of the electronic map.
Preferably, when the processing unit determines that the collision condition is met and the target object is another vehicle, the processing unit further determines at least one of a first turn signal on/off state corresponding to the vehicle and a second turn signal on/off state corresponding to the other vehicle, wherein the first turn signal on/off state indicates whether the turn signal of the vehicle is on within the critical period, the second turn signal on/off state indicates whether the turn signal of the other vehicle is on within the critical period, and the image data further indicates the at least one of the first turn signal on/off state and the second turn signal on/off state.
Compared with the prior art, the accident recording system can identify the target object related to the collision condition from the environment sensing data when judging that the collision condition is met, and then generate and output the image data capable of showing the relative position relationship between the vehicle and the target object in the key period, so that when the vehicle has a collision accident, the accident recording system can reproduce the occurrence process of the collision accident by the image data in real time for reference of a principal and a researcher, thereby effectively assisting in clearing accident details and truly improving the inconvenience of the prior art.
Detailed Description
Before the present invention is described in detail, it should be noted that in the following description, like elements are denoted by the same reference numerals. Unless otherwise defined, the term "electrical connection" as used herein refers broadly to a "wired electrical connection" in which a plurality of electronic devices/apparatuses/elements are connected to each other through conductive materials, and a "radio connection" in which single/double wireless signal transmission is performed through wireless communication technology. In addition, unless otherwise specified, the term "electrically connected" as used herein refers to "direct electrical connection" formed by directly connecting a plurality of electronic devices/apparatuses/elements to each other, and "indirect electrical connection" formed by indirectly connecting a plurality of electronic devices/apparatuses/elements to each other through other electronic devices/apparatuses/elements.
Referring to fig. 1, a first embodiment of the accident recording system 1 of the present invention is suitable for being disposed on a vehicle 5, and in the application of the present embodiment, the vehicle 5 may be, for example, a small passenger car, however, in other applications of the present embodiment, the vehicle 5 may be, for example, a motorcycle, a truck or a large passenger car, and is not limited to the small passenger car of the present embodiment.
In this embodiment, the accident recording system 1 includes, for example, a storage unit 11, a positioning unit 12, a motion sensing unit 13, an environment sensing unit 14, and a processing unit 15 electrically connected to the storage unit 11, the positioning unit 12, the motion sensing unit 13 and the environment sensing unit 14.
In the present embodiment, the storage unit 11 stores, for example, an electronic map and a plurality of icons, wherein the electronic map is implemented as a two-dimensional planar map presented in a top view, and the icons are, for example, a plurality of two-dimensional images, respectively, and each icon presents, for example, an object (e.g., a vehicle). Also, the storage unit 11 is implemented as a memory module for storing digital data in this embodiment, for example. However, in other embodiments, the storage unit 11 may be implemented as a conventional hard disk, a solid state disk, or other types of computer readable recording media, or a combination of different types of computer readable recording media.
The positioning unit 12 is implemented, for example, as a satellite positioning module implemented based on satellite positioning technology, and the positioning unit 12 is able to immediately address the current location of the positioning unit 12 itself, for example, by receiving satellite signals. More specifically, in the present embodiment, the satellite signal may be, for example, from a global positioning system (Global Positioning System, abbreviated as GPS), that is, the positioning unit 12 is, for example, a GPS satellite positioning module in the present embodiment. However, in other embodiments, the satellite signals may be, for example, from other systems that are generally referred to by the satellite navigation system (Global Navigation SATELLITE SYSTEM, abbreviated as GNSS) and that utilize satellite signals to implement the immediate addressing function, such as the beidou satellite navigation system (BDS), the GALILEO positioning system (GALILEO), the GLONASS system (GLONASS), etc., and thus the implementation of the positioning unit 12 is not limited to this embodiment.
The activity sensing unit 13 is implemented as a six-axis acceleration sensor including an acceleration sensor (Accelerometer in english) and a gyroscope (gyroscillope), for example, and the activity sensing unit 13 is fixedly provided to the vehicle 5, so that it can be used to sense the speed, direction and angular variation of the displacement and rotation of the vehicle 5. It should be noted that the acceleration Sensor may be referred to as an accelerometer, an acceleration Sensor, or a gravity Sensor (G-Sensor), and in other embodiments, the activity sensing unit 13 may include only one of the acceleration Sensor and the gyroscope, for example, but is not limited to this embodiment.
The environment sensing unit 14 includes, for example, a photographing module 141 and a radar module 142. In more detail, in the present embodiment, the photographing module 141 has a plurality of photographing lenses, for example, and the photographing lenses are used for performing video photographing around the vehicle 5. On the other hand, the radar module 142 has, for example, a plurality of millimeter wave radars, for example, for transmitting millimeter waves toward the periphery of the vehicle 5, thereby scanning objects around the vehicle 5 and sensing the positions and activities of the objects. It should be noted that, in other embodiments, the environment sensing unit 14 may also include only one of the photographing module 141 and the radar module 142, and each of the millimeter wave radars may also be implemented as, for example, a nano wave radar, an ultrasonic radar, an infrared radar, or a light radar, but not limited to this embodiment.
The processing unit 15 may, for example, include a central processing unit disposed in the vehicle 5 in the present embodiment, and the processing unit 15 is, for example, electrically connected to a driving computer (not shown) of the vehicle 5 and capable of communicating with the driving computer, so as to obtain various operating parameters (such as driving speed) of the vehicle 5 from the driving computer, but not limited thereto.
It is to be noted that the accident recording system 1 of the present embodiment can be manufactured and sold separately, for example, and the accident recording system 1 is attached to the vehicle 5 after the vehicle 5 leaves the factory, for example. However, in other embodiments, the accident recording system 1 may be built into the vehicle 5 before the vehicle 5 leaves the factory, and the processing unit 15 may be implemented as a driving computer of the vehicle 5 or a part of the driving computer of the vehicle 5, for example, so the practical implementation of the accident recording system 1 is not limited to this embodiment.
Referring to fig. 1 and 2 together, the following exemplarily describes in detail how the accident recording system 1 of the present embodiment implements an accident recording method during traveling of the vehicle 5, and for convenience of description, the vehicle 5 provided with the accident recording system 1 is referred to herein as a first vehicle in the application of the present embodiment.
First, in step S1, the processing unit 15 continuously receives, for example, positioning data generated by the positioning unit 12 from the positioning unit 12, activity sensing data generated by the activity sensing unit 13 from the activity sensing unit 13, and environment sensing data generated by the environment sensing unit 14 from the environment sensing unit 14.
In more detail, in the present embodiment, the positioning data is, for example, a positioning result of the positioning unit 12 performing immediate addressing, and corresponds to a current position indicating the first vehicle corresponding to the current position in the electronic map. On the other hand, the activity sensing data is, for example, a result of the activity sensing unit 13 sensing its own activity in six degrees of freedom, and corresponds to an activity pattern indicating the first vehicle. In another aspect, the environmental sensing data includes, for example, dynamic image data generated by video shooting performed by the shooting module 141, and scan result data generated by scanning performed by the radar module 142 through the millimeter wave radars. The positioning data, the activity sensing data and the environment sensing data are updated in real time, for example.
In the case that the processing unit 15 continuously receives the positioning data, the activity sensing data and the environment sensing data, the flow proceeds to step S2.
In step S2, the processing unit 15 determines whether the activity of the first vehicle meets a collision condition according to the activity sensing data. More specifically, the collision condition represents that the first vehicle collides with or is impacted by an object, and the processing unit 15 may determine whether the collision condition is met according to the acceleration change indicated by the activity sensing data, for example. It should be noted that the manner in which the processing unit 15 determines whether the collision condition is met may be achieved by any other conventional technique, and is not limited to this embodiment.
If the processing unit 15 determines that the activity of the first vehicle does not meet the collision condition, the process proceeds from step S2 again, for example, after a predetermined period is completed, and if the processing unit 15 determines that the activity of the first vehicle meets the collision condition, the process proceeds to step S3.
In step S3, once the processing unit 15 determines that the collision condition is met, the processing unit 15 identifies a target object related to the collision condition and represented by the environmental sensing data from the environmental sensing data. In more detail, in the present embodiment, the processing unit 15 may identify the target object that collides with the first vehicle (corresponding to the collision condition being met) according to the dynamic image data of the environmental sensing data, for example, by using an image identification technology, and the target object may be, for example, another vehicle in running, and for convenience of explanation, the other vehicle that is met by the collision condition is referred to herein as a second vehicle in the application of the present embodiment.
It should be noted that, in practical applications, the target object is not limited to other vehicles, but may be other obstacles (such as a utility pole, a tree, or a wall). In other embodiments, the processing unit 15 may recognize the target object according to the dynamic image data and the scan result data, or recognize the target object according to the scan result data only, so the manner of recognizing the target object by the processing unit 15 is not limited to this embodiment.
After the processing unit 15 recognizes the target object (the second vehicle in the present embodiment) from the environment sensing data, the flow proceeds to step S4.
Referring to fig. 3, in step S4, the processing unit 15 obtains a first speed parameter P1 corresponding to the first vehicle, calculates a second speed parameter P2 corresponding to the second vehicle according to the environmental sensing data, and calculates a distance parameter P3 corresponding to the first vehicle and the second vehicle, wherein the first speed parameter P1, the second speed parameter P2 and the distance parameter P3 correspond to a critical period associated with a time point corresponding to the collision condition, for example.
In this embodiment, the critical period may be, for example, a period representing a period from "non-coincidence" to "coincidence" of the collision condition to a stop of the movement of the first vehicle itself. More specifically, for example, a starting time point of the critical period may be, for example, X seconds before the processing unit 15 determines that the collision condition is met, and X may be any value. On the other hand, an end time point of the critical period may be, for example, a time point representing a time point at which the first vehicle completely stops moving after the collision condition is met (i.e., after the collision occurs). However, in other embodiments, the critical period may be implemented, for example, as a period from X seconds before the collision condition is met to Y seconds after the collision condition is met, and Y may be any value. Generally, the critical period may be implemented at a time point including a time point when the processing unit 15 determines that the collision condition is met, and thus, the implementation of the critical period is not limited to the embodiment.
In more detail, the first speed parameter P1 may be, for example, a speed indicating a movement speed of the first vehicle itself at a plurality of time points within the critical period, that is, a value of the first speed parameter P1 is, for example, dynamically changed in this example. The processing unit 15 may obtain the first speed parameter P1 from a driving computer of the first vehicle, but is not limited thereto. In other embodiments, the first speed parameter P1 may be, for example, a moving speed indicating that the first vehicle is moving for M seconds before the collision condition is met, and M may be any value, in other words, the value of the first speed parameter P1 is not necessarily dynamic, and the first speed parameter P1 may be implemented as long as it can indicate the moving speed of the first vehicle before the collision condition is met.
On the other hand, the second speed parameter P2 may be, for example, a speed indicating the movement speed of the second vehicle at a plurality of time points within the critical period, that is, the value of the second speed parameter P2 is also, for example, dynamically changed in this example. The processing unit 15 may calculate the relative speed between the second vehicle and the first vehicle according to at least one of the scan result data and the dynamic image data, and further calculate a second speed parameter P2 corresponding to the second vehicle. However, in other embodiments, the second speed parameter P2 may be, for example, a moving speed indicating that the second vehicle is moving N seconds before the collision condition is met, and N may be any value, in other words, the value of the second speed parameter P2 is not necessarily dynamic, and the second speed parameter P2 may be implemented as long as it can indicate the moving speed of the second vehicle before the collision condition is met.
In another aspect, the distance parameter P3 indicates, for example, a distance between the first vehicle and the second vehicle at a plurality of time points within the critical period, that is, the value of the distance parameter P3 is also dynamically changed in this example, but not limited thereto.
After the processing unit 15 obtains the first speed parameter P1, the second speed parameter P2 and the distance parameter P3, the flow proceeds to step S5.
In step S5, the processing unit 15 generates image data D (shown in fig. 3 for example). In the present embodiment, the image data D is, for example, a dynamic image that is shown in fig. 3 and is composed of a plurality of static images, and more specifically, the image data D may be implemented as a dynamic PNG file or GIF file, but is not limited thereto. The image data D is generated by the processing unit 15 according to the electronic map, the positioning data, a first graphic M1 corresponding to the first vehicle among the icons, a second graphic M2 corresponding to the second vehicle among the icons, the first speed parameter P1, the second speed parameter P2, and the distance parameter P3, but not limited thereto.
In more detail, in the present embodiment, the processing unit 15 generates the image data D by determining a key portion of the electronic map according to the positioning data, for example, and the key portion corresponds to a position of the first vehicle in a real environment when the collision condition is met. The processing unit 15 also determines, for example, a corresponding position and a moving manner of the first graphic M1 corresponding to the first vehicle to be displayed on the critical portion according to a portion of the positioning data corresponding to the critical period, and calculates a relative position relationship between the second vehicle and the first vehicle in the critical period according to at least one of the scanning result data and the dynamic image data, so as to determine a corresponding position and a moving manner of the second graphic M2 corresponding to the second vehicle to be displayed on the critical portion. Thus, the image data D dynamically represents the positions of the first vehicle and the second vehicle in the real environment during the critical period, and dynamically represents the movement process and the relative positional relationship between the first vehicle and the second vehicle during the critical period, for example, as shown in fig. 3 by using the critical portion of the electronic map, the first graphic M1 and the second graphic M2.
Further, in the present embodiment, the image data D dynamically shows the numerical changes of the first speed parameter P1, the second speed parameter P2 and the distance parameter P3 during the critical period, for example, as shown in fig. 3, so that the image data D can further show the speed changes of the first vehicle and the second vehicle during the critical period and the distance changes between the first vehicle and the second vehicle compared to the driving recorder image and the traffic accident scene map drawn later in the prior art. It should be noted that, the image data D of the present embodiment can not only display the "result" of the collision accident between the first vehicle and the second vehicle, but also further display the "process" of the collision accident between the first vehicle and the second vehicle, so that the image data D can reveal more detailed information of the collision accident than the traffic accident scene map drawn later, thereby helping to clarify the process and cause of the collision accident.
After the processing unit 15 generates the image data D, the flow proceeds to step S6.
In step S6, the processing unit 15 outputs the image data D, and in this embodiment, the processing unit 15 outputs the image data D, for example, may transmit the image data D to a user device (e.g. via a network or bluetooth) in a wireless manner for the user device to display the image data D, where the user device may be, for example, a smart phone or a tablet computer, but not limited thereto. In other examples, the processing unit 15 may output the image data D by, for example, transmitting the image data D to an email box preset by a user, and thus the manner in which the processing unit 15 outputs the image data D is not limited to the embodiment.
After the processing unit 15 outputs the image data D, the accident recording method of the present embodiment ends.
The above is an exemplary description of how the accident recording system 1 of the present embodiment implements the accident recording method.
It should be noted that, in other embodiments, the image data D may be, for example, a still image (e.g., a JPEG file), and in the embodiment in which the image data D is a still image, the image data D preferably, for example, shows that the first vehicle and the second vehicle determine that the collision condition corresponds to the current relative positional relationship in the processing unit 15. On the other hand, in other embodiments, the storage unit 11 may not store an electronic map, and the processing unit 15 may, for example, use an image recognition technology to recognize the surrounding environment of the first vehicle from the dynamic image data, and receive the environment sensing data, so as to generate the image data D according to the recognized driving environment, instead of using the electronic map to generate the image data D.
It should be understood that the steps S1 to S6 and the flowchart of fig. 2 of the present embodiment are only for illustrating one possible implementation of the accident recording method, and even if the steps S1 to S6 are combined, split or sequentially adjusted, the accident recording method still belongs to the possible implementation of the accident recording method as long as the same effects as the present embodiment can be achieved, and therefore, the steps S1 to S6 and the flowchart of fig. 2 of the present embodiment are not used for limiting the possible implementation scope of the accident recording system and method of the present invention.
The above is a description of the first embodiment of the accident recording system 1 of the present invention.
The invention also provides a second embodiment of the accident recording system 1 and the second embodiment differs from the first embodiment in the accident recording method implemented by the accident recording system 1.
Specifically, in step S3 of the accident recording method of the second embodiment, the processing unit 15 further determines whether any traffic light (i.e. traffic light) is present in the dynamic image data of the environmental sensing data. And, in the case that the processing unit 15 determines that the dynamic image data presents one or more traffic lights, and the processing unit 15 is capable of identifying the signal status (i.e. red, yellow or green) of the traffic signal (S) according to the dynamic image data, the processing unit 15 generates the image data D according to, for example, one or more signal diagrams corresponding to the traffic signal (S) in the diagrams in step S5, so that, in the second embodiment, the image data D further presents signal diagrams corresponding to the traffic signal (S) respectively, and each signal icon dynamically presents, for example, the signal status of the corresponding traffic signal (S) within the critical period, thereby helping to further clarify the cause and process of the collision accident between the first vehicle and the second vehicle. In a further embodiment, the processing unit 15 may further determine whether any road marking (such as a mesh line, a drain line, or a stop prohibition line) is present in the dynamic image data of the environmental sensing data, and if the determination result is yes, add a corresponding marking graphic to the image data D. It should be noted that, in the second embodiment, for each traffic light presented by the dynamic image data, the processing unit 15 can determine, for example, a relative position and distance relationship between the traffic light and the vehicle 5 in the real environment according to the position of the traffic light presented in the dynamic image and the shooting direction of the shooting module 141 relative to the vehicle 5, and the activity sensing unit also measures the related activity sensing data, and then, the processing unit 15 can determine, for example, a corresponding position of the traffic light in the electronic map according to the relative position and distance relationship between the traffic light and the vehicle 5 in the real environment, or further, according to the relative position and distance relationship between the traffic light and the vehicle 5 in the real environment and the positioning data (i.e. the positioning result of the positioning unit 12), and then determine a corresponding position where a signal diagram corresponding to the traffic light should be presented in the image data D, so that the second embodiment of the accident recording system 1 can ensure that each signal diagram can reflect a correct position of the traffic light in the image data D in the real environment.
The above is a description of the second embodiment of the accident recording system 1 of the present invention.
The invention also provides a third embodiment of the accident recording system 1, and the third embodiment is different from the first embodiment in the accident recording method implemented by the accident recording system 1.
Specifically, in step S5 of the accident recording method according to the third embodiment, when the target object is the second vehicle, the processing unit 15 identifies the type of the second vehicle (e.g. bicycle, motorcycle, passenger car, truck, bus, etc.) according to the dynamic image data of the environmental sensing data, and selects the second graphic M2 capable of displaying the type of the second vehicle from the graphic according to the type of the second vehicle, so that the image data D can more closely represent the process and the result of the collision accident of the first vehicle and the second vehicle.
The above is a description of a third embodiment of the accident recording system 1 according to the present invention.
The invention also provides a fourth embodiment of the accident recording system 1, and the fourth embodiment differs from the first embodiment in the accident recording method implemented by the accident recording system 1.
Specifically, in step S3 of the accident recording method according to the fourth embodiment, in the case where the processing unit 15 determines that the collision condition is met and the target object is the second vehicle, the processing unit 15 also determines, for example, a first turn signal on/off state corresponding to the first vehicle and a second turn signal on/off state corresponding to the second vehicle. The first turn signal on/off state indicates whether the turn signal of the first vehicle is turned on within the critical period, and the processing unit 15 determines the first turn signal on/off state according to the operation parameters from the driving computer, for example. On the other hand, the second turn signal on/off state indicates whether the turn signal of the second vehicle is turned on within the critical period, and the processing unit 15 performs image recognition on the dynamic image data of the environmental sensing data, so as to determine whether at least one side of the head and/or tail of the second vehicle has a regularly blinking light source, and further determine the second turn signal on/off state. In step S5 of the fourth embodiment, the processing unit 15 further generates the image data D according to the first turn signal on/off state and the second turn signal on/off state, in other words, the image data D further indicates the first turn signal on/off state and the second turn signal on/off state in the fourth embodiment. Therefore, if any one of the first vehicle and the second vehicle collides with the other side in the turning process, the first turn signal on/off state and the second turn signal on/off state are indicated, so that the driver of the first vehicle and the second vehicle can know whether the driving behaviors of the driver of the second vehicle are proper or not. It should be noted that, in a similar embodiment, the processing unit 15 may determine only one of the first turn signal on/off state and the second turn signal on/off state, and the image data D may indicate only one of the first turn signal on/off state and the second turn signal on/off state.
It is to be noted that the technical means in the foregoing first to fourth embodiments can be implemented in admixture with each other.
In summary, by implementing the accident recording method, the accident recording system 1 can identify the target object that causes the collision condition to be met from the environmental sensing data when the collision condition is judged to be met, and then generate and output the image data D that can represent the relative positional relationship between the vehicle 5 and the target object during the critical period, thereby, when the vehicle 5 has a collision accident, the accident recording system 1 can reproduce the occurrence process of the collision accident in real time with the image data D for the reference of the principal and the investigator, so that the detail of the li accident can be effectively assisted, and the purpose of the invention can be truly achieved.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.