Encounter Risk Evaluation with a Forerunner UAV
"> Figure 1
<p>Forerunner demonstration setup with ambulance car (EGV), UAV and other vehicle.</p> "> Figure 2
<p>DJI M600 with the onboard unit installed.</p> "> Figure 3
<p>Block scheme of the EGV onboard hardware.</p> "> Figure 4
<p>Onboard hardware system of the EGV. 1: MikroTik router, 2: RTK GNSS electronics, 3: NI CAN interface, 4: 4S LiPo battery, 5: GNSS receiver antenna, 6: USB connection to laptop.</p> "> Figure 5
<p>Block scheme of the drone onboard hardware.</p> "> Figure 6
<p>Onboard hardware system of the aerial vehicle. 1: RTK GNSS electronics, 2: Basler camera, 3: Gimbal, 4: WiFi antennas, 5: Nvidia Jetson Xavier NX, 6: 4S LiPo battery.</p> "> Figure 7
<p>Map of ZalaZONE Proving Ground with the Smart City module (source: <a href="https://avlzalazone.com/testing-and-track/" target="_blank">https://avlzalazone.com/testing-and-track/</a>, accessed on 2 March 2023).</p> "> Figure 8
<p>An encounter scenario in ZalaZONE Smart City with EGV and other vehicle.</p> "> Figure 9
<p>Onboard software architecture and data flow of the aerial vehicle.</p> "> Figure 10
<p>Inference differences between Yolo_RL (<b>left</b>) and fine tuned (<b>right</b>) network. Note the missed car at the STOP sign in the left part.</p> "> Figure 11
<p>Position difference between back-projected and RTK EGV North positions.</p> "> Figure 12
<p>An encounter in the feasibility demonstration.</p> "> Figure 13
<p>Zoom of back-projected and RTK GNSS logged EGV positions.</p> "> Figure 14
<p>Definition of Encoder Event (EE).</p> "> Figure 15
<p>Velocity estimation with 1st order differentiation.</p> "> Figure 16
<p>EGV RTK GNSS velocity and estimated acceleration.</p> "> Figure 17
<p>EGV velocity estimation with SDKF position smoothing and line fitting.</p> "> Figure 18
<p>EGV acceleration estimation with two state Kalman filter.</p> "> Figure 19
<p>EGV estimated and measured position.</p> "> Figure 20
<p>EGV estimated and RTK GNSS measured velocity.</p> "> Figure 21
<p>EGV estimated and RTK GNSS estimated acceleration.</p> "> Figure 22
<p>Standard deviations of state estimates.</p> "> Figure 23
<p>Special relations of EGV and the other object.</p> "> Figure 24
<p>STD transients.</p> "> Figure 25
<p>Car braking times from different velocities.</p> "> Figure 26
<p>Data for circle overlap measure calculation.</p> "> Figure 27
<p>Relation of EGV stopped position and other vehicle trajectory.</p> "> Figure 28
<p>Short vehicle trajectories in flight 3 encounter 3 (magenta and red colors show the sections with valid danger notification).</p> "> Figure 29
<p>Short vehicle trajectories in flight 4 encounter 3 (magenta and red colors show the sections with valid danger notification).</p> "> Figure 30
<p>Decision with the basic method for the 2nd encounter of flight 2. Red and magenta colors show the sections of danger notification, while the red circle with cross is the emergency and the blue circle with cross is the firm braking stopped position of the EGV, respectively.</p> "> Figure 31
<p>Second encounters of flight 2 (<b>left</b>) and 3 (<b>right</b>) with improved decision. Red and magenta colors show the sections of danger notification, while the red circle with cross is the emergency and the blue circle with cross is the firm braking predicted stopped position of the EGV, respectively.</p> "> Figure 32
<p>Motion prediction for steady vehicles (blue EGV, red other vehicle).</p> "> Figure 33
<p>Motion prediction for moving vehicles (blue EGV, red other vehicle).</p> "> Figure 34
<p>Decisions for steady objects (<b>left</b>: the improved method, * means no decision, circle means object behind EGV, × means object in front of EGV; <b>right</b>: the basic method).</p> "> Figure 35
<p>Decisions for steady then moving objects (<b>left</b>: the improved method, * means no decision, circle means object behind EGV, × means object in front of EGV; <b>right</b>: the basic method).</p> "> Figure 36
<p>Decisions for 1st encounter (<b>left</b>: the improved method, * means no decision, circle means object behind EGV; <b>right</b>: the basic method).</p> "> Figure 37
<p>Decisions for 2nd encounter (<b>left</b>: the improved method, * means no decision, circle means object behind EGV, red and magenta × means danger of collision; <b>right</b>: the basic method, red and magenta means danger of collision while the red circle with cross is the emergency and the blue circle with cross is the firm braking predicted stopped position of the EGV, respectively).</p> "> Figure 38
<p>Decisions for 3rd encounter (<b>left</b>: the improved method, * means no decision, circle means object behind EGV, red and magenta × means danger of collision; <b>right</b>: the basic method, red and magenta means danger of collision while the red circle with cross is the emergency and the blue circle with cross is the firm braking predicted stopped position of the EGV, respectively).</p> "> Figure 39
<p>Decisions for EGV following the other vehicle (<b>left</b>: the improved method, × means special case having other vehicle in front of EGV moving away; <b>right</b>: the basic method).</p> "> Figure 40
<p>Histogram of DNT values for 2nd (<b>left</b>) and 3rd (<b>right</b>) encounters with basic and improved methods).</p> "> Figure 41
<p>Histogram of minimum distances for 2nd (<b>left</b>) and 3rd (<b>right</b>) encounters with basic and improved methods).</p> ">
Abstract
:1. Introduction
2. System Hardware Structure and the Demonstration Scenarios
2.1. EGV Onboard Unit
2.2. Aerial Vehicle Onboard Unit
2.3. The Demonstration Scenarios
3. System Software Structure
3.1. Control Software
3.2. Camera Software
3.3. Detect Software
3.4. Speed of Onboard Operation
4. Training of Object Detection
5. Object Tracking in the Demonstrations
5.1. Projection of Bounding Box Centers to North-East-Down Coordinates
5.2. EGV Tracking and Management of Other Tracks
6. Risk Evaluation in the Encounters
6.1. Basic Decision in the Feasibility Demonstration (September 2022)
6.2. Improvements in the Decision Method
6.2.1. First Order Differentiation
6.2.2. Filtering and Polynomial Fit
6.2.3. Kalman Filtering Only
6.2.4. Motion Prediction and Risk Evaluation
- SC2: Object is in front of EGV, comes toward it (head on collision possibility) and so the driver should see it. meaning that the object and its moving direction is in range relative to EGV moving direction and the object comes towards the EGV.
- SC3: Object is in front of EGV, goes into the same direction and so the driver should see it. meaning that the object and its moving direction is in range relative to EGV moving direction and the object goes into the same direction.
7. Parameterized Basic and Improved Decision Methods for Tuning
7.1. Parameterized Basic Decision Method
7.2. Parameterized Improved Decision Method
7.3. Evaluation of the Situation after Decision
8. Parameter Tuning of the Methods
- 1st encounter: The other vehicle comes slowly from the right and stops well in time. This should be classified safe by the system.
- 2nd encounter: The other vehicle comes faster from the right and stops with emergency braking. This should be classified safe.
- 3rd encounter: The other vehicle comes fast from the right and does not stop; the EGV should stop. This should be classified dangerous.
8.1. Tuning Results for The Basic Decision Method
- Decide safe situation for 1st encounter.
- Decide dangerous situation for 3rd encounter.
- Give a safe stopping distance at least with emergency braking for the 3rd encounter.
8.2. Tuning Results for the Improved Decision Method
9. Detailed Evaluation with the Selected Parameters
- No decision: blue and cyan x.
- Danger of collision: red and magenta x.
- No decision: blue and cyan star.
- Danger of collision: red and magenta x.
- SC1 special mode, object behind EGV: blue and cyan circle.
- SC2 special mode, object in front of EGV moving towards it: blue and cyan plus.
- SC3 special mode, object in front of EGV moving away from it: blue and cyan x.
10. Conclusions
Supplementary Materials
Author Contributions
Funding
Conflicts of Interest
References
- Nagy, M. Development and Simulation Testing of a Forerunner UAV System. Master’s Thesis, Budapest University of Technology and Economics, Budapest, Hungary, 2021. [Google Scholar]
- Nagy., M.; Bauer., P.; Hiba., A.; Gáti., A.; Drotár., I.; Lattes., B.; Ádám, K. The Forerunner UAV Concept for the Increased Safety of First Responders. In Proceedings of the 7th International Conference on Vehicle Technology and Intelligent Transport Systems—VEHITS, INSTICC, Online Streaming, 28–30 April 2021; SciTePress: Setubal, Portugal, 2021; pp. 362–369. [Google Scholar] [CrossRef]
- Center, V.I.R. SZE’s Drone Research: Final Forerunner Drone Demonstration on IV. ZalaZONE Innovation Day. 2022. Available online: https://jkk-web.sze.hu/szes-drone-research-final-forerunner-drone-demonstration-on-iv-zalazone-innovation-day/?lang=en (accessed on 2 March 2023).
- Hiba, A.; Bauer, P.; Nagy, M.; Simonyi, E.; Kisari, A.; Kuna, G.I.; Drotar, I. Software-in-the-loop simulation of the forerunner UAV system. IFAC-PapersOnLine 2022, 55, 139–144. [Google Scholar] [CrossRef]
- Bauer, P.; Nagy, M.; Kuna, G.I.; Kisari, A.; Simonyi, E.; Hiba, A.; Drotar, I. Stability focused evaluation and tuning of special ground vehicle tracking algorithms. In Proceedings of the 22nd IFAC World Congress 2023, Yokohama, Japan, 9– 14 July 2023. [Google Scholar]
- DJI. DJI M600 Pro Hexacopter; DJI: Shenzhen, China, 2017. [Google Scholar]
- Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.Y.; Berg, A.C. SSD: Single Shot MultiBox Detector. In Proceedings of the 14th European Conference, Amsterdam, The Netherlands, 11–14 October 2016. [Google Scholar] [CrossRef]
- Lin, T.Y.; Goyal, P.; Girshick, R.; He, K.; Dollár, P. Focal Loss for Dense Object Detection. arXiv 2017, arXiv:1708.02002. [Google Scholar] [CrossRef]
- Wang, C.Y.; Bochkovskiy, A.; Liao, H.Y.M. YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. arXiv 2022, arXiv:2207.02696. [Google Scholar] [CrossRef]
- Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. Yolov4: Optimal speed and accuracy of object detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
- Jocher, G.; Stoken, A.; Chaurasia, A.; Borovec, J.; Kwon, Y.; Michael, K.; Changyu, L.; Fang, J.; Abhiram, V.; Laughing; et al. Ultralytics/Yolov5: V6.0 - YOLOv5n ‘Nano’ Models, Roboflow Integration, TensorFlow Export, OpenCV DNN Support. 2021. Available online: https://zenodo.org/record/5563715#.ZAVilx9BxPY (accessed on 2 March 2023).
- Luo, W.; Xing, J.; Milan, A.; Zhang, X.; Liu, W.; Kim, T.K. Multiple object tracking: A literature review. Artif. Intell. 2021, 293, 103448. [Google Scholar] [CrossRef]
- Wojke, N.; Bewley, A.; Paulus, D. Simple online and realtime tracking with a deep association metric. In Proceedings of the 2017 IEEE International Conference on Image Processing (ICIP), Beijing, China, 17–20 September 2017; pp. 3645–3649. [Google Scholar]
- Chu, P.; Fan, H.; Tan, C.C.; Ling, H. Online multi-object tracking with instance-aware tracker and dynamic model refreshment. In Proceedings of the 2019 IEEE Winter Conference on Applications of Computer Vision (WACV), Waikoloa Village, HI, USA, 7–11 January 2019; pp. 161–170. [Google Scholar]
- Wang, Z.; Zheng, L.; Liu, Y.; Li, Y.; Wang, S. Towards Real-Time Multi-Object Tracking. In Proceedings of the Computer Vision—ECCV 2020; Vedaldi, A., Bischof, H., Brox, T., Frahm, J.M., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 107–122. [Google Scholar]
- Bewley, A.; Ge, Z.; Ott, L.; Ramos, F.; Upcroft, B. Simple online and realtime tracking. In Proceedings of the 2016 IEEE International Conference on Image Processing (ICIP), Phoenix, AZ, USA, 25–28 September 2016; pp. 3464–3468. [Google Scholar]
- Park, Y.; Lepetit, V.; Woo, W. Multiple 3d object tracking for augmented reality. In Proceedings of the 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality, Washington, DC, USA, 15–18 September 2008; pp. 117–120. [Google Scholar]
- Wu, Q.; Hui, L.C.K.; Yeung, C.Y.; Chim, T.W. Early car collision prediction in VANET. In Proceedings of the 2015 International Conference on Connected Vehicles and Expo (ICCVE), Shenzhen, China, 19–23 October 2015; pp. 94–99. [Google Scholar] [CrossRef]
- Chen, K.P.; Hsiung, P.A. Vehicle Collision Prediction under Reduced Visibility Conditions. Sensors 2018, 18, 3026. [Google Scholar] [CrossRef]
- Rossi, L.; Ajmar, A.; Paolanti, M.; Pierdicca, R. Vehicle trajectory prediction and generation using LSTM models and GANs. PLoS ONE 2021, 16, 3868. [Google Scholar] [CrossRef] [PubMed]
- Candela, E.; Feng, Y.; Mead, D.; Demiris, Y.; Angeloudis, P. Fast Collision Prediction for Autonomous Vehicles using a Stochastic Dynamics Model. In Proceedings of the 2021 IEEE International Intelligent Transportation Systems Conference (ITSC), Indianapolis, IN, USA, 19–22 September 2021; pp. 211–216. [Google Scholar] [CrossRef]
- Shen, C.H.; Hsu, T.J. Research on Vehicle Trajectory Prediction and Warning Based on Mixed Neural Networks. Appl. Sci. 2021, 11, 7. [Google Scholar] [CrossRef]
- Hebert, A.; Guédon, T.; Glatard, T.; Jaumard, B. High-Resolution Road Vehicle Collision Prediction for the City of Montreal. In Proceedings of the 2019 IEEE International Conference on Big Data (Big Data), Los Angeles, CA, USA, 9–12 December 2019; pp. 1804–1813. [Google Scholar]
- Moazzam, M.G.; Haque, M.R.; Uddin, M.S. Image-Based Vehicle Speed Estimation. J. Comput. Commun. 2019, 07, 1–5. [Google Scholar] [CrossRef]
- Vakili, E.; Shoaran, M.; Sarmadi, M.R. Single–camera vehicle speed measurement using the geometry of the imaging system. Multimed. Tools Appl. 2020, 79, 19307–19327. [Google Scholar] [CrossRef]
- Doğan, S.; Temiz, M.S.; Külür, S. Real Time Speed Estimation of Moving Vehicles from Side View Images from an Uncalibrated Video Camera. Sensors 2010, 10, 4805–4824. [Google Scholar] [CrossRef] [PubMed]
- Feng, H.; Shi, W.; Chen, F.; Byon, Y.J.; Heng, W.; Pan, S. A Calculation Method for Vehicle Movement Reconstruction from Videos. J. Adv. Transp. 2020, 2020, 1–13. [Google Scholar] [CrossRef]
- Guido, G.; Gallelli, V.; Rogano, D.; Vitale, A. Evaluating the accuracy of vehicle tracking data obtained from Unmanned Aerial Vehicles. Int. J. Transp. Sci. Technol. 2016, 5, 136–151. [Google Scholar] [CrossRef]
- Biswas, D.; Su, H.; Wang, C.; Stevanovic, A. Speed Estimation of Multiple Moving Objects from a Moving UAV Platform. ISPRS Int. J. -Geo-Inf. 2019, 8, 259. [Google Scholar] [CrossRef]
- Yamazaki, F.; Liu, W.; Vu, T.T. Vehicle Extraction and Speed Detection from Digital Aerial Images. In Proceedings of the IGARSS 2008–2008 IEEE International Geoscience and Remote Sensing Symposium, Boston, MA, USA, 7–11 July 2008; Volume 3, pp. III-1334–III-1337. [Google Scholar] [CrossRef]
- Long, H.; Chung, Y.N.; Li, J.D. Automatic Vehicle Speed Estimation Method for Unmanned Aerial Vehicle Images. J. Inf. Hiding Multim. Signal Process. 2018, 9, 442–451. [Google Scholar]
- da Silva Bastos, M.E.; Freitas, V.Y.F.; de Menezes, R.S.T.; Maia, H. Vehicle Speed Detection and Safety Distance Estimation Using Aerial Images of Brazilian Highways. In Proceedings of the 2020: ANAIS DO XLVII SEMINÁRIO INTEGRADO DE SOFTWARE E HARDWARE; 2020. Available online: https://sol.sbc.org.br/index.php/semish/article/view/11334/11197 (accessed on 2 March 2023). [CrossRef]
- Brown, R.; Schneider, S.; Mulligan, M. Analysis of algorithms for velocity estimation from discrete position versus time data. IEEE Trans. Ind. Electron. 1992, 39, 11–19. [Google Scholar] [CrossRef]
- Kilic, E.; Baser, O.; Dolen, M.; Konukseven, E.I. An enhanced adaptive windowing technique for velocity and acceleration estimation using incremental position encoders. In Proceedings of the ICSES 2010 International Conference on Signals and Electronic Circuits, Gliwice, Poland, 7–10 September 2010; pp. 61–64. [Google Scholar]
- Merry, R.; van de Molengraft, M.; Steinbuch, M. Velocity and acceleration estimation for optical incremental encoders. Mechatronics 2010, 20, 20–26. [Google Scholar] [CrossRef]
- Shaowei, W.; Shanming, W. Velocity and acceleration computations by single-dimensional Kalman filter with adaptive noise variance. Przegląd Elektrotechniczny 2012, 88, 283–287. [Google Scholar]
- Saltarén, R.J.; Cena, G.; Abascal, G. On the Velocity and Acceleration Estimation from Discrete Time-Position Sensors. J. Control. Eng. Appl. Inform. 2015, 17, 30–40. [Google Scholar]
- AVL. AVL Zalazone Smart City Module. 2022. Available online: https://www.avlzalazone.com/testing-and-track/smart-city/ (accessed on 2 March 2023).
- DJI. DJI Onboard SDK Documentation. 2022. Available online: https://developer.dji.com/onboard-api-reference/index.html (accessed on 2 March 2023).
- Basler. acA2040-35gc Camera. 2022. Available online: https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2040-35gc/ (accessed on 2 March 2023).
- Li, X.; Cai, Z.; Zhao, X. Oriented-YOLOv5: A Real-time Oriented Detector Based on YOLOv5. In Proceedings of the 2022 7th International Conference on Computer and Communication Systems (ICCCS), Wuhan, China, 22–25 April 2022; pp. 216–222. [Google Scholar]
- Dosovitskiy, A.; Ros, G.; Codevilla, F.; Lopez, A.; Koltun, V. CARLA: An Open Urban Driving Simulator. In Proceedings of the 1st Annual Conference on Robot Learning, Mountain View, CA, USA, 13–15 November 2017; pp. 1–16. [Google Scholar]
- Hsieh, M.R.; Lin, Y.L.; Hsu, W.H. Drone-based Object Counting by Spatially Regularized Regional Proposal Network. arXiv 2017, arXiv:1707.05972. [Google Scholar] [CrossRef]
- Hsieh, M.R.; Lin, Y.L.; Hsu, W.H. CARPK and PUCPR+ Databases. 2017. Available online: https://lafi.github.io/LPN/ (accessed on 2 March 2023).
- Razakarivony, S.; Jurie, F. Vehicle detection in aerial imagery: A small target detection benchmark. J. Vis. Commun. Image Represent. 2016, 34, 187–203. [Google Scholar] [CrossRef]
- Razakarivony, S.; Jurie, F. Vedai Database. 2016. Available online: https://downloads.greyc.fr/vedai/ (accessed on 2 March 2023).
- Nathan, M.T.; Goran, K.; Wesam, A.S.; Kofi, B. Cars Overhead with Context (COWC). 2015. Available online: https://library.ucsd.edu/dc/object/bb8332755d (accessed on 2 March 2023).
- Nathan, M.T.; Goran, K.; Wesam, A.S.; Kofi, B. Cars Overhead with Context (COWC) Database. 2015. Available online: https://gdo152.llnl.gov/cowc/ (accessed on 2 March 2023).
- Roboflow, Online Computer Vision Platform. Available online: https://roboflow.com/ (accessed on 2 March 2023).
- Li, X.; Wu, J. Extracting High-Precision Vehicle Motion Data from Unmanned Aerial Vehicle Video Captured under Various Weather Conditions. Remote Sens. 2022, 14, 5513. [Google Scholar] [CrossRef]
- Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef]
Classes | Precision | Recall | F1 | [email protected] | mAP.5:0.95 |
---|---|---|---|---|---|
All | 0.938 | 0.826 | 0.878 | 0.878 | 0.581 |
Cars | 0.941 | 0.979 | 0.959 | 0.977 | 0.827 |
Bicycles | 0.938 | 0.861 | 0.897 | 0.918 | 0.54 |
Pedestrians | 0.934 | 0.637 | 0.757 | 0.739 | 0.377 |
Classes | Precision | Recall | F1 | [email protected] | mAP.5:0.95 |
---|---|---|---|---|---|
Cars | 0.941 | 0.94 | 0.94 | 0.95 | 0.775 |
Direction | Flight 1 | Flight 2 | Flight 3 | Flight 4 | Average |
---|---|---|---|---|---|
North | 0.058 | 0.26 | 0.315 | 0.0662 | 0.1748 |
East | 1.43 | 0.93 | 0.333 | 0.2 | 0.7233 |
Braking Type | 20 km/h | 30 km/h | 40 km/h | 50 km/h |
---|---|---|---|---|
Firm | 1.64 | 1.88 | 2.6 | 2.8 |
Emergency | 0.98 | 1.28 | 1.58 | 1.8 |
Parameter | MIN | Step | MAX |
---|---|---|---|
W [-] | 3 | 1 | 9 |
[s] | 1 | 0.5 | 2.5 |
[m] | 2 | 2 | 6 |
[-] | 2 | 2 | 8 |
Notation | Decision Braking Style | Evaluation Braking Style |
---|---|---|
FF | Firm | Firm |
EE | Emergency | Emergency |
FE | Firm | Emergency |
Parameter | MIN | Step | MAX |
---|---|---|---|
[-] | 1 | 1 | 3 |
[-] | 1 | 1 | 3 |
[s] | −0.2 | 0.1 | 0.2 |
[-] | 0 | 0.1 | 0.5 |
FF | - | - | - |
EE | - | - | - |
FE | - | - | - |
W [-] | [s] | [m] | [-] |
---|---|---|---|
4 | 1.5 | 6 | 8 |
4 | 2 | 2 | 6 |
4 | 2 | 2 | 8 |
Valid Combination | S1 | S2 | S3 | S4 | S5 | S6 |
---|---|---|---|---|---|---|
Flight 2/3rd Enc. | ||||||
1 | 1 | 2 | 2 | 3 | 3 | |
2 | 3 | 2 | 3 | 2 | 3 | |
[m] | 4.83 | 5.45 | 4.83 | 5.68 | 5.28 | 5.92 |
DNT () [s] | 2.3 | 3.5 | 2.3 | 3.7 | 2.7 | 3.8 |
Flight 1/3rd Enc. | ||||||
Case | S1 | S2 | S3 | S4 | S5 | S6 |
[m] | 1.46 | 1.46 | 1.46 | 1.46 | 1.46 | 1.46 |
DNT () [s] | 2.2 | 2.6 | 2.4 | 2.6 | 2.4 | 2.7 |
Flight 1/1st Enc. | ||||||
Case | S1 | – | S3 | – | – | – |
Flight 2/1st Enc. | ||||||
Case | – | – | – | – | – | – |
THS | 0 | 0.1 | 0.2 | 0.3 | 0.4 | 0.5 |
[m] | 4.83 | 4.83 | 4.83 | 3.64 | 0 | 0 |
DNT () [s] | 2.3 | 2.3 | 2.3 | 1.5 | 0 | 0 |
[s] | −0.2 | −0.1 | 0 | 0.1 | 0.2 |
[m] | 3.64 | 4.83 | 4.83 | 5.28 | 5.18 |
DNT () [s] | 1.5 | 2.3 | 2.3 | 2.7 | 2.9 |
Flight/Encounter | Expected | Decision | Real MIN. Distance [m] | Firm MIN. Distance [m] | Emergency MIN. Distance [m] | DNT () [s] |
---|---|---|---|---|---|---|
Flight 1/1st enc. | S | S | 13.3 | - | - | - |
Flight 1/2nd enc. | S | D | 6.45 | 2.47 | 3.5 | 0.4 |
Flight 1/3rd enc. | D | D | 6.93 | −1.15 | 1.67 | 0 |
Flight 2/1st enc. | S | S | 11.17 | - | - | - |
Flight 2/2nd enc. | S | D | 10.1 | 8.6 | 8 | 0.2 |
Flight 2 / 3rd enc. | D | D | 9.15 | 8.24 | 8.81 | 0.2 |
Flight 3/1st enc. | S | S | 21.38 | - | - | - |
Flight 3/2nd enc. | S | D | 8.94 | 5.48 | 5.82 | 0.2 |
Flight 3/3rd enc. | D | D | 11 | 6 | 7 | 0.2 |
Flight 4/1st enc. | S | S | 16.3 | - | - | - |
Flight 4/2nd enc. | S | D | 7.9 | 5.6 | 6.65 | 0.4 |
Flight 4/3rd enc. | D | S | 12 | - | - | - |
Flight/Encounter | Expected | Decision | Real MIN. Distance [m] | Evaluation MIN. Distance [m] | DNT () [s] |
---|---|---|---|---|---|
Flight 1/1st enc. | S | S | 13.3 | - | - |
Flight 1/2nd enc. | S | D | 6.45 | 6.77 | 3.9 |
Flight 1/3rd enc. | D | D | 6.93 | −1.08 | 2.4 |
Flight 2/1st enc. | S | D | 11.17 | 3.9 | 1.0 |
Flight 2/2nd enc. | S | D | 10.1 | -3.2 | 3 |
Flight 2/3rd enc. | D | D | 9.15 | 3 | 2.3 |
Flight 3/1st enc. | S | S | 21.38 | - | - |
Flight 3/2nd enc. | S | D | 8.94 | −2.7 | 2.8 |
Flight 3/3rd enc. | D | D | 11 | 8 | 1.8 |
Flight 4/1st enc. | S | S | 16.3 | - | - |
Flight 4/2nd enc. | S | D | 7.9 | 6.42 | 3 |
Flight 4/3rd enc. | D | D | 12 | 6 | 1.0 |
Flight/Encounter | Expected | Decision | Real MIN. Distance [m] | Evaluation MIN. Distance [m] | DNT () [s] |
---|---|---|---|---|---|
Flight 1/1st enc. | S | S | 13.3 | - | - |
Flight 1/2nd enc. | S | D | 6.45 | 1.88 | 2.3 |
Flight 1/3rd enc. | D | D | 6.93 | 1.46 | 2.4 |
Flight 2/1st enc. | S | S | 11.17 | - | - |
Flight 2/2nd enc. | S | S | 10.1 | - | - |
Flight 2/3rd enc. | D | S | 9.15 | - | - |
Flight 3/1st enc. | S | S | 21.38 | - | - |
Flight 3/2nd enc. | S | S | 8.94 | - | - |
Flight 3/3rd enc. | D | D | 11 | 7 | 0.9 |
Flight 4/1st enc. | S | S | 16.3 | - | - |
Flight 4/2nd enc. | S | D | 7.9 | 5.62 | 1.8 |
Flight 4/3rd enc. | D | S | 12 | - | - |
Flight/Encounter | Expected | Decision | Real MIN. Distance [m] | Evaluation MIN. Distance [m] | DNT () [s] |
---|---|---|---|---|---|
Flight 1/1st enc. | S | S | 13.3 | - | - |
Flight 1/2nd enc. | S | D | 6.45 | 8.81 | 3.9 |
Flight 1/3rd enc. | D | D | 6.93 | 1.46 | 2.4 |
Flight 2/1st enc. | S | D | 11.17 | 6.49 | 1.0 |
Flight 2/2nd enc. | S | D | 10.1 | −1.11 | 3.0 |
Flight 2/3rd enc. | D | D | 9.15 | 4.83 | 2.3 |
Flight 3/1st enc. | S | S | 21.38 | - | - |
Flight 3/2nd enc. | S | D | 8.94 | −1 | 2.8 |
Flight 3/3rd enc. | D | D | 11 | 8.28 | 1.8 |
Flight 4/1st enc. | S | S | 16.3 | - | - |
Flight 4/2nd enc. | S | D | 7.9 | 7.55 | 3.0 |
Flight 4/3rd enc. | D | D | 12 | 7.37 | 1.0 |
Flight/Encounter | Expected | Decision | Real MIN. Distance [m] | Firm MIN. Distance [m] | Emergency MIN. Distance [m] | DNT (maxT) [s] |
---|---|---|---|---|---|---|
Flight 1/1st enc. | S | S | 13.3 | - | - | - |
Flight 1/2nd enc. | S | D | 6.45 | 6.63 | 7.66 | 1.9 |
Flight 1/3rd enc. | D | D | 6.93 | −1.15 | 1.67 | 2.1 |
Flight 2/1st enc. | S | D | 11.17 | 8.43 | 8.82 | 0.2 |
Flight 2/2nd enc. | S | D | 10.1 | 1.04 | 1.63 | 0.3 |
Flight 2/3rd enc. | D | D | 9.15 | 11.75 | 11.81 | 1.3 |
Flight 3/1st enc. | S | S | 21.38 | - | - | - |
Flight 3/2nd enc. | S | D | 8.94 | 5.43 | 5.76 | 1 |
Flight 3/3rd enc. | D | D | 11 | 1.75 | 3.29 | 1 |
Flight 4/1st enc. | S | S | 16.3 | - | - | - |
Flight 4/2nd enc. | S | D | 7.9 | 8.65 | 9.26 | 1.6 |
Flight 4/3rd enc. | D | D | 12 | 6.07 | 7.2 | 0.3 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bauer, P.; Hiba, A.; Nagy, M.; Simonyi, E.; Kuna, G.I.; Kisari, Á.; Drotár, I.; Zarándy, Á. Encounter Risk Evaluation with a Forerunner UAV. Remote Sens. 2023, 15, 1512. https://doi.org/10.3390/rs15061512
Bauer P, Hiba A, Nagy M, Simonyi E, Kuna GI, Kisari Á, Drotár I, Zarándy Á. Encounter Risk Evaluation with a Forerunner UAV. Remote Sensing. 2023; 15(6):1512. https://doi.org/10.3390/rs15061512
Chicago/Turabian StyleBauer, Péter, Antal Hiba, Mihály Nagy, Ernő Simonyi, Gergely István Kuna, Ádám Kisari, István Drotár, and Ákos Zarándy. 2023. "Encounter Risk Evaluation with a Forerunner UAV" Remote Sensing 15, no. 6: 1512. https://doi.org/10.3390/rs15061512