FFireNet: Deep Learning Based Forest Fire Classification and Detection in Smart Cities
<p>MobileNetV2 bottleneck residual blocks.</p> "> Figure 2
<p>Working mechanism for forest fire detection.</p> "> Figure 3
<p>Images from (<b>a</b>,<b>b</b>) Fire class and (<b>c</b>,<b>d</b>) No-Fire class.</p> "> Figure 4
<p>(<b>a</b>) Resized original images of both classes. Data augmentations on (<b>b</b>) Fire images and (<b>c</b>) No-Fire images.</p> "> Figure 5
<p>Training performance of the proposed approach on the forest fire dataset.</p> "> Figure 6
<p>Confusion matrix of the proposed approach on the test forest fire dataset.</p> "> Figure 7
<p>(<b>a</b>) ROC curve and (<b>b</b>) scaled representation.</p> "> Figure 8
<p>(<b>a</b>) PR Curve and (<b>b</b>) scaled representation.</p> "> Figure 9
<p>True classifications: (<b>a</b>) positives and (<b>b</b>) negatives.</p> "> Figure 10
<p>False positives (ground truth = No-Fire).</p> "> Figure 11
<p>False negative (ground truth = Fire).</p> ">
Abstract
:1. Introduction
- Presenting the literature of computer vision-based forest fire localization and classification methods in forest and wildland environments.
- Making use of the newly created dataset, this research further improves the detection accuracy in classifying fire and no-fire images of the forest fire detection dataset that focuses on forest settings unlike previous wildfire research that considers wildlands including bushes and farmlands etc.
- Proposing a CNN-based transfer learning approach named FFireNet for forest fire classification on the local dataset. The solution explores the MobileNetV2 model by exploiting the trained weights of the convolutional base and adding fully connected layers for learning complex features and classification.
- Evaluating the proposed FFireNet method and comparing it with other CNN models on the forest fire dataset using various performance metrics to validate the performance of the proposed approach.
2. Literature Review
2.1. Localization
2.2. Classification
3. Forest Fire Detection System
3.1. Proposed Approach
3.2. MobileNetV2 Model
3.3. FFireNet
3.4. Activation Functions
3.5. Optimizer
3.6. Loss Function
4. Dataset
4.1. Pre-Processing on Dataset
4.2. Dataset Distribution
4.3. Augmentation of Data
5. Results and Discussion
5.1. Hyper-Parameter Selection
5.2. Performance Analysis of the Proposed Approach
5.2.1. Training Performance Analysis of Proposed Approach
5.2.2. Prediction Accuracy and Error Rate
5.2.3. True and False Rates
5.2.4. Precision and Recall
5.2.5. True Classifications
5.2.6. False Classifications
5.3. Performance Comparison with Other CNN Models
5.4. Performance Comparison with Other Research Works in the Literature
6. Conclusions
Author Contributions
Funding
Data Availability
Conflicts of Interest
References
- Zanchi, G.; Yu, L.; Akselsson, C.; Bishop, K.; Köhler, S.; Olofsson, J.; Belyazid, S. Simulation of water and chemical transport of chloride from the forest ecosystem to the stream. Environ. Model. Softw. 2021, 138, 104984. [Google Scholar] [CrossRef]
- Bo, M.; Mercalli, L.; Pognant, F.; Cat Berro, D.; Clerico, M. Urban air pollution, climate change and wildfires: The case study of an extended forest fire episode in northern italy favoured by drought and warm weather conditions. In Proceedings of the 7th International Conference on Energy and Environment Research, Jawa Tengah, Indonesia, 14–17 September 2020; de Sá Caetano, N., Salvini, C., Giovannelli, A., Felgueiras, C., Eds.; Energy Reports. 2020; pp. 781–786. [Google Scholar]
- Vardoulakis, S.; Marks, G.; Abramson, M.J. Lessons learned from the Australian bushfires: Climate change, air pollution, and public health. JAMA Intern. Med. 2020, 180, 635–636. [Google Scholar] [CrossRef] [PubMed]
- How Different Tree Species Impact the Spread of Wildfire. Available online: https://open.alberta.ca/dataset/how-different-tree-species-impact-the-spread-of-wildfire (accessed on 20 August 2021).
- Maxouris, C. Here’s Just How Bad the Devastating Australian Fires Are–by the Numbers. Available online: https://edition.cnn.com/2020/01/06/us/australian-fires-by-the-numbers-trnd/index.html (accessed on 20 August 2021).
- State of California. Thomas Fire Incident Information. Available online: https://fire.ca.gov/incident/?incident=d28bc34e-73a8-454d-9e55-dea7bdd40bee (accessed on 20 August 2021).
- Rodrigues, M.; Gelabert, P.J.; Ameztegui, A.; Coll, L.; Vega-Garcia, C. Has COVID-19 halted winter-spring wildfires in the mediterranean? Insights for wild-fire science under a pandemic context. Sci. Total Environ. 2021, 765, 142793. [Google Scholar] [CrossRef] [PubMed]
- Kountouris, Y. Human activity, daylight saving time and wildfire occurrence. Sci. Total Environ. 2020, 727, 138044. [Google Scholar] [CrossRef] [PubMed]
- Kim, T.; Ramos, C.; Mohammed, S. Smart city and IoT. Future Gener. Comput. Syst. 2017, 76, 159–162. [Google Scholar] [CrossRef]
- Hsiao, Y.C.; Wu, M.H.; Li, S.C. Elevated Performance of the Smart City—A Case Study of the IoT by Innovation Mode. IEEE Trans. Eng. Manag. 2021, 68, 1461–1475. [Google Scholar] [CrossRef]
- Khan, A.I.; Al-Habsi, S. Machine Learning in Computer Vision. Procedia Comput. Sci. 2020, 167, 1444–1451. [Google Scholar] [CrossRef]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef] [PubMed]
- Khan, A.; Khan, S.; Hassan, B.; Zheng, Z. CNN-Based Smoker Classification and Detection in Smart City Application. Sensors 2022, 22, 892. [Google Scholar] [CrossRef]
- Khan, S.; Teng, Y.; Cui, J. Pedestrian Traffic Lights Classification Using Transfer Learning in Smart City Application. In Proceedings of the IEEE 13th International Conference on Communication Software and Networks, Chongqing, China, 4–7 June 2021; pp. 352–356. [Google Scholar]
- Bu, F.; Gharajeh, M.S. Intelligent and vision-based fire detection systems: A survey. Image Vis. Comput. 2019, 91, 103803. [Google Scholar] [CrossRef]
- Bouguettaya, A.; Zarzour, H.; Taberkit, A.M.; Kechida, A. A review on early wildfire detection from unmanned aerial vehicles using deep learning-based computer vision algorithms. Sign. Proc. 2022, 190, 108309. [Google Scholar] [CrossRef]
- Zhang, Q.X.; Lin, G.H.; Zhang, Y.M.; Xu, G.; Wang, J.J. Wildland Forest Fire Smoke Detection Based on Faster R-CNN using Synthetic Smoke Images. Process Eng. 2019, 211, 441–446. [Google Scholar] [CrossRef]
- Hossain, F.M.A.; Zhang, Y.M.; Tonima, M.A. Forest fire flame and smoke detection from UAV-captured images using fire-specific color features and multi-color space local binary pattern. J. Unmanned Veh. Syst. 2020, 8, 285–309. [Google Scholar] [CrossRef]
- Jeong, M.; Park, M.; Nam, J.; Ko, B.C. Light-Weight Student LSTM for Real-Time Wildfire Smoke Detection. Sensors 2020, 20, 5508. [Google Scholar] [CrossRef] [PubMed]
- Srinivas, K.; Dua, M. Fog Computing and Deep CNN Based Efficient Approach to Early Forest Fire Detection with Unmanned Aerial Vehicles. In Proceedings of the International Conference on Inventive Computation Technologies (ICICIT), Tamil Nadu, India, 29–30 August 2019; Smys, S., Bestak, R., Rocha, A., Eds.; Springer Nature: Chennai, India, 2019; pp. 646–652. [Google Scholar]
- Alexandrov, D.; Pertseva, E.; Berman, I.; Pantiukhin, I.; Kapitonov, A. Analysis of Machine Learning Methods for Wildfire Security Monitoring with an Unmanned Aerial Vehicles. In Proceedings of the 24th Conference of Open Innovations Association FRUCT (Finnish-Russian University Cooperation in Telecommunications), Moscow, Russia, 8–12 April 2019; pp. 3–9. [Google Scholar]
- Zhang, Q.; Xu, J.; Xu, L.; Guo, H. Deep Convolutional Neural Networks for Forest Fire Detection. In Proceedings of the International Forum on Management, Education and Information Technology Application, Guangzhou, China, 30–31 January 2016; Kim, Y.H., Ed.; Atlantis Press: Amsterdam, The Netherlands, 2016; pp. 568–575. [Google Scholar]
- Jiao, Z.; Zhang, Y.; Mu, L.; Xin, J.; Jiao, S.; Liu, H.; Liu, D. A YOLOv3-based Learning Strategy for Real-time UAV-based Forest Fire Detection. In Proceedings of the Chinese Control and Decision Conference (CCDC), Hefei, China, 22–24 August 2020; IEEE: New York, NY, USA, 2020; pp. 4963–4967. [Google Scholar]
- Li, W.; Yu, Z. A Lightweight Convolutional Neural Network Flame Detection Algorithm. In Proceedings of the IEEE International Conference on Electronics Information and Emergency Communication (ICEIEC), Beijing, China, 18–20 June 2021; IEEE: New York, NY, USA, 2021; pp. 83–86. [Google Scholar]
- Lee, W.; Kim, S.; Lee, Y.T.; Lee, H.W.; Choi, M. Deep neural networks for wild fire detection with unmanned aerial vehicle. In Proceedings of the IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 5–10 January 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 252–253. [Google Scholar]
- Kaabi, R.; Sayadi, M.; Bouchouicha, M.; Fnaiech, F.; Moreau, E.; Ginoux, J.M. Early smoke detection of forest wildfire video using deep belief network. In Proceedings of the International Conference on Advanced Technologies for Signal and Image Processing (ATSIP), Sousse, Tunisia, 21–24 March 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–6. [Google Scholar]
- Zhao, Y.; Ma, J.; Li, X.; Zhang, J. Saliency Detection and Deep Learning-Based Wildfire Identification in UAV Imagery. Sensors 2018, 18, 712. [Google Scholar] [CrossRef] [Green Version]
- Chen, Y.; Zhang, Y.; Xin, J.; Wang, G.; Mu, L.; Yi, Y.; Liu, H.; Liu, D. UAV Image-based Forest Fire Detection Approach Using Convolutional Neural Network. In Proceedings of the IEEE Conference on Industrial Electronics and Applications (ICIEA), Xi’an, China, 19–21 June 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 2118–2123. [Google Scholar]
- Cao, Y.; Yang, F.; Tang, Q.; Lu, X. An Attention Enhanced Bidirectional LSTM for Early Forest Fire Smoke Recognition. IEEE Access 2019, 7, 154732–154742. [Google Scholar] [CrossRef]
- Sousa, M.J.; Moutinho, A.; Almeida, M. Wildfire detection using transfer learning on augmented datasets. Expert Syst. Appl. 2019, 142, 112975. [Google Scholar] [CrossRef]
- Govil, K.; Welch, M.L.; Ball, J.T.; Pennypacker, C.R. Preliminary Results from a Wildfire Detection System Using Deep Learning on Remote Camera Images. Remote Sens. 2020, 12, 166. [Google Scholar] [CrossRef] [Green Version]
- Tang, Y.; Feng, H.; Chen, J.; Chen, Y. ForestResNet: A Deep Learning Algorithm for Forest Image Classification. J. Physics Conf. Ser. 2021, 2024, 012053. [Google Scholar] [CrossRef]
- Park, M.; Tran, D.Q.; Lee, S.; Park, S. Multilabel Image Classification with Deep Transfer Learning for Decision Support on Wildfire Response. Remote Sens. 2021, 13, 3985. [Google Scholar] [CrossRef]
- Sun, X.; Sun, L.; Huang, Y. Forest fire smoke recognition based on convolutional neural network. J. For. Res. 2020, 32, 1921–1927. [Google Scholar] [CrossRef]
- Khan, A.; Hassan, B.; Khan, S.; Ahmed, R.; Abuassba, A. DeepFire: A Novel Dataset and Deep Transfer Learning Benchmark for Forest Fire Detection. Mob. Inf. Syst. 2022, 2022, 1–14. [Google Scholar] [CrossRef]
- Mihalkova, L.; Mooney, R. Transfer learning from minimal target data by mapping across relational domains. In Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI), Pasadena, CA, USA, 11–17 July 2009; AAAI: Palo Alto, CA, USA, 2009; pp. 1163–1168. [Google Scholar]
- Sandler, M.; Howard, A.; Zhu, M.; Zhmoginov, A.; Chen, L. MobileNetV2: Inverted Residuals and Linear Bottlenecks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018; IEEE Computer Society, Conference Publishing Services: Los Alamitos, CA, USA, 2018; pp. 4510–4520. [Google Scholar]
- Deng, J.; Dong, W.; Socher, R.; Li, L.; Li, K.; Fei-Fei, L. ImageNet: A large-scale hierarchical image database. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Miami, FL, USA, 20–25 June 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 248–255. [Google Scholar]
- Forest Fire Dataset. Available online: https://www.kaggle.com/datasets/alik05/forest-fire-dataset (accessed on 15 April 2022).
- Zoph, B.; Vasudevan, V.; Shlens, J.; Le, Q.V. Learning transferable architectures for scalable image recognition. arXiv 2017, arXiv:1707.07012. [Google Scholar]
- Chollet, F. Xception: Deep Learning with Depthwise Separable Convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; IEEE: Piscataway, NJ, USA; pp. 1800–1807. [Google Scholar]
- Szegedy, C.; Vanhoucke, V.; Ioffe, S.; Shlens, J.; Wojna, Z. Rethinking the Inception Architecture for Computer Vision. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 2818–2826. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Identity Mappings in Deep Residual Networks. In Proceedings of the European Conference on Computer Vision (ECCV), Amsterdam, The Netherlands, 11–14 October 2016; Springer International Publishing: New York, NY, USA; pp. 630–645. [Google Scholar]
Research | Method | Dataset Type | Classes | Accuracy | Precision | Recall | AP | F1 |
---|---|---|---|---|---|---|---|---|
Alexandrov et al. [21] | HaaR | Image | smoke and forest background | 87.4% | 87.4% | 100% | - | - |
LBP | 81.3% | 81.3% | 100% | - | - | |||
YOLOv2 | 98.3% | 100% | 98.3% | - | - | |||
Faster R-CNN | 95.9% | 100% | 95.9% | - | - | |||
SSD | 81.1% | 88.4% | 90.7% | - | - | |||
Zhang et al. [22] | SVM-Pool5 | Images through video | fire and non-fire | 95.6% | 76.2% | - | - | - |
CNN-Pool5 | 97.3% | 84.8% | - | - | - | |||
Jiao et al. [23] | YOLOv3 | Image | fire | - | 84% | 78% | 78.9% | 81% |
Li et al. [24] | Yolo-Edge | Image | forest fire and non-forest fire | - | 78.5% | 51.2% | 63.7% | 62% |
Research | Method | Dataset Type | Classes | Accuracy | Precision | Recall | TNR | FPR | FNR | F1 |
---|---|---|---|---|---|---|---|---|---|---|
Lee et al. [25] | AlexNet | Image | fire and non-fire | 94.8% | - | - | - | - | - | - |
VGG13 | 86.2% | - | - | - | - | - | - | |||
Modified VGG13 | 96.2% | - | - | - | - | - | - | |||
GoogleNet | 99.0% | - | - | - | - | - | - | |||
Modified GoogleNet | 96.9% | - | - | - | - | - | - | |||
Kaabi et al. [26] | DBN | Images through video | smoke and no smoke | 95% | - | - | - | - | - | - |
Zhao et al. [27] | Fire_Net | Image | fire and no-fire | 98% | - | 98.8% | 97.2% | - | 0.12% | - |
Chen et al. [28] | CNN-9 | Images through video | smoke and flame | 61% | - | 80% | - | 72% | - | - |
CNN-17 | 86% | - | 98% | - | 34% | - | - | |||
Cao et al. [29] | Abi-LSTM | Images through video | smoke and non-smoke | 97.8% | - | 97.5% | 98% | - | - | - |
Sousa et al. [30] | Inceptionv3 | Image | fire and not fire | 93.6% | 94.1% | 93.1% | 94.1% | - | - | - |
Govil et al. [31] | Inceptionv3 | Image | smoke and non-smoke | 91% | - | - | - | - | - | 89% |
Tang et al. [32] | ForestResNet | Image | normal, smoke and fire | 92% | - | - | - | - | - | - |
Park et al. [33] | VGG16 | Image | wildfire, non-fire, flame, smoke, building and pedestrian | - | 96.4% | 95.6% | - | - | - | 96% |
ResNet50 | - | 96.6% | 94.9% | - | - | - | 95.6% | |||
Densenet121 | - | 99.1% | 97.8% | - | - | - | 98.5% | |||
Sun et al. [34] | CNN_S | Image | fire and without fire | 94.1% | - | - | - | - | - | - |
Khan et al. [35] | VGG19 | Image | fire and no-fire | 95.0% | 95.7% | 94.2% | 95.8% | 4.2% | 5.8% | 94.9% |
Dataset | Training | Validation | Testing | Total |
---|---|---|---|---|
Fire | 608 | 152 | 190 | 950 |
No-fire | 608 | 152 | 190 | 950 |
Total | 1216 | 304 | 380 | 1900 |
Augmentation | Value |
---|---|
Rotation | 1°~50° |
Scaling | 0.1~0.2 |
Shear Transformation | 0.1~0.2 |
Translation | 0.1~0.2 |
Parameters | Value |
---|---|
Maximum Epochs | 50 |
Iterations/Epoch | 100 |
Validation Frequency | 100 |
Batch Size | 64 |
Learning Rate | 0.01 |
Loss Function | Binary cross entropy |
Optimizer | Stochastic Gradient Descent (SGD) |
Performance Metrics | Results |
---|---|
189 | |
185 | |
5 | |
1 | |
98.42% | |
1.58% |
Performance Metrics | Results |
---|---|
97.37% | |
99.47% | |
2.63% | |
0.53% | |
0.9990 |
Performance Metrics | Results |
---|---|
97.42% | |
99.47% | |
2.58% | |
98.43% | |
99.90% |
Methods | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
Proposed FFireNet | 189 | 185 | 5 | 1 | 98.42% | 1.58% | 97.42% | 99.47% | 2.58% | 98.43% |
NASNetMobile [40] | 188 | 183 | 7 | 2 | 97.63% | 2.37% | 96.41% | 98.95% | 3.59% | 97.22% |
InceptionV3 [42] | 187 | 179 | 11 | 3 | 96.32% | 3.68% | 94.44% | 98.42% | 5.56% | 96.39% |
Xception [41] | 184 | 181 | 9 | 6 | 96.05% | 3.95% | 95.33% | 96.84% | 4.67% | 96.08% |
ResNet152V2 [43] | 189 | 177 | 13 | 1 | 96.32% | 3.68% | 93.56% | 99.47% | 6.44% | 96.42% |
Methods | Accuracy | Precision | Recall | F1 Score |
---|---|---|---|---|
Proposed FFireNet | 98.42% | 97.42 | 99.47% | 98.43% |
Khan et al. [35] | 95.0% | 95.7 | 94.2% | 94.96% |
Sousa et al. [30] | 93.6% | 94.1% | 93.1% | - |
Govil et al. [31] | 91% | - | - | 89% |
Tang et al. [32] | 92% | - | - | - |
Park et al. [33] | - | 99.1% | 97.8% | 98.5% |
Sun et al. [34] | 94.1% | - | - | - |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Khan, S.; Khan, A. FFireNet: Deep Learning Based Forest Fire Classification and Detection in Smart Cities. Symmetry 2022, 14, 2155. https://doi.org/10.3390/sym14102155
Khan S, Khan A. FFireNet: Deep Learning Based Forest Fire Classification and Detection in Smart Cities. Symmetry. 2022; 14(10):2155. https://doi.org/10.3390/sym14102155
Chicago/Turabian StyleKhan, Somaiya, and Ali Khan. 2022. "FFireNet: Deep Learning Based Forest Fire Classification and Detection in Smart Cities" Symmetry 14, no. 10: 2155. https://doi.org/10.3390/sym14102155