Deep Learning Based Oil Palm Tree Detection and Counting for High-Resolution Remote Sensing Images
"> Figure 1
<p>The study area of this research in the south of Peninsular Malaysia. The <b>blue</b> rectangles show the two regions from which the manually interpreted samples are collected. The <b>red</b> squares show the three selected regions for evaluating the performance of our proposed method.</p> "> Figure 2
<p>The flowchart of our proposed method.</p> "> Figure 3
<p>The structure of the convolutional neural network (CNN).</p> "> Figure 4
<p>The sliding window technique.</p> "> Figure 5
<p>Sample merging.</p> "> Figure 6
<p>Detection image of each method for region 1 (extracted area). Each <b>red</b> circle denotes a detected palm tree. Each <b>green</b> square denotes a palm tree in ground truth that cannot be detected correctly. Each <b>blue</b> square denotes a background sample that is detected as a palm tree by mistake.</p> "> Figure 7
<p>Detection image of each method for region 2 (extracted area).</p> "> Figure 8
<p>Detection image of each method for region 3 (extracted area).</p> ">
Abstract
:1. Introduction
2. Study Area and Datasets
3. Methods
3.1. Overview
3.2. CNN Training and Parameter Optimization
3.3. Label Prediction
3.4. Sample Merging
4. Results
4.1. Classification Accuracy and Parameter Optimization
4.2. Detection Results Evaluation
5. Discussion
6. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Suhaily, S.; Jawaid, M.; Khalil, H.A.; Mohamed, A.R.; Ibrahim, F. A review of oil palm biocomposites for furniture design and applications: Potential and challenges. BioResources 2012, 7, 4400–4423. [Google Scholar]
- Malek, S.; Bazi, Y.; Alajlan, N.; AlHichri, H.; Melgani, F. Efficient framework for palm tree detection in UAV images. IEEE J. Sel. Top. Appl. Earth Obs. 2014, 7, 4692–4703. [Google Scholar] [CrossRef]
- Cracknell, A.P.; Kanniah, K.D.; Tan, K.P.; Wang, L. Evaluation of MODIS gross primary productivity and land cover products for the humid tropics using oil palm trees in Peninsular Malaysia and Google Earth imagery. Int. J. Remote Sens. 2013, 34, 7400–7423. [Google Scholar] [CrossRef]
- Tan, K.P.; Kanniah, K.D.; Cracknell, A.P. A review of remote sensing based productivity models and their suitability for studying oil palm productivity in tropical regions. Prog. Phys. Geogr. 2012, 36, 655–679. [Google Scholar] [CrossRef]
- Tan, K.P.; Kanniah, K.D.; Cracknell, A.P. Use of UK-DMC2 and ALOS PALSAR for studying the age of oil palm trees in southern peninsular Malaysia. Int. J. Remote Sens. 2013, 34, 7424–7446. [Google Scholar] [CrossRef]
- Kanniah, K.D.; Tan, K.P.; Cracknell, A.P. UK-DMC2 satellite data for deriving biophysical parameters of oil palm trees in Malaysia. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, Munich, Germany, 22–27 July 2012; pp. 6569–6572.
- Cheng, Y.; Yu, L.; Cracknell, A.P.; Gong, P. Oil palm mapping using Landsat and PALSAR: A case study in Malaysia. Int. J. Remote Sens. 2016, 37, 5431–5442. [Google Scholar] [CrossRef]
- Cracknell, A.P.; Kanniah, K.D.; Tan, K.P.; Wang, L. Towards the development of a regional version of MOD17 for the determination of gross and net primary productivity of oil palm trees. Int. J. Remote Sens. 2015, 36, 262–289. [Google Scholar] [CrossRef]
- Pouliot, D.A.; King, D.J.; Bell, F.W.; Pitt, D.G. Automated tree crown detection and delineation in high-resolution digital camera imagery of coniferous forest regeneration. Remote Sens. Environ. 2002, 82, 322–334. [Google Scholar] [CrossRef]
- Shafri, H.Z.; Hamdan, N.; Saripan, M.I. Semi-automatic detection and counting of oil palm trees from high spatial resolution airborne imagery. Int. J. Remote Sens. 2011, 32, 2095–2115. [Google Scholar] [CrossRef]
- Ke, Y.; Quackenbush, L.J. A review of methods for automatic individual tree-crown detection and delineation from passive remote sensing. Int. J. Remote Sens. 2011, 32, 4725–4747. [Google Scholar] [CrossRef]
- Srestasathiern, P.; Rakwatin, P. Oil palm tree detection with high resolution multi-spectral satellite imagery. Remote Sens. 2014, 6, 9749–9774. [Google Scholar] [CrossRef]
- Manandhar, A.; Hoegner, L.; Stilla, U. Palm Tree Detection Using Circular Autocorrelation of Polar Shape Matrix. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, III-3, 465–472. [Google Scholar] [CrossRef]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet classification with deep convolutional neural networks. In Proceedings of the Advances in Neural Information Processing Systems, Lake Tahoe, NV, USA, 3–8 December 2012; pp. 1097–1105.
- Ciregan, D.; Meier, U.; Schmidhuber, J. Multi-column deep neural networks for image classification. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Rhode Island, RI, USA, 16–21 June 2012; pp. 3642–3649.
- Sun, Y.; Wang, X.; Tang, X. Deep learning face representation from predicting 10,000 classes. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 24–27 June 2014; pp. 1891–1898.
- Li, H.; Lin, Z.; Shen, X.; Brandt, J.; Hua, G. A convolutional neural network cascade for face detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 5325–5334.
- Ouyang, W.; Wang, X. Joint deep learning for pedestrian detection. In Proceedings of the IEEE International Conference on Computer Vision, Sydney, Australia, 1–8 December 2013; pp. 2056–2063.
- Zeng, X.; Ouyang, W.; Wang, X. Multi-stage contextual deep learning for pedestrian detection. In Proceedings of the IEEE International Conference on Computer Vision, Sydney, Australia, 1–8 December 2013; pp. 121–128.
- Chen, Y.; Lin, Z.; Zhao, X.; Wang, G.; Gu, Y. Deep learning-based classification of hyperspectral data. IEEE J. Sel. Top. Appl. Earth Obs. 2014, 7, 2094–2107. [Google Scholar] [CrossRef]
- Yue, J.; Zhao, W.; Mao, S.; Liu, H. Spectral–spatial classification of hyperspectral images using deep convolutional neural networks. Remote Sens. Lett. 2015, 6, 468–477. [Google Scholar] [CrossRef]
- Li, W.; Fu, H.; Yu, L.; Gong, P.; Feng, D.; Li, C.; Clinton, N. Stacked Autoencoder-based deep learning for remote-sensing image classification: a case study of African land-cover mapping. Int. J. Remote Sens. 2016, 37, 5632–5646. [Google Scholar] [CrossRef]
- Hu, F.; Xia, G.S.; Hu, J.; Zhang, L. Transferring deep convolutional neural networks for the scene classification of high-resolution remote sensing imagery. Remote Sens. 2015, 7, 14680–14707. [Google Scholar] [CrossRef]
- Zou, Q.; Ni, L.; Zhang, T.; Wang, Q. Deep learning based feature selection for remote sensing scene classification. IEEE Geosci. Remote Sens. Lett. 2015, 12, 2321–2325. [Google Scholar] [CrossRef]
- Penatti, O.A.; Nogueira, K.; dos Santos, J.A. Do deep features generalize from everyday objects to remote sensing and aerial scenes domains? In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, Boston, MA, USA, 7–12 June 2015; pp. 44–51.
- Chen, X.; Xiang, S.; Liu, C.L.; Pan, C.H. Vehicle detection in satellite images by hybrid deep convolutional neural networks. IEEE Geosci. Remote Sens. Lett. 2014, 11, 1797–1801. [Google Scholar] [CrossRef]
- Vakalopoulou, M.; Karantzalos, K.; Komodakis, N.; Paragios, N. Building detection in very high resolution multispectral data with deep learning features. In Proceedings of the 2015 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Milan, Italy, 26–31 July 2015; pp. 1873–1876.
- Laben, C.A.; Brower, B.V. Process for Enhancing the Spatial Resolution of Multispectral Imagery Using Pan-Sharpening. U.S. Patent 6,011,875, 4 January 2000. [Google Scholar]
- Abadi, M.; Agarwal, A.; Barham, P.; Brevdo, E.; Chen, Z.; Citro, C.; Corrado, G.S.; Davis, A.; Dean, J.; Devin, M.; et al. TensorFlow: Large-Scale Machine Learning on Heterogeneous Systems. 2015. Available online: https://arxiv.org/abs/1603.04467 (accessed on 31 August 2016).
- Bradski, G.; Kaehler, A. Learning OpenCV: Computer Vision with the OpenCV Library; O’Reilly Media, Inc.: Sebastopol, CA, USA, 2008. [Google Scholar]
Evaluation Index | Region 1 | Region 2 | Region 3 |
---|---|---|---|
The number of correctly detected palm trees | 1651 | 1607 | 1683 |
The number of all detected objects | 1729 | 1695 | 1706 |
The number of palm trees in ground truth | 1709 | 1642 | 1702 |
Precision | 95.49% | 94.81% | 98.65% |
Recall | 96.61% | 97.87% | 98.88% |
Overall accuracy | 96.05% | 96.34% | 98.77% |
Evaluation Index | Region 1 | Region 2 | Region 3 |
---|---|---|---|
The number of correctly detected palm trees | 1648 | 1585 | 1679 |
The number of all detected objects | 1800 | 1725 | 1718 |
The number of palm trees in ground truth | 1709 | 1642 | 1702 |
Precision | 91.56% | 91.88% | 97.73% |
Recall | 96.43% | 96.53% | 98.64% |
Overall accuracy | 94.00% | 94.21% | 98.19% |
Evaluation Index | Region 1 | Region 2 | Region 3 |
---|---|---|---|
The number of correctly detected palm trees | 1429 | 1392 | 1608 |
The number of all detected objects | 1532 | 1493 | 1684 |
The number of palm trees in ground truth | 1709 | 1642 | 1702 |
Precision | 93.28% | 93.24% | 95.49% |
Recall | 83.62% | 84.77% | 94.48% |
Overall accuracy | 88.45% | 89.01% | 94.99% |
Evaluation Index | Region 1 | Region 2 | Region 3 |
---|---|---|---|
The number of correctly detected palm trees | 1493 | 1397 | 1643 |
The number of all detected objects | 1719 | 1675 | 1761 |
The number of palm trees in ground truth | 1709 | 1642 | 1689 |
Precision | 86.85% | 83.40% | 93.30% |
Recall | 87.36% | 85.08% | 97.28% |
Overall accuracy | 87.11% | 84.24% | 95.29% |
Methods | Region 1 | Region 2 | Region 3 |
---|---|---|---|
CNN | 1651 | 1607 | 1683 |
ANN | 1648 | 1585 | 1679 |
TMPL | 1429 | 1392 | 1608 |
LMF | 1493 | 1397 | 1643 |
Methods | Region 1 | Region 2 | Region 3 | ||||||
---|---|---|---|---|---|---|---|---|---|
Precision | Recall | OA | Precision | Recall | OA | Precision | Recall | OA | |
CNN | 95.49% | 96.61% | 96.05% | 94.81% | 97.87% | 96.34% | 98.65% | 98.88% | 98.77% |
ANN | 91.56% | 96.43% | 94.00% | 91.88% | 96.53% | 94.21% | 97.73% | 98.64% | 98.19% |
TMPL | 93.28% | 83.62% | 88.45% | 93.24% | 84.77% | 89.01% | 95.49% | 94.48% | 94.99% |
LMF | 86.85% | 87.36% | 87.11% | 83.40% | 85.08% | 84.24% | 93.30% | 97.28% | 95.29% |
© 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, W.; Fu, H.; Yu, L.; Cracknell, A. Deep Learning Based Oil Palm Tree Detection and Counting for High-Resolution Remote Sensing Images. Remote Sens. 2017, 9, 22. https://doi.org/10.3390/rs9010022
Li W, Fu H, Yu L, Cracknell A. Deep Learning Based Oil Palm Tree Detection and Counting for High-Resolution Remote Sensing Images. Remote Sensing. 2017; 9(1):22. https://doi.org/10.3390/rs9010022
Chicago/Turabian StyleLi, Weijia, Haohuan Fu, Le Yu, and Arthur Cracknell. 2017. "Deep Learning Based Oil Palm Tree Detection and Counting for High-Resolution Remote Sensing Images" Remote Sensing 9, no. 1: 22. https://doi.org/10.3390/rs9010022