Automated Identification of Crop Tree Crowns from UAV Multispectral Imagery by Means of Morphological Image Analysis
"> Figure 1
<p>Third-party aerial capture of the case study site shown to illustrate the study plot, highlighted in red.</p> "> Figure 2
<p>Equipment used to capture the aerial imagery used in this paper.</p> "> Figure 3
<p>Representative diagram of the methodology proposed for detecting and counting crop trees from multispectral aerial images.</p> "> Figure 4
<p>Colour image generated from the information provided by the orthomosaics of the Blue, Red Edge and NIR spectral bands.</p> "> Figure 5
<p>Representation of the computed DSM as the intensity image <math display="inline"><semantics> <mrow> <mi>G</mi> <msub> <mi>S</mi> <mrow> <mi>D</mi> <mi>S</mi> <mi>M</mi> </mrow> </msub> </mrow> </semantics></math>. Note in the zoomed area, highlighted in the red square, the differences in terms of grey level between those pixel regions which apparently belong to olive trees, and those from the surrounding ground. Then, given that each pixel intensity value is assigned according to its elevation in the DSM, higher pixel values indicate higher altitudes with respect to the sea level. It should be noted that, for the sake of facilitating its visualisation, the image display range has been established between the minimum bigger-than-zero value from the DMS, and its maximum.</p> "> Figure 6
<p>Filling gaps illustration: (<b>a</b>) same zoomed area of <math display="inline"><semantics> <mrow> <mi>G</mi> <msub> <mi>S</mi> <mrow> <mi>D</mi> <mi>M</mi> <mi>S</mi> </mrow> </msub> </mrow> </semantics></math> shown in <a href="#remotesensing-12-00748-f005" class="html-fig">Figure 5</a>; (<b>b</b>) result of the filling gaps operation applied to (<b>a</b>).</p> "> Figure 7
<p>(<b>a</b>) Greyscale image <math display="inline"><semantics> <mrow> <msub> <mi>I</mi> <mrow> <mi>G</mi> <mi>S</mi> <mn>1</mn> </mrow> </msub> </mrow> </semantics></math> resulting from filling gaps in the image shown in <a href="#remotesensing-12-00748-f005" class="html-fig">Figure 5</a>, <math display="inline"><semantics> <mrow> <mi>G</mi> <msub> <mi>S</mi> <mrow> <mi>D</mi> <mi>M</mi> <mi>S</mi> </mrow> </msub> </mrow> </semantics></math>; (<b>b</b>) background estimation of (<b>a</b>), <math display="inline"><semantics> <mrow> <msub> <mi>I</mi> <mrow> <mi>B</mi> <msub> <mi>E</mi> <mrow> <mi>D</mi> <mi>E</mi> <mi>F</mi> </mrow> </msub> </mrow> </msub> </mrow> </semantics></math>; (<b>c</b>) resulting image <math display="inline"><semantics> <mrow> <msub> <mi>I</mi> <mrow> <mi>G</mi> <mi>S</mi> <mn>2</mn> </mrow> </msub> </mrow> </semantics></math> after subtracting (<b>b</b>) to (<b>a</b>).</p> "> Figure 8
<p>Image <math display="inline"><semantics> <mrow> <msub> <mi>I</mi> <mrow> <mi>B</mi> <mi>I</mi> <mi>N</mi> <mn>1</mn> </mrow> </msub> </mrow> </semantics></math> resulting from the binarization of <math display="inline"><semantics> <mrow> <msub> <mi>I</mi> <mrow> <mi>G</mi> <mi>S</mi> <mn>3</mn> </mrow> </msub> </mrow> </semantics></math>, shown in <a href="#remotesensing-12-00748-f006" class="html-fig">Figure 6</a>c. Note in the zoomed area, in the red square, how potential plants have been accurately segmented from the background.</p> "> Figure 9
<p>(<b>a</b>) ROI mask image <math display="inline"><semantics> <mrow> <msub> <mi>I</mi> <mrow> <mi>R</mi> <mi>O</mi> <mi>I</mi> </mrow> </msub> </mrow> </semantics></math>; (<b>b</b>) image <math display="inline"><semantics> <mrow> <msub> <mi>I</mi> <mrow> <mi>B</mi> <mi>I</mi> <mi>N</mi> <mi>def</mi> </mrow> </msub> </mrow> </semantics></math> resulting from filtering the binary image <math display="inline"><semantics> <mrow> <msub> <mi>I</mi> <mrow> <mi>B</mi> <mi>I</mi> <mi>N</mi> <mn>3</mn> </mrow> </msub> </mrow> </semantics></math> (visually very similar to <math display="inline"><semantics> <mrow> <msub> <mi>I</mi> <mrow> <mi>B</mi> <mi>I</mi> <mi>N</mi> <mn>3</mn> </mrow> </msub> </mrow> </semantics></math>, shown in <a href="#remotesensing-12-00748-f008" class="html-fig">Figure 8</a>), with image <math display="inline"><semantics> <mrow> <msub> <mi>I</mi> <mrow> <mi>R</mi> <mi>O</mi> <mi>I</mi> </mrow> </msub> </mrow> </semantics></math>.</p> "> Figure 10
<p>(<b>a</b>) Sub-image of the study plot orthomosaic represented in <a href="#remotesensing-12-00748-f004" class="html-fig">Figure 4</a>, where it can be observed a couple of trees with overlapping foliage; (<b>b</b>) sub-image of the binary image resulted from the segmentation performed, <math display="inline"><semantics> <mrow> <msub> <mi>I</mi> <mrow> <mi>B</mi> <mi>I</mi> <mi>N</mi> <mi>def</mi> </mrow> </msub> </mrow> </semantics></math>, corresponding to the area represented in (<b>a</b>). Note how the two olive trees share the same connected component.</p> "> Figure 11
<p>(<b>a</b>) Sub-image of the study plot orthomosaic represented in <a href="#remotesensing-12-00748-f004" class="html-fig">Figure 4</a>; (<b>b</b>) sub-image of the binary image resulted from the segmentation performed, corresponding to the area represented in (<b>a</b>); (<b>c</b>) representation of the ellipses (in red) computed for each connected component in the image (<b>b</b>), with their corresponding major (in blue) and minor (in green) axes.</p> "> Figure 12
<p>Illustration of the process to estimate a representative location for aggregated trees, formulated in Equations (17)–(19). Examples for odd (<b>a</b>) and even (<b>b</b>) number of aggregated trees are given.</p> "> Figure 13
<p>Result of the estimation of the individual tree location points.</p> "> Figure 14
<p>False positives detected during performance assessment: (<b>a</b>) case related to a car parked next to the study site; (<b>b</b>) case consequence of wrongly splitting one tree into two different connected components because of its damaged condition; (<b>c</b>) case obtained after overestimating the number of trees contained in an aggregated connected component.</p> "> Figure 15
<p>False negative resulting from a lack of information in the point cloud: (<b>a</b>) aerial sub-image where the tree wrongly discarded by the algorithm is represented; (<b>b</b>) 3D point cloud-based representation of the area shown in (<b>a</b>); (<b>c</b>) elevation information provided by the DSM, represented as a greyscale image, corresponding to the area shown in (<b>a</b>).</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. Study Case Site
2.2. Image Acquisition
2.2.1. Aerial Imaging Equipment
2.2.2. Flight Planning and Development
2.3. Image Analysis Methodology for Olive Trees Detection, Geolocation and Counting
2.3.1. Image Pre-Processing
2.3.2. Digital Surface Model (DSM)
2.3.3. Image Segmentation Algorithm
2.3.4. Image Analysis Algorithm for Tree Counting and for the Estimation of Tree Locations
2.4. Performance Evaluation of the Image Analysis Methodology
- : it gives the hit ratio for the trees found by the algorithm. Mathematically:
- : it provides the ratio of actual trees found by the algorithm:
- : it is the harmonic mean of the two metrics described above, being mathematically defined as:
3. Results
4. Discussion
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Appendix A
References
- Tilman, D.; Balzer, C.; Hill, J.; Befort, B.L. Global food demand and the sustainable intensification of agriculture. Proc. Natl. Acad. Sci. USA 2011, 108, 20260–20264. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Araus, J.L.; Cairns, J.E. Field high-throughput phenotyping: The new crop breeding frontier. Trends Plant Sci. 2014, 19, 52–61. [Google Scholar] [CrossRef] [PubMed]
- Jin, X.; Liu, S.; Baret, F.; Hemerlé, M.; Comar, A. Estimates of plant density of wheat crops at emergence from very low altitude UAV imagery. Remote Sens. Environ. 2017, 198, 105–114. [Google Scholar] [CrossRef] [Green Version]
- Aparna, P.; Ramachandra, H.; Harshita, M.P.; Harshitha, S.; Nandkishore, K.; Vinod, P.V. CNN Based Technique for Automatic Tree Counting Using Very High Resolution Data. In Proceedings of the 2018 International Conference on Design Innovations for 3Cs Compute Communicate Control (ICDI3C), Bangalore, India, 25–28 April 2018; IEEE: Bangalore, India, 2018; pp. 127–129. [Google Scholar]
- Malek, S.; Bazi, Y.; Alajlan, N.; AlHichri, H.; Melgani, F. Efficient Framework for Palm Tree Detection in UAV Images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 4692–4703. [Google Scholar] [CrossRef]
- Salamí, E.; Gallardo, A.; Skorobogatov, G.; Barrado, C. On-the-Fly Olive Trees Counting Using a UAS and Cloud Services. Remote Sens. 2019, 11, 316. [Google Scholar] [CrossRef] [Green Version]
- Shakoor, N.; Lee, S.; Mockler, T.C. High throughput phenotyping to accelerate crop breeding and monitoring of diseases in the field. Curr. Opin. Plant Biol. 2017, 38, 184–192. [Google Scholar] [CrossRef]
- Furbank, R.T.; Tester, M. Phenomics – Technologies to relieve the phenotyping bottleneck. Trends Plant Sci. 2011, 16, 635–644. [Google Scholar] [CrossRef]
- Sankaran, S.; Khot, L.R.; Espinoza, C.Z.; Jarolmasjed, S.; Sathuvalli, V.R.; Vandemark, G.J.; Miklas, P.N.; Carter, A.H.; Pumphrey, M.O.; Knowles, N.R.; et al. Low-altitude, high-resolution aerial imaging systems for row and field crop phenotyping: A review. Eur. J. Agron. 2015, 70, 112–123. [Google Scholar] [CrossRef]
- Yang, G.; Liu, J.; Zhao, C.; Li, Z.; Huang, Y.; Yu, H.; Xu, B.; Yang, X.; Zhu, D.; Zhang, X.; et al. Unmanned Aerial Vehicle Remote Sensing for Field-Based Crop Phenotyping: Current Status and Perspectives. Front. Plant Sci. 2017, 8, 1111. [Google Scholar] [CrossRef]
- Tripicchio, P.; Satler, M.; Dabisias, G.; Ruffaldi, E.; Avizzano, C.A. Towards Smart Farming and Sustainable Agriculture with Drones. In Proceedings of the 2015 International Conference on Intelligent Environments, Prague, Czech Republic, 15–17 July 2015; IEEE: Prague, Czech Republic, 2015; pp. 140–143. [Google Scholar]
- Hunt, E.R.; Daughtry, C.S.T. What good are unmanned aircraft systems for agricultural remote sensing and precision agriculture? Int. J. Remote Sens. 2018, 39, 5345–5376. [Google Scholar] [CrossRef] [Green Version]
- Peña, J.M.; Torres-Sánchez, J.; de Castro, A.I.; Kelly, M.; López-Granados, F. Weed Mapping in Early-Season Maize Fields Using Object-Based Analysis of Unmanned Aerial Vehicle (UAV) Images. PLoS ONE 2013, 8, e77151. [Google Scholar] [CrossRef] [Green Version]
- Floreano, D.; Wood, R.J. Science, technology and the future of small autonomous drones. Nature 2015, 521, 460–466. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Miserque Castillo, J.Z.; Laverde Diaz, R.; Rueda Guzmán, C.L. Development of an aerial counting system in oil palm plantations. IOP Conf. Ser. Mater. Sci. Eng. 2016, 138, 012007. [Google Scholar] [CrossRef] [Green Version]
- Primicerio, J.; Caruso, G.; Comba, L.; Crisci, A.; Gay, P.; Guidoni, S.; Genesio, L.; Ricauda Aimonino, D.; Vaccari, F.P. Individual plant definition and missing plant characterization in vineyards from high-resolution UAV imagery. Eur. J. Remote Sens. 2017, 50, 179–186. [Google Scholar] [CrossRef]
- Jiang, H.; Chen, S.; Li, D.; Wang, C.; Yang, J. Papaya Tree Detection with UAV Images Using a GPU-Accelerated Scale-Space Filtering Method. Remote Sens. 2017, 9, 721. [Google Scholar] [CrossRef] [Green Version]
- Koc-San, D.; Selim, S.; Aslan, N.; San, B.T. Automatic citrus tree extraction from UAV images and digital surface models using circular Hough transform. Comput. Electron. Agric. 2018, 150, 289–301. [Google Scholar] [CrossRef]
- Csillik, O.; Cherbini, J.; Johnson, R.; Lyons, A.; Kelly, M. Identification of Citrus Trees from Unmanned Aerial Vehicle Imagery Using Convolutional Neural Networks. Drones 2018, 2, 39. [Google Scholar] [CrossRef] [Green Version]
- Ampatzidis, Y.; Partel, V. UAV-Based High Throughput Phenotyping in Citrus Utilizing Multispectral Imaging and Artificial Intelligence. Remote Sens. 2019, 11, 410. [Google Scholar] [CrossRef] [Green Version]
- Selim, S.; Sonmez, N.K.; Coslu, M.; Onur, I. Semi-automatic Tree Detection from Images of Unmanned Aerial Vehicle Using Object-Based Image Analysis Method. J. Indian Soc. Remote Sens. 2019, 47, 193–200. [Google Scholar] [CrossRef]
- Kestur, R.; Angural, A.; Bashir, B.; Omkar, S.N.; Anand, G.; Meenavathi, M.B. Tree Crown Detection, Delineation and Counting in UAV Remote Sensed Images: A Neural Network Based Spectral–Spatial Method. J. Indian Soc. Remote Sens. 2018, 46, 991–1004. [Google Scholar] [CrossRef]
- Marques, P.; Pádua, L.; Adão, T.; Hruška, J.; Peres, E.; Sousa, A.; Sousa, J.J. UAV-Based Automatic Detection and Monitoring of Chestnut Trees. Remote Sens. 2019, 11, 855. [Google Scholar] [CrossRef] [Green Version]
- Díaz-Varela, R.; de la Rosa, R.; León, L.; Zarco-Tejada, P. High-Resolution Airborne UAV Imagery to Assess Olive Tree Crown Parameters Using 3D Photo Reconstruction: Application in Breeding Trials. Remote Sens. 2015, 7, 4213–4232. [Google Scholar] [CrossRef] [Green Version]
- Torres-Sánchez, J.; López-Granados, F.; Serrano, N.; Arquero, O.; Peña, J.M. High-Throughput 3-D Monitoring of Agricultural-Tree Plantations with Unmanned Aerial Vehicle (UAV) Technology. PLoS ONE 2015, 10, e0130479. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Torres-Sánchez, J.; López-Granados, F.; Borra-Serrano, I.; Peña, J.M. Assessing UAV-collected image overlap influence on computation time and digital surface model accuracy in olive orchards. Precis. Agric. 2018, 19, 115–133. [Google Scholar] [CrossRef]
- Shepard, D. A two-dimensional interpolation function for irregularly-spaced data. In Proceedings of the 1968 23rd ACM National Conference, New York, NY, USA, 27–29 August 1968; ACM Press: New York, NY, USA, 1968; pp. 517–524. [Google Scholar]
- Otsu, N. A Threshold Selection Method from Gray-Level Histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef] [Green Version]
- Jain, A.K. Fundamentals of Digital Image Processing; Prentice Hall: Englewood Cliffs, NJ, USA, 1989; ISBN 0133361659. [Google Scholar]
- Soille, P. Morphological Image Analysis: Principles and Applications; Springer-Verlag GmbH: Heidelberg, Germany, 2004; ISBN 9783662050880. [Google Scholar]
- Serra, J. Image Analysis and Mathematical Morphology, vol. I; Academic Press Inc.: Cambridge, MA, USA, 1982; ISBN 9780126372427. [Google Scholar]
Band Number | Band Name | Centre Wavelength (nm) | Bandwidth (nm) |
---|---|---|---|
1 | Blue | 475 | 20 |
2 | Green | 560 | 20 |
3 | Red | 668 | 10 |
4 | Near Infrared | 840 | 40 |
5 | Red Edge | 717 | 10 |
Actual Tree Population | Estimated Tree Population | ||||||
---|---|---|---|---|---|---|---|
3919 | 3909 | 3906 | 3 | 13 | 0.9992 | 0.9967 | 0.9975 |
Method | Actual Tree Population | |||
---|---|---|---|---|
Torres-Sánchez et al., 2015 [25] | 135 | – | 0.945–0.969 | – |
Torres-Sánchez et al., 2018 [26] | – | – | 0.970 | – |
Salamí et al., 2019 [6] | 332 | 0.9939 | 0.9909 | 0.9924 |
Malek et al. [5] | 617 | 0.9009 | 0.9440 | 0.9219 |
Csillik et al. [19] | 2912 | 0.9459 | 0.9794 | 0.9624 |
Ampatzidis and Partel [20] | 4931 | 0.9990 | 0.9970 | 0.9980 |
Selim et al. [21] | 105 | – | 0.8286 | – |
Marques et al. [23] | 1092 | 0.9944 | 0.9780 | 0.9861 |
This work | 3918 | 0.9992 | 0.9967 | 0.9975 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sarabia, R.; Aquino, A.; Ponce, J.M.; López, G.; Andújar, J.M. Automated Identification of Crop Tree Crowns from UAV Multispectral Imagery by Means of Morphological Image Analysis. Remote Sens. 2020, 12, 748. https://doi.org/10.3390/rs12050748
Sarabia R, Aquino A, Ponce JM, López G, Andújar JM. Automated Identification of Crop Tree Crowns from UAV Multispectral Imagery by Means of Morphological Image Analysis. Remote Sensing. 2020; 12(5):748. https://doi.org/10.3390/rs12050748
Chicago/Turabian StyleSarabia, Ricardo, Arturo Aquino, Juan Manuel Ponce, Gilberto López, and José Manuel Andújar. 2020. "Automated Identification of Crop Tree Crowns from UAV Multispectral Imagery by Means of Morphological Image Analysis" Remote Sensing 12, no. 5: 748. https://doi.org/10.3390/rs12050748