Tree Crown Delineation Algorithm Based on a Convolutional Neural Network
"> Figure 1
<p>(<b>A</b>) The Brazilian territory. In green, the Atlantic Rainforest biome extension, and the red square marks the location of Santa Genebra Reserve; (<b>B</b>) True color composition of the WorldView-2 image acquired over the Santa Genebra Forest Reserve.</p> "> Figure 2
<p>(<b>A</b>) The multi-spectral image obtained from the composition of R, G, B spectral bands with spatial resolution of <math display="inline"><semantics> <mrow> <mn>2.0</mn> </mrow> </semantics></math> m; (<b>B</b>) The panchromatic image with spatial resolution of <math display="inline"><semantics> <mrow> <mn>0.5</mn> </mrow> </semantics></math> m; (<b>C</b>) The image obtained from pan-sharpening process by the LMVM algorithm.</p> "> Figure 3
<p>(<b>A</b>) The Santa Genebra Reserve. Both (<b>B</b>,<b>C</b>) Exhibit the same small region from the Santa Genebra Reserve; in (<b>C</b>), the red polygons belong to the training set and yellow polygons belong to the validation set.</p> "> Figure 4
<p>The Mask R-CNN architecture, which is composed of convolution layers, region proposal networks (RPNs), and fully connected networks (FCNs). The Faster R-CNN performs the region proposal selection; the region of interest (RoI) ALIGN sets up all RoIs to the same shape; the FCNs make the object labeling and the bound box; and the convolution layers perform the pixel determination of each object (object mask).</p> "> Figure 5
<p>Each step in the algorithm developed for creating synthetic images.</p> "> Figure 6
<p>(<b>A</b>) A synthetic forest image; (<b>B</b>) The same image with the delineation of each example tree crown.</p> "> Figure 7
<p>Evolution of training total loss values (blue line) and validation total loss values (red line) per epoch.</p> "> Figure 8
<p>The relation between the bounding box from the algorithm-delineated object and the manually delineated object (evaluation object) to obtain the <math display="inline"><semantics> <mrow> <mi>R</mi> <mi>e</mi> <mi>c</mi> <mi>a</mi> <mi>l</mi> <mi>l</mi> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mi>P</mi> <mi>r</mi> <mi>e</mi> <mi>c</mi> <mi>i</mi> <mi>s</mi> <mi>i</mi> <mi>o</mi> <mi>n</mi> </mrow> </semantics></math>, and <math display="inline"><semantics> <mrow> <mi>I</mi> <mi>o</mi> <mi>U</mi> </mrow> </semantics></math> values</p> "> Figure 9
<p>Number of tree canopies with segments intersecting them. The legend brings the mean crown area, in square meters, for each group. For example: the mean crown area for tree crowns intersected by 1 segment is 28 m<math display="inline"><semantics> <msup> <mrow/> <mn>2</mn> </msup> </semantics></math>.</p> "> Figure 10
<p>Relations between the evaluation crown area and the segmented crown area, both in pixels. The red dashed line represents the linear model.</p> "> Figure 11
<p>(<b>A</b>) The association between the validation crown area (in pixels) and the pixel deficit. The red dashed line represents a local polynomial to fit the graphic variables. The black dashed line is the approximation between pixel deficit and the crown area, and the coefficient of determination (<math display="inline"><semantics> <msup> <mi>R</mi> <mn>2</mn> </msup> </semantics></math>) from this approximation is equal to <math display="inline"><semantics> <mrow> <mo>−</mo> <mn>0.005</mn> </mrow> </semantics></math>; (<b>B</b>) The frequency of the right percentage of the tree crown area, or, in other words, the area accuracy, was estimated by the Mask R-CNN.</p> "> Figure 12
<p>(<b>A</b>) Evaluation of the association between crown area (in pixel) and the pixel excess. The black dashed line corresponds to the linear model fit and the red dashed line is the local polynomial fit; (<b>B</b>) The frequency of the right percentage of the tree crown area estimated by Mask R-CNN.</p> "> Figure 13
<p>Distribution of pixel deficit and excess.</p> "> Figure 14
<p>Result of the application of Mask R-CNN to perform the TCDD. (<b>A</b>) The test region from the Santa Genebra Reserve image; (<b>B</b>) The result of tree crown delineation in red; (<b>C</b>) The identification of each tree crown, each fill color represents an ID provided by Mask R-CNN.</p> "> Figure 15
<p>(<b>A</b>) Relation between the <math display="inline"><semantics> <mrow> <mi>I</mi> <mi>o</mi> <mi>U</mi> </mrow> </semantics></math> value and the area of each crown in the evaluation set; (<b>B</b>) Distribution of <math display="inline"><semantics> <mrow> <mi>I</mi> <mi>o</mi> <mi>U</mi> </mrow> </semantics></math> values.</p> "> Figure 16
<p>The <math display="inline"><semantics> <mrow> <mi>I</mi> <mi>o</mi> <mi>U</mi> </mrow> </semantics></math> result of some examples. (<b>A</b>) A region of Santa Genebra Reserve; (<b>B</b>) Evaluation set examples; (<b>C</b>) The Mask R-CNN result of tree crown delineation; (<b>D</b>) The IoU result; in other words, the overlap between the response bounding box and the evaluation bounding box.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. Study Site
2.2. WorldView-2 Satellite Image
2.3. Individual Tree Crown Dataset
2.4. Instance Segmentation with Mask R-CNN
2.5. Synthetic Forest Images for Training
Algorithm 1. Algorithm for building a a synthetic forest image to compose the training dataset. |
- is a background where the synthetic forest will be created. Literally, it can be an image patch from a region of a WV-2 image or a black-image. The dimensions are determined by the user, but the channels must be R, G, and B;
- is the set of manually delineated crowns. In the specific case of this research, it is a shape-file with the geometries of each manually delineated crown;
- variable to control the numbers of crowns in the ; the algorithm copies for the a specific number of crowns;
- is a two-dimensional array which is polygonized to create the correspondent forest shape-file, where for each crown there is a geometry;
- the algorithm checks if the selected to copy the crown into the is free. In other words, it assesses if putting the new tree crown into the will cover most part of an existing tree crown;
- the is filled in the same selected with values.
- the is converted in shape-file with the geometries of each tree within; and
- the algorithm return the forest () image and its shape-file ().
2.6. Training the Mask R-CNN for TCDD
- class loss—how close the model is to predicting the correct class;
- bounding box loss—the distance between the ground-true (validation) bounding box parameters (height and width) and the prediction bounding box parameters; in other words, how good the model is at locating objects within the image;
- mask loss—measures per pixel misclassification by comparing the ground-true pixels and the predicted pixels; and
- total loss—is the sum of the others’ losses;
2.7. Independent Algorithm Assessment
3. Results
3.1. Detection Accuracy
3.2. Delineation Accuracy
4. Discussion
4.1. TCDD Detection Performance
4.2. TCDD Delineation Performance
4.3. Algorithm Requirements
4.3.1. Shade Effect—Limitations and How to Resolve Them
4.3.2. How to Deal with the Leaf Fall Effect
4.3.3. Algorithm Limitations
4.4. Algorithm’s Advance
4.5. Application Perspectives
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- FAO. Global Forest Resources Assessment 2010—Brazil Country Report; Technical Report; Food and Agriculture Organization of the United Nations: Rome, Italy, 2010. [Google Scholar]
- Malhi, Y.; Roberts, J.T.; Betts, R.A.; Killeen, T.J.; Li, W.; Nobre, C.A. Climate change, deforestation, and the fate of the amazon. Science 2008, 319, 169–172. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Phillips, O.L.; Brienen, R.J.W. Carbon uptake by mature Amazon forests has mitigated Amazon nations carbon emissions. Carbon Balance Manag. 2017, 12, 1–9. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Achard, F.; Eva, H.D.; Stibig, H.J.; Mayaux, P.; Gallego, J.; Richards, T.; Malingreau, J.P. Determination of deforestation rates of the world’s humid tropical forests. Science 2002, 297, 999–1002. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Mitchard, E.T.A.; Feldpausch, T.R.; Brienen, R.J.W.; Lopez-Gonzalez, G.; Monteagudo, A.; Baker, T.R.; Lewis, S.L.; Lloyd, J.; Quesada, C.A.; Gloor, M.; et al. Markedly divergent estimates of Amazon forest carbon density from ground plots and satellites. Glob. Ecol. Biogeogr. 2014, 23, 935–946. [Google Scholar] [CrossRef] [PubMed]
- White, J.C.; Coops, N.C.; Wulder, M.A.; Vastaranta, M.; Hilker, T.; Tompalski, P. Remote Sensing Technologies for Enhancing Forest Inventories: A Review. Can. J. Remote Sens. 2016, 42, 619–641. [Google Scholar] [CrossRef] [Green Version]
- De Lima, R.A.F.; Mori, D.P.; Pitta, G.; Melito, M.O.; Bello, C.; Magnago, L.F.; Zwiener, V.P.; Saraiva, D.D.; Marques, M.C.M.; de Oliveira, A.A.; et al. How much do we know about the endangered Atlantic Forest? Reviewing nearly 70 years of information on tree community surveys. Biodivers. Conserv. 2015, 24, 2135–2148. [Google Scholar] [CrossRef] [Green Version]
- Jensen, J.R. Remote Sensing of the Environment. An Earth Resource Perspective, 2nd ed.; Prentice Hall: London, UK, 2007; p. 592. [Google Scholar]
- Franklin, S.E. Remote Sensing for Sustainable Forest Management, 1st ed.; Lewis Publishers: London, UK, 2001; p. 116. [Google Scholar]
- Instituto Nacional de Pesquisas Espaciais (INPE). Deforestation Estimates in the Brazilian Amazon; Technical Report; Instituto Nacional de Pesquisas Espaciais (INPE): São José dos Campos, Brazil, 2002. [Google Scholar]
- Hansen, M.C.; Potapov, P.V.; Moore, R.; Hancher, M.; Turubanova, S.A.; Tyukavina, A.; Thau, D.; Stehman, S.V.; Goetz, S.J.; Loveland, T.R.; et al. High-Resolution Global Maps of 21st-Century Forest Cover Change. Science 2013, 342, 850–853. [Google Scholar] [CrossRef] [Green Version]
- Ke, Y.; Quackenbush, L.J. A comparison of three methods for automatic tree crown detection and delineation from high spatial resolution imagery. Int. J. Remote Sens. 2011, 32, 3625–3647. [Google Scholar] [CrossRef]
- Wagner, F.H.; Ferreira, M.P.; Sanchez, A.; Hirye, M.C.; Zortea, M.; Gloor, E.; Phillips, O.L.; de S. Filho, C.R.; Shimabukuro, Y.E.; Aragão, L.E. Individual tree crown delineation in a highly diverse tropical forest using very high resolution satellite images. ISPRS J. Photogramm. Remote Sens. 2018, 145, 362–377. [Google Scholar] [CrossRef]
- Palace, M.; Keller, M.; Asner, G.P.; Hagen, S.; Braswell, B. Amazon Forest Structure from IKONOS Satellite Data and the Automated Characterization of Forest Canopy Properties. Biotropica 2008, 40, 141–150. [Google Scholar] [CrossRef]
- Singh, M.; Evans, D.; Tan, B.S.; Nin, C.S. Mapping and Characterizing Selected Canopy Tree Species at the Angkor World Heritage Site in Cambodia Using Aerial Data. PLoS ONE 2015, 10, 1–26. [Google Scholar] [CrossRef] [PubMed]
- Slik, J.W.F.; Arroyo-Rodríguez, V.; Aiba, S.I.; Alvarez-Loayza, P.; Alves, L.F.; Ashton, P.; Balvanera, P.; Bastian, M.L.; Bellingham, P.J.; Van den Berg, E.; et al. An estimate of the number of tropical tree species. Proc. Natl. Acad. Sci. USA 2015, 112, 7472–7477. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Clark, M.L.; Roberts, D.A.; Clark, D.B. Hyperspectral discrimination of tropical rain forest tree species at leaf to crown scales. Remote Sens. Environ. 2005, 96, 375–398. [Google Scholar] [CrossRef]
- Cabello-Lebic, A. Tree Crown Delineation. In AusCover Good Practice Guidelines: A Technical Handbook Supporting Calibration and Validation Activities of Remotely Sensed Data Products, 1st ed.; TERN AusCover: Canberra, Australia, 2015; pp. 197–207. [Google Scholar]
- Fassnacht, F.E.; Latifi, H.; Stereńczak, K.; Modzelewska, A.; Lefsky, M.; Waser, L.T.; Straub, C.; Ghosh, A. Review of studies on tree species classification from remotely sensed data. Remote Sens. Environ. 2016, 186, 64–87. [Google Scholar] [CrossRef]
- Ferreira, M.P.; Zortea, M.; Zanotta, D.C.; Shimabukuro, Y.E.; de Souza Filho, C.R. Mapping tree species in tropical seasonal semi-deciduous forests with hyperspectral and multispectral data. Remote Sens. Environ. 2016, 179, 66–78. [Google Scholar] [CrossRef]
- Gougeon, F.A.; Leckie, D.G. Individual tree crown image analysis—A step towards precision forestry. In Proceedings of the First, Held International Precision Forestry Symposium, Seattle, WA, USA, 17 June 2001; pp. 43–49. [Google Scholar]
- Verlic, A.; Duric, N.; Kokalj, Z.; Marsetic, A.; Simoncic, P.; Ostir, K. Tree Species Classification using WorldView-2 Satellite Images and Laser Scanning Data in a natural Urban Forest. Sumarski List 2014, 138, 477–488. [Google Scholar]
- Koch, B.; Heyder, U.; Weinacker, H. Detection of Individual Tree Crowns in Airborne Lidar Data. Photogramm. Eng. Remote Sens. 2006, 72, 357–363. [Google Scholar] [CrossRef] [Green Version]
- Lee, J.; Cai, X.; Lellmann, J.; Dalponte, M.; Malhi, Y.; Butt, N.; Morecroft, M.; Schönlieb, C.; Coomes, D.A. Individual Tree Species Classification From Airborne Multisensor Imagery Using Robust PCA. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2016, 9, 2554–2567. [Google Scholar] [CrossRef] [Green Version]
- Tochon, G.; Féret, J.; Valero, S.; Martin, R.; Knapp, D.; Salembier, P.; Chanussot, J.; Asner, G. On the use of binary partition trees for the tree crown segmentation of tropical rainforest hyperspectral images. Remote Sens. Environ. 2015, 159, 318–331. [Google Scholar] [CrossRef] [Green Version]
- Ke, Y.; Quackenbush, L.J. A review of methods for automatic individual tree-crown detection and delineation from passive remote sensing. Int. J. Remote Sens. 2011, 32, 4725–4747. [Google Scholar] [CrossRef]
- Walsworth, N.A.; King, D.J. Image modelling of forest changes associated with acid mine drainage. Aspen Bibliogr. 1999, 25, 567–580. [Google Scholar] [CrossRef]
- Ozcan, A.H.; Hisar, D.; Sayar, Y.; Unsalan, C. Tree crown detection and delineation in satellite images using probabilistic voting. Remote Sens. Lett. 2017, 8, 761–770. [Google Scholar] [CrossRef]
- Pouliot, D.A.; King, D.J.; Bell, F.W.; Pitt, D.G. Automated tree crown detection and delineation in high-resolution digital camera imagery of coniferous forest regeneration. Remote Sens. Environ. 2002, 82, 322–334. [Google Scholar] [CrossRef]
- Pollock, R.J. The Automatic Recognition of Individual Trees in Aerial Images of Forests Based on a Synthetic Tree Crown Image Model. Ph.D. Thesis, University of British Columbia, Vancouver, BC, Canada, 1996. [Google Scholar]
- Erikson, M. Segmentation and Classification of Individual Tree Crowns in High Spatial Resolution Aerial Images. Ph.D. Thesis, Swedish University of Agricultural Sciences, Uppsala, Sweden, 2004. [Google Scholar]
- Culvenor, D.S. TIDA: An algorithm for the delineation of tree crowns in high spatial resolution remotely sensed imagery. Comput. Geosci. 2002, 28, 33–44. [Google Scholar] [CrossRef]
- Li, Z.; Hayward, R.; Zhang, J.; Liu, Y.; Walker, R. Towards automatic tree crown detection and delineation in spectral feature space using PCNN and morphological reconstruction. In Proceedings of the 2009 16th IEEE International Conference on Image Processing (ICIP), Cairo, Egypt, 7–10 November 2009; pp. 1705–1708. [Google Scholar] [CrossRef]
- Kestur, R.; Angural, A.; Bashir, B.; Omkar, S.N.; Anand, G.; Meenavathi, M.B. Tree Crown Detection, Delineation and Counting in UAV Remote Sensed Images: A Neural Network Based Spectral Spatial Method. J. Indian Soc. Remote Sens. 2018, 46, 991–1004. [Google Scholar] [CrossRef]
- Lawrence, S.; Giles, C.L.; Tsoi, A.C.; Back, A.D. Face recognition: A convolutional neural-network approach. IEEE Trans. Neural Netw. 1997, 8, 98–113. [Google Scholar] [CrossRef] [Green Version]
- Girshick, R.; Donahue, J.; Darrell, T.; Malik, J. Rich Feature Hierarchies for Accurate Object Detection and Semantic Segmentation. In Proceedings of the 2014 IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 24–27 June 2014; pp. 580–587. [Google Scholar]
- Toshev, A.; Szegedy, C. DeepPose: Human Pose Estimation via Deep Neural Networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Columbus, OH, USA, 24–27 June 2014. [Google Scholar] [CrossRef] [Green Version]
- Wagner, F.H.; Sanchez, A.; Tarabalka, Y.; Lotte, R.G.; Ferreira, M.P.; Aidar, M.P.M.; Gloor, E.; Phillips, O.L.; Aragão, L.E.O.C. Using the U-net convolutional network to map forest types and disturbance in the Atlantic rainforest with very high resolution images. Remote Sens. Ecol. Conserv. 2019, 5, 360–375. [Google Scholar] [CrossRef] [Green Version]
- Zhang, L.; Zhang, L.; Du, B. Deep Learning for Remote Sensing Data: A Technical Tutorial on the State of the Art. IEEE Geosci. Remote Sens. Mag. 2016, 4, 22–40. [Google Scholar] [CrossRef]
- Girshick, R.; Donahue, J.; Darrell, T.; Malik, J. Region-Based Convolutional Networks for Accurate Object Detection and Segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 2016, 38, 142–158. [Google Scholar] [CrossRef]
- Yanfei, L.; Yanfei, Z.; Feng, F.; Qiqi, Z.; Qianqing, Q. Scene Classification Based on a Deep Random-Scale Stretched Convolutional Neural Network. Remote Sens. 2018, 10, 444. [Google Scholar] [CrossRef] [Green Version]
- Li, W.; Fu, H.; Yu, L.; Cracknell, A. Deep learning based oil palm tree detection and counting for high-resolution remote sensing images. Remote Sens. 2016, 9, 22. [Google Scholar] [CrossRef] [Green Version]
- Kattenborn, T.; Eichel, J.; Fassnacht, F.E. Convolutional Neural Networks enable efficient, accurate and fine-grained segmentation of plant species and communities from high-resolution UAV imagery. Sci. Rep. 2019, 9, 1–9. [Google Scholar] [CrossRef] [PubMed]
- Weinstein, B.G.; Marconi, S.; Bohlman, S.; Zare, A.; White, E. Individual tree-drown detection in RGB imagery using semi-supervised deep learning neural networks. Remote Sens. 2019, 11, 1309. [Google Scholar] [CrossRef] [Green Version]
- He, K.; Gkioxari, G.; Dollár, P.; Girshick, R.B. Mask R-CNN. Cornell Univ. Comput. Res. Rep. 2017, 1, 1–12. [Google Scholar]
- Nur Omeroglu, A.; Kumbasar, N.; Argun Oral, E.; Ozbek, I.Y. Mask R-CNN Algoritması ile Hangar Tespiti Hangar Detection with Mask R-CNN Algorithm. In Proceedings of the 2019 27th Signal Processing and Communications Applications Conference (SIU), Sivas, Turkey, 24–26 April 2019; pp. 1–4. [Google Scholar] [CrossRef]
- Qiao, Y.; Truman, M.; Sukkarieh, S. Cattle segmentation and contour extraction based on Mask R-CNN for precision livestock farming. Comput. Electron. Agric. 2019, 165, 1–9. [Google Scholar] [CrossRef]
- Nie, S.; Jiang, Z.; Zhang, H.; Cai, B.; Yao, Y. Inshore Ship Detection Based on Mask R-CNN. In Proceedings of the IGARSS 2018—2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 693–696. [Google Scholar] [CrossRef]
- Lecun, Y.; Bottou, L.; Bengio, Y.; Haffner, P. Gradient-based learning applied to document recognition. Proc. IEEE 1998, 86, 2278–2324. [Google Scholar] [CrossRef] [Green Version]
- LeCun, Y.; Yoshua, B.; Geoffrey, H. Deep Learning. Nature 2016, 521, 436–444. [Google Scholar] [CrossRef]
- Wu, H.; Prasad, S. Semi-Supervised Deep Learning Using Pseudo Labels for Hyperspectral Image Classification. IEEE Trans. Image Process. 2018, 27, 1259–1270. [Google Scholar] [CrossRef]
- Guaratini, M.; Gomes, E.; Tamashiro, J.; Rodrigues, R. Composição florística da reserva municipal de Santa Genebra, Campinas, SP. Rev. Bras. BotÂnica 2008, 31, 323–337. [Google Scholar] [CrossRef] [Green Version]
- Farah, F.; Rodrigues, R.; Santos, F.; Tamashiro, J.; Shepherd, G.; Siqueira, T.; Batista, J.; Manly, B. Forest destructuring as revealed by the temporal dynamics of fundamental species—Case study of Santa Genebra Forest in Brazil. Ecol. Indic. 2014, 37, 40–44. [Google Scholar] [CrossRef]
- Béthune, S.; Muller, F.; Donnay, J. Fusion of Multispectral in addition, Panchromatic Images by Local Mean and Variance Matching Filtering Techniques. In Proceedings of the 2nd International Conference Fusion of Earth Data: Merging Point Measurements, Raster Maps and Remotely Sensed Images, Sophia Antipolis, France, 28–30 January 1998; pp. 1–6. [Google Scholar]
- Witharana, C.; Civco, D.L.; Meyer, T.H. Evaluation of pansharpening algorithms in support of earth observation based rapid-mapping workflows. Appl. Geogr. 2013, 37, 63–87. [Google Scholar] [CrossRef]
- Bai, M.; Urtasun, R. Deep Watershed Transform for Instance Segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017. [Google Scholar] [CrossRef] [Green Version]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 1137–1149. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Abadi, M.; Agarwal, A.; Barham, P.; Brevdo, E.; Chen, Z.; Citro, C.; Corrado, G.S.; Davis, A.; Dean, J.; Devin, M.; et al. TensorFlow: Large-Scale Machine Learning on Heterogeneous Systems. Available online: https://www.tensorflow.org/ (accessed on 13 October 2019).
- Chollet, F. Keras. 2015. Available online: https://keras.io (accessed on 13 October 2019).
- Warmerdam, F. GDAL: Geospatial Data Abstraction Library. 2018. Available online: pypi.org/project/GDAL (accessed on 10 February 2019).
- Abdulla, W. Mask R-CNN for Object Detection and Instance Segmentation on Keras and TensorFlow. 2017. Available online: https://github.com/matterport/Mask_RCNN (accessed on 30 September 2019).
- Cohen, J. A Coefficient of Agreement for Nominal Scales. Educ. Psychol. Meas. 1960, 20, 37–46. [Google Scholar] [CrossRef]
- Larsen, M.; Eriksson, M.; Descombes, X.; Perrin, G.; Brandtberg, T.; Gougeon, F.A. Comparison of six individual tree crown detection algorithms evaluated under varying forest conditions. Int. J. Remote Sens. 2011, 32. [Google Scholar] [CrossRef]
- Gomes, M.F.; Maillard, P.; Deng, H. Individual tree crown detection in sub-meter satellite imagery using Marked Point Processes and a geometrical-optical model. Remote Sens. Environ. 2018, 211, 184–195. [Google Scholar] [CrossRef]
- Silva, C.A.; Hudak, A.T.; Vierling, L.A.; Loudermilk, E.L.; O’Brien, J.J.; Hiers, J.K.; Jack, S.B.; Gonzalez-Benecke, C.; Lee, H.; Falkowski, M.J.; et al. Imputation of Individual Longleaf Pine (Pinus palustris Mill.) Tree Attributes from Field and LiDAR Data. Can. J. Remote Sens. 2016, 42, 554–573. [Google Scholar] [CrossRef]
- Dalponte, M.; Orka, H.O.; Ene, L.T.; Gobakken, T.; Naesset, E. Tree crown delineation and tree species classification in boreal forests using hyperspectral and ALS data. Remote Sens. Environ. 2014, 140, 306–317. [Google Scholar] [CrossRef]
- Ferreira, M.P.; Wagner, F.H.; Aragão, L.E.; Shimabukuro, Y.E.; de Souza Filho, C.R. Tree species classification in tropical forests using visible to shortwave infrared WorldView-3 images and texture analysis. ISPRS J. Photogramm. Remote Sens. 2019, 149, 119–131. [Google Scholar] [CrossRef]
- Bastin, J.F.; Rutishauser, E.; Kellner, J.R.; Saatchi, S.; Pélissier, R.; Hérault, B.; Slik, F.; Bogaert, J.; De Cannière, C.; Marshall, A.R.; et al. Pan-tropical prediction of forest structure from the largest trees. Glob. Ecol. Biogeogr. 2018, 27, 1366–1383. [Google Scholar] [CrossRef]
- Blanchard, E.; Birnbaum, P.; Ibanez, T.; Boutreux, T.; Antin, C.; Ploton, P.; Vincent, G.; Pouteau, R.; Vandrot, H.; Hequet, V.; et al. Contrasted allometries between stem diameter, crown area, and tree height in five tropical biogeographic areas. Trees 2016, 30, 1953–1968. [Google Scholar] [CrossRef]
- Bilal, A.; Jourabloo, A.; Ye, M.; Liu, X.; Ren, L. Do Convolutional Neural Networks Learn Class Hierarchy? IEEE Trans. Vis. Comput. Graph. 2018, 24, 152–162. [Google Scholar] [CrossRef] [PubMed]
- Dalagnol, R.; Phillips, O.L.; Gloor, E.; Galvã, L.S.; Wagner, F.H.; Locks, C.J.; Aragão, L.E.O.C. Quantifying Canopy Tree Loss and Gap Recovery in Tropical Forests under Low-Intensity Logging Using VHR Satellite Imagery and Airborne LiDAR. Remote Sens. 2019, 11, 817. [Google Scholar] [CrossRef] [Green Version]
Band Name | Band Wavelength (nm) | Band Spatial Resolution (m) |
---|---|---|
Panchromatic | 450 to 800 | 0.5 |
Blue (B) | 450 to 510 | 2.0 |
Green (G) | 510 to 580 | 2.0 |
Red (R) | 630 to 690 | 2.0 |
Validation Set | Training Set | |
---|---|---|
Minimum Area (m) | ||
Mean Area (m) | ||
Maximum Area (m) |
Hardware Component | Hardware Specification |
---|---|
Operational System | Windows 10 |
CPU | Intel core i7 8th Gen. |
GPU | Nvidia GeForce 1070 GTX, 6 GB |
RAM | 32 GB |
Metric | Training | Validation |
---|---|---|
Class loss | ||
Bounding box loss | ||
Mask loss | ||
Total loss |
Ground Truth | |||
---|---|---|---|
Crown | Non-crown | ||
Segmented | Crown | 395 | 6 |
Non-crown | 33 | 555 |
Number of Segments | Frequency |
---|---|
1 | 355 () |
2 | 33 () |
3 | 4 () |
4 | 1 () |
5 | 2 () |
Metric | Average Value (All Tree Crowns Detected) | Average Value () |
---|---|---|
score |
Research | Algorithm | Detection Accuracy |
---|---|---|
Larsen et al. [63] | region growing | |
Larsen et al. [63] | treetop technique | |
Larsen et al. [63] | template matching | |
Larsen et al. [63] | scale-space | |
Larsen et al. [63] | Markov random fields | |
Larsen et al. [63] | marked point process | |
Wagner et al. [13] | edge detection and region growing | |
Our research (PCD ≥ ) | CNN based | |
Our research (PCD ≥ ) | CNN based | |
Our research () | CNN based |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
G. Braga, J.R.; Peripato, V.; Dalagnol, R.; P. Ferreira, M.; Tarabalka, Y.; O. C. Aragão, L.E.; F. de Campos Velho, H.; Shiguemori, E.H.; Wagner, F.H. Tree Crown Delineation Algorithm Based on a Convolutional Neural Network. Remote Sens. 2020, 12, 1288. https://doi.org/10.3390/rs12081288
G. Braga JR, Peripato V, Dalagnol R, P. Ferreira M, Tarabalka Y, O. C. Aragão LE, F. de Campos Velho H, Shiguemori EH, Wagner FH. Tree Crown Delineation Algorithm Based on a Convolutional Neural Network. Remote Sensing. 2020; 12(8):1288. https://doi.org/10.3390/rs12081288
Chicago/Turabian StyleG. Braga, José R., Vinícius Peripato, Ricardo Dalagnol, Matheus P. Ferreira, Yuliya Tarabalka, Luiz E. O. C. Aragão, Haroldo F. de Campos Velho, Elcio H. Shiguemori, and Fabien H. Wagner. 2020. "Tree Crown Delineation Algorithm Based on a Convolutional Neural Network" Remote Sensing 12, no. 8: 1288. https://doi.org/10.3390/rs12081288