Automatic Detection of Olive Tree Canopies for Groves with Thick Plant Cover on the Ground
<p>Workflow of the method proposed in this paper. The arrows represent data transfers, and the nodes represent functions that are applied to that data. As exceptions, the initial node, <span class="html-italic">Multispectral Captures</span>, represents the data acquired by the UAV and the final node, <span class="html-italic">Olive Tree Canopy Coordinates</span>, represents the most optimal coordinates that the model is capable of predicting. Blocks with dotted outlines represent tasks that are performed beforehand.</p> "> Figure 2
<p>Capture 0, as provided by the multispectral camera: one image for each of the blue, green, red, near-infrared and red-edge bands (<b>left</b> to <b>right</b>), where the large amount of plant cover in the ground is clearly visible.</p> "> Figure 3
<p>Developed application for the manual labelling of the aerial multispectral images of olive groves. The image shown corresponds to capture 0. The regions are labelled red for the olive tree class, yellow for the shadow class, blue for the weed class and purple for the ground class. The background changes colour to indicate the class selected for labelling.</p> "> Figure 4
<p>Vegetation mask associated with capture 0. It is a black and white image in which the white pixels were predicted as belonging to the vegetation class by a one-class classification algorithm known as local outlier factor.</p> "> Figure 5
<p>Probability image associated with capture 0. It is a greyscale image in which the values of each pixel indicate the probability of belonging to the olive tree class. This probability is calculated by a decision tree that receives as input only the pixels classified as vegetation during training.</p> "> Figure 6
<p>Images resulting from each stage of the <span class="html-italic">OTCM</span> block associated with capture 0. The resulting image of each stage is the one received as input by the next stage. (<b>a</b>) Clipping and normalisation. (<b>b</b>) Noise filtering. (<b>c</b>) Density computation. (<b>d</b>) Segmentation. (<b>e</b>) Centroid extraction.</p> "> Figure 7
<p>Delaunay triangulation using the coordinates of the olive tree canopies from capture 0, obtained with the optimal combination of parameters calculated by the model whose values are 0.9, 25 and 25 for the probability threshold, the kernel size and the segmentation threshold, respectively.</p> "> Figure 8
<p>Graphical representation associated with capture 0 in <math display="inline"><semantics> <msup> <mi mathvariant="double-struck">R</mi> <mn>3</mn> </msup> </semantics></math> of the discrete function <span class="html-italic">F</span>, assigning a constant value to one of the parameters in each case. The selected values are the optimum ones computed posteriorly by the model. (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>p</mi> <mo>=</mo> <mn>0.9</mn> </mrow> </semantics></math>. (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>k</mi> <mo>=</mo> <mn>25</mn> </mrow> </semantics></math>. (<b>c</b>) <math display="inline"><semantics> <mrow> <mi>s</mi> <mo>=</mo> <mn>25</mn> </mrow> </semantics></math>. (<b>d</b>) <math display="inline"><semantics> <mrow> <mi>p</mi> <mo>=</mo> <mn>0.9</mn> </mrow> </semantics></math>. (<b>e</b>) <math display="inline"><semantics> <mrow> <mi>k</mi> <mo>=</mo> <mn>25</mn> </mrow> </semantics></math>. (<b>f</b>) <math display="inline"><semantics> <mrow> <mi>s</mi> <mo>=</mo> <mn>25</mn> </mrow> </semantics></math>.</p> "> Figure 9
<p>Graphical representation of the redefined function <span class="html-italic">F</span> associated with capture 0, fixing one of the parameters in each case. It is observed how the domain of the function changes when it is redefined. The black dot represents the minimum value of the function. (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>p</mi> <mo>=</mo> <mn>0.9</mn> </mrow> </semantics></math>. (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>k</mi> <mo>=</mo> <mn>25</mn> </mrow> </semantics></math>. (<b>c</b>) <math display="inline"><semantics> <mrow> <mi>s</mi> <mo>=</mo> <mn>25</mn> </mrow> </semantics></math>.</p> "> Figure 10
<p>Ground truth associated with capture 0. It is a black and white image in which the white pixels have been manually labeled by a person in order to calculate the quality metrics of the model.</p> "> Figure 11
<p>Resulting images obtained and their comparison with the ground truth. Each green mark and contour set indicates a true positive, each red mark indicates a false positive and each red contour indicates a false negative. (<b>a</b>) Capture 0. (<b>b</b>) Capture 5. (<b>c</b>) Capture 8. (<b>d</b>) Capture 11. (<b>e</b>) Capture 17.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. Multispectral Captures
- A Micasense RedEdge-M multispectral camera, capable of capturing 12 bit images with a 1280 by 960 pixel resolution in five bands of the electromagnetic spectrum. These bands are blue (475 nm), green (560 nm), red (668 nm), near infrared (840 nm) and red-edge (717 nm).
- A sunlight sensor Micasense Downwelling Light Sensor (DLS) 1, capable of measuring the ambient light for each of the five bands of the camera Micasense RedEdge-M.
- A global positioning system (GPS) Ublox M8N.
2.2. Radiometric Correction and Band Alignment (RCBA)
2.2.1. Raw Images to Radiance Images Conversion
2.2.2. Radiance Images to Reflectance Images Conversion
2.2.3. Band Alignment
2.3. Manual Labelling
2.4. Vegetation Classification (VC)
2.5. Olive Tree Canopy Estimation (OTCE)
2.6. Olive Tree Canopy Marking (OTCM)
3. Results and Discussion
- Positive: Set of connected pixels (contour) marked as olive canopy in the ground truth.
- Negative: Set of connected pixels not marked as olive canopy in the ground truth.
- Predicted positive: Any coordinate in pixels computed by the model corresponding to an olive tree canopy.
- Predicted negative: It is not applicable because the model does not compute coordinates where there are no olive tree canopies.
- True positive: Every predicted positive whose coordinates are within the bounds of some positive.
- True negative: Not applicable since the predicted negatives do not exist.
- False positive: Any predicted positive whose coordinates are not within the bounds of any positive.
- False negative: Any positive for which there is no predicted positive coordinate that is within its bounds.
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
BRDF | Bidirectional Reflectance Distribution Function |
CNN | Convolutional Neural Network |
CRP | Calibrated Reflectance Panel |
DSM | Digital Surface Model |
DD | Decimal Degrees |
DLS | Downwelling Light Sensor |
DN | Digital Numbers |
DP | Data Preprocessing |
EU | European Union |
FAO | Food and Agriculture Organization of the United Nations |
FN | False Negative |
FP | False Positive |
FPS | Frames Per Second |
GPS | Global Positioning System |
GSD | Ground Sample Distance |
HCRF | Hemispherical–Conical Reflectance Factor |
HDRF | Hemispheric–Directional Reflectance Factor |
IFOV | Instantaneous Field of View |
LiDAR | Laser Imaging Detection and Ranging |
LM | Local Maximum |
OCC | One-Class Classification |
OTCD | Olive Tree Canopy Detection Model |
OTCE | Olive Tree Canopy Estimation |
OTCM | Olive Tree Canopy Marking |
P | Positive |
PN | Predicted Negative |
PP | Predicted Positive |
RCBA | Radiometric Correction and Band Alignment |
RMSE | Root Mean Square Error |
ROI | Region Of Interest |
TN | True Negative |
TP | True Positive |
UAV | Unmanned Aerial Vehicles |
VC | Vegetation Classification |
References
- FAOSTAT. Available online: https://www.fao.org/faostat/en/#data/QCL (accessed on 8 February 2022).
- Economic Affairs & Promotion Unit–International Olive Council. Available online: https://www.internationaloliveoil.org/what-we-do/economic-affairs-promotion-unit/#figures (accessed on 8 February 2022).
- Marchal, P.; Gila, D.; García, J.; Ortega, J. Fuzzy Decision Support System for the Determination of the Set Points of Relevant Variables in the Virgin Olive Oil Elaboration Process. In Proceedings of the 2013 IEEE International Conference on Systems, Man, and Cybernetics, SMC, Manchester, UK, 13–16 October 2013; pp. 3489–3494. [Google Scholar] [CrossRef]
- Bordons, C.; Núñez-Reyes, A. Model Based Predictive Control of an Olive Oil Mill. J. Food Eng. 2008, 84, 1–11. [Google Scholar] [CrossRef]
- Furferi, R.; Carfagni, M.; Daou, M. Artificial Neural Network Software for Real-Time Estimation of Olive Oil Qualitative Parameters during Continuous Extraction. Comput. Electron. Agric. 2007, 55, 115–131. [Google Scholar] [CrossRef]
- Fernández, J.; Alcon, F.; Diaz-Espejo, A.; Hernandez-Santana, V.; Cuevas, M. Water Use Indicators and Economic Analysis for On-Farm Irrigation Decision: A Case Study of a Super High Density Olive Tree Orchard. Agric. Water Manag. 2020, 237, 106074. [Google Scholar] [CrossRef]
- Riveros-Burgos, C.; Ortega-Farías, S.; Morales-Salinas, L.; Fuentes-Peñailillo, F.; Tian, F. Assessment of the Clumped Model to Estimate Olive Orchard Evapotranspiration Using Meteorological Data and UAV-based Thermal Infrared Imagery. Irrig. Sci. 2021, 39, 63–80. [Google Scholar] [CrossRef]
- Tovar, M.; Motilva, M.; Romero, M. Changes in the Phenolic Composition of Virgin Olive Oil from Young Trees (Olea Europaea L. Cv. Arbequina) Grown under Linear Irrigation Strategies. J. Agric. Food Chem. 2001, 49, 5502–5508. [Google Scholar] [CrossRef]
- Angelopoulos, K.; Dichio, B.; Xiloyannis, C. Inhibition of Photosynthesis in Olive Trees (Olea europaea L.) during Water Stress and Rewatering. J. Exp. Bot. 1996, 47, 1093–1100. [Google Scholar] [CrossRef]
- Brito, C.; Dinis, L.T.; Moutinho-Pereira, J.; Correia, C. Drought Stress Effects and Olive Tree Acclimation under a Changing Climate. Plants 2019, 8, 232. [Google Scholar] [CrossRef]
- Castrignanò, A.; Belmonte, A.; Antelmi, I.; Quarto, R.; Quarto, F.; Shaddad, S.; Sion, V.; Muolo, M.; Ranieri, N.; Gadaleta, G.; et al. A Geostatistical Fusion Approach Using UAV Data for Probabilistic Estimation of Xylella Fastidiosa Subsp. Pauca Infection in Olive Trees. Sci. Total Environ. 2021, 752, 141814. [Google Scholar] [CrossRef]
- Castrignanò, A.; Belmonte, A.; Antelmi, I.; Quarto, R.; Quarto, F.; Shaddad, S.; Sion, V.; Muolo, M.; Ranieri, N.; Gadaleta, G.; et al. Semi-Automatic Method for Early Detection of Xylella Fastidiosa in Olive Trees Using Uav Multispectral Imagery and Geostatistical-Discriminant Analysis. Remote Sens. 2021, 13, 14. [Google Scholar] [CrossRef]
- Adamo, F.; Attivissimo, F.; Di Nisio, A.; Ragolia, M.A.; Scarpetta, M. A New Processing Method to Segment Olive Trees and Detect Xylella Fastidiosa in UAVs Multispectral Images. In Proceedings of the 2021 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Glasgow, UK, 17–20 May 2021; pp. 1–6. [Google Scholar] [CrossRef]
- Blekos, K.; Tsakas, A.; Xouris, C.; Evdokidis, I.; Alexandropoulos, D.; Alexakos, C.; Katakis, S.; Makedonas, A.; Theoharatos, C.; Lalos, A. Analysis, Modeling and Multi-Spectral Sensing for the Predictive Management of Verticillium Wilt in Olive Groves. J. Sens. Actuator Netw. 2021, 10, 15. [Google Scholar] [CrossRef]
- Noguera, M.; Aquino, A.; Ponce, J.; Cordeiro, A.; Silvestre, J.; Calderón, R.; Marcelo, M.; Pedro, J.; Andújar, J. Nutritional Status Assessment of Olive Crops by Means of the Analysis and Modelling of Multispectral Images Taken with UAVs. Biosyst. Eng. 2021, 211, 1–18. [Google Scholar] [CrossRef]
- Cano Marchal, P.; Martínez Gila, D.; Illana Rico, S.; Gómez Ortega, J.; Gámez García, J. Assessment of the Nutritional State for Olive Trees Using Uavs. Lect. Notes Electr. Eng. 2021, 695, 284–292. [Google Scholar] [CrossRef]
- Díaz-Varela, R.; de la Rosa, R.; León, L.; Zarco-Tejada, P. High-Resolution Airborne UAV Imagery to Assess Olive Tree Crown Parameters Using 3D Photo Reconstruction: Application in Breeding Trials. Remote Sens. 2015, 7, 4213–4232. [Google Scholar] [CrossRef]
- Anifantis, A.; Camposeo, S.; Vivaldi, G.; Santoro, F.; Pascuzzi, S. Comparison of UAV Photogrammetry and 3D Modeling Techniques with Other Currently Used Methods for Estimation of the Tree Row Volume of a Super-High-Density Olive Orchard. Agricultur 2019, 9, 233. [Google Scholar] [CrossRef]
- Wang, P.; Wang, L.; Leung, H.; Zhang, G. Super-Resolution Mapping Based on Spatial–Spectral Correlation for Spectral Imagery. IEEE Trans. Geosci. Remote Sens. 2021, 59, 2256–2268. [Google Scholar] [CrossRef]
- Berardino, P.; Fornaro, G.; Lanari, R.; Sansosti, E. A New Algorithm for Surface Deformation Monitoring Based on Small Baseline Differential SAR Interferograms. IEEE Trans. Geosci. Remote. Sens. 2002, 40, 2375–2383. [Google Scholar] [CrossRef]
- Liu, J.; Wu, Y.; Gao, X.; Zhang, X. A Simple Method of Mapping Landslides Runout Zones Considering Kinematic Uncertainties. Remote Sens. 2022, 14, 668. [Google Scholar] [CrossRef]
- Di Nisio, A.; Adamo, F.; Acciani, G.; Attivissimo, F. Fast Detection of Olive Trees Affected by Xylella Fastidiosa from UAVs Using Multispectral Imaging. Sensors 2020, 20, 4915. [Google Scholar] [CrossRef]
- Ortega-Farías, S.; Ortega-Salazar, S.; Poblete, T.; Kilic, A.; Allen, R.; Poblete-Echeverría, C.; Ahumada-Orellana, L.; Zuñiga, M.; Sepúlveda, D. Estimation of Energy Balance Components over a Drip-Irrigated Olive Orchard Using Thermal and Multispectral Cameras Placed on a Helicopter-Based Unmanned Aerial Vehicle (UAV). Remote Sens. 2016, 8, 638. [Google Scholar] [CrossRef]
- Modica, G.; Messina, G.; De Luca, G.; Fiozzo, V.; Praticò, S. Monitoring the Vegetation Vigor in Heterogeneous Citrus and Olive Orchards. A Multiscale Object-Based Approach to Extract Trees’ Crowns from UAV Multispectral Imagery. Comput. Electron. Agric. 2020, 175, 105500. [Google Scholar] [CrossRef]
- Waleed, M.; Um, T.W.; Khan, A.; Khan, U. Automatic Detection System of Olive Trees Using Improved K-Means Algorithm. Remote Sens. 2020, 12, 760. [Google Scholar] [CrossRef]
- Li, W.; Fu, H.; Yu, L.; Cracknell, A. Deep Learning Based Oil Palm Tree Detection and Counting for High-Resolution Remote Sensing Images. Remote Sens. 2017, 9, 22. [Google Scholar] [CrossRef]
- Daliakopoulos, I.N.; Grillakis, E.G.; Koutroulis, A.G.; Tsanis, I.K. Tree Crown Detection on Multispectral VHR Satellite Imagery. Photogramm. Eng. Remote Sens. 2009, 75, 1201–1211. [Google Scholar] [CrossRef]
- Wang, L.; Gong, P.; Biging, G.S. Individual Tree-Crown Delineation and Treetop Detection in High-Spatial-Resolution Aerial Imagery. Photogramm. Eng. Remote Sens. 2004, 70, 351–357. [Google Scholar] [CrossRef]
- Erikson, M. Segmentation of Individual Tree Crowns in Colour Aerial Photographs Using Region Growing Supported by Fuzzy Rules. Can. J. For. Res. 2003, 33, 1557–1563. [Google Scholar] [CrossRef]
- Weinstein, B.G.; Marconi, S.; Bohlman, S.; Zare, A.; White, E. Individual Tree-Crown Detection in RGB Imagery Using Semi-Supervised Deep Learning Neural Networks. Remote Sens. 2019, 11, 1309. [Google Scholar] [CrossRef]
- Csillik, O.; Cherbini, J.; Johnson, R.; Lyons, A.; Kelly, M. Identification of Citrus Trees from Unmanned Aerial Vehicle Imagery Using Convolutional Neural Networks. Drones 2018, 2, 39. [Google Scholar] [CrossRef]
- Xu, W.; Deng, S.; Liang, D.; Cheng, X. A Crown Morphology-Based Approach to Individual Tree Detection in Subtropical Mixed Broadleaf Urban Forests Using UAV LiDAR Data. Remote Sens. 2021, 13, 1278. [Google Scholar] [CrossRef]
- Gao, M.; Yang, F.; Wei, H.; Liu, X. Individual Maize Location and Height Estimation in Field from UAV-Borne LiDAR and RGB Images. Remote Sens. 2022, 14, 2292. [Google Scholar] [CrossRef]
- Koc-San, D.; Selim, S.; Aslan, N.; San, B.T. Automatic Citrus Tree Extraction from UAV Images and Digital Surface Models Using Circular Hough Transform. Comput. Electron. Agric. 2018, 150, 289–301. [Google Scholar] [CrossRef]
- Mu, Y.; Fujii, Y.; Takata, D.; Zheng, B.; Noshita, K.; Honda, K.; Ninomiya, S.; Guo, W. Characterization of Peach Tree Crown by Using High-Resolution Images from an Unmanned Aerial Vehicle. Hortic. Res. 2018, 5, 74. [Google Scholar] [CrossRef] [PubMed]
- Marques, P.; Pádua, L.; Adão, T.; Hruška, J.; Peres, E.; Sousa, A.; Sousa, J.J. UAV-Based Automatic Detection and Monitoring of Chestnut Trees. Remote Sens. 2019, 11, 855. [Google Scholar] [CrossRef]
- Jiang, H.; Chen, S.; Li, D.; Wang, C.; Yang, J. Papaya Tree Detection with UAV Images Using a GPU-Accelerated Scale-Space Filtering Method. Remote Sens. 2017, 9, 721. [Google Scholar] [CrossRef]
- Larsen, M.; Eriksson, M.; Descombes, X.; Perrin, G.; Brandtberg, T.; Gougeon, F.A. Comparison of Six Individual Tree Crown Detection Algorithms Evaluated under Varying Forest Conditions. Int. J. Remote Sens. 2011, 32, 5827–5852. [Google Scholar] [CrossRef]
- Peña, J.; Torres-Sánchez, J.; de Castro, A.; Kelly, M.; López-Granados, F. Weed Mapping in Early-Season Maize Fields Using Object-Based Analysis of Unmanned Aerial Vehicle (UAV) Images. PLoS ONE 2013, 8, e77151. [Google Scholar] [CrossRef]
- Sa, I.; Chen, Z.; Popovic, M.; Khanna, R.; Liebisch, F.; Nieto, J.; Siegwart, R. WeedNet: Dense Semantic Weed Classification Using Multispectral Images and MAV for Smart Farming. IEEE Robot. Autom. Lett. 2018, 3, 588–595. [Google Scholar] [CrossRef]
- de Castro, A.; Torres-Sánchez, J.; Peña, J.; Jiménez-Brenes, F.; Csillik, O.; López-Granados, F. An Automatic Random Forest-OBIA Algorithm for Early Weed Mapping between and within Crop Rows Using UAV Imagery. Remote Sens. 2018, 10, 285. [Google Scholar] [CrossRef]
- Pérez-Ortiz, M.; Peña, J.M.; Gutiérrez, P.A.; Torres-Sánchez, J.; Hervás-Martínez, C.; López-Granados, F. A Semi-Supervised System for Weed Mapping in Sunflower Crops Using Unmanned Aerial Vehicles and a Crop Row Detection Method. Appl. Soft Comput. 2015, 37, 533–544. [Google Scholar] [CrossRef]
- Dian Bah, M.; Hafiane, A.; Canals, R. Deep Learning with Unsupervised Data Labeling for Weed Detection in Line Crops in UAV Images. Remote Sens. 2018, 10, 1690. [Google Scholar] [CrossRef]
- Nicodemus, F.E.; Richmond, J.; Hsia, J.; Ginsberg, I.; Limperis, T. Geometrical Considerations and Nomenclature for Reflectance; National Bureau of Standards: Gaithersburg, MD, USA, 1977. [CrossRef]
- Evangelidis, G.D.; Psarakis, E.Z. Parametric Image Alignment Using Enhanced Correlation Coefficient Maximization. IEEE Trans. Pattern Anal. Mach. Intell. 2008, 30, 1858–1865. [Google Scholar] [CrossRef]
- Li, J.; Meng, L.; Yang, B.; Tao, C.; Li, L.; Zhang, W. LabelRS: An Automated Toolbox to Make Deep Learning Samples from Remote Sensing Images. Remote Sens. 2021, 13, 2064. [Google Scholar] [CrossRef]
- Novelty Detection with Local Outlier Factor (LOF). Available online: https://scikit-learn/stable/auto_examples/neighbors/plot_lof_novelty_detection.html (accessed on 8 February 2022).
- Abozeid, A.; Alanazi, R.; Elhadad, A.; Taloba, A.I.; Abd El-Aziz, R.M. A Large-Scale Dataset and Deep Learning Model for Detecting and Counting Olive Trees in Satellite Imagery. Comput. Intell. Neurosci. 2022, 2022, e1549842. [Google Scholar] [CrossRef] [PubMed]
- Safonova, A.; Guirado, E.; Maglinets, Y.; Alcaraz-Segura, D.; Tabik, S. Olive Tree Biovolume from UAV Multi-Resolution Image Segmentation with Mask R-CNN. Sensors 2021, 21, 1617. [Google Scholar] [CrossRef]
- Waleed, M.; Um, T.W.; Khan, A.; Ahmad, Z. An Automated Method for Detection and Enumeration of Olive Trees Through Remote Sensing. IEEE Access 2020, 8, 108592–108601. [Google Scholar] [CrossRef]
- Salamí, E.; Gallardo, A.; Skorobogatov, G.; Barrado, C. On-the-Fly Olive Tree Counting Using a UAS and Cloud Services. Remote Sens. 2019, 11, 316. [Google Scholar] [CrossRef]
- Khan, A.; Khan, U.; Waleed, M.; Khan, A.; Kamal, T.; Marwat, S.N.K.; Maqsood, M.; Aadil, F. Remote Sensing: An Automated Methodology for Olive Tree Detection and Counting in Satellite Images. IEEE Access 2018, 6, 77816–77828. [Google Scholar] [CrossRef]
- Karydas, C.; Gewehr, S.; Iatrou, M.; Iatrou, G.; Mourelatos, S. Olive Plantation Mapping on a Sub-Tree Scale with Object-Based Image Analysis of Multispectral UAV Data; Operational Potential in Tree Stress Monitoring. J. Imaging 2017, 3, 57. [Google Scholar] [CrossRef]
- Chemin, Y.H.; Beck, P.S.A. A Method to Count Olive Trees in Heterogenous Plantations from Aerial Photographs. Geoinformatics 2017, Preprint. [Google Scholar] [CrossRef]
- Peters, J.; Van Coillie, F.; Westra, T.; De Wulf, R. Synergy of Very High Resolution Optical and Radar Data for Object-Based Olive Grove Mapping. Int. J. Geogr. Inf. Sci. 2011, 25, 971–989. [Google Scholar] [CrossRef]
- Moreno-Garcia, J.; Linares, L.J.; Rodriguez-Benitez, L.; Solana-Cipres, C. Olive Trees Detection in Very High Resolution Images. In Proceedings of the Information Processing and Management of Uncertainty in Knowledge-Based Systems; Hüllermeier, E., Kruse, R., Hoffmann, F., Eds.; Springer: Berlin/Heidelberg, Germany, 2010; pp. 21–29. [Google Scholar] [CrossRef]
- González, J.; Galindo, C.; Arevalo, V.; Ambrosio, G. Applying Image Analysis and Probabilistic Techniques for Counting Olive Trees in High-Resolution Satellite Images. In Proceedings of the Advanced Concepts for Intelligent Vision Systems, Delft, The Netherlands, 28–31 August 2007; Blanc-Talon, J., Philips, W., Popescu, D., Scheunders, P., Eds.; Springer: Berlin/Heidelberg, Germany, 2007; pp. 920–931. [Google Scholar] [CrossRef]
ID | Campaign | Date and Time [ISO8601] | Latitude [DD] | Longitude [DD] | Altitude above Sea Level [m] | Altitude above Ground Level [m] | Ground Sample Distance [cm/px] |
---|---|---|---|---|---|---|---|
0 | 2 | 2018-10-04T11:00:00Z | 37.316932 | −3.354593 | 1408.694 | 188.774 | 13.109 |
1 | 2018-10-04T11:02:56Z | 37.316660 | −3.354487 | 1407.542 | 187.622 | 13.029 | |
2 | 2018-10-04T11:11:48Z | 37.314839 | −3.357709 | 1406.342 | 186.422 | 12.946 | |
3 | 3 | 2018-11-08T11:33:28Z | 37.316566 | −3.354403 | 1403.569 | 180.834 | 12.558 |
4 | 2018-11-08T11:33:32Z | 37.316642 | −3.354134 | 1403.604 | 180.869 | 12.560 | |
5 | 2018-11-08T11:59:06Z | 37.314442 | −3.354405 | 1402.180 | 179.757 | 12.483 | |
6 | 4 | 2018-11-29T09:37:17Z | 37.316852 | −3.354528 | 1402.846 | 182.051 | 12.642 |
7 | 2018-11-29T09:40:04Z | 37.316620 | −3.354438 | 1402.582 | 181.787 | 12.624 | |
8 | 2018-11-29T10:18:13Z | 37.314431 | −3.354456 | 1401.603 | 181.069 | 12.574 | |
9 | 5 | 2019-07-18T08:13:38Z | 37.316618 | −3.354430 | 1409.757 | 185.211 | 12.862 |
10 | 2019-07-18T08:13:44Z | 37.316764 | −3.353918 | 1409.754 | 185.208 | 12.862 | |
11 | 2019-07-18T09:04:03Z | 37.314307 | −3.354476 | 1412.117 | 189.000 | 13.125 | |
12 | 6 | 2019-10-03T08:16:23Z | 37.316612 | −3.354427 | 1408.163 | 183.848 | 12.767 |
13 | 2019-10-03T08:17:19Z | 37.316682 | −3.353719 | 1409.065 | 184.750 | 12.830 | |
14 | 2019-10-03T09:16:04Z | 37.314328 | −3.354328 | 1408.841 | 183.622 | 12.752 | |
15 | 7 | 2019-11-07T08:45:24Z | 37.316665 | −3.354254 | 1401.534 | 177.614 | 12.334 |
16 | 2019-11-07T08:46:20Z | 37.316640 | −3.353892 | 1400.133 | 176.213 | 12.237 | |
17 | 2019-11-07T11:09:46Z | 37.314307 | −3.354477 | 1403.273 | 178.476 | 12.394 |
Parameter | Metadata Tag | Value |
---|---|---|
g | ||
ID | Fold | Positives (P) | Predicted Positives (PP) | True Positives (TP) | False Negatives (FN) | False Positives (FP) | Recall | Precision | F Score |
---|---|---|---|---|---|---|---|---|---|
2 | 0 | 154 | 166 | 134 | 21 | 32 | 0.870 | 0.807 | 0.838 |
1 | 125 | 141 | 118 | 7 | 23 | 0.944 | 0.837 | 0.887 | |
3 | 112 | 115 | 108 | 4 | 7 | 0.964 | 0.939 | 0.952 | |
7 | 106 | 98 | 94 | 13 | 4 | 0.887 | 0.959 | 0.922 | |
6 | 1 | 100 | 134 | 96 | 4 | 38 | 0.960 | 0.716 | 0.821 |
16 | 100 | 123 | 76 | 26 | 47 | 0.760 | 0.618 | 0.682 | |
5 | 238 | 247 | 193 | 45 | 54 | 0.811 | 0.781 | 0.796 | |
13 | 103 | 120 | 51 | 52 | 69 | 0.495 | 0.425 | 0.457 | |
10 | 2 | 115 | 121 | 110 | 5 | 11 | 0.957 | 0.909 | 0.932 |
12 | 111 | 136 | 90 | 23 | 46 | 0.811 | 0.662 | 0.729 | |
15 | 103 | 148 | 67 | 57 | 81 | 0.650 | 0.453 | 0.534 | |
0 | 104 | 119 | 101 | 3 | 18 | 0.971 | 0.849 | 0.906 | |
8 | 3 | 226 | 238 | 31 | 195 | 207 | 0.137 | 0.130 | 0.134 |
11 | 270 | 285 | 263 | 7 | 22 | 0.974 | 0.923 | 0.948 | |
4 | 102 | 119 | 58 | 44 | 61 | 0.569 | 0.487 | 0.525 | |
14 | 4 | 263 | 264 | 231 | 32 | 33 | 0.878 | 0.875 | 0.877 |
17 | 225 | 236 | 149 | 80 | 87 | 0.662 | 0.631 | 0.646 | |
9 | 123 | 121 | 107 | 16 | 14 | 0.870 | 0.884 | 0.877 | |
2680 | 2931 | 2077 | 634 | 854 | 0.775 | 0.709 | 0.740 |
ID | Fold | VC Training Time (s) | OTCE Training Time (s) | VC Predict Time (s) | OTCE Prediction Time [s] | OTCM Prediction Time (s) for 5700 Iterations |
---|---|---|---|---|---|---|
2 | 0 | 46.3 | 6.4 | 28.7 | 0.15 | 56.7 |
1 | 40.4 | 0.24 | 65.5 | |||
3 | 40.1 | 0.26 | 57.7 | |||
7 | 34.8 | 0.13 | 55.2 | |||
6 | 1 | 57.7 | 6.4 | 29.7 | 0.12 | 60.1 |
16 | 46.6 | 0.16 | 66.0 | |||
5 | 34.8 | 0.19 | 59.6 | |||
13 | 33.8 | 0.07 | 51.0 | |||
10 | 2 | 54.9 | 7.3 | 24.5 | 0.05 | 43.6 |
12 | 28.8 | 0.11 | 60.9 | |||
15 | 28.5 | 0.11 | 71.7 | |||
0 | 30.8 | 0.22 | 61.1 | |||
8 | 3 | 66.4 | 6.7 | 28.2 | 0.14 | 56.6 |
11 | 26.1 | 0.04 | 52.2 | |||
4 | 28.1 | 0.15 | 53.1 | |||
14 | 4 | 63.6 | 6.8 | 29.2 | 0.10 | 50.9 |
17 | 38.7 | 0.23 | 54.3 | |||
9 | 29.9 | 0.09 | 53.0 | |||
57.0 | 6.7 | 32.3 | 0.14 | 57.2 |
Reference | Method | Publication Date | Dataset | Channels |
---|---|---|---|---|
- | Proposed | - | MicaSense RedEdge-MX | Red, green and blue |
[48] | Deep learning model (SwinTUnet) based on Unet-like networks | 15 January 2022 | Satellites Pro | Red, green and blue |
[49] | Orthophotos + Mask R-CNN | 25 February 2021 | Parrot Sequoia camera | Red, green, blue and near infrared |
[50] | Edge detection + circular Hough transform | 1 June 2020 | SIGPAC viewer | Red |
[25] | Laplacian of Gaussian + improved k-means clustering | 26 February 2020 | SIGPAC viewer | Red, green and blue |
[51] | Colour-based vs. stereo-vision-based segmentation | 5 February 2019 | DJI Phantom4 camera | Red, green and blue |
[52] | Multi-level thresholding + circular Hough transform | 4 December 2018 | SIGPAC viewer | Red, green and blue |
[53] | Radiometrically corrected orthophotos+ Object-based image analysis | 4 December 2017 | Modified multiSPEC 4C camera | Red, green, red-edge and near-infrared |
[54] | Orthophotos + Thresholding + watershed analysis + microbiological cell counting algorithm | 27 October 2017 | Leica ADS40, ADS80, ADS100 and DMC III cameras | Red, green, blue and near-infrared |
[55] | Optical + radar data + object-based classification | 19 July 2011 | ADS40 Airborne Digital Sensor + RAMSES + TerraSAR-X satellite | Panchromatic, red, green, blue, near-infrared and X |
[56] | K-mean clustering | 2 July 2010 | SIGPAC viewer | Red, green and blue |
[57] | Reticular matching | 31 August 2007 | Quickbird | Panchromatic |
Reference | No. of Images | Accuracy | Precision | Recall | Omission Error Rate | Commission Error Rate | Estimation Error |
---|---|---|---|---|---|---|---|
- | 18 | N/A | 70.9% | 77.5% | 22.5% | N/A | 9.37% |
[48] | 230 | 98.3% | N/A | 98.8% | 1.2% | 0.97% | 0.94% |
[49] | 150 | N/A | 90.07–100% | 90.83–100% | 0–9.17% | N/A | N/A |
[50] | 60 | N/A | N/A | 96% | 4% | 1.2% | 1.27% |
[25] | 110 | 97.5% | N/A | 99% | 1% | 4% | 0.97% |
[51] | 10 | N/A | 99.84% | 97.61% | 2.39% | N/A | N/A |
[52] | N/A | 96% | N/A | 97% | 3% | 3% | 1.2% |
[53] | 315 | 93% | 91.6% | 95.9% | 4.1% | 10.57% | 4.76% |
[54] | 4 | N/A | N/A | N/A | N/A | N/A | 4–27% |
[55] | 3 | 76.8–90.5% | N/A | 16–66.7% | 33.3–84% | 0.6–8.4% | N/A |
[56] | N/A | N/A | N/A | 83.33% | 16.67% | 0% | |
[57] | 3 | N/A | N/A | 93% | 7% | 5% | 1.24% |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Illana Rico, S.; Martínez Gila, D.M.; Cano Marchal, P.; Gómez Ortega, J. Automatic Detection of Olive Tree Canopies for Groves with Thick Plant Cover on the Ground. Sensors 2022, 22, 6219. https://doi.org/10.3390/s22166219
Illana Rico S, Martínez Gila DM, Cano Marchal P, Gómez Ortega J. Automatic Detection of Olive Tree Canopies for Groves with Thick Plant Cover on the Ground. Sensors. 2022; 22(16):6219. https://doi.org/10.3390/s22166219
Chicago/Turabian StyleIllana Rico, Sergio, Diego Manuel Martínez Gila, Pablo Cano Marchal, and Juan Gómez Ortega. 2022. "Automatic Detection of Olive Tree Canopies for Groves with Thick Plant Cover on the Ground" Sensors 22, no. 16: 6219. https://doi.org/10.3390/s22166219