Multispectral LiDAR Point Cloud Classification: A Two-Step Approach
"> Figure 1
<p>Photo of the realistic experimental scene.</p> "> Figure 2
<p>Point cloud of the experimental materials. The white wall points are almost white for high reflectance, which is why they are nearly invisible in this picture.</p> "> Figure 3
<p>Normalized spectral reflectance variation of seven targets on four wavelengths: (<b>a</b>) 556; (<b>b</b>) 670; (<b>c</b>) 700; and (<b>d</b>) 780 nm. The upper, bottom edge, and middle line of the box represent the 25th and 75th percentiles, and the median of the point cloud, respectively. The length of the dotted line is 1.5 times that of the box unless the end of dotted line contains all external points. Outliers are marked with crosses.</p> "> Figure 4
<p>Manually labeled multispectral light detection and ranging (LiDAR) point cloud in the seven materials: white wall, white paper box, cactus, ceramic flowerpot, healthy scindapsus leaves, withered scindapsus leaves and plastic foam, which are shown in blue, red, orange, yellow, green, brown and purple, respectively.</p> "> Figure 5
<p>Classification results based on raw spectral reflectance. Different colors represent the different results of targets. The representative color is the same as the training samples shown in <a href="#remotesensing-09-00373-f004" class="html-fig">Figure 4</a>.</p> "> Figure 6
<p>Classification of artificial (white wall, white paper box, ceramic flowerpot, and plastic foam) and vegetable (cactus and scindapsus leaves) targets on the basis of (<b>a</b>) raw spectral reflectance; and (<b>b</b>) five VIs. Red and green points represent the artificial and vegetable samples, respectively.</p> "> Figure 7
<p>Variation of the five types of vegetation indexes of healthy (red) and withered (blue) scindapsus leaves: Chlorophyll Absorption Reflectance Index 1, Normalized Difference Red Edge, Modified Triangular Vegetation Index 1, Green Normalized Difference Vegetation Index, and Gitelson. The upper, bottom edge, and middle line of the box represent the 25th and 75th percentiles, and the median of the VI value, respectively. The length of the dotted line is 1.5 times that of the box unless end of the dotted line contains the external point. Outliers are marked with red crosses.</p> "> Figure 8
<p>Healthy and withered scindapsus leaves classification results based on (<b>a</b>) VI and (<b>b</b>) spectral reflectance. Green and brown points indicate healthy and withered leaves, respectively. The result indicates that the VI is more sensitive to the growing condition of leaves, which makes it helpful for discriminating between healthy and withered leaves.</p> "> Figure 9
<p>Reclassification result of seven individual targets based on the k-NN algorithm with spatial information. Different colors represent different targets. The representative colors are the same as those of the training samples in <a href="#remotesensing-09-00373-f004" class="html-fig">Figure 4</a>.</p> "> Figure 10
<p>Colorful points represent the points whose class changed after reclassification based on the k-NN algorithm with spatial information. Gray, green, and red points represent the unchanged points, the correctly changed points, and the falsely changed points, respectively.</p> "> Figure 11
<p>Classification result of the seven individual targets based on Gong’s method [<a href="#B38-remotesensing-09-00373" class="html-bibr">38</a>]. Different colors represent different targets. The representative colors are the same as those of the training samples in <a href="#remotesensing-09-00373-f004" class="html-fig">Figure 4</a>.</p> ">
Abstract
:1. Introduction
2. Materials
2.1. Equipment
2.2. Materials and Data
2.3. Variation of Raw Spectral Reflectance
3. Methods
3.1. Classification Based on Raw Spectral Reflectance
3.2. Classification Based on Vegetation Index (VI)
3.3. Reclassification Based on Neighborhood Spatial Information with k-Nearest Neighbors Algorithm
4. Results
4.1. Classification Based on Raw Spectral Reflectance
4.2. Classification Based on VI
4.3. Reclassification Based on Neighborhood Spatial Information with k-NN
5. Discussion
6. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
Appendix A
Vegetation Index | MWCL Adapted Formula | Original Formula |
---|---|---|
Triangle Vegetation Index (TVI) [56] | 0.5 × [120 × (R780 − R556) − 200 × (R670 − R556)] | 0.5 × [120 × (NIR − GREEN) –200 × (R − GREEN)] |
Red-edge Triangular Vegetation Index (RTVI) [57] | [100 × (R780 − R700) – 10 × (R780 − R556)] sqrt (R700/R670) | [100 × (R750 − R730) – 10 × (R750 − R550)] sqrt (R700/R670) |
Modified Chlorophyll Absorption in Reflectance Index (MCARI) [58] | [(R700 − R670) − 0.2 × (R700 − R556)](R700/R670) | [(R700 − R670) − 0.2 × (R700 − R550)](R700/R670) |
Transformed Chlorophyll Absorption in Reflectance Index (TCARI) [59] | 3 × [(R700 − R670) − 0.2 × (R700 − R556)(R700/R670)] | 3 × [(R700 − R670) − 0.2 × (R700 − R550)(R700/R670)] |
Red-edge Inflection Point (REIP) [60] | 700 + 40 × [(R670 + R780)/2 − R700]/(R780 − R700) | 700 + 40 × [(R670 + R780)/2 − R700]/(R740 − R700) |
Normalized Difference Vegetation Index (NDVI) [61] | (R780 − R670)/(R780 + R670) | (NIR − R)/(NIR + R) |
Soil-Adjusted Vegetation Index (SAVI) [62] | [(R780 – R670)/(R780 + R670 + 0.5)] × (1 + 0.5) | [(NIR − R)/(NIR + R + 0.5)] × (1 + 0.5) |
Optimized Soil-Adjusted Vegetation Index (OSAVI) [63] | [(R780 – R670)/(R780 + R670 + 0.16)] | [(NIR − R)/(NIR + R + 0.16)] |
Optimal Vegetation Index (VIopt) [64] | (1 + 0.45)((R780)2 + 1)/(R670 + 0.45) | (1 + 0.45)((NIR)2 + 1)/(R + 0.45) |
References
- Brekke, C.; Solberg, A.H.S. Oil spill detection by satellite remote sensing. Remote Sens. Environ. 2005, 95, 1–13. [Google Scholar] [CrossRef]
- Ricchetti, E. Multispectral satellite image and ancillary data integration for geological classification. Photogramm. Eng. Remote Sens. 2000, 66, 429–435. [Google Scholar]
- Li, C.; Wang, J.; Wang, L.; Hu, L.; Gong, P. Comparison of classification algorithms and training sample sizes in urban land classification with landsat thematic mapper imagery. Remote Sens. 2014, 6, 964. [Google Scholar] [CrossRef]
- Wardlow, B.D.; Egbert, S.L.; Kastens, J.H. Analysis of time-series modis 250 m vegetation index data for crop classification in the U.S. Central great plains. Remote Sens. Environ. 2007, 108, 290–310. [Google Scholar] [CrossRef]
- Yang, J.; Shi, S.; Gong, W.; Du, L.; Ma, Y.Y.; Zhu, B.; Song, S.L. Application of fluorescence spectrum to precisely inverse paddy rice nitrogen content. Plant Soil Environ. 2015, 61, 182–188. [Google Scholar] [CrossRef]
- Koch, B.; Heyder, U.; Weinacker, H. Detection of individual tree crowns in airborne LiDAR data. Photogramm. Eng. Remote Sens. 2006, 72, 357–363. [Google Scholar] [CrossRef]
- Immitzer, M.; Atzberger, C.; Koukal, T. Tree species classification with random forest using very high spatial resolution 8-band worldview-2 satellite data. Remote Sens. 2012, 4, 2661. [Google Scholar] [CrossRef]
- Wolter, P.T.; Mladenoff, D.J.; Host, G.E.; Crow, T.R. Improved forest classification in the northern lake states using multi-temporal landsat imagery. Photogramm. Eng. Remote Sens. 1995, 61, 1129–1144. [Google Scholar]
- Ramsey, E.; Rangoonwala, A.; Jones, C. Structural classification of marshes with polarimetric sar highlighting the temporal mapping of marshes exposed to oil. Remote Sens. 2015, 7, 11295. [Google Scholar] [CrossRef]
- Zhou, G.; Zhang, R.; Zhang, D. Manifold learning co-location decision tree for remotely sensed imagery classification. Remote Sens. 2016, 8, 855. [Google Scholar] [CrossRef]
- Yang, J.; Gong, W.; Shi, S.; Du, L.; Sun, J.; Song, S.L. Estimation of nitrogen content based on fluorescence spectrum and principal component analysis in paddy rice. Plant Soil Environ. 2016, 62, 178–183. [Google Scholar] [CrossRef]
- Serpico, S.B.; Bruzzone, L.; Roli, F. Special issue on non-conventional pattern analysis in remote sensingan experimental comparison of neural and statistical non-parametric algorithms for supervised classification of remote-sensing images. Pattern Recognit. Lett. 1996, 17, 1331–1341. [Google Scholar] [CrossRef]
- Licciardi, G.; Pacifici, F.; Tuia, D.; Prasad, S.; West, T.; Giacco, F.; Thiel, C.; Inglada, J.; Christophe, E.; Chanussot, J.; Gamba, P. Decision fusion for the classification of hyperspectral data: Outcome of the 2008 GRS-S data fusion contest. IEEE Trans. Geosci. Remote Sens. 2009, 47, 3857–3865. [Google Scholar] [CrossRef]
- Yang, J.; Gong, W.; Shi, S.; Du, L.; Sun, J.; Ma, Y.; Song, S. Accurate identification of nitrogen fertilizer application of paddy rice using laser-induced fluorescence combined with support vector machine. Plant Soil Environ. 2015, 61, 501–506. [Google Scholar] [CrossRef]
- Wieland, M.; Liu, W.; Yamazaki, F. Learning change from synthetic aperture radar images: Performance evaluation of a support vector machine to detect earthquake and tsunami-induced changes. Remote Sens. 2016, 8, 792. [Google Scholar] [CrossRef]
- Chen, J.; Wang, C.; Wang, R. Using stacked generalization to combine svms in magnitude and shape feature spaces for classification of hyperspectral data. IEEE Trans. Geosci. Remote Sens. 2009, 47, 2193–2205. [Google Scholar] [CrossRef]
- Marconcini, M.; Camps-Valls, G.; Bruzzone, L. A composite semisupervised svm for classification of hyperspectral images. IEEE Geosci. Remote Sens. Lett. 2009, 6, 234–238. [Google Scholar] [CrossRef]
- Chen, J.; Wang, C.; Wang, R. Fusion of svms in wavelet domain for hyperspectral data classification. In Proceedings of the 2009 IEEE International Conference on Robotics and Biomimetics (ROBIO), Guilin, China, 19–23 December 2009; pp. 1372–1375. [Google Scholar]
- Demir, B.; Erturk, S. Clustering-based extraction of border training patterns for accurate svm classification of hyperspectral images. IEEE Geosci. Remote Sens. Lett. 2009, 6, 840–844. [Google Scholar] [CrossRef]
- Mountrakis, G.; Im, J.; Ogole, C. Support vector machines in remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2011, 66, 247–259. [Google Scholar] [CrossRef]
- Zhan, Q.; Molenaar, M.; Tempfli, K. Building extraction from laser data by reasoning on image segments in elevation slices. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2002, 34, 305–308. [Google Scholar]
- Brodu, N.; Lague, D. 3D terrestrial LiDAR data classification of complex natural scenes using a multi-scale dimensionality criterion: Applications in geomorphology. ISPRS J. Photogramm. Remote Sens. 2012, 68, 121–134. [Google Scholar] [CrossRef]
- Vaughn, N.R.; Moskal, L.M.; Turnblom, E.C. Tree species detection accuracies using discrete point LiDAR and airborne waveform LiDAR. Remote Sens. 2012, 4, 377. [Google Scholar] [CrossRef]
- Guo, L.; Chehata, N.; Mallet, C.; Boukir, S. Relevance of airborne LiDAR and multispectral image data for urban scene classification using random forests. ISPRS J. Photogramm. Remote Sens. 2011, 66, 56–66. [Google Scholar] [CrossRef]
- García, M.; Riaño, D.; Chuvieco, E.; Salas, J.; Danson, F.M. Multispectral and LiDAR data fusion for fuel type mapping using support vector machine and decision rules. Remote Sens. Environ. 2011, 115, 1369–1379. [Google Scholar] [CrossRef]
- Laible, S.; Khan, Y.N.; Bohlmann, K.; Zell, A. 3D LiDAR- and camera-based terrain classification under different lighting conditions. In Autonomous Mobile Systems; Springer: Berlin/Heidelberg, Germany, 2012; pp. 21–29. [Google Scholar]
- Zhang, J.; Lin, X. Advances in fusion of optical imagery and LiDAR point cloud applied to photogrammetry and remote sensing. Int. J. Image Data Fusion 2016, 8, 1–31. [Google Scholar] [CrossRef]
- Wei, G.; Shalei, S.; Bo, Z.; Shuo, S.; Faquan, L.; Xuewu, C. Multi-wavelength canopy LiDAR for remote sensing of vegetation: Design and system performance. ISPRS J. Photogramm. Remote Sens. 2012, 69, 1–9. [Google Scholar] [CrossRef]
- Niu, Z.; Xu, Z.; Sun, G.; Huang, W.; Wang, L.; Feng, M.; Li, W.; He, W.; Gao, S. Design of a new multispectral waveform LiDAR instrument to monitor vegetation. IEEE Geosci. Remote Sens. Lett. 2015, 12, 1506–1510. [Google Scholar]
- Woodhouse, I.H.; Nichol, C.; Sinclair, P.; Jack, J.; Morsdorf, F.; Malthus, T.J.; Patenaude, G. A multispectral canopy LiDAR demonstrator project. IEEE Geosci. Remote Sens. Lett. 2011, 8, 839–843. [Google Scholar] [CrossRef]
- Li, W.; Sun, G.; Niu, Z.; Gao, S.; Qiao, H. Estimation of leaf biochemical content using a novel hyperspectral full-waveform LiDAR system. Remote Sens. Lett. 2014, 5, 693–702. [Google Scholar] [CrossRef]
- Nevalainen, O.; Hakala, T.; Suomalainen, J.; Mäkipää, R.; Peltoniemi, M.; Krooks, A.; Kaasalainen, S. Fast and nondestructive method for leaf level chlorophyll estimation using hyperspectral LiDAR. Agric. For. Meteorol. 2014, 198, 250–258. [Google Scholar] [CrossRef]
- Du, L.; Gong, W.; Shi, S.; Yang, J.; Sun, J.; Zhu, B.; Song, S. Estimation of rice leaf nitrogen contents based on hyperspectral LiDAR. Int. J. Appl. Earth Obs. Geoinf. 2016, 44, 136–143. [Google Scholar] [CrossRef]
- Wichmann, V.; Bremer, M.; Lindenberger, J.; Rutzinger, M.; Georges, C.; Petrini-Monteferri, F. Evaluating the potential of multispectral airborne LiDAR for topographic mapping and land cover classification. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, II-3/W5, 113–119. [Google Scholar] [CrossRef]
- Zou, X.; Zhao, G.; Li, J.; Yang, Y.; Fang, Y. 3D land cover classification based on multispectral LiDAR point clouds. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B1, 741–747. [Google Scholar]
- Ahokas, E.; Hyyppä, J.; Yu, X.; Liang, X.; Matikainen, L.; Karila, K.; Litkey, P.; Kukko, A.; Jaakkola, A.; Kaartinen, H. Towards automatic single-sensor mapping by multispectral airborne laser scanning. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B3, 155–162. [Google Scholar]
- Hartzell, P.; Glennie, C.; Biber, K.; Khan, S. Application of multispectral LiDAR to automated virtual outcrop geology. ISPRS J. Photogramm. Remote Sens. 2014, 88, 147–155. [Google Scholar] [CrossRef]
- Gong, W.; Sun, J.; Shi, S.; Yang, J.; Du, L.; Zhu, B.; Song, S. Investigating the potential of using the spatial and spectral information of multispectral LiDAR for object classification. Sensors 2015, 15, 21989–22002. [Google Scholar] [CrossRef] [PubMed]
- Vauhkonen, J.; Hakala, T.; Suomalainen, J.; Kaasalainen, S.; Nevalainen, O.; Vastaranta, M.; Holopainen, M.; Hyyppa, J. Classification of spruce and pine trees using active hyperspectral LiDAR. IEEE Geosci. Remote Sens. Lett. 2013, 10, 1138–1141. [Google Scholar] [CrossRef]
- Tang, T.; Dai, L. Accuracy test of point-based and object-based urban building feature classification and extraction applying airborne LiDAR data. Geocarto Int. 2014, 29, 710–730. [Google Scholar] [CrossRef]
- Antonarakis, A.S.; Richards, K.S.; Brasington, J. Object-based land cover classification using airborne LiDAR. Remote Sens. Environ. 2008, 112, 2988–2998. [Google Scholar] [CrossRef]
- Wallace, A.; Nichol, C.; Woodhouse, I. Recovery of forest canopy parameters by inversion of multispectral LiDAR data. Remote Sens. 2012, 4, 509–531. [Google Scholar] [CrossRef]
- Bo, Z.; Wei, G.; Shi, S.; Song, S. A multi-wavelength canopy LiDAR for vegetation monitoring: System implementation and laboratory-based tests. Procedia Environ. Sci. 2011, 10, 2775–2782. [Google Scholar] [CrossRef]
- Biavati, G.; Donfrancesco, G.D.; Cairo, F.; Feist, D.G. Correction scheme for close-range LiDAR returns. Appl. Opt. 2011, 50, 5872–5882. [Google Scholar] [CrossRef] [PubMed]
- Kao, D.L.; Kramer, M.G.; Love, A.L.; Dungan, J.L.; Pang, A.T. Visualizing distributions from multi-return LiDAR data to understand forest structure. Cartogr. J. 2005, 42, 35–47. [Google Scholar] [CrossRef]
- Georgiev, G.T.; Butler, J.J. Long-term calibration monitoring of spectralon diffusers brdf in the air-ultraviolet. Appl. Opt. 2007, 46, 7892–7899. [Google Scholar] [CrossRef] [PubMed]
- Chang, C.-C.; Lin, C.-J. Libsvm: A library for support vector machines. ACM Trans. Intell. Syst. Technol. 2011, 2, 27. [Google Scholar] [CrossRef]
- Huete, A.; Didan, K.; Miura, T.; Rodriguez, E.P.; Gao, X.; Ferreira, L.G. Overview of the radiometric and biophysical performance of the modis vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
- Kim, M.S.; Daughtry, C.; Chappelle, E.; McMurtrey, J.; Walthall, C. The use of high spectral resolution bands for estimating absorbed photosynthetically active radiation (a par). In Proceedings of the 6th International Symposium on Physical Measurements and Signatures in Remote Sensing, Val d’lsère, France, 17–21 January 1994. [Google Scholar]
- Barnes, E.; Clarke, T.; Richards, S.; Colaizzi, P.; Haberland, J.; Kostrzewski, M.; Waller, P.; Choi, C.; Riley, E.; Thompson, T. Coincident detection of crop water stress, nitrogen status and canopy density using ground based multispectral data. In Proceedings of the 5th International Conference on Precision Agriculture, Bloomington, MN, USA, 16–19 July 2000; pp. 16–19. [Google Scholar]
- Haboudane, D.; Miller, J.R.; Pattey, E.; Zarco-Tejada, P.J.; Strachan, I.B. Hyperspectral vegetation indices and novel algorithms for predicting green lai of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Buschmann, C.; Lichtenthaler, H.K. The chlorophyll fluorescence ratio f735/f700 as an accurate measure of the chlorophyll content in plants. Remote Sens. Environ. 1999, 69, 296–302. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from eos-modis. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
- Altman, N.S. An introduction to kernel and nearest-neighbor nonparametric regression. Am. Stat. 1992, 46, 175–185. [Google Scholar] [CrossRef]
- Shi, S.; Song, S.; Gong, W.; Du, L.; Zhu, B.; Huang, X. Improving backscatter intensity calibration for multispectral LiDAR. IEEE Geosci. Remote Sens. Lett. 2015, 12, 1421–1425. [Google Scholar] [CrossRef]
- Broge, N.H.; Leblanc, E. Comparing prediction power and stability of broadband and hyperspectral vegetation indices for estimation of green leaf area index and canopy chlorophyll density. Remote Sens. Environ. 2001, 76, 156–172. [Google Scholar] [CrossRef]
- Nicolas, T.; Philippe, V.; Huang, W.-J. New index for crop canopy fresh biomass estimation. Spectrosc. Spectr. Anal. 2010, 30, 512–517. [Google Scholar]
- Daughtry, C.; Walthall, C.; Kim, M.; De Colstoun, E.B.; McMurtrey, J. Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
- Haboudane, D.; Miller, J.R.; Tremblay, N.; Zarco-Tejada, P.J.; Dextraze, L. Integrated narrow-band vegetation indices for prediction of crop chlorophyll content for application to precision agriculture. Remote Sens. Environ. 2002, 81, 416–426. [Google Scholar] [CrossRef]
- Guyot, G.; Baret, F.; Major, D. High spectral resolution: Determination of spectral shifts between the red and the near infrared. Int. Arch. Photogramm. Remote Sens. 1988, 11, 740–760. [Google Scholar]
- Rouse, J.W., Jr.; Haas, R.; Schell, J.; Deering, D. Monitoring Vegetation Systems in the Great Plains With Erts. Available online: https://ntrs.nasa.gov/search.jsp?R=19740022614 (accessed on 20 September 2016).
- Huete, A.R. A soil-adjusted vegetation index (savi). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
- Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
- Reyniers, M.; Walvoort, D.J.J.; Baardemaaker, J.D. A linear model to predict with a multi-spectral radiometer the amount of nitrogen in winter wheat. Int. J. Remote Sens. 2006, 27, 4159–4178. [Google Scholar] [CrossRef]
Vegetation Index | MWCL Adapted Formula | Original Formula |
---|---|---|
Chlorophyll Absorption Reflectance Index 1 (CARI1) [49] | (R700 − R670) − 0.2 × (R700 + R556) | (R700 − R670) − 0.2 × (R700 + R550) |
Normalized Difference Red Edge (NDRE) [50] | (R780 − R700)/(R780 + R700) | (R790 − R720)/(R790 + R720) |
Modified Triangular Vegetation Index1 (MTVI1) [51] | 1.2 × [1.2 × (R780 − R556) − 2.5 × (R670 − R556)] | 1.2 × [1.2 × (R800 − R550) − 2.5 × (R670 − R550)] |
Gitelson [52] | 1/R700 | 1/R700 |
Green Normalized Difference Vegetation Index (GNDVI) [53] | (R780 − R556)/(R780 + R556) | (NIR − GREEN)/(NIR + GREEN) |
Predicted Class | Producer Accuracy | ||||||||
---|---|---|---|---|---|---|---|---|---|
W Wall | W P Box | Cactus. | C Pot | H E Leaf | W E Leaf | P Foam | |||
Ground truth | W Wall | 5486 | 350 | 9 | 74 | 106 | 4 | 130 | 0.8907 |
W P Box | 187 | 2784 | 73 | 150 | 134 | 35 | 200 | 0.7813 | |
Cactus. | 1 | 17 | 625 | 0 | 213 | 0 | 0 | 0.7301 | |
C Pot | 3 | 69 | 0 | 1401 | 68 | 27 | 211 | 0.7875 | |
H E Leaf | 0 | 14 | 173 | 1 | 2728 | 4 | 0 | 0.9342 | |
W E Leaf | 0 | 6 | 2 | 111 | 139 | 175 | 0 | 0.4041 | |
Plastic F | 364 | 72 | 1 | 345 | 89 | 3 | 1477 | 0.6282 | |
User accuracy | 0.9081 | 0.8405 | 0.7078 | 0.6729 | 0.7845 | 0.7056 | 0.7319 |
VI | Raw Spectral Reflectance | ||||||
---|---|---|---|---|---|---|---|
Predicted Class | Producer Accuracy | Predicted Class | Producer Accuracy | ||||
H Leaf | W Leaf | H Leaf | W Leaf | ||||
Ground truth | H Leaf | 2896 | 24 | 0.9918 | 2919 | 1 | 0.9997 |
W Leaf | 128 | 305 | 0.7044 | 333 | 100 | 0.2309 | |
User accuracy | 0.9577 | 0.9271 | 0.8976 | 0.9901 |
Classification Targets | Raw Spectral Reflectance | VI | ||
---|---|---|---|---|
Before | After k-NN | Before | After k-NN | |
Seven individual targets | 81.258% | 87.188% | - | - |
Artificial and vegetation | 96.457% | 97.957% | 97.747% | 99.302% |
Healthy and withered leaves | 90.039% | 88.309% | 95.556% | 97.197% |
Withered leaves | 23.272% | 12.903% | 70.507% | 81.567% |
Predicted Class | Producer Accuracy | ||||||||
---|---|---|---|---|---|---|---|---|---|
W Wall | W P Box | Cactus | C Pot | H E Leaf | W E Leaf | P Foam | |||
Ground truth | W Wall | 5790 | 316 | 0 | 5 | 44 | 1 | 3 | 0.9400 |
W P Box | 79 | 2963 | 20 | 112 | 60 | 31 | 291 | 0.8332 | |
Cactus | 0 | 0 | 745 | 0 | 111 | 0 | 0 | 0.8703 | |
C Pot | 0 | 0 | 0 | 1538 | 43 | 31 | 166 | 0.8650 | |
H E Leaf | 0 | 2 | 74 | 0 | 2830 | 0 | 0 | 0.9738 | |
W E Leaf | 0 | 0 | 0 | 98 | 113 | 222 | 0 | 0.5127 | |
Plastic F | 325 | 28 | 0 | 261 | 77 | 0 | 1659 | 0.7059 | |
User accuracy | 0.9347 | 0.8954 | 0.8879 | 0.7636 | 0.8633 | 0.7789 | 0.7829 |
Predicted Class | Producer Accuracy | ||||||||
---|---|---|---|---|---|---|---|---|---|
W Wall | W P Box | Cactus. | C Pot | H E Leaf | W E Leaf | P Foam | |||
Ground truth | W Wall | 5460 | 380 | 11 | 29 | 111 | 5 | 163 | 0.8865 |
W P Box | 155 | 2877 | 72 | 135 | 143 | 25 | 156 | 0.8074 | |
Cactus | 3 | 3 | 555 | 0 | 295 | 0 | 0 | 0.6483 | |
C Pot | 3 | 76 | 0 | 1499 | 68 | 20 | 113 | 0.8426 | |
H E Leaf | 1 | 15 | 144 | 1 | 2755 | 4 | 0 | 0.9434 | |
W E Leaf | 0 | 7 | 0 | 115 | 142 | 169 | 0 | 0.3903 | |
Plastic F | 387 | 12 | 1 | 217 | 89 | 6 | 1639 | 0.6971 | |
User accuracy | 0.9086 | 0.8537 | 0.7088 | 0.7510 | 0.7646 | 0.7379 | 0.7914 |
© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, B.; Shi, S.; Gong, W.; Zhang, Q.; Yang, J.; Du, L.; Sun, J.; Zhang, Z.; Song, S. Multispectral LiDAR Point Cloud Classification: A Two-Step Approach. Remote Sens. 2017, 9, 373. https://doi.org/10.3390/rs9040373
Chen B, Shi S, Gong W, Zhang Q, Yang J, Du L, Sun J, Zhang Z, Song S. Multispectral LiDAR Point Cloud Classification: A Two-Step Approach. Remote Sensing. 2017; 9(4):373. https://doi.org/10.3390/rs9040373
Chicago/Turabian StyleChen, Biwu, Shuo Shi, Wei Gong, Qingjun Zhang, Jian Yang, Lin Du, Jia Sun, Zhenbing Zhang, and Shalei Song. 2017. "Multispectral LiDAR Point Cloud Classification: A Two-Step Approach" Remote Sensing 9, no. 4: 373. https://doi.org/10.3390/rs9040373