A Robust Gradient Based Method for Building Extraction from LiDAR and Photogrammetric Imagery
"> Figure 1
<p>False building detection at yellow arrowed regions. (<b>a</b>) Photogrammetric Image; (<b>b</b>) Texture Index; (<b>c</b>) Texture mask.</p> "> Figure 2
<p>Different Pseudo-NDVI values for trees with (<b>a</b>) red; and (<b>b</b>) green leaves (To increase the readability of <a href="#sensors-16-01110-f002" class="html-fig">Figure 2</a>b, brightness level of the Pseudo-NDVI image is increased by 90%. The original image is mainly black and when used as input into the proposed method, it has resulted in lower performance).</p> "> Figure 3
<p>The flow diagram of the proposed method.</p> "> Figure 4
<p>(<b>a</b>) Ground (red colour) and Non-ground LiDAR (blue colour) points (<b>b</b>) Building edges (red colour).</p> "> Figure 5
<p>(<b>a</b>) An angular histogram of building edges, where the angular scale is defined in degree unit (<b>b</b>) Building edges and (<b>c</b>) Group of prominent building edges.</p> "> Figure 6
<p>(<b>a</b>) Grid cell properties, and (<b>b</b>) LiDAR points on transparent building.</p> "> Figure 7
<p>Grid overlaid on an image at (<b>a</b>) 81°; (<b>b</b>) 26°; and (<b>c</b>) 39°; Derived intensity image from LiDAR data at (<b>d</b>) 81°; (<b>e</b>) 26°; and (<b>f</b>) 39°.</p> "> Figure 8
<p>Gradient-based building mask at (<b>a</b>) 81°; (<b>b</b>) 26°; and (<b>c</b>) 39°; where constant gradient in <span class="html-italic">X</span> and <span class="html-italic">Y</span> axes are represent by white and gray colour.</p> "> Figure 9
<p>Results after applying variance and point density analyses: (<b>a</b>) LiDAR points at objects; (<b>b</b>) Removing the LiDAR points with high variance; (<b>c</b>) Examining the point density; and (<b>d</b>) Update the gradient-based building mask to finalize the result.</p> "> Figure 10
<p>(<b>a</b>) Regions for local colour matching analysis, where the buildings are indicated by red boundaries and their surrounding regions are indicated by blue boundaries; and (<b>b</b>) Unmatched building region.</p> "> Figure 11
<p>(<b>a</b>) Building region after applying the colour matching analysis; (<b>b</b>) Shadow region outlined in yellow colour; (<b>c</b>) Green straight lines around the building and (<b>d</b>) Building region in blue colour is extracted after applying the shadow analysis.</p> "> Figure 12
<p>CRCSI benchmark data set i.e., (<b>a</b>) Harvey Bay and (<b>b</b>) Atkinvale, where the reference buildings are indicated by cyan boundaries.</p> "> Figure 13
<p>ISPRS benchmark data set (<b>a</b>) VA01; (<b>b</b>) VA02; and (<b>c</b>) VA03; where the benchmark areas are indicated by green boundaries and reference buildings are indicated by cyan boundaries.</p> "> Figure 14
<p>Qualitative analysis of (<b>a</b>) Harvey Bay(HB) and (<b>b</b>) Atkinvale(AV), where the buildings are indicated by blue boundaries, transparent buildings are indicated by magenta boundaries, and false detected buildings are indicated by yellow boundaries.</p> "> Figure 15
<p>(<b>a</b>) Benchmark of VA01; (<b>b</b>) Building extraction after applying GBE; and (<b>c</b>) Result description by ISPRS, where pixels in yellow, blue and red colours are the true building extraction, missed building and false building extraction, respectively.</p> "> Figure 16
<p>(<b>a</b>) Benchmark of VA02; (<b>b</b>) Building extraction after applying GBE; and (<b>c</b>) Result description by ISPRS, where pixels in yellow, blue and red colours are the true building extraction, missed building and false building extraction, respectively.</p> "> Figure 17
<p>(<b>a</b>) Benchmark of VA03; (<b>b</b>) Building extraction after applying GBE; and (<b>c</b>) Result description by ISPRS, where pixels in yellow, blue and red colours are the true building extraction, missed building and false building extraction, respectively.</p> ">
Abstract
:1. Introduction
2. Related Work
2.1. Limitations of Rule-Based Building Extraction Methods
2.2. Contributions of the Research
3. The Proposed Gradient-Based Building Extraction Method
3.1. Classification of LiDAR Points and Straight Lines
3.2. Finding Prominent Orientation of Buildings
3.3. Generating Grid and Height Intensity Image
3.4. Gradient Calculation
3.5. Refine the Building Mask Using LiDAR Points
3.6. Refining Building Using Imagery
3.6.1. Local Colour Matching Analysis
- Input the photogrammetric image and updated gradient-based building mask .
- Extract B buildings from the .
- Extract area A around the each of the B buildings.
- Calculate the mean of colour information in area A for building i, where i∈B.
- Calculate the colour range threshold using Equation (1).
- Calculate the number of matched Pixels of the building according to Equation (2).
- Eliminate the building if more than half of its pixels match with the .
- Eliminate the matched pixels of building as well.
- Repeat Steps 4 to 8 until all the buildings are processed.
3.6.2. Shadow Analysis
- Step 1 (Detect shadow region): The image of the input photogrammetric image is derived, where I is intensity and its range is [0, 1]. The lower value of I, e.g., = 0.25, indicates shadow region. Therefore, pixels having I less than are grouped as shadow regions which are also shown in Figure 11b.
- Step 2 (Eliminate shadow region): The shorter trees and buildings may be covered by the shadows of the taller buildings. Interestingly, the shadowed buildings may have some long straight lines (i.e., extracted during initial steps by Canny edge detector), whereas shadowed trees have no long straight lines. This is also shown in Figure 11c, where the shadowed buildings have some straight lines and shadowed trees have no straight lines. Therefore, the straight lines are used to distinguish the shadowed trees from shadowed building. The shadowed tree pixels are eliminated and the building pixels are used to update the building regions. Finally, the building regions are masked into a refined mask.
3.6.3. Morphological Filter
4. Experimental Setup
4.1. Parameter Settings
4.2. Benchmark Data Sets
4.3. Evaluation Systems
5. Experimental Results and Discussion
5.1. Performance Analysis of the Proposed Method
5.2. Comparison Analysis
5.3. Stability Analysis
6. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Lee, D.H.; Lee, K.M.; Lee, S.U. Fusion of lidar and imagery for reliable building extraction. Photogramm. Eng. Remote Sens. 2008, 74, 215–225. [Google Scholar] [CrossRef]
- Mayer, H. Automatic object extraction from aerial imagery—A survey focusing on buildings. Comput. Vis. Image Underst. 1999, 74, 138–149. [Google Scholar] [CrossRef]
- Chen, L.; Zhao, S.; Han, W.; Li, Y. Building detection in an urban area using lidar data and QuickBird imagery. Int. J. Remote Sens. 2012, 33, 5135–5148. [Google Scholar] [CrossRef]
- Noronha, S.; Nevatia, R. Detection and modeling of buildings from multiple aerial images. IEEE Trans. Pattern Anal. Mach. Intell. 2001, 23, 501–518. [Google Scholar] [CrossRef]
- Awrangjeb, M.; Ravanbakhsh, M.; Fraser, C.S. Automatic detection of residential buildings using LIDAR data and multispectral imagery. ISPRS J. Photogramm. Remote Sens. 2010, 65, 457–467. [Google Scholar] [CrossRef]
- Nguyen, H.T.; Pearce, J.M.; Harrap, R.; Barber, G. The application of LiDAR to assessment of rooftop solar photovoltaic deployment potential in a municipal district unit. Sensors 2012, 12, 4534–4558. [Google Scholar] [CrossRef] [PubMed]
- Rottensteiner, F.; Trinder, J.; Clode, S.; Kubik, K. Using the Dempster–Shafer method for the fusion of LIDAR data and multi-spectral images for building detection. Inf. Fusion 2005, 6, 283–300. [Google Scholar] [CrossRef]
- Brunn, A.; Weidner, U. Extracting buildings from digital surface models. Int. Arch. Photogramm. Remote Sens. 1997, 32, 27–34. [Google Scholar]
- Abdullah, S.; Awrangjeb, M.; Lu, G. Lidar segmentation using suitable seed points for 3D building extraction. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 1, 1–8. [Google Scholar] [CrossRef]
- Sohn, G.; Dowman, I. Terrain surface reconstruction by the use of tetrahedron model with the MDL criterion. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2002, 34, 336–344. [Google Scholar]
- Rottensteiner, F. Automatic generation of high-quality building models from lidar data. IEEE Comput. Graph. Appl. 2003, 23, 42–50. [Google Scholar] [CrossRef]
- Vu, T.T.; Tokunaga, M. Filtering airborne laser scanner data. Photogramm. Eng. Remote Sens. 2004, 70, 1267–1274. [Google Scholar] [CrossRef]
- Rottensteiner, F.; Briese, C. A new method for building extraction in urban areas from high-resolution LIDAR data. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2002, 34, 295–301. [Google Scholar]
- Kim, C.; Habib, A. Object-based integration of photogrammetric and LiDAR data for automated generation of complex polyhedral building models. Sensors 2009, 9, 5679–5701. [Google Scholar] [CrossRef] [PubMed]
- Awrangjeb, M.; Fraser, C.S. Automatic segmentation of raw LiDAR data for extraction of building roofs. Remote Sens. 2014, 6, 3716–3751. [Google Scholar] [CrossRef]
- Kim, J.; Muller, J. 3D reconstruction from very high resolution satellite stereo and it’s application to object identification. In Proceedings of the Joint International Symposium on “Geospatial Theory, Processing and Applicaations”, Ottawa, ON, Canada, 8–12 July 2002.
- Hermosilla, T.; Ruiz, L.A.; Recio, J.A.; Estornell, J. Evaluation of automatic building detection approaches combining high resolution images and LiDAR data. Remote Sens. 2011, 3, 1188–1210. [Google Scholar] [CrossRef]
- Vu, T.T.; Yamazaki, F.; Matsuoka, M. Multi-scale solution for building extraction from LiDAR and image data. Int. J. Appl. Earth Obs. Geoinf. 2009, 11, 281–289. [Google Scholar] [CrossRef]
- Cao, Y.; Wei, H.; Zhao, H.; Li, N. An effective approach for land-cover classification from airborne lidar fused with co-registered data. Int. J. Remote Sens. 2012, 33, 5927–5953. [Google Scholar] [CrossRef]
- Haala, N.; Brenner, C. Extraction of buildings and trees in urban environments. ISPRS J. Photogramm. Remote Sens. 1999, 54, 130–137. [Google Scholar] [CrossRef]
- Chen, L.C.; Teo, T.A.; Shao, Y.C.; Lai, Y.C.; Rau, J.Y. Fusion of LIDAR data and optical imagery for building modeling. Int. Arch. Photogramm. Remote Sens. 2004, 35, 732–737. [Google Scholar]
- Sohn, G.; Dowman, I. Data fusion of high-resolution satellite imagery and LiDAR data for automatic building extraction. ISPRS J. Photogramm. Remote Sens. 2007, 62, 43–63. [Google Scholar] [CrossRef]
- Grigillo, D.; Kanjir, U. Urban object extraction from digital surface model and digital aerial images. ISPRS Ann. Photogramm., Remote Sens. Spat. Inf. Sci. 2012, 1, 215–220. [Google Scholar] [CrossRef]
- Awrangjeb, M.; Zhang, C.; Fraser, C.S. Automatic extraction of building roofs using LIDAR data and multispectral imagery. ISPRS J. Photogramm. Remote Sens. 2013, 83, 1–18. [Google Scholar] [CrossRef]
- Gerke, M.; Xiao, J. Fusion of airborne laserscanning point clouds and images for supervised and unsupervised scene classification. ISPRS J. Photogramm. Remote Sens. 2014, 87, 78–92. [Google Scholar] [CrossRef]
- Cai, D.; Li, M.; Bao, Z.; Chen, Z.; Wei, W.; Zhang, H. Study on shadow detection method on high resolution remote sensing image based on HIS space transformation and NDVI index. In Proceedings of the 18th International Conference on Geoinformatics, Beijing, China, 18–20 June 2010; pp. 1–4.
- Blaschke, T.; Hay, G.J.; Weng, Q.; Resch, B. Collective sensing: Integrating geospatial technologies to understand urban systems—An overview. Remote Sens. 2011, 3, 1743–1776. [Google Scholar] [CrossRef]
- Shi, W.; Li, J. Shadow detection in color aerial images based on hsi space and color attenuation relationship. EURASIP J. Adv. Sig. Proc. 2012, 2012, 1–13. [Google Scholar] [CrossRef]
- Huang, X.; Zhang, L. Morphological building/shadow index for building extraction from high-resolution imagery over urban areas. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2012, 5, 161–172. [Google Scholar] [CrossRef]
- Siddiqui, F.U.; Teng, S.W.; Lu, G.; Awrangjeb, M. An improved building detection in complex sites using the LIDAR height variation and point density. In Proceedings of the 28th International Conference on Image and Vision Computing New Zealand (IVCNZ 2013), Wellington, New Zealand, 27–29 November 2013; pp. 471–476.
- Cramer, M. The DGPF-test on digital airborne camera evaluation–overview and test design. Photogramm. Fernerkund. Geoinf. 2010, 2010, 73–82. [Google Scholar] [CrossRef] [PubMed]
- MARS Software; Version 7.0.; MERRICK and Company: Greenwood Village, CO, USA, 2011.
- Awrangjeb, M.; Zhang, C.; Fraser, C.S. Building detection in complex scenes thorough effective separation of buildings from trees. Photogramm. Eng. Remote Sens. 2012, 78, 729–745. [Google Scholar] [CrossRef]
- Dorninger, P.; Pfeifer, N. A comprehensive automated 3D approach for building extraction, reconstruction, and regularization from airborne laser scanning point clouds. Sensors 2008, 8, 7323–7343. [Google Scholar] [CrossRef]
- Sampath, A.; Shan, J. Segmentation and reconstruction of polyhedral building roofs from aerial lidar point clouds. IEEE Trans. Geosci. Remote Sens. 2010, 48, 1554–1567. [Google Scholar] [CrossRef]
- Rottensteiner, F.; Sohn, G.; Gerke, M.; Wegner, J.D. ISPRS Test Project on Urban Classification and 3D Building Reconstruction; Commission III-Photogrammetric Computer Vision and Image Analysis, Working Group III/4-3D Scene Analysis; ISPRS: Hannover, Germany, 2013; pp. 1–17. [Google Scholar]
- Barista Software; Version 2.1.; CRCSI: Melbourne, Australia, 2010.
- Rutzinger, M.; Rottensteiner, F.; Pfeifer, N. A comparison of evaluation techniques for building extraction from airborne laser scanning. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2009, 2, 11–20. [Google Scholar] [CrossRef]
- Awrangjeb, M.; Fraser, C.S. An automatic and threshold-free performance evaluation system for building extraction techniques from airborne LIDAR data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 4184–4198. [Google Scholar] [CrossRef]
- Zarea, A.; Mohammadzadeh, A. Introducing a New Approach for Buildings Detection Using LiDAR Data and Aerial Image. Available online: http://www2.isprs.org/commissions/comm3/wg4/results.html (accessed on 19 April 2016).
- Ramiya, A.M.; Nidamanuri, R.R.; Krishnan, R. Automated Building Detection Using LiDAR Data and Digital Aerial Images by Supervoxels Method. Available online: http://www2.isprs.org/commissions/comm3/wg4/results.html (accessed on 19 April 2016).
- Chen, Q.; Sun, M. Brief Description of Building Detection from Airborne LiDAR Data Based on Extraction of Planar Structure. Available online: http://www2.isprs.org/commissions/comm3/wg4/results.html (accessed on 19 April 2016).
No. | Thresholds/Parameters | Values | Sources |
---|---|---|---|
1 | Height | 1 m | [9,15] |
2 | Straight line length | 3 m | [5,15,33] |
3 | Degree range | 11.25 | [15] |
4 | Grid/cell length | 2 | [15] |
5 | SE size | 1 | [15] |
6 | Height tolerance/Variance | 0.2 m | this paper |
7 | point density in a grid | 0.5 | [30] |
8 | Building portion matched | 0.5 | this paper |
9 | Colour range | automatically set | this paper |
11 | Tree colour | automatically set | this paper |
12 | Shadow intensity | 0.25 | this paper |
Areas | |||||||||
---|---|---|---|---|---|---|---|---|---|
HB | 96.0 | 96.1 | 92.6 | 100 | 100 | 100 | 100 | 100 | 100 |
AV | 76.9 | 96.3 | 75.7 | 86.2 | 100 | 86.8 | 100 | 100 | 100 |
Average | 86.4 | 96.2 | 84.1 | 93.1 | 100 | 93.4 | 100 | 100 | 100 |
Areas | ||||
---|---|---|---|---|
HB | 95.8 | 89.7 | 86.3 | 0.79 |
AV | 79 | 95.7 | 76.4 | 1.1 |
Average | 87.4 | 92.7 | 81.3 | 0.94 |
Areas | 1:M | N:1 | N:M | ||||||
---|---|---|---|---|---|---|---|---|---|
VA01 | 86.5 | 96.9 | 84.1 | 100 | 100 | 100 | 0 | 7 | 0 |
VA02 | 85.7 | 100 | 85.7 | 100 | 100 | 100 | 0 | 2 | 0 |
VA03 | 76.8 | 95.7 | 74.2 | 97.4 | 100 | 97.4 | 0 | 6 | 0 |
Average | 83.0 | 97.5 | 81.3 | 99.1 | 100 | 99.1 | 0 | 5 | 0 |
Areas | ||||
---|---|---|---|---|
VA01 | 93.5 | 86.0 | 81.1 | 1.1 |
VA02 | 97.2 | 84.3 | 82.3 | 1.0 |
VA03 | 93.7 | 81.3 | 77.1 | 1.0 |
Average | 94.8 | 83.8 | 80.1 | 1.03 |
Methods | Published | Tested on Data Set |
---|---|---|
MON2 | [15] | CRCSI and ISPRS |
KNTUmod | [40] | ISPRS |
IIST2 | [41] | ISPRS |
WHUQC | [42] | ISPRS |
Areas | Methods | |||||||
---|---|---|---|---|---|---|---|---|
HB | GBE | 96.0 | 96.1 | 100 | 100 | 95.8 | 89.7 | 0.60 |
MON2 | 80 | 95.2 | 90.9 | 95.2 | 91.3 | 90.0 | 0.84 | |
AV | GBE | 77.0 | 96.3 | 86.2 | 100 | 78.4 | 95.7 | 0.90 |
MON2 | 67.2 | 100 | 81.1 | 100 | 87.2 | 94.9 | 0.66 |
Areas | Methods | |||||||
---|---|---|---|---|---|---|---|---|
VA01 | GBE | 86.5 | 96.9 | 100 | 100 | 93.5 | 86.0 | 1.01 |
MON2 | 83.8 | 96.9 | 100 | 100 | 92.7 | 88.7 | 1.11 | |
KNTUmod | 78.6 | 100 | 100 | 100 | 91.4 | 94.3 | 0.8 | |
IIST2 | 83.8 | 84.8 | 89.3 | 96.3 | 90.7 | 82.4 | 1.2 | |
WHUQC | 78.4 | 96.9 | 92.9 | 100 | 83.7 | 98.1 | 1.0 | |
VA02 | GBE | 85.7 | 100 | 100 | 100 | 97.2 | 84.3 | 1.07 |
MON2 | 85.7 | 84.6 | 100 | 100 | 91.5 | 91.0 | 0.8 | |
KNTUmod | 78.6 | 100 | 100 | 100 | 86.5 | 93.6 | 1.0 | |
IIST2 | 71.4 | 91.7 | 100 | 90.9 | 86.2 | 90.3 | 0.8 | |
WHUQC | 85.7 | 100 | 100 | 100 | 86.7 | 99.6 | 0.8 | |
VA03 | GBE | 76.8 | 95.7 | 97.4 | 100 | 93.9 | 81.3 | 1.03 |
MON2 | 78.6 | 97.8 | 97.4 | 100 | 93.9 | 86.3 | 0.89 | |
KNTUmod | 85.7 | 98.0 | 100 | 100 | 88.3 | 99.0 | 0.7 | |
IIST2 | 83.9 | 53.2 | 94.7 | 82.2 | 91.0 | 75.7 | 1.1 | |
WHUQC | 78.6 | 100 | 97.4 | 100 | 87.0 | 98.3 | 0.9 |
GBE | MON2 | |||||
---|---|---|---|---|---|---|
u | ||||||
VA02 | 40 | 82.250 | 40 | 79.970 | 0.45, 0.2 | 80.633 |
50 | 82.250 | 50 | 77.428 | 0.55, 0.3 | 80.268 | |
60 | 82.246 | 60 | 70.846 | 0.4, 0.15 | 79.391 | |
70 | 82.241 | 70 | 55.656 | 0.35, 0.1 | 79.527 | |
St.D | 0.004 | St.D | 10.912 | - | 0.594 | |
AV | 40 | 76.699 | 40 | 68.751 | 0.45, 0.2 | 65.345 |
50 | 76.407 | 50 | 71.186 | 0.55, 0.3 | 69.407 | |
60 | 76.474 | 60 | 64.197 | 0.4, 0.15 | 69.298 | |
70 | 76.878 | 70 | 47.740 | 0.35, 0.1 | 65.322 | |
St.D | 0.215 | St.D | 10.557 | - | 2.320 |
© 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Siddiqui, F.U.; Teng, S.W.; Awrangjeb, M.; Lu, G. A Robust Gradient Based Method for Building Extraction from LiDAR and Photogrammetric Imagery. Sensors 2016, 16, 1110. https://doi.org/10.3390/s16071110
Siddiqui FU, Teng SW, Awrangjeb M, Lu G. A Robust Gradient Based Method for Building Extraction from LiDAR and Photogrammetric Imagery. Sensors. 2016; 16(7):1110. https://doi.org/10.3390/s16071110
Chicago/Turabian StyleSiddiqui, Fasahat Ullah, Shyh Wei Teng, Mohammad Awrangjeb, and Guojun Lu. 2016. "A Robust Gradient Based Method for Building Extraction from LiDAR and Photogrammetric Imagery" Sensors 16, no. 7: 1110. https://doi.org/10.3390/s16071110
APA StyleSiddiqui, F. U., Teng, S. W., Awrangjeb, M., & Lu, G. (2016). A Robust Gradient Based Method for Building Extraction from LiDAR and Photogrammetric Imagery. Sensors, 16(7), 1110. https://doi.org/10.3390/s16071110