[go: up one dir, main page]

Next Article in Journal
Optimized Sample Selection in SVM Classification by Combining with DMSP-OLS, Landsat NDVI and GlobeLand30 Products for Extracting Urban Built-Up Areas
Next Article in Special Issue
Spatial Variability Analysis of Within-Field Winter Wheat Nitrogen and Grain Quality Using Canopy Fluorescence Sensor Measurements
Previous Article in Journal / Special Issue
Potential of RapidEye and WorldView-2 Satellite Data for Improving Rice Nitrogen Status Monitoring at Different Growth Stages
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluation of Orthomosics and Digital Surface Models Derived from Aerial Imagery for Crop Type Mapping

1
The State Key Laboratory of Remote Sensing Science, Institute of Remote Sensing and Digital Earth, Chinese Academy of Sciences, P.O. Box 9718, 20 Datun Road, Chaoyang, Beijing 100101, China
2
USDA-Agricultural Research Service, Aerial Application Technology Research Unit, 3103 F & B Road, College Station, TX 77845, USA
3
Beijing Research Center for Information Technology in Agriculture, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
4
Laboratory of Digital Earth Sciences, Institute of Remote Sensing and Digital Earth, Chinese Academy of Sciences, Beijing 100094, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2017, 9(3), 239; https://doi.org/10.3390/rs9030239
Submission received: 30 December 2016 / Revised: 19 February 2017 / Accepted: 2 March 2017 / Published: 4 March 2017
(This article belongs to the Special Issue Earth Observations for Precision Farming in China (EO4PFiC))

Abstract

:
Orthomosics and digital surface models (DSM) derived from aerial imagery, acquired by consumer-grade cameras, have the potential for crop type mapping. In this study, a novel method was proposed for extracting the crop height from DSM and for evaluating the orthomosics and crop height for the identification of crop types (mainly corn, cotton, and sorghum). The crop height was extracted by subtracting the DSM derived during the crop growing season from that derived after the crops were harvested. Then, the crops were identified from four-band aerial imagery (blue, green, red, and near-infrared) and the crop height, using an object-based classification method and a maximum likelihood method. The results showed that the extracted crop height had a very high linear correlation with the field measured crop height, with an R-squared value of 0.98. For the object-based method, crops could be identified from the four-band airborne imagery and crop height, with an overall accuracy of 97.50% and a kappa coefficient of 0.95, which were 2.52% and 0.04 higher than those without crop height, respectively. When considering the maximum likelihood, crops could be mapped from the four-band airborne imagery and crop height with an overall accuracy of 78.52% and a kappa coefficient of 0.67, which were 2.63% and 0.04 higher than those without crop height, respectively.

Graphical Abstract">

Graphical Abstract

1. Introduction

Consumer-grade cameras have been increasingly used in airborne remote sensing, due to their low cost, compact size, compact data storage, and user-friendliness [1]. These cameras are usually equipped with either a charge-coupled device (CCD) sensor or a complementary metal-oxide-semiconductor (CMOS) sensor. Due to their light weight, these sensors can be easily mounted on unmanned aerial vehicles (UAV). UAV equipped with consumer-grade cameras have been widely used in forestry [2,3], precision agriculture, and crop monitoring [4,5,6,7]. George et al. monitored crop parameters using an UAV [4]. Hardin et al. detected and mapped small weed patches using a small remotely piloted vehicle (RPV) [8]. Berni et al. extracted the leaf area index, chlorophyll content, water stress, and canopy temperature using thermal and narrowband multispectral imagery, acquired by sensors onboard an UAV [9]. Swain et al. estimated the yield and total biomass of a rice crop using a radio-controlled unmanned helicopter-based low-altitude remote sensing (LARS) platform [10]. Hunt et al. extracted the leaf area index using near infrared(NIR)-Green-Blue digital photographs, acquired by an unmanned aircraft [11].
Consumer-grade cameras can acquire RGB images in a high spatial resolution with a 12 or 14 bit data depth, using a Bayer color filter mosaic. The high spatial and radiometric resolutions make the images from these sensors comparable to those acquired by conventional multispectral imaging systems, which typically use three or four cameras to acquire 3-band or 4-band imagery. Therefore, consumer-grade cameras have also been widely used in manned aircrafts [1,12,13,14]. Yang et al. designed an airborne multispectral imaging system based on two identical consumer-grade cameras for acquiring RGB and NIR images, and evaluated the system for crop condition assessment and pest detection [1]. Nguy-Robertson et al. demonstrated the potential of using consumer-grade digital cameras to derive quantitative information for vegetation analysis [15]. Song et al. compared mosaicking techniques for airborne images from consumer-grade cameras [16]. Westbrook et al. identified individual cotton plants using consumer-grade cameras [17]. Zhang et al. mapped crops using images acquired by consumer-grade cameras, with object-based and pixel-based methods [14].
Although consumer-grade cameras have many advantages, they also have some limitations. They can only be used to acquire broadband images in red, green, and blue bands, and in NIR bands with modifications. Because the spectral responses of many different crops are very similar, it is difficult to distinguish crops with a similar spectral reflectance in images acquired by consumer-grade cameras, due to their low spectral resolution [13]. High spectral resolution imagery or other auxiliary information like crop height is needed for precision crop monitoring, an estimation of various biophysical parameters, and disease detection [18,19].
There are two common products extracted from overlapped aerial imagery acquired by consumer-grade cameras. One is orthomosaic imagery, and the other is the digital surface model (DSM). Although consumer-grade camera applications in agriculture are mainly focused on the use of orthomosaics derived from aerial RGB and NIR imagery [8,11,12,16,17,20], DSM data have been used for estimating crop height and other growth parameters. Bendig et al. estimated the biomass of barley using crop surface models derived from UAV-based RGB imaging [21]. They also extracted the plant height from crop surface models for biomass monitoring in barley [22]. Grenzdörffer et al. extracted the crop height with UAV point clouds [23]. Li et al. estimated the canopy height and aboveground biomass of maize using high-resolution stereo images from a low-cost unmanned aerial vehicle system [24]. Papadavid et al. mapped the potato crop height and leaf area index through vegetation indices [25]. Verhoeven et al. extracted the plant height of a crop canopy from archived aerial photographs, for interpretative mapping [26]. Height information measured by other methods has also been used for agricultural applications. Sharma et al. demonstrated that the yield estimation of corn could be improved by using corn height measured with an acoustic sensor [27]. Gao et al. extracted the height of maize using airborne full-waveform light detection and ranging (LiDAR) data [28]. Tilly et al. measured an accurate plant height and biomass with terrestrial laser scanning in paddy rice [29,30]. Zhang et al. designed a LiDAR-based crop height measurement system for the large perennial grass Miscanthus giganteus [31].
The accuracy of crop identification can be improved by using the elevation information extracted from DSM data, because different crops generally have different heights. However, more research is needed to evaluate DSM derived from aerial imagery, for crop identification. As aerial imagery from consumer-grade cameras is becoming increasingly available, it is necessary to evaluate orthomosaics and DSM derived from aerial imagery, for crop identification. Thus, the objectives of this study were to (1) extract the crop height from imagery taken during the growing season and after harvest; and (2) evaluate the orthomosics and crop height derived from aerial imagery for crop type mapping.

2. Study Area

An 8 km × 10 km rectangular area covering the Texas A&M University Farm in College Station, TX, USA, was selected as the study area (Figure 1). The latitude and longitude of this area ranged from 30°28′37.68″ to 30°35′37.53″E and 96°30′31.88″ to 96°22′11.84″W, respectively. The study area was generally flat and primarily consisted of cropland. The main land cover types included crops (cotton, corn, winter wheat, grain sorghum, water melon, and sugar cane), grass, trees, water, and residential areas.

3. Data and Preprocessing

3.1. Airborne Imagery

Four-band (red, green, blue, and NIR) airborne imagery acquired on 30 June and 21 October 2016 was used in this study. The images, with a 7360 × 4912 pixel array, were acquired with two Nikon D810 digital CMOS cameras with AF Nikkor 20 mm f/1.8 G fixed-length lenses (Nikon Inc., Melville, NY, USA), onboard a Cessna 206 aircraft. One camera was used to acquire the RGB image, while the other was equipped with an 830-nm long-pass filter, used to acquire the NIR image. The images on 30 June were acquired at an altitude of approximately 3000 m above ground level (AGL) with a 0.8-m spatial resolution, while the images on 21 October were acquired at an altitude of 1500 m AGL with a 0.4-m spatial resolution. Images were acquired along evenly spaced flight lines with forward overlap and lateral overlap values higher than 70% and 40%, respectively. The imagery acquired on 21 October 2016 was cloud-free, while 20% of the imaging area had cloud cover on 30 June 2016. Fifty white boards were evenly placed in the study area as ground control points (GCPs) (Figure 1b), and the coordinates of these GCPs were measured using a GPS with an accuracy of 1 to 8 cm (Figure 1c).
The acquired images were in a NEF format. All of the NEF images were converted to a tiff format using the ViewNX-i software provided with the cameras. Then, Pix4Dmapper software was used to mosaic all of the tiff images to a 0.8-m image for 30 June and a 0.4-m image for 21 October, with 23 field measured GCPs (Table 1). The mosaicked images were then calibrated using the method described in Section 4.1. The calibrated image for 30 June was used to extract the crop height and land cover types. The calibrated image for 21 October was used to extract the elevation data after the crop harvest, using the method described in Section 4.1.

3.2. Field Data

A total of 68 GCPs were measured using a GPS with an accuracy of 1 to 8 cm. Twenty-three field measured GCPs that could be clearly identified in the airborne imagery were used for the geometric correction, for image mosaicking. Fifty-three GCPs were used in the Z-error removal and 15 samples (independent samples in Table 1) were randomly selected from the 23 GCPs, to be used for validation in the Z-error removal.
Four 8 m × 8 m calibration tarps with nominal reflectance values of 8%, 16%, 32%, and 48% were placed in the study area for the radiometric calibration of the airborne imagery (Figure 1a). The actual reflectance values of the four tarps were measured on the ground using a FieldSpec HandHeld 2 spectrometer (Figure 1a), during image acquisition. The FieldSpec HandHeld 2 spectrometer is a portable terrain spectrometer. It can obtain spectral data from 325 to 1075 nm, with a spectral resolution of 3 nm.
In addition to the GPS coordinates of the 68 GCPs and the reflectance data for the four tarps, the land cover types of 218 crop fields and other land areas over the study area were surveyed on 12 September 2016. These 218 fields/areas covered 90% of the study area. The land cover types were recorded in airborne RGB mosaic images acquired on 30 June 2016, with the help of google earth on an iPad, which told us where we were. The field measured land cover types data were used in the selection of area of interests (AOIs) and the validation of crop identification.
A total of 32 crop height measurements were made using a tape measure in 13 plots of corn, cotton, and grain sorghum. The latitude and longitude coordinates of these 32 points were measured using the same GPS receiver. These field measured crop height data were used for the validation of extracted crop height data.

4. Methods

4.1. Calibration of Airborne Imagery

Orthomosics created by Pix4Dmapper software were calibrated using linear regression models derived from the field measured reflectance values and mean digital numbers (DNs) of the four tarps, by three steps. First, the mean DNs of the four tarps in the orthomosics were calculated using the zonal statistics as table tools in ARCGIS 10.2. Then, linear models (one for each band) were built using linear regression analysis to relate the field measured reflectance values to the mean DNs of the four tarps. Finally, those models were used to convert the DNs in the orthomosaics to reflectance.

4.2. Extraction of Crop Height

The mosaic images and DSM generated from the aerial imagery had a 3D root mean squared (RMS) error of 0.391 m for 30 June and 0.421 m for 21 October. The RMS error was the mean error determined from the X, Y, and Z RMS errors. The high accuracy made it possible to extract the crop height from the DSM data. A DSM is the surface height values of the terrain and it captures such terrain features as the top of the crop canopy and buildings. By contrast, a digital elevation model (DEM) is a 3D representation of a terrain’s surface or bare ground elevation. By subtracting a DEM from the DSM, the crop height can be extracted. Three methods could be used to obtain DEM data. First, there are existing DEM data, such as those provided by the U. S. Geological Survey (USGS). However, most DEM data measured by satellites have spatial resolutions of 30 m or larger, which are too coarse for crop height extraction. Second, DEM can also be measured by airborne laser scanning (ALS) or LiDAR [24,28,29]. Third, DEM can be extracted from aerial imagery after crops have been harvested. So, the DSM is equal to DEM in the crop areas after the crop harvest. In this study, the third method was used to determine the DEM data in the crop area. Thus, the crop height was calculated by subtracting the DEM data acquired on 21 October 2016 from the DSM data acquired on 30 June 2016.
Although the mosaic image and DSM had a 3D RMS error of 0.421 m for 21 October, the Z errors of the 23 GCPs ranged from 0.018 m to 1.263 m. Thus, the errors in some areas were too high for crop height extraction. To minimize the influence of Z errors, a Z-error removal method was proposed in this study. This method was carried out in three steps. First, the Z errors of the 53 GCPs were calculated by subtracting actual Z values from the calculated Z values. Then, a Z-error raster image was generated in ARCGIS 10.2, using the Z errors of these 53 GCP points. This was accomplished by (1) creating a TIN with the 53 GCP points using the Create TIN tool in ARCGIS, and (2) converting the tin to a 0.4-m raster image using the TIN to Raster tools in ARCGIS. Finally, the errors were removed from the DSM data by subtracting the 0.4-m Z-errors raster image. After the errors were removed from the DSM data for 30 June and 21 October, the new DSM data were validated using 15 independent points and then used to extract the crop height. The validation results are shown in Table 1. It can be found that the RMS error was reduced from 0.601 m to 0.025 m.

4.3. Crop Identification Using Mosaic Imagery and Crop Height

Numerous studies have shown that both unsupervised classification and supervised classification methods, like the minimum distance, Mahalanobis distance, maximum likelihood, spectral angle mapper (SAM), and neural net and support vector machine (SVM), can be used to identify crops from airborne imagery [17,32]. The object-based classification method, which is usually used for high spatial resolution data, is more suitable for crop identification from airborne imagery. Currently, many object-based classification tools and software are available, including eCognition and Feature Extraction tools in ENVI 5.2.
To evaluate the ability of orthomosics and crop height for crop identification, a small area in the 8 km × 10 km rectangular area was selected. This small area contained the main land cover types, had lower DSM errors, and was cloud-free for the images. The four-band airborne imagery and crop height for 30 June 2016 was used for image classification. An object-based classification method was applied to the five layers of image data using the example-based feature extraction in ENVI 5.2. This tool identified crops in two steps. The four spectral bands were first selected, to create objects by segment and merge methods. Specifically, the edge algorithm was selected to segment plots at a scale level of 50. The full lambda schedule algorithm was selected to merge plots at a merge level of 95. The texture kernel size was set to three. Three parameters were then set in the example-based classification. First, some training samples for each of the six land cover types (corn, cotton, sorghum, grass, bare soil, road, and wheat) were selected in the small area. Second, three types of attributes were selected, including spectral (mean, std, min, and max), texture (range, mean, variance, and entropy), and spatial (area, length, compactness, convexity, solidity, roundness, form factor, elongation, rectangular fit, main direction, major length, minor length, number of holes, and hole area/solid area). Finally, the SVM classification method was selected for crop classification. To evaluate the influence of crop height in crop classification for different classification methods, the maximum likelihood method was applied to the four-band airborne imagery, with and without the crop height, over the same areas of interest.

4.4. Validation

4.4.1. Validation of Crop Height

A total of 32 crop height measurements, made in 13 plots of corn, cotton, and grain sorghum, were used for the validation of the extracted crop height data. Linear regression analysis was performed to validate the relationship between the image extracted and the field measured crop height data. However, due to the influence of the scale effect, which will be discussed in Section 5.2, the validation was not performed by point, but by plot. The mean values of the extracted and measured crop height data for each plot were first calculated. Then, the mean measured crop height was related to the mean extracted crop height using linear regression analysis.

4.4.2. Validation of Crop Identification

The field surveyed land cover types in 46 plots were used to validate the results of crop identification using the confusion matrix method [33]. The overall accuracy and kappa coefficient were used to determine the overall classification accuracy. The producer’s accuracy and user’s accuracy for each land cover type were used to determine the classification accuracy of each land cover type.

5. Results

5.1. Airborne Imagery Calibration

Figure 2 shows the calibration models for the four bands. It can be seen that the linear models between field measured reflectance values and mean DNs of the four tarps, had higher R2 values, from 0.93 for the green band to 0.99 for the NIR band. This result demonstrates that this method can be used to successfully calibrate the orthomosics created by Pix4Dmapper software. However, the model parameters are sensitive to atmospheric conditions. So, these models need be rebuilt for different times and locations.

5.2. Crop Height Extraction

Figure 3 shows the results of linear regression analysis between field measured crop height and extracted crop height, in 13 plots (three corn, two grain sorghum, and eight cotton plots). It can be seen that the extracted crop height had a very high linear correlation with the field measured crop height, with an R2 value of 0.98. However, all of the extracted crop height values were lower than the field measured values, which is a common observation found in studies with high resolution imagery over closed canopies [34]. This deviation was caused by two reasons. First, the field measured crop height was measured on 12 September 2016, while the image data used for crop height extraction was acquired on 30 June 2016. The crops had grown a little taller between 30 June and 12 September. Second, the image scale had some effect on crop height extraction. The extracted crop height layer was a raster image with a pixel size of 0.4 m. Since the crops were planted in rows with a raw spacing of approximately 1 m, each pixel contained some bare soil exposure, if the crop canopy cover was not full. The partial canopy cover had reduced the extracted crop height. For example, two of the cotton plots had a canopy cover which was lower than 60%. The extracted crop height in these two plots was 0.25 m and 0.15 m lower than the respective field measured crop height.

5.3. Crop Identification

Figure 4 shows the crop identification results from the example-based feature extraction in ENVI 5.2. Table 2 shows the accuracy results using the field measured land cover data. The results showed that the crops could be identified with an overall accuracy of 97.50% and a kappa coefficient of 0.96. The errors were mainly due to the misclassification of cotton, sorghum, and road.
For a comparison with the object-based method, Figure 5 shows the classification results based on the maximum likelihood. Table 3 shows the accuracy assessment results, using the same field measured land cover data. Compared with the object-based method, the maximum likelihood had an 18.98% lower overall accuracy and resulted in a large number of misclassified pixels in the fields.

6. Discussion

6.1. Crop Mapping without Crop Height

To show the effect of crop height on crop identification, the example-based feature extraction was applied to the four-band airborne imagery, with the same areas of interest. Figure 6 shows the classification results. Table 4 presents the accuracy assessment results using the same field measured land cover data in Section 5.3. From Figure 4 and Figure 6 and Table 2 and Table 4, it can be seen that the overall accuracy and kappa coefficient of the classification with crop height were only 2.52% and 0.04 higher than those without crop height, respectively. However, by comparing the classification map with the crop height, the classification map without the crop height showed obvious misclassifications in the areas that are highlighted by the red circles in Figure 6. This is because these areas had a very similar spectral response, but varied in plant height. Although crop height didn’t seem to significantly increase the overall classification accuracy, it has the potential to improve classification results. Moreover, it is not difficult and expensive to extract the crop height by using aerial imagery collected either before crop planting or after crop harvest. Compared with DEM generated from ALS, the data acquisition and processing cost of the proposed method is lower [24,35]. The crop height can also be extracted by the soil point method, which extracts the crop height from DSM using DEM of soil points in the vegetated imagery [22,23]. Compared with the soil point method, the proposed method can be used in high density crop areas where there is little soil background.
For a comparison with the object-based method, Figure 7 shows the classification results based on the maximum likelihood using the four-band airborne imagery. Table 5 shows the accuracy of the assessment results using the same field measured land cover data in Section 5.2. From Figure 5 and Figure 7 and Table 3 and Table 5, it can be seen that the overall accuracy and kappa coefficient of the classification with crop height, were only 2.63% and 0.04 higher than those without crop height, respectively. Compared with the object-based method, the maximum likelihood had a 19% lower overall accuracy, with and without crop height.

6.2. Limitations

Although the proposed method can extract the crop height accurately and the crop height is useful for crop mapping, there are several disadvantages that limit the applications of this method.
First, many factors affect the crop height, including crop variety, growth period, and farmland management practices like irrigation and fertilization. Due to these factors, the height of the same crop can vary from field to field. Even in the same field, where these factors are very similar, the crop height can illustrate some variability. Thus, if only the crop height is used for crop type mapping, the mapping accuracy will be low. For example, the overall accuracy of the classification map with only the crop height based on the maximum likelihood, is 74.66% for the same area. The accuracy would be much lower in a large area, due to increased spatial variability. Therefore, the crop height needs to be used with multispectral imagery to achieve improved classification results.
Second, the errors in the DSM can influence the accuracy of the surface height. In this study, although a Pix4Dmapper generated the 0.4-m DSM with a 3D RME error of 0.421 m from the aerial imagery acquired at 1500 m AGL, the errors for the 23 GCPs ranged from 0.018 m to 1.263 m. These errors were partly attributed to: (1) image acquisition quality, such as platform stability and image overlap; and (2) land surface conditions, such as land cover types and terrain. The distribution of errors was not uniform. Since the manned aircraft was used in this study, the platform was stable compared to UAV, and the images were acquired with sufficient forward overlap and lateral overlap. Large Z errors were found in the hills, due to terrain effects. The errors are generally too large for crop height extraction in the edge areas, where there are no sufficient image overlaps. Moreover, there are some outliers in the DSM product and the high density of outliers makes the DSM unusable in some areas, such as in water and dense vegetation areas. The actual DEM of water and soil can be acquired by filtering the point cloud. However, the actual DSM of a crop canopy cannot be accurately obtained by filtering the point cloud because of the variability in the crop height. Higher resolution image data will reduce the number of outliers, because more 3D cloud points can be derived from high resolution imagery. In this study, the DSM from the imagery taken at 1500 m had fewer outliers than that from the imagery taken at 3000 m. Previous studies have also shown that there are almost no outliers in the DSM data derived from 0.2-m aerial imagery [24]. So, it is recommended that the spatial resolution of image data should be finer than 0.2 m for an accurate crop height extraction, when the spatial variability in the field is considered.

7. Conclusions

In this study, the orthomosics and DSM derived from aerial imagery acquired by a normal RGB and a modified NIR consumer-grade camera, were evaluated for crop type mapping. The crop height was extracted from DSM data generated by a pix4Dmapper, from the aerial imagery. Crops were identified from the mosaicked four-band imagery, and the crop height was ascertained by the maximum likelihood and an object-based classification method, using the example-based feature extraction in ENVI 5.2. Based on the results from this study, the following conclusions can be drawn:
(1)
Crop height can be extracted by subtracting the DEM from the DSM generated by a pix4Dmapper from aerial imagery. The extracted crop height had a very high linear correlation with the field measured crop height, with an R2 value of 0.98.
(2)
Crop height information is useful for crop identification. Crops can be identified from the four-band imagery and the crop height can be revealed by an object-based classification method, with an overall accuracy and kappa coefficient of 97.50% and 0.96, respectively. The overall accuracy and kappa coefficient of the crop classification map without crop height were 2.52% and 0.04 lower, respectively. When considering the maximum likelihood, crops could be mapped from the four-band imagery and crop height with an overall accuracy of 78.52% and a kappa coefficient of 0.67, which were 2.63% and 0.04 higher than those without crop height, respectively.

Acknowledgments

This project was conducted as part of a visiting scholar research program, and the first author was financially supported by the China Scholarship Council (201504910261), the Youth Innovation Promotion Association CAS (2017089), the Hundred Talents Program CAS (Y6YR0700QM), the National Natural Science Foundation of China (41301390), the National Science and Technology Major Project (2014AA06A511), the Major State Basic Research Development Program of China (2013CB733405, 2010CB950603), and the National Science and Technology Major Project of China. The author wishes to thank Fred Gomez and Lee Denham of USDA-ARS in College Station, Texas, for collecting the airborne images and ground data for this study.

Author Contributions

Mingquan Wu designed and conducted the experiment, processed and analyzed the imagery, and wrote the manuscript. Chenghai Yang guided the study design, participated in data collection, advised in data analysis, and revised the manuscript. Xiaoyu Song, W. Clint Hoffmann, Wenjiang Huang, Zheng Niu, Changyao Wang, and Wang Li were involved in the process of the experiment, ground data collection, or manuscript revision. All authors reviewed and approved the final manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Yang, C.; Westbrook, J.K.; Suh, C.P.-C.; Martin, D.E.; Hoffmann, W.C.; Lan, Y.; Fritz, B.K.; Goolsby, J.A. An airborne multispectral imaging system based on two consumer-grade cameras for agricultural remote sensing. Remote Sens. 2014, 6, 5257–5278. [Google Scholar] [CrossRef]
  2. Grenzdörffer, G.; Engel, A.; Teichert, B. The photogrammetric potential of low-cost UAVs in forestry and agriculture. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2008, 31, 1207–1214. [Google Scholar]
  3. Honkavaara, E.; Saari, H.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing and assessment of spectrometric, stereoscopic imagery collected using a lightweight UAV spectral camera for precision agriculture. Remote Sens. 2013, 5, 5006–5039. [Google Scholar] [CrossRef] [Green Version]
  4. George, E.A.; Tiwari, G.; Yadav, R.; Peters, E.; Sadana, S. UAV systems for parameter identification in agriculture. In Proceedings of the IEEE Global Humanitarian Technology Conference: South Asia Satellite (GHTC-SAS), Trivandrum, India, 23–24 August 2013; pp. 270–273.
  5. Zhang, C.; Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: A review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
  6. Gómez-Candón, D.; De Castro, A.; López-Granados, F. Assessing the accuracy of mosaics from unmanned aerial vehicle (UAV) imagery for precision agriculture purposes in wheat. Precis. Agric. 2014, 15, 44–56. [Google Scholar] [CrossRef]
  7. Torres-Sánchez, J.; Peña, J.; De Castro, A.; López-Granados, F. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV. Comput. Electron. Agric. 2014, 103, 104–113. [Google Scholar] [CrossRef]
  8. Hardin, P.; Jackson, M.; Anderson, V.; Johnson, R. Detecting squarrose knapweed (Centaurea virgata Lam. Ssp. Squarrosa Gugl.) using a remotely piloted vehicle: A Utah case study. GISci. Remote Sens. 2007, 44, 203–219. [Google Scholar] [CrossRef]
  9. Berni, J.A.; Zarco-Tejada, P.J.; Suárez, L.; Fereres, E. Thermal and narrowband multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle. IEEE Trans. Geosci. Remote Sens. 2009, 47, 722–738. [Google Scholar] [CrossRef]
  10. Swain, K.C.; Thomson, S.J.; Jayasuriya, H.P. Adoption of an unmanned helicopter for low-altitude remote sensing to estimate yield and total biomass of a rice crop. Trans. ASABE 2010, 53, 21–27. [Google Scholar] [CrossRef]
  11. Hunt, E.R.; Hively, W.D.; Fujikawa, S.J.; Linden, D.S.; Daughtry, C.S.; McCarty, G.W. Acquisition of NIR-green-blue digital photographs from unmanned aircraft for crop monitoring. Remote Sens. 2010, 2, 290–305. [Google Scholar] [CrossRef]
  12. Yang, C. A high-resolution airborne four-camera imaging system for agricultural remote sensing. Comput. Electron. Agric. 2012, 88, 13–24. [Google Scholar] [CrossRef]
  13. Yang, C.; Everitt, J.H.; Davis, M.R.; Mao, C. A CCD camera-based hyperspectral imaging system for stationary and airborne applications. Geocarto Int. 2003, 18, 71–80. [Google Scholar] [CrossRef]
  14. Zhang, J.; Yang, C.; Song, H.; Hoffmann, W.C.; Zhang, D.; Zhang, G. Evaluation of an airborne remote sensing platform consisting of two consumer-grade cameras for crop identification. Remote Sens. 2016, 8, 257. [Google Scholar] [CrossRef]
  15. Nguy-Robertson, A.L.; Brinley Buckley, E.M.; Suyker, A.S.; Awada, T.N. Determining factors that impact the calibration of consumer-grade digital cameras used for vegetation analysis. Int. J. Remote Sens. 2016, 37, 3365–3383. [Google Scholar] [CrossRef]
  16. Song, H.; Yang, C.; Zhang, J.; Hoffmann, W.C.; He, D.; Thomasson, J.A. Comparison of mosaicking techniques for airborne images from consumer-grade cameras. J. Appl. Remote Sens. 2016, 10, 016030. [Google Scholar] [CrossRef]
  17. Westbrook, J.K.; Eyster, R.S.; Yang, C.; Suh, C.P.-C. Airborne multispectral identification of individual cotton plants using consumer-grade cameras. Remote Sens. Appl. Soc. Environ. 2016, 4, 37–43. [Google Scholar] [CrossRef]
  18. Goel, P.; Prasher, S.; Landry, J.A.; Patel, R.; Viau, A.; Miller, J. Estimation of crop biophysical parameters through airborne and field hyperspectral remote sensing. Trans. ASAE 2003, 46, 1235–1464. [Google Scholar]
  19. Calderón, R.; Navas-Cortés, J.A.; Lucena, C.; Zarco-Tejada, P.J. High-resolution airborne hyperspectral and thermal imagery for early detection of Verticillium wilt of olive using fluorescence, temperature and narrow-band spectral indices. Remote Sens. Environ. 2013, 139, 231–245. [Google Scholar] [CrossRef]
  20. Song, H.; Yang, C.; Zhang, J.; He, D.; Thomasson, J.A. Combining fuzzy set theory and nonlinear stretching enhancement for unsupervised classification of cotton root rot. J. Appl. Remote Sens. 2015, 9, 096013. [Google Scholar] [CrossRef]
  21. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV-Based RGB imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef]
  22. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  23. Grenzdörffer, G.J. Crop height determination with UAS point clouds. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 40, 135. [Google Scholar] [CrossRef]
  24. Li, W.; Niu, Z.; Chen, H.; Li, D.; Wu, M.; Zhao, W. Remote estimation of canopy height and aboveground biomass of maize using high-resolution stereo images from a low-cost unmanned aerial vehicle system. Ecol. Indic. 2016, 67, 637–648. [Google Scholar] [CrossRef]
  25. Papadavid, G.; Hadjimitsis, D.; Toulios, L.; Michaelides, S. Mapping potato crop height and leaf area index through vegetation indices using remote sensing in Cyprus. J. Appl. Remote Sens. 2011, 5, 053526. [Google Scholar] [CrossRef]
  26. Verhoeven, G.; Vermeulen, F. Engaging with the canopy—Multi-dimensional vegetation mark visualisation using archived aerial images. Remote Sens. 2016, 8, 752. [Google Scholar] [CrossRef]
  27. Sharma, L.K.; Bu, H.; Franzen, D.W.; Denton, A. Use of corn height measured with an acoustic sensor improves yield estimation with ground based active optical sensors. Comput. Electron. Agric. 2016, 124, 254–262. [Google Scholar] [CrossRef]
  28. Shuai, G.; Zheng, N.; Gang, S.; Dan, Z.; Kun, J.; Yuchu, Q. Height Extraction of Maize Using Airborne Full-waveform LIDAR Data and a Deconvolution Algorithm. IEEE Geosci. Remote Sens. Lett. 2015, 12, 1978–1982. [Google Scholar] [CrossRef]
  29. Tilly, N.; Hoffmeister, D.; Cao, Q.; Huang, S.; Lenz-Wiedemann, V.; Miao, Y.; Bareth, G. Multitemporal crop surface models: Accurate plant height measurement and biomass estimation with terrestrial laser scanning in paddy rice. J. Appl. Remote Sens. 2014, 8, 083671. [Google Scholar] [CrossRef]
  30. Tilly, N.; Hoffmeister, D.; Cao, Q.; Lenz-Wiedemann, V.; Miao, Y.; Bareth, G. Precise plant height monitoring and biomass estimation with Terrestrial Laser Scanning in paddy rice. In Proceedings of the ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences Conference, Antalya, Turkey, 11–13 November 2013; Volume 1113.
  31. Zhang, L.; Grift, T.E. A LIDAR-based crop height measurement system for Miscanthus giganteus. Comput. Electron. Agric. 2012, 85, 70–76. [Google Scholar] [CrossRef]
  32. Yang, C.; Odvody, G.N.; Fernandez, C.J.; Landivar, J.A.; Minzenmayer, R.R.; Nichols, R.L. Evaluating unsupervised and supervised image classification methods for mapping cotton root rot. Precis. Agric. 2015, 16, 201–215. [Google Scholar] [CrossRef]
  33. Wu, M.; Zhang, X.; Huang, W.; Niu, Z.; Wang, C.; Li, W.; Hao, P. Reconstruction of daily 30 m data from HJ CCD, GF-1 WFV, Landsat, and MODIS data for crop monitoring. Remote Sens. 2015, 7, 16293–16314. [Google Scholar] [CrossRef]
  34. Bareth, G.; Bendig, J.; Tilly, N.; Hoffmeister, D.; Aasen, H.; Bolten, A. A comparison of UVA-and TLS-derived plant height for crop monitoring: Using polygon grids for the analysis of crop surface models (CSMs). Photogramm. Fernerkund. Geoinf. 2016, 2016, 85–94. [Google Scholar] [CrossRef]
  35. Li, W.; Niu, Z.; Chen, H.; Li, D. Characterizing canopy structural complexity for the estimation of maize LAI based on ALS data and UAV stereo images. Int. J. Remote Sens. 2016. [Google Scholar] [CrossRef]
Figure 1. Location of the study area with (a) spectral measurement of calibration tarps; (b) ground control point (GCP) measurement with GPS; and (c) airborne RGB mosaic image acquired on 1 August 2016.
Figure 1. Location of the study area with (a) spectral measurement of calibration tarps; (b) ground control point (GCP) measurement with GPS; and (c) airborne RGB mosaic image acquired on 1 August 2016.
Remotesensing 09 00239 g001
Figure 2. Calibration models between field measured reflectance values and mean digital numbers (DNs) of the four tarps for blue, green, red, and NIR bands.
Figure 2. Calibration models between field measured reflectance values and mean digital numbers (DNs) of the four tarps for blue, green, red, and NIR bands.
Remotesensing 09 00239 g002
Figure 3. Linear regression result between extracted and field measured crop height.
Figure 3. Linear regression result between extracted and field measured crop height.
Remotesensing 09 00239 g003
Figure 4. Crop classification results based on the object-based method, using four-band airborne imagery and crop height.
Figure 4. Crop classification results based on the object-based method, using four-band airborne imagery and crop height.
Remotesensing 09 00239 g004
Figure 5. Crop classification map based on the maximum likelihood, using four-band airborne imagery with crop height.
Figure 5. Crop classification map based on the maximum likelihood, using four-band airborne imagery with crop height.
Remotesensing 09 00239 g005
Figure 6. Crop mapping results based on the object-based method, using four-band airborne imagery without crop height.
Figure 6. Crop mapping results based on the object-based method, using four-band airborne imagery without crop height.
Remotesensing 09 00239 g006
Figure 7. Crop classification map based on the maximum likelihood, using four-band airborne imagery without crop height.
Figure 7. Crop classification map based on the maximum likelihood, using four-band airborne imagery without crop height.
Remotesensing 09 00239 g007
Table 1. GCP errors before or after Z-error removal.
Table 1. GCP errors before or after Z-error removal.
GCPError X (m)Error Y (m)Error Z (m)Z-Error after Removal (m)Independent Sample 1
0−0.637−0.513−0.7550.004Yes
1−0.0940.0560.6890.023No
20.7690.5551.263−0.028Yes
30.1730.4350.5330.005No
40.1030.4340.1700.009Yes
5−0.1250.3210.059−0.015Yes
6−0.4160.3960.0580.012No
7−0.410−0.089−0.749−0.004Yes
80.284−0.0650.921−0.050No
9−0.432−0.0140.2140.017Yes
10−0.590−0.302−0.304−0.019No
110.0120.0200.4920.000No
120.1840.1690.4810.025Yes
130.3050.172−0.407−0.002Yes
140.148−0.171−0.145−0.003Yes
150.182−0.454−0.6870.022Yes
16−0.096−0.218−0.870−0.042No
170.297−0.0920.1200.018Yes
18−0.023−0.316−0.0900.048Yes
19−0.234−0.316−0.112−0.003Yes
20−0.092−0.4060.018−0.011Yes
210.4370.2500.971−0.001Yes
220.7060.4210.961−0.055No
Mean (m)0.0196030.0118480.1231060.000459
Sigma (m)0.3637780.3139220.5879690.024957
RMS Error (m)0.3643060.3141460.6007190.024839
1 independent samples: point were not used in Z-error removal.
Table 2. Accuracy assessment results of crop classification based on the object-based method, using four-band airborne imagery and crop height.
Table 2. Accuracy assessment results of crop classification based on the object-based method, using four-band airborne imagery and crop height.
ClassCornCottonGrassSorghumBare SoilWheatRoadProducer’s Accuracy (%)User’s Accuracy (%)
Corn96.7900000096.79100
Cotton0.5998.611.032.630.33018.3898.698.46
Grass0088.97000088.97100
Sorghum2.630.17097.370.160097.3794.2
Bare soil00.06008001.188090.44
Wheat0000099.87099.87100
Road01.180019.510.1380.4480.4456.92
Overall accuracy = 97.50%; Kappa coefficient = 0.96.
Table 3. Accuracy assessment results of crop classification based on the maximum likelihood, using four-band airborne imagery with crop height.
Table 3. Accuracy assessment results of crop classification based on the maximum likelihood, using four-band airborne imagery with crop height.
ClassCornCottonGrassSorghumBare SoilWheatRoadProducer’s Accuracy (%)User’s Accuracy (%)
Corn92.078.314.790.590.140092.0774.06
Cotton1.5771.920.399.326.211.910.8971.9297.72
Grass5.356.0593.422.3200093.4264.91
Sorghum0.576.091.3986.160.7800.7386.1640.41
Bare soil0.446.5301.4540.560.8212.9740.5611.82
Wheat00.5009.86970.419787.11
Road00.600.1922.440.2785.0185.0142.82
Overall accuracy = 78.52%; Kappa coefficient = 0.67.
Table 4. Accuracy assessment results of crop identification based on the object-based method, using four-band airborne imagery without crop height.
Table 4. Accuracy assessment results of crop identification based on the object-based method, using four-band airborne imagery without crop height.
ClassCornCottonGrassSorghumBare SoilWheatRoadProducer’s Accuracy (%)User’s Accuracy (%)
Corn96.411.1901.6100096.4194.96
Cotton1.3194.8211.031.020.33018.3894.8298.44
Grass0088.97000088.97100
Sorghum2.272.59097.370.160097.3782.08
Bare soil00.060077.1502.877.1585.88
Wheat0000099.87099.87100
Road01.340022.360.1378.8278.8253.24
Overall accuracy = 94.98%; Kappa coefficient = 0.92.
Table 5. Accuracy assessment results of crop classification based on the maximum likelihood, using four-band airborne imagery without crop height.
Table 5. Accuracy assessment results of crop classification based on the maximum likelihood, using four-band airborne imagery without crop height.
ClassCornCottonGrassSorghumBare SoilWheatRoadProducer’s Accuracy (%)User’s Accuracy (%)
Corn87.639.594.611.40.180087.6370.38
Cotton2.169.240.369.323.971.970.7369.2497.5
Grass7.027.4692.12.7500092.159.36
Sorghum2.846.832.9384.90.7800.6584.934.89
Bare soil0.45.8301.4649.610.7513.3749.6115.45
Wheat00.510010.3397.080.4197.0886.83
Road00.5400.1915.140.1984.8584.8548.43
Overall accuracy = 75.98%; Kappa coefficient = 0.63.

Share and Cite

MDPI and ACS Style

Wu, M.; Yang, C.; Song, X.; Hoffmann, W.C.; Huang, W.; Niu, Z.; Wang, C.; Li, W. Evaluation of Orthomosics and Digital Surface Models Derived from Aerial Imagery for Crop Type Mapping. Remote Sens. 2017, 9, 239. https://doi.org/10.3390/rs9030239

AMA Style

Wu M, Yang C, Song X, Hoffmann WC, Huang W, Niu Z, Wang C, Li W. Evaluation of Orthomosics and Digital Surface Models Derived from Aerial Imagery for Crop Type Mapping. Remote Sensing. 2017; 9(3):239. https://doi.org/10.3390/rs9030239

Chicago/Turabian Style

Wu, Mingquan, Chenghai Yang, Xiaoyu Song, Wesley Clint Hoffmann, Wenjiang Huang, Zheng Niu, Changyao Wang, and Wang Li. 2017. "Evaluation of Orthomosics and Digital Surface Models Derived from Aerial Imagery for Crop Type Mapping" Remote Sensing 9, no. 3: 239. https://doi.org/10.3390/rs9030239

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop