[go: up one dir, main page]

Next Article in Journal
Mapping Land Management Regimes in Western Ukraine Using Optical and SAR Data
Previous Article in Journal
Comparison of Medium Spatial Resolution ENVISAT-MERIS and Terra-MODIS Time Series for Vegetation Decline Analysis: A Case Study in Central Asia
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Airborne Multispectral Imaging System Based on Two Consumer-Grade Cameras for Agricultural Remote Sensing

1
USDA-Agricultural Research Service, Aerial Application Technology Research Unit, 3103 F&B Road, College Station, TX 77845, USA
2
USDA-Agricultural Research Service, Insect Control and Cotton Disease Research Unit, 2771 F&B Road, College Station, TX 77845, USA
3
USDA-Agricultural Research Service, Tick and Biting Fly Research Unit, 22675 N, Moorefield Road, Edinburg, TX 78541, USA
*
Author to whom correspondence should be addressed.
Remote Sens. 2014, 6(6), 5257-5278; https://doi.org/10.3390/rs6065257
Submission received: 23 April 2014 / Revised: 29 May 2014 / Accepted: 29 May 2014 / Published: 6 June 2014

<p>A two-camera multispectral imaging system mounted on an aluminum rack.</p> ">

<p>A two-camera imaging system installed over a camera port in a Cessna 206 aircraft.</p> ">

<p>Geotagged images plotted in Google Earth. The images were acquired at 610 m (2000 ft) above ground level near Hargill, Texas on 14 May 2012.</p> ">

<p>Normal color (<b>left</b>) and color-infrared (<b>right</b>) images acquired at 305 m (1000 ft) above ground level on 17 May 2012 from an area with diverse ground cover types in South Texas.</p> ">

<p>Histograms of the four bands for the images shown in <a href="#f4-remotesensing-06-05257" class="html-fig">Figure 4</a>.</p> ">

<p>Normal color (<b>left</b>) and color-infrared (<b>right</b>) images acquired at 305 m (1000 ft) above ground on 17 May 2012 from cotton fields near Hargill, Texas. The images for a close-up area with a 40 m by 40 m square extracted from the full-size image are also shown.</p> ">

<p>Classification map of a four-band image for a 40 m × 40 m area within a cotton field based on unsupervised classification. Red represents cotton plants, while white depicts soil background.</p> ">

<p>Normal color (<b>left</b>) and color-infrared (<b>right</b>) images acquired at 1524 m (5000 ft) AGL from a cotton root rot-infected area near San Angelo, Texas on 18 September 2012.</p> ">

<p>Normal color (<b>left</b>) and color-infrared (<b>right</b>) images acquired at 457 m (1500 ft) above ground from a fallow field infested with henbit near College Station, Texas on 13 February 2013.</p> ">
Versions Notes

Abstract

:
This paper describes the design and evaluation of an airborne multispectral imaging system based on two identical consumer-grade cameras for agricultural remote sensing. The cameras are equipped with a full-frame complementary metal oxide semiconductor (CMOS) sensor with 5616 × 3744 pixels. One camera captures normal color images, while the other is modified to obtain near-infrared (NIR) images. The color camera is also equipped with a GPS receiver to allow geotagged images. A remote control is used to trigger both cameras simultaneously. Images are stored in 14-bit RAW and 8-bit JPEG files in CompactFlash cards. The second-order transformation was used to align the color and NIR images to achieve subpixel alignment in four-band images. The imaging system was tested under various flight and land cover conditions and optimal camera settings were determined for airborne image acquisition. Images were captured at altitudes of 305–3050 m (1000–10,000 ft) and pixel sizes of 0.1–1.0 m were achieved. Four practical application examples are presented to illustrate how the imaging system was used to estimate cotton canopy cover, detect cotton root rot, and map henbit and giant reed infestations. Preliminary analysis of example images has shown that this system has potential for crop condition assessment, pest detection, and other agricultural applications.

1. Introduction

With advances in electronic imaging technology, airborne imaging systems have evolved from simple, small-frame, analog video cameras in the 1980s to sophisticated, large-frame, high-resolution digital cameras today. These systems have been widely used as a versatile remote sensing tool for many applications due to their advantages over film-based aerial photography and satellite imagery. Among the advantages of airborne imaging systems are their relatively low cost, high spatial resolution, and real-time/near-real-time availability of imagery for visual assessment and image processing, and their ability to obtain data in narrow spectral bands in the visible to mid-infrared region of the spectrum [1,2]. Over the years, numerous commercial and custom-built multispectral imaging systems have been developed and used for diverse remote sensing applications, including rangeland and cropland assessment, precision agriculture, and pest management [35]. Most airborne multispectral imaging systems can provide 8- to 16-bit image data with submeter resolutions at 3–12 narrow spectral bands in the visible to near-infrared (NIR) regions of the electromagnetic spectrum [68]. Some airborne imaging systems, such as the spectra-View 5WT medium format multispectral camera (Airborne Data Systems, Inc., Redwood Falls, MN, USA), can also capture mid-infrared and thermal infrared images.
Recent developments in high resolution (i.e., high spatial resolution) satellite sensors have significantly narrowed the gap in spatial resolution between traditional satellite and airborne imagery. Commercial availability of many high resolution satellite sensors such as IKONOS, QuickBird, GeoEye-1, and WorldView 2 in recent years provides new opportunities for remote sensing applications that require high resolution image data. For example, GeoEye-1 provides four spectral bands at 1.65 m resolution and WorldView 2 offers eight spectral bands at 1.8 m resolution. Moreover, the high revisit frequency and fast turnaround time combined with their relatively large aerial coverage make high resolution satellite sensors attractive for many applications, including precision agriculture [9,10]. Nevertheless, airborne multispectral imaging systems still offer some advantages, including the immediate availability for real-time assessment and the flexibility to change filters for desired wavelengths and bandwidths. More importantly, satellite imagery cannot always be acquired from a target area at specified time periods due to satellite orbits, competition for images at the same time with other customers, and weather conditions. Additionally, satellite imagery is still more expensive than airborne imagery for smaller areas. As unmanned aerial systems (UAS) are becoming more available, they are being used as another versatile and cost-effective platform for airborne remote sensing along with large numbers of manned aircraft, including agricultural aircraft [1113].
Technological advances in sensor and imaging technologies have made consumer-grade digital cameras an attractive option for remote sensing due to their low cost, compact size, compact data storage, and user-friendliness. Consumer-grade digital cameras are fitted with either a charge-coupled device (CCD) sensor or a complementary metal-oxide-semiconductor (CMOS) sensor. These cameras employ a Bayer color filter mosaic to obtain true-color images using one single sensor [14,15]. A Bayer filter mosaic is a color filter array for arranging RGB color filters on the pixel array of a CCD or CMOS. Since each pixel is filtered to record only one of the three primary colors, various demosaicing algorithms can be used to interpolate a set of complete red, green, and blue values for each pixel. Although pixel interpolation lowers the effective spatial resolution of the band images, there is no need to align the three band images. Consequently, consumer-grade digital color cameras have been increasingly used by researchers for agricultural applications [1619].
The use of images with visible and NIR bands is very common in remote sensing, especially for vegetation monitoring. Many vegetation indices such as the normalized difference vegetation index (NDVI) require spectral information in the NIR and red bands, even though the three visible bands could be sufficient for some applications. Since most consumer-grade cameras only provide the three broad visible bands, NIR filtering techniques can be used to convert a color camera to a NIR camera. Most digital cameras are fitted with filters to block UV and infrared light. Therefore, it is possible to replace the blocking filter by a long-pass infrared filter on standard CCD or CMOS sensors for obtaining NIR images. Studies have been conducted on the use of NIR-converted digital cameras for monitoring plant conditions and results from these studies support their use as simple and affordable tools for plant stress detection and growth monitoring [20,21].
The Aerial Application Technology Research Unit at the U.S. Department of Agriculture, Agricultural Research Service’s Southern Plains Agricultural Research Center in College Station, Texas has devoted considerable efforts to the development and evaluation of airborne imaging systems for agricultural applications. Currently, we have a suite of airborne multispectral and hyperspectral imaging systems. These systems have been used for research and practical applications in precision agriculture and pest management [2225]. One of the multispectral imaging systems [8] consists of four high resolution CCD digital cameras, which are sensitive in the 400 to 1000 nm spectral range and provide 2048 × 2048 active pixels with 12-bit data depth. The four cameras are respectively equipped with blue (430–470 nm), green (530–570 nm), red (630–670 nm), and NIR (810–850 nm) bandpass interference filters to capture four-band images. Like most imaging systems, the system employs multiple CCD cameras, each of which is equipped with a different bandpass filter. This approach provides the advantage that each camera can be individually adjusted for optimum focus and aperture settings, but has the disadvantage that the images from all the bands have to be properly aligned. With multiple optical systems, it is very difficult to achieve this alignment optically or mechanically, so a software-based registration procedure is commonly used to align the band images. Another logistical problem associated with these imaging systems is that all the cameras have to be connected to a desktop PC via image grabbing cards for system control and data storage. They also require an external power supply for the cameras, PC and display monitor. Consequently, these systems are heavy and bulky, making it difficult to install and remove the systems in aircraft.
For both research and practical applications, low-cost and user-friendly imaging systems are needed. These systems should be able to capture geotagged images at varying altitudes and normal airplane speed. The objectives of this study were to: (1) develop a high resolution imaging system using two identical consumer-grade cameras with a NIR conversion on one of the cameras; (2) integrate a GPS receiver with the imaging system to acquire geotagged images; and (3) demonstrate the potential use of the system for estimating cotton row width, detecting cotton root rot, and mapping two invasive weeds, henbit and giant weed.

2. System Design and Description

2.1. Selection and Configuration of System Components

This system was designed to meet the requirements for both research and practical applications. The required major characteristics of the system were as follows: (1) one camera to capture true-color images; (2) a second identical camera that can be modified to capture NIR images; (3) at least 20 megapixel array to obtain high resolution images; (4) at least 12-bit pixel depth to allow a wide range of digital count values for diverse ground targets; (5) a focal length that allows pixel sizes of 0.1 to 1 m at flight heights of 305 to 3050 m (1000 to 10,000 ft); (6) short exposure time (fast shutter speed) to minimize image blurring at low flight height and fast flight speed; (7) a GPS receiver integrated to the color camera to geotag images; and (8) a wireless device to trigger the two cameras simultaneously.
To meet the above criteria, two Canon EOS 5D Mark II cameras and two Canon EF 20 mm f/2.8 USM lenses (Canon USA Inc., Lake Success, NY, USA) were selected (Figure 1). Table 1 presents the major specifications of the camera. The EOS 5D Mark II is a high-performance, 14-bit digital SLR camera with a full-frame (36 mm × 24 mm) CMOS sensor. The image sensor contains 21 megapixels (5616 × 3744 pixels) at fine resolution, though images can also be recorded at two coarser resolutions. The focal lenses have a fixed focal length of 20 mm and an aperture scale of f/2.8 to f/22.
One camera was used to collect true-color images and the other was converted to a NIR camera by replacing the NIR blocking filter fitted in front of the CCD sensor. The filter was used to block the NIR wavelengths to obtain true-color images, so it was possible to obtain NIR images if the NIR blocking filter was replaced by a filter that allows NIR light to transmit. Therefore, the NIR blocking filter was replaced by a 720-nm long-pass filter (Life Pixel, Mukilteo, WA, USA). The Bayer filter mosaic, on the other hand, is fused to the sensor substrate and cannot be removed. Therefore, the transmission profiles of the Bayer color channels remain when using RGB cameras with the NIR blocking filter replaced. All three channels in the converted camera only record NIR radiation, but the red-channel has the best sensitivity [20]. As a result, the image recorded in the red-channel was used as the NIR image.
A Garmin eTrex Vista HC series GPS receiver (Garmin International, Inc., Olathe, KS, USA) was attached to the color camera via a wireless file transmitter WFT-E4 IIA (Canon USA Inc., Lake Success, NY, USA) to allow geotagged images. The two cameras were attached to an aluminum frame rack next to each other. A Wireless RF Pro Remote Control (Hähnel Industries Ltd, Bandon, Co. Cork, Ireland) was attached to each camera so that both cameras could be triggered simultaneously with one transmitter for image acquisition. Images from each camera were stored in 14-bit RAW (CR2 format) and 8-bit JPEG files in a CompactFlash card. Figure 2 shows the configuration of the two-camera imaging system installed over a camera port in a Cessna 206 aircraft. The mechanical alignment ensured that both cameras had a similar field of view. Color and NIR images were viewed on the camera’s 3-in LCD monitors for targeting particular imaging areas. An external video monitor could also be connected to the cameras for large and clear views, and for real-time assessment. The cost of the whole system was about $6000. Each camera with the lens cost about $2500 and the other components, including the NIR conversion, were under $1000. The total cost was only one-third of the cost of the four-camera system mentioned above [8].

2.2. Parameter Settings of Cameras

In remote sensing applications, cameras are generally used to take nadir images high above ground under sunny conditions. The two cameras were set to the manual mode and the lens focus set to infinity (∞). Major parameters such as exposure time or shutter speed, aperture, and ISO speed were determined to obtain high quality images. Since low ISO speed has low image noise, the ISO speed for both cameras was set at 200, which is the lowest ISO speed for the cameras under the manual mode. Based on the low flight height at 305 m (1000 ft) and typical flight speed at 200 km/h (124 mph), the exposure time was set at 1/500 s or 2 ms for both cameras to minimize image blurring. The aperture was set at f/10 for the color camera and f/14 for the NIR camera so that the histograms of all four band images from diverse target areas were well spread within the dynamic range without saturation. Image recording quality and array was set to RAW+JEPG to record 21-megapixel (5616 × 3744) images. The file size was approximately 25.8 MB for the RAW image and 6.1 MB for the JPEG image. A 32-GB CF card was used for each camera and could store about 1000 pairs of images. The information display menu was set to view the histograms and GPS information on the LCD for each image right after the image was captured. All other parameters for the cameras were set to the defaults. Generally, camera settings should remain the same for the whole imaging season so that the images taken at different times can be compared despite the fact that light intensity changes over time. However, if images have to be taken under overcast conditions, aperture opening can be increased to avoid dark images.

2.3. Image Alignment

One inherent problem associated with any multi-camera imaging system is the misalignment among the images obtained from different cameras. Therefore, the cameras were mounted close to each other and aligned before each imaging mission to initially minimize misalignment of images. Although this mechanical alignment ensures that the two cameras capture images with maximum overlapping areas, it is not sufficient for data extraction and image analysis directly from the unaligned images. Therefore, first- and second-order polynomial transformation models were used to more accurately align the individual images. A first-order transformation is a linear transformation with the following form:
{ x o = a 1 + a 2 x i + a 3 y i y o = b 1 + b 2 x i + b 3 y i ,
where xi and yi are the coordinates in a raw or source image (input); xo and yo are the registered or aligned coordinates (output), a1, a2, and a3 are the coefficients of the transformation in the x direction; and b1, b2, and b3 are the coefficients of the transformation in the y direction.
A linear transformation will shift, scale and rotate the source image to align with a reference image. For the color and NIR images, the color image can be selected as the reference image and the NIR image will be registered to it. To determine the transformation coefficients, at least three reference points are required. In practice, many more points (i.e., 8–10) are used to increase the accuracy of the transformation. Thus the least squares method can be used to estimate the coefficients.
Generally, the first-order transformation is sufficient for image-to-image alignment. However, if the RMS errors exceed the error tolerance for a particular application, a second-order transformation can be used. A second-order transformation has the following form:
{ x o = a 1 + a 2 x i + a 3 y i + a 4 x i 2 + a 5 x i y i + a 6 y i 2 y o = b 1 + b 2 x i + b 3 y i + b 4 x i 2 + b 5 x i y i + b 6 y i 2 .
Three second-order terms are added to each equation compared with the first-order transformation and at least six reference points are required to calculate the transformation coefficients.
To evaluate the accuracy of a transformation and determine the appropriate order of the transformation, root mean square (RMS) errors are used. For n reference points, the x RMS error, Rx, the y RMS error, Ry, and the total RMS error, Rt, are calculated as follows:
R x = 1 n j = 1 n ( x r j - x i j ) 2 , R y = 1 n j = 1 n ( y r j - y i j ) 2 , R t = R x 2 + R y 2 ,
where xij and yij are the input coordinates for the jth point and xrj and yrj are the retransformed coordinates for the jth point in the input image coordinate system. RMS errors indicate how good the alignment is between the source image and the reference image. The smaller the RMS errors, the better the alignment. A RMS error of 2 indicates the retransformed pixel can be 2 pixels away from the input pixel.
After the coordinate transformation, pixels are moved to new locations and the values of the pixels need to be assigned. A resampling procedure is needed to assign pixel values for the aligned image. Although several resampling methods are available, the nearest neighbor algorithm is more appropriate for this purpose because it simply assigns the value of the closest pixel in the input image to the new pixel and it does not change the original data values. Most image processing software packages can be used to perform image alignment. In this study, ERDAS Imagine 2013 (Intergraph Corporation, Madison, AL, USA) was used for this purpose.

2.4. Ground Coverage and Pixel Size

The ground coverage of the cameras was determined by the following formulas:
{ G L = L F H = 1.8 H G W = W F H = 1.2 H ,
where GL is the length of the ground coverage (m), GW is the width of the ground coverage (m), L is the length of the sensor (36 mm), W is the width of the sensor (24 mm), F is the focal length of the lens (20 mm), and H is the flight height above ground level (AGL) (m).
The pixel size of the fine resolution image (5616 × 3744 pixels) was determined by the following formula:
P = 1.8 5616 H = 3.205 × 10 - 4 H
where P is ground pixel size (m). Table 2 gives ground coverage and pixel size of the multispectral imaging system at commonly-used flight heights from 305 to 3048 m (1000 to 10,000 ft). When flight height increases from 305 m (1000 ft) to 3048 m (10,000 ft), pixel size increases from 0.1 to 1.0 m (0.3 to 3.2 ft) and ground coverage increases from 549 m × 366 m (1800 ft × 1200 ft) to 5486 m × 3658 m (18,000 ft × 12,000 ft). Essentially, for every 1000 ft of change in flight height, pixel size changes by 0.1 m. Since flight height is normally adjusted in 152 m (500 ft) or 305 m (1000 ft) increments in the U.S., the information listed in this table can be used for quick reference to determine appropriate flight height based on pixel size or ground coverage requirements.

3. Image Acquisition, Visualization and Preprocessing

The two-camera multispectral imaging system was installed in a Cessna 206 single-engine aircraft via a camera port for airborne image acquisition testing (Figure 2). Airborne images were taken under sunny conditions, at altitudes of 305 m (1000 ft) to 3050 m (10,000 ft) with a ground speed of approximately 200 km/h (124 mph), over diverse target areas with a wide range of reflectance. The camera parameters (i.e., exposure time and aperture) were adjusted multiple times so that these settings would be appropriate for different applications without the need to change the settings.
Both the RAW and JEPG images stored in CF cards can be readily viewed with the image software provided with the camera, while the JEPG images can be viewed with most image viewers that are commonly installed on a computer such as Windows Photo Viewer. The camera-provided software also can be used to correct the vignetting effect in the RAW image and to export it as Tiff and other format for further processing. Vignetting means a reduction of an image’s brightness at the periphery compared to the image center. Images from the Canon camera and lens combination have minimal vignetting effect on images taken at higher altitudes (>610 m). Nevertheless, this effect should be corrected before the RAW image is exported from the software.
Since the GPS information was included in the metadata of the RAW and JPEG images, Picasa 3.9 (Google Inc., Mountain View, CA, USA) was used to create a KMZ file from all the JEPG images to be viewed in Google Earth 7.1 (Google Inc., Mountain View, CA, USA). Figure 3 shows some geotagged images plotted in Google Earth. These images were acquired at 610 m (2000 ft) AGL near Hargill, Texas on 14 May 2012. From the plot, the user can quickly identify individual images for particular fields and study sites, or a group of continuous images that can be stitched together to cover a larger geographic area for quick assessment of crop conditions.
Figure 4 shows a normal color image and a color-infrared (CIR) image for an area with diverse ground cover types, including citrus orchards, crops, brushes, forage land and residential areas. The color image is the original true-color image captured by the color camera, while the CIR image was created by stacking the aligned NIR image from the NIR camera with the red and green band images in the color image. The color and NIR images were acquired at 305 m (1000 ft) AGL on 17 May 2012 near Hargill, Texas. The different color shades along the edges of the CIR image are due to the misalignment between the NIR and color images. These areas show the non-overlapped areas with the color image and should not be used for image analysis. Figure 5 shows the histograms of the four bands for the images shown in Figure 4. The histograms are well spread within the dynamic range without saturation.
Table 3 gives the RMS errors for registering the NIR image to the normal color image shown in Figure 4 using the first- and second-order transformations based on 4 to 10 control points. The alignment errors for two other pairs of images acquired at 610 and 1219 m (2000 and 4000 ft) AGL are also given in Table 3. These two pairs of images contained the imaging area shown in Figure 4, but covered four and 25 times as large an area since they were captured two and five times, respectively, as high above ground as the images shown in Figure 4.
The RMS errors are much higher for the first-order transformation (5.1–7.9 pixels) than for the second-order transformation (0.1–0.7 pixels) for all three flight heights, indicating that the second-order transformation is more appropriate for aligning the color and NIR images. The results in Table 3 indicate that the number of the control points does not greatly affect the RMS errors. Nevertheless, as the number of control points increases, the RMS errors tend to stabilize. It appears that nine control points evenly distributed within each image is sufficient for image-to-image alignment based on the second-order transformation. It should be noticed that a RMS error of 0.5 pixels translates to ground errors of 0.05, 0.10, and 0.20 m for the 305-, 610-, and 1219-m AGL images respectively, as their respective pixel sizes are 0.1, 0.2 and 0.4 m. These errors are well acceptable for most agricultural applications.
For some applications, color images alone may be sufficient, so there is no need to align the color and NIR images. For other applications, both the color and NIR images are required. Moreover, the aligned image needs to be georeferenced or rectified to a coordinate system (e.g., the universal transverse Mercator (UTM)) using a set of GPS ground control points collected from the imaging area or using a georeferenced image or map for the imaging area. For temporal studies and some applications, the image needs to be radiometrically calibrated to reflectance using at least two calibration tarpaulins with low and high reflectance values using empirical line correction. Four tarpaulins with nominal reflectance values of 4%, 16%, 32% and 48% are usually placed within the imaging area for most of our studies. The raw digital counts can then be converted to reflectance values based on the regression equations between the actual reflectance measured from the tarpaulins and the digital counts extracted from the tarpaulins on each band image. Consumer-grade cameras can be used for scientific data acquisition if the image maintains a linear relationship to scene radiance [26] or if camera parameters (i.e., shutter speed, f-stop, and ISO-sensitivity) measuring scene radiance are used in conjunction with a camera calibration procedure [19].

4. Application Examples

To illustrate the potential use of the multispectral imaging system for agricultural applications, this section presents four practical examples on the use of the system to estimate cotton canopy cover, detect cotton root rot, and map the distribution of two invasive weeds, henbit and giant reed. The two cameras were set to the optimized settings and images were captured at predetermined altitudes and then preprocessed using the procedures described above. Table 4 presents the geographic coordinates, flight heights, and RMS errors for aligning the color and NIR images acquired from the four study sites. The total RMS errors range from 4.3 to 6.7 pixels using the first-order transformation and from 0.4 to 0.6 pixels using the second-order transformation. Therefore, the second-order transformation was used for image alignment. These examples represent some of our ongoing studies using this imaging system.

4.1. Estimating Cotton Canopy Cover

Accurate and timely detection of volunteer and regrowth cotton plants is important for advancing boll weevil eradication progress in south Texas and reducing the risk of reinfestation. One of the objectives was to evaluate if the imaging system can be used to estimate cotton plant width or canopy cover at early growing stages. Figure 6 shows the normal color and CIR composite images acquired from a cotton field at 305 m (1000 ft) AGL near Hargill, Texas on 17 May 2012. The crop was predominately at the third-grown square stage. It is difficult to see cotton rows on the full size image due to small figure size, so a close-up area with 400 × 400 pixels (40 m × 40 m) was extracted for illustration and image processing.
To estimate cotton canopy width, the four-band image was classified into two spectral classes using ISODATA (Iterative Self-Organizing Data Analysis) unsupervised classification in ERDAS Imagine [27]. One spectral class represents cotton plants and the other bare soil. Figure 7 shows the classification map for the small area based on ISODATA classification. Average canopy cover was estimated to be 43.7% or equivalent to a row canopy width of 42 cm, which is very similar to the measured plant width (38 cm) based on ten random measurements in the field. The imaging system is being evaluated to identify volunteer and regrowth cotton at lower altitudes of 305–610 m (1000–2000 ft) AGL and to distinguish cotton fields at higher altitudes of 1524–3048 m (5000–10,000 ft) AGL in the early growing season.

4.2. Mapping Cotton Root Rot

Cotton root rot, caused by the soilborne fungus Phymatotrichopsis omnivore, is a major cotton disease affecting cotton production in the southwestern and south central U.S. Generally, only portions of the field are infected, so it is important to define the infected areas and understand the seasonal spread of the disease within fields so that variable rate technology can be used to apply the fungicide only to the infected areas for more effective and economical control. Aerial imagery has proven to be an accurate and effective method to map root rot infection within fields [28].
Figure 8 shows the color and CIR images from a cotton root rot-infected area near San Angelo, Texas on 18 September 2012. The image was acquired at 1524 m (5000 ft) AGL with a ground pixel size of 0.5 m. The cotton fields on the image were severely infected with cotton root rot. On the color image, healthy cotton plants have a dark green color, whereas infected plants have a grayish color similar to bare soil. On the CIR image, healthy plants have a reddish-magenta tone, while infected plants have a cyan or greenish color. In fact, the image was taken two weeks before harvest and most of the infected plants were dead or severely stressed. The image can be used to quantify infected areas within the fields for the management and control of the disease. Temporal images acquired within and across growing seasons can be used to monitor the progression of the disease. Both unsupervised and supervised classification methods have been used to delineate root rot-infected areas with fields in our previous studies [28]. The two-camera imaging system is being used to monitor the seasonal progression of cotton root rot within fungicide-treated and nontreated cotton fields in central and south Texas and for evaluating the performance and efficacy of different fungicide control methods.

4.3. Mapping Henbit

Henbit (Lamium amplexicaule L.), also known as henbit deadnettle, is a winter annual weed that grows either sprawling or upright to a height of about 15 to 38 cm (6 to 15 in). Henbit is considered an invasive weed in field crops and lawns in much of Texas and may displace desirable vegetation if not properly managed. Figure 9 shows the normal color and CIR images acquired at 457 m (1500 ft) AGL from a fallow field infested with henbit near College Station, Texas on 13 February 2013. Since only portions of the field were infested, it is more appropriate to apply herbicide over only infested areas instead of the entire field.
Although the field is small and was only infested with henbit, there existed great variability in both soil color and weed intensity. To accurately delineate the infested areas within the field, five different classes of ground cover types were identified, including dense weed, sparse weed, weed on dark soil, dark soil, and light soil. For supervised classification, training pixels were selected from each cover type on the four-band image to generate class signatures. A field boundary was also defined to exclude the areas outside the boundary for image classification. Although many supervised classification methods can be used, the commonly-used maximum likelihood classifier was selected for image classification. The maximum likelihood classifier calculates the probability that a given pixel belongs to a specific class and assigns the pixel to the class that has the highest probability [29].
Figure 10 presents the five-class classification map and the merged two-zone map. A visual comparison of the classification maps with the color and CIR images indicates that the five cover types were generally well separated on the classification map. The two-zone map was used as a prescription map for site-specific aerial glyphosate application. Based on the two-zone map, approximately 21% of the field was infested with henbit. More work is being conducted on the use of this imaging system for mapping weed infestations and crop growth conditions for site-specific aerial chemical application.

4.4. Mapping Giant Reed

Giant reed (Arundo donax L.) is a bamboo-like perennial grass that grows 3 to 10 m tall and spreads from horizontal rootstocks below the soil to form large colonies. It typically grows in riparian areas and floodplains and can be found on wet stream banks, gravel bars, or dry banks away from permanent water. Giant reed is an invasive weed throughout the southern half of the United States and northern Mexico with the densest stands growing along the coastal rivers of southern California and the Rio Grande in Texas. Remote sensing has been successfully used to quantify its distribution and estimate total infested area along the Texas-Mexico portion of the Rio Grande for the control of this invasive weed [30].
Figure 11 shows the normal color and CIR images from a giant reed-infested area along the Rio Grande near Del Rio, Texas. The images were taken at an altitude of 2438 m (8000 ft) AGL on 15 November 2013. Giant reed shows a lime green tone on the normal color image and a bright red color on the color-infrared image. Several supervised classification methods have been used to distinguish giant reed from other plant species and cover types in our previous studies [30]. Compared to the four-camera system that was previously used [8], the two-camera system covers a ground area of 5.5 km by 3.7 km (3.4 miles by 2.3 miles) at 3048 m (10,000 ft) AGL, which is 5.4 times the area the four-camera system covers at the same altitude. This will greatly increase the mapping efficiency for large area and regional mapping. The new imaging system is also being used to evaluate the effect of biological control agents on the control of this invasive weed.

5. Error and Uncertainty Analysis of the Imaging System

Many factors can affect the performance and accuracy of the consumer grade-based imaging system. First the cameras employ the Bayer filter to capture the three band images with one CMOS sensor. On the average, two-thirds of the pixel values are interpolated using the other third of actually measured pixel values [14]. This process will introduce some error and smoothing effect on the image, especially in the areas where rapid color transition exists. Although the NIR-converted camera only captures NIR light, the Bayer filter remains and only the NIR values captured by the red-channel is used due to its higher sensitivity [20]. In contrast, a conventional four-camera system designates one camera for each band to capture non-interpolated digital values for each pixel. Although several studies have shown that images from single true color cameras and/or NIR-converted cameras can be used for agricultural and plant science applications, little research has been conducted to evaluate the effect of the Bayer filter on the quality of the image and how this effect is translated to the estimation of biophysical parameters (e.g., crop yield) and image classification results (e.g., differentiating crop from weed species). Therefore, research is needed to compare these two types of imaging systems using images captured simultaneously with these systems.
Like any multi-camera imaging systems, the two-camera system is subjected to the misalignment between the two cameras. Camera misalignment introduces another source of error for the system. To minimize the mismatch of the imaging areas between the two cameras, the camera mount was manufactured precisely and extra care was made to make sure that the cameras were mounted parallelly and in the same plane (see Figure 2) before each aerial mission. Under ideal conditions, the two images should be a simple shift equal to the center distance between the two cameras. In reality, due to the mounting errors as well as the manufacturing differences between the two sets of cameras and lenses, a first-order or a second-order transformation is needed to align the two images as demonstrated in Section 2.3. This mathematical alignment method resulted in subpixel alignment for the examples presented in this paper.
Another error or uncertainty related to image misalignment is the reduction of the effective imaging area between the two cameras. The non-overlapped areas between the NIR and normal color images can be seen in all the above example CIR images. Table 5 presents the overlapped areas in number of pixels and in percentage and the mismatch error in percentage for the images in the four examples. The overlapped area was estimated from the four coordinates of the overlapped imaging area for each four-band image. The overlapped area ranged from 96.86% for the estimating cotton canopy width study to 98.54% for the mapping cotton root rot study. Thus, the mismatch error or non-overlapped area between the NIR and normal color images ranged from 1.46% to 3.14% for the four examples. These are relatively small errors considering each image covers significant large area than each study area. In practice, if images need to be taken over a single field or a study site, the imaging area should be centered at the field and leave sufficient areas surrounding the field. If images need to be taken over a large area, there should be sufficient overlap (30%–40%) between consecutive images in the same flight line and in the neighboring flight lines for regional mapping to ensure complete coverage.
Once the NIR and normal color images are aligned and combined into a four-band image. The multispectral image can be rectified to a coordinate system and converted to reflectance like any other images. All other errors related to image processing and classification analysis apply to the image. Georeferencing error has much to do with the reference image or GPS ground control points to be used. The positional accuracy of the rectified image will not be any better than that of the reference image or the GPS receiver. For example, if an image with 0.1-m pixel size is georeferenced with a GPS receiver with 1-m accuracy. The georeferenced image will have a positional error greater than 1 m. Therefore, more accurate reference image or GPS data should be used to georeference the image. Image analysis can take different forms depending on particular applications. An image can be used to estimate crop biophysical parameters such as yield and it can also be used to distinguish a crop disease or a weed. The four examples presented in this paper illustrate some potential applications of the system in agriculture. These results are preliminary and more thorough analysis and accuracy assessment for each study are needed to evaluate the performance of the imaging system.

6. Conclusions

An airborne multispectral imaging system based on a consumer-grade color camera and a NIR conversion of a second identical color camera was designed, assembled, and evaluated in this study. The system can acquire 14-bit normal color and NIR images with 5616 × 3744 pixels. Images can be captured at altitudes of 305–3050 m (1000–10,000 ft) to achieve pixel sizes from 0.1 to 1.0 m and ground coverage from 549 m × 366 m (1800 ft × 1200 ft) to 5486 m × 3658 m (18,000 ft × 12,000 ft). The major camera parameters, including ISO speed, exposure time and aperture, were optimized to acquire images under diverse ground cover types without the need to change settings for different flight missions across the whole season. The optimal ISO speed and exposure time were 200 and 1/500 s, respectively, for both cameras and the optimal aperture was f/10 for the normal color camera and f/14 for the NIR camera.
The second-order transformation was found effective to align the normal color and NIR images. The RMS errors between the two images acquired over a site with diverse ground cover types at three different altitudes and for the four example studies were all within one pixel (0.4–0.7 pixels) based on seven to nine reference points with the second-order transformation. The overlapped area between the two images ranged from 96.9% to 98.5% based on the four example studies. Compared with conventional four-camera systems, the two-camera system is less expensive (about $6000) and can be easily assembled with the off-the-shelf electronics. The methods and techniques presented in this study can be used to develop similar imaging systems by end users as consumer grade cameras are becoming a more affordable and attractive remote sensing tool.
Airborne testing and evaluations of the system over a two-year period showed that the two-camera imaging system performed reliably. Geotagged images from the system can be readily viewed in common image viewers and in Google Earth. Images acquired for four ongoing studies have demonstrated that the imaging system has the potential for monitoring crop growing conditions, detecting crop diseases, and mapping invasive weeds on cropland and wetland ecosystems. More research is needed to compare the quality of images between the consumer-grade two-camera system and conventional four-camera systems and to evaluate the imaging system for other remote sensing applications.

Acknowledgments

The author wishes to thank Fred Gomez and Lee Denham of USDA-ARS in College Station, Texas for test-flying the system and acquiring the example images for this study.

Conflicts of Interest

The authors declare no conflict of interest.
  • Author ContributionsChenghai Yang designed, assembled, tested and evaluated the imaging system, analyzed the sample images and wrote the manuscript. The coauthors participated in the evaluation of the system for different applications. John Westbrook and Charles Suh were involved in the evaluation of the system for estimating cotton canopy cover. Daniel Martin, Clint Hoffmann, Yubin Lan and Bradley Fritz were involved in the evaluation of the system for mapping henbit infestations for aerial application. John Goolsby was involved in the evaluation of the system for mapping giant reed infestations.
  • DisclaimerMention of trade names or commercial products in this publication is solely for the purpose of providing specific information and does not imply recommendation or endorsement by the U.S. Department of Agriculture.

References

  1. Everitt, J.H.; Escobar, D.E.; Alaniz, M.A.; Davis, M.R. Using multispectral video imagery for detecting soil surface conditions. Photogramm. Eng. Remote Sens 1989, 55, 467–471. [Google Scholar]
  2. Mausel, P.W.; Everitt, J.H.; Escobar, D.E.; King, D.J. Airborne videography: Current status and future perspectives. Photogramm. Eng. Remote Sens 1992, 58, 1189–1195. [Google Scholar]
  3. Pearson, R.; Mao, C.; Grace, J. Real-time airborne agricultural monitoring. Remote Sens. Environ 1994, 49, 304–310. [Google Scholar]
  4. Pinter, P.J., Jr.; Hatfield, J.L.; Schepers, J.S.; Barnes, E.M.; Moran, M.S.; Daughtry, C.S.T.; Upchurch, D.R. Remote sensing for crop management. Photogramm. Eng. Remote Sens 2003, 69, 647–664. [Google Scholar]
  5. Backoulou, G.F.; Elliott, N.C.; Giles, K.; Phoofolo, M.; Catana, V. Development of a method using multispectral imagery and spatial pattern metrics to quantify stress to wheat fields caused by. Diuraphis noxia. Comput. Electron. Agric 2011, 75, 64–70. [Google Scholar]
  6. Escobar, D.E.; Everitt, J.H.; Noriega, J.R.; Cavazos, I.; Davis, M.R. A twelve-band airborne digital video imaging system (ADVIS). Remote Sens. Environ 1998, 66, 122–128. [Google Scholar]
  7. Gorsevski, P.V.; Gessler, P.E. The design and the development of a hyperspectral and multispectral airborne mapping system. ISPRS J. Photogramm. Remote Sens 2009, 64, 184–192. [Google Scholar]
  8. Yang, C. A high resolution airborne four-camera imaging system for agricultural applications. Comput. Electron. Agric 2012, 88, 13–24. [Google Scholar]
  9. Bausch, W.C.; Khosla, R. QuickBird satellite versus ground-based multi-spectral data for estimating nitrogen status of irrigated maize. Precis. Agric 2010, 11, 274–290. [Google Scholar]
  10. McCarthy, M.J.; Halls, J.N. Habitat mapping and change assessment of coastal environments: An examination of WorldView-2, QuickBird, and IKONOS satellite imagery and airborne LiDAR for mapping barrier island habitats. ISPRS Int. J. Geo-Inf 2014, 3, 297–325. [Google Scholar]
  11. Hunt, E.R., Jr.; Hively, W.D.; Fujikawa, S.J.; Linden, D.S.; Daughtry, C.S.T.; McCarty, G.W. Acquisition of NIR-green-blue digital photographs from unmanned aircraft for crop monitoring. Remote Sens 2010, 2, 290–305. [Google Scholar]
  12. Laliberte, A.S.; Goforth, M.A.; Steele, C.M.; Rango, A. Multispectral remote sensing from unmanned aircraft: Image processing workflows and applications for rangeland environments. Remote Sens 2011, 3, 2529–2551. [Google Scholar]
  13. Watts, A.C.; Ambrosia, V.G.; Hinkley, E.A. Unmanned aircraft systems in remote sensing and scientific research: Classification and considerations of use. Remote Sens 2012, 4, 1671–1692. [Google Scholar]
  14. Bayer, B.E. Color Imaging Array. US Patent 3971065, 20.
  15. Hirakawa, K.; Wolfe, P.J. Spatio-Spectral Sampling and Color Filter Array Design. In Single-Sensor Imaging: Methods and Applications for Digital Cameras; Lukac, R., Ed.; CRC Press: Boca Raton, FL, USA, 2008; pp. 137–151. [Google Scholar]
  16. Levin, N.; Ben-Dor, E.; Singer, A. A digital camera as a tool to measure colour indices and related properties of sandy soils in semi-arid environments. Int. J. Remote Sens 2005, 26, 5475–5492. [Google Scholar]
  17. Lebourgeois, V.; Bégué, A.; Labbé, S.; Mallavan, B.; Prévot, L.; Roux, B. Can commercial digital cameras be used as multispectral sensors?—A crop monitoring test. Sensors 2008, 8, 7300–7322. [Google Scholar]
  18. Hardin, P.J.; Jensen, R.R. Small-scale unmanned aerial vehicles in environmental remote sensing: Challenges and opportunities. GISci. Remote Sens 2011, 48, 99–111. [Google Scholar]
  19. Sakamoto, T.; Gitelson, A.A.; Nguy-Robertson, A.L.; Arkebauer, T.J.; Wardlow, B.D.; Suyker, A.E.; Verma, S.B. An alternative method using digital cameras for continuous monitoring of crop status. Agric. For. Meteorol 2012, 154–155, 113–126. [Google Scholar]
  20. Nijland, W.; de Jong, R.; de Jong, S.M.; Wulder, M.A.; Bater, C.W.; Coops, N.C. Monitoring plant condition and phenology using infrared sensitive consumer grade digital cameras. Agric. For. Meteorol 2014, 184, 98–106. [Google Scholar]
  21. Rabatel, G.; Gorretta, N.; Labbé, N. Getting simultaneous red and near-infrared band data from a single digital camera for plant monitoring applications: Theoretical and practical study. Biosyst. Eng 2014, 117, 2–14. [Google Scholar] [Green Version]
  22. Yang, C.; Everitt, J.H.; Bradford, J.M. Airborne hyperspectral imagery and yield monitor data for estimating grain sorghum yield variability. Trans. ASAE 2004, 47, 915–924. [Google Scholar]
  23. Yang, C.; Everitt, J.H.; Bradford, J.M. Airborne hyperspectral imagery and linear spectral unmixing for mapping variation in crop yield. Precis. Agric 2007, 8, 279–296. [Google Scholar]
  24. Yang, C.; Goolsby, J.A.; Everitt, J.H.; Du, Q. Applying six classifiers to airborne hyperspectral imagery for detecting giant reed. Geocarto Int 2012, 27, 413–424. [Google Scholar]
  25. Kumar, A.; Lee, W.S.; Ehsani, M.R.; Albrigo, L.G.; Yang, C.; Mangan, R.L. Citrus greening disease detection using aerial hyperspectral and multispectral imaging techniques. J. Appl. Remote Sens 2012, 6, 063542. [Google Scholar]
  26. Akkaynak, D.; Treibitz, T.; Xiao, B.; Gürkan, U.A.; Allen, J.J.; Demirci, U.; Hanlon, R.T. Use of commercial off-the-shelf digital cameras for scientific data acquisition and scene-specific color calibration. J. Opt. Soc. Am 2014, 31, 312–321. [Google Scholar]
  27. Intergraph Corporation, ERDAS Field Guide; Intergraph Corporation: Huntsville, AL, USA, 2013.
  28. Yang, C.; Odvody, G.N.; Fernandez, C.J.; Landivar, J.A.; Minzenmayer, R.R.; Nichols, R.L.; Thomasson, J.A. Monitoring cotton root rot progression within a growing season using airborne multispectral imagery. J. Cotton Sci 2014, 18, 85–93. [Google Scholar]
  29. Richards, J.A. Remote Sensing Digital Image Analysis; Springer-Verlag: Berlin, Germany, 1999. [Google Scholar]
  30. Yang, C.; Everitt, J.H.; Goolsby, J.A. Using aerial photography for mapping giant reed infestations along the Texas-Mexico portion of the Rio Grande. Invasive Plant Sci. Manag 2011, 4, 402–410. [Google Scholar]
Figure 1. A two-camera multispectral imaging system mounted on an aluminum rack.
Figure 1. A two-camera multispectral imaging system mounted on an aluminum rack.
Remotesensing 06 05257f1
Figure 2. A two-camera imaging system installed over a camera port in a Cessna 206 aircraft.
Figure 2. A two-camera imaging system installed over a camera port in a Cessna 206 aircraft.
Remotesensing 06 05257f2
Figure 3. Geotagged images plotted in Google Earth. The images were acquired at 610 m (2000 ft) above ground level near Hargill, Texas on 14 May 2012.
Figure 3. Geotagged images plotted in Google Earth. The images were acquired at 610 m (2000 ft) above ground level near Hargill, Texas on 14 May 2012.
Remotesensing 06 05257f3
Figure 4. Normal color (left) and color-infrared (right) images acquired at 305 m (1000 ft) above ground level on 17 May 2012 from an area with diverse ground cover types in South Texas.
Figure 4. Normal color (left) and color-infrared (right) images acquired at 305 m (1000 ft) above ground level on 17 May 2012 from an area with diverse ground cover types in South Texas.
Remotesensing 06 05257f4
Figure 5. Histograms of the four bands for the images shown in Figure 4.
Figure 5. Histograms of the four bands for the images shown in Figure 4.
Remotesensing 06 05257f5
Figure 6. Normal color (left) and color-infrared (right) images acquired at 305 m (1000 ft) above ground on 17 May 2012 from cotton fields near Hargill, Texas. The images for a close-up area with a 40 m by 40 m square extracted from the full-size image are also shown.
Figure 6. Normal color (left) and color-infrared (right) images acquired at 305 m (1000 ft) above ground on 17 May 2012 from cotton fields near Hargill, Texas. The images for a close-up area with a 40 m by 40 m square extracted from the full-size image are also shown.
Remotesensing 06 05257f6
Figure 7. Classification map of a four-band image for a 40 m × 40 m area within a cotton field based on unsupervised classification. Red represents cotton plants, while white depicts soil background.
Figure 7. Classification map of a four-band image for a 40 m × 40 m area within a cotton field based on unsupervised classification. Red represents cotton plants, while white depicts soil background.
Remotesensing 06 05257f7
Figure 8. Normal color (left) and color-infrared (right) images acquired at 1524 m (5000 ft) AGL from a cotton root rot-infected area near San Angelo, Texas on 18 September 2012.
Figure 8. Normal color (left) and color-infrared (right) images acquired at 1524 m (5000 ft) AGL from a cotton root rot-infected area near San Angelo, Texas on 18 September 2012.
Remotesensing 06 05257f8
Figure 9. Normal color (left) and color-infrared (right) images acquired at 457 m (1500 ft) above ground from a fallow field infested with henbit near College Station, Texas on 13 February 2013.
Figure 9. Normal color (left) and color-infrared (right) images acquired at 457 m (1500 ft) above ground from a fallow field infested with henbit near College Station, Texas on 13 February 2013.
Remotesensing 06 05257f9
Figure 10. Five-zone classification map and merged two-zone prescription map derived from the images shown in Figure 9 based on maximum likelihood classification.
Figure 10. Five-zone classification map and merged two-zone prescription map derived from the images shown in Figure 9 based on maximum likelihood classification.
Remotesensing 06 05257f10
Figure 11. Normal color (left) and color-infrared (right) images acquired at 2438 m (8000 ft) above ground level on 15 November 2013 from a giant reed-infested area along the Rio Grande near Del Rio, Texas. Giant reed shows a lime green tone on the normal color image and a bright red color on the color-infrared image.
Figure 11. Normal color (left) and color-infrared (right) images acquired at 2438 m (8000 ft) above ground level on 15 November 2013 from a giant reed-infested area along the Rio Grande near Del Rio, Texas. Giant reed shows a lime green tone on the normal color image and a bright red color on the color-infrared image.
Remotesensing 06 05257f11
Table 1. Major specifications of a Canon EOS 5D Mark II camera.
Table 1. Major specifications of a Canon EOS 5D Mark II camera.
CharacteristicsDescription
Camera typeDigital, single-lens reflex
Sensor typeCMOS
Sensing area36 × 24 mm
Pixel array2784 × 1856, 3861 × 2574 or 5616 × 3744
Image typeRAW (14-bit) + JPEG
Image size10.8 + 2.1 MB, 14.8 + 3.6 MB or 25.8 + 6.1 MB
Shooting speedMax 3.9 shots/s
Display3-in TFT color LCD
Recording mediaType I or II CF card
ISO speed100–6400
Shutter speed1/8000 to 30 s
Dimensions152 × 113.5 × 75 mm
Weight810 g
Operating temperature0–40 °C
Table 2. Ground coverage and pixel size of a two-camera imaging system at different flight heights above ground level.
Table 2. Ground coverage and pixel size of a two-camera imaging system at different flight heights above ground level.
Flight HeightGround CoveragePixel Size *

(m)(ft)(m × m)(ft × ft)(m)(ft)
3051000549 × 3661800 × 12000.100.32
61020001097 × 7323600 × 24000.200.64
91430001646 × 10975400 × 36000.290.96
121940002195 × 14637200 × 48000.391.28
152450002743 × 18299000 × 60000.491.60
182960003292 × 219510,800 × 72000.591.92
213470003840 × 256012,600 × 84000.682.24
243880004389 × 292614,400 × 96000.782.56
274390004938 × 329216,200 × 10,8000.882.88
304810,0005486 × 365818,000 × 12,0000.983.21
Note:
*Pixel array set at 5616 × 3744 pixels.
Table 3. Root mean square errors in pixels for registering near-infrared band images to normal color images acquired at 305, 610 and 1219 m (1000, 2000 and 4000 ft) above ground level (AGL) near Hargill, Texas using the first- and second-order transformations based on 4 to 10 control points.
Table 3. Root mean square errors in pixels for registering near-infrared band images to normal color images acquired at 305, 610 and 1219 m (1000, 2000 and 4000 ft) above ground level (AGL) near Hargill, Texas using the first- and second-order transformations based on 4 to 10 control points.
Flight Height AGLNo. of PointsFirst-Order TransformationSecond-Order Transformation

xyTotalxyTotal
305 m (1000 ft)44.62.35.1
55.32.55.9
64.92.55.5
74.92.45.40.30.20.4
84.62.35.10.50.40.6
94.42.24.90.50.40.6
104.22.14.70.50.40.6

610 m (2000 ft)43.96.57.6
55.36.07.9
64.96.07.8
74.65.67.20.10.40.4
85.95.27.90.30.40.5
95.75.17.60.50.40.7
105.64.87.40.50.40.7

1219 m (4000 ft)45.83.16.6
55.62.96.3
65.22.85.9
74.92.85.60.30.40.5
84.62.75.40.50.40.6
94.62.85.40.50.30.6
104.42.75.10.50.40.6
Table 4. Geographic coordinates, flight heights above ground level, and root mean square errors in pixels for registering near-infrared band images to normal color images acquired from four different study sites using the first- and second-order transformations based on nine control points.
Table 4. Geographic coordinates, flight heights above ground level, and root mean square errors in pixels for registering near-infrared band images to normal color images acquired from four different study sites using the first- and second-order transformations based on nine control points.
Application ExampleLongitudeLatitudeFlight HeightFirst-OrderSecond-Order

DegreeDegreemftxyTotalxyTotal
Estimating cotton width−98.047426.425730510005.53.06.30.30.50.6
Mapping root rot−100.294831.3793152450003.83.04.90.20.30.4
Mapping henbit−96.431730.536945715004.94.56.70.30.40.5
Mapping giant reed−100.771629.1869243880003.03.14.30.30.40.5
Table 5. Overlapped area and error for aligned near-infrared band images and normal color images for four different study sites.
Table 5. Overlapped area and error for aligned near-infrared band images and normal color images for four different study sites.
Application ExampleOverlapped AreaError

pixel%%
Estimating cotton width20,365,80596.863.14
Mapping root rot20,718,50798.541.46
Mapping henbit20,374,55496.903.10
Mapping giant reed20,509,00497.542.46
Notes: Overlap Area (%) = Overlap pixels/(5616 × 3744) × 100; Error (%) = 100 – Overlap Area (%).

Share and Cite

MDPI and ACS Style

Yang, C.; Westbrook, J.K.; Suh, C.P.-C.; Martin, D.E.; Hoffmann, W.C.; Lan, Y.; Fritz, B.K.; Goolsby, J.A. An Airborne Multispectral Imaging System Based on Two Consumer-Grade Cameras for Agricultural Remote Sensing. Remote Sens. 2014, 6, 5257-5278. https://doi.org/10.3390/rs6065257

AMA Style

Yang C, Westbrook JK, Suh CP-C, Martin DE, Hoffmann WC, Lan Y, Fritz BK, Goolsby JA. An Airborne Multispectral Imaging System Based on Two Consumer-Grade Cameras for Agricultural Remote Sensing. Remote Sensing. 2014; 6(6):5257-5278. https://doi.org/10.3390/rs6065257

Chicago/Turabian Style

Yang, Chenghai, John K. Westbrook, Charles P.-C. Suh, Daniel E. Martin, W. Clint Hoffmann, Yubin Lan, Bradley K. Fritz, and John A. Goolsby. 2014. "An Airborne Multispectral Imaging System Based on Two Consumer-Grade Cameras for Agricultural Remote Sensing" Remote Sensing 6, no. 6: 5257-5278. https://doi.org/10.3390/rs6065257

Article Metrics

Back to TopTop