[go: up one dir, main page]

 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (19)

Search Parameters:
Keywords = HyspIRI

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
40 pages, 3131 KiB  
Review
Recent Advances of Hyperspectral Imaging Technology and Applications in Agriculture
by Bing Lu, Phuong D. Dao, Jiangui Liu, Yuhong He and Jiali Shang
Remote Sens. 2020, 12(16), 2659; https://doi.org/10.3390/rs12162659 - 18 Aug 2020
Cited by 480 | Viewed by 34749
Abstract
Remote sensing is a useful tool for monitoring spatio-temporal variations of crop morphological and physiological status and supporting practices in precision farming. In comparison with multispectral imaging, hyperspectral imaging is a more advanced technique that is capable of acquiring a detailed spectral response [...] Read more.
Remote sensing is a useful tool for monitoring spatio-temporal variations of crop morphological and physiological status and supporting practices in precision farming. In comparison with multispectral imaging, hyperspectral imaging is a more advanced technique that is capable of acquiring a detailed spectral response of target features. Due to limited accessibility outside of the scientific community, hyperspectral images have not been widely used in precision agriculture. In recent years, different mini-sized and low-cost airborne hyperspectral sensors (e.g., Headwall Micro-Hyperspec, Cubert UHD 185-Firefly) have been developed, and advanced spaceborne hyperspectral sensors have also been or will be launched (e.g., PRISMA, DESIS, EnMAP, HyspIRI). Hyperspectral imaging is becoming more widely available to agricultural applications. Meanwhile, the acquisition, processing, and analysis of hyperspectral imagery still remain a challenging research topic (e.g., large data volume, high data dimensionality, and complex information analysis). It is hence beneficial to conduct a thorough and in-depth review of the hyperspectral imaging technology (e.g., different platforms and sensors), methods available for processing and analyzing hyperspectral information, and recent advances of hyperspectral imaging in agricultural applications. Publications over the past 30 years in hyperspectral imaging technology and applications in agriculture were thus reviewed. The imaging platforms and sensors, together with analytic methods used in the literature, were discussed. Performances of hyperspectral imaging for different applications (e.g., crop biophysical and biochemical properties’ mapping, soil characteristics, and crop classification) were also evaluated. This review is intended to assist agricultural researchers and practitioners to better understand the strengths and limitations of hyperspectral imaging to agricultural applications and promote the adoption of this valuable technology. Recommendations for future hyperspectral imaging research for precision agriculture are also presented. Full article
(This article belongs to the Special Issue Hyperspectral Remote Sensing of Agriculture and Vegetation)
Show Figures

Figure 1

Figure 1
<p>The number of publications that utilized hyperspectral imaging for agriculture applications (by May 2020).</p>
Full article ">Figure 2
<p>Number of publications that used different hyperspectral imaging platforms over time.</p>
Full article ">Figure 3
<p>Hyperspectral UAV systems used in previous agricultural studies. Figures were reproduced with permission from the corresponding publishers: (<b>a</b>) MDPI [<a href="#B119-remotesensing-12-02659" class="html-bibr">119</a>], (<b>b</b>) MDPI [<a href="#B120-remotesensing-12-02659" class="html-bibr">120</a>], (<b>c</b>) MDPI [<a href="#B121-remotesensing-12-02659" class="html-bibr">121</a>], and (<b>d</b>) SPIE [<a href="#B122-remotesensing-12-02659" class="html-bibr">122</a>].</p>
Full article ">Figure 4
<p>Close-range imaging platforms used in previous studies. Figures were reproduced with permission from corresponding publishers: (<b>a</b>) American Society for Photogrammetry and Remote Sensing (ASPRS), Bethesda, Maryland, asprs.org [<a href="#B139-remotesensing-12-02659" class="html-bibr">139</a>]; (<b>b</b>) SPIE [<a href="#B148-remotesensing-12-02659" class="html-bibr">148</a>]; (<b>c</b>) Elsevier [<a href="#B138-remotesensing-12-02659" class="html-bibr">138</a>]; (<b>d</b>) Springer Nature [<a href="#B144-remotesensing-12-02659" class="html-bibr">144</a>]; (<b>e</b>) Elsevier [<a href="#B149-remotesensing-12-02659" class="html-bibr">149</a>].</p>
Full article ">
14 pages, 2026 KiB  
Article
The Utility of the Upcoming HyspIRI’s Simulated Spectral Settings in Detecting Maize Gray Leafy Spot in Relation to Sentinel-2 MSI, VENµS, and Landsat 8 OLI Sensors
by Mbulisi Sibanda, Onisimo Mutanga, Timothy Dube, John Odindi and Paramu L. Mafongoya
Agronomy 2019, 9(12), 846; https://doi.org/10.3390/agronomy9120846 - 4 Dec 2019
Cited by 7 | Viewed by 2770
Abstract
Considering the high maize yield loses caused by incidences of disease, as well as incomprehensive monitoring initiatives in crop farming, there is a need for spatially explicit, cost-effective, and consistent approaches for monitoring, as well as for forecasting, food-crop diseases, such as maize [...] Read more.
Considering the high maize yield loses caused by incidences of disease, as well as incomprehensive monitoring initiatives in crop farming, there is a need for spatially explicit, cost-effective, and consistent approaches for monitoring, as well as for forecasting, food-crop diseases, such as maize Gray Leaf Spot. Such approaches are valuable in reducing the associated economic losses while fostering food security. In this study, we sought to investigate the utility of the forthcoming HyspIRI sensor in detecting disease progression of Maize Gray Leaf Spot infestation in relation to the Sentinel-2 MSI and Landsat 8 OLI spectral configurations simulated using proximally sensed data. Healthy, intermediate, and severe categories of maize crop infections by the Gray Leaf Spot disease were discriminated based on partial least squares–discriminant analysis (PLS-DA) algorithm. Comparatively, the results show that the HyspIRI’s simulated spectral settings slightly performed better than those of Sentinel-2 MSI, VENµS, and Landsat 8 OLI sensor. HyspIRI exhibited an overall accuracy of 0.98 compared to 0.95, 0.93, and 0.89, which were exhibited by Sentinel-2 MSI, VENµS, and Landsat 8 OLI sensor sensors, respectively. Furthermore, the results showed that the visible section, red-edge, and NIR covered by all the four sensors were the most influential spectral regions for discriminating different Maize Gray Leaf Spot infections. These findings underscore the potential value of the upcoming hyperspectral HyspIRI sensor in precision agriculture and forecasting of crop-disease epidemics, which are necessary to ensure food security. Full article
(This article belongs to the Special Issue Remote Sensing of Agricultural Monitoring)
Show Figures

Figure 1

Figure 1
<p>The general location of the Research Farm in Pietermaritzburg, KwaZulu Natal Province, South Africa.</p>
Full article ">Figure 2
<p>Maize leaves that are (<b>a</b>) healthy, and (<b>b</b>,<b>c</b>) intermediately and (<b>d</b>) severely infected by Grey Leaf Spot disease.</p>
Full article ">Figure 3
<p>Differences in mean reflectance of healthy, intermediately infected, and severely infected maize crops based on (<b>a</b>) HyspIRI, (<b>b</b>) Sentinel-2 MSI, (<b>c</b>) VENµS, and (<b>d</b>) Landsat OLI spectral settings.</p>
Full article ">Figure 4
<p>PLS–DA classification accuracies of healthy, intermediately infected, and severely infected maize crops based on (<b>a</b>) HyspIRI, (<b>b</b>) Sentinel-2 MSI, (<b>c</b>) VENµS, and (<b>d</b>) Landsat OLI spectral settings. OA, UA, and PA refer to overall accuracy, user accuracy, and producer accuracies, respectively.</p>
Full article ">Figure 4 Cont.
<p>PLS–DA classification accuracies of healthy, intermediately infected, and severely infected maize crops based on (<b>a</b>) HyspIRI, (<b>b</b>) Sentinel-2 MSI, (<b>c</b>) VENµS, and (<b>d</b>) Landsat OLI spectral settings. OA, UA, and PA refer to overall accuracy, user accuracy, and producer accuracies, respectively.</p>
Full article ">Figure 5
<p>Accuracies of healthy, intermediately infected, and severely infected maize crops, based on (<b>a</b>) HyspIRI, (<b>b</b>) Sentinel-2 MSI, (<b>c</b>) VENµS, and (<b>d</b>) Landsat OLI spectral settings based on Pontius and Millionres’s omission and commission and agreement allocations. See comment on <a href="#agronomy-09-00846-f003" class="html-fig">Figure 3</a> caption.</p>
Full article ">Figure 6
<p>Variable importance in projection scores for (<b>a</b>) HyspIRI, (<b>b</b>) Sentinel-2 MSI, (<b>c</b>) VENµS, and (<b>d</b>) Landsat OLI spectral bands. The black lines represent the optimal level of the variable importance projection score determined, based on the literature [<a href="#B34-agronomy-09-00846" class="html-bibr">34</a>].</p>
Full article ">
5 pages, 1226 KiB  
Proceeding Paper
Experimental Tests on TIR Multispectral Images for Temperature-Emissivity Separation by Using the MaxEnTES Algorithm
by Vanni Nardino, Gabriele Amato, Donatella Guzzi, Cinzia Lastri and Valentina Raimondi
Proceedings 2019, 27(1), 10; https://doi.org/10.3390/proceedings2019027010 - 18 Sep 2019
Viewed by 998
Abstract
Satellite images in the TIR are relevant for several Earth Observation applications. The retrieval of temperature and emissivity from the emitted radiance, however, requires the use of suitable algorithms, such as the MaxEnTES that uses the concept of maximum entropy to solve the [...] Read more.
Satellite images in the TIR are relevant for several Earth Observation applications. The retrieval of temperature and emissivity from the emitted radiance, however, requires the use of suitable algorithms, such as the MaxEnTES that uses the concept of maximum entropy to solve the Temperature-Emissivity Separation problem. Here we discuss the performance of MaxEnTES when applied to TIR images with a limited number of channels, specifically simulated HyspIRI multispectral images and real multispectral images by ASTER. The results were respectively compared with the original temperatures used for the simulations and with the temperatures obtained by using the ASTER TES algorithm. Full article
Show Figures

Figure 1

Figure 1
<p>HyspIRI-like simulated image: (<b>a</b>) thematic map with 19 classes; (<b>b</b>) Simulated ground-truth temperature image.</p>
Full article ">Figure 2
<p>ASTER acquisition over Fucino area, Abruzzo, Italy: (<b>a</b>) ASTER A09 product—Ground emitted radiance image at 10.657 µm, (<b>b</b>) ASTER A08 product—Temperature image.</p>
Full article ">Figure 3
<p>MaxEnTES applied to simulated and real data: (<b>a</b>) Simulated data: histogram of the differences between the temperatures retrieved using MaxEnTES and those used for the simulation. (<b>b</b>) Real data: scatterplot of ASTER A08 temperatures vs the temperatures retrieved using MaxEnTES.</p>
Full article ">
23 pages, 2835 KiB  
Article
Comparison of Methods for Modeling Fractional Cover Using Simulated Satellite Hyperspectral Imager Spectra
by Philip E. Dennison, Yi Qi, Susan K. Meerdink, Raymond F. Kokaly, David R. Thompson, Craig S. T. Daughtry, Miguel Quemada, Dar A. Roberts, Paul D. Gader, Erin B. Wetherley, Izaya Numata and Keely L. Roth
Remote Sens. 2019, 11(18), 2072; https://doi.org/10.3390/rs11182072 - 4 Sep 2019
Cited by 33 | Viewed by 4978
Abstract
Remotely sensed data can be used to model the fractional cover of green vegetation (GV), non-photosynthetic vegetation (NPV), and soil in natural and agricultural ecosystems. NPV and soil cover are difficult to estimate accurately since absorption by lignin, cellulose, and other organic molecules [...] Read more.
Remotely sensed data can be used to model the fractional cover of green vegetation (GV), non-photosynthetic vegetation (NPV), and soil in natural and agricultural ecosystems. NPV and soil cover are difficult to estimate accurately since absorption by lignin, cellulose, and other organic molecules cannot be resolved by broadband multispectral data. A new generation of satellite hyperspectral imagers will provide contiguous narrowband coverage, enabling new, more accurate, and potentially global fractional cover products. We used six field spectroscopy datasets collected in prior experiments from sites with partial crop, grass, shrub, and low-stature resprouting tree cover to simulate satellite hyperspectral data, including sensor noise and atmospheric correction artifacts. The combined dataset was used to compare hyperspectral index-based and spectroscopic methods for estimating GV, NPV, and soil fractional cover. GV fractional cover was estimated most accurately. NPV and soil fractions were more difficult to estimate, with spectroscopic methods like partial least squares (PLS) regression, spectral feature analysis (SFA), and multiple endmember spectral mixture analysis (MESMA) typically outperforming hyperspectral indices. Using an independent validation dataset, the lowest root mean squared error (RMSE) values were 0.115 for GV using either normalized difference vegetation index (NDVI) or SFA, 0.164 for NPV using PLS, and 0.126 for soil using PLS. PLS also had the lowest RMSE averaged across all three cover types. This work highlights the need for more extensive and diverse fine spatial scale measurements of fractional cover, to improve methodologies for estimating cover in preparation for future hyperspectral global monitoring missions. Full article
(This article belongs to the Section Remote Sensing in Agriculture and Vegetation)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>A single 1 nm oversampled field spectrum (green) processed to a simulated 10 nm bandwidth VSWIR spectrum (orange). Bands excluded due to proximity to atmospheric water vapor absorption features are shaded black. VSWIR includes the visible, near infrared, and shortwave infrared spectral regions.</p>
Full article ">Figure 2
<p>Ternary plots for the (<b>a</b>) training and (<b>b</b>) validation spectral libraries. Both position on the plot and color indicate the fractional cover for each cover type. The color and symbol keys apply to both plots.</p>
Full article ">Figure 3
<p>Actual vs. predicted green vegetation (GV) fractional cover for the validation library for (<b>a</b>) NDVI, (<b>b</b>) EVI, (<b>c</b>) NDII, (<b>d</b>) SFA, (<b>e</b>) MESMA, and (<b>f</b>) PLS. See <a href="#remotesensing-11-02072-t001" class="html-table">Table 1</a> for metric definitions. The color of each point indicates the actual GV, NPV, and soil fraction. The diagonal line is a 1:1 ratio and the symbol indicates the vegetation type in the source dataset.</p>
Full article ">Figure 4
<p>Actual vs. predicted non-photosynthetic vegetation (NPV) fractional cover for the validation library for (<b>a</b>) CAI, (<b>b</b>) LCA, (<b>c</b>) hSINDRI, (<b>d</b>) SFA, (<b>e</b>) MESMA, and (<b>f</b>) PLS. See <a href="#remotesensing-11-02072-t002" class="html-table">Table 2</a> for metric definitions. The color of each point indicates the actual GV, NPV, and soil fraction. The diagonal line is a 1:1 ratio and the symbol indicates the vegetation type in the source dataset.</p>
Full article ">Figure 5
<p>Actual vs. predicted soil fractional cover for the validation library for (<b>a</b>) 1 − (GV<sub>NDVI</sub> + NPV<sub>CAI</sub>), (<b>b</b>) 1 − (GV<sub>SFA</sub> + NPV<sub>SFA</sub>), (<b>c</b>) MESMA, and (<b>d</b>) PLS. The color of each point indicates the actual GV, NPV, and soil. The diagonal line is a 1:1 ratio the symbol indicates the vegetation type in the source dataset.</p>
Full article ">Figure 6
<p>Partial least squares regression coefficients for GV, NPV, and soil fraction models.</p>
Full article ">Figure 7
<p>A comparison of a low NPV cover spectrum (SF_032) and moderate NPV cover spectrum (SF_101) with similar CAI values (0.00743 and 0.00795, respectively). Squares indicate the bands used to calculate CAI (see <a href="#remotesensing-11-02072-t002" class="html-table">Table 2</a> for band wavelengths).</p>
Full article ">
15 pages, 939 KiB  
Article
Discrimination of Tomato Plants (Solanum lycopersicum) Grown under Anaerobic Baffled Reactor Effluent, Nitrified Urine Concentrates and Commercial Hydroponic Fertilizer Regimes Using Simulated Sensor Spectral Settings
by Mbulisi Sibanda, Onisimo Mutanga, Lembe S. Magwaza, Timothy Dube, Shirly T. Magwaza, Alfred O. Odindo, Asanda Mditshwa and Paramu L. Mafongoya
Agronomy 2019, 9(7), 373; https://doi.org/10.3390/agronomy9070373 - 11 Jul 2019
Cited by 3 | Viewed by 3998
Abstract
We assess the discriminative strength of three different satellite spectral settings (HyspIRI, the forthcoming Landsat 9 and Sentinel 2-MSI), in mapping tomato (Solanum lycopersicum Linnaeus) plants grown under hydroponic system, using human-excreta derived materials (HEDM), namely, anaerobic baffled reactor (ABR) effluent and [...] Read more.
We assess the discriminative strength of three different satellite spectral settings (HyspIRI, the forthcoming Landsat 9 and Sentinel 2-MSI), in mapping tomato (Solanum lycopersicum Linnaeus) plants grown under hydroponic system, using human-excreta derived materials (HEDM), namely, anaerobic baffled reactor (ABR) effluent and nitrified urine concentrate (NUC) and commercial hydroponic fertilizer mix (CHFM) as main sources of nutrients. Simulated spectral settings of HyspIRI, Landsat 9 and Sentinel 2-MSI were resampled from spectrometric proximally sensed data. Discriminant analysis (DA) was applied in discriminating tomatoes grown under these different nutrient sources. Results showed that the simulated spectral settings of HyspIRI sensor better discriminate tomatoes grown under different fertilizer regimes when compared to Landsat 9 OLI and Sentinel-2 MSI spectral configurations. Using the DA algorithm, HyspIRI exhibited high overall accuracy (OA) of 0.99 and a kappa statistic of 0.99 whereas Landsat OLI and Sentinel-2 MSI exhibited OA of 0.94 and 0.95 and 0.79 and 0.85 kappa statistics, respectively. Simulated HyspIRI wavebands 710, 720, 690, 840, 1370 and 2110 nm, Sentinel 2-MSI bands 7 (783 nm), 6 (740 nm), 5 (705 nm) and 8a (865 nm) as well as Landsat bands 5 (865 nm), 6 (1610 nm), 7 (2200 nm) and 8 (590 nm), in order of importance, were selected as the most suitable bands for discriminating tomatoes grown under different fertilizer regimes. Overall, the performance of simulated HyspIRI, Landsat 9 OLI-2 and Sentinel-2 MSI spectral bands seem to bring new opportunities for crop monitoring. Full article
(This article belongs to the Special Issue Remote Sensing Applications for Agriculture and Crop Modelling)
Show Figures

Figure 1

Figure 1
<p>Collected spectra (<b>a</b>) and Mean spectral signatures of tomato crops grown under ABR effluent, NUC and CHFM treatments based on (<b>b</b>) HyspIRI (<b>c</b>) Sentinel -2 MSI and (<b>d</b>) Landsat 9 OLI-2 spectral settings.</p>
Full article ">Figure 1 Cont.
<p>Collected spectra (<b>a</b>) and Mean spectral signatures of tomato crops grown under ABR effluent, NUC and CHFM treatments based on (<b>b</b>) HyspIRI (<b>c</b>) Sentinel -2 MSI and (<b>d</b>) Landsat 9 OLI-2 spectral settings.</p>
Full article ">
29 pages, 4254 KiB  
Article
Accuracies Achieved in Classifying Five Leading World Crop Types and their Growth Stages Using Optimal Earth Observing-1 Hyperion Hyperspectral Narrowbands on Google Earth Engine
by Itiya Aneece and Prasad Thenkabail
Remote Sens. 2018, 10(12), 2027; https://doi.org/10.3390/rs10122027 - 13 Dec 2018
Cited by 42 | Viewed by 9711
Abstract
As the global population increases, we face increasing demand for food and nutrition. Remote sensing can help monitor food availability to assess global food security rapidly and accurately enough to inform decision-making. However, advances in remote sensing technology are still often limited to [...] Read more.
As the global population increases, we face increasing demand for food and nutrition. Remote sensing can help monitor food availability to assess global food security rapidly and accurately enough to inform decision-making. However, advances in remote sensing technology are still often limited to multispectral broadband sensors. Although these sensors have many applications, they can be limited in studying agricultural crop characteristics such as differentiating crop types and their growth stages with a high degree of accuracy and detail. In contrast, hyperspectral data contain continuous narrowbands that provide data in terms of spectral signatures rather than a few data points along the spectrum, and hence can help advance the study of crop characteristics. To better understand and advance this idea, we conducted a detailed study of five leading world crops (corn, soybean, winter wheat, rice, and cotton) that occupy 75% and 54% of principal crop areas in the United States and the world respectively. The study was conducted in seven agroecological zones of the United States using 99 Earth Observing-1 (EO-1) Hyperion hyperspectral images from 2008–2015 at 30 m resolution. The authors first developed a first-of-its-kind comprehensive Hyperion-derived Hyperspectral Imaging Spectral Library of Agricultural crops (HISA) of these crops in the US based on USDA Cropland Data Layer (CDL) reference data. Principal Component Analysis was used to eliminate redundant bands by using factor loadings to determine which bands most influenced the first few principal components. This resulted in the establishment of 30 optimal hyperspectral narrowbands (OHNBs) for the study of agricultural crops. The rest of the 242 Hyperion HNBs were redundant, uncalibrated, or noisy. Crop types and crop growth stages were classified using linear discriminant analysis (LDA) and support vector machines (SVM) in the Google Earth Engine cloud computing platform using the 30 optimal HNBs (OHNBs). The best overall accuracies were between 75% to 95% in classifying crop types and their growth stages, which were achieved using 15–20 HNBs in the majority of cases. However, in complex cases (e.g., 4 or more crops in a Hyperion image) 25–30 HNBs were required to achieve optimal accuracies. Beyond 25–30 bands, accuracies asymptote. This research makes a significant contribution towards understanding modeling, mapping, and monitoring agricultural crops using data from upcoming hyperspectral satellites, such as NASA’s Surface Biology and Geology mission (formerly HyspIRI mission) and the recently launched HysIS (Indian Hyperspectral Imaging Satellite, 55 bands over 400–950 nm in VNIR and 165 bands over 900–2500 nm in SWIR), and contributions in advancing the building of a novel, first-of-its-kind global hyperspectral imaging spectral-library of agricultural crops (GHISA: www.usgs.gov/WGSC/GHISA). Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Study areas throughout the US in various agroecological zones (AEZs). US study areas, named according to AEZs in which they are located. AEZs as defined by Food and Agriculture Organization (FAO) [<a href="#B59-remotesensing-10-02027" class="html-bibr">59</a>].</p>
Full article ">Figure 2
<p>Hyperion hyperspectral datacubes. (<b>a</b>) Earth Observing-1 (EO-1) Hyperion image over Ponca City, Oklahoma, USA, 2 September 2010, false color composite of RGB 844, 569, 529 nm; (<b>b</b>) locator map, with red rectangle showing location of Hyperion image; (<b>c</b>) datacube illustrating 198 bands available in Google Earth Engine (GEE) Hyperion data; (<b>d</b>) datacube illustrating 30 optimal hyperspectral narrowbands for studying globally dominant agricultural crops. Numerous band combinations are possible and important when using hyperspectral data. Here, we used 844 nm, 569 nm, and 529 nm. This is because, 844 nm is a center of NIR shoulder, 569 nm is at a point of steep slope within the green region, and 529 is at minimum slope, which allowed us to highlight the visual contrasts across different classes.</p>
Full article ">Figure 3
<p>Illustration of Hyperspectral Imaging Spectral library of Agricultural crops (HISA) of the US for 5 crops. HISA illustrated for 5 crops in certain agroecological zones and certain growth stages. N is number of spectra included in the average. HISA is part of the Global Hyperspectral Imaging Spectral-Library of Agricultural-crops (GHISA) (<a href="http://www.usgs.gov/WGSC/GHISA" target="_blank">www.usgs.gov/WGSC/GHISA</a>).</p>
Full article ">Figure 4
<p>Illustration of HISA of the US for 5 crops. Hyperspectral Imaging Spectral library of Agricultural crops (HISA) illustrated for 1 crop in 2 growth stages in AEZ 7, and 4–6 growth stages for the other crops in AEZ 9. N is number of spectra included in the average. HISA is part of the Global Hyperspectral Imaging Spectral-Library of Agricultural-crops (GHISA) (<a href="http://www.usgs.gov/WGSC/GHISA" target="_blank">www.usgs.gov/WGSC/GHISA</a>).</p>
Full article ">Figure 5
<p>Spectral matching of corn crop using EO-1 Hyperion data within and across agroecological zones (AEZs). Spectral matching of EO-1 Hyperion data taking corn crop spectra for the late growth stage: (<b>a</b>) within AEZ 10 for Julian Day 193 of year 2013 versus Julian Day 210 of year 2014; (<b>b</b>) across AEZs (AEZs 9 and 10) for Julian Day 210 of year 2014 in AEZ 10 versus Julian Day 184 of year 2014 in AEZ 9. N is number of samples used to calculate each average spectral profile.</p>
Full article ">Figure 6
<p>Optimal hyperspectral narrowbands (OHNBs) in the study of agricultural crops based on Hyperion data of leading world crops in the US. The 20 best hyperspectral narrowbands (HNBs) illustrated with spectral profiles of crops in the critical growth stage. Rice is not included because in the Hyperion images selected, there were no rice spectra in the critical stage.</p>
Full article ">Figure 7
<p>Number of hyperspectral narrowbands versus classification accuracies of crop types in AEZ 5. Number of hyperspectral narrowbands versus classification accuracies based on LDA of crop types for AEZ 5.</p>
Full article ">Figure 8
<p>Number of hyperspectral narrowbands versus classification accuracies of crop types in AEZ 6. Number of hyperspectral narrowbands versus classification accuracies based on linear discriminant analysis (LDA) of crop types for (<b>a</b>) AEZ 6 Image 1; (<b>b</b>) AEZ 6 Image 2; and (<b>c</b>) AEZ 6 Image 3.</p>
Full article ">Figure 9
<p>Number of hyperspectral narrowbands versus classification accuracies of crop types in AEZ 7. Number of hyperspectral narrowbands versus classification accuracies based on LDA of crop types for (<b>a</b>) AEZ 7 Image 1 and (<b>b</b>) AEZ 7 Image 2.</p>
Full article ">Figure 10
<p>Number of hyperspectral narrowbands versus classification accuracies of crop types in AEZ 9. Number of hyperspectral narrowbands versus classification accuracies based on LDA of crop types for (<b>a</b>) AEZ 9 Image 1 and (<b>b</b>) AEZ 9 Image 2.</p>
Full article ">Figure 11
<p>Number of hyperspectral narrowbands versus classification accuracies of crop types in AEZ 10. Number of hyperspectral narrowbands versus classification accuracies based on LDA of crop types for (<b>a</b>) AEZ 10 Image 1; (<b>b</b>) AEZ 10 Image 2; and (<b>c</b>) AEZ 10 Image 3.</p>
Full article ">Figure 12
<p>Number of hyperspectral narrowbands versus classification accuracies of crop growth stages. Number of hyperspectral narrowbands versus classification accuracies determined using discriminant analyses for 6 distinct growth stages, where present, of (<b>a</b>) corn; (<b>b</b>) cotton; (<b>c</b>) rice; (<b>d</b>) soybean; and (<b>e</b>) winter wheat.</p>
Full article ">Figure 13
<p>Crop type classification results for AEZ 9 Image 1 using 15 bands. Crop type image classification results using support vector machine (SVM) supervised classification in GEE for Agroecological Zone (AEZ) 9, EO-1 Hyperion Image 1 using 15 hyperspectral narrowbands. (<b>a</b>) EO-1 Hyperion Image False Color Composite, RGB: 844, 569, 529 nm; (<b>b</b>) SVM classification results using 15 bands; (<b>c</b>) USDA CDL reference data; (<b>d</b>) close-up of SVM results, extent indicated by red box in (<b>b</b>); (<b>e</b>) close-up of CDL reference data, extent indicated by red box in (<b>c</b>); (<b>f</b>) locator map; red rectangle shows location of Hyperion image.</p>
Full article ">Figure 14
<p>Crop type classification results for AEZ 9 Image 1 using 30 bands. Crop type image classification results using SVM supervised classification in GEE for Agroecological Zone (AEZ) 9, EO-1 Hyperion Image 1 using 30 hyperspectral narrowbands. (<b>a</b>) EO-1 Hyperion Image False Color Composite, RGB: 844, 569, 529 nm; (<b>b</b>) SVM classification results using 30 bands; (<b>c</b>) USDA CDL reference data; (<b>d</b>) close-up of SVM results, extent indicated by red box in (<b>b</b>); (<b>e</b>) close-up of CDL reference data, extent indicated by red box in (<b>c</b>); (<b>f</b>) locator map; red rectangle shows location of Hyperion image.</p>
Full article ">
32 pages, 500 KiB  
Review
Survey of Hyperspectral Earth Observation Applications from Space in the Sentinel-2 Context
by Julie Transon, Raphaël D’Andrimont, Alexandre Maugnard and Pierre Defourny
Remote Sens. 2018, 10(2), 157; https://doi.org/10.3390/rs10020157 - 23 Jan 2018
Cited by 212 | Viewed by 16247
Abstract
In the last few decades, researchers have developed a plethora of hyperspectral Earth Observation (EO) remote sensing techniques, analysis and applications. While hyperspectral exploratory sensors are demonstrating their potential, Sentinel-2 multispectral satellite remote sensing is now providing free, open, global and systematic high [...] Read more.
In the last few decades, researchers have developed a plethora of hyperspectral Earth Observation (EO) remote sensing techniques, analysis and applications. While hyperspectral exploratory sensors are demonstrating their potential, Sentinel-2 multispectral satellite remote sensing is now providing free, open, global and systematic high resolution visible and infrared imagery at a short revisit time. Its recent launch suggests potential synergies between multi- and hyper-spectral data. This study, therefore, reviews 20 years of research and applications in satellite hyperspectral remote sensing through the analysis of Earth observation hyperspectral sensors’ publications that cover the Sentinel-2 spectrum range: Hyperion, TianGong-1, PRISMA, HISUI, EnMAP, Shalom, HyspIRI and HypXIM. More specifically, this study (i) brings face to face past and future hyperspectral sensors’ applications with Sentinel-2’s and (ii) analyzes the applications’ requirements in terms of spatial and temporal resolutions. Eight main application topics were analyzed including vegetation, agriculture, soil, geology, urban, land use, water resources and disaster. Medium spatial resolution, long revisit time and low signal-to-noise ratio in the short-wave infrared of some hyperspectral sensors were highlighted as major limitations for some applications compared to the Sentinel-2 system. However, these constraints mainly concerned past hyperspectral sensors, while they will probably be overcome by forthcoming instruments. Therefore, this study is putting forward the compatibility of hyperspectral sensors and Sentinel-2 systems for resolution enhancement techniques in order to increase the panel of hyperspectral uses. Full article
(This article belongs to the Special Issue Spatial Enhancement of Hyperspectral Data and Applications)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Lifetime successions of the MultiSpectral Imagers (MSI) sensors of S2 (orange) and the main past and planned spaceborne hyperspectral EO sensors (grey).</p>
Full article ">Figure 2
<p>Evolution of the publication number of each selected Earth observation hyperspectral sensor and of the S2 sensors between 1999 and 2016. For each sensor, the total amount of Scopus results appears in brackets.</p>
Full article ">Figure 3
<p>Distribution of the research publications found with the Scopus platform using hyperspectral and S2 imagery depending on their main applications between 1999 and 2016. The number of publications treating of applications is specified in brackets for each sensor.</p>
Full article ">Figure 4
<p>Useful wavelengths retrieved from our hyperspectral applications review and from the discussion of Segl et al. [<a href="#B24-remotesensing-10-00157" class="html-bibr">24</a>] about the relevance of the S2 bands compared to S2 center wavelengths (black dots). The bandwidth of the S2 bands are indicated by the bars.</p>
Full article ">Figure 5
<p>Optimal (non-redundant) bands to retrieve the vegetation and agricultural crops biophysical parameters according to Thenkabail et al. [<a href="#B16-remotesensing-10-00157" class="html-bibr">16</a>] and Miglani et al. [<a href="#B25-remotesensing-10-00157" class="html-bibr">25</a>] compared to S2 center wavelengths (black dots). The bandwidth of the S2 bands are indicated by the bars.</p>
Full article ">
6510 KiB  
Article
Challenges in Methane Column Retrievals from AVIRIS-NG Imagery over Spectrally Cluttered Surfaces: A Sensitivity Analysis
by Minwei Zhang, Ira Leifer and Chuanmin Hu
Remote Sens. 2017, 9(8), 835; https://doi.org/10.3390/rs9080835 - 12 Aug 2017
Cited by 7 | Viewed by 6238
Abstract
A comparison between efforts to detect methane anomalies by a simple band ratio approach from the Airborne Visual Infrared Imaging Spectrometer-Classic (AVIRIS-C) data for the Kern Front oil field, Central California, and the Coal Oil Point marine hydrocarbon seep field, offshore southern California, [...] Read more.
A comparison between efforts to detect methane anomalies by a simple band ratio approach from the Airborne Visual Infrared Imaging Spectrometer-Classic (AVIRIS-C) data for the Kern Front oil field, Central California, and the Coal Oil Point marine hydrocarbon seep field, offshore southern California, was conducted. The detection succeeded for the marine source and failed for the terrestrial source, despite these sources being of comparable strength. Scene differences were investigated in higher spectral and spatial resolution collected by the AVIRIS-C successor instrument, AVIRIS Next Generation (AVIRIS-NG), by a sensitivity study. Sensitivity to factors including water vapor, aerosol, planetary boundary layer (PBL) structure, illumination and viewing angle, and surface albedo clutter were explored. The study used the residual radiance method, with sensitivity derived from MODTRAN (MODerate resolution atmospheric correction TRANsmission) simulations of column methane (XCH4). Simulations used the spectral specifications and geometries of AVIRIS-NG and were based on a uniform or an in situ vertical CH4 profile, which was measured concurrent with the AVIRIS-NG data. Small but significant sensitivity was found for PBL structure and water vapor; however, highly non-linear, extremely strong sensitivity was found for surface albedo error. For example, a 10% decrease in the surface albedo corresponded to a 300% XCH4 increase over background XCH4 to compensate for the total signal, less so for stronger plumes. This strong non-linear sensitivity resulted from the high percentage of surface-reflected radiance in the airborne at-sensor total radiance. Coarse spectral resolution and feedback from interferents like water vapor underlay this sensitivity. Imaging spectrometry like AVIRIS and the Hyperspectral InfraRed Imager (HyspIRI) candidate satellite mission, have the advantages of contextual spatial information and greater at-sensor total radiance. However, they also face challenges due to their relatively broad spectral resolution compared to trace gas specific orbital sensors, e.g., the Greenhouse gases Observing SATellite (GOSAT), which is especially applicable to trace gas retrievals over scenes with high spectral albedo variability. Results of the sensitivity analysis are applicable for the residual radiance method and CH4 profiles used in the analysis, but they illustrate potential significant challenges in CH4 retrievals using other approaches. Full article
(This article belongs to the Special Issue Remote Sensing of Greenhouse Gases)
Show Figures

Figure 1

Figure 1
<p>(<b>A</b>) True color imagery of Airborne Visual Infrared Imaging Spectrometer-Classic (AVIRIS-C) data acquired on 19 June 2008. (<b>B</b>) Band ratio (σ) of at-sensor reflectance (ρ<sub>t</sub>) for the 2298 and 2058 nm bands, σ = ρ<sub>t</sub>(2298)/ρ<sub>t</sub>(2058) for AVIRIS-C data in (<b>A</b>), black rectangle outline shows clear plume structure. (<b>C</b>) True color imagery of AVIRIS-C data acquired on 6 June 2013. (<b>D</b>) σ for AVIRIS-C data in (<b>C</b>). Data key on figure.</p>
Full article ">Figure 2
<p>True color imagery for AVIRIS-NG image of the Kern Front oil field, near Bakersfield, central California on 4 September 2014. The land cover types for the three pixels selected for sensitivity analysis are noted, by Ar, Bsl, and Rbl, which are for Asphalt road, Brown sandy loam, and Reddish brown sandy loam, respectively. Pixels in the red box are used to investigate the effect of subpixel heterogeneity on albedo and XCH<sub>4</sub>.</p>
Full article ">Figure 3
<p>(<b>A</b>) In situ methane, CH<sub>4</sub>, and wind data for 4 September 2014 collected by CIRPAS for the Kern Front and Kern River oil fields. Data key on panel. (<b>B</b>) Derived in situ and uniform CH<sub>4</sub> profiles, which have the same column-averaged CH<sub>4</sub> within a boundary layer of 2.0 km.</p>
Full article ">Figure 4
<p>Transmittance spectra for H<sub>2</sub>O and CH<sub>4</sub> in wavelength ranges of (A) 1600–2500 nm, (B) 2239–2299 nm, generated using MODerate resolution atmospheric correction TRANsmission (MODTRAN) for a mid-latitude summer atmosphere for the AVIRIS-NG sensor (<a href="#remotesensing-09-00835-f002" class="html-fig">Figure 2</a>) at 2.4 km altitude with CH<sub>4</sub> based on in situ data, see <a href="#remotesensing-09-00835-f003" class="html-fig">Figure 3</a>.</p>
Full article ">Figure 5
<p>Surface albedo (ρ<sub>s</sub>) for (<b>A</b>) 500 to 2500 nm, (<b>B</b>) 2139 to 2299 nm, for three common scene elements: asphalt road (Ar), brown sandy loam (Bsl), and red-brown sandy loam (Rbl), respectively. Data key on figure. Spectra are from Environment for Visualizing Images (ENVI) library.</p>
Full article ">Figure 6
<p>(<b>A</b>) Relative error (β) in at sensor radiance (<span class="html-italic">L<sub>t</sub></span>(λ)) with respect to the relative surface albedo error (α) and 1:1 line (solid). (<b>B</b>) Relative methane column, XCH<sub>4</sub>, sensitivity, <span class="html-italic">S</span>(XCH<sub>4_b</sub>, α) with respect to α for the three land cover types, Ar, Bsl, Rbl, which are for Asphalt road, Brown sandy loam, and Reddish brown sandy loam, respectively. XCH<sub>4</sub>_<sub>b</sub> is for the observed plume (<span class="html-italic">k</span> = 1) in all simulations. (<b>C</b>) <span class="html-italic">S</span>(XCH<sub>4_b</sub>, α) for pixel Rbl with different plume strength cases defined by <span class="html-italic">k</span>, see Equation (4). (<b>D</b>) Expanded view of (<b>C</b>) for positive α.</p>
Full article ">Figure 7
<p>Inverse XCH<sub>4_b</sub> (1/XCH<sub>4_b</sub>) versus slope of the lines in <a href="#remotesensing-09-00835-f006" class="html-fig">Figure 6</a>C, calculated for data near α = 0 for the plume strength cases of <span class="html-italic">k</span> = 0.25, 1, 2.5, and 10.</p>
Full article ">Figure 8
<p>Introduced relative error(<span class="html-italic">S<sub>Uni</sub></span>/<span class="html-italic">S<sub>Obs</sub></span>), for two scenarios, a uniform 2.0 km planetary boundary layer (PBL) (<span class="html-italic">S<sub>Uni</sub></span>) relative to the observed profile (<span class="html-italic">S<sub>Obs</sub></span>) shown in <a href="#remotesensing-09-00835-f003" class="html-fig">Figure 3</a>. Both scenarios have the same XCH<sub>4</sub>.</p>
Full article ">Figure 9
<p>Column retrieval error (<span class="html-italic">S</span>(<span class="html-italic">k</span> = 1, α = −10%)) where <span class="html-italic">k</span> = 1 signifies the plume profile (<a href="#remotesensing-09-00835-f003" class="html-fig">Figure 3</a>) for (<b>A</b>) solar zenith angle (θ<sub>s</sub>) when viewing zenith angle (θ<sub>v</sub>) and relative sun-senor azimuth angle (φ) are equal to 8°, 90° respectively, (<b>B</b>) θ<sub>v</sub> when θ<sub>s</sub> and φ are equal to 40°, 90° respectively, and (<b>C</b>) φ when θ<sub>s</sub> and θ<sub>v</sub> are equal to 40°, 8° respectively.</p>
Full article ">Figure 10
<p>Column retrieval error (<span class="html-italic">S</span>(XCH<sub>4_b</sub>, α)) versus α for XCH<sub>4_b</sub> of plume case <span class="html-italic">k</span> = 1 for (<b>A</b>) different aerosol type, (<b>B</b>) different visibility.</p>
Full article ">Figure 11
<p>Calculated sensor radiance (<span class="html-italic">L<sub>t</sub></span>) with respect to wavelength (λ)simulated for background XCH<sub>4</sub>. Solid lines show <span class="html-italic">L<sub>t</sub></span> for the true ρ<sub>s</sub>(λ) (denoted “_True”) for spectra for scene elements Asphalt road (Ar), Brown sandy loam (Bsl), and Reddish-brown fine sandy loam (Rbl) shown in <a href="#remotesensing-09-00835-f003" class="html-fig">Figure 3</a>. Dashed lines show <span class="html-italic">L<sub>t</sub></span> for constant ρ<sub>s</sub>(2139) across the feature (denoted “_Flat”).</p>
Full article ">Figure 12
<p><span class="html-italic">S</span>(XCH<sub>4_b</sub>, α = 0) resulting from an overestimation of 9% in water vapor for (<b>A</b>) three pixels types, Asphalt road (Ar), Brown sandy loam (Bsl), and Reddish-brown sandy loam (Rbl) with the XCH<sub>4</sub> plume case, defined as <span class="html-italic">k</span> = 1, and (<b>B</b>) the pixel Ar for different XCH<sub>4</sub> plume strengths <span class="html-italic">k</span> = 0, 0.1, 0.25, 1, 2.5, 10. <span class="html-italic">k</span> = 0 represents background.</p>
Full article ">Figure 13
<p><span class="html-italic">L<sub>t_OA</sub></span> derived by averaging <span class="html-italic">L<sub>t</sub></span> simulated over each subpixel (a total of 160 × 160 subpixels), see text for description. <span class="html-italic">L<sub>t_AO</sub></span> simulated using the average albedo of the 160 × 160 subpixels. GOSAT pixel noted above. In situ profile (i.e., k = 1, <a href="#remotesensing-09-00835-f003" class="html-fig">Figure 3</a>) was used for calculating <span class="html-italic">L<sub>t_OA</sub></span> and <span class="html-italic">L<sub>t_AO</sub></span>.</p>
Full article ">Figure 14
<p>XCH<sub>4</sub> underestimation (γ) with the subpixel percent coverage (<span class="html-italic">Pc</span>) of the GOSAT pixel covered by a CH<sub>4</sub> plume anomalies from 0.05 to 1 ppm, remainder background CH<sub>4</sub>. Data key on figure.</p>
Full article ">Figure 15
<p>(<b>A</b>) Methane (CH<sub>4</sub>) and (<b>B</b>) carbon dioxide (CO<sub>2</sub>), and winds for 4 September 2014 from airborne in situ (CIRPAS) and surface mobile in situ (AMOG surveyor). Red star shows data curtain zero point (119.0411°W, 35.3972°N). Data keys on panels.</p>
Full article ">Figure 16
<p>(<b>A</b>) Downwind methane (CH<sub>4</sub>) vertical profile with altitude (<span class="html-italic">z</span>) for Kern Front oil field transect (<a href="#remotesensing-09-00835-f015" class="html-fig">Figure A1</a>) on 4 September 2014, and (<b>B</b>) Transect curtain, vertically-interpolated CH<sub>4</sub>(<span class="html-italic">x</span>, <span class="html-italic">z</span>), where <span class="html-italic">x</span> is easting distance relative to 119.0411°W, 35.3972°N. Dashed line shows data locations with gray for ground and above airborne data. Where surface data were unavailable, surface values were extrapolated from C<sub>AMOG</sub>/C<sub>CIRPAS(500 m)</sub> and (<b>C</b>) CH<sub>4</sub> anomaly (CH<sub>4</sub>′(<span class="html-italic">x</span>,<span class="html-italic">z</span>)) based on background subtraction, (<b>D</b>) plane-normal wind profile (<span class="html-italic">U<sub>N</sub></span>(<span class="html-italic">z</span>)), as in panel A, (<b>E</b>) transect curtain <span class="html-italic">U<sub>N</sub></span>(<span class="html-italic">x</span>,<span class="html-italic">z</span>) and (<b>F</b>) transect curtain of CH<sub>4</sub> flux (<span class="html-italic">Q<sub>CH4</sub></span>(<span class="html-italic">x</span>,<span class="html-italic">z</span>)). Data key on panels.</p>
Full article ">Figure 17
<p>Scene occurrence histogram for surface albedo (ρ<sub>s</sub>) for 2239 nm derived from FLAASH atmospheric correction of AVIRIS-NG data on 4 September 2014. ρ<sub>s</sub>(2239) for the three pixels Asphalt road (Ar), Brown sandy loam (Bsl), and Reddish brown sandy loam (Rbl), identified on figure.</p>
Full article ">Figure 18
<p><span class="html-italic">S</span>(XCH<sub>4_b</sub>, α) resulted from an underestimation of 0.02 in ρ<sub>s</sub> for <b>A</b>) three pixels Asphalt road (Ar), Brown sandy loam (Bsl), and Reddish brown sandy loam (Rbl) with the XCH<sub>4</sub> plume case of <span class="html-italic">k</span> = 0.1, and <b>B</b>) pixel Ar with XCH<sub>4</sub> plume cases of <span class="html-italic">k</span> = 0, 0.1, 1, 10.</p>
Full article ">Figure 19
<p><span class="html-italic">S</span>(XCH<sub>4_b</sub>, α) resulted from a combination of an absolute underestimation of 0.02 in ρ<sub>s</sub> and a relative overestimation of 9% in column water vapor for (<b>A</b>) three pixels Asphalt road (Ar), Brown sandy loam (Bsl), and Reddish brown sandy loam (Rbl) with the XCH<sub>4</sub> plume case of <span class="html-italic">k</span> = 1, and (<b>B</b>) the pixel Ar with XCH<sub>4</sub> plume cases of <span class="html-italic">k</span> = 0, 0.1, 1, 10.</p>
Full article ">
20385 KiB  
Article
One-Dimensional Convolutional Neural Network Land-Cover Classification of Multi-Seasonal Hyperspectral Imagery in the San Francisco Bay Area, California
by Daniel Guidici and Matthew L. Clark
Remote Sens. 2017, 9(6), 629; https://doi.org/10.3390/rs9060629 - 20 Jun 2017
Cited by 88 | Viewed by 11721
Abstract
In this study, a 1-D Convolutional Neural Network (CNN) architecture was developed, trained and utilized to classify single (summer) and three seasons (spring, summer, fall) of hyperspectral imagery over the San Francisco Bay Area, California for the year 2015. For comparison, the Random [...] Read more.
In this study, a 1-D Convolutional Neural Network (CNN) architecture was developed, trained and utilized to classify single (summer) and three seasons (spring, summer, fall) of hyperspectral imagery over the San Francisco Bay Area, California for the year 2015. For comparison, the Random Forests (RF) and Support Vector Machine (SVM) classifiers were trained and tested with the same data. In order to support space-based hyperspectral applications, all analyses were performed with simulated Hyperspectral Infrared Imager (HyspIRI) imagery. Three-season data improved classifier overall accuracy by 2.0% (SVM), 1.9% (CNN) to 3.5% (RF) over single-season data. The three-season CNN provided an overall classification accuracy of 89.9%, which was comparable to overall accuracy of 89.5% for SVM. Both three-season CNN and SVM outperformed RF by over 7% overall accuracy. Analysis and visualization of the inner products for the CNN provided insight to distinctive features within the spectral-temporal domain. A method for CNN kernel tuning was presented to assess the importance of learned features. We concluded that CNN is a promising candidate for hyperspectral remote sensing applications because of the high classification accuracy and interpretability of its inner products. Full article
(This article belongs to the Special Issue Learning to Understand Remote Sensing Images)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Study area overview with an AVIRIS-C, 11 June 2015 RGB mosaic of 12 individual flight runs. Reference data are red and cyan points. Inset shows the multi-seasonal image extent with water and cloud mask applied.</p>
Full article ">Figure 2
<p>Twenty-five randomly selected three-season reflectance spectra from each of the twelve LCCS classes. Note that the x-axis is the band number (1–558) with seasons in spring- summer-fall sequence. Bad bands have been removed within each season (186 bands per season).</p>
Full article ">Figure 3
<p>Training and testing reference data distributions.</p>
Full article ">Figure 4
<p>Overview of machine learning classification and accuracy assessment methodology.</p>
Full article ">Figure 5
<p>Convolutional Neural Network (CNN) flow diagram.</p>
Full article ">Figure 6
<p>Detailed Convolutional Neural Network (CNN) architecture.</p>
Full article ">Figure 7
<p>An example convolutional feature map for an Annual Crop three-season spectrum. The example class spectrum is shown in background. Green, yellow and gray background areas represent bands in the spring, summer and fall, respectively. Each season has 186 bands, ordered 370 to 2500 nm. However, in this figure the original 224 bands per season are shown in order to display the bad data gaps (e.g., atmospheric absorption windows).</p>
Full article ">Figure 8
<p>Classified land-cover maps for (<b>A</b>) Support Vector Machine; (<b>B</b>) Random Forests; and (<b>C</b>) Convolutional Neural Networks. White areas indicate pixels that were not classified (e.g., water, clouds, no data); (<b>D</b>) Natural color mosaic of imagery from June 2015.</p>
Full article ">Figure 9
<p>Kernel Importance Matrix. This table shows the percent change in producer accuracy when zeroing a kernel from the CNN. Class index definitions found in <a href="#remotesensing-09-00629-t003" class="html-table">Table 3</a>.</p>
Full article ">Figure 10
<p>Convolutional feature maps for each class for the three-season CNN. These feature maps are the result of averaging the convolution of the kernels with 75 spectra per class (<a href="#sec2dot4dot3-remotesensing-09-00629" class="html-sec">Section 2.4.3</a>). An example class spectrum is shown in background. Green, yellow and gray background areas represent bands in the spring, summer and fall, respectively. Each season has 186 bands, ordered 370 to 2500 nm. However, in this figure the original 224 bands per season are shown in order to display the bad data gaps (e.g., atmospheric absorption windows).</p>
Full article ">Figure 11
<p>Feature importance for the three-season Random Forests classifier. Nominal spectral profile for ENT shown for context.</p>
Full article ">Figure 12
<p>Mini-batch training accuracy curve.</p>
Full article ">Figure 13
<p>Testing accuracy curve.</p>
Full article ">
2883 KiB  
Article
Hyperspectral Remote Sensing for Detecting Soil Salinization Using ProSpecTIR-VS Aerial Imagery and Sensor Simulation
by Odílio Coimbra da Rocha Neto, Adunias Dos Santos Teixeira, Raimundo Alípio de Oliveira Leão, Luis Clenio Jario Moreira and Lênio Soares Galvão
Remote Sens. 2017, 9(1), 42; https://doi.org/10.3390/rs9010042 - 6 Jan 2017
Cited by 42 | Viewed by 6927
Abstract
Soil salinization due to irrigation affects agricultural productivity in the semi-arid region of Brazil. In this study, the performance of four computational models to estimate electrical conductivity (EC) (soil salinization) was evaluated using laboratory reflectance spectroscopy. To investigate the influence of bandwidth and [...] Read more.
Soil salinization due to irrigation affects agricultural productivity in the semi-arid region of Brazil. In this study, the performance of four computational models to estimate electrical conductivity (EC) (soil salinization) was evaluated using laboratory reflectance spectroscopy. To investigate the influence of bandwidth and band positioning on the EC estimates, we simulated the spectral resolution of two hyperspectral sensors (airborne ProSpecTIR-VS and orbital Hyperspectral Infrared Imager (HyspIRI)) and three multispectral instruments (RapidEye/REIS, High Resolution Geometric (HRG)/SPOT-5, and Operational Land Imager (OLI)/Landsat-8)). Principal component analysis (PCA) and the first-order derivative analysis were applied to the data to generate metrics associated with soil brightness and spectral features, respectively. The three sets of data (reflectance, PCA, and derivative) were tested as input variable for Extreme Learning Machine (ELM), Ordinary Least Square regression (OLS), Partial Least Squares Regression (PLSR), and Multilayer Perceptron (MLP). Finally, the laboratory models were inverted to a ProSpecTIR-VS image (400–2500 nm) acquired with 1-m spatial resolution in the northeast of Brazil. The objective was to estimate EC over exposed soils detected using the Normalized Difference Vegetation Index (NDVI). The results showed that the predictive ability of the linear models and ELM was better than that of the MLP, as indicated by higher values of the coefficient of determination (R2) and ratio of the performance to deviation (RPD), and lower values of the root mean square error (RMSE). Metrics associated with soil brightness (reflectance and PCA scores) were more efficient in detecting changes in the EC produced by soil salinization than metrics related to spectral features (derivative). When applied to the image, the PLSR model with reflectance had an RMSE of 1.22 dS·m−1 and an RPD of 2.21, and was more suitable for detecting salinization (10–20 dS·m−1) in exposed soils (NDVI < 0.30) than the other models. For all computational models, lower values of RMSE and higher values of RPD were observed for the narrowband-simulated sensors compared to the broadband-simulated instruments. The soil EC estimates improved from the RapidEye to the HRG and OLI spectral resolutions, showing the importance of shortwave intervals (SWIR-1 and SWIR-2) in detecting soil salinization when the reflectance of selected bands is used in data modelling. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Location of the study area in the Morada Nova Irrigation District, the State of Ceará (CE) in the northeast of Brazil. The sites of soil sample collections are indicated.</p>
Full article ">Figure 2
<p>Methodology used in the generation and validation of the linear and nonlinear computational models.</p>
Full article ">Figure 3
<p>Spectra of saline soils with increased values of electrical conductivity (EC), obtained in the laboratory.</p>
Full article ">Figure 4
<p>Relationship of electrical conductivity (EC) to (<b>a</b>) reflectance of the ProSpecTIR-VS band located at 1642 nm; and (<b>b</b>) first principal component (PC1).</p>
Full article ">Figure 5
<p>Relationship between measured electrical conductivity (EC) and that estimated by the PLSR model using laboratory reflectance data for the simulated 5-nm bands of the ProSpecTIR-VS sensor centered at 395, 1642, and 1717 nm.</p>
Full article ">Figure 6
<p>Relationship between measured electrical conductivity (EC) and that estimated by the PLSR model using reflectance data for 32 pixels of exposed soils (NDVI &lt; 0.30) and three bands (395, 1642, and 1717 nm) of the ProSpecTIR-VS sensor.</p>
Full article ">Figure 7
<p>(<b>a</b>) ProSpecTIR-VS false-color composite and (<b>b</b>) Electric Conductivity (EC) estimated for areas of exposed soils in the hyperspectral image from the ProSpecTIR-VS sensor, based on the laboratory-calibrated PLSR model.</p>
Full article ">Figure 7 Cont.
<p>(<b>a</b>) ProSpecTIR-VS false-color composite and (<b>b</b>) Electric Conductivity (EC) estimated for areas of exposed soils in the hyperspectral image from the ProSpecTIR-VS sensor, based on the laboratory-calibrated PLSR model.</p>
Full article ">
4747 KiB  
Article
Regionalization of Uncovered Agricultural Soils Based on Organic Carbon and Soil Texture Estimations
by Martin Kanning, Bastian Siegmann and Thomas Jarmer
Remote Sens. 2016, 8(11), 927; https://doi.org/10.3390/rs8110927 - 8 Nov 2016
Cited by 29 | Viewed by 6670
Abstract
The determination of soil texture and organic carbon across agricultural areas provides important information to derive soil condition. Precise digital soil maps can help to till agricultural fields with more accuracy, greater cost-efficiency and better environmental protection. In the present study, the laboratory [...] Read more.
The determination of soil texture and organic carbon across agricultural areas provides important information to derive soil condition. Precise digital soil maps can help to till agricultural fields with more accuracy, greater cost-efficiency and better environmental protection. In the present study, the laboratory analysis of sand, silt, clay and soil organic carbon (SOC) content was combined with hyperspectral image data to estimate the distribution of soil texture and SOC across an agricultural area. The aim was to identify regions with similar soil properties and derive uniform soil regions based on this information. Soil parameter data and corresponding laboratory spectra were used to calibrate cross-validated (leave-one-out) partial least squares regression (PLSR) models, resulting in robust models for sand (R2 = 0.77, root-mean-square error (RMSE) = 5.37) and SOC (R2 = 0.89, RMSE = 0.27), as well as moderate models for silt (R2 = 0.62, RMSE = 5.46) and clay (R2 = 0.53, RMSE = 2.39). The regression models were applied to Airborne Imaging Spectrometer for Applications DUAL (aisaDUAL) hyperspectral image data to spatially estimate the concentration of these parameters. Afterwards, a decision tree, based on the Food and Agriculture Organization (FAO) soil texture classification scheme, was developed to determine the soil texture for each pixel of the hyperspectral airborne data. These soil texture regions were further refined with the spatial SOC estimations. The developed method is useful to identify spatial regions with similar soil properties, which can provide a vital information source for an adapted treatment of agricultural fields in terms of the necessary amount of fertilizers or water. The approach can also be adapted to wider regions with a larger sample size to create detailed digital soil maps (DSMs). Further, the presented method should be applied to future hyperspectral satellite missions like Environmental Mapping and Analysis Program (EnMap) and Hyperspectral Infrared Imager (HyspIRI) to cover larger areas in shorter time intervals. Updated DSMs on a regular basis could particularly support precision farming aspects. Full article
(This article belongs to the Special Issue Remote Sensing in Precision Agriculture)
Show Figures

Figure 1

Figure 1
<p>Location of the study site in Germany and Saxony-Anhalt (<b>left</b>); and the two investigated fields (red polygons) (<b>right</b>).</p>
Full article ">Figure 2
<p>Scatterplots of leave-one-out cross-validated partial least squares regression (PLSR) models for sand, silt, clay and soil organic carbon (SOC) with the regression line (continuous) and the 1:1 line (dashed).</p>
Full article ">Figure 3
<p>Partial least squares regression (PLSR) factor loadings for sand, silt, clay and soil organic carbon (SOC) models.</p>
Full article ">Figure 4
<p>Spatial predictions of sand, silt, clay and SOC for the investigated and neighboring uncovered fields within the study site based on PLSR.</p>
Full article ">Figure 5
<p>Cumulated spatial prediction sand, silt and clay predictions and deviation from 100%: (<b>a</b>) For all pixels in the image; (<b>b</b>) The histogram of grain size composition for pixels only from the observed fields A and B; and (<b>c</b>) The histogram of grain size composition for all pixels.</p>
Full article ">Figure 6
<p>Regionalization results based on decision tree classification of (<b>a</b>) Soil texture and (<b>b</b>) Soil texture subdivided by SOC (SiL: silt loam; L: loam; SL: sand loam; SiCL: silt clay loam).</p>
Full article ">
8032 KiB  
Article
Towards an Improved LAI Collection Protocol via Simulated and Field-Based PAR Sensing
by Wei Yao, David Kelbe, Martin Van Leeuwen, Paul Romanczyk and Jan Van Aardt
Sensors 2016, 16(7), 1092; https://doi.org/10.3390/s16071092 - 14 Jul 2016
Cited by 9 | Viewed by 6711
Abstract
In support of NASA’s next-generation spectrometer—the Hyperspectral Infrared Imager (HyspIRI)—we are working towards assessing sub-pixel vegetation structure from imaging spectroscopy data. Of particular interest is Leaf Area Index (LAI), which is an informative, yet notoriously challenging parameter to efficiently measure in situ. While [...] Read more.
In support of NASA’s next-generation spectrometer—the Hyperspectral Infrared Imager (HyspIRI)—we are working towards assessing sub-pixel vegetation structure from imaging spectroscopy data. Of particular interest is Leaf Area Index (LAI), which is an informative, yet notoriously challenging parameter to efficiently measure in situ. While photosynthetically-active radiation (PAR) sensors have been validated for measuring crop LAI, there is limited literature on the efficacy of PAR-based LAI measurement in the forest environment. This study (i) validates PAR-based LAI measurement in forest environments, and (ii) proposes a suitable collection protocol, which balances efficiency with measurement variation, e.g., due to sun flecks and various-sized canopy gaps. A synthetic PAR sensor model was developed in the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model and used to validate LAI measurement based on first-principles and explicitly-known leaf geometry. Simulated collection parameters were adjusted to empirically identify optimal collection protocols. These collection protocols were then validated in the field by correlating PAR-based LAI measurement to the normalized difference vegetation index (NDVI) extracted from the “classic” Airborne Visible Infrared Imaging Spectrometer (AVIRIS-C) data ( R 2 was 0.61). The results indicate that our proposed collecting protocol is suitable for measuring the LAI of sparse forest (LAI < 3–5 ( m 2 / m 2 )). Full article
(This article belongs to the Section Remote Sensors)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The study contains two parts: the simulation and field-based approach. Each step in the simulation environment is repeated in field-based approach to ensure that the simulations and models are correct.</p>
Full article ">Figure 2
<p>(<b>a</b>) an ash tree located within the Rochester Institute of Technology (RIT) campus and (<b>b</b>) its 3D model.</p>
Full article ">Figure 3
<p>(<b>a</b>) plot #116 in the National Ecological Observatory Network’s (NEON) D17 Domain and (<b>b</b>) the virtual scene.</p>
Full article ">Figure 4
<p>The angles mapped to the detector cells are not constant.</p>
Full article ">Figure 5
<p>Flatten the hemisphere (<b>a</b>) into a plane (<b>b</b>).</p>
Full article ">Figure 6
<p>Convert (<b>a</b>) <span class="html-italic">θ</span> and <span class="html-italic">ϕ</span> in the spherical coordinate system to (<b>b</b>) <math display="inline"> <semantics> <msub> <mi>α</mi> <mi>X</mi> </msub> </semantics> </math> and <math display="inline"> <semantics> <msub> <mi>α</mi> <mi>Y</mi> </msub> </semantics> </math> in the <span class="html-italic">X</span>/<span class="html-italic">Y</span> angle system.</p>
Full article ">Figure 7
<p>Capturing the sun disk with different spatial resolutions. (<b>a</b>) the Sun is mapped to one pixel; (<b>b</b>) the Sun is mapped to two pixels; (<b>c</b>) the same Sun position as in the above figure on a higher resolution detector array; and (<b>d</b>) the same Sun position as in the above figure on a higher resolution detector array.</p>
Full article ">Figure 8
<p>Using AccuPAR LP-80 to collect the below-canopy photosynthetically-active radiation (PAR) reading along quadrantal directions.</p>
Full article ">Figure 9
<p>The simulated PAR and measured above-canopy PAR on San Joaquin Experimental Range (SJER).</p>
Full article ">Figure 10
<p>An example of the original and horizontally mirrored PAR: (<b>a</b>) simulated PAR and (<b>b</b>) measured PAR.</p>
Full article ">Figure 11
<p>The simulated and measured below-canopy PAR for the RIT Ash Tree: (<b>a</b>) along the east–west direction; (<b>b</b>) along the north–south direction.</p>
Full article ">Figure 12
<p>The leaf area calculated by simulation results of the model tree.</p>
Full article ">Figure 13
<p>Leaf area index (LAI) estimates derived from simulated AccuPAR readings against the normalized difference vegetation index (NDVI) extracted from simulated Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) images. (<b>a</b>) 45 PAR readings were simulated along three transects in each <math display="inline"> <semantics> <mrow> <mn>15</mn> <mspace width="0.166667em"/> <mi mathvariant="normal">m</mi> <mo>×</mo> <mn>15</mn> <mspace width="0.166667em"/> <mi mathvariant="normal">m</mi> </mrow> </semantics> </math> square (<math display="inline"> <semantics> <mrow> <msup> <mi>R</mi> <mn>2</mn> </msup> <mo>=</mo> <mn>0</mn> <mo>.</mo> <mn>92</mn> </mrow> </semantics> </math>, <math display="inline"> <semantics> <mrow> <mi>R</mi> <mi>M</mi> <mi>S</mi> <mi>E</mi> <mo>=</mo> <mn>0</mn> <mo>.</mo> <mn>33</mn> </mrow> </semantics> </math>); (<b>b</b>) 30 PAR readings were simulated along two transects in each <math display="inline"> <semantics> <mrow> <mn>15</mn> <mspace width="0.166667em"/> <mi mathvariant="normal">m</mi> <mo>×</mo> <mn>15</mn> <mspace width="0.166667em"/> <mi mathvariant="normal">m</mi> </mrow> </semantics> </math> square (<math display="inline"> <semantics> <mrow> <msup> <mi>R</mi> <mn>2</mn> </msup> <mo>=</mo> <mn>0</mn> <mo>.</mo> <mn>77</mn> </mrow> </semantics> </math>, <math display="inline"> <semantics> <mrow> <mi>R</mi> <mi>M</mi> <mi>S</mi> <mi>E</mi> <mo>=</mo> <mn>0</mn> <mo>.</mo> <mn>66</mn> </mrow> </semantics> </math>); and (<b>c</b>) 15 PAR readings were simulated along one transect in each <math display="inline"> <semantics> <mrow> <mn>15</mn> <mspace width="0.166667em"/> <mi mathvariant="normal">m</mi> <mo>×</mo> <mn>15</mn> <mspace width="0.166667em"/> <mi mathvariant="normal">m</mi> </mrow> </semantics> </math> square (<math display="inline"> <semantics> <mrow> <msup> <mi>R</mi> <mn>2</mn> </msup> <mo>=</mo> <mn>0</mn> <mo>.</mo> <mn>66</mn> </mrow> </semantics> </math>, <math display="inline"> <semantics> <mrow> <mi>R</mi> <mi>M</mi> <mi>S</mi> <mi>E</mi> <mo>=</mo> <mn>1</mn> <mo>.</mo> <mn>24</mn> </mrow> </semantics> </math>).</p>
Full article ">Figure 14
<p>NDVI extracted from a real AVIRIS-C image was used to verify the in situ forest LAI, where <math display="inline"> <semantics> <mrow> <msup> <mi>R</mi> <mn>2</mn> </msup> <mo>=</mo> <mn>0</mn> <mo>.</mo> <mn>61</mn> </mrow> </semantics> </math>, <math display="inline"> <semantics> <mrow> <mi>R</mi> <mi>M</mi> <mi>S</mi> <mi>E</mi> <mo>=</mo> <mn>0</mn> <mo>.</mo> <mn>34</mn> </mrow> </semantics> </math>.</p>
Full article ">
11242 KiB  
Article
Utilizing HyspIRI Prototype Data for Geological Exploration Applications: A Southern California Case Study
by Wendy M. Calvin and Elizabeth L. Pace
Geosciences 2016, 6(1), 11; https://doi.org/10.3390/geosciences6010011 - 24 Feb 2016
Cited by 14 | Viewed by 5183
Abstract
The purpose of this study was to demonstrate the value of the proposed Hyperspectral Infrared Imager (HyspIRI) instrument for geological mapping applications. HyspIRI-like data were collected as part of the HyspIRI airborne campaign that covered large regions of California, USA, over multiple seasons. [...] Read more.
The purpose of this study was to demonstrate the value of the proposed Hyperspectral Infrared Imager (HyspIRI) instrument for geological mapping applications. HyspIRI-like data were collected as part of the HyspIRI airborne campaign that covered large regions of California, USA, over multiple seasons. This work focused on a Southern California area, which encompasses Imperial Valley, the Salton Sea, the Orocopia Mountains, the Chocolate Mountains, and a variety of interesting geological phenomena including fumarole fields and sand dunes. We have mapped hydrothermal alteration, lithology and thermal anomalies, demonstrating the value of this type of data for future geologic exploration activities. We believe HyspIRI will be an important instrument for exploration geologists as data may be quickly manipulated and used for remote mapping of hydrothermal alteration minerals, lithology and temperature anomalies. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Imperial Valley, California, USA, study area. A true color image derived from 18-m AVIRIS data is shown overlain on a topographic shaded relief image. Known geothermal resource areas or power plants are indicated by initials: HMS—Hot Mineral Spa; SS—Salton Sea; B—Brawley; EM—East Mesa; H—Heber. The inset on upper right map shows the location of the study area.</p>
Full article ">Figure 2
<p>Orocopia Mountains and Chocolate Mountains geologic map showing major lithologic units from Jenings <span class="html-italic">et al.</span> [<a href="#B20-geosciences-06-00011" class="html-bibr">20</a>].</p>
Full article ">Figure 3
<p>Illustration of a simple color decorrelation stretch band combination that rapidly separates common alteration minerals into distinct color fields. Red-Green-Blue channels are chosen as illustrated in the figure, and resulting mineral color associations are described in the text. Spectral standards are from the USGS spectral library [<a href="#B39-geosciences-06-00011" class="html-bibr">39</a>].</p>
Full article ">Figure 4
<p>Decorrelation stretch of AVIRIS bands at 2.16, 2.21, and 2.24 µm displayed as RGB, respectively. The same scene is shown at two different spatial resolutions: (<b>a</b>) 18 m, the resolution of original AVIRIS data; and (<b>b</b>) 30 m, the resolution of the proposed HyspIRI VSWIR data. The black box shows the location of <a href="#geosciences-06-00011-f005" class="html-fig">Figure 5</a>. Blues, magentas, and yellows highlight hydrothermal or other alteration mineralogy, as described in the text.</p>
Full article ">Figure 5
<p>The Orocopia and Chocolate Mountains area. A true color image derived from 18 m AVIRIS data is shown overlain on a shaded relief image. Remotely mapped minerals are shown in colors: green—epidote; magenta—muscovite; blue—kaolinite. The location of this figure is shown in <a href="#geosciences-06-00011-f004" class="html-fig">Figure 4</a>.</p>
Full article ">Figure 6
<p>Spectra of mapped units (solid lines) compared with library mineral standards from the USGS spectral library [<a href="#B39-geosciences-06-00011" class="html-bibr">39</a>] (dashed lines). Each scene spectrum represents an average of more than several hundred pixels.</p>
Full article ">Figure 7
<p>The Orocopia Mountains and the Chocolate Mountains. Decorrelation stretch of MASTER bands at 10.63, 9.03, and 8.60 µm displayed as RGB, respectively. Geologic map units from <a href="#geosciences-06-00011-f002" class="html-fig">Figure 2</a> are outlined by black lines. The location outline is shown in <a href="#geosciences-06-00011-f004" class="html-fig">Figure 4</a>.</p>
Full article ">Figure 8
<p>Emissivity spectral shapes (<b>a</b>) and spectral locations mapped; (<b>b</b>) using a matched filter approach. Colors in the map correspond to the same color in the spectral plot. Blue, green and purple spectral shapes occur in the same regions, identify the schist of the Orocopio Mountains, and several basalt outcrops. Yellow and coral spectral shapes both map the Algodones dune field and only yellow is shown for simplicity. The area shown is identified in <a href="#geosciences-06-00011-f004" class="html-fig">Figure 4</a> and is the same as in <a href="#geosciences-06-00011-f005" class="html-fig">Figure 5</a> and <a href="#geosciences-06-00011-f007" class="html-fig">Figure 7</a> for units compare with <a href="#geosciences-06-00011-f002" class="html-fig">Figure 2</a> and <a href="#geosciences-06-00011-f007" class="html-fig">Figure 7</a>.</p>
Full article ">Figure 9
<p>(<b>a</b>) Imperial Valley, California study area. A true color image derived from AVIRIS data is shown overlain on a shaded relief image; the box shows the location of (<b>b</b>). (<b>b</b>) Land surface temperature data for the southern end of the Salton Sea below 100 m elevation; the box shows the location of (<b>c</b>). Known geothermal areas are indicated by initials: HMS—Hot Mineral Spa; SS—Salton Sea. (<b>c</b>) Land surface temperature data for the Salton Sea geothermal area. Circles show geothermal power plants and squares show geothermal features located by [<a href="#B27-geosciences-06-00011" class="html-bibr">27</a>]. Fumarole fields are indicated by initials: W—Wister; MI—Mullet Island; DS—Davis Schrimpf; RI—Red Island.</p>
Full article ">
1281 KiB  
Technical Note
Monitoring the Impacts of Severe Drought on Southern California Chaparral Species using Hyperspectral and Thermal Infrared Imagery
by Austin R. Coates, Philip E. Dennison, Dar A. Roberts and Keely L. Roth
Remote Sens. 2015, 7(11), 14276-14291; https://doi.org/10.3390/rs71114276 - 28 Oct 2015
Cited by 40 | Viewed by 6908
Abstract
Airborne hyperspectral and thermal infrared imagery acquired in 2013 and 2014, the second and third years of a severe drought in California, were used to assess drought impacts on dominant plant species. A relative green vegetation fraction (RGVF) calculated from 2013–2014 Airborne Visible [...] Read more.
Airborne hyperspectral and thermal infrared imagery acquired in 2013 and 2014, the second and third years of a severe drought in California, were used to assess drought impacts on dominant plant species. A relative green vegetation fraction (RGVF) calculated from 2013–2014 Airborne Visible Infrared Imaging Spectrometer (AVIRIS) data using linear spectral unmixing revealed seasonal and multi-year changes relative to a pre-drought 2011 reference AVIRIS image. Deeply rooted tree species and tree species found in mesic areas showed the least change in RGVF. Coastal sage scrub species demonstrated the highest seasonal variability, as well as a longer-term decline in RGVF. Ceanothus species were apparently least well-adapted to long-term drought among chaparral species, showing persistent declines in RGVF over 2013 and 2014. Declining RGVF was associated with higher land surface temperature retrieved from MODIS-ASTER Airborne Simulator (MASTER) data. Combined collection of hyperspectral and thermal infrared imagery may offer new opportunities for mapping and monitoring drought impacts on ecosystems. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The study region covering the Santa Barbara Coast and portions of the Santa Ynez Mountains and Santa Ynez Valley, California. An AVIRIS images acquired in July 2011 is shown in a 1651 nm (red), 831 nm (green), 658 nm (blue) false color composite. Reference polygons dominated by plant species are marked by yellow dots.</p>
Full article ">Figure 2
<p>Precipitation and soil moisture measured at Sedgwick Ranch, California. (<b>a</b>) Monthly precipitation; (<b>b</b>) percent of each month with a volumetric moisture content (VMC) greater than 0.08 at a depth of 15 cm. The 0.08 VMC threshold was empirically determined to be just above the wilting point in this soil type; and (<b>c</b>) percent of each month with a VMC greater than 0.08 at a depth of 46 cm.</p>
Full article ">Figure 3
<p>An NDVI threshold of 0.3 was empirically determined using spectra extracted from species reference polygons (GV) and from areas with senesced grassland and soil cover (non-GV). Non-GV cover was identified based on appearance in high resolution orthoimagery and manual inspection of spectral shape in the July 2011 AVIRIS image. 95% of GV-type spectra were above the 0.3 NDVI threshold, while only 12% of non-GV-type spectra were above the same threshold.</p>
Full article ">Figure 4
<p>Relative GV fraction for dates in 2013 and 2014. White areas indicate pixels with an NDVI less than 0.3 in the July 2011 data. An instrument malfunction resulted in missing data for the white strip shown in the 2014-06-04 image. Fire scars for the Gap, Jesusita, and Tea fires are indicated by the letters G, J, and T, respectively. The location of the Sedgwick Ranch station recording precipitation and soil moisture data shown in <a href="#remotesensing-07-14276-f002" class="html-fig">Figure 2</a> is indicated by the letter S.</p>
Full article ">Figure 5
<p>Relative GV fraction for dates in 2013. The areas shown are subsets of the study area shown in <a href="#remotesensing-07-14276-f004" class="html-fig">Figure 4</a>: (<b>a</b>) Agriculture in the Santa Ynez valley; (<b>b</b>) chaparral (top-right of subset) and coastal sage scrub and grasslands (bottom-left of subset); (<b>c</b>) the fire scar of the 2008 Gap fire (top of subset) and agriculture and grasslands (bottom of subset). White areas indicate pixels with an NDVI less than 0.3 in the July 2011 data.</p>
Full article ">Figure 6
<p>Relative GV fraction distributions for individual species. Species codes are listed in <a href="#remotesensing-07-14276-t001" class="html-table">Table 1</a>. Density is the kernel density estimate calculated using the “density” function in R statistical software (<a href="http://www.R-project.org/" target="_blank">http://www.R-project.org/</a>).</p>
Full article ">Figure 7
<p>Land surface temperature <span class="html-italic">versus</span> relative GV fraction for pixels in species polygons. Species codes are listed in <a href="#remotesensing-07-14276-t001" class="html-table">Table 1</a>.</p>
Full article ">
53063 KiB  
Article
Developing in situ Non-Destructive Estimates of Crop Biomass to Address Issues of Scale in Remote Sensing
by Michael Marshall and Prasad Thenkabail
Remote Sens. 2015, 7(1), 808-835; https://doi.org/10.3390/rs70100808 - 14 Jan 2015
Cited by 78 | Viewed by 8875
Abstract
Ground-based estimates of aboveground wet (fresh) biomass (AWB) are an important input for crop growth models. In this study, we developed empirical equations of AWB for rice, maize, cotton, and alfalfa, by combining several in situ non-spectral and spectral predictors. The non-spectral predictors [...] Read more.
Ground-based estimates of aboveground wet (fresh) biomass (AWB) are an important input for crop growth models. In this study, we developed empirical equations of AWB for rice, maize, cotton, and alfalfa, by combining several in situ non-spectral and spectral predictors. The non-spectral predictors included: crop height (H), fraction of absorbed photosynthetically active radiation (FAPAR), leaf area index (LAI), and fraction of vegetation cover (FVC). The spectral predictors included 196 hyperspectral narrowbands (HNBs) from 350 to 2500 nm. The models for rice, maize, cotton, and alfalfa included H and HNBs in the near infrared (NIR); H, FAPAR, and HNBs in the NIR; H and HNBs in the visible and NIR; and FVC and HNBs in the visible; respectively. In each case, the non-spectral predictors were the most important, while the HNBs explained additional and statistically significant predictors, but with lower variance. The final models selected for validation yielded an R2 of 0.84, 0.59, 0.91, and 0.86 for rice, maize, cotton, and alfalfa, which when compared to models using HNBs alone from a previous study using the same spectral data, explained an additional 12%, 29%, 14%, and 6% in AWB variance. These integrated models will be used in an up-coming study to extrapolate AWB over 60 × 60 m transects to evaluate spaceborne multispectral broad bands and hyperspectral narrowbands. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The spatial extent of non-spectroradiometric, spectroradiometric, and aboveground wet biomass (AWB) samples taken in the Central Valley of California in 2011 and 2012. More than 500+ samples were taken for alfalfa, cotton, maize, and rice. The 60 × 60 m transects where AWB and non-destructive predictors were taken (▪) are overlaid with the National Agricultural Statistics Service Cropland Data Layer for 2012 [<a href="#B40-remotesensing-07-00808" class="html-bibr">40</a>].</p>
Full article ">Figure 2
<p>Non-spectroradiometric and spectroradiometric measurements used to predict aboveground wet biomass: (<b>A</b>) height (cm) using a measuring stick or telescoping rod; (<b>B</b>) fraction of vegetation cover derived from red-green-blue photos (in this case the excess green minus excess red index); (<b>C</b>) gap fraction and leaf area index derived from a ACCUPAR LP-80 ceptometer; and (<b>D</b>) hypersepctral narrowbands retrieved from an Analytical Spectral Devices Field Spec Pro 3 and white reflectance mounted to a tripod.</p>
Full article ">Figure 3
<p>Pearson correlation between aboveground wet biomass and candidate non-spectral and untransformed spectral predictors for (<b>A</b>) rice, (<b>B</b>) alfalfa, (<b>C</b>) cotton, and (<b>D</b>) maize. The dashed line shows the threshold used to include predictors in the model-building phase. LAI is the leaf area index, β is the gap fraction, EXG is the excess greenness index, EXGR is the excess green minus excess red index, NDI is the normalized difference index, meanG is the mean chromatic greenness index, and medianG is the median chromatic greenness index. The hyperspectral narrowbands are in nm.</p>
Full article ">Figure 3 Cont.
<p>Pearson correlation between aboveground wet biomass and candidate non-spectral and untransformed spectral predictors for (<b>A</b>) rice, (<b>B</b>) alfalfa, (<b>C</b>) cotton, and (<b>D</b>) maize. The dashed line shows the threshold used to include predictors in the model-building phase. LAI is the leaf area index, β is the gap fraction, EXG is the excess greenness index, EXGR is the excess green minus excess red index, NDI is the normalized difference index, meanG is the mean chromatic greenness index, and medianG is the median chromatic greenness index. The hyperspectral narrowbands are in nm.</p>
Full article ">Figure 4
<p>Pearson correlation between aboveground wet biomass and candidate non-spectral and 1st order derivative transformed spectral predictors for (<b>A</b>) rice, (<b>B</b>) alfalfa, (<b>C</b>) cotton, and (<b>D</b>) maize. The dashed line shows the threshold used to include predictors in the model-building phase. LAI is the leaf area index, β is the gap fraction, EXG is the excess greenness index, EXGR is the excess green minus excess red index, NDI is the normalized difference index, meanG is the mean chromatic greenness index, and medianG is the median chromatic greenness index. The hyperspectral narrowbands are in nm.</p>
Full article ">Figure 4 Cont.
<p>Pearson correlation between aboveground wet biomass and candidate non-spectral and 1st order derivative transformed spectral predictors for (<b>A</b>) rice, (<b>B</b>) alfalfa, (<b>C</b>) cotton, and (<b>D</b>) maize. The dashed line shows the threshold used to include predictors in the model-building phase. LAI is the leaf area index, β is the gap fraction, EXG is the excess greenness index, EXGR is the excess green minus excess red index, NDI is the normalized difference index, meanG is the mean chromatic greenness index, and medianG is the median chromatic greenness index. The hyperspectral narrowbands are in nm.</p>
Full article ">Figure 5
<p>Scatterplots of observed (validation subset) <span class="html-italic">versus</span> predicted aboveground wet biomass using the multiple-band vegetation indices shaded in gray from <a href="#remotesensing-07-00808-t002" class="html-table">Table 2</a> for rice (<b>A</b>); alfalfa (<b>B</b>); cotton (<b>C</b>); and maize (<b>D</b>). The diagonal line represents a 1:1 relationship. With the exception of alfalfa, which is built on 1st order derivative transformed spectra, equations are built on the untransformed spectra.</p>
Full article ">Figure 6
<p>Scatterplots of observed (validation subset) <span class="html-italic">versus</span> predicted aboveground wet biomass using the two-band vegetation indices shaded in gray from <a href="#remotesensing-07-00808-t003" class="html-table">Table 3</a> for rice (<b>A</b>); alfalfa (<b>B</b>); cotton (<b>C</b>); and maize (<b>D</b>). The diagonal line represents a 1:1 relationship. With the exception of alfalfa, which is built on 1st order derivative transformed spectra, equations are built on untransformed spectra.</p>
Full article ">Figure 6 Cont.
<p>Scatterplots of observed (validation subset) <span class="html-italic">versus</span> predicted aboveground wet biomass using the two-band vegetation indices shaded in gray from <a href="#remotesensing-07-00808-t003" class="html-table">Table 3</a> for rice (<b>A</b>); alfalfa (<b>B</b>); cotton (<b>C</b>); and maize (<b>D</b>). The diagonal line represents a 1:1 relationship. With the exception of alfalfa, which is built on 1st order derivative transformed spectra, equations are built on untransformed spectra.</p>
Full article ">
Back to TopTop