[go: up one dir, main page]

 
 
remotesensing-logo

Journal Browser

Journal Browser

Multi-Sensor and Multi-Data Integration in Remote Sensing

A special issue of Remote Sensing (ISSN 2072-4292).

Deadline for manuscript submissions: closed (31 August 2016) | Viewed by 172241

Special Issue Editors


E-Mail Website
Guest Editor
Department of Geomatics Engineering, The University of Calgary, Calgary, AB T2N 1N4, Canada
Interests: intelligent and autonomous systems; navigation & positioning technologies; satellite technologies; multi-sensor systems; wireless positioning; vehicles & transportation systems; driverless cars; technology development; applications
Special Issues, Collections and Topics in MDPI journals
Mobile Multi-Sensor Systems Research Group, Department of Geomatics Engineering, The University of Calgary, 2500 University Drive NW, Calgary, AB, Canada
Interests: photogrammetry; laser scanning; unmanned aerial vehicles; 3D modelling and reconstruction

E-Mail Website
Guest Editor
Mobile Multi-Sensor Systems Research Group, Department of Geomatics Engineering, The University of Calgary 2500 University Drive NW, Calgary, AB, Canada
Interests: photogrammetry; remote sensing; computer vision; laser scanning; sensor integration; unmanned aerial vehicles

Special Issue Information

Dear Colleagues,
With the advances in sensor technology and the increasing quantity of multi-sensor, multi-temporal, and multi-resolution data from different sources, data integration has become as a valuable tool in remote sensing applications. The main objective of multi-sensor data integration is to synergistically combine sensory data from disparate sources—with different characteristics, resolution, and quality—in order to provide more reliable, accurate, and useful information required for diverse mapping, modelling and monitoring applications. Therefore, the integration of these disparate data contributes to the robust interpretation of the observed objects/scenes and provides the basis of effective planning and decision-making. Over the past few decades, multi-sensor data integration has received tremendous attention and different approaches and techniques have been presented with the aim of improving integration quality and exploring more application areas. However, with the emergence of new sensors and broader diversity of the intended applications, new methodologies and best practices for multi-sensor data integration need to be developed and their advantages and limitations actively to be shared by scientific research community.

This Special Issue invites submissions on latest advances in remote sensing multi-sensor data integration. The focus of the contributions to the Special Issue will be on reviewing current progress, on highlighting the latest methodologies that have been proposed to respond to the needs of multi-sensor data processing, and on pointing out the strategies to be thought to meet the requirements of potential applications.

The topics of interest include (but are not limited to):

  • Multi-sensor, multi-temporal, multi-resolution data integration
  • Fusion of heterogeneous sensor information
  • Heterogeneous data processing
  • Performance: measures and evaluation
  • Applications to urban studies, 3D reconstruction and modelling, environmental monitoring, etc.

Prof. Naser El-sheimy
Dr. Zahra Lari
Dr. Adel Moussa
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Remote Sensing is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2700 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Multi-sensor systems
  • multi-temporal data
  • computer vision
  • laser scanning
  • sensor integration
  • unmanned aerial vehicles
  • navigation and imaging technologies
  • photogrammetry and remote sensing

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (21 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

14018 KiB  
Article
A New Approach for Realistic 3D Reconstruction of Planar Surfaces from Laser Scanning Data and Imagery Collected Onboard Modern Low-Cost Aerial Mapping Systems
by Zahra Lari, Naser El-Sheimy and Ayman Habib
Remote Sens. 2017, 9(3), 212; https://doi.org/10.3390/rs9030212 - 25 Feb 2017
Cited by 4 | Viewed by 7743
Abstract
Over the past few years, accurate 3D surface reconstruction using remotely-sensed data has been recognized as a prerequisite for different mapping, modelling, and monitoring applications. To fulfill the needs of these applications, necessary data are generally collected using various digital imaging systems. Among [...] Read more.
Over the past few years, accurate 3D surface reconstruction using remotely-sensed data has been recognized as a prerequisite for different mapping, modelling, and monitoring applications. To fulfill the needs of these applications, necessary data are generally collected using various digital imaging systems. Among them, laser scanners have been acknowledged as a fast, accurate, and flexible technology for the acquisition of high density 3D spatial data. Despite their quick accessibility, the acquired 3D data using these systems does not provide semantic information about the nature of scanned surfaces. Hence, reliable processing techniques are employed to extract the required information for 3D surface reconstruction. Moreover, the extracted information from laser scanning data cannot be effectively utilized due to the lack of descriptive details. In order to provide a more realistic and accurate perception of the scanned scenes using laser scanning systems, a new approach for 3D reconstruction of planar surfaces is introduced in this paper. This approach aims to improve the interpretability of the extracted planar surfaces from laser scanning data using spectral information from overlapping imagery collected onboard modern low-cost aerial mapping systems, which are widely adopted nowadays. In this approach, the scanned planar surfaces using laser scanning systems are initially extracted through a novel segmentation procedure, and then textured using the acquired overlapping imagery. The implemented texturing technique, which intends to overcome the computational inefficiency of the previously-developed 3D reconstruction techniques, is performed in three steps. In the first step, the visibility of the extracted planar surfaces from laser scanning data within the collected images is investigated and a list of appropriate images for texturing each surface is established. Successively, an occlusion detection procedure is carried out to identify the occluded parts of these surfaces in the field of view of captured images. In the second step, visible/non-occluded parts of the planar surfaces are decomposed into segments that will be textured using individual images. Finally, a rendering procedure is accomplished to texture these parts using available images. Experimental results from overlapping laser scanning data and imagery collected onboard aerial mapping systems verify the feasibility of the proposed approach for efficient realistic 3D surface reconstruction. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The outline of the proposed 3D scene reconstruction approach.</p>
Full article ">Figure 2
<p>Established local neighborhood for (<b>a</b>) the detection and (<b>b</b>) precise representation of planar features (through iterative plane fitting procedure).</p>
Full article ">Figure 3
<p>Established adaptive cylindrical buffer for local point density estimation in a planar neighborhood.</p>
Full article ">Figure 4
<p>Defined attribute points for three locally-classified planar neighborhoods.</p>
Full article ">Figure 5
<p>2D representation of acceptable spatial and angular deviations within a given planar surface.</p>
Full article ">Figure 6
<p>2D Illustration of clustering procedure in parameter domain: (<b>a</b>) approximate peak detection using octree space partitioning strategy, and (<b>b</b>) precise peak detection.</p>
Full article ">Figure 7
<p>Investigation of the visibility of a planar surface in an image.</p>
Full article ">Figure 8
<p>Representation of (<b>a</b>) a fully visible and (<b>b</b>) a partially visible planar surface in an image.</p>
Full article ">Figure 9
<p>Occlusion investigation of a boundary point of the intended planar surface within a given image (<b>a</b>) visible boundary point (<b>b</b>) occluded boundary point.</p>
Full article ">Figure 10
<p>The surfaces occluding (green surface) and being occluded (yellow surface) by a boundary point of the intended planar surface.</p>
Full article ">Figure 11
<p>Identification and omission of the occluded part of the intended planar surface—occluding by another planar surface.</p>
Full article ">Figure 12
<p>The occlusions caused by the intended planar surface (<b>a</b>) all the boundary points of the intended surface projected inside the occluded surface and (<b>b</b>) a section of the boundary points projected inside the occluded surface.</p>
Full article ">Figure 13
<p>Rendering of a fully-visible planar surface within (<b>a</b>) a single appropriate image or (<b>b</b>) multiple appropriate images.</p>
Full article ">Figure 14
<p>A partially-visible planar surface within multiple overlapping images.</p>
Full article ">Figure 15
<p>Visible segments in (<b>a</b>) multiple images (segment<sub>12</sub> and segment<sub>23</sub>) and (<b>b</b>) in a single image (segment<sub>11</sub>, segment<sub>22</sub>, and segment<sub>33</sub>).</p>
Full article ">Figure 16
<p>Rendering a segment visible in two images (segment<sub>12</sub>).</p>
Full article ">Figure 17
<p>The intersection between two visible surfaces within two images where one of them includes an occluded area.</p>
Full article ">Figure 18
<p>Visible segments of two surfaces in single or multiple images.</p>
Full article ">Figure 19
<p>Transformations between texture, object, and screen spaces.</p>
Full article ">Figure 20
<p>Different types of planar surfaces: (<b>a</b>) convex, (<b>b</b>) concave, (<b>c</b>) hollow, and (<b>d</b>) complex surfaces.</p>
Full article ">Figure 21
<p>Delaunay triangulation of (<b>a</b>) a concave and (<b>b</b>) a hollow planar surface.</p>
Full article ">Figure 22
<p>Correspondence between object space and texture space.</p>
Full article ">Figure 23
<p>Filtering methods for adapting a texture image to the screen resolution.</p>
Full article ">Figure 24
<p>(<b>a</b>) An overview of the educational complex in CalgaryCanada) from Google Earth; (<b>b</b>) laser scanning point cloud colored according to height; and (<b>c</b>) two of the acquired images over the same complex.</p>
Full article ">Figure 25
<p>(<b>a</b>) An overview of the area of interest (in Burnaby, BC, Canada) from Google Earth; (<b>b</b>) laser scanning point cloud colored according to height; and (<b>c</b>) three of the captured images over same area.</p>
Full article ">Figure 26
<p>(<b>a</b>) An overview of the building of interest (located in Calgary, AB, Canada) from Google Earth; (<b>b</b>) laser scanning point cloud colored according to height; and (<b>c</b>) three of the captured images over the same building.</p>
Full article ">Figure 27
<p>Planar surface extraction results for the (<b>a</b>) first; (<b>b</b>) second; and (<b>c</b>) third laser scanning datasets before quality control procedure.</p>
Full article ">Figure 28
<p>Planar surface extraction results for the (<b>a</b>) first; (<b>b</b>) second; and (<b>c</b>) third laser scanning datasets after quality control procedure.</p>
Full article ">Figure 29
<p>Analysis of the quality of the extracted surfaces from: (<b>a</b>) first; (<b>b</b>) second; and (<b>c</b>) third provided laser scanning point clouds after quality control of segmentation/planar surface extraction outcome.</p>
Full article ">Figure 30
<p>Realistic 3D surface reconstruction outcome for the first dataset: (<b>a</b>) and (<b>b</b>) two different views from the textured planar surfaces and individually colored non-segmented points.</p>
Full article ">Figure 31
<p>Realistic 3D surface reconstruction outcome for the second dataset: a single view of the textured planar surfaces and individually colored non-segmented points.</p>
Full article ">Figure 32
<p>Realistic 3D surface reconstruction outcome for the third dataset: (<b>a</b>) and (<b>b</b>) two different views from the textured planar regions and individually colored non-segmented points.</p>
Full article ">Figure 33
<p>Quality control of an extracted planar surface with respect to its corresponding control planar surface.</p>
Full article ">Figure 34
<p>Pie charts representing distribution of the reconstructed surfaces w.r.t to their deviations from corresponding surfaces extracted from overlapping control data sources for the (<b>a</b>) first dataset; (<b>b</b>) second dataset; and (<b>c</b>) third dataset.</p>
Full article ">Figure 35
<p>(<b>a</b>) Laser scanning point cloud colored according to height, and (<b>b</b>) a part of a captured image over the complex building of interest.</p>
Full article ">Figure 36
<p>Realistic 3D scene reconstruction results for the provided dataset using (<b>a</b>) a traditional point-based and (<b>b</b>) the proposed surface-based scene reconstruction techniques.</p>
Full article ">
3520 KiB  
Article
Multivariate Spatial Data Fusion for Very Large Remote Sensing Datasets
by Hai Nguyen, Noel Cressie and Amy Braverman
Remote Sens. 2017, 9(2), 142; https://doi.org/10.3390/rs9020142 - 9 Feb 2017
Cited by 21 | Viewed by 6718
Abstract
Global maps of total-column carbon dioxide (CO2) mole fraction (in units of parts per million) are important tools for climate research since they provide insights into the spatial distribution of carbon intake and emissions as well as their seasonal and annual [...] Read more.
Global maps of total-column carbon dioxide (CO2) mole fraction (in units of parts per million) are important tools for climate research since they provide insights into the spatial distribution of carbon intake and emissions as well as their seasonal and annual evolutions. Currently, two main remote sensing instruments for total-column CO2 are the Orbiting Carbon Observatory-2 (OCO-2) and the Greenhouse gases Observing SATellite (GOSAT), both of which produce estimates of CO2 concentration, called profiles, at 20 different pressure levels. Operationally, each profile estimate is then convolved into a single estimate of column-averaged CO2 using a linear pressure weighting function. This total-column CO2 is then used for subsequent analyses such as Level 3 map generation and colocation for validation. In principle, total-column CO2 in these applications may be more efficiently estimated by making optimal estimates of the vector-valued CO2 profiles and applying the pressure weighting function afterwards. These estimates will be more efficient if there is multivariate dependence between CO2 values in the profile. In this article, we describe a methodology that uses a modified Spatial Random Effects model to account for the multivariate nature of the data fusion of OCO-2 and GOSAT. We show that multivariate fusion of the profiles has improved mean squared error relative to scalar fusion of the column-averaged CO2 values from OCO-2 and GOSAT. The computations scale linearly with the number of data points, making it suitable for the typically massive remote sensing datasets. Furthermore, the methodology properly accounts for differences in instrument footprint, measurement-error characteristics, and data coverages. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Locations of the 14 TCCON sites that were used to compare to column-averaged, data-fused CO<sub>2</sub> values.</p>
Full article ">Figure 2
<p>Retrieved column-averaged CO<sub>2</sub> data from OCO-2 (<b>top panel</b>) and ACOS (GOSAT) (<b>bottom panel</b>) on 3 March 2015.</p>
Full article ">Figure 3
<p>Boxplots of the “predicted minus TCCON” column-averaged CO<sub>2</sub>, where “predicted” is given by SSDF (teal) and MSDF (red).</p>
Full article ">Figure 4
<p>Boxplots of CO<sub>2</sub> concentrations for MSDF predictions and colocated OCO-2 retrieved profiles at Darwin. Heights on the horizontal axis are the index <span class="html-italic">k</span> from height <math display="inline"> <semantics> <mrow> <msub> <mi>h</mi> <mi>k</mi> </msub> <mo>;</mo> <mi>k</mi> <mo>=</mo> <mn>1</mn> <mo>,</mo> <mo>…</mo> <mo>,</mo> <mn>20</mn> <mo>.</mo> </mrow> </semantics> </math></p>
Full article ">
6899 KiB  
Article
Modeling the Effects of the Urban Built-Up Environment on Plant Phenology Using Fused Satellite Data
by Norman Gervais, Alexander Buyantuev and Feng Gao
Remote Sens. 2017, 9(1), 99; https://doi.org/10.3390/rs9010099 - 23 Jan 2017
Cited by 9 | Viewed by 9068
Abstract
Understanding the effects that the Urban Heat Island (UHI) has on plant phenology is important in predicting ecological impacts of expanding cities and the impacts of the projected global warming. However, the underlying methods to monitor phenological events often limit this understanding. Generally, [...] Read more.
Understanding the effects that the Urban Heat Island (UHI) has on plant phenology is important in predicting ecological impacts of expanding cities and the impacts of the projected global warming. However, the underlying methods to monitor phenological events often limit this understanding. Generally, one can either have a small sample of in situ measurements or use satellite data to observe large areas of land surface phenology (LSP). In the latter, a tradeoff exists among platforms with some allowing better temporal resolution to pick up discrete events and others possessing the spatial resolution appropriate for observing heterogeneous landscapes, such as urban areas. To overcome these limitations, we applied the Spatial and Temporal Adaptive Reflectance Model (STARFM) to fuse Landsat surface reflectance and MODIS nadir BRDF-adjusted reflectance (NBAR) data with three separate selection conditions for input data across two versions of the software. From the fused images, we derived a time-series of high temporal and high spatial resolution synthetic Normalized Difference Vegetation Index (NDVI) imagery to identify the dates of the start of the growing season (SOS), end of the season (EOS), and the length of the season (LOS). The results were compared between the urban and exurban developed areas within the vicinity of Ogden, UT and across all three data scenarios. The results generally show an earlier urban SOS, later urban EOS, and longer urban LOS, with variation across the results suggesting that phenological parameters are sensitive to input changes. Although there was strong evidence that STARFM has the potential to produce images capable of capturing the UHI effect on phenology, we recommend that future work refine the proposed methods and compare the results against ground events. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Location map of the study area within the contiguous United States and mean impervious surface and urban/exurban classification by municipality.</p>
Full article ">Figure 2
<p>The steps contained in the python script to process the MODIS and Landsat data for input into STARFM.</p>
Full article ">Figure 3
<p>The temporal distribution of the Landsat scenes used in the base pairing.</p>
Full article ">Figure 4
<p>The steps contained in the python script to match MODIS scenes without a corresponding Landsat scene with the appropriate MODIS/Landsat base pair and create the STARFM input texts reflecting this.</p>
Full article ">Figure 5
<p>Example synthetic NDVI image.</p>
Full article ">Figure 6
<p>Three bar charts showing the approximate average number of days the SOS is earlier, EOS is later, and LOS is longer in the urban area for (<b>a</b>) the informal selection criteria, (<b>b</b>) the 95% statistical criteria, and (<b>c</b>) the 90% statistical selection criteria. Error bars represent a 99.9% confidence interval.</p>
Full article ">
5507 KiB  
Article
An Integrated GNSS/INS/LiDAR-SLAM Positioning Method for Highly Accurate Forest Stem Mapping
by Chuang Qian, Hui Liu, Jian Tang, Yuwei Chen, Harri Kaartinen, Antero Kukko, Lingli Zhu, Xinlian Liang, Liang Chen and Juha Hyyppä
Remote Sens. 2017, 9(1), 3; https://doi.org/10.3390/rs9010003 - 23 Dec 2016
Cited by 119 | Viewed by 14547
Abstract
Forest mapping, one of the main components of performing a forest inventory, is an important driving force in the development of laser scanning. Mobile laser scanning (MLS), in which laser scanners are installed on moving platforms, has been studied as a convenient measurement [...] Read more.
Forest mapping, one of the main components of performing a forest inventory, is an important driving force in the development of laser scanning. Mobile laser scanning (MLS), in which laser scanners are installed on moving platforms, has been studied as a convenient measurement method for forest mapping in the past several years. Positioning and attitude accuracies are important for forest mapping using MLS systems. Inertial Navigation Systems (INSs) and Global Navigation Satellite Systems (GNSSs) are typical and popular positioning and attitude sensors used in MLS systems. In forest environments, because of the loss of signal due to occlusion and severe multipath effects, the positioning accuracy of GNSS is severely degraded, and even that of GNSS/INS decreases considerably. Light Detection and Ranging (LiDAR)-based Simultaneous Localization and Mapping (SLAM) can achieve higher positioning accuracy in environments containing many features and is commonly implemented in GNSS-denied indoor environments. Forests are different from an indoor environment in that the GNSS signal is available to some extent in a forest. Although the positioning accuracy of GNSS/INS is reduced, estimates of heading angle and velocity can maintain high accurate even with fewer satellites. GNSS/INS and the LiDAR-based SLAM technique can be effectively integrated to form a sustainable, highly accurate positioning and mapping solution for use in forests without additional hardware costs. In this study, information such as heading angles and velocities extracted from a GNSS/INS is utilized to improve the positioning accuracy of the SLAM solution, and two information-aided SLAM methods are proposed. First, a heading angle-aided SLAM (H-aided SLAM) method is proposed that supplies the heading angle from GNSS/INS to SLAM. Field test results show that the horizontal positioning accuracy of an entire trajectory of 800 m is 0.13 m and is significantly improved (by 70%) compared to that of a traditional GNSS/INS; second, a more complex information added SLAM solution that utilizes both heading angle and velocity information simultaneously (HV-aided SLAM) is investigated. Experimental results show that the horizontal positioning accuracy can reach a level of six centimetres with the HV-aided SLAM, which is a significant improvement (by 86%). Thus, a more accurate forest map is obtained by the proposed integrated method. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The STD of velocity using IE in forest.</p>
Full article ">Figure 2
<p>ATV and two positioning and orientation system (POS) devices including NovAtel SPAN and FARO Focus3D X330.</p>
Full article ">Figure 3
<p>The number of visible satellites (Global Positioning System (GPS) + Global Navigation Satellite System (GLONASS)) in forest.</p>
Full article ">Figure 4
<p>Workflow of highly accurate data processing by integration of GNSS/INS and IMLE-SLAM.</p>
Full article ">Figure 5
<p>Vehicle Route (red) and Reference Trees (green) (Background orthoimage: NLS 2012).</p>
Full article ">Figure 6
<p>Schematic diagram for evaluating the positioning accuracy using reference trees.</p>
Full article ">Figure 7
<p>(<b>a</b>) Trajectories and stem position using the IMLE-SLAM and GNSS/INS methods for the entire test. (<b>b</b>) Position errors of the IMLE-SLAM solution for the entire test. (<b>c</b>) Position errors of the GNSS/INS solution for the entire test.</p>
Full article ">Figure 8
<p>The difference of heading angle estimated by GNSS/INS and IMLE-SLAM.</p>
Full article ">Figure 9
<p>(<b>a</b>) Position errors of the IMLE-SLAM solution and GNSS/INS in the dense forest. (<b>b</b>) Differences in heading angles estimated by GNSS/INS and IMLE-SLAM.</p>
Full article ">Figure 10
<p>(<b>a</b>) Trajectories and stem position by H-aided IMLE-SLAM for the entire test. (<b>b</b>) Position errors of the H-aided IMLE-SLAM solution for the entire test.</p>
Full article ">Figure 10 Cont.
<p>(<b>a</b>) Trajectories and stem position by H-aided IMLE-SLAM for the entire test. (<b>b</b>) Position errors of the H-aided IMLE-SLAM solution for the entire test.</p>
Full article ">Figure 11
<p>(<b>a</b>) Trajectories and stem positions of the entire test by HV-aided IMLE-SLAM. (<b>b</b>) Position errors of the entire test by HV-aided IMLE-SLAM.</p>
Full article ">
22770 KiB  
Article
Exploiting TERRA-AQUA MODIS Relationship in the Reflective Solar Bands for Aerosol Retrieval
by Xingwang Fan and Yuanbo Liu
Remote Sens. 2016, 8(12), 996; https://doi.org/10.3390/rs8120996 - 3 Dec 2016
Cited by 1 | Viewed by 5448
Abstract
Satellite remote sensing has been providing aerosol data with ever-increasing accuracy, representative of the MODerate-resolution Imaging Spectroradiometer (MODIS) Dark Target (DT) and Deep Blue (DB) aerosol retrievals. These retrievals are generally performed over spectrally dark objects and therefore may struggle over bright surfaces. [...] Read more.
Satellite remote sensing has been providing aerosol data with ever-increasing accuracy, representative of the MODerate-resolution Imaging Spectroradiometer (MODIS) Dark Target (DT) and Deep Blue (DB) aerosol retrievals. These retrievals are generally performed over spectrally dark objects and therefore may struggle over bright surfaces. This study proposed an analytical TERRA-AQUA MODIS relationship in the reflective solar bands for aerosol retrieval. For the relationship development, the bidirectional reflectance distribution function (BRDF) effects were adjusted using reflectance ratios in the MODIS 2.13 μm band and the path radiance was approximated as an analytical function of aerosol optical thickness (AOT) and scattering phase function. Comparisons with MODIS observation data, MODIS AOT data, and sun photometer measurements demonstrate the validity of the proposed relationship for aerosol retrieval. The synergetic TERRA-AQUA MODIS retrievals are highly correlated with the ground measured AOT at TERRA MODIS overpass time (R2 = 0.617; RMSE = 0.043) and AQUA overpass time (R2 = 0.737; RMSE = 0.036). Compared to our retrievals, both the MODIS DT and DB retrievals are subject to severe underestimation. Sensitivity analyses reveal that the proposed method may perform better over non-vegetated than vegetated surfaces, which can offer a complement to MODIS operational algorithms. In an analytical form, the proposed method also has advantages in computational efficiency, and therefore can be employed for fine-scale (relative to operational 10 km MODIS product) MODIS aerosol retrieval. Overall, this study provides insight into aerosol retrievals and other applications regarding TERRA-AQUA MODIS data. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>The performances of reflectance ratios at the mid-infrared band (2200 nm) to approximate those at the red band (606 nm) and those at the near-infrared band (866 nm) for (<b>a</b>) the bog and (<b>b</b>) the forest land cover types.</p>
Full article ">Figure 2
<p>Comparisons of the calculated path radiances (6S model) and the estimated path radiances (analytical model) in (<b>a</b>) the red and (<b>b</b>) the near-infrared bands.</p>
Full article ">Figure 3
<p>Comparisons of TOA reflectances in (<b>a</b>) the red and (<b>b</b>) the near-infrared bands. The <b>left</b> frame denotes TERRA MODIS band reflectances, the <b>middle</b> frame denotes the corresponding AQUA MODIS band reflectances, and the <b>right</b> frame denotes the modeled TERRA MODIS band reflectances.</p>
Full article ">Figure 4
<p>Comparison of AOT data at 550 nm. The left frame denotes TERRA MODIS aerosol retrieval, the middle frame denotes AQUA MODIS retrieval, and the right frame denotes the aerosol retrievals in this study.</p>
Full article ">Figure 5
<p>Comparisons of the TERRA-AQUA aerosol retrievals and the sun photometer aerosol measurements at (<b>a</b>) the TERRA overpass time and (<b>b</b>) the AQUA overpass time.</p>
Full article ">Figure 6
<p>Comparisons of the TERRA-AQUA aerosol retrievals and (<b>a</b>) the TERRA MODIS retrievals and (<b>b</b>) the AQUA MODIS retrievals. DT represents dark target algorithm and DB represents dark blue algorithm.</p>
Full article ">Figure 7
<p>The effects of VZA difference on aerosol retrieval under different levels of AOT (AOT = 0.05; 0.10; 0.50) for (<b>a</b>) SZA = 30° and (<b>b</b>) SZA = 60°.</p>
Full article ">Figure 8
<p>The effects of dual-view RAA difference on aerosol retrieval under different levels of AOT (AOT = 0.05; 0.10; 0.50) for (<b>a</b>) SZA = 30° and (<b>b</b>) SZA = 60°.</p>
Full article ">Figure 9
<p>The effects of land surface (TOA reflectance = 0.05; 0.15; 0.25) on aerosol retrieval for (<b>a</b>) SZA = 30° and (<b>b</b>) SZA = 60°.</p>
Full article ">Figure 10
<p>The effects of BRDF adjustment factor (<span class="html-italic">f<sub>BRDF</sub></span> = 0.8; 0.9; 1.0; 1.1; 1.2) on aerosol retrieval for (<b>a</b>) TOA reflectance = 0.05; (<b>b</b>) TOA reflectance = 0.15; (<b>c</b>) TOA reflectance = 0.25; and (<b>d</b>) TOA reflectance = 0.35.</p>
Full article ">Figure 11
<p>Comparisons of TERRA and AQUA MODIS band ratios in the (<b>a</b>) 0.645 μm, (<b>b</b>) 0.859 μm, and (<b>c</b>) 2.13 μm bands.</p>
Full article ">Figure 12
<p>Time series (January 1–September 30, 2016) of aerosol type information collected from (<b>a</b>) TERRA and (<b>b</b>) AQUA MODIS aerosol products.</p>
Full article ">
7903 KiB  
Article
Interpolation of GPS and Geological Data Using InSAR Deformation Maps: Method and Application to Land Subsidence in the Alto Guadalentín Aquifer (SE Spain)
by Marta Béjar-Pizarro, Carolina Guardiola-Albert, Ramón P. García-Cárdenas, Gerardo Herrera, Anna Barra, Antonio López Molina, Serena Tessitore, Alejandra Staller, José A. Ortega-Becerril and Ramón P. García-García
Remote Sens. 2016, 8(11), 965; https://doi.org/10.3390/rs8110965 - 23 Nov 2016
Cited by 48 | Viewed by 8569
Abstract
Land subsidence resulting from groundwater extractions is a global phenomenon adversely affecting many regions worldwide. Understanding the governing processes and mitigating associated hazards require knowing the spatial distribution of the implicated factors (piezometric levels, lithology, ground deformation), usually only known at discrete locations. [...] Read more.
Land subsidence resulting from groundwater extractions is a global phenomenon adversely affecting many regions worldwide. Understanding the governing processes and mitigating associated hazards require knowing the spatial distribution of the implicated factors (piezometric levels, lithology, ground deformation), usually only known at discrete locations. Here, we propose a methodology based on the Kriging with External Drift (KED) approach to interpolate sparse point measurements of variables influencing land subsidence using high density InSAR measurements. In our study, located in the Alto Guadalentín basin, SE Spain, these variables are GPS vertical velocities and the thickness of compressible soils. First, we estimate InSAR and GPS rates of subsidence covering the periods 2003–2010 and 2004–2013, respectively. Then, we apply the KED method to the discrete variables. The resulting continuous GPS velocity map shows maximum subsidence rates of 13 cm/year in the center of the basin, in agreement with previous studies. The compressible deposits thickness map is significantly improved. We also test the coherence of Sentinel-1 data in the study region and evaluate the applicability of this methodology with the new satellite, which will improve the monitoring of aquifer-related subsidence and the mapping of variables governing this phenomenon. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>(<b>a</b>) Location map. The black-dashed rectangles labelled A and B outline the regions used for the interpolation of GPS vertical velocities and the thickness of compressible soils., respectively. The thickness of compressible deposits, formed by clay and silt layers, is shown as colored circles at 18 boreholes (estimated by [<a href="#B28-remotesensing-08-00965" class="html-bibr">28</a>]). The black line delimits the Alto Guadalentín aquifer. The SRTM-90 Digital Elevation Model was used to generate the background topography. Names for the main cities in the study area are indicated. (<b>b</b>) Location of the 39 points that we measured with GPS between 28 January and 1 March 2013. Each site is identified with a number from 1–39 (<a href="#app1-remotesensing-08-00965" class="html-app">Supplementary Table S1</a>). The height of these points had been previously measured by the Spanish National Geographic Institute (IGNE): red diamonds were measured in August–September 2004; yellow diamonds were measured in March 2005; and blue diamonds were measured in March 2009. (<b>c</b>) NW-SE geological cross-section of the Alto Guadalentín basin, from [<a href="#B28-remotesensing-08-00965" class="html-bibr">28</a>]. The location of the two permanent GPS stations, LOR1 and LORC, is indicated by the black stars in (<b>a</b>,<b>c</b>).</p>
Full article ">Figure 2
<p>(<b>a</b>) Deformation map from ENVISAT data for the time period 2003–2010 (Box A in <a href="#remotesensing-08-00965-f001" class="html-fig">Figure 1</a>a). Negative deformation (red regions) indicates land subsidence. Black circles show the GPS location, and blue arrows indicate GPS vertical velocities. GPS velocities are obtained by comparing the reference vertical positions provided by the IGNE, which varies in the network between August–September 2004 and March 2009 (see <a href="#remotesensing-08-00965-f001" class="html-fig">Figure 1</a>b), and the new vertical positions measured in 2013 (see <a href="#app1-remotesensing-08-00965" class="html-app">Table S1</a>). The large blue arrow at the top represents the scale of GPS velocities. (<b>b</b>–<b>d</b>) Cross-sections showing vertical deformation rates from InSAR (red dots) and GPS data (blue circles) measured along the three dashed black lines in (<b>a</b>). (<b>e</b>) LOS time series for the point indicated by the black star in (<b>a</b>).</p>
Full article ">Figure 3
<p>Correlation diagrams: (<b>a</b>) comparison between deformation rates estimated from InSAR and GPS data. Note that InSAR data cover the period 2003–2010, while GPS data cover three different periods (2004–2013, 2005–2013, 2009–2013) depending on the position of the GPS point (see <a href="#sec3dot2-remotesensing-08-00965" class="html-sec">Section 3.2</a> and <a href="#remotesensing-08-00965-f001" class="html-fig">Figure 1</a>b). (<b>b</b>) Comparison between the InSAR-derived deformation rate and compressible deposit thickness.</p>
Full article ">Figure 4
<p>(<b>a</b>) Interpolated GPS-velocity map for period 2004–2013. Dashed black lines indicate cross-sections in (<b>b</b>–<b>d</b>). (<b>b</b>–<b>d</b>) Cross-sections showing vertical ground deformation rates from InSAR (red dots), GPS measurements (blue dots) and GPS interpolations (green dots).</p>
Full article ">Figure 5
<p>Cross-validation procedure for the GPS deformation rates’ interpolation. To characterize the ability of the interpolation to re-estimate (Z*) the data values (Z), errors are standardized by the predicted standard deviation ((Z* – Z)/S*). We define the interval [−2.5; 2.5] to focus on the 1% extreme values of a normal distribution. (<b>a</b>) Histogram of the standardized error. The minimum value, maximum value, mean value and variance are indicated. (<b>b</b>) Standardized error versus interpolated GPS rates in the 39 GPS locations.</p>
Full article ">Figure 6
<p>(<b>a</b>) Map of compressible soil thickness from this study (Box B in <a href="#remotesensing-08-00965-f001" class="html-fig">Figure 1</a>a). (<b>b</b>) Compressible soil thickness map from [<a href="#B28-remotesensing-08-00965" class="html-bibr">28</a>]. Circles indicate the location of the 18 boreholes where the thickness of the compressible deposit was measured. Map of residuals between maps (<b>a</b>,<b>b</b>) is shown in <a href="#app1-remotesensing-08-00965" class="html-app">Supplementary Figure S2</a>.</p>
Full article ">Figure 7
<p>Cross-validation procedure for the compressible deposit thickness interpolation. (<b>a</b>) Histogram of the standardized error. The minimum value, maximum value, mean value and variance are indicated. (<b>b</b>) Standardized error versus the interpolated thickness of the compressible deposit in the 18 borehole locations.</p>
Full article ">Figure 8
<p>Cross-sections showing ground deformation rates from ENVISAT data (red dots), ALOS data (gray dots), COSMO-SkyMed (CSK) data (green dots) and GPS measurements (blue circles). The temporal interval of each InSAR dataset is indicated in (<b>a</b>). GPS velocities were obtained by comparing the reference vertical positions provided by the IGNE, which varies in the network between August–September 2004 and March 2009 (see <a href="#remotesensing-08-00965-f001" class="html-fig">Figure 1</a>b), and the new vertical positions measured in 2013. The location of the cross-sections is indicated in <a href="#remotesensing-08-00965-f002" class="html-fig">Figure 2</a>a: (<b>a</b>) shows ground deformation rates across A–B, (<b>b</b>) shows ground deformation rates across C–D and (<b>c</b>) shows ground deformation rates across E–F.</p>
Full article ">
8288 KiB  
Article
Using Landsat, MODIS, and a Biophysical Model to Evaluate LST in Urban Centers
by Mohammad Z. Al-Hamdan, Dale A. Quattrochi, Lahouari Bounoua, Asia Lachir and Ping Zhang
Remote Sens. 2016, 8(11), 952; https://doi.org/10.3390/rs8110952 - 16 Nov 2016
Cited by 13 | Viewed by 8371
Abstract
In this paper, we assessed and compared land surface temperature (LST) in urban centers using data from Landsat, MODIS, and the Simple Biosphere model (SiB2). We also evaluated the sensitivity of the model’s LST to different land cover types, fractions (percentages), and emissivities [...] Read more.
In this paper, we assessed and compared land surface temperature (LST) in urban centers using data from Landsat, MODIS, and the Simple Biosphere model (SiB2). We also evaluated the sensitivity of the model’s LST to different land cover types, fractions (percentages), and emissivities compared to reference points derived from Landsat thermal data. This was demonstrated in three climatologically- and morphologically-different cities of Atlanta, GA, New York, NY, and Washington, DC. Our results showed that in these cities SiB2 was sensitive to both the emissivity and the land cover type and fraction, but much more sensitive to the latter. The practical implications of these results are rather significant since they imply that the SiB2 model can be used to run different scenarios for evaluating urban heat island (UHI) mitigation strategies. This study also showed that using detailed emissivities per land cover type and fractions from Landsat-derived data caused a convergence of the model results towards the Landsat-derived LST for most of the studied cases. This study also showed that SiB2 LSTs are closer in magnitude to Landsat-derived LSTs than MODIS-derived LSTs. It is important, however, to emphasize that both Landsat and MODIS LSTs are not direct observations and, as such, do not represent a ground truth. More studies will be needed to compare these results to in situ LST data and provide further validation. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The spatial domains of the study areas (5 km × 5 km cells) and NLCD-2001 LC within those areas.</p>
Full article ">Figure 2
<p>An example of Landsat-derived LST for Atlanta on 1 October 2001 and inputs needed for that derivation.</p>
Full article ">Figure 3
<p>An example of Landsat-derived LST for New York on 14 April 2001 and inputs needed for that derivation.</p>
Full article ">Figure 4
<p>An example of Landsat-derived LST for Washington, DC on 26 August 2001 and inputs needed for that derivation (the extracted block represents the 5 km × 5 km cell where analyses were performed).</p>
Full article ">Figure 5
<p>Differences in daily composite of canopy temperature computed with Landcover type-dependent emissivity and constant emissivity. Results are averaged during winter (December, January, and February) and summer (June, July and August).</p>
Full article ">Figure 6
<p>Time series of LST results from all SiB2 model scenarios, Landsat, and MODIS for Atlanta, Washington, DC, and New York.</p>
Full article ">Figure 7
<p>Landsat LST versus MODIS LST or SiB2 LST for Atlanta, Washington, DC, and New York.</p>
Full article ">Figure 8
<p>Root mean square difference (RMSD) between Landsat thermal data (TM, ETM<sup>+</sup>, and TM and ETM<sup>+</sup> combined), each SiB2 model scenario, and MODIS.</p>
Full article ">Figure 9
<p>Comparison of nighttime SiB2 simulated canopy temperature for each model scenario and MODIS nighttime LST for the 5 km × 5 km cell in Washington, DC.</p>
Full article ">Figure 10
<p>Landsat TM- and ETM<sup>+</sup>-derived land surface temperatures and SiB2 canopy temperature for each land cover type present in the 5 km × 5 km cell in Washington, DC.</p>
Full article ">Figure 11
<p>Comparison of SiB2 canopy temperature (<b>blue</b>) to Landsat TM (<b>a</b>–<b>c</b>) and ETM<sup>+</sup> (<b>d</b>–<b>f</b>) LST (<b>red</b>) for each land cover type in the 5 km × 5 km cell in Washington, DC.</p>
Full article ">Figure 12
<p>SUHI amplitude computed using SiB2 Canopy temperature, Landsat TM- and Landsat ETM<sup>+</sup>-derived LST over the 5 km × 5 km cell in Washington, DC.</p>
Full article ">
8339 KiB  
Article
The Use of C-/X-Band Time-Gapped SAR Data and Geotechnical Models for the Study of Shanghai’s Ocean-Reclaimed Lands through the SBAS-DInSAR Technique
by Antonio Pepe, Manuela Bonano, Qing Zhao, Tianliang Yang and Hanmei Wang
Remote Sens. 2016, 8(11), 911; https://doi.org/10.3390/rs8110911 - 2 Nov 2016
Cited by 66 | Viewed by 6695
Abstract
In this work, we investigate the temporal evolution of ground deformation affecting the ocean-reclaimed lands of the Shanghai (China) megacity, from 2007 to 2016, by applying the Differential Synthetic Aperture Radar Interferometry (DInSAR) technique known as the Small BAseline Subset (SBAS) algorithm. For [...] Read more.
In this work, we investigate the temporal evolution of ground deformation affecting the ocean-reclaimed lands of the Shanghai (China) megacity, from 2007 to 2016, by applying the Differential Synthetic Aperture Radar Interferometry (DInSAR) technique known as the Small BAseline Subset (SBAS) algorithm. For the analysis, we exploited two sets of non-time-overlapped synthetic aperture radar (SAR) data, acquired from 2007 to 2010, by the ASAR/ENVISAT (C-band) instrument, and from 2014 to 2016 by the X-band COSMO-SkyMed (CSK) sensors. The long time gap (of about three years) existing between the available C- and X-band datasets made the generation of unique displacement time-series more difficult. Nonetheless, this problem was successfully solved by benefiting from knowledge of time-dependent geotechnical models, which describe the temporal evolution of the expected deformation affecting Shanghai’s ocean-reclaimed platforms. The combined ENVISAT/CSK (vertical) deformation time-series were analyzed to gain insight into the future evolution of displacement signals within the investigated area. As an outcome, we find that ocean-reclaimed lands in Shanghai experienced, between 2007 and 2016, average cumulative (vertical) displacements extending down to 25 centimeters. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Landsat 8 image collected on 3 August 2015, which shows the city of Shanghai and its reclaimed coastal areas (since the 1980s) in Nanhui New City. Ocean-reclaimed areas are represented in pink and the black line identifies the Dachang–Zhoupu fault.</p>
Full article ">Figure 2
<p>Distribution of the available ASAR/ENVISAT (<b>a</b>); and COSMO-SkyMed CSK (<b>b</b>) Synthetic Aperture Radar SAR data in the temporal/perpendicular baseline domain; SAR data are represented by <b>red</b> (<b>a</b>); and <b>black</b> (<b>b</b>) diamonds, whereas connecting arcs are representative of the selected, small baseline interferometric data pairs.</p>
Full article ">Figure 3
<p>Four selected DInSAR interferograms relevant to the Shanghai area, achieved by exploiting ENVISAT/ASAR acquisitions over ascending orbits (<b>a</b>,<b>b</b>); and CSK SAR data over descending orbits (<b>c</b>,<b>d</b>): (<b>a</b>) 29 October 2007–3 December 2007, ENVISAT interferogram, with a perpendicular baseline = 64 m, and temporal baseline = 35 days; (<b>b</b>) 2 March 2009–20 July 2009, ENVISAT interferogram, with a perpendicular baseline = 5 m, and a temporal baseline = 140 days; (<b>c</b>) 8 January 2014–9 February 2014, CSK interferogram, with a perpendicular baseline = 70 m, and a temporal baseline = 32 days; and (<b>d</b>) 27 January 2015–2 October 2015, CSK interferogram, with a perpendicular baseline = 45 m, and a temporal baseline = 248 days.</p>
Full article ">Figure 4
<p>LOS SBAS-DInSAR average displacement velocity maps, retrieved using the 2007–2010 ASAR/ENVISAT (<b>a</b>); and the 2014–2016 CSK (<b>b</b>) datasets, and shown on a common geocoded grid. The <b>white</b> rectangles represent the reclaimed area of Lingang New City, where the following comparative analysis has been performed. The black stars identify the locations of the spatial reference point.</p>
Full article ">Figure 5
<p>(<b>a</b>) Geocoded map of the mean (vertical) deformation velocity during the 2007–2016 time-interval over Lingang New City, as retrieved by combining ENVISAT and CSK LOS displacement time-series; and (<b>b</b>) map of the root mean square error (RMSE) values of the difference between the computed best-fit models and the combined ASAR/ENVISAT and CSK (vertical) deformation time-series. The plots of the ENVISAT/CSK combined (vertical) deformation time-series, calculated in correspondence with the four pixels, labeled as P<sub>1</sub>, P<sub>2</sub>, P<sub>3</sub>, and P<sub>4</sub>, are also shown. Combined time-series are plotted in <b>black</b> (ASAR/ENVISAT) and <b>red</b> (CSK) triangles, whereas the corresponding best-fit models are plotted by continuous <b>black</b> lines.</p>
Full article ">Figure 6
<p>Map of the entire cumulative displacements on 18 March 2016, as obtained by the combination of ASAR/ENVISAT and the CSK time-series.</p>
Full article ">
19403 KiB  
Article
Towards Slow-Moving Landslide Monitoring by Integrating Multi-Sensor InSAR Time Series Datasets: The Zhouqu Case Study, China
by Qian Sun, Jun Hu, Lei Zhang and Xiaoli Ding
Remote Sens. 2016, 8(11), 908; https://doi.org/10.3390/rs8110908 - 2 Nov 2016
Cited by 63 | Viewed by 7498
Abstract
Although the past few decades have witnessed the great development of Synthetic Aperture Radar Interferometry (InSAR) technology in the monitoring of landslides, such applications are limited by geometric distortions and ambiguity of 1D Line-Of-Sight (LOS) measurements, both of which are the fundamental weakness [...] Read more.
Although the past few decades have witnessed the great development of Synthetic Aperture Radar Interferometry (InSAR) technology in the monitoring of landslides, such applications are limited by geometric distortions and ambiguity of 1D Line-Of-Sight (LOS) measurements, both of which are the fundamental weakness of InSAR. Integration of multi-sensor InSAR datasets has recently shown its great potential in breaking through the two limits. In this study, 16 ascending images from the Advanced Land Observing Satellite (ALOS) and 18 descending images from the Environmental Satellite (ENVISAT) have been integrated to characterize and to detect the slow-moving landslides in Zhouqu, China between 2008 and 2010. Geometric distortions are first mapped by using the imaging geometric parameters of the used SAR data and public Digital Elevation Model (DEM) data of Zhouqu, which allow the determination of the most appropriate data assembly for a particular slope. Subsequently, deformation rates along respective LOS directions of ALOS ascending and ENVISAT descending tracks are estimated by conducting InSAR time series analysis with a Temporarily Coherent Point (TCP)-InSAR algorithm. As indicated by the geometric distortion results, 3D deformation rates of the Xieliupo slope at the east bank of the Pai-lung River are finally reconstructed by joint exploiting of the LOS deformation rates from cross-heading datasets based on the surface–parallel flow assumption. It is revealed that the synergistic results of ALOS and ENVISAT datasets provide a more comprehensive understanding and monitoring of the slow-moving landslides in Zhouqu. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Shaded-relief map of Zhouqu. The red box reveals the scope of the study area. Black real and dashed boxes indicate the coverages of the Advanced Land Observing Satellite (ALOS) ascending and Environmental Satellite (ENVISAT) descending datasets, respectively. The white line represents the Pai-lung River.</p>
Full article ">Figure 2
<p>Geometric distortion effects in the Synthetic Aperture Radar (SAR) image.</p>
Full article ">Figure 3
<p>(<b>a</b>,<b>b</b>) local incidence angle <math display="inline"> <semantics> <mi>θ</mi> </semantics> </math> and the azimuth angle <math display="inline"> <semantics> <mi>α</mi> </semantics> </math> for the used ALOS ascending data; (<b>c</b>,<b>d</b>) local incidence angle <math display="inline"> <semantics> <mi>θ</mi> </semantics> </math> and the azimuth angle <math display="inline"> <semantics> <mi>α</mi> </semantics> </math> for the used ENVISAT descending data.</p>
Full article ">Figure 4
<p>Local terrain slope angle <math display="inline"> <semantics> <mi>χ</mi> </semantics> </math> (<b>a</b>) and local terrain aspect angle <math display="inline"> <semantics> <mi>γ</mi> </semantics> </math> (<b>b</b>) for the study area.</p>
Full article ">Figure 5
<p>Geometric distortions in the ALOS ascending (<b>a</b>) and ENVISAT descending (<b>b</b>) images acquired over Zhouqu. E: Resolution-Enhancing; F: Foreshortening; S: Shadow; L: Layover.</p>
Full article ">Figure 6
<p>Integrating result of the geometric distortions of the ALOS ascending and ENVISAT descending data. B: appropriate for both data; A: appropriate for ALOS data; E: appropriate for ENVISAT data; N: appropriate for neither data.</p>
Full article ">Figure 7
<p>Spatial-temporal baselines of the generated ALOS ascending (<b>a</b>) and ENVISAT descending (<b>b</b>) interferograms. Circles and lines represent SAR images and interferometric pairs, respectively.</p>
Full article ">Figure 8
<p>LOS deformation rate fields over the study area estimated from ALOS ascending (<b>a</b>) and ENVISAT descending (<b>b</b>) datasets, respectively, which are superimposed on Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM)-derived topography. AZ and LOS indicate the azimuth and the LOS directions, respectively. Dashed boxes indicate the regions affected by the landslides.</p>
Full article ">Figure 9
<p>(<b>a</b>,<b>b</b>) locations of temporarily coherent points (TCPs) in Xieliupo region derived from ALOS ascending and ENVISAT descending results, respectively, which are superimposed on the corresponding geometric distortions. E: Resolution-Enhancing; F: Foreshortening; S: Shadow; L: Layover; (<b>c</b>,<b>d</b>) LOS deformation rates at the Xieliupo region estimated from ALOS ascending and ENVISAT descending datasets, respectively, which are superimposed on the optical image. AZ and LOS indicate the azimuth and the LOS directions, respectively.</p>
Full article ">Figure 10
<p>3D deformation rates of the Xieliupo landslide, where the color and the arrow represent the vertical and horizontal deformation velocities, respectively. Black dashed line outlines the scope of the Xieliupo landslide [<a href="#B38-remotesensing-08-00908" class="html-bibr">38</a>]. Red line marks the location of Pingding-Huama fault [<a href="#B38-remotesensing-08-00908" class="html-bibr">38</a>].</p>
Full article ">
14402 KiB  
Article
Integrating Data of ASTER and Landsat-8 OLI (AO) for Hydrothermal Alteration Mineral Mapping in Duolong Porphyry Cu-Au Deposit, Tibetan Plateau, China
by Tingbin Zhang, Guihua Yi, Hongmei Li, Ziyi Wang, Juxing Tang, Kanghui Zhong, Yubin Li, Qin Wang and Xiaojuan Bie
Remote Sens. 2016, 8(11), 890; https://doi.org/10.3390/rs8110890 - 28 Oct 2016
Cited by 83 | Viewed by 10440
Abstract
One of the most important characteristics of porphyry copper deposits (PCDs) is the type and distribution pattern of alteration zones which can be used for screening and recognizing these deposits. Hydrothermal alteration minerals with diagnostic spectral absorption properties in the visible and near-infrared [...] Read more.
One of the most important characteristics of porphyry copper deposits (PCDs) is the type and distribution pattern of alteration zones which can be used for screening and recognizing these deposits. Hydrothermal alteration minerals with diagnostic spectral absorption properties in the visible and near-infrared (VNIR) through the shortwave infrared (SWIR) regions can be identified by multispectral and hyperspectral remote sensing data. Six Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) bands in SWIR have been shown to be effective in the mapping of Al-OH, Fe-OH, Mg-OH group minerals. The five VNIR bands of Landsat-8 (L8) Operational Land Imager (OLI) are useful for discriminating ferric iron alteration minerals. In the absence of complete hyperspectral coverage area, an opportunity, however, exists to integrate ASTER and L8-OLI (AO) to compensate each other’s shortcomings in covering area for mineral mapping. This study examines the potential of AO data in mineral mapping in an arid area of the Duolong porphyry Cu-Au deposit(Tibetan Plateau in China) by using spectral analysis techniques. Results show the following conclusions: (1) Combination of ASTER and L8-OLI data (AO) has more mineral information content than either alone; (2) The Duolong PCD alteration zones of phyllic, argillic and propylitic zones are mapped using ASTER SWIR bands and the iron-bearing mineral information is best mapped using AO VNIR bands; (3) The multispectral integration data of AO can provide a compensatory data of ASTER VNIR bands for iron-bearing mineral mapping in the arid and semi-arid areas. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>(<b>a</b>) Geographical map; (<b>b</b>) sketch tectonic map and generalized geologic map; (<b>c</b>) the Duolong PCD (Modified from [<a href="#B55-remotesensing-08-00890" class="html-bibr">55</a>]).</p>
Full article ">Figure 2
<p>ASTER and L8-OLI pre-processing steps and AO integrating processing flowchart.</p>
Full article ">Figure 3
<p>(<b>a</b>) The histogram of spectral offset values between O3 and A1, O4 and A2; (<b>b</b>) Two-dimensional scatterplot and linear fitting of O3 and A1 (Skipping 200 pixels to display); (<b>c</b>) Two-dimensional scatterplot and linear fitting of O4 and A2 (Skipping 200 pixels to display); (<b>d</b>) Mean linear fitting in OLI VNIR bands and ASTER VNIR bands.</p>
Full article ">Figure 4
<p>The OLI false color composite image (band 7, 5, 2 in RGB) of the Duolong area, Tibetan Plateau showing the field sample locations.</p>
Full article ">Figure 5
<p>Laboratory and field samples of mineral reflectance spectra in study area. The spectra of Res. are resampled to AO bandpasses. (<b>a</b>) F3: field sample number 3; Muscovite GDS 113 Ruby coming from USGS-SL; (<b>b</b>) F12: field sample number 12; Kaolinite CM9 coming from USGS-SL; (<b>c</b>) F8: field sample number 8; Limonite HS41.3 coming from USGS-SL; (<b>d</b>) F4: field sample number 4; Chlorite SMR-13.a coming from USGS-SL.</p>
Full article ">Figure 6
<p>The MNF images and eigenvalues (Eig) of AO and ASTER data. (<b>a</b>) The MNF images of AO data; (<b>b</b>) The MNF images of ASTER data; (<b>c</b>) The eigenvalues of AO MNF and ASTER MNF images.</p>
Full article ">Figure 7
<p>The abundance of Al-OH group minerals of Duolong PCD. (<b>a</b>) Al-OH<sub>1</sub>, ASTER RBD (B5 + B7)/B6 (Threshold 94.6%); (<b>b</b>) Al-OH<sub>2</sub>, ASTER SWIR bands MTMF (Infeasibility &lt; 5); (<b>c</b>) Al-OH<sub>3</sub>, ASTER SWIR bands MTMF (Infeasibility &lt; 5); (<b>d</b>) False color composite image of Al-OH composition in ASTER BRs B5/B6 (R), B7/B6 (G), B7/B5 (B) (Threshold 97.5%). The green dotted line shows the altered zone outline from field investigations of Duolong PCD. The white areas indicate background values.</p>
Full article ">Figure 8
<p>The field photographs of the study areas. (<b>a</b>) regional view of propylitic zone of Duobuza PCD in site 2; (<b>b</b>) regional view of phyllic zone of Duobuza PCD in site 1; (<b>c</b>) ferruginization in propylitic zone of Duobuza PCD in site 4; (<b>d</b>) ferruginization and kaolinization in phyllic zone of Bolong PCD in site 7; (<b>e</b>) ferruginization of cryptoexplosion breccia in phyllic zone, Bolong PCD, site 10; (<b>f</b>) granodiorite-porphyry outcrop in site 9. For the geographical location of the sites see <a href="#remotesensing-08-00890-f004" class="html-fig">Figure 4</a>.</p>
Full article ">Figure 9
<p>The abundance of Mg-OH group minerals of Duolong PCD. (<b>a</b>) Mg-OH<sub>1</sub>, ASTER RBD (B6 + B9)/B8 (Threshold 98.6%); (<b>b</b>) Mg-OH<sub>2</sub>, ASTER SWIR bands MTMF (Infeasibility &lt; 5). The green dotted line shows the altered zone outline from field investigations of Duolong PCD. The white areas indicate background values.</p>
Full article ">Figure 10
<p>The abundance of ferric iron and ferrous iron group minerals of Duolong PCD. (<b>a</b>) FI<sub>1</sub>, ASTER RB B2/B1 (Threshold 90.8%); (<b>b</b>) FI<sub>1</sub>, AO RB B4/B1 (Threshold 98.5%); (<b>c</b>) FI<sub>1</sub>, ASTER RB B4/B3 (Threshold 95.5%); (<b>d</b>) FI<sub>1</sub>, AO RB B7/B6 (Threshold 97.8%); (<b>e</b>) FI<sub>2</sub>, AO VNIR bands MTMF (Infeasibility &lt; 5); (<b>f</b>) FI<sub>3</sub>, ASTER RB B5/B4 (Threshold 94.9%). The green dotted line shows the altered zone outline from field investigations of Duolong PCD. The white areas indicate background values.</p>
Full article ">Figure 11
<p>Simplified alteration map illustrates the distribution of hydrothermal alteration zones in the Duolong PCD, the phyllic, argillic and propylitic zones are produced using MTMF of ASTER SWIR data, the gossans are produced using BR B4/B1 and MTMF of AO VNIR data.</p>
Full article ">
2550 KiB  
Article
Integration of Aerial Thermal Imagery, LiDAR Data and Ground Surveys for Surface Temperature Mapping in Urban Environments
by Emanuele Mandanici, Paolo Conte and Valentina A. Girelli
Remote Sens. 2016, 8(10), 880; https://doi.org/10.3390/rs8100880 - 23 Oct 2016
Cited by 17 | Viewed by 8158
Abstract
A single-band surface temperature retrieval method is proposed, aiming at achieving a better accuracy by exploiting the integration of aerial thermal images with LiDAR data and ground surveys. LiDAR data allow the generation of a high resolution digital surface model and a detailed [...] Read more.
A single-band surface temperature retrieval method is proposed, aiming at achieving a better accuracy by exploiting the integration of aerial thermal images with LiDAR data and ground surveys. LiDAR data allow the generation of a high resolution digital surface model and a detailed modeling of the Sky-View Factor (SVF). Ground surveys of surface temperature and emissivity, instead, are used to estimate the atmospheric parameters involved in the model (through a bounded least square adjustment) and for a first assessment of the accuracy of the results. The RMS of the difference between the surface temperatures computed from the model and measured on the check sites ranges between 0.8 °C and 1.0 °C, depending on the algorithm used to calculate the SVF. Results are in general better than the ones obtained without considering SVF and prove the effectiveness of the integration of different data sources. The proposed approach has the advantage of avoiding the modeling of the atmosphere conditions, which is often difficult to achieve with the desired accuracy; on the other hand, it is highly dependent on the accuracy of the data measured on the ground. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Ortho-mosaic of the thermal images acquired over the test area in Bologna and technical cartography (white lines). The red polygon delimits the area in which SVF was mapped.Thermal orthomosaic</p>
Full article ">Figure 2
<p>Comparison between temperature data registered by the thermal camera and by the thermocouple located at the TC02 site, during the time of the flight. The thermal camera observed an area of approximately 6 m<sup>2</sup>.</p>
Full article ">Figure 3
<p>Portion of the SVF map obtained from the digital surface model derived from LiDAR data. The map is produced by the ENVI SVF Tool. Red circles indicate the locations of the ground surveys.SVF by LiDAR data</p>
Full article ">Figure 4
<p>Determination of the maximum elevation angle of the obstacles as a function of the azimuthal direction (red line) for the site labeled “GAZZ”. The 2D view of the panoramic scan is colored in grey levels according to the amplitudes of the registered echoes.</p>
Full article ">Figure 5
<p>Histograms of the SVF values obtained from the four tested algorithms (<b>left</b>) and differences in retrieved surface temperature (<b>right</b>) produced by the use of these algorithms versus ignoring the SVF in the correction process.</p>
Full article ">Figure 6
<p>Map of the SVF difference between the spheric and planar solutions produced by SkyHelios.</p>
Full article ">Figure 7
<p>Average surface temperature of the building roofs in the test area.</p>
Full article ">
11073 KiB  
Article
Incorporating Diversity into Self-Learning for Synergetic Classification of Hyperspectral and Panchromatic Images
by Xiaochen Lu, Junping Zhang, Tong Li and Ye Zhang
Remote Sens. 2016, 8(10), 804; https://doi.org/10.3390/rs8100804 - 29 Sep 2016
Cited by 15 | Viewed by 4743
Abstract
Derived from semi-supervised learning and active learning approaches, self-learning (SL) was recently developed for the synergetic classification of hyperspectral (HS) and panchromatic (PAN) images. Combining the image segmentation and active learning techniques, SL aims at selecting and labeling the informative unlabeled samples automatically, [...] Read more.
Derived from semi-supervised learning and active learning approaches, self-learning (SL) was recently developed for the synergetic classification of hyperspectral (HS) and panchromatic (PAN) images. Combining the image segmentation and active learning techniques, SL aims at selecting and labeling the informative unlabeled samples automatically, thereby improving the classification accuracy under the condition of small samples. This paper presents an improved synergetic classification scheme based on the concept of self-learning for HS and PAN images. The investigated scheme considers three basic rules, namely the identity rule, the uncertainty rule, and the diversity rule. By integrating the diversity of samples into the SL scheme, a more stable classifier is trained by using fewer samples. Experiments on three synthetic and real HS and PAN images reveal that the diversity criterion can avoid the problem of bias sampling, and has a certain advantage over the primary self-learning approach. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Experimental data sets. (<b>a</b>) The AVIRIS HS image over San Diego; (<b>b</b>) The ground truth image of San Diego; (<b>c</b>) The AVIRIS HS image over Indian Pines region; (<b>d</b>) The ground truth image of Indian Pines region; (<b>e</b>) The University of Houston campus HS image; (<b>f</b>) The PAN image; (<b>g</b>) The ground truth images of the University of Houston campus.</p>
Full article ">Figure 2
<p>The main framework of the presented scheme.</p>
Full article ">Figure 3
<p>SVM hyperplanes and selected unknown samples. (<b>a</b>) The hyperplane obtained by the training set; (<b>b</b>) The retrained hyperplane after first iteration of AL. Four candidates have been selected; (<b>c</b>,<b>d</b>) The retrained hyperplane after first iteration using uncertainty rule and diversity rule. (<b>c</b>) Represents the similarity-based diversity rule, and (<b>d</b>) represents the cluster-based diversity rule. (The dot-dash lines in (<b>b</b>–<b>d</b>) denote the hyperplane in (<b>a</b>).)</p>
Full article ">Figure 4
<p>Overall accuracies (as a function of AL iteration) obtained for the data set of San Diego. (<b>a</b>) 5 labeled samples, MBT; (<b>b</b>) 10 labeled samples, MBT; (<b>c</b>) 15 labeled samples, MBT; (<b>d</b>) 5 labeled samples, MMS; (<b>e</b>) 10 labeled samples, MMS; (<b>f</b>) 15 labeled samples, MMS. The three figures in each line show the OAs using different amounts of training samples, i.e., 5 labeled samples per class with 40 selected samples per iteration, 10 labeled samples per class with 64 selected samples per iteration, 15 labeled samples per class with 80 selected samples per iteration.</p>
Full article ">Figure 5
<p>Overall accuracies (as a function of AL iteration) obtained for the data set of the Indian Pines region. (<b>a</b>) 5 labeled samples, MBT; (<b>b</b>) 10 labeled samples, MBT; (<b>c</b>) 15 labeled samples, MBT; (<b>d</b>) 5 labeled samples, MMS; (<b>e</b>) 10 labeled samples, MMS; (<b>f</b>) 15 labeled samples, MMS. The three figures in each line show the OAs using different amounts of training samples, i.e., 5 labeled samples per class with 27 selected samples per iteration, 10 labeled samples per class with 36 selected samples per iteration, 15 labeled samples per class with 45 selected samples per iteration.</p>
Full article ">Figure 6
<p>Overall accuracies (as a function of AL iteration) obtained for the data set of the University of Houston. (<b>a</b>) 5 labeled samples, MBT; (<b>b</b>) 10 labeled samples, MBT; (<b>c</b>) 15 labeled samples, MBT; (<b>d</b>) 5 labeled samples, MMS; (<b>e</b>) 10 labeled samples, MMS; (<b>f</b>) 15 labeled samples, MMS. The three figures in each line show the OAs using different amounts of training samples, i.e., 5 labeled samples per class with 36 selected samples per iteration, 10 labeled samples per class with 60 selected samples per iteration, 15 labeled samples per class with 72 selected samples per iteration.</p>
Full article ">
17654 KiB  
Article
Automated Ortho-Rectification of UAV-Based Hyperspectral Data over an Agricultural Field Using Frame RGB Imagery
by Ayman Habib, Youkyung Han, Weifeng Xiong, Fangning He, Zhou Zhang and Melba Crawford
Remote Sens. 2016, 8(10), 796; https://doi.org/10.3390/rs8100796 - 24 Sep 2016
Cited by 53 | Viewed by 10133
Abstract
Low-cost Unmanned Airborne Vehicles (UAVs) equipped with consumer-grade imaging systems have emerged as a potential remote sensing platform that could satisfy the needs of a wide range of civilian applications. Among these applications, UAV-based agricultural mapping and monitoring have attracted significant attention from [...] Read more.
Low-cost Unmanned Airborne Vehicles (UAVs) equipped with consumer-grade imaging systems have emerged as a potential remote sensing platform that could satisfy the needs of a wide range of civilian applications. Among these applications, UAV-based agricultural mapping and monitoring have attracted significant attention from both the research and professional communities. The interest in UAV-based remote sensing for agricultural management is motivated by the need to maximize crop yield. Remote sensing-based crop yield prediction and estimation are primarily based on imaging systems with different spectral coverage and resolution (e.g., RGB and hyperspectral imaging systems). Due to the data volume, RGB imaging is based on frame cameras, while hyperspectral sensors are primarily push-broom scanners. To cope with the limited endurance and payload constraints of low-cost UAVs, the agricultural research and professional communities have to rely on consumer-grade and light-weight sensors. However, the geometric fidelity of derived information from push-broom hyperspectral scanners is quite sensitive to the available position and orientation established through a direct geo-referencing unit onboard the imaging platform (i.e., an integrated Global Navigation Satellite System (GNSS) and Inertial Navigation System (INS). This paper presents an automated framework for the integration of frame RGB images, push-broom hyperspectral scanner data and consumer-grade GNSS/INS navigation data for accurate geometric rectification of the hyperspectral scenes. The approach relies on utilizing the navigation data, together with a modified Speeded-Up Robust Feature (SURF) detector and descriptor, for automating the identification of conjugate features in the RGB and hyperspectral imagery. The SURF modification takes into consideration the available direct geo-referencing information to improve the reliability of the matching procedure in the presence of repetitive texture within a mechanized agricultural field. Identified features are then used to improve the geometric fidelity of the previously ortho-rectified hyperspectral data. Experimental results from two real datasets show that the geometric rectification of the hyperspectral data was improved by almost one order of magnitude. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>Flowchart of the proposed methodology for the automated registration of RGB-based and partially-rectified hyperspectral orthophotos.</p>
Full article ">Figure 2
<p>Portion of a partially-rectified hyperspectral orthophoto that shows a repetitive pattern within an agricultural field (RGB bands are displayed).</p>
Full article ">Figure 3
<p>Extracted features over an agricultural field at different scales: (<b>a</b>) fine-scale features; and (<b>b</b>) coarse-scale features.</p>
Full article ">Figure 4
<p>Search space constraints in the spatial and scale domains. The radius and the height of the cylinder represent the extent of the search space in the spatial and scale domains, respectively, for a feature in the template image (i.e., marked white circle).</p>
Full article ">Figure 5
<p>Comparison of the matching performance for acquired imagery over an agricultural field: (<b>a</b>) SURF and (<b>b</b>) modified SURF strategies.</p>
Full article ">Figure 6
<p>Conceptual basis of the implemented procedure for describing the geometric transformations between neighboring partially-rectified hyperspectral orthophotos.</p>
Full article ">Figure 7
<p>Imaging geometry of a nadir-looking vertical push-broom hyperspectral scanner.</p>
Full article ">Figure 8
<p>Sensors and UAVs utilized for dataset acquisition: (<b>a</b>) hyperspectral push-broom scanner mounted on (<b>b</b>) a fixed-wing UAV and (<b>c</b>) RGB frame camera mounted on (<b>d</b>) a quad copter UAV.</p>
Full article ">Figure 9
<p>Sample RGB image over the test field with a close-up of one of the targets.</p>
Full article ">Figure 10
<p>RGB-based orthophoto of the test field derived from the captured frame images on 25 July 2015.</p>
Full article ">Figure 11
<p>Mosaicked partially-rectified orthophoto of hyperspectral data over the test field captured on 21 August 2015 (only the RGB bands are displayed) showing heading errors and misalignment between neighboring flight lines.</p>
Full article ">Figure 12
<p>Mosaicked refined hyperspectral orthophoto through the proposed approach: (<b>a</b>) acquired data on 4 August 2015 and (<b>b</b>) acquired data on 21 August 2015.</p>
Full article ">Figure 13
<p>Subareas of the test field for visual inspection of the registration result: (<b>a</b>) partially-rectified hyperspectral orthophoto and (<b>b</b>) refined hyperspectral orthophoto generated by the proposed approach.</p>
Full article ">Figure 13 Cont.
<p>Subareas of the test field for visual inspection of the registration result: (<b>a</b>) partially-rectified hyperspectral orthophoto and (<b>b</b>) refined hyperspectral orthophoto generated by the proposed approach.</p>
Full article ">Figure 14
<p>Close-up of chessboard orthophoto mosaics generated from the RGB-based orthophoto and refined hyperspectral orthophoto through the proposed approach: (<b>a</b>) acquired hyperspectral data on 4 August 2015 and (<b>b</b>) acquired hyperspectral data on 21 August 2015.</p>
Full article ">
1928 KiB  
Article
Exploratory Analysis of Dengue Fever Niche Variables within the Río Magdalena Watershed
by Austin Stanforth, Max J. Moreno-Madriñán and Jeffrey Ashby
Remote Sens. 2016, 8(9), 770; https://doi.org/10.3390/rs8090770 - 19 Sep 2016
Cited by 12 | Viewed by 7561
Abstract
Previous research on Dengue Fever have involved laboratory tests or study areas with less diverse temperature and elevation ranges than is found in Colombia; therefore, preliminary research was needed to identify location specific attributes of Dengue Fever transmission. Environmental variables derived from the [...] Read more.
Previous research on Dengue Fever have involved laboratory tests or study areas with less diverse temperature and elevation ranges than is found in Colombia; therefore, preliminary research was needed to identify location specific attributes of Dengue Fever transmission. Environmental variables derived from the Moderate Resolution Imaging Spectroradiometer (MODIS) and Tropical Rainfall Measuring Mission (TRMM) satellites were combined with population variables to be statistically compared against reported cases of Dengue Fever in the Río Magdalena watershed, Colombia. Three-factor analysis models were investigated to analyze variable patterns, including a population, population density, and empirical Bayesian estimation model. Results identified varying levels of Dengue Fever transmission risk, and environmental characteristics which support, and advance, the research literature. Multiple temperature metrics, elevation, and vegetation composition were among the more contributory variables found to identify future potential outbreak locations. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Study site identification of the Magdalena Watershed in Colombia, demonstrating its geographical location, elevation diversity, and presence of larger populated urban environments.</p>
Full article ">Figure 2
<p>Graphic depicting stages of analysis.</p>
Full article ">Figure 3
<p>Model output of principal component loadings. The strongest component loading was bolded for each variable.</p>
Full article ">Figure 4
<p>Population density and EBE model graphical comparisons to the reported cases per 10,000 population.</p>
Full article ">
21023 KiB  
Article
Enhanced Compositional Mapping through Integrated Full-Range Spectral Analysis
by Meryl L. McDowell and Fred A. Kruse
Remote Sens. 2016, 8(9), 757; https://doi.org/10.3390/rs8090757 - 15 Sep 2016
Cited by 10 | Viewed by 7086
Abstract
We developed a method to enhance compositional mapping from spectral remote sensing through the integration of visible to near infrared (VNIR, ~0.4–1 µm), shortwave infrared (SWIR, ~1–2.5 µm), and longwave infrared (LWIR, ~8–13 µm) data. Spectral information from the individual ranges was first [...] Read more.
We developed a method to enhance compositional mapping from spectral remote sensing through the integration of visible to near infrared (VNIR, ~0.4–1 µm), shortwave infrared (SWIR, ~1–2.5 µm), and longwave infrared (LWIR, ~8–13 µm) data. Spectral information from the individual ranges was first analyzed independently and then the resulting compositional information in the form of image endmembers and apparent abundances was integrated using ISODATA cluster analysis. Independent VNIR, SWIR, and LWIR analyses of a study area near Mountain Pass, California identified image endmembers representing vegetation, manmade materials (e.g., metal, plastic), specific minerals (e.g., calcite, dolomite, hematite, muscovite, gypsum), and general lithology (e.g., sulfate-bearing, carbonate-bearing, and silica-rich units). Integration of these endmembers and their abundances produced a final full-range classification map incorporating much of the variation from all three spectral ranges. The integrated map and its 54 classes provide additional compositional information that is not evident in the VNIR, SWIR, or LWIR data alone, which allows for more complete and accurate compositional mapping. A supplemental examination of hyperspectral LWIR data and comparison with the multispectral LWIR data used in the integration illustrates its potential to further improve this approach. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The region near Mountain Pass, California. The basemap image is from the ArcGIS Online World Imagery basemap by Esri. The solid black outline gives the approximate extent of the study area covered by the spectral imagery. Significant terrain features are labeled. Note that the Solar Facility location is marked but not visible in the image, which was acquired pre-construction.</p>
Full article ">Figure 2
<p>Simplified surface geologic map indicating general lithology and notable features, based on the surficial and geologic maps of Schmidt and McMackin [<a href="#B29-remotesensing-08-00757" class="html-bibr">29</a>], Miller et al. in [<a href="#B31-remotesensing-08-00757" class="html-bibr">31</a>] and Hewett [<a href="#B32-remotesensing-08-00757" class="html-bibr">32</a>]. The solid black outline gives the approximate extent of the study area covered by the spectral imagery. Uncolored surfaces within the study area represent Quaternary alluvium deposits.</p>
Full article ">Figure 3
<p>Methods flow chart summarizing the process used in the independent spectral analysis of each wavelength range and the subsequent integration and classification of the resulting endmember abundances. The rounded boxes indicate data products, and the rectangular boxes indicate procedures.</p>
Full article ">Figure 4
<p>Example of matched filter (MF) score vs. Infeasibility score scatterplot illustrating the selection of an infeasibility threshold at the edge of the data background distribution separating feasible and infeasible MF scores. The class shown here is shortwave infrared Class 2.</p>
Full article ">Figure 5
<p>Selected visible to near infrared (VNIR) image endmember spectra and example reference spectra. <sup>1</sup> USGS Digital Spectral Library [<a href="#B61-remotesensing-08-00757" class="html-bibr">61</a>]; <sup>2</sup> ASTER Spectral Library [<a href="#B62-remotesensing-08-00757" class="html-bibr">62</a>].</p>
Full article ">Figure 6
<p>False color composite images of selected visible to near infrared (VNIR) endmember classes: (<b>a</b>) 6/15/33 in red/green/blue; (<b>b</b>) 12/32/33 in red/green/blue. Pixels with approximate abundance values less than 40% for the selected endmembers are masked in gray.</p>
Full article ">Figure 7
<p>Selected shortwave infrared (SWIR) image endmember spectra and reference spectra of: (<b>a</b>) mineral phases; (<b>b</b>) vegetation; (<b>c</b>) manmade materials. <sup>1</sup> USGS Digital Spectral Library [<a href="#B61-remotesensing-08-00757" class="html-bibr">61</a>]; <sup>2</sup> ASTER Spectral Library [<a href="#B62-remotesensing-08-00757" class="html-bibr">62</a>]; <sup>3</sup> Turner et al. [<a href="#B63-remotesensing-08-00757" class="html-bibr">63</a>]. * spectrum has been vertically offset to facilitate comparison and plotting.</p>
Full article ">Figure 8
<p>False color composite images of selected shortwave infrared (SWIR) endmember classes: (<b>a</b>) 3/6/5 in red/green/blue; (<b>b</b>) 2/7/8 in red/green/blue. Pixels with approximate abundance values less than 40% for the selected endmembers are masked in gray.</p>
Full article ">Figure 9
<p>Selected longwave infrared (LWIR) image endmember spectra modeled predominantly in areas of: (<b>a</b>) natural terrain; (<b>b</b>) manmade surfaces, and example reference spectra convolved to the MODIS/ASTER Airborne Simulator (MASTER) bandpasses. <sup>1</sup> ASTER Spectral Library [<a href="#B62-remotesensing-08-00757" class="html-bibr">62</a>]. * spectrum’s vertical scale has been decreased to facilitate comparison and plotting.</p>
Full article ">Figure 10
<p>False color composite images of selected longwave infrared (LWIR) endmember classes: (<b>a</b>) 3/10/28 in red/green/blue; (<b>b</b>) 11/17/20 in red/green/blue. Pixels with approximate abundance values less than 40% for the selected endmembers are masked in gray.</p>
Full article ">Figure 11
<p>Classification map produced from integration of the visible to near infrared (VNIR), shortwave infrared (SWIR), and longwave infrared (LWIR) image endmembers. The map subsets shown in subsequent figures re outlined in black.</p>
Full article ">Figure 12
<p>Mean endmember apparent abundance values for selected classes produced by integration of the visible to near infrared (VNIR), shortwave infrared (SWIR), and longwave infrared (LWIR) image endmembers.</p>
Full article ">Figure 13
<p>Examination of individual wavelength range results incorporated in the full-range characterization of a siliciclastic rock unit: (<b>a</b>) subset of the integrated classification map with outlines of general surface units and regions of interest corresponding to subsequent plots; (<b>b</b>) graph of mean endmember abundance values for selected regions of interest, offset for clarity; (<b>c</b>) abundance distribution images for notable endmembers; (<b>d</b>) mean visible to near infrared (VNIR) and shortwave infrared (SWIR) spectra of selected regions of interest; (<b>e</b>) mean longwave infrared (LWIR) spectra of selected regions of interest.</p>
Full article ">Figure 14
<p>Examination of individual wavelength range results incorporated in the full-range characterization of a quartz-rich sandstone unit: (<b>a</b>) subset of the integrated classification map with outlines of general surface units and regions of interest corresponding to subsequent plots; (<b>b</b>) graph of mean endmember abundance values for selected regions of interest, offset for clarity; (<b>c</b>) abundance distribution images for notable endmembers; (<b>d</b>) mean visible to near infrared (VNIR) and shortwave infrared (SWIR) spectra of selected regions of interest; (<b>e</b>) mean longwave infrared (LWIR) spectra of selected regions of interest.</p>
Full article ">Figure 15
<p>Examination of individual wavelength range results incorporated in the full-range characterization of a carbonate unit: (<b>a</b>) subset of the integrated classification map with outlines of general surface units and regions of interest corresponding to subsequent plots; (<b>b</b>) graph of mean endmember abundance values for selected regions of interest, offset for clarity; (<b>c</b>) abundance distribution images for notable endmembers; (<b>d</b>) mean visible to near infrared (VNIR) and shortwave infrared (SWIR) spectra of selected regions of interest; (<b>e</b>) mean longwave infrared (LWIR) spectra of selected regions of interest.</p>
Full article ">Figure 16
<p>False color composite mosaic of the Mako emissivity data with bands 8.59/11.32/12.10 µm displayed in red/green/blue. The white boxes outline regions of interest discussed in the text and <a href="#remotesensing-08-00757-f017" class="html-fig">Figure 17</a>. The black boxes outline the data subsets shown in <a href="#remotesensing-08-00757-f018" class="html-fig">Figure 18</a> and <a href="#remotesensing-08-00757-f019" class="html-fig">Figure 19</a>.</p>
Full article ">Figure 17
<p>Mean emissivity spectra of regions of interest (ROI) M1-M7 from the Mako data at: (<b>a</b>) Mako native spectral resolution; (<b>b</b>) convolved to the spectral response functions of the MODIS/ASTER Airborne Simulator (MASTER) longwave infrared bands. See <a href="#remotesensing-08-00757-f016" class="html-fig">Figure 16</a> for ROI locations; gray dashed lines indicate band positions displayed as red (R), green (G), and blue (B) in <a href="#remotesensing-08-00757-f016" class="html-fig">Figure 16</a>.</p>
Full article ">Figure 18
<p>Detailed examination of a subset of the Mako emissivity data: (<b>a</b>) outlines of selected regions of interest (ROIs) on the false color composite image with bands 8.59/11.32/12.10 µm displayed as red/green/blue; (<b>b</b>) mean emissivity spectra of the ROIs; (<b>c</b>) mean emissivity spectra of the ROIs convolved to the spectral response functions of the MODIS/ASTER Airborne Simulator (MASTER) longwave infrared bands; (<b>d</b>) ROI outlines on Mako data resampled to the pixel size of our MASTER data set; (<b>e</b>) mean emissivity spectrum of each ROI plotted with the spectra of the new resampled pixels to which it contributes.</p>
Full article ">Figure 19
<p>Detailed examination of a subset of the Mako emissivity data: (<b>a</b>) outlines of selected regions of interest (ROIs) on the false color composite image with bands 8.59/11.32/12.10 µm displayed as red/green/blue; (<b>b</b>) mean emissivity spectra of the ROIs; (<b>c</b>) mean emissivity spectra of the ROIs convolved to the spectral response functions of the MODIS/ASTER Airborne Simulator (MASTER) longwave infrared bands; (<b>d</b>) ROI outlines on Mako data resampled to the pixel size of our MASTER data set; (<b>e</b>) mean emissivity spectrum of each ROI plotted with the spectra of the new resampled pixels to which it contributes.</p>
Full article ">
8605 KiB  
Article
An Image Matching Algorithm Integrating Global SRTM and Image Segmentation for Multi-Source Satellite Imagery
by Xiao Ling, Yongjun Zhang, Jinxin Xiong, Xu Huang and Zhipeng Chen
Remote Sens. 2016, 8(8), 672; https://doi.org/10.3390/rs8080672 - 19 Aug 2016
Cited by 29 | Viewed by 8101
Abstract
This paper presents a novel image matching method for multi-source satellite images, which integrates global Shuttle Radar Topography Mission (SRTM) data and image segmentation to achieve robust and numerous correspondences. This method first generates the epipolar lines as a geometric constraint assisted by [...] Read more.
This paper presents a novel image matching method for multi-source satellite images, which integrates global Shuttle Radar Topography Mission (SRTM) data and image segmentation to achieve robust and numerous correspondences. This method first generates the epipolar lines as a geometric constraint assisted by global SRTM data, after which the seed points are selected and matched. To produce more reliable matching results, a region segmentation-based matching propagation is proposed in this paper, whereby the region segmentations are extracted by image segmentation and are considered to be a spatial constraint. Moreover, a similarity measure integrating Distance, Angle and Normalized Cross-Correlation (DANCC), which considers geometric similarity and radiometric similarity, is introduced to find the optimal correspondences. Experiments using typical satellite images acquired from Resources Satellite-3 (ZY-3), Mapping Satellite-1, SPOT-5 and Google Earth demonstrated that the proposed method is able to produce reliable and accurate matching results. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Examples of multi-source satellite images for image matching: (<b>a</b>) nadir-viewing image of the ZY-3 satellite; (<b>b</b>) forward-viewing image of Mapping Satellite-1; (<b>c</b>) backward-viewing image of the ZY-3 satellite; (<b>d</b>) panchromatic image of the SPOT-5 satellite; (<b>e</b>) Google Earth image under the geographic coordinate system; (<b>f</b>) nadir-viewing image of Mapping Satellite-1.</p>
Full article ">Figure 2
<p>The framework of the proposed matching algorithm. DANCC, Distance, Angle and Normalized Cross-Correlation.</p>
Full article ">Figure 3
<p>Examples of the distribution maps generated by the Local Contrast (LC) operator: (<b>a</b>) the satellite image of Test 1; (<b>b</b>) the distribution map of Test 1; (<b>c</b>) the satellite image of Test 2; (<b>d</b>) the distribution map of Test 2.</p>
Full article ">Figure 4
<p>The principle of the iterative elevation determination.</p>
Full article ">Figure 5
<p>The principle of local geometric and radiometric distortion rectification.</p>
Full article ">Figure 6
<p>Epipolar lines generated by GC<math display="inline"> <semantics> <msup> <mrow/> <mn>3</mn> </msup> </semantics> </math> and the developed method: (<b>a</b>) a checkpoint on the test image; (<b>b</b>) zoomed view of epipolar lines; the red one is generated by the developed method, and the blue one is generated by GC<math display="inline"> <semantics> <msup> <mrow/> <mn>3</mn> </msup> </semantics> </math>.</p>
Full article ">Figure 7
<p>The matching result produced from the developed method, the correspondences are marked by red rectangles, and the outliers are marked by green circles.</p>
Full article ">Figure 8
<p>The results of image segmentation and region merging process: (<b>a</b>) the initial segmentation; (<b>b</b>) the segmentation after the merging process, the false segment is removed, and the segmentation quality is obviously improved.</p>
Full article ">Figure 9
<p>Candidate prediction in a region. The red point is a seed point; the remaining points marked in yellow are predicted on the right image. The points marked in black are the predicted points.</p>
Full article ">Figure 10
<p>Illustration of the matching propagation. The root node marked in red and two starting nodes marked in green determine the local disparity <math display="inline"> <semantics> <mrow> <mi mathvariant="sans-serif">Δ</mi> <mi>λ</mi> </mrow> </semantics> </math> and <math display="inline"> <semantics> <mrow> <mi mathvariant="sans-serif">Δ</mi> <mi>θ</mi> </mrow> </semantics> </math>, then the circular searching area can be estimated.</p>
Full article ">Figure 11
<p>Illustration of the sample point description for DANCC.</p>
Full article ">Figure 12
<p>Matching performance comparison with different methods: (<b>a</b>) zoomed view of a small part of the left image; the root node is marked in red; a starting node is marked in green; and the remaining point is marked in yellow; (<b>b</b>) zoomed view of a small part of right image and the correspondence determined by DANCC is marked in yellow; (<b>c</b>) the distribution of similarity values in the searching area using the method presented by Xiong [<a href="#B17-remotesensing-08-00672" class="html-bibr">17</a>,<a href="#B18-remotesensing-08-00672" class="html-bibr">18</a>]; (<b>d</b>) the distribution of similarity values in the searching area using NCC; (<b>e</b>) the distribution of similarity values in the searching area using DANCC.</p>
Full article ">Figure 13
<p>Experimental results using different weight values: (<b>a</b>) the statistics of the correct matching rate using an image pair (<a href="#remotesensing-08-00672-f001" class="html-fig">Figure 1</a>a,b); (<b>b</b>) the statistics of the correct matching rate using an image pair (<a href="#remotesensing-08-00672-f001" class="html-fig">Figure 1</a>c,d); (<b>c</b>) the statistics of the correct matching rate using an image pair (<a href="#remotesensing-08-00672-f001" class="html-fig">Figure 1</a>e,f).</p>
Full article ">Figure 14
<p>Comparison of the matching results produced by the three methods: (<b>a</b>) the matching result produced by TAACC; (<b>b</b>) the matching result produced by GC<math display="inline"> <semantics> <msup> <mrow/> <mn>3</mn> </msup> </semantics> </math>; (<b>c</b>) the matching result produced by the proposed method.</p>
Full article ">Figure 15
<p>The initial corresponding triangulations produced by TAACC. The false triangle constraints cause mismatches as shown in <a href="#remotesensing-08-00672-f014" class="html-fig">Figure 14</a>a.</p>
Full article ">Figure 16
<p>Comparison of the matching results produced by the three methods: (<b>a</b>) the matching result produced by TAACC; (<b>b</b>) the matching result produced by GC<math display="inline"> <semantics> <msup> <mrow/> <mn>3</mn> </msup> </semantics> </math>; (<b>c</b>) the matching result produced by the proposed method.</p>
Full article ">Figure 17
<p>The matching result produced by the proposed method. The zoomed views of a local region, which is marked with a rectangle in each image, are provided to show the matching details.</p>
Full article ">Figure 18
<p>The elevation difference of checkpoints: the blue line is obtained from the reference DEM with their geographic coordinates from Google Earth, and the red one is from the reference DEM by the iterative approach mentioned in <a href="#sec2dot2-remotesensing-08-00672" class="html-sec">Section 2.2</a> with their image coordinates in the Mapping Satellite-1 image.</p>
Full article ">
3850 KiB  
Article
Application of Helmert Variance Component Based Adaptive Kalman Filter in Multi-GNSS PPP/INS Tightly Coupled Integration
by Zhouzheng Gao, Wenbin Shen, Hongping Zhang, Maorong Ge and Xiaoji Niu
Remote Sens. 2016, 8(7), 553; https://doi.org/10.3390/rs8070553 - 29 Jun 2016
Cited by 40 | Viewed by 7566
Abstract
The integration of the Global Positioning System (GPS) and the Inertial Navigation System (INS) based on Real-time Kinematic (RTK) and Single Point Positioning (SPP) technology have been applied as a powerful approach in kinematic positioning and attitude determination. However, the accuracy of RTK [...] Read more.
The integration of the Global Positioning System (GPS) and the Inertial Navigation System (INS) based on Real-time Kinematic (RTK) and Single Point Positioning (SPP) technology have been applied as a powerful approach in kinematic positioning and attitude determination. However, the accuracy of RTK and SPP based GPS/INS integration mode will degrade visibly along with the increasing user-base distance and the quality of pseudo-range. In order to overcome such weaknesses, the tightly coupled integration between GPS Precise Point Positioning (PPP) and INS was proposed recently. Because of the rapid development of the multi-constellation Global Navigation Satellite System (multi-GNSS), we introduce the multi-GNSS into the tightly coupled integration of PPP and INS in this paper. Meanwhile, in order to weaken the impacts of the GNSS observations with low quality and the inaccurate state model on the performance of the multi-GNSS PPP/INS tightly coupled integration, the Helmert variance component estimation based adaptive Kalman filter is employed in the algorithm implementation. Finally, a set of vehicle-borne GPS + BeiDou + GLONASS and Micro-Electro-Mechanical-Systems (MEMS) INS data is analyzed to evaluate the performance of such algorithm. The statistics indicate that the performance of the multi-GNSS PPP/INS tightly coupled integration can be enhanced significantly in terms of both position accuracy and convergence time. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Implementation of the Helmert variance component estimation (HVCE) based adaptive Kalman filter for the multi-GNSS PPP/INS tightly coupled integration.</p>
Full article ">Figure 2
<p>Velocities (<b>top</b>) and attitudes of land vehicle experiment on June 19, 2013 in Wuhan, China (the <b>middle</b> subfigure shows roll and pitch components, and the <b>bottom</b> subfigure shows heading direction); the purple box shows the enlarged velocity time series of Up direction in the top subfigure.</p>
Full article ">Figure 3
<p>Sky-plot of available GPS (G) satellites (<b>Blue</b>), BeiDou (B) satellites (<b>Red</b>), and GLONASS (R) satellites (<b>Green</b>) of land vehicle experiment on 19 June 2013 in Wuhan, China, respectively; the lower-left subfigure is the trajectory of this test.</p>
Full article ">Figure 4
<p>Availability of GPS (defining as PRN &lt; 35), BeiDou (defining as 35 &lt; PRN &lt; 70), and GLONASS (defining as 70 &lt; PRN &lt; 105) of land vehicle experiment on 19 June 2013 in Wuhan, China; the red dashed box shows a GNSS partial outage caused by users’ observing environment.</p>
Full article ">Figure 5
<p>Available satellite numbers (<b>top</b>) of GPS, G + R, G + B, and G + B + R, and the corresponding PDOP values (<b>bottom</b>) of the vehicle-borne experiment.</p>
Full article ">Figure 6
<p>Position offsets time series in navigation frame (North-East-Up) calculated by making differences between the references values and the PPP solutions using the single- (GPS) and the multi-GNSS (G + R, G + B, and G + R + B) data.</p>
Full article ">Figure 7
<p>RMS of position offsets from PPP mode (<b>top</b>), the PPP/INS tightly coupled integration mode without (<b>middle</b>) and with the HVCE and adaptive schemes (<b>bottom</b>) using GPS, G + R, G + B, and G + R + B GNSS data.</p>
Full article ">Figure 8
<p>Position offsets time series in navigation frame (North-East-Up) calculated by making differences between the references and the solutions of the Kalman filter based PPP/INS tightly coupled integration using the single- (GPS) and the multi-GNSS (G + R, G + B, and G + R + B) data.</p>
Full article ">Figure 9
<p>Position offsets time series in navigation frame (North-East-Up) calculated by making differences between the references and the solutions of the Helmert variance component based adaptive Kalman filter based PPP/INS tightly coupled integration using the single- (GPS) and the multi-GNSS (G + R, G + B, and G + R + B) data.</p>
Full article ">Figure 10
<p>RMS of velocity offsets from PPP mode (<b>top</b>), the PPP/INS tightly coupled integration mode without (<b>middle</b>) and with the HVCE adaptive scheme (<b>bottom</b>) using GPS, G + R, G + B, and G + R + B GNSS data.</p>
Full article ">Figure 11
<p>RMS of attitude offsets from the PPP/INS tightly coupled integration mode without (<b>top</b>) and with the HVCE based adaptive scheme (<b>bottom</b>) using GPS, G + R, G + B, and G + R + B GNSS data.</p>
Full article ">Figure 12
<p>Weight of GPS, BeiDou, and GLONASS calculated by the PPP/INS tightly coupled integration using G + R + B, G + R, G + B data by the HVCE based adaptive Kalman filter; here, “GNSS (X)” means the weight of “GNSS” calculated in the “X” PPP/INS integration mode.</p>
Full article ">Figure 13
<p>RMS of attitude offsets from the PPP/INS tightly coupled integration mode without (<b>top</b>) and with the HVCE based adaptive scheme (<b>bottom</b>) using GPS, G + R, G + B, and G + R + B GNSS data: GPS (PRN &lt; 35), BeiDou (35 &lt; PRN &lt; 70), and GLONASS (70 &lt; PRN &lt; 105).</p>
Full article ">Figure 14
<p>Position offsets time series of the Kalman filter based PPP/INS tightly coupled integration using the multi-GNSS (G + R, G + B, and G + R + B) data in the GNSS outage simulation test.</p>
Full article ">Figure 15
<p>Position offsets time series of GNSS outage simulation test of the PPP/INS tightly coupled integration using the HVCE based adaptive Kalman filter with the multi-GNSS (G + R, G + B, and G + R + B) data.</p>
Full article ">
7479 KiB  
Article
Generation of Land Cover Maps through the Fusion of Aerial Images and Airborne LiDAR Data in Urban Areas
by Yongmin Kim
Remote Sens. 2016, 8(6), 521; https://doi.org/10.3390/rs8060521 - 22 Jun 2016
Cited by 7 | Viewed by 6376
Abstract
Satellite images and aerial images with high spatial resolution have improved visual interpretation capabilities. The use of high-resolution images has rapidly grown and has been extended to various fields, such as military surveillance, disaster monitoring, and cartography. However, many problems were encountered in [...] Read more.
Satellite images and aerial images with high spatial resolution have improved visual interpretation capabilities. The use of high-resolution images has rapidly grown and has been extended to various fields, such as military surveillance, disaster monitoring, and cartography. However, many problems were encountered in which one object has a variety of spectral properties and different objects have similar spectral characteristics in terms of land cover. The problems are quite noticeable, especially for building objects in urban environments. In the land cover classification process, these issues directly decrease the classification accuracy by causing misclassification of single objects as well as between objects. This study proposes a method of increasing the accuracy of land cover classification by addressing the problem of misclassifying building objects through the output-level fusion of aerial images and airborne Light Detection and Ranging (LiDAR) data. The new method consists of the following three steps: (1) generation of the segmented image via a process that performs adaptive dynamic range linear stretching and modified seeded region growth algorithms; (2) extraction of building information from airborne LiDAR data using a planar filter and binary supervised classification; and (3) generation of a land cover map using the output-level fusion of two results and object-based classification. The new method was tested at four experimental sites with the Min-Max method and the SSI-nDSM method followed by a visual assessment and a quantitative accuracy assessment through comparison with reference data. In the accuracy assessment, the new method exhibits various advantages, including reduced noise and more precise classification results. Additionally, the new method improved the overall accuracy by more than 5% over the comparative evaluation methods. The high and low patterns between the overall and building accuracies were similar. Thus, the new method is judged to have successfully solved the inaccuracy problem of classification that is often produced by high-resolution images of urban environments through an output-level fusion technique. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>Flowchart of classification generation using the new method.</p>
Full article ">Figure 2
<p>Image segmentation: (<b>a</b>) original image; (<b>b</b>) radiometric-enhanced image; and (<b>c</b>) segmented image (red lines are segment boundaries).</p>
Full article ">Figure 3
<p>Building information extracted from airborne LiDAR: (<b>a</b>) DTM; (<b>b</b>) building information (white areas).</p>
Full article ">Figure 4
<p>Results of output-level fusion: (<b>a</b>) building area and (<b>b</b>) non-building area.</p>
Full article ">Figure 5
<p>Aerial images of experimental sites: (<b>a</b>) site 1; (<b>b</b>) site 2; (<b>c</b>) site 3; and (<b>d</b>) site 4.</p>
Full article ">Figure 6
<p>Final results for site 1: (<b>a</b>) aerial image; (<b>b</b>) Max-Min method; (<b>c</b>) SSI, nDSM method; and (<b>d</b>) the new method.</p>
Full article ">Figure 7
<p>Results for site 2: (<b>a</b>) aerial image; (<b>b</b>) Max-Min method; (<b>c</b>) SSI, nDSM method; and (<b>d</b>) the new method.</p>
Full article ">Figure 8
<p>Final results for site 3: (<b>a</b>) aerial image; (<b>b</b>) Max-Min method; (<b>c</b>) SSI, nDSM method; and (<b>d</b>) the new method.</p>
Full article ">Figure 9
<p>Results for site 4: (<b>a</b>) aerial image; (<b>b</b>) Max-Min method; (<b>c</b>) SSI, nDSM method; and (<b>d</b>) the new method.</p>
Full article ">Figure 10
<p>Overall accuracy of three classification results.</p>
Full article ">Figure 11
<p>Producer’s and user’s accuracies for the building class: (<b>a</b>) site 1; (<b>b</b>) site 2; (<b>c</b>) site 3; and (<b>d</b>) site 4.</p>
Full article ">
5662 KiB  
Article
Merging Alternate Remotely-Sensed Soil Moisture Retrievals Using a Non-Static Model Combination Approach
by Seokhyeon Kim, Robert M. Parinussa, Yi Y. Liu, Fiona M. Johnson and Ashish Sharma
Remote Sens. 2016, 8(6), 518; https://doi.org/10.3390/rs8060518 - 21 Jun 2016
Cited by 15 | Viewed by 6108
Abstract
Soil moisture is an important variable in the coupled hydrologic and climate system. In recent years, microwave-based soil moisture products have been shown to be a viable alternative to in situ measurements. A popular way to measure the performance of soil moisture products [...] Read more.
Soil moisture is an important variable in the coupled hydrologic and climate system. In recent years, microwave-based soil moisture products have been shown to be a viable alternative to in situ measurements. A popular way to measure the performance of soil moisture products is to calculate the temporal correlation coefficient (R) against in situ measurements or other appropriate reference datasets. In this study, an existing linear combination method improving R was modified to allow for a non-static or nonstationary model combination as the basis for improving remotely-sensed surface soil moisture. Previous research had noted that two soil moisture products retrieved using the Japan Aerospace Exploration Agency (JAXA) and Land Parameter Retrieval Model (LPRM) algorithms from the same Advanced Microwave Scanning Radiometer 2 (AMSR2) sensor are spatially complementary in terms of R against a suitable reference over a fixed period. Accordingly, a linear combination was proposed to maximize R using a set of spatially-varying, but temporally-fixed weights. Even though this approach showed promising results, there was room for further improvements, in particular using non-static or dynamic weights that take account of the time-varying nature of the combination algorithm being approximated. The dynamic weighting was achieved by using a moving window. A number of different window sizes was investigated. The optimal weighting factors were determined for the data lying within the moving window and then used to dynamically combine the two parent products. We show improved performance for the dynamically-combined product over the static linear combination. Generally, shorter time windows outperform the static approach, and a 60-day time window is suggested to be the optimum. Results were validated against in situ measurements collected from 124 stations over different continents. The mean R of the dynamically-combined products was found to be 0.57 and 0.62 for the cases using the European Centre for Medium-Range Weather Forecasts Reanalysis-Interim (ERA-Interim) and Modern-Era Retrospective Analysis for Research and Applications Land (MERRA-Land) reanalysis products as the reference, respectively, outperforming the statically-combined products (0.55 and 0.54). Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Locations of 124 ground stations from 10 networks used for the comparison with combined products.</p>
Full article ">Figure 2
<p>Schematic diagram for dynamic linear combination. T denotes the period defined by the window (<span class="html-italic">i.e</span>., T = (t − N/2):(t + N/2)). Therefore, a bold symbol that has T as its subscript means a vector in the period T, and a non-bold symbol with t as the subscript represents a value at the point in time <span class="html-italic">t</span>.</p>
Full article ">Figure 3
<p>Results from experiments that uses ERA-Interim as the reference for various window sizes, N60, N90 and N120. Each panel shows the R between the reference and (<b>a</b>) JAXA; (<b>b</b>) LPRM; (<b>c</b>) static; (<b>d</b>) N60; (<b>e</b>) N90 and (<b>f</b>) N120; the more bluish colours in the maps indicate higher R against the reference; the overall performance for the various scenarios is summarized in the boxplot (<b>g</b>).</p>
Full article ">Figure 4
<p>Comparison between combined soil moisture products. For ERA-Interim as the reference, (<b>a</b>) The differences in R between the static and N60 products against the reference (<span class="html-italic">i.e</span>., R of N60 minus R of static) and (<b>b</b>) the mean weights that were used for the dynamic combination using the reference over the two-year study period; (<b>c</b>) and (<b>d</b>) show corresponding results with (<b>a</b>) and (<b>b</b>) when using MERRA-Land as the reference.</p>
Full article ">Figure 5
<p>Results from the simulation experiment. (<b>a</b>) The x-axis indicates Euclidean distances (ξ) calculated by Equation (7), representing the qualities of the parent products, and the y-axis, <math display="inline"> <semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>d</mi> <mi>y</mi> <mi>n</mi> </mrow> </msub> </mrow> </semantics> </math> or <math display="inline"> <semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>s</mi> <mi>t</mi> <mi>a</mi> </mrow> </msub> </mrow> </semantics> </math>. The dashed two lines present the linear regression of all results from the dynamic and static combinations, respectively; (<b>b</b>) The x-axis indicates N sizes, the y-axis differences between <math display="inline"> <semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>d</mi> <mi>y</mi> <mi>n</mi> </mrow> </msub> </mrow> </semantics> </math> and <math display="inline"> <semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>s</mi> <mi>t</mi> <mi>a</mi> </mrow> </msub> </mrow> </semantics> </math> (<span class="html-italic">i.e</span>., <math display="inline"> <semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>d</mi> <mi>y</mi> <mi>n</mi> </mrow> </msub> </mrow> </semantics> </math> − <math display="inline"> <semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>s</mi> <mi>t</mi> <mi>a</mi> </mrow> </msub> </mrow> </semantics> </math>).</p>
Full article ">Figure 6
<p>(<b>a</b>) Box plots showing combination performances against <span class="html-italic">in situ</span> measurements with the N60 and the two references. The labels on the x-axis indicate parent or statically-/dynamically-combined products with the references, and the y-axis R between the product and the <span class="html-italic">in situ</span> measurements. The value in each box is the mean of R. Comparison against <span class="html-italic">in situ</span> measurements from the ISMN for dynamically combined products using the N60 and (<b>b</b>) ERA-Interim and (<b>c</b>) MERRA-Land as the reference, respectively. The x-axis presents R between a dynamic product and the <span class="html-italic">in situ</span> measurements from a station, the y-axis R between a static product and the <span class="html-italic">in situ</span> measurements.</p>
Full article ">Figure 7
<p>Dynamic and static combination results using MERRA-Land as the reference at (<b>a</b>) Sandy Ridge station in Soil Climate Analysis Network and (<b>b</b>) Sandstone-6-W station in U.S. Climate Reference Network. Each panel shows static/dynamic weights (<b>top</b>), as well as time series of statically- and dynamically-combined soil moisture products (<b>bottom</b>).</p>
Full article ">Figure 8
<p>Combination performances with the quality of parent products and reference against <span class="html-italic">in situ</span> measurements. (<b>a</b>) ERA-Interim; (<b>b</b>) MERRA-Land. The x-axis for each panel presents the Euclidean distances (ξ) calculated by Equation (8), and the y-axis <math display="inline"> <semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>i</mi> <mi>n</mi> <mi>s</mi> <mi>i</mi> <mi>t</mi> <mi>u</mi> <mo>−</mo> <mi>s</mi> <mi>t</mi> <mi>a</mi> </mrow> </msub> </mrow> </semantics> </math> or <math display="inline"> <semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>i</mi> <mi>n</mi> <mi>s</mi> <mi>i</mi> <mi>t</mi> <mi>u</mi> <mo>−</mo> <mi>d</mi> <mi>y</mi> <mi>n</mi> </mrow> </msub> </mrow> </semantics> </math>. Linear regression lines represent the tendencies of both cases.</p>
Full article ">Figure 9
<p>Combination performances with reference quality against <span class="html-italic">in situ</span> measurements. The x-axis presents R between <span class="html-italic">in situ</span> measurements and the references (<math display="inline"> <semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>i</mi> <mi>n</mi> <mi>s</mi> <mi>i</mi> <mi>t</mi> <mi>u</mi> <mo>−</mo> <mi>r</mi> <mi>e</mi> <mi>f</mi> </mrow> </msub> </mrow> </semantics> </math>), the y-axis R between <span class="html-italic">in situ</span> measurements and statically-/dynamically-combined products (<math display="inline"> <semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>i</mi> <mi>n</mi> <mi>s</mi> <mi>i</mi> <mi>t</mi> <mi>u</mi> <mo>−</mo> <mi>s</mi> <mi>t</mi> <mi>a</mi> </mrow> </msub> </mrow> </semantics> </math> and <math display="inline"> <semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>i</mi> <mi>n</mi> <mi>s</mi> <mi>i</mi> <mi>t</mi> <mi>u</mi> <mo>−</mo> <mi>d</mi> <mi>y</mi> <mi>n</mi> </mrow> </msub> </mrow> </semantics> </math>). Linear regression lines are added for representing the average tendencies of both cases.</p>
Full article ">
14161 KiB  
Article
Bayesian Method for Building Frequent Landsat-Like NDVI Datasets by Integrating MODIS and Landsat NDVI
by Limin Liao, Jinling Song, Jindi Wang, Zhiqiang Xiao and Jian Wang
Remote Sens. 2016, 8(6), 452; https://doi.org/10.3390/rs8060452 - 27 May 2016
Cited by 77 | Viewed by 8221
Abstract
Studies related to vegetation dynamics in heterogeneous landscapes often require Normalized Difference Vegetation Index (NDVI) datasets with both high spatial resolution and frequent coverage, which cannot be satisfied by a single sensor due to technical limitations. In this study, we propose a new [...] Read more.
Studies related to vegetation dynamics in heterogeneous landscapes often require Normalized Difference Vegetation Index (NDVI) datasets with both high spatial resolution and frequent coverage, which cannot be satisfied by a single sensor due to technical limitations. In this study, we propose a new method called NDVI-Bayesian Spatiotemporal Fusion Model (NDVI-BSFM) for accurately and effectively building frequent high spatial resolution Landsat-like NDVI datasets by integrating Moderate Resolution Imaging Spectroradiometer (MODIS) and Landsat NDVI. Experimental comparisons with the results obtained using other popular methods (i.e., the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM), the Enhanced Spatial and Temporal Adaptive Reflectance Fusion Model (ESTARFM), and the Flexible Spatiotemporal DAta Fusion (FSDAF) method) showed that our proposed method has the following advantages: (1) it can obtain more accurate estimates; (2) it can retain more spatial detail; (3) its prediction accuracy is less dependent on the quality of the MODIS NDVI on the specific prediction date; and (4) it produces smoother NDVI time series profiles. All of these advantages demonstrate the strengths and the robustness of the proposed NDVI-BSFM in providing reliable high spatial and temporal resolution NDVI datasets to support other land surface process studies. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Three-dimensional plot of the correlation coefficients between the overlapped MODIS and Landsat NDVI at different locations.</p>
Full article ">Figure 2
<p>Flowchart illustrating the process employed to obtain prior MODIS NDVI time series for each land-cover type.</p>
Full article ">Figure 3
<p>Schematic diagram showing the multi-year average NDVI time series and the prior MODIS NDVI time series for farmland.</p>
Full article ">Figure 4
<p>Flowchart illustrating the NDVI-Bayesian Spatiotemporal Fusion Model (NDVI-BSFM).</p>
Full article ">Figure 5
<p>Schematic diagram showing the initial NDVI curve, the predicted point and observations.</p>
Full article ">Figure 6
<p>Location of the Huailai area and its corresponding Landsat 8/OLI image (NIR-red-green composite).</p>
Full article ">Figure 7
<p>Input paired NDVI images of the Huailai site: Landsat NDVI images acquired on: DOY = 94 (<b>a</b>); DOY = 206 (<b>b</b>); and DOY = 286 (<b>c</b>); and their corresponding MODIS NDVI images (<b>d</b>–<b>f</b>).</p>
Full article ">Figure 8
<p>The comparison of results predicted by three methods (NDVI-BSFM,STARFM and ESTARFM) under the condition of good MODIS NDVI: (<b>a</b>) original Landsat NDVI image on DOY = 238; (<b>b</b>) NDVI image on DOY = 238 predicted by NDVI-BSFM; (<b>c</b>) NDVI image predicted by STARFM; (<b>d</b>) NDVI image predicted by ESTARFM; (<b>e</b>) MODIS NDVI image on DOY = 241; (<b>f</b>) the absolute error distribution diagram of NDVI-BSFM; (<b>g</b>) the absolute error distribution diagram of STARFM; (<b>h</b>) the absolute error distribution diagram of ESTARFM.</p>
Full article ">Figure 9
<p>The comparison of results predicted by three methods (NDVI-BSFM,STARFM and ESTARFM) under the condition of poor MODIS NDVI: (<b>a</b>) original Landsat NDVI image on DOY = 217; (<b>b</b>) NDVI image on DOY = 217 predicted by NDVI-BSFM; (<b>c</b>) NDVI image predicted by STARFM; (<b>d</b>) NDVI image predicted by ESTARFM; (<b>e</b>) MODIS NDVI image DOY = 217 ; (<b>f</b>) the absolute error distribution diagram of NDVI-BSFM (<b>g</b>) the absolute error distribution diagram of STARFM; (<b>h</b>) the absolute error distribution diagram of ESTARFM.</p>
Full article ">Figure 10
<p>Landsat-like NDVI time series for the study area built using NDVI-BSFM for every eight-day period.</p>
Full article ">Figure 11
<p>The accuracy of the predicted NDVI obtained using STARFM and NDVI-BSFM: (<b>a</b>) AAD; (<b>b</b>) AARD; (<b>c</b>) <span class="html-italic">r</span> and (<b>d</b>) RMSE.</p>
Full article ">Figure 12
<p>Validation of the temporal NDVI profiles produced using the two methods for four measured Landsat pixels.</p>
Full article ">Figure 13
<p>Location of the Yunnan area and its corresponding Landsat 8/OLI image (NIR-red-green composite).</p>
Full article ">Figure 14
<p>Input Landsat NDVI image on the paired date (DOY = 104) (<b>a</b>); and its corresponding MODIS NDVI image (<b>b</b>); Actual Landsat NDVI image on the prediction date (DOY = 56) (<b>c</b>); and its corresponding MODIS NDVI image (<b>d</b>); NDVI image on DOY = 56 predicted: by NDVI-BSFM (<b>e</b>); by FSDAF (<b>f</b>); and by STARFM (<b>g</b>).</p>
Full article ">
13126 KiB  
Article
A Comparative Study of Cross-Product NDVI Dynamics in the Kilimanjaro Region—A Matter of Sensor, Degradation Calibration, and Significance
by Florian Detsch, Insa Otte, Tim Appelhans and Thomas Nauss
Remote Sens. 2016, 8(2), 159; https://doi.org/10.3390/rs8020159 - 19 Feb 2016
Cited by 27 | Viewed by 8829
Abstract
While satellite-based monitoring of vegetation activity at the earth’s surface is of vital importance for many eco-climatological applications, the degree of agreement among certain sensors and products providing estimates of the Normalized Difference Vegetation Index (NDVI) has been found to vary considerably. In [...] Read more.
While satellite-based monitoring of vegetation activity at the earth’s surface is of vital importance for many eco-climatological applications, the degree of agreement among certain sensors and products providing estimates of the Normalized Difference Vegetation Index (NDVI) has been found to vary considerably. In order to assess the extent of such differences in highly heterogeneous terrain, we analyze and compare intra-annual seasonal fluctuations and long-term monotonic trends (2003–2012) in the Kilimanjaro region, Tanzania. The considered NDVI datasets include the Moderate Resolution Imaging Spectroradiometer (MODIS) products from Terra and Aqua, Collections 5 and 6, and the 3rd Generation Global Inventory Modeling and Mapping Studies (GIMMS) product. The degree of agreement in seasonal fluctuations is assessed by calculating a pairwise Index of Association (IOAs), whereas long-term trends are derived from the trend-free pre-whitened Mann–Kendall test. On the seasonal scale, the two Terra-MODIS products (and, accordingly, the two Aqua-MODIS products) are best associated with each other, indicating that the seasonal signal remained largely unaffected by the new Collection 6 calibration approach. On the long-term scale, we find that the negative impacts of band ageing on Terra-MODIS NDVI have been accounted for in Collection 6, which now distinctly outweighs Aqua-MODIS in terms of greening trends. GIMMS NDVI, by contrast, fails to capture small-scale seasonal and trend patterns that are characteristic for the highly fragmented landscape which is likely owing to the coarse spatial resolution. As a short digression, we also demonstrate that the amount of false discoveries in the determined trend fraction is distinctly higher for p < 0.05 ( 52.6 % ) than for p < 0.001 ( 2.2 % ) which should point the way for any future studies focusing on the reliable deduction of long-term monotonic trends. Full article
(This article belongs to the Special Issue Multi-Sensor and Multi-Data Integration in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>(<b>a</b>) Spatial coverage of the 1/12-degree GIMMS grid (white dashed) including consecutive pixel numbering superimposed upon an aerial image of the study area (Map data: Google, TerraMetrics [<a href="#B47-remotesensing-08-00159" class="html-bibr">47</a>]); (<b>b</b>) spatial pattern of IOA<sub>s</sub> between NDVI<sub>3g</sub> and MODIS CMG-based NDVI<sub>Aqua-C5</sub> (<a href="#sec2dot2dot1-remotesensing-08-00159" class="html-sec">Section 2.2.1</a>); (<b>c</b>) raw time series (2003–2012) of NDVI<sub>3g</sub> (purple) and MODIS CMG-based NDVI<sub>Aqua-C5</sub> (orange) including 10%–90% quantile range calculated from all 250-m NDVI<sub>Aqua-C5</sub> pixels falling inside each 8-km GIMMS pixel (light orange). The underlying coordinate reference system (CRS) in (<b>a</b>) and (<b>b</b>) is EPSG:4326 and the digital elevation model (black solid) is based on toposheets at scale 1:50000 digitized by Ong’injo <span class="html-italic">et al.</span> [<a href="#B48-remotesensing-08-00159" class="html-bibr">48</a>].</p>
Full article ">Figure 2
<p>(<b>a</b>) Bing Maps aerial image of the study area (downloaded via OpenStreetMap [<a href="#B49-remotesensing-08-00159" class="html-bibr">49</a>]); spatial patterns of “significant” values of τ (2003–2012) for (<b>b</b>) NDVI<sub>3g</sub>; (<b>c</b>) NDVI<sub>Terra-C5</sub>; (<b>d</b>) NDVI<sub>Aqua-C5</sub>; (<b>e</b>) NDVI<sub>Terra-C6</sub>; and (<b>f</b>) NDVI<sub>Aqua-C6</sub>. Also included are the 1-d Kernel density estimates for the 250-m MODIS products originating from Collections 5 (connecting <b>c</b> and <b>d</b>) and 6 (connecting <b>e</b> and <b>f</b>). The underlying CRS and additional data are the same as in <a href="#remotesensing-08-00159-f001" class="html-fig">Figure 1</a>.</p>
Full article ">Figure 3
<p>MD<sub>τ</sub> calculated from “significant” values of τ (<math display="inline"> <mrow> <mi>p</mi> <mo>&lt;</mo> <mn>0.05</mn> </mrow> </math>) for each pair of MODIS NDVI products.</p>
Full article ">Figure 4
<p>Spatial patterns of “conclusive” values of τ (2003–2012) for (<b>a</b>) NDVI<sub>Terra-C5</sub>; (<b>b</b>) NDVI<sub>Aqua-C5</sub>; (<b>c</b>) NDVI<sub>Terra-C6</sub>; and (<b>d</b>) NDVI<sub>Aqua-C6</sub>. Also included are the 1-d Kernel density estimates for the 250-m MODIS products originating from Collections 5 (connecting <b>a</b> and <b>b</b>) and 6 (connecting <b>c</b> and <b>d</b>). The underlying CRS and additional data are the same as in <a href="#remotesensing-08-00159-f001" class="html-fig">Figure 1</a>.</p>
Full article ">Figure 5
<p>MD<sub>τ</sub> calculated from “conclusive” values of τ (<math display="inline"> <mrow> <mi>p</mi> <mo>&lt;</mo> <mn>0.001</mn> </mrow> </math>) for each pair of MODIS NDVI products.</p>
Full article ">Figure 6
<p>FDR depending on the <span class="html-italic">Power</span> of the applied test and the trend prevalence (<span class="html-italic">P</span>) under real conditions for (<b>a</b>) “significant” trends (<math display="inline"> <mrow> <mi>p</mi> <mo>&lt;</mo> <mn>0.05</mn> </mrow> </math>); and (<b>b</b>) “conclusive” trends (<math display="inline"> <mrow> <mi>p</mi> <mo>&lt;</mo> <mn>0.001</mn> </mrow> </math>).</p>
Full article ">
Back to TopTop