[go: up one dir, main page]

Next Issue
Volume 14, May-2
Previous Issue
Volume 14, April-2
 
 
remotesensing-logo

Journal Browser

Journal Browser

Remote Sens., Volume 14, Issue 9 (May-1 2022) – 335 articles

Cover Story (view full-size image): Blocky landscapes on Mars, known as ‘chaos terrain’, develop in topographic lows and through debated formation mechanisms. Several different hypotheses explain chaos terrain development, but no one hypothesis can explain all of the incidents of chaos terrain. In this study, we focus on the Galilaei crater, a paleolake with chaos terrain concentrated along the crater’s interior walls. Blocks that define the chaos terrain are associated with parts of the crater wall that are relatively less steep than others, and polygonal cracks on the crater floor reflect the evaporation of ancient standing water. We propose that the morphologies observed can be explained by subaqueous mass flow, analogous to submarine shelf slides on Earth. We discuss Earth analogs and the implications of these observations for Mars. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
20 pages, 1247 KiB  
Article
Wavelength Extension of the Optimized Asymmetric-Order Vegetation Isoline Equation to Cover the Range from Visible to Near-Infrared
by Munenori Miura, Kenta Obata and Hiroki Yoshioka
Remote Sens. 2022, 14(9), 2289; https://doi.org/10.3390/rs14092289 - 9 May 2022
Viewed by 1894
Abstract
Vegetation isoline equations describe analytical relationships between two reflectances of different wavelengths. Their applications range from retrievals of biophysical parameters to the derivation of the inter-sensor relationships of spectral vegetation indexes. Among the three variants of vegetation isoline equations introduced thus far, the [...] Read more.
Vegetation isoline equations describe analytical relationships between two reflectances of different wavelengths. Their applications range from retrievals of biophysical parameters to the derivation of the inter-sensor relationships of spectral vegetation indexes. Among the three variants of vegetation isoline equations introduced thus far, the optimized asymmetric-order vegetation isoline equation is the newest and is known to be the most accurate. This accuracy assessment, however, has been performed only for the wavelength pair of red and near-infrared (NIR) bands fixed at ∼655 nm and ∼865 nm, respectively. The objective of this study is to extend this wavelength limitation. An accuracy assessment was therefore performed over a wider range of wavelengths, from 400 to 1200 nm. The optimized asymmetric-order vegetation isoline equation was confirmed to demonstrate the highest accuracy among the three isolines for all the investigated wavelength pairs. The second-best equation, the asymmetric-order isoline equation, which does not include an optimization factor, was not superior to the least-accurate equation (i.e., the first-order isoline equation) in some cases. This tendency was prominent when the reflectances of the two wavelengths were similar. By contrast, the optimized asymmetric-order vegetation isoline showed stable performance throughout this study. A single factor introduced into the optimized asymmetric-order isoline equation was concluded to effectively reduce errors in the isoline for all the wavelength combinations examined in this study. Full article
(This article belongs to the Section Biogeosciences Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>Errors in the first-order isoline equation (upper panels, (<b>a</b>,<b>d</b>)), asymmetric-order isoline equation (middle panels, (<b>b</b>,<b>e</b>)), and optimized asymmetric-order isoline equation (bottom panels, (<b>c</b>,<b>f</b>)) for red and NIR wavelength pairs as a function of the LAI and soil reflectance. The FVC is 1.0 for the left panels (<b>a</b>–<b>c</b>) and 0.5 for the right panels (<b>d</b>–<b>f</b>).</p>
Full article ">Figure 2
<p><math display="inline"><semantics> <msub> <mi>k</mi> <mrow> <mi>o</mi> <mi>p</mi> <mi>t</mi> </mrow> </msub> </semantics></math> as a function of <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>1</mn> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>2</mn> </msub> </semantics></math>.</p>
Full article ">Figure 3
<p>(<b>a</b>) The value of <math display="inline"><semantics> <msub> <mi>k</mi> <mrow> <mi>o</mi> <mi>p</mi> <mi>t</mi> </mrow> </msub> </semantics></math> as a function of <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>1</mn> </msub> </semantics></math> for four <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>2</mn> </msub> </semantics></math> wavelengths (810, 860, 910, and 940 nm). (<b>b</b>) Enlarged view of (<b>a</b>) in the range from 400 to 700 nm.</p>
Full article ">Figure 4
<p>The value of <math display="inline"><semantics> <msub> <mi>k</mi> <mrow> <mi>o</mi> <mi>p</mi> <mi>t</mi> </mrow> </msub> </semantics></math> as a function of <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>2</mn> </msub> </semantics></math> for four <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>1</mn> </msub> </semantics></math> wavelengths ((<b>a</b>) 470, (<b>b</b>) 510, (<b>c</b>) 640, and (<b>d</b>) 860 nm).</p>
Full article ">Figure 5
<p>Simulated TOC reflectance spectra for various LAI values with (<b>a</b>) bright soil and (<b>b</b>) dark soil.</p>
Full article ">Figure 6
<p>Mean epsilon for (<b>a</b>) first-order, (<b>b</b>) asymmetric-order, and (<b>c</b>) optimized asymmetric-order isoline equations as a function of <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>1</mn> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>2</mn> </msub> </semantics></math>.</p>
Full article ">Figure 7
<p>Comparisons of the mean epsilon among the first-order, the asymmetric-order, and the optimized asymmetric-order isolines as a function of <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>1</mn> </msub> </semantics></math> at four fixed values of <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>2</mn> </msub> </semantics></math>: (<b>a</b>) 810 nm, (<b>b</b>) 860 nm, (<b>c</b>) 910 nm, and (<b>d</b>) 940 nm.</p>
Full article ">Figure 8
<p>Comparisons of the mean epsilon among the first-order, the asymmetric-order, and the optimized asymmetric-order isolines as a function of <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>2</mn> </msub> </semantics></math> at four fixed values of <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>1</mn> </msub> </semantics></math>: (<b>a</b>) 470 nm, (<b>b</b>) 510 nm, (<b>c</b>) 640 nm, and (<b>d</b>) 860 nm.</p>
Full article ">Figure 9
<p>Scatter plot of true spectra and the three variants of isoline equations in the reflectance subspace. The wavelength <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>2</mn> </msub> </semantics></math> was fixed at 860 nm, whereas <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>1</mn> </msub> </semantics></math> was selected from (<b>a</b>) 690 nm, (<b>b</b>) 710 nm, (<b>c</b>) 730 nm, and (<b>d</b>) 850 nm (<math display="inline"><semantics> <msub> <mi>λ</mi> <mn>1</mn> </msub> </semantics></math> minus 10).</p>
Full article ">Figure 10
<p>Plot of the error in the first-order vegetation isoline equation (blue line) and the over-correction terms of the asymmetric-order vegetation isoline equation (red line) and the adjusted overcorrection term with <math display="inline"><semantics> <msub> <mi>k</mi> <mrow> <mi>o</mi> <mi>p</mi> <mi>t</mi> </mrow> </msub> </semantics></math> (green crosses). The wavelength pair of <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>1</mn> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>2</mn> </msub> </semantics></math> is the same as in <a href="#remotesensing-14-02289-f009" class="html-fig">Figure 9</a>. The wavelength <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>2</mn> </msub> </semantics></math> was fixed at 860 nm, whereas <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>1</mn> </msub> </semantics></math> was selected from (<b>a</b>) 690 nm, (<b>b</b>) 710 nm, (<b>c</b>) 730 nm, and (<b>d</b>) 850 nm (<math display="inline"><semantics> <msub> <mi>λ</mi> <mn>2</mn> </msub> </semantics></math> minus 10).</p>
Full article ">
18 pages, 6047 KiB  
Article
UAV-Borne Imagery Can Supplement Airborne Lidar in the Precise Description of Dynamically Changing Shrubland Woody Vegetation
by Tomáš Klouček, Petr Klápště, Jana Marešová and Jan Komárek
Remote Sens. 2022, 14(9), 2287; https://doi.org/10.3390/rs14092287 - 9 May 2022
Cited by 4 | Viewed by 2807
Abstract
Airborne laser scanning (ALS) is increasingly used for detailed vegetation structure mapping; however, there are many local-scale applications where it is economically ineffective or unfeasible from the temporal perspective. Unmanned aerial vehicles (UAVs) or airborne imagery (AImg) appear to be promising alternatives, but [...] Read more.
Airborne laser scanning (ALS) is increasingly used for detailed vegetation structure mapping; however, there are many local-scale applications where it is economically ineffective or unfeasible from the temporal perspective. Unmanned aerial vehicles (UAVs) or airborne imagery (AImg) appear to be promising alternatives, but only a few studies have examined this assumption outside economically exploited areas (forests, orchards, etc.). The main aim of this study was to compare the usability of normalized digital surface models (nDSMs) photogrammetrically derived from UAV-borne and airborne imagery to those derived from low- (1–2 pts/m2) and high-density (ca. 20 pts/m2) ALS-scanning for the precise local-scale modelling of woody vegetation structures (the number and height of trees/shrubs) across six dynamically changing shrubland sites. The success of the detection of woody plant tops was initially almost 100% for UAV-based models; however, deeper analysis revealed that this was due to the fact that omission and commission errors were approximately equal and the real accuracy was approx. 70% for UAV-based models compared to 95.8% for the high-density ALS model. The percentage mean absolute errors (%MAE) of shrub/tree heights derived from UAV data ranged between 12.2 and 23.7%, and AImg height accuracy was relatively lower (%MAE: 21.4–47.4). Combining UAV-borne or AImg-based digital surface models (DSM) with ALS-based digital terrain models (DTMs) significantly improved the nDSM height accuracy (%MAE: 9.4–13.5 and 12.2–25.0, respectively) but failed to significantly improve the detection of the number of individual shrubs/trees. The height accuracy and detection success using low- or high-density ALS did not differ. Therefore, we conclude that UAV-borne imagery has the potential to replace custom ALS in specific local-scale applications, especially at dynamically changing sites where repeated ALS is costly, and the combination of such data with (albeit outdated and sparse) ALS-based digital terrain models can further improve the success of the use of such data. Full article
Show Figures

Figure 1

Figure 1
<p>Study site overview.</p>
Full article ">Figure 2
<p>A broader view of the study area (<b>bottom</b>); the senseFly eBee equipped with a consumer-grade DSC-WX220 digital compact camera (<b>top</b>).</p>
Full article ">Figure 3
<p>Description of the study workflow and the preparation of individual nDSMs.</p>
Full article ">Figure 4
<p>Examples of the individual woody plant detections for seven created nDSMs and the reference orthomosaic created by visual interpretation (top left corner) on the sample of Site 5. White dots represent individual detected shrubs and trees. White frame represents a subset of the study Site 5.</p>
Full article ">Figure A1
<p>Box-plots displaying quartile characteristics (median, Q25, and Q75) of the absolute nDSM heights for each study location. For each location, 100 randomly selected woody plant tops were used. Black dots represent outliers.</p>
Full article ">Figure A2
<p>Box-plots displaying quartile characteristics (median, Q25, and Q75) of nDSM height differences for individual study locations. For each location, 100 randomly selected woody plant tops were used. Black dots represent outliers.</p>
Full article ">Figure A3
<p>Box-plots of summary quartile characteristics (median, Q25, and Q75) of nDSM height differences from ALS<sub>HD</sub> reference for all study locations. For every nDSM, 600 randomly selected woody plant tops were used. Black dots represent outliers.</p>
Full article ">
20 pages, 63893 KiB  
Article
Integration and Comparison Methods for Multitemporal Image-Based 2D Annotations in Linked 3D Building Documentation
by Jakob Taraben and Guido Morgenthal
Remote Sens. 2022, 14(9), 2286; https://doi.org/10.3390/rs14092286 - 9 May 2022
Cited by 4 | Viewed by 2286
Abstract
Data acquisition systems and methods to capture high-resolution images or reconstruct 3D point clouds of existing structures are an effective way to document their as-is condition. These methods enable a detailed analysis of building surfaces, providing precise 3D representations. However, for the condition [...] Read more.
Data acquisition systems and methods to capture high-resolution images or reconstruct 3D point clouds of existing structures are an effective way to document their as-is condition. These methods enable a detailed analysis of building surfaces, providing precise 3D representations. However, for the condition assessment and documentation, damages are mainly annotated in 2D representations, such as images, orthophotos, or technical drawings, which do not allow for the application of a 3D workflow or automated comparisons of multitemporal datasets. In the available software for building heritage data management and analysis, a wide range of annotation and evaluation functions are available, but they also lack integrated post-processing methods and systematic workflows. The article presents novel methods developed to facilitate such automated 3D workflows and validates them on a small historic church building in Thuringia, Germany. Post-processing steps using photogrammetric 3D reconstruction data along with imagery were implemented, which show the possibilities of integrating 2D annotations into 3D documentations. Further, the application of voxel-based methods on the dataset enables the evaluation of geometrical changes of multitemporal annotations in different states and the assignment to elements of scans or building models. The proposed workflow also highlights the potential of these methods for condition assessment and planning of restoration work, as well as the possibility to represent the analysis results in standardised building model formats. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The captured point clouds of the Wehrkirche Döblitz (<b>a</b>) from outside and (<b>b</b>) of the main hall and the sanctuary at the ground level and (<b>c</b>) the reconstructed orthophoto of the wall painting.</p>
Full article ">Figure 2
<p>Flowchart of the proposed process pipeline to integrate 2D annotations in 3D models and to perform an automated damage comparison (data entities in the white boxes, processing steps without frame).</p>
Full article ">Figure 3
<p>Segmented building parts of the main hall with manually modelled building elements: (<b>a</b>) initial point cloud of the interior, (<b>b</b>) modelled facets for floor, walls, and ceiling, (<b>c</b>) explosion drawing of the point cloud segments including the numbering of the walls, and (<b>d</b>) subset of Wall 01 with the extracted image set as camera orientations from the inside.</p>
Full article ">Figure 4
<p>Examples of annotated damages: (<b>a</b>) areas of missing plaster (red) and craquele (purple), (<b>b</b>) a crack polyline (light blue) and a small missing plaster detail as a polygon (red), and (<b>c</b>) the synthetic multitemporal labels for the case study in <math display="inline"><semantics> <msub> <mi>t</mi> <mn>1</mn> </msub> </semantics></math> (blue), <math display="inline"><semantics> <msub> <mi>t</mi> <mn>2</mn> </msub> </semantics></math> (yellow), and <math display="inline"><semantics> <msub> <mi>t</mi> <mn>3</mn> </msub> </semantics></math> (red) for a missing plaster area.</p>
Full article ">Figure 5
<p>Exemplary mappings of 2D annotations of missing plaster (red) and craqueles (green) from the case study dataset on the triangle mesh surface; (<b>a</b>) an image-based central projection showing the estimated position of the camera <span class="html-italic">p</span>, the focal length <span class="html-italic">f</span> as a vector to the image centre, and the orientation of the local coordinate system and (<b>b</b>) an orthogonal projection from an orthophoto patch, which was registered to the 3D scene.</p>
Full article ">Figure 6
<p>Three-dimensional view of (<b>a</b>) the segmented walls with assigned numbering 01–04 and (<b>b</b>,<b>c</b>) the mapped 3D annotations as coloured decay mapping in the states (<b>b</b>) <math display="inline"><semantics> <msub> <mi>t</mi> <mn>1</mn> </msub> </semantics></math>, (<b>c</b>) <math display="inline"><semantics> <msub> <mi>t</mi> <mn>2</mn> </msub> </semantics></math>, and (<b>d</b>) <math display="inline"><semantics> <msub> <mi>t</mi> <mn>3</mn> </msub> </semantics></math>.</p>
Full article ">Figure 7
<p>Three-dimensional views with wall elements and two exemplary 3D annotations from the case study dataset showing the voxel percentage to non-corresponding building elements: (green) assigned voxels and (red) not assigned voxels for octree based or defined voxel sizes with <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mrow> <mi>m</mi> <mi>i</mi> <mi>n</mi> </mrow> </msub> <mo>=</mo> <mn>10</mn> <mrow> <mi mathvariant="normal">c</mi> <mi mathvariant="normal">m</mi> </mrow> </mrow> </semantics></math>.</p>
Full article ">Figure 8
<p>Voxel-based assignment and comparison of the exemplary damage from <math display="inline"><semantics> <msub> <mi>t</mi> <mn>1</mn> </msub> </semantics></math> (green) and <math display="inline"><semantics> <msub> <mi>t</mi> <mn>2</mn> </msub> </semantics></math> (red) in the dataset of Wall 02 using the parameters <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mi>v</mi> </msub> <mo>=</mo> <mn>0.5</mn> <mrow> <mo> </mo> <mi mathvariant="normal">c</mi> <mi mathvariant="normal">m</mi> </mrow> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mi>r</mi> <mi>v</mi> </msub> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math>, resulting in an identified overlap (yellow) and a damage growth of 72.3%.</p>
Full article ">Figure 9
<p>Schema of the linking between the different data entities in the case study to achieve a comprehensive condition history on the inspection timeline.</p>
Full article ">Figure 10
<p>Voxelised (<math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mi>v</mi> </msub> <mo>=</mo> <mn>5</mn> <mrow> <mo> </mo> <mi mathvariant="normal">c</mi> <mi mathvariant="normal">m</mi> </mrow> </mrow> </semantics></math>) and assigned damages of each annotated state for the wall elements (<math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mrow> <mi>m</mi> <mi>i</mi> <mi>n</mi> </mrow> </msub> <mo>=</mo> <mn>20</mn> <mrow> <mo> </mo> <mi mathvariant="normal">c</mi> <mi mathvariant="normal">m</mi> </mrow> </mrow> </semantics></math>, <math display="inline"><semantics> <msub> <mi>p</mi> <mrow> <mi>m</mi> <mi>i</mi> <mi>n</mi> </mrow> </msub> </semantics></math> = 25%) in the case study dataset: Wall 01 (yellow), Wall 02 (blue), Wall 03 (green), and Wall 04 (orange).</p>
Full article ">Figure 11
<p>Back-projection of a crack annotation on Wall 01 to images containing the damage.</p>
Full article ">Figure 12
<p>Results for a missing plaster annotation applying the voxel-based assignment and comparison of (left of each pair) <math display="inline"><semantics> <msub> <mi>t</mi> <mn>1</mn> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>t</mi> <mn>2</mn> </msub> </semantics></math> and (right of each pair) <math display="inline"><semantics> <msub> <mi>t</mi> <mn>2</mn> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>t</mi> <mn>3</mn> </msub> </semantics></math> showing the effects of a too small definition of <math display="inline"><semantics> <msub> <mi>r</mi> <mi>v</mi> </msub> </semantics></math> and the percentage growth <math display="inline"><semantics> <msub> <mi>p</mi> <mo>+</mo> </msub> </semantics></math> computed from different <math display="inline"><semantics> <msub> <mi>d</mi> <mi>v</mi> </msub> </semantics></math>.</p>
Full article ">Figure 13
<p>Percentage of overlapping voxels (yellow, <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mi>v</mi> </msub> <mo>=</mo> <mn>0.5</mn> <mrow> <mo> </mo> <mi mathvariant="normal">c</mi> <mi mathvariant="normal">m</mi> </mrow> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mi>r</mi> <mi>v</mi> </msub> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math>) for two near damage annotations (blue) that are not assigned, whereas a bounding box or distance check would result in a valid assignment.</p>
Full article ">Figure 14
<p>Different combinations of <span class="html-italic">d<sub>v</sub></span> and <span class="html-italic">r<sub>v</sub></span> (all lead to <math display="inline"><semantics> <mrow> <msub> <mi>u</mi> <mrow> <mi>m</mi> <mi>a</mi> <mi>x</mi> </mrow> </msub> <mo>=</mo> <mn>17.3</mn> <mrow> <mo> </mo> <mi mathvariant="normal">c</mi> <mi mathvariant="normal">m</mi> </mrow> </mrow> </semantics></math>) for annotated (<b>a</b>) missing plaster surface and (<b>b</b>) crack polyline.</p>
Full article ">Figure 15
<p>Three-dimensional view of the voxelised and compared full annotation dataset (comparison of <math display="inline"><semantics> <msub> <mi>t</mi> <mn>2</mn> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>t</mi> <mn>3</mn> </msub> </semantics></math>) in two variants with (<b>a</b>) a rough computation using <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mi>v</mi> </msub> <mo>=</mo> <mn>5</mn> <mrow> <mo> </mo> <mi mathvariant="normal">c</mi> <mi mathvariant="normal">m</mi> </mrow> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mi>r</mi> <mi>v</mi> </msub> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math> and (<b>b</b>) a more detailed computation using <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mi>v</mi> </msub> <mo>=</mo> <mn>0.5</mn> <mrow> <mo> </mo> <mi mathvariant="normal">c</mi> <mi mathvariant="normal">m</mi> </mrow> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mi>r</mi> <mi>v</mi> </msub> <mo>=</mo> <mn>4</mn> </mrow> </semantics></math> (red: new damaged regions, yellow: regions already identified in <math display="inline"><semantics> <msub> <mi>t</mi> <mn>2</mn> </msub> </semantics></math>, and green: repaired or not re-identified regions).</p>
Full article ">
19 pages, 13367 KiB  
Technical Note
An Algebraic Comparison of Synthetic Aperture Interferometry and Digital Beam Forming in Imaging Radiometry
by Eric Anterrieu, Pierre Lafuma and Nicolas Jeannin
Remote Sens. 2022, 14(9), 2285; https://doi.org/10.3390/rs14092285 - 9 May 2022
Cited by 5 | Viewed by 2173
Abstract
Digital beam forming (DBF) and synthetic aperture interferometry (SAI) are signal processing techniques that mix the signals collected by an antenna array to obtain high-resolution images with the aid of a computer. This note aims at comparing these two approaches from an algebraic [...] Read more.
Digital beam forming (DBF) and synthetic aperture interferometry (SAI) are signal processing techniques that mix the signals collected by an antenna array to obtain high-resolution images with the aid of a computer. This note aims at comparing these two approaches from an algebraic perspective with the illustrations of simulations conducted at microwaves frequencies within the frame of the Soil Moisture and Ocean Salinity (SMOS) mission. Although the two techniques are using the same signals and sharing the same goal, there are several differences that deserve attention. From the algebraic point of view, it is the case for the singular values distributions of the respective modeling matrices which are both rank-deficient but do not have the same sensitivity to the diversity of the array’s elementary antennas radiation patterns. As a consequence of this difference, the level and the angular signature of the reconstruction floor error are significantly lower with the DBF paradigm than with the SAI one. Full article
Show Figures

Figure 1

Figure 1
<p>Illustration of the principle of the two paradigms. In both cases, all the elementary antennas are pointing in the same direction, depicted here by the main beam of the elementary power patterns (in blue). To make the illustration simpler without losing anything of the comparison, amplifiers, local oscillators, and band-pass filters are omitted. (<b>a</b>) In SAI, for every pair of elementary antennas <math display="inline"><semantics> <msub> <mi mathvariant="script">A</mi> <mi>p</mi> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi mathvariant="script">A</mi> <mi>q</mi> </msub> </semantics></math>, the radio signals are transmitted to a complex correlation unit to be combined in-phase/in-quadrature and integrated in order to provide the real and imaginary parts of the complex visibility sample <span class="html-italic">V</span> for the baseline vector <math display="inline"><semantics> <msub> <mi mathvariant="bold">b</mi> <mrow> <mi>p</mi> <mi>q</mi> </mrow> </msub> </semantics></math>. Referring to the integral in (<a href="#FD1-remotesensing-14-02285" class="html-disp-formula">1</a>), the contribution from direction <math display="inline"><semantics> <mi mathvariant="bold-italic">ξ</mi> </semantics></math> (in red) to <math display="inline"><semantics> <mrow> <mi>V</mi> <mo>(</mo> <msub> <mi mathvariant="bold">b</mi> <mrow> <mi>p</mi> <mi>q</mi> </mrow> </msub> <mo>)</mo> </mrow> </semantics></math> depends on the delay between the time a wave train (dashed red) arrives on antenna <math display="inline"><semantics> <msub> <mi mathvariant="script">A</mi> <mi>p</mi> </msub> </semantics></math> and its arrival on antenna <math display="inline"><semantics> <msub> <mi mathvariant="script">A</mi> <mi>q</mi> </msub> </semantics></math>. (<b>b</b>) In DBF, a phase control <math display="inline"><semantics> <mrow> <msub> <mi>φ</mi> <mi>p</mi> </msub> <mrow> <mo>(</mo> <msup> <mi mathvariant="bold-italic">ξ</mi> <mo>′</mo> </msup> <mo>)</mo> </mrow> </mrow> </semantics></math> is applied to the radio signals so that the beam of the antenna array <math display="inline"><semantics> <mi mathvariant="double-struck">A</mi> </semantics></math> is steered to the direction <math display="inline"><semantics> <msup> <mi mathvariant="bold-italic">ξ</mi> <mo>′</mo> </msup> </semantics></math>, here illustrated by the main beam of the array power pattern (in red). The signals thus obtained are combined all together and integrated to provide the antenna temperature <math display="inline"><semantics> <mrow> <msub> <mi>T</mi> <mi mathvariant="double-struck">A</mi> </msub> <mrow> <mo>(</mo> <msup> <mi mathvariant="bold-italic">ξ</mi> <mo>′</mo> </msup> <mo>)</mo> </mrow> </mrow> </semantics></math> from this pointing direction. Although the elementary power patterns have a wide beam, the phased combination leads to an array power pattern with a much narrower beam so that, when referring to the integral in (<a href="#FD4-remotesensing-14-02285" class="html-disp-formula">4</a>), the directions <math display="inline"><semantics> <mi mathvariant="bold-italic">ξ</mi> </semantics></math> bringing the most important contributions to <math display="inline"><semantics> <mrow> <msub> <mi>T</mi> <mi mathvariant="double-struck">A</mi> </msub> <mrow> <mo>(</mo> <msup> <mi mathvariant="bold-italic">ξ</mi> <mo>′</mo> </msup> <mo>)</mo> </mrow> </mrow> </semantics></math> are those around <math display="inline"><semantics> <msup> <mi mathvariant="bold-italic">ξ</mi> <mo>′</mo> </msup> </semantics></math>. When the phase control is applied by computational mean, several beams can be simultaneously steered in various directions, provided the processing capacity is appropriate to apply the different sets of phase shifts corresponding to every direction.</p>
Full article ">Figure 2
<p>Impact of the sparsity of antenna arrays on the array factor with (<b>a</b>) the Y-shaped array of SMOS populated with 69 elementary antennas (orange) and a dense hexagonal array with the same dimensions but filled with 1387 elementary antennas (blue). As a consequence of the sparsity of the antenna array of SMOS, the corresponding array factor (<b>b</b>) exhibits side-lobes as high as <math display="inline"><semantics> <mrow> <mo>−</mo> <mn>7.2</mn> </mrow> </semantics></math> dB in the 6 tails of <math display="inline"><semantics> <msup> <mrow> <mo>|</mo> <mi>A</mi> <mi>F</mi> <mrow> <mo>(</mo> <mi mathvariant="bold-italic">ξ</mi> <mo>,</mo> <mn mathvariant="bold">0</mn> <mo>)</mo> </mrow> <mo>|</mo> </mrow> <mn>2</mn> </msup> </semantics></math>) and it has a main beam as large as <math display="inline"><semantics> <mrow> <mn>0.0355</mn> </mrow> </semantics></math> rad at half-power. On the contrary, the array factor of the dense hexagonal array (<b>c</b>) exhibits side-lobes as low as <math display="inline"><semantics> <mrow> <mo>−</mo> <mn>19</mn> </mrow> </semantics></math> dB in the 6 tails of <math display="inline"><semantics> <msup> <mrow> <mo>|</mo> <mi>A</mi> <mi>F</mi> <mrow> <mo>(</mo> <mi mathvariant="bold-italic">ξ</mi> <mo>,</mo> <mn mathvariant="bold">0</mn> <mo>)</mo> </mrow> <mo>|</mo> </mrow> <mn>2</mn> </msup> </semantics></math>) and it has a main beam as narrow as <math display="inline"><semantics> <mrow> <mn>0.0299</mn> </mrow> </semantics></math> rad at half-power. For ease of comparison, array factors are here normalized and the same color scale is used.</p>
Full article ">Figure 3
<p>Comparison between (<b>a</b>) the average power pattern <math display="inline"><semantics> <mrow> <mo>〈</mo> <mo>|</mo> <msub> <mi mathvariant="script">F</mi> <mrow> <mspace width="-0.166667em"/> <mi>p</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi mathvariant="bold-italic">ξ</mi> <mo>)</mo> </mrow> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>〉</mo> </mrow> </semantics></math> of the SMOS 69 elementary antennas and two examples of the array power pattern <math display="inline"><semantics> <mrow> <mrow> <mo>|</mo> <mi mathvariant="double-struck">F</mi> </mrow> <mrow> <mo>(</mo> <mi mathvariant="bold-italic">ξ</mi> <mo>,</mo> <msup> <mi mathvariant="bold-italic">ξ</mi> <mo>′</mo> </msup> <mo>)</mo> </mrow> <msup> <mrow> <mo>|</mo> </mrow> <mn>2</mn> </msup> </mrow> </semantics></math> of that Y-shaped array: (<b>b</b>) when the antenna array points into the boresight direction <math display="inline"><semantics> <mrow> <msup> <mi mathvariant="bold-italic">ξ</mi> <mo>′</mo> </msup> <mo>=</mo> <mrow> <mo>(</mo> <mn>0</mn> <mo>,</mo> <mn>0</mn> <mo>)</mo> </mrow> </mrow> </semantics></math> and (<b>c</b>) when it points into the direction <math display="inline"><semantics> <mrow> <msup> <mi mathvariant="bold-italic">ξ</mi> <mo>′</mo> </msup> <mo>=</mo> <mrow> <mo>(</mo> <mn>0.5</mn> <mo>,</mo> <mo>−</mo> <mn>0.25</mn> <mo>)</mo> </mrow> </mrow> </semantics></math>. The main beam of <math display="inline"><semantics> <mrow> <mo>〈</mo> <mo>|</mo> <msub> <mi mathvariant="script">F</mi> <mrow> <mspace width="-0.166667em"/> <mi>p</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi mathvariant="bold-italic">ξ</mi> <mo>)</mo> </mrow> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>〉</mo> </mrow> </semantics></math> is as wide as <math display="inline"><semantics> <mrow> <mn>1.135</mn> </mrow> </semantics></math> rad (or <math display="inline"><semantics> <msup> <mn>65</mn> <mo>∘</mo> </msup> </semantics></math>), as emphasized by the dashed circle, whereas that of <math display="inline"><semantics> <msup> <mrow> <mo>|</mo> <mi mathvariant="double-struck">F</mi> <mrow> <mo>(</mo> <mi mathvariant="bold-italic">ξ</mi> <mo>,</mo> <mn mathvariant="bold">0</mn> <mo>)</mo> </mrow> <mo>|</mo> </mrow> <mn>2</mn> </msup> </semantics></math> is as narrow as <math display="inline"><semantics> <mrow> <mn>0.0355</mn> </mrow> </semantics></math> rad (or <math display="inline"><semantics> <msup> <mn>2</mn> <mo>∘</mo> </msup> </semantics></math>). As a consequence of the attenuation caused by the element power patterns, the peak value in (<b>c</b>) is about <math display="inline"><semantics> <mrow> <mo>−</mo> <mn>3.2</mn> </mrow> </semantics></math> dB below that in (<b>b</b>). The field of view of SMOS is subject to aliasing because of the spacing between the elementary antennas: the hexagonal synthesized field of view and its six neighbors are drawn in red, and its extension from side to side is equal to <math display="inline"><semantics> <mrow> <mn>1.32</mn> </mrow> </semantics></math> rad, as illustrated in (<b>c</b>) by the angular Euclidian distance between the main beam in <math display="inline"><semantics> <mrow> <mo>(</mo> <mn>0.5</mn> <mo>,</mo> <mo>−</mo> <mn>0.25</mn> <mo>)</mo> </mrow> </semantics></math> and its alias in <math display="inline"><semantics> <mrow> <mo>(</mo> <mo>−</mo> <mn>0.64</mn> <mo>,</mo> <mn>0.41</mn> <mo>)</mo> </mrow> </semantics></math>. For ease of comparison, power patterns are here normalized and the same color scale is used.</p>
Full article ">Figure 4
<p>Distributions of the singular values of the modeling matrices <math display="inline"><semantics> <mi mathvariant="bold">G</mi> </semantics></math> in X polarization for (<b>a</b>) the SAI paradigm and for (<b>b</b>) the DBF one. Whatever the approach, two groups of singular values separated by a well-determined gap are observed. In every case, the first group is composed of the 2791 largest singular values. However, in the SMOS case (red, 69 different antenna patterns), this gap is narrower than in the ideal case (green, same voltage pattern for each antenna), except for the DBF, which is less sensitive to the disparity of the elementary patterns. The same behavior is observed in Y polarization.</p>
Full article ">Figure 5
<p>Brightness temperature distributions of a typical scene over the ocean: (<b>a</b>) <math display="inline"><semantics> <mrow> <msup> <mi>T</mi> <mi>H</mi> </msup> <mrow> <mo>(</mo> <mi mathvariant="bold-italic">ξ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> and (<b>b</b>) <math display="inline"><semantics> <mrow> <msup> <mi>T</mi> <mi>V</mi> </msup> <mrow> <mo>(</mo> <mi mathvariant="bold-italic">ξ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> at ground level, (<b>c</b>) <math display="inline"><semantics> <mrow> <msup> <mi>T</mi> <mi>X</mi> </msup> <mrow> <mo>(</mo> <mi mathvariant="bold-italic">ξ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> and (<b>d</b>) <math display="inline"><semantics> <mrow> <msup> <mi>T</mi> <mi>Y</mi> </msup> <mrow> <mo>(</mo> <mi mathvariant="bold-italic">ξ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> at instrument level.</p>
Full article ">Figure 6
<p>Retrieved brightness temperature maps <math display="inline"><semantics> <msub> <mi mathvariant="script">T</mi> <mi>r</mi> </msub> </semantics></math> at instrument level in X and Y polarizations corresponding to (<b>a</b>,<b>b</b>) the inversion of complex visibilities with the SAI paradigm and to (<b>c</b>,<b>d</b>) the inversion of antenna temperatures with the DBF one. Color scale is the same for the four maps.</p>
Full article ">Figure 7
<p>Floor error maps <math display="inline"><semantics> <mrow> <mo>Δ</mo> <msub> <mi mathvariant="script">T</mi> <mi>r</mi> </msub> </mrow> </semantics></math> at instrument level in X and Y polarizations corresponding to (<b>a</b>,<b>b</b>) the inversion of complex visibilities with the SAI paradigm and to (<b>c</b>,<b>d</b>) the inversion of antenna temperatures with the DBF one. Biases and standard deviations are given in the AF-FOV (dashed blue) and in the EAF-FOV (dashed brown). Color scale is the same for the four maps.</p>
Full article ">Figure 8
<p>Floor error maps <math display="inline"><semantics> <mrow> <mo>Δ</mo> <msub> <mi mathvariant="script">T</mi> <mi>r</mi> </msub> </mrow> </semantics></math> at ground level in H and V polarizations corresponding to (<b>a</b>,<b>b</b>) the inversion of complex visibilities with the SAI paradigm and to (<b>c</b>,<b>d</b>) the inversion of antenna temperatures with the DBF one. Biases and standard deviations are given in the AF-FOV (dashed blue) and in the EAF-FOV (dashed brown). Color scale is the same for the four maps.</p>
Full article ">Figure 9
<p>Variations of the retrieved temperatures <math display="inline"><semantics> <msubsup> <mi mathvariant="script">T</mi> <mi>r</mi> <mi>H</mi> </msubsup> </semantics></math> (red dots) and <math display="inline"><semantics> <msubsup> <mi mathvariant="script">T</mi> <mi>r</mi> <mi>V</mi> </msubsup> </semantics></math> (blue dots) as well as those of the temperatures <math display="inline"><semantics> <msup> <mi mathvariant="script">T</mi> <mi>H</mi> </msup> </semantics></math> (red line) and <math display="inline"><semantics> <msup> <mi mathvariant="script">T</mi> <mi>V</mi> </msup> </semantics></math> (blue line) of the scene with the ground incidence angle <span class="html-italic">i</span> (<b>a</b>) for the SAI paradigm and (<b>b</b>) for the DBF one. Biases and standard deviations are those in the EAF-FOV of <a href="#remotesensing-14-02285-f008" class="html-fig">Figure 8</a>.</p>
Full article ">Figure 10
<p>Variations of the RMSE in the floor error maps <math display="inline"><semantics> <mrow> <mo>Δ</mo> <msubsup> <mi mathvariant="script">T</mi> <mi>r</mi> <mi>H</mi> </msubsup> </mrow> </semantics></math> (red) and <math display="inline"><semantics> <mrow> <mo>Δ</mo> <msubsup> <mi mathvariant="script">T</mi> <mi>r</mi> <mi>V</mi> </msubsup> </mrow> </semantics></math> (blue) with the ground incidence angle <span class="html-italic">i</span> (<b>a</b>) for the SAI paradigm and (<b>b</b>) for the DBF one. Dashed lines indicate the level of the RMSE over the EAF-FOV taken from <a href="#remotesensing-14-02285-t001" class="html-table">Table 1</a>. Scaling is the same for both graphs.</p>
Full article ">Figure 11
<p>Examples of random Gaussian noise added (<b>a</b>) to the complex visibilities <math display="inline"><semantics> <msub> <mi>V</mi> <mrow> <mi>p</mi> <mi>q</mi> </mrow> </msub> </semantics></math> or (<b>b</b>) to the antenna temperatures <math display="inline"><semantics> <mrow> <msub> <mi>T</mi> <mrow> <mspace width="-0.166667em"/> <mi mathvariant="double-struck">A</mi> </mrow> </msub> <mspace width="-0.166667em"/> <mrow> <mo>(</mo> <msup> <mi mathvariant="bold-italic">ξ</mi> <mo>′</mo> </msup> <mo>)</mo> </mrow> </mrow> </semantics></math> in X polarization and (<b>c</b>) the corresponding standard deviation map <math display="inline"><semantics> <mi>σ</mi> </semantics></math><math display="inline"><semantics> <mrow> <msubsup> <mrow/> <mi>r</mi> <mi>X</mi> </msubsup> <mrow> <mo>(</mo> <mi mathvariant="bold-italic">ξ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> of 1000 error maps <math display="inline"><semantics> <mrow> <mo>Δ</mo> <msubsup> <mi mathvariant="script">T</mi> <mi>r</mi> <mi>X</mi> </msubsup> <mrow> <mo>(</mo> <mi mathvariant="bold-italic">ξ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> thus obtained in the synthesized field of view with either paradigm (the average map is almost equal to 0 as the Gaussian noises are zero-centered and the modeling operators are linear). Here, the standard deviation of <math display="inline"><semantics> <mrow> <mo>Δ</mo> <msub> <mi>V</mi> <mrow> <mi>p</mi> <mi>q</mi> </mrow> </msub> </mrow> </semantics></math> is set to <math display="inline"><semantics> <mrow> <mn>0.1</mn> </mrow> </semantics></math> K, that of <math display="inline"><semantics> <mrow> <mo>Δ</mo> <msub> <mi>T</mi> <mrow> <mspace width="-0.166667em"/> <mi mathvariant="double-struck">A</mi> </mrow> </msub> <mspace width="-0.166667em"/> <mrow> <mo>(</mo> <msup> <mi mathvariant="bold-italic">ξ</mi> <mo>′</mo> </msup> <mo>)</mo> </mrow> </mrow> </semantics></math> is equal to <math display="inline"><semantics> <mrow> <mn>0.14</mn> </mrow> </semantics></math> K, and the radiometric sensitivity in the boresight direction <math display="inline"><semantics> <mrow> <mi mathvariant="bold-italic">ξ</mi> <mo>=</mo> <mo>(</mo> <mn>0</mn> <mo>,</mo> <mn>0</mn> <mo>)</mo> </mrow> </semantics></math> of <math display="inline"><semantics> <mi>σ</mi> </semantics></math><math display="inline"><semantics> <mrow> <msubsup> <mrow/> <mi>r</mi> <mi>X</mi> </msubsup> <mrow> <mo>(</mo> <mi mathvariant="bold-italic">ξ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> is equal to <math display="inline"><semantics> <mrow> <mn>2.55</mn> </mrow> </semantics></math> K. The same behavior is observed in Y polarization.</p>
Full article ">Figure 12
<p>Close views of (<b>a</b>) the antenna temperature map <math display="inline"><semantics> <mrow> <msub> <mi>T</mi> <mrow> <mspace width="-0.166667em"/> <mi mathvariant="double-struck">A</mi> </mrow> </msub> <mrow> <mo>(</mo> <msup> <mi mathvariant="bold-italic">ξ</mi> <mo>′</mo> </msup> <mo>)</mo> </mrow> </mrow> </semantics></math> and (<b>c</b>) the retrieved brightness temperature map <math display="inline"><semantics> <mrow> <msub> <mi>T</mi> <mi>r</mi> </msub> <mrow> <mo>(</mo> <mi mathvariant="bold-italic">ξ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> when the observed scene <math display="inline"><semantics> <mrow> <mi>T</mi> <mo>(</mo> <mi mathvariant="bold-italic">ξ</mi> <mo>)</mo> </mrow> </semantics></math> is reduced to a single hot-spot in the boresight direction <math display="inline"><semantics> <mrow> <mi mathvariant="bold-italic">ξ</mi> <mo>=</mo> <mo>(</mo> <mn>0</mn> <mo>,</mo> <mn>0</mn> <mo>)</mo> </mrow> </semantics></math> (for comparison purpose with the same color scale, both maps are normalized so that their peak value is equal to 1) with the amplitude of their Fourier transforms (<b>b</b>) <math display="inline"><semantics> <mrow> <mrow> <mo>|</mo> </mrow> <msub> <mover accent="true"> <mi>T</mi> <mo>^</mo> </mover> <mrow> <mspace width="-0.166667em"/> <mi mathvariant="double-struck">A</mi> </mrow> </msub> <mrow> <mrow> <mo>(</mo> <mi mathvariant="bold">u</mi> <mo>)</mo> </mrow> <mo>|</mo> </mrow> </mrow> </semantics></math> and (<b>d</b>) <math display="inline"><semantics> <mrow> <mrow> <mo>|</mo> </mrow> <msub> <mover accent="true"> <mi>T</mi> <mo>^</mo> </mover> <mi>r</mi> </msub> <mrow> <mrow> <mo>(</mo> <mi mathvariant="bold">u</mi> <mo>)</mo> </mrow> <mo>|</mo> </mrow> </mrow> </semantics></math> (also compared with the same logarithmic color scale).</p>
Full article ">Figure 13
<p>Cuts along the <math display="inline"><semantics> <msub> <mi>ξ</mi> <mn>2</mn> </msub> </semantics></math> axis of the retrieved brightness temperature maps <math display="inline"><semantics> <mrow> <msub> <mi>T</mi> <mi>r</mi> </msub> <mrow> <mo>(</mo> <mi mathvariant="bold-italic">ξ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> (dashed lines) with the SAI (green) and DBF (purple) paradigms when the scene under observation <math display="inline"><semantics> <mrow> <mi>T</mi> <mo>(</mo> <mi mathvariant="bold-italic">ξ</mi> <mo>)</mo> </mrow> </semantics></math> is reduced to a single hot-spot in the boresight direction <math display="inline"><semantics> <mrow> <mi mathvariant="bold-italic">ξ</mi> <mo>=</mo> <mo>(</mo> <mn>0</mn> <mo>,</mo> <mn>0</mn> <mo>)</mo> </mrow> </semantics></math>. In both cases, the FWHM of the normalized response is about <math display="inline"><semantics> <mrow> <mn>0.0278</mn> </mrow> </semantics></math> rad. Also shown on the graph is the same cut of the antenna temperature map <math display="inline"><semantics> <mrow> <msub> <mi>T</mi> <mrow> <mspace width="-0.166667em"/> <mi mathvariant="double-struck">A</mi> </mrow> </msub> <mrow> <mo>(</mo> <msup> <mi mathvariant="bold-italic">ξ</mi> <mo>′</mo> </msup> <mo>)</mo> </mrow> </mrow> </semantics></math> (solid line), the FWHM of the main beam is here about <math display="inline"><semantics> <mrow> <mn>0.0355</mn> </mrow> </semantics></math> rad.</p>
Full article ">Figure 14
<p>Comparison between (<b>a</b>) a typical scene over Spain in X polarization at a resolution better than <math display="inline"><semantics> <mrow> <mn>0.01</mn> </mrow> </semantics></math> rad, (<b>b</b>) the corresponding antenna temperature map simulated for the Y-shaped array of SMOS using DBF with 69 elementary antennas at the resolution of <math display="inline"><semantics> <mrow> <mn>0.0355</mn> </mrow> </semantics></math> rad, (<b>c</b>) the brightness temperature distribution retrieved from the inversion of (<b>b</b>) at the resolution of <math display="inline"><semantics> <mrow> <mn>0.0278</mn> </mrow> </semantics></math> rad, and (<b>d</b>) another antenna temperature map simulated for an hexagonal array using DBF with 1387 elementary antennas at the resolution of <math display="inline"><semantics> <mrow> <mn>0.0299</mn> </mrow> </semantics></math> rad. The same behavior is observed in Y polarization.</p>
Full article ">
23 pages, 8412 KiB  
Article
A Novel Ultra−High Resolution Imaging Algorithm Based on the Accurate High−Order 2−D Spectrum for Space−Borne SAR
by Tao He, Lei Cui, Pengbo Wang, Yanan Guo and Lei Zhuang
Remote Sens. 2022, 14(9), 2284; https://doi.org/10.3390/rs14092284 - 9 May 2022
Viewed by 2165
Abstract
Ultra−high spatial resolution, which can bring more detail to ground observation, is a constant pursuit of the modern space−borne synthetic aperture radar. However, the exact imaging in this case has always been a complex technical problem due to its complicated imaging geometry and [...] Read more.
Ultra−high spatial resolution, which can bring more detail to ground observation, is a constant pursuit of the modern space−borne synthetic aperture radar. However, the exact imaging in this case has always been a complex technical problem due to its complicated imaging geometry and signal structure. To achieve those applications’ strict requirements, a novel ultra−high resolution imaging algorithm based on an accurate high−order 2−D spectrum is presented in this paper. The only first two Doppler parameters needed as range models in the defective spectrum are replaced by a polynomial range model, which can derive coefficients from the relative motion between the radar and the targets. Then, the new spectrum is calculated through the Lagrange inversion formula. Based on this, the novel imaging algorithm is elaborated in detail as follows: The range high−order term of the spectrum is compensated completely, and the range chirp rate space variance is eliminated by the cubic phase term. Two steps of range cell migration correct are applied in this algorithm before and after the range compression; one is the traditional linear chirp scaling method, and another is the interpolation to correct the quadratic range cell migration introduced by the range chirp rate equalization. The simulation results illustrate that the proposed algorithm can handle the exact imaging processing with a 0.25 m resolution around the azimuth and range in 2 km × 6 km, which validates the feasibility of the proposed algorithm. Full article
(This article belongs to the Topic Advances in Environmental Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Detailed implementation of the proposed algorithm.</p>
Full article ">Figure 2
<p>Time−frequency diagram of the sliding spotlight SAR.</p>
Full article ">Figure 3
<p>The 2−D spectrum of the echo whether the two−step processing is applied. (<b>a</b>) The original spectrum. (<b>b</b>) the unwrapped spectrum.</p>
Full article ">Figure 4
<p>The range distortion caused by the high−order phase term.</p>
Full article ">Figure 5
<p>(<b>a</b>) The original spectrum and the wrapped spectrum due to the linear term. (<b>b</b>) The original focused result and the false target formed by wrapping.</p>
Full article ">Figure 6
<p>(<b>a</b>) The real part of the chirp signal before and after the chirp rate equalization. (<b>b</b>) the range compression result focused by the reference chirp rate before and after the chirp rate equalization.</p>
Full article ">Figure 7
<p>The diagram of quadratic RCM.</p>
Full article ">Figure 8
<p>(<b>a</b>) The phase error introduced by the truncated range vector. (<b>b</b>) the detail view of the circled place.</p>
Full article ">Figure 9
<p>(<b>a</b>) The phase error introduced by the truncated range model. (<b>b</b>) the detail view of the circled place.</p>
Full article ">Figure 10
<p>(<b>a</b>). PSLR of the compressed point targets. (<b>b</b>). IRW of the compressed point targets.</p>
Full article ">Figure 11
<p>The ground scene of simulation.</p>
Full article ">Figure 12
<p>The sketch map of the range compression results. (<b>a</b>) Compressed signal with a mismatch chirp rate. (<b>b</b>) Only the chirp rate equalization is employed and the quadratic RCM appeared. (<b>c</b>) Corrected quadratic RCM by means of interpolation.</p>
Full article ">Figure 13
<p>Interpolated results of (<b>a</b>) PT3, (<b>b</b>) PT5, (<b>c</b>) PT7 focused by the proposed algorithm.</p>
Full article ">Figure 14
<p>Interpolated results of (<b>a</b>) PT3, (<b>b</b>) PT5, (<b>c</b>) PT7 focused by the conventional CSA.</p>
Full article ">Figure 15
<p>Interpolated results of (<b>a</b>) PT3, (<b>b</b>) PT5, (<b>c</b>) PT7 focused by the back−projection algorithm.</p>
Full article ">Figure 16
<p>The (<b>a</b>) range and (<b>b</b>) azimuth profile of PT5 focused by the proposed algorithm, BPA, and the conventional CSA.</p>
Full article ">
22 pages, 6763 KiB  
Article
A Framework for Survey Planning Using Portable Unmanned Aerial Vehicles (pUAVs) in Coastal Hydro-Environment
by Ha Linh Trinh, Hieu Trung Kieu, Hui Ying Pak, Dawn Sok Cheng Pang, Angel Anisa Cokro and Adrian Wing-Keung Law
Remote Sens. 2022, 14(9), 2283; https://doi.org/10.3390/rs14092283 - 9 May 2022
Cited by 7 | Viewed by 3232
Abstract
Recently, remote sensing using survey-grade UAVs has been gaining tremendous momentum in applications for the coastal hydro-environment. UAV-based remote sensing provides high spatial and temporal resolutions and flexible operational availability compared to other means, such as satellite imagery or point-based in situ measurements. [...] Read more.
Recently, remote sensing using survey-grade UAVs has been gaining tremendous momentum in applications for the coastal hydro-environment. UAV-based remote sensing provides high spatial and temporal resolutions and flexible operational availability compared to other means, such as satellite imagery or point-based in situ measurements. As strict requirements and government regulations are imposed for every UAV survey, detailed survey planning is essential to ensure safe operations and seamless coordination with other activities. This study established a comprehensive framework for the planning of efficient UAV deployments in coastal areas, which was based on recent on-site survey experiences with a portable unmanned aerial vehicle (pUAV) that was carrying a heavyweight spectral sensor. The framework was classified into three main categories: (i) pre-survey considerations (i.e., administrative preparation and UAV airframe details); (ii) execution strategies (i.e., parameters and contingency planning); and (iii) environmental effects (i.e., weather and marine conditions). The implementation and verification of the framework were performed using a UAV–airborne spectral sensing exercise for water quality monitoring in Singapore. The encountered challenges and the mitigation practices that were developed from the actual field experiences were integrated into the framework to advance the ease of UAV deployment for coastal monitoring and improve the acquisition process of high-quality remote sensing images. Full article
(This article belongs to the Topic Autonomy for Enabling the Next Generation of UAVs)
Show Figures

Figure 1

Figure 1
<p>Survey area is located in the inner basin of the Singapore Strait and is indicated by the red box.</p>
Full article ">Figure 2
<p>The UAV–airborne imager system (<b>a</b>) that was applied in the study consisted of the DJI Matrice M600 Pro with Ronin MX gimbal (<b>b</b>), a D-RTK GNSS system (<b>c</b>) and a BaySpec OCI-F hyperspectral camera (<b>d</b>) with visible and near-infrared (VNIR) spectrum capabilities.</p>
Full article ">Figure 3
<p>Structure of UAV survey framework for coastal environment.</p>
Full article ">Figure 4
<p>Correlation between the total take-off weight and maximum endurance of a DJI M600 Pro (source: <a href="http://www.dji.com" target="_blank">www.dji.com</a>, accessed on 16 February 2022).</p>
Full article ">Figure 5
<p>Example of a <span class="html-italic">p</span>UAV flight pattern: the take-off/landing point is indicated by the blue dot; the flight path to the starting point is indicated by the blue arrow; the path of the UAV returning to the take-off/landing point is indicated by the red arrow.</p>
Full article ">Figure 6
<p>Raw images of survey area during the same UAV flight on 3 May 2021: (<b>a</b>) without and (<b>b</b>) with cloud cover.</p>
Full article ">Figure 7
<p>(<b>a</b>) Poor acquired data during a UAV flight at 04:48 p.m. on 10 November 2021 due to shady conditions and a sudden thunderstorm. (<b>b</b>) The stitched images could not be aligned and there are some missing data due to the reduction in camera shutter speed, which was caused by the poor illumination conditions.</p>
Full article ">Figure 8
<p>Sun reflectance on water bodies at different times: (<b>a</b>) without sun glint at 10:58:10 a.m. and (<b>b</b>) with sun glint at 11:34:17 a.m. The sediment plume can be seen in the flight area while barge ships/vessels/boats are masked from the stitched image (white areas).</p>
Full article ">Figure 9
<p>TSS reflectance during UAV flights (<b>a</b>) with noise that was caused by the appearance of barge shown in the red box and (<b>b</b>) with the noise removed.</p>
Full article ">Figure 10
<p>Variance of reflectance for the different groups of TSS concentrations (<b>a</b>) during good weather conditions and (<b>b</b>) poor weather conditions due to cloud contamination and rain.</p>
Full article ">Figure 11
<p>Variance of reflectance of the survey flights during UAV flights (<b>a</b>) without and (<b>b</b>) with (<b>b</b>) high sun glint effect for different groups of TSS concentrations.</p>
Full article ">Figure A1
<p>Actual photos taken during the survey on 9 December 2021 under (<b>a</b>) good weather conditions and (<b>b</b>) bad weather conditions due to the approaching thunderstorm and rain.</p>
Full article ">Figure A2
<p>(<b>a</b>) Flight at 4:28 p.m. on 13 January 2022 recorded the highest TSS concentrations of up to 122 mg/L because the sediment plumes were captured near a discharge source, i.e., a dumping barge; (<b>b</b>) reflectance of TSS concentrations varied significantly when the TSS was under 10 mg/L.</p>
Full article ">
20 pages, 14921 KiB  
Article
INDMF Based Regularity Calculation Method and Its Application in the Recognition of Typical Loess Landforms
by Sheng Jiang, Xiaoli Huang and Ling Jiang
Remote Sens. 2022, 14(9), 2282; https://doi.org/10.3390/rs14092282 - 9 May 2022
Cited by 1 | Viewed by 1834
Abstract
The topographical morphology of the loess landform on the Loess Plateau exhibits remarkable textural features at different spatial scales. However, existing topographic texture analysis studies on the Loess Plateau are usually dominated by statistical characteristics and are missing structural characteristics. At the same [...] Read more.
The topographical morphology of the loess landform on the Loess Plateau exhibits remarkable textural features at different spatial scales. However, existing topographic texture analysis studies on the Loess Plateau are usually dominated by statistical characteristics and are missing structural characteristics. At the same time, there is a lack of regularity calculation methods for DEM digital terrain analysis. Taking the Loess Plateau as the study area, a regularity calculation method based on the improved normalized distance matching function (INDMF) is proposed and applied to the classification of a loess landform. The regularity calculation method used in this study (INDMF regularity) mainly includes two key steps. Step 1 calculates the INDMF sequence value and the peak and valley values for the terrain data. Step 2 calculates the significant peak and valley, constructs the significant peak and valley sequences, and then obtains the regularity using the normalised ratio value. The experimental results show that the proposed method has good anti-interference ability and can effectively extract the regularity of the main landform unit. Compared with previous methods, adding structural features (i.e., INDMF regularity) can effectively distinguish loess hill and loess ridge in the hilly and gully region. For the loess hill and loess ridge, the recognition rates of the proposed method are 84.62% and 92.86%, respectively. Combined with the existing topographic characteristics, the proposed INDMF regularity is a topographic structure feature extraction method that can effectively discriminate between loess hill and loess ridge areas on the Loess Plateau. Full article
Show Figures

Figure 1

Figure 1
<p>DEM visualisation of typical landform of Loess Plateau. (<b>a</b>) the early stage of loess tableland; (<b>b</b>) the late stage of loess tableland; (<b>c</b>) loess ridge area; (<b>d</b>) loess hill area.</p>
Full article ">Figure 2
<p>Study area and sample data.</p>
Full article ">Figure 3
<p>The flow chart of the INDMF regularity calculation method.</p>
Full article ">Figure 4
<p>Flow chart of typical landform recognition of Loess Plateau combined with INDMF regularity.</p>
Full article ">Figure 5
<p>The flow chart of three-level recognition rules. (<b>a</b>) the first level recognition rule; (<b>b</b>) the second level recognition rule for loess gully and hilly areas; (<b>c</b>) the second level recognition rule for loess tableland; (<b>d</b>) the third level recognition rule.</p>
Full article ">Figure 6
<p>INDMF regularity based on loess hill areas. (<b>a</b>) sample 1; (<b>b</b>) positive and negative terrains of sample 1; (<b>c</b>) INDMF of sample 1; (<b>d</b>) sample 2; (<b>e</b>) positive and negative terrains of sample 2; (<b>f</b>) INDMF of sample 2; (<b>g</b>) sample 3; (<b>h</b>) positive and negative terrains of sample 3; (<b>i</b>) INDMF of sample 3.</p>
Full article ">Figure 7
<p>INDMF regularity based on loess ridge areas. (<b>a</b>) sample 1; (<b>b</b>) positive and negative terrains of sample 1; (<b>c</b>) INDMF of sample 1; (<b>d</b>) sample 2; (<b>e</b>) positive and negative terrains of sample 2; (<b>f</b>) INDMF of sample 2; (<b>g</b>) sample 3; (<b>h</b>) positive and negative terrains of sample 3; (<b>i</b>) INDMF of sample 3.</p>
Full article ">Figure 8
<p>Schematic of the positive and negative topography of the Loess Plateau [<a href="#B55-remotesensing-14-02282" class="html-bibr">55</a>]. (<b>a</b>) loess hilly and gully area; (<b>b</b>) loess tableland area.</p>
Full article ">Figure 9
<p>DEM based hillshade map of a small watershed in the Loess Plateau of China. (<b>a</b>) terraced field; (<b>b</b>) gully bottom area.</p>
Full article ">
13 pages, 34286 KiB  
Communication
Augmentation-Based Methodology for Enhancement of Trees Map Detalization on a Large Scale
by Svetlana Illarionova, Dmitrii Shadrin, Vladimir Ignatiev, Sergey Shayakhmetov, Alexey Trekin and Ivan Oseledets
Remote Sens. 2022, 14(9), 2281; https://doi.org/10.3390/rs14092281 - 9 May 2022
Cited by 14 | Viewed by 2446
Abstract
Remote sensing tasks play a very important role in the domain of sensing and measuring, and can be very specific. Advances in computer vision techniques allow for the extraction of various information from remote sensing satellite imagery. This information is crucial in making [...] Read more.
Remote sensing tasks play a very important role in the domain of sensing and measuring, and can be very specific. Advances in computer vision techniques allow for the extraction of various information from remote sensing satellite imagery. This information is crucial in making quantitative and qualitative assessments for monitoring of forest clearing in protected areas for power lines, as well as for environmental analysis, in particular for making assessments of carbon footprint, which is a highly relevant task. Solving these problems requires precise segmentation of the forest mask. Although forest mask extraction from satellite data has been considered previously, no open-access applications are able to provide the high-detailed forest mask. Detailed forest masks are usually obtained using unmanned aerial vehicles (UAV) that set particular limitations such as cost and inapplicability for vast territories. In this study, we propose a novel neural network-based approach for high-detailed forest mask creation. We implement an object-based augmentation technique for a minimum amount of labeled high-detailed data. Using this augmented data we fine-tune the models, which are trained on a large forest dataset with less precise labeled masks. The provided algorithm is tested for multiple territories in Russia. The F1-score, for small details (such as individual trees) was improved to 0.929 compared to the baseline score of 0.856. The developed model is available in an SAAS platform. The developed model allows a detailed and precise forest mask to be easily created, which then be used for solving various applied problems. Full article
Show Figures

Figure 1

Figure 1
<p>Proposed pipeline for CNN model training.</p>
Full article ">Figure 2
<p>Examples of original and generated samples and tree masks. In the generated samples, new various backgrounds were used to achieve greater diversity and to combine trees’ images and masks from different areas. Artificially added shadows provide more realistic images associated with semantic segmentation masks.</p>
Full article ">Figure 3
<p>Raw images (<b>left</b>) and predictions (<b>right</b>) for different territories: (<b>a</b>) Baoting Li and Miao Autonomous County, Hainan, China, 18°29<math display="inline"><semantics> <msup> <mrow/> <mo>′</mo> </msup> </semantics></math>24.0<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>″</mo> </mrow> </msup> </semantics></math>N 109°35<math display="inline"><semantics> <msup> <mrow/> <mo>′</mo> </msup> </semantics></math>24.0<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>″</mo> </mrow> </msup> </semantics></math>E; (<b>b</b>) Zelenodolsky District, Republic of Tatarstan, Russia, 55°55<math display="inline"><semantics> <msup> <mrow/> <mo>′</mo> </msup> </semantics></math>48.0<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>″</mo> </mrow> </msup> </semantics></math>N 48°44<math display="inline"><semantics> <msup> <mrow/> <mo>′</mo> </msup> </semantics></math>24.0<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>″</mo> </mrow> </msup> </semantics></math>E; (<b>c</b>) Republic of Dagestan, Russia, 43°01<math display="inline"><semantics> <msup> <mrow/> <mo>′</mo> </msup> </semantics></math>09.1<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>″</mo> </mrow> </msup> </semantics></math>N 47°19<math display="inline"><semantics> <msup> <mrow/> <mo>′</mo> </msup> </semantics></math>28.2<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>″</mo> </mrow> </msup> </semantics></math>E.</p>
Full article ">Figure 4
<p>Forest segmentation results for Republic of Tatarstan test territories: (<b>a</b>) input image; (<b>b</b>) ground truth; (<b>c</b>) small dataset fine-tuned without OBA; (<b>d</b>) small dataset fine-tuned with OBA.</p>
Full article ">Figure 5
<p>Input image from new region outside training site, Zelenodolsky District, Republic of Tatarstan, Russia, 55°55<math display="inline"><semantics> <msup> <mrow/> <mo>′</mo> </msup> </semantics></math>48.0<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>″</mo> </mrow> </msup> </semantics></math>N 48°44<math display="inline"><semantics> <msup> <mrow/> <mo>′</mo> </msup> </semantics></math>24.0<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>″</mo> </mrow> </msup> </semantics></math>E (composite orthophotomap provided by Mapbox, acquisition date: 20 March 2022) (<b>a</b>); Open Street Map (acquisition date: 20 March 2022) (<b>b</b>); forest segmentation results of the final CNN model fine-tuned with OBA (<b>c</b>).</p>
Full article ">Figure 6
<p>Input image from new region outside training site, Wickwar, England, 51°36<math display="inline"><semantics> <msup> <mrow/> <mo>′</mo> </msup> </semantics></math>26.7<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>″</mo> </mrow> </msup> </semantics></math>N, 2°23<math display="inline"><semantics> <msup> <mrow/> <mo>′</mo> </msup> </semantics></math>17.1<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>″</mo> </mrow> </msup> </semantics></math>W (composite orthophotomap provided by Google, acquisition date: 30 April 2022) (<b>a</b>); Open Street Map (acquisition date: 30 April 2022) (<b>b</b>); forest segmentation results of the final CNN model fine-tuned with OBA (<b>c</b>).</p>
Full article ">
21 pages, 5105 KiB  
Article
A Continuous Change Tracker Model for Remote Sensing Time Series Reconstruction
by Yangjian Zhang, Li Wang, Yuanhuizi He, Ni Huang, Wang Li, Shiguang Xu, Quan Zhou, Wanjuan Song, Wensheng Duan, Xiaoyue Wang, Shakir Muhammad, Biswajit Nath, Luying Zhu, Feng Tang, Huilin Du, Lei Wang and Zheng Niu
Remote Sens. 2022, 14(9), 2280; https://doi.org/10.3390/rs14092280 - 9 May 2022
Cited by 1 | Viewed by 2455
Abstract
It is hard for current time series reconstruction methods to achieve the balance of high-precision time series reconstruction and explanation of the model mechanism. The goal of this paper is to improve the reconstruction accuracy with a well-explained time series model. Thus, we [...] Read more.
It is hard for current time series reconstruction methods to achieve the balance of high-precision time series reconstruction and explanation of the model mechanism. The goal of this paper is to improve the reconstruction accuracy with a well-explained time series model. Thus, we developed a function-based model, the CCTM (Continuous Change Tracker Model) model, that can achieve high precision in time series reconstruction by tracking the time series variation rate. The goal of this paper is to provide a new solution for high-precision time series reconstruction and related applications. To test the reconstruction effects, the model was applied to four types of datasets: normalized difference vegetation index (NDVI), gross primary productivity (GPP), leaf area index (LAI), and MODIS surface reflectance (MSR). Several new observations are as follows. First, the CCTM model is well explained and based on the second-order derivative theorem, which divides the yearly time series into four variation types including uniform variations, decelerated variations, accelerated variations, and short-periodical variations, and each variation type is represented by a designed function. Second, the CCTM model provides much better reconstruction results than the Harmonic model on the NDVI, GPP, MSR, and LAI datasets for the seasonal segment reconstruction. The combined use of the Savitzky–Golay filter and the CCTM model is better than the combinations of the Savitzky–Golay filter with other models. Third, the Harmonic model has the best trend-fitting ability on the yearly time series dataset, with the highest R-Square and the lowest RMSE among the four function fitting models. However, with seasonal piecewise fitting, the four models all achieved high accuracy, and the CCTM performs the best. Fourth, the CCTM model should also be applied to time series image compression, two compression patterns with 24 coefficients and 6 coefficients respectively are proposed. The daily MSR dataset can achieve a compression ratio of 15 by using the 6-coefficients method. Finally, the CCTM model also has the potential to be applied to change detection, trend analysis, and phenology and seasonal characteristics extractions. Full article
(This article belongs to the Section Environmental Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Flowchart of time series reconstruction.</p>
Full article ">Figure 2
<p>Distribution of sampling points and stable land cover types. 13 kinds of stable land cover types from 2001 to 2005 are extracted from the MCD12Q1.006 product. For 9 kinds of vegetation land cover types, 100 sampling points for each of the ENF, EBF, DBF, and MF datasets and 200 sampling points for each of the shrubland, savanna, grassland, and cropland datasets are obtained.</p>
Full article ">Figure 3
<p>Gap-filling method. Solid time series points represent known time series values, and hollow time series points represent missing time series values.</p>
Full article ">Figure 4
<p>Trend-fill method.</p>
Full article ">Figure 5
<p>Fitting RMSE distribution on different land cover types among different models. The figs reflect the fitting RMSE probability density distributions of different models. The frequency is the number of time series points in a certain RMSE range, and different land cover types may have different RMSE categories.</p>
Full article ">Figure 6
<p><span class="html-italic">R</span><sup>2</sup> distribution on different land cover types among different models. The figs reflect the fitting <span class="html-italic">R</span><sup>2</sup>probability density distributions of different models. The frequency is the number of time series points in the certain <span class="html-italic">R</span><sup>2</sup> range, and different land cover types may have different RMSE categories.</p>
Full article ">Figure 7
<p>The fitting curves of different models with no seasons division. The figs reflect the yearly time series fitted by the CCTM, Harmonic, Exp, and Ln models.</p>
Full article ">Figure 8
<p>Reconstruction RMSE distribution of NDVI time series data with smoothing process. The frequency is the percentage of time series points in the certain RMSE range, and different landcover types may have different RMSE categories.</p>
Full article ">Figure 9
<p><span class="html-italic">R</span><sup>2</sup> probability distribution of reconstructed NDVI time-series data after smoothing for different landcover types. The frequency is the number of time series points in a certain RMSE range, and different landcover types may have different RMSE categories.</p>
Full article ">Figure 10
<p>RMSE of reconstructed MSR time series data after smoothing for different landcover types. The frequency is the percentage of the certain RMSE range, and different landcover types may have different RMSE categories.</p>
Full article ">Figure 11
<p><span class="html-italic">R</span><sup>2</sup> probability distribution of reconstructed MSR time series data after smoothing for different landcover types. The frequency is the number of time series points in a certain RMSE range, and different landcover types may have different RMSE categories.</p>
Full article ">
48 pages, 16390 KiB  
Review
Remote Sensing of Geomorphodiversity Linked to Biodiversity—Part III: Traits, Processes and Remote Sensing Characteristics
by Angela Lausch, Michael E. Schaepman, Andrew K. Skidmore, Eusebiu Catana, Lutz Bannehr, Olaf Bastian, Erik Borg, Jan Bumberger, Peter Dietrich, Cornelia Glässer, Jorg M. Hacker, Rene Höfer, Thomas Jagdhuber, Sven Jany, András Jung, Arnon Karnieli, Reinhard Klenke, Toralf Kirsten, Uta Ködel, Wolfgang Kresse, Ulf Mallast, Carsten Montzka, Markus Möller, Hannes Mollenhauer, Marion Pause, Minhaz Rahman, Franziska Schrodt, Christiane Schmullius, Claudia Schütze, Peter Selsam, Ralf-Uwe Syrbe, Sina Truckenbrodt, Michael Vohland, Martin Volk, Thilo Wellmann, Steffen Zacharias and Roland Baatzadd Show full author list remove Hide full author list
Remote Sens. 2022, 14(9), 2279; https://doi.org/10.3390/rs14092279 - 9 May 2022
Cited by 20 | Viewed by 5881
Abstract
Remote sensing (RS) enables a cost-effective, extensive, continuous and standardized monitoring of traits and trait variations of geomorphology and its processes, from the local to the continental scale. To implement and better understand RS techniques and the spectral indicators derived from them in [...] Read more.
Remote sensing (RS) enables a cost-effective, extensive, continuous and standardized monitoring of traits and trait variations of geomorphology and its processes, from the local to the continental scale. To implement and better understand RS techniques and the spectral indicators derived from them in the monitoring of geomorphology, this paper presents a new perspective for the definition and recording of five characteristics of geomorphodiversity with RS, namely: geomorphic genesis diversity, geomorphic trait diversity, geomorphic structural diversity, geomorphic taxonomic diversity, and geomorphic functional diversity. In this respect, geomorphic trait diversity is the cornerstone and is essential for recording the other four characteristics using RS technologies. All five characteristics are discussed in detail in this paper and reinforced with numerous examples from various RS technologies. Methods for classifying the five characteristics of geomorphodiversity using RS, as well as the constraints of monitoring the diversity of geomorphology using RS, are discussed. RS-aided techniques that can be used for monitoring geomorphodiversity in regimes with changing land-use intensity are presented. Further, new approaches of geomorphic traits that enable the monitoring of geomorphodiversity through the valorisation of RS data from multiple missions are discussed as well as the ecosystem integrity approach. Likewise, the approach of monitoring the five characteristics of geomorphodiversity recording with RS is discussed, as are existing approaches for recording spectral geomorhic traits/ trait variation approach and indicators, along with approaches for assessing geomorphodiversity. It is shown that there is no comparable approach with which to define and record the five characteristics of geomorphodiversity using only RS data in the literature. Finally, the importance of the digitization process and the use of data science for research in the field of geomorphology in the 21st century is elucidated and discussed. Full article
Show Figures

Figure 1

Figure 1
<p>Curves of developing geodiversity and biodiversity, after Gray et al., [<a href="#B6-remotesensing-14-02279" class="html-bibr">6</a>], reprinted with permission from Gray et al., 2022, Elsevier. license no. 5225220111723.</p>
Full article ">Figure 2
<p>The mapped sub-topics and the relationships among the sub-topics of the research articles found using the search terms “geomorphology” and “remote sensing”. Two (left), five (center), and ten (right) topics were identified at the different levels, with shrinking proportion, respectively. The height of the topic bars corresponds to the topic proportion in the collection of articles. The bands’ width is proportional to the number of documents affiliated with the inter-related topics. Colors are only meant to distinguish between topics.</p>
Full article ">Figure 3
<p>The five characteristics of geomorphodiversity exist on all the spatial, temporal, and directional scales of geomorphic organization (modified after Lausch et al. [<a href="#B7-remotesensing-14-02279" class="html-bibr">7</a>]).</p>
Full article ">Figure 4
<p>Different air- and spaceborne remote-sensing platforms for assessing geomorphodiversity, geomorphic traits, their changes and disturbances: (<b>a</b>) unmanned aerial vehicles (UAV or drone), (<b>b</b>) microlight—gravity-controlled; (<b>c</b>) gyrocopter or microlight helicopter; ((<b>d</b>)—top) ECO-Dimona; ((<b>d</b>)—bottom) Cessna and (<b>e</b>) satellite (from Lausch et al. [<a href="#B47-remotesensing-14-02279" class="html-bibr">47</a>]).</p>
Full article ">Figure 5
<p>In situ and remote sensing (RS) approaches, common links, and the constraints of RS for monitoring the five characteristics of geomorphodiversity. Geomorphological traits are the crucial link between in situ and RS monitoring approaches (from Lausch et al. [<a href="#B7-remotesensing-14-02279" class="html-bibr">7</a>]).</p>
Full article ">Figure 6
<p>Examples of spectral traits of geomorphodiversity, monitored by RS technologies.</p>
Full article ">Figure 7
<p>(<b>a</b>) Dune sand, (<b>b</b>) sand and gravel, (<b>c</b>) gravel, (<b>d</b>) crushed stone, and (<b>e</b>) rubble; the characteristics and spatial-temporal distribution of geomorphic traits are important factors for monitoring geomorphodiversity.</p>
Full article ">Figure 8
<p>Changes in geomorphic traits lead to geomorphic trait variations. These changes can also lead to a change in vegetation traits, as geo- and biodiversity interact with each other either directly or indirectly. This example shows changes in geomorphology caused by underground open-cast mining with subsequent landslides. (<b>a</b>) Woody plants at the edge of a watercourse, (<b>b</b>) Overthrow of woody plants as a result of the collapse of underground open-cast mining tunnels and subsequent landslides, (<b>c</b>) Digital elevation model (DEM) recorded by airborne laserscanning (ALS) three months before the event (2017/09), (<b>d</b>) DEM recorded by ALS after the event (2017/12), (<b>e</b>) Difference model shows the geomorphological changes caused by the event.</p>
Full article ">Figure 9
<p>For discrimination and successful monitoring, in addition to the characteristics and the distribution of geomorphic traits and their changes, the spatial characteristics of the deployed RS sensors are of major importance. Here the importance of the spatial resolutionis illustrated through a comparison of DEMs of a post-mining potash tailings pile, Teutschenthal-Bahnhof, near Halle, Germany (see also Schwefel et al. [<a href="#B94-remotesensing-14-02279" class="html-bibr">94</a>]): (<b>a</b>) LiDAR (DEM 1)—1 m, (<b>b</b>) photo of the post-mining landscape with the 95 m high potash tailings pile, (<b>c</b>) SRTM (DEM 90)—90 m, (<b>d</b>) Aster (DEM 30)—30 m, (<b>e</b>) DEM generated from height information of the land surveying office—LVermGeo (DEM 10)—10 m, (<b>f</b>) SAR (DEM 5)—5 m, (<b>g</b>) LiDAR (DEM 1)—1 m (from Lausch et al. [<a href="#B7-remotesensing-14-02279" class="html-bibr">7</a>]).</p>
Full article ">Figure 10
<p>All five characteristics of geomorphodiversity can be recorded using RS technologies. The individual characteristics of geomorphodiversity are illustrated by means of examples, which are: (I) geomorphic trait diversity, (<b>a</b>) AVIRIS hyperspectral RS data was used to classify the mineral distribution and the geomorphic traits in the Cuprite area, Nevada (from Clark et al. [<a href="#B96-remotesensing-14-02279" class="html-bibr">96</a>]); (II) geomorphic genesis diversity, (<b>b</b>) photo of the characteristic relief forms created by the exogenous and endeogenous genesis processes genesis, (<b>c</b>) TIR image of part of the Siberian Trap supervolcano; (III) geomorphic structural diversity, (<b>d</b>) derivation of dune pattern mapping with RS (from Shumack et al. [<a href="#B97-remotesensing-14-02279" class="html-bibr">97</a>]); (IV) geomorphic taxonomic diversity, (<b>e</b>) classification of different mountain types using RS (from Farmakis-Serebryakova et al. [<a href="#B98-remotesensing-14-02279" class="html-bibr">98</a>]); (V) geomorphic functional diversity, (<b>f-1</b>) processes of geogenesis and river degradation lead to changes in morphometric river features, (<b>f-2</b>) the morphometric changes can be recorded using RS data, reprinted with permission from Ventura et al. [<a href="#B99-remotesensing-14-02279" class="html-bibr">99</a>], 2021, Elsevier. license number: 4856041399548; (<b>g</b>) The integration and combination of all five features form the basis of the geo-spectranometric approach and lead to the ‘spectral fingerprint of geomorphology and geomorphodiversity’. All features and individual figures are explained in detail in the following chapters.</p>
Full article ">Figure 11
<p>(1) Geogenic exogenous and endogenous processes, such as in such as a volcanic eruption (<b>a</b>) leads to characteristic geogenic geomorphic traits. (2) Geomorphic traits can be: mineralogical, structural, taxonomic and functional traits, such as: (<b>b</b>,<b>c</b>) different minerals, (<b>d</b>) different sturcural forms and form complexes, (<b>e</b>) different rock types. Consequently, the processes, e.g., volcanic eruptions lead to (3) geomorphic trait variations and the shaping of terrain and mountains. Geomorphic traits and trait variations produce a morphologogically specific (4) spectral response, which can be detected using different RS technologies. (<b>f</b>) Laboratory spectral features of different hydrothermal minerals and mapping results from RS analysis are crucial for mineral classification, (<b>c</b>) VNIR-AVIRIS hyperspectral data from the Cuprite area (Nevada) show results of lithography classification using hyperspectral RS data, (<b>g</b>) Representation of a volcanic eruption using RS of the Siberian Trap, (<b>h</b>) Distribution of volcanic deposits optained by differencing pre-eruptive and post-eruptive DEMs derived from ASTER and PlanetScope RS Data, location—Nabro volcano; Image sources: (<b>a</b>,<b>b</b>) from Peyghambari and Zhang [<a href="#B108-remotesensing-14-02279" class="html-bibr">108</a>], (<b>c</b>,<b>f</b>) from Clark et al. [<a href="#B96-remotesensing-14-02279" class="html-bibr">96</a>], (<b>d</b>) from Shumak et al., [<a href="#B97-remotesensing-14-02279" class="html-bibr">97</a>], (<b>e</b>) from Farmakis-Serebryakova et al. [<a href="#B98-remotesensing-14-02279" class="html-bibr">98</a>], (<b>h</b>) reprinted with permission from Calvari et al., [<a href="#B109-remotesensing-14-02279" class="html-bibr">109</a>], 2022, Elsevier. license number: 5303110916041.</p>
Full article ">Figure 12
<p>Geomorphic structural diversity, monitored with remote sensing technology. Examples of dune-field landscape patterns on remotely sensed data. (<b>a</b>) Barchan dunes, (<b>b</b>) linear dunes, (<b>c</b>) dome dunes, (<b>d</b>) reticulate dunes, and (<b>e</b>) longitudinal dunes (reprinted with permission from Zeng et al. [<a href="#B117-remotesensing-14-02279" class="html-bibr">117</a>], 2021, Elsevier. license no. 5176660806321).</p>
Full article ">Figure 13
<p>Geomorphic taxonomic diversity using RS. The most suitable relief shading methods per landform based on the survey results: the clear sky model method for (<b>a</b>) block mountains; (<b>b</b>) folded mountains; (<b>c</b>) mountains formed by erosion processes; (<b>d</b>) cluster shading for drumlins; (<b>e</b>) plateaus; and (<b>f</b>) V-shaped valleys; (<b>g</b>) clear sky model with custom illumination (here, S) for U-shaped valleys; (<b>h</b>) standard hill-shading with custom illumination (here, W) for alluvial fans; and (<b>i</b>) aspect shading for glaciers (from Farmakis-Serebryakova et al. [<a href="#B98-remotesensing-14-02279" class="html-bibr">98</a>]).</p>
Full article ">Figure 14
<p>Monitoring status and changes to geomorphic characteristics with RS. (<b>a</b>) Different processes during geogenesis lead to the formation of specific morphometric fluvial traits—the meanders, (<b>b</b>) the entire river system is characterized by these meanders. (2) Processes/drivers such as land-use-intensity, river regulations, or barrages lead to (<b>c</b>) changes in structural, functional fluvial traits (fluvial trait variations) (<b>d</b>,<b>e</b>). These fluvial trait variations lead to spectral responses in the remote sensing signal (<b>d</b>). Example of monitoring temporal changes of fluvial traits—vertical displacement rate of the river system from 2006–2010 with remote sensing technologies (LiDAR) (<b>d</b>,<b>e</b>). From Ventura et al. [<a href="#B99-remotesensing-14-02279" class="html-bibr">99</a>], Reprinted with permission from Ventura et al. [<a href="#B99-remotesensing-14-02279" class="html-bibr">99</a>], 2021, Elsevier. license no. 4856041399548.</p>
Full article ">Figure 15
<p>By comparing evolutionary and anthropogenically induced geomorphic structures and structural forms, conclusions can be drawn about the degree of anthropogenic overprinting. The anthropogenic fingerprint on geomorphology can be derived using RS technologies. “Images illustrating the detection of the anthropogenic topographic signature of terraced landforms in the Alpine context (Trento Province, central Italian Alps, Italy). Images of natural (<b>a</b>–<b>d</b>) and terraced landscapes (<b>e</b>–<b>h</b>) are derived from aerial photographs (<b>a</b>,<b>e</b>); shaded relief maps are derived from 2 m LiDAR digital terrain models (DTMs) (<b>b</b>,<b>f</b>); slope maps (<b>c</b>,<b>g</b>); and maps of the slope local length of auto-correlation (SLLAC) (<b>d</b>,<b>h</b>). The LiDAR dataset is offered as a free download by the Autonomous Province of Trento (Alps). Natural landscapes show maps of SLLAC with randomly distributed elements and highly noisy backgrounds, whereas the construction of terraces yields a clear topographic signature that results in more regular SLLAC maps, with ordered elongated elements that follow the terrace benches” (from Braun et al. [<a href="#B28-remotesensing-14-02279" class="html-bibr">28</a>]).</p>
Full article ">Figure 16
<p>A selection of fluvial geomorphic processes/changes, caused by natural processes or declining and increasing human land-use intensity, with images recorded by Landsat optical remote-sensing missions. The changes lead to different traits and taxonomical changes and act on different time scales. Image sources: Landsat satellite images, (<a href="https://earthobservatory.nasa.gov/" target="_blank">https://earthobservatory.nasa.gov/</a>, (accessed on 5 January 2022)); (<b>a</b>,<b>b</b>); land-use intensity—low <a href="https://earthobservatory.nasa.gov/images/89266/a-shape-shifting-river-in-bolivia" target="_blank">https://earthobservatory.nasa.gov/images/89266/a-shape-shifting-river-in-bolivia</a>, (accessed on 5 January 2022); (<b>c</b>,<b>d</b>) land-use intensity—increasing; <a href="https://earthobservatory.nasa.gov/world-of-change/YellowRiver/show-all" target="_blank">https://earthobservatory.nasa.gov/world-of-change/YellowRiver/show-all</a>, (accessed on 5 January 2022); (<b>e</b>,<b>f</b>) land-use intensity—increasing, <a href="https://earthobservatory.nasa.gov/images/91083/reshaping-the-xingu-river" target="_blank">https://earthobservatory.nasa.gov/images/91083/reshaping-the-xingu-river</a>, (accessed on 5 January 2022); (<b>g</b>,<b>h</b>) land-use intensity—declining, <a href="https://earthobservatory.nasa.gov/world-of-change/WaxLake" target="_blank">https://earthobservatory.nasa.gov/world-of-change/WaxLake</a>, (accessed on 5 January 2022).</p>
Full article ">Figure 17
<p>(<b>a</b>) Schematic diagram of space-for-time substitution from Huang et al. [<a href="#B126-remotesensing-14-02279" class="html-bibr">126</a>]; (<b>b</b>) evolution patterns of mountain types derived with RS, from Farmakis-Serebryakova and Hurni [<a href="#B98-remotesensing-14-02279" class="html-bibr">98</a>]; (<b>c</b>) examples of anthropogenic surface patterns derived with airborne LiDAR from Tarolli et al. [<a href="#B17-remotesensing-14-02279" class="html-bibr">17</a>].</p>
Full article ">Figure 18
<p>Linking in situ/field-monitoring approaches with different RS platforms (high-frequency spectral wireless sensor networks, ICOS towers, drones, and air- and spaceborne sensors) for the comprehensive monitoring of the traits of geomorphology, geomorphodiversity, and biodiversity and their interactions, as well as improving the calibration and validation of remote sensing data (modified after Lausch et al. [<a href="#B139-remotesensing-14-02279" class="html-bibr">139</a>]).</p>
Full article ">Figure 19
<p>Keywords and their co-occurrence in a bibliographic search in the Web of Science, with the topics: “geomorphodiversity” or “geomorphological diversity” or “landform diversity”; date of analysis, April 2022, finding 653 papers and around 100 keywords.</p>
Full article ">
21 pages, 9410 KiB  
Article
Estimates of Hyperspectral Surface and Underwater UV Planar and Scalar Irradiances from OMI Measurements and Radiative Transfer Computations
by Alexander Vasilkov, Nickolay Krotkov, David Haffner, Zachary Fasnacht and Joanna Joiner
Remote Sens. 2022, 14(9), 2278; https://doi.org/10.3390/rs14092278 - 9 May 2022
Cited by 2 | Viewed by 2713
Abstract
Quantitative assessment of the UV effects on aquatic ecosystems requires an estimate of the in-water hyperspectral radiation field. Solar UV radiation in ocean waters is estimated on a global scale by combining extraterrestrial solar irradiance from the Total and Spectral Solar Irradiance Sensor [...] Read more.
Quantitative assessment of the UV effects on aquatic ecosystems requires an estimate of the in-water hyperspectral radiation field. Solar UV radiation in ocean waters is estimated on a global scale by combining extraterrestrial solar irradiance from the Total and Spectral Solar Irradiance Sensor (TSIS-1), satellite estimates of cloud/surface reflectivity, ozone from the Ozone Monitoring Instrument (OMI) and in-water chlorophyll concentration from the Moderate Resolution Imaging Spectroradiometer (MODIS) with radiative transfer computations in the ocean-atmosphere system. A comparison of the estimates of collocated OMI-derived surface irradiance with Marine Optical Buoy (MOBY) measurements shows a good agreement within 5% for different seasons. To estimate scalar irradiance at the ocean surface and in water, we propose scaling the planar irradiance, calculated from satellite observation, on the basis of Hydrolight computations. Hydrolight calculations show that the diffuse attenuation coefficients of scalar and planar irradiance with depth are quite close to each other. That is why the differences between the planar penetration and scalar penetration depths are small and do not exceed a couple of meters. A dominant factor defining the UV penetration depths is chlorophyll concentration. There are other constituents in water that absorb in addition to chlorophyll; the absorption from these constituents can be related to that of chlorophyll in Case I waters using an inherent optical properties (IOP) model. Other input parameters are less significant. The DNA damage penetration depths vary from a few meters in areas of productive waters to about 30–35 m in the clearest waters. A machine learning approach (an artificial neural network, NN) was developed based on the full physical algorithm for computational efficiency. The NN shows a very good performance in predicting the penetration depths (within 2%). Full article
(This article belongs to the Topic Advances in Environmental Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>Extraterrestrial spectral irradiance at different spectral resolutions shown for (<b>a</b>) the entire UV spectral range and (<b>b</b>) the OCI spectral range shown in the light blue cutout in (<b>a</b>). The color bar represents the boxcar smoothing width in nanometers.</p>
Full article ">Figure 2
<p>(<b>a</b>) A comparison of the smoothed <math display="inline"><semantics> <msub> <mi>E</mi> <mi>s</mi> </msub> </semantics></math> measurements from MOBY (solid line) and OMI (symbols) using the Solar Ultraviolet Spectral Irradiance Monitor (plus signs) and TSIS (asterisks) extraterrestrial solar irradiance spectra for a selected OMI pixel. (<b>b</b>) The mean difference and standard deviation between the OMI-derived surface irradiance and the MOBY-measured irradiance.</p>
Full article ">Figure 3
<p>Spectral planar (solid lines) and scalar (dashed lines) irradiances. <b>Left</b>: Absolute values of the irradiances at two depths: 2 m (red lines) and 20 m (blue lines). <b>Right</b>: Planar and scalar irradiances normalized over their values at 400 nm. The black line shows the DNA damage action spectrum.</p>
Full article ">Figure 4
<p>DNA-weighted planar (solid lines) and scalar (dashed lines) irradiances at two depths: 2 m (red lines) and 20 m (blue lines).</p>
Full article ">Figure 5
<p>Diffuse attenuation coefficient <math display="inline"><semantics> <msub> <mi>K</mi> <mi>d</mi> </msub> </semantics></math> (red lines) and <math display="inline"><semantics> <msub> <mi>K</mi> <mi>o</mi> </msub> </semantics></math> (blue lines) as a function of: (<b>a</b>) wavelength, (<b>b</b>) depth for two SZAs of 15<math display="inline"><semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics></math> (solid lines) and 60<math display="inline"><semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics></math> (dashed lines), (<b>c</b>) chlorophyll concentration for two wavelengths 300 nm (solid lines) and 380 nm (dashed lines) and (<b>d</b>) SZA for two values of chlorophyll concentration 0.1 mg/m<sup>3</sup> (solid lines) and 1.0 mg/m<sup>3</sup> (dashed lines).</p>
Full article ">Figure 6
<p>Maps of the main input parameters: (<b>a</b>) solar zenith angle; (<b>b</b>) chlorophyll concentration in mg/m<sup>3</sup>; (<b>c</b>) ozone amounts in DU; and (<b>d</b>) effective cloud optical depth at 360 nm.</p>
Full article ">Figure 7
<p>Maps of the surface erythemal dose rates: (<b>a</b>) calculated using the scalar irradiance; (<b>b</b>) calculated using the planar irradiance; (<b>c</b>) ratio of the scalar erythemal dose rate to the planar dose rate.</p>
Full article ">Figure 8
<p>Maps of the penetration depths for the DNA damage dose rates: (<b>a</b>) calculated using the planar irradiance; (<b>b</b>) calculated using the scalar irradiance. (<b>c</b>) The difference between the planar penetration and scalar penetration depths. The color bars represent the penetration depths in meters.</p>
Full article ">Figure 9
<p>Comparisons of Hydrolight penetration depth simulations and neural network estimate for 7 January 2005. Left panel shows comparisons for orbits included in neural network training while right panel shows orbits on this date not included in the training.</p>
Full article ">Figure 10
<p>Map of planar penetration depth on 7 January 2005. Top panel shows planar penetration depth from Hydrolight, middle panel shows planar penetration depth from the neural network and the bottom shows the percent difference of Hydrolight versus the neural network estimate. The color bars represent the penetration depths in meters and the percent difference.</p>
Full article ">Figure A1
<p>The particulate-matter absorption coefficient. <b>Left</b> panel: <math display="inline"><semantics> <msub> <mi>a</mi> <mrow> <mi>p</mi> <mi>h</mi> </mrow> </msub> </semantics></math> as a function of wavelength for different chlorophyll concentrations: 1—Chl = 0.55 mg/m<sup>3</sup>; 2—Chl = 0.13 mg/m<sup>3</sup>; 3—Chl = 0.03 mg/m<sup>3</sup>. <b>Right</b> panel: <math display="inline"><semantics> <msub> <mi>a</mi> <mrow> <mi>p</mi> <mi>h</mi> </mrow> </msub> </semantics></math> as a function of chlorophyll concentration for different wavelengths: 1–300 nm; 2–340 nm; 3–380 nm.</p>
Full article ">Figure A2
<p>The scaling factor dependence on different variables; (<b>a</b>) the <span class="html-italic">f</span> dependence on wavelength for different SZAs; (<b>b</b>) the <span class="html-italic">f</span> dependence on SZA at 300 nm for different ozone amounts; (<b>c</b>) the <span class="html-italic">f</span> dependence on chlorophyll concentration for different wavelengths. (<b>d</b>) The <span class="html-italic">f</span> dependence on ozone amount for different wavelengths.</p>
Full article ">
26 pages, 9706 KiB  
Article
Identification of Coupling Relationship between Ecosystem Services and Urbanization for Supporting Ecological Management: A Case Study on Areas along the Yellow River of Henan Province
by Hejie Wei, Dong Xue, Junchang Huang, Mengxue Liu and Ling Li
Remote Sens. 2022, 14(9), 2277; https://doi.org/10.3390/rs14092277 - 9 May 2022
Cited by 25 | Viewed by 3210
Abstract
Urbanization has an important effect on ecosystem services (ESs) and identifying the relationship between urbanization and ESs can provide a decision-making reference for regional ecological protection and management. Taking the areas along the Yellow River of Henan Province (AYRHP) as a research area, [...] Read more.
Urbanization has an important effect on ecosystem services (ESs) and identifying the relationship between urbanization and ESs can provide a decision-making reference for regional ecological protection and management. Taking the areas along the Yellow River of Henan Province (AYRHP) as a research area, a coupling system of ESs and urbanization is established in this study to reveal the coupling relationship between the two. ESs are estimated by using Carnegie–Ames–Stanford approach, revision universal soil loss equation, and Integrated Valuation of Ecosystem Services and Trade-offs (InVEST) models. The urbanization level is evaluated from three dimensions, namely, population, economy, and land. The coupling coordination relationship between various ESs and urbanization in AYRHP is quantified from 2000 to 2018 on the county scale based on the coupling coordination degree (CCD) model. The lead–lag relationship between ESs and urbanization is identified by using the relative development degree model, and ecological management zoning is conducted. Results show that in the study period, net primary production (NPP), soil conservation, and food production are increased, whereas water yield is decreased. In the study period, population, economy, and land urbanization level are increasing, and the comprehensive urbanization level is increased by 51.63%. The total CCD between NPP, food production, and water yield and comprehensive urbanization is basic or moderate coordination, whereas that between soil conservation and comprehensive urbanization is moderate maladjustment. In the research period, the coupling coordination between NPP and food production and comprehensive urbanization is increasing; that between water yield and comprehensive urbanization is fluctuated; and that between soil conservation and comprehensive urbanization is decreasing. The result of the research into the relative development degree in 2018 showed that food production, water yield, and soil conservation lag behind the urbanization level in most regions and counties along the Yellow River of Henan Province. On the basis of the lead–lag relationship between different ESs and urbanization level, the AYRHP are divided into ecological reconstruction area, ecological and agricultural improvement area, and ecological conservation area. CCD and relative development degree models can be used to evaluate the coordination relationship between ESs and urbanization, which provides scientific support for regional ES management. Full article
(This article belongs to the Special Issue Remote Sensing Applications in Urban Ecosystem Services)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Location of the research area.</p>
Full article ">Figure 2
<p>The framework of coupling ecosystem services and urbanization.</p>
Full article ">Figure 3
<p>Spatiotemporal distribution patterns of NPP in the AYRHP on grid and county scales.</p>
Full article ">Figure 4
<p>Spatiotemporal distribution patterns of soil conservation amount in the AYRHP on grid and county scales.</p>
Full article ">Figure 5
<p>Spatiotemporal distribution patterns of water yield in the AYRHP on raster and county scales.</p>
Full article ">Figure 6
<p>Spatiotemporal distribution patterns of food production in the AYRHP on raster and county scales.</p>
Full article ">Figure 7
<p>Spatiotemporal distribution patterns of population, economy, and land urbanization in the AYRHP on the county scale.</p>
Full article ">Figure 8
<p>Spatiotemporal distribution patterns of comprehensive urbanization in the AYRHP on the county scale.</p>
Full article ">Figure 9
<p>Temporal change in the coupling coordination between ESs and comprehensive urbanization in the AYRHP.</p>
Full article ">Figure 10
<p>Spatiotemporal change in the coupling coordination between ESs and comprehensive urbanization in the AYRHP.</p>
Full article ">Figure 11
<p>Spatiotemporal change in the relative development of ESs and comprehensive urbanization in the AYRHP on the county scale.</p>
Full article ">Figure 12
<p>Ecological management zoning at the county level in the AYRHP.</p>
Full article ">
19 pages, 9695 KiB  
Article
A Context Feature Enhancement Network for Building Extraction from High-Resolution Remote Sensing Imagery
by Jinzhi Chen, Dejun Zhang, Yiqi Wu, Yilin Chen and Xiaohu Yan
Remote Sens. 2022, 14(9), 2276; https://doi.org/10.3390/rs14092276 - 9 May 2022
Cited by 28 | Viewed by 5204
Abstract
The complexity and diversity of buildings make it challenging to extract low-level and high-level features with strong feature representation by using deep neural networks in building extraction tasks. Meanwhile, deep neural network-based methods have many network parameters, which take up a lot of [...] Read more.
The complexity and diversity of buildings make it challenging to extract low-level and high-level features with strong feature representation by using deep neural networks in building extraction tasks. Meanwhile, deep neural network-based methods have many network parameters, which take up a lot of memory and time in training and testing. We propose a novel fully convolutional neural network called the Context Feature Enhancement Network (CFENet) to address these issues. CFENet comprises three modules: the spatial fusion module, the focus enhancement module, and the feature decoder module. First, the spatial fusion module aggregates the spatial information of low-level features to obtain buildings’ outline and edge information. Secondly, the focus enhancement module fully aggregates the semantic information of high-level features to filter the information of building-related attribute categories. Finally, the feature decoder module decodes the output of the above two modules to segment the buildings more accurately. In a series of experiments on the WHU Building Dataset and the Massachusetts Building Dataset, our CFENet balances efficiency and accuracy compared to the other four methods we compared, and achieves optimality on all five evaluation metrics: PA, PC, F1, IoU, and FWIoU. This indicates that CFENet can effectively enhance and fuse buildings’ low-level and high-level features, improving building extraction accuracy. Full article
Show Figures

Figure 1

Figure 1
<p>Our network Context Feature Enhancement Network (CFENet) adopts a ResNet-101 as the backbone. It consists of three novel modules: Spatial Fusion Module (SFM), Focus Enhancement Module (FEM), and Feature Decoder Module (FDM). The input image size of our network is 512 × 512 pixels, and the outputs of the four stages of ResNet-101 are used as Feature 1, Feature 2, Feature 3, and Feature 4, respectively. Then Feature 1 and Feature 2 are fed into the SFM, and Feature 3 and Feature 4 are fed into the FEM. Finally, we input the FDM to obtain the final segmentation result. (<b>a</b>) An overall structure of CFENet. (<b>b</b>) Structure diagram of SFM, FEM, and FDM.</p>
Full article ">Figure 2
<p>The architecture of Location Block.</p>
Full article ">Figure 3
<p>Some examples of the WHU Building Dataset. The first row is the original image and the second row is the ground truth.</p>
Full article ">Figure 4
<p>An example of the Massachusetts Building Dataset. (<b>a</b>) Original image; (<b>b</b>) Ground truth.</p>
Full article ">Figure 5
<p>Results of different methods on buildings of small size and complex shapes. (<b>a</b>,<b>b</b>) are the original images; (<b>c</b>) ground truth; (<b>d</b>) the results of CFENet; (<b>e</b>–<b>h</b>) show the results of U-Net, Deeplabv3+, PSPNet and HRNet, respectively. Green, red, and blue pixels on the map represent predictions of true positives, false positives, and false negatives, respectively. The experimental results in the red box show the areas where the contrast effect is more obvious.</p>
Full article ">Figure 6
<p>Results of different methods on large buildings with complex textures and colors. (<b>a</b>) Original image; (<b>b</b>) ground truth; (<b>c</b>) the results of our CFENet; (<b>d</b>–<b>g</b>) show the results of U-Net, Deeplabv3+, PSPNet, and HRNet, respectively. Green, red, and blue pixels on the map represent predictions of true positives, false positives, and false negatives, respectively. The experimental results in the red box show the areas where the contrast effect is more obvious.</p>
Full article ">Figure 7
<p>Results of different methods on easily confused buildings. (<b>a</b>–<b>c</b>) Original input images; (<b>d</b>) ground truth; (<b>e</b>–<b>i</b>) the results of our method and the other four methods, respectively. Green, red, and blue pixels on the map represent predictions of true positives, false positives, and false negatives, respectively. The experimental results in the red box show the areas where the contrast effect is more obvious.</p>
Full article ">Figure 8
<p>Extraction results of different methods for different types of buildings on the Massachusetts Building Dataset. (<b>a</b>) Original input image; (<b>b</b>–<b>f</b>) the results of our method and the other four methods, respectively. Green, red, and blue pixels on the map represent predictions of true positives, false positives, and false negatives, respectively.</p>
Full article ">Figure 9
<p>Visualization results of ablation experiments. (<b>a</b>) Original image; (<b>b</b>) the output of BaseNet (ResNet-101); (<b>c</b>) the output of BaseNet + SFM; (<b>d</b>) the output of BaseNet + SFM + FEM; (<b>e</b>) the output of CFENet (BaseNet + SFM + FEM + FDM); (<b>f</b>) ground truth. The experimental results in the box show areas where the contrast effect is more obvious.</p>
Full article ">Figure 10
<p>Comparison of experimental results and model sizes of different methods. Line graphs with different colors represent the results of different evaluation metrics. The histogram represents the model sizes of different methods.</p>
Full article ">Figure 11
<p>Heatmaps of CFENet’s feature maps. SFM Output represents the feature maps output by the spatial fusion module. FEM Output represents the feature maps output by the focus enhancement module. FDM Output represents the feature maps output by the feature decoder module.</p>
Full article ">
16 pages, 7429 KiB  
Article
Research on Generalized RQD of Rock Mass Based on 3D Slope Model Established by Digital Close-Range Photogrammetry
by Qing Ding, Fengyan Wang, Jianping Chen, Mingchang Wang and Xuqing Zhang
Remote Sens. 2022, 14(9), 2275; https://doi.org/10.3390/rs14092275 - 9 May 2022
Cited by 6 | Viewed by 2247
Abstract
The traditional method of obtaining rock quality designation (RQD) cannot fully reflect the anisotropy of the rock mass and thus cannot accurately reflect its quality. In the method of calculating RQD based on three-dimensional network simulation of discontinuities, due to the limited number [...] Read more.
The traditional method of obtaining rock quality designation (RQD) cannot fully reflect the anisotropy of the rock mass and thus cannot accurately reflect its quality. In the method of calculating RQD based on three-dimensional network simulation of discontinuities, due to the limited number of samples and low accuracy of discontinuity data obtained by manual contact measurement, a certain deviation in the network is generated based on the data, which has an impact on the calculation result. Taking a typical slope in Dongsheng quarry in Changchun City as an example, in this study, we obtained the discontinuity data of the slope based on digital close-range photogrammetry, which greatly enlarged the sample size of discontinuity data and improved the data quality. Based on the heterogeneity of the rock mass, the optimum threshold of discontinuity spacing was determined when surveying lines were laid parallel to different coordinate axes to calculate the generalized RQD, and the influence of measuring blank areas on the slope caused by vegetation coverage or gravel accumulation was eliminated. The real generalized RQD of the rock mass after eliminating the influence of blank areas was obtained. Experiments showed that, after eliminating the influence of blank areas, the generalized RQD of the slope rock mass more truly represented the complete quality of rock mass and offers a new idea for the quality evaluation of engineering rock mass. Full article
Show Figures

Figure 1

Figure 1
<p>Geographical location and real scene of research slope.</p>
Full article ">Figure 2
<p>(<b>a</b>) Principle of digital close-range photogrammetry; (<b>b</b>) Schematic diagram of photograph perpendicular to slope.</p>
Full article ">Figure 3
<p>(<b>a</b>) Pole diagram of orientation of discontinuities; (<b>b</b>) Spatial expression of trace of discontinuities.</p>
Full article ">Figure 4
<p>Three-dimensional representation of discontinuity.</p>
Full article ">Figure 5
<p>Determining RQD of a single surveying line, which eliminates influence of measurement blank area.</p>
Full article ">Figure 6
<p>Three-dimensional disk model of discontinuities with ellipsoids.</p>
Full article ">Figure 7
<p>Schematic diagram of surveying lines laid parallel to coordinate axes.</p>
Full article ">Figure 8
<p>Relationship between RQD and distance between surveying lines.</p>
Full article ">Figure 9
<p>RQD values on parallel <span class="html-italic">Y</span>-axis effective surveying lines under different thresholds.</p>
Full article ">Figure 10
<p>RQD frequency distribution histogram of parallel <span class="html-italic">Y</span>-axis surveying lines under different thresholds.</p>
Full article ">Figure 11
<p>Variation in RQD mean and standard deviation with threshold <span class="html-italic">t</span> on parallel <span class="html-italic">Y</span>-axis surveying lines.</p>
Full article ">Figure 12
<p>Relationship between RQD standard deviation and discontinuity spacing threshold.</p>
Full article ">
13 pages, 2998 KiB  
Communication
Adaptive Subspace Signal Detection in Structured Interference Plus Compound Gaussian Sea Clutter
by Zeyu Wang, Jun Liu, Yachao Li, Hongmeng Chen and Mugen Peng
Remote Sens. 2022, 14(9), 2274; https://doi.org/10.3390/rs14092274 - 8 May 2022
Cited by 6 | Viewed by 2016
Abstract
This paper discusses the problem of detecting subspace signals in structured interference plus compound Gaussian sea clutter with persymmetric structure. The sea clutter is represented by a compound Gaussian process wherein the texture obeys the inverse Gaussian distribution. The structured interference lies in [...] Read more.
This paper discusses the problem of detecting subspace signals in structured interference plus compound Gaussian sea clutter with persymmetric structure. The sea clutter is represented by a compound Gaussian process wherein the texture obeys the inverse Gaussian distribution. The structured interference lies in a known subspace, which is independent with the target signal subspace. By resorting to the two-step generalized likelihood ratio test, two-step Rao, and two-step Wald design criteria, three adaptive subspace signal detectors are proposed. Moreover, the constant false-alarm rate property of the proposed detectors is proved. The experimental results based on IPIX real sea clutter data and simulated data illustrate that the proposed detectors outperform their counterparts. Full article
(This article belongs to the Special Issue Target Detection and Information Extraction in Radar Images)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>False-alarm probabilities versus correlation coefficients for <math display="inline"><semantics> <mrow> <mi>K</mi> <mo>=</mo> <mn>2</mn> <mi>N</mi> </mrow> </semantics></math>.</p>
Full article ">Figure 2
<p>Detection performance of the detectors: (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>K</mi> <mo>=</mo> <mi>N</mi> </mrow> </semantics></math>; (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>K</mi> <mo>=</mo> <mn>4</mn> <mi>N</mi> </mrow> </semantics></math>.</p>
Full article ">Figure 3
<p>ROC of the detectors for <math display="inline"><semantics> <mrow> <mi>K</mi> <mo>=</mo> <mn>2</mn> <mi>N</mi> <mo>,</mo> <mi>INR</mi> <mo>=</mo> <mn>10</mn> <mi>dB</mi> </mrow> </semantics></math>: (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>SCR</mi> <mo>=</mo> <mo>−</mo> <mn>5</mn> <mi>dB</mi> </mrow> </semantics></math>; (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>SCR</mi> <mo>=</mo> <mn>5</mn> <mi>dB</mi> </mrow> </semantics></math>.</p>
Full article ">Figure 4
<p>Detection performance of the detectors for <math display="inline"><semantics> <mrow> <mi>K</mi> <mo>=</mo> <mn>2</mn> <mi>N</mi> </mrow> </semantics></math>: (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>SCR</mi> <mo>=</mo> <mo>−</mo> <mn>5</mn> <mi>dB</mi> </mrow> </semantics></math>; (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>SCR</mi> <mo>=</mo> <mn>5</mn> <mi>dB</mi> </mrow> </semantics></math>.</p>
Full article ">Figure 5
<p>Contours of const <math display="inline"><semantics> <mrow> <msub> <mi>P</mi> <mi>d</mi> </msub> </mrow> </semantics></math> for the proposed detectors: (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>K</mi> <mo>=</mo> <mi>N</mi> </mrow> </semantics></math>; (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>K</mi> <mo>=</mo> <mn>4</mn> <mi>N</mi> </mrow> </semantics></math>.</p>
Full article ">Figure 6
<p>Clutter amplitude and amplitude probability density function for VV polarizations, 16th range cell, dataset 85: (<b>a</b>) clutter amplitude; (<b>b</b>) amplitude probability density function.</p>
Full article ">Figure 7
<p>Detection performance of the detectors: (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>K</mi> <mo>=</mo> <mi>N</mi> </mrow> </semantics></math>; (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>K</mi> <mo>=</mo> <mn>2</mn> <mi>N</mi> </mrow> </semantics></math>.</p>
Full article ">
17 pages, 31734 KiB  
Technical Note
Adaptive Kalman Filter for Real-Time Precise Orbit Determination of Low Earth Orbit Satellites Based on Pseudorange and Epoch-Differenced Carrier-Phase Measurements
by Min Li, Tianhe Xu, Yali Shi, Kai Wei, Xianming Fei and Dixing Wang
Remote Sens. 2022, 14(9), 2273; https://doi.org/10.3390/rs14092273 - 8 May 2022
Cited by 6 | Viewed by 3619
Abstract
Real-time precise orbit determination (POD) of low earth orbiters (LEOs) is crucial for orbit maintenance as well as autonomous operation for space missions. The Global Positioning System (GPS) has become the dominant technique for real-time precise orbit determination (POD) of LEOs. However, the [...] Read more.
Real-time precise orbit determination (POD) of low earth orbiters (LEOs) is crucial for orbit maintenance as well as autonomous operation for space missions. The Global Positioning System (GPS) has become the dominant technique for real-time precise orbit determination (POD) of LEOs. However, the observation conditions of near-earth space are more critical than those on the ground. Real-time POD accuracy can be seriously affected when the observation environment suffers from strong space events, i.e., a heavy solar storm. In this study, we proposed a reliable adaptive Kalman filter based on pseudorange and epoch-differenced carrier-phase measurements. This approach uses the epoch-differenced carrier phase to eliminate the ambiguities and thus reduces the significant number of unknown parameters. Real calculations demonstrate that four to five observed GPS satellites is sufficient to solve reliable position parameters. Furthermore, with accurate pseudorange and epoch-differenced carrier-phase-based reference orbits, orbital dynamic disturbance can be detected precisely and reliably with an adaptive Kalman filter. Analyses of Swarm-A POD show that sub-meter level real-time orbit solutions can be obtained when the observation conditions are good. For poor observation conditions such as the GRACE-A satellite on 8 September 2017, when fewer than five GPS satellites were observed for 14% of the observation time, 1–2 m orbital accuracy can still be achieved with the proposed approach. Full article
(This article belongs to the Special Issue Precision Orbit Determination of Satellites)
Show Figures

Figure 1

Figure 1
<p>Diagram of comparison of orbit discrepancies detected with different types of observations (<b>left</b>: pseudorange; <b>right</b>: pseudorange and epoch-differenced carrier-phase).</p>
Full article ">Figure 2
<p>Swarm-A real-time POD results on a normal day of Scheme 1.</p>
Full article ">Figure 3
<p>Swarm-A real-time POD results on a normal day of Schemes 2 (<b>left</b>) and 3 (<b>right</b>).</p>
Full article ">Figure 4
<p>Swarm-A real-time POD results on a normal day of Schemes 4 (<b>left</b>) and 5 (<b>right</b>).</p>
Full article ">Figure 5
<p>Swarm-A real-time POD results of Scheme 1.</p>
Full article ">Figure 6
<p>Swarm-A real-time POD results of Schemes 2 (<b>left</b>) and 3 (<b>right</b>).</p>
Full article ">Figure 7
<p>Swarm-A real-time POD results of Schemes 4 (<b>left</b>) and 5 (<b>right</b>).</p>
Full article ">Figure 8
<p>GRACE-A real-time POD results of Scheme 1.</p>
Full article ">Figure 9
<p>GRACE-A real-time POD results of Schemes 2 (<b>left</b>) and 3 (<b>right</b>).</p>
Full article ">Figure 10
<p>GRACE-A real-time POD results of Schemes 4 (<b>left</b>) and 5 (<b>right</b>).</p>
Full article ">Figure 11
<p>The adaptive factors for Swarm-A and GRACE-A real-time POD solutions from Schemes 3 and 5 on the day of a solar storm. The two upper subplots are for Swarm-A while the two lower subplots are for GRACE-A. The two left subplots are from Scheme 3 (pseudorange-only) and the right are from Scheme 5 (pseudorange and epoch-differenced carrier-phase-based).</p>
Full article ">
17 pages, 4326 KiB  
Article
Local Persistent Ionospheric Positive Responses to the Geomagnetic Storm in August 2018 Using BDS-GEO Satellites over Low-Latitude Regions in Eastern Hemisphere
by Jun Tang, Xin Gao, Dengpan Yang, Zhengyu Zhong, Xingliang Huo and Xuequn Wu
Remote Sens. 2022, 14(9), 2272; https://doi.org/10.3390/rs14092272 - 8 May 2022
Cited by 9 | Viewed by 2365
Abstract
We present the ionospheric disturbance responses over low-latitude regions by using total electron content from Geostationary Earth Orbit (GEO) satellites of the BeiDou Navigation Satellite System (BDS), ionosonde data and Swarm satellite data, during the geomagnetic storm in August 2018. The results show [...] Read more.
We present the ionospheric disturbance responses over low-latitude regions by using total electron content from Geostationary Earth Orbit (GEO) satellites of the BeiDou Navigation Satellite System (BDS), ionosonde data and Swarm satellite data, during the geomagnetic storm in August 2018. The results show that a prominent total electron content (TEC) enhancement over low-latitude regions is observed during the main phase of the storm. There is a persistent TEC increase lasting for about 1–2 days and a moderately positive disturbance response during the recovery phase on 27–28 August, which distinguishes from the general performance of ionospheric TEC in the previous storms. We also find that this phenomenon is a unique local-area disturbance of the ionosphere during the recovery phase of the storm. The enhanced foF2 and hmF2 of the ionospheric F2 layer is observed by SANYA and LEARMONTH ionosonde stations during the recovery phase. The electron density from Swarm satellites shows a strong equatorial ionization anomaly (EIA) crest over the low-latitude area during the main phase of storm, which is simultaneous with the uplift of the ionospheric F2 layer from the SANYA ionosonde. Meanwhile, the thermosphere O/N2 ratio shows a local increase on 27–28 August over low-latitude regions. From the above results, this study suggests that the uplift of F layer height and the enhanced O/N2 ratio are possibly main factors causing the local-area positive disturbance responses during the recovery phase of the storm in August 2018. Full article
Show Figures

Figure 1

Figure 1
<p>Blue triangles represent locations of MGEX stations, and the green squares represent ionosonde stations. The red stars represent projections of BeiDou Navigation Satellite System Geostationary Earth Orbit (BDS-GEO) satellites on the ground.</p>
Full article ">Figure 2
<p>Temporal variations of solar wind velocity, IMF-Bz component, Dst and Kp indexes during 23–29 August 2018. The yellow shadow shows the period of the storm main phase.</p>
Full article ">Figure 3
<p>Diurnal variations of Geostationary Earth Orbit (GEO) total electron content (TEC) and corresponding upper bounds and lower bounds calculated from the global ionospheric map (GIM) model over different stations during 23–29 August 2018. The red and purple areas represent positive signatures and negative signatures, respectively. The yellow vertical bar is the duration of the main phase. The VTEC is vertical TEC, and the TECU is TEC Units.</p>
Full article ">Figure 4
<p>The <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mi>o</mi> </msub> <msub> <mi>F</mi> <mn>2</mn> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mi>h</mi> <mi>m</mi> </msub> <msub> <mi>F</mi> <mn>2</mn> </msub> </mrow> </semantics></math> (red lines) and corresponding averaged quiet values (blue lines) at DARWIN and GUAM stations. The green and purple lines represent <math display="inline"><semantics> <mrow> <mi>D</mi> <msub> <mi>f</mi> <mi>o</mi> </msub> <msub> <mi>F</mi> <mn>2</mn> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <mi>D</mi> <msub> <mi>h</mi> <mi>m</mi> </msub> <msub> <mi>F</mi> <mn>2</mn> </msub> </mrow> </semantics></math>, respectively. The yellow bars represent the duration of the main phase.</p>
Full article ">Figure 5
<p>Diurnal variations of GEO TEC and the corresponding disturbance bounds from four stations during 23–29 August 2018. The bottom figure shows the variation of solar wind velocity (blue line) and IMF-Bz (red line), respectively. The yellow bar is the disturbance duration during the recovery phase.</p>
Full article ">Figure 6
<p>The <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mi>o</mi> </msub> <msub> <mi>F</mi> <mn>2</mn> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mi>h</mi> <mi>m</mi> </msub> <msub> <mi>F</mi> <mn>2</mn> </msub> </mrow> </semantics></math> (red lines) and corresponding averaged quiet values (blue lines) at SANYA and LEARMONTH stations. The green and purple lines represent <math display="inline"><semantics> <mrow> <mi>D</mi> <msub> <mi>f</mi> <mi>o</mi> </msub> <msub> <mi>F</mi> <mn>2</mn> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <mi>D</mi> <msub> <mi>h</mi> <mi>m</mi> </msub> <msub> <mi>F</mi> <mn>2</mn> </msub> </mrow> </semantics></math>, respectively. The yellow bars represent disturbance duration during the recovery phase.</p>
Full article ">Figure 7
<p>The electron density Ne from Swarm satellites A and C on 25–27 August 2018.</p>
Full article ">Figure 8
<p>Variation of O/N<sub>2</sub> ratio from TIMED/GUVI during 24–29 August 2018.</p>
Full article ">Figure 9
<p>Variation of the interplanetary electric field during 24–28 August 2018.</p>
Full article ">
20 pages, 3364 KiB  
Article
Combining Different Transformations of Ground Hyperspectral Data with Unmanned Aerial Vehicle (UAV) Images for Anthocyanin Estimation in Tree Peony Leaves
by Lili Luo, Qinrui Chang, Yifan Gao, Danyao Jiang and Fenling Li
Remote Sens. 2022, 14(9), 2271; https://doi.org/10.3390/rs14092271 - 8 May 2022
Cited by 18 | Viewed by 3378
Abstract
To explore rapid anthocyanin (Anth) detection technology based on remote sensing (RS) in tree peony leaves, we considered 30 species of tree peonies located in Shaanxi Province, China. We used an SVC HR~1024i portable ground object spectrometer and mini-unmanned aerial vehicle (UAV)-borne RS [...] Read more.
To explore rapid anthocyanin (Anth) detection technology based on remote sensing (RS) in tree peony leaves, we considered 30 species of tree peonies located in Shaanxi Province, China. We used an SVC HR~1024i portable ground object spectrometer and mini-unmanned aerial vehicle (UAV)-borne RS systems to obtain hyperspectral (HS) reflectance and images of canopy leaves. First, we performed principal component analysis (PCA), first-order differential (FD), and continuum removal (CR) transformations on the original ground-based spectra; commonly used spectral parameters were implemented to estimate Anth content using multiple stepwise regression (MSR), partial least squares (PLS), back-propagation neural network (BPNN), and random forest (RF) models. The spectral transformation highlighted the characteristics of spectral curves and improved the relationship between spectral reflectance and Anth, and the RF model based on the FD spectrum portrayed the best estimation accuracy (R2c = 0.91; R2v = 0.51). Then, the RGB (red-green-blue) gray vegetation index (VI) and the texture parameters were constructed using UAV images, and an Anth estimation model was constructed using UAV parameters. Finally, the UAV image was fused with the ground spectral data, and a multisource RS model of Anth estimation was constructed, based on PCA + UAV, FD + UAV, and CR + UAV, using MSR, PLS, BPNN, and RF methods. The RF model based on FD+UAV portrayed the best modeling and verification effect (R2c = 0.93; R2v = 0.76); compared with the FD-RF model, R2c increased only slightly, but R2v increased greatly from 0.51 to 0.76, indicating improved modeling and testing accuracy. The optimal spectral transformation for the Anth estimation of tree peony leaves was obtained, and a high-precision Anth multisource RS model was constructed. Our results can be used for the selection of ground-based HS transformation in future plant Anth estimation, and as a theoretical basis for plant growth monitoring based on ground and UAV multisource RS. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Location of study area and distribution of samples.</p>
Full article ">Figure 2
<p>Spectrometer and tree peony samples.</p>
Full article ">Figure 3
<p>Unmanned aerial vehicle (UAV) image mosaic and information extraction.</p>
Full article ">Figure 4
<p>Characteristics of original spectrum, first-order differential (FD) spectrum and continuum removal (CR) spectrum of tree peony leaves.</p>
Full article ">Figure 5
<p>Principal component analysis (PCA) results of tree peony leaves spectra.</p>
Full article ">Figure 6
<p>Coefficients of determination (<span class="html-italic">R</span><span class="html-italic"><sup>2</sup></span>) contour maps of ratio vegetation index (<span class="html-italic">RVI</span>), differential vegetation index (<span class="html-italic">DVI</span>), normalized vegetation index (<span class="html-italic">NDVI</span>) and soil-regulated vegetation index (<span class="html-italic">SAVI</span>).</p>
Full article ">Figure 7
<p>Distribution of measured Anth and predicted Anth in the random forest (RF) models: (<b>a</b>,<b>b</b>) are the calibration set and test set of the RF model based on principal component analysis using an unmanned aerial vehicle (PCA + UAV); (<b>c</b>,<b>d</b>) are the calibration set and test set of the RF model based on first-order differential using an unmanned aerial vehicle (FD + UAV); (<b>e</b>,<b>f</b>) are the calibration set and test set of the RF model based on continuum removal using an unmanned aerial vehicle (CR + UAV).</p>
Full article ">
23 pages, 7852 KiB  
Article
An Integrated Model of Summer and Winter for Chlorophyll-a Retrieval in the Pearl River Estuary Based on Hyperspectral Data
by Haitao Li, Xuetong Xie, Xiankun Yang, Bowen Cao and Xuening Xia
Remote Sens. 2022, 14(9), 2270; https://doi.org/10.3390/rs14092270 - 8 May 2022
Cited by 7 | Viewed by 2683
Abstract
Chlorophyll-a (Chla) is an important parameter for water quality. For remote sensing-based methods for the measurement of Chla, in-situ hyperspectral data is crucial for building retrieval models. In the Pearl River Estuary, we used 61 groups of in-situ hyperspectral data and [...] Read more.
Chlorophyll-a (Chla) is an important parameter for water quality. For remote sensing-based methods for the measurement of Chla, in-situ hyperspectral data is crucial for building retrieval models. In the Pearl River Estuary, we used 61 groups of in-situ hyperspectral data and corresponding Chla concentrations collected in July and December 2020 to build a Chla retrieval model that takes the two different seasons and the turbidity of water into consideration. The following results were obtained. (1) Based on the pre-processing techniques for hyperspectral data, it was shown that the first-derivative of 680 nm is the optimal band for the estimation of Chla in the Pearl River Estuary, with R2 > 0.8 and MAPE of 26.03%. (2) To overcome the spectral resolution problem in satellite image retrieval, based on the simulated reflectance from the Sentinel-2 satellite and the shape of the discrete spectral curve, we constructed a multispectral model using the slope difference index method, which reached a R2 of 0.78 and MAPE of 35.21% and can integrate the summer and winter data. (3) The slope difference method applied to the Sentinel-2 image shows better performance than the red-NIR ratio method. Therefore, the method proposed in this paper is practicable for Chla monitoring of coastal waters based on both in-situ data and images. Full article
(This article belongs to the Special Issue Hyperspectral Remote Sensing Technology in Water Quality Evaluation)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Location and spatial distribution of the sampling sites.</p>
Full article ">Figure 2
<p>The technical roadmap of this study. It illustrates the critical parts such as data preparation, data analysis, and results. It also shows the necessary workflow for those parts.</p>
Full article ">Figure 3
<p>The in-situ <span class="html-italic">R<sub>rs</sub></span> (by Formula (2)) and normalized <span class="html-italic">R<sub>rs</sub></span>. <span class="html-italic">R<sub>rs</sub></span> for July (<b>a</b>) and December (<b>c</b>), and their normalized counterparts (<b>b</b>,<b>d</b>).</p>
Full article ">Figure 4
<p>Comparison of the correlation coefficients between Chl<span class="html-italic">a</span> concentration and normalized reflectance in July, December and both.</p>
Full article ">Figure 5
<p>The derived models based on correlated individual bands in (<b>a</b>) July at 496 nm, (<b>b</b>) December at 489 nm, and (<b>c</b>) both at 673 nm.</p>
Full article ">Figure 6
<p>The quadratic models of normalized <span class="html-italic">R<sub>rs</sub></span> (673) for July: (<b>a</b>) the original value (Chl<span class="html-italic">a</span>) model and (<b>b</b>) the reciprocal Chl<span class="html-italic">a</span> (1/Chl<span class="html-italic">a</span>) model.</p>
Full article ">Figure 7
<p>Results of band correlation analysis. All the possibilities of (<b>a</b>) correlation coefficients (dotted lines for negative correlation) and (<b>b</b>) R<sup>2</sup> from the two-bands ratio.</p>
Full article ">Figure 8
<p>Normalized ratio models for the two monthly datasets and their correlation with Chl<span class="html-italic">a</span> concentration: (<b>a</b>) normalized <span class="html-italic">R<sub>rs</sub></span> (683)/<span class="html-italic">R<sub>rs</sub></span> (677), (<b>b</b>) <span class="html-italic">R<sub>rs</sub></span> (640)/<span class="html-italic">R<sub>rs</sub></span> (667), and (<b>c</b>) normalized <span class="html-italic">R<sub>rs</sub></span> (705)/<span class="html-italic">R<sub>rs</sub></span> (665).</p>
Full article ">Figure 9
<p>Sampling data collected in July and December based on first-derivate pre-processing.</p>
Full article ">Figure 10
<p>The heatmap of correlation coefficients for July, December, and both.</p>
Full article ">Figure 11
<p>Model of first-derivate at (<b>a</b>) 680 nm and (<b>b</b>) the figure transposed from the <span class="html-italic">x</span>- to <span class="html-italic">y</span>-axis.</p>
Full article ">Figure 12
<p>Comparison of resampled reflectance (the average) based on datasets collected in July and December.</p>
Full article ">Figure 13
<p>Comparison of models of using k1, k2, and k1-k2. (<b>a</b>) Using only k1 as the independent variable; (<b>b</b>) using only k2; (<b>c</b>) k1–k2, using the slope difference index method.</p>
Full article ">Figure 14
<p>(<b>a</b>) Subtraction and (<b>b</b>) multiplication model of the discrete spectral derivate method.</p>
Full article ">Figure 15
<p>Relationship between actual Chl<span class="html-italic">a</span> and Chl<span class="html-italic">a</span> estimated by the model using bands of (<b>a</b>) 673, (<b>b</b>) 705/665, (<b>c</b>) 489, (<b>d</b>) 680, (<b>e</b>) 560~665~705, (<b>f</b>) 665–705, and (<b>g</b>) 705 × 783.</p>
Full article ">Figure 15 Cont.
<p>Relationship between actual Chl<span class="html-italic">a</span> and Chl<span class="html-italic">a</span> estimated by the model using bands of (<b>a</b>) 673, (<b>b</b>) 705/665, (<b>c</b>) 489, (<b>d</b>) 680, (<b>e</b>) 560~665~705, (<b>f</b>) 665–705, and (<b>g</b>) 705 × 783.</p>
Full article ">Figure 16
<p>Models of different data and methods. Ratio method with (<b>a</b>) L2A image reflectance, (<b>b</b>) simulated <span class="html-italic">R<sub>rs</sub></span>, and (<b>c</b>) C2RCC image reflectance, and slope difference method with (<b>d</b>) L2A, (<b>e</b>) simulated <span class="html-italic">R<sub>rs</sub></span>, and (<b>f</b>) C2RCC image.</p>
Full article ">Figure 17
<p>Chl<span class="html-italic">a</span> distribution simulated from different methods based on L2A and C2RCC image products: (<b>a</b>) from L2A images using band ratio method, (<b>b</b>) from L2A images using simulated <span class="html-italic">R<sub>rs</sub></span> and band ratio method, (<b>c</b>) from C2RCC images using band ratio method, (<b>d</b>) from C2RCC images using simulated <span class="html-italic">R<sub>rs</sub></span> and band ratio method, (<b>e</b>) from L2A images using slope difference method, (<b>f</b>) from L2A images using simulated <span class="html-italic">R<sub>rs</sub></span> and slope difference method, (<b>g</b>) from C2RCC images using slope difference method, (<b>h</b>) from C2RCC images using simulated <span class="html-italic">R<sub>rs</sub></span> and slope difference method.</p>
Full article ">Figure 18
<p>Statistical relationship between actual Chl<span class="html-italic">a</span> and estimated Chl<span class="html-italic">a</span> from images. The subplot sequence is same as <a href="#remotesensing-14-02270-f017" class="html-fig">Figure 17</a>.</p>
Full article ">
16 pages, 4688 KiB  
Article
A Neural Network Method for Retrieving Sea Surface Wind Speed for C-Band SAR
by Peng Yu, Wenxiang Xu, Xiaojing Zhong, Johnny A. Johannessen, Xiao-Hai Yan, Xupu Geng, Yuanrong He and Wenfang Lu
Remote Sens. 2022, 14(9), 2269; https://doi.org/10.3390/rs14092269 - 8 May 2022
Cited by 10 | Viewed by 2985
Abstract
Based on the Ocean Projection and Extension neural Network (OPEN) method, a novel approach is proposed to retrieve sea surface wind speed for C-band synthetic aperture radar (SAR). In order to prove the methodology with a robust dataset, five-year normalized radar cross section [...] Read more.
Based on the Ocean Projection and Extension neural Network (OPEN) method, a novel approach is proposed to retrieve sea surface wind speed for C-band synthetic aperture radar (SAR). In order to prove the methodology with a robust dataset, five-year normalized radar cross section (NRCS) measurements from the advanced scatterometer (ASCAT), a well-known side-looking radar sensor, are used to train the model. In situ wind data from direct buoy observations, instead of reanalysis wind data or model results, are used as the ground truth in the OPEN model. The model is applied to retrieve sea surface winds from two independent data sets, ASCAT and Sentinel-1 SAR data, and has been well-validated using buoy measurements from the National Oceanic and Atmospheric Administration (NOAA) and China Meteorological Administration (CMA), and the ASCAT coastal wind product. The comparison between the OPEN model and four C-band model (CMOD) versions (CMOD4, CMOD-IFR2, CMOD5.N, and CMOD7) further indicates the good performance of the proposed model for C-band SAR sensors. It is anticipated that the use of high-resolution SAR data together with the new wind speed retrieval method can provide continuous and accurate ocean wind products in the future. Full article
Show Figures

Figure 1

Figure 1
<p>Locations of the NDBC buoys for training the OPEN model in this study. The yellow marks indicate the locations of NDBC buoy data.</p>
Full article ">Figure 2
<p>Locations of the coastal buoys for the validation of SAR winds using the proposed model. The yellow rectangular box shows the locations of collocated SAR scenes, and the yellow marks indicate the locations of buoy data.</p>
Full article ">Figure 3
<p>Histogram of collocated in situ (<b>a</b>) wind speeds and (<b>b</b>) wind directions from NDBC buoys.</p>
Full article ">Figure 4
<p>Flowchart of the structure of the OPEN model.</p>
Full article ">Figure 5
<p>Comparison of OPEN model results and label wind data from buoy measurements.</p>
Full article ">Figure 6
<p>Comparisons of collocated NDBC buoy winds with retrieved wind speeds from the test data using (<b>a</b>) OPEN model, (<b>b</b>) CMOD4, (<b>c</b>) CMOD5.N, (<b>d</b>) CMOD-IFR2, and (<b>e</b>) CMOD7.</p>
Full article ">Figure 7
<p>Comparisons of collocated ASCAT coastal wind products with retrieved wind speeds from the test data using (<b>a</b>) OPEN model, (<b>b</b>) CMOD4, (<b>c</b>) CMOD5.N, (<b>d</b>) CMOD-IFR2, and (<b>e</b>) CMOD7.</p>
Full article ">Figure 8
<p>Comparisons of collocated coastal buoy winds with Sentinel-1 SAR winds using (<b>a</b>) OPEN, (<b>b</b>) CMOD4, (<b>c</b>) CMOD5.N, (<b>d</b>) CMOD-IFR2, and (<b>e</b>) CMOD7.</p>
Full article ">Figure 9
<p>(<b>a</b>) Sentinel-1 C-band SAR image in VV polarization that was acquired from the ascending pass at 10:02 UTC on 12 January 2020, given by the greyscale values (in dB); (<b>b</b>) wind speeds retrieved by the OPEN model, with the color bar denoting the wind speeds (m/s).</p>
Full article ">Figure 10
<p>Comparisons of retrieved winds using the OPEN model divided into (<b>a</b>) Coastal and (<b>b</b>) Offshore buoy wind speeds.</p>
Full article ">
22 pages, 4953 KiB  
Article
Urbanization Level in Chinese Counties: Imbalance Pattern and Driving Force
by Baifa Zhang, Jing Zhang and Changhong Miao
Remote Sens. 2022, 14(9), 2268; https://doi.org/10.3390/rs14092268 - 8 May 2022
Cited by 33 | Viewed by 4130
Abstract
Urbanization level is a key indicator for socioeconomic development and policy making, but the measurement data and methods need to be discussed further due to the limitation of a single index and the availability and accuracy of statistical data. China is urbanizing rapidly, [...] Read more.
Urbanization level is a key indicator for socioeconomic development and policy making, but the measurement data and methods need to be discussed further due to the limitation of a single index and the availability and accuracy of statistical data. China is urbanizing rapidly, but the urbanization level at the county scale remains a mystery due to its complexity and lack of unified and effective measurement indicators. In this paper, we proposed a new urbanization index to measure the Chinese urbanization level at the county scale by integrating population, land, and economic factors; by fusing remote sensing data and traditional demographic data, we investigated the multi-dimensional unbalanced development patterns and the driving mechanism from 1995 to 2015. Results indicate that: The average comprehensive urbanization level at the Chinese county scale has increased from 31.06% in 1995 to 45.23% in 2015, and the urbanization level in the permanent population may overestimate China’s urbanization process. There were significant but different spatial and temporal dynamic patterns in population, land, and economic levels as well as at a comprehensive urbanization level. The comprehensive urbanization level shows the pattern of being high in the south-east and low in the north-west, divided by “Hu line”. The urbanization of registered populations presents high in the northern border and the eastern coastal areas, which is further strengthened over time. Economic urbanization based on lighting data presents high in the east and low in the west. Land urbanization based on remote sensing data shows high in the south and low in the north. The registered population urbanization level is lower than economic and land urbanization. County urbanization was driven by large population size, reasonable industrial structure, and strong government capacity; 38% and 59% of urbanization levels can be regarded as the key nodes of the urbanization process. When the urbanization rate is lower than 38%, the secondary industry plays a strong role in powering urbanization; when the urbanization rate is higher than 38% but less than 59%, the promotion effect of the tertiary industry is more obvious, and the secondary industry is gradually weakened. When the urbanization rate exceeds 59%, the tertiary industry becomes the major driver. Full article
(This article belongs to the Special Issue Remote Sensing of Night-Time Light)
Show Figures

Figure 1

Figure 1
<p>Spatial distribution of population-economy-land urbanization rates in China from 1995 to 2015.</p>
Full article ">Figure 2
<p>Spatial distribution of comprehensive urbanization rate of China’s counties from 1995 to 2015.</p>
Full article ">Figure 3
<p>Comparison between permanent resident population urbanization and comprehensive urbanization.</p>
Full article ">Figure 4
<p>Number of counties with permanent population urbanization and comprehensive urbanization in different urbanization rates.</p>
Full article ">
19 pages, 32130 KiB  
Article
Optical Turbulence Profile in Marine Environment with Artificial Neural Network Model
by Cuicui Bi, Chun Qing, Pengfei Wu, Xiaomei Jin, Qing Liu, Xianmei Qian, Wenyue Zhu and Ningquan Weng
Remote Sens. 2022, 14(9), 2267; https://doi.org/10.3390/rs14092267 - 8 May 2022
Cited by 12 | Viewed by 2283
Abstract
Optical turbulence strongly affects different types of optoelectronic and adaptive optics systems. Systematic direct measurements of optical turbulence profiles [Cn2(h)] are lacking for many climates and seasons, particularly in marine environments, because it is impractical and [...] Read more.
Optical turbulence strongly affects different types of optoelectronic and adaptive optics systems. Systematic direct measurements of optical turbulence profiles [Cn2(h)] are lacking for many climates and seasons, particularly in marine environments, because it is impractical and expensive to deploy instrumentation. Here, a backpropagation neural network optimized using a genetic algorithm (GA-BP) is developed to estimate atmospheric turbulence profiles in marine environments which is validated against corresponding [Cn2(h)] profile datasets from a field campaign of balloon-borne microthermal measurements at the Haikou marine environment site. Overall, the trend and magnitude of the GA-BP model and measurements agree. The [Cn2(h)] profiles from the GA-BP model are generally superior to those obtained by BP and the physically-based (HMNSP99) models. Several statistical operators were used to quantify the GA-BP model performance on reconstructing the optical turbulence profiles in marine environments. The characterization of vertical distributions of optical turbulence profiles and the main integral parameters derived from [Cn2(h)] profiles are presented. The median Fried parameter, isoplanatic angle, and coherence time are 9.94 cm, 0.69, and 2.85 ms, respectively, providing independent optical turbulence parameters for adaptive optics systems. The proposed approach exhibits potential for implementation in ground-based optical applications in marine environments. Full article
(This article belongs to the Section Atmospheric Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>Topological structure diagram of the GA-BP neural network model.</p>
Full article ">Figure 2
<p>Mean relative error of the training and validation for different numbers of hidden neurons.</p>
Full article ">Figure 3
<p>Algorithm flowchart of the GA-BP neural network model.</p>
Full article ">Figure 4
<p>Training process of the GA-BP neural network model.</p>
Full article ">Figure 5
<p>Topographical distribution map of Haikou marine environment site. The black point represents the Haikou radiosonde station.</p>
Full article ">Figure 6
<p>(<b>a</b>) Payload of balloon-borne micro-thermometer measurement system. (<b>b</b>) Balloon-borne micro-thermometer measurement. (<b>c</b>) Balloon.</p>
Full article ">Figure 7
<p>(<b>a</b>–<b>i</b>) Comparison of the <math display="inline"><semantics> <mrow> <msubsup> <mi>C</mi> <mi>n</mi> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>h</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> profiles (left) between the models (GA-BP, BP and HMNSP99) and the measurement, the <math display="inline"><semantics> <mrow> <msub> <mi>L</mi> <mn>0</mn> </msub> <mrow> <mo>(</mo> <mi>h</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> profiles (right) from the HMNSP99 model and the measurement in the morning.</p>
Full article ">Figure 8
<p>(<b>a</b>–<b>l</b>) Comparison of the <math display="inline"><semantics> <mrow> <msubsup> <mi>C</mi> <mi>n</mi> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>h</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> profiles (left) between the models (GA-BP, BP and HMNSP99) and the measurement, the <math display="inline"><semantics> <mrow> <msub> <mi>L</mi> <mn>0</mn> </msub> <mrow> <mo>(</mo> <mi>h</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> profiles (right) from the HMNSP99 model and the measurement in the night.</p>
Full article ">Figure 9
<p>Statistical results of the <math display="inline"><semantics> <mrow> <msubsup> <mi>C</mi> <mi>n</mi> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>h</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> profiles between the GA-BP, BP and HMNSP99 models in the morning. (<b>a</b>) BIAS. (<b>b</b>) RMSE. (<b>c</b>) Average <math display="inline"><semantics> <mrow> <msubsup> <mi>C</mi> <mi>n</mi> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>h</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> profiles.</p>
Full article ">Figure 10
<p>Statistical results of the <math display="inline"><semantics> <mrow> <msubsup> <mi>C</mi> <mi>n</mi> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>h</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> profiles between the GA-BP, BP and HMNSP99 models in the night. (<b>a</b>) BIAS. (<b>b</b>) RMSE. (<b>c</b>) Average <math display="inline"><semantics> <mrow> <msubsup> <mi>C</mi> <mi>n</mi> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>h</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> profiles.</p>
Full article ">Figure 11
<p>Comparison of the integrated parameters between GA-BP, BP and HMNSP99 models. (<b>a</b>) Fried parameter. (<b>b</b>) Isoplanatic angle. (<b>c</b>) Coherence time.</p>
Full article ">Figure 12
<p>Box plot for integrated parameters between GA-BP, BP and HMNSP99 models. (<b>a</b>) Fried parameter. (<b>b</b>) Isoplanatic angle. (<b>c</b>) Coherence time.</p>
Full article ">
20 pages, 12409 KiB  
Article
Hyperspectral Image Classification via Deep Structure Dictionary Learning
by Wenzheng Wang, Yuqi Han, Chenwei Deng and Zhen Li
Remote Sens. 2022, 14(9), 2266; https://doi.org/10.3390/rs14092266 - 8 May 2022
Cited by 17 | Viewed by 2751
Abstract
The construction of diverse dictionaries for sparse representation of hyperspectral image (HSI) classification has been a hot topic over the past few years. However, compared with convolutional neural network (CNN) models, dictionary-based models cannot extract deeper spectral information, which will reduce their performance [...] Read more.
The construction of diverse dictionaries for sparse representation of hyperspectral image (HSI) classification has been a hot topic over the past few years. However, compared with convolutional neural network (CNN) models, dictionary-based models cannot extract deeper spectral information, which will reduce their performance for HSI classification. Moreover, dictionary-based methods have low discriminative capability, which leads to less accurate classification. To solve the above problems, we propose a deep learning-based structure dictionary for HSI classification in this paper. The core ideas are threefold, as follows: (1) To extract the abundant spectral information, we incorporate deep residual neural networks in dictionary learning and represent input signals in the deep feature domain. (2) To enhance the discriminative ability of the proposed model, we optimize the structure of the dictionary and design sharing constraint in terms of sub-dictionaries. Thus, the general and specific feature of HSI samples can be learned separately. (3) To further enhance classification performance, we design two kinds of loss functions, including coding loss and discriminating loss. The coding loss is used to realize the group sparsity of code coefficients, in which within-class spectral samples can be represented intensively and effectively. The Fisher discriminating loss is used to enforce the sparse representation coefficients with large between-class scatter. Extensive tests performed on hyperspectral dataset with bright prospects prove the developed method to be effective and outperform other existing methods. Full article
(This article belongs to the Special Issue Signal Processing Theory and Methods in Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>Workflow of the proposed feature extraction model.</p>
Full article ">Figure 2
<p>Network architectures for our encoder: (<b>a</b>) a block of residual networks, (<b>b</b>) main structure of CNNs.</p>
Full article ">Figure 3
<p>Overview of the built dictionary of the developed model. Shared constraints are used to describe the common features of all classes of HSI samples.</p>
Full article ">Figure 4
<p>The loss function value of training samples and classification accuracy of the developed model versus the number of epochs.</p>
Full article ">Figure 5
<p>The classification OA under different numbers of atoms for each sub-dictionary.</p>
Full article ">Figure 6
<p>The classification OA under different regularization parameters <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>1</mn> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>2</mn> </msub> </semantics></math>.</p>
Full article ">Figure 7
<p>The confusion matrix of the developed model on the Center of Pavia dataset.</p>
Full article ">Figure 8
<p>Classification maps of the Center of Pavia dataset with methods in comparison: (<b>a</b>) pseudo-color image; (<b>b</b>) FDDL; (<b>c</b>) DPL; (<b>d</b>) ResNet; (<b>e</b>) RNN; (<b>f</b>) CNN; (<b>g</b>) Ours; (<b>h</b>) ground truth. The yellow and red rectangles correspond to building and water areas.</p>
Full article ">Figure 9
<p>Confusion matrix of the developed model on the Botswana dataset.</p>
Full article ">Figure 10
<p>Classification map for the Botswana dataset with methods in comparison: (<b>a</b>) pseudo-color image; (<b>b</b>) FDDL; (<b>c</b>) DPL; (<b>d</b>) ResNet; (<b>e</b>) RNN; (<b>f</b>) CNN; (<b>g</b>) ours; (<b>h</b>) ground truth. Rectangles colored as red and yellow represent mountain and grassland areas.</p>
Full article ">Figure 11
<p>Confusion matrix of the developed model on the Houston 2013 dataset.</p>
Full article ">Figure 12
<p>Classification map generated by Houston 2013 dataset with approaches in comparison: (<b>a</b>) pseudo-color image; (<b>b</b>) FDDL; (<b>c</b>) DPL; (<b>d</b>) ResNet; (<b>e</b>) RNN; (<b>f</b>) CNN; (<b>g</b>) ours; (<b>h</b>) ground truth. The rectangles with the colors of red and yellow represent the parking lot space and building area.</p>
Full article ">Figure 13
<p>The confusion matrix of our model on the Houston 2018 dataset.</p>
Full article ">Figure 14
<p>Classification maps of the Houston 2018 dataset with compared methods: (<b>a</b>) pseudo-color image; (<b>b</b>) FDDL; (<b>c</b>) DPL; (<b>d</b>) ResNet; (<b>e</b>) RNN; (<b>f</b>) CNN; (<b>g</b>) Ours; (<b>h</b>) ground truth. The yellow and red rectangles are corresponding to grassland andbuilding areas.</p>
Full article ">
27 pages, 4381 KiB  
Article
One-Shot Dense Network with Polarized Attention for Hyperspectral Image Classification
by Haizhu Pan, Moqi Liu, Haimiao Ge and Liguo Wang
Remote Sens. 2022, 14(9), 2265; https://doi.org/10.3390/rs14092265 - 8 May 2022
Cited by 14 | Viewed by 2338
Abstract
In recent years, hyperspectral image (HSI) classification has become a hot research direction in remote sensing image processing. Benefiting from the development of deep learning, convolutional neural networks (CNNs) have shown extraordinary achievements in HSI classification. Numerous methods combining CNNs and attention mechanisms [...] Read more.
In recent years, hyperspectral image (HSI) classification has become a hot research direction in remote sensing image processing. Benefiting from the development of deep learning, convolutional neural networks (CNNs) have shown extraordinary achievements in HSI classification. Numerous methods combining CNNs and attention mechanisms (AMs) have been proposed for HSI classification. However, to fully mine the features of HSI, some of the previous methods apply dense connections to enhance the feature transfer between each convolution layer. Although dense connections allow these methods to fully extract features in a few training samples, it decreases the model efficiency and increases the computational cost. Furthermore, to balance model performance against complexity, the AMs in these methods compress a large number of channels or spatial resolutions during the training process, which results in a large amount of useful information being discarded. To tackle these issues, in this article, a novel one-shot dense network with polarized attention, namely, OSDN, was proposed for HSI classification. More precisely, since HSI contains rich spectral and spatial information, the OSDN has two independent branches to extract spectral and spatial features, respectively. Similarly, the polarized AMs contain two components: channel-only AMs and spatial-only AMs. Both polarized AMs can use a specially designed filtering method to reduce the complexity of the model while maintaining high internal resolution in both the channel and spatial dimensions. To verify the effectiveness and lightness of OSDN, extensive experiments were carried out on five benchmark HSI datasets, namely, Pavia University (PU), Kennedy Space Center (KSC), Botswana (BS), Houston 2013 (HS), and Salinas Valley (SV). Experimental results consistently showed that the OSDN can greatly reduce computational cost and parameters while maintaining high accuracy in a few training samples. Full article
(This article belongs to the Special Issue Recent Advances in Processing Mixed Pixels for Hyperspectral Image)
Show Figures

Figure 1

Figure 1
<p>Illustration of the 3-D convolution operation.</p>
Full article ">Figure 2
<p>Illustration of the residual block in the ResNet.</p>
Full article ">Figure 3
<p>Illustration of the dense block in the DenseNet.</p>
Full article ">Figure 4
<p>Details of the channel-only polarized attention mechanism in our network.</p>
Full article ">Figure 5
<p>Details of the spatial-only polarized attention mechanism in our network.</p>
Full article ">Figure 6
<p>The structure of the proposed network.</p>
Full article ">Figure 7
<p>Full-factor classification maps for the PU dataset. (<b>a</b>) Ground-truth. (<b>b</b>) SVM. (<b>c</b>) HYSN. (<b>d</b>) SSRN. (<b>e</b>) FDSS. (<b>f</b>) DBMA. (<b>g</b>) DBDA. (<b>h</b>) PCIA. (<b>i</b>) SSGC. (<b>j</b>) OSDN. (<b>k</b>) False-color image.</p>
Full article ">Figure 8
<p>Full-factor classification maps for the KSC dataset. (<b>a</b>) Ground-truth. (<b>b</b>) SVM. (<b>c</b>) HYSN. (<b>d</b>) SSRN. (<b>e</b>) FDSS. (<b>f</b>) DBMA. (<b>g</b>) DBDA. (<b>h</b>) PCIA. (<b>i</b>) SSGC. (<b>j</b>) OSDN. (<b>k</b>) False-color image.</p>
Full article ">Figure 9
<p>Full-factor classification maps for the BS dataset. (<b>a</b>) Ground-truth. (<b>b</b>) SVM. (<b>c</b>) HYSN. (<b>d</b>) SSRN. (<b>e</b>) FDSS. (<b>f</b>) DBMA. (<b>g</b>) DBDA. (<b>h</b>) PCIA. (<b>i</b>) SSGC. (<b>j</b>) OSDN. (<b>k</b>) False-color image.</p>
Full article ">Figure 10
<p>Full-factor classification maps for the HS dataset. (<b>a</b>) Ground-truth. (<b>b</b>) SVM. (<b>c</b>) HYSN. (<b>d</b>) SSRN. (<b>e</b>) FDSS. (<b>f</b>) DBMA. (<b>g</b>) DBDA. (<b>h</b>) PCIA. (<b>i</b>) SSGC. (<b>j</b>) OSDN. (<b>k</b>) False-color image.</p>
Full article ">Figure 11
<p>Full-factor classification maps for the SA dataset. (<b>a</b>) Ground-truth. (<b>b</b>) SVM. (<b>c</b>) HYSN. (<b>d</b>) SSRN. (<b>e</b>) FDSS. (<b>f</b>) DBMA. (<b>g</b>) DBDA. (<b>h</b>) PCIA. (<b>i</b>) SSGC. (<b>j</b>) OSDN. (<b>k</b>) False-color image.</p>
Full article ">Figure 12
<p>Comparison of <span class="html-italic">OA</span> using different spatial window sizes for the five datasets.</p>
Full article ">Figure 13
<p>Comparison of <span class="html-italic">OA</span> using different training sample proportions for the five datasets: (<b>a</b>) PU, (<b>b</b>) KSC, (<b>c</b>) BS, (<b>d</b>) HS, and (<b>e</b>) SA.</p>
Full article ">Figure 14
<p>Classification results at different methods on the five datasets. (<b>a</b>) <span class="html-italic">OA</span>. (<b>b</b>) <span class="html-italic">AA</span>. (<b>c</b>) <span class="html-italic">Kappa</span>.</p>
Full article ">Figure 15
<p>Different dense blocks. (<b>a</b>) Dense block. (<b>b</b>) Week dense block. (<b>c</b>) One-shot dense block.</p>
Full article ">Figure 16
<p><span class="html-italic">OA</span> (%) of OSDN with different attention models on five datasets.</p>
Full article ">
17 pages, 8064 KiB  
Article
Experimental Study on the Exploration of Camera Scanning Reflective Fourier Ptychography Technology for Far-Field Imaging
by Mingyang Yang, Xuewu Fan, Yuming Wang and Hui Zhao
Remote Sens. 2022, 14(9), 2264; https://doi.org/10.3390/rs14092264 - 8 May 2022
Cited by 3 | Viewed by 2717
Abstract
Fourier ptychography imaging is a powerful phase retrieval method that can be used to realize super-resolution. In this study, we establish a mathematical model of long-distance camera scanning based on reflective Fourier ptychography imaging. In order to guarantee the effective recovery of a [...] Read more.
Fourier ptychography imaging is a powerful phase retrieval method that can be used to realize super-resolution. In this study, we establish a mathematical model of long-distance camera scanning based on reflective Fourier ptychography imaging. In order to guarantee the effective recovery of a high-resolution image in the experiment, we analyze the influence of laser coherence in different modes and the surface properties of diverse materials for diffused targets. For the analysis, we choose a single-mode fiber laser as the illumination source and metal materials with high diffused reflectivity as the experimental targets to ensure the validity of the experimental results. Based on the above, we emulate camera scanning with a single camera attached to an X-Y translation stage, and an experimental system with a working distance of 3310 mm is used as an example to image a fifty-cent coin. We also perform speckle analysis for rough targets and calculate the average speckle size using a normalized autocorrelation function in different positions. The method of calculating the average speckle size for everyday objects provides the premise for subsequent research on image quality evaluation; meanwhile, the coherence of the light field and the targets with high reflectivity under this experiment provide an application direction for the further development of the technique, such as computer vision, surveillance and remote sensing. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The geometric model of the reflective Fourier ptychographic optical imaging system.</p>
Full article ">Figure 2
<p>The 2D light field intensity distribution image of the single-mode fiber laser.</p>
Full article ">Figure 3
<p>The real image of light beam pattern emitted by an actual single-mode fiber laser.</p>
Full article ">Figure 4
<p>(<b>a</b>) The simulation image of LP<sub>11</sub> mode light field intensity distribution; (<b>b</b>) the simulation image of the sum of LP<sub>11</sub> and LP<sub>12</sub> mode light field intensity distribution.</p>
Full article ">Figure 5
<p>The real image of light beam pattern emitted by an actual multi-mode fiber laser.</p>
Full article ">Figure 6
<p>The sketch of reflective Fourier ptychography imaging setup. The single-mode fiber infrared semiconductor laser with wavelength of 976 nm emanates coherent light to illuminate the target. The aperture acts as a low-pass filter of Fourier transform, and the information undergoes inverse Fourier transform. The camera sensor receives the intensity measurements at different positions.</p>
Full article ">Figure 7
<p>The sketch of specular and diffuse reflections on the surface of an object.</p>
Full article ">Figure 8
<p>The reconstructed images with low resolution of non-metallic materials. (<b>a</b>) is a one-yuan RMB bill; (<b>b</b>) is a resolution target with pottery and porcelain.</p>
Full article ">Figure 9
<p>The reconstructed images with high resolution of metallic materials. (<b>a</b>) is a one-yuan coin with a nickel-plated steel core; (<b>b</b>) is a fifty-cent coin with a steel core copper-plated alloy; (<b>c</b>) is a ten-cent coin with aluminum alloy.</p>
Full article ">Figure 10
<p>The low-resolution image array with 81 images and the central image of a fifty-cent coin.</p>
Full article ">Figure 11
<p>The reconstructed high-resolution results of a fifty-cent coin in this experiment. (<b>a</b>) is the reconstructed high-resolution intensity image of a fifty-cent coin; (<b>b</b>) is the reconstructed phase image of a fifty-cent coin; (<b>c</b>) is the synthetic Fourier spectrum image of the reconstructed high-resolution image.</p>
Full article ">Figure 12
<p>The results of average speckle sizes of the LR central image and the HR image at three random positions.</p>
Full article ">Figure 13
<p>The plot of the resolution enhancement ratio at ten random different positions of the LR central image and the HR image above.</p>
Full article ">Figure 14
<p>The low-resolution images acquired by the laboratory macroscopic far-field Fourier ptychography setup with scanning pupil of 10 mm. (<b>a</b>) is the low-resolution image array with 81 images; (<b>b</b>) is the 41st center input image with low-resolution.</p>
Full article ">Figure 15
<p>The reconstructed high-resolution results of USAF target in this experiment. (<b>a</b>) is the reconstructed high-resolution intensity image of USAF target; (<b>b</b>) is the reconstructed phase image of USAF target. (<b>c</b>) is the synthetic Fourier spectrum image of the reconstructed high-resolution USAF image.</p>
Full article ">
23 pages, 7370 KiB  
Article
Remote Sensing Mapping of Build-Up Land with Noisy Label via Fault-Tolerant Learning
by Gang Xu, Yongjun Fang, Min Deng, Geng Sun and Jie Chen
Remote Sens. 2022, 14(9), 2263; https://doi.org/10.3390/rs14092263 - 8 May 2022
Cited by 2 | Viewed by 2753
Abstract
China’s urbanization has dramatically accelerated in recent decades. Land for urban build-up has changed not only in large cities but also in small counties. Land cover mapping is one of the fundamental tasks in the field of remote sensing and has received great [...] Read more.
China’s urbanization has dramatically accelerated in recent decades. Land for urban build-up has changed not only in large cities but also in small counties. Land cover mapping is one of the fundamental tasks in the field of remote sensing and has received great attention. However, most current mapping requires a significant manual effort for labeling or classification. It is of great practical value to use the existing low-resolution label data for the classification of higher resolution images. In this regard, this work proposes a method based on noise-label learning for fine-grained mapping of urban build-up land in a county in central China. Specifically, this work produces a build-up land map with a resolution of 10 m based on a land cover map with a resolution of 30 m. Experimental results show that the accuracy of the results is improved by 5.5% compared with that of the baseline method. This notion indicates that the time required to produce a fine land cover map can be significantly reduced using existing coarse-grained data. Full article
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Location of Taoyuan County. (<b>b</b>) Taoyuan County Sentinel-2 10 m resolution RGB image. (<b>c</b>) Sentinel-1 c-band dummy color image. red: VV, green: VH, blue: VV/VH. (<b>d</b>) Taoyuan County DEM.</p>
Full article ">Figure 2
<p>(<b>a</b>) Taoyuan 30 m land cover map. (<b>b</b>) Taoyuan 10 m land cover map.</p>
Full article ">Figure 3
<p>Research method flowchart.</p>
Full article ">Figure 4
<p>Build-up landcover mapping results. (<b>a</b>) Remote sensing imagery of small settlements. (<b>b</b>) The 30 m land cover map for small settlements. (<b>c</b>) Results of mapping of small settlements (<b>d</b>) The 10 m resolution build-up land mapping of Taoyuan County. (<b>e</b>) Remote sensing imagery of large settlements. (<b>f</b>) The 30 m land cover map for large settlements. (<b>g</b>) Results of mapping of large settlements.</p>
Full article ">Figure 5
<p>The CMs for different methods. (<b>a</b>) Baseline CM. (<b>b</b>) Our CM.</p>
Full article ">Figure 6
<p>(<b>a</b>) The 10 m resolution build-up land mapping of Xinhua County. (<b>b</b>) The 10 m resolution build-up land mapping of Anhua County. (<b>c</b>) The 10 m resolution build-up land mapping of Taojiang County.</p>
Full article ">Figure 7
<p>Comparison of mapping effects. First row: Sentinel-2 L2A image; second row: Globeland30 30 m landcover; third row: our results. First column: urban area; second column: forest area; third column: farmland area.</p>
Full article ">Figure 8
<p>Confusion matrices results. First row: Taojiang results; second row: Xinhua results; third row: Anhua results. First column: baseline results; second column: our method’s results.</p>
Full article ">Figure 9
<p>The built-up land cover mapping result in Changsha County. (<b>a</b>) The 10 m resolution build-up land cover map of Changsha County confusion matrices results. (<b>b</b>) Results of mapping of factory. (<b>c</b>) Results of mapping of small settlements. (<b>d</b>) The 30 m land cover map for factory. (<b>e</b>) The 30 m land cover map for small settlements. (<b>f</b>) Remote sensing imagery of factory. (<b>g</b>) Remote sensing imagery of small settlements.</p>
Full article ">Figure 10
<p>The CMs for different methods in Changsha County. (<b>a</b>) Baseline CM. (<b>b</b>) Our CM.</p>
Full article ">Figure 11
<p>The 2021 Taoyuan County mapping results. (<b>a</b>) The 2021 10 m built-up land cover map in Taoyuan. (<b>b</b>) Remote sensing imagery of new built-up area in 2021. (<b>c</b>) Remote sensing imagery of the same area in 2020. (<b>d</b>) Results of mapping of new built-up area in 2021. (<b>e</b>) Results of mapping of the same area in 2020.</p>
Full article ">Figure 12
<p>Confusion matrices: (<b>a</b>) 2021 Taoyuan County with our methods; (<b>b</b>) 2020 Taoyuan County with our methods.</p>
Full article ">Figure 13
<p>Overall classification accuracy using single-band images.</p>
Full article ">Figure 14
<p>Ground truth labels. First row: ground truth labels in Google Maps; second row: ground truth labels in Sentinel-2 L2A RGB.</p>
Full article ">
25 pages, 6287 KiB  
Article
A Machine Learning Approach to Waterbody Segmentation in Thermal Infrared Imagery in Support of Tactical Wildfire Mapping
by Jacqueline A. Oliver, Frédérique C. Pivot, Qing Tan, Alan S. Cantin, Martin J. Wooster and Joshua M. Johnston
Remote Sens. 2022, 14(9), 2262; https://doi.org/10.3390/rs14092262 - 8 May 2022
Cited by 4 | Viewed by 2383
Abstract
Wildfire research is working toward near real-time tactical wildfire mapping through the application of computer vision techniques to airborne thermal infrared (IR) imagery. One issue hindering automation is the potential for waterbodies to be marked as areas of combustion due to their relative [...] Read more.
Wildfire research is working toward near real-time tactical wildfire mapping through the application of computer vision techniques to airborne thermal infrared (IR) imagery. One issue hindering automation is the potential for waterbodies to be marked as areas of combustion due to their relative warmth in nighttime thermal imagery. Segmentation and masking of waterbodies could help resolve this issue, but the reliance on data captured exclusively in the thermal IR and the presence of real areas of combustion in some of the images introduces unique challenges. This study explores the use of the random forest (RF) classifier for the segmentation of waterbodies in thermal IR images containing a heterogenous wildfire. Features for classification are generated through the application of contextual and textural filters, as well as normalization techniques. The classifier’s outputs are compared against static GIS-based data on waterbody extent as well as the outputs of two unsupervised segmentation techniques, based on entropy and variance, respectively. Our results show that the RF classifier achieves very high balanced accuracy (>98.6%) for thermal imagery with and without wildfire pixels, with an overall F1 score of 0.98. The RF method surpassed the accuracy of all others tested, even with heterogenous training sets as small as 20 images. In addition to assisting automation of wildfire mapping, the efficiency and accuracy of this approach to segmentation can facilitate the creation of larger training data sets, which are necessary for invoking more complex deep learning approaches. Full article
(This article belongs to the Section Remote Sensing Image Processing)
Show Figures

Figure 1

Figure 1
<p>Thermal imagery study area in Woodland Caribou Provincial Park, near Red Lake Ontario. Map displays the wildfires that were active at the time of image collection, on 12 August 2018, based on the National Burned Area Composite (NBAC) of Canada [<a href="#B35-remotesensing-14-02262" class="html-bibr">35</a>].</p>
Full article ">Figure 2
<p>(<b>A</b>) Original raster image, with no-data values masked out for figure visibility; (<b>B</b>) original Raster with delineated waterbody boundary (blue) and existing CanVec boundary (pink) overlaid; (<b>C</b>) reference raster generated based on original raster and waterbody boundary; (<b>D</b>) Canvec Raster generated based on original raster image, adjusted to reflect no-data border in the original.</p>
Full article ">Figure 3
<p>Metrics for CanVec baseline data.</p>
Full article ">Figure 4
<p>(<b>A</b>) Metrics for binary entropy unsupervised technique; (<b>B</b>) metrics for binary variance unsupervised technique.</p>
Full article ">Figure 5
<p>Metrics for texture-feature trained random forest classifiers by feature number and training set size.</p>
Full article ">Figure 6
<p>Typical CanVec segmentation errors. (<b>A)</b> Oversimplification of waterbody edges; (<b>B</b>) Directional skewing along shorelines; (<b>C</b>) Oversimplification and directional skewing errors typically intensify along narrow waterbodies.</p>
Full article ">Figure 7
<p>Sample segmentation for binary entropy and binary variance (<b>A</b>) BE more accurately captures shorelines at the expense of additional false positives, including within areas of active fire; (<b>B</b>) BV includes lines of false positives where there are visible temperature shifts in water; (<b>C</b>) Both methods are prone to false positive patches, particularly common near narrow waterbodies.</p>
Full article ">Figure 8
<p>Sample of significant features for a fire pixel-containing IR image. Original and normalized images reflect the suppressed detail that can occur when IR images containing active wildfire are processed by standard image processing libraries. As described in <a href="#app1-remotesensing-14-02262" class="html-app">Appendix A</a>, the max-normalization technique can help preserve this detail.</p>
Full article ">Figure 9
<p>Sample of significant features for IR image without fire pixels.</p>
Full article ">Figure 10
<p>Sample segmentation results using the 91-feature RF classifier trained with 50 images. (<b>A</b>) Overall strong accuracy, even in the presence of flaming combustion close to the shoreline, without the directional skewing common in static GIS data output; (<b>B</b>) noise and larger false positive blobs occurred in many images; (<b>C</b>) weaker performance near narrower waterbodies.</p>
Full article ">Figure 11
<p>Comparative distribution of classifier accuracy with respect to waterbody area for all 517 waterbodies binned by size classes for (<b>a</b>) RF, (<b>b</b>) BE, and (<b>c</b>) BV.</p>
Full article ">
21 pages, 14617 KiB  
Article
Detection of Flood Extent Using Sentinel-1A/B Synthetic Aperture Radar: An Application for Hurricane Harvey, Houston, TX
by Kristy F. Tiampo, Lingcao Huang, Conor Simmons, Clay Woods and Margaret T. Glasscoe
Remote Sens. 2022, 14(9), 2261; https://doi.org/10.3390/rs14092261 - 8 May 2022
Cited by 11 | Viewed by 3358
Abstract
The increasing number of flood events combined with coastal urbanization has contributed to significant economic losses and damage to buildings and infrastructure. Development of higher resolution SAR flood mapping that accurately identifies flood features at all scales can be incorporated into operational flood [...] Read more.
The increasing number of flood events combined with coastal urbanization has contributed to significant economic losses and damage to buildings and infrastructure. Development of higher resolution SAR flood mapping that accurately identifies flood features at all scales can be incorporated into operational flood forecasting tools, improving response and resilience to large flood events. Here, we present a comparison of several methods for characterizing flood inundation using a combination of synthetic aperture radar (SAR) remote sensing data and machine learning methods. We implement two applications with SAR GRD data, an amplitude thresholding technique applied, for the first time, to Sentinel-1A/B SAR data, and a machine learning technique, DeepLabv3+. We also apply DeepLabv3+ to a false color RGB characterization of dual polarization SAR data. Analyses at 10 m pixel spacing are performed for the major flood event associated with Hurricane Harvey and associated inundation in Houston, TX in August of 2017. We compare these results with high-resolution aerial optical images over this time period, acquired by the NOAA Remote Sensing Division. We compare the results with NDWI produced from Sentinel-2 images, also at 10 m pixel spacing, and statistical testing suggests that the amplitude thresholding technique is the most effective, although the machine learning analysis is successful at reproducing the inundation shape and extent. These results demonstrate the effectiveness of flood inundation mapping at unprecedented resolutions and its potential for use in operational emergency hazard response to large flood events. Full article
(This article belongs to the Special Issue Remote Sensing for Near-Real-Time Disaster Monitoring)
Show Figures

Figure 1

Figure 1
<p>Map of study area. Red squares outline of the two ascending SAR images used in this analysis, 29 August 2017. The northern SAR image is Path 34, Frame 95, and the southern SAR scene is Path 34, Frame 90. The Sentinel-2 scenes are outlined in green; the large swath was acquired on 30 August 2017, and the smaller scene on 1 September 2017.</p>
Full article ">Figure 2
<p>NOAA Remote Sensing Division airborne digital optical imagery of the Houston area acquired between 27 August and 3 September 2017, in response to Hurricane Harvey. Approximate GSD for each pixel ranges between 35 and 50 cm (<a href="https://storms.ngs.noaa.gov/storms/harvey/download/metadata.html" target="_blank">https://storms.ngs.noaa.gov/storms/harvey/download/metadata.html</a> (accessed on 1 February 2021)). Red squares outline three areas selected for comparison with results of flood detection and DeepLabv3+ analysis.</p>
Full article ">Figure 3
<p>(<b>a</b>) GWM (water pixels in blue) [<a href="https://global-surface-water.appspot.com/download" target="_blank">https://global-surface-water.appspot.com/download</a> (accessed on 17 May 2021)]; (<b>b</b>) MODIS-derived flood inundation, 250 m pixel spacing, for 28 August–4 September 2017, courtesy of DFO (water in blue) [<a href="#B41-remotesensing-14-02261" class="html-bibr">41</a>], with the GWM removed to characterize temporary water only.</p>
Full article ">Figure 4
<p>NDWI derived from Sentinel-2 images, acquired on 30 August and 1 September 2017. Water is shown in blue, and pixel spacing is 10 m.</p>
Full article ">Figure 5
<p>(<b>a</b>) Sentinel-1 SAR image, 29 August 2017, 10 m pixel spacing; (<b>b</b>) false color RGB image, where VH is blue, VV is green, and VH/VV is blue, 10 m pixel spacing.</p>
Full article ">Figure 6
<p>(<b>a</b>) SAR GRD image of Houston, TX, 29 August 2017, with training region and polygons outlined in red; (<b>b</b>) training polygons from (<b>a</b>); (<b>c</b>) grids for manually drawing flooding areas; and (<b>d</b>) enlarged training polygons marked by the orange box in (<b>b</b>).</p>
Full article ">Figure 7
<p>Results from thresholding analysis of SAR GRD data, 29 August 2017, where flood waters are detected using the thresholding method described in Thresholding. Water is shown in blue and pixel spacing is 10 m, and permanent water bodies removed using the GWM.</p>
Full article ">Figure 8
<p>Results from application of DeepLabv3+ to the SAR GRD data, 29 August 2017, shown in <a href="#remotesensing-14-02261-f006" class="html-fig">Figure 6</a>a. Water pixels are shown in blue, pixel spacing is 10 m, permanent water bodies removed using the GWM.</p>
Full article ">Figure 9
<p>Results from the DeepLabv3+ ML analysis applied to the false RGB data of <a href="#remotesensing-14-02261-f005" class="html-fig">Figure 5</a>b. Water pixels are shown in blue, pixel spacing is 10 m, permanent water bodies removed using the GWM.</p>
Full article ">Figure 10
<p>(<b>a</b>) Water pixels identified by MODIS data, courtesy of the DFO [<a href="#B41-remotesensing-14-02261" class="html-bibr">41</a>] 250 m pixel spacing; (<b>b</b>) water pixels identified from NDWI analysis of Sentinel-2 data, 10 m pixel spacing; (<b>c</b>) water pixels identified by the DeepLabv3+ analysis of SAR GRD data, 29 August 2017, 10 m pixel spacing; (<b>d</b>) water pixels identified from thresholding analysis of the same SAR GRD data, 10 m pixel spacing; and (<b>e</b>) water pixels identified by the ML analysis of the RGB classification of the SAR GRD data, 10 m pixel spacing. The GWM is removed from all results except the NDWI (<b>b</b>). Water pixels are shown in blue in all subfigures.</p>
Full article ">Figure 11
<p>(<b>a</b>) NOAA Remote Sensing Division airborne digital optical imagery of the Houston area acquired between 27 August and 3 September 2017, subregion 1, as shown in <a href="#remotesensing-14-02261-f002" class="html-fig">Figure 2</a>; (<b>b</b>) water pixels identified by DFO MODIS data, 250 m pixel spacing, courtesy of the DFO [<a href="#B41-remotesensing-14-02261" class="html-bibr">41</a>]; (<b>c</b>) water pixels identified from NDWI analysis of Sentinel-2 data, 10 m pixel spacing; (<b>d</b>) water pixels identified by the DeepLabv3+ analysis of SAR GRD data, 29 August 2017, 10 pixel spacing; (<b>e</b>) water pixels identified from thresholding analysis of the same SAR GRD data, 10 m pixel spacing; and (<b>f</b>) water pixels identified by the classification analysis of the SAR GRD data, 10 m pixel spacing. The GWM <span class="html-italic">is not removed</span> from the analyses of (<b>b</b>) through (<b>f</b>). Water pixels are shown in orange.</p>
Full article ">Figure 12
<p>(<b>a</b>) NOAA Remote Sensing Division airborne digital optical imagery of the Houston area acquired between 27 August and 3 September 2017, subregion 2, as shown in <a href="#remotesensing-14-02261-f002" class="html-fig">Figure 2</a>; (<b>b</b>) water pixels identified by MODIS data, 250 m pixel spacing, courtesy of the DFO [<a href="#B41-remotesensing-14-02261" class="html-bibr">41</a>]; (<b>c</b>) water pixels identified from NDWI analysis of Sentinel-2 data, 10 m pixel spacing; (<b>d</b>) water pixels identified by the DeepLabv3+ analysis of SAR GRD data, 29 August 2017, 10 m pixel spacing; (<b>e</b>) water pixels identified from thresholding analysis of the same SAR GRD data, 10 m pixel spacing; and (<b>f</b>) water pixels identified by the classification analysis of the SAR GRD data, 10 m pixel spacing. The GWM <span class="html-italic">is not removed</span> from the analyses of (<b>b</b>) through (<b>f</b>). Water pixels are shown in orange.</p>
Full article ">Figure 13
<p>(<b>a</b>) NOAA Remote Sensing Division airborne digital optical imagery of the Houston area acquired between 27 August and 3 September 2017, subregion 3, as shown in <a href="#remotesensing-14-02261-f002" class="html-fig">Figure 2</a>; (<b>b</b>) water pixels identified by MODIS data, 250 m pixel spacing, courtesy of the DFO [<a href="#B41-remotesensing-14-02261" class="html-bibr">41</a>]; (<b>c</b>) water pixels identified from NDWI analysis of Sentinel-2 data, 10 m pixel spacing; (<b>d</b>) water pixels identified by the DeepLabv3+ analysis of SAR GRD data, 29 August 2017, 10 m pixel spacing; (<b>e</b>) water pixels identified from thresholding analysis of the same SAR GRD data, 10 m pixel spacing; and (<b>f</b>) water pixels identified by the classification analysis of the SAR GRD data, 10 m pixel spacing. The GWM <span class="html-italic">is not removed</span> from the analyses of (<b>b</b>) through (<b>f</b>). Water pixels are shown in orange.</p>
Full article ">Figure 14
<p>Confusion matrices for DeepLabv3+ analysis compared with Sentinel-2 NDWI analysis; thresholding analysis compared with Sentinel-2 NDWI analysis; and false color RGB ML analysis compared with Sentinel-2 NDWI analysis. In each, the top left box shows true positives (tp), the top right shows false positives (fp), bottom left shows false negatives (fn) and bottom right shows true negatives (tn) for each comparison.</p>
Full article ">
21 pages, 5627 KiB  
Article
Comparative Study of the 60 GHz and 118 GHz Oxygen Absorption Bands for Sounding Sea Surface Barometric Pressure
by Qiurui He, Jiaoyang Li, Zhenzhan Wang and Lanjie Zhang
Remote Sens. 2022, 14(9), 2260; https://doi.org/10.3390/rs14092260 - 8 May 2022
Cited by 1 | Viewed by 2525
Abstract
The 60 GHz and 118 GHz oxygen absorption bands are prominent in the passive microwave remote sensing of atmospheric temperature, and also can be used for sounding sea surface barometric pressure (SSP). Microwave Temperature Sounder II (MWTS-II) has 13 channels in the 60 [...] Read more.
The 60 GHz and 118 GHz oxygen absorption bands are prominent in the passive microwave remote sensing of atmospheric temperature, and also can be used for sounding sea surface barometric pressure (SSP). Microwave Temperature Sounder II (MWTS-II) has 13 channels in the 60 GHz band, and Microwave Humidity and Temperature Sounder (MWHTS) has 8 channels in the 118 GHz band. They are both carried on Fengyun-3C Satellite (FY-3C) and Fengyun-3D Satellite (FY-3D), which provide measurements for comparing the retrieval accuracies of SSP using 60 GHz and 118 GHz bands. In this study, based on the weighting functions for MWHTS and MWTS-II, the 60 GHz and 118 GHz channel combinations representing 60 GHz and 118 GHz are established, respectively, and the retrieval accuracies of SSP from these two channel combinations are compared in different weather conditions. The experimental results show that the retrieval accuracy of SSP at 60 GHz is higher than that of 118 GHz in clear, cloudy, and rainy sky conditions. In addition, the retrieval experiments of SSP from MWTS-II and MWHTS are also carried out, and the experimental results show that the retrieval accuracy of SSP from MWTS-II is higher. The comparative study of the 60 GHz and 118 GHz for sounding SSP can provide support for the theoretical study of microwave remote sensing of SSP with practical measurements, and further contribute to understand the performance of 60 GHz and 118 GHz in atmospheric sounding. Full article
Show Figures

Figure 1

Figure 1
<p>Weighting functions for MWHTS and MWTS-II calculated from the U.S. standard atmospheric profile. (<b>a</b>) MWHTS. (<b>b</b>) MWTS-II.</p>
Full article ">Figure 2
<p>The calculation accuracies of RTTOV for the simulated brightness temperatures versus the vertical integral of cloud liquid water. (<b>a</b>) MWHTS. (<b>b</b>) MWTS-II.</p>
Full article ">Figure 3
<p>The schematic of the data preprocessing procedure.</p>
Full article ">Figure 4
<p>The average transmissivity for each channel of MWHTS and MWTS-II in clear, cloudy, and rainy sky conditions.</p>
Full article ">Figure 5
<p>The correlation coefficients between the observations of channels with zero surface transmissivity and with non-zero surface transmissivity. (<b>a</b>) MWHTS. (<b>b</b>) MWTS-II.</p>
Full article ">Figure 6
<p>The structure and major configuration of the DNN.</p>
Full article ">Figure 7
<p>The SSP retrieval results in the clear sky condition. (<b>a</b>) The 60 GHz channel combination. (<b>b</b>) The 118 GHz channel combination.</p>
Full article ">Figure 8
<p>The SSP retrieval results in the cloudy sky condition. (<b>a</b>) The 60 GHz channel combination. (<b>b</b>) The 118 GHz channel combination.</p>
Full article ">Figure 9
<p>The SSP retrieval results in the rainy sky condition. (<b>a</b>) The 60 GHz channel combination. (<b>b</b>) The 118 GHz channel combination.</p>
Full article ">Figure 10
<p>The SSP retrieval results from the 60 GHz extended channel combination in different weather conditions. (<b>a</b>) The clear sky. (<b>b</b>) The cloudy sky. (<b>c</b>) The rainy sky.</p>
Full article ">Figure 11
<p>The SSP retrieval results in the clear sky condition. (<b>a</b>) MWTS-II. (<b>b</b>) MWHTS.</p>
Full article ">Figure 12
<p>The SSP retrieval results in the cloudy sky condition. (<b>a</b>) MWTS-II. (<b>b</b>) MWHTS.</p>
Full article ">Figure 13
<p>The SSP retrieval results in the rainy sky condition. (<b>a</b>) MWTS-II. (<b>b</b>) MWHTS.</p>
Full article ">
20 pages, 8426 KiB  
Article
Effect of the Shadow Pixels on Evapotranspiration Inversion of Vineyard: A High-Resolution UAV-Based and Ground-Based Remote Sensing Measurements
by Saihong Lu, Junjie Xuan, Tong Zhang, Xueer Bai, Fei Tian and Samuel Ortega-Farias
Remote Sens. 2022, 14(9), 2259; https://doi.org/10.3390/rs14092259 - 7 May 2022
Cited by 13 | Viewed by 2546
Abstract
Due to the proliferation of precision agriculture, the obstacle of estimating evapotranspiration (ET) and its components from shadow pixels acquired from remote sensing technology should not be neglected. To accurately detect shaded soil and leaf pixels and quantify the implications of shadow pixels [...] Read more.
Due to the proliferation of precision agriculture, the obstacle of estimating evapotranspiration (ET) and its components from shadow pixels acquired from remote sensing technology should not be neglected. To accurately detect shaded soil and leaf pixels and quantify the implications of shadow pixels on ET inversion, a two-year field-scale observation was carried out in the growing season for a pinot noir vineyard. Based on high-resolution remote sensing sensors covering visible light, thermal infrared, and multispectral light, the supervised classification was applied to detect shadow pixels. Then, we innovatively combined the normalized difference vegetation index with the three-temperature model to quantify the proportion of plant transpiration (T) and soil evaporation (E) in the vineyard ecosystem. Finally, evaluated with the eddy covariance system, we clarified the implications of the shadow pixels on the ET estimation and the spatiotemporal patterns of ET in a vineyard system by considering where shadow pixels were presented. Results indicated that the shadow detection process significantly improved reliable assessment of ET and its components. (1) The shaded soil pixels misled the land cover classification, with the mean canopy cover ignoring shadows 1.68–1.70 times more often than that of shaded area removal; the estimation accuracy of ET can be improved by 4.59–6.82% after considering the effect of shaded soil pixels; and the accuracy can be improved by 0.28–0.89% after multispectral correction. (2) There was a 2 °C canopy temperature discrepancy between sunlit leaves and shaded leaves, meaning that the estimation accuracy of T can be improved by 1.38–7.16% after considering the effect of shaded canopy pixels. (3) Simultaneously, the characteristics showed that there was heterogeneity of ET in the vineyard spatially and that E and T fluxes accounted for 238.05 and 208.79 W·m−2, respectively; the diurnal variation represented a single-peak curve, with a mean of 0.26 mm/h. Our findings provide a better understanding of the influences of shadow pixels on ET estimation using remote sensing techniques. Full article
(This article belongs to the Special Issue Remote Sensing for Eco-Hydro-Environment)
Show Figures

Figure 1

Figure 1
<p>Location of the experimental site.</p>
Full article ">Figure 2
<p>The observation of experimental plot based on UAV and ground remote sensing: (<b>a1</b>)–(<b>a5</b>) the visible-light, (<b>b1</b>)–(<b>b3</b>) thermal infrared, and (<b>c1</b>) multispectral images of vineyard. Specially, (<b>a4</b>,<b>a5</b>) are the RBG images of imitation canopy and imitation dry soil (see <a href="#sec2dot3dot1-remotesensing-14-02259" class="html-sec">Section 2.3.1</a>).</p>
Full article ">Figure 3
<p>The diurnal variations of air temperature (<span class="html-italic">T</span><sub>a</sub>) and net radiation (<span class="html-italic">R</span><sub>n</sub>) in the investigation days.</p>
Full article ">Figure 4
<p>The schematic diagram of data processing for ET estimation.</p>
Full article ">Figure 5
<p>Airborne (<b>a</b>) visible-light and (<b>b</b>) thermal image obtained over the study vineyard at 13:00 on 6 June 2019. The black rectangle marked were set up for statistics.</p>
Full article ">Figure 6
<p>Map of the (<b>a</b>) visible light, (<b>b</b>) land surface temperature (LST), and (<b>c</b>) their frequency histograms of vineyard at different observation heights: (1) 120 m; (2) 80 m; (3) 40 m; and (4) 2 m. The data of 120 m were taken at 13:00 on 12 July 2019; 80 m and 40 m were the samples acquired at 13:00 and 13:30 on 16 July 2019, respectively; the ground data were acquired at 13:14 on 16 July 2019.</p>
Full article ">Figure 7
<p>Example of spatial variability of four types of pixel classifications: (<b>a</b>,<b>b</b>) are the distributions of not considering and considering the shaded soil based on supervised classification, respectively; (<b>c</b>,<b>d</b>) are the distributions of not considering and considering the shaded soil based on the supervised classification and normalized difference vegetation index (NDVI), respectively.</p>
Full article ">Figure 8
<p>The temporal pattern histogram of vegetation coverage. The f1 and f2 represent the vegetation coverage of not considering and considering the shaded soil based on supervised classification, respectively; f3 and f4 represent the vegetation coverage of not considering and considering the shaded soil based on the supervised classification and normalized difference vegetation index (NDVI), respectively.</p>
Full article ">Figure 9
<p>Spatial distribution of estimated values for the instantaneous transpiration and evaporation rate over a vineyard using UAV imagery and meteorological data on 16 July 2019.</p>
Full article ">Figure 10
<p>Illustration of the consequences of pixel binning for thermal image for the grape. (<b>a</b>,<b>c</b>) The RGB images; (<b>b</b>,<b>d</b>) the thermal images taken with a Fluke TiX620 camera; (<b>e</b>) together with corresponding frequency histograms of sunlit and shaded leaves. The images were taken at 10:30 on 13 August 2020.</p>
Full article ">Figure 11
<p>(<b>a</b>) Diurnal variations of transpiration rate of grape; and (<b>b</b>) the basic statistics of transpiration rate under sunlit leaves, shaded leaves, and all leaves.</p>
Full article ">
Previous Issue
Next Issue
Back to TopTop