[go: up one dir, main page]

Next Issue
Volume 12, May-2
Previous Issue
Volume 12, April-2
 
 
remotesensing-logo

Journal Browser

Journal Browser

Remote Sens., Volume 12, Issue 9 (May-1 2020) – 180 articles

Cover Story (view full-size image): Tropical forests store globally significant amounts of carbon as aboveground biomass (AGB). Light detection and ranging (LiDAR) and optical wavelength data are increasingly being used to map tree height and to estimate AGB. In the tropics, cloud cover is prevalent, and so several years of satellite data may need to be considered. This may reduce mapping accuracy because of seasonal and interannual changes in forest reflectance. The sensitivity of tree height and AGB estimation derived using airborne LiDAR data with respect to the season of Landsat acquisition was examined. The quantitative results suggest that using a single cloud-free Landsat-8 image may be sufficient for dominant canopy height and AGB mapping, but that use of Landsat imagery from different seasons is preferred for improving Congo Basin forest inventories.View this paper.
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
16 pages, 4769 KiB  
Article
Observation of Turbulent Mixing Characteristics in the Typical Daytime Cloud-Topped Boundary Layer over Hong Kong in 2019
by Tao Huang, Steve Hung-lam Yim, Yuanjian Yang, Olivia Shuk-ming Lee, David Hok-yin Lam, Jack Chin-ho Cheng and Jianping Guo
Remote Sens. 2020, 12(9), 1533; https://doi.org/10.3390/rs12091533 - 11 May 2020
Cited by 20 | Viewed by 4459
Abstract
Turbulent mixing is critical in affecting urban climate and air pollution. Nevertheless, our understanding of it, especially in a cloud-topped boundary layer (CTBL), remains limited. High-temporal resolution observations provide sufficient information of vertical velocity profiles, which is essential for turbulence studies in the [...] Read more.
Turbulent mixing is critical in affecting urban climate and air pollution. Nevertheless, our understanding of it, especially in a cloud-topped boundary layer (CTBL), remains limited. High-temporal resolution observations provide sufficient information of vertical velocity profiles, which is essential for turbulence studies in the atmospheric boundary layer (ABL). We conducted Doppler Light Detection and Ranging (LiDAR) measurements in 2019 using the 3-Dimensional Real-time Atmospheric Monitoring System (3DREAMS) to reveal the characteristics of typical daytime turbulent mixing processes in CTBL over Hong Kong. We assessed the contribution of cloud radiative cooling on turbulent mixing and determined the altitudinal dependence of the contribution of surface heating and vertical wind shear to turbulent mixing. Our results show that more downdrafts and updrafts in spring and autumn were observed and positively associated with seasonal cloud fraction. These results reveal that cloud radiative cooling was the main source of downdraft, which was also confirmed by our detailed case study of vertical velocity. Compared to winter and autumn, cloud base heights were lower in spring and summer. Cloud radiative cooling contributed ~32% to turbulent mixing even near the surface, although the contribution was relatively weaker compared to surface heating and vertical wind shear. Surface heating and vertical wind shear together contributed to ~45% of turbulent mixing near the surface, but wind shear can affect up to ~1100 m while surface heating can only reach ~450 m. Despite the fact that more research is still needed to further understand the processes, our findings provide useful references for local weather forecast and air quality studies. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Location (red dot) of the LiDAR at King’s Park (LiDAR KP). The figure was built on a map obtained from Google map, <a href="https://www.google.com/maps" target="_blank">https://www.google.com/maps</a>.</p>
Full article ">Figure 2
<p>Vertical profiles of water vapor mixing ratio <span class="html-italic">r</span> (blue cross), virtual potential temperature <math display="inline"><semantics> <mrow> <msub> <mi>θ</mi> <mi>v</mi> </msub> </mrow> </semantics></math> (red line), and attenuated backscatter coefficient (black dot) at 08:00 Hong Kong Time (HKT) (UTC 00) on 16 October. The range of atmospheric boundary layer height (ABLH) has been marked with orange dash lines.</p>
Full article ">Figure 3
<p>Diurnal variation of meteorological parameters of (<b>a</b>) cloud fraction, (<b>b</b>) temperature, (<b>c</b>) solar radiation, and (<b>d</b>) relative humidity (RH) from ground-level observation data at King’s Park Meteorological Station (KPMS) on cloudy days in 2019. Sunrise and sunset marked in the figure were extracted from the Hong Kong Observatory website (<a href="https://www.hko.gov.hk/tc/gts/astron2019/almanac2019_index.htm" target="_blank">https://www.hko.gov.hk/tc/gts/astron2019/almanac2019_index.htm</a>). A t-test for the differences between seasons was conducted. The t-test results show that all the seasonal differences can pass a 95% test except the solar radiation difference between winter and spring (<span class="html-italic">p</span> &lt; 0.2) and RH difference between spring and summer (<span class="html-italic">p</span> &lt; 0.3).</p>
Full article ">Figure 4
<p>Diurnal variation of wind profiles of horizontal wind speed (left), horizontal wind direction (middle), and vertical velocity (right) on cloudy days in 2019. The rows from top to bottom represent winter, spring, summer and autumn, respectively.</p>
Full article ">Figure 5
<p>The probability distribution function of cloud base height within the atmospheric boundary layer during 07:00 to 17:00 HKT on cloudy days in 2019. It is noted that the curves were built based on data at a 10-min resolution.</p>
Full article ">Figure 6
<p>Vertical velocity profiles from 07:00 to 17:00 HKT on (<b>a</b>) 01 Feb, (<b>b</b>) 01 May, (<b>c</b>) 22 Aug, (<b>d</b>) 23 Nov. 10-min cloud base height has been marked as black dots in each profile. Positive vertical velocity indicates updraft.</p>
Full article ">Figure 7
<p>Vertical profile of vertical velocity variance <math display="inline"><semantics> <mrow> <msubsup> <mi>σ</mi> <mi>w</mi> <mn>2</mn> </msubsup> </mrow> </semantics></math> (left panel) and skewness <math display="inline"><semantics> <mrow> <msub> <mi>s</mi> <mi>w</mi> </msub> </mrow> </semantics></math> (right panel). Both <math display="inline"><semantics> <mrow> <msubsup> <mi>σ</mi> <mi>w</mi> <mn>2</mn> </msubsup> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mi>s</mi> <mi>w</mi> </msub> </mrow> </semantics></math> are 1-min time interval. For visualization, <math display="inline"><semantics> <mrow> <msub> <mi>s</mi> <mi>w</mi> </msub> </mrow> </semantics></math> has been moving averaged in a 10-min window size.</p>
Full article ">Figure 8
<p>Contribution of (<b>a</b>) cloud radiative cooling and (<b>b</b>) surface heating and vertical wind shear to the turbulent mixing within the daytime cloud-topped boundary layer (CTBL) in all the cases. Y axis is the altitude (z) normalized by cloud base height (h).</p>
Full article ">Figure 9
<p>Vertical profile of the averaged daytime vertical wind shear between different heights for (<b>a</b>) 01 Feb, (<b>b</b>) 01 May, (<b>c</b>), 22 Aug, (<b>d</b>) 23 Nov. Shading denotes the intensity of wind shear at least two consecutive layers where x-axis is the top layer height and y-axis is the bottom layer height. Note that the colormap is presented at a log scale.</p>
Full article ">Figure 10
<p>The correlation between turbulent mixing and the two factors: (<b>a</b>) surface heating due to solar radiation and (<b>b</b>) vertical wind shear. Solid dots represent data with high statistical significance (<span class="html-italic">p</span> &lt; 0.05).</p>
Full article ">
25 pages, 32044 KiB  
Article
On a Flood-Producing Coastal Mesoscale Convective Storm Associated with the Kor’easterlies: Multi-Data Analyses Using Remotely-Sensed and In-Situ Observations and Storm-Scale Model Simulations
by Seon Ki Park and Sojung Park
Remote Sens. 2020, 12(9), 1532; https://doi.org/10.3390/rs12091532 - 11 May 2020
Cited by 7 | Viewed by 4512
Abstract
A flood-producing heavy rainfall event occurred at the mountainous coastal region in the northeast of South Korea on 5–6 August 2018, subsequent to extreme heat waves, through a quasi-stationary mesoscale convective system (MCS). We analyzed the storm environment via a multi-data approach using [...] Read more.
A flood-producing heavy rainfall event occurred at the mountainous coastal region in the northeast of South Korea on 5–6 August 2018, subsequent to extreme heat waves, through a quasi-stationary mesoscale convective system (MCS). We analyzed the storm environment via a multi-data approach using high-resolution (1-km) simulations from the Weather Research and Forecasting (WRF) and in situ/satellite/radar observations. The brightness temperature, from the Advanced Himawari Imager water vapor band, and the composite radar reflectivity were used to identify characteristics of the MCS and associated precipitations. The following factors affected this back-building MCS: low-level convergence by the Korea easterlies (Kor’easterlies), carrying moist air into the coast; strong vertical wind shear, making the updraft tilted and sustained; coastal fronts and back-building convection bands, formed through interactions among the Kor’easterlies, cold pool outflows, and orography; mid-level advection of cold air and positive relative vorticity, enhancing vertical convection and potential instability; and vigorous updraft releasing potential instability. The pre-storm synoptic environment provided favorable conditions for storm development such as high moisture and temperature over the coastal area and adjacent sea, and enhancement of the Kor’easterlies by expansion of a surface high pressure system. Upper-level north-northwesterly winds prompted the MCS to propagate south-southeastward along the coastline. Full article
(This article belongs to the Special Issue Weather Forecasting and Modeling Using Satellite Data)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>(<b>a</b>) Computational domains with triple nesting (d01, d02, and d03) and (<b>b</b>) topography of d03 (shaded) with the height range represented in m above sea level (asl). In (<b>b</b>), black boxes are for area-averaged analyses, representing areas of upwind tips of tapering clouds near Sokcho (A1) and Bukgangneung (A2), whereas black dots depict locations of weather stations: Sokcho (S; 38.25°N, 128.56°E; 18.06 m asl), Bukgangneung (B; 37.80°N, 128.86°E; 78.9 m asl), Gangneung (G; 37.75°N, 128.89°E; 26.04 m asl), and Daegwallyeong (D; 37.68°N, 128.86°E; 772.57 m asl). Topography data are from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global Digital Elevation Model (GDEM), available at <a href="https://earthdata.nasa.gov/" target="_blank">https://earthdata.nasa.gov/</a>.</p>
Full article ">Figure 2
<p>Accumulated rainfall amount (in mm) from the Automatic Weather Systems (AWS) observations, including the Automated Synoptic Observing Systems (ASOS), the composite radar-AWS rain-rate (RAR) data, and the model results (WRF) for (<b>a</b>) 12 h from 12:00 UTC 5 to 00:00 UTC 6 and (<b>b</b>) 27 h from 12:00 UTC 5 to 15:00 UTC 6 August 2018. Black dots depict station locations (see <a href="#remotesensing-12-01532-f001" class="html-fig">Figure 1</a>b). Computations were made at the domain d03 of WRF.</p>
Full article ">Figure 3
<p>Surface chart (<b>top</b>) and upper-level charts of 850 hPa (<b>middle left</b>), 700 hPa (<b>middle right</b>), and 500 hPa (<b>bottom</b>) on 12:00 UTC 5 August 2018. At the surface, thick colored lines represent the 1008 hPa isobars—blue at the current time and other colors at given previous times (e.g., 050000 in violet means 00:00 UTC 5 August 2018). The colored ⊗ symbols represent the center locations of low pressure systems at given previous times. Green hatched lines represent isodrosotherms of 20 °C. Stations with fog, precipitation and thunderstorm are represented by yellow, green and red circles, respectively. At the upper levels, blue solid lines represent isohypses and red dashed lines isotherms. Areas with green dots indicate wet areas with <math display="inline"><semantics> <mrow> <mi>T</mi> <mo>−</mo> <msub> <mi>T</mi> <mi>d</mi> </msub> </mrow> </semantics></math> &lt; 4 °C. Modified from the Korea Meteorological Administration (KMA) weather charts.</p>
Full article ">Figure 4
<p>Skew T-log P diagrams of Bukgangneung on 00:00 UTC 5, 12:00 UTC 5, and 00:00 UTC 6 August 2018.</p>
Full article ">Figure 5
<p>Time series of meteorological variables from Sokcho (<b>top</b>), Gangneung (<b>middle</b>) and Daegwallyeong (<b>bottom</b>) from 00:00 KST 5 to 00:00 KST 7 August 2018 (KST = UTC + 9) for mean sea-level pressure (P; pink), relative humidity (H; dark blue), temperature (T; red), rainfall (R; blue), and 10-min average wind speed (S; dark green) and direction (W; orange). Shaded areas with cyan, pink and blue mean the precipitation period, and the 15-min and 1 h accumulated rainfall amount, respectively. Dashed boxes indicate the period of abrupt changes in H and T. From the KMA ASOS data.</p>
Full article ">Figure 6
<p>Temporal variation of brightness temperature (<math display="inline"><semantics> <msub> <mi>T</mi> <mi>B</mi> </msub> </semantics></math>, shaded in gray; °K) from the Advanced Himawari Imager (AHI) water vapor band 8. Colored contours indicate <math display="inline"><semantics> <mrow> <msub> <mi>T</mi> <mi>B</mi> </msub> </mrow> </semantics></math> = 210 °K (red), 215 °K (yellow), 220 ° K (green), and 230 °K (blue), respectively. The white rectangle (<b>left panel</b>) represents the sub-domain for enlarged images (<b>right panels</b>) from 08:50 UTC to 10:50 UTC 5 August 2018. Green dots depict station locations (see <a href="#remotesensing-12-01532-f001" class="html-fig">Figure 1</a>b).</p>
Full article ">Figure 7
<p>Temporal variation of brightness temperature (<math display="inline"><semantics> <msub> <mi>T</mi> <mi>B</mi> </msub> </semantics></math>, shaded in gray; °K) from the AHI water vapor band 8 and composite radar reflectivities (<span class="html-italic">Z</span>, shaded; dBZ). Black solid lines indicate <math display="inline"><semantics> <mrow> <msub> <mi>T</mi> <mi>B</mi> </msub> </mrow> </semantics></math> = 220 °K. In terms of reflectivity, total precipitation is divided into stratiform (<math display="inline"><semantics> <mrow> <mi>Z</mi> <mo>&lt;</mo> <mn>35</mn> </mrow> </semantics></math> dBZ; shaded in blue to green) and convective (<math display="inline"><semantics> <mrow> <mi>Z</mi> <mo>≥</mo> <mn>35</mn> </mrow> </semantics></math> dBZ; shaded in yellow to pink) components. The white rectangle (<b>left panel</b>) represents the subdomain for enlarged images (<b>right panels</b>) from 15:00 to 22:00 UTC on 5 August 2018. Black dots depict station locations (see <a href="#remotesensing-12-01532-f001" class="html-fig">Figure 1</a>b).</p>
Full article ">Figure 8
<p>Hourly variation of lightnings from 15:00 UTC 5 to 07:00 UTC 6 August 2018. Each panel represents lightnings occurred during the last 3 h, as indicated in different color per half an hour. The white rectangle (<b>left panel</b>) represents the sub-domain for enlarged images (<b>right panels</b>). From the KMA.</p>
Full article ">Figure 9
<p>Hourly variation, from 15:00 UTC 5 to 14:00 UTC 6 August 2018, of the 12 h accumulated rainfall (AR; mm) ending at the specified time in each panel over the same enlarged domain in <a href="#remotesensing-12-01532-f008" class="html-fig">Figure 8</a> (white rectangle). Numbers indicate the maximum 12 h AR with the stations in parentheses: S for Sokcho, H for Hyeonnae (∼36 km north-northwest from S), SA for Seorak-dong (∼20 km west-southwest from S), G for Gangneung, and O for Okgye (∼20 km south-southeast from G). From the KMA.</p>
Full article ">Figure 10
<p>Analyses of storm environment on 12:00 UTC (<b>left panels</b>) and 18:00 UTC 5 (<b>right panels</b>) August 2018. (<b>Upper panels</b>) wet-bulb potential temperature (<math display="inline"><semantics> <msub> <mi>θ</mi> <mi>w</mi> </msub> </semantics></math>, contoured and shaded in blue; °K), horizontal convergence (<math display="inline"><semantics> <mrow> <mo>−</mo> <mi>δ</mi> </mrow> </semantics></math>, shaded; <math display="inline"><semantics> <msup> <mn>10</mn> <mrow> <mo>−</mo> <mn>5</mn> </mrow> </msup> </semantics></math> s<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>) and horizontal wind vectors (m s<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>) at 950 hPa, over the domain d02 (terrain higher than the 950-hPa level is hatched): (<b>Lower panels</b>) the convective available potential energy (CAPE, shaded in blue; J kg<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>) and surface wind vectors (m s<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>) over the domain d03.</p>
Full article ">Figure 11
<p>Time–pressure diagrams, averaged over A1 (<b>a</b>,<b>c</b>) and A2 (<b>b</b>,<b>d</b>) (see <a href="#remotesensing-12-01532-f001" class="html-fig">Figure 1</a>b) from 00:00 UTC 5 to 11:00 UTC 6 August 2018: (<b>a</b>,<b>b</b>) equivalent potential temperature (<math display="inline"><semantics> <msub> <mi>θ</mi> <mi>e</mi> </msub> </semantics></math>, shaded; °K), water vapor mixing ratio (<math display="inline"><semantics> <msub> <mi>q</mi> <mi>v</mi> </msub> </semantics></math>, contoured every 2 g kg<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>), potential instability (thick dash-contoured every −5 °K km<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>), and horizontal wind vectors (m s<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>); and (<b>c</b>, <b>d</b>) horizontal convergence (<math display="inline"><semantics> <mrow> <mo>−</mo> <mi>δ</mi> </mrow> </semantics></math>, shaded; 10<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>−</mo> <mn>5</mn> </mrow> </msup> </semantics></math> s<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>), vertical component of relative vorticity (contoured every <math display="inline"><semantics> <mrow> <mn>5</mn> <mo>×</mo> <msup> <mn>10</mn> <mrow> <mo>−</mo> <mn>5</mn> </mrow> </msup> </mrow> </semantics></math> s<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>; solid for positive and dashed for negative values) with yellow crosses representing centers of positive values, and vertical wind vectors (m s<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>). Values of all the fields are computed over the domain d03.</p>
Full article ">Figure 12
<p>Water vapor images from the AHI three water vapor channels—6.2 μm (<b>top panels</b>), 6.9 μm (<b>middle panels</b>), and 7.3 μm (<b>bottom panels</b>) on 0900 UTC (<b>left panels</b>) and 1200 UTC (<b>right panels</b>) on 5 August 2018. Blue arrows indicate convective cells described in the text, and the red dashed circle depicts the mid-level dry intrusion. From the KMA/National Meteorological Satellite Center.</p>
Full article ">Figure 13
<p>Potential temperature (<math display="inline"><semantics> <mi>θ</mi> </semantics></math>, shaded; °K) at the lowest model level, hourly accumulated rainfall greater than 20 mm (hatched with closed contour), vertical wind speed at 500 m greater than 0.5 m s<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math> (shaded in purple), and horizontal wind vectors at 10 m (m s<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>) on (<b>a</b>) 15:00 UTC, (<b>b</b>) 18:00 UTC, (<b>c</b>) 20:00 UTC, and (<b>d</b>) 22:00 UTC 5 August 2018, over the domain d03.</p>
Full article ">Figure 14
<p>Vertical cross sections of mixing ratios of rain water (<math display="inline"><semantics> <msub> <mi>q</mi> <mi>r</mi> </msub> </semantics></math>, shaded; g kg<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>) and cloud water (<math display="inline"><semantics> <msub> <mi>q</mi> <mi>c</mi> </msub> </semantics></math>, contoured every 0.2 g kg<math display="inline"><semantics> <msup> <mrow/> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>), potential temperature (<math display="inline"><semantics> <mi>θ</mi> </semantics></math>, red-contoured every 1 °K), and wind vectors along the axis A–B in <a href="#remotesensing-12-01532-f013" class="html-fig">Figure 13</a> below 15 km (<b>top and middle panels</b>) and 2 km (<b>bottom panels</b>) at given specific times. Think red lines (bottom panels) depict <math display="inline"><semantics> <mrow> <mi>θ</mi> </mrow> </semantics></math> = 300 °K for 16:00 UTC 5 and <math display="inline"><semantics> <mrow> <mi>θ</mi> </mrow> </semantics></math> = 299 °K for 00:00 UTC 6, respectively. Computations are made at the domain d03.</p>
Full article ">Figure 15
<p>Same as in (<b>a</b>) <a href="#remotesensing-12-01532-f010" class="html-fig">Figure 10</a>, (<b>b</b>) <a href="#remotesensing-12-01532-f013" class="html-fig">Figure 13</a>, and (<b>c</b>) <a href="#remotesensing-12-01532-f007" class="html-fig">Figure 7</a>, respectively, but for 03:00 UTC 6 August 2018.</p>
Full article ">Figure 16
<p>Schematic diagram of the quasi-stationary MCS representing the triggering and back-building processes for the coastal flooding event at the Yeongdong region in South Korea.</p>
Full article ">
10 pages, 902 KiB  
Letter
Lens-Loaded Coded Aperture with Increased Information Capacity for Computational Microwave Imaging
by Okan Yurduseven, Muhammad Ali Babar Abbasi, Thomas Fromenteze and Vincent Fusco
Remote Sens. 2020, 12(9), 1531; https://doi.org/10.3390/rs12091531 - 11 May 2020
Cited by 21 | Viewed by 3101
Abstract
Computational imaging using coded apertures offers all-electronic operation with a substantially reduced hardware complexity for data acquisition. At the core of this technique is the single-pixel coded aperture modality, which produces spatio-temporarily varying, quasi-random bases to encode the back-scattered radar data replacing the [...] Read more.
Computational imaging using coded apertures offers all-electronic operation with a substantially reduced hardware complexity for data acquisition. At the core of this technique is the single-pixel coded aperture modality, which produces spatio-temporarily varying, quasi-random bases to encode the back-scattered radar data replacing the conventional pixel-by-pixel raster scanning requirement of conventional imaging techniques. For a frequency-diverse computational imaging radar, the coded aperture is of significant importance, governing key imaging metrics such as the orthogonality of the information encoded from the scene as the frequency is swept, and hence the conditioning of the imaging problem, directly impacting the fidelity of the reconstructed images. In this paper, we present dielectric lens loading of coded apertures as an effective way to increase the information coding capacity of frequency-diverse antennas for computational imaging problems. We show that by lens loading the coded aperture for the presented imaging problem, the number of effective measurement modes can be increased by 32% while the conditioning of the imaging problem is improved by a factor of greater than two times. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Depiction of the studied imaging scenario (<b>a</b>) frequency-diverse computational imaging of an L-shaped phantom with lens (<b>b</b>) frequency-diverse computational imaging of an L-shaped phantom without lens. Dimensions: <math display="inline"><semantics> <msub> <mi>L</mi> <mn>1</mn> </msub> </semantics></math> = 5 cm, <math display="inline"><semantics> <msub> <mi>W</mi> <mn>1</mn> </msub> </semantics></math> = 5 cm, <math display="inline"><semantics> <msub> <mi>L</mi> <mn>2</mn> </msub> </semantics></math> = 0.4 m, <math display="inline"><semantics> <msub> <mi>W</mi> <mn>2</mn> </msub> </semantics></math> = 0.4 m, <span class="html-italic">D</span> = 0.4 m, <span class="html-italic">h</span> = 0.15 m, <span class="html-italic">a</span> = 1.5 cm. Aperture coordinates are <math display="inline"><semantics> <mi mathvariant="bold-italic">r</mi> </semantics></math>(<math display="inline"><semantics> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>,</mo> <mi>z</mi> </mrow> </semantics></math>) and scene coordinates are <math display="inline"><semantics> <mi mathvariant="bold-italic">r</mi> </semantics></math>’(<math display="inline"><semantics> <mrow> <msup> <mi>x</mi> <mo>′</mo> </msup> <mo>,</mo> <msup> <mi>y</mi> <mo>′</mo> </msup> <mo>,</mo> <msup> <mi>z</mi> <mo>′</mo> </msup> </mrow> </semantics></math>).</p>
Full article ">Figure 2
<p>Simulated reflection coefficient pattern of the lens-loaded frequency-diverse metasurface antenna.</p>
Full article ">Figure 3
<p>Characterised near-field electric field data at adjacent frequencies: (<b>a</b>) 9.99 GHz (<b>b</b>) 10 GHz (<b>c</b>) 10.01 GHz. Colorbar in dB.</p>
Full article ">Figure 4
<p>Reconstructed images of an L-shaped gun phantom, (<b>a</b>) reconstructed microwave image without the lens-loaded metasurface antenna, (<b>b</b>) reconstructed image with the lens-loaded metasurface antenna, (<b>c</b>) reflectivity distribution of the imaged scene (ground truth), (<b>d</b>) reconstructed image of the gun phantom in the <math display="inline"><semantics> <mrow> <mi>y</mi> <mi>z</mi> </mrow> </semantics></math>-plane (depth), (<b>e</b>) reflectivity distribution comparison between the ground truth and the reconstructed image along y = 0 m slice. Colorbar is in dB scale.</p>
Full article ">Figure 5
<p>Normalised spatial frequency domain (k-space) representation of the coded aperture radiated fields, (<b>a</b>) 9 GHz, (<b>b</b>) 11 GHz, (<b>c</b>) overall k-space pattern formed by incoherent superposition (summing up intensity data) of the k-transformed coded aperture radiated fields for 201 frequency points.</p>
Full article ">Figure 6
<p>Comparison of the SVD patterns with and without the loaded lens, (<b>a</b>) SVD spectrum for the entire 201 measurement modes. The system SNR level is highlighted in the same SVD plot to provide a reference threshold for the effective number of measurement modes, (<b>b</b>) zoomed in SVD window with significant singular value components remaining above the 20 dB SNR level. The selected region exhibits a rather linear decay spectrum with the decay slope labelled <math display="inline"><semantics> <msub> <mi>K</mi> <mn>1</mn> </msub> </semantics></math> (with lens) and <math display="inline"><semantics> <msub> <mi>K</mi> <mn>2</mn> </msub> </semantics></math> (without lens), respectively. <math display="inline"><semantics> <msub> <mi>K</mi> <mn>1</mn> </msub> </semantics></math> = 0.8<math display="inline"><semantics> <msub> <mi>K</mi> <mn>2</mn> </msub> </semantics></math> within the highlighted window.</p>
Full article ">
20 pages, 5800 KiB  
Article
Toward a Standardized Encoding of Remote Sensing Geo-Positioning Sensor Models
by Meng Jin, Yuqi Bai, Emmanuel Devys and Liping Di
Remote Sens. 2020, 12(9), 1530; https://doi.org/10.3390/rs12091530 - 11 May 2020
Cited by 3 | Viewed by 3315
Abstract
Geolocation information is an important feature of remote sensing image data that is captured through a variety of passive or active observation sensors, such as push-broom electro-optical sensor, synthetic aperture radar (SAR), light detection and ranging (LIDAR) and sound navigation and ranging (SONAR). [...] Read more.
Geolocation information is an important feature of remote sensing image data that is captured through a variety of passive or active observation sensors, such as push-broom electro-optical sensor, synthetic aperture radar (SAR), light detection and ranging (LIDAR) and sound navigation and ranging (SONAR). As a fundamental processing step to locate an image, geo-positioning is used to determine the ground coordinates of an object from image coordinates. A variety of sensor models have been created to describe geo-positioning process. In particular, Open Geospatial Consortium (OGC) has defined the Sensor Model Language (SensorML) specification in its Sensor Web Enablement (SWE) initiative to describe sensors including the geo-positioning process. It has been realized using syntax from the extensible markup language (XML). Besides, two standards defined by the International Organization for Standardization (ISO), ISO 19130-1 and ISO 19130-2, introduced a physical sensor model, a true replacement model, and a correspondence model for the geo-positioning process. However, a standardized encoding for geo-positioning sensor models is still missing for the remote sensing community. Thus, the interoperability of remote sensing data between application systems cannot be ensured. In this paper, a standardized encoding of remote sensing geo-positioning sensor models is introduced. It is semantically based on ISO 19130-1 and ISO 19130-2, and syntactically based on OGC SensorML. It defines a cross mapping of the sensor models defined in ISO 19130-1 and ISO 19130-2 to the SensorML, and then proposes a detailed encoding method to finalize the XML schema (an XML schema here is the structure to define an XML document), which will become a profile of OGC SensorML. It seamlessly unifies the sensor models defined in ISO 19130-1, ISO 19130-2, and OGC SensorML. By enabling a standardized description of sensor models used to produce remote sensing data, this standard is very promising in promoting data interoperability, mobility, and integration in the remote sensing domain. Full article
(This article belongs to the Section Remote Sensing Image Processing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>An overview of the UML packages defined in two important sensor model related standards.</p>
Full article ">Figure 2
<p>The structure of a simple process example implemented by SensorML to describe a frame sensor.</p>
Full article ">Figure 3
<p>The mapping result when an ISO 19130-1 or ISO 19130-2 sensor model component matches a SensorML class.</p>
Full article ">Figure 4
<p>The mapping result when a SensorML class cannot meet the semantic and constraint requirements of an ISO 19130-1 or ISO 19130-2 sensor model component.</p>
Full article ">Figure 5
<p>The mapping result when SensorML does not have any similar class with ISO 19130-1 and ISO 19130-2 sensor model components.</p>
Full article ">Figure 6
<p>An example XML Schema of SD_Sensor under namespace sml19130.</p>
Full article ">Figure 7
<p>Mapping ISO 19130-1 and ISO 19130-2 classes to SensorML by four-category classes (i. non-physical/atomic, ii. physical/atomic, iii. non-physical/composite, and iv. physical/composite). (<b>a</b>) SensorML classes in four categories. (<b>b</b>) The corresponding ISO 19130-1 and ISO 19130-2 classes in four categories.</p>
Full article ">Figure 8
<p>Resulting specification structure overview of ISO 19130-1.</p>
Full article ">Figure 9
<p>Resulting specification structure overview of ISO 19130-2.</p>
Full article ">Figure 10
<p>The location of the example image near Poza Rica in Mexico.</p>
Full article ">Figure 11
<p>The components of the SE_SensorModel instance in a real scenario (left) and the corresponding schema elements in a structured form (right).</p>
Full article ">
23 pages, 6851 KiB  
Article
An Integrated Solution for 3D Heritage Modeling Based on Videogrammetry and V-SLAM Technology
by Pedro Ortiz-Coder and Alonso Sánchez-Ríos
Remote Sens. 2020, 12(9), 1529; https://doi.org/10.3390/rs12091529 - 11 May 2020
Cited by 14 | Viewed by 4203
Abstract
This paper presents an approach for 3D reconstruction of heritage scenes using a videogrammetric-based device. The system, based on two video cameras with different characteristics, uses a combination of visual simultaneous localization and mapping (SLAM) and photogrammetry technologies. VSLAM, together with a series [...] Read more.
This paper presents an approach for 3D reconstruction of heritage scenes using a videogrammetric-based device. The system, based on two video cameras with different characteristics, uses a combination of visual simultaneous localization and mapping (SLAM) and photogrammetry technologies. VSLAM, together with a series of filtering algorithms, is used for the optimal selection of images and to guarantee that the user does not lose tracking during data acquisition in real time. The different photogrammetrically adapted tools in this device and for this type of handheld capture are explained. An evaluation of the device is carried out, including comparisons with the Faro Focus X 330 laser scanner, through three case studies in which multiple aspects are analyzed. We demonstrate that the proposed videogrammetric system is 17 times faster in capturing data than the laser scanner and that the post-processing of the system is fully automatic, but takes more time than the laser scanner in post-processing. It can also be seen that the accuracies of both systems and the generated textures are very similar. Our evaluation demonstrates the possibilities of considering the proposed system as a new professional-quality measurement instrument. Full article
(This article belongs to the Special Issue Photogrammetry and Image Analysis in Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>General procedure overview.</p>
Full article ">Figure 2
<p>Initial prototype configuration P-1 (top) and the actual prototype P-2 (bottom), 14 cm long and lighter, with an extendable pole anchor point (red arrow) and a ball joint system (yellow arrow) to facilitate data capture.</p>
Full article ">Figure 3
<p>Break area filter workflow.</p>
Full article ">Figure 4
<p>Effect of statistical outlier removal (SOR) and radius outlier removal (ROR) filters on a point cloud on the surface of a Roman column: (<b>a</b>) Point cloud with outliers in areas marked with dashed lines in yellow; (<b>b</b>) point cloud after applying the SOR filter, section AA is marked with a red line; (<b>c</b>) point cloud of section AA after applying the SOR filter; and (<b>d</b>) point cloud of section AA after the application of the SOR and ROR filters, with a better definition of the contour of the column section.</p>
Full article ">Figure 4 Cont.
<p>Effect of statistical outlier removal (SOR) and radius outlier removal (ROR) filters on a point cloud on the surface of a Roman column: (<b>a</b>) Point cloud with outliers in areas marked with dashed lines in yellow; (<b>b</b>) point cloud after applying the SOR filter, section AA is marked with a red line; (<b>c</b>) point cloud of section AA after applying the SOR filter; and (<b>d</b>) point cloud of section AA after the application of the SOR and ROR filters, with a better definition of the contour of the column section.</p>
Full article ">Figure 5
<p>Work areas in the “Casa del Mitreo”: (<b>a</b>). Working area 1: Pond and Peristilium (left); (<b>b</b>). Working area 2: Underground rooms (center); and (<b>c</b>). Working area 3: Rooms with mosaic floors (right).</p>
Full article ">Figure 6
<p>Total station (<b>a</b>) and target model (<b>b</b>) used to co-ordinate control points; Faro Focus3D X 330 laser scanner recording data in working areas 1 (<b>c</b>) and 3 (<b>d</b>); and data capture with the proposed system in working area 2 (<b>e</b>,<b>f</b>).</p>
Full article ">Figure 6 Cont.
<p>Total station (<b>a</b>) and target model (<b>b</b>) used to co-ordinate control points; Faro Focus3D X 330 laser scanner recording data in working areas 1 (<b>c</b>) and 3 (<b>d</b>); and data capture with the proposed system in working area 2 (<b>e</b>,<b>f</b>).</p>
Full article ">Figure 7
<p>Trajectory followed (red lines) during data acquisition in the three work zones: orking area 1 (<b>a</b>); working area 2 (<b>b</b>); and working area 3 (<b>c</b>).</p>
Full article ">Figure 7 Cont.
<p>Trajectory followed (red lines) during data acquisition in the three work zones: orking area 1 (<b>a</b>); working area 2 (<b>b</b>); and working area 3 (<b>c</b>).</p>
Full article ">Figure 8
<p>Comparison of the point cloud resulting from the LS Faro Focus3D X 330 (<b>a</b>,<b>c</b>,<b>e</b>) and the proposed system (<b>b</b>,<b>d</b>,<b>f</b>) of working area 1 (<b>a</b>,<b>b</b>), working area 2 (<b>c</b>,<b>d</b>), and working area 3 (<b>e</b>,<b>f</b>). Visualization was carried out in Meshlab [<a href="#B52-remotesensing-12-01529" class="html-bibr">52</a>] with a homogeneous normal calculation for both systems, with the aim of eliminating textures and favoring a balanced geometric comparison.</p>
Full article ">Figure 8 Cont.
<p>Comparison of the point cloud resulting from the LS Faro Focus3D X 330 (<b>a</b>,<b>c</b>,<b>e</b>) and the proposed system (<b>b</b>,<b>d</b>,<b>f</b>) of working area 1 (<b>a</b>,<b>b</b>), working area 2 (<b>c</b>,<b>d</b>), and working area 3 (<b>e</b>,<b>f</b>). Visualization was carried out in Meshlab [<a href="#B52-remotesensing-12-01529" class="html-bibr">52</a>] with a homogeneous normal calculation for both systems, with the aim of eliminating textures and favoring a balanced geometric comparison.</p>
Full article ">Figure 9
<p>Graphs of distribution of angular and linear magnitudes in samples M1 and M2.</p>
Full article ">Figure 10
<p>Overlapping cross sections generated in both point clouds, obtained using a horizontal plane at a height of 0.50 m above the ground. The result of the section of the cloud captured by the laser scanner appears in red, and that with the proposed system in green: Work zone 1 (<b>a</b>) at scale 1/235; and work zone 3 (<b>b</b>) at scale 1/180.</p>
Full article ">Figure 11
<p>Detail of the point clouds of work zones 2 (<b>a</b>) and 3 (<b>b</b>) obtained with both systems. The two images in the upper part belong to those obtained with the proposed system and those obtained with the Focus3D X 330 are shown in the lower part.</p>
Full article ">
20 pages, 3724 KiB  
Article
Novel Semi-Supervised Hyperspectral Image Classification Based on a Superpixel Graph and Discrete Potential Method
by Yifei Zhao, Fenzhen Su and Fengqin Yan
Remote Sens. 2020, 12(9), 1528; https://doi.org/10.3390/rs12091528 - 11 May 2020
Cited by 18 | Viewed by 3236
Abstract
Hyperspectral image (HSI) classification plays an important role in the automatic interpretation of the remotely sensed data. However, it is a non-trivial task to classify HSI accurately and rapidly due to its characteristics of having a large amount of data and massive noise [...] Read more.
Hyperspectral image (HSI) classification plays an important role in the automatic interpretation of the remotely sensed data. However, it is a non-trivial task to classify HSI accurately and rapidly due to its characteristics of having a large amount of data and massive noise points. To address this problem, in this work, a novel, semi-supervised, superpixel-level classification method for an HSI was proposed based on a graph and discrete potential (SSC-GDP). The key idea of the proposed scheme is the construction of the weighted connectivity graph and the division of the weighted graph. Based on the superpixel segmentation, a weighted connectivity graph is constructed usingthe weighted connection between a superpixel and its spatial neighbors. The generated graph is then divided into different communities/sub-graphs by using a discrete potential and the improved semi-supervised Wu–Huberman (ISWH) algorithm. Each community in the weighted connectivity graph represents a class in the HSI. The local connection strategy, together with the linear complexity of the ISWH algorithm, ensures the fast implementation of the suggested SSC-GDP method. To prove the effectiveness of the proposed spectral–spatial method, two public benchmarks, Indian Pines and Salinas, were utilized to test the performance of our proposal. The comparative test results confirmed that the proposed method was superior to several other state-of-the-art methods. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The proposed classification framework. The base image is generated using the first principal component from the PCA method. For clarity, the weights of each edge are omitted in the constructed graph and community structure. ERS: Entropy rate superpixel segmentation, HSI: Hyperspectral image, ISWH: Improved semi-supervised Wu–Huberman algorithm, PCA: Principal component analysis.</p>
Full article ">Figure 2
<p>The process of discovering community structures using the ISWH method. The digit in each circle represents the potential of the node. A red digit indicates that the labeled node has a known potential. (<b>a</b>) A network of nine nodes with two communities. (<b>b</b>,<b>c</b>) Potential of each node obtained after five successive updates by assigning different initial values to the corresponding two labeled nodes.</p>
Full article ">Figure 3
<p>Indian Pines dataset: (<b>a</b>) false-color composite image and (<b>b</b>) reference image.</p>
Full article ">Figure 4
<p>Salinas dataset: (<b>a</b>) false-color composite image and (<b>b</b>) reference image.</p>
Full article ">Figure 5
<p>The classification maps of the Indian Pines dataset produced using (<b>a</b>) EPF, (<b>b</b>) IFRF, (<b>c</b>) LORSAL, (<b>d</b>) SMLR-S, (<b>e</b>) SCMK, (<b>f</b>) SuperPCA, (<b>g</b>) SVM-SD, and (<b>h</b>) our proposed method.</p>
Full article ">Figure 6
<p>The classification maps of the Salinas dataset produced using (<b>a</b>) EPF, (<b>b</b>) IFRF, (<b>c</b>) LORSAL, (<b>d</b>) SMLR-S, (<b>e</b>) SCMK, (<b>f</b>) SuperPCA, (<b>g</b>) SVM-SD, and (<b>h</b>) our proposed method.</p>
Full article ">Figure 7
<p>The effect of the number of superpixels on the classification accuracy: (<b>a</b>) the Indian Pines dataset and (<b>b</b>) the Salinas dataset.</p>
Full article ">Figure 8
<p>Comparison results of eight classifiers on the (<b>a</b>) Indian Pines and (<b>b</b>) Salinas datasets.</p>
Full article ">Figure 9
<p>The variation of classification results on two hyperspectral datasets with the number of updates.</p>
Full article ">
14 pages, 2363 KiB  
Letter
Mapping Fragmented Impervious Surface Areas Overlooked by Global Land-Cover Products in the Liping County, Guizhou Province, China
by Jing Zhao and Narumasa Tsutsumida
Remote Sens. 2020, 12(9), 1527; https://doi.org/10.3390/rs12091527 - 11 May 2020
Cited by 5 | Viewed by 3304
Abstract
Imperviousness is an important indicator for monitoring urbanization and environmental changes, and is evaluated widely in urban areas, but not in rural areas. An accurate impervious surface area (ISA) map in rural areas is essential to achieve environmental conservation and sustainable rural development. [...] Read more.
Imperviousness is an important indicator for monitoring urbanization and environmental changes, and is evaluated widely in urban areas, but not in rural areas. An accurate impervious surface area (ISA) map in rural areas is essential to achieve environmental conservation and sustainable rural development. Global land-cover products such as MODIS MCD12Q1, ESA CCI-LC, and Global Urban Land are common resources for environmental practitioners to collect land-cover information including ISAs. However, global products tend to focus on large ISA agglomerations and may not identify fragmented ISA extents in less populated regions. Land-use planners and practitioners have to map ISAs if it is difficult to obtain such spatially explicit information from local governments. A common and consistent approach for rural ISA mapping is yet to be established. A case study of the Liping County, a typical rural region in southwest China, was undertaken with the objectives of assessing the global land-cover products in the context of rural ISA mapping and proposing a simple and feasible approach for the mapping. This approach was developed using Landsat 8 imagery and by applying a random forests classifier. An appropriate number of training samples were distributed to towns or villages across all townships in the study area for classification. The results demonstrate that the global land-cover products identified major ISA agglomerations, specifically at the county seat; however, other fragmented ISAs over the study area were overlooked. In contrast, the map created using the developed approach inferred ISAs across all townships with an overall accuracy of 91%. A large amount of training samples together with geographic information of towns or villages is the key suggestion to identify and map ISAs in rural areas. Full article
(This article belongs to the Special Issue Recent Advances in Satellite Derived Global Land Product Validation)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Location of the Liping County. Numbers indicate the number of villages and/or towns in each township (390 villages and 33 towns, in total). Elevation data from [<a href="#B36-remotesensing-12-01527" class="html-bibr">36</a>].</p>
Full article ">Figure 2
<p>A false-color composite for the creation of training samples (<b>a</b>) at the county seat, and (<b>b</b>) at a village.</p>
Full article ">Figure 3
<p>Spatial distribution of training sample polygons.</p>
Full article ">Figure 4
<p>Out-of-bag (OOB) error of the random forests classifier with different parameter values of <span class="html-italic">mtry</span> and <span class="html-italic">ntree</span>.</p>
Full article ">Figure 5
<p>ISA maps in the Liping County for 2015 found by (<b>a</b>) Moderate Resolution Imaging Spectroradiometer Land Cover Type/Dynamics products (MODIS MCD12Q1), (<b>b</b>) European Space Agency Climate Change Initiative-Land Cover (ESA CCI-LC), (<b>c</b>) Global Urban Land (GUL), and (<b>d</b>) estimated by random forests classifier.</p>
Full article ">Figure 6
<p>The variable importance of the random forests classifier by (<b>a</b>) Mean Decrease Accuracy and (<b>b</b>) Mean Decrease Gini.</p>
Full article ">
21 pages, 8793 KiB  
Article
Secular Changes in Atmospheric Turbidity over Iraq and a Possible Link to Military Activity
by Alexandra Chudnovsky and Alexander Kostinski
Remote Sens. 2020, 12(9), 1526; https://doi.org/10.3390/rs12091526 - 11 May 2020
Cited by 7 | Viewed by 4603
Abstract
We examine satellite-derived aerosol optical depth (AOD) data during the period 2000–2018 over the Middle East to evaluate the contribution of anthropogenic pollution. We focus on Iraq, where US troops were present for nearly nine years. We begin with a plausibility argument linking [...] Read more.
We examine satellite-derived aerosol optical depth (AOD) data during the period 2000–2018 over the Middle East to evaluate the contribution of anthropogenic pollution. We focus on Iraq, where US troops were present for nearly nine years. We begin with a plausibility argument linking anthropogenic influence and AOD signature. We then calculate the percent change in AOD every two years. To pinpoint the causes for changes in AOD on a spatial basis, we distinguish between synoptically “calm” periods and those with vigorous synoptic activity. This was done on high-resolution 10 km AOD retrievals from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor (Terra satellite). We found spatiotemporal variability in the intensity of the AOD and its standard deviation along the dust-storm corridor during three studied periods: before Operation Iraqi Freedom (OIF) (1 March 2000–19 March 2003), during OIF (20 March 2003–1 September 2010), and Operation New Dawn (OND; 1 September 2010–18 December 2011), and after the US troops’ withdrawal (19 December 2011–31 December 2018). Pixels of military camps and bases, major roads and areas of conflict, and their corresponding AOD values, were selected to study possible effects. We found that winter, with its higher frequency of days with synoptically “calm” conditions compared to spring and summer, was the best season to quantitatively estimate the impact of these ground-based sources. Surprisingly, an anthropogenic impact on the AOD signature was also visible during vigorous synoptic activity. Meteorological conditions that favor detection of these effects using space imagery are discussed, where the effects are more salient than in surrounding regions with similar meteorological conditions. This exceeds expectations when considering synoptic variations alone. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Aerosol optical depth (AOD) spatial and temporal variability over Iraq generated using an open source engine Giovanni (<a href="https://giovanni.gsfc.nasa.gov/giovanni/doc/UsersManualworkingdocument.docx.html" target="_blank">https://giovanni.gsfc.nasa.gov/giovanni/doc/UsersManualworkingdocument.docx.html</a>). (<b>A</b>) Averaged AOD on a pixel basis over 1 March 2000–19 March 2003 (before US troop and coalition force entrance), 20 March 2003–31 December 2011 (presence of US and coalition forces), and 1 January 2012–31 December 2018 (post-war). (<b>B</b>) Monthly averaged AOD values over Iraq during 2000–US troop entrance was generally followed by an increase in AOD, whereas the converse is true upon troop withdrawal.</p>
Full article ">Figure 2
<p>Comparison of monthly average AOD values between Iraq and adjacent countries. Left: smoothed average AOD, right: monthly average AOD. Iraq, Kuwait (<b>A</b>) and Saudi Arabia (<b>B</b>) show a similar increasing trend. Whereas the amplitude is larger for Kuwait, the amplitude for Iraq is larger than that for Saudi Arabia. However, the decrease in AOD between 2012 and 2018 is largest for Iraq. Iraq AOD pattern differs from those of Jordan and Syria (<b>A</b>,<b>C</b>). The data was generated using an open source engine Giovanni at 1 ° spatial resolution (<a href="https://giovanni.gsfc.nasa.gov/giovanni/doc/UsersManualworkingdocument.docx.html" target="_blank">https://giovanni.gsfc.nasa.gov/giovanni/doc/UsersManualworkingdocument.docx.html</a>).</p>
Full article ">Figure 3
<p>Map of military/anthropogenic sources in Iraq (based on <a href="#remotesensing-12-01526-t001" class="html-table">Table 1</a>). Black spots are also shown as resulting from oil refineries and a possible military activity signature as observed using the MODIS visible reflectance data source (MOD021 product). Locations of major military conflict areas are as reported by the media and air base locations were downloaded. Note that the spatial direction is along the main wind direction: west–northwest.</p>
Full article ">Figure 4
<p>Average percent change in AOD for different time frames. Here we assume that a positive change for all countries reflects the synoptic impact whereas an increase for only one country reflects local conditions. Calm periods are circled in red and indicated by red arrows; synoptic impact years are indicated by black arrows.</p>
Full article ">Figure 5
<p>Average percent change for selected time periods: “in war” and “post-war”. Case 1: large synoptic impact during military presence. Case 2: small synoptic impact during the same time period for comparison. Case 3: “post-war” reference showing a general decrease in AOD.</p>
Full article ">Figure 5 Cont.
<p>Average percent change for selected time periods: “in war” and “post-war”. Case 1: large synoptic impact during military presence. Case 2: small synoptic impact during the same time period for comparison. Case 3: “post-war” reference showing a general decrease in AOD.</p>
Full article ">Figure 6
<p>Averaged seasonal AOD (Panel <b>A</b>) and AOD standard deviation (stdev) (Panel <b>B</b>) on a pixel basis over three time periods: 1 March 2000–19 March 2003 (before entrance of US troops and coalition forces), 20 March 2003–31 December 2011 (US and coalition forces present), and 1 January 2012–31 December 2018 (‘post-war” period).</p>
Full article ">Figure 6 Cont.
<p>Averaged seasonal AOD (Panel <b>A</b>) and AOD standard deviation (stdev) (Panel <b>B</b>) on a pixel basis over three time periods: 1 March 2000–19 March 2003 (before entrance of US troops and coalition forces), 20 March 2003–31 December 2011 (US and coalition forces present), and 1 January 2012–31 December 2018 (‘post-war” period).</p>
Full article ">Figure 7
<p>Standard deviation (stdev) of AOD during the three time periods (see legend in <a href="#remotesensing-12-01526-f006" class="html-fig">Figure 6</a>). Highlighted in red are all pixels with stdev above 0.15. All pixels were counted and are shown.</p>
Full article ">Figure 8
<p>Percent change in AOD (top row) and AOD stdev (bottom row) in winter between Periods 1 and 2 (<b>left</b>) and 2 and 3 (<b>right</b>). Blue pixels are military sites. Reddish colors indicate increase whereas black and grey indicate decrease. Period 1: 1 March 2000–19 March 2003 (before OIF); Period 2: 20 March 2003–31 December 2011 (presence of US forces); Period 3: 1 January 2012–31 December 2018 (post-war).</p>
Full article ">Figure 9
<p>Average stdev of AOD in summer for different periods. We see an increase in AOD stdev in all areas indicated by red arrows (zoom-in on 2006–2011). These are areas with high anthropogenic activity. The stdev of AOD during 2006–2011 is also much larger than during the other reference periods.</p>
Full article ">Figure 10
<p>Balad military air base site and PM<sub>10</sub> sampling locations during 2004–2009 based on publically available data [<a href="#B60-remotesensing-12-01526" class="html-bibr">60</a>]. (<b>A</b>) Box plot of mg PM<sub>10</sub> per m<sup>3</sup> confirms the increase in PM<sub>10</sub> and distinct temporal pattern variability. (<b>B</b>) Spatial variability between different locations.</p>
Full article ">
23 pages, 8581 KiB  
Article
An Improved Cloud Detection Method for GF-4 Imagery
by Ming Lu, Feng Li, Bangcheng Zhan, He Li, Xue Yang, Xiaotian Lu and Huachao Xiao
Remote Sens. 2020, 12(9), 1525; https://doi.org/10.3390/rs12091525 - 11 May 2020
Cited by 10 | Viewed by 3208
Abstract
Clouds are significant barriers to the application of optical remote sensing images. Accurate cloud detection can help to remove contaminated pixels and improve image quality. Many cloud detection methods have been developed. However, traditional methods either rely heavily on thermal infrared bands or [...] Read more.
Clouds are significant barriers to the application of optical remote sensing images. Accurate cloud detection can help to remove contaminated pixels and improve image quality. Many cloud detection methods have been developed. However, traditional methods either rely heavily on thermal infrared bands or clear-sky images. When traditional cloud detection methods are used with Gaofen 4 (GF-4) imagery, it is very difficult to separate objects with similar spectra, such as ice, snow, and bright sand, from clouds. In this paper, we propose a new method, named Real-Time-Difference (RTD), to detect clouds using a pair of images obtained by the GF-4 satellite. The RTD method has four main steps: (1) data preprocessing, including transforming digital value (DN) to Top of Atmosphere (TOA) reflectance, and orthographic and geometric correction; (2) the computation of a series of cloud indexes for a single image to highlight clouds; (3) the calculation of the difference between a pair of real-time images in order to obtain moved clouds; and (4) confirming the clouds and background by analyzing their physical and dynamic features. The RTD method was validated in three sites located in the Hainan, Liaoning, and Xinjiang areas of China. The results were compared with those of a popular classifier, Support Vector Machine (SVM). The results showed that RTD outperformed SVM; for the Hainan, Liaoning, and Xinjiang areas, respectively, the overall accuracy of RTD reached 95.9%, 94.1%, and 93.9%, and its Kappa coefficient reached 0.92, 0.88, and 0.88. In the future, we expect RTD to be developed into an important means for the rapid detection of clouds that can be used on images from geostationary orbit satellites. Full article
(This article belongs to the Special Issue Quality Improvement of Remote Sensing Images)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The locations of the test images of the Hainan, Liaoning, and Xinjiang areas in China.</p>
Full article ">Figure 2
<p>Flowchart for the Real-Time-Difference (RTD) cloud-screening method. GSWO: Global Surface Water Occurrence.</p>
Full article ">Figure 3
<p>False-color GF-4 image of the Hainan study area with the NIR-R-G composition (left) and a difference map of the time-adjacent images in the same area (right).</p>
Full article ">Figure 4
<p>Classification of clouds and background in the Hainan study area using the RTD (left) and Support Vector Machine (SVM; right) methods.</p>
Full article ">Figure 5
<p>The 2000 random points generated for the Hainan study area that were used for the accuracy assessment. Red points indicate clouds and green points indicate background.</p>
Full article ">Figure 6
<p>False-color GF-4 image of the Liaoning area using the NIR-R-G composition (left) and a difference map of two time-adjacent blue band images of the same area (right).</p>
Full article ">Figure 7
<p>Classification of clouds and background in the Liaoning area using the RTD method (left) and SVM method (right).</p>
Full article ">Figure 8
<p>The 2000 points obtained using stratified random sampling from the Liaoning study area that were used for the accuracy assessment. Red points indicate clouds and green points indicate background.</p>
Full article ">Figure 9
<p>False-color GF-4 image of the Xinjiang area using the NIR-R-G composition (left) and a difference map of two time-adjacent blue band images of the same area (right).</p>
Full article ">Figure 10
<p>Classification of clouds and background in the Xinjiang area using the RTD method (left) and the SVM method (right).</p>
Full article ">Figure 11
<p>The 2000 points obtained using stratified random sampling from the Xinjiang study area that were used for the accuracy assessment. Red points indicate clouds and green points indicate background.</p>
Full article ">Figure 12
<p>The results of the spectral test of the RTD method in Xinjiang.</p>
Full article ">Figure 13
<p>A part of the original GF-4 imagery (<b>a</b>) and the cloud-detection results of SVM with different kernel types, namely linear (<b>b</b>), polynomial (<b>c</b>), radial basis (<b>d</b>), and sigmoid (<b>e</b>).</p>
Full article ">Figure 14
<p>The SVM results using the radial basis kernel type with penalty parameters of 20 (<b>a</b>), 40 (<b>b</b>), 60 (<b>c</b>), 80 (<b>d</b>), and 100 (<b>e</b>).</p>
Full article ">Figure 15
<p>Example of the MIR band of the GF-4 imagery for the Liaoning Sea study area.</p>
Full article ">
20 pages, 7474 KiB  
Article
Impact of Aerosol Vertical Distribution on Aerosol Optical Depth Retrieval from Passive Satellite Sensors
by Chong Li, Jing Li, Oleg Dubovik, Zhao-Cheng Zeng and Yuk L. Yung
Remote Sens. 2020, 12(9), 1524; https://doi.org/10.3390/rs12091524 - 11 May 2020
Cited by 27 | Viewed by 4721
Abstract
When retrieving Aerosol Optical Depth (AOD) from passive satellite sensors, the vertical distribution of aerosols usually needs to be assumed, potentially causing uncertainties in the retrievals. In this study, we use the Moderate Resolution Spectroradiometer (MODIS) and Visible Infrared Imaging Radiometer Suite (VIIRS) [...] Read more.
When retrieving Aerosol Optical Depth (AOD) from passive satellite sensors, the vertical distribution of aerosols usually needs to be assumed, potentially causing uncertainties in the retrievals. In this study, we use the Moderate Resolution Spectroradiometer (MODIS) and Visible Infrared Imaging Radiometer Suite (VIIRS) sensors as examples to investigate the impact of aerosol vertical distribution on AOD retrievals. A series of sensitivity experiments was conducted using radiative transfer models with different aerosol profiles and surface conditions. Assuming a 0.2 AOD, we found that the AOD retrieval error is the most sensitive to the vertical distribution of absorbing aerosols; a −1 km error in aerosol scale height can lead to a ~30% AOD retrieval error. Moreover, for this aerosol type, ignoring the existence of the boundary layer can further result in a ~10% AOD retrieval error. The differences in the vertical distribution of scattering and absorbing aerosols within the same column may also cause −15% (scattering aerosols above absorbing aerosols) to 15% (scattering aerosols below absorbing aerosols) errors. Surface reflectance also plays an important role in affecting the AOD retrieval error, with higher errors over brighter surfaces in general. The physical mechanism associated with the AOD retrieval errors is also discussed. Finally, by replacing the default exponential profile with the observed aerosol vertical profile by a micro-pulse lidar at the Beijing-PKU site in the VIIRS retrieval algorithm, the retrieved AOD shows a much better agreement with surface observations, with the correlation coefficient increased from 0.63 to 0.83 and bias decreased from 0.15 to 0.03. Our study highlights the importance of aerosol vertical profile assumption in satellite AOD retrievals, and indicates that considering more realistic profiles can help reduce the uncertainties. Full article
(This article belongs to the Special Issue Advances of Remote Sensing Inversion)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Examples of exponentially decreasing aerosol profiles for different scale heights, varying from 1.0 to 3.0 km with an interval of 0.5 km. Only profiles below 5 km are shown in this figure.</p>
Full article ">Figure 2
<p>Spectral dependence of single scattering albedo (<b>a</b>) and asymmetry factor (<b>b</b>) for three aerosol models embedded in the 6SV radiative transfer model.</p>
Full article ">Figure 3
<p>(<b>a</b>) The relative errors of AOD (Aerosol Optical Depth) 550nm with the errors of scale height for three different aerosol models. (<b>b</b>) The change in AOD relative error with AOD value. The true scale height is assumed to be 2 km.</p>
Full article ">Figure 4
<p>The relative errors of AOD 550 nm with the errors of scale height under four surface albedos (0.02, 0.05, 0.1, 0.2). The fine absorbing aerosol model is used in this experiment.</p>
Full article ">Figure 5
<p>Example of observed aerosol extinction profiles by MPL(micro-pulse lidar) at Beijing-PKU site. Only profiles below 5 km are shown in this figure.</p>
Full article ">Figure 6
<p>Two types of aerosol vertical distributions: exponential decreasing profile starting from the surface (blue line), and with a 1-km boundary layer near the surface (red line).</p>
Full article ">Figure 7
<p>(<b>a</b>) The relative errors of AOD 550 nm by ignoring the boundary layer structure, under four surface albedos (0.02, 0.05, 0.1, 0.2). (<b>b</b>) The change in AOD relative errors with the AOD value. The fine absorbing aerosol model is used in this experiment. (<b>c</b>) The expanded part of (<b>b</b>).</p>
Full article ">Figure 8
<p>Examples of aerosol layering from CALIPSO aerosol subtype products. (ND: not defined; CM: clean marine; D: dust; PC: polluted continental; CC: clean continental; PD: polluted dust; S: smoke; O: other). The red boxes indicate smoke above pollution aerosols.</p>
Full article ">Figure 9
<p>Assumed aerosol profile structures for the layered structure experiment, with two different aerosol types (sulfate and soot) located at different altitudes.</p>
Full article ">Figure 10
<p>Spectral dependence of single scattering albedo (<b>a</b>) and asymmetry factor (<b>b</b>) for soot and sulfate (50% humidity) obtained from the OPAC database.</p>
Full article ">Figure 11
<p>Relative errors of AOD 550 nm by ignoring the different vertical distribution of absorbing and scattering aerosols. (Case 1: soot lying between 0–1.0 km, sulfate lying between 3.0–4.0 km; Case 2: the reversed structure with case; Case 3: a mixed exponential profile).</p>
Full article ">Figure 12
<p>Spectral dependence of single scattering albedo (<b>a</b>) and asymmetry factor (<b>b</b>) for reconstructed seasonal aerosol models at the Beijing-PKU site.</p>
Full article ">Figure 13
<p>Scatterplots of AOD retrieved using the default profile (exponential decaying with scale height = 2 km) and the MPL-retrieved aerosol extinction profile.</p>
Full article ">Figure 14
<p>Comparison of the mean bias (upper panel), RMSE (middle panel) and the correlation (bottom panel) between the retrieval using the exponential profile (blue) and the MPL-retrieved aerosol profile (orange) for the four seasons.</p>
Full article ">Figure 15
<p>Histograms of the SSA(single scattering albedo) values at the Beijing-PKU site during 2016–2019.</p>
Full article ">Figure 16
<p>Scatterplots of AOD retrieved using the default profile (exponential decaying with scale height = 2 km) and the CALIPSO aerosol extinction profile.</p>
Full article ">Figure 17
<p>Comparison of the mean bias (upper panel), RMSE (middle panel) and the correlation (bottom panel) between the retrieval using the exponential profile (blue) and the CALIPSO aerosol profile (orange) for the four seasons.</p>
Full article ">Figure 18
<p>Geographical boundaries of the seven regions defined for the aerosol parameter analysis. AERONET sites are indicated by the red dots.</p>
Full article ">Figure 19
<p>Probability density function of AERONET AOD for the four seasons over the seven regions.</p>
Full article ">Figure 20
<p>Probability density function of AERONET SSA values for the four seasons over the seven regions.</p>
Full article ">
20 pages, 2125 KiB  
Article
Mapping Floristic Patterns of Trees in Peruvian Amazonia Using Remote Sensing and Machine Learning
by Pablo Pérez Chaves, Gabriela Zuquim, Kalle Ruokolainen, Jasper Van doninck, Risto Kalliola, Elvira Gómez Rivero and Hanna Tuomisto
Remote Sens. 2020, 12(9), 1523; https://doi.org/10.3390/rs12091523 - 10 May 2020
Cited by 7 | Viewed by 8747
Abstract
Recognition of the spatial variation in tree species composition is a necessary precondition for wise management and conservation of forests. In the Peruvian Amazonia, this goal is not yet achieved mostly because adequate species inventory data has been lacking. The recently started Peruvian [...] Read more.
Recognition of the spatial variation in tree species composition is a necessary precondition for wise management and conservation of forests. In the Peruvian Amazonia, this goal is not yet achieved mostly because adequate species inventory data has been lacking. The recently started Peruvian national forest inventory (INFFS) is expected to change the situation. Here, we analyzed genus-level variation, summarized through non-metric multidimensional scaling (NMDS), in a set of 157 INFFS inventory plots in lowland to low mountain rain forests (<2000 m above sea level) using Landsat satellite imagery and climatic, edaphic, and elevation data as predictor variables. Genus-level floristic patterns have earlier been found to be indicative of species-level patterns. In correlation tests, the floristic variation of tree genera was most strongly related to Landsat variables and secondly to climatic variables. We used random forest regression, under varying criteria of feature selection and cross-validation, to predict the floristic composition on the basis of Landsat and environmental data. The best model explained >60% of the variation along NMDS axes 1 and 2 and 40% of the variation along NMDS axis 3. We used this model to predict the three NMDS dimensions at a 450-m resolution over all of the Peruvian Amazonia and classified the pixels into 10 floristic classes using k-means classification. An indicator analysis identified statistically significant indicator genera for 8 out of the 10 classes. The results are congruent with earlier studies, suggesting that the approach is robust and can be applied to other tropical regions, which is useful for reducing research gaps and for identifying suitable areas for conservation. Full article
(This article belongs to the Special Issue Remote Sensing for Biodiversity Mapping and Monitoring)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Geographical characteristics of the study area in Peruvian Amazonia. (<b>a</b>) Landsat TM/ETM+ composite (bands 5, 4, and 3 assigned to red, green, and blue, respectively) of Peruvian Amazonia with the sampling setup of the inventory plots and the five latitudinally defined blocks used in cross-validation of the models, (<b>b</b>) Mean annual temperature, (<b>c</b>) annual precipitation, (<b>d</b>) elevation, and (<b>e</b>) cation exchange capacity (CEC) in the study area. (<b>f</b>) Each inventory plot in hydromorphic (16 plots) or mountain forests (28 plots) consisted of 10 round subunits of 0.05 ha separated by 30 m and (<b>g</b>) in lowland forests (113 plots) of 7 rectangular subunits of 0.1 ha (20 m × 50 m) separated by 75 m. The circle with a radius of 450 m in (<b>f</b>) and (<b>g</b>) depicts the buffer we used for extracting Landsat and elevation data. Panels (<b>g</b>) and (<b>f</b>) are schemes not drawn to scale.</p>
Full article ">Figure 2
<p>Floristic patterns across Peruvian Amazonia. (<b>a</b>–<b>c</b>) Non-metric multidimensional scaling (NMDS) ordination based on floristic dissimilarities between 157 plots (presence-absence data of trees, extended Sørensen dissimilarity). The three NMDS axes together capture 76% of the variation in the original compositional dissimilarity matrix with a stress of 0.19. (<b>d</b>–<b>f</b>). Relationship between each floristic NMDS axis and the variable with the strongest correlation with that axis (Bio6 = minimum temperature of the coldest month, Bio15 = precipitation seasonality). Regression lines were calculated for all data together (continuous line) or after subsetting the plots by ecozone (dashed lines). (<b>g</b>–<b>i</b>) map of the 157 tree inventory plots indicating the value of the corresponding best predictor (x axis from d, e, and f, respectively). In all panels, colors indicate the different ecozones: green for lowland forests (HF), orange for montane forests (MF), and purple for hydromorphic forests (HF). The full names and abbreviations of all variables are listed in <a href="#app1-remotesensing-12-01523" class="html-app">Table S1</a>.</p>
Full article ">Figure 3
<p>Spatial patterns in the community composition of tree genera across Peruvian Amazonia as represented by the predicted NMDS ordination scores. Predicted values were obtained with random forest regressions using a selection of Landsat and environmental variables (<a href="#remotesensing-12-01523-t001" class="html-table">Table 1</a>). The variable selection was based on forward feature selection and cross-validation was done using either 10 folds of randomly selected calibration data (<b>a</b>–<b>d</b>) or 5 spatially constrained folds (<b>e</b>–<b>h</b>). Panels a–c and e–g show predicted values for a single ban, and panels d and h the predicted values for all three NMDS axes assigned to red, green, and blue, respectively. Non-forest areas and areas above the 2000-m elevation were masked out and are shown in white.</p>
Full article ">Figure 4
<p>K-means clustering of Peruvian Amazonia into 10 classes based on the floristic ordination scores (NMDS axes 1-3) predicted using Landsat and environmental data with random forest regression. Non-forest areas and areas above the 2000-m elevation were masked out and shown in white (<b>a</b>). The letters on top of the maps refer to the Pastaza fan (Pas), mountain forests following the Andes (And), hydromorphic forests or “aguajales” (Agu), seasonally inundated forests (Inu), and tierra firme forests (Tf-N for Northern (<b>b</b>) and Tf-S for Southern (<b>c</b>)).</p>
Full article ">
15 pages, 2385 KiB  
Article
Satellite-Based Observations Reveal Effects of Weather Variation on Rice Phenology
by Hongfei Wang, Aniruddha Ghosh, Bruce A. Linquist and Robert J. Hijmans
Remote Sens. 2020, 12(9), 1522; https://doi.org/10.3390/rs12091522 - 10 May 2020
Cited by 18 | Viewed by 3878
Abstract
Obtaining detailed data on the spatio-temporal variation in crop phenology is critical to increasing our understanding of agro-ecosystem function, such as their response to weather variation and climate change. It is challenging to collect such data over large areas through field observations. The [...] Read more.
Obtaining detailed data on the spatio-temporal variation in crop phenology is critical to increasing our understanding of agro-ecosystem function, such as their response to weather variation and climate change. It is challenging to collect such data over large areas through field observations. The use of satellite remote sensing data has made phenology data collection easier, although the quality and the utility of such data to understand agro-ecosystem function have not been widely studied. Here, we evaluated satellite data-based estimates of rice phenological stages in California, USA by comparing them with survey data and with predictions by a temperature-driven phenology model. We then used the satellite data-based estimates to quantify the crop phenological response to changes in weather. We used time-series of MODIS satellite data and PhenoRice, a rule-based rice phenology detection algorithm, to determine annual planting, heading and harvest dates of paddy rice in California between 2002 and 2017. At the state level, our satellite-based estimates of rice phenology were very similar to the official survey data, particularly for planting and harvest dates (RMSE = 3.8–4.0 days). Satellite based observations were also similar to predictions by the DD10 temperature-driven phenology model. We analyzed how the timing of these phenological stages varied with concurrent temperature and precipitation over this 16-year time period. We found that planting was earlier in warm springs (−1.4 days °C−1 for mean temperature between mid-April and mid-May) and later in wet years (5.3 days 100 mm-1 for total precipitation from March to April). Higher mean temperature during the pre-heading period of the growing season advanced heading by 2.9 days °C−1 and shortened duration from planting to heading by 1.9 days °C−1. The entire growing season was reduced by 3.2 days °C−1 because of the increased temperature during the rice season. Our findings confirm that satellite data can be an effective way to estimate variations in rice phenology and can provide critical information that can be used to improve understanding of agricultural responses to weather variation. Full article
(This article belongs to the Special Issue Remote and Proximal Assessment of Plant Traits)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Proportion of years with rice cultivation detected with the satellite data-based estimate for counties with most rice producing areas in California between 2002 and 2017.</p>
Full article ">Figure 2
<p>Satellite data-based estimate of rice area versus the area reported by the United States Department of Agriculture for California from 2002 to 2017 at the (<b>a</b>) state level and (<b>b</b>) county level. Each dot represents a year. Black lines are regression lines and red dashed lines are <span class="html-italic">y = x</span>.</p>
Full article ">Figure 3
<p>Satellite-data based estimates versus USDA reported (<b>a</b>) planting, (<b>b</b>) heading, and (<b>c</b>) harvest time expressed as day of year (doy; the number of days since 31 December of the previous year) for rice in California. Black lines are regression lines, and red dashed lines are <span class="html-italic">y = x</span>.</p>
Full article ">Figure 4
<p>Average (<b>a</b>) planting, (<b>b</b>) heading and (<b>c</b>) harvest dates, growing length of (<b>d</b>) pre-heading period, (<b>e</b>) post-heading period and (<b>f</b>) whole growing season of rice in the Sacramento Valley of California estimated with MODIS data and PhenoRice between 2002 and 2017.</p>
Full article ">Figure 5
<p>The relationship between (<b>a</b>) satellite-observed planting date and the pre-season precipitation, (<b>b</b>) satellite-observed planting date and the average mean temperature of pre-season, (<b>c</b>) satellite-observed heading date and the average mean temperature of pre-heading season, and (<b>d</b>) satellite-observed harvest date and the average mean temperature of post-heading season.</p>
Full article ">Figure 6
<p>The relationship between (<b>a</b>) the length of the pre-heading period derived from satellite observations and the average mean temperature of pre-heading season, (<b>b</b>) post-heading season length derived from satellite observations and the average mean temperature of the post-heading season, (<b>c</b>) total growing season length derived from satellite observations and the average mean temperature of the rice growth season, for rice in California.</p>
Full article ">
17 pages, 14815 KiB  
Article
Sea Fog Detection Based on Normalized Difference Snow Index Using Advanced Himawari Imager Observations
by Han-Sol Ryu and Sungwook Hong
Remote Sens. 2020, 12(9), 1521; https://doi.org/10.3390/rs12091521 - 10 May 2020
Cited by 18 | Viewed by 8823
Abstract
Many previous studies have attempted to distinguish fog from clouds using low-orbit and geostationary satellite observations from visible (VIS) to longwave infrared (LWIR) bands. However, clouds and fog have often been misidentified because of their similar spectral features. Recently, advanced meteorological geostationary satellites [...] Read more.
Many previous studies have attempted to distinguish fog from clouds using low-orbit and geostationary satellite observations from visible (VIS) to longwave infrared (LWIR) bands. However, clouds and fog have often been misidentified because of their similar spectral features. Recently, advanced meteorological geostationary satellites with improved spectral, spatial, and temporal resolutions, including Himawari-8/9, GOES-16/17, and GeoKompsat-2A, have become operational. Accordingly, this study presents an improved algorithm for detecting daytime sea fog using one VIS and one near-infrared (NIR) band of the Advanced Himawari Imager (AHI) of the Himawari-8 satellite. We propose a regression-based relationship for sea fog detection using a combination of the Normalized Difference Snow Index (NDSI) and reflectance at the green band of the AHI. Several case studies, including various foggy and cloudy weather conditions in the Yellow Sea for three years (2017–2019), have been performed. The results of our algorithm showed a successful detection of sea fog without any cloud mask information. The pixel-level comparison results with the sea fog detection based on the shortwave infrared (SWIR) band (3.9 μm) and the brightness temperature difference between SWIR and LWIR bands of the AHI showed high statistical scores for probability of detection (POD), post agreement (PAG), critical success index (CSI), and Heidke skill score (HSS). Consequently, the proposed algorithms for daytime sea fog detection can be effective in daytime, particularly twilight, conditions, for many satellites equipped with VIS and NIR bands. Full article
(This article belongs to the Section Ocean Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Study area of the Yellow Sea. The Red-Green-Blue (RGB) image observed from the Advanced Himawari Imager (AHI) sensor shows sea fog and clouds on 14 March 2018, 00:30 UTC (09:30 Korean Standard Time (KST)).</p>
Full article ">Figure 2
<p>Example of <math display="inline"><semantics> <mrow> <mi>N</mi> <mi>D</mi> <mi>S</mi> <msub> <mi>I</mi> <mrow> <mi>o</mi> <mi>b</mi> <mi>s</mi> </mrow> </msub> </mrow> </semantics></math> on 14 March 2018, 00:30 UTC (09:30 Korean Standard Time (KST)).</p>
Full article ">Figure 3
<p>Case study examples. (<b>a</b>) AHI true-color images and (<b>b</b>) distributions of sea fog and sea surface pixels versus all pixels in the Yellow Sea in (<b>a</b>). The pixels in the land area were excluded.</p>
Full article ">Figure 4
<p>(<b>a</b>) Area selection of sea fog, clouds, and sea surface in the <math display="inline"><semantics> <mrow> <mi>N</mi> <mi>D</mi> <mi>S</mi> <msub> <mi>I</mi> <mrow> <mi>o</mi> <mi>b</mi> <mi>s</mi> </mrow> </msub> </mrow> </semantics></math> data. (<b>b</b>) Distributions of sea fog, clouds, sea surface pixels, and all pixels in the Yellow Sea in the <math display="inline"><semantics> <mrow> <mi>N</mi> <mi>D</mi> <mi>S</mi> <msub> <mi>I</mi> <mrow> <mi>d</mi> <mi>i</mi> <mi>f</mi> <mi>f</mi> </mrow> </msub> </mrow> </semantics></math>-<math display="inline"><semantics> <mrow> <mi>N</mi> <mi>D</mi> <mi>S</mi> <msub> <mi>I</mi> <mrow> <mi>o</mi> <mi>b</mi> <mi>s</mi> </mrow> </msub> </mrow> </semantics></math> plane.</p>
Full article ">Figure 5
<p>Examples of (<b>a</b>) brightness temperature (TB) at AHI 3.9 μm and (<b>b</b>) brightness temperature differences (BTD) between AHI 3.9 μm and AHI 11.2 μm, (<b>c</b>) histogram of BTD and threshold value determined using the Otsu method, and (<b>d</b>) the sea fog area after applying the threshold value.</p>
Full article ">Figure 6
<p>Flow chart of the proposed fog detection algorithm.</p>
Full article ">Figure 7
<p>(<b>a</b>) <math display="inline"><semantics> <mrow> <mi>N</mi> <mi>D</mi> <mi>S</mi> <msub> <mi>I</mi> <mrow> <mi>o</mi> <mi>b</mi> <mi>s</mi> </mrow> </msub> </mrow> </semantics></math>–<math display="inline"><semantics> <mrow> <mi>N</mi> <mi>D</mi> <mi>S</mi> <msub> <mi>I</mi> <mrow> <mi>c</mi> <mi>a</mi> <mi>l</mi> </mrow> </msub> </mrow> </semantics></math> and (<b>b</b>) results for sea fog detection using the proposed algorithm.</p>
Full article ">Figure 8
<p>Quantitative comparison of the proposed sea fog detection algorithm and AHI BTD sea fog algorithm. POD, PAG, CSI, and HSS values were 0.954, 0.887, 0.851, and 0.911, respectively.</p>
Full article ">Figure 9
<p>Results of the proposed sea fog detection algorithm for foggy and partly cloudy weather cases on (<b>a</b>) 24 February 2019, at 02:00 UTC, (<b>b</b>) 1 March 2019, at 01:30 UTC, and (<b>c</b>) 26 March 2019, at 02:00 UTC. The green and cyan colors indicate the sea fog pixels obtained using only AHI 3.9 μm and the BTD algorithm and only the proposed sea fog algorithm, respectively. The purple color indicates that both sea fog algorithms detected pixels as sea fog.</p>
Full article ">Figure 10
<p>Sea fog and cloud mixtures. Results of the proposed sea fog detection algorithm on (<b>a</b>) 9 April 2018, at 04:00 UTC, (<b>b</b>) 6 June 2018, at 00:30 UTC, and (<b>c</b>) 4 June 2019, at 01:30 UTC. The green and cyan colors indicate the sea fog pixels obtained using only AHI 3.9 μm and the BTD algorithm and only the proposed sea fog algorithm, respectively. The purple color indicates that both sea fog algorithms detected pixels as sea fog.</p>
Full article ">Figure 11
<p>Temporal variation of sea fog on 26 March 2019, from 00:00 UTC to 04:00 UTC with 30-min intervals in an RGB image, the brightness temperature at 3.9 μm, <math display="inline"><semantics> <mrow> <mi>N</mi> <mi>D</mi> <mi>S</mi> <msub> <mi>I</mi> <mrow> <mi>d</mi> <mi>i</mi> <mi>f</mi> <mi>f</mi> </mrow> </msub> </mrow> </semantics></math> results for sea fog detection, and the statistical comparison between two sea fog algorithms.</p>
Full article ">Figure 12
<p>Case of sea fog in the twilight condition (27 May 2018, at 22:30 UTC) in (<b>a</b>) RGB image, (<b>b</b>) the brightness temperature at AHI 3.9 μm, (<b>c</b>) the proposed sea fog algorithm, and (<b>d</b>) the results of two sea fog algorithms.</p>
Full article ">Figure 13
<p>Temporal variation of sea fog on 13 March 2018, at 22:30 UTC to 14 March 2019, at 03:00 UTC with 30-min intervals in an RGB image, the brightness temperature at 3.9 μm, <math display="inline"><semantics> <mrow> <mi>N</mi> <mi>D</mi> <mi>S</mi> <msub> <mi>I</mi> <mrow> <mi>o</mi> <mi>b</mi> <mi>s</mi> </mrow> </msub> <mo>−</mo> <mi>N</mi> <mi>D</mi> <mi>S</mi> <msub> <mi>I</mi> <mrow> <mi>c</mi> <mi>a</mi> <mi>l</mi> </mrow> </msub> </mrow> </semantics></math>, results for sea fog detection, and the statistical comparison between two sea fog algorithms.</p>
Full article ">Figure 14
<p>Cases of absence of SWIR band observation. Results for sea fog detection using the proposed algorithm on (<b>a</b>) 11 March 2017 at 01:20 UTC, (<b>b</b>) 30 April 2017, at 22:30 UTC, and (<b>c</b>) 1 June 2017, at 02:20 UTC.</p>
Full article ">
31 pages, 8300 KiB  
Article
Infrared Small Target Detection via Non-Convex Tensor Rank Surrogate Joint Local Contrast Energy
by Xuewei Guan, Landan Zhang, Suqi Huang and Zhenming Peng
Remote Sens. 2020, 12(9), 1520; https://doi.org/10.3390/rs12091520 - 9 May 2020
Cited by 58 | Viewed by 4180
Abstract
Small target detection is a crucial technique that restricts the performance of many infrared imaging systems. In this paper, a novel detection model of infrared small target via non-convex tensor rank surrogate joint local contrast energy (NTRS) is proposed. To improve the latest [...] Read more.
Small target detection is a crucial technique that restricts the performance of many infrared imaging systems. In this paper, a novel detection model of infrared small target via non-convex tensor rank surrogate joint local contrast energy (NTRS) is proposed. To improve the latest infrared patch-tensor (IPT) model, a non-convex tensor rank surrogate merging tensor nuclear norm (TNN) and the Laplace function, is utilized for low rank background patch-tensor constraint, which has a useful property of adaptively allocating weight for every singular value and can better approximate l 0 -norm. Considering that the local prior map can be equivalent to the saliency map, we introduce a local contrast energy feature into IPT detection framework to weight target tensor, which can efficiently suppress the background and preserve the target simultaneously. Besides, to remove the structured edges more thoroughly, we suggest an additional structured sparse regularization term using the l 1 , 1 , 2 -norm of third-order tensor. To solve the proposed model, a high-efficiency optimization way based on alternating direction method of multipliers with the fast computing of tensor singular value decomposition is designed. Finally, an adaptive threshold is utilized to extract real targets of the reconstructed target image. A series of experimental results show that the proposed method has robust detection performance and outperforms the other advanced methods. Full article
(This article belongs to the Special Issue Computer Vision and Machine Learning Application on Earth Observation)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>T-SVD illustration of a third-order tensor <math display="inline"><semantics> <mrow> <mi mathvariant="script">X</mi> <mo>∈</mo> <msup> <mo>ℝ</mo> <mrow> <msub> <mi>m</mi> <mn>1</mn> </msub> <mo>×</mo> <msub> <mi>m</mi> <mn>2</mn> </msub> <mo>×</mo> <msub> <mi>m</mi> <mn>3</mn> </msub> </mrow> </msup> </mrow> </semantics></math>. <math display="inline"><semantics> <mi mathvariant="script">U</mi> </semantics></math>, <math display="inline"><semantics> <mrow> <mtext> </mtext> <mi mathvariant="script">S</mi> </mrow> </semantics></math>, <math display="inline"><semantics> <mi mathvariant="script">V</mi> </semantics></math> are third-order tensors of size <math display="inline"><semantics> <mrow> <msub> <mi>m</mi> <mn>1</mn> </msub> <mo>×</mo> <msub> <mi>m</mi> <mn>2</mn> </msub> <mo>×</mo> <msub> <mi>m</mi> <mn>3</mn> </msub> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mi>m</mi> <mn>1</mn> </msub> <mo>×</mo> <msub> <mi>m</mi> <mn>2</mn> </msub> <mo>×</mo> <msub> <mi>m</mi> <mn>3</mn> </msub> </mrow> </semantics></math>, and <math display="inline"><semantics> <mrow> <msub> <mi>m</mi> <mn>2</mn> </msub> <mo>×</mo> <msub> <mi>m</mi> <mn>2</mn> </msub> <mo>×</mo> <msub> <mi>m</mi> <mn>3</mn> </msub> </mrow> </semantics></math>, respectively. <math display="inline"><semantics> <mrow> <mtext> </mtext> <mi mathvariant="script">S</mi> </mrow> </semantics></math> is also a f-diagonal tensor.</p>
Full article ">Figure 2
<p>Tensor construction. The left is to obtain local patch-images and these patches are stacked as a third-order tensor on the right. The red arrow indicates sliding operation. <math display="inline"><semantics> <mrow> <msub> <mi>m</mi> <mn>3</mn> </msub> </mrow> </semantics></math> is the number of patches.</p>
Full article ">Figure 3
<p>Comparison of the contribution of the <math display="inline"><semantics> <mrow> <msub> <mo>ℓ</mo> <mn>1</mn> </msub> </mrow> </semantics></math>-norm, the Laplace function, and <math display="inline"><semantics> <mrow> <msub> <mo>ℓ</mo> <mn>0</mn> </msub> </mrow> </semantics></math> -norm to the rank with respect to a varying singular value.</p>
Full article ">Figure 4
<p>Constructing of the local contrast element: (<b>a</b>) a sub-block in the whole image; and (<b>b</b>) the diffusion structure of the sub-block. <math display="inline"><semantics> <mrow> <msubsup> <mi>B</mi> <mrow> <mo stretchy="false">(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo stretchy="false">)</mo> </mrow> <mi>k</mi> </msubsup> </mrow> </semantics></math> is a set of all neighborhood pixels with a distance of <math display="inline"><semantics> <mi>k</mi> </semantics></math> from the center pixel <math display="inline"><semantics> <mrow> <mo stretchy="false">(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo stretchy="false">)</mo> </mrow> </semantics></math>, and is indicated with dotted lines of different colors.</p>
Full article ">Figure 5
<p>(<b>a</b>) Raw infrared image; (<b>b</b>) local prior weight map of PSTNN, where there are some strong edge residuals left in the red circle; and (<b>c</b>) the proposed local prior weight map.</p>
Full article ">Figure 6
<p>The flowchart of the proposed infrared small target detection method. First, local prior weight map is generated by using local contrast energy feature. Second, original patch-tensor and local prior weight patch-tensor are constructed. Third, the low-rank background tensor <math display="inline"><semantics> <mo>ℬ</mo> </semantics></math>, the structured sparse edge tensor <math display="inline"><semantics> <mi mathvariant="script">S</mi> </semantics></math>, and the sparse target tensor <math display="inline"><semantics> <mi mathvariant="script">T</mi> </semantics></math> can be separated by the proposed model. Then, the target image is reconstructed and small targets can be extracted by an adaptive threshold segmentation algorithm.</p>
Full article ">Figure 7
<p>The 20 original infrared images of different scenes used in the experiment. A red square indicates that there is a real target in the area.</p>
Full article ">Figure 8
<p>Corresponding 3D mesh views of the 20 different scenes in <a href="#remotesensing-12-01520-f007" class="html-fig">Figure 7</a>. The images in (<b>a</b>–<b>t</b>) correspond to <a href="#remotesensing-12-01520-f007" class="html-fig">Figure 7</a>a–t, respectively. The x-axis and y-axis represent the column coordinate and row coordinate of the pixels of the original image, respectively, while the z-axis represents the gray value of the pixels.</p>
Full article ">Figure 9
<p>An infrared small target and its local adjacent area. Yellow rectangle denotes the small target area and red rectangle denotes its local adjacent area. On the right is an enlarged view of the red rectangle.</p>
Full article ">Figure 10
<p>Detection performances of the proposed model under different parameters for the six tested sequences: (<b>a1</b>–<b>a6</b>) ROC curves under different patch sizes on Sequence 1–6; (<b>b1</b>–<b>b6</b>) ROC curves under different sliding steps on Sequence 1–6; (<b>c1</b>–<b>c6</b>) ROC curves under different penalty factors on Sequence 1–6; and (<b>d1</b>–<b>d6</b>) ROC curves under different compromising parameters on Sequences 1–6.</p>
Full article ">Figure 10 Cont.
<p>Detection performances of the proposed model under different parameters for the six tested sequences: (<b>a1</b>–<b>a6</b>) ROC curves under different patch sizes on Sequence 1–6; (<b>b1</b>–<b>b6</b>) ROC curves under different sliding steps on Sequence 1–6; (<b>c1</b>–<b>c6</b>) ROC curves under different penalty factors on Sequence 1–6; and (<b>d1</b>–<b>d6</b>) ROC curves under different compromising parameters on Sequences 1–6.</p>
Full article ">Figure 11
<p>Twenty reconstructed target images by the proposed model. Red Squares indicate the detected true targets. (<b>a</b>–<b>t</b>) The processed results corresponding to the original images in <a href="#remotesensing-12-01520-f007" class="html-fig">Figure 7</a>a–t, respectively. All the true targets are accurately detected without any false alarm.</p>
Full article ">Figure 12
<p>3D mesh views of the reconstructed target images in <a href="#remotesensing-12-01520-f011" class="html-fig">Figure 11</a>. The images in (<b>a</b>–<b>t</b>) correspond to <a href="#remotesensing-12-01520-f011" class="html-fig">Figure 11</a>a–t, respectively. In each subfigure, the x-axis and y-axis represent the column coordinate and row coordinate of the pixels of the original image, respectively, while the z-axis represents the gray value of the pixels.</p>
Full article ">Figure 13
<p>Twenty processed target images by PSTNN model. The images in (<b>a</b>–<b>t</b>) correspond to the original images in <a href="#remotesensing-12-01520-f007" class="html-fig">Figure 7</a>a–t, respectively. Red Squares denote the detected targets. Yellow rectangles and squares denote the residual clutters that may cause false alarms and their enlarged views are also shown to facilitate observation.</p>
Full article ">Figure 14
<p>Experimental results of anti-noise for the scene in <a href="#remotesensing-12-01520-f007" class="html-fig">Figure 7</a>o by the proposed model: (<b>a1</b>–<b>e1</b>) the image samples added white Gaussian noise with standard deviations of 4, 8, 12, 16, and 20; (<b>a2</b>–<b>e2</b>) 3D mesh views of the image samples; (<b>a3</b>–<b>e3</b>) the reconstructed target images; and (<b>a4</b>–<b>e4</b>) 3D mesh views of the reconstructed target images. Red Squares denote the true targets detected. Yellow square denotes the residual clutter in the reconstructed target image. In (<b>a2</b>–<b>e2</b>,<b>a4</b>–<b>e4</b>), the horizontal plane represents the coordinate location of the image pixels, and the height dimension represents the gray value of the pixels.</p>
Full article ">Figure 15
<p>Experimental results of anti-noise for the scene in <a href="#remotesensing-12-01520-f007" class="html-fig">Figure 7</a>q by the proposed model: (<b>a1</b>–<b>e1</b>) the image samples added white Gaussian noise with standard deviation of 4, 8, 12, 16, 20; (<b>a2</b>–<b>e2</b>) 3D mesh views of the image samples; (<b>a3</b>–<b>e3</b>) the reconstructed target images; and (<b>a4</b>–<b>e4</b>) 3D mesh views of the reconstructed target images. Red Squares denote the true targets detected. In (<b>a2</b>–<b>e2</b>,<b>a4</b>–<b>e4</b>), the horizontal plane represents the coordinate location of the image pixels, and the height dimension represents the gray value of the pixels.</p>
Full article ">Figure 16
<p>ROC curves of detection results under different methods on the six tested sequences: (<b>a</b>) ROC curves on Sequence 1; (<b>b</b>) ROC curves on Sequence 2; (<b>c</b>) ROC curves on Sequence 3; (<b>d</b>) ROC curves on Sequence 4; (<b>e</b>) ROC curves on Sequence 5; and (<b>f</b>) ROC curves on Sequence 6. Blue dash-dot line, green solid line, cyan dashed line, black dotted line, magenta dash-dot line, yellow dashed line, and red solid line are used for Top-hat, LCM, MPCM, IPI, RIPT, PSTNN, and our method, respectively.</p>
Full article ">
21 pages, 5557 KiB  
Article
Canopy Height Estimation Using Sentinel Series Images through Machine Learning Models in a Mangrove Forest
by Sujit Madhab Ghosh, Mukunda Dev Behera and Somnath Paramanik
Remote Sens. 2020, 12(9), 1519; https://doi.org/10.3390/rs12091519 - 9 May 2020
Cited by 55 | Viewed by 8713
Abstract
Canopy height serves as a good indicator of forest carbon content. Remote sensing-based direct estimations of canopy height are usually based on Light Detection and Ranging (LiDAR) or Synthetic Aperture Radar (SAR) interferometric data. LiDAR data is scarcely available for the Indian tropics, [...] Read more.
Canopy height serves as a good indicator of forest carbon content. Remote sensing-based direct estimations of canopy height are usually based on Light Detection and Ranging (LiDAR) or Synthetic Aperture Radar (SAR) interferometric data. LiDAR data is scarcely available for the Indian tropics, while Interferometric SAR data from commercial satellites are costly. High temporal decorrelation makes freely available Sentinel-1 interferometric data mostly unsuitable for tropical forests. Alternatively, other remote sensing and biophysical parameters have shown good correlation with forest canopy height. The study objective was to establish and validate a methodology by which forest canopy height can be estimated from SAR and optical remote sensing data using machine learning models i.e., Random Forest (RF) and Symbolic Regression (SR). Here, we analysed the potential of Sentinel-1 interferometric coherence and Sentinel-2 biophysical parameters to propose a new method for estimating canopy height in the study site of the Bhitarkanika wildlife sanctuary, which has mangrove forests. The results showed that interferometric coherence, and biophysical variables (Leaf Area Index (LAI) and Fraction of Vegetation Cover (FVC)) have reasonable correlation with canopy height. The RF model showed a Root Mean Squared Error (RMSE) of 1.57 m and R2 value of 0.60 between observed and predicted canopy heights; whereas, the SR model through genetic programming demonstrated better RMSE and R2 values of 1.48 and 0.62 m, respectively. The SR also established an interpretable model, which is not possible via any other machine learning algorithms. The FVC was found to be an essential variable for predicting forest canopy height. The canopy height maps correlated with ICESat-2 estimated canopy height, albeit modestly. The study demonstrated the effectiveness of Sentinel series data and the machine learning models in predicting canopy height. Therefore, in the absence of commercial and rare data sources, the methodology demonstrated here offers a plausible alternative for forest canopy height estimation. Full article
Show Figures

Figure 1

Figure 1
<p>Location of the Bhitarkanika Wildlife Sanctuary (Odisha state) on the eastern coast of India. The false color composite image of BWS is shown with the boundary demarcated over which the canopy height measured plot positions are overlaid with green markers.</p>
Full article ">Figure 2
<p>Methodology flow diagram depicting the steps adopted for generating canopy height maps and comparison of model outputs.</p>
Full article ">Figure 3
<p>Distribution of field-measured canopy heights displayed against their frequency of occurrence.</p>
Full article ">Figure 4
<p>(<b>a</b>) Coherence, (<b>b</b>) Fraction of Vegetation Cover (FVC), (<b>c</b>) Leaf Area Index (LAI) images and (<b>d</b>) Digital Elevation Model (DEM), with a vegetated and non-vegetated patch shown under red and yellow ellipses respectively.</p>
Full article ">Figure 5
<p>Frequency distribution of the input images for the canopy height model: (<b>a</b>) coherence; (<b>b</b>) FVC; (<b>c</b>) LAI; (<b>d</b>) DEM.</p>
Full article ">Figure 6
<p>Progress of Symbolic Regression (SR) over time, for canopy height model establishment, including the relationship assumed for the regression.</p>
Full article ">Figure 7
<p>Correlation plot of canopy heights between field measurements and SR model based predictions.</p>
Full article ">Figure 8
<p>(<b>a</b>) Variable importance of the RF model for canopy height estimation; (<b>b</b>) correlation plot of canopy height between field measured and RF-based model prediction.</p>
Full article ">Figure 9
<p>(<b>a</b>,<b>b</b>) Canopy height map using the SR model and its histogram distribution; (<b>c</b>,<b>d</b>) canopy height map using RF model and its histogram distribution; (<b>e</b>,<b>f</b>) canopy height difference map derived using estimations from two models (SR–RF) and its histogram distribution.</p>
Full article ">Figure 10
<p>(<b>a</b>) ICESat-2 footprints shown over the BWS on the canopy height map using SR prediction, and (<b>b</b>) Frequency distribution of canopy height pixels from ICESat-2 data.</p>
Full article ">Figure 11
<p>Correlation plots of canopy height values between ICESat-2 footprints and (<b>a</b>) RF model, and (<b>b</b>) SR model predictions.</p>
Full article ">
17 pages, 9856 KiB  
Article
Nonlinear Relationship Between the Yield of Solar-Induced Chlorophyll Fluorescence and Photosynthetic Efficiency in Senescent Crops
by Leizhen Liu, Wenhui Zhao, Qiu Shen, Jianjun Wu, Yanguo Teng, Jianhua Yang, Xinyi Han and Feng Tian
Remote Sens. 2020, 12(9), 1518; https://doi.org/10.3390/rs12091518 - 9 May 2020
Cited by 15 | Viewed by 3249
Abstract
It has been demonstrated that solar-induced chlorophyll fluorescence (SIF) is linearly related to the primary production of photosynthesis (GPP) in various ecosystems. However, it is unknown whether such linear relationships have been established in senescent crops. SIF and GPP can be expressed as [...] Read more.
It has been demonstrated that solar-induced chlorophyll fluorescence (SIF) is linearly related to the primary production of photosynthesis (GPP) in various ecosystems. However, it is unknown whether such linear relationships have been established in senescent crops. SIF and GPP can be expressed as the products of absorbed photosynthetically active radiation (APAR) with the SIF yield and photosystem II (PSII) operating efficiency, respectively. Thus, the relationship between SIF and GPP can be represented by the relationship between the SIF yield and PSII operating efficiency when the APAR has the same value. Therefore, we analyzed the relationship between the SIF yield and the PSII operating efficiency to address the abovementioned question. Here, diurnal measurements of the canopy SIF (760 nm, F760) of soybean and sweet potato were manually measured and used to calculate the SIF yield. The PSII operating efficiency was calculated from measurements of the chlorophyll fluorescence at the leaf level using the FluorImager chlorophyll fluorescence imaging system. Meanwhile, field measurements of the gas exchange and other physiological parameters were also performed using commercial-grade devices. The results showed that the SIF yield was not linearly related to the PSII operating efficiency at the diurnal scale, reflecting the nonlinear relationship between SIF and GPP. This nonlinear relationship mainly resulted from the heterogeneity and diurnal dynamics of the PSII operating efficiency and from the intrinsic diurnal changes in the maximum efficiency of the PSII photochemistry and the proportion of opened PSII centers. Intensifying respiration was another factor that complicated the response of photosynthesis to the variation in environmental conditions and negatively impacted the relationship between the SIF yield and the PSII operating efficiency. The nonlinear relationship between the SIF yield and PSII efficiency might yield errors in the estimation of GPP using the SIF measurements of senescent crops. Full article
Show Figures

Figure 1

Figure 1
<p>The relationship between the received photosynthetically active radiation (PAR) and the accumulated irradiance from 761 to 763 nm. All the measurements were collected for wheat from 20 April to 19 May 2017.</p>
Full article ">Figure 2
<p>Comparisons of canopy pigment contents between soybean and sweet potato. The difference in the values between soybean and sweet potato were significant at the level of <span class="html-italic">P</span> &lt; 0.05, which was determined using the one-way ANOVA (Tukey test). Error bars show the standard error of the mean.</p>
Full article ">Figure 3
<p>Diurnal patterns of F760, AF760 and simultaneously measured PAR in (<b>a</b>) soybean and (<b>b</b>) sweet potato.</p>
Full article ">Figure 4
<p>Diurnal patterns of (<b>a</b>) and (<b>b</b>) Fy760 accompanied by the absorbed photosynthetically active radiation (APAR), (<b>c</b>) and (<b>d</b>) PSII operating efficiency (<math display="inline"><semantics> <mrow> <msubsup> <mi>F</mi> <mi>q</mi> <mo>′</mo> </msubsup> <mo>/</mo> <msubsup> <mi>F</mi> <mi>m</mi> <mo>′</mo> </msubsup> </mrow> </semantics></math>) accompanied by <span class="html-italic">NPQ</span> in soybean (upper panel) and sweet potato (lower panel). The values of APAR in left column was calculated based on Equation (2).</p>
Full article ">Figure 5
<p>Relationship of the PSII operating efficiency (<math display="inline"><semantics> <mrow> <msubsup> <mi>F</mi> <mi>q</mi> <mo>′</mo> </msubsup> <mo>/</mo> <msubsup> <mi>F</mi> <mi>m</mi> <mo>′</mo> </msubsup> </mrow> </semantics></math>) with (<b>a</b>) the solar-induced chlorophyll fluorescence (SIF) yield (Fy760) and (<b>b</b>) non-photochemical quenching (<math display="inline"><semantics> <mrow> <msub> <mi>F</mi> <mi>m</mi> </msub> <mo>/</mo> <msubsup> <mi>F</mi> <mi>m</mi> <mo>′</mo> </msubsup> <mo>−</mo> <mn>1</mn> </mrow> </semantics></math>, <span class="html-italic">NPQ</span>) based on the diurnal measurements of soybean (black) and sweet potato (red).</p>
Full article ">Figure 6
<p>Diurnal patterns of the gas exchange measurements: (<b>a</b>) and (<b>b</b>) the net photosynthetic rate (A) accompanied by photosynthetically active radiation (PAR), (<b>c</b>) and (<b>d</b>) intercellular carbon dioxide concentration (Ci) accompanied by stomatal conductance (gsw). The upper panel shows the soybean results, and the lower panel shows the sweet potato results. The PAR in the left column was measured synchronously with the gas exchange measurements using Li-6800. Significant differences (<span class="html-italic">P</span> &lt; 0.05) between the mean values in the figure are marked by different letters above the symbols, while the letters with a same letter showed that the difference is not significant. For example, as shown in Figure a, the letters “a” and “ab” corresponding to first and second measurement of A represent that their difference is insignificant, while the letters “a” and “bc” corresponding to the first and third one represent that the difference is significant.</p>
Full article ">Figure 7
<p>Diurnal patterns of Fy760 accompanied by the photosynthesis parameters and gas exchange measurements. The upper panel shows the soybean results, and the lower panel shows the sweet potato results. Images of <math display="inline"><semantics> <mrow> <msubsup> <mi>F</mi> <mi>q</mi> <mo>′</mo> </msubsup> <mo>/</mo> <msubsup> <mi>F</mi> <mi>m</mi> <mo>′</mo> </msubsup> </mrow> </semantics></math> (PSII operating efficiency), <math display="inline"><semantics> <mrow> <msubsup> <mi>F</mi> <mi>v</mi> <mo>′</mo> </msubsup> <mo>/</mo> <msubsup> <mi>F</mi> <mi>m</mi> <mo>′</mo> </msubsup> </mrow> </semantics></math> (PSII maximum efficiency), and <math display="inline"><semantics> <mrow> <msubsup> <mi>F</mi> <mi>q</mi> <mo>′</mo> </msubsup> <mo>/</mo> <msubsup> <mi>F</mi> <mi>v</mi> <mo>′</mo> </msubsup> </mrow> </semantics></math> (PSII efficiency factor) were from soybean and sweet potato.</p>
Full article ">Figure 8
<p>Diurnal patterns of intercellular carbon dioxide concentration (Ci) accompanied by the air temperature and short-wave radiation for (<b>a</b>) soybean results and (<b>b</b>) sweet potato results. The air temperature and short-wave radiation were collected with an interval time of one hour in a meteorological station that was ~300 meters away from our experimental site. Significant differences (<span class="html-italic">P</span> &lt; 0.05) between the mean values in the figure are marked by different letters above symbols, while the letters with a same letter showed that the difference is not significant. For example, as shown in Figure a, the letters “ab” and “bc” corresponding to the second and third measurement of Ci represent that their difference is insignificant, while the letters “ab” and “c” corresponding to the second and fourth one represent that the difference is significant.</p>
Full article ">
31 pages, 5907 KiB  
Article
Short-Term Ecogeomorphic Evolution of a Fluvial Delta from Hindcasting Intertidal Marsh-Top Elevations (HIME)
by Brittany C. Smith, Kevan B. Moffett and David Mohrig
Remote Sens. 2020, 12(9), 1517; https://doi.org/10.3390/rs12091517 - 9 May 2020
Viewed by 2948
Abstract
Understanding how delta islands grow and change at contemporary, interannual timescales remains a key scientific goal and societal need, but the high-resolution, high frequency morphodynamic data that would be most useful for this are as yet logistically prohibitive. The recorded water levels needed [...] Read more.
Understanding how delta islands grow and change at contemporary, interannual timescales remains a key scientific goal and societal need, but the high-resolution, high frequency morphodynamic data that would be most useful for this are as yet logistically prohibitive. The recorded water levels needed for relative elevation analysis are also often lacking. This paper presents a new approach for hindcasting intertidal marsh-top elevations (HIME) to resolve ecogeomorphic change, even in a young, rapidly changing fluvial delta setting, at sub-decadal temporal resolution and at the spatial resolution of widely available optical remote sensing imagery (e.g., 30 m Landsat). The HIME method first calculates: (i) the probability of land exposure in a set of historical imagery from a user-defined discrete timespan (e.g., months or years); (ii) the probability of water level non-exceedance from water level records, which need not be complete nor coincident with the imagery; and (iii) the systematic variation in local mean water level with distance along the primary hydraulic gradient. The HIME method then combines these inputs to estimate a marsh-top elevation map for each historical timespan of interest. The method was developed, validated, applied, and results analyzed to investigate time-lapse evolution of the Wax Lake Delta in Louisiana, USA, every three years, over two decades (1993–2013). The hindcast maps of delta island extents and elevations evidenced ecogeomorphic system self-organization around four stable attractors, or elevation platforms, at about −0.3 m (subtidal), 0.2 m, 0.4 m, and 0.9 m (supratidal) NAVD88. The HIME results also yielded a time series of net subaerial sediment accumulation, and specific locations and magnitudes of gains and losses, at scales from 30 m to delta-wide (~100 km3) and 6 to 21 years. Average subaerial net sediment accumulation at the Wax Lake Delta (WLD) was estimated as 0.6 cm/yr during the study period. Finally, multiple linear regression models were successfully trained on the HIME elevation maps to model evolving delta island morphologies based on simple geometric factors, such as distance down-delta and position on a delta island; the models also successfully reproduced an average delta topset slope of 1.4 cm. Overall, this study’s development and application of the HIME method added detailed insights to recent, transient ecogeomorphological change at the WLD, and demonstrated the potential of the new approach for accurately reconstructing past intertidal topographies and dynamic change. Full article
(This article belongs to the Special Issue Remote Sensing of Estuarine, Lagoon and Delta Environments)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Study area. (<b>a</b>) The Wax Lake Delta (WLD, red square) is located at the terminus of the Wax Lake Outlet (blue line), which diverts water from the Atchafalaya River (purple line) in the U.S. state of Louisiana. The Atchafalaya is a distributary channel for the Mississippi (green line) and Red (red line) Rivers. (<b>b</b>) Image of WLD from 28-Sep-2010 by Landsat [<a href="#B88-remotesensing-12-01517" class="html-bibr">88</a>], for water level y = 0.35 m (NAVD88) at Camp Island gage (yellow dot) [<a href="#B93-remotesensing-12-01517" class="html-bibr">93</a>]. (<b>c</b>) Image of WLD from 24-Apr-2011 by Landsat [<a href="#B88-remotesensing-12-01517" class="html-bibr">88</a>], for water level y = 0.67 m (NAVD88) at Camp Island gage. WLD Pintail Island test case area outlined in white.</p>
Full article ">Figure 2
<p>Schematic of Hindcasting Intertidal Marsh-top Elevation (HIME) method: Step 1, calculate a time series of raster exposure probability models from remote sensing images; Step 2, fit a water level cumulative probability of non-exceedance distribution function to available water level data; Step 3, combine the results of Step 1 and Step 2 to produce a time series of estimated marsh-top elevation models.</p>
Full article ">Figure 3
<p>Example geometric position maps. Distance of each marsh pixel in the delta to (<b>a</b>) the delta apex, (<b>b</b>) its island’s apex, (<b>c</b>) its island’s nearest channel edge, (<b>d</b>) its island’s central axis (i.e., midline, starting ¼ of the way down the island from the island apex). Units for each variable are meters. Examples shown for the 2011–2013 timeframe.</p>
Full article ">Figure 4
<p>Historical marsh-top elevation models for Pintail Island, resulting from the HIME method applied to RGB Landsat imagery from 1999–2010.</p>
Full article ">Figure 5
<p>Validation of HIME exposure probabilities for Pintail Island. (<b>a</b>) LIDAR 2009 exposure probability map at 1 m pixel resolution. (<b>b</b>) 2008–2010 HIME exposure probability model at 30 m pixel resolution, masked to LIDAR extent. (<b>c</b>) Comparison of LIDAR-derived (X-axis) and spatially co-located HIME (Y-axis) exposure probabilities. Blue points are medians and blue horizontal bars are the interquartile ranges of the LIDAR 2009-derived exposure probability pixels that were co-located with the pixels of each of the <span class="html-italic">n</span> HIME exposure probability categories. Solid black line is linear regression, dashed red line is 1:1 line.</p>
Full article ">Figure 6
<p>(<b>a</b>) Map of HIME marsh-top elevations for WLD in the 2011–2013 timeframe. Maps for the other six timeframes are in the <a href="#app1-remotesensing-12-01517" class="html-app">Supplementary Materials, Figure S3</a>. (<b>b</b>) Time series of probability distributions of WLD HIME elevations (in 0.05 m bins) from 1993–1995 (left) to 2011–2013 (right).</p>
Full article ">Figure 7
<p>(<b>a</b>) Map of net marsh top elevation change over the study period, from 1993–1995 to 2011–2013. (<b>b</b>) Distribution of probabilities of net HIME marsh top elevation change over the study period. (<b>c</b>) Maps of net estimated marsh top elevation change between consecutive timeframes; net aggradation in green-yellow-red, net erosion in light blue-dark blue.</p>
Full article ">Figure 8
<p>Probability distributions of WLD elevations (m, in 0.05 m bins) in each 1 km-band of distance from the delta apex (color-coded), and change within each band over time (panels, left to right).</p>
Full article ">Figure 9
<p>Progression over time of WLD average (<b>a</b>) HIME marsh top elevation, (<b>b</b>) subaerial extent, and (<b>c</b>) relative subaerial sediment volume. Time series of WLD land area extent and envelopes of maximum and minimum interannual land areas, from regression model and data by Allen et al. (2012) [<a href="#B90-remotesensing-12-01517" class="html-bibr">90</a>], are superimposed as gray dashed lines on (<b>b</b>).</p>
Full article ">Figure 10
<p>(<b>a</b>) Net change in WLD subaerial sediment volume over two decades, from the initial (1993–1995) to final (2011–2013) HIME timeframes of study, as experienced by discrete marsh top elevation zones (X-axis calculated as bins of <span class="html-italic">z</span> ± 0.05 m width); the marsh elevation zones (X-axis) were assessed for either the initial (elev<sub>i</sub>) or final (elev<sub>f</sub>) timeframe. Net sediment accumulation at a given elevation is shaded, net sediment volume loss is hatched. (<b>b</b>) Elevation histograms of occurrence of six key vegetation species/groups at WLD, as developed from a 400-point random sample of 30-m pixels on Pintail Island by Carle, Sasser, and Roberts (2015) [<a href="#B32-remotesensing-12-01517" class="html-bibr">32</a>]. Adapted from Carle, Sasser, and Roberts (2015). Reproduced with permission from the Coastal Education and Research Foundation, Inc, Seattle, WA, USA. (<b>c</b>) Boxplots of net HIME elevation change, as experienced by WLD pixels beginning within discrete elevation zones (<span class="html-italic">z<sub>i</sub></span> ± 0.05 m) at the initial timeframe. (<b>d</b>) Boxplots of net HIME elevation change, as experienced by WLD pixels ending within discrete elevation zones at the final timeframe. Boxplots illustrate median (central points, connected by line), interquartile range (IQR: 25th to 75th percentiles as box bottom and top), 1.5 × IQR whisker lengths, and outliers (dots); boxplots are based on varying numbers of points per elevation zone. All elevations in meters NAVD88.</p>
Full article ">
20 pages, 3882 KiB  
Article
Retrieval of Secchi Disk Depth in Turbid Lakes from GOCI Based on a New Semi-Analytical Algorithm
by Shuai Zeng, Shaohua Lei, Yunmei Li, Heng Lyu, Jiafeng Xu, Xianzhang Dong, Rui Wang, Ziqian Yang and Jianchao Li
Remote Sens. 2020, 12(9), 1516; https://doi.org/10.3390/rs12091516 - 9 May 2020
Cited by 26 | Viewed by 3459
Abstract
The accurate remote estimation of the Secchi disk depth (ZSD) in turbid waters is essential in the monitoring the ecological environment of lakes. Using the field measured ZSD and the remote sensing reflectance (Rrs(λ)) data, a new semi-analytical algorithm (denoted [...] Read more.
The accurate remote estimation of the Secchi disk depth (ZSD) in turbid waters is essential in the monitoring the ecological environment of lakes. Using the field measured ZSD and the remote sensing reflectance (Rrs(λ)) data, a new semi-analytical algorithm (denoted as ZSDZ) for retrieving ZSD was developed from Rrs(λ), and it was applied to Geostationary Ocean Color Imager (GOCI) images in extremely turbid waters. Our results are as follows: (1) the ZSDZ performs well in estimating ZSD in turbid water bodies (0.15 m < ZSD < 2.5 m). By validating with the field measured data that were collected in four turbid inland lakes, the determination coefficient (R2) is determined to be 0.89, with a mean absolute square percentage error (MAPE) of 22.39%, and root mean square error (RMSE) of 0.24 m. (2) The ZSDZ improved the retrieval accuracy of ZSD in turbid waters and outperformed the existing semi-analytical schemes. (3) The developed algorithm and GOCI data are in order to map the hourly variation of ZSD in turbid inland waters, the GOCI-derived results reveal a significant spatiotemporal variation in our study region, which are significantly driven by wind forcing. This study can provide a new approach for estimating water transparency in turbid waters, offering important support for the management of inland waters. Full article
(This article belongs to the Special Issue Remote Sensing of Aquatic Ecosystem Health and Processes)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The location of Lake Hongze (<b>B</b>), Lake Dongting (<b>C</b>), Lake Taihu (<b>D</b>), and Lake Erhai (<b>E</b>) in China (<b>A</b>). The symbols represent the sampling date in year-month forma.</p>
Full article ">Figure 2
<p>Comparison between measured and the model derived values of <span class="html-italic">K<sub>d</sub></span>(745) (<b>A</b>). The in-situ relationships between <span class="html-italic">K<sub>d</sub></span>(745) and <span class="html-italic">K</span><sub>d</sub>(555) (<b>B</b>).</p>
Full article ">Figure 3
<p>Average Rrs (<b>A</b>) and <span class="html-italic">K<sub>d</sub></span> (B) in Lake Hongze (HZ), Lake Taihu (TH), Lake Dongting (DT), and Lake Erhai (EH), respectively. The <math display="inline"><semantics> <mrow> <msub> <mi>K</mi> <mi>d</mi> </msub> <mrow> <mo>(</mo> <mrow> <mn>555</mn> </mrow> <mo>)</mo> </mrow> </mrow> </semantics></math> is the minimum <math display="inline"><semantics> <mrow> <msub> <mi>K</mi> <mi>d</mi> </msub> </mrow> </semantics></math> value within the visible domain for GOCI (<b>B</b>).</p>
Full article ">Figure 4
<p>Validation of the <math display="inline"><semantics> <mrow> <msub> <mi>K</mi> <mi>d</mi> </msub> <mrow> <mo>(</mo> <mrow> <mn>555</mn> </mrow> <mo>)</mo> </mrow> </mrow> </semantics></math> (<b>A</b>) and Z<sub>SD</sub> (<b>B</b>) between the measured and predicted values. Applicability verification of the new algorithm in Lake Dongting.</p>
Full article ">Figure 5
<p>The field measured reflectance corresponded to the Geostationary Ocean Color Imager (GOCI) bands for Lake Taihu (<b>A</b>) and Lake Hongze (<b>B</b>). The performance of new algorithm based on synchronized images (<b>C</b>).</p>
Full article ">Figure 6
<p>Hourly variations of GOCI-derived Z<sub>SD</sub> based on Z<sub>SDZ</sub> model in Lake Taihu (<b>A</b>) and Lake Hongze (<b>B</b>). The retrieval results of Z<sub>SD</sub> in the eastern part of Lake Taihu are not shown due to the dense distribution of aquatic vegetation (see <a href="#remotesensing-12-01516-f001" class="html-fig">Figure 1</a>).</p>
Full article ">Figure 7
<p>The relationship between measured Z<sub>SD</sub> and measured water quality parameters in Lake Hongze and Taihu: (<b>A</b>) total suspended matter (TSM), (<b>B</b>) inorganic suspended matter (ISM), (<b>C</b>) organic suspended matter (OSM), and (<b>D</b>) Chla.</p>
Full article ">Figure 8
<p>Comparison of the measured and derived Z<sub>SD</sub> of the four Models. (<b>A</b>) Z<sub>SDM14</sub>, (<b>B</b>) Z<sub>SDR17</sub>, and (<b>C</b>) Z<sub>SDV6</sub> were provided by previous studies, (<b>D</b>) Z<sub>SDZ</sub> is provided by this study.</p>
Full article ">Figure 9
<p>Variation of mean absolute square percentage error (MAPE) for Z<sub>SD</sub>, derived from the errors introduced by atmospheric correction.</p>
Full article ">Figure 10
<p>The relationship between Z<sub>SD</sub> and wind speed (<b>A</b>) and scatterplot of different levels of wind speed and Z<sub>SD</sub> (<b>B</b>).</p>
Full article ">
21 pages, 9160 KiB  
Article
A Deep Learning Model for Automatic Plastic Mapping Using Unmanned Aerial Vehicle (UAV) Data
by Gordana Jakovljevic, Miro Govedarica and Flor Alvarez-Taboada
Remote Sens. 2020, 12(9), 1515; https://doi.org/10.3390/rs12091515 - 9 May 2020
Cited by 60 | Viewed by 7184
Abstract
Although plastic pollution is one of the most noteworthy environmental issues nowadays, there is still a knowledge gap in terms of monitoring the spatial distribution of plastics, which is needed to prevent its negative effects and to plan mitigation actions. Unmanned Aerial Vehicles [...] Read more.
Although plastic pollution is one of the most noteworthy environmental issues nowadays, there is still a knowledge gap in terms of monitoring the spatial distribution of plastics, which is needed to prevent its negative effects and to plan mitigation actions. Unmanned Aerial Vehicles (UAVs) can provide suitable data for mapping floating plastic, but most of the methods require visual interpretation and manual labeling. The main goals of this paper are to determine the suitability of deep learning algorithms for automatic floating plastic extraction from UAV orthophotos, testing the possibility of differentiating plastic types, and exploring the relationship between spatial resolution and detectable plastic size, in order to define a methodology for UAV surveys to map floating plastic. Two study areas and three datasets were used to train and validate the models. An end-to-end semantic segmentation algorithm based on U-Net architecture using the ResUNet50 provided the highest accuracy to map different plastic materials (F1-score: Oriented Polystyrene (OPS): 0.86; Nylon: 0.88; Polyethylene terephthalate (PET): 0.92; plastic (in general): 0.78), showing its ability to identify plastic types. The classification accuracy decreased with the decrease in spatial resolution, performing best on 4 mm resolution images for all kinds of plastic. The model provided reliable estimates of the area and volume of the plastics, which is crucial information for a cleaning campaign. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Study areas: Lake Balkana (left) and Crna Rijeka River (right). EPSG:3857.</p>
Full article ">Figure 2
<p>Targets used in the study area located in Lake Balkana (<b>a</b>) frame with metal construction for the underwater survey, (<b>b</b>) frame for the on the water surface survey, (<b>c</b>) nylon rope, (<b>d</b>) plastic bottles.</p>
Full article ">Figure 3
<p>Workflow used in this study where “B*” and “CR**” correspond with the Balkana and Crna Rijeka dataset respectively. UAV = Unmanned Aerial Vehicles; SfM = Structure from Motion.</p>
Full article ">Figure 4
<p>Building blocks of (<b>a</b>) ResNet, (<b>b</b>) Inception-ResNet v2, (<b>c</b>) Xception, and (<b>d</b>) ResNeXt (C = 32) (<b>e</b>) architecture of ResUNet50/ResUNext50. Where: ReLu is Rectified Linear Unit, BN is Batch Normalization, and CONV is convolution.</p>
Full article ">Figure 5
<p>Ground truth data and results of the classification using the four tested models for detecting different plastic materials, located underwater (<b>a</b>) and overwater (<b>b</b>–<b>d</b>) (Dataset 1). Where: OPS is Oriented Polystyrene and PET is Polyethylene terephthalate.</p>
Full article ">Figure 6
<p>Spectral signatures of water, PET and OPS.</p>
Full article ">Figure 7
<p>Ground truth data and results of the classification using the ResUNet50 algorithm for visual comparison, at different spatial resolutions and for different plastic materials, located underwater (<b>a</b>) and overwater (<b>b</b>–<b>d</b>) (Dataset 2).</p>
Full article ">Figure 8
<p>Differences in the extension of the detected area covered by plastic (using the classification of the 4 mm orthophoto as a reference value).</p>
Full article ">Figure 9
<p>Relationship between the spatial resolution (cm/pixel) and the area covered by an image gathered by the DJI Mavic ProCamera (grid mission with an 80 % overlap).</p>
Full article ">Figure 10
<p>Visual comparison between the orthophoto, true data (ground truth) and classification results for the five different scenarios: (<b>a</b>) group of plastics, (<b>b</b>) single plastic items, (<b>c</b>) plastic in shallow waters, (<b>d</b>) training data errors (orange lines), which were misclassified by the operator and correctly classified by the algorithm (<b>e</b>) plastic on the ground.</p>
Full article ">Figure 11
<p>Proposed flight planning methodology to obtain accurate datasets for algorithm calibration.</p>
Full article ">
33 pages, 4749 KiB  
Article
An UAV and Satellite Multispectral Data Approach to Monitor Water Quality in Small Reservoirs
by Carmen Cillero Castro, Jose Antonio Domínguez Gómez, Jordi Delgado Martín, Boris Alejandro Hinojo Sánchez, Jose Luis Cereijo Arango, Federico Andrés Cheda Tuya and Ramon Díaz-Varela
Remote Sens. 2020, 12(9), 1514; https://doi.org/10.3390/rs12091514 - 9 May 2020
Cited by 75 | Viewed by 8974
Abstract
A multi-sensor and multi-scale monitoring tool for the spatially explicit and periodic monitoring of eutrophication in a small drinking water reservoir is presented. The tool was built with freely available satellite and in situ data combined with Unmanned Aerial Vehicle (UAV)-based technology. The [...] Read more.
A multi-sensor and multi-scale monitoring tool for the spatially explicit and periodic monitoring of eutrophication in a small drinking water reservoir is presented. The tool was built with freely available satellite and in situ data combined with Unmanned Aerial Vehicle (UAV)-based technology. The goal is to evaluate the performance of a multi-platform approach for the trophic state monitoring with images obtained with MultiSpectral Sensors on board satellites Sentinel 2 (S2A and S2B), Landsat 8 (L8) and UAV. We assessed the performance of three different sensors (MultiSpectral Instrument (MSI), Operational Land Imager (OLI) and Rededge Micasense) for retrieving the pigment chlorophyll-a (chl-a), as a quantitative descriptor of phytoplankton biomass and trophic level. The study was conducted in a waterbody affected by cyanobacterial blooms, one of the most important eutrophication-derived risks for human health. Different empirical models and band indices were evaluated. Spectral band combinations using red and near-infrared (NIR) bands were the most suitable for retrieving chl-a concentration (especially 2 band algorithm (2BDA), the Surface Algal Bloom Index (SABI) and 3 band algorithm (3BDA)) even though blue and green bands were useful to classify UAV images into two chl-a ranges. The results show a moderately good agreement among the three sensors at different spatial resolutions (10 m., 30 m. and 8 cm.), indicating a high potential for the development of a multi-platform and multi-sensor approach for the eutrophication monitoring of small reservoirs. Full article
(This article belongs to the Special Issue She Maps)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Localization of the study site in Europe (<b>A</b>), in the NW of the Iberian Peninsula (<b>B</b>) and aerial picture of the reservoir with in situ sampling points both from the periodic monitoring [<a href="#B28-remotesensing-12-01514" class="html-bibr">28</a>] and from the two Unmanned Aerial Vehicle (UAV) flights (same sampling points for 2017 and 2018) (<b>C</b>).</p>
Full article ">Figure 2
<p>Graphical representation of spatial resolution and bandwidths of the three multispectral sensors used. Notation: S (Sentinel 2, MSI), L (Landsat 8, OLI) and R (Rededge Micasense), followed by the band number in each sensor.</p>
Full article ">Figure 3
<p>Workflow for Image processing and data analysis.</p>
Full article ">Figure 4
<p>Best performing models (models using the SBC: (<b>a</b>) 2BDA; (<b>b</b>) NDVI; (<b>c</b>) SABI and (<b>d</b>) Kab_1) for retrieving chl-a from Landsat 8 OLI data. The code color of the points reflects the time difference between the in situ data and the satellite image acquisition data: white: 0 days; light grey: 1 day; dark grey: 2 days and black: 3 days.</p>
Full article ">Figure 5
<p>Validation scatter plots for the best performing models for retrieving chl-a using Landsat 8 OLI data. Models using the SBC: (<b>a</b>) 2BDA; (<b>b</b>) NDVI; (<b>c</b>) SABI and (<b>d</b>) Kab_1). Line 1:1 is also shown.</p>
Full article ">Figure 6
<p>Best performing models for retrieving chl-a from Sentinel 2 MSI data (models using the SBC: (<b>a</b>) 2BDA_2; (<b>b</b>) 3BDA_2; (<b>c</b>) Kab_1 and (<b>d</b>) B3B2). The color code of the points reflects the time difference between the in situ data and the satellite image acquisition data: white: 0 day; dark grey: 2 days and black: 3 days (there are no match-ups with 1 day difference).</p>
Full article ">Figure 7
<p>Validation scatter plots for the best performing models for retrieving chl-a from Sentinel 2 MSI data. Models using the SBC: (<b>a</b>) 2BDA_2; (<b>b</b>) 3BDA_2; (<b>c</b>) Kab_1 and (<b>d</b>) B3B2. Line 1:1 is also shown.</p>
Full article ">Figure 8
<p>Best performing models for retrieving chl-a from Rededge Micasense on board UAV. 3BDA_MOD in (<b>a</b>) a low-chl-a condition which corresponds to 2017 flight data, and (<b>b</b>) and (<b>c</b>) high chl-a condition, which corresponds with 2018 flight data. The results correspond to the calculations made with data included in a 0.5, 1.0 and 1.5-m. buffers, which are shown following this order from top to base.</p>
Full article ">Figure 9
<p>Best performing models for retrieving chl-a from Rededge Micasense on board UAV when the data of both flights are combined. Models applying Spectral Band Combinations (<b>a</b>) B3B1; (<b>b</b>) GB1 and (<b>c</b>) G/B are shown.</p>
Full article ">Figure 10
<p>Application of the B3B1 algorithm to the images corresponding to the 2017 flight (low-chl-a) (<b>a</b>) and 2018 flight (high chl-a) (<b>b</b>). Legend indicates µgr/L of chl-a.</p>
Full article ">Figure 11
<p>Box-whiskers plot comparing the spectral signature of Sentinel 2 MSI and Rededge Micasense sensors for the images acquired on 10/02/2018. Figure (<b>a</b>) shows the results for the outer pixels in the reservoir. Figure (<b>b</b>) shows the results for the central pixels in the reservoir, and Figure (<b>c</b>) shows the results for the entire reservoir.</p>
Full article ">Figure 12
<p>Graphical results of the combining monitoring approach for chl-a in September and October 2017 and 2018. Thematic chl-a (µg/l) maps for Landsat 8 OLI, Sentinel 2 (A- B) MSI and UAV Rededge Micasense flights. The date column shows the satellite overpass. The in situ column shows the results of the periodic monitoring in the reservoir in the nearest date available (BC and BP sampling points in <a href="#remotesensing-12-01514-f001" class="html-fig">Figure 1</a>) in the same color code (bottom color bar).</p>
Full article ">
25 pages, 17599 KiB  
Article
Estimating Stem Volume in Eucalyptus Plantations Using Airborne LiDAR: A Comparison of Area- and Individual Tree-Based Approaches
by Rodrigo Vieira Leite, Cibele Hummel do Amaral, Raul de Paula Pires, Carlos Alberto Silva, Carlos Pedro Boechat Soares, Renata Paulo Macedo, Antonilmar Araújo Lopes da Silva, Eben North Broadbent, Midhun Mohan and Hélio Garcia Leite
Remote Sens. 2020, 12(9), 1513; https://doi.org/10.3390/rs12091513 - 9 May 2020
Cited by 26 | Viewed by 5809
Abstract
Forest plantations are globally important for the economy and are significant for carbon sequestration. Properly managing plantations requires accurate information about stand timber stocks. In this study, we used the area (ABA) and individual tree (ITD) based approaches for estimating stem volume in [...] Read more.
Forest plantations are globally important for the economy and are significant for carbon sequestration. Properly managing plantations requires accurate information about stand timber stocks. In this study, we used the area (ABA) and individual tree (ITD) based approaches for estimating stem volume in fast-growing Eucalyptus spp forest plantations. Herein, we propose a new method to improve individual tree detection (ITD) in dense canopy homogeneous forests and assess the effects of stand age, slope and scan angle on ITD accuracy. Field and Light Detection and Ranging (LiDAR) data were collected in Eucalyptus urophylla x Eucalyptus grandis even-aged forest stands located in the mountainous region of the Rio Doce Valley, southeastern Brazil. We tested five methods to estimate volume from LiDAR-derived metrics using ABA: Artificial Neural Network (ANN), Random Forest (RF), Support Vector Machine (SVM), and linear and Gompertz models. LiDAR-derived canopy metrics were selected using the Recursive Feature Elimination algorithm and Spearman’s correlation, for nonparametric and parametric methods, respectively. For the ITD, we tested three ITD methods: two local maxima filters and the watershed method. All methods were tested adding our proposed procedure of Tree Buffer Exclusion (TBE), resulting in 35 possibilities for treetop detection. Stem volume for this approach was estimated using the Schumacher and Hall model. Estimated volumes in both ABA and ITD approaches were compared to the field observed values using the F-test. Overall, the ABA with ANN was found to be better for stand volume estimation ( r y y ^ = 0.95 and RMSE = 14.4%). Although the ITD results showed similar precision ( r y y ^ = 0.94 and RMSE = 16.4%) to the ABA, the results underestimated stem volume in younger stands and in gently sloping terrain (<25%). Stem volume maps also differed between the approaches; ITD represented the stand variability better. In addition, we discuss the importance of LiDAR metrics as input variables for stem volume estimation methods and the possible issues related to the ABA and ITD performance. Full article
(This article belongs to the Special Issue 3D Point Clouds in Forest Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Map of the study area showing stand and plot locations over the mountainous region of the Rio Doce Valley, Minas Gerais, Brazil.</p>
Full article ">Figure 2
<p>Flowchart representing the steps of individual tree detection and the Tree Buffer Exclusion (TBE). CHM stands for Canopy Height Model. Hi = Elevation of point i; Hmax = maximum height; n = total number of detected points.</p>
Full article ">Figure 3
<p>Flowchart showing the steps to estimate and assess stem volume for the whole stand using the area-based (ABA) and individual-tree detection (ITD) approaches.</p>
Full article ">Figure 4
<p>Correlation matrix with Pearson’s correlation values amongst LiDAR metrics. The figure highlights the metrics selected for the next step of feature selection. Correlation chart built with R package corrplot [<a href="#B70-remotesensing-12-01513" class="html-bibr">70</a>].</p>
Full article ">Figure 5
<p>LiDAR and field measured mean height (m) for the sample plots. Charts (<b>a</b>–<b>c</b>) are the results for the stand age classes Young (&lt;4 years), Intermediate (between 4 to 6 years) and Mature (&gt;6 years). Charts (<b>d</b>–<b>f</b>) are the results for the stand slope classes varying from &lt;25% to &gt;45% and (<b>g</b><b>–i</b>) are the results for the LiDAR data acquisition scan-angle classes 0–7°, 7–23° and 23–30°, respectively. <span class="html-italic">p</span>-values are for the Graybill [<a href="#B68-remotesensing-12-01513" class="html-bibr">68</a>] F-test.</p>
Full article ">Figure 6
<p>Estimated vs. observed stem volume for the whole stand. Charts (<b>a</b>–<b>e</b>) are the estimations with the area-based approach using the methods Artificial Neural Network (ANN), Random Forest (RF), Support Vector Machine (SVM) and the Gompertz and Linear models. Chart (<b>f</b>) is the volume obtained with the Individual Tree Detection (ITD) approach <span class="html-italic">p</span>-values are for the Graybill [<a href="#B68-remotesensing-12-01513" class="html-bibr">68</a>] F-test.</p>
Full article ">Figure 7
<p>Stem volume map for young stands using Artificial Neural Network (ANN) and Gompertz, in the area- and individual tree (ITD) based approaches. Box and violin plots represent the volume estimates distribution across the stands and methods.</p>
Full article ">Figure 8
<p>Stem volume map for intermediate stands using Artificial Neural Network (ANN) and Gompertz, in the area- and individual tree (ITD) based approaches. Box and violin plots represent the volume estimates distribution across the stands and methods.</p>
Full article ">Figure 9
<p>Stem volume map for mature stands using Artificial Neural Network (ANN) and Gompertz, in the area- and individual tree (ITD) based approaches. Box and violin plots represent the volume estimates distribution across the stands and methods.</p>
Full article ">Figure 10
<p>Sample plot transects representing metrics selected as input features for the estimation methods. Colored in blue is the metric height value in the point cloud. Charts (<b>a</b>–<b>d</b>) represent the height at the 10th (H.P10), 50th (H.P50), mode (Hmod) and 95th (H.P95) percentiles, respectively. Subscript numbers 1, 2 and 3 represent different sample plots with uniform, intermediate and stratified canopy layers, respectively.</p>
Full article ">
15 pages, 5858 KiB  
Article
Rapid Determination of Soil Class Based on Visible-Near Infrared, Mid-Infrared Spectroscopy and Data Fusion
by Hanyi Xu, Dongyun Xu, Songchao Chen, Wanzhu Ma and Zhou Shi
Remote Sens. 2020, 12(9), 1512; https://doi.org/10.3390/rs12091512 - 9 May 2020
Cited by 26 | Viewed by 4438
Abstract
Wise soil management requires detailed soil information, but conventional soil class mapping in a rather coarse spatial resolution cannot meet the demand for precision agriculture. With the advantages of non-destructiveness, rapid cost-efficiency, and labor savings, the spectroscopic technique has proved its high potential [...] Read more.
Wise soil management requires detailed soil information, but conventional soil class mapping in a rather coarse spatial resolution cannot meet the demand for precision agriculture. With the advantages of non-destructiveness, rapid cost-efficiency, and labor savings, the spectroscopic technique has proved its high potential for success in soil classification. Previous studies mainly focused on predicting soil classes using a single sensor. In this study, we attempted to compare the predictive ability of visible near infrared (vis-NIR) spectra, mid-infrared (MIR) spectra, and their fused spectra for soil classification. A total of 146 soil profiles were collected from Zhejiang, China, and the soil properties and spectra were measured by their genetic horizons. Along with easy-to-measure auxiliary soil information (soil organic matter, soil texture, color and pH), four spectral data, including vis-NIR, MIR, their simple combination (vis-NIR-MIR), and outer product analysis (OPA) fused spectra, were used for soil classification using a multiple objectives mixed support vector machine model. The independent validation results showed that the classification model using MIR (accuracy of 64.5%) was slightly better than that using vis-NIR (accuracy of 64.2%). The predictive model built on vis-NIR-MIR did not improve the classification accuracy, having the lowest accuracy of 61.1%, which likely resulted from an over-fitting problem. The model based on OPA fused spectra performed best with an accuracy of 68.4%. Our results prove the potential of fusing vis-NIR and MIR using OPA for improving prediction ability for soil classification. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Location of the study area and the sampling sites of soil profiles.</p>
Full article ">Figure 2
<p>Workflow of data fusion by outer product analysis (OPA) (revised after [<a href="#B27-remotesensing-12-01512" class="html-bibr">27</a>]).</p>
Full article ">Figure 3
<p>Workflow in this study.</p>
Full article ">Figure 4
<p>Violin plots of different soil profiles. The median values of each soil order are indicated by the numbers inside each violin: (<b>a</b>) soil organic matter, (<b>b</b>) sand, (<b>c</b>) silt, and (<b>d</b>) clay. The numbers shown inside each violin are the median values of each soil order.</p>
Full article ">Figure 5
<p>Confusion matrix of soil classification based on vis–NIR spectra. Inside the square brackets are lower and upper limits of the 90% confidence intervals. Outside are mean values. N represents the number of samples in each soil order allocated by model.</p>
Full article ">Figure 6
<p>Confusion matrix of soil classification based on MIR spectra.</p>
Full article ">Figure 7
<p>Confusion matrix of soil classification based on spectral combination.</p>
Full article ">Figure 8
<p>Confusion matrix of soil classification based on outer product analysis.</p>
Full article ">
13 pages, 3843 KiB  
Letter
Can InSAR Coherence and Closure Phase Be Used to Estimate Soil Moisture Changes?
by Yusuf Eshqi Molan and Zhong Lu
Remote Sens. 2020, 12(9), 1511; https://doi.org/10.3390/rs12091511 - 9 May 2020
Cited by 10 | Viewed by 3781
Abstract
We studied the influence of the statistical properties of soil moisture changes on the Interferometric Synthetic Aperture Radar (InSAR) coherence and closure phase to determine whether the InSAR coherence and closure phase can be used to estimate soil moisture changes. We generated semi-synthetic [...] Read more.
We studied the influence of the statistical properties of soil moisture changes on the Interferometric Synthetic Aperture Radar (InSAR) coherence and closure phase to determine whether the InSAR coherence and closure phase can be used to estimate soil moisture changes. We generated semi-synthetic multi-looked interferograms by pairing n real single-looked pixels of an observed SAR image with n synthetic single-looked pixels. The synthetic SAR data are generated from the real SAR data by applying soil moisture changes with a pre-defined mean and standard deviation of changes. Our results show that the diversity of soil moisture changes within the multi-look window gives rise to decorrelation, a multi-looked phase artifact, and a non-zero phase triplet. The decorrelation and closure phase increase by enlarging the diversity of soil moisture changes. We also showed that non-soil moisture changes can lead to larger decorrelations and closure phases. Furthermore, the diversity of phase changes, decorrelation, and closure phases are correlated with land cover type. We concluded that the closure phase and coherence are independent of the magnitude of soil moisture changes and are inappropriate tools to estimate soil moisture changes. Coherence, however, can be used as a proxy for soil moisture changes if the diversity and magnitude of soil moisture changes within a multi-looked pixel are strongly correlated. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Intensity (<b>a</b>) and phase (<b>b</b>) changes of the soil. <math display="inline"><semantics> <mrow> <msub> <mi>r</mi> <mi>p</mi> </msub> </mrow> </semantics></math> is the radius of the larger grains (main scatterers) and <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mi>s</mi> </msub> </mrow> </semantics></math> is the fractional volume of larger grains.</p>
Full article ">Figure 2
<p>The phase (<b>a</b>) and coherence (<b>b</b>) changes due to the change in the average and standard deviation of soil-moisture changes.</p>
Full article ">Figure 3
<p>The phase (<b>a</b>) and coherence (<b>b</b>) changes due to the change in the average and standard deviation of soil-moisture changes and zero-mean random non-soil moisture change.</p>
Full article ">Figure 4
<p>Closure phase (in radians) due to different combinations of soil moisture changes.</p>
Full article ">Figure 5
<p>Closure phase (in radians) due to different combination of soil moisture changes and zero-mean random non-soil moisture change.</p>
Full article ">Figure 6
<p>(<b>a</b>) Closure phase (in radians), (<b>b</b>) root mean square (<math display="inline"><semantics> <mrow> <mi>R</mi> <mi>M</mi> <msub> <mi>S</mi> <mi>s</mi> </msub> </mrow> </semantics></math>) of phase changes (in radians), and (<b>c</b>) average decorrelation over the study area in Idaho. (<b>d</b>) The red box on the optical image outlines the study area.</p>
Full article ">Figure 7
<p>(<b>a</b>) Closure phase (in radians), (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>R</mi> <mi>M</mi> <msub> <mi>S</mi> <mi>s</mi> </msub> </mrow> </semantics></math> of phase changes (in radians), and (<b>c</b>) average decorrelation over the study area in Oregon. (<b>d</b>) The red box on the optical image outlines the study area.</p>
Full article ">Figure 8
<p>The correlation coefficient image over the study area in Idaho.</p>
Full article ">
16 pages, 2120 KiB  
Article
Timing of Landsat Overpasses Effectively Captures Flow Conditions of Large Rivers
by George H. Allen, Xiao Yang, John Gardner, Joel Holliman, Cédric H. David and Matthew Ross
Remote Sens. 2020, 12(9), 1510; https://doi.org/10.3390/rs12091510 - 9 May 2020
Cited by 24 | Viewed by 5694
Abstract
Satellites provide a temporally discontinuous record of hydrological conditions along Earth’s rivers (e.g., river width, height, water quality). The degree to which archived satellite data effectively capture the overall population of river flow frequency is unknown. Here, we use the entire archives of [...] Read more.
Satellites provide a temporally discontinuous record of hydrological conditions along Earth’s rivers (e.g., river width, height, water quality). The degree to which archived satellite data effectively capture the overall population of river flow frequency is unknown. Here, we use the entire archives of Landsat 5, 7, and 8 to determine when a cloud-free image is available over the United States Geological Survey (USGS) river gauges located on Landsat-observable rivers. We compare the flow frequency distribution derived from the daily gauge record to the flow frequency distribution derived from ideally sampling gauged discharge based on the timing of cloud-free Landsat overpasses. Examining the patterns of flow frequency across multiple gauges, we find that there is not a statistically significant difference between the flow frequency distribution associated with observations contained within the Landsat archive and the flow frequency distribution derived from the daily gauge data (α = 0.05), except for hydrological extremes like maximum and minimum flow. At individual gauges, we find that Landsat observations span a wide range of hydrological conditions (97% of total flow variability observed in 90% of the study gauges) but the degree to which the Landsat sample can represent flow frequency distribution varies from location to location and depends on sample size. The results of this study indicate that the Landsat archive is, on average, representative of the temporal frequencies of hydrological conditions present along Earth’s large rivers with broad utility for hydrological, ecologic and biogeochemical evaluations of river systems. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Map showing the United States Geological Survey (USGS) gauges used in this study and Landsat 5, 7, and 8 data availability based on the number of images with less than 30% cloud cover. Regions affected by cloud cover and low sun angle tend to have lower data availability. Rivers equal to or wider than Landsat’s 30 m resolution at mean discharge shown as blue lines [<a href="#B28-remotesensing-12-01510" class="html-bibr">28</a>].</p>
Full article ">Figure 2
<p>Hydrographs, cumulative density functions (CDFs), and a percentile plot from two example gauge stations. For visual clarity, hydrographs show a 5 year subset of the full record (1984–2019), while CDFs correspond to the full record length. Gaps in the hydrographs correspond to United States Geological Survey (USGS) flow observations that are estimated, provisional, and/or frozen. (<b>a</b>) Hydrograph of USGS Gauge 06225500 at Wind River near Crowheart, Wyoming, and (<b>b</b>) the corresponding flow frequency CDF (KSD statistic = 0.09, <span class="html-italic">p</span>-value &lt; 0.001). (<b>c</b>) USGS Gauge 15129000 at Alsek River near Yakutat, Alaska, and (<b>d</b>) the corresponding flow frequency CDF (KSD statistic = 0.17, <span class="html-italic">p</span>-value = 0.055). (<b>e</b>) Fiftieth percentile (median) flow calculated from the full gauge record (x-axis) vs. from the Landsat sample (y-axis), with the two example gauge stations represented as colored circles. Gray circles represent median flows from other gauge stations.</p>
Full article ">Figure 3
<p>Individual gauge analysis, whereby each point represents metrics at a single gauge station. (<b>a</b>) Empirical CDF of the proportion of flow percentiles of the gauge record that the Landsat sample represents at each gauge. (<b>b</b>) Cloud occurrence correlates negatively with the ability of Landsat to capture flow frequency as represented by the KSD statistic. (<b>c</b>) Watershed area weakly correlates with the KSD statistic. (<b>d</b>) The Richards–Baker Flashiness Index does not correlate with the KSD statistic.</p>
Full article ">Figure 4
<p>Full gauge record flow percentiles vs. Landsat-sampled flow percentiles for each USGS gauge studied (N = 1134). The black line represents the 1:1 line and the red line represents the Theil–Sen median estimator best fit (equation shown in red). R<sup>2</sup>: coefficient of determination; rBias: relative bias; rRMSE: relative root mean square error; KSD: Kolmogorov–Smirnov D-statistic; MWW p: Mann–Whitney–Wilcoxon <span class="html-italic">p</span>-value.</p>
Full article ">Figure 5
<p>Landsat’s representation of flow conditions with increasing observation duration. (<b>a</b>) Variations in R<sup>2</sup> value with increasing observation duration. (<b>b</b>) Relative metrics (rBias, rMAE, rRMSE) comparing gauge daily records and Landsat-sampled gauge records. (<b>c</b>) Absolute error metrics (Bias, MAE, RMSE, unit: cms) comparing gauge daily records and Landsat-sampled gauge records. Median statistical values plotted for each percentile.</p>
Full article ">
43 pages, 21441 KiB  
Article
Forest Inventory with Long Range and High-Speed Personal Laser Scanning (PLS) and Simultaneous Localization and Mapping (SLAM) Technology
by Christoph Gollob, Tim Ritter and Arne Nothdurft
Remote Sens. 2020, 12(9), 1509; https://doi.org/10.3390/rs12091509 - 9 May 2020
Cited by 87 | Viewed by 9199
Abstract
The use of new and modern sensors in forest inventory has become increasingly efficient. Nevertheless, the majority of forest inventory data are still collected manually, as part of field surveys. The reason for this is the sometimes time-consuming and incomplete data acquisition with [...] Read more.
The use of new and modern sensors in forest inventory has become increasingly efficient. Nevertheless, the majority of forest inventory data are still collected manually, as part of field surveys. The reason for this is the sometimes time-consuming and incomplete data acquisition with static terrestrial laser scanning (TLS). The use of personal laser scanning (PLS) can reduce these disadvantages. In this study, we assess a new personal laser scanner and compare it with a TLS approach for the estimation of tree position and diameter in a wide range of forest types and structures. Traditionally collected forest inventory data are used as reference. A new density-based algorithm for position finding and diameter estimation is developed. In addition, several methods for diameter fitting are compared. For circular sample plots with a maximum radius of 20 m and lower diameter at breast height (dbh) threshold of 5 cm, tree mapping showed a detection of 96% for PLS and 78.5% for TLS. Using plot radii of 20 m, 15 m, and 10 m, as well as a lower dbh threshold of 10 cm, the respective detection rates for PLS were 98.76%, 98.95%, and 99.48%, while those for TLS were considerably lower (86.32%, 93.81%, and 98.35%, respectively), especially for larger sample plots. The root mean square error (RMSE) of the best dbh measurement was 2.32 cm (12.01%) for PLS and 2.55 cm (13.19%) for TLS. The highest precision of PLS and TLS, in terms of bias, were 0.21 cm (1.09%) and −0.74 cm (−3.83%), respectively. The data acquisition time for PLS took approximately 10.96 min per sample plot, 4.7 times faster than that for TLS. We conclude that the proposed PLS method is capable of efficient data capture and can detect the largest number of trees with a sufficient dbh accuracy. Full article
(This article belongs to the Special Issue 3D Point Clouds in Forest Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>(<b>a</b>) ZEB HORIZON personal laser scanning (PLS) in operation; and (<b>b</b>) walking path for PLS data acquisition (starting and ending in plot center) and sample plot.</p>
Full article ">Figure 2
<p>(<b>a</b>) Cross-section of a single tree for PLS and terrestrial laser scanning (TLS); (<b>b</b>) transformed points with cyclic cubic spline for PLS and TLS; and (<b>c</b>) back-transformed points with cyclic cubic spline for PLS and TLS.</p>
Full article ">Figure 3
<p>Cylindrical and spherical objects for evaluation of point cloud quality and diameter fitting.</p>
Full article ">Figure 4
<p>Distribution of detection rates for PLS/TLS, lower dbh thresholds, and plot radii. Black squares represent average detection rates over all sample plots.</p>
Full article ">Figure 5
<p>Distribution of commission errors for PLS/TLS, lower dbh thresholds, and plot radii. Black squares represent average detection rates over all sample plots.</p>
Full article ">Figure 6
<p>Distribution of overall accuracies for PLS/TLS, lower dbh thresholds, and plot radii. Black squares represent average detection rates over all sample plots.</p>
Full article ">Figure 7
<p>Average overall accuracy achieved with PLS and TLS, evaluated separately for lower dbh thresholds of 5 cm, 10 cm, and 15 cm and different sample plot radii r. Solid black lines indicate overall accuracy averaged over 20 (PLS) and 17 (TLS) sample plots, respectively. Gray shaded area indicates 95% prediction interval.</p>
Full article ">Figure 8
<p>dbh deviation, <math display="inline"><semantics> <mrow> <msub> <mi>δ</mi> <mrow> <mi>d</mi> <mi>b</mi> <mi>h</mi> </mrow> </msub> </mrow> </semantics></math>, for different methods of dbh estimation (ell, circ2, circ, tegam, and gam) over all 20 sample plots for PLS. The dashed gray line is zero-reference. Left panel: the deviation between dbh estimates from PLS and dbh measured in the field. The solid red line is a locally weighted scatterplot smoothing (LOESS) fit with a span of 3/4. Central panel: The deviation between dbh estimates from PLS and dbh measured in the field for the two most common tree species. Blue dots indicate spruce (<span class="html-italic">Picea abies</span>) and green dots indicate beech (<span class="html-italic">Fagus sylvatica</span>). Right panel: LOESS fit with a span of 3/4 for the data from the central panel. The blue lines indicate spruce and the green lines indicates beech. The shaded blue and green areas indicate 95% confidence intervals of the corresponding LOESS fits.</p>
Full article ">Figure 9
<p>dbh deviation, <math display="inline"><semantics> <mrow> <msub> <mi>δ</mi> <mrow> <mi>d</mi> <mi>b</mi> <mi>h</mi> </mrow> </msub> </mrow> </semantics></math>, for different methods of dbh estimation (ell, circ2, circ, tegam, and gam) over 17 sample plots for TLS. The dashed gray line is zero-reference. Left panel: the deviation between dbh estimates from TLS and dbh measured in the field. The solid red line is a locally weighted scatterplot smoothing (LOESS) fit with a span of 3/4. Central panel: The deviation between dbh estimates from TLS and dbh measured in the field for the two most common tree species. Blue dots indicate spruce (<span class="html-italic">Picea abies</span>) and green dots indicate beech (<span class="html-italic">Fagus sylvatica</span>). Right panel: LOESS fit with a span of 3/4 for the data from the central panel. The blue lines indicate spruce and the green lines indicates beech. The shaded blue and green areas indicate 95% confidence intervals of the corresponding LOESS fits.</p>
Full article ">Figure 10
<p>Bias from simulating noisy cross-sections depending on reference diameter, standard deviation, and dbh fitting method. Mean bias of <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mrow> <mi>g</mi> <mi>a</mi> <mi>m</mi> </mrow> </msub> </mrow> </semantics></math> (red curves) and <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mrow> <mi>c</mi> <mi>i</mi> <mi>r</mi> <mi>c</mi> <mn>2</mn> </mrow> </msub> </mrow> </semantics></math> (black curves) and range (min–max) as dashed curves. The light-gray dotted line is zero-reference.</p>
Full article ">Figure 11
<p>Root mean square error (RMSE) of a cross-validated linear model. The sample size represents the number of measured reference diameters. The solid black line indicates the average RMSE. The corresponding dashed lines indicate the min/max values of RMSE.</p>
Full article ">Figure 12
<p>dbh deviation, <math display="inline"><semantics> <mrow> <msub> <mi>δ</mi> <mrow> <mi>dbh</mi> </mrow> </msub> </mrow> </semantics></math>, of corrected values with RMSE and bias over all sample plots for PLS. The solid red line is a LOESS fit with a span of 3/4. The light-gray dashed line is zero-reference.</p>
Full article ">Figure A1
<p>Panoramic photos of three typical sample plots. (<b>a</b>) plot 3; (<b>b</b>) plot 8; and (<b>c</b>) plot 6.</p>
Full article ">Figure A2
<p>dbh deviation, <math display="inline"><semantics> <mrow> <msub> <mi>δ</mi> <mrow> <mi>dbh</mi> </mrow> </msub> </mrow> </semantics></math>, for different methods of dbh estimation (ell, circ2, circ, tegam, and gam) over all 20 sample plots for PLS. The dashed gray line is zero-reference. Left panel: the deviation between dbh estimates from PLS and dbh measured in the field. The solid red line is a locally weighted scatterplot smoothing (LOESS) fit with a span of 3/4. Central panel: The deviation between dbh estimates from PLS and dbh measured in the field for different tree species. Right panel: LOESS fit with a span of 3/4 for the data from the central panel. The colored lines indicate different tree species. The shaded areas indicate 95% confidence intervals of the corresponding LOESS fits.</p>
Full article ">Figure A3
<p>Overview of four object cross-sections including the fitted diameters, deviations, standard deviations of the residuals, and average intensities. The solid red and black circles indicate the diameter <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mrow> <mi>g</mi> <mi>a</mi> <mi>m</mi> </mrow> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mrow> <mi>c</mi> <mi>i</mi> <mi>r</mi> <mi>c</mi> <mn>2</mn> </mrow> </msub> </mrow> </semantics></math>, respectively. The solid green circle indicates the diameter of the reference circle.</p>
Full article ">Figure A4
<p>Overview of four object cross-sections including the fitted diameters, deviations, standard deviations of the residuals, and average intensities. The solid red and black circles indicate the diameter <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mrow> <mi>g</mi> <mi>a</mi> <mi>m</mi> </mrow> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mrow> <mi>c</mi> <mi>i</mi> <mi>r</mi> <mi>c</mi> <mn>2</mn> </mrow> </msub> </mrow> </semantics></math>, respectively. The solid green circle indicates the diameter of the reference circle.</p>
Full article ">Figure A5
<p>Examples of diameter deviation <math display="inline"><semantics> <mi>δ</mi> </semantics></math> for different methods of dbh estimation for simulated noisy cross-sections. Normally distributed random errors with a standard deviation of 2.45 cm were added to the distances (radii in a closed circle) of 1000 polar co-ordinates per tree to simulate noisy cross-sections. The solid red and black circles indicate the diameter <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mrow> <mi>g</mi> <mi>a</mi> <mi>m</mi> </mrow> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mrow> <mi>c</mi> <mi>i</mi> <mi>r</mi> <mi>c</mi> <mn>2</mn> </mrow> </msub> </mrow> </semantics></math>, respectively. The solid green circle indicates the diameter of the reference circle.</p>
Full article ">
17 pages, 5424 KiB  
Article
The Retrieval of Total Precipitable Water over Global Land Based on FY-3D/MWRI Data
by Baolong Du, Dabin Ji, Jiancheng Shi, Yongqian Wang, Tianjie Lei, Peng Zhang and Husi Letu
Remote Sens. 2020, 12(9), 1508; https://doi.org/10.3390/rs12091508 - 9 May 2020
Cited by 12 | Viewed by 3210
Abstract
Total precipitable water (TPW) is an important key factor in the global water cycle and climate change. The knowledge of TPW characteristics at spatial and temporal scales could help us to better understand our changing environment. Currently, many algorithms are available to retrieve [...] Read more.
Total precipitable water (TPW) is an important key factor in the global water cycle and climate change. The knowledge of TPW characteristics at spatial and temporal scales could help us to better understand our changing environment. Currently, many algorithms are available to retrieve TPW from optical and microwave sensors. There are still no available TPW data over land from FY-3D MWRI, which was launched by China in 2017. However, the TPW product over land is a key element for the retrieval of many ecological environment parameters. In this paper, an improved algorithm was developed to retrieve TPW over land from the brightness temperature of FY-3D MWRI. The major improvement is that surface emissivity, which is a key parameter in the retrieval of TPW in all-weather conditions, was developed and based on an improved algorithm according to the characteristics of FY-3D MWRI. The improvement includes two aspects, one is selection of appropriate ancillary data in estimating surface emissivity parameter Δε18.7/Δε23.8 in clear sky conditions, and the other is an improvement of the Δε18.7/Δε23.8 estimation function in cloudy conditions according to the band configuration of FY-3D MWRI. Finally, TPW retrieved was validated using TPW observation from the SuomiNet GPS and global distributed Radiosonde Observations (RAOB) networks. According to the validation, TPW retrieved using observations from FY-3D MWRI and ancillary data from Aqua MODIS had the best quality. The root mean square error (RMSE) and correlation coefficient between the retrieved TPW and observed TPW from RAOB were 5.47 and 0.94 mm, respectively. Full article
(This article belongs to the Section Remote Sensing in Geology, Geomorphology and Hydrology)
Show Figures

Figure 1

Figure 1
<p>Spatial distribution of the Digital Elevation Model from level 1 brightness temperature product of FY-3D Microwave Radiation Imager in 2018. Ground elevation increases from 0 to 6.5 km as shown by the color bar changes from blue to red.</p>
Full article ">Figure 2
<p>Distribution map of total precipitable water (TPW) observation sites from the Global SuomiNET GPS network.</p>
Full article ">Figure 3
<p>Scatter diagram of Land Surface Temperature (LST) from global distributed radiosonde observations versus (<b>a</b>) Aqua MODIS LST and (<b>b</b>) FY-3D MERSI LST, respectively, in 2018. The dashed lines are diagonals and the solid lines are regression lines of the two comparison datasets.</p>
Full article ">Figure 4
<p>Scatter diagram of RAOB TPW versus (<b>a</b>) Aqua MODIS TPW and (<b>b</b>) FY-3D MERSI TPW in 2018. The dashed lines are diagonals, and the solid lines are regression lines of the two comparison datasets.</p>
Full article ">Figure 5
<p>Scatter diagram between the estimated Δε<sub>18.7</sub>/Δε<sub>23.8</sub> and original Δε<sub>18.7</sub>/Δε<sub>23.8</sub> in a clear sky condition.</p>
Full article ">Figure 6
<p>Flowchart of total precipitable water retrieval used for this study.</p>
Full article ">Figure 7
<p>Validation of retrieved TPW datasets using RAOB observations. (<b>a</b>) FY-3D MWRI TPW retrieved using Method A and (<b>b</b>) FY-3D MWRI TPW retrieved using Method B.</p>
Full article ">Figure 8
<p>Comparison between TPW retrieved from FY-3D MWRI using Method A and TPW derived from the global SuomiNET GPS network in the year 2018.</p>
Full article ">Figure 9
<p>Comparison of daily global quarter-degree gridded TPW over land between FY-3D MWRI and Aqua MODIS. (<b>a</b>) TPW retrieved from FY-3D MWRI in descending orbit on 9 July 2018, (<b>b</b>) TPW derived from Aqua MODIS in descending orbit on 9 July 2018, (<b>c</b>) TPW retrieved from FY-3D MWRI in ascending orbit on 6 December 2018, and (<b>d</b>) TPW derived from Aqua MODIS in ascending orbit on 6 December 2018.</p>
Full article ">Figure 10
<p>Validation of monthly average TPW using TPW observations from RAOB. (<b>a</b>) Validation of monthly average TPW from FY-3D MWRI and (<b>b</b>) validation of monthly average TPW from Aqua MODIS.</p>
Full article ">Figure 11
<p>Comparison of monthly average TPW between FY3D MWRI and Aqua Modis in spatial scale. The monthly average TPW from FY-3D MWRI are shown for July 2018 (<b>a</b>) and December 2018 (<b>c</b>); (<b>b</b>,<b>d</b>) show the monthly average TPW from Aqua MODIS for July 2018 and December 2018, respectively.</p>
Full article ">
19 pages, 5735 KiB  
Article
Studying the Applicability of X-Band SAR Data to the Network-Scale Mapping of Pavement Roughness on US Roads
by Franz J. Meyer, Olaniyi A. Ajadi and Edward J. Hoppe
Remote Sens. 2020, 12(9), 1507; https://doi.org/10.3390/rs12091507 - 9 May 2020
Cited by 17 | Viewed by 5551
Abstract
The traveling public judges the quality of a road mostly by its roughness and/or ride quality. Hence, mapping, monitoring, and maintaining adequate pavement smoothness is of high importance to State Departments of Transportation in the US. Current methods rely mostly on in situ [...] Read more.
The traveling public judges the quality of a road mostly by its roughness and/or ride quality. Hence, mapping, monitoring, and maintaining adequate pavement smoothness is of high importance to State Departments of Transportation in the US. Current methods rely mostly on in situ measurements and are, therefore, time consuming and costly when applied at the network scale. This paper studies the applicability of satellite radar remote sensing data, specifically, high-resolution Synthetic Aperture Radar (SAR) data acquired at X-band, to the network-wide mapping of pavement roughness of roads in the US. Based on a comparison of high-resolution X-band Cosmo-SkyMed images with road roughness data in the form of International Roughness Index (IRI) measurements, we found that X-band radar brightness generally increases when pavement roughness worsens. Based on these findings, we developed and inverted a model to distinguish well maintained road segments from segments in need of repair. Over test sites in Augusta County, VA, we found that our classification scheme reaches an overall accuracy of 92.6%. This study illustrates the capacity of X-band SAR for pavement roughness mapping and suggests that incorporating SAR into DOT operations could be beneficial. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Map of the main area of interest of this study. Our research is focused on a section of the southern half of Augusta County, VA, and encompasses the city of Staunton and its vicinity. The bounding box of the Synthetic Aperture Radar (SAR) frame used in this study is shown by a bold black rectangle.</p>
Full article ">Figure 2
<p>International Roughness Index (IRI) roughness scale and its relationship to road surface conditions.</p>
Full article ">Figure 3
<p>(<b>a</b>) Statistical distribution of IRI values for interstates (black) and secondary roads (blue) for our study area; (<b>b</b>) statistical distribution of radar brightness values by same road types.</p>
Full article ">Figure 4
<p>Workflow for IRI vs. radar brightness forward model development.</p>
Full article ">Figure 5
<p>IRI vs. SAR <math display="inline"><semantics> <mrow> <msubsup> <mi>γ</mi> <mi>T</mi> <mn>0</mn> </msubsup> </mrow> </semantics></math> scatter plot for IRI measurement locations from the training set.</p>
Full article ">Figure 6
<p>(<b>a</b>) Signal model describing the relationship between <math display="inline"><semantics> <mrow> <msubsup> <mi>γ</mi> <mi>T</mi> <mn>0</mn> </msubsup> <msub> <mrow/> <mrow> <mn>0.1</mn> <mi>mile</mi> </mrow> </msub> </mrow> </semantics></math> and IRI. Measurement locations belonging to the training set are used; (<b>b</b>) type-I (blue) and type-II errors (red) of road surface classification from SAR for a detection threshold of <math display="inline"><semantics> <mrow> <mi>I</mi> <mi>R</mi> <msub> <mi>I</mi> <mrow> <mi>t</mi> <mi>h</mi> <mi>r</mi> <mi>e</mi> <mi>s</mi> <mi>h</mi> </mrow> </msub> <mo>=</mo> <mn>220</mn> <mo> </mo> <mi>inches</mi> <mo>/</mo> <mi>mile</mi> </mrow> </semantics></math>.</p>
Full article ">Figure 7
<p>Distribution of good and bad road segments according to ground-truth IRI samples (<b>a</b>) and as classified using SAR amplitude data (<b>b</b>). The black rectangle indicates an area affected by misclassifications. This area is discussed further in <a href="#sec7dot3-remotesensing-12-01507" class="html-sec">Section 7.3</a> and Figure 9b.</p>
Full article ">Figure 8
<p>Errors in SAR-derived IRI estimates as a function of <math display="inline"><semantics> <mrow> <msubsup> <mi>γ</mi> <mi>T</mi> <mn>0</mn> </msubsup> <msub> <mrow/> <mrow> <mn>0.1</mn> <mi>mile</mi> </mrow> </msub> </mrow> </semantics></math> according to Equation (3) (red line). The dependence of IRI on <math display="inline"><semantics> <mrow> <msubsup> <mi>γ</mi> <mi>T</mi> <mn>0</mn> </msubsup> <msub> <mrow/> <mrow> <mn>0.1</mn> <mi>mile</mi> </mrow> </msub> </mrow> </semantics></math> as described by Equation (1) is also shown (gray line).</p>
Full article ">Figure 9
<p>(<b>a</b>) Examples of locations affected by type-II errors (red markers) and their relationship to nearby tree stands (green-shaded regions). Most type-II errors are found in the direct vicinity of tree stands; (<b>b</b>) distribution of type-I errors along Glebe School Road, Augusta County, VA. The presence of type-I errors (bad road segments misclassified as good road segments) in a series of consecutive 0.1-mile road segments suggests road maintenance between the collection of the IRI and SAR data samples. The area marked by a black rectangle is further analyzed in <a href="#remotesensing-12-01507-f010" class="html-fig">Figure 10</a>.</p>
Full article ">Figure 10
<p>Comparison of the pavement conditions on Glebe School Road in October 2013 and September 2015. Significant rutting can be seen in the October 2013 image (left panel) increasing road surface roughness. This rutting appears largely mitigated in the September 2015 data set (right panel) due to the emplacement of a new pavement on the western side of the road. Image data provided by Google Earth. The location of these images is indicated by the rectangle in <a href="#remotesensing-12-01507-f009" class="html-fig">Figure 9</a>b.</p>
Full article ">
16 pages, 6020 KiB  
Letter
Models and Theoretical Analysis of SoOp Circular Polarization Bistatic Scattering for Random Rough Surface
by Xuerui Wu and Shuanggen Jin
Remote Sens. 2020, 12(9), 1506; https://doi.org/10.3390/rs12091506 - 9 May 2020
Cited by 12 | Viewed by 2878
Abstract
Soil moisture is an important factor affecting the global climate and environment, which can be monitored by microwave remote sensing all day and under all weather conditions. However, existing monostatic radars and microwave radiometers have their own limitations in monitoring soil moisture with [...] Read more.
Soil moisture is an important factor affecting the global climate and environment, which can be monitored by microwave remote sensing all day and under all weather conditions. However, existing monostatic radars and microwave radiometers have their own limitations in monitoring soil moisture with shallower depths. The emerging remote sensing of signal of opportunity (SoOp) provides a new method for soil moisture monitoring, but only an experimental perspective was proposed at present, and its mechanism is not clear. In this paper, based on the traditional surface scattering models, we employed the polarization synthesis method, the coordinate transformation, and the Mueller matrix, to develop bistatic radar circular polarization models that are suitable for SoOP remote sensing. Using these models as a tool, the bistatic scattering versus the observation frequency, soil moisture, scattering zenith angle, and scattering azimuth at five different circular polarizations (LR, HR, VR, + 45° R, and −45° R) are simulated and analyzed. The results show that the developed models can determine the optimal observation combination of polarizations and observation angle. The systematic analysis of the scattering characteristics of random rough surfaces provides an important guiding significance for the design of space-borne payloads, the analysis of experimental data, and the development of backward inversion algorithms for more effective SoOP remote sensing. Full article
(This article belongs to the Special Issue Earth Observation in Support of Sustainable Soils Development)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Geometry for the Forward Scatter Alignment (FSA) coordinate system.</p>
Full article ">Figure 2
<p>Variation of bistatic radar scattering cross section (BRCS) with frequency for five different polarizations.</p>
Full article ">Figure 3
<p>Soil moisture effects on BRCS at five different polarizations. <math display="inline"><semantics> <mrow> <msub> <mi>θ</mi> <mi>i</mi> </msub> <mo>=</mo> <msup> <mrow> <mn>30</mn> </mrow> <mo>°</mo> </msup> <mo>,</mo> <msub> <mi>θ</mi> <mi>s</mi> </msub> <mo>=</mo> <msup> <mn>5</mn> <mo>°</mo> </msup> <mo>,</mo> <msub> <mi>φ</mi> <mi>s</mi> </msub> <mo>=</mo> <msup> <mrow> <mn>120</mn> </mrow> <mo>°</mo> </msup> </mrow> </semantics></math>. Subfigures a to e represent the polarizations of LR, VR, HR, +45° R and −45° R, respectively.</p>
Full article ">Figure 4
<p>Soil moisture effects on BRCS at five different polarizations. <math display="inline"><semantics> <mrow> <msub> <mi>θ</mi> <mi>i</mi> </msub> <mo>=</mo> <msup> <mrow> <mn>30</mn> </mrow> <mo>°</mo> </msup> <mo>,</mo> <msub> <mi>θ</mi> <mi>s</mi> </msub> <mo>=</mo> <msup> <mrow> <mn>30</mn> </mrow> <mo>°</mo> </msup> <mo>,</mo> <msub> <mi>φ</mi> <mi>s</mi> </msub> <mo>=</mo> <msup> <mn>0</mn> <mo>°</mo> </msup> </mrow> </semantics></math>. Subfigures a to e represent the polarizations of LR, VR, HR, +45° R and −45° R, respectively.</p>
Full article ">Figure 5
<p>Scattering zenith angles’ effects on BRCS at five different polarizations. <math display="inline"><semantics> <mrow> <msub> <mi>θ</mi> <mi>i</mi> </msub> <mo>=</mo> <msup> <mrow> <mn>30</mn> </mrow> <mo>°</mo> </msup> <mo>,</mo> <msub> <mi>θ</mi> <mi>s</mi> </msub> <mo>=</mo> <msup> <mrow> <mn>30</mn> </mrow> <mo>°</mo> </msup> <mo>,</mo> <msub> <mi>φ</mi> <mi>s</mi> </msub> <mo>=</mo> <msup> <mn>0</mn> <mo>°</mo> </msup> </mrow> </semantics></math>. Subfigures a to e represent the polarizations of LR, VR, HR, +45° R and −45° R, respectively.</p>
Full article ">Figure 6
<p>Scattering zenith angles’ effects on BRCS at five different polarizations. <math display="inline"><semantics> <mrow> <msub> <mi>θ</mi> <mi>i</mi> </msub> <mo>=</mo> <msup> <mrow> <mn>30</mn> </mrow> <mi>o</mi> </msup> <mo>,</mo> <msub> <mi>φ</mi> <mi>s</mi> </msub> <mo>=</mo> <msup> <mn>0</mn> <mi>o</mi> </msup> </mrow> </semantics></math>. Subfigures a to e represent the polarizations of LR, VR, HR, +45° R and −45° R, respectively.</p>
Full article ">Figure 7
<p>Scattering azimuth angles’ effects on BRCS at five different polarizations.</p>
Full article ">Figure 8
<p>BRCS for different soil moisture contents (<b>a</b>–<b>c</b>) and the BRCS differences for different soil moisture differences (<b>d</b>–<b>f</b>) at LR polarization.</p>
Full article ">Figure 9
<p>BRCS for different soil moisture contents (<b>a</b>–<b>c</b>) and the BRCS differences for different soil moisture differences (<b>d</b>–<b>f</b>) at VR polarization.</p>
Full article ">Figure 10
<p>BRCS for different soil moisture contents (<b>a</b>–<b>c</b>) and the BRCS differences for different soil moisture differences (<b>d</b>–<b>f</b>) at HR polarization.</p>
Full article ">Figure 11
<p>BRCS for different soil moisture contents (<b>a</b>–<b>c</b>) and the BRCS differences for different soil moisture differences (<b>d</b>–<b>f</b>) at +45° R polarization.</p>
Full article ">Figure 12
<p>BRCS for different soil moisture contents (<b>a</b>–<b>c</b>) and the BRCS differences for different soil moisture differences (<b>d</b>–<b>f</b>) at −45° R polarization.</p>
Full article ">
20 pages, 7011 KiB  
Article
Mapping Seasonal Tree Canopy Cover and Leaf Area Using Worldview-2/3 Satellite Imagery: A Megacity-Scale Case Study in Tokyo Urban Area
by Yutaka Kokubu, Seiichi Hara and Akira Tani
Remote Sens. 2020, 12(9), 1505; https://doi.org/10.3390/rs12091505 - 9 May 2020
Cited by 13 | Viewed by 5080
Abstract
This study presents a methodology for developing a high-resolution (2 m) urban tree canopy leaf area inventory in different tree phenological seasons and a subsequent application of the methodology to a 625 km2 urban area in Tokyo. Satellite remote sensing has the [...] Read more.
This study presents a methodology for developing a high-resolution (2 m) urban tree canopy leaf area inventory in different tree phenological seasons and a subsequent application of the methodology to a 625 km2 urban area in Tokyo. Satellite remote sensing has the advantage of imaging large areas simultaneously. However, mapping the tree canopy cover and leaf area accurately is still difficult in a highly heterogeneous urban landscape. The WorldView-2/3 satellite imagery at the individual tree level (2 m resolution) was used to map urban trees based on a simple pixel-based classification method. The comparison of our mapping results with the tree canopy cover derived from aerial photography shows that the error margin is within an acceptable range of 5.5% at the 3.0 km2 small district level, 5.0% at the 60.9 km2 municipality level, and 1.2% at the 625 km2 city level. Furthermore, we investigated the relationship between the satellite data (vegetation index) and in situ tree-measurement data (leaf area index) to develop a simple model to directly map the tree leaf area from the WorldView-2/3 imagery. The estimated total leaf area in Tokyo urban area in the leaf-on season (633 km2) was twice that of the leaf-off season (319 km2). Our results also showed that the estimated total leaf area in Tokyo urban area was 1.9–6.2 times higher than the results of the moderate-resolution (30 m) satellite imagery. Full article
(This article belongs to the Section Urban Remote Sensing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>(<b>a</b>) Study area superimposed on the population density map of Tokyo according to the 2015 National Population Census. The area surrounded by the black lines with high population density indicates our study area of 23 municipalities (Tokyo special wards). (<b>b</b>) Location and name of each municipality (special ward). The maps were generated using MATLAB R2018b.</p>
Full article ">Figure 2
<p>WorldView-2/3 imagery used in this study for (upper panels: (<b>a</b>) leaf-on season and (lower panels: (<b>b</b>) leaf-off season. The yellow areas in (<b>a</b>) and (<b>b</b>), respectively, represent the sites where leaf area index (LAI; m<sup>2</sup> m<sup>−2</sup>) field measurement was conducted in each season (<b>A1</b>–<b>A3</b> and <b>B1</b>–<b>B3</b>). The green plots illustrate the LAI field measurement points. Aerial photograph inspection in (<b>A2</b> and <b>B2</b>) and field investigation in (<b>A3</b> and <b>B3</b>) were conducted to assess the accuracy of tree mapping results from WorldView-2/3 imagery. Satellite imagery from DigitalGlobe Products. (<b>a</b>, <b>A1</b>, <b>b</b>, and <b>B1</b>–<b>B3</b>) WorldView2 © 2020 DigitalGlobe, Inc., a Maxar company. (<b>a</b>, <b>A2</b>, and <b>B3</b>) WorldView3 © 2020 DigitalGlobe, Inc., a Maxar company.</p>
Full article ">Figure 3
<p>Flowchart of pixel-based image classification procedure for tree canopy cover extraction.</p>
Full article ">Figure 4
<p>Mapping results of vegetation in the test area shown in <a href="#remotesensing-12-01505-f002" class="html-fig">Figure 2</a> (<b>A2</b> and <b>B2</b>) for leaf-on season (upper panels: <b>a1</b>–<b>a3</b>) and leaf-off season (lower panels: <b>b1</b>–<b>b3</b>). The areas enclosed by yellow lines are places where the comparisons were performed. (<b>a1</b>) Aerial photograph taken in May 2012 (leaf-on season). (<b>a2</b>) Tree canopy cover (purple) and grass (green) based on the Tokyo Green and Water Coverage ratios data (Tokyo GWC-ratio data) derived from aerial photograph interpretation. (<b>b1</b>) Aerial photograph taken in March 2013 (leaf-off season). (<b>b2</b>) Manually delineated tree canopy cover (red) and grass (green). (<b>a3</b> and <b>b3</b>) Tree canopy cover (red), grass (dense = green; sparse = light blue), and vegetation in shadow (blue) from the WorldView-2/3 imagery. Satellite imagery from DigitalGlobe Products. (<b>a2</b> and <b>a3</b>) WorldView3 © 2020 DigitalGlobe, Inc., a Maxar company. (<b>b3</b>) WorldView2 © 2020 DigitalGlobe, Inc., a Maxar company. (<b>a1</b>, <b>b1</b>, and <b>b2</b>) Aerial photograph © NTT GEOSPACE CORPORATION.</p>
Full article ">Figure 5
<p>Mapping results of vegetation in the test area shown in <a href="#remotesensing-12-01505-f002" class="html-fig">Figure 2</a> (<b>A3</b> and <b>B3</b>). (<b>a1</b>) Map of the tree canopy (purple) and grass (green) based on the Tokyo GWC-ratio data. (<b>b1</b>) Vegetation cover, which was manually delineated based on field observations. (<b>a2</b> and <b>b2</b>) Vegetation cover estimated from WorldView-2/3 imagery. The areas enclosed by yellow lines represent where the comparisons were performed. Satellite imagery from DigitalGlobe Products. (<b>a1</b> and <b>a2</b>) WorldView3 © 2020 DigitalGlobe, Inc., a Maxar company. (<b>b1</b> and <b>b2</b>) WorldView2 © 2020 DigitalGlobe, Inc., a Maxar company.</p>
Full article ">Figure 6
<p>Image classification performance for tree canopy cover estimation. The plots represent the ratio of tree canopy cover (%) estimated for various areas (two test areas, each of 23 municipalities in Tokyo special-wards, and all Tokyo special wards) in comparison with the respective verification data derived from the field survey and aerial photographs.</p>
Full article ">Figure 7
<p>Performance of the LAI−NDVI (Normalized Difference Vegetation Index) approximate model developed for both the leaf-on and leaf-off seasons. (<b>a1</b> and <b>b1</b>) Scatter plots of vegetation index (NDVI) vs. observed leaf area index (LAI; m<sup>2</sup> m<sup>−2</sup>). The line represents the LAI−NDVI approximate model derived from the maximum NDVI and observed LAI data. (<b>a2</b> and <b>b2</b>) Estimated LAI vs. observed LAI. The y = 1.00x line is fitted. (<b>c</b>) Box plots of comparison between observed LAI and estimated LAI for each species.</p>
Full article ">Figure 8
<p>Example of mapping results of tree canopy leaf area in the test area shown in <a href="#remotesensing-12-01505-f002" class="html-fig">Figure 2</a> (<b>A1</b> and <b>B1</b>). (<b>a</b>) Vegetation cover according to the Tokyo GWC-ratio data based on aerial photograph. Estimated map of tree leaf area index (LAI; m<sup>2</sup> m<sup>−2</sup>) for leaf-on season (<b>b</b>) and leaf-off season (<b>c</b>). Satellite imagery from DigitalGlobe Products. (<b>a</b>–<b>c</b>) WorldView2 © 2020 DigitalGlobe, Inc., a Maxar company.</p>
Full article ">
16 pages, 3501 KiB  
Article
A Novel Method Based on Backscattering for Discriminating Summer Blooms of the Raphidophyte (Chattonella spp.) and the Diatom (Skeletonema spp.) Using MODIS Images in Ariake Sea, Japan
by Chi Feng, Joji Ishizaka, Katsuya Saitoh, Takayuki Mine and Hirokazu Yamashita
Remote Sens. 2020, 12(9), 1504; https://doi.org/10.3390/rs12091504 - 8 May 2020
Cited by 12 | Viewed by 3629
Abstract
The raphidophyte Chattonella spp. and diatom Skeletonema spp. are the dominant harmful algal species of summer blooms in Ariake Sea, Japan. A new bio-optical algorithm based on backscattering features has been developed to differentiate harmful raphidophyte blooms from diatom blooms using MODIS imagery. [...] Read more.
The raphidophyte Chattonella spp. and diatom Skeletonema spp. are the dominant harmful algal species of summer blooms in Ariake Sea, Japan. A new bio-optical algorithm based on backscattering features has been developed to differentiate harmful raphidophyte blooms from diatom blooms using MODIS imagery. Bloom waters were first discriminated from other water types based on the distinct spectral shape of the remote sensing reflectance R r s (λ) data derived from MODIS. Specifically, bloom waters were discriminated by the positive value of Spectral Shape, SS (645), which arises from the R r s (λ) shoulder at 645 nm in bloom waters. Next, the higher cellular-specific backscattering coefficient, estimated from MODIS data and quasi-analytical algorithm (QAA) of raphidophyte, Chattonella spp., was utilized to discriminate it from blooms of the diatom, Skeletonema spp. A new index b b p i n d e x ( 555 ) was calculated based on a semi-analytical bio-optical model to discriminate the two algal groups. This index, combined with a supplemental Red Band Ratio (RBR) index, effectively differentiated the two bloom types. Validation of the method was undertaken using MODIS satellite data coincident with confirmed bloom observations from local field surveys, which showed that the newly developed method, based on backscattering features, could successfully discriminate the raphidophyte Chattonella spp. from the diatom Skeletonema spp. and thus provide reliable information on the spatial distribution of harmful blooms in Ariake Sea. Full article
(This article belongs to the Special Issue Coastal Waters Monitoring Using Remote Sensing Technology)
Show Figures

Figure 1

Figure 1
<p>Map of Ariake Sea, Japan. Solid points show the field sampling locations in 2011–2014. The gray circle and square indicate the position of data taken by Saga Fisheries Promotion Center and Kumamoto Fisheries Research Center, respectively.</p>
Full article ">Figure 2
<p>MODIS <math display="inline"><semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>r</mi> <mi>s</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>λ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> spectra (N = 35) using the match-up method (See <a href="#sec2dot3-remotesensing-12-01504" class="html-sec">Section 2.3</a>). Blue dashed line (clear water); green solid line (mixed waters); gray dotted line (turbid water), red solid and dashed line indicates raphidophyte and diatom bloom water, respectively. The gray lines indicate location of MODIS bands.</p>
Full article ">Figure 3
<p>Scatter plot of MODIS <math display="inline"><semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>r</mi> <mi>s</mi> </mrow> </msub> <mrow> <mo>(</mo> <mrow> <mn>555</mn> </mrow> <mo>)</mo> </mrow> </mrow> </semantics></math> and SS (645) generated based on the MODIS match-up pairs.</p>
Full article ">Figure 4
<p>(<b>a</b>–<b>c</b>) Standard Chl a images from MODIS level-2 for Ariake Sea. Non-bloom (<b>a</b>), raphidophyte bloom (<b>b</b>) and diatom bloom (<b>c</b>) as confirmed by bloom reports from Japan Red Tide Online. (<b>d</b>–<b>f</b>) Water types derived using MODIS <math display="inline"><semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>r</mi> <mi>s</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>λ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> and our newly developed method. (<b>g</b>–<b>i</b>) Scatter plot of <math display="inline"><semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>r</mi> <mi>s</mi> </mrow> </msub> <mrow> <mo>(</mo> <mrow> <mn>555</mn> </mrow> <mo>)</mo> </mrow> </mrow> </semantics></math> and SS (645) derived from the MODIS <math display="inline"><semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>r</mi> <mi>s</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>λ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> extracted from scenes shown in (<b>a</b>–<b>c</b>). Only areas with positive <math display="inline"><semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>r</mi> <mi>s</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>λ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> are shown.</p>
Full article ">Figure 5
<p>Inherent optical properties of diatom and raphidophyte bloom waters: (<b>a</b>) Chl a-specific absorption of bloom water <math display="inline"><semantics> <mrow> <msub> <mi>a</mi> <mrow> <mi>b</mi> <mi>l</mi> <mi>o</mi> <mi>o</mi> <mi>m</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi mathvariant="sans-serif">λ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> where <math display="inline"><semantics> <mrow> <msub> <mi>a</mi> <mrow> <mi>b</mi> <mi>l</mi> <mi>o</mi> <mi>o</mi> <mi>m</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi mathvariant="sans-serif">λ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> is normalized to in situ Chl a concentration; (<b>b</b>) Cell-specific backscattering of suspended particles <math display="inline"><semantics> <mrow> <msub> <mi>b</mi> <mrow> <mi>b</mi> <mi>p</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi mathvariant="sans-serif">λ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math>. The spectra are normalized to cellular abundance. The <math display="inline"><semantics> <mrow> <msub> <mi>a</mi> <mrow> <mi>b</mi> <mi>l</mi> <mi>o</mi> <mi>o</mi> <mi>m</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi mathvariant="sans-serif">λ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mi>b</mi> <mrow> <mi>b</mi> <mi>p</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi mathvariant="sans-serif">λ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> were derived by the MODIS <math display="inline"><semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>r</mi> <mi>s</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>λ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> match up results using QAA V6. N = 4 for raphidophyte bloom and N = 8 for diatom bloom. The solid and plot line represent raphidophyte and diatom bloom, respectively.</p>
Full article ">Figure 6
<p>(<b>a</b>) Scatterplot of in situ Chl a and <math display="inline"><semantics> <mrow> <msub> <mi>a</mi> <mrow> <mi>b</mi> <mi>l</mi> <mi>o</mi> <mi>o</mi> <mi>m</mi> </mrow> </msub> <mrow> <mo>(</mo> <mrow> <mn>443</mn> </mrow> <mo>)</mo> </mrow> <mo> </mo> </mrow> </semantics></math> of the raphidophyte and diatom blooms; (<b>b</b>) scatterplot of cellular abundance and <math display="inline"><semantics> <mrow> <msub> <mi>b</mi> <mrow> <mi>b</mi> <mi>p</mi> </mrow> </msub> <mrow> <mo>(</mo> <mrow> <mn>555</mn> </mrow> <mo>)</mo> </mrow> </mrow> </semantics></math>. The triangles and squares depict raphidophyte (N = 4) and diatom bloom (N = 8), respectively. Linear regression line was drawn on log transformed data. The data was from the match-up pairs of MODIS in 2011–2014. The <math display="inline"><semantics> <mrow> <msub> <mi>a</mi> <mrow> <mi>b</mi> <mi>l</mi> <mi>o</mi> <mi>o</mi> <mi>m</mi> </mrow> </msub> <mrow> <mo>(</mo> <mrow> <mn>443</mn> </mrow> <mo>)</mo> </mrow> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mi>b</mi> <mrow> <mi>b</mi> <mi>p</mi> </mrow> </msub> <mrow> <mo>(</mo> <mrow> <mn>555</mn> </mrow> <mo>)</mo> </mrow> </mrow> </semantics></math> were derived from MODIS <math display="inline"><semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>r</mi> <mi>s</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>λ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> by QAA V6.</p>
Full article ">Figure 7
<p>(<b>a</b>) Scatter plot of RBR and <math display="inline"><semantics> <mrow> <msub> <mi>b</mi> <mrow> <mi>b</mi> <mi>p</mi> <mo>−</mo> <mi>i</mi> <mi>n</mi> <mi>d</mi> <mi>e</mi> <mi>x</mi> </mrow> </msub> <mrow> <mo>(</mo> <mrow> <mn>555</mn> </mrow> <mo>)</mo> </mrow> </mrow> </semantics></math> using MODIS <math display="inline"><semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>r</mi> <mi>s</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>λ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> coincident with field data for 2011–2014. (<b>b</b>) Scatter plot of RBR and <math display="inline"><semantics> <mrow> <msub> <mi>b</mi> <mrow> <mi>b</mi> <mi>p</mi> <mo>−</mo> <mi>i</mi> <mi>n</mi> <mi>d</mi> <mi>e</mi> <mi>x</mi> </mrow> </msub> <mrow> <mo>(</mo> <mrow> <mn>555</mn> </mrow> <mo>)</mo> </mrow> </mrow> </semantics></math> derived from the points flagged as bloom in <a href="#remotesensing-12-01504-f004" class="html-fig">Figure 4</a>b,c. The solid line in <a href="#remotesensing-12-01504-f007" class="html-fig">Figure 7</a>b represents the function expressed by Equation (12) separating raphidophyte blooms from diatom blooms.</p>
Full article ">Figure 8
<p>(<b>a</b>–<b>d</b>) MODIS Chl a images from standard level_2 products showing bloom distribution confirmed by reports from Japan Red Tide Online. Only pixels positively flagged as bloom waters are shown in color. The red circles and squares indicate the location of raphidophyte and diatom blooms, respectively. (<b>e</b>) Scatterplot of <math display="inline"><semantics> <mrow> <msub> <mi>b</mi> <mrow> <mi>b</mi> <mi>p</mi> <mo>−</mo> <mi>i</mi> <mi>n</mi> <mi>d</mi> <mi>e</mi> <mi>x</mi> </mrow> </msub> </mrow> </semantics></math> (555) and RBR derived from the bloom pixels in (<b>a</b>–<b>d</b>), indicating distinct algal groups. Red triangle and squares indicate raphidophyte and diatom blooms, respectively.</p>
Full article ">Figure 9
<p>Multispectral method for the identification of raphidophyte and diatom, and non-bloom waters.</p>
Full article ">
Previous Issue
Next Issue
Back to TopTop