[go: up one dir, main page]

 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (5,725)

Search Parameters:
Keywords = temporal resolution

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
8 pages, 4150 KiB  
Article
Whole-Head Noninvasive Brain Signal Measurement System with High Temporal and Spatial Resolution Using Static Magnetic Field Bias to the Brain
by Osamu Hiwaki
Bioengineering 2024, 11(9), 917; https://doi.org/10.3390/bioengineering11090917 (registering DOI) - 13 Sep 2024
Abstract
Noninvasive brain signal measurement techniques are crucial for understanding human brain function and brain–machine interface applications. Conventionally, noninvasive brain signal measurement techniques, such as electroencephalography, magnetoencephalography, functional magnetic resonance imaging, and near-infrared spectroscopy, have been developed. However, currently, there is no practical noninvasive [...] Read more.
Noninvasive brain signal measurement techniques are crucial for understanding human brain function and brain–machine interface applications. Conventionally, noninvasive brain signal measurement techniques, such as electroencephalography, magnetoencephalography, functional magnetic resonance imaging, and near-infrared spectroscopy, have been developed. However, currently, there is no practical noninvasive technique to measure brain function with high temporal and spatial resolution using one instrument. We developed a novel noninvasive brain signal measurement technique with high temporal and spatial resolution by biasing a static magnetic field emitted from a coil on the head to the brain. In this study, we applied this technique to develop a groundbreaking system for noninvasive whole-head brain function measurement with high spatiotemporal resolution across the entire head. We validated this system by measuring movement-related brain signals evoked by a right index finger extension movement and demonstrated that the proposed system can measure the dynamic activity of brain regions involved in finger movement with high spatiotemporal accuracy over the whole brain. Full article
(This article belongs to the Special Issue Neuroimaging Techniques for Wearable Devices in Bioengineering)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Measurement of neural signals in the cerebral cortex using a static magnetic field. (1) A static magnetic field generated by a coil on the scalp passes through the region of the cerebral cortex below the coil. (2) Static magnetic field fluctuates according to neural electromagnetic activity in the cerebral cortex through which the static magnetic field passes. (3) Magnetic sensor at the top of the coil measures neural activity in the cerebral cortex as a fluctuation in the magnetic field.</p>
Full article ">Figure 2
<p>(<b>a</b>) Noninvasive whole-head system for measuring brain signals under static magnetic field bias. A total of 159 pairs of magnetic sensors and coils were placed on the scalp with a neoprene cap. (<b>b</b>) Each magnetic sensor was connected to a coil using a plastic nut.</p>
Full article ">Figure 3
<p>Grand averages of movement-related signals evoked by extension of the right index finger were measured in three participants using the developed whole-head MBP system. Signals for all 159 channels across the scalp surface are presented as if looking down on the scalp surface with the nose (anterior) at the top.</p>
Full article ">Figure 4
<p>(<b>Left</b>) Movement-related signals of channels (A) and (B) in <a href="#bioengineering-11-00917-f003" class="html-fig">Figure 3</a>. Gray lines indicate the standard deviation at each sampling time. (<b>Right</b>) Topographies of signal distributions across the scalp at selected time points: −1500 ms, −1000 ms, 30 ms, and 140 ms. Maps are presented as if looking down on the scalp surface with the nose (anterior) at the top. Positive and negative signals are shown in red and blue, respectively.</p>
Full article ">
16 pages, 13238 KiB  
Article
Transfer of Periodic Phenomena in Multiphase Capillary Flows to a Quasi-Stationary Observation Using U-Net
by Bastian Oldach, Philipp Wintermeyer and Norbert Kockmann
Computers 2024, 13(9), 230; https://doi.org/10.3390/computers13090230 - 13 Sep 2024
Viewed by 77
Abstract
Miniaturization promotes the efficiency and exploration domain in scientific fields such as computer science, engineering, medicine, and biotechnology. In particular, the field of microfluidics is a flourishing technology, which deals with the manipulation of small volumes of liquid. Dispersed droplets or bubbles in [...] Read more.
Miniaturization promotes the efficiency and exploration domain in scientific fields such as computer science, engineering, medicine, and biotechnology. In particular, the field of microfluidics is a flourishing technology, which deals with the manipulation of small volumes of liquid. Dispersed droplets or bubbles in a second immiscible liquid are of great interest for screening applications or chemical and biochemical reactions. However, since very small dimensions are characterized by phenomena that differ from those at macroscopic scales, a deep understanding of physics is crucial for effective device design. Due to small volumes in miniaturized systems, common measurement techniques are not applicable as they exceed the dimensions of the device by a multitude. Hence, image analysis is commonly chosen as a method to understand ongoing phenomena. Artificial Intelligence is now the state of the art for recognizing patterns in images or analyzing datasets that are too large for humans to handle. X-ray-based Computer Tomography adds a third dimension to images, which results in more information, but ultimately, also in more complex image analysis. In this work, we present the application of the U-Net neural network to extract certain states during droplet formation in a capillary, which forms a constantly repeated process that is captured on tens of thousands of CT images. The experimental setup features a co-flow setup that is based on 3D-printed capillaries with two different cross-sections with an inner diameter, respectively edge length of 1.6 mm. For droplet formation, water was dispersed in silicon oil. The classification into different droplet states allows for 3D reconstruction and a time-resolved 3D analysis of the present phenomena. The original U-Net was modified to process input images of a size of 688 × 432 pixels while the structure of the encoder and decoder path feature 23 convolutional layers. The U-Net consists of four max pooling layers and four upsampling layers. The training was performed on 90% and validated on 10% of a dataset containing 492 images showing different states of droplet formation. A mean Intersection over Union of 0.732 was achieved for a training of 50 epochs, which is considered a good performance. The presented U-Net needs 120 ms per image to process 60,000 images to categorize emerging droplets into 24 states at 905 angles. Once the model is trained sufficiently, it provides accurate segmentation for various flow conditions. The selected images are used for 3D reconstruction enabling the 2D and 3D quantification of emerging droplets in capillaries that feature circular and square cross-sections. By applying this method, a temporal resolution of 25–40 ms was achieved. Droplets that are emerging in capillaries with a square cross-section become bigger under the same flow conditions in comparison to capillaries with a circular cross section. The presented methodology is promising for other periodic phenomena in different scientific disciplines that focus on imaging techniques. Full article
Show Figures

Figure 1

Figure 1
<p>Principle sketch of how a periodic process like repeated slug formation can be classified according to the state of droplet formation. The left side shows a regular slug flow, which is typical for multiphase flows in capillaries, and the droplet formation mechanism with the steps from <math display="inline"><semantics> <msub> <mi>l</mi> <mn>1</mn> </msub> </semantics></math> to <math display="inline"><semantics> <msub> <mi>l</mi> <mn>5</mn> </msub> </semantics></math>. On the right side, the repeated slug flow formation is temporally resolved for the steps <math display="inline"><semantics> <msub> <mi>l</mi> <mn>1</mn> </msub> </semantics></math> to <math display="inline"><semantics> <msub> <mi>l</mi> <mn>5</mn> </msub> </semantics></math> to obtain a series of stationary states that enable 3D analysis.</p>
Full article ">Figure 2
<p>(<b>a</b>) The µ-CT used for the experiments and its surrounding peripherals. (<b>b</b>) A close-up view of the specimen chamber with an installed capillary under investigation.</p>
Full article ">Figure 3
<p>(<b>a</b>) The U-Net architecture that was used for this work with an input image and the classification of the image as the output. (<b>b</b>) An example CT image of an emerging droplet, the human-labeled ground truth and the outlet of the U-Net that was trained for 50 epochs. (<b>c</b>) A sketch of the ARM. The datasets are fed to the U-Net and are classified according to the defined states. Projection images are acquired at each angular position from 0 to <math display="inline"><semantics> <msup> <mn>227</mn> <mo>∘</mo> </msup> </semantics></math> in <math display="inline"><semantics> <msup> <mn>0.25</mn> <mo>∘</mo> </msup> </semantics></math> increments. The U-Net is applied to select one image for each angular position that captures a desired droplet state. The selected projection images are then used to reconstruct a 3D volume for image analysis.</p>
Full article ">Figure 4
<p>The training <span class="html-italic">accuracy</span> (dotted gray line) and validation <span class="html-italic">accuracy</span> (solid black line) for 50 epochs can be tracked on the left Y-axis. The corresponding <span class="html-italic">IoU</span> (red markers) is given on the left Y-axis for 1, 5, 10, 20, 30, and 50 epochs of training.</p>
Full article ">Figure 5
<p>(<b>a</b>) The 2D droplet contours tracked over the 24 steps provided by the U-Net classification by plotting droplet radii over the droplet length. The diagram emphasizes the droplet evolution starting at the filling stage (dotted light-gray lines) and over the necking stage (dashed dark-gray line), until the droplet detaches (solid black line) in the circular capillary (top) with an inner diameter <math display="inline"><semantics> <msub> <mi>d</mi> <mrow> <mi>c</mi> <mo>,</mo> <mi>i</mi> </mrow> </msub> </semantics></math> of 1.6 mm and for the square capillary (bottom) with <math display="inline"><semantics> <msub> <mi>d</mi> <mi>h</mi> </msub> </semantics></math> = 1.6 mm. (<b>b</b>) The reconstructed 3D representation of the different droplet states in the circular (top) and square (bottom) capillary for the filling stage (left), necking stage (middle), and the detached droplet (right) for a constant Weber number <math display="inline"><semantics> <mrow> <mi>W</mi> <mi>e</mi> </mrow> </semantics></math>.</p>
Full article ">Figure A1
<p>Comparison of the input image, the ground truth, and the U-Net output for 1, 5, 10, 20, 30, and 50 epochs of training.</p>
Full article ">
23 pages, 4848 KiB  
Article
Summer Chukchi Sea Near-Surface Salinity Variability in Satellite Observations and Ocean Models
by Semyon A. Grodsky, Nicolas Reul and Douglas Vandemark
Remote Sens. 2024, 16(18), 3397; https://doi.org/10.3390/rs16183397 - 12 Sep 2024
Viewed by 209
Abstract
The Chukchi Sea is an open estuary in the southwestern Arctic. Its near-surface salinities are higher than those of the surrounding open Arctic waters due to the key inflow of saltier and warmer Pacific waters through the Bering Strait. This salinity distribution may [...] Read more.
The Chukchi Sea is an open estuary in the southwestern Arctic. Its near-surface salinities are higher than those of the surrounding open Arctic waters due to the key inflow of saltier and warmer Pacific waters through the Bering Strait. This salinity distribution may suggest that interannual changes in the Bering Strait mass transport are the sole and dominant factor shaping the salinity distribution in the downstream Chukchi Sea. Using satellite sea surface salinity (SSS) retrievals and altimetry-based estimates of the Bering Strait transport, the relationship between the Strait transport and Chukchi Sea SSS distributions is analyzed from 2010 onward, focusing on the ice-free summer to fall period. A comparison of five different satellite SSS products shows that anomalous SSS spatially averaged over the Chukchi Sea during the ice-free period is consistent among them. Observed interannual temporal change in satellite SSS is confirmed by comparison with collocated ship-based thermosalinograph transect datasets. Bering Strait transport variability is known to be driven by the local meridional wind stress and by the Pacific-to-Arctic sea level gradient (pressure head). This pressure head, in turn, is related to an Arctic Oscillation-like atmospheric mean sea level pattern over the high-latitude Arctic, which governs anomalous zonal winds over the Chukchi Sea and affects its sea level through Ekman dynamics. Satellite SSS anomalies averaged over the Chukchi Sea show a positive correlation with preceding months’ Strait transport anomalies. This correlation is confirmed using two longer (>40-year), separate ocean data assimilation models, with either higher- (0.1°) or lower-resolution (0.25°) spatial resolution. The relationship between the Strait transport and Chukchi Sea SSS anomalies is generally stronger in the low-resolution model. The area of SSS response correlated with the Strait transport is located along the northern coast of the Chukotka Peninsula in the Siberian Coastal Current and adjacent zones. The correlation between wind patterns governing Bering Strait variability and Siberian Coastal Current variability is driven by coastal sea level adjustments to changing winds, in turn driving the Strait transport. Due to the Chukotka coastline configuration, both zonal and meridional wind components contribute. Full article
(This article belongs to the Special Issue Application of Remote Sensing in Coastline Monitoring)
Show Figures

Figure 1

Figure 1
<p>September climatological SSS and standard deviation of monthly SSS from (<b>a</b>,<b>b</b>) satellite SMAP data (2015–2024), (<b>c</b>,<b>d</b>) high-resolution RARE1 (1980–2021) ocean reanalysis, and (<b>e</b>,<b>f</b>) low-resolution SODA (1980–2015) ocean reanalysis. Anadyr Current (1), Alaska Coastal Current (2), and Siberian Coastal Current (3) are sketched in (<b>e</b>). Chukchi Sea (180–200°E, 66–73°N) and Bering Strait (190–192.5°E, 65–66.5°N) box areas are shown in (<b>c</b>).</p>
Full article ">Figure 2
<p>September–October SSS anomaly (SSSA) from (<b>a</b>) four different satellite salinity products averaged over the Chukchi Sea box and (<b>b</b>) TSG transects averaged over (65.5°–70°N, 170°–167°W) domain. Bars for different satellite products are shifted in (<b>a</b>) to avoid overlapping. X-ticks correspond to Jan. See <a href="#sec2-remotesensing-16-03397" class="html-sec">Section 2</a> for description of satellite datasets. For each satellite dataset, the seasonal cycle of SSS is calculated based on the SMAP period since 2015.</p>
Full article ">Figure 3
<p>September (<b>a</b>–<b>i</b>) SMAP SSS, (<b>j</b>–<b>r</b>) ancillary SST from the Canada Meteorological Center (CMC) included in SMAP version 6.0. (<b>s</b>–<b>zz</b>) May–August de-trended multi-satellite sea level anomaly (SSHA).</p>
Full article ">Figure 4
<p>September SSS anomaly (SSSA) averaged over the Chukchi Sea box versus May–August along channel geostrophic velocity component (<math display="inline"><semantics> <mrow> <mi>v</mi> <mi>A</mi> </mrow> </semantics></math>) averaged over the Bering Strait box from the AVISO all-satellite altimeter analysis. Each symbol represents the mean of up to five satellite datasets shown in <a href="#remotesensing-16-03397-f002" class="html-fig">Figure 2</a>a, while vertical bars represent their STD. See <a href="#remotesensing-16-03397-f001" class="html-fig">Figure 1</a>c for the locations of the two boxes. Symbol colors correspond to years. Linear regression (solid), <math display="inline"><semantics> <mrow> <mi mathvariant="normal">S</mi> <mi mathvariant="normal">S</mi> <mi mathvariant="normal">S</mi> <mi mathvariant="normal">A</mi> <mo>=</mo> <msubsup> <mrow> <mn>0.11</mn> </mrow> <mrow> <mn>0.03</mn> </mrow> <mrow> <mn>0.19</mn> </mrow> </msubsup> <mo>⋅</mo> <mo> </mo> <mi>v</mi> <mi>A</mi> </mrow> </semantics></math> explains ~40% of SSSA variance, where subscripts are the 95% confidence interval of the regression coefficient.</p>
Full article ">Figure 5
<p>(<b>a</b>) Spatial and (<b>b</b>) temporal parts of the leading EOF of May–August monthly de-trended SSH anomalies (SSHA) from satellite altimetry (1993–2023, ~53% of explained variance). EOF is computed for grid points with at least half of ice-free data. (<b>c</b>) SSHA averaged over the Chukchi Sea box (black in a, the same as in <a href="#remotesensing-16-03397-f001" class="html-fig">Figure 1</a>c).</p>
Full article ">Figure 6
<p>Time regression of May–August anomalies of (<b>a</b>) Chukchi Sea SSHA, (<b>b</b>) Bering Strait northward wind (<math display="inline"><semantics> <mrow> <mi>V</mi> <mn>10</mn> <mi>A</mi> </mrow> </semantics></math>) with atmospheric mean sea level pressure anomaly (MSLPA) elsewhere. (<b>c</b>,<b>d</b>) Time series of May–August mean SSHA and <math display="inline"><semantics> <mrow> <mi>V</mi> <mn>10</mn> <mi>A</mi> </mrow> </semantics></math>. Values in (<b>a</b>,<b>b</b>) show MSLPA (mbar) corresponding to one STD of SSHA and <math display="inline"><semantics> <mrow> <mi>V</mi> <mn>10</mn> <mi>A</mi> </mrow> </semantics></math>, respectively. Panel (<b>c</b>) is the same as in <a href="#remotesensing-16-03397-f005" class="html-fig">Figure 5</a>c.</p>
Full article ">Figure 7
<p>Monthly (May–August) Bering Strait geostrophic velocity anomaly (<math display="inline"><semantics> <mrow> <mi>v</mi> <mi>A</mi> </mrow> </semantics></math>) from satellite altimetry vs. (<b>a</b>) Bering Strait meridional wind anomaly (<math display="inline"><semantics> <mrow> <mi>V</mi> <mn>10</mn> <mi>A</mi> </mrow> </semantics></math>) and (<b>c</b>) Chukchi Sea surface height anomaly (SSHA). Velocity anomaly components (<b>b</b>) due to wind (<math display="inline"><semantics> <mrow> <mi>v</mi> <mi>A</mi> <mi>w</mi> </mrow> </semantics></math>), and (<b>d</b>) Chukchi Sea SSHA (<math display="inline"><semantics> <mrow> <mi>v</mi> <mi>A</mi> <mi>h</mi> </mrow> </semantics></math>), which are calculated by subtracting signals linearly correlated with SSHA and <math display="inline"><semantics> <mrow> <mi>V</mi> <mn>10</mn> <mi>A</mi> </mrow> </semantics></math>, respectively.</p>
Full article ">Figure 8
<p>Chukchi Sea SSS anomaly temporally regressed on BS salinity transport anomaly components due to (<b>a</b>) pressure head (<math display="inline"><semantics> <mrow> <msubsup> <mrow> <mi>v</mi> </mrow> <mrow> <mi>h</mi> </mrow> <mrow> <mo>′</mo> </mrow> </msubsup> <mo>⋅</mo> <mo>Δ</mo> <mi>S</mi> </mrow> </semantics></math>), (<b>b</b>) meridional winds over the Strait (<math display="inline"><semantics> <mrow> <mi>v</mi> <msub> <mrow> <mo>′</mo> </mrow> <mrow> <mi>w</mi> </mrow> </msub> <mo>⋅</mo> <mo>Δ</mo> <mi>S</mi> </mrow> </semantics></math>), and (<b>c</b>) salinity variations in the Strait (<math display="inline"><semantics> <mrow> <mi>v</mi> <mo>⋅</mo> <mi>S</mi> <mo>′</mo> </mrow> </semantics></math>). Magnitudes correspond to one standard deviation of the respective Bering Strait forcing factor. Inlays show vertical profiles of salinity response in red (Chukotka) and blue (eastern Chukchi Sea) boxes shown in (<b>c</b>). Eastern box vertical profiles are not shown in (<b>a</b>,<b>b</b>) due to negligible response magnitudes. Data are from 1980–2021 RARE1 ocean reanalysis. Points with fewer than 20 examples of September ice-free data are blanked. The geographic grid is drawn with 10° and 5° intervals in longitude and latitude, respectively, starting from 170°E, 60°N.</p>
Full article ">Figure 9
<p>The same as in <a href="#remotesensing-16-03397-f008" class="html-fig">Figure 8</a> but for lower-resolution SODA3 reanalysis.</p>
Full article ">Figure A1
<p>(<b>a</b>) Seasonal cycle of total (<math display="inline"><semantics> <mrow> <mi>v</mi> </mrow> </semantics></math>) and geostrophic (<math display="inline"><semantics> <mrow> <mi>v</mi> <mi>g</mi> </mrow> </semantics></math>) at the surface averaged across Bering Strait along 65.75N. (<b>b</b>) Scatter diagram of Bering Strait volume transport anomaly with total (<math display="inline"><semantics> <mrow> <mi>v</mi> <mi>A</mi> </mrow> </semantics></math>) and geostrophic (<math display="inline"><semantics> <mrow> <mi>v</mi> <mi>g</mi> <mi>A</mi> </mrow> </semantics></math>) monthly anomalies. Because both velocity anomalies are linearly correlated with Bering Strait (BS) transport anomalies, they also are mutually correlated, <math display="inline"><semantics> <mrow> <mi>v</mi> <mi>g</mi> <mi>A</mi> <mo>=</mo> <mn>0.63</mn> <mo>⋅</mo> <mi>v</mi> <mi>A</mi> </mrow> </semantics></math>.</p>
Full article ">Figure A2
<p>Regression of (<b>a</b>) winter (November–February) and (<b>b</b>) summer (May–August) AO index with ERA5 monthly MSLPA. MSLP values correspond to one standard deviation of AO index during winter and summer months. Note the difference in color scale limits between (<b>a</b>,<b>b</b>). (<b>c</b>) Scatter diagram of summer sea level anomaly averaged over the Chukchi Sea box (<a href="#remotesensing-16-03397-f001" class="html-fig">Figure 1</a>c) and AO index.</p>
Full article ">Figure A3
<p>Time regression of May–August Bering Strait salinity flux anomaly components, (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>v</mi> <mo>′</mo> <mo>Δ</mo> <mi>S</mi> </mrow> </semantics></math> and (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>v</mi> <msup> <mrow> <mi>S</mi> </mrow> <mrow> <mo>′</mo> </mrow> </msup> </mrow> </semantics></math>, on concurrent de-trended SSH anomalies. Magnitudes correspond to one standard deviation of the respective salinity flux component.</p>
Full article ">Figure A4
<p>The same as in <a href="#remotesensing-16-03397-f008" class="html-fig">Figure 8</a> but for ORA5S reanalysis based on the NEMO ocean model.</p>
Full article ">Figure A5
<p>Time regression of May–August Bering Strait salinity flux anomaly component, <math display="inline"><semantics> <mrow> <mi>v</mi> <mi>S</mi> <mo>′</mo> </mrow> </semantics></math>, on concurrent atmospheric mean sea level pressure anomaly. Magnitude corresponds to one STD of <math display="inline"><semantics> <mrow> <mi>v</mi> <mi>S</mi> <mo>′</mo> </mrow> </semantics></math>.</p>
Full article ">
29 pages, 6780 KiB  
Article
Phenological and Biophysical Mediterranean Orchard Assessment Using Ground-Based Methods and Sentinel 2 Data
by Pierre Rouault, Dominique Courault, Guillaume Pouget, Fabrice Flamain, Papa-Khaly Diop, Véronique Desfonds, Claude Doussan, André Chanzy, Marta Debolini, Matthew McCabe and Raul Lopez-Lozano
Remote Sens. 2024, 16(18), 3393; https://doi.org/10.3390/rs16183393 - 12 Sep 2024
Viewed by 262
Abstract
A range of remote sensing platforms provide high spatial and temporal resolution insights which are useful for monitoring vegetation growth. Very few studies have focused on fruit orchards, largely due to the inherent complexity of their structure. Fruit trees are mixed with inter-rows [...] Read more.
A range of remote sensing platforms provide high spatial and temporal resolution insights which are useful for monitoring vegetation growth. Very few studies have focused on fruit orchards, largely due to the inherent complexity of their structure. Fruit trees are mixed with inter-rows that can be grassed or non-grassed, and there are no standard protocols for ground measurements suitable for the range of crops. The assessment of biophysical variables (BVs) for fruit orchards from optical satellites remains a significant challenge. The objectives of this study are as follows: (1) to address the challenges of extracting and better interpreting biophysical variables from optical data by proposing new ground measurements protocols tailored to various orchards with differing inter-row management practices, (2) to quantify the impact of the inter-row at the Sentinel pixel scale, and (3) to evaluate the potential of Sentinel 2 data on BVs for orchard development monitoring and the detection of key phenological stages, such as the flowering and fruit set stages. Several orchards in two pedo-climatic zones in southeast France were monitored for three years: four apricot and nectarine orchards under different management systems and nine cherry orchards with differing tree densities and inter-row surfaces. We provide the first comparison of three established ground-based methods of assessing BVs in orchards: (1) hemispherical photographs, (2) a ceptometer, and (3) the Viticanopy smartphone app. The major phenological stages, from budburst to fruit growth, were also determined by in situ annotations on the same fields monitored using Viticanopy. In parallel, Sentinel 2 images from the two study sites were processed using a Biophysical Variable Neural Network (BVNET) model to extract the main BVs, including the leaf area index (LAI), fraction of absorbed photosynthetically active radiation (FAPAR), and fraction of green vegetation cover (FCOVER). The temporal dynamics of the normalised FAPAR were analysed, enabling the detection of the fruit set stage. A new aggregative model was applied to data from hemispherical photographs taken under trees and within inter-rows, enabling us to quantify the impact of the inter-row at the Sentinel 2 pixel scale. The resulting value compared to BVs computed from Sentinel 2 gave statistically significant correlations (0.57 for FCOVER and 0.45 for FAPAR, with respective RMSE values of 0.12 and 0.11). Viticanopy appears promising for assessing the PAI (plant area index) and FCOVER for orchards with grassed inter-rows, showing significant correlations with the Sentinel 2 LAI (R2 of 0.72, RMSE 0.41) and FCOVER (R2 0.66 and RMSE 0.08). Overall, our results suggest that Sentinel 2 imagery can support orchard monitoring via indicators of development and inter-row management, offering data that are useful to quantify production and enhance resource management. Full article
(This article belongs to the Section Remote Sensing in Agriculture and Vegetation)
Show Figures

Figure 1

Figure 1
<p>Schematic of the three approaches used to monitor orchard development at different spatial scales throughout the year (from tree level for phenological observations to watershed level using Sentinel 2 data).</p>
Full article ">Figure 2
<p>(<b>a</b>) Locations of the monitored orchards in the Ouvèze–Ventoux watershed (green points at right) and in the La Crau area (yellow points at left), (<b>b</b>) pictures of 2 cherry orchards (13 September and 22 July 2022): top, non-grassed orchard drip-irrigated by two rows of drippers and bottom, grassed orchard drip-irrigated in summer, (<b>c</b>) pictures of 2 orchards in La Crau (top, nectarine tree in spring 22 March 2023 and bottom, in summer 26 June 2022).</p>
Full article ">Figure 3
<p>(<b>a</b>) Main steps in processing the hemispherical photographs. (<b>b</b>) The three methods of data acquisition around the central tree. (<b>c</b>) Protocol used with hemispherical photographs. (<b>d</b>) Protocol used with the Viticanopy application, with 3 trees monitored in the four directions (blue arrows). (<b>e</b>) Protocols used with the ceptometer: P1 measured in the shadow of the trees and (blue) P2 in the inter-rows (black).</p>
Full article ">Figure 4
<p>Protocol for the monitoring of the phenological stages of cherry trees. (<b>a</b>) Phenology of cherry trees according to BBCH; (<b>b</b>) at plot scale, in an orchard, three trees in red monitored by observations (BBCH scale); (<b>c</b>) at tree scale, two locations are selected to classify flowering stage in the tree; and (<b>d</b>) flowering stage of a cherry tree in April 2022.</p>
Full article ">Figure 5
<p>Comparison of temporal profiles of Sentinel 2 LAI interpolated profile (black line) and PAI obtained from the ceptometer (blue line, P2 protocol) and Viticanopy (green line) for three orchards: (<b>a</b>) 3099 (cherry—grassed—Ouvèze), (<b>b</b>) 183 (cherry—non-grassed—Ouvèze), and (<b>c</b>) 4 (nectarine—La Crau) at the beginning of 2023.</p>
Full article ">Figure 6
<p>Comparison between Sentinel 2 LAI and PAI from (<b>a</b>) ceptometer measurements taken at all orchards of the two areas (La Crau and Ouvèze), (<b>b</b>) Viticanopy measurements at all orchards, and (<b>c</b>) Viticanopy measurements excluding 2 non-grassed orchards (183, 259). The black line represents the optimal correlation 1:1; the red line represents the results from linear regression.</p>
Full article ">Figure 7
<p>(<b>a</b>)—(top graphs) Proportion of tree (orange <span class="html-italic">100*FCOVER<sub>t</sub>/FCOVER<sub>c</sub></span>, see Equation (1)) and of inter-row (green <span class="html-italic">100*((1-FCOVER<sub>t</sub>)*FCOVER<sub>g</sub>)/FCOVER<sub>c</sub></span>) components computed from hemispherical photographs used to estimate FCOVER for two dates, 22 March 2022 (doy:81) and 21 June 2022 (doy 172), for all the monitored fields. (<b>b</b>)—(bottom graphs) For two plots, left, field 183.2 and right, field 3099.1, temporal variations in proportion of tree and inter-row components for the different observation dates in 2022.</p>
Full article ">Figure 8
<p>(<b>a</b>) Averaged percentage of grass contribution on FAPAR computed from hemispherical photographs according to Equation (1) for all grassed orchard plots in 2022. Examples of Sentinel 2 FAPAR dynamics (black lines) for plots at (<b>b</b>) non-grassed site 183 and (<b>c</b>) grassed site 1418. Initial values of FAPAR, as computed from BVNET, are provided in black. The green line represents adjusted FAPAR after subtracting the grass contribution (percentage obtained from hemispherical photographs). It corresponds to FAPAR only for the trees. The percentage of grass contribution is in red.</p>
Full article ">Figure 9
<p>Correlation between (<b>a</b>) FCOVER obtained from hemispherical photographs (from Equation (1)) for all orchards of the two studied areas and FCOVER from Sentinel 2 computed with BVNET (<b>b</b>) FAPAR from hemispherical photographs and FAPAR from Sentinel 2 for all orchards and for the 3 years. (<b>c</b>) Correlation between FCOVER from Viticanopy and Sentinel 2 for all orchards for the two areas, except 183 and 259. (<b>d</b>) Correlation between FCOVER from upward-aimed hemispherical photographs and from Viticanopy for all plots.</p>
Full article ">Figure 10
<p>(<b>a</b>) LAI temporal profiles obtained from BVNET applied to Sentinel 2 data averaged at plot and field scales (field 3099) for the year 2022 and (<b>b</b>) soil water stock (in mm in blue) computed at 0–50 cm using capacitive sensors (described in <a href="#sec2dot1-remotesensing-16-03393" class="html-sec">Section 2.1</a>), with rainfall recorded at the Carpentras station (see <a href="#app1-remotesensing-16-03393" class="html-app">Supplementary Part S1 and Table S1</a>).</p>
Full article ">Figure 11
<p>Time series of FCOVER (mean value at field scale) for the cherry trees in field 3099 in Ouvèze area from 2016 to 2023.</p>
Full article ">Figure 12
<p>Sentinel 2 FAPAR evolution in 2022 for two cherry tree fields, with the date of flowering observation (in green) and the date of fruit set observation (in red) for (<b>a</b>) plot 183 (non-grassed cherry trees) and (<b>b</b>) plot 3099 (grassed cherry trees).</p>
Full article ">Figure 13
<p>Variability in dates for the phenological stages of a cherry tree orchard (plot 3099) observed in 2022.</p>
Full article ">Figure 14
<p>(<b>a</b>) Normalised FAPAR computed for all observed cherry trees relative to observation dates for BBCH stages in the Ouvèze area in 2021 for five plots. (<b>b</b>) Map of dates distinguishing between flowering and fruit set stages for 2021 obtained by thresholding FAPAR images.</p>
Full article ">
26 pages, 29764 KiB  
Article
Mapping Fruit-Tree Plantation Using Sentinel-1/2 Time Series Images with Multi-Index Entropy Weighting Dynamic Time Warping Method
by Weimeng Xu, Zhenhong Li, Hate Lin, Guowen Shao, Fa Zhao, Han Wang, Jinpeng Cheng, Lei Lei, Riqiang Chen, Shaoyu Han and Hao Yang
Remote Sens. 2024, 16(18), 3390; https://doi.org/10.3390/rs16183390 - 12 Sep 2024
Viewed by 203
Abstract
Plantation distribution information is of great significance to the government’s macro-control, optimization of planting layout, and realization of efficient agricultural production. Existing studies primarily relied on high spatiotemporal resolution remote sensing data to address same-spectrum, different-object classification by extracting phenological information from temporal [...] Read more.
Plantation distribution information is of great significance to the government’s macro-control, optimization of planting layout, and realization of efficient agricultural production. Existing studies primarily relied on high spatiotemporal resolution remote sensing data to address same-spectrum, different-object classification by extracting phenological information from temporal imagery. However, the classification problem of orchard or artificial forest, where the spectral and textural features are similar and their phenological characteristics are alike, still presents a substantial challenge. To address this challenge, we innovatively proposed a multi-index entropy weighting DTW method (ETW-DTW), building upon the traditional DTW method with single-feature inputs. In contrast to previous DTW classification approaches, this method introduces multi-band information and utilizes entropy weighting to increase the inter-class distances. This allowed for accurate classification of orchard categories, even in scenarios where the spectral textures were similar and the phenology was alike. We also investigated the impact of fusing optical and Synthetic Aperture Radar (SAR) data on the classification accuracy. By combining Sentinel-1 and Sentinel-2 time series imagery, we validated the enhanced classification effectiveness with the inclusion of SAR data. The experimental results demonstrated a noticeable improvement in orchard classification accuracy under conditions of similar spectral characteristics and phenological patterns, providing comprehensive information for orchard mapping. Additionally, we further explored the improvement in results based on two different parcel-based classification strategies compared to pixel-based classification methods. By comparing the classification results, we found that the parcel-based averaging method has advantages in clearly defining orchard boundaries and reducing noise interference. In conclusion, the introduction of the ETW-DTW method is of significant practical importance in addressing the challenge of same-spectrum, different-object classification. The obtained orchard distribution can provide valuable information for the government to optimize the planting structure and layout and regulate the macroeconomic benefits of the fruit industry. Full article
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Topography of the study area; (<b>b</b>) the location of the study area in Shanxi Province and the position highlighted by a black triangle; (<b>c</b>) a typical fruit plantation landscape in Miaoshang county via Google Earth (Google Earth, Image © 2020 DigitalGlobe).</p>
Full article ">Figure 2
<p>Cropping calendars of major crops in the study area.</p>
Full article ">Figure 3
<p>Temporal coverage of the Sentinel-1 and Sentinel-2 time series data used in this study. (<b>a</b>) The cloud cover of Sentinel-2 time series scenes of 2019 and 2020; (<b>b</b>) The data collection of Sentinel-1 and Sentinel-2; (<b>c</b>) The number of high quality observations of Sentinel-2.</p>
Full article ">Figure 4
<p>The workflow of the ETW-DTW approach for orchards classification.</p>
Full article ">Figure 5
<p>The Workflow of Minimum intra-class Distance for Classification. The symbol * denotes the element-wise multiplication of corresponding elements in each matrix.</p>
Full article ">Figure 6
<p>Example of the HANTS-fitted and the original NDVI time series in 2020. The red points represent the original values of the NDVI temporal profile, and the HANTS fitted curve is illustrated by the bule line.</p>
Full article ">Figure 7
<p>The order and change law of index materiality. (<b>a</b>) The correlation of each index in jujube samples; (<b>b</b>) the correlation of each index in persimmon samples; (<b>c</b>) the correlation of each index in apple samples; (<b>d</b>) the correlation of each index in peach samples; (<b>e</b>) the correlation of each index in corn samples; (<b>f</b>) the importance of each index in each period.</p>
Full article ">Figure 8
<p>Differences in timing curves of each index or band from November 2019 to April 2020. (<b>a</b>) NDVI timing reference curve for each category; (<b>b</b>) MNDWI timing reference curve for each category; (<b>c</b>) NIR timing reference curve for each category; (<b>d</b>) SWIR timing reference curve for each category; (<b>e</b>) VV/VH timing reference curve for each category; the buffer band is the standard deviation of each timing curve.</p>
Full article ">Figure 9
<p>Entropy weight matrix. The horizontal axis represents various indices used in the classification, while the vertical axis represents the categories to be classified. Colors range from purple to yellow, indicating the magnitude of weights—darker colors correspond to larger weights, and lighter colors to smaller weights. For instance, when classifying with the apple category as the standard curve, the pixels/plots to be classified need to calculate TW-DTW distances with the respective NDVI, NIR, SWIR, MNDWI, and VV/VH temporal curves of apples. The obtained distances were then multiplied by the corresponding weights of each index to derive the final ETW-DTW distance.</p>
Full article ">Figure 10
<p>Spatial distribution map of various crops at plot scale: (<b>a</b>) distribution map of ETW-DTW method; (<b>b</b>) distribution map of results with NDVI timing curve as input; (<b>c</b>) distribution map of results with SWIR timing curve as input; (<b>d</b>) distribution map of results with VV/VH timing curve as input; (<b>e</b>) distribution map of results with MNDWI timing curve as input; (<b>f</b>) distribution map of results with NIR timing curve as input.</p>
Full article ">Figure 11
<p>Distribution extraction results of pixel scale and plot scale in ETW-DTW. (<b>a</b>) classification results of pixel scale ETW-DTW; (<b>b</b>) classification results of P1-based ETW-DTW; (<b>c</b>) classification results of P2-based ETW-DTW.</p>
Full article ">Figure 12
<p>Comparison of local magnification results of pixel scale and plot scale: the first row (<b>a1</b>–<b>d1</b>) displays the Google image of these four areas, the second row (<b>a2</b>–<b>d2</b>) displays the P1-based classification results, the third row (<b>a3</b>–<b>d3</b>) displays the pixel scale-based classification results, and the 4th row (<b>a4</b>–<b>d4</b>) displays the P2-based classification results.</p>
Full article ">Figure 13
<p>Entropy weight matrix of optical indices. The horizontal axis represents various indices (NDVI, NIR, SWIR, and MNDWI) of optical imagery. The vertical axis represents the categories to be classified (apples, peaches, persimmons, and jujubes). Colors range from purple to yellow, indicating the magnitude of weights—darker colors corresponded to larger weights, and lighter colors to smaller weights.</p>
Full article ">
24 pages, 372 KiB  
Review
How Immersed Are You? State of the Art of the Neurophysiological Characterization of Embodiment in Mixed Reality for Out-of-the-Lab Applications
by Vincenzo Ronca, Alessia Ricci, Rossella Capotorto, Luciano Di Donato, Daniela Freda, Marco Pirozzi, Eduardo Palermo, Luca Mattioli, Giuseppe Di Gironimo, Domenico Coccorese, Sara Buonocore, Francesca Massa, Daniele Germano, Gianluca Di Flumeri, Gianluca Borghini, Fabio Babiloni and Pietro Aricò
Appl. Sci. 2024, 14(18), 8192; https://doi.org/10.3390/app14188192 - 12 Sep 2024
Viewed by 223
Abstract
Mixed Reality (MR) environments hold immense potential for inducing a sense of embodiment, where users feel like their bodies are present within the virtual space. This subjective experience has been traditionally assessed using subjective reports and behavioral measures. However, neurophysiological approaches offer unique [...] Read more.
Mixed Reality (MR) environments hold immense potential for inducing a sense of embodiment, where users feel like their bodies are present within the virtual space. This subjective experience has been traditionally assessed using subjective reports and behavioral measures. However, neurophysiological approaches offer unique advantages in objectively characterizing embodiment. This review article explores the current state of the art in utilizing neurophysiological techniques, particularly Electroencephalography (EEG), Photoplethysmography (PPG), and Electrodermal activity (EDA), to investigate the neural and autonomic correlates of embodiment in MR for out-of-the-lab applications. More specifically, it was investigated how EEG, with its high temporal resolution, PPG, and EDA, can capture transient brain activity associated with specific aspects of embodiment, such as visuomotor synchrony, visual feedback of a virtual body, and manipulations of virtual body parts. The potential of such neurophysiological signals to differentiate between subjective experiences of embodiment was discussed, with a particular regard to identify the neural and autonomic markers of early embodiment formation during MR exposure in real settings. Finally, the strengths and limitations of the neurophysiological approach in the context of MR embodiment research were discussed, in order to achieve a more comprehensive understanding of this multifaceted phenomenon. Full article
27 pages, 4362 KiB  
Article
Himawari-8 Sea Surface Temperature Products from the Australian Bureau of Meteorology
by Pallavi Govekar, Christopher Griffin, Owen Embury, Jonathan Mittaz, Helen Mary Beggs and Christopher J. Merchant
Remote Sens. 2024, 16(18), 3381; https://doi.org/10.3390/rs16183381 - 11 Sep 2024
Viewed by 294
Abstract
As a contribution to the Integrated Marine Observing System (IMOS), the Bureau of Meteorology introduces new reprocessed Himawari-8 satellite-derived Sea Surface Temperature (SST) products. The Radiative Transfer Model and a Bayesian cloud clearing method is used to retrieve SSTs every 10 min from [...] Read more.
As a contribution to the Integrated Marine Observing System (IMOS), the Bureau of Meteorology introduces new reprocessed Himawari-8 satellite-derived Sea Surface Temperature (SST) products. The Radiative Transfer Model and a Bayesian cloud clearing method is used to retrieve SSTs every 10 min from the geostationary satellite Himawari-8. An empirical Sensor Specific Error Statistics (SSES) model, introduced herein, is applied to calculate bias and standard deviation for the retrieved SSTs. The SST retrieval and compositing method, along with validation results, are discussed. The monthly statistics for comparisons of Himawari-8 Level 2 Product (L2P) skin SST against in situ SST quality monitoring (iQuam) in situ SST datasets, adjusted for thermal stratification, showed a mean bias of −0.2/−0.1 K and a standard deviation of 0.4–0.7 K for daytime/night-time after bias correction, where satellite zenith angles were less than 60° and the quality level was greater than 2. For ease of use, these native resolution SST data have been composited using a method introduced herein that retains retrieved measurements, to hourly, 4-hourly and daily SST products, and projected onto the rectangular IMOS 0.02 degree grid. On average, 4-hourly products cover ≈10% more of the IMOS domain, while one-night composites cover ≈25% more of the IMOS domain than a typical 1 h composite. All available Himawari-8 data have been reprocessed for the September 2015–December 2022 period. The 10 min temporal resolution of the newly developed Himawari-8 SST data enables a daily composite with enhanced spatial coverage, effectively filling in SST gaps caused by transient clouds occlusion. Anticipated benefits of the new Himawari-8 products include enhanced data quality for applications like IMOS OceanCurrent and investigations into marine thermal stress, marine heatwaves, and ocean upwelling in near-coastal regions. Full article
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) SST, (<b>b</b>) probability of a pixel being clear, (<b>c</b>) sensitivity, and (<b>d</b>) assigned quality levels for one random L2P on 15 December 2020, 20:00:00 UTC, for all quality levels.</p>
Full article ">Figure 2
<p>(<b>a</b>) <math display="inline"><semantics> <mrow> <mi mathvariant="monospace">sses</mi> <mo>_</mo> <mi mathvariant="monospace">bias</mi> </mrow> </semantics></math>, (<b>b</b>) <math display="inline"><semantics> <mrow> <mi mathvariant="monospace">sses</mi> <mo>_</mo> <mi mathvariant="monospace">standard</mi> <mo>_</mo> <mi mathvariant="monospace">deviation</mi> </mrow> </semantics></math>, and (<b>c</b>) <math display="inline"><semantics> <mrow> <mi mathvariant="monospace">sses</mi> <mo>_</mo> <mi mathvariant="monospace">count</mi> </mrow> </semantics></math> for one random L2P on 15th December 2020, 20:00:00 UTC.</p>
Full article ">Figure 3
<p>Full disk spatial coverage Himawari-8 L2P validation against drifting buoys and tropical moorings, September 2015–December 2022, showing the impact of bias correction on day (<b>top</b>) and night (<b>bottom</b>) retrievals, <math display="inline"><semantics> <mrow> <mi mathvariant="sans-serif">Δ</mi> <mi>SST</mi> </mrow> </semantics></math>. Left—hand panels show variables before bias correction, and right—hand panels after bias correction have been applied to the SST values. The grey region indicates pixels with no data.</p>
Full article ">Figure 4
<p>Same as <a href="#remotesensing-16-03381-f003" class="html-fig">Figure 3</a>, for <math display="inline"><semantics> <mrow> <mi>σ</mi> <mi mathvariant="sans-serif">Δ</mi> <mi>SST</mi> </mrow> </semantics></math>.</p>
Full article ">Figure 5
<p>Same as <a href="#remotesensing-16-03381-f003" class="html-fig">Figure 3</a>, for <math display="inline"><semantics> <mrow> <mi>σ</mi> <mi>zSST</mi> </mrow> </semantics></math>.</p>
Full article ">Figure 6
<p>Annual and diurnal performance of full disk Himawari-8 L2P validation against drifting buoys and tropical moorings for September 2015–December 2022 for (<b>a</b>) northern and (<b>b</b>) southern part of the disk. The colour indicates mean bias, when bias-corrected SST is compared with drifting buoys and tropical moorings.</p>
Full article ">Figure 7
<p>Same as <a href="#remotesensing-16-03381-f006" class="html-fig">Figure 6</a>. Here, the colour indicates standard deviation, when bias-corrected SST compared with drifting buoys and tropical moorings for (<b>a</b>) northern and (<b>b</b>) southern part of the disk.</p>
Full article ">Figure 8
<p>Same as <a href="#remotesensing-16-03381-f006" class="html-fig">Figure 6</a>. Here, the colour indicates <math display="inline"><semantics> <msub> <mi>z</mi> <mrow> <mi>s</mi> <mi>s</mi> <mi>e</mi> <mi>s</mi> </mrow> </msub> </semantics></math>, when bias-corrected SST is compared with drifting buoys and tropical moorings for (<b>a</b>) northern and (<b>b</b>) southern part of the disk.</p>
Full article ">Figure 9
<p>Full disk Himawari-8 L2P skin SST validation against drifting buoys and tropical moorings, 30-day running statistics, September 2015–December 2022, (<b>a</b>) uncorrected mean, (<b>b</b>) bias-corrected mean, (<b>c</b>) uncorrected standard deviation, and (<b>d</b>) bias-corrected standard deviation, for QL ≥ 3.</p>
Full article ">Figure 9 Cont.
<p>Full disk Himawari-8 L2P skin SST validation against drifting buoys and tropical moorings, 30-day running statistics, September 2015–December 2022, (<b>a</b>) uncorrected mean, (<b>b</b>) bias-corrected mean, (<b>c</b>) uncorrected standard deviation, and (<b>d</b>) bias-corrected standard deviation, for QL ≥ 3.</p>
Full article ">Figure 10
<p>Composite SSTs for (<b>a</b>) 1 h, (<b>b</b>) 4 h and (<b>c</b>) 1 night on the IMOS domain for 15 December 2020.</p>
Full article ">Figure 11
<p>Monthly statistics for validation of 1-hour L3C skin SST against drifting buoys and tropical moorings for September 2015–December 2022 on the IMOS domain, uncorrected (<b>a</b>) mean and (<b>c</b>) standard deviation, bias-corrected (<b>b</b>) mean and (<b>d</b>) standard deviation, for QL ≥ 3. Daytime validations are shown in blue, and night-time validations in orange.</p>
Full article ">Figure 11 Cont.
<p>Monthly statistics for validation of 1-hour L3C skin SST against drifting buoys and tropical moorings for September 2015–December 2022 on the IMOS domain, uncorrected (<b>a</b>) mean and (<b>c</b>) standard deviation, bias-corrected (<b>b</b>) mean and (<b>d</b>) standard deviation, for QL ≥ 3. Daytime validations are shown in blue, and night-time validations in orange.</p>
Full article ">Figure 12
<p>Same as <a href="#remotesensing-16-03381-f011" class="html-fig">Figure 11</a>, for L3C-4hour SSTs.</p>
Full article ">Figure 12 Cont.
<p>Same as <a href="#remotesensing-16-03381-f011" class="html-fig">Figure 11</a>, for L3C-4hour SSTs.</p>
Full article ">Figure 13
<p>Monthly statistics for 1-day Night L3C skin SST validation against drifting buoys and tropical moorings for September 2015–December 2022 for the IMOS domain (<b>a</b>) mean, (<b>b</b>) standard deviation, for QL ≥ 3. The brown line denotes uncorrected data, whereas the cyan line corresponds to bias−corrected data.</p>
Full article ">Figure 14
<p>SST data coverage from (<b>a</b>) MultiSensor and (<b>b</b>) GeoPolar MultiSensor L3S SST product on the IMOS domain for 15th December 2020.</p>
Full article ">
24 pages, 1824 KiB  
Article
Challenges Facing the Use of Remote Sensing Technologies in the Construction Industry: A Review
by Abdulmohsen S. Almohsen
Buildings 2024, 14(9), 2861; https://doi.org/10.3390/buildings14092861 - 10 Sep 2024
Viewed by 433
Abstract
Remote sensing is essential in construction management by providing valuable information and insights throughout the project lifecycle. Due to the rapid advancement of remote sensing technologies, their use has been increasingly adopted in the architecture, engineering, and construction industries. This review paper aims [...] Read more.
Remote sensing is essential in construction management by providing valuable information and insights throughout the project lifecycle. Due to the rapid advancement of remote sensing technologies, their use has been increasingly adopted in the architecture, engineering, and construction industries. This review paper aims to advance the understanding, knowledge base, and practical implementation of remote sensing technologies in the construction industry. It may help support the development of robust methodologies, address challenges, and pave the way for the effective integration of remote sensing into construction management processes. This paper presents the results of a comprehensive literature review, focusing on the challenges faced in using remote sensing technologies in construction management. One hundred and seventeen papers were collected from eight relevant journals, indexed in Web of Science, and then categorized by challenge type. The results of 44 exemplary studies were reported in the three types of remote sensing platforms (satellite, airborne, and ground-based remote sensing). The paper provides construction professionals with a deeper understanding of remote sensing technologies and their applications in construction management. The challenges of using remote sensing in construction were collected and classified into eleven challenges. According to the number of collected documents, the critical challenges were shadow, spatial, and temporal resolution issues. The findings emphasize the use of unmanned airborne systems (UASs) and satellite remote sensing, which have become increasingly common and valuable for tasks such as preconstruction planning, progress tracking, safety monitoring, and environmental management. This knowledge allows for informed decision-making regarding integrating remote sensing into construction projects, leading to more efficient and practical project planning, design, and execution. Full article
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Active remote sensing, (<b>b</b>) passive remote sensing.</p>
Full article ">Figure 2
<p>Steps of using remote sensing in construction industry.</p>
Full article ">Figure 3
<p>Flowchart of the results.</p>
Full article ">Figure 4
<p>Percentages of the platform remote sensing utilized in construction application.</p>
Full article ">Figure 5
<p>Sensor types of remote sensing.</p>
Full article ">Figure 6
<p>Types of challenges of utilizing remote sensing in construction management.</p>
Full article ">Figure 7
<p>Percentage distribution of challenge types.</p>
Full article ">
19 pages, 12898 KiB  
Article
The Reconstruction of FY-4A and FY-4B Cloudless Top-of-Atmosphere Radiation and Full-Coverage Particulate Matter Products Reveals the Influence of Meteorological Factors in Pollution Events
by Zhihao Song, Lin Zhao, Qia Ye, Yuxiang Ren, Ruming Chen and Bin Chen
Remote Sens. 2024, 16(18), 3363; https://doi.org/10.3390/rs16183363 - 10 Sep 2024
Viewed by 214
Abstract
By utilizing top-of-atmosphere radiation (TOAR) data from China’s new generation of geostationary satellites (FY-4A and FY-4B) along with interpretable machine learning models, near-surface particulate matter concentrations in China were estimated, achieving hourly temporal resolution, 4 km spatial resolution, and 100% spatial coverage. First, [...] Read more.
By utilizing top-of-atmosphere radiation (TOAR) data from China’s new generation of geostationary satellites (FY-4A and FY-4B) along with interpretable machine learning models, near-surface particulate matter concentrations in China were estimated, achieving hourly temporal resolution, 4 km spatial resolution, and 100% spatial coverage. First, the cloudless TOAR data were matched and modeled with the solar radiation products from the ERA5 dataset to construct and estimate a fully covered TOAR dataset under assumed clear-sky conditions, which increased coverage from 20–30% to 100%. Subsequently, this dataset was applied to estimate particulate matter. The analysis demonstrated that the fully covered TOAR dataset (R2 = 0.83) performed better than the original cloudless dataset (R2 = 0.76). Additionally, using feature importance scores and SHAP values, the impact of meteorological factors and air mass trajectories on the increase in PM10 and PM2.5 during dust events were investigated. The analysis of haze events indicated that the main meteorological factors driving changes in particulate matter included air pressure, temperature, and boundary layer height. The particulate matter concentration products obtained using fully covered TOAR data exhibit high coverage and high spatiotemporal resolution. Combined with data-driven interpretable machine learning, they can effectively reveal the influencing factors of particulate matter in China. Full article
(This article belongs to the Section Atmospheric Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>Study area. The region covered in this study includes the entire territory of China. The green dots represent air quality monitoring stations.</p>
Full article ">Figure 2
<p>Performance of the TOAR data estimation model. The dark dotted line represents the error line, the light dotted line represents the 1:1 line, and the solid red line represents the linear regression fitting line.</p>
Full article ">Figure 3
<p>Full coverage TOAR: particulate matter estimation model based on sample cross-validation results. The dark dotted line represents the error line, the light dotted line represents the 1:1 line, and the solid red line represents the linear regression fitting line.</p>
Full article ">Figure 4
<p>Full coverage TOAR: particulate matter estimation model based on spatial validation results. The dark dotted line represents the error line, the light dotted line represents the 1:1 line, and the solid red line represents the linear regression fitting line.</p>
Full article ">Figure 5
<p>The annual average distribution of the particulate matter estimation results.</p>
Full article ">Figure 6
<p>Spatial distribution of PM<sub>10</sub> and PM<sub>2.5</sub> concentrations during the development of the dust storm event. (<b>Left</b>) Distribution of PM<sub>10</sub> concentrations. (<b>Right</b>) Distribution of PM<sub>2.5</sub> concentrations.</p>
Full article ">Figure 7
<p>Interpretation of the dust transport model for ΔPM<sub>10</sub> and ΔPM<sub>2.5</sub>.The solid red line represents the linear regression fitting line.</p>
Full article ">Figure 8
<p>SHAP importance scores of variables impacting ΔPM<sub>10</sub> and ΔPM<sub>2.5</sub> during dust storm processes. The variables shown in the figure include the following: lat: latitude, lon: longitude, height: air mass height, pressures: the air pressure at the height of the air mass, TM: temperatures, SP: surface pressures, WS: wind speeds, RH: relative humidities, BLH: boundary layer heights, SOR: surface solar radiation, LUCC: land use and land cover, HEIGHT: altitude, and RK: population density.</p>
Full article ">Figure 9
<p>Distribution of PM<sub>2.5</sub> and PM<sub>10</sub> concentrations during the development of the haze event. <b>Left</b>: Distribution of PM<sub>10</sub> concentrations. <b>Right</b>: Distribution of PM<sub>2.5</sub> concentrations.</p>
Full article ">Figure 10
<p>SHAP importance scores of various variables affecting ΔPM<sub>10</sub> and ΔPM<sub>2.5</sub> during haze weather.</p>
Full article ">
20 pages, 13462 KiB  
Article
Extraction of Garlic in the North China Plain Using Multi-Feature Combinations from Active and Passive Time Series Data
by Chuang Peng, Binglong Gao, Wei Wang, Wenji Zhu, Yongqi Chen and Chao Dong
Appl. Sci. 2024, 14(18), 8141; https://doi.org/10.3390/app14188141 - 10 Sep 2024
Viewed by 420
Abstract
Garlic constitutes a significant small-scale agricultural commodity in China. A key factor influencing garlic prices is the planted area, which can be accurately and efficiently determined using remote sensing technology. However, the spectral characteristics of garlic and winter wheat are easily confused, and [...] Read more.
Garlic constitutes a significant small-scale agricultural commodity in China. A key factor influencing garlic prices is the planted area, which can be accurately and efficiently determined using remote sensing technology. However, the spectral characteristics of garlic and winter wheat are easily confused, and the widespread intercropping of these crops in the study area exacerbates this issue, leading to significant challenges in remote sensing image analysis. Additionally, remote sensing data are often affected by weather conditions, spatial resolution, and revisit frequency, which can result in delayed and inaccurate area extraction. In this study, historical data were utilized to restore Sentinel-2 remote sensing images, aimed at mitigating cloud and rain interference. Feature combinations were devised, incorporating two vegetation indices into a comprehensive time series, along with Sentinel-1 synthetic aperture radar (SAR) time series and other temporal datasets. Multiple classification combinations were employed to extract garlic within the study area, and the accuracy of the classification results was systematically analyzed. First, we used passive satellite imagery to extract winter crops (garlic, winter wheat, and others) with high accuracy. Second, we identified garlic by applying various combinations of time series features derived from both active and passive remote sensing data. Third, we evaluated the classification outcomes of various feature combinations to generate an optimal garlic cultivation distribution map for each region. Fourth, we developed a garlic fragmentation index to assess the impact of landscape fragmentation on garlic extraction accuracy. The findings reveal that: (1) Better results in garlic extraction can be achieved using active–passive time series remote sensing. The performance of the classification model can be further enhanced by incorporating short-wave infrared bands or spliced time series data into the classification features. (2) Examination of garlic cultivation fragmentation using the garlic fragmentation index aids in elucidating variations in accuracy across the study area’s six counties. (3) Comparative analysis with validation samples demonstrated superior garlic extraction outcomes from the six primary garlic-producing counties of the North China Plain in 2021, achieving an overall precision exceeding 90%. This study offers a practical exploration of target crop identification using multi-source remote sensing data in mixed cropping areas. The methodology presented here demonstrates the potential for efficient, cost-effective, and accurate garlic classification, which is crucial for improving garlic production management and optimizing agricultural practices. Moreover, this approach holds promise for broader applications, such as nationwide garlic mapping. Full article
(This article belongs to the Special Issue Intelligent Computing and Remote Sensing—2nd Edition)
Show Figures

Figure 1

Figure 1
<p>Diagram of garlic fertility period.</p>
Full article ">Figure 2
<p>Overview of the study area.</p>
Full article ">Figure 3
<p>Workflow of garlic extraction based on active–passive remote sensing time series data.</p>
Full article ">Figure 4
<p>The time series of Sentinel-2 NDVI for garlic and winter wheat. The curves in the figure represent the average NDVI values derived from the samples, while the upper and lower boundaries indicate the standard deviation.</p>
Full article ">Figure 5
<p>Time series of Sentinel-1 curves for garlic and winter wheat. (<b>a</b>) Time series data on the ratio of vertical–vertical (VV) and vertical–horizontal (VH) polarization in garlic and winter wheat. (<b>b</b>) Time series data of VV and VH polarization for garlic and winter wheat. The curves in the figure represent the average values derived from the samples.</p>
Full article ">Figure 6
<p>Winter crop distribution maps. This figure illustrates the winter vegetation classification results for each county within the study area. Green indicates areas of winter vegetation, while gray represents other land cover types.</p>
Full article ">Figure 6 Cont.
<p>Winter crop distribution maps. This figure illustrates the winter vegetation classification results for each county within the study area. Green indicates areas of winter vegetation, while gray represents other land cover types.</p>
Full article ">Figure 7
<p>Garlic distribution maps. This figure illustrates the garlic classification results for each county within the study area. Blue indicates areas of garlic, while gray represents other land cover types.</p>
Full article ">
17 pages, 34922 KiB  
Article
Coastal Sea Ice Concentration Derived from Marine Radar Images: A Case Study from Utqiaġvik, Alaska
by Felix St-Denis, L. Bruno Tremblay, Andrew R. Mahoney and Kitrea Pacifica L. M. Takata-Glushkoff
Remote Sens. 2024, 16(18), 3357; https://doi.org/10.3390/rs16183357 - 10 Sep 2024
Viewed by 346
Abstract
We apply the Canny edge algorithm to imagery from the Utqiaġvik coastal sea ice radar system (CSIRS) to identify regions of open water and sea ice and quantify ice concentration. The radar-derived sea ice concentration (SIC) is compared against the (closest to the [...] Read more.
We apply the Canny edge algorithm to imagery from the Utqiaġvik coastal sea ice radar system (CSIRS) to identify regions of open water and sea ice and quantify ice concentration. The radar-derived sea ice concentration (SIC) is compared against the (closest to the radar field of view) 25 km resolution NSIDC Climate Data Record (CDR) and the 1 km merged MODIS-AMSR2 sea ice concentrations within the ∼11 km field of view for the year 2022–2023, when improved image contrast was first implemented. The algorithm was first optimized using sea ice concentration from 14 different images and 10 ice analysts (140 analyses in total) covering a range of ice conditions with landfast ice, drifting ice, and open water. The algorithm is also validated quantitatively against high-resolution MODIS-Terra in the visible range. Results show a correlation coefficient and mean bias error between the optimized algorithm, the CDR and MODIS-AMSR2 daily SIC of 0.18 and 0.54, and ∼−1.0 and 0.7%, respectively, with an averaged inter-analyst error of ±3%. In general, the CDR captures the melt period correctly and overestimates the SIC during the winter and freeze-up period, while the merged MODIS-AMSR2 better captures the punctual break-out events in winter, including those during the freeze-up events (reduction in SIC). Remnant issues with the detection algorithm include the false detection of sea ice in the presence of fog or precipitation (up to 20%), quantified from the summer reconstruction with known open water conditions. The proposed technique allows for the derivation of the SIC from CSIRS data at spatial and temporal scales that coincide with those at which coastal communities members interact with sea ice. Moreover, by measuring the SIC in nearshore waters adjacent to the shoreline, we can quantify the effect of land contamination that detracts from the usefulness of satellite-derived SIC for coastal communities. Full article
(This article belongs to the Section Remote Sensing Image Processing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Map of the study site and location of the CSIRS. The black circle marks the coastal radar range and the black start highlights the location of the coastal radar.</p>
Full article ">Figure 2
<p>Flow chart of the floe edge detection algorithm.</p>
Full article ">Figure 3
<p>Images from 11 March 2022, taken after each algorithm step: (<b>a</b>) initial image, (<b>b</b>) image with land removed, (<b>c</b>) the output of the Canny edge algorithm, (<b>d</b>) the contours, in red, found from the detected edges, and (<b>e</b>) the final sea ice contour.</p>
Full article ">Figure 4
<p>Images analyzed by the analysts.</p>
Full article ">Figure 5
<p>CDR (<b>a</b>) and merged MODIS-AMSR2 (<b>b</b>) grid cells used for the comparison with the marine radar. Utqiaġvik and the radar range are marked with a black star and circle.</p>
Full article ">Figure 6
<p>(<b>a</b>) Minimum RMSE (red) and the corresponding best 1:1 line fit (blue) for each kernel, (<b>b</b>) scatter plot of SIC derived from the radar images with the optimal set of parameters and SIC from the 10 analysts (colorbar), including the best-line fit (green line), (<b>c</b>) the analyst standard deviation (STD) for each analyzed frame, and (<b>d</b>) histogram of the departure of the best 1:1 line fit when removing one-by-one the analyzed frame. Note that the blue axis in (<b>a</b>) does not start at 0. Each of the horizontal lines represent a different image in (<b>b</b>). The inter-analyst averaged error of 0.026 is represented by the dashed line in (<b>c</b>).</p>
Full article ">Figure 7
<p>(<b>a</b>) Nearly synchronous marine radar image, including sea ice edge (red) from the detection algorithm and at 15:09 local time (AKDT). (<b>b</b>) MODIS Terra image at 15:09 AKDT. (<b>c</b>) Time series of Pearson correlation coefficient (r) between all coarse-grained (1 km) 4 min images (360 in total) CSIRS SIC and the merged MODIS-AMSR2 SIC for 14 April 2022. The red shading corresponds to the 95% confidence interval. (<b>d</b>) Scatter plot of the CSIRS SIC with the merged MODIS-AMSR2 for the time of maximum correlation at 6:00 AKDT. The best line fit is given by <math display="inline"><semantics> <mrow> <mi>y</mi> <mo>=</mo> <mn>0.43</mn> <mi>x</mi> <mo>+</mo> <mn>0.28</mn> </mrow> </semantics></math> and the red shading corresponds to the RMSE of 0.12.</p>
Full article ">Figure 8
<p>Daily (<b>a</b>), 7-day running mean (<b>b</b>), and 31-day running mean (<b>c</b>) time series of the SIC derived from the radar (blue), the CDR (green), and the merged MODIS-AMSR2 (red) for 2022 to 2023 as a function of the Julian days starting 1 January 2022. The holes in the time series represent the non-availability of the data.</p>
Full article ">
24 pages, 15733 KiB  
Article
Evolution Patterns and Dominant Factors of Soil Salinization in the Yellow River Delta Based on Long-Time-Series and Similar Phenological-Fusion Images
by Bing Guo, Mei Xu and Rui Zhang
Remote Sens. 2024, 16(17), 3332; https://doi.org/10.3390/rs16173332 - 8 Sep 2024
Viewed by 408
Abstract
Previous studies were mostly conducted based on sparse time series and different phenological images, which often ignored the dramatic changes in salinization evolution throughout the year. Based on Landsat and moderate-resolution-imaging spectroradiometer (MODIS) images from 2000 to 2020, this study applied the Enhanced [...] Read more.
Previous studies were mostly conducted based on sparse time series and different phenological images, which often ignored the dramatic changes in salinization evolution throughout the year. Based on Landsat and moderate-resolution-imaging spectroradiometer (MODIS) images from 2000 to 2020, this study applied the Enhanced Spatial and Temporal Adaptive Reflectance Fusion Model (ESTARFM) algorithm to obtain similar phenological images for the month of April for the past 20 years. Based on the random forest algorithm, the surface parameters of the salinization were optimized, and the feature space index models were constructed. Combined with the measured ground data, the optimal monitoring index model of salinization was determined, and then the spatiotemporal evolution patterns of salinization and its driving mechanisms in the Yellow River Delta were revealed. The main conclusions were as follows: (1) The derived long-time-series and similar phenological-fusion images enable us to reveal the patterns of change in the dramatic salinization in the year that we examined using the ESTARFM algorithm. (2) The NDSI-TGDVI feature space salinization monitoring index model based on point-to-point mode had the highest accuracy of 0.92. (3) From 2000 to 2020, the soil salinization in the Yellow River Delta showed an aggravating trend. The average value of salinization during the past 20 years was 0.65, which is categorized as severe salinization. The degree of salinization gradually decreased from the northeastern coastal area to the southwestern inland area. (4) The dominant factors affecting soil salinization in different historical periods varied. The research results could provide support for decision-making regarding the precise prevention and control of salinization in the Yellow River Delta. Full article
Show Figures

Figure 1

Figure 1
<p>The location of the study area and the distribution of the observed samples from the field. (<b>a</b>) the location of the study area in China; (<b>b</b>) the location of the study area in Shandong Province; (<b>c</b>) the location of the study area.</p>
Full article ">Figure 2
<p>The flowchart of the entire paper.</p>
Full article ">Figure 3
<p>Principle of feature space method: (<b>a</b>) TGDVI-NDSI; (<b>b</b>) EDVI-Albedo.</p>
Full article ">Figure 4
<p>Types of interactive dominant factors affecting salinization changes.</p>
Full article ">Figure 5
<p>Comparison details of ESTARFM fusion results: (<b>a</b>) coarse-resolution image; (<b>b</b>) fusion image; (<b>c</b>) actual image; (<b>d</b>) enlarged area from coarse-resolution image; (<b>e</b>) enlarged area from fine-resolution image; and (<b>f</b>) enlarged area from actual-resolution image.</p>
Full article ">Figure 6
<p>ESTARFM fusion results and 2D scatter plot of actual images.</p>
Full article ">Figure 7
<p>Optimization of salinization characterization parameters in the Yellow River Delta.</p>
Full article ">Figure 8
<p>Characteristic space salinization monitoring indicator models: (<b>a</b>) GNDVI-NDSI; (<b>b</b>) NDSI-EDVI; (<b>c</b>) NDSI-RVI; (<b>d</b>) NDSI-TGDVI; (<b>e</b>) SI2-Albedo; (<b>f</b>) WI-Albedo; (<b>g</b>) WI-SI2; (<b>h</b>) EDVI-Albedo; (<b>i</b>) GNDVI-Albedo; (<b>j</b>) NDSI-Albedo; (<b>k</b>) TGDVI-Albedo; and (<b>l</b>) TGDVI-SI2.</p>
Full article ">Figure 9
<p>Construction of NDSI-TGDVI feature space salinization monitoring model based on point-to-point mode.</p>
Full article ">Figure 10
<p>Construction of EDVI-Albedo feature space salinization monitoring model based on point-to-line mode.</p>
Full article ">Figure 11
<p>Temporal variations in average salinization monitoring model from 2000 to 2020.</p>
Full article ">Figure 12
<p>Spatial distribution of salinization at different levels: (<b>a</b>) 2000; (<b>b</b>) 2005; (<b>c</b>) 2010; (<b>d</b>) 2015; and (<b>e</b>) 2020.</p>
Full article ">Figure 13
<p>Migration trajectory of salinization gravity center in the Yellow River Delta at different timescales.</p>
Full article ">Figure 14
<p>Q values of different driving factors in different years.</p>
Full article ">Figure 15
<p>The dominant interactive factors of soil salinization in the Yellow River Delta. (<b>a</b>) 2000; (<b>b</b>) 2005; (<b>c</b>) 2010; (<b>d</b>) 2015; (<b>e</b>) 2020.</p>
Full article ">Figure 15 Cont.
<p>The dominant interactive factors of soil salinization in the Yellow River Delta. (<b>a</b>) 2000; (<b>b</b>) 2005; (<b>c</b>) 2010; (<b>d</b>) 2015; (<b>e</b>) 2020.</p>
Full article ">
19 pages, 8921 KiB  
Article
A Method for Cropland Layer Extraction in Complex Scenes Integrating Edge Features and Semantic Segmentation
by Yihang Lu, Lin Li, Wen Dong, Yizhen Zheng, Xin Zhang, Jinzhong Zhang, Tao Wu and Meiling Liu
Agriculture 2024, 14(9), 1553; https://doi.org/10.3390/agriculture14091553 - 8 Sep 2024
Viewed by 384
Abstract
Cultivated land is crucial for food production and security. In complex environments like mountainous regions, the fragmented nature of the cultivated land complicates rapid and accurate information acquisition. Deep learning has become essential for extracting cultivated land but faces challenges such as edge [...] Read more.
Cultivated land is crucial for food production and security. In complex environments like mountainous regions, the fragmented nature of the cultivated land complicates rapid and accurate information acquisition. Deep learning has become essential for extracting cultivated land but faces challenges such as edge detail loss and limited adaptability. This study introduces a novel approach that combines geographical zonal stratification with the temporal characteristics of medium-resolution remote sensing images for identifying cultivated land. The methodology involves geographically zoning and stratifying the study area, and then integrating semantic segmentation and edge detection to analyze remote sensing images and generate initial extraction results. These results are refined through post-processing with medium-resolution imagery classification to produce a detailed map of the cultivated land distribution. The method achieved an overall extraction accuracy of 95.07% in Tongnan District, with specific accuracies of 92.49% for flat cultivated land, 96.18% for terraced cultivated land, 93.80% for sloping cultivated land, and 78.83% for forest intercrop land. The results indicate that, compared to traditional methods, this approach is faster and more accurate, reducing both false positives and omissions. This paper presents a new methodological framework for large-scale cropland mapping in complex scenarios, offering valuable insights for subsequent cropland extraction in challenging environments. Full article
(This article belongs to the Special Issue Applications of Remote Sensing in Agricultural Soil and Crop Mapping)
Show Figures

Figure 1

Figure 1
<p>The location and topography of the study area.</p>
Full article ">Figure 2
<p>Typical sample diagram: land cover sample points (<b>a</b>), edge detection samples (<b>b</b>,<b>b1</b>,<b>c</b>,<b>c1</b>), semantic segmentation samples (<b>d</b>,<b>d1</b>,<b>e</b>,<b>e1</b>).</p>
Full article ">Figure 3
<p>Diagram of zoning and layering: plain area (<b>A</b>); mountainous area (<b>B</b>); forest–grass area (<b>C</b>); flat cultivated land (<b>a</b>); terraced cultivated land (<b>b1</b>); sloping cultivated land (<b>b2</b>); forest intercrop land (<b>c</b>).</p>
Full article ">Figure 4
<p>Technology roadmap.</p>
Full article ">Figure 5
<p>Overall distribution mapping: the distribution characteristics of terraced cultivated land (<b>A</b>), the distribution characteristics of forest intercrop land (<b>B</b>), the distribution characteristics of flat cultivated land (<b>C</b>), and the distribution characteristics of sloping cultivated land (<b>D</b>).</p>
Full article ">Figure 6
<p>Comparison of different models.</p>
Full article ">Figure 7
<p>A comparison of the results of the partitioned and layered extraction method with those of the non-partitioned and direct extraction method.</p>
Full article ">
20 pages, 9581 KiB  
Article
Simulation and Spatio-Temporal Analysis of Soil Erosion in the Source Region of the Yellow River Using Machine Learning Method
by Jinxi Su, Rong Tang and Huilong Lin
Land 2024, 13(9), 1456; https://doi.org/10.3390/land13091456 - 7 Sep 2024
Viewed by 483
Abstract
The source region of the Yellow River (SRYR), known as the “Chinese Water Tower”, is currently grappling with severe soil erosion, which jeopardizes the sustainability of its alpine grasslands. Large-scale soil erosion monitoring poses a significant challenge, complicating global efforts to study soil [...] Read more.
The source region of the Yellow River (SRYR), known as the “Chinese Water Tower”, is currently grappling with severe soil erosion, which jeopardizes the sustainability of its alpine grasslands. Large-scale soil erosion monitoring poses a significant challenge, complicating global efforts to study soil erosion and land cover changes. Moreover, conventional methods for assessing soil erosion do not adequately address the variety of erosion types present in the SRYR. Given these challenges, the objectives of this study were to develop a suitable assessment and prediction model for soil erosion tailored to the SRYR’s needs. By leveraging soil erosion data measured by 137Cs from 521 locations and employing the random forest (RF) algorithm, a new soil erosion model was formulated. Key findings include that: (1) The RF soil erosion model significantly outperformed the revised universal soil loss equation (RUSLE) model and revised wind erosion equation (RWEQ) model, achieving an R2 of 0.52 and an RMSE of 5.88. (2) The RF model indicated that from 2001 to 2020, the SRYR experienced an average annual soil erosion modulus (SEM) of 19.32 t·ha−1·y−1 with an annual total erosion in the SRYR of 225.18 × 106 t·y−1. Spatial analysis revealed that 78.64% of the region suffered low erosion, with erosion intensity declining from northwest to southeast. (3) The annual SEM in the SRYR demonstrated a downward trend from 2001 to 2020, with 83.43% of the study area showing improvement. Based on these findings, measures for soil erosion prevention and control in the SRYR were proposed. Future studies should refine the temporal analysis to better understand the influence of extreme climate events on soil erosion, while leveraging high-resolution data to enhance model accuracy. Insights into the drivers of soil erosion in the SRYR will support more effective policy development. Full article
Show Figures

Figure 1

Figure 1
<p>The layout of land cover types and sampling sites from 2015-2020 across the SRYR.</p>
Full article ">Figure 2
<p>Model results and performance of RUSLE, RWEQ, and RF models: Results of the RUSLE, RWEQ, and RF models (<b>a</b>–<b>c</b>); Performance of the RUSLE, RWEQ, and RF models (<b>d</b>–<b>f</b>); Classification of soil erosion for each model (<b>g</b>).</p>
Full article ">Figure 3
<p>Spatial distribution of SEM in SRYR in 2001 (<b>a</b>), 2005 (<b>b</b>), 2010 (<b>c</b>), 2015 (<b>d</b>), 2020 (<b>e</b>). Temporal changes of SEM in SRYR between 2001 and 2020 (<b>f</b>).</p>
Full article ">Figure 4
<p>Temporal changes in SEM across each county in the SRYR from 2001 to 2020: Maduo (<b>a</b>); Maqin (<b>b</b>); Qumalai (<b>c</b>); Dari (<b>d</b>); Maqu (<b>e</b>); Gande (<b>f</b>); Chindu (<b>g</b>); Xinghai (<b>h</b>); Ruoergai (<b>i</b>); Hongyuan (<b>j</b>); Jiuzhi (<b>k</b>); Tongde (<b>l</b>); Henan (<b>m</b>); Zeku (<b>n</b>); Aba (<b>o</b>). Average soil erosion modulus for each county from 2001 to 2020 (<b>p</b>).</p>
Full article ">Figure 5
<p>Temporal changes in SEM across main land cover types in the SRYR from 2001 to 2020: Alpine steppe (<b>a</b>); Alpine meadow (<b>b</b>); Mountain meadow (<b>c</b>); Cropland (<b>d</b>); Wetland (<b>e</b>); Barren (<b>f</b>). Average soil erosion modulus for land cover types mentioned above from 2001 to 2020 (<b>g</b>).</p>
Full article ">Figure 6
<p>Temporal trends for SEM in the SRYR between 2001 and 2020.</p>
Full article ">
17 pages, 5654 KiB  
Article
A Short-Term Power Prediction Method for Photovoltaics Based on Similar Day Clustering and Spatio-Temporal Feature Extraction
by Xu Huang, Leying Wang, Leijiao Ge, Luyang Hou, Tianshuo Du, Yiwen Zheng and Yanbo Chen
Electronics 2024, 13(17), 3536; https://doi.org/10.3390/electronics13173536 - 6 Sep 2024
Viewed by 246
Abstract
Accurate PV power prediction is crucial for enhancing grid planning, optimizing dispatch operations, and advancing management strategies. In pursuit of this objective, this study proposes a short-term distributed PV power prediction method that incorporates temporal and spatial feature extraction as well as similar [...] Read more.
Accurate PV power prediction is crucial for enhancing grid planning, optimizing dispatch operations, and advancing management strategies. In pursuit of this objective, this study proposes a short-term distributed PV power prediction method that incorporates temporal and spatial feature extraction as well as similar day analysis. Firstly, to address the poor adaptability of traditional clustering methods to time-series data, the K-shape clustering algorithm is employed to categorize the time series into different weather types. Secondly, to overcome the challenges posed by varying time resolutions in similar day analysis, a novel method based on Dynamic Time Warping (DTW) is proposed. This method calculates the similarity between the target days and the days to be collected, considering both the time of day and the day of the week. Subsequently, a PV power generation prediction model based on a convolutional long short-term memory (CNN-LSTM) network is developed to enhance prediction accuracy. To tackle the difficulty of manual hyperparameter tuning, the chaos reverse sparrow search algorithm (CRSSA) is introduced. Finally, a case study is conducted on the measured data of a distributed photovoltaic power station in a certain region of China. By comparing RMSE and MAPE, compared with other prediction models, the proposed prediction model and solving algorithm effectively reduced the relative error by more than 1%, verifying the effectiveness of the proposed method. Full article
(This article belongs to the Special Issue Advances in Enhancing Energy and Power System Stability and Control)
Show Figures

Figure 1

Figure 1
<p>DTW schematic diagram.</p>
Full article ">Figure 2
<p>LSTM structure diagram.</p>
Full article ">Figure 3
<p>CNN-LSTM structure diagram.</p>
Full article ">Figure 4
<p>CRSSA-CNN-LSTM forecast flow chart.</p>
Full article ">Figure 5
<p>Contour coefficient values for various cluster numbers.</p>
Full article ">Figure 6
<p>Different weather patterns for cluster partitioning.</p>
Full article ">Figure 6 Cont.
<p>Different weather patterns for cluster partitioning.</p>
Full article ">Figure 7
<p>Comparison of different clustering methods.</p>
Full article ">Figure 8
<p>Model 1 typical day forecast results for each model.</p>
Full article ">Figure 9
<p>Model 2 typical day forecast results for each model.</p>
Full article ">Figure 10
<p>Model 3 typical day forecast results for each model.</p>
Full article ">Figure 11
<p>Convergence curves of different optimization algorithms.</p>
Full article ">
Back to TopTop