[go: up one dir, main page]

 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (6,182)

Search Parameters:
Keywords = infrared images

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
25 pages, 9319 KiB  
Article
Blind Separation of Skin Chromophores from Multispectral Dermatological Images
by Mustapha Zokay and Hicham Saylani
Diagnostics 2024, 14(20), 2288; https://doi.org/10.3390/diagnostics14202288 - 14 Oct 2024
Abstract
Background/Objectives: Based on Blind Source Separation and the use of multispectral imaging, the new approach we propose in this paper aims to improve the estimation of the concentrations of the main skin chromophores (melanin, oxyhemoglobin and deoxyhemoglobin), while considering shading as a [...] Read more.
Background/Objectives: Based on Blind Source Separation and the use of multispectral imaging, the new approach we propose in this paper aims to improve the estimation of the concentrations of the main skin chromophores (melanin, oxyhemoglobin and deoxyhemoglobin), while considering shading as a fully-fledged source. Methods: In this paper, we demonstrate that the use of the Infra-Red spectral band, in addition to the traditional RGB spectral bands of dermatological images, allows us to model the image provided by each spectral band as a mixture of the concentrations of the three chromophores in addition to that of the shading, which are estimated through four steps using Blind Source Separation. Results: We studied the performance of our new method on a database of real multispectral dermatological images of melanoma by proposing a new quantitative performances measurement criterion based on mutual information. We then validated these performances on a database of multispectral dermatological images that we simulated using our own new protocol. Conclusions: All the results obtained demonstrated the effectiveness of our new approach for estimating the concentrations of the skin chromophores from a multispectral dermatological image, compared to traditional approaches that consist of using only the RGB image by neglecting shading. Full article
Show Figures

Figure 1

Figure 1
<p>Chromophore absorption coefficient [<a href="#B25-diagnostics-14-02288" class="html-bibr">25</a>,<a href="#B39-diagnostics-14-02288" class="html-bibr">39</a>] and the 4 spectral bands of multispectral imaging.</p>
Full article ">Figure 2
<p>Flowchart illustrating the main steps of our method <span class="html-italic">BCSnmf-Irgb</span>.</p>
Full article ">Figure 3
<p>Processed dermatological image: (<b>a</b>) Image with specular reflection, (<b>b</b>) Image without specular reflection.</p>
Full article ">Figure 4
<p>Contributions of chromophores and shading estimated by each of the three methods: (<b>a</b>) <span class="html-italic">BCS-rgb</span>, (<b>b</b>) <span class="html-italic">BCS-Irgb</span> and (<b>c</b>) <span class="html-italic">BCSnmf-Irgb</span>.</p>
Full article ">Figure 5
<p>Example of dermatological images containing hair.</p>
Full article ">Figure 6
<p>(<b>a</b>) Simulated contributions of the three chromophores and shading, (<b>b</b>) Resulting RGB dermatological image.</p>
Full article ">Figure 7
<p>Contributions of the chromophores and shading estimated by each of the three methods: (<b>a</b>) <span class="html-italic">BCS-rgb</span>, (<b>b</b>) <span class="html-italic">BCS-Irgb</span> and (<b>c</b>) <span class="html-italic">BCSnmf-Irgb</span>.</p>
Full article ">
22 pages, 6158 KiB  
Article
Spatial and Temporal Change Analysis of Urban Built-Up Area via Nighttime Lighting Data—A Case Study with Yunnan and Guizhou Provinces
by Qian Jing, Armando Marino, Yongjie Ji, Han Zhao, Guoran Huang and Lu Wang
Land 2024, 13(10), 1677; https://doi.org/10.3390/land13101677 - 14 Oct 2024
Abstract
As urbanization accelerates, characteristics of urban spatial expansion play a significant role in the future utilization of land resources, the protection of the ecological environment, and the coordinated development of population and land. In this study, Yunnan and Guizhou provinces were selected as [...] Read more.
As urbanization accelerates, characteristics of urban spatial expansion play a significant role in the future utilization of land resources, the protection of the ecological environment, and the coordinated development of population and land. In this study, Yunnan and Guizhou provinces were selected as the study area, and the 2013–2021 National Polar-Orbiting Partnership’s Visible Infrared Imaging Radiometer Suite (NPP-VIIRS) nighttime light (NTL) data were utilized for spatial and temporal change analysis of urban built-up areas. Firstly, the built-up areas in Yunnan and Guizhou provinces were extracted through ENUI (Enhanced Nighttime Lighting Urban Index) indices, and then the urban expansion speed and urban center of gravity migration were constructed and used to explore and analyze the spatial and temporal change and expansion characteristics of built-up areas in Yunnan and Guizhou provinces. The results showed the following. (1) Due to the complementarity between data types, such as NTL, EVI, NDBI, and NDWI, ENUI has better performance in expressing urban characteristics. (2) Influenced by national and local policies, such as “One Belt, One Road”, transportation infrastructure construction, geographic location, the historical background, and other factors, the urban expansion rate of Yunnan and Guizhou provinces in general showed a continuous advancement from 2013 to 2021, and there were three years in which the expansion rate was positive. (3) The center of gravity migration distance of most cities in Guizhou Province shows a trend of increasing and then decreasing, while the center of gravity migration distance in Yunnan Province shows a trend of continuous decrease in general. From the perspective of migration direction, Guizhou Province has the largest number of migrations to the northeast, while Yunnan Province has the largest number of migrations to the southeast. (4) Influenced by policy, economy, population, geography, and other factors, urban compactness in Yunnan and Guizhou provinces continued to grow from 2013 to 2021. The results of this study can help us better understand urbanization in western China, reveal the urban expansion patterns and spatial characteristics of Yunnan and Guizhou provinces, and provide valuable references for development planning and policymaking in Yunnan and Guizhou provinces. Full article
Show Figures

Figure 1

Figure 1
<p>Location map of the research area. (<b>a</b>) Yunnan Province location map of Guizhou Province in China; (<b>b</b>) Yunnan Province and Guizhou Province digital elevation model (DEM).</p>
Full article ">Figure 2
<p>Workflow chart.</p>
Full article ">Figure 3
<p>Distribution characteristics of typical cities in Yunnan Province from 2013 to 2021.</p>
Full article ">Figure 4
<p>Distribution characteristics of typical cities in Guizhou Province from 2013 to 2021.</p>
Full article ">Figure 5
<p>(<b>a</b>) Map of built-up areas of Yunnan Province. (<b>b</b>) Map of built-up areas in Guizhou Province.</p>
Full article ">Figure 6
<p>Map of urban expansion rate in Yunnan and Guizhou provinces from 2013 to 2021. (<b>a</b>) Yunnan Province; (<b>b</b>) Guizhou Province.</p>
Full article ">Figure 7
<p>Migration map of urban centers of gravity in Yunnan and Guizhou provinces. (<b>a</b>) Yunnan Province; (<b>b</b>) Guizhou Province.</p>
Full article ">Figure 8
<p>Gray correlation diagram of Yunnan and Guizhou provinces.</p>
Full article ">
33 pages, 2746 KiB  
Review
Progression in Near-Infrared Fluorescence Imaging Technology for Lung Cancer Management
by Xinglong Chen, Yuning Li, Jialin Su, Lemeng Zhang and Hongwen Liu
Biosensors 2024, 14(10), 501; https://doi.org/10.3390/bios14100501 - 14 Oct 2024
Abstract
Lung cancer is a major threat to human health and a leading cause of death. Accurate localization of tumors in vivo is crucial for subsequent treatment. In recent years, fluorescent imaging technology has become a focal point in tumor diagnosis and treatment due [...] Read more.
Lung cancer is a major threat to human health and a leading cause of death. Accurate localization of tumors in vivo is crucial for subsequent treatment. In recent years, fluorescent imaging technology has become a focal point in tumor diagnosis and treatment due to its high sensitivity, strong selectivity, non-invasiveness, and multifunctionality. Molecular probes-based fluorescent imaging not only enables real-time in vivo imaging through fluorescence signals but also integrates therapeutic functions, drug screening, and efficacy monitoring to facilitate comprehensive diagnosis and treatment. Among them, near-infrared (NIR) fluorescence imaging is particularly prominent due to its improved in vivo imaging effect. This trend toward multifunctionality is a significant aspect of the future advancement of fluorescent imaging technology. In the past years, great progress has been made in the field of NIR fluorescence imaging for lung cancer management, as well as the emergence of new problems and challenges. This paper generally summarizes the application of NIR fluorescence imaging technology in these areas in the past five years, including the design, detection principles, and clinical applications, with the aim of advancing more efficient NIR fluorescence imaging technologies to enhance the accuracy of tumor diagnosis and treatment. Full article
(This article belongs to the Special Issue Probes for Biosensing and Bioimaging)
24 pages, 1129 KiB  
Article
Infrared Image Generation Based on Visual State Space and Contrastive Learning
by Bing Li, Decao Ma, Fang He, Zhili Zhang, Daqiao Zhang and Shaopeng Li
Remote Sens. 2024, 16(20), 3817; https://doi.org/10.3390/rs16203817 - 14 Oct 2024
Abstract
The preparation of infrared reference images is of great significance for improving the accuracy and precision of infrared imaging guidance. However, collecting infrared data on-site is difficult and time-consuming. Fortunately, the infrared images can be obtained from the corresponding visible-light images to enrich [...] Read more.
The preparation of infrared reference images is of great significance for improving the accuracy and precision of infrared imaging guidance. However, collecting infrared data on-site is difficult and time-consuming. Fortunately, the infrared images can be obtained from the corresponding visible-light images to enrich the infrared data. To this end, this present work proposes an image translation algorithm that converts visible-light images to infrared images. This algorithm, named V2IGAN, is founded on the visual state space attention module and multi-scale feature contrastive learning loss. Firstly, we introduce a visual state space attention module designed to sharpen the generative network’s focus on critical regions within visible-light images. This enhancement not only improves feature extraction but also bolsters the generator’s capacity to accurately model features, ultimately enhancing the quality of generated images. Furthermore, the method incorporates a multi-scale feature contrastive learning loss function, which serves to bolster the robustness of the model and refine the detail of the generated images. Experimental results show that the V2IGAN method outperforms existing typical infrared image generation techniques in both subjective visual assessments and objective metric evaluations. This suggests that the V2IGAN method is adept at enhancing the feature representation in images, refining the details of the generated infrared images, and yielding reliable, high-quality results. Full article
Show Figures

Figure 1

Figure 1
<p>Visible-to-Infrared Image Translation Process.</p>
Full article ">Figure 2
<p>The framework flowchart of V2IGAN algorithm.</p>
Full article ">Figure 3
<p>The generator structure of V2IGAN.</p>
Full article ">Figure 4
<p>The structure of the VSS block.</p>
Full article ">Figure 5
<p>SS2D schematic diagram.</p>
Full article ">Figure 6
<p>Contrastive learning.</p>
Full article ">Figure 7
<p>Three datasets’ image samples.</p>
Full article ">Figure 8
<p>Examples of infrared images generated by different methods on the FLIR dataset.</p>
Full article ">Figure 9
<p>Examples of infrared images generated by different methods on the AVIID dataset.</p>
Full article ">Figure 10
<p>Examples of infrared images generated by different methods on the IRVI dataset.</p>
Full article ">Figure 11
<p>Ablation experiment results on the AVIID dataset.</p>
Full article ">Figure 12
<p>T-SNE visualization comparison on FLIR dataset.</p>
Full article ">Figure 13
<p>T-SNE visualization comparison on AVIID dataset.</p>
Full article ">Figure 14
<p>T-SNE visualization comparison on IRVI dataset.</p>
Full article ">
25 pages, 6736 KiB  
Article
LFIR-YOLO: Lightweight Model for Infrared Vehicle and Pedestrian Detection
by Quan Wang, Fengyuan Liu, Yi Cao, Farhan Ullah and Muxiong Zhou
Sensors 2024, 24(20), 6609; https://doi.org/10.3390/s24206609 (registering DOI) - 14 Oct 2024
Abstract
The complexity of urban road scenes at night and the inadequacy of visible light imaging in such conditions pose significant challenges. To address the issues of insufficient color information, texture detail, and low spatial resolution in infrared imagery, we propose an enhanced infrared [...] Read more.
The complexity of urban road scenes at night and the inadequacy of visible light imaging in such conditions pose significant challenges. To address the issues of insufficient color information, texture detail, and low spatial resolution in infrared imagery, we propose an enhanced infrared detection model called LFIR-YOLO, which is built upon the YOLOv8 architecture. The primary goal is to improve the accuracy of infrared target detection in nighttime traffic scenarios while meeting practical deployment requirements. First, to address challenges such as limited contrast and occlusion noise in infrared images, the C2f module in the high-level backbone network is augmented with a Dilation-wise Residual (DWR) module, incorporating multi-scale infrared contextual information to enhance feature extraction capabilities. Secondly, at the neck of the network, a Content-guided Attention (CGA) mechanism is applied to fuse features and re-modulate both initial and advanced features, catering to the low signal-to-noise ratio and sparse detail features characteristic of infrared images. Third, a shared convolution strategy is employed in the detection head, replacing the decoupled head strategy and utilizing shared Detail Enhancement Convolution (DEConv) and Group Norm (GN) operations to achieve lightweight yet precise improvements. Finally, loss functions, PIoU v2 and Adaptive Threshold Focal Loss (ATFL), are integrated into the model to better decouple infrared targets from the background and to enhance convergence speed. The experimental results on the FLIR and multispectral datasets show that the proposed LFIR-YOLO model achieves an improvement in detection accuracy of 4.3% and 2.6%, respectively, compared to the YOLOv8 model. Furthermore, the model demonstrates a reduction in parameters and computational complexity by 15.5% and 34%, respectively, enhancing its suitability for real-time deployment on resource-constrained edge devices. Full article
(This article belongs to the Section Sensing and Imaging)
Show Figures

Figure 1

Figure 1
<p>LFIR-YOLO model structure diagram.</p>
Full article ">Figure 2
<p>Dilation-wise Residual module.</p>
Full article ">Figure 3
<p>Content−guided Attention module.</p>
Full article ">Figure 4
<p>Content−guided Attention Fusion module.</p>
Full article ">Figure 5
<p>Lightweight Shared Detail-enhanced Convolution Detection Head structure diagram.</p>
Full article ">Figure 6
<p>Details of DEConv.</p>
Full article ">Figure 7
<p>Ablation experiment comparison chart for mAP@0.5 and box_loss.</p>
Full article ">Figure 8
<p>Random infrared example Group 1. (<b>a</b>): A multi-object detection scene with vehicles at varying distances. (<b>b</b>): A dynamic blur detection scene where the vehicle in the foreground is in motion. (<b>c</b>): A low-contrast outdoor urban scene focused on detecting distant pedestrians.</p>
Full article ">Figure 9
<p>Random infrared example Group 2.</p>
Full article ">Figure 10
<p>Computational complexity of the model.</p>
Full article ">Figure 11
<p>(<b>a</b>) FLIR image detection results for multi-scale target scene. (<b>b</b>) FLIR image detection results for occlusion scene.</p>
Full article ">Figure 11 Cont.
<p>(<b>a</b>) FLIR image detection results for multi-scale target scene. (<b>b</b>) FLIR image detection results for occlusion scene.</p>
Full article ">Figure 12
<p>(<b>a</b>) Multispectral image detection results for infrared low-contrast scene. (<b>b</b>) Multispectral image detection results for false detection case.</p>
Full article ">Figure 13
<p>Representative scenarios for dynamic object detection. The scenarios include a regular traffic road environment (<b>a</b>), a pedestrian walkway environment under very low light at night (<b>b</b>), and a high-speed road environment with strong light conditions (<b>c</b>).</p>
Full article ">Figure 14
<p>Regular traffic road environment.</p>
Full article ">Figure 15
<p>Pedestrian walkway environment under very low light at night.</p>
Full article ">Figure 16
<p>High-speed road environment with strong lighting.</p>
Full article ">
17 pages, 5155 KiB  
Article
Developing a New Method to Rapidly Map Eucalyptus Distribution in Subtropical Regions Using Sentinel-2 Imagery
by Chunxian Tang, Xiandie Jiang, Guiying Li and Dengsheng Lu
Forests 2024, 15(10), 1799; https://doi.org/10.3390/f15101799 - 13 Oct 2024
Viewed by 274
Abstract
Eucalyptus plantations with fast growth and short rotation play an important role in improving economic conditions for local farmers and governments. It is necessary to map and update eucalyptus distribution in a timely manner, but to date, there is a lack of suitable [...] Read more.
Eucalyptus plantations with fast growth and short rotation play an important role in improving economic conditions for local farmers and governments. It is necessary to map and update eucalyptus distribution in a timely manner, but to date, there is a lack of suitable approaches for quickly mapping its spatial distribution in a large area. This research aims to develop a uniform procedure to map eucalyptus distribution at a regional scale using the Sentinel-2 imagery on the Google Earth Engine (GEE) platform. Different seasonal Senstinel-2 images were first examined, and key vegetation indices from the selected seasonal images were identified using random forest and Pearson correlation analysis. The selected key vegetation indices were then normalized and summed to produce new indices for mapping eucalyptus distribution based on the calculated best cutoff values using the ROC (Receiver Operating Characteristic) curve. The uniform procedure was tested in both experimental and test sites and then applied to the entire Fujian Province. The results indicated that the best season to distinguish eucalyptus forests from other forest types was winter. The composite indices for eucalyptus–coniferous forest separation (CIEC) and for eucalyptus–broadleaf forest separation (CIEB), which were synthesized from the enhanced vegetation index (EVI), plant senescing reflectance index (PSRI), shortwave infrared water stress index (SIWSI), and MERIS terrestrial chlorophyll index (MTCI), can effectively differentiate eucalyptus from other forest types. The proposed procedure with the best cutoff values (0.58 for CIEC and 1.29 for CIEB) achieved accuracies of above 90% in all study sites. The eucalyptus classification accuracies in Fujian Province, with a producer’s accuracy of 91%, user’s accuracy of 97%, and overall accuracy of 94%, demonstrate the strong robustness and transferability of this proposed procedure. This research provided a new insight into quickly mapping eucalyptus distribution in subtropical regions. However, more research is still needed to explore the robustness and transferability of this proposed method in tropical regions or in other subtropical regions with different environmental conditions. Full article
Show Figures

Figure 1

Figure 1
<p>The locations of two typical sites in Fujian Province (<b>a</b>): the experimental site was located in Minhou and Minqing Counties with sparse eucalyptus distribution (<b>b</b>), and the test site was located in Yunxiao County with extensive eucalyptus distribution (<b>c</b>). Both (<b>b</b>,<b>c</b>) were false color composites based on Sentinel-2A imagery.</p>
Full article ">Figure 2
<p>Framework of designing a uniform procedure to map eucalyptus distribution at a regional scale.</p>
Full article ">Figure 3
<p>The strategy of extracting eucalyptus from other tree species ((<b>a</b>)—masking out non-eucalyptus using the selected vegetation indices; (<b>b</b>)—development of new indices; (<b>c</b>)—determination of thresholds for separating eucalyptus from other forests).</p>
Full article ">Figure 4
<p>Spectral curves of eucalyptus and other forest types in different seasons. (<b>a</b>–<b>d</b>) Spectral curves of eucalyptus and other forest types in spring, summer, autumn, and winter, respectively. The spring image was acquired on 8 April 2022; the summer one on 22 July 2022; the autumn one on 25 September 2022; and the winter one on 22 December 2022. The grey semitransparent boxes represent bands with significant differences in reflectance values.</p>
Full article ">Figure 5
<p>The importance of potential indices in the experimental site.</p>
Full article ">Figure 6
<p>The correlation coefficients between vegetation indices: (<b>a</b>) eucalyptus and coniferous forests; (<b>b</b>) eucalyptus and broadleaf forests. Note: the <span class="html-italic">p</span>−level for the coefficients between each index was less than 0.01.</p>
Full article ">Figure 7
<p>ROC curves and AUC values under the combinations with different numbers of vegetation indices for differentiating eucalyptus from coniferous forests (<b>a</b>) and from broadleaf forests (<b>b</b>).</p>
Full article ">Figure 8
<p>Spatial distribution of eucalyptus in the experimental site (<b>a</b>) and test site (<b>b</b>), (<b>c</b>,<b>d</b>) represent the local distribution of eucalyptus in experimental site. The green color represents eucalyptus plantations.</p>
Full article ">Figure 9
<p>Spatial distribution of eucalyptus coverage (percent) in Fujian Province (<b>a</b>), the percent values in the legend represent the proportion of eucalyptus within a 1 km × 1 km grid; (<b>b</b>–<b>d</b>) represent different proportions of eucalyptus plantations.</p>
Full article ">
18 pages, 7445 KiB  
Article
Unveiling the Potential of CuO and Cu2O Nanoparticles against Novel Copper-Resistant Pseudomonas Strains: An In-Depth Comparison
by Olesia Havryliuk, Garima Rathee, Jeniffer Blair, Vira Hovorukha, Oleksandr Tashyrev, Jordi Morató, Leonardo M. Pérez and Tzanko Tzanov
Nanomaterials 2024, 14(20), 1644; https://doi.org/10.3390/nano14201644 - 13 Oct 2024
Viewed by 410
Abstract
Four novel Pseudomonas strains with record resistance to copper (Cu2+) previously isolated from ecologically diverse samples (P. lactis UKR1, P. panacis UKR2, P. veronii UKR3, and P. veronii UKR4) were tested against sonochemically synthesised copper-oxide (I) (Cu2O) and [...] Read more.
Four novel Pseudomonas strains with record resistance to copper (Cu2+) previously isolated from ecologically diverse samples (P. lactis UKR1, P. panacis UKR2, P. veronii UKR3, and P. veronii UKR4) were tested against sonochemically synthesised copper-oxide (I) (Cu2O) and copper-oxide (II) (CuO) nanoparticles (NPs). Nanomaterials characterisation by X-ray diffractometry (XRD), X-ray photoelectron spectroscopy (XPS), Fourier transform infrared spectroscopy (FTIR), and High-Resolution Transmission Electron Microscopy (HRTEM) confirmed the synthesis of CuO and Cu2O NPs. CuO NPs exhibited better performance in inhibiting bacterial growth due to their heightened capacity to induce oxidative stress. The greater stability and geometrical shape of CuO NPs were disclosed as important features associated with bacterial cell toxicity. SEM and TEM images confirmed that both NPs caused membrane disruption, altered cell morphology, and pronounced membrane vesiculation, a distinctive feature of bacteria dealing with stressor factors. Finally, Cu2O and CuO NPs effectively decreased the biofilm-forming ability of the Cu2+-resistant UKR strains as well as degraded pre-established biofilm, matching NPs’ antimicrobial performance. Despite the similarities in the mechanisms of action revealed by both NPs, distinctive behaviours were also detected for the different species of wild-type Pseudomonas analysed. In summary, these findings underscore the efficacy of nanotechnology-driven strategies for combating metal tolerance in bacteria. Full article
(This article belongs to the Special Issue Antimicrobial and Antioxidant Activity of Nanoparticles)
Show Figures

Figure 1

Figure 1
<p>XRD spectra of CuO (<b>a</b>) and Cu<sub>2</sub>O NPs (<b>b</b>), FTIR spectra of CuO (<b>c</b>) and Cu<sub>2</sub>O NPs (<b>d</b>), and UV-vis spectra of CuO (<b>e</b>) and Cu<sub>2</sub>O NPs (<b>f</b>).</p>
Full article ">Figure 2
<p>(<b>a</b>) TEM image, (<b>b</b>) HRTEM image, (<b>c</b>) EDX spectrum, and (<b>d</b>) SAED of CuO NPs.</p>
Full article ">Figure 3
<p>(<b>a</b>) TEM image, (<b>b</b>) HRTEM image, (<b>c</b>) EDX spectrum, and (<b>d</b>) SAED of Cu<sub>2</sub>O NPs.</p>
Full article ">Figure 4
<p>Effect of CuO and Cu<sub>2</sub>O NPs on the growth of the copper-resistant (<b>a</b>) <span class="html-italic">P. lactis</span> UKR1, (<b>b</b>) <span class="html-italic">P. panacis</span> UKR2, (<b>c</b>) <span class="html-italic">P. veronii</span> UKR3, and (<b>d</b>) <span class="html-italic">P. veronii</span> UKR4 strains.</p>
Full article ">Figure 5
<p>Reactive oxygen species (ROS) generation in copper-resistant (a) <span class="html-italic">P. lactis</span> UKR1, (b) <span class="html-italic">P. panacis</span> UKR2, (c) <span class="html-italic">P. veronii</span> UKR3, and (d) <span class="html-italic">P. veronii</span> UKR4 treated with Cu<sub>2</sub>O or CuO NPs. Different letters represent statistically significant differences (<span class="html-italic">p</span> &lt; 0.05) between NPs type and concentration for each strain; e.g., “a” is different from “b”.</p>
Full article ">Figure 6
<p>Representative SEM micrographs of untreated <span class="html-italic">P. lactis</span> UKR1 cells (<b>a</b>,<b>b</b>), 100 mg/L CuO NPs-treated cells (<b>c</b>,<b>d</b>), and 100 mg/L Cu<sub>2</sub>O NPs-treated cells (<b>e</b>,<b>f</b>). NP-treated cells show straightforward evidence of membrane injury, cytoplasmic leakage, and cell morphology alteration. White arrows indicate vesicle formation on the bacterial surface.</p>
Full article ">Figure 7
<p>Representative TEM micrographs of untreated <span class="html-italic">P. lactis</span> UKR1 cells (<b>a</b>,<b>b</b>), 100 mg/L Cu<sub>2</sub>O NP (<b>c</b>,<b>d</b>), and 100 mg/L CuO NP-treated cells (<b>e</b>,<b>f</b>). A considerable number of intracellular nanoparticles attached to the bacterial cells’ surface (black regular forms) can be observed in Cu<sub>2</sub>O and CuO-treated bacteria. The scale bar represents 0.5 µm.</p>
Full article ">Figure 8
<p>Quantification of biofilm formation (<b>a</b>) and bacterial biofilm remaining (<b>b</b>) after 48-h treatment in the absence (control) and presence of 50 mg/L and 100 mg/L Cu<sub>2</sub>O or CuO NPs. Error bars indicate standard deviations (S.D.). Different letters represent statistically significant differences (<span class="html-italic">p</span> &lt; 0.05) between NPs type and concentration for each strain; e.g., “b” is different from “c” but not from “bc”.</p>
Full article ">
25 pages, 27745 KiB  
Article
Infrared and Visible Image Fusion via Sparse Representation and Guided Filtering in Laplacian Pyramid Domain
by Liangliang Li, Yan Shi, Ming Lv, Zhenhong Jia, Minqin Liu, Xiaobin Zhao, Xueyu Zhang and Hongbing Ma
Remote Sens. 2024, 16(20), 3804; https://doi.org/10.3390/rs16203804 - 13 Oct 2024
Viewed by 372
Abstract
The fusion of infrared and visible images together can fully leverage the respective advantages of each, providing a more comprehensive and richer set of information. This is applicable in various fields such as military surveillance, night navigation, environmental monitoring, etc. In this paper, [...] Read more.
The fusion of infrared and visible images together can fully leverage the respective advantages of each, providing a more comprehensive and richer set of information. This is applicable in various fields such as military surveillance, night navigation, environmental monitoring, etc. In this paper, a novel infrared and visible image fusion method based on sparse representation and guided filtering in Laplacian pyramid (LP) domain is introduced. The source images are decomposed into low- and high-frequency bands by the LP, respectively. Sparse representation has achieved significant effectiveness in image fusion, and it is used to process the low-frequency band; the guided filtering has excellent edge-preserving effects and can effectively maintain the spatial continuity of the high-frequency band. Therefore, guided filtering combined with the weighted sum of eight-neighborhood-based modified Laplacian (WSEML) is used to process high-frequency bands. Finally, the inverse LP transform is used to reconstruct the fused image. We conducted simulation experiments on the publicly available TNO dataset to validate the superiority of our proposed algorithm in fusing infrared and visible images. Our algorithm preserves both the thermal radiation characteristics of the infrared image and the detailed features of the visible image. Full article
Show Figures

Figure 1

Figure 1
<p>Laplacian pyramid. (<b>a</b>) Three-level Laplacian pyramid decomposition diagram; (<b>b</b>) Three-level Laplacian reconstruction diagram.</p>
Full article ">Figure 2
<p>The structure of the proposed method.</p>
Full article ">Figure 3
<p>Examples from the TNO dataset.</p>
Full article ">Figure 4
<p>Fusion results of different decomposition levels in LP. (<b>a</b>) 1 level; (<b>b</b>) 2 level; (<b>c</b>) 3 level; (<b>d</b>) 4 level; (<b>e</b>) 5 level; (<b>f</b>) 6 level.</p>
Full article ">Figure 5
<p>Results on Data 1. (<b>a</b>) ICA; (<b>b</b>) ADKLT; (<b>c</b>) MFSD; (<b>d</b>) MDLatLRR; (<b>e</b>) PMGI; (<b>f</b>) RFNNest; (<b>g</b>) EgeFusion; (<b>h</b>) LEDIF; (<b>i</b>) Proposed.</p>
Full article ">Figure 6
<p>Results on Data 2. (<b>a</b>) ICA; (<b>b</b>) ADKLT; (<b>c</b>) MFSD; (<b>d</b>) MDLatLRR; (<b>e</b>) PMGI; (<b>f</b>) RFNNest; (<b>g</b>) EgeFusion; (<b>h</b>) LEDIF; (<b>i</b>) Proposed.</p>
Full article ">Figure 7
<p>Results on Data 3. (<b>a</b>) ICA; (<b>b</b>) ADKLT; (<b>c</b>) MFSD; (<b>d</b>) MDLatLRR; (<b>e</b>) PMGI; (<b>f</b>) RFNNest; (<b>g</b>) EgeFusion; (<b>h</b>) LEDIF; (<b>i</b>) Proposed.</p>
Full article ">Figure 8
<p>Results on Data 4. (<b>a</b>) ICA; (<b>b</b>) ADKLT; (<b>c</b>) MFSD; (<b>d</b>) MDLatLRR; (<b>e</b>) PMGI; (<b>f</b>) RFNNest; (<b>g</b>) EgeFusion; (<b>h</b>) LEDIF; (<b>i</b>) Proposed.</p>
Full article ">Figure 9
<p>Objective performance of different methods on the TNO dataset.</p>
Full article ">Figure 9 Cont.
<p>Objective performance of different methods on the TNO dataset.</p>
Full article ">Figure 9 Cont.
<p>Objective performance of different methods on the TNO dataset.</p>
Full article ">Figure 10
<p>Results on Lytro-01. (<b>a</b>) Near focus; (<b>b</b>) Far focus; (<b>c</b>) ICA; (<b>d</b>) FusionDN; (<b>e</b>) PMGI; (<b>f</b>) U2Fusion; (<b>g</b>) LEGFF; (<b>h</b>) ZMFF; (<b>i</b>) EgeFusion; (<b>j</b>) LEDIF; (<b>k</b>) Proposed.</p>
Full article ">Figure 11
<p>Objective performance of different methods on the Lytro dataset.</p>
Full article ">Figure 11 Cont.
<p>Objective performance of different methods on the Lytro dataset.</p>
Full article ">Figure 12
<p>Objective performance of different methods on the MFI-WHU dataset.</p>
Full article ">Figure 12 Cont.
<p>Objective performance of different methods on the MFI-WHU dataset.</p>
Full article ">
28 pages, 7076 KiB  
Article
Coupling Image-Fusion Techniques with Machine Learning to Enhance Dynamic Monitoring of Nitrogen Content in Winter Wheat from UAV Multi-Source
by Xinwei Li, Xiangxiang Su, Jun Li, Sumera Anwar, Xueqing Zhu, Qiang Ma, Wenhui Wang and Jikai Liu
Agriculture 2024, 14(10), 1797; https://doi.org/10.3390/agriculture14101797 - 12 Oct 2024
Viewed by 364
Abstract
Plant nitrogen concentration (PNC) is a key indicator reflecting the growth and development status of plants. The timely and accurate monitoring of plant PNC is of great significance for the refined management of crop nutrition in the field. The rapidly developing sensor technology [...] Read more.
Plant nitrogen concentration (PNC) is a key indicator reflecting the growth and development status of plants. The timely and accurate monitoring of plant PNC is of great significance for the refined management of crop nutrition in the field. The rapidly developing sensor technology provides a powerful means for monitoring crop PNC. Although RGB images have rich spatial information, they lack the spectral information of the red edge and near infrared bands, which are more sensitive to vegetation. Conversely, multispectral images offer superior spectral resolution but typically lag in spatial detail compared to RGB images. Therefore, the purpose of this study is to improve the accuracy and efficiency of crop PNC monitoring by combining the advantages of RGB images and multispectral images through image-fusion technology. This study was based on the booting, heading, and early-filling stages of winter wheat, synchronously acquiring UAV RGB and MS data, using Gram–Schmidt (GS) and principal component (PC) image-fusion methods to generate fused images and evaluate them with multiple image-quality indicators. Subsequently, models for predicting wheat PNC were constructed using machine-selection algorithms such as RF, GPR, and XGB. The results show that the RGB_B1 image contains richer image information and more image details compared to other bands. The GS image-fusion method is superior to the PC method, and the performance of fusing high-resolution RGB_B1 band images with MS images using the GS method is optimal. After image fusion, the correlation between vegetation indices (VIs) and wheat PNC has been enhanced to varying degrees in different growth periods, significantly enhancing the response ability of spectral information to wheat PNC. To comprehensively assess the potential of fused images in estimating wheat PNC, this study fully compared the performance of PNC models before and after fusion using machine learning algorithms such as Random Forest (RF), Gaussian Process Regression (GPR), and eXtreme Gradient Boosting (XGB). The results show that the model established by the fusion image has high stability and accuracy in a single growth period, multiple growth periods, different varieties, and different nitrogen treatments, making it significantly better than the MS image. The most significant enhancements were during the booting to early-filling stages, particularly with the RF algorithm, which achieved an 18.8% increase in R2, a 26.5% increase in RPD, and a 19.7% decrease in RMSE. This study provides an effective technical means for the dynamic monitoring of crop nutritional status and provides strong technical support for the precise management of crop nutrition. Full article
(This article belongs to the Section Digital Agriculture)
Show Figures

Figure 1

Figure 1
<p>Chuzhou City, Anhui Province (<b>A</b>) and experiment design (<b>B</b>), with (<b>C</b>–<b>E</b>) representing the booting stage, heading stage, and early-filling stage, respectively.</p>
Full article ">Figure 2
<p>Image-fusion process flowchart. (<b>A</b>) represents the PC image-fusion method; (<b>B</b>) represents the GS image-fusion method.</p>
Full article ">Figure 3
<p>Flow chart in this study.</p>
Full article ">Figure 4
<p>Comprehensive strategy for model construction.</p>
Full article ">Figure 5
<p>Information entropy of RGB band images of wheat canopy. B1, B2, and B3 correspond to the red, green, and blue bands of the RGB imagery, respectively.</p>
Full article ">Figure 6
<p>Correlation change of band information and wheat PNC before and after image fusion.</p>
Full article ">Figure 7
<p>Correlation change of VIs and wheat PNC before and after image fusion.</p>
Full article ">Figure 8
<p>Compared to the MS model, the improvements of the fusion image model from the <span class="html-italic">R</span><sup>2</sup> perspective.</p>
Full article ">Figure 9
<p>Compared to the MS model, the improvements of the fusion image model from the <span class="html-italic">RMSE</span> and <span class="html-italic">RPD</span> perspective.</p>
Full article ">Figure 10
<p>Estimating PNC across different varieties. V1, V2, and V3 represent Huaimai 44, Yannong 999, and Ningmai 13, respectively.</p>
Full article ">Figure 11
<p>Estimating PNC across different nitrogen treatments. N0, N1, N2, and N3 represent 0 kg/ha, 100 kg/ha, 200 kg/ha, and 300 kg/ha, respectively.</p>
Full article ">Figure 12
<p>The importance and interaction of variables within the model. Vint represents variable interactions, and Vimp represents variable importance.</p>
Full article ">Figure 13
<p>Winter wheat PNC spatiotemporal distribution map. (<b>a</b>) represents the measured PNC, and (<b>b</b>) represents the predicted PNC.</p>
Full article ">Figure 14
<p>The RF algorithm’s Vint and Vimp from 100 cycles of sampling for both fusion and MS images during the heading stage. Vint represents variable interactions, and Vimp represents variable importance.</p>
Full article ">Figure 15
<p>Difference treatment on the correlation (|r|) between feature variables in the fusion image and MS image.</p>
Full article ">
27 pages, 14919 KiB  
Article
Marine Microplastic Classification by Hyperspectral Imaging: Case Studies from the Mediterranean Sea, the Strait of Gibraltar, the Western Atlantic Ocean and the Bay of Biscay
by Roberta Palmieri, Silvia Serranti, Giuseppe Capobianco, Andres Cózar, Elisa Martí and Giuseppe Bonifazi
Appl. Sci. 2024, 14(20), 9310; https://doi.org/10.3390/app14209310 (registering DOI) - 12 Oct 2024
Viewed by 351
Abstract
In this work, a comprehensive characterization of microplastic samples collected from unique geographical locations, including the Mediterranean Sea, Strait of Gibraltar, Western Atlantic Ocean and Bay of Biscay utilizing advanced hyperspectral imaging (HSI) techniques working in the short-wave infrared range (1000–2500 nm) is [...] Read more.
In this work, a comprehensive characterization of microplastic samples collected from unique geographical locations, including the Mediterranean Sea, Strait of Gibraltar, Western Atlantic Ocean and Bay of Biscay utilizing advanced hyperspectral imaging (HSI) techniques working in the short-wave infrared range (1000–2500 nm) is presented. More in detail, an ad hoc hierarchical classification approach was developed and applied to optimize the identification of polymers. Morphological and morphometrical attributes of microplastic particles were simultaneously measured by digital image processing. Results showed that the collected microplastics are mainly composed, in decreasing order of abundance, by polyethylene (PE), polypropylene (PP), polystyrene (PS) and expanded polystyrene (EPS), in agreement with the literature data related to marine microplastics. The investigated microplastics belong to the fragments (86.8%), lines (9.2%) and films (4.0%) categories. Rigid (thick-walled) fragments were found at all sampling sites, while film-type microplastics and lines were absent in some samples from the Mediterranean Sea and the Western Atlantic Ocean. Rigid fragments and lines are mainly made of PE, whereas PP is the most common polymer for the film category. Average Feret diameter of microplastic fragments decreases from EPS (3–4 mm) to PE (2–3 mm) and PP (1–2 mm). The setup strategies illustrate that the HSI-based approach enables the classification of the polymers constituting microplastic particles and, at the same time, to measure and classify them by shape. Such multiple characterization of microplastic samples at the individual level is proposed as a useful tool to explore the environmental selection of microplastic features (i.e., composition, category, size, shape) and to advance the understanding of the role of weathering, hydrodynamic and other phenomena in their transport and fragmentation. Full article
Show Figures

Figure 1

Figure 1
<p>Sampling sites in the Western Atlantic Ocean, Bay of Biscay, Strait of Gibraltar and Mediterranean Sea.</p>
Full article ">Figure 2
<p>Dendrogram showing the hierarchical PLS-DA model built to classify the four different polymers constituting microplastic particles: PE, PP, EPS, PS and not identified (NI).</p>
Full article ">Figure 3
<p>Composition of the training dataset of microplastics. Polyethylene (PE), polypropylene (PP), polystyrene (PS) and expanded polystyrene (EPS).</p>
Full article ">Figure 4
<p>Average raw reflectance spectra in the SWIR range (1000–2500 nm) of the reference microplastic particles acquired by HSI device and used as training set.</p>
Full article ">Figure 5
<p>Average pre-processed reflectance spectra of the different polymers (<b>a</b>) and PCA score plot (PC1–PC2–PC5) (<b>b</b>) related to Rule 1.</p>
Full article ">Figure 6
<p>Average pre-processed reflectance spectra of the different polymers (<b>a</b>) and PCA score plot (PC1–PC2–PC5) (<b>b</b>) related to Rule 2.</p>
Full article ">Figure 7
<p>Source digital images and corresponding false color prediction maps obtained from the HSI-hierarchical model of some of the examined microplastic samples from the Mediterranean Sea and the Strait of Gibraltar obtained by the application of the hierarchical PLS-DA model.</p>
Full article ">Figure 8
<p>Source digital images and corresponding false color prediction maps obtained from the HSI-hierarchical model of some of the examined microplastic samples from the Western Atlantic Ocean (E4 Sermiento) and the Bay of Biscay (ETO 16 and NST 41) were obtained following the application of the hierarchical PLS-DA model.</p>
Full article ">Figure 9
<p>Percentages of different identified polymers in the investigated marine microplastic samples collected in different areas. NI: not identified.</p>
Full article ">Figure 10
<p>Abundance of microplastic categories in each analyzed sample.</p>
Full article ">Figure 11
<p>Percentage of polymer types in each analyzed marine microplastic category such as fragments, films and lines.</p>
Full article ">Figure 12
<p>Percentage abundance of polymer types at each sampling site, categorized by fragments, films and lines.</p>
Full article ">Figure 12 Cont.
<p>Percentage abundance of polymer types at each sampling site, categorized by fragments, films and lines.</p>
Full article ">Figure 13
<p>Maximum Feret diameter frequency distribution (in number) for PE, PP, EPS, PS.</p>
Full article ">Figure 14
<p>Area frequency distribution (in number) for PE, PP, EPS and PS fragments.</p>
Full article ">Figure 15
<p>Perimeter frequency distribution (in number) for PE, PP, EPS and PS fragments.</p>
Full article ">Figure 16
<p>Circularity frequency distribution (in number) for PE, PP, EPS and PS fragments.</p>
Full article ">Figure 17
<p>Example of microplastic items, made of different polymers (i.e., PP, PE and EPS), showing different circularity values.</p>
Full article ">Figure A1
<p>Digital images and corresponding predicted images obtained after the application of hierarchical PLS-DA model of the microplastic samples coming from the Strait of Gibraltar and Mediterranean Sea: (<b>a</b>) A02DS, (<b>b</b>) A06NS, (<b>c</b>) A27NS and (<b>d</b>) A34NS.</p>
Full article ">Figure A1 Cont.
<p>Digital images and corresponding predicted images obtained after the application of hierarchical PLS-DA model of the microplastic samples coming from the Strait of Gibraltar and Mediterranean Sea: (<b>a</b>) A02DS, (<b>b</b>) A06NS, (<b>c</b>) A27NS and (<b>d</b>) A34NS.</p>
Full article ">Figure A1 Cont.
<p>Digital images and corresponding predicted images obtained after the application of hierarchical PLS-DA model of the microplastic samples coming from the Strait of Gibraltar and Mediterranean Sea: (<b>a</b>) A02DS, (<b>b</b>) A06NS, (<b>c</b>) A27NS and (<b>d</b>) A34NS.</p>
Full article ">Figure A2
<p>Digital images and corresponding predicted images obtained after the application of hierarchical PLS-DA model of the microplastic samples coming from the Western Atlantic Ocean: E4-Sarmiento.</p>
Full article ">Figure A2 Cont.
<p>Digital images and corresponding predicted images obtained after the application of hierarchical PLS-DA model of the microplastic samples coming from the Western Atlantic Ocean: E4-Sarmiento.</p>
Full article ">Figure A3
<p>Digital images and corresponding predicted images obtained after the application of hierarchical PLS-DA model of the microplastic samples coming from Bay of Biscay: ETO 16.</p>
Full article ">Figure A4
<p>Digital images and corresponding predicted images obtained after the application of hierarchical PLS-DA model of the microplastic samples coming from Bay of Biscay: NST41.</p>
Full article ">
20 pages, 3947 KiB  
Article
Modeling of Biologically Effective Daily Radiant Exposures over Europe from Space Using SEVIRI Measurements and MERRA-2 Reanalysis
by Agnieszka Czerwińska and Janusz Krzyścin
Remote Sens. 2024, 16(20), 3797; https://doi.org/10.3390/rs16203797 - 12 Oct 2024
Viewed by 210
Abstract
Ultraviolet solar radiation at the Earth’s surface significantly impacts both human health and ecosystems. A biologically effective daily radiant exposure (BEDRE) model is proposed for various biological processes with an analytical formula for its action spectrum. The following processes are considered: erythema formation, [...] Read more.
Ultraviolet solar radiation at the Earth’s surface significantly impacts both human health and ecosystems. A biologically effective daily radiant exposure (BEDRE) model is proposed for various biological processes with an analytical formula for its action spectrum. The following processes are considered: erythema formation, previtamin D3 synthesis, psoriasis clearance, and inactivation of SARS-CoV-2 virions. The BEDRE model is constructed by multiplying the synthetic BEDRE value under cloudless conditions by a cloud modification factor (CMF) parameterizing the attenuation of radiation via clouds. The CMF is an empirical function of the solar zenith angle (SZA) at midday and the daily clearness index from the Spinning Enhanced Visible and Infrared Imager (SEVIRI) measurements on board the second-generation Meteosat satellites. Total column ozone, from MERRA-2 reanalysis, is used in calculations of clear-sky BEDRE values. The proposed model was trained and validated using data from several European ground-based spectrophotometers and biometers for the periods 2014–2023 and 2004–2013, respectively. The model provides reliable estimates of BEDRE for all biological processes considered. Under snow-free conditions and SZA < 45° at midday, bias and standard deviation of observation-model differences are approximately ±5% and 15%, respectively. The BEDRE model can be used as an initial validation tool for ground-based UV data. Full article
(This article belongs to the Section Environmental Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>The location of the UV measuring stations shown in <a href="#remotesensing-16-03797-t001" class="html-table">Table 1</a> (created with Google My Maps: Map data 2024).</p>
Full article ">Figure 2
<p>Normalized action spectra for the specific biological effects: erythema appearance (black), photosynthesis of previtamin D<sub>3</sub> in human skin (blue), psoriasis clearance (green), and inactivation of SARS-CoV-2 virions (red).</p>
Full article ">Figure 3
<p>Scatter plot of UBE model against measured daily erythemal radiant exposure at Belsk for all-sky conditions for different ranges of noon SZA: (<b>a</b>) SZA &lt; 45°; (<b>b</b>) SZA ≥ 45° and SZA &lt; 60°; (<b>c</b>) SZA ≥ 60°. The dotted line is the 1–1 agreement line. The solid curve represents smoothed values from the LOWESS filter [<a href="#B43-remotesensing-16-03797" class="html-bibr">43</a>].</p>
Full article ">Figure 4
<p>Scatter plot RE<sub>BIOL</sub>(D) from the UBE model with <math display="inline"><semantics> <mrow> <msub> <mrow> <mover accent="true"> <mrow> <mi mathvariant="sans-serif">α</mi> </mrow> <mo stretchy="false">^</mo> </mover> </mrow> <mrow> <mi mathvariant="normal">E</mi> <mi mathvariant="normal">R</mi> <mi mathvariant="normal">Y</mi> <mi mathvariant="normal">T</mi> </mrow> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mrow> <mover accent="true"> <mrow> <mi mathvariant="sans-serif">β</mi> </mrow> <mo stretchy="false">^</mo> </mover> </mrow> <mrow> <mi mathvariant="normal">E</mi> <mi mathvariant="normal">R</mi> <mi mathvariant="normal">Y</mi> <mi mathvariant="normal">T</mi> </mrow> </msub> </mrow> </semantics></math> when SZA<sub>N</sub> &lt; 45° versus corresponding values from spectral measurements at Belsk for the period 2011–2023: (<b>a</b>) for VITD, (<b>b</b>) for PSOR, and (<b>c</b>) for SARS.</p>
Full article ">Figure A1
<p>Scatter plot of the modeled (UBE model with <math display="inline"><semantics> <mrow> <msub> <mrow> <mover accent="true"> <mrow> <mi mathvariant="sans-serif">α</mi> </mrow> <mo stretchy="false">^</mo> </mover> </mrow> <mrow> <mi mathvariant="normal">E</mi> <mi mathvariant="normal">R</mi> <mi mathvariant="normal">Y</mi> <mi mathvariant="normal">T</mi> </mrow> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mrow> <mover accent="true"> <mrow> <mi mathvariant="sans-serif">β</mi> </mrow> <mo stretchy="false">^</mo> </mover> </mrow> <mrow> <mi mathvariant="normal">E</mi> <mi mathvariant="normal">R</mi> <mi mathvariant="normal">Y</mi> <mi mathvariant="normal">T</mi> </mrow> </msub> </mrow> </semantics></math>) versus the measured daily radiant exposure for all-sky conditions and different ranges of SZA at noon: (<b>a</b>) Reading for SZA<sub>N</sub> &lt; 45°; (<b>b</b>) Reading for SZA<sub>N</sub> ≥ 45° and SZA<sub>N</sub> &lt; 60°; (<b>c</b>) Reading for SZA ≥ 60°; (<b>d</b>) Vienna for SZA<sub>N</sub> &lt;45°; (<b>e</b>) Vienna for SZA<sub>N</sub> ≥ 45° and SZA<sub>N</sub> &lt; 60°; (<b>f</b>) Vienna for SZA ≥ 60°. The dotted line is the 1–1 agreement line. The solid curve represents smoothed values from the Lowess filter [<a href="#B43-remotesensing-16-03797" class="html-bibr">43</a>].</p>
Full article ">Figure A2
<p>Scatter plot of the modeled (UBE model with <math display="inline"><semantics> <mrow> <msub> <mrow> <mover accent="true"> <mrow> <mi mathvariant="sans-serif">α</mi> </mrow> <mo stretchy="false">^</mo> </mover> </mrow> <mrow> <mi mathvariant="normal">E</mi> <mi mathvariant="normal">R</mi> <mi mathvariant="normal">Y</mi> <mi mathvariant="normal">T</mi> </mrow> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mrow> <mover accent="true"> <mrow> <mi mathvariant="sans-serif">β</mi> </mrow> <mo stretchy="false">^</mo> </mover> </mrow> <mrow> <mi mathvariant="normal">E</mi> <mi mathvariant="normal">R</mi> <mi mathvariant="normal">Y</mi> <mi mathvariant="normal">T</mi> </mrow> </msub> </mrow> </semantics></math>) versus measured daily erythemal radiant exposure for different ranges of noon SZA: (SZA &lt; 45°; SZA ≥ 45° and SZA &lt; 60°; and SZA ≥ 60°: (<b>a</b>–<b>c</b>) Diekirch (Luxembourg); (<b>d</b>–<b>f</b>) Uccle (Belgium); (<b>g</b>–<b>i</b>) Davos (Switzerland); (<b>j</b>–<b>l</b>) Chisinau (Moldavia). As these stations were not used in UBE training, all available daily data in the period 2004–2023 have been used. The dotted line is the 1–1 perfect agreement line. The solid curve represents smoothed values from the Lowess filter [<a href="#B43-remotesensing-16-03797" class="html-bibr">43</a>].</p>
Full article ">Figure A2 Cont.
<p>Scatter plot of the modeled (UBE model with <math display="inline"><semantics> <mrow> <msub> <mrow> <mover accent="true"> <mrow> <mi mathvariant="sans-serif">α</mi> </mrow> <mo stretchy="false">^</mo> </mover> </mrow> <mrow> <mi mathvariant="normal">E</mi> <mi mathvariant="normal">R</mi> <mi mathvariant="normal">Y</mi> <mi mathvariant="normal">T</mi> </mrow> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mrow> <mover accent="true"> <mrow> <mi mathvariant="sans-serif">β</mi> </mrow> <mo stretchy="false">^</mo> </mover> </mrow> <mrow> <mi mathvariant="normal">E</mi> <mi mathvariant="normal">R</mi> <mi mathvariant="normal">Y</mi> <mi mathvariant="normal">T</mi> </mrow> </msub> </mrow> </semantics></math>) versus measured daily erythemal radiant exposure for different ranges of noon SZA: (SZA &lt; 45°; SZA ≥ 45° and SZA &lt; 60°; and SZA ≥ 60°: (<b>a</b>–<b>c</b>) Diekirch (Luxembourg); (<b>d</b>–<b>f</b>) Uccle (Belgium); (<b>g</b>–<b>i</b>) Davos (Switzerland); (<b>j</b>–<b>l</b>) Chisinau (Moldavia). As these stations were not used in UBE training, all available daily data in the period 2004–2023 have been used. The dotted line is the 1–1 perfect agreement line. The solid curve represents smoothed values from the Lowess filter [<a href="#B43-remotesensing-16-03797" class="html-bibr">43</a>].</p>
Full article ">Figure A3
<p>Scatter plot of the modeled (UBE model with <math display="inline"><semantics> <mrow> <msub> <mrow> <mover accent="true"> <mrow> <mi mathvariant="sans-serif">α</mi> </mrow> <mo stretchy="false">^</mo> </mover> </mrow> <mrow> <mi mathvariant="normal">E</mi> <mi mathvariant="normal">R</mi> <mi mathvariant="normal">Y</mi> <mi mathvariant="normal">T</mi> </mrow> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mrow> <mover accent="true"> <mrow> <mi mathvariant="sans-serif">β</mi> </mrow> <mo stretchy="false">^</mo> </mover> </mrow> <mrow> <mi mathvariant="normal">E</mi> <mi mathvariant="normal">R</mi> <mi mathvariant="normal">Y</mi> <mi mathvariant="normal">T</mi> </mrow> </msub> </mrow> </semantics></math>) versus the measured daily radiant exposure for all-sky conditions and SZA at noon less than 45°: (<b>a</b>) VITD for Reading; (<b>b</b>) PSOR for Reading; (<b>c</b>) SARS for Reading; (<b>d</b>) VITD for Uccle; (<b>e</b>) PSOR for Uccle; (<b>f</b>) SARS for Uccle. The dotted line is the 1–1 agreement line. The solid curve represents smoothed values from the Lowess filter [<a href="#B43-remotesensing-16-03797" class="html-bibr">43</a>].</p>
Full article ">Figure A3 Cont.
<p>Scatter plot of the modeled (UBE model with <math display="inline"><semantics> <mrow> <msub> <mrow> <mover accent="true"> <mrow> <mi mathvariant="sans-serif">α</mi> </mrow> <mo stretchy="false">^</mo> </mover> </mrow> <mrow> <mi mathvariant="normal">E</mi> <mi mathvariant="normal">R</mi> <mi mathvariant="normal">Y</mi> <mi mathvariant="normal">T</mi> </mrow> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mrow> <mover accent="true"> <mrow> <mi mathvariant="sans-serif">β</mi> </mrow> <mo stretchy="false">^</mo> </mover> </mrow> <mrow> <mi mathvariant="normal">E</mi> <mi mathvariant="normal">R</mi> <mi mathvariant="normal">Y</mi> <mi mathvariant="normal">T</mi> </mrow> </msub> </mrow> </semantics></math>) versus the measured daily radiant exposure for all-sky conditions and SZA at noon less than 45°: (<b>a</b>) VITD for Reading; (<b>b</b>) PSOR for Reading; (<b>c</b>) SARS for Reading; (<b>d</b>) VITD for Uccle; (<b>e</b>) PSOR for Uccle; (<b>f</b>) SARS for Uccle. The dotted line is the 1–1 agreement line. The solid curve represents smoothed values from the Lowess filter [<a href="#B43-remotesensing-16-03797" class="html-bibr">43</a>].</p>
Full article ">
15 pages, 15095 KiB  
Article
Galaxy Classification Using EWGC
by Yunyan Nie, Zhiren Pan, Jianwei Zhou, Bo Qiu, A-Li Luo, Chong Luo and Xiaodong Luan
Universe 2024, 10(10), 394; https://doi.org/10.3390/universe10100394 - 12 Oct 2024
Viewed by 159
Abstract
The Enhanced Wide-field Galaxy Classification Network (EWGC) is a novel architecture designed to classify spiral and elliptical galaxies using Wide-field Infrared Survey Explorer (WISE) images. The EWGC achieves an impressive classification accuracy of 90.02%, significantly outperforming the previously developed WGC network and underscoring [...] Read more.
The Enhanced Wide-field Galaxy Classification Network (EWGC) is a novel architecture designed to classify spiral and elliptical galaxies using Wide-field Infrared Survey Explorer (WISE) images. The EWGC achieves an impressive classification accuracy of 90.02%, significantly outperforming the previously developed WGC network and underscoring its superior performance in galaxy morphology classification. Remarkably, the network demonstrates a consistent accuracy of 90.02% when processing both multi-target and single-target images. Such robustness indicates the EWGC’s versatility and potential for various applications in galaxy classification tasks. Full article
(This article belongs to the Section Astroinformatics and Astrostatistics)
Show Figures

Figure 1

Figure 1
<p>Examples of synthesized samples using make_lupton_rgb function. (<b>a</b>) Images of the WISE <span class="html-italic">W</span>1, <span class="html-italic">W</span>2, and <span class="html-italic">W</span>3 bands from left to right. (<b>b</b>) Image composition.</p>
Full article ">Figure 2
<p>Example of image preprocessing. (<b>a</b>) Original image; (<b>b</b>) enhancement; (<b>c</b>) pixel map obtained by subtracting (<b>a</b>) from (<b>b</b>) for the composite map. (<b>d</b>) Pixel map of each channel map obtained by subtracting (<b>a</b>) from (<b>b</b>). From left to right, the first is the <span class="html-italic">W</span>1 channel, the second is the <span class="html-italic">W</span>2 channel, and the third is the <span class="html-italic">W</span>3 channel.</p>
Full article ">Figure 3
<p>Diagram of EWGC: C is the number of channels, H is the height of the input image, W is the width of the input image, N is the number of input images, and M is the number of modules.</p>
Full article ">Figure 4
<p>Sampling-based dynamic upsampling and module designs in Dysample. The input feature, upsampled feature, generated offset, and original grid are denoted by <span class="html-italic">X</span>, <span class="html-italic">X</span>′, <span class="html-italic">O</span>, and g, respectively. (<b>a</b>) The sampling set is generated by the sampling point generator, with which the input feature is re-sampled by the grid sample function. In generator (<b>b</b>), the sampling set is the sum of the generated offset and the original grid position. A version of the “static scope factor” is used, where the offset is generated with a linear layer.</p>
Full article ">Figure 5
<p>Initial sampling positions and offset scopes. The points and the colored masks represent the initial sampling positions and the offset scopes, respectively. Considering sampling four points (s = 2), (<b>a</b>) in the case of nearest initialization, the four offsets share the same initial position but ignore position relation; in bilinear initialization (<b>b</b>), it separates the initial positions such that they distribute evenly. Without offset modulation (<b>b</b>), the offset scope would typically overlap, so in (<b>c</b>) it locally constrains the offset scope to reduce the overlap.</p>
Full article ">Figure 6
<p>Diagram of EWGC_mag network. In line with the concept of multimodal feature fusion, the magnitude information from <span class="html-italic">W</span>1 to <span class="html-italic">W</span>3 is concatenated to the end of the flattened feature extracted using EWGC. A fully connected layer processes the fused multimodal information to perform the classification operation.</p>
Full article ">Figure 7
<p>Accuracy of different model training and validation sets.</p>
Full article ">
17 pages, 4833 KiB  
Article
Fabrication and Properties of Hydrogel Dressings Based on Genipin Crosslinked Chondroitin Sulfate and Chitosan
by Ling Wang, Xiaoyue Ding, Xiaorui He, Ning Tian, Peng Ding, Wei Guo, Oseweuba Valentine Okoro, Yanfang Sun, Guohua Jiang, Zhenzhong Liu, Armin Shavandi and Lei Nie
Polymers 2024, 16(20), 2876; https://doi.org/10.3390/polym16202876 - 11 Oct 2024
Viewed by 268
Abstract
Multifunctional hydrogel dressings remain highly sought after for the promotion of skin wound regeneration. In the present study, multifunctional CHS-DA/HACC (CH) hydrogels with an interpenetrated network were constructed using hydroxypropyl trimethyl ammonium chloride modified chitosan (HACC) and dopamine-modified chondroitin sulfate (CHS-DA), using genipin [...] Read more.
Multifunctional hydrogel dressings remain highly sought after for the promotion of skin wound regeneration. In the present study, multifunctional CHS-DA/HACC (CH) hydrogels with an interpenetrated network were constructed using hydroxypropyl trimethyl ammonium chloride modified chitosan (HACC) and dopamine-modified chondroitin sulfate (CHS-DA), using genipin as crosslinker. The synthesis of HACC and CHS-DA was effectively confirmed using Fourier transform infrared (FT-IR) analysis and 1H nuclear magnetic resonance (1H NMR) spectroscopy. The prepared CH hydrogels exhibited a network of interconnected pores within the microstructure. Furthermore, rheological testing demonstrated that CH hydrogels exhibited strong mechanical properties, stability, and injectability. Further characterization investigations showed that the CH hydrogels showed favorable self-healing and self-adhesion properties. It was also shown that increasing HACC concentration ratio was positively correlated with the antibacterial activity of CH hydrogels, as evidenced by their resistance to Escherichia coli and Staphylococcus aureus. Additionally, Cell Counting Kit-8 (CCK-8) tests, fluorescent images, and a cell scratch assay demonstrated that CH hydrogels had good biocompatibility and cell migration ability. The multifunctional interpenetrated network hydrogels were shown to have good antibacterial properties, antioxidant properties, stable storage modulus and loss modulus, injectable properties, self-healing properties, and biocompatibility, highlighting their potential as wound dressings in wound healing applications. Full article
(This article belongs to the Special Issue Bioactive and Biomedical Hydrogel Dressings for Wound Healing)
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Chondroitin sulfate (CHS), DA, and CHS-DA FTIR spectra. (<b>b</b>) <sup>1</sup>H NMR spectra comparing CHS with the synthesized CHS-DA. (<b>c</b>) Chitosan and HACC FT-IR spectra. (<b>d</b>) The spectra (<sup>1</sup>H NMR) of chitosan and the produced HACC.</p>
Full article ">Figure 2
<p>Characterization of the synthesized CH hydrogels: (<b>a</b>) FT-IR spectra of CH hydrogels. (<b>b</b>) Stress–strain profiles of CH hydrogels. (<b>c</b>) SEM images of CH hydrogels at various magnifications. (<b>d</b>) Pore size distribution of CH hydrogels was calculated from SEM images using ImageJ software (1.54g version). (<b>e</b>) Equilibrium swelling ratio of CH hydrogels in PBS. * <span class="html-italic">p</span> &lt; 0.05 and ** <span class="html-italic">p</span> &lt; 0.01.</p>
Full article ">Figure 3
<p>Rheological analysis of the CH hydrogels: the storage modulus (G′) and loss modulus (G″) were measured as functions of frequency (<b>a</b>–<b>c</b>) and time (<b>d</b>–<b>f</b>). In addition, the frequency was kept at 1 Hz during G′ and G″ testing over time. Panels (<b>g</b>–<b>i</b>) show the variations in viscosity of the CH hydrogels with shear rate, and viscosity could reflect a fluid’s resistance to a change in shape or movement. The inset images illustrate that the CH hydrogels can be injected using a syringe.</p>
Full article ">Figure 4
<p>(<b>a</b>) A three-step cyclic strain test was conducted using a rheometer to assess the self-healing capabilities of the CH hydrogels. (<b>b</b>) The self-healing properties of the CH hydrogels were further examined through macroscopic observation, where the hydrogels were cut into two pieces and rejoined, and their contact morphology was documented. (<b>c</b>) Images showing the adhesion of CH hydrogels to various substrates, including glass, human skin, rubber, plastic, wood, and metal.</p>
Full article ">Figure 5
<p>(<b>a</b>) Images showing the inhibition zones created by CH hydrogels against <span class="html-italic">E. coli</span> and <span class="html-italic">S. aureus</span> after 8 h, demonstrating their antibacterial properties. Note that the left and right images represent repeated experiments. (<b>b</b>,<b>c</b>) The radius of the inhibition zones for CH hydrogels against (<b>b</b>) <span class="html-italic">S. aureus</span> and (<b>c</b>) <span class="html-italic">E. coli</span>, calculated based on the observed zones. (<b>d</b>) DPPH radical scavenging rate of CH hydrogels within 30 min, highlighting their antioxidant activity. * <span class="html-italic">p</span> &lt; 0.05 and *** <span class="html-italic">p</span> &lt; 0.001.</p>
Full article ">Figure 6
<p>Biocompatibility and cell migration of CH hydrogels. (<b>a</b>) Fluorescent microscopy images of NIH-3T3 cells cultured with CH hydrogel extracts for 1, 3, and 5 days, respectively. Cells were stained with Calcein-AM/PI; scale bars: 200 μm. (<b>b</b>) Hydrogel cytocompatibilities were assessed with NIH-3T3 cells at various time points using the CCK-8 assay. (<b>c</b>) Scratch assay results for NIH-3T3 cells after 8 h of incubation, with blue dotted lines representing the width of the scratch gap. Scale bar: 100 μm. (<b>d</b>) The migration rate of NIH-3T3 cells based on scratch assay results. * <span class="html-italic">p</span> &lt; 0.05, ** <span class="html-italic">p</span> &lt; 0.01, and *** <span class="html-italic">p</span> &lt; 0.001.</p>
Full article ">Scheme 1
<p>The schematic representation of CH hydrogel preparation: (<b>a</b>) Synthesis of dopamine-modified chondroitin sulfate (CHS-DA), (<b>b</b>) Production of hydroxypropyl trimethyl ammonium chloride-modified chitosan (HACC), and (<b>c</b>) Fabrication of CH hydrogel with an interpenetrated network structure via genipin crosslinking polymers of HACC and CHS-DA.</p>
Full article ">
25 pages, 13668 KiB  
Article
Predicting Rock Hardness and Abrasivity Using Hyperspectral Imaging Data and Random Forest Regressor Model
by Saleh Ghadernejad and Kamran Esmaeili
Remote Sens. 2024, 16(20), 3778; https://doi.org/10.3390/rs16203778 - 11 Oct 2024
Viewed by 482
Abstract
This study aimed to develop predictive models for rock hardness and abrasivity based on hyperspectral imaging data, providing valuable information without interrupting the mining processes. The data collection stage first involved scanning 159 rock samples collected from 6 different blasted rock piles using [...] Read more.
This study aimed to develop predictive models for rock hardness and abrasivity based on hyperspectral imaging data, providing valuable information without interrupting the mining processes. The data collection stage first involved scanning 159 rock samples collected from 6 different blasted rock piles using visible and near-infrared (VNIR) and short-wave infrared (SWIR) sensors. The hardness and abrasivity of the samples were then determined through Leeb rebound hardness (LRH) and Cerchar abrasivity index (CAI) tests, respectively. The data preprocessing involved radiometric correction, background removal, and staking VNIR and SWIR images. An integrated approach based on K-means clustering and the band ratio concept was employed for feature extraction, resulting in 28 band-ratio-based features. Afterward, the random forest regressor (RFR) algorithm was employed to develop predictive models for rock hardness and abrasivity separately. The performance assessment showed that the developed models can estimate rock hardness and abrasivity of unseen data with R2 scores of 0.74 and 0.79, respectively, with the most influential features located mainly within the SWIR region. The results indicate that integrated hyperspectral data and RFR technique have strong potential for practical and efficient rock hardness and abrasivity characterization during mining processes. Full article
(This article belongs to the Section Remote Sensing in Geology, Geomorphology and Hydrology)
Show Figures

Figure 1

Figure 1
<p>The rock hardness and rock abrasivity footprints on the entire mine-to-mill process.</p>
Full article ">Figure 2
<p>Workflow of developing predictive models for rock hardness and abrasivity using hyperspectral data.</p>
Full article ">Figure 3
<p>Data collection steps: (<b>a</b>) hyperspectral imaging system, (<b>b</b>) LRH test, (<b>c</b>) CAI test.</p>
Full article ">Figure 4
<p>The schematic illustration of (<b>a</b>) the hyperspectral scanning using HySpex VS-620 and (<b>b</b>) the search algorithm for staking VNIR and SWIR images.</p>
Full article ">Figure 5
<p>The results of the preprocessing of hyperspectral data.</p>
Full article ">Figure 6
<p>Boxplots of (<b>a</b>) mean HLD value and (<b>b</b>) mean CAI value of different sampling locations.</p>
Full article ">Figure 7
<p>Distribution of (<b>a</b>) mean HLD value and (<b>b</b>) mean CAI value with respect to the considered thresholds.</p>
Full article ">Figure 8
<p>(<b>a</b>) The ratio of HLD classes within the training dataset, (<b>b</b>) the ratio of HLD classes within the testing dataset, (<b>c</b>) the ratio of CAI classes within the training dataset, and (<b>d</b>) the ratio of CAI classes within the testing dataset.</p>
Full article ">Figure 9
<p>True color illustration of the mosaic images (training dataset: 127 rock samples).</p>
Full article ">Figure 10
<p>Visualization of the elbow method used for determining the optimum number of clusters.</p>
Full article ">Figure 11
<p>The mean spectral curves of the K-means analysis with 7 clusters (training dataset).</p>
Full article ">Figure 12
<p>The process of feature extraction on the dominant spectral curves obtained from the K-means clustering analysis.</p>
Full article ">Figure 13
<p>Pairwise correlation matrix of different band ratios along with HLD and CAI values on the training dataset (red boxes highlight the potential relationship between band ratios, HLD, and CAI values).</p>
Full article ">Figure 14
<p>The distribution of the last three selected band ratios, color-coded based on the HLD and CAI classes.</p>
Full article ">Figure 15
<p>The relationship between HLD and CAI values [<a href="#B53-remotesensing-16-03778" class="html-bibr">53</a>].</p>
Full article ">Figure 16
<p>The structure of the RFR algorithm.</p>
Full article ">Figure 17
<p>The performance evaluation of the developed models: (<b>a</b>) 1:1 plot for the HLD model on the training dataset, (<b>b</b>) 1:1 plot for the HLD model on the testing dataset, (<b>c</b>) residual plot for the HLD model on the testing dataset, (<b>d</b>) 1:1 plot for the CAI model on the training dataset, (<b>e</b>) 1:1 plot for the CAI model on the testing dataset, (<b>f</b>) residual plot for the CAI model on the testing dataset.</p>
Full article ">Figure 18
<p>SHAP feature importance analysis in (<b>a</b>) the HLD model and (<b>b</b>) the CAI model.</p>
Full article ">Figure 19
<p>Comparison of <span class="html-italic">R</span><sup>2</sup> values for different predictive models (VNIR, SWIR, VNIR-SWIR) for HLD and CAI.</p>
Full article ">Figure 20
<p>The results of the resampling approach on sample GS1-17: (<b>a</b>) original image, (<b>b</b>) resampling using a window size of 1, (<b>c</b>) resampling using a window size of 2, (<b>d</b>) resampling using a window size of 3, and (<b>e</b>) resampling using a window size of 4.</p>
Full article ">Figure 21
<p>The spectral curve comparison between the original image and resampled ones for the specified spot in <a href="#remotesensing-16-03778-f020" class="html-fig">Figure 20</a> for sample GS1-17.</p>
Full article ">Figure 22
<p>The performance comparison of the developed models tested using the original and resampled data for (<b>a</b>) HLD and (<b>b</b>) CAI.</p>
Full article ">
21 pages, 5672 KiB  
Article
Hydrogen Bond Integration in Potato Microstructure: Effects of Water Removal, Thermal Treatment, and Cooking Techniques
by Iman Dankar, Amira Haddarah, Montserrat Pujolà and Francesc Sepulcre
Polysaccharides 2024, 5(4), 609-629; https://doi.org/10.3390/polysaccharides5040039 (registering DOI) - 11 Oct 2024
Viewed by 243
Abstract
Fourier-transform infrared spectroscopy (FTIR), X-ray diffraction (XRD), and Scanning electron microscopy (SEM) were used to study the effects of heat treatments and water removal by freeze-drying after different time intervals (6, 12, 24, 48, and 72 h) on the molecular structure of potato [...] Read more.
Fourier-transform infrared spectroscopy (FTIR), X-ray diffraction (XRD), and Scanning electron microscopy (SEM) were used to study the effects of heat treatments and water removal by freeze-drying after different time intervals (6, 12, 24, 48, and 72 h) on the molecular structure of potato tubers. SEM images show structural differences between raw (RP), microwaved (MP), and boiled potato (BP). MP showed a cracked structure. BP was able to re-associate into a granule-like structure after 6 h of freeze-dying, whereas RP had dried granules within a porous matrix after 24 h of freeze-drying. These results are consistent with the moisture content and FTIR results for MP and BP, which demonstrated dried spectra after 6 h of freeze-drying and relatively coincided with RP results after 24 h of freeze-drying. Additionally, three types of hydrogen bonds have been characterized between water and starch, and the prevalence of water very weakly bound to starch has also been detected. The relative crystallinity (RC) was increased by thermal treatment, whereby microwaving recorded the highest value. A comparison of the FTIR and XRD results indicated that freeze-drying treatment overcomes heat effects to generate an integral starch molecule. Full article
(This article belongs to the Special Issue Latest Research on Polysaccharides: Structure and Applications)
Show Figures

Figure 1

Figure 1
<p>Moisture content % of MP, RP, and BP as a function of lyophilization time (h).</p>
Full article ">Figure 2
<p>SEM micrographs of RP, MP, and BP at t<sub>0</sub> (initially before freeze-drying) and after 6, 12, 24, 48, and 72 h of freeze-drying respectively. Arrows correspond to the cavities induced inside the structure of potato due to freeze-drying.</p>
Full article ">Figure 2 Cont.
<p>SEM micrographs of RP, MP, and BP at t<sub>0</sub> (initially before freeze-drying) and after 6, 12, 24, 48, and 72 h of freeze-drying respectively. Arrows correspond to the cavities induced inside the structure of potato due to freeze-drying.</p>
Full article ">Figure 3
<p>FTIR spectra for raw potato (RP), boiled potato (BP) and microwaved potato (MP) initially before freeze-drying (<b>a</b>) and after 6 h of freeze-drying (<b>b</b>).</p>
Full article ">Figure 4
<p>FTIR spectra for raw potato (RP), boiled potato (BP), and microwaved potato (MP) after 12 h (<b>a</b>) and 24 h (<b>b</b>) of freeze-drying.</p>
Full article ">Figure 5
<p>FTIR spectra for raw potato (RP), boiled potato (BP), and microwaved potato (MP) after 48 h (<b>a</b>) and 72 h (<b>b</b>) of freeze-drying.</p>
Full article ">Figure 6
<p>Curve fitting analysis of the un-lyophilized raw potato (RP 0 h sample). The figure displays the experimental FTIR signal (----), the simulated profile (-·-·-) and the three resolved components centered at 3618 cm<sup>−1</sup>, 3414 cm<sup>−1</sup> and 3180 cm<sup>−1</sup>.</p>
Full article ">Figure 7
<p>Wavenumber shift of the 3 peaks corresponding to the OH vibration in the 3700–3000 cm<sup>−1</sup> region (<b>a</b>) and that of the peak centered at 1640 cm<sup>−1</sup> that corresponds to the OH bending vibration (<b>b</b>), as a function of the freeze-drying time (h).</p>
Full article ">Figure 8
<p>XRD patterns of RP0, BP0, MP0 (initially before freeze-drying), and RPF (after lyophilization) (<b>a</b>), and XRD patterns of RP, BP, and MP after 6 h (<b>b</b>), 12 h (<b>c</b>), 24 h (<b>d</b>), 48 h (<b>e</b>) and 72 h (<b>f</b>) of freeze-drying respectively.</p>
Full article ">Figure 9
<p>Relative % of crystallinity of MP, RP and BP as a function of lyophilization time (h).</p>
Full article ">
Back to TopTop