[go: up one dir, main page]

Next Issue
Volume 6, June
Previous Issue
Volume 5, December
 
 

AgriEngineering, Volume 6, Issue 1 (March 2024) – 49 articles

Cover Story (view full-size image): The integration of agricultural robots in precision farming plays a pivotal role in tackling the pressing demands of minimizing energy usage, enhancing productivity, and maximizing crop yield to meet the needs of an expanding global population and depleting non-renewable resources. Evaluating the energy expenditure is vital when assessing agricultural machinery systems. Through the reduction of fuel consumption, operational costs can be curtailed while simultaneously minimizing the overall environmental footprint left by these machines. Accurately calculating fuel usage empowers farmers to make well-informed decisions about their farming operations, resulting in more sustainable and productive methods. In this study, the ASABE model was applied to predict the fuel consumption of the studied robot. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
11 pages, 3971 KiB  
Article
Evaluation of a System to Assess Herbicide Movement in Straw under Dry and Wet Conditions
by Izabela Thais dos Santos, Ivana Paula Ferraz Santos de Brito, Ana Karollyna Alves de Matos, Valesca Pinheiro de Miranda, Guilherme Constantino Meirelles, Priscila Oliveira de Abreu, Ricardo Alcántara-de la Cruz, Edivaldo D. Velini and Caio A. Carbonari
AgriEngineering 2024, 6(1), 858-868; https://doi.org/10.3390/agriengineering6010049 - 19 Mar 2024
Cited by 1 | Viewed by 1178
Abstract
Straw from no-till cropping systems, in addition to increasing the soil organic matter content, may also impede the movement of applied herbicides into the soil and, thus, alter the behavior and fate of these compounds in the environment. Rain or irrigation before or [...] Read more.
Straw from no-till cropping systems, in addition to increasing the soil organic matter content, may also impede the movement of applied herbicides into the soil and, thus, alter the behavior and fate of these compounds in the environment. Rain or irrigation before or after an herbicide treatment can either help or hinder its movement through the straw, influencing weed control. Our objective was to develop a system for herbicide application and rain simulation, enabling the evaluation of the movement of various herbicides either in dry or wet straw under different rainfall volumes (25, 50, 75, and 100 mm). The amount of the applied herbicides that moved through the straw were collected and measured using a liquid chromatograph with a tandem mass spectrometry system (LC-MS/MS). Measurements obtained with the developed system showed a high herbicide treatment uniformity across all replications. The movement of the active ingredients through the straw showed variability that was a function of the applied herbicide, ranging from 17% to 99%. In wet straw, the collected herbicide remained constant from 50 to 100 mm of simulated rainfall. For the wet straw, the decreasing percentages of the herbicide movement through straw to the soil were sulfentrazone (99%), atrazine and diuron (91% each), hexazinone (84%), fomesafen (80.4%), indaziflam (79%), glyphosate (63%), haloxyfop-p-methyl (45%), and S-metolachlor (27%). On the dry straw, the decreasing percentages of the herbicide movement were fomesafen (88%), sulfentrazone (74%), atrazine (69.4%), hexazinone (69%), diuron (68.4%), glyphosate (48%), indaziflam (34.4%), S-metolachlor (22%), and haloxyfop-p-methyl (18%). Overall, herbicide movement was higher in wet straw (with a previous 25 mm simulated rainfall layer) than in dry straw. Some herbicides, like haloxyfop-p-methyl and indaziflam, exhibited over 50% higher movement in wet straw than dry straw after 100 mm of simulated rain. The developed system can be adapted for various uses, serving as a valuable tool to evaluate the behavior of hazardous substances in different agricultural and environmental scenarios. Full article
Show Figures

Figure 1

Figure 1
<p>(<b>A</b>) Scheme of the system developed for herbicide application and rain simulation. Sequence of the equipment structures: (<b>B</b>) side view of the syringe compression system; (<b>C</b>) top view of the rain application and simulation table; and (<b>D</b>) front view of the rain application and simulation table. Sequence of herbicide application or rain simulation: (<b>E</b>) polypropylene capsule with straw treated with repeat pipette; (<b>F</b>) Falcon collection tube with capsule when applying herbicides or simulating rain; (<b>G</b>) herbicide solution in movement to be collected (white arrow); and (<b>H</b>) collection of the herbicide solution for chromatographic analysis.</p>
Full article ">Figure 2
<p>Data adjusted using the Mitscherlich model for maximum amounts of herbicides extracted in 10 t·ha<sup>−1</sup> of wet sugarcane straw (25 mm of simulated rainfall before treatment) of sugarcane after different simulations of rainfall (mm).</p>
Full article ">Figure 3
<p>Data adjusted using the Mitscherlich model for maximum extracted amounts of herbicides in 10 t ha<sup>−1</sup> of dry sugarcane straw after different rainfall simulations (mm).</p>
Full article ">
17 pages, 7466 KiB  
Article
A Performance Comparison of CNN Models for Bean Phenology Classification Using Transfer Learning Techniques
by Teodoro Ibarra-Pérez, Ramón Jaramillo-Martínez, Hans C. Correa-Aguado, Christophe Ndjatchi, Ma. del Rosario Martínez-Blanco, Héctor A. Guerrero-Osuna, Flabio D. Mirelez-Delgado, José I. Casas-Flores, Rafael Reveles-Martínez and Umanel A. Hernández-González
AgriEngineering 2024, 6(1), 841-857; https://doi.org/10.3390/agriengineering6010048 - 18 Mar 2024
Cited by 5 | Viewed by 1372
Abstract
The early and precise identification of the different phenological stages of the bean (Phaseolus vulgaris L.) allows for the determination of critical and timely moments for the implementation of certain agricultural activities that contribute in a significant manner to the output and [...] Read more.
The early and precise identification of the different phenological stages of the bean (Phaseolus vulgaris L.) allows for the determination of critical and timely moments for the implementation of certain agricultural activities that contribute in a significant manner to the output and quality of the harvest, as well as the necessary actions to prevent and control possible damage caused by plagues and diseases. Overall, the standard procedure for phenological identification is conducted by the farmer. This can lead to the possibility of overlooking important findings during the phenological development of the plant, which could result in the appearance of plagues and diseases. In recent years, deep learning (DL) methods have been used to analyze crop behavior and minimize risk in agricultural decision making. One of the most used DL methods in image processing is the convolutional neural network (CNN) due to its high capacity for learning relevant features and recognizing objects in images. In this article, a transfer learning approach and a data augmentation method were applied. A station equipped with RGB cameras was used to gather data from images during the complete phenological cycle of the bean. The information gathered was used to create a set of data to evaluate the performance of each of the four proposed network models: AlexNet, VGG19, SqueezeNet, and GoogleNet. The metrics used were accuracy, precision, sensitivity, specificity, and F1-Score. The results of the best architecture obtained in the validation were those of GoogleNet, which obtained 96.71% accuracy, 96.81% precision, 95.77% sensitivity, 98.73% specificity, and 96.25% F1-Score. Full article
(This article belongs to the Special Issue Application of Artificial Neural Network in Agriculture)
Show Figures

Figure 1

Figure 1
<p>The architecture of convolutional network.</p>
Full article ">Figure 2
<p>Block diagram of the concept of transfer learning.</p>
Full article ">Figure 3
<p>Diagram of proposed methodology.</p>
Full article ">Figure 4
<p>Installation of the GSM camera station in the open field for the capture of images: (<b>a</b>) camera station for the capture of images; (<b>b</b>) schematic diagram for the acquisition of images.</p>
Full article ">Figure 5
<p>Descriptive stages of the phenology of the bean: (<b>a</b>) vegetative phase in primary leaves, first and third trifoliate leaves; (<b>b</b>) reproductive phase in prefloration and floration; (<b>c</b>) reproductive stage in the formation and filling of pods; (<b>d</b>) reproductive phase in maturation.</p>
Full article ">Figure 6
<p>Images for training and tests per class.</p>
Full article ">Figure 7
<p>Data augmentation in an image of the phenology of the bean: (<b>a</b>) original image without data increase; (<b>b</b>) image after rotation; (<b>c</b>) image after translation; (<b>d</b>) image after reflection; (<b>e</b>) image after scaling.</p>
Full article ">Figure 8
<p>Accuracy of models during training and validation.</p>
Full article ">Figure 9
<p>Summary of model performance: (<b>a</b>) AlexNet model; (<b>b</b>) VGG19 model; (<b>c</b>) SqueezeNet model; (<b>d</b>) GoogleNet model.</p>
Full article ">Figure 10
<p>Confusion matrix of CNN models: (<b>a</b>) AlexNet architecture; (<b>b</b>) VGG19 architecture; (<b>c</b>) SqueezeNet architecture; (<b>d</b>) GoogleNet architecture.</p>
Full article ">Figure 10 Cont.
<p>Confusion matrix of CNN models: (<b>a</b>) AlexNet architecture; (<b>b</b>) VGG19 architecture; (<b>c</b>) SqueezeNet architecture; (<b>d</b>) GoogleNet architecture.</p>
Full article ">
18 pages, 16463 KiB  
Article
An Effective and Affordable Internet of Things (IoT) Scale System to Measure Crop Water Use
by José O. Payero
AgriEngineering 2024, 6(1), 823-840; https://doi.org/10.3390/agriengineering6010047 - 13 Mar 2024
Cited by 4 | Viewed by 1468
Abstract
Scales are widely used in many agricultural applications, ranging from weighing crops at harvest to determine crop yields to regularly weighing animals to determine growth rate. In agricultural research applications, there is a long history of measuring crop water use (evapotranspiration [ET]) using [...] Read more.
Scales are widely used in many agricultural applications, ranging from weighing crops at harvest to determine crop yields to regularly weighing animals to determine growth rate. In agricultural research applications, there is a long history of measuring crop water use (evapotranspiration [ET]) using a particular type of scale called weighing lysimeters. Typically, weighing lysimeters require very accurate data logging systems that tend to be expensive. Recent developments in open-source technologies, such as micro-controllers and Internet of Things (IoT) platforms, have created opportunities for developing effective and affordable ways to monitor crop water use and transmit the data to the Internet in near real-time. Therefore, this study aimed to create an affordable Internet of Things (IoT) scale system to measure crop ET. A scale system to monitor crop ET was developed using an Arduino-compatible microcontroller with cell phone communication, electronic load cells, an Inter-Integrated Circuit (I2C) multiplexer, and analog-to-digital converters (ADCs). The system was powered by a LiPo battery, charged by a small (6 W) solar panel. The IoT scale system was programmed to collect data from the load cells at regular time intervals and send the data to the ThingSpeak IoT platform. The system performed successfully during indoor and outdoor experiments conducted in 2023 at the Clemson University Edisto Research and Education Center, Blackville, SC. Calibrations relating the measured output of the scale load cells to changes in mass resulted in excellent linear relationships during the indoor (r2 = 1.0) and outdoor experiments (r2 = 0.9994). The results of the outdoor experiments showed that the IoT scale system could accurately measure changes in lysimeter mass during several months (Feb to Jun) without failure in data collection or transmission. The changes in lysimeter mass measured during that period reflected the same trend as concurrent soil moisture data measured at a nearby weather station. The changes in lysimeter mass measured with the IoT scale system during the outdoor experiment were accurate enough to derive daily and hourly crop ET and even detect what appeared to be dew formation during the morning hours. The IoT scale system can be built using open-source, off-the-shelf electronic components which can be purchased online and easily replaced or substituted. The system can also be developed at a fraction of the cost of data logging, communication, and visualization systems typically used for lysimeter and scale applications. Full article
(This article belongs to the Topic Current Research on Intelligent Equipment for Agriculture)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>General diagram of the IoT scale system, which includes (<b>A</b>) an electronic scale, (<b>B</b>) a datalogger with cell phone communication, (<b>C</b>) an IoT Cloud server, and (<b>D</b>) user data access and visualization.</p>
Full article ">Figure 2
<p>Diagram showing the layout of the different components of the IoT scale system.</p>
Full article ">Figure 3
<p>Electronic components for IoT scale system.</p>
Full article ">Figure 4
<p>Connections of the different electronic components of the IoT scale system.</p>
Full article ">Figure 5
<p>View of field installation of the IoT scale system.</p>
Full article ">Figure 6
<p>Experimental setup for the indoor and outdoor experiments. (<b>A</b>) Flowerpot with four load cells underneath. (<b>B</b>) Soil container with four load cells underneath. (<b>C</b>) Soil container with weeds.</p>
Full article ">Figure 7
<p>Calibration during the outdoor experiment (24 February 2023).</p>
Full article ">Figure 8
<p>Time series (<b>A</b>) and regression (<b>B</b>) results of load cell calibration conducted on 15 January 2023 (y = −0.01505x − 0.216; r<sup>2</sup> = 1.0).</p>
Full article ">Figure 9
<p>Time series (<b>A</b>) and regression (<b>B</b>) results of lysimeter load cell calibration conducted in the field on 24 February 2023 (y = −0.01785x − 0.01384; r<sup>2</sup> = 0.9994).</p>
Full article ">Figure 10
<p>Daily (<b>A</b>) precipitation, (<b>B</b>) maximum (Max) and minimum (Min) air temperature, (<b>C</b>) solar radiation, (<b>D</b>) relative humidity, and (<b>E</b>) soil moisture at three soil depths (5, 10, and 20 cm) from February to June 2023 at the study site.</p>
Full article ">Figure 11
<p>Data from one of the load cells as shown in ThingSpeak between 22 May and 19 June 2023.</p>
Full article ">Figure 12
<p>Lysimeter field data measured every ten minutes from February to June 2023.</p>
Full article ">Figure 13
<p>Daily and cumulative evapotranspiration (ET) derived from lysimeter measurements from February to June 2023. The bars represent daily ET, and the solid line represents cumulative ET.</p>
Full article ">Figure 14
<p>Lysimeter mass (<b>A</b>,<b>C</b>) measured every ten minutes during 2–3 June and 8–10 June 2023, and calculated hourly evapotranspiration rate (<b>B</b>,<b>D</b>). The vertical lines indicate the end of each day.</p>
Full article ">
20 pages, 10201 KiB  
Article
Robotic Multi-Boll Cotton Harvester System Integration and Performance Evaluation
by Shekhar Thapa, Glen C. Rains, Wesley M. Porter, Guoyu Lu, Xianqiao Wang, Canicius Mwitta and Simerjeet S. Virk
AgriEngineering 2024, 6(1), 803-822; https://doi.org/10.3390/agriengineering6010046 - 13 Mar 2024
Viewed by 1502
Abstract
Several studies on robotic cotton harvesters have designed their end-effectors and harvesting algorithms based on the approach of harvesting a single cotton boll at a time. These robotic cotton harvesting systems often have slow harvesting times per boll due to limited computational speed [...] Read more.
Several studies on robotic cotton harvesters have designed their end-effectors and harvesting algorithms based on the approach of harvesting a single cotton boll at a time. These robotic cotton harvesting systems often have slow harvesting times per boll due to limited computational speed and the extended time taken by actuators to approach and retract for picking individual cotton bolls. This study modified the design of the previous version of the end-effector with the aim of improving the picking ratio and picking time per boll. This study designed and fabricated a pullback reel to pull the cotton plants backward while the rover harvested and moved down the row. Additionally, a YOLOv4 cotton detection model and hierarchical agglomerative clustering algorithm were implemented to detect cotton bolls and cluster them. A harvesting algorithm was then developed to harvest the cotton bolls in clusters. The modified end-effector, pullback reel, vacuum conveying system, cotton detection model, clustering algorithm, and straight-line path planning algorithm were integrated into a small red rover, and both lab and field tests were conducted. In lab tests, the robot achieved a picking ratio of 57.1% with an average picking time of 2.5 s per boll. In field tests, picking ratio was 56.0%, and it took an average of 3.0 s per boll. Although there was no improvement in the lab setting over the previous design, the robot’s field performance was significantly better, with a 16% higher picking ratio and a 46% reduction in picking time per boll compared to the previous end-effector version tested in 2022. Full article
Show Figures

Figure 1

Figure 1
<p>The robotic platform “Small Red Rover” illustrating its sensor and actuator components.</p>
Full article ">Figure 2
<p>(<b>a</b>) IMUs, (<b>b</b>) Rotary encoder, and (<b>c</b>) Zed2 stereo-camera and Zed ROS coordinate frame.</p>
Full article ">Figure 3
<p>Error distributions of individual IMUs and fused IMU’s.</p>
Full article ">Figure 4
<p>Zed2 camera coordinate frame to rover frame of reference.</p>
Full article ">Figure 5
<p>Contextual block diagram of the cotton harvesting rover.</p>
Full article ">Figure 6
<p>(<b>a</b>) End-Effector CAD model and (<b>b</b>) fabricated end-effector model.</p>
Full article ">Figure 7
<p>(<b>a</b>) CAD model of the Pullback Reel, (<b>b</b>) Assembly view of the Pullback Reel, Stopper Board, and the End-Effector, (<b>c</b>) Back view of the fabricated assembly, driven by a geared DC motor, and (<b>d</b>) The front view of the Pullback Reel assembly.</p>
Full article ">Figure 8
<p>(<b>a</b>) Cotton detection by the YOLOv4 model with a Zed2 stereo-camera. Green rectangles represent bounding boxes around the detected cotton bolls, and the box area reflects the boll size. Numeric values represent confidence of the detected cotton boll. (<b>b</b>) Distribution of detected cotton bolls from (<b>a</b>) with height-based clustering. Same colored circles indicate bolls in the same cluster, and red crosses represent centroidal heights of each cluster.</p>
Full article ">Figure 9
<p>Frame masking to detect cotton bolls of desired location in a row.</p>
Full article ">Figure 10
<p>Flow chart of the cotton harvesting rover.</p>
Full article ">Figure 11
<p>(<b>a</b>) Lab test setup, and (<b>b</b>) field test setup.</p>
Full article ">Figure 12
<p>(<b>a</b>) Boxplot with <span class="html-italic">t</span>-test (<span class="html-italic">p</span>-value = 0.85) for the picking ratio between the field and lab tests in 2023. (<b>b</b>) Boxplot with <span class="html-italic">t</span>-test (<span class="html-italic">p</span>-value = 0.29) for the picking time per boll between the field and lab tests in 2023. (<b>c</b>) Boxplot with <span class="html-italic">t</span>-test (<span class="html-italic">p</span>-value = 0.0015) for the picking ratio between the field tests in 2022 and 2023. (<b>d</b>) Boxplot with <span class="html-italic">t</span>-test (<span class="html-italic">p</span>-value = 0.028) for the picking time per boll between the field tests in 2022 and 2023.</p>
Full article ">Figure 13
<p>(<b>a</b>) Stopper board used in the lab test, (<b>b</b>) inclined sheet metal added to the stopper board for the field testing, (<b>c</b>) harvested cotton with foreign materials, (<b>d</b>) seeded cotton with some bur and branches blocking the vacuum canister inlet, and (<b>e</b>) cotton bolls and some branches blocking vacuum hose.</p>
Full article ">
17 pages, 29958 KiB  
Article
Optimizing Crop Yield Estimation through Geospatial Technology: A Comparative Analysis of a Semi-Physical Model, Crop Simulation, and Machine Learning Algorithms
by Murali Krishna Gumma, Ramavenkata Mahesh Nukala, Pranay Panjala, Pavan Kumar Bellam, Snigdha Gajjala, Sunil Kumar Dubey, Vinay Kumar Sehgal, Ismail Mohammed and Kumara Charyulu Deevi
AgriEngineering 2024, 6(1), 786-802; https://doi.org/10.3390/agriengineering6010045 - 11 Mar 2024
Cited by 2 | Viewed by 2505
Abstract
This study underscores the critical importance of accurate crop yield information for national food security and export considerations, with a specific focus on wheat yield estimation at the Gram Panchayat (GP) level in Bareilly district, Uttar Pradesh, using technologies such as machine learning [...] Read more.
This study underscores the critical importance of accurate crop yield information for national food security and export considerations, with a specific focus on wheat yield estimation at the Gram Panchayat (GP) level in Bareilly district, Uttar Pradesh, using technologies such as machine learning algorithms (ML), the Decision Support System for Agrotechnology Transfer (DSSAT) crop model and semi-physical models (SPMs). The research integrates Sentinel-2 time-series data and ground data to generate comprehensive crop type maps. These maps offer insights into spatial variations in crop extent, growth stages and the leaf area index (LAI), serving as essential components for precise yield assessment. The classification of crops employed spectral matching techniques (SMTs) on Sentinel-2 time-series data, complemented by field surveys and ground data on crop management. The strategic identification of crop-cutting experiment (CCE) locations, based on a combination of crop type maps, soil data and weather parameters, further enhanced the precision of the study. A systematic comparison of three major crop yield estimation models revealed distinctive gaps in each approach. Machine learning models exhibit effectiveness in homogenous areas with similar cultivars, while the accuracy of a semi-physical model depends upon the resolution of the utilized data. The DSSAT model is effective in predicting yields at specific locations but faces difficulties when trying to extend these predictions to cover a larger study area. This research provides valuable insights for policymakers by providing near-real-time, high-resolution crop yield estimates at the local level, facilitating informed decision making in attaining food security. Full article
(This article belongs to the Section Remote Sensing in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Ground points and CCEs across the study area.</p>
Full article ">Figure 2
<p>Methodology adopted for crop type mapping.</p>
Full article ">Figure 3
<p>Methodology for yield estimation using the DSSAT and its integration with remote sensing.</p>
Full article ">Figure 4
<p>Methodology for estimating yield through an SPM.</p>
Full article ">Figure 5
<p>Spatial distribution of wheat and other LULCs.</p>
Full article ">Figure 6
<p>Spatial distribution of yield at GP level using ML algorithm.</p>
Full article ">Figure 7
<p>Spatial distribution of yield estimation using DSSAT crop simulation model.</p>
Full article ">Figure 8
<p>Spatial distribution of yield using semi-physical approach.</p>
Full article ">Figure 9
<p>Chart showing the yield variations using different models.</p>
Full article ">Figure 10
<p>Correlation between the CCE data and the model yield.</p>
Full article ">
13 pages, 4034 KiB  
Article
Design of an Internet of Things (IoT)-Based Photosynthetically Active Radiation (PAR) Monitoring System
by Younsuk Dong and Hunter Hansen
AgriEngineering 2024, 6(1), 773-785; https://doi.org/10.3390/agriengineering6010044 - 8 Mar 2024
Cited by 1 | Viewed by 1354
Abstract
Photosynthetically Active Radiation (PAR) is an important parameter in the plant photosynthesis process, which can relate to plant growth, crop water use, and leaf gas exchange. Previously, many researchers utilized commercially available sensors to monitor PAR. The high cost of the commercially available [...] Read more.
Photosynthetically Active Radiation (PAR) is an important parameter in the plant photosynthesis process, which can relate to plant growth, crop water use, and leaf gas exchange. Previously, many researchers utilized commercially available sensors to monitor PAR. The high cost of the commercially available PAR sensors has limited researchers, agricultural professionals, and farmers to use and expand PAR monitoring in agricultural lands. Thus, this paper focuses on designing an affordable Internet of Things (IoT)-based PAR sensor monitoring system including 3D-printed enclosures (waterproof) for the sensors, performance evaluation of multiple light sensors, solar powering configuration, cloud setup, and cost analysis. Three sensors, including VTB8440BH photodiode, SI 1145, and LI-190R sensors, were evaluated. The 3D-printed waterproof enclosures were designed for the photodiode and SI 1145. Particle Boron was used for recording and sending the sensor data to the IoT webserver. Both the photodiode and SI 1145 were compared to LI-190R, which is the industry standard. In the calibration process, the R2 values of the photodiode and SI 1145 with LI-190R were 0.609 and 0.961, respectively. Field validation data shows that SI 1145 had a strong correlation with LI-190R. In addition, the performance evaluation data shows the photodiode had a weaker correlation with LI-190R than SI 1145. In conclusion, the study successfully developed and designed affordable and reliable IoT-based PAR sensor monitoring systems, including a 3D-printed housing, hardware, programming, and IoT website. SI 1145 with a glass filter is an alternative sensor to monitor PAR at a low cost and has the advantage of being connected to IoT microcontrollers. Full article
Show Figures

Figure 1

Figure 1
<p>Light penetration of the UV AR IR glass-cut filter.</p>
Full article ">Figure 2
<p>Diagram of the layout of the PTFE sheet and glass filter for the photodiode and SI1145.</p>
Full article ">Figure 3
<p>Enclosure design. (<b>A</b>): Outside of the enclosure. (<b>B</b>): Bottom plate of the enclosure. (<b>C</b>): Inside of the enclosure for the photodiode. (<b>D</b>): Inside of the enclosure for SI 1145. (<b>E</b>): Cross-section of enclosure for the photodiode. (<b>F</b>): Cross-section of enclosure for the SI 1145.</p>
Full article ">Figure 4
<p>3D-printed housing for SI 1145 and photodiode.</p>
Full article ">Figure 5
<p>System diagram of the IoT-based PAR sensor monitoring system.</p>
Full article ">Figure 6
<p>Screenshot of the IoT web server that displays data and shares with users.</p>
Full article ">Figure 7
<p>Flow of the microcontroller program.</p>
Full article ">Figure 8
<p>Comparison of photodiode with LI-190R from the calibration process.</p>
Full article ">Figure 9
<p>Comparison of SI 1145 with LI-190R from the calibration process.</p>
Full article ">Figure 10
<p>Comparison of LI-190R, SI 1145, and photodiode in outdoor environment conditions.</p>
Full article ">
19 pages, 3376 KiB  
Article
Estimating Fuel Consumption of an Agricultural Robot by Applying Machine Learning Techniques during Seeding Operation
by Mahdi Vahdanjoo, René Gislum and Claus Aage Grøn Sørensen
AgriEngineering 2024, 6(1), 754-772; https://doi.org/10.3390/agriengineering6010043 - 7 Mar 2024
Viewed by 1086
Abstract
The integration of agricultural robots in precision farming plays a pivotal role in tackling the pressing demands of minimizing energy usage, enhancing productivity, and maximizing crop yield to meet the needs of an expanding global population and depleting non-renewable resources. Evaluating the energy [...] Read more.
The integration of agricultural robots in precision farming plays a pivotal role in tackling the pressing demands of minimizing energy usage, enhancing productivity, and maximizing crop yield to meet the needs of an expanding global population and depleting non-renewable resources. Evaluating the energy expenditure is vital when assessing agricultural machinery systems. Through the reduction of fuel consumption, operational costs can be curtailed while simultaneously minimizing the overall environmental footprint left by these machines. Accurately calculating fuel usage empowers farmers to make well-informed decisions about their farming operations, resulting in more sustainable and productive methods. In this study, the ASABE model was applied to predict the fuel consumption of the studied robot. Results show that the ASABE model can predict the fuel consumption of the robot with an average error equal to 27.5%. Moreover, different machine-learning techniques were applied to develop an effective and novel model for estimating the fuel consumption of an agricultural robot. The proposed GPR model (gaussian process regression) considers four operational features of the studied robot: total operational time, total traveled distance, automatic working distance, and automatic turning distance. The GPR model with four features, considering hyperparameter optimization, showed the best performance (R-squared validation = 0.93, R-squared test = 1.00) among other models. Furthermore, three different ML methods (gradient boosting, random forest, and XGBoost) were considered in this study and compared with the developed GPR model. The results show that the GPR model outperformed the mentioned models. Moreover, the one-way ANOVA test results revealed that the predicted values from the GPR model and observation do not have significantly different means. The results of the sensitivity analysis show that the traveled distance and the total time have a significant correlation with the fuel consumption of the studied robot. Full article
Show Figures

Figure 1

Figure 1
<p>The steps to resolve a supervised learning problem.</p>
Full article ">Figure 2
<p>Studied agricultural robot (version 150D). (<b>a</b>) Shows the top 2D view of the robot. Numbers 1 to 4 represent safety bumper, central boom, three-point-hitch, and emergency stop, respectively; (<b>b</b>) Shows the 3D view of the studied robot.</p>
Full article ">Figure 3
<p>Plotted coordinates of the robot based on different task (time/distance) elements. Green color represents manual driving; blue is for automatic and working mode; and white is for automatic and nonworking mode of the robot.</p>
Full article ">Figure 4
<p>Data cleaning process for the fuel consumption parameter.</p>
Full article ">Figure 5
<p>Comparison of predicted values for fuel consumption of this robot based on ASABE model with the measured values.</p>
Full article ">Figure 6
<p>The response plot and predicted vs. actual plot for the Ensemble model. In the left plot, the blue points are true values, the orange points are predicted values, and the orange lines are errors. In the right plot, the blue points are observations, and the black line is the prefect prediction.</p>
Full article ">Figure 7
<p>Feature ranking based on MRMR, F-Test, and RReliefF algorithms.</p>
Full article ">Figure 8
<p>The response plot and actual vs. predicted values for GPR model with three features.</p>
Full article ">Figure 9
<p>The response plot and actual vs. predicted values for GPR model with two features.</p>
Full article ">Figure 10
<p>The Response plot and Actual vs. Predicted values for GPR model with one feature.</p>
Full article ">Figure 11
<p>The response plot and actual vs. predicted values for GPR model with four features applying hyperparameter optimization.</p>
Full article ">Figure 12
<p>Minimum MSE plot for the GPR predictive model.</p>
Full article ">Figure 13
<p>The results of sensitivity analysis for the second model.</p>
Full article ">
30 pages, 9985 KiB  
Article
Soqia: A Responsive Web Geographic Information System Solution for Dynamic Spatio-Temporal Monitoring of Soil Water Status in Arboriculture
by Lahoucine Ennatiqi, Mourad Bouziani, Reda Yaagoubi and Lahcen Kenny
AgriEngineering 2024, 6(1), 724-753; https://doi.org/10.3390/agriengineering6010042 - 7 Mar 2024
Cited by 2 | Viewed by 1339
Abstract
The optimization of irrigation in arboriculture holds crucial importance for effectively managing water resources in arid regions. This work introduces the development and implementation of an innovative solution named ‘Soqia’, a responsive WEB-GIS web application designed for real-time monitoring of the water status [...] Read more.
The optimization of irrigation in arboriculture holds crucial importance for effectively managing water resources in arid regions. This work introduces the development and implementation of an innovative solution named ‘Soqia’, a responsive WEB-GIS web application designed for real-time monitoring of the water status in arboriculture. This solution integrates meteorological data, remote sensing data, and ground sensor-collected data for precise irrigation management at the agricultural plot level. A range of features has been considered in the development of this WEB -GIS solution, ranging from visualizing vegetation indices to accessing current weather data, thereby contributing to more efficient irrigation management. Compared to other existing applications, ‘Soqia’ provides users with the current amount of water to irrigate, as well as an estimated amount for the next 8 days. Additionally, it offers spatio-temporal tracking of vegetation indices provided as maps and graphs. The importance of the Soqia solution at the national level is justified by the scarcity of water resources due to increasingly frequent and intense drought seasons for the past years. Low rainfall is recorded in all national agricultural areas. The implemented prototype is a first step toward the development of future innovative tools aimed at improving water management in regions facing water challenges. This prototype illustrates the potential of Web-GIS-based precision irrigation systems for the rational use of water in agriculture in general and arboriculture in particular. Full article
(This article belongs to the Section Agricultural Irrigation Systems)
Show Figures

Figure 1

Figure 1
<p>Technical architecture of the application.</p>
Full article ">Figure 2
<p>Use case diagram.</p>
Full article ">Figure 3
<p>Location and boundaries of the study area.</p>
Full article ">Figure 4
<p>Illustration of the devices installed on the farm.</p>
Full article ">Figure 5
<p>Class diagram of our databases (the symbol ‘1..*’ represents a multiplicity where there is at least one instance of the associated class, but there may be several).</p>
Full article ">Figure 6
<p>Overlaying geospatial data on an OpenLayers map.</p>
Full article ">Figure 7
<p>Key stages of implementation.</p>
Full article ">Figure 8
<p>Main steps for creating vegetation index maps and graphs.</p>
Full article ">Figure 9
<p>Steps for visualizing data stored in TimescaleDB (sensor example n° 01).</p>
Full article ">Figure 10
<p>(<b>a</b>) Access page; (<b>b</b>) home page.</p>
Full article ">Figure 11
<p>Interface for vegetation index classification maps (Green: high value, Yellow: medium value, Red: low value).</p>
Full article ">Figure 12
<p>Vegetation index time series.</p>
Full article ">Figure 13
<p>Dashboard screen for sensor 1. The red box at the top left indicates that clicking on the graph allows users to view the exact time of recording and the corresponding values for soil temperature and moisture.</p>
Full article ">Figure 14
<p>Weather Conditions Consultation.</p>
Full article ">Figure 15
<p>Page 1 of the service screen 4 (Weather Forecast: Temperature and Humidity).</p>
Full article ">Figure 16
<p>Page 2 of the service screen 4 (Weather Forecast: Wind speed, Ground pressure and Precipitation).</p>
Full article ">Figure 17
<p>Page 3 of the service screen 4 (Weather Forecast: Solar Radiation).</p>
Full article ">Figure 18
<p>Service screen 5 (Evapotranspiration estimation).</p>
Full article ">Figure 19
<p>Farm component mapping interface.</p>
Full article ">Figure A1
<p>Access page.</p>
Full article ">Figure A2
<p>Home page.</p>
Full article ">Figure A3
<p>Weather consultation.</p>
Full article ">Figure A4
<p>(<b>a</b>) Interface for vegetation index classification maps; (<b>b</b>) index graph interface.</p>
Full article ">Figure A5
<p>Dashboard screen (sensor 1).</p>
Full article ">Figure A6
<p>Page 1 of weather estimation service.</p>
Full article ">Figure A7
<p>Page 2 of weather estimation service.</p>
Full article ">Figure A8
<p>Page 3 of weather estimation service.</p>
Full article ">Figure A9
<p>Evapotranspiration estimation.</p>
Full article ">
26 pages, 5861 KiB  
Article
Improved Collision Avoidance Algorithm of Autonomous Rice Transplanter Based on Virtual Goal Point
by Jinyang Li, Miao Zhang, Meiqing Li and Deqiang Ge
AgriEngineering 2024, 6(1), 698-723; https://doi.org/10.3390/agriengineering6010041 - 7 Mar 2024
Viewed by 858
Abstract
To ensure the operation safety and efficiency of an autonomous rice transplanter, a path planning method of obstacle avoidance based on the improved artificial potential field is proposed. Firstly, the obstacles are divided into circular or elliptic obstacles according to the difference between [...] Read more.
To ensure the operation safety and efficiency of an autonomous rice transplanter, a path planning method of obstacle avoidance based on the improved artificial potential field is proposed. Firstly, the obstacles are divided into circular or elliptic obstacles according to the difference between the length and width of an obstacle as well as the angle between the vehicle’s forward direction and the length direction of the obstacle. Secondly, improved repulsive fields for circular and elliptic models are developed. To escape the local minimum and goal inaccessibility of the traditional artificial potential field as well as meet the requirements of agronomy and vehicle kinematics constraints, the adaptive setting and adjusting strategy for virtual goal points is proposed according to relative azimuth between obstacle and vehicle. The path smoothing method based on the B-spline interpolation method is presented. Finally, the intelligent obstacle avoidance algorithm is designed, and the path evaluation rule is given to obtain the low-cost, non-collision, smooth and shortest obstacle avoidance path. To verify the effectiveness of the proposed obstacle avoidance algorithm, simulation and field experiments are conducted. Simulation and experimental results demonstrate that the proposed improved collision avoidance algorithm is highly effective and realizable. Full article
Show Figures

Figure 1

Figure 1
<p>Flow chart of determining the obstacle type.</p>
Full article ">Figure 2
<p>Development of the elliptic obstacle model.</p>
Full article ">Figure 3
<p>Schematic diagram of the repulsive force field of a circular obstacle.</p>
Full article ">Figure 4
<p>Comparison of the improved attractive force field and the traditional attractive force field.</p>
Full article ">Figure 5
<p>Diagram of relative azimuth between obstacle and vehicle. (<b>a</b>) Elliptic obstacle located on the left of the operating path; (<b>b</b>) elliptic obstacle located in the operating path; (<b>c</b>) elliptic obstacle located on the right of the operating path; (<b>d</b>) circular obstacle located on the left of the operating path; (<b>e</b>) circular obstacle located in the operating path; and (<b>f</b>) circular obstacle located on the right of the operating path.</p>
Full article ">Figure 6
<p>Schematic diagram of setting and adjusting for virtual target points. (<b>a</b>) Elliptic obstacle; (<b>b</b>) circular obstacle.</p>
Full article ">Figure 7
<p>Obstacle avoidance path comparison before and after third-order B-spline curve path smoothing. Black dot: trajectory before using third-order B-spline curve path smoothing. Blue line: trajectory after using third-order B-spline curve path smoothing.</p>
Full article ">Figure 8
<p>Adopted experimental device in the field tests.</p>
Full article ">Figure 9
<p>Flow chart of the improved obstacle avoidance algorithm.</p>
Full article ">Figure 10
<p>Comparison of obstacle avoidance path under the action of the elliptic and circular repulsive force field. (<b>a</b>) Circular repulsive force field; (<b>b</b>) elliptic repulsive force field.</p>
Full article ">Figure 11
<p>Comparison of obstacle avoidance path for two obstacle models with the different shapes when the obstacle is situated at the left of the working path. (<b>a</b>) Elliptic obstacle; (<b>b</b>) circular obstacle.</p>
Full article ">Figure 12
<p>Comparison of obstacle avoidance paths for whether to set a virtual goal point when the elliptic obstacle is located in the operating path. (<b>a</b>) Not setting virtual goal point; (<b>b</b>) setting virtual goal point.</p>
Full article ">Figure 13
<p>Comparison of obstacle avoidance paths for whether to set a virtual goal point when the circular obstacle is located in the operating path. (<b>a</b>) Not setting virtual goal point; (<b>b</b>) setting virtual goal point.</p>
Full article ">Figure 14
<p>Comparison of obstacle avoidance paths for whether to set a virtual goal point when the elliptic obstacle is located on the right of the operating path. (<b>a</b>) Not setting virtual goal point; (<b>b</b>) setting virtual goal point.</p>
Full article ">Figure 15
<p>Comparison of obstacle avoidance paths for whether to set a virtual goal point when the circular obstacle is located on the right of the operating path. (<b>a</b>) Not setting virtual goal point; (<b>b</b>) setting virtual goal point.</p>
Full article ">Figure 16
<p>Comparison of obstacle avoidance paths for circular obstacles under different virtual goal points. (<b>a</b>) Obstacle avoidance paths at six different virtual goal points when the obstacle coordinates are <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>X</mi> </mrow> <mrow> <mn>0</mn> </mrow> </msub> <mo>=</mo> <mfenced separators="|"> <mrow> <mn>5.4,8.0</mn> </mrow> </mfenced> </mrow> </semantics></math>; (<b>b</b>) variation of path curvature in (<b>a</b>); (<b>c</b>) obstacle avoidance paths at six different virtual goal points when the obstacle coordinates are <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>X</mi> </mrow> <mrow> <mn>0</mn> </mrow> </msub> <mo>=</mo> <mfenced separators="|"> <mrow> <mn>6.4,8.0</mn> </mrow> </mfenced> </mrow> </semantics></math>; and (<b>d</b>) variation of path curvature in (<b>c</b>).</p>
Full article ">Figure 17
<p>Comparison of obstacle avoidance paths for elliptic obstacles under different virtual goal points. (<b>a</b>) Obstacle avoidance paths at six different virtual goal points when the obstacle coordinates are <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>X</mi> </mrow> <mrow> <mn>0</mn> </mrow> </msub> <mo>=</mo> <mfenced separators="|"> <mrow> <mn>3.3,5.6</mn> </mrow> </mfenced> </mrow> </semantics></math>; (<b>b</b>) variation of path curvature in (<b>a</b>); (<b>c</b>) obstacle avoidance paths at six different virtual goal points when the obstacle coordinates are <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>X</mi> </mrow> <mrow> <mn>0</mn> </mrow> </msub> <mo>=</mo> <mfenced separators="|"> <mrow> <mn>4.3,5.6</mn> </mrow> </mfenced> </mrow> </semantics></math>; and (<b>d</b>) variation of path curvature in (<b>c</b>).</p>
Full article ">Figure 18
<p>Obstacle avoidance results when the obstacle is located on the left operating path. (<b>a</b>) Practical obstacle avoidance path; (<b>b</b>) curvature of the practical obstacle avoidance path.</p>
Full article ">Figure 19
<p>Obstacle avoidance results when the obstacle is located in the operating path. (<b>a</b>) Planned obstacle paths for different virtual goal points; (<b>b</b>) practical obstacle avoidance path; (<b>c</b>) curvature of the practical obstacle avoidance path; and (<b>d</b>) distance from the rice transplanter to the outer contour of the obstacle.</p>
Full article ">
20 pages, 5071 KiB  
Article
Design, Integration, and Experiment of Transplanting Robot for Early Plug Tray Seedling in a Plant Factory
by Wei Liu, Minya Xu and Huanyu Jiang
AgriEngineering 2024, 6(1), 678-697; https://doi.org/10.3390/agriengineering6010040 - 6 Mar 2024
Viewed by 1381
Abstract
In the context of plant factories relying on artificial light sources, energy consumption stands out as a significant cost factor. Implementing early seedling removal and replacement operations has the potential to enhance the yield per unit area and the per-energy consumption. Nevertheless, conventional [...] Read more.
In the context of plant factories relying on artificial light sources, energy consumption stands out as a significant cost factor. Implementing early seedling removal and replacement operations has the potential to enhance the yield per unit area and the per-energy consumption. Nevertheless, conventional transplanting machines are limited to handling older seedlings with well-established roots. This study addresses these constraints by introducing a transplanting workstation based on the UR5 industrial robot tailored to early plug tray seedlings in plant factories. A diagonal oblique insertion end effector was employed, ensuring stable grasping even in loose substrate conditions. Robotic vision technology was utilized for the recognition of nongerminating holes and inferior seedlings. The integrated robotic system seamlessly managed the entire process of removing and replanting the plug tray seedlings. The experimental findings revealed that the diagonal oblique-insertion end effector achieved a cleaning rate exceeding 65% for substrates with a moisture content exceeding 70%. Moreover, the threshold-segmentation-based method for identifying empty holes and inferior seedlings demonstrated a recognition accuracy surpassing 97.68%. The success rate for removal and replanting in transplanting process reached an impressive 95%. This transplanting robot system serves as a reference for the transplantation of early seedlings with loose substrate in plant factories, holding significant implications for improving yield in plant factory settings. Full article
(This article belongs to the Topic Current Research on Intelligent Equipment for Agriculture)
Show Figures

Figure 1

Figure 1
<p>The robotic plug tray seedling transplanting system developed in this study: (<b>a</b>) The safe region of the UR5 robot workspace (areas marked by the red arrows indicate high forces region of the robot), (<b>b</b>) relative installation positions of the robot, camera, end effector, and plug trays, (<b>c</b>) the cylinder control circuit diagram, (<b>d</b>) pneumatic control schematic diagram, (<b>e</b>) structure design, and (<b>f</b>) prototype, including RGB camera, robot manipulator, gripper, control system, and so on.</p>
Full article ">Figure 2
<p>(<b>a</b>) The dimensions of the hole, two types of gripper: (<b>b</b>) shovel-type, (<b>c</b>) fork-type, and (<b>d</b>) structure of diagonal oblique-insertion-type end effector.</p>
Full article ">Figure 3
<p>Working principle diagram of end-effector, grasping: (<b>a</b>) approaching, (<b>b</b>) lowering, (<b>c</b>) insertion, (<b>d</b>) lifting, planting: (<b>e</b>) approaching, (<b>f</b>) lowering, (<b>g</b>) releasing, (<b>h</b>) lifting.</p>
Full article ">Figure 4
<p>Motion of the gripper: (<b>a</b>) upper and lower stop point of the gripper, (<b>b</b>) motion analysis of the gripper.</p>
Full article ">Figure 5
<p>Method of grasping test.</p>
Full article ">Figure 6
<p>Image processing flowchart.</p>
Full article ">Figure 7
<p>Image preprocessing: (<b>a</b>) origin image, (<b>b</b>) background removing, (<b>c</b>) origin image with grid, (<b>d</b>) corrected image with grid.</p>
Full article ">Figure 8
<p>Grayscale processing: (<b>a</b>) processed gray image, (<b>b</b>) processed Exg image, (<b>c</b>) gray histogram, (<b>d</b>) Exg histogram.</p>
Full article ">Figure 8 Cont.
<p>Grayscale processing: (<b>a</b>) processed gray image, (<b>b</b>) processed Exg image, (<b>c</b>) gray histogram, (<b>d</b>) Exg histogram.</p>
Full article ">Figure 9
<p>Seedling leaf segmentation: (<b>a</b>) adaptive threshold segmentation, (<b>b</b>) Otsu’s threshold segmentation.</p>
Full article ">Figure 10
<p>Pixel value analysis and result output: (<b>a</b>) bubble scatter chart of pixel value for each cell, (<b>b</b>) result output and display (the red box indicates identification results as bad seedlings, while the yellow box indicates healthy seedlings).</p>
Full article ">Figure 11
<p>(<b>a</b>) Positioning method, (<b>b</b>) positioning accuracy measurement.</p>
Full article ">Figure 12
<p>Result of grasping test.</p>
Full article ">Figure 13
<p>Growth status of transplanted group and control group: (<b>a</b>) the day of transplant, (<b>b</b>) 12 days after transplant, (<b>c</b>) box plot of two groups of seedling heights.</p>
Full article ">
21 pages, 14596 KiB  
Article
Integrated Route-Planning System for Agricultural Robots
by Gavriela Asiminari, Vasileios Moysiadis, Dimitrios Kateris, Patrizia Busato, Caicong Wu, Charisios Achillas, Claus Grøn Sørensen, Simon Pearson and Dionysis Bochtis
AgriEngineering 2024, 6(1), 657-677; https://doi.org/10.3390/agriengineering6010039 - 5 Mar 2024
Cited by 1 | Viewed by 1546
Abstract
Within the transition from precision agriculture (task-specific approach) to smart farming (system-specific approach) there is a need to build and evaluate robotic systems that are part of an overall integrated system under a continuous two-way connection and interaction. This paper presented an initial [...] Read more.
Within the transition from precision agriculture (task-specific approach) to smart farming (system-specific approach) there is a need to build and evaluate robotic systems that are part of an overall integrated system under a continuous two-way connection and interaction. This paper presented an initial step in creating an integrated system for agri-robotics, enabling two-way communication between an unmanned ground vehicle (UGV) and a farm management information system (FMIS) under the general scope of smart farming implementation. In this initial step, the primary task of route-planning for the agricultural vehicles, as a prerequisite for the execution of any field operation, was selected as a use-case for building and evaluating this integration. The system that was developed involves advanced route-planning algorithms within the cloud-based FMIS, a comprehensive algorithmic package compatible with agricultural vehicles utilizing the Robot Operating System (ROS), and a communicational and computational unit (CCU) interconnecting the FMIS algorithms, the corresponding user interface, and the vehicles. Its analytical module provides valuable information about UGVs’ performance metrics, specifically performance indicators of working distance, non-working distance, overlapped area, and field-traversing efficiency. The system was demonstrated via the implementation of two robotic vehicles in route-execution tasks in various operational configurations, field features, and cropping systems (open field, row crops, orchards). The case studies showed variability in the operational performance of the field traversal efficiency to be between 79.2% and 93%, while, when implementing the optimal route-planning functionality of the system, there was an improvement of up to 9.5% in the field efficiency. The demonstrated results indicate that the user can obtain better control over field operations by making alterations to ensure optimum field performance, and the user can have complete supervision of the operation. Full article
Show Figures

Figure 1

Figure 1
<p>System architecture and flow between the FMIS and the UGV.</p>
Full article ">Figure 2
<p>Representation of route-planning module.</p>
Full article ">Figure 3
<p>Headland passes (red) and inner boundary (blue) after (<b>a</b>) automatic reduction and (<b>b</b>) automatic increase.</p>
Full article ">Figure 4
<p>(<b>a</b>) AB pattern, (<b>b</b>) SF pattern, (<b>c</b>) BL pattern, and (<b>d</b>) an instance of B pattern. Numbers “0” and “1” refer to the starting and ending point of the route, respectively.</p>
Full article ">Figure 4 Cont.
<p>(<b>a</b>) AB pattern, (<b>b</b>) SF pattern, (<b>c</b>) BL pattern, and (<b>d</b>) an instance of B pattern. Numbers “0” and “1” refer to the starting and ending point of the route, respectively.</p>
Full article ">Figure 5
<p>Representation of different turn types in a robot simulation. (<b>a</b>) <span class="html-italic">Ω<sub>turn</sub></span> (<b>b</b>) <span class="html-italic">T<sub>turn</sub></span> (<b>c</b>) omni-direction turn.</p>
Full article ">Figure 6
<p>The first implemented UGV (Husky) for the system demonstration.</p>
Full article ">Figure 7
<p>The second implemented UGV (Thorvald) for the system demonstration.</p>
Full article ">Figure 8
<p>The generated (<b>a</b>) and recorder (<b>c</b>) paths for field A (case: working width equal to 4.5 m, minimum turning radius equal to 6 m, <span class="html-italic">Ω<sub>turn</sub></span> type, AB fieldwork pattern, and driving direction parallel to the longest field edge), and the generated (<b>b</b>) and recorder (<b>d</b>) paths for field B (case: non-convex-field-shaped working width equal to 4.5 m, minimum turning radius equal to 6 m, <span class="html-italic">T<sub>turn</sub></span> as turn type, AΒ fieldwork pattern, and driving direction parallel to the longest edge).</p>
Full article ">Figure 9
<p>3D bar chart showing the FTE value for all 36 use-cases for Field A.</p>
Full article ">Figure 10
<p>3D bar chart showing the FTE value for all 36 use-cases for Field Β.</p>
Full article ">Figure 11
<p>UGV following the adjusted tracks that were created in a cotton field.</p>
Full article ">Figure 12
<p>Generated tracks for cotton cultivation with working widths of (<b>a</b>) 0.9 m, (<b>b</b>) 2.7 m, and (<b>c</b>) 5.4 m.</p>
Full article ">Figure 13
<p>Husky following a track in the middle of the row.</p>
Full article ">Figure 14
<p>The fieldwork tracks of the experiment performed in a tree orchard. (<b>a</b>) Routes performed in the middle of the row; (<b>b</b>) routes dedicated to one side of the tree row.</p>
Full article ">Figure 15
<p>Results as presented to the user interface.</p>
Full article ">
12 pages, 3825 KiB  
Article
Sweet Pepper Leaf Area Estimation Using Semantic 3D Point Clouds Based on Semantic Segmentation Neural Network
by Truong Thi Huong Giang and Young-Jae Ryoo
AgriEngineering 2024, 6(1), 645-656; https://doi.org/10.3390/agriengineering6010038 - 4 Mar 2024
Viewed by 998
Abstract
In the field of agriculture, measuring the leaf area is crucial for the management of crops. Various techniques exist for this measurement, ranging from direct to indirect approaches and destructive to non-destructive techniques. The non-destructive approach is favored because it preserves the plant’s [...] Read more.
In the field of agriculture, measuring the leaf area is crucial for the management of crops. Various techniques exist for this measurement, ranging from direct to indirect approaches and destructive to non-destructive techniques. The non-destructive approach is favored because it preserves the plant’s integrity. Among these, several methods utilize leaf dimensions, such as width and length, to estimate leaf areas based on specific models that consider the unique shapes of leaves. Although this approach does not damage plants, it is labor-intensive, requiring manual measurements of leaf dimensions. In contrast, some indirect non-destructive techniques leveraging convolutional neural networks can predict leaf areas more swiftly and autonomously. In this paper, we propose a new direct method using 3D point clouds constructed by semantic RGB-D (Red Green Blue and Depth) images generated by a semantic segmentation neural network and RGB-D images. The key idea is that the leaf area is quantified by the count of points depicting the leaves. This method demonstrates high accuracy, with an R2 value of 0.98 and a RMSE (Root Mean Square Error) value of 3.05 cm2. Here, the neural network’s role is to segregate leaves from other plant parts to accurately measure the leaf area represented by the point clouds, rather than predicting the total leaf area of the plant. This method is direct, precise, and non-invasive to sweet pepper plants, offering easy leaf area calculation. It can be implemented on laptops for manual use or integrated into robots for automated periodic leaf area assessments. This innovative method holds promise for advancing our understanding of plant responses to environmental changes. We verified the method’s reliability and superior performance through experiments on individual leaves and whole plants. Full article
(This article belongs to the Special Issue Implementation of Artificial Intelligence in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Semantic segmentation of sweet pepper plant parts.</p>
Full article ">Figure 2
<p>3D point clouds. (<b>a</b>) RGB point cloud, and (<b>b</b>) semantic 3D point cloud.</p>
Full article ">Figure 3
<p>Example of augmentation to generate training dataset.</p>
Full article ">Figure 4
<p>Steps to estimate leaf area. The red and green lines are the O<sub>x</sub>- and O<sub>y</sub>-axes of the 3D coordinates.</p>
Full article ">Figure 5
<p>Experimental setup of leaf area estimation using two RGB-D cameras.</p>
Full article ">Figure 6
<p>Process to estimate the whole leaf area. The red and green lines are the O<sub>x</sub>- and O<sub>y</sub>-axes of 3D coordinates.</p>
Full article ">Figure 7
<p>Three samples for single-leaf area estimation experiments. The red and green lines are the O<sub>x</sub>- and O<sub>y</sub>-axes of the 3D coordinate. (<b>1</b>)–(<b>3</b>) are the first three samples. (<b>1</b>) is a 2 cm<sup>2</sup> rectangle leaf, (<b>2</b>) is flat leaf and (<b>3</b>) is a curved leaf.</p>
Full article ">Figure 8
<p>Regressed leaf areas.</p>
Full article ">Figure 9
<p>The visual result of creating a semantic 3D point cloud from two RGB-D cameras. Images (<b>a</b>–<b>c</b>) are the front, left, and back viewpoints, respectively.</p>
Full article ">
25 pages, 4361 KiB  
Article
Two-Stage Ensemble Deep Learning Model for Precise Leaf Abnormality Detection in Centella asiatica
by Budsaba Buakum, Monika Kosacka-Olejnik, Rapeepan Pitakaso, Thanatkij Srichok, Surajet Khonjun, Peerawat Luesak, Natthapong Nanthasamroeng and Sarayut Gonwirat
AgriEngineering 2024, 6(1), 620-644; https://doi.org/10.3390/agriengineering6010037 - 4 Mar 2024
Cited by 1 | Viewed by 1204
Abstract
Leaf abnormalities pose a significant threat to agricultural productivity, particularly in medicinal plants such as Centella asiatica (Linn.) Urban (CAU), where they can severely impact both the yield and the quality of leaf-derived substances. In this study, we focus on the early detection [...] Read more.
Leaf abnormalities pose a significant threat to agricultural productivity, particularly in medicinal plants such as Centella asiatica (Linn.) Urban (CAU), where they can severely impact both the yield and the quality of leaf-derived substances. In this study, we focus on the early detection of such leaf diseases in CAU, a critical intervention for minimizing crop damage and ensuring plant health. We propose a novel parallel-Variable Neighborhood Strategy Adaptive Search (parallel-VaNSAS) ensemble deep learning method specifically designed for this purpose. Our approach is distinguished by a two-stage ensemble model, which combines the strengths of advanced image segmentation and Convolutional Neural Networks (CNNs) to detect leaf diseases with high accuracy and efficiency. In the first stage, we employ U-net, Mask-R-CNN, and DeepNetV3++ for the precise image segmentation of leaf abnormalities. This step is crucial for accurately identifying diseased regions, thereby facilitating a focused and effective analysis in the subsequent stage. The second stage utilizes ShuffleNetV2, SqueezeNetV2, and MobileNetV3, which are robust CNN architectures, to classify the segmented images into different categories of leaf diseases. This two-stage methodology significantly improves the quality of disease detection over traditional methods. By employing a combination of ensemble segmentation and diverse CNN models, we achieve a comprehensive and nuanced analysis of leaf diseases. Our model’s efficacy is further enhanced through the integration of four decision fusion strategies: unweighted average (UWA), differential evolution (DE), particle swarm optimization (PSO), and Variable Neighborhood Strategy Adaptive Search (VaNSAS). Through extensive evaluations of the ABL-1 and ABL-2 datasets, which include a total of 14,860 images encompassing eight types of leaf abnormalities, our model demonstrates its superiority. The ensemble segmentation method outperforms single-method approaches by 7.34%, and our heterogeneous ensemble model excels by 8.43% and 14.59% compared to the homogeneous ensemble and single models, respectively. Additionally, image augmentation contributes to a 5.37% improvement in model performance, and the VaNSAS strategy enhances solution quality significantly over other decision fusion methods. Overall, our novel parallel-VaNSAS ensemble deep learning method represents a significant advancement in the detection of leaf diseases in CAU, promising a more effective approach to maintaining crop health and productivity. Full article
Show Figures

Figure 1

Figure 1
<p>Ensemble image segmentation techniques.</p>
Full article ">Figure 2
<p>Image augmentation example images: (<b>a</b>) random cropping, (<b>b</b>) random scaling, and (<b>c</b>) random flipping.</p>
Full article ">Figure 3
<p>Framework of the proposed model.</p>
Full article ">Figure 4
<p>Average accuracy using the different model entities proposed.</p>
Full article ">Figure 5
<p>Confusion matrix of the proposed model.</p>
Full article ">Figure 6
<p>Heatmap of the leaf abnormality for all classes.</p>
Full article ">
23 pages, 4218 KiB  
Article
Mats Made from Recycled Tyre Rubber and Polyurethane for Improving Growth Performance in Buffalo Farms
by Antonio Masiello, Maria Rosa di Cicco, Antonio Spagnuolo, Carmela Vetromile, Giuseppe De Santo, Guido Costanzo, Antonio Marotta, Florindo De Cristofaro and Carmine Lubritto
AgriEngineering 2024, 6(1), 597-619; https://doi.org/10.3390/agriengineering6010036 - 4 Mar 2024
Viewed by 1289
Abstract
This study focuses on anti-trauma mats designed for buffaloes’ comfort, using as raw materials rubber powder from end-of-life tyres (ELTs) and an isocyanate-based polyurethane resin binder. The first part of the study focused on mat formulation. Whilst it was possible to select a [...] Read more.
This study focuses on anti-trauma mats designed for buffaloes’ comfort, using as raw materials rubber powder from end-of-life tyres (ELTs) and an isocyanate-based polyurethane resin binder. The first part of the study focused on mat formulation. Whilst it was possible to select a unique combination of raw materials and design features, it was necessary to investigate the relationship between three critical parameters affecting mat consistency and therefore buffalo comfort: binder quantity, mat thickness, and desired final mat density (bulk). In order to quantitatively assess the variation in hardness, various combinations were investigated within well-defined ranges based on the relevant literature. The results obtained from nine selected combinations indicate that increases in the three critical parameters do not induce a real phase transition in the final product consistency, although the hardness suggests an increasing trend. The mats consistently exhibited a moderately soft/hard consistency, offering environmental benefits in terms of increased rubber usage and potentially reduced chemical binder, depending on the desired thickness. The selected mixture showed excellent resistance to heavy chemical loads, suggesting reliability for frequent cleaning operations. The second part of the study involved field trials of the mats with calves. This involved monitoring their weight gain and appetite levels over a 90-day period. The results showed excellent growth performance compared to uncoated grids (i.e., weight gain was approximately 20% higher at the end of the observation period); this was similar to that achieved with the use of straw bedding. However, compared to straw bedding, the mats (i) exhibit long-term durability, with no signs of wear from washing or trampling over the months of the trial, (ii) allow for quick and efficient cleaning, and (iii) enable companies to save on labour, material (straw), and waste disposal costs, while maintaining (or even improving) the same welfare levels associated with the use of straw. Full article
(This article belongs to the Section Livestock Farming Technology)
Show Figures

Figure 1

Figure 1
<p>Moulding Press 60T (Salvadori<sup>®</sup>): rendering provided by the manufacturer (<b>left</b>) and the machine installed in the T-Cycle company (<b>right</b>).</p>
Full article ">Figure 2
<p>Some photos of the Shore measurement procedure on a sample mat (<b>a</b>) and grid pattern for the measurement of Shore A (<b>b</b>).</p>
Full article ">Figure 3
<p>Detail images of the samples used to perform the chemical tests.</p>
Full article ">Figure 4
<p>Hardness level distribution for each sample (the grid is reported in <a href="#agriengineering-06-00036-f002" class="html-fig">Figure 2</a>).</p>
Full article ">Figure 5
<p>Hardness Shore A levels for each prototype through the four combination tests conducted. The samples are arranged on the graph for each test in ascending order of their respective variables: binder fraction for Test I; thickness for Test II; final mat density for Tests III and IV.</p>
Full article ">Figure 6
<p>Samples submerged in the different reagents during the chemical stress tests.</p>
Full article ">Figure 7
<p>Average weight trend of the calves with the five different floorings.</p>
Full article ">Figure 8
<p>Temporal trend of the liquid and solid rations consumed by the 15 subjects during the day, over an observation period of 90 days from the 3rd day after birth. Data are presented as daily means for each of the five flooring types.</p>
Full article ">
23 pages, 16208 KiB  
Article
Improving the Estimation of Rice Crop Damage from Flooding Events Using Open-Source Satellite Data and UAV Image Data
by Vicente Ballaran, Jr., Miho Ohara, Mohamed Rasmy, Koki Homma, Kentaro Aida and Kohei Hosonuma
AgriEngineering 2024, 6(1), 574-596; https://doi.org/10.3390/agriengineering6010035 - 4 Mar 2024
Cited by 1 | Viewed by 1356
Abstract
Having an additional tool for swiftly determining the extent of flood damage to crops with confidence is beneficial. This study focuses on estimating rice crop damage caused by flooding in Candaba, Pampanga, using open-source satellite data. By analyzing the correlation between Normalized Difference [...] Read more.
Having an additional tool for swiftly determining the extent of flood damage to crops with confidence is beneficial. This study focuses on estimating rice crop damage caused by flooding in Candaba, Pampanga, using open-source satellite data. By analyzing the correlation between Normalized Difference Vegetation Index (NDVI) measurements from unmanned aerial vehicles (UAVs) and Sentinel-2 (S2) satellite data, a cost-effective and time-efficient alternative for agricultural monitoring is explored. This study comprises two stages: establishing a correlation between clear sky observations and NDVI measurements, and employing a combination of S2 NDVI and Synthetic Aperture Radar (SAR) NDVI to estimate crop damage. The integration of SAR and optical satellite data overcomes cloud cover challenges during typhoon events. The accuracy of standing crop estimation reached up to 99.2%, while crop damage estimation reached up to 99.7%. UAVs equipped with multispectral cameras prove effective for small-scale monitoring, while satellite imagery offers a valuable alternative for larger areas. The strong correlation between UAV and satellite-derived NDVI measurements highlights the significance of open-source satellite data in accurately estimating rice crop damage, providing a swift and reliable tool for assessing flood damage in agricultural monitoring. Full article
Show Figures

Figure 1

Figure 1
<p>Location of the 16 sites for UAV experiment in the Philippines.</p>
Full article ">Figure 2
<p>Location of the three farm lots used as study sites in Japan.</p>
Full article ">Figure 3
<p>Study framework showing two stages, 1 and 2, with stage 2 comprising three sub-stages: (<b>A</b>)—open-source data acquisition; (<b>B</b>)—estimation using S2 data; (<b>C</b>)—estimation using fusion of S2 and S1 data.</p>
Full article ">Figure 4
<p>P4 Multispectral camera-mounted UAV used in the study.</p>
Full article ">Figure 5
<p>Rice crop damages in terms of affected area (in hectares) in barangays of Candaba, Pampanga.</p>
Full article ">Figure 6
<p>Map of Barangay Pangclara.</p>
Full article ">Figure 7
<p>Size of farm areas (in hectares) planted with rice on specific dates in Brgy. Pangclara.</p>
Full article ">Figure 8
<p>Graph of UAV versus Sentinel−2 NDVI measurements for Obayashi, Ibaraki, Japan.</p>
Full article ">Figure 9
<p>Graph of UAV versus Sentinel−2 NDVI measurements for Candaba, Pampanga.</p>
Full article ">Figure 10
<p>Unsupervised clustering method in Barangay Pangclara, Candaba, Pampanga, showing (<b>a</b>) actual RGB image; and (<b>b</b>) unsupervised clusters.</p>
Full article ">Figure 11
<p>Flood visualizations of Candaba Municipality and Barangay Pangclara for the Typhoon Quinta flood event, showing (<b>a1</b>) before the typhoon at Candaba; (<b>a2</b>) during the typhoon at Candaba; (<b>a3</b>) before the typhoon at Brgy. Pangclara; and (<b>a4</b>) during the typhoon at Brgy. Pangclara.</p>
Full article ">Figure 12
<p>Comparison between the recorded damage data against estimated damage values.</p>
Full article ">Figure 13
<p>Summary table of comparison between recorded damage data and estimated damage value for the six flood events.</p>
Full article ">Figure 14
<p>S2 RGB images (<b>top</b>) to determine cloud presence and NDVI measurements from UAV and S2 (<b>bottom</b>).</p>
Full article ">Figure 15
<p>Drone-captured vs. satellite-derived NDVI at Barangay Magumbali, Candaba, Pampanga.</p>
Full article ">Figure 16
<p>Flood visualization from S1 SAR under Typhoon Karding.</p>
Full article ">Figure 17
<p>Cases where the ‘after’ flood scenario from S1 SAR is equated with the net flood, where blue color shows inundated areas.</p>
Full article ">Figure 18
<p>CHIRPS 2022 rainfall data in Barangay Pangclara under Typhoon Florita showing continuous rainfall (in red circle) prior to typhoon event.</p>
Full article ">
19 pages, 3913 KiB  
Article
Morning Glory Flower Detection in Aerial Images Using Semi-Supervised Segmentation with Gaussian Mixture Models
by Sruthi Keerthi Valicharla, Jinge Wang, Xin Li, Srikanth Gururajan, Roghaiyeh Karimzadeh and Yong-Lak Park
AgriEngineering 2024, 6(1), 555-573; https://doi.org/10.3390/agriengineering6010034 - 1 Mar 2024
Cited by 1 | Viewed by 1249
Abstract
The invasive morning glory, Ipomoea purpurea (Convolvulaceae), poses a mounting challenge in vineyards by hindering grape harvest and as a secondary host of disease pathogens, necessitating advanced detection and control strategies. This study introduces a novel automated image analysis framework using aerial images [...] Read more.
The invasive morning glory, Ipomoea purpurea (Convolvulaceae), poses a mounting challenge in vineyards by hindering grape harvest and as a secondary host of disease pathogens, necessitating advanced detection and control strategies. This study introduces a novel automated image analysis framework using aerial images obtained from a small fixed-wing unmanned aircraft system (UAS) and an RGB camera for the large-scale detection of I. purpurea flowers. This study aimed to assess the sampling fidelity of aerial detection in comparison with the actual infestation measured by ground validation surveys. The UAS was systematically operated over 16 vineyard plots infested with I. purpurea and another 16 plots without I. purpurea infestation. We used a semi-supervised segmentation model incorporating a Gaussian Mixture Model (GMM) with the Expectation-Maximization algorithm to detect and count I. purpurea flowers. The flower detectability of the GMM was compared with that of conventional K-means methods. The results of this study showed that the GMM detected the presence of I. purpurea flowers in all 16 infested plots with 0% for both type I and type II errors, while the K-means method had 0% and 6.3% for type I and type II errors, respectively. The GMM and K-means methods detected 76% and 65% of the flowers, respectively. These results underscore the effectiveness of the GMM-based segmentation model in accurately detecting and quantifying I. purpurea flowers compared with a conventional approach. This study demonstrated the efficiency of a fixed-wing UAS coupled with automated image analysis for I. purpurea flower detection in vineyards, achieving success without relying on data-driven deep-learning models. Full article
(This article belongs to the Special Issue Smart Pest Monitoring Technology)
Show Figures

Figure 1

Figure 1
<p>Study site in a vineyard located in Arvin, CA, USA. The area of vineyard blocks surveyed for this study was ca. 65.5 ha (1560 m by 420 m). (<b>a</b>) GPS ground track of the UAS flight and (<b>b</b>) a ground view of <span class="html-italic">I. purpurea</span> infestation. The red dot on the map indicates the location of the study site, orange lines in (<b>a</b>) indicate the flight path of the UAS, and red arrows in (<b>b</b>) indicate <span class="html-italic">I. purpurea</span> flowers.</p>
Full article ">Figure 2
<p>The fixed-wing UAS model (MiG-27 Foamy) used in this study. Please see <a href="#agriengineering-06-00034-t001" class="html-table">Table 1</a> for detailed specifications of the UAS.</p>
Full article ">Figure 3
<p>Overall enhanced Gaussian Mixture Model (GMM) framework for the detection and counting of <span class="html-italic">I. purpurea</span> flowers on aerial images with unsupervised and supervised modules.</p>
Full article ">Figure 4
<p>Overview of the Image-Driven Feature Extraction Block (supervised module).</p>
Full article ">Figure 5
<p>Gaussian Mixture Model (GMM; unsupervised module) architecture with inputs as the 2-D flower pixel array from the supervised module and the HSV image and output as a binary flower mask indicating the presence of <span class="html-italic">I. purpurea</span>.</p>
Full article ">Figure 6
<p>Illustration of our enhanced GMM framework application to an RGB image depicting: (<b>a</b>) the input: original RGB image, (<b>b</b>) the predicted binary mask: output of our enhanced GMM highlighting <span class="html-italic">I. purpurea</span> flower regions, and (<b>c</b>) the segmented output: application of the binary mask to the original RGB image, emphasizing the identified flower locations and providing a total count estimate. White dots in (<b>b</b>) and red dots in (<b>c</b>) indicate <span class="html-italic">I. purpurea</span> flowers detected by the enhanced GMM.</p>
Full article ">Figure 7
<p>Comparison between the conventional K-means and our enhanced GMM methods for detecting <span class="html-italic">I. purpurea</span> flowers using a line plot (<b>a</b>) and regression (<b>b</b>). The dotted line in (<b>b</b>) indicates a perfect match between ground and aerial survey results (i.e., <span class="html-italic">y</span> = <span class="html-italic">x</span>).</p>
Full article ">Figure 8
<p>Spider plot comparison of bias metrics (Bias Ratio, Scaled Mean Bias Residual, and Fractional Bias) between K-means and our enhanced GMM for <span class="html-italic">I. purpurea</span> flower detection and counting.</p>
Full article ">
16 pages, 13132 KiB  
Article
A Multiple Criteria Decision-Making Method Generated by the Space Colonization Algorithm for Automated Pruning Strategies of Trees
by Gang Zhao and Dian Wang
AgriEngineering 2024, 6(1), 539-554; https://doi.org/10.3390/agriengineering6010033 - 26 Feb 2024
Viewed by 1120
Abstract
The rise of mechanical automation in orchards has sparked research interest in developing robots capable of autonomous tree pruning operations. To achieve accurate pruning outcomes, these robots require robust perception systems that can reconstruct three-dimensional tree characteristics and execute appropriate pruning strategies. Three-dimensional [...] Read more.
The rise of mechanical automation in orchards has sparked research interest in developing robots capable of autonomous tree pruning operations. To achieve accurate pruning outcomes, these robots require robust perception systems that can reconstruct three-dimensional tree characteristics and execute appropriate pruning strategies. Three-dimensional modeling plays a crucial role in enabling accurate pruning outcomes. This paper introduces a specialized tree modeling approach using the space colonization algorithm (SCA) tailored for pruning. The proposed method extends SCA to operate in three-dimensional space, generating comprehensive cherry tree models. The resulting models are exported as normalized point cloud data, serving as the input dataset. Multiple criteria decision analysis is utilized to guide pruning decisions, incorporating various factors such as tree species, tree life cycle stages, and pruning strategies during real-world implementation. The pruning task is transformed into a point cloud neural network segmentation task, identifying the trunks and branches to be pruned. This approach reduces the data acquisition time and labor costs during development. Meanwhile, pruning training in a virtual environment is an application of digital twin technology, which makes it possible to combine the meta-universe with the automated pruning of fruit trees. Experimental results demonstrate superior performance compared to other pruning systems. The overall accuracy is 85%, with mean accuracy and mean Intersection over Union (IoU) values of 0.83 and 0.75. Trunks and branches are successfully segmented with class accuracies of 0.89 and 0.81, respectively, and Intersection over Union (IoU) metrics of 0.79 and 0.72. Compared to using the open-source synthetic tree dataset, this dataset yields 80% of the overall accuracy under the same conditions, which is an improvement of 6%. Full article
Show Figures

Figure 1

Figure 1
<p>Framework of our modeling trees and pruning system.</p>
Full article ">Figure 2
<p>(<b>a</b>) The initial skeleton and the set of points to be searched; (<b>b</b>) Calculating the growth vector; (<b>c</b>) The angle constraint; (<b>d</b>) Estimating the growth direction; (<b>e</b>) Generating a new skeleton; (<b>f</b>) Deleting the invalid points to be searched; (<b>g</b>) Results of the First Iteration.</p>
Full article ">Figure 3
<p>The automatically generated three-dimensional tree models.</p>
Full article ">Figure 4
<p>The architecture of PointNet++.</p>
Full article ">Figure 5
<p>The multiple criteria decision-making method of pruning a dormant cherry tree.</p>
Full article ">Figure 6
<p>Example of labeled point clouds of the tree.</p>
Full article ">Figure 7
<p>Workflow of our modeling trees and pruning system.</p>
Full article ">Figure 8
<p>(<b>a</b>) Excessive bending; (<b>b</b>) incorrect iteration quantities; and (<b>c</b>) unrealistic branch competition.</p>
Full article ">Figure 9
<p>(<b>a</b>) Learning accuracy curve; (<b>b</b>) minimum batch loss curve of PointNet++.</p>
Full article ">Figure 10
<p>(<b>a</b>) Estimated performance over the entire test dataset; (<b>b</b>) comparison of performance measures for individual classes.</p>
Full article ">Figure 11
<p>(<b>a</b>) Pictures of cherry trees in Beijing; (<b>b</b>) virtual cherry tree models generated by SCA; (<b>c</b>) virtual cherry tree point cloud models; and (<b>d</b>) synthetic cherry models.</p>
Full article ">
13 pages, 4400 KiB  
Article
Glyphosate Pattern Recognition Using Microwave-Interdigitated Sensors and Principal Component Analysis
by Carlos R. Santillán-Rodríguez, Renee Joselin Sáenz-Hernández, Cristina Grijalva-Castillo, Eutiquio Barrientos-Juarez, José Trinidad Elizalde-Galindo and José Matutes-Aquino
AgriEngineering 2024, 6(1), 526-538; https://doi.org/10.3390/agriengineering6010032 - 23 Feb 2024
Viewed by 822
Abstract
Glyphosate is an herbicide used worldwide with harmful health effects, and efforts are currently being made to develop sensors capable of detecting its presence. In this work, an array of four interdigitated microwave sensors was used together with the multivariate statistical technique of [...] Read more.
Glyphosate is an herbicide used worldwide with harmful health effects, and efforts are currently being made to develop sensors capable of detecting its presence. In this work, an array of four interdigitated microwave sensors was used together with the multivariate statistical technique of principal component analysis, which allowed a well-defined pattern to be found that characterized waters for agricultural use extracted from the Bustillos lagoon. The variability due to differences between the samples was explained by the first principal component, amounting to 86.3% of the total variance, while the variability attributed to the measurements and sensors was explained through the second principal component, amounting to 13.2% of the total variance. The time evolution of measurements showed a clustering of data points as time passed, which was related to microwave–sample interaction, varied with the fluctuating dynamical structure of each sample, and tended to have a stable mean value. Full article
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Electric field force lines of the parallel plate capacitor; (<b>b</b>) curved electric field force lines with an acute angle between the plates; (<b>c</b>) electric field force lines when both plates are in the same plane; and (<b>d</b>) equivalent circuit of the capacitive interdigitated sensor. The electric field force lines were simulated using COMSOL Multiphysics version 6.2.</p>
Full article ">Figure 2
<p>Bustillos lagoon aerial photograph indicating points from which the water samples were extracted.</p>
Full article ">Figure 3
<p>The design scheme of the 3F interdigitated sensor.</p>
Full article ">Figure 4
<p>(<b>a</b>) Microwave-interdigitated sensor connected to the vector network analyzer using a 50 Ω coaxial cable; (<b>b</b>) enlarged photo of the microwave-interdigitated sensor submerged in a beaker with water and connected to the coaxial cable through an SMA connector.</p>
Full article ">Figure 5
<p>Component plot showing the scores of Bustillos lagoon samples, DIW, DTW and GLY, with an insert within the image (indicated by an arrow) representing a magnification of a subset of glyphosate data.</p>
Full article ">Figure 6
<p>Plots of the first two principal components calculated using (<b>a</b>) the correlation matrix and (<b>b</b>) the covariance matrix, respectively.</p>
Full article ">Figure 7
<p>R coefficient variation over measurement time for samples (DIW, GLY, S2) measured with sensors 3F (<b>a</b>–<b>c</b>), 6F (<b>d</b>–<b>f</b>), 9F (<b>g</b>–<b>i</b>), and 12F (<b>j</b>–<b>l</b>).</p>
Full article ">Figure 8
<p>Reflected amplitude coefficient of Bustillos lagoon samples, deionized water, and distilled water as a function of the reflected amplitude coefficient of commercial GLY, using the measurement time from 1 min to 50 min as a parameter. Each graph corresponds to (<b>a</b>) 3F, (<b>b</b>) 6F, (<b>c</b>) 9F, and (<b>d</b>) 12F interdigitated sensors used in the array.</p>
Full article ">Figure 9
<p>Component plots using arrays with only three sensors: (<b>a</b>) (3F, 6F, 9F), (<b>b</b>) (3F, 6F, 12F), (<b>c</b>) (3F, 9F, 12F) and (<b>d</b>) (6F, 9F, 12F).</p>
Full article ">
17 pages, 4001 KiB  
Article
UAV-Based Classification of Intercropped Forage Cactus: A Comparison of RGB and Multispectral Sample Spaces Using Machine Learning in an Irrigated Area
by Oto Barbosa de Andrade, Abelardo Antônio de Assunção Montenegro, Moisés Alves da Silva Neto, Lizandra de Barros de Sousa, Thayná Alice Brito Almeida, João Luis Mendes Pedroso de Lima, Ailton Alves de Carvalho, Marcos Vinícius da Silva, Victor Wanderley Costa de Medeiros, Rodrigo Gabriel Ferreira Soares, Thieres George Freire da Silva and Bárbara Pinto Vilar
AgriEngineering 2024, 6(1), 509-525; https://doi.org/10.3390/agriengineering6010031 - 23 Feb 2024
Viewed by 1287
Abstract
Precision agriculture requires accurate methods for classifying crops and soil cover in agricultural production areas. The study aims to evaluate three machine learning-based classifiers to identify intercropped forage cactus cultivation in irrigated areas using Unmanned Aerial Vehicles (UAV). It conducted a comparative analysis [...] Read more.
Precision agriculture requires accurate methods for classifying crops and soil cover in agricultural production areas. The study aims to evaluate three machine learning-based classifiers to identify intercropped forage cactus cultivation in irrigated areas using Unmanned Aerial Vehicles (UAV). It conducted a comparative analysis between multispectral and visible Red-Green-Blue (RGB) sampling, followed by the efficiency analysis of Gaussian Mixture Model (GMM), K-Nearest Neighbors (KNN), and Random Forest (RF) algorithms. The classification targets included exposed soil, mulching soil cover, developed and undeveloped forage cactus, moringa, and gliricidia in the Brazilian semiarid. The results indicated that the KNN and RF algorithms outperformed other methods, showing no significant differences according to the kappa index for both Multispectral and RGB sample spaces. In contrast, the GMM showed lower performance, with kappa index values of 0.82 and 0.78, compared to RF 0.86 and 0.82, and KNN 0.86 and 0.82. The KNN and RF algorithms performed well, with individual accuracy rates above 85% for both sample spaces. Overall, the KNN algorithm demonstrated superiority for the RGB sample space, whereas the RF algorithm excelled for the multispectral sample space. Even with the better performance of multispectral images, machine learning algorithms applied to RGB samples produced promising results for crop classification. Full article
(This article belongs to the Section Remote Sensing in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Location of the study area (<b>a</b>), climatic classification (<b>b</b>), hypsometry (<b>c</b>), experimental plot (<b>d</b>), and images of the intercropped system (<b>e</b>). Aw: tropical savanna climate with a dry winter season; BWh: hot arid climate; BSh: hot semiarid tropical climate with a defined dry season.</p>
Full article ">Figure 2
<p>Climate information of the study area. Period from 31 December 2022 to 1 February 2023. Parnamirim, Pernambuco State, Brazil. PET: Potential Crop Evapotranspiration; CET: Crop Evapotranspiration.</p>
Full article ">Figure 3
<p>Mean biometric data and standard deviation of forage cactus at the irrigated area.</p>
Full article ">Figure 4
<p>Samples used for the classification of the plots in the irrigated area.</p>
Full article ">Figure 5
<p>Classification by the Gaussian Mixture algorithm (<b>A</b>), K-Nearest Neighbors (<b>B</b>), and the Random Forest algorithm (<b>C</b>) with multispectral sample space.</p>
Full article ">Figure 6
<p>Classification by the Gaussian Mixture Model (<b>A</b>), K-Nearest Neighbors (<b>B</b>), and Random Forest (<b>C</b>) algorithms with the RGB sample space.</p>
Full article ">Figure 7
<p>Box plot regarding the individual precision (<b>Top</b>), recall (<b>Middle</b>), and F1 Score (<b>Bottom</b>) results of each algorithm in RGB and Multispectral sample spaces.</p>
Full article ">
18 pages, 12795 KiB  
Article
Maize Crop Detection through Geo-Object-Oriented Analysis Using Orbital Multi-Sensors on the Google Earth Engine Platform
by Ismael Cavalcante Maciel Junior, Rivanildo Dallacort, Cácio Luiz Boechat, Paulo Eduardo Teodoro, Larissa Pereira Ribeiro Teodoro, Fernando Saragosa Rossi, José Francisco de Oliveira-Júnior, João Lucas Della-Silva, Fabio Henrique Rojo Baio, Mendelson Lima and Carlos Antonio da Silva Junior
AgriEngineering 2024, 6(1), 491-508; https://doi.org/10.3390/agriengineering6010030 - 22 Feb 2024
Viewed by 1301
Abstract
Mato Grosso state is the biggest maize producer in Brazil, with the predominance of cultivation concentrated in the second harvest. Due to the need to obtain more accurate and efficient data, agricultural intelligence is adapting and embracing new technologies such as the use [...] Read more.
Mato Grosso state is the biggest maize producer in Brazil, with the predominance of cultivation concentrated in the second harvest. Due to the need to obtain more accurate and efficient data, agricultural intelligence is adapting and embracing new technologies such as the use of satellites for remote sensing and geographic information systems. In this respect, this study aimed to map the second harvest maize cultivation areas at Canarana-MT in the crop year 2019/2020 by using geographic object-based image analysis (GEOBIA) with different spatial, spectral, and temporal resolutions. MSI/Sentinel-2, OLI/Landsat-8, MODIS-Terra and MODIS-Aqua, and PlanetScope imagery were used in this assessment. The maize crops mapping was based on cartographic basis from IBGE (Brazilian Institute of Geography and Statistics) and the Google Earth Engine (GEE), and the following steps of image filtering (gray-level co-occurrence matrix—GLCM), vegetation indices calculation, segmentation by simple non-iterative clustering (SNIC), principal component (PC) analysis, and classification by random forest (RF) algorithm, followed finally by confusion matrix analysis, kappa, overall accuracy (OA), and validation statistics. From these methods, satisfactory results were found; with OA from 86.41% to 88.65% and kappa from 81.26% and 84.61% among the imagery systems considered, the GEOBIA technique combined with the SNIC and GLCM spectral and texture feature discriminations and the RF classifier presented a mapping of the corn crop of the study area that demonstrates an improved and aided the performance of automated multispectral image classification processes. Full article
Show Figures

Figure 1

Figure 1
<p>Flowchart of the object-oriented classification methodology.</p>
Full article ">Figure 2
<p>Location of study area in Canarana municipality, Mato Grosso state, presented by using the normalized difference vegetation index (NDVI).</p>
Full article ">Figure 3
<p>Land-use and land-cover sample’s location at Canarana-MT.</p>
Full article ">Figure 4
<p>PC analysis mosaicking for (<b>A</b>) OLI/Landsat-8, (<b>B</b>) MODIS Terra, (<b>C</b>) Planet NICFI, and (<b>D</b>) MSI/Sentinel-2.</p>
Full article ">Figure 5
<p>Accuracy test with different quantities of decision trees in the random forest classification process in each imagery system considered: (<b>A</b>) OLI/Landsat-8, (<b>B</b>) MODIS Terra, (<b>C</b>) Planet NICFI, and (<b>D</b>) MSI/Sentinel-2.</p>
Full article ">Figure 6
<p>Land-use and land-cover classification based on GEOBIA and random forest for each considered sensor: (<b>A</b>) OLI/Landsat-8, (<b>B</b>) MODIS (<b>C</b>) Planet NICFI, and (<b>D</b>) MSI/Sentinel-2.</p>
Full article ">Figure 7
<p>Classified second-crop maize areas clip: (<b>A</b>) OLI/Landsat-8, (<b>B</b>) MODIS, (<b>C</b>) Planet NICFI, and (<b>D</b>) MSI/Sentinel-2.</p>
Full article ">Figure 8
<p>Confusion matrix for OLI/Landsat-8 imagery.</p>
Full article ">Figure 9
<p>Confusion matrix for MODIS imagery.</p>
Full article ">Figure 10
<p>Confusion matrix for Planet NICFI imagery.</p>
Full article ">Figure 11
<p>Confusion matrix for MSI/Sentinel-2.</p>
Full article ">
12 pages, 845 KiB  
Article
Assessment of a Low-Cost Hydrogen Sensor for Detection and Monitoring of Biohydrogen Production during Sugarcane Straw/Vinasse Co-Digestion
by Andrés Barrera, David Gómez-Ríos and Howard Ramírez-Malule
AgriEngineering 2024, 6(1), 479-490; https://doi.org/10.3390/agriengineering6010029 - 22 Feb 2024
Viewed by 948
Abstract
In this work, hydrogen production from the co-digestion of sugarcane straw and sugarcane vinasse in the dark fermentation (DF) process was monitored using a cost-effective hydrogen detection system. This system included a sensor of the MQ-8 series, an Arduino Leonardo board, and a [...] Read more.
In this work, hydrogen production from the co-digestion of sugarcane straw and sugarcane vinasse in the dark fermentation (DF) process was monitored using a cost-effective hydrogen detection system. This system included a sensor of the MQ-8 series, an Arduino Leonardo board, and a computer. For the DF, different concentrations of sugarcane vinasse and volumetric ratios of vinasse/hemicellulose hydrolysate were used together with a thermally pretreated inoculum, while the hydrogen detection system stored the hydrogen concentration data during the fermentation time. The results showed that a higher concentration of vinasse led to higher inhibitors for the DF, resulting in a longer lag phase. Additionally, the hydrogen detection system proved to be a useful tool in monitoring the DF, showcasing a rapid response time, and providing reliable information about the period of adaptation of the inoculum to the substrate. The measurement system was assessed using the error metrics SE, RMSE, and MBE, whose values ranged 0.6 and 5.0% as minimum and maximum values. The CV (1.0–8.0%) and SD (0.79–5.62 ppm) confirmed the sensor’s robustness, while the ANOVA at the 5% significance level affirmed the repeatability of measurements with this instrument. The RMSE values supported the accuracy of the sensor for online measurements (6.08–14.78 ppm). The adoption of this straightforward and affordable method sped up the analysis of hydrogen in secluded regions without incurring the expenses associated with traditional measuring instruments while offering a promising solution for biomass valorization, contributing to the advancement of rural green energy initiatives in remote areas. Full article
Show Figures

Figure 1

Figure 1
<p>Experimental setup for DF and hydrogen detection.</p>
Full article ">Figure 2
<p>Hydrogen detection rate from a thermally pretreated inoculum fed with a substrate composed of sugarcane vinasse and HH: (<b>a</b>) volumetric ratio of 3:1 (sugarcane vinasse/HH) and sugarcane vinasse diluted to 15% of the original; (<b>b</b>) volumetric ratio of 3:1 (sugarcane vinasse/HH) and sugarcane vinasse diluted to 5% of the original; (<b>c</b>) volumetric ratio of 1:1 (sugarcane vinasse/HH) and sugarcane vinasse diluted to 10% of the original; (<b>d</b>) volumetric ratio of 1:3 (sugarcane vinasse/HH) and sugarcane vinasse diluted to 5% of the original; (<b>e</b>) volumetric ratio of 1:3 (sugarcane vinasse/HH) and sugarcane vinasse diluted to 15% of the original.</p>
Full article ">Figure 2 Cont.
<p>Hydrogen detection rate from a thermally pretreated inoculum fed with a substrate composed of sugarcane vinasse and HH: (<b>a</b>) volumetric ratio of 3:1 (sugarcane vinasse/HH) and sugarcane vinasse diluted to 15% of the original; (<b>b</b>) volumetric ratio of 3:1 (sugarcane vinasse/HH) and sugarcane vinasse diluted to 5% of the original; (<b>c</b>) volumetric ratio of 1:1 (sugarcane vinasse/HH) and sugarcane vinasse diluted to 10% of the original; (<b>d</b>) volumetric ratio of 1:3 (sugarcane vinasse/HH) and sugarcane vinasse diluted to 5% of the original; (<b>e</b>) volumetric ratio of 1:3 (sugarcane vinasse/HH) and sugarcane vinasse diluted to 15% of the original.</p>
Full article ">
24 pages, 2720 KiB  
Review
Peculiarities of Unmanned Aerial Vehicle Use in Crop Production in Russia: A Review
by Marina Zvezdina, Yuliya Shokova and Sergey Lazarenko
AgriEngineering 2024, 6(1), 455-478; https://doi.org/10.3390/agriengineering6010028 - 21 Feb 2024
Viewed by 1538
Abstract
This review article examines the potential for intensifying Russian crop production through digital transformation, particularly through the use of unmanned aerial vehicles (UAVs). (1) The importance of this topic is driven by declining food security in some parts of the world and the [...] Read more.
This review article examines the potential for intensifying Russian crop production through digital transformation, particularly through the use of unmanned aerial vehicles (UAVs). (1) The importance of this topic is driven by declining food security in some parts of the world and the Russian government’s goal to increase grain exports by 2050. (2) Comparisons of agriculture technologies suggest that the use of UAVs for crop treatment with agrochemicals is economically effective in certain cases. (3) Specifically, UAV treatment is advantageous for plots with irregular shapes, larger than 2 ha, and containing between 9 and 19% infertile land. It is also important to agree on the flight parameters of the UAV, such as speed and altitude, as well as the type of on-board sprayer and agrochemical. In case of insufficient funds or expertise, it is recommended to hire specialized companies. (4) The listed peculiarities of Russian crop production led to assumptions about the regions where the use of UAVs for agrochemical treatment of crops would be economically effective. Full article
Show Figures

Figure 1

Figure 1
<p>Evaluation results of food security indicators in different regions of the world [<a href="#B52-agriengineering-06-00028" class="html-bibr">52</a>].</p>
Full article ">Figure 2
<p>Dynamics of changes in the structure of agricultural production by type of farm, % of total [<a href="#B49-agriengineering-06-00028" class="html-bibr">49</a>].</p>
Full article ">Figure 3
<p>Regional structure of agricultural producers by type of farm as % of total [<a href="#B49-agriengineering-06-00028" class="html-bibr">49</a>].</p>
Full article ">Figure 4
<p>Regions of Russia where UAV flights are allowed or can be allowed.</p>
Full article ">Figure 5
<p>Influence of the value of 1 ha plant protection agent (PPA) treatment cost on the critical value of land use coefficient (LUC): (<b>a</b>) General dependence; (<b>b</b>) Explanation of the critical point concept.</p>
Full article ">Figure 6
<p>Influence of technical characteristics and agrochemical application rates on the cost of crop treatment with UAVs: (<b>a</b>) Influence of UAV operational flight speed and agrochemical application rate at the rut length of the cultivated field 1 km; (<b>b</b>) Influence of rut length and agrochemical application rate at the operational flight speed of 60 km/h [<a href="#B92-agriengineering-06-00028" class="html-bibr">92</a>].</p>
Full article ">Figure 7
<p>Results of SWOT analysis of unmanned aerial vehicle (UAV) application in Russian crop production for field treatment with agrochemicals.</p>
Full article ">
17 pages, 4132 KiB  
Article
Advanced Farming Strategies Using NASA POWER Data in Peanut-Producing Regions without Surface Meteorological Stations
by Thiago Orlando Costa Barboza, Marcelo Araújo Junqueira Ferraz, Cristiane Pilon, George Vellidis, Taynara Tuany Borges Valeriano and Adão Felipe dos Santos
AgriEngineering 2024, 6(1), 438-454; https://doi.org/10.3390/agriengineering6010027 - 20 Feb 2024
Viewed by 1170
Abstract
Understanding the impact of climate on peanut growth is crucial, given the importance of temperature in peanut to accumulate Growing Degree Days (GDD). Therefore, our study aimed to compare data sourced from the NASA POWER platform with information from surface weather stations to [...] Read more.
Understanding the impact of climate on peanut growth is crucial, given the importance of temperature in peanut to accumulate Growing Degree Days (GDD). Therefore, our study aimed to compare data sourced from the NASA POWER platform with information from surface weather stations to identify underlying climate variables associated with peanut maturity (PMI). Second, we sought to devise alternative methods for calculating GDD in peanut fields without nearby weather stations. We utilized four peanut production fields in the state of Georgia, USA, using the cultivar Georgia-06G. Weather data from surface stations located near peanut fields were obtained from the University of Georgia’s weather stations. Corresponding data from the NASA POWER platform were downloaded by inputting the geographic coordinates of the weather stations. The climate variables included maximum and minimum temperatures, average temperature, solar radiation, surface pressure, relative humidity, and wind speed. We evaluated the platforms using Pearson correlation (r) analysis (p < 0.05), linear regression analysis, assessing coefficient of determination (R2), root mean square error (RMSE), and Willmott index (d), as well as principal component analysis. Among the climate variables, maximum and minimum temperatures, average temperature, and solar radiation showed the highest R2 values, along with low RMSE values. Conversely, wind speed and relative humidity exhibited lower correlation values with errors higher than those of the other variables. The grid size from the NASA POWER platform contributed to low model adjustments since the grid’s extension is kilometric and can overlap areas. Despite this limitation, NASA POWER proves to be a potential tool for PMI monitoring. It should be especially helpful for growers who do not have surface weather stations near their farms. Full article
Show Figures

Figure 1

Figure 1
<p>Peanut field production across various counties in Georgia, USA. (<b>A</b>) Magnolia (Ducker); (<b>B</b>) Blaelock (Coffee); (<b>C</b>) Docia (Tift); (<b>D</b>) Grand Canyon (Berrien). The red dots represent the location of each field in Georgia, and the color in each field represents the buffer delimiting the regions (polygons) of the collection of peanut to evaluate peanut pod maturity.</p>
Full article ">Figure 2
<p>Evaluation steps for maturity classification and comparison between the NASA POWER platform and surface weather stations.</p>
Full article ">Figure 3
<p>Correlation analysis for multiple variables from the locations Berrien (<b>A</b>), Coffe (<b>B</b>), Dougherty (<b>C</b>), Tift (<b>D</b>), and general model (<b>E</b>) for the two years (2018 and 2019).</p>
Full article ">Figure 4
<p>Linear regression analysis between NASA POWER (NP) and weather stations (WS) and metrics to evaluate the performance: determination coefficient (R<sup>2</sup>), Root Mean Square Error (RMSE), and Willmott performance index (d) for surface pressure (PS). (<b>a</b>) General model; (<b>b</b>) Model for the Coffee region; (<b>c</b>) Model for the Tift region; (<b>d</b>) Model for the Berrien region; and (<b>e</b>) Model for the Dougherty region.</p>
Full article ">Figure 5
<p>Linear regression analysis between NASA POWER (NP) and weather stations (WS) and metrics to evaluate the performance: determination coefficient (R<sup>2</sup>), Root Mean Square Error (RMSE), and Willmott performance index (d) for Wind speed (WS). (<b>a</b>) General model; (<b>b</b>) Model for the Coffee region; (<b>c</b>) Model for the Tift region; (<b>d</b>) Model for the Berrien region; and (<b>e</b>) Model for the Dougherty region.</p>
Full article ">Figure 6
<p>Linear regression analysis between NASA POWER (NP) and weather stations (WS) and metrics to evaluate the performance: determination coefficient (R<sup>2</sup>), Root Mean Square Error (RMSE), and Willmott performance index (d) for Solar radiation (Qg). (<b>a</b>) General model; (<b>b</b>) Model for the Coffee region; (<b>c</b>) Model for the Tift region; (<b>d</b>) Model for the Berrien region; and (<b>e</b>) Model for the Dougherty region.</p>
Full article ">Figure 7
<p>Linear regression analysis between NASA POWER (NP) and weather stations (WS) and metrics to evaluate the performance: determination coefficient (R<sup>2</sup>), Root Mean Square Error (RMSE), and Willmott performance index (d) for maximum temperature. (<b>a</b>) General model; (<b>b</b>) Model for the Coffee region; (<b>c</b>) Model for the Tift region; (<b>d</b>) Model for the Berrien region; and (<b>e</b>) Model for the Dougherty region.</p>
Full article ">Figure 8
<p>Linear regression analysis between NASA POWER (NP) and weather stations (WS) and metrics to evaluate the performance: determination coefficient (R<sup>2</sup>), Root Mean Square Error (RMSE), and Willmott performance index (d) for minimum temperature. (<b>a</b>) General model; (<b>b</b>) Model for the Coffee region; (<b>c</b>) Model for the Tift region; (<b>d</b>) Model for the Berrien region; and (<b>e</b>) Model for the Dougherty region.</p>
Full article ">Figure 9
<p>Linear regression analysis between NASA POWER (NP) and weather stations (WS) and metrics to evaluate the performance: determination coefficient (R<sup>2</sup>), Root Mean Square Error (RMSE), and Willmott performance index (d) for mean temperature. (<b>a</b>) General model; (<b>b</b>) Model for the Coffee region; (<b>c</b>) Model for the Tift region; (<b>d</b>) Model for the Berrien region; and (<b>e</b>) Model for the Dougherty region.</p>
Full article ">Figure 10
<p>Principal Component Analysis (PCA) for each region and Global model. (<b>A</b>) represents the PCA of the general model; (<b>B</b>) represents the Coffee region PCA; (<b>C</b>) represents the Tift region PCA; (<b>D</b>) represents the Berrien region; and (<b>E</b>) represents the Dougherty region. PS: surface pressure; WS: wind speed; UR: relative humidity; QG: solar radiation; Tmax: maximum temperature; Tmin: minimum temperature; Tmean: average temperature.</p>
Full article ">
15 pages, 3171 KiB  
Article
Modelling the Yield and Estimating the Energy Properties of Miscanthus x Giganteus in Different Harvest Periods
by Ivan Brandić, Neven Voća, Josip Leto and Nikola Bilandžija
AgriEngineering 2024, 6(1), 423-437; https://doi.org/10.3390/agriengineering6010026 - 19 Feb 2024
Viewed by 960
Abstract
This research aims to use artificial neural networks (ANNs) to estimate the yield and energy characteristics of Miscanthus x giganteus (MxG), considering factors such as year of cultivation, location, and harvest time. In the study, which was conducted over three years [...] Read more.
This research aims to use artificial neural networks (ANNs) to estimate the yield and energy characteristics of Miscanthus x giganteus (MxG), considering factors such as year of cultivation, location, and harvest time. In the study, which was conducted over three years in two different geographical areas, ANN regression models were used to estimate the lower heating value (LHV) and yield of MxG. The models showed high predictive accuracy, achieving R2 values of 0.85 for LHV and 0.95 for yield, with corresponding RMSEs of 0.13 and 2.22. A significant correlation affecting yield was found between plant height and number of shoots. In addition, a sensitivity analysis of the ANN models showed the influence of both categorical and continuous input variables on the predictions. These results highlight the role of MxG as a sustainable biomass energy source and provide insights for optimizing biomass production, influencing energy policy, and contributing to advances in renewable energy and global energy sustainability efforts. Full article
Show Figures

Figure 1

Figure 1
<p>Architecture of the ANN model for estimating (<b>a</b>) LHV and (<b>b</b>) yield of <span class="html-italic">MxG</span>.</p>
Full article ">Figure 2
<p>Recorded temperatures and precipitation for the duration of the study during different harvest periods at the (<b>a</b>) Bistra and (<b>b</b>) Sljeme sites.</p>
Full article ">Figure 3
<p>Heatmap of the correlation coefficient of ultimate analysis data and energy values of <span class="html-italic">MxG</span>.</p>
Full article ">Figure 4
<p>Heatmap of the correlation coefficient of the investigated measured properties of <span class="html-italic">MxG</span>.</p>
Full article ">Figure 5
<p>Scatterplot of the overlap between actual and predicted data for the estimation of LHV biomass of <span class="html-italic">MxG</span> with the data split into training, testing, and validation.</p>
Full article ">Figure 6
<p>Scatterplot of the overlap between actual and predicted data for biomass yield estimation of <span class="html-italic">MxG</span> with the data split into training, testing, and validation.</p>
Full article ">Figure 7
<p>The sensitivity method of the relative importance of the continuous and categorical input variables of the ANN model for the output of the LHV.</p>
Full article ">Figure 8
<p>The sensitivity method of the relative importance of the continuous and categorical input variables of the ANN model for the output return.</p>
Full article ">
14 pages, 2070 KiB  
Technical Note
Theoretical Study of the Motion of a Cut Sugar Beet Tops Particle along the Inner Surface of the Conveying and Unloading System of a Topping Machine
by Simone Pascuzzi, Volodymyr Bulgakov, Ivan Holovach, Semjons Ivanovs, Aivars Aboltins, Yevhen Ihnatiev, Adolfs Rucins, Oleksandra Trokhaniak and Francesco Paciolla
AgriEngineering 2024, 6(1), 409-422; https://doi.org/10.3390/agriengineering6010025 - 15 Feb 2024
Cited by 5 | Viewed by 1044
Abstract
One of the most delicate operations in the sugar beet harvesting process is removing the tops from the heads of the root crops without any mechanical damages. The aim of this study is to improve the design of the conveying and unloading system [...] Read more.
One of the most delicate operations in the sugar beet harvesting process is removing the tops from the heads of the root crops without any mechanical damages. The aim of this study is to improve the design of the conveying and unloading system of the sugar beet topper machine. In this paper, a mathematical model of the motion of a cut beet tops particle M, along the conveying and unloading system, has been developed to support the evaluation of kinematic and design parameters, depending on the rotational speed of the thrower blade, the air flow speed, the required ejection speed of particle M, and the position of the trailer that moves alongside the harvester. It has been established that increasing the speed Va of the top particle M, which has left the end of the blade of the thrower, leads to an increase in the arc coordinate S(t) of its movement along the cylindrical section of the casing. Within the range of the speed change from 4 m·s–1 to 8 m·s–1, the value of the arc coordinate S(t) increases by 1.4 times during time t = 0.006 s. Moreover, a rapid decrease in speed V is observed with an increase in the length x of the discharge chute. Full article
(This article belongs to the Section Agricultural Mechanization and Machinery)
Show Figures

Figure 1

Figure 1
<p>Front-mounted beet topper machine: (<b>a</b>) overall view: I—tractor; II—beet topper implement; (<b>b</b>) machine main scheme: 1—drive working bodies; 2—cylindrical section of the unloading system; 3—discharge chute; 4—frame; 5—pneumatic feeler wheel; 6—rotary cutting device.</p>
Full article ">Figure 2
<p>Force diagram concerning the motion of particle <span class="html-italic">M</span> after it has left the blade of the thrower in the cylindrical section of the conveying and unloading system. Aerodynamic force <math display="inline"><semantics> <mrow> <msub> <mrow> <mover accent="true"> <mrow> <mi>F</mi> </mrow> <mo>¯</mo> </mover> </mrow> <mrow> <mi>n</mi> </mrow> </msub> </mrow> </semantics></math> (blue line), weight force <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <mi>G</mi> </mrow> <mo>¯</mo> </mover> </mrow> </semantics></math> (green line), normal reaction force <math display="inline"><semantics> <mrow> <msub> <mrow> <mover accent="true"> <mrow> <mi>N</mi> </mrow> <mo>¯</mo> </mover> </mrow> <mrow> <mn>1</mn> </mrow> </msub> </mrow> </semantics></math> (red line), friction force <math display="inline"><semantics> <mrow> <msub> <mrow> <mover accent="true"> <mrow> <mi>F</mi> </mrow> <mo>¯</mo> </mover> </mrow> <mrow> <mi>t</mi> <mi>r</mi> <mn>1</mn> </mrow> </msub> </mrow> </semantics></math> (orange line).</p>
Full article ">Figure 3
<p>Force diagram concerning the motion of particle <span class="html-italic">M</span> along the straight-line section <span class="html-italic">DL</span> of the discharge chute. Aerodynamic force <math display="inline"><semantics> <mrow> <msub> <mrow> <mover accent="true"> <mrow> <mi>F</mi> </mrow> <mo>¯</mo> </mover> </mrow> <mrow> <mi>n</mi> </mrow> </msub> </mrow> </semantics></math> (blue line), weight force <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <mi>G</mi> </mrow> <mo>¯</mo> </mover> </mrow> </semantics></math> (green line), normal reaction force <math display="inline"><semantics> <mrow> <msub> <mrow> <mover accent="true"> <mrow> <mi>N</mi> </mrow> <mo>¯</mo> </mover> </mrow> <mrow> <mn>1</mn> </mrow> </msub> </mrow> </semantics></math> (red line), friction force <math display="inline"><semantics> <mrow> <msub> <mrow> <mover accent="true"> <mrow> <mi>F</mi> </mrow> <mo>¯</mo> </mover> </mrow> <mrow> <mi>t</mi> <mi>r</mi> <mn>1</mn> </mrow> </msub> </mrow> </semantics></math> (orange line).</p>
Full article ">Figure 4
<p>Dependence of the arc coordinate <span class="html-italic">S(t)</span> of particle <span class="html-italic">M</span> on time <span class="html-italic">t</span>, during its motion along the cylindrical section of the conveying and unloading system after it has left the thrower blade at the following speeds <span class="html-italic">V<sub>a</sub></span>: 1—<span class="html-italic">V<sub>a</sub></span> = 4 m·s<sup>–1</sup>; 2—<span class="html-italic">V<sub>a</sub></span> = 5 m·s<sup>–1</sup>; 3—<span class="html-italic">V<sub>a</sub></span> = 6 m·s<sup>–1</sup>; 4—<span class="html-italic">V<sub>a</sub></span> = 7 m·s<sup>–1</sup>; 5—<span class="html-italic">V<sub>a</sub></span> = 8 m·s<sup>–1</sup>.</p>
Full article ">Figure 5
<p>Dependence of the speed <span class="html-italic">V</span> of particle <span class="html-italic">M</span> on time <span class="html-italic">t</span> along the straight-line section <span class="html-italic">DL</span> of the discharge chute at a given initial speed of <span class="html-italic">V</span><sub>1</sub> = 7 m·s<sup>–1</sup>.</p>
Full article ">Figure 6
<p>Dependence of the speed <span class="html-italic">V</span> of particle <span class="html-italic">M</span> on the length <span class="html-italic">x</span> of the straight-line section <span class="html-italic">DL</span> of the discharge chute, with an initial speed of <span class="html-italic">V</span><sub>1</sub> = 7 m·s<sup>–1</sup>.</p>
Full article ">
13 pages, 1046 KiB  
Article
Carbon and Nitrogen Stocks in Topsoil under Different Land Use/Land Cover Types in the Southeast of Spain
by Abderraouf Benslama, Ignacio Gómez Lucas, Manuel M. Jordan Vidal, María Belén Almendro-Candel and Jose Navarro-Pedreño
AgriEngineering 2024, 6(1), 396-408; https://doi.org/10.3390/agriengineering6010024 - 12 Feb 2024
Cited by 2 | Viewed by 1322
Abstract
Land use plays a crucial role in the stock of soil organic carbon (SOC) and soil nitrogen (SN). The aim of this study was to assess and characterize the effects of various soil management practices on the physicochemical properties of soil in a [...] Read more.
Land use plays a crucial role in the stock of soil organic carbon (SOC) and soil nitrogen (SN). The aim of this study was to assess and characterize the effects of various soil management practices on the physicochemical properties of soil in a Mediterranean region in southeastern Spain. Texture, soil moisture, bulk density, pH, electrical conductivity, equivalent CaCO3 (%), soil organic matter and carbon, and Kjeldahl nitrogen were determined for the surface topsoil (0–5 cm, 180 samples) under three types of land cover: cropland, grassland, and urban soil. The main soil textures were silt, silt loam, and sandy loam with low percentages of soil moisture in all soil samples and lower bulk density values in cropland and grassland areas. The pH was alkaline and the electrical conductivity as well as the equivalent calcium carbonate content were moderate to high. Organic matter estimated using the LOI and WB methods varied in the order cropland > grassland > urban soil. The results obtained for SOC and SN indicate that cropland presented the highest stocks, followed by grassland and urban soil. The values determined for the C/N ratio were close to 10 in cropland and grassland, indicating that organic matter readily undergoes decomposition at these sites. Our results emphasize the importance of evaluating the effects and identifying the impacts of different soil management techniques, and further research is needed to better understand the potential to improve soil organic carbon and nitrogen storage in semiarid regions. Full article
Show Figures

Figure 1

Figure 1
<p>Location of sampling areas of this study.</p>
Full article ">Figure 2
<p>Sampling areas for cropland soil (<b>A</b>), grassland soil (<b>B</b>), and urban soil (<b>C</b>).</p>
Full article ">
21 pages, 13276 KiB  
Article
Enhanced Deep Learning Architecture for Rapid and Accurate Tomato Plant Disease Diagnosis
by Shahab Ul Islam, Shahab Zaib, Giampaolo Ferraioli, Vito Pascazio, Gilda Schirinzi and Ghassan Husnain
AgriEngineering 2024, 6(1), 375-395; https://doi.org/10.3390/agriengineering6010023 - 12 Feb 2024
Cited by 3 | Viewed by 1502
Abstract
Deep neural networks have demonstrated outstanding performances in agriculture production. Agriculture production is one of the most important sectors because it has a direct impact on the economy and social life of any society. Plant disease identification is a big challenge for agriculture [...] Read more.
Deep neural networks have demonstrated outstanding performances in agriculture production. Agriculture production is one of the most important sectors because it has a direct impact on the economy and social life of any society. Plant disease identification is a big challenge for agriculture production, for which we need a fast and accurate technique to identify plant disease. With the recent advancement in deep learning, we can develop a robust and accurate system. This research investigated the use of deep learning for accurate and fast tomato plant disease identification. In this research, we have used individual and merged datasets of tomato plants with 10 diseases (including healthy plants). The main aim of this work is to check the accuracy of the existing convolutional neural network models such as Visual Geometry Group, Residual Net, and DenseNet on tomato plant disease detection and then design a custom deep neural network model to give the best accuracy in case of the tomato plant. We have trained and tested our models with datasets containing over 18,000 and 25,000 images with 10 classes. We achieved over 99% accuracy with our custom model. This high accuracy was achieved with less training time and lower computational cost compared to other CNNs. This research demonstrates the potential of deep learning for efficient and accurate tomato plant disease detection, which can benefit farmers and contribute to improved agricultural production. The custom model’s efficient performance makes it promising for practical implementation in real-world agricultural settings. Full article
(This article belongs to the Special Issue Application of Artificial Neural Network in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Methodology.</p>
Full article ">Figure 2
<p>Model Convergence.</p>
Full article ">Figure 3
<p>VGG Architecture.</p>
Full article ">Figure 4
<p>Custom Model Architecture.</p>
Full article ">Figure 5
<p>VGG Individual and Merged Data Performance.</p>
Full article ">Figure 6
<p>VGG Individual and Merged Data Results.</p>
Full article ">Figure 7
<p>Custom Model Individual and Merged Data performance.</p>
Full article ">Figure 8
<p>Custom Model Individual and Merged Data Results.</p>
Full article ">Figure 9
<p>ResNet Individual and Merged Data Performance.</p>
Full article ">Figure 10
<p>ResNet Individual and Merged Data Results.</p>
Full article ">Figure 11
<p>DenseNet121 Individual and Merged Data Performance.</p>
Full article ">Figure 12
<p>DenseNet121 Individual and Merged Data Results.</p>
Full article ">Figure 13
<p>DenseNet169 Individual and Merged Data Performance.</p>
Full article ">Figure 14
<p>DenseNet169 Individual and Merged Data Results.</p>
Full article ">Figure 15
<p>DenseNet201 Individual and Merged Data Performance.</p>
Full article ">Figure 16
<p>DenseNet201 Individual and Merged Data Results.</p>
Full article ">Figure 17
<p>Comparison Chart of Individual Data.</p>
Full article ">Figure 18
<p>Comparison Chart of Merged Data.</p>
Full article ">
14 pages, 2843 KiB  
Article
AI-Based Prediction of Carrot Yield and Quality on Tropical Agriculture
by Yara Karine de Lima Silva, Carlos Eduardo Angeli Furlani and Tatiana Fernanda Canata
AgriEngineering 2024, 6(1), 361-374; https://doi.org/10.3390/agriengineering6010022 - 9 Feb 2024
Cited by 2 | Viewed by 1715
Abstract
The adoption of artificial intelligence tools can improve production efficiency in the agroindustry. Our objective was to perform the predictive modeling of carrot yield and quality. The crop was grown in two commercial areas during the summer season in Brazil. The root samples [...] Read more.
The adoption of artificial intelligence tools can improve production efficiency in the agroindustry. Our objective was to perform the predictive modeling of carrot yield and quality. The crop was grown in two commercial areas during the summer season in Brazil. The root samples were taken at 200 points with a 30 × 30 m sampling grid at 82 and 116 days after sowing in both areas. The total fresh biomass, aerial part, and root biometry were quantified for previous crop harvesting to measure yield. The quality of the roots was assessed by sub-sampling three carrots by the concentration of total soluble solids (°Brix) and firmness in the laboratory. Vegetation indices were extracted from satellite imagery. The most important variables for the predictive models were selected by principal component analysis and submitted to the Artificial Neural Network (ANN), Random Forest (RF), and Multiple Linear Regression (MLR) algorithms. SAVI and NDVI indices stood out as predictors of crop yield, and the results from the ANN (R2 = 0.68) were superior to the RF (R2 = 0.67) and MLR (R2 = 0.61) models. Carrot quality cannot be modeled by the predictive models in this study; however, it should be explored in future research, including other crop variables. Full article
(This article belongs to the Special Issue Implementation of Artificial Intelligence in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Experimental sites 1 (<b>A</b>) and 2 (<b>B</b>) and their respective areas of data collection.</p>
Full article ">Figure 2
<p>Meteorological data from both experimental sites over the crop season. Source: NASA Power, 2022–2023 (<a href="https://power.larc.nasa.gov/data-access-viewer/" target="_blank">https://power.larc.nasa.gov/data-access-viewer/</a>), accessed on: 8 January 2024.</p>
Full article ">Figure 3
<p>Manual data collection of the carrots.</p>
Full article ">Figure 4
<p>Flowchart of the experimental process and data processing.</p>
Full article ">Figure 5
<p>Comparison of the total crop mass data in relation to the normal distribution.</p>
Full article ">Figure 6
<p>Principal component analysis at 82 and 116 DAS. MT-total mass; MA-air mass; MR-root mass; C-root length; D-root diameter; B-°Brix; and F-firmness; E1-experimental site 1; E2-experimental site 2.</p>
Full article ">Figure 7
<p>Performance of the predictive models by method: ANN (<b>A</b>); RF (<b>B</b>); and MLR (<b>C</b>).</p>
Full article ">
17 pages, 19010 KiB  
Article
An Improved Detection Method for Crop & Fruit Leaf Disease under Real-Field Conditions
by Serosh Karim Noon, Muhammad Amjad, Muhammad Ali Qureshi, Abdul Mannan and Tehreem Awan
AgriEngineering 2024, 6(1), 344-360; https://doi.org/10.3390/agriengineering6010021 - 9 Feb 2024
Cited by 1 | Viewed by 1429
Abstract
Using deep learning-based tools in the field of agriculture for the automatic detection of plant leaf diseases has been in place for many years. However, optimizing their use in the specific background of the agriculture field, in the presence of other leaves and [...] Read more.
Using deep learning-based tools in the field of agriculture for the automatic detection of plant leaf diseases has been in place for many years. However, optimizing their use in the specific background of the agriculture field, in the presence of other leaves and the soil, is still an open challenge. This work presents a deep learning model based on YOLOv6s that incorporates (1) Gaussian error linear unit in the backbone, (2) efficient channel attention in the basic RepBlock, and (3) SCYLLA-Intersection Over Union (SIOU) loss function to improve the detection accuracy of the base model in real-field background conditions. Experiments were carried out on a self-collected dataset containing 3305 real-field images of cotton, wheat, and mango (healthy and diseased) leaves. The results show that the proposed model outperformed many state-of-the-art and recent models, including the base YOLOv6s, in terms of detection accuracy. It was also found that this improvement was achieved without any significant increase in the computational cost. Hence, the proposed model stood out as an effective technique to detect plant leaf diseases in real-field conditions without any increased computational burden. Full article
Show Figures

Figure 1

Figure 1
<p>Sample images taken from the dataset showing difficult field conditions (<b>a</b>) Similar background (<b>b</b>) Shadow interference (<b>c</b>) varying light &amp; complex background (<b>d</b>) variability of diseased symptoms (<b>e</b>) multiple objects in varying light.</p>
Full article ">Figure 2
<p>The structure of the RepEA block with Efficient Channel attention embedded in the Rep block of YOLOV6.</p>
Full article ">Figure 3
<p>The GELU function used in place of ReLU.</p>
Full article ">Figure 4
<p>Proposed model for crop &amp; fruit leaf disease detection.</p>
Full article ">Figure 5
<p>Self Collected dataset (<b>a</b>) Wheat healthy (<b>b</b>) Wheat brown rust (<b>c</b>) Wheat yellow rust (<b>d</b>) Wheat stem rust (<b>e</b>) Wheat smut (<b>f</b>) Mango healthy (<b>g</b>) Mango anthracnose (<b>h</b>) Mango nutrient deficient (<b>i</b>) Cotton healthy (<b>j</b>) Cotton Curl.</p>
Full article ">Figure 6
<p>Visualizing the distribution of images in each class.</p>
Full article ">Figure 7
<p>Data augmentation Steps (<b>a</b>) original image (<b>b</b>) Flip vertical (<b>c</b>) Flip horizontal (<b>d</b>) brightness −25% (<b>e</b>) brightness +25% (<b>f</b>) Rotate 25%.</p>
Full article ">Figure 8
<p>IoU and Classification loss curves during proposed model training.</p>
Full article ">Figure 9
<p>Comparison of mAP@50% for proposed model &amp; default YOLOv6 model.</p>
Full article ">Figure 10
<p>Confusion matrix of the proposed model on test dataset.</p>
Full article ">Figure 11
<p>Precision-Recall curve of all 10 classes @ IoU threshold of 0.5.</p>
Full article ">Figure 12
<p>Detection results on test dataset. (<b>a</b>,<b>c</b>,<b>e</b>,<b>g</b>) results of default YOLOV6 model. (<b>b</b>,<b>d</b>,<b>f</b>,<b>h</b>) Results of Improved YOLOV6 model.</p>
Full article ">Figure 12 Cont.
<p>Detection results on test dataset. (<b>a</b>,<b>c</b>,<b>e</b>,<b>g</b>) results of default YOLOV6 model. (<b>b</b>,<b>d</b>,<b>f</b>,<b>h</b>) Results of Improved YOLOV6 model.</p>
Full article ">
14 pages, 3314 KiB  
Article
Hyperspectral Response of the Soybean Crop as a Function of Target Spot (Corynespora cassiicola) Using Machine Learning to Classify Severity Levels
by José Donizete de Queiroz Otone, Gustavo de Faria Theodoro, Dthenifer Cordeiro Santana, Larissa Pereira Ribeiro Teodoro, Job Teixeira de Oliveira, Izabela Cristina de Oliveira, Carlos Antonio da Silva Junior, Paulo Eduardo Teodoro and Fabio Henrique Rojo Baio
AgriEngineering 2024, 6(1), 330-343; https://doi.org/10.3390/agriengineering6010020 - 7 Feb 2024
Cited by 4 | Viewed by 1425
Abstract
Plants respond to biotic and abiotic pressures by changing their biophysical and biochemical aspects, such as reducing their biomass and developing chlorosis, which can be readily identified using remote-sensing techniques applied to the VIS/NIR/SWIR spectrum range. In the current scenario of agriculture, production [...] Read more.
Plants respond to biotic and abiotic pressures by changing their biophysical and biochemical aspects, such as reducing their biomass and developing chlorosis, which can be readily identified using remote-sensing techniques applied to the VIS/NIR/SWIR spectrum range. In the current scenario of agriculture, production efficiency is fundamental for farmers, but diseases such as target spot continue to harm soybean yield. Remote sensing, especially hyperspectral sensing, can detect these diseases, but has disadvantages such as cost and complexity, thus favoring the use of UAVs in these activities, as they are more economical. The objectives of this study were: (i) to identify the most appropriate input variable (bands, vegetation indices and all reflectance ranges) for the metrics assessed in machine learning models; (ii) to verify whether there is a statistical difference in the response of NDVI (normalized difference vegetation index), grain weight and yield when subjected to different levels of severity; and (iii) to identify whether there is a relationship between the spectral bands and vegetation indices with the levels of target spot severity, grain weight and yield. The field experiment was carried out in the 2022/23 crop season and involved different fungicide treatments to obtain different levels of disease severity. A spectroradiometer and UAV (unmanned aerial vehicle) imagery were used to collect spectral data from the leaves. Data were subjected to machine learning analysis using different algorithms. LR (logistic regression) and SVM (support vector machine) algorithms performed better in classifying target spot severity levels when spectral data were used. Multivariate canonical analysis showed that healthy leaves stood out at specific wavelengths, while diseased leaves showed different spectral patterns. Disease detection using hyperspectral sensors enabled detailed information acquisition. Our findings reveal that remote sensing, especially using hyperspectral sensors and machine learning techniques, can be effective in the early detection and monitoring of target spot in the soybean crop, enabling fast decision-making for the control and prevention of yield losses. Full article
Show Figures

Figure 1

Figure 1
<p>Location of the study area (<b>A</b>), equipment used in hyper (<b>B</b>) and multispectral (<b>C</b>) imagery, and diagram of the map of target spot severity obtained from NDVI index (<b>D</b>).</p>
Full article ">Figure 2
<p>Average rainfall and temperature conditions during the experiment.</p>
Full article ">Figure 3
<p>Healthy leaves (<b>A</b>), leaves with 25% target spot severity (<b>B</b>), and leaves with 50% target spot severity (<b>C</b>).</p>
Full article ">Figure 4
<p>Spectral signature for each level of target spot severity in soybean.</p>
Full article ">Figure 5
<p>Canonical analysis relating spectral bands based on a hyperspectral sensor (<b>A</b>) and vegetation indices based on a multispectral sensor (<b>B</b>) with the levels of target spot severity, grain yield, and grain weight.</p>
Full article ">Figure 6
<p>Boxplots for the accuracy metrics correct classification percentage (CC), Kappa, and F-score considering the machine learning models and different inputs tested for classifying target spot severities in soybeans in 100 samples for each severity. SB: spectral bands; VIs: vegetation indices; ALL: all reflectance ranges provided by the hyperspectral sensor. Averages followed by the same uppercase letters for the different inputs and the same lowercase letters for the different ML algorithms do not differ by the Scott–Knott test at 5% probability.</p>
Full article ">
Previous Issue
Next Issue
Back to TopTop