[go: up one dir, main page]

 
 
sensors-logo

Journal Browser

Journal Browser

Sensors in Agriculture

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Physical Sensors".

Deadline for manuscript submissions: closed (31 October 2017) | Viewed by 333578

Printed Edition Available!
A printed edition of this Special Issue is available here.

Special Issue Editor


E-Mail Website
Guest Editor
Head of Agricultural Engineering Laboratory, Faculty of Agriculture, Aristotle University of Thessaloniki (A.U.Th.), P.O. 275, 54124 Thessaloniki, Greece
Interests: remote sensing; multiscale fusion robotic agriculture; sensor networks; robotics; development of cognitive abilities; fusion of global and local cognition
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Agriculture requires technical solutions for increasing production, while the environmental impact decreases by reduced application of agro-chemicals and increased use of environmental friendly management practices. A benefit of this is the reduction of production costs. Technologies of Sensors produce tools to achieve the above-mentioned goals. The explosive technological advances and development in recent years enormously facilitates the attainment of these objectives by removing many barriers for their implementation, including reservations expressed by farmers themselves. Precision Agriculture is an emerging area, where sensor-based technologies play an important role.

Farmers, researchers, and technical manufacturers, all together, are joining efforts to find efficient solutions and improvements in production and in to reductions in costs. This Special Issue aims to bring together recent research and development concerning novel sensors and their applications in agriculture. Sensors in agriculture are based on the requirements of the farmers according to the farming operations that need to be addressed. Papers addressing sensor development addressing a wide range of agricultural tasks, including, but not limited to, recent research and developments in the following areas are expected:

  • Optical sensors: Hyperspectral, Multispectral, Fluorescence and thermal sensing
  • Sensors for Crop health status determination
  • Sensors for crop phenotyping, germination, emergence and determination of the different growth stages of crops
  • Sensors for detection of Microorganisms and Pest management
  • Airborne sensors (UAV)
  • Multisensor systems, sensor fusion
  • Non-destructive soil sensing
  • Yield estimation and prediction
  • Detection and identification of crops and weeds
  • Sensors for detection of fruits
  • Sensors for fruit quality determination
  • Sensors for Weed Control
  • Volatile components detection, and electronic noses and tongues
  • Sensors for positioning, navigation, and obstacle detection
  • Sensor networks in agriculture, wearable sensors, and Internet of things
  • Low energy, disposable and energy harvesting sensors in agriculture

Prof. Dr. Dimitrios Moshou
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Agricultural sensors
  • Sensor information acquisition
  • sensor information processing
  • Sensor-based decision making
  • sensors in agriculture
  • precision agriculture
  • sensors in agricultural production
  • technologies
  • sensors applications
  • processing of sensed data

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (40 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

20 pages, 6150 KiB  
Article
A Framework to Design the Computational Load Distribution of Wireless Sensor Networks in Power Consumption Constrained Environments
by David Sánchez-Álvarez, Marino Linaje and Francisco-Javier Rodríguez-Pérez
Sensors 2018, 18(4), 954; https://doi.org/10.3390/s18040954 - 23 Mar 2018
Cited by 9 | Viewed by 4335
Abstract
In this paper, we present a work based on the computational load distribution among the homogeneous nodes and the Hub/Sink of Wireless Sensor Networks (WSNs). The main contribution of the paper is an early decision support framework helping WSN designers to take decisions [...] Read more.
In this paper, we present a work based on the computational load distribution among the homogeneous nodes and the Hub/Sink of Wireless Sensor Networks (WSNs). The main contribution of the paper is an early decision support framework helping WSN designers to take decisions about computational load distribution for those WSNs where power consumption is a key issue (when we refer to “framework” in this work, we are considering it as a support tool to make decisions where the executive judgment can be included along with the set of mathematical tools of the WSN designer; this work shows the need to include the load distribution as an integral component of the WSN system for making early decisions regarding energy consumption). The framework takes advantage of the idea that balancing sensors nodes and Hub/Sink computational load can lead to improved energy consumption for the whole or at least the battery-powered nodes of the WSN. The approach is not trivial and it takes into account related issues such as the required data distribution, nodes, and Hub/Sink connectivity and availability due to their connectivity features and duty-cycle. For a practical demonstration, the proposed framework is applied to an agriculture case study, a sector very relevant in our region. In this kind of rural context, distances, low costs due to vegetable selling prices and the lack of continuous power supplies may lead to viable or inviable sensing solutions for the farmers. The proposed framework systematize and facilitates WSN designers the required complex calculations taking into account the most relevant variables regarding power consumption, avoiding full/partial/prototype implementations, and measurements of different computational load distribution potential solutions for a specific WSN. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Energy consumption of the network.</p>
Full article ">Figure 2
<p>Energy consumption produced by the transmission.</p>
Full article ">Figure 3
<p>Energy consumption as a function of processing load distribution.</p>
Full article ">Figure 4
<p>Real consumption of the data transmission.</p>
Full article ">Figure 5
<p>The farm where the measures of coverage and power of the signal were realized. The box includes the sensor node and batteries as well as hardware to check relevant parameters.</p>
Full article ">
16 pages, 3225 KiB  
Article
Using a Mobile Device “App” and Proximal Remote Sensing Technologies to Assess Soil Cover Fractions on Agricultural Fields
by Ahmed Laamrani, Renato Pardo Lara, Aaron A. Berg, Dave Branson and Pamela Joosse
Sensors 2018, 18(3), 708; https://doi.org/10.3390/s18030708 - 27 Feb 2018
Cited by 18 | Viewed by 5432
Abstract
Quantifying the amount of crop residue left in the field after harvest is a key issue for sustainability. Conventional assessment approaches (e.g., line-transect) are labor intensive, time-consuming and costly. Many proximal remote sensing devices and systems have been developed for agricultural applications such [...] Read more.
Quantifying the amount of crop residue left in the field after harvest is a key issue for sustainability. Conventional assessment approaches (e.g., line-transect) are labor intensive, time-consuming and costly. Many proximal remote sensing devices and systems have been developed for agricultural applications such as cover crop and residue mapping. For instance, current mobile devices (smartphones & tablets) are usually equipped with digital cameras and global positioning systems and use applications (apps) for in-field data collection and analysis. In this study, we assess the feasibility and strength of a mobile device app developed to estimate crop residue cover. The performance of this novel technique (from here on referred to as “app” method) was compared against two point counting approaches: an established digital photograph-grid method and a new automated residue counting script developed in MATLAB at the University of Guelph. Both photograph-grid and script methods were used to count residue under 100 grid points. Residue percent cover was estimated using the app, script and photograph-grid methods on 54 vertical digital photographs (images of the ground taken from above at a height of 1.5 m) collected from eighteen fields (9 corn and 9 soybean, 3 samples each) located in southern Ontario. Results showed that residue estimates from the app method were in good agreement with those obtained from both photograph–grid and script methods (R2 = 0.86 and 0.84, respectively). This study has found that the app underestimates the residue coverage by −6.3% and −10.8% when compared to the photograph-grid and script methods, respectively. With regards to residue type, soybean has a slightly lower bias than corn (i.e., −5.3% vs. −7.4%). For photos with residue <30%, the app derived residue measurements are within ±5% difference (bias) of both photograph-grid- and script-derived residue measurements. These methods could therefore be used to track the recommended minimum soil residue cover of 30%, implemented to reduce farmland topsoil and nutrient losses that impact water quality. Overall, the app method was found to be a good alternative to the point counting methods, which are more time-consuming. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Study area within the Lake Erie watershed, southern Ontario, Canada. The 18 plots used in this study were located south of the city of London.</p>
Full article ">Figure 2
<p>Examples of vertical photographs taken over soybean (<b>a</b>) and corn (<b>c</b>) fields. The photograph-grid method used superimposed, uniformly spaced 10 by 10 digital grid intersections (<b>b</b>). The app estimated crop residue is displayed in (<b>d</b>) as a thematic classified map using a two-color palette legend (residue class in red vs. no-residue class in gray. (<b>c,d</b>) show an example of the app method input and output. This example shows an evident underestimation of the app-derived crop residue compared to that from the photograph-grid method (i.e., 62% vs. 88%, respectively).</p>
Full article ">Figure 3
<p>Relationships between residue cover percent as determined by photograph-grid and app: (<b>a</b>) residue cover (corn = red dots and soybean = blue dots); and (<b>b</b>) residue levels (Low = blue, medium= red and high = green). Related statistics are shown in <a href="#sensors-18-00708-t002" class="html-table">Table 2</a>.</p>
Full article ">Figure 4
<p>Relationships between residue cover percent as determined by script and app: (<b>a</b>) residue cover (corn = red dots and soybean = blue dots); and (<b>b</b>) residue levels (Low &lt; 30%, medium= 30–60% and high &gt; 60%). Related statistics (i.e., R<sup>2</sup>) are shown in <a href="#sensors-18-00708-t003" class="html-table">Table 3</a>.</p>
Full article ">Figure 5
<p>Relationships between residue cover percent as determined by photograph-grid and app with 95% CI: (<b>a</b>) OLS log-linear regression; (<b>b</b>) OLS logit-linear regression, (<b>c</b>) generalized Poisson regression; (<b>d</b>) Beta regression with logit link.</p>
Full article ">Figure 6
<p>Relationships between residue cover percent as determined by script and app with 95% CI: (<b>a</b>) OLS log-linear regression; (<b>b</b>) OLS logit-linear regression; (<b>c</b>) generalized Poisson regression; (<b>d</b>) Beta regression with logit link.</p>
Full article ">
18 pages, 4540 KiB  
Article
Comparative Study of the Detection of Chromium Content in Rice Leaves by 532 nm and 1064 nm Laser-Induced Breakdown Spectroscopy
by Jiyu Peng, Fei Liu, Tingting Shen, Lanhan Ye, Wenwen Kong, Wei Wang, Xiaodan Liu and Yong He
Sensors 2018, 18(2), 621; https://doi.org/10.3390/s18020621 - 18 Feb 2018
Cited by 29 | Viewed by 4770
Abstract
Fast detection of toxic metals in crops is important for monitoring pollution and ensuring food safety. In this study, laser-induced breakdown spectroscopy (LIBS) was used to detect the chromium content in rice leaves. We investigated the influence of laser wavelength (532 nm and [...] Read more.
Fast detection of toxic metals in crops is important for monitoring pollution and ensuring food safety. In this study, laser-induced breakdown spectroscopy (LIBS) was used to detect the chromium content in rice leaves. We investigated the influence of laser wavelength (532 nm and 1064 nm excitation), along with the variations of delay time, pulse energy, and lens-to-sample distance (LTSD), on the signal (sensitivity and stability) and plasma features (temperature and electron density). With the optimized experimental parameters, univariate analysis was used for quantifying the chromium content, and several preprocessing methods (including background normalization, area normalization, multiplicative scatter correction (MSC) transformation and standardized normal variate (SNV) transformation were used to further improve the analytical performance. The results indicated that 532 nm excitation showed better sensitivity than 1064 nm excitation, with a detection limit around two times lower. However, the prediction accuracy for both excitation wavelengths was similar. The best result, with a correlation coefficient of 0.9849, root-mean-square error of 3.89 mg/kg and detection limit of 2.72 mg/kg, was obtained using the SNV transformed signal (Cr I 425.43 nm) induced by 532 nm excitation. The results indicate the inspiring capability of LIBS for toxic metals detection in plant materials. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Averaged spectra of sample from extra group (60 μM chromium stress) in the range of 240–880 nm. Experimental parameters of energy, LTSD, delay time, gate width for 532 nm and 1064 nm excitations are 90 mJ, 98 mm, 4 μs and 16 μs, respectively.</p>
Full article ">Figure 2
<p>(<b>a</b>) Spectral intensity in the range of 424–428 nm with different delay times. The solid line indicated the spectra from 532 nm excitation, and dash line indicated the spectra from 1064 nm excitation; (<b>b</b>) Comparison of SBR, SNR and RSD of emission Cr I 425.43 nm with different delay time under 532 nm and 1064 nm excitations; (<b>c</b>) Time-resolved analysis of electron density and temperature under 532 nm and 1064 nm excitations.</p>
Full article ">Figure 3
<p>(<b>a</b>) Spectral intensity in the range of 424–428 nm with different laser energies. The solid line indicated the spectra from 532 nm excitation, and dash line indicated the spectra from 1064 nm excitation; (<b>b</b>) Comparison of SBR, SNR and RSD of emission Cr I 425.43 nm with different laser energies under 532 nm and 1064 nm excitations; (<b>c</b>) Electron density and temperature with different laser energy under 532 nm and 1064 nm excitations.</p>
Full article ">Figure 4
<p>Morphologies of craters with different pulse energies under 532 nm and 1064 nm excitations: (<b>a</b>) 532 nm excitation, pulse energy = 20 mJ; (<b>b</b>) 1064 nm excitation, pulse energy = 20 mJ; (<b>d</b>) 532 nm excitation, pulse energy = 130 mJ; (<b>e</b>) 1064 nm excitation, pulse energy = 130 mJ. The colors in (<b>a,b,d,e</b>) indicated the depth of craters. Plots of statistic parameters (depth and ablated volume) of craters with different pulse energies: (<b>c</b>) pulse energy = 20 mJ; (<b>f</b>) pulse energy = 130 mJ. The morphologies of craters were measured using digital microscopy (VHX-6000, Keyence, Osaka, Japan).</p>
Full article ">Figure 5
<p>(<b>a</b>) Spectral intensity in the range of 424–428 nm with different LTSDs. The solid line indicated the spectra from 532 nm excitation, and dash line indicated the spectra from 1064 nm excitation; (<b>b</b>) Comparison of SBR, SNR and RSD of emission Cr I 425.43 nm with different LTSDs under 532 nm and 1064 nm excitations; (<b>c</b>) electron density and temperature with different LTSDs under 532 nm and 1064 nm excitations.</p>
Full article ">Figure 6
<p>Morphologies of craters with different LTSDs under 532 nm and 1064 nm excitations: (<b>a</b>) 532 nm excitation, LTSD = 93 mm; (<b>b</b>) 1064 nm excitation, LTSD = 93 mm; (<b>d</b>) 532 nm excitation, LTSD = 97 mm; (<b>e</b>) 1064 nm excitation, LTSD = 97 mm; (<b>g</b>) 532 nm excitation, LTSD = 102 mm; (<b>h</b>) 1064 nm excitation, LTSD = 102 mm. The colors in (<b>a,b,d,e,g,h</b>) indicated the depth of craters. Plots of statistic parameters (depth and ablated volume) of craters with different LTSDs: (<b>c</b>) LTSD = 93 mm; (<b>f</b>) LTSD = 97 mm; (<b>i</b>) LTSD = 102 mm.</p>
Full article ">Figure 7
<p>Spectra of two representative samples (chromium content of 6.99 mg/kg and 56.03 mg/kg) in the spectral range of 424–248 nm. The spectra were processed by different pretreatments: (<b>a</b>) raw; (<b>b</b>) background normalization; (<b>c</b>) area normalization; (<b>d</b>) MSC transformation; (<b>e</b>) SNV transformation.</p>
Full article ">Figure 8
<p>Relationship between reference value and LIBS measured value that calibrated with the variables of (<b>a</b>) raw signal of 532 nm excitation; (<b>b</b>) MSC transformed signal of 532 nm excitation; (<b>c</b>) SNV transformed signal of 532 nm excitation; (<b>d</b>) raw signal of 1064 nm excitation; (<b>e</b>) MSC transformed signal of 1064 nm excitation; (<b>f</b>) SNV transformed signal of 1064 nm excitation. (R<sub>c</sub>: R in calibration set, R<sub>p</sub>: R in prediction set, RMSEC: RMSE in calibration set, RMSEP: RMSE in prediction set).</p>
Full article ">
11 pages, 3506 KiB  
Article
Two Solutions of Soil Moisture Sensing with RFID for Landslide Monitoring
by Sérgio Francisco Pichorim, Nathan J. Gomes and John C. Batchelor
Sensors 2018, 18(2), 452; https://doi.org/10.3390/s18020452 - 3 Feb 2018
Cited by 51 | Viewed by 9864
Abstract
Two solutions for UHF RFID tags for soil moisture sensing were designed and are described in this paper. In the first, two conventional tags (standard transponders) are employed: one, placed close to the soil surface, is the sensor tag, while the other, separated [...] Read more.
Two solutions for UHF RFID tags for soil moisture sensing were designed and are described in this paper. In the first, two conventional tags (standard transponders) are employed: one, placed close to the soil surface, is the sensor tag, while the other, separated from the soil, is the reference for system calibration. By transmission power ramps, the tag’s turn-on power levels are measured and correlated with soil condition (dry or wet). In the second solution, the SL900A chip, which supports up to two external sensors and an internal temperature sensor, is used. An interdigital capacitive sensor was connected to the transponder chip and used for soil moisture measurement. In a novel design for an UHF RFID tag the sensor is placed below the soil surface, while the transponder and antenna are above the soil to improve communication. Both solutions are evaluated practically and results show the presence of water in soil can be remotely detected allowing for their application in landslide monitoring. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>System using two adhesive UHF RFID tags for soil moisture monitoring. Sensor and reference tags, with distance <span class="html-italic">H</span> and <span class="html-italic">A</span>, respectively, from soil. The reading distance is <span class="html-italic">R</span>.</p>
Full article ">Figure 2
<p>Transmission power to turn on the tag (for <span class="html-italic">H</span> = 26 mm) as a function of frequency. Solid line is for wet soil and dashed line is for dry soil. The cursor marks the frequency of 868 MHz.</p>
Full article ">Figure 3
<p>Transmission power to turn on the tag (for 868 MHz) as a function of the distance between tag and soil (<span class="html-italic">H</span>). Diamonds (solid line) are for wet soil and dots (dashed line) are for dry soil.</p>
Full article ">Figure 4
<p>Prototype of Soil Moisture Sensor using two adhesive UHF RFID tags. The Sensor tag is place 12 mm over the soil. A Reference tag is 100 mm apart from the soil.</p>
Full article ">Figure 5
<p>Transmission power to turn on the tags (for 868 MHz) as a function of the reading distance between tags and reader (<span class="html-italic">R</span>). Solid lines are for Sensor Tag and dashed line for Reference Tag. Diamonds are for dry soil and squares are for wet soil.</p>
Full article ">Figure 6
<p>Block diagram of External Sensor Interface of SL900A. For a capacitive sensor (<span class="html-italic">C sensor</span>) a reference capacitor (<span class="html-italic">C ref</span>) must be inserted in Ext1 pin.</p>
Full article ">Figure 7
<p>Design of the Soil Stick Sensor using UHF RFID chip (SL900A) and an interdigital capacitive humidity sensor (<span class="html-italic">C sensor</span>). A neck of <span class="html-italic">N</span> = 30 mm was defined. The arrow shape allows the tag to be stuck into soil. The whole sensor area must be buried. All dimensions are in millimeter (mm).</p>
Full article ">Figure 8
<p>Example of the Soil Stick Sensor for Humidity in a real situation.</p>
Full article ">Figure 9
<p>Sensor capacitance (solid dots and line) as a function of soil moisture <span class="html-italic">h</span> (weight ratio). Dashed line and hollow dots are values obtained by GUI of UHF RFID reader. Curves are parabolic approximations.</p>
Full article ">
20 pages, 10383 KiB  
Article
Specim IQ: Evaluation of a New, Miniaturized Handheld Hyperspectral Camera and Its Application for Plant Phenotyping and Disease Detection
by Jan Behmann, Kelvin Acebron, Dzhaner Emin, Simon Bennertz, Shizue Matsubara, Stefan Thomas, David Bohnenkamp, Matheus T. Kuska, Jouni Jussila, Harri Salo, Anne-Katrin Mahlein and Uwe Rascher
Sensors 2018, 18(2), 441; https://doi.org/10.3390/s18020441 - 2 Feb 2018
Cited by 155 | Viewed by 20904
Abstract
Hyperspectral imaging sensors are promising tools for monitoring crop plants or vegetation in different environments. Information on physiology, architecture or biochemistry of plants can be assessed non-invasively and on different scales. For instance, hyperspectral sensors are implemented for stress detection in plant phenotyping [...] Read more.
Hyperspectral imaging sensors are promising tools for monitoring crop plants or vegetation in different environments. Information on physiology, architecture or biochemistry of plants can be assessed non-invasively and on different scales. For instance, hyperspectral sensors are implemented for stress detection in plant phenotyping processes or in precision agriculture. Up to date, a variety of non-imaging and imaging hyperspectral sensors is available. The measuring process and the handling of most of these sensors is rather complex. Thus, during the last years the demand for sensors with easy user operability arose. The present study introduces the novel hyperspectral camera Specim IQ from Specim (Oulu, Finland). The Specim IQ is a handheld push broom system with integrated operating system and controls. Basic data handling and data analysis processes, such as pre-processing and classification routines are implemented within the camera software. This study provides an introduction into the measurement pipeline of the Specim IQ as well as a radiometric performance comparison with a well-established hyperspectral imager. Case studies for the detection of powdery mildew on barley at the canopy scale and the spectral characterization of Arabidopsis thaliana mutants grown under stressed and non-stressed conditions are presented. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Vector visualization of the Specim IQ (Specim Ltd., Oulu, Finland) with annotations and dimensions on the left side and RGB renderings on the right side.</p>
Full article ">Figure 2
<p>The standard workflow of the Specim IQ hyperspectral camera.</p>
Full article ">Figure 3
<p>The mean spectra including the standard deviation of green paper (<b>A</b>) and purple polyethylene (<b>B</b>) as representatives of the observed reference objects of different color in the indoor setting (<b>C</b>).</p>
Full article ">Figure 4
<p>Comparison of spectral reflectance spectra from the outdoor experiment (<b>A</b>–<b>D</b>): paper green, paper dark yellow, Polyethylene purple, Polyethylene blue. Beside the increased reflectance observed by the Specim IQ in the NIR, all spectra reveal a high level of congruence.</p>
Full article ">Figure 5
<p>RGB visualization of a reflectance test image to show the line pattern. The highlighted image part in (<b>A</b>) is visualized in zoom view in (<b>B</b>). On the white reference the line pattern is visible, whereas on the plants it is mainly covered by natural variability.</p>
Full article ">Figure 6
<p>Differences observed between non-stress acclimated (NSA) and stress acclimated (SA) <span class="html-italic">Arabidopsis</span> wildtype (Col-0) and NPQ-deficient mutants (<span class="html-italic">npq1</span> and <span class="html-italic">npq4</span>) as shown by computed spectral ratios. Left panel shows the false-colour images of selected ROIs (<b>A</b>); NDVI (<b>C</b>); REIP (<b>E</b>); and PRI (<b>G</b>) computed from spectral information captured by the Specim IQ camera. Right panel shows the computed means ± standard errors of reflectance values (<b>B</b>); NDVI (<b>D</b>); REIP (<b>F</b>); and PRI (<b>H</b>) from three individual plants randomly distributed in the imaging frame. Different letters indicate significant differences based on LSD (<math display="inline"> <semantics> <mrow> <mi>α</mi> <mo>=</mo> <mn>0</mn> <mo>.</mo> <mn>05</mn> </mrow> </semantics> </math>).</p>
Full article ">Figure 7
<p>Classification of powdery mildew using the Spectral Angle Mapper (SAM) and Support Vector Machine (SVM). Powdery mildew detection with the SAM is based on two reference spectra for “symptoms” and two reference spectra for “healthy tissue”. The SVM prediction is based on 15 training samples for each class. The image contains the white reference panel on the left side.</p>
Full article ">Figure 8
<p>Evaluation for the images of inoculated barley plants by a Support Vector Machine classification model (green: healthy, orange: symptom, gray: background): Cultivar (cv.) Milford shows significantly more affected pixels whereas the cv. Tocada shows only a few symptoms in the measured part of the canopy. Percentage of affected pixels is given for inoculated (inoc.) and healthy control (cont.) plants.</p>
Full article ">
22 pages, 5822 KiB  
Article
Research on the Effects of Drying Temperature on Nitrogen Detection of Different Soil Types by Near Infrared Sensors
by Pengcheng Nie, Tao Dong, Yong He and Shupei Xiao
Sensors 2018, 18(2), 391; https://doi.org/10.3390/s18020391 - 29 Jan 2018
Cited by 16 | Viewed by 4900
Abstract
Soil is a complicated system whose components and mechanisms are complex and difficult to be fully excavated and comprehended. Nitrogen is the key parameter supporting plant growth and development, and is the material basis of plant growth as well. An accurate grasp of [...] Read more.
Soil is a complicated system whose components and mechanisms are complex and difficult to be fully excavated and comprehended. Nitrogen is the key parameter supporting plant growth and development, and is the material basis of plant growth as well. An accurate grasp of soil nitrogen information is the premise of scientific fertilization in precision agriculture, where near infrared sensors are widely used for rapid detection of nutrients in soil. However, soil texture, soil moisture content and drying temperature all affect soil nitrogen detection using near infrared sensors. In order to investigate the effects of drying temperature on the nitrogen detection in black soil, loess and calcium soil, three kinds of soils were detected by near infrared sensors after 25 °C placement (ambient temperature), 50 °C drying (medium temperature), 80 °C drying (medium-high temperature) and 95 °C drying (high temperature). The successive projections algorithm based on multiple linear regression (SPA-MLR), partial least squares (PLS) and competitive adaptive reweighted squares (CARS) were used to model and analyze the spectral information of different soil types. The predictive abilities were assessed using the prediction correlation coefficients (RP), the root mean squared error of prediction (RMSEP), and the residual predictive deviation (RPD). The results showed that the loess (RP = 0.9721, RMSEP = 0.067 g/kg, RPD = 4.34) and calcium soil (RP = 0.9588, RMSEP = 0.094 g/kg, RPD = 3.89) obtained the best prediction accuracy after 95 °C drying. The detection results of black soil (RP = 0.9486, RMSEP = 0.22 g/kg, RPD = 2.82) after 80 °C drying were the optimum. In conclusion, drying temperature does have an obvious influence on the detection of soil nitrogen by near infrared sensors, and the suitable drying temperature for different soil types was of great significance in enhancing the detection accuracy. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Near infrared (NIR) spectrum soil detection platform.</p>
Full article ">Figure 2
<p>Near infrared spectra of three kinds of soils (<b>A</b>) 50 °C drying; (<b>B</b>) 80 °C drying; (<b>C</b>) 95 °C drying; (<b>D</b>) 25 °C placement. (<b>a</b>,<b>d</b>,<b>g</b>,<b>j</b>) are the black soil average spectrum at 50 °C, 80 °C, 95 °C drying and 25 °C placement respectively; (<b>b</b>,<b>e</b>,<b>h</b>,<b>k</b>) are the loess average spectrum at 50 °C, 80 °C, 95 °C drying and 25 °C placement respectively; (<b>c</b>,<b>f</b>,<b>j</b>,<b>l</b>) are the calcium soil average spectrum at 50 °C, 80 °C, 95 °C drying and 25 °C placement respectively.</p>
Full article ">Figure 2 Cont.
<p>Near infrared spectra of three kinds of soils (<b>A</b>) 50 °C drying; (<b>B</b>) 80 °C drying; (<b>C</b>) 95 °C drying; (<b>D</b>) 25 °C placement. (<b>a</b>,<b>d</b>,<b>g</b>,<b>j</b>) are the black soil average spectrum at 50 °C, 80 °C, 95 °C drying and 25 °C placement respectively; (<b>b</b>,<b>e</b>,<b>h</b>,<b>k</b>) are the loess average spectrum at 50 °C, 80 °C, 95 °C drying and 25 °C placement respectively; (<b>c</b>,<b>f</b>,<b>j</b>,<b>l</b>) are the calcium soil average spectrum at 50 °C, 80 °C, 95 °C drying and 25 °C placement respectively.</p>
Full article ">Figure 3
<p>The wavelength number of loess, calcium and black soil selected by SPA: (<b>A</b>) 50 °C drying; (<b>B</b>) 80 °C drying; (<b>C</b>) 95 °C drying; (<b>D</b>) 25 °C placement. (<b>a</b>,<b>d</b>,<b>g,j</b>) are the loess wavelength number at 50 °C, 80 °C, 95 °C drying and 25 °C placement respectively; (<b>b</b>,<b>e</b>,<b>h</b>,<b>k</b>) are the calcium wavelength number at 50 °C, 80 °C, 95 °C drying and 25 °C placement respectively; (<b>c</b>),(<b>f</b>),(<b>j</b>) and (<b>l</b>) are the black soil wavelength number at 50 °C, 80 °C, 95 °C drying and 25 °C placement respectively.</p>
Full article ">Figure 3 Cont.
<p>The wavelength number of loess, calcium and black soil selected by SPA: (<b>A</b>) 50 °C drying; (<b>B</b>) 80 °C drying; (<b>C</b>) 95 °C drying; (<b>D</b>) 25 °C placement. (<b>a</b>,<b>d</b>,<b>g,j</b>) are the loess wavelength number at 50 °C, 80 °C, 95 °C drying and 25 °C placement respectively; (<b>b</b>,<b>e</b>,<b>h</b>,<b>k</b>) are the calcium wavelength number at 50 °C, 80 °C, 95 °C drying and 25 °C placement respectively; (<b>c</b>),(<b>f</b>),(<b>j</b>) and (<b>l</b>) are the black soil wavelength number at 50 °C, 80 °C, 95 °C drying and 25 °C placement respectively.</p>
Full article ">Figure 4
<p>SPA-MLR algorithm prediction results: (<b>A</b>) black soil; (<b>B</b>) loess; (<b>C</b>) calcium soil.</p>
Full article ">Figure 4 Cont.
<p>SPA-MLR algorithm prediction results: (<b>A</b>) black soil; (<b>B</b>) loess; (<b>C</b>) calcium soil.</p>
Full article ">Figure 5
<p>The prediction effect by PLS: (<b>A</b>) black soil; (<b>B</b>) loess; (<b>C</b>) calcium soil.</p>
Full article ">Figure 5 Cont.
<p>The prediction effect by PLS: (<b>A</b>) black soil; (<b>B</b>) loess; (<b>C</b>) calcium soil.</p>
Full article ">Figure 6
<p>The variable selection process by competitive adaptive reweighted squares (CARS): (<b>a</b>) 50 °C drying; (<b>b</b>) 80 °C drying; (<b>c</b>) 95 °C drying; (<b>d</b>) 25 °C placement.</p>
Full article ">Figure 7
<p>CARS prediction results: (<b>A</b>) black soil; (<b>B</b>) loess; (<b>C</b>) calcium soil.</p>
Full article ">Figure 8
<p>The prediction results of three kinds of soils at different drying temperatures based on three algorithms: (<b>A</b>) 50 °C drying; (<b>B</b>) 80 °C drying; (<b>C</b>) 95 °C drying; (<b>D</b>) 25 °C placement.</p>
Full article ">Figure 9
<p>Prediction results of three kinds of soil based on different temperatures algorithms: (<b>A</b>) black soil; (<b>B</b>) loess; (<b>C</b>) calcium soil.</p>
Full article ">
13 pages, 6840 KiB  
Article
Evaluation of Apple Maturity with Two Types of Dielectric Probes
by Marcin Kafarski, Andrzej Wilczek, Agnieszka Szypłowska, Arkadiusz Lewandowski, Piotr Pieczywek, Grzegorz Janik and Wojciech Skierucha
Sensors 2018, 18(1), 121; https://doi.org/10.3390/s18010121 - 4 Jan 2018
Cited by 20 | Viewed by 4223
Abstract
The observed dielectric spectrum of ripe apples in the last period of shelf-life was analyzed using a multipole dielectric relaxation model, which assumes three active relaxation processes: primary α-process (water relaxation) and two secondary processes caused by solid-water-ion interactions α’ (bound water relaxations), [...] Read more.
The observed dielectric spectrum of ripe apples in the last period of shelf-life was analyzed using a multipole dielectric relaxation model, which assumes three active relaxation processes: primary α-process (water relaxation) and two secondary processes caused by solid-water-ion interactions α’ (bound water relaxations), as well as β’ (Maxwell-Wagner effect). The performance of two designs of the dielectric probe was compared: a classical coaxial open-ended probe (OE probe) and an open-ended probe with a prolonged central conductor in a form of an antenna (OE-A-probe). The OE-A probe increases the measurement volume and consequently extends the range of applications to other materials, like granulated agricultural products, soils, or liquid suspensions. However, its measurement frequency range is limited as compared to the OE probe because, above 1.5 GHz, the probe with the antenna generates higher propagation modes and the applied calibrations and calculations are not sufficient. It was shown that data from measurements using the OE-A probe gave slightly stronger correlations with apples’ quality parameters than using the typical OE probe. Additionally, we have compared twelve multipole fitting models with different combinations of poles (eight three-pole and four two-pole models). It was shown that the best fit is obtained using a two-pole model for data collected for the OE-A probe and a three-pole model for the OE probe, using only Cole-Cole poles in both cases. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Tested coaxial probes and the elements of the measurement setup; (<b>a</b>) a coax OE probe (left) and a coax OE-A (right); (<b>b</b>) respective dimensions of the probes (in mm); and (<b>c</b>) experimental setup.</p>
Full article ">Figure 2
<p>EMPro FEM simulations of electrical field penetration depth in distilled water for the frequency of 1.5 GHz.</p>
Full article ">Figure 3
<p>The scatter of data of recorded <math display="inline"> <semantics> <mrow> <msubsup> <mi>ε</mi> <mi>r</mi> <mo>*</mo> </msubsup> </mrow> </semantics> </math> spectra for the measurement series S10 collected with the OE and the OE-A probes, and the results of the applied elimination of outliers: (<b>a</b>,<b>b</b>) for the apple 7 in series S10; (<b>c</b>,<b>d</b>) for all apples in series S10.</p>
Full article ">Figure 4
<p>Results of the best and the worst data fitting of dielectric spectra of <math display="inline"> <semantics> <mrow> <msubsup> <mi>ε</mi> <mi>r</mi> <mo>*</mo> </msubsup> </mrow> </semantics> </math> collected with the OE probe for series S10, apple No. 7 for; (<b>a</b>) two-pole and (<b>b</b>) three-pole models.</p>
Full article ">Figure 5
<p>Results of the best and the worst data fitting of dielectric spectra of <math display="inline"> <semantics> <mrow> <msubsup> <mi>ε</mi> <mi>r</mi> <mo>*</mo> </msubsup> </mrow> </semantics> </math> collected with the OE-A probe for series S10, apple No. 7 for; (<b>a</b>) two-pole and (<b>b</b>) three-pole models.</p>
Full article ">Figure 6
<p>Correlation between the parameters of CC-CC-CC multipole relaxation model and the shelf-life of the tested apples determined by the OE probe (assuming the influence of electrical conductivity <span class="html-italic">σ</span> (mS/m) and three dielectric dispersion effects from Maxwel-Wagner—<span class="html-italic">MW</span>, bound water—<span class="html-italic">bw</span> and free water—<span class="html-italic">fw</span>).</p>
Full article ">Figure 7
<p>Correlation between the parameters of the multipole model and the shelf-life of the tested apples determined by the OE-A probe (assuming the influence of electrical conductivity <span class="html-italic">σ</span> (mS/m) and two dielectric dispersion effects from Maxwel-Wagner—<span class="html-italic">MW</span> and bound water—<span class="html-italic">bw</span>).</p>
Full article ">Figure 8
<p>Mean values of firmness - left, acoustic events (AE)—middle, and soluble solids content (SSC)—right variability in apples in each test during the 21-day long shelf-life.</p>
Full article ">Figure 9
<p>Relation between firmness of the tested apples and selected multipole model parameters (low frequency electrical conductivity <span class="html-italic">σ</span> (mS/m)—left, relaxation frequency of bound water effect <span class="html-italic">bw</span> —middle and relaxation frequency of Maxwell-Wagner effect <span class="html-italic">MW</span>—right) for <math display="inline"> <semantics> <mrow> <msubsup> <mi>ε</mi> <mi>r</mi> <mo>*</mo> </msubsup> </mrow> </semantics> </math> measured with the OE-A dielectric probe.</p>
Full article ">
16 pages, 52677 KiB  
Article
Application of Near Infrared Reflectance Spectroscopy for Rapid and Non-Destructive Discrimination of Hulled Barley, Naked Barley, and Wheat Contaminated with Fusarium
by Jongguk Lim, Giyoung Kim, Changyeun Mo, Kyoungmin Oh, Geonseob Kim, Hyeonheui Ham, Seongmin Kim and Moon S. Kim
Sensors 2018, 18(1), 113; https://doi.org/10.3390/s18010113 - 2 Jan 2018
Cited by 26 | Viewed by 6321
Abstract
Fusarium is a common fungal disease in grains that reduces the yield of barley and wheat. In this study, a near infrared reflectance spectroscopic technique was used with a statistical prediction model to rapidly and non-destructively discriminate grain samples contaminated with Fusarium. [...] Read more.
Fusarium is a common fungal disease in grains that reduces the yield of barley and wheat. In this study, a near infrared reflectance spectroscopic technique was used with a statistical prediction model to rapidly and non-destructively discriminate grain samples contaminated with Fusarium. Reflectance spectra were acquired from hulled barley, naked barley, and wheat samples contaminated with Fusarium using near infrared reflectance (NIR) spectroscopy with a wavelength range of 1175–2170 nm. After measurement, the samples were cultured in a medium to discriminate contaminated samples. A partial least square discrimination analysis (PLS-DA) prediction model was developed using the acquired reflectance spectra and the culture results. The correct classification rate (CCR) of Fusarium for the hulled barley, naked barley, and wheat samples developed using raw spectra was 98% or higher. The accuracy of discrimination prediction improved when second and third-order derivative pretreatments were applied. The grains contaminated with Fusarium could be rapidly discriminated using spectroscopy technology and a PLS-DA discrimination model, and the potential of the non-destructive discrimination method could be verified. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Geographic locations and numbers of hulled barley, naked barley, and wheat used in the experiments from each location. Sample groups marked with an asterisk (*) are control groups.</p>
Full article ">Figure 2
<p>A schematic diagram of the near infrared reflectance spectroscopy (NIRS) measurement system for the discrimination of <span class="html-italic">Fusarium</span> contamination of cereals.</p>
Full article ">Figure 3
<p>Bifurcated fiber-optic probe and light illumination in near infrared reflectance (NIR) spectrum acquisition for hulled barley, naked barley, and wheat.</p>
Full article ">Figure 4
<p>Culture results for hulled barley (GB3-C) in the control group, showing that the <span class="html-italic">Fusarium</span> spore was not observed.</p>
Full article ">Figure 5
<p>Culture results for hulled barley (GN9) in the experimental group, showing that the <span class="html-italic">Fusarium</span> spore was observed. The hulled barley samples that generated spores were marked by red squares.</p>
Full article ">Figure 6
<p>Culture results for naked barley (GG2-C) in the control group, showing that the <span class="html-italic">Fusarium</span> spore was observed. The naked barley samples that generated spores were marked by red squares.</p>
Full article ">Figure 7
<p>Culture results for naked barley (GN4) in the experimental group, showing that the <span class="html-italic">Fusarium</span> spore was observed. The naked barley samples that generated spores were marked by red squares.</p>
Full article ">Figure 8
<p>Culture results for wheat (GW1) in the control group, showing that the <span class="html-italic">Fusarium</span> spore was not observed.</p>
Full article ">Figure 9
<p>Culture results for wheat (GN2) in the control group, showing that the <span class="html-italic">Fusarium</span> spore was observed. The wheat samples that generated spores were marked by red squares.</p>
Full article ">Figure 10
<p>All of the NIR reflectance spectra of hulled barley samples not contaminated with <span class="html-italic">Fusarium</span> (NCF) (<b>a</b>) and contaminated <span class="html-italic">Fusarium</span> (CF) (<b>b</b>).</p>
Full article ">Figure 11
<p>All NIR reflectance spectra of naked barley samples NCF (<b>a</b>) and CF (<b>b</b>).</p>
Full article ">Figure 12
<p>All NIR reflectance spectra of wheat samples NCF (<b>a</b>) and CF (<b>b</b>).</p>
Full article ">Figure 13
<p>Validation results for <span class="html-italic">Fusarium</span> discrimination of the PLS-DA model developed using raw reflectance spectra (<b>a</b>) and the third-order derivative pretreatment (<b>b</b>) obtained from hulled barley.</p>
Full article ">Figure 14
<p>Validation results for <span class="html-italic">Fusarium</span> discrimination of the PLS-DA model developed using raw reflectance spectra (<b>a</b>) and the third-order derivative pretreatment (<b>b</b>) obtained from naked barley.</p>
Full article ">Figure 15
<p>Validation results for the <span class="html-italic">Fusarium</span> discrimination of the PLS-DA model developed using raw reflectance spectra (<b>a</b>) and the second-order derivative pretreatment (<b>b</b>) obtained from wheat.</p>
Full article ">
27 pages, 16687 KiB  
Article
Combination of Multi-Agent Systems and Wireless Sensor Networks for the Monitoring of Cattle
by Alberto L. Barriuso, Gabriel Villarrubia González, Juan F. De Paz, Álvaro Lozano and Javier Bajo
Sensors 2018, 18(1), 108; https://doi.org/10.3390/s18010108 - 2 Jan 2018
Cited by 59 | Viewed by 11380
Abstract
Precision breeding techniques have been widely used to optimize expenses and increase livestock yields. Notwithstanding, the joint use of heterogeneous sensors and artificial intelligence techniques for the simultaneous analysis or detection of different problems that cattle may present has not been addressed. This [...] Read more.
Precision breeding techniques have been widely used to optimize expenses and increase livestock yields. Notwithstanding, the joint use of heterogeneous sensors and artificial intelligence techniques for the simultaneous analysis or detection of different problems that cattle may present has not been addressed. This study arises from the necessity to obtain a technological tool that faces this state of the art limitation. As novelty, this work presents a multi-agent architecture based on virtual organizations which allows to deploy a new embedded agent model in computationally limited autonomous sensors, making use of the Platform for Automatic coNstruction of orGanizations of intElligent Agents (PANGEA). To validate the proposed platform, different studies have been performed, where parameters specific to each animal are studied, such as physical activity, temperature, estrus cycle state and the moment in which the animal goes into labor. In addition, a set of applications that allow farmers to remotely monitor the livestock have been developed. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Virtual organization of agents overview.</p>
Full article ">Figure 2
<p>(<b>A</b>) Battery percentage consumed per hour and (<b>B</b>) messages sent per hour (results grouped by protocols and networking technologies).</p>
Full article ">Figure 3
<p>Interaction performed between agents.</p>
Full article ">Figure 4
<p>Behavior of the used antennas signal: Ti AM-V5 G-Ti (5 GHz): (<b>a</b>) horizontal Azimuth and (<b>b</b>) vertical Azimuth.</p>
Full article ">Figure 5
<p>Altimetry and quality of the radio link.</p>
Full article ">Figure 6
<p>Network architecture diagram.</p>
Full article ">Figure 7
<p>Infrastructure of the proposed system.</p>
Full article ">Figure 8
<p>(<b>A</b>) GPS sensor; (<b>B</b>) solar collar for GPS sensor; (<b>C</b>) food sensor (ultrasonic sensor and solar battery); (<b>D</b>) motion sensor; and (<b>E</b>) vaginal thermometer.</p>
Full article ">Figure 9
<p>(<b>A</b>) Feeder; and (<b>B</b>) measuring system.</p>
Full article ">Figure 10
<p>Aerial view of the farm, virtual enclosures.</p>
Full article ">Figure 11
<p>Steps in the detection of labor.</p>
Full article ">Figure 12
<p>Activity index of a cow.</p>
Full article ">Figure 13
<p>Cow average temperature, grouped by breeding gender.</p>
Full article ">Figure 14
<p>Cow average temperature, grouped by breeding gender.</p>
Full article ">Figure 15
<p>Television application interface.</p>
Full article ">Figure 16
<p>Web application interface.</p>
Full article ">Figure 17
<p>(<b>A</b>) The application menu; (<b>B</b>) a heat alert for a particular animal, in which you can see the animal’s name, id, the date and a description; and (<b>C</b>) the location of the animals.</p>
Full article ">Figure A1
<p>Design of the Vaginal Thermometer.</p>
Full article ">Figure A2
<p>Electric Collar Scheme.</p>
Full article ">
4268 KiB  
Article
A Compound Sensor for Simultaneous Measurement of Packing Density and Moisture Content of Silage
by Delun Meng, Fanjia Meng, Wei Sun and Shuang Deng
Sensors 2018, 18(1), 73; https://doi.org/10.3390/s18010073 - 28 Dec 2017
Cited by 2 | Viewed by 4285
Abstract
Packing density and moisture content are important factors in investigating the ensiling quality. Low packing density is a major cause of loss of sugar content. The moisture content also plays a determinant role in biomass degradation. To comprehensively evaluate the ensiling quality, this [...] Read more.
Packing density and moisture content are important factors in investigating the ensiling quality. Low packing density is a major cause of loss of sugar content. The moisture content also plays a determinant role in biomass degradation. To comprehensively evaluate the ensiling quality, this study focused on developing a compound sensor. In it, moisture electrodes and strain gauges were embedded into an ASABE Standard small cone for the simultaneous measurements of the penetration resistance (PR) and moisture content (MC) of silage. In order to evaluate the performance of the designed sensor and the theoretical analysis being used, relevant calibration and validation tests were conducted. The determination coefficients are 0.996 and 0.992 for PR calibration and 0.934 for MC calibration. The validation indicated that this measurement technique could determine the packing density and moisture content of the silage simultaneously and eliminate the influence of the friction between the penetration shaft and silage. In this study, we not only design a compound sensor but also provide an alternative way to investigate the ensiling quality which would be useful for further silage research. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Diagram of the compound sensor.</p>
Full article ">Figure 2
<p>The probe of the compound sensor. (<b>a</b>) A photo of the probe; (<b>b</b>) Detailed structure of the probe.</p>
Full article ">Figure 3
<p>Diagram of the Wheatstone bridge circuit.</p>
Full article ">Figure 4
<p>Schematic diagram of the compound sensor for moisture content measurement.</p>
Full article ">Figure 5
<p>A photo of the force-testing machine.</p>
Full article ">Figure 6
<p>The force calibration results.</p>
Full article ">Figure 7
<p>The calibration of moisture content results.</p>
Full article ">Figure 8
<p>Diagram of the validation system.</p>
Full article ">Figure 9
<p>Dynamic measurement results of the compound sensor of Cylinder 1 to Cylinder 4.</p>
Full article ">Figure 9 Cont.
<p>Dynamic measurement results of the compound sensor of Cylinder 1 to Cylinder 4.</p>
Full article ">Figure 10
<p>Average value of penetration resistance of silages with two different moisture contents.</p>
Full article ">Figure 11
<p>Average value of penetration resistance and volumetric moisture content of silages with the same dry matter density.</p>
Full article ">
3247 KiB  
Article
Multisensor Capacitance Probes for Simultaneously Monitoring Rice Field Soil-Water-Crop-Ambient Conditions
by James Brinkhoff, John Hornbuckle and Thomas Dowling
Sensors 2018, 18(1), 53; https://doi.org/10.3390/s18010053 - 26 Dec 2017
Cited by 9 | Viewed by 9135
Abstract
Multisensor capacitance probes (MCPs) have traditionally been used for soil moisture monitoring and irrigation scheduling. This paper presents a new application of these probes, namely the simultaneous monitoring of ponded water level, soil moisture, and temperature profile, conditions which are particularly important for [...] Read more.
Multisensor capacitance probes (MCPs) have traditionally been used for soil moisture monitoring and irrigation scheduling. This paper presents a new application of these probes, namely the simultaneous monitoring of ponded water level, soil moisture, and temperature profile, conditions which are particularly important for rice crops in temperate growing regions and for rice grown with prolonged periods of drying. WiFi-based loggers are used to concurrently collect the data from the MCPs and ultrasonic distance sensors (giving an independent reading of water depth). Models are fit to MCP water depth vs volumetric water content (VWC) characteristics from laboratory measurements, variability from probe-to-probe is assessed, and the methodology is verified using measurements from a rice field throughout a growing season. The root-mean-squared error of the water depth calculated from MCP VWC over the rice growing season was 6.6 mm. MCPs are used to simultaneously monitor ponded water depth, soil moisture content when ponded water is drained, and temperatures in root, water, crop and ambient zones. The insulation effect of ponded water against cold-temperature effects is demonstrated with low and high water levels. The developed approach offers advantages in gaining the full soil-plant-atmosphere continuum in a single robust sensor. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Diagram of a multiple-sensor capacitive soil moisture probe. The probe contains <math display="inline"> <semantics> <mrow> <mi>N</mi> <mo>−</mo> <mn>1</mn> </mrow> </semantics> </math> sensors, <span class="html-italic">d</span> is the total water depth in mm, <math display="inline"> <semantics> <msub> <mi>d</mi> <mi>n</mi> </msub> </semantics> </math> is the water depth relative to the bottom of the <span class="html-italic">n</span>-th sensor and <math display="inline"> <semantics> <msub> <mi>n</mi> <mi>s</mi> </msub> </semantics> </math> is the sensor sitting above soil level. Each of the sensors returns volumetric water content (VWC) (%), temperature (°C) and conductivity (dS/m) data.</p>
Full article ">Figure 2
<p>Photograph of a WiField data logger installed in a rice field with ultrasonic water level sensor, temperature sensors and MCP attached.</p>
Full article ">Figure 3
<p>Water depth vs probe normalized VWC (<math display="inline"> <semantics> <msub> <mi>θ</mi> <msub> <mi>n</mi> <mi>norm</mi> </msub> </msub> </semantics> </math>). Measured points from each sensor of the MCP are shown as red points, and the mean over all these sensors as a solid red line. Other lines indicate the water depth calculated from <math display="inline"> <semantics> <msub> <mi>θ</mi> <msub> <mi>n</mi> <mi>norm</mi> </msub> </msub> </semantics> </math> using various models.</p>
Full article ">Figure 4
<p>Water depth using the three models processing the characterized MCP’s VWC data, with actual water depth measured by the ultrasonic sensor on the x-axis. The bottom graph shows the error between the actual (ultrasonic) and modeled (MCP) depth.</p>
Full article ">Figure 5
<p>The error between actual (using the ultrasonic sensor) and calculated (using the VWC data from the MCP processed using the tangent model) water depth from 3 additional independent MCPs. Probes 1 and 2 are 12-sensor probes, and probe 3 is an 8-sensor probe.</p>
Full article ">Figure 6
<p>Measurements from a rice field over a growing season. The top graph shows the water depth from the ultrasonic sensor and from the MCP probe using the linear and tangent models. The middle graph shows the difference between the ultrasonic depth and depths calculated from the MCP readings. The bottom graph shows the soil moisture.</p>
Full article ">Figure 7
<p>Regression plot of the modeled vs actual water depths from the rice field. The <math display="inline"> <semantics> <msup> <mi>R</mi> <mn>2</mn> </msup> </semantics> </math> of the linear model is 0.985, and that of the tangent model is 0.991.</p>
Full article ">Figure 8
<p>The temperature profile measured by the MCP during cold temperature events (20 February 2017 and 12 April 2017 6AM). The measured water depth is indicated with blue shading.</p>
Full article ">
8694 KiB  
Article
A Low-Cost Approach to Automatically Obtain Accurate 3D Models of Woody Crops
by José M. Bengochea-Guevara, Dionisio Andújar, Francisco L. Sanchez-Sardana, Karla Cantuña and Angela Ribeiro
Sensors 2018, 18(1), 30; https://doi.org/10.3390/s18010030 - 24 Dec 2017
Cited by 12 | Viewed by 4556
Abstract
Crop monitoring is an essential practice within the field of precision agriculture since it is based on observing, measuring and properly responding to inter- and intra-field variability. In particular, “on ground crop inspection” potentially allows early detection of certain crop problems or precision [...] Read more.
Crop monitoring is an essential practice within the field of precision agriculture since it is based on observing, measuring and properly responding to inter- and intra-field variability. In particular, “on ground crop inspection” potentially allows early detection of certain crop problems or precision treatment to be carried out simultaneously with pest detection. “On ground monitoring” is also of great interest for woody crops. This paper explores the development of a low-cost crop monitoring system that can automatically create accurate 3D models (clouds of coloured points) of woody crop rows. The system consists of a mobile platform that allows the easy acquisition of information in the field at an average speed of 3 km/h. The platform, among others, integrates an RGB-D sensor that provides RGB information as well as an array with the distances to the objects closest to the sensor. The RGB-D information plus the geographical positions of relevant points, such as the starting and the ending points of the row, allow the generation of a 3D reconstruction of a woody crop row in which all the points of the cloud have a geographical location as well as the RGB colour values. The proposed approach for the automatic 3D reconstruction is not limited by the size of the sampled space and includes a method for the removal of the drift that appears in the reconstruction of large crop rows. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Field platform and on-board equipment.</p>
Full article ">Figure 2
<p>3D reconstruction of a woody crop row in which the drift problem is observed.</p>
Full article ">Figure 3
<p>Section of the 3D reconstruction of a vineyard row: before filtering (<b>a</b>); and after filtering (<b>c</b>). The removed points are marked in fluorescent green in image (<b>b</b>).</p>
Full article ">Figure 4
<p>Part of the model line of the 3D reconstruction of <a href="#sensors-18-00030-f002" class="html-fig">Figure 2</a>.</p>
Full article ">Figure 5
<p>Yellow circles represent the endpoints of the tree row added to the model line of <a href="#sensors-18-00030-f004" class="html-fig">Figure 4</a>.</p>
Full article ">Figure 6
<p>Part of the 3D reconstruction shown in <a href="#sensors-18-00030-f002" class="html-fig">Figure 2</a> split into sections.</p>
Full article ">Figure 7
<p>The coordinate system defined in yellow.</p>
Full article ">Figure 8
<p>The X-axis of the section (Xsection) is represented in red, and the X-axis of the reference (Xreference) is represented in yellow.</p>
Full article ">Figure 9
<p>3D reconstruction of a tree row with drift (up) and the result after the drift correction procedure is applied (down).</p>
Full article ">Figure 10
<p>Detail of the sampling in a vineyard using the field platform.</p>
Full article ">Figure 11
<p>(<b>a</b>,<b>c</b>) RGB images supplied by the Kinect v2 sensor; and (<b>b</b>,<b>d</b>) examples of the depth images supplied by the Kinect v2 sensor simultaneous to when images (<b>a</b>,<b>c</b>) were obtained.</p>
Full article ">Figure 12
<p>3D reconstruction of a row of vines.</p>
Full article ">Figure 13
<p>Details of a 3D reconstruction of a vine that shows the triangular mesh obtained from a point cloud.</p>
Full article ">Figure 14
<p>Condition of the vineyards used in the experiments conducted in: (<b>a</b>) February 2016; (<b>b</b>) May 2016; and (<b>c</b>) July 2016.</p>
Full article ">Figure 15
<p>The same view of the 3D reconstruction of a vineyard with the information acquired in: (<b>a</b>) February 2016; (<b>b</b>) May 2016; and (<b>c</b>) July 2016.</p>
Full article ">Figure 15 Cont.
<p>The same view of the 3D reconstruction of a vineyard with the information acquired in: (<b>a</b>) February 2016; (<b>b</b>) May 2016; and (<b>c</b>) July 2016.</p>
Full article ">Figure 16
<p>Examples of 3D reconstructions that exhibit drift. Sampling performed in: (<b>a</b>) February 2016; (<b>b</b>) May 2016; and (<b>c</b>) July 2016.</p>
Full article ">Figure 17
<p>(<b>a</b>–<b>c</b>) 3D reconstruction of vineyards in <a href="#sensors-18-00030-f016" class="html-fig">Figure 16</a> after the drift has been removed.</p>
Full article ">Figure 18
<p>Rotation angles calculated by the proposed approach to correct each section on the vineyard rows of: (<b>a</b>) 85 m length; and (<b>b</b>) 105 m length. Sampling performed during February, May and July 2016.</p>
Full article ">Figure 18 Cont.
<p>Rotation angles calculated by the proposed approach to correct each section on the vineyard rows of: (<b>a</b>) 85 m length; and (<b>b</b>) 105 m length. Sampling performed during February, May and July 2016.</p>
Full article ">
19384 KiB  
Article
Estimation of the Botanical Composition of Clover-Grass Leys from RGB Images Using Data Simulation and Fully Convolutional Neural Networks
by Søren Skovsen, Mads Dyrmann, Anders Krogh Mortensen, Kim Arild Steen, Ole Green, Jørgen Eriksen, René Gislum, Rasmus Nyholm Jørgensen and Henrik Karstoft
Sensors 2017, 17(12), 2930; https://doi.org/10.3390/s17122930 - 17 Dec 2017
Cited by 37 | Viewed by 8781
Abstract
Optimal fertilization of clover-grass fields relies on knowledge of the clover and grass fractions. This study shows how knowledge can be obtained by analyzing images collected in fields automatically. A fully convolutional neural network was trained to create a pixel-wise classification of clover, [...] Read more.
Optimal fertilization of clover-grass fields relies on knowledge of the clover and grass fractions. This study shows how knowledge can be obtained by analyzing images collected in fields automatically. A fully convolutional neural network was trained to create a pixel-wise classification of clover, grass, and weeds in red, green, and blue (RGB) images of clover-grass mixtures. The estimated clover fractions of the dry matter from the images were found to be highly correlated with the real clover fractions of the dry matter, making this a cheap and non-destructive way of monitoring clover-grass fields. The network was trained solely on simulated top-down images of clover-grass fields. This enables the network to distinguish clover, grass, and weed pixels in real images. The use of simulated images for training reduces the manual labor to a few hours, as compared to more than 3000 h when all the real images are annotated for training. The network was tested on images with varied clover/grass ratios and achieved an overall pixel classification accuracy of 83.4%, while estimating the dry matter clover fraction with a standard deviation of 7.8%. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Example of the image analysis on a real image. Each pixel in the image is automatically analyzed and classified as either grass (blue), clover (red), weeds (yellow), or unidentified (black overlay). (<b>a</b>) Example input image; (<b>b</b>) Automatically analyzed image.</p>
Full article ">Figure 2
<p>Overview of the variation in the 179 sample pairs sorted by the age of the swards. (<b>a</b>) Clover content in harvested material; (<b>b</b>) The total dry matter yield of harvested samples. The year axis is discrete, but points are spread out to avoid occlusion of samples.</p>
Full article ">Figure 3
<p>Depiction of the camera setup used for the sample acquisition and visualization of the gamma correction used to reduce the impact of shadow regions. (<b>a</b>) The pushcart with the camera system mounted on top; (<b>b</b>) Sample of acquired image; (<b>c</b>) Crop of an original captured image to illustrate the level of details; (<b>d</b>) Gamma-corrected version of (<b>c</b>) with a more uniform lighting and increased visibility.</p>
Full article ">Figure 3 Cont.
<p>Depiction of the camera setup used for the sample acquisition and visualization of the gamma correction used to reduce the impact of shadow regions. (<b>a</b>) The pushcart with the camera system mounted on top; (<b>b</b>) Sample of acquired image; (<b>c</b>) Crop of an original captured image to illustrate the level of details; (<b>d</b>) Gamma-corrected version of (<b>c</b>) with a more uniform lighting and increased visibility.</p>
Full article ">Figure 4
<p>Visualization of annotated image samples. Top row shows four samples with different clover contents. Bottom row shows crops of the images from the top row to illustrate the image resolution. The black pixels surrounding the image samples mark the border of the harvested samples. When the orientation of the camera does not align perfectly with the harvested patch, the image sample appears to be rotated. (<b>a1</b>) 5% clover; (<b>b1</b>) 20% clover; (<b>c1</b>) 50% clover; (<b>d1</b>) 70% clover; (<b>a2</b>) Annotated image patch of (a1); (<b>b2</b>) Annotated image patch of (b1); (<b>c2</b>) Annotated image patch of (c1); (<b>d2</b>) Annotated image patch of (d1).</p>
Full article ">Figure 4 Cont.
<p>Visualization of annotated image samples. Top row shows four samples with different clover contents. Bottom row shows crops of the images from the top row to illustrate the image resolution. The black pixels surrounding the image samples mark the border of the harvested samples. When the orientation of the camera does not align perfectly with the harvested patch, the image sample appears to be rotated. (<b>a1</b>) 5% clover; (<b>b1</b>) 20% clover; (<b>c1</b>) 50% clover; (<b>d1</b>) 70% clover; (<b>a2</b>) Annotated image patch of (a1); (<b>b2</b>) Annotated image patch of (b1); (<b>c2</b>) Annotated image patch of (c1); (<b>d2</b>) Annotated image patch of (d1).</p>
Full article ">Figure 5
<p>Samples of single plants that are used for simulating images. (<b>a</b>) clover; (<b>b</b>) dandelion; (<b>c</b>) thistle; and (<b>d</b>) grasses.</p>
Full article ">Figure 6
<p>Simulated training data pair. (<b>a</b>) Simulated red, green, and blue (RGB) image; (<b>b</b>) Corresponding label image. Grass, clover and soil pixels in the RGB image are denoted by blue, red and black pixels in the label image, respectively.</p>
Full article ">Figure 7
<p>The fully convolutional network with a output stride of 8 (FCN-8s) architecture. The network consists of 15 convolutional layers and 5 max pooling layers. The outputs from pooling layers three and four were routed through deconvolution layers to help restore smaller details in the segmented images. (This figure is redrawn from Long et al. [<a href="#B17-sensors-17-02930" class="html-bibr">17</a>]).</p>
Full article ">Figure 8
<p>Comparison between four ground truth labels of the hand-annotated images, our implementation of the approach by Bonesmo et al. [<a href="#B11-sensors-17-02930" class="html-bibr">11</a>], and the two instances of convolutional neural network (CNN)-driven segmentation. The areas in blue represent grass, those in red represent clover, those in yellow represent weeds, those in black represent soil, and those in white represent areas that could not be distinguished in the manual annotation.</p>
Full article ">Figure 8 Cont.
<p>Comparison between four ground truth labels of the hand-annotated images, our implementation of the approach by Bonesmo et al. [<a href="#B11-sensors-17-02930" class="html-bibr">11</a>], and the two instances of convolutional neural network (CNN)-driven segmentation. The areas in blue represent grass, those in red represent clover, those in yellow represent weeds, those in black represent soil, and those in white represent areas that could not be distinguished in the manual annotation.</p>
Full article ">Figure 9
<p>Visualization of CNN-driven semantic segmentation of highlighted sample pairs from <a href="#sensors-17-02930-f010" class="html-fig">Figure 10</a>. The first column shows the evaluated image. The second column shows the thresholded identification of grass (blue), clover (red), and weeds (yellow) in the image. (<b>a1</b>) Low clover fraction; (<b>a2</b>) Semantic segmentation of (a1); (<b>b1</b>) Medium clover fraction; (<b>b2</b>) Semantic segmentation of (b1); (<b>c1</b>) High clover fraction; (<b>c2</b>) Semantic segmentation of (c1).</p>
Full article ">Figure 10
<p>Comparison and linear regression of the visual clover ratio determined by the CNN and the actual clover ratio in the harvested dry matter at the primary field site. The linear relationship between the visual and dry matter fraction of clover in the sample pairs is clear. The annotations <span class="html-italic">a</span>, <span class="html-italic">b</span>, and <span class="html-italic">c</span> refer to the three image samples shown in <a href="#sensors-17-02930-f009" class="html-fig">Figure 9</a>.</p>
Full article ">Figure 11
<p>Comparison of the visual clover ratio and the dry matter clover ratio of the validation field site. The linear relationship is more poorly defined, with an introduction of a noticeable offset, possibly due to a less complete segmentation of clovers in the image samples.</p>
Full article ">
2500 KiB  
Article
In Vivo Non-Destructive Monitoring of Capsicum Annuum Seed Growth with Diverse NaCl Concentrations Using Optical Detection Technique
by Naresh Kumar Ravichandran, Ruchire Eranga Wijesinghe, Muhammad Faizan Shirazi, Jeehyun Kim, Hee-Young Jung, Mansik Jeon and Seung-Yeol Lee
Sensors 2017, 17(12), 2887; https://doi.org/10.3390/s17122887 - 12 Dec 2017
Cited by 7 | Viewed by 4387
Abstract
We demonstrate that optical coherence tomography (OCT) is a plausible optical tool for in vivo detection of plant seeds and its morphological changes during growth. To investigate the direct impact of salt stress on seed germination, the experiment was conducted using Capsicum annuum [...] Read more.
We demonstrate that optical coherence tomography (OCT) is a plausible optical tool for in vivo detection of plant seeds and its morphological changes during growth. To investigate the direct impact of salt stress on seed germination, the experiment was conducted using Capsicum annuum seeds that were treated with different molar concentrations of NaCl. To determine the optimal concentration for the seed growth, the seeds were monitored for nine consecutive days. In vivo two-dimensional OCT images of the treated seeds were obtained and compared with the images of seeds that were grown using sterile distilled water. The obtained results confirm the feasibility of using OCT for the proposed application. Normalized depth profile analysis was utilized to support the conclusions. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Schematic of the SS-OCT setup. (<b>A</b>) Schematic of the SS-OCT system. (<b>B</b>) Photograph of a sample <span class="html-italic">Capsicum annuum</span> seed. The dashed line represents the place of OCT scanning beam.</p>
Full article ">Figure 2
<p>A 2D SS-OCT image and a histological image. (<b>A</b>) A 2D-SSOCT image of a seed primed with sterile distilled water for one day. (<b>B</b>) An enlarged histology image of the same seed that was used in (<b>A</b>). C: Cotyledon; I: Inner seed coat; N: Non-micropylar endosperm; T: Testa. Dashed red circle shows the enclosed cotyledon regions.</p>
Full article ">Figure 3
<p>Comparative growth analysis by 2D SS-OCT of <span class="html-italic">Capsicum annuum</span> seeds primed with different salt solutions. The images were taken on consecutive days for seeds that were soaked in sterile distilled water (SDW) (<b>A1</b>–<b>A5</b>), 0.1 M NaCl solution (<b>B1</b>–<b>B5</b>), 0.2 M NaCl solution (<b>C1</b>–<b>C5</b>), 0.3 M NaCl solution (<b>D1</b>–<b>D5</b>), and 0.4 M NaCl solution (<b>E1</b>–<b>E5</b>). C: Cotyledon; EM: embryo; I: Inner seed coat; T: Testa. The scale bar of 400 µm applies to all 2D images.</p>
Full article ">Figure 4
<p>Statistical analysis of 2D OCT images using depth profile analysis. (<b>A</b>) The recorded average seed weight fluctuations during monitoring period. (<b>B</b>,<b>C</b>) Representative 2D-OCT images marked with embryo region in a seed. (<b>D</b>) The measured average embryo thickness using depth profile analysis of all seed groups during the monitoring period.</p>
Full article ">Figure 5
<p>SS-OCT images acquired on Day 3 after priming the <span class="html-italic">Capsicum annuum</span> seeds with different solutions. The bottom figure is the respective averaged and normalized depth profile analysis plot (as discussed in <a href="#sec2dot4-sensors-17-02887" class="html-sec">Section 2.4</a>) for SDW, 0.1 M NaCl, 0.2 M NaCl, 0.3 M NaCl, and 0.4 M NaCl, which are correspondingly shown with colors of red, blue, green, black, and magenta plots.</p>
Full article ">Figure 6
<p>SS-OCT images on Day 9 after priming the <span class="html-italic">Capsicum annuum</span> seeds with different solutions. Bottom figure is the respective averaged and normalized depth profile analysis plot (as discussed in <a href="#sec2dot4-sensors-17-02887" class="html-sec">Section 2.4</a>) for SDW, 0.1 M NaCl, 0.2 M NaCl, 0.3 M NaCl, and 0.4 M NaCl, which are correspondingly shown with colors of red, blue, green, black, and magenta plots.</p>
Full article ">
5294 KiB  
Article
Contamination Event Detection with Multivariate Time-Series Data in Agricultural Water Monitoring
by Yingchi Mao, Hai Qi, Ping Ping and Xiaofang Li
Sensors 2017, 17(12), 2806; https://doi.org/10.3390/s17122806 - 4 Dec 2017
Cited by 7 | Viewed by 4625
Abstract
Time series data of multiple water quality parameters are obtained from the water sensor networks deployed in the agricultural water supply network. The accurate and efficient detection and warning of contamination events to prevent pollution from spreading is one of the most important [...] Read more.
Time series data of multiple water quality parameters are obtained from the water sensor networks deployed in the agricultural water supply network. The accurate and efficient detection and warning of contamination events to prevent pollution from spreading is one of the most important issues when pollution occurs. In order to comprehensively reduce the event detection deviation, a spatial–temporal-based event detection approach with multivariate time-series data for water quality monitoring (M-STED) was proposed. The M-STED approach includes three parts. The first part is that M-STED adopts a Rule K algorithm to select backbone nodes as the nodes in the CDS, and forward the sensed data of multiple water parameters. The second part is to determine the state of each backbone node with back propagation neural network models and the sequential Bayesian analysis in the current timestamp. The third part is to establish a spatial model with Bayesian networks to estimate the state of the backbones in the next timestamp and trace the “outlier” node to its neighborhoods to detect a contamination event. The experimental results indicate that the average detection rate is more than 80% with M-STED and the false detection rate is lower than 9%, respectively. The M-STED approach can improve the rate of detection by about 40% and reduce the false alarm rate by about 45%, compared with the event detection with a single water parameter algorithm, S-STED. Moreover, the proposed M-STED can exhibit better performance in terms of detection delay and scalability. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>The topology of a water supply network.</p>
Full article ">Figure 2
<p>Procedure in the off-line phase.</p>
Full article ">Figure 3
<p>Procedure in the on-line phase.</p>
Full article ">Figure 4
<p>BP (Back Propagation) neural network structure of free chlorine.</p>
Full article ">Figure 5
<p>Event probability of single water quality parameter. (<b>a</b>) free Chlorine; (<b>b</b>) EC; (<b>c</b>) pH; (<b>d</b>) Temperature; (<b>e</b>) TOC; (<b>f</b>) turbidity.</p>
Full article ">Figure 6
<p>Event probability of multiple water quality parameters.</p>
Full article ">Figure 7
<p>State transitions of the fused values of six water quality parameters at the moment <math display="inline"> <semantics> <mrow> <msub> <mi>τ</mi> <mn>1</mn> </msub> </mrow> </semantics> </math></p>
Full article ">Figure 8
<p>Time series of multivariate water quality parameters. (<b>a</b>) Free Chlorine; (<b>b</b>) EC; (<b>c</b>) pH; (<b>d</b>) Temperature; (<b>e</b>) TOC; (<b>f</b>) turbidity.</p>
Full article ">Figure 8 Cont.
<p>Time series of multivariate water quality parameters. (<b>a</b>) Free Chlorine; (<b>b</b>) EC; (<b>c</b>) pH; (<b>d</b>) Temperature; (<b>e</b>) TOC; (<b>f</b>) turbidity.</p>
Full article ">Figure 9
<p>Comparison of detection rates between M-STED and S-STED.</p>
Full article ">Figure 10
<p>Comparison of false alarm rates between M-STED and S-STED.</p>
Full article ">Figure 11
<p>Comparison of ROC (receiver operating curve) between M-STED and S-STED.</p>
Full article ">Figure 12
<p>The average delay time in the different node densities.</p>
Full article ">Figure 13
<p>Scalability with an increased number of nodes.</p>
Full article ">
5223 KiB  
Article
A Combined Approach of Sensor Data Fusion and Multivariate Geostatistics for Delineation of Homogeneous Zones in an Agricultural Field
by Annamaria Castrignanò, Gabriele Buttafuoco, Ruggiero Quarto, Carolina Vitti, Giuliano Langella, Fabio Terribile and Accursio Venezia
Sensors 2017, 17(12), 2794; https://doi.org/10.3390/s17122794 - 3 Dec 2017
Cited by 52 | Viewed by 5988
Abstract
To assess spatial variability at the very fine scale required by Precision Agriculture, different proximal and remote sensors have been used. They provide large amounts and different types of data which need to be combined. An integrated approach, using multivariate geostatistical data-fusion techniques [...] Read more.
To assess spatial variability at the very fine scale required by Precision Agriculture, different proximal and remote sensors have been used. They provide large amounts and different types of data which need to be combined. An integrated approach, using multivariate geostatistical data-fusion techniques and multi-source geophysical sensor data to determine simple summary scale-dependent indices, is described here. These indices can be used to delineate management zones to be submitted to differential management. Such a data fusion approach with geophysical sensors was applied in a soil of an agronomic field cropped with tomato. The synthetic regionalized factors determined, contributed to split the 3D edaphic environment into two main horizontal structures with different hydraulic properties and to disclose two main horizons in the 0–1.0-m depth with a discontinuity probably occurring between 0.40 m and 0.70 m. Comparing this partition with the soil properties measured with a shallow sampling, it was possible to verify the coherence in the topsoil between the dielectric properties and other properties more directly related to agronomic management. These results confirm the advantages of using proximal sensing as a preliminary step in the application of site-specific management. Combining disparate spatial data (data fusion) is not at all a naive problem and novel and powerful methods need to be developed. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Locations of the experimental field (<b>a</b>) and geophysical surveys (<b>b</b>). The dotted lines are the 24 transects where the geophysical surveys were carried out.</p>
Full article ">Figure 2
<p>Examples of point (<b>a</b>) and reguralized (<b>b</b>) auto- and cross-variograms of the Gaussian variables of EC<sub>a(- Hz)</sub> (apparent electrical conductivities measured by the GEM Profiler), EC<sub>a H</sub> and EC<sub>a V</sub> (apparent electrical conductivities measured by the EM38DD in horizontal and vertical polarization) and GPR signal amplitude at different depth slices. The experimental values are the plotted points and the solid lines are of the model of coregionalization. The dash-dotted lines are the hulls of perfect correlation and the dashed lines are the experimental variances.</p>
Full article ">Figure 3
<p>Maps of estimates of EC<sub>a</sub> (mS·m<sup>−1</sup>) at the selected frequencies (<b>a–f</b>). Color scale uses iso-frequency classes.</p>
Full article ">Figure 4
<p>Maps of estimates of EC<sub>a</sub> (mS·m<sup>−1</sup>) in the (<b>a</b>) horizontal and (<b>b</b>) vertical orientations. Color scale uses iso-frequency classes.</p>
Full article ">Figure 5
<p>Maps of horizontal sections of estimated GPR signal amplitude envelope. Only a selection up to 1 m of the horizontal sections (<b>a–l</b>) are reported. Color scale uses iso-frequency classes.</p>
Full article ">Figure 6
<p>Maps of the first (<b>a</b>) and second (<b>b</b>) regionalized factors from sensor data fusion.</p>
Full article ">Figure 7
<p>Map of the 33.8-scale factor of soil properties.</p>
Full article ">Figure 8
<p>Cokriged maps of total carbonate (<b>a</b>) and coarse sand (<b>b</b>).</p>
Full article ">Figure 9
<p>Cokriged maps of wilting point (<b>a</b>) and field capacity (<b>b</b>).</p>
Full article ">
3091 KiB  
Article
Development of Spectral Disease Indices for ‘Flavescence Dorée’ Grapevine Disease Identification
by Hania AL-Saddik, Jean-Claude Simon and Frederic Cointault
Sensors 2017, 17(12), 2772; https://doi.org/10.3390/s17122772 - 29 Nov 2017
Cited by 53 | Viewed by 7247
Abstract
Spectral measurements are employed in many precision agriculture applications, due to their ability to monitor the vegetation’s health state. Spectral vegetation indices are one of the main techniques currently used in remote sensing activities, since they are related to biophysical and biochemical crop [...] Read more.
Spectral measurements are employed in many precision agriculture applications, due to their ability to monitor the vegetation’s health state. Spectral vegetation indices are one of the main techniques currently used in remote sensing activities, since they are related to biophysical and biochemical crop variables. Moreover, they have been evaluated in some studies as potentially beneficial for detecting or differentiating crop diseases. Flavescence Dorée (FD) is an infectious, incurable disease of the grapevine that can produce severe yield losses and, hence, compromise the stability of the vineyards. The aim of this study was to develop specific spectral disease indices (SDIs) for the detection of FD disease in grapevines. Spectral signatures of healthy and diseased grapevine leaves were measured with a non-imaging spectro-radiometer at two infection severity levels. The most discriminating wavelengths were selected by a genetic algorithm (GA) feature selection tool, the Spectral Disease Indices (SDIs) are designed by exhaustively testing all possible combinations of wavelengths chosen. The best weighted combination of a single wavelength and a normalized difference is chosen to create the index. The SDIs are tested for their ability to differentiate healthy from diseased vine leaves and they are compared to some common set of Spectral Vegetation Indices (SVIs). It was demonstrated that using vegetation indices was, in general, better than using complete spectral data and that SDIs specifically designed for FD performed better than traditional SVIs in most of cases. The precision of the classification is higher than 90%. This study demonstrates that SDIs have the potential to improve disease detection, identification and monitoring in precision agriculture applications. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Some symptoms of FD on leaves: a red discoloration on a red grapevine variety (<b>a</b>) and a yellow discoloration on a white grapevine variety (<b>b</b>); windings of leaves can also be noticed.</p>
Full article ">Figure 2
<p>Vineyard distribution and S. titanus presence in France (from [<a href="#B24-sensors-17-02772" class="html-bibr">24</a>] modified).</p>
Full article ">Figure 3
<p>Locations of the measurements on a sample leaf.</p>
Full article ">Figure 4
<p>Systematical approach and development of SDIs from hyperspectral reflectance data.</p>
Full article ">Figure 5
<p>Noise removal and resolution reduction of spectral data.</p>
Full article ">Figure 6
<p>GA-Based Feature Selection.</p>
Full article ">Figure 7
<p>Example of feature averaging.</p>
Full article ">Figure 8
<p>Configuration of analyzed data, infested against healthy groups for a binary classification.</p>
Full article ">Figure 9
<p>Reflectance of infected (Red) and healthy (green) leaves for Marselan (<b>a</b>) and Chardonnay grapevines (<b>b</b>).</p>
Full article ">
4728 KiB  
Article
Portable Electronic Nose Based on Electrochemical Sensors for Food Quality Assessment
by Wojciech Wojnowski, Tomasz Majchrzak, Tomasz Dymerski, Jacek Gębicki and Jacek Namieśnik
Sensors 2017, 17(12), 2715; https://doi.org/10.3390/s17122715 - 24 Nov 2017
Cited by 120 | Viewed by 10039
Abstract
The steady increase in global consumption puts a strain on agriculture and might lead to a decrease in food quality. Currently used techniques of food analysis are often labour-intensive and time-consuming and require extensive sample preparation. For that reason, there is a demand [...] Read more.
The steady increase in global consumption puts a strain on agriculture and might lead to a decrease in food quality. Currently used techniques of food analysis are often labour-intensive and time-consuming and require extensive sample preparation. For that reason, there is a demand for novel methods that could be used for rapid food quality assessment. A technique based on the use of an array of chemical sensors for holistic analysis of the sample’s headspace is called electronic olfaction. In this article, a prototype of a portable, modular electronic nose intended for food analysis is described. Using the SVM method, it was possible to classify samples of poultry meat based on shelf-life with 100% accuracy, and also samples of rapeseed oil based on the degree of thermal degradation with 100% accuracy. The prototype was also used to detect adulterations of extra virgin olive oil with rapeseed oil with 82% overall accuracy. Due to the modular design, the prototype offers the advantages of solutions targeted for analysis of specific food products, at the same time retaining the flexibility of application. Furthermore, its portability allows the device to be used at different stages of the production and distribution process. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Pneumatic assembly of the electronic nose prototype.</p>
Full article ">Figure 2
<p>The view of an electronic nose prototype based on the electrochemical sensor array.</p>
Full article ">Figure 3
<p>(<b>a</b>) Exploded view drawing of the sensor module; (<b>b</b>) flow velocity magnitude along the sensors’ surface without the perforated plate and (<b>c</b>) with the perforated plate.</p>
Full article ">Figure 4
<p>Radar plots for fish, poultry, and pork samples at different time of incubation.</p>
Full article ">Figure 5
<p>Principal component analysis (PCA) of poultry meat stored at 4 °C over a period of five days.</p>
Full article ">Figure 6
<p>Confusion matrices for Support Vector Machine SVM classification of (<b>a</b>) poultry meat based on shelf-life; (<b>b</b>) rapeseed oil based on the degree of thermal degradation; (<b>c</b>) olive oil based on the admixture of adulterations (purple: classified correctly; pink: misclassified).</p>
Full article ">Figure 7
<p>Cluster analysis of variables (sensors) during poultry shelf-life evaluation.</p>
Full article ">Figure 8
<p>Sensor response signals for poultry meat samples after five days of storage with and without circulation mode.</p>
Full article ">Figure 9
<p>3D projection of PCA of rapeseed oil samples incubated at different temperatures.</p>
Full article ">Figure 10
<p>Loadings for the first four principal components in an analysis of rapeseed oil’s thermal degradation.</p>
Full article ">
4070 KiB  
Article
Estimating Crop Area at County Level on the North China Plain with an Indirect Sampling of Segments and an Adapted Regression Estimator
by Qinghan Dong, Jia Liu, Limin Wang, Zhongxin Chen and Javier Gallego
Sensors 2017, 17(11), 2638; https://doi.org/10.3390/s17112638 - 16 Nov 2017
Cited by 4 | Viewed by 3918
Abstract
Image classifications, including sub-pixel analysis, are often used to estimate crop acreage directly. However, this type of assessment often leads to a biased estimation, because commission and omission errors generally do not compensate for each other. Regression estimators combine remote sensing information with [...] Read more.
Image classifications, including sub-pixel analysis, are often used to estimate crop acreage directly. However, this type of assessment often leads to a biased estimation, because commission and omission errors generally do not compensate for each other. Regression estimators combine remote sensing information with more accurate ground data on a field sample, and can result in more accurate and cost-effective assessments of crop acreage. In this pilot study, which aims to produce crop statistics in Guoyang County, the area frame sampling approach is adapted to a strip-like cropping pattern on the North China Plain. Remote sensing information is also used to perform a stratification in which non-agricultural areas are excluded from the ground survey. In order to compute crop statistics, 202 ground points in the agriculture stratum were surveyed. Image classification was included as an auxiliary variable in the subsequent analysis to obtain a regression estimator. The results of this pilot study showed that the integration of remote sensing information as an auxiliary variable can improve the accuracy of estimation by reducing the variance of the estimates, as well as the cost effectiveness of an operational application at the county level in the region. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Guoyang County (<b>a</b>) is located in the north of Anhui province, and on the south edge of the North China Plain (<b>b</b>).</p>
Full article ">Figure 1 Cont.
<p>Guoyang County (<b>a</b>) is located in the north of Anhui province, and on the south edge of the North China Plain (<b>b</b>).</p>
Full article ">Figure 2
<p>A grid of 0.01° was overlaid on the Google Earth imagery (<b>a</b>). There are 2074 grid points within the boarder of the Guoyang County. The sub-sampling of the agricultural stratum was carried out in a systematic way. Two hundred and two grid points plot were thus selected for field survey to assess the crop proportions in this sample/grid point (<b>b</b>).</p>
Full article ">Figure 3
<p>The crop proportion of a particular grid/sample point was assessed by expanding the point to a segment according to the physical boundaries of the field plot harboring the grid/sample point. The proportion of each crop was assessed by global positioning system (GPS) measuring on one border of the plot, perpendicular to the field strips (<b>a</b>). The expanded segment can be located on the edge of the agricultural stratum (<b>b</b>).</p>
Full article ">Figure 4
<p>Classification using two scenes of RapidEye imagery.</p>
Full article ">Figure 5
<p>Regressions of crop proportions for the soybean (<b>a</b>) and maize (<b>b</b>) derived from 202 surveyed segments against those derived from the classification for the same segments.</p>
Full article ">
6463 KiB  
Article
FieldSAFE: Dataset for Obstacle Detection in Agriculture
by Mikkel Fly Kragh, Peter Christiansen, Morten Stigaard Laursen, Morten Larsen, Kim Arild Steen, Ole Green, Henrik Karstoft and Rasmus Nyholm Jørgensen
Sensors 2017, 17(11), 2579; https://doi.org/10.3390/s17112579 - 9 Nov 2017
Cited by 64 | Viewed by 11649
Abstract
In this paper, we present a multi-modal dataset for obstacle detection in agriculture. The dataset comprises approximately 2 h of raw sensor data from a tractor-mounted sensor system in a grass mowing scenario in Denmark, October 2016. Sensing modalities include stereo camera, thermal [...] Read more.
In this paper, we present a multi-modal dataset for obstacle detection in agriculture. The dataset comprises approximately 2 h of raw sensor data from a tractor-mounted sensor system in a grass mowing scenario in Denmark, October 2016. Sensing modalities include stereo camera, thermal camera, web camera, 360 camera, LiDAR and radar, while precise localization is available from fused IMU and GNSS. Both static and moving obstacles are present, including humans, mannequin dolls, rocks, barrels, buildings, vehicles and vegetation. All obstacles have ground truth object labels and geographic coordinates. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Recording platform surrounded by static and moving obstacles. Multiple drone views record the exact position of obstacles, while the recording platform records local sensor data.</p>
Full article ">Figure 2
<p>Recording platform.</p>
Full article ">Figure 3
<p>Example frames from the FieldSAFE dataset. (<b>a</b>) Left stereo image; (<b>b</b>) stereo pointcloud; (<b>c</b>) 360<math display="inline"> <semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics> </math> camera image (cropped); (<b>d</b>) web camera image; (<b>e</b>) thermal camera image (cropped); (<b>f</b>) LiDAR point cloud (cropped and colored by height); (<b>g</b>) radar detections overlaid on LiDAR point cloud (black). Green and red circles denote detections from mid- and long-range modes, respectively.</p>
Full article ">Figure 4
<p>Sensor registration. “Hand” denotes a manual measurement by hand, whereas “calibrated” indicates that an automated calibration procedure was used to estimate the extrinsic parameters.</p>
Full article ">Figure 5
<p>Colored and labeled orthophotos. (<b>a</b>) Orthophoto with tractor tracks overlaid. Black tracks include only static obstacles, whereas red and white tracks also have moving obstacles. Currently, red tracks have no ground truth for moving obstacles annotated. (<b>b</b>) Labeled orthophoto.</p>
Full article ">Figure 6
<p>Examples of static obstacles.</p>
Full article ">Figure 7
<p>Examples of moving obstacles (from the stereo camera) and their paths (black) overlaid on the tractor path (grey).</p>
Full article ">
4750 KiB  
Article
A Real-Time Smooth Weighted Data Fusion Algorithm for Greenhouse Sensing Based on Wireless Sensor Networks
by Tengyue Zou, Yuanxia Wang, Mengyi Wang and Shouying Lin
Sensors 2017, 17(11), 2555; https://doi.org/10.3390/s17112555 - 6 Nov 2017
Cited by 11 | Viewed by 5483
Abstract
Wireless sensor networks are widely used to acquire environmental parameters to support agricultural production. However, data variation and noise caused by actuators often produce complex measurement conditions. These factors can lead to nonconformity in reporting samples from different nodes and cause errors when [...] Read more.
Wireless sensor networks are widely used to acquire environmental parameters to support agricultural production. However, data variation and noise caused by actuators often produce complex measurement conditions. These factors can lead to nonconformity in reporting samples from different nodes and cause errors when making a final decision. Data fusion is well suited to reduce the influence of actuator-based noise and improve automation accuracy. A key step is to identify the sensor nodes disturbed by actuator noise and reduce their degree of participation in the data fusion results. A smoothing value is introduced and a searching method based on Prim’s algorithm is designed to help obtain stable sensing data. A voting mechanism with dynamic weights is then proposed to obtain the data fusion result. The dynamic weighting process can sharply reduce the influence of actuator noise in data fusion and gradually condition the data to normal levels over time. To shorten the data fusion time in large networks, an acceleration method with prediction is also presented to reduce the data collection time. A real-time system is implemented on STMicroelectronics STM32F103 and NORDIC nRF24L01 platforms and the experimental results verify the improvement provided by these new algorithms. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Illustration of a sensor network deployed in a greenhouse.</p>
Full article ">Figure 2
<p>Illustration of an adjacent graph for a sensor network.</p>
Full article ">Figure 3
<p>Example of the weight of a sensor node influenced by the actuator.</p>
Full article ">Figure 4
<p>Procedure of the acceleration mechanism.</p>
Full article ">Figure 5
<p>(<b>a</b>) The average accuracy for methods in the simulation; (<b>b</b>) The average time/cost for the methods in the simulation.</p>
Full article ">Figure 6
<p>(<b>a</b>) The average energy consumption of each sensor node over an hour; (<b>b</b>) The average accuracy and time/cost for various adjustment coefficients <span class="html-italic">ε</span>.</p>
Full article ">Figure 7
<p>(<b>a</b>) Illustration of the wireless sensor node for experiments; (<b>b</b>) Illustration of the deployment of sensor nodes for experiments.</p>
Full article ">Figure 8
<p>(<b>a</b>) Illustration of the appearance of the experimental greenhouse; (<b>b</b>) Illustration of the inside structure for the experimental greenhouse.</p>
Full article ">Figure 9
<p>(<b>a</b>) Average accuracy of each algorithm in the experiments; (<b>b</b>) Average time cost of each algorithm in the experiments.</p>
Full article ">
2378 KiB  
Article
Fast Detection of Striped Stem-Borer (Chilo suppressalis Walker) Infested Rice Seedling Based on Visible/Near-Infrared Hyperspectral Imaging System
by Yangyang Fan, Tao Wang, Zhengjun Qiu, Jiyu Peng, Chu Zhang and Yong He
Sensors 2017, 17(11), 2470; https://doi.org/10.3390/s17112470 - 27 Oct 2017
Cited by 33 | Viewed by 7486
Abstract
Striped stem-borer (SSB) infestation is one of the most serious sources of damage to rice growth. A rapid and non-destructive method of early SSB detection is essential for rice-growth protection. In this study, hyperspectral imaging combined with chemometrics was used to detect early [...] Read more.
Striped stem-borer (SSB) infestation is one of the most serious sources of damage to rice growth. A rapid and non-destructive method of early SSB detection is essential for rice-growth protection. In this study, hyperspectral imaging combined with chemometrics was used to detect early SSB infestation in rice and identify the degree of infestation (DI). Visible/near-infrared hyperspectral images (in the spectral range of 380 nm to 1030 nm) were taken of the healthy rice plants and infested rice plants by SSB for 2, 4, 6, 8 and 10 days. A total of 17 characteristic wavelengths were selected from the spectral data extracted from the hyperspectral images by the successive projection algorithm (SPA). Principal component analysis (PCA) was applied to the hyperspectral images, and 16 textural features based on the gray-level co-occurrence matrix (GLCM) were extracted from the first two principal component (PC) images. A back-propagation neural network (BPNN) was used to establish infestation degree evaluation models based on full spectra, characteristic wavelengths, textural features and features fusion, respectively. BPNN models based on a fusion of characteristic wavelengths and textural features achieved the best performance, with classification accuracy of calibration and prediction sets over 95%. The accuracy of each infestation degree was satisfactory, and the accuracy of rice samples infested for 2 days was slightly low. In all, this study indicated the feasibility of hyperspectral imaging techniques to detect early SSB infestation and identify degrees of infestation. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Samples of six degrees of infestation: (<b>a</b>) DI0; (<b>b</b>) DI1; (<b>c</b>) DI2; (<b>d</b>) DI3; (<b>e</b>) DI4; (<b>f</b>) DI5.</p>
Full article ">Figure 2
<p>Average spectral curves of samples in six degrees of infestation.</p>
Full article ">Figure 3
<p>Scores’ scatter plots of samples in six degrees of infestation.</p>
Full article ">Figure 4
<p>Selection of characteristic wavelengths by the successive projection algorithm (SPA): (<b>a</b>) numbers of characteristic wavelengths with the minimum root mean square error (RMSE); (<b>b</b>) distribution of characteristic wavelengths in the full band.</p>
Full article ">Figure 5
<p>Texture feature images and PC images of a rice sample.</p>
Full article ">
4707 KiB  
Article
Comparison between Random Forests, Artificial Neural Networks and Gradient Boosted Machines Methods of On-Line Vis-NIR Spectroscopy Measurements of Soil Total Nitrogen and Total Carbon
by Said Nawar and Abdul M. Mouazen
Sensors 2017, 17(10), 2428; https://doi.org/10.3390/s17102428 - 24 Oct 2017
Cited by 136 | Viewed by 8087
Abstract
Accurate and detailed spatial soil information about within-field variability is essential for variable-rate applications of farm resources. Soil total nitrogen (TN) and total carbon (TC) are important fertility parameters that can be measured with on-line (mobile) visible and near infrared (vis-NIR) spectroscopy. This [...] Read more.
Accurate and detailed spatial soil information about within-field variability is essential for variable-rate applications of farm resources. Soil total nitrogen (TN) and total carbon (TC) are important fertility parameters that can be measured with on-line (mobile) visible and near infrared (vis-NIR) spectroscopy. This study compares the performance of local farm scale calibrations with those based on the spiking of selected local samples from both fields into an European dataset for TN and TC estimation using three modelling techniques, namely gradient boosted machines (GBM), artificial neural networks (ANNs) and random forests (RF). The on-line measurements were carried out using a mobile, fiber type, vis-NIR spectrophotometer (305–2200 nm) (AgroSpec from tec5, Germany), during which soil spectra were recorded in diffuse reflectance mode from two fields in the UK. After spectra pre-processing, the entire datasets were then divided into calibration (75%) and prediction (25%) sets, and calibration models for TN and TC were developed using GBM, ANN and RF with leave-one-out cross-validation. Results of cross-validation showed that the effect of spiking of local samples collected from a field into an European dataset when combined with RF has resulted in the highest coefficients of determination (R2) values of 0.97 and 0.98, the lowest root mean square error (RMSE) of 0.01% and 0.10%, and the highest residual prediction deviations (RPD) of 5.58 and 7.54, for TN and TC, respectively. Results for laboratory and on-line predictions generally followed the same trend as for cross-validation in one field, where the spiked European dataset-based RF calibration models outperformed the corresponding GBM and ANN models. In the second field ANN has replaced RF in being the best performing. However, the local field calibrations provided lower R2 and RPD in most cases. Therefore, from a cost-effective point of view, it is recommended to adopt the spiked European dataset-based RF/ANN calibration models for successful prediction of TN and TC under on-line measurement conditions. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>The on-line visible and near infrared (vis-NIR) spectroscopy sensor developed by Mouazen [<a href="#B27-sensors-17-02428" class="html-bibr">27</a>].</p>
Full article ">Figure 2
<p>Overview flowchart listing the main steps of the multivariate analysis methods. (<b>a</b>) Random forests regression; (<b>b</b>) gradient boosted machines (GBM); and (<b>c</b>) artificial neural networks (ANNs).</p>
Full article ">Figure 3
<p>Histograms, box-plots and descriptive statistics of (<b>a</b>) soil total nitrogen (TN) and (<b>b</b>) total carbon (TC) for Hessleskew and Hagg fields, and European dataset.</p>
Full article ">Figure 4
<p>Scatter plots of visible and near infrared (vis-NIR)-predicted versus laboratory-analysed total nitrogen (TN) in Hessleskew field in cross validation (<b>a</b>); lab prediction (<b>b</b>) and on-line prediction (<b>c</b>); using local dataset (<b>A</b>) and spiked European dataset (<b>B</b>); comparing between gradient boosted machines (GBM), artificial neural network (ANNs) and random forests (RF) models.</p>
Full article ">Figure 5
<p>Scatter plots of visible and near infrared (vis-NIR)-predicted versus laboratory-analysed total nitrogen (TN) in Hagg field in cross validation (<b>a</b>); lab prediction (<b>b</b>) and on-line prediction (<b>c</b>), using local dataset (<b>A</b>) and spiked European dataset (<b>B</b>); comparing between gradient boosted machines (GBM); artificial neural network (ANNs) and random forests (RF) models.</p>
Full article ">Figure 6
<p>Scatter plots of visible and near infrared (vis-NIR)-predicted versus laboratory-analysed total carbon (TC) in the Hesselskew field in cross validation (<b>a</b>); lab prediction (<b>b</b>) and on-line prediction (<b>c</b>); using local dataset (<b>A</b>) and spiked European dataset (<b>B</b>); comparing between gradient boosted machines (GBM), artificial neural networks (ANNs) and random forests (RF) models.</p>
Full article ">Figure 7
<p>Scatter plots of visible and near infrared (vis-NIR)-predicted versus laboratory-analysed total carbon (TC) in the Hagg field in cross validation (<b>a</b>); lab prediction (<b>b</b>) and on-line prediction (<b>c</b>); using local dataset (<b>A</b>) and spiked European dataset (<b>B</b>); comparing between gradient boosted machines (GBM), artificial neural networks (ANNs) and random forests (RF) models.</p>
Full article ">Figure 8
<p>Comparison of residual prediction deviation (RPD) values for (<b>A</b>) total nitrogen (TN) and (<b>B</b>) total carbon (TC) predictions obtained with (<b>a</b>) gradient boosted machines (GBM); (<b>b</b>) artificial neural networks (ANNs) and (<b>c</b>) random forests (RF) analyses in cross-validation (Cal), laboratory prediction (Lab) and on-line prediction (Online). Results were generated with local field datasets of 122 and 149 samples for the Hessleskew and Hagg fields, respectively, and a spiked European dataset (528 samples).</p>
Full article ">Figure 9
<p>Comparison of root mean square error (RMSE) values for (<b>A</b>) total nitrogen (TN) and (<b>B</b>) total carbon (TC) predictions obtained with (<b>a</b>) gradient boosted machines (GBM); (<b>b</b>) artificial neural networks (ANNs) and (<b>c</b>) random forests (RF) analyses in cross-validation (Cal), laboratory prediction (Lab) and on-line prediction (Online). Results were generated with local field datasets of 122 and 149 samples for the Hessleskew and Hagg fields, respectively, and a spiked European dataset (528 samples).</p>
Full article ">
6737 KiB  
Article
Evaluation of Sensible Heat Flux and Evapotranspiration Estimates Using a Surface Layer Scintillometer and a Large Weighing Lysimeter
by Jerry E. Moorhead, Gary W. Marek, Paul D. Colaizzi, Prasanna H. Gowda, Steven R. Evett, David K. Brauer, Thomas H. Marek and Dana O. Porter
Sensors 2017, 17(10), 2350; https://doi.org/10.3390/s17102350 - 14 Oct 2017
Cited by 31 | Viewed by 5137
Abstract
Accurate estimates of actual crop evapotranspiration (ET) are important for optimal irrigation water management, especially in arid and semi-arid regions. Common ET sensing methods include Bowen Ratio, Eddy Covariance (EC), and scintillometers. Large weighing lysimeters are considered the ultimate standard for measurement of [...] Read more.
Accurate estimates of actual crop evapotranspiration (ET) are important for optimal irrigation water management, especially in arid and semi-arid regions. Common ET sensing methods include Bowen Ratio, Eddy Covariance (EC), and scintillometers. Large weighing lysimeters are considered the ultimate standard for measurement of ET, however, they are expensive to install and maintain. Although EC and scintillometers are less costly and relatively portable, EC has known energy balance closure discrepancies. Previous scintillometer studies used EC for ground-truthing, but no studies considered weighing lysimeters. In this study, a Surface Layer Scintillometer (SLS) was evaluated for accuracy in determining ET as well as sensible and latent heat fluxes, as compared to a large weighing lysimeter in Bushland, TX. The SLS was installed over irrigated grain sorghum (Sorghum bicolor (L.) Moench) for the period 29 July–17 August 2015 and over grain corn (Zea mays L.) for the period 23 June–2 October 2016. Results showed poor correlation for sensible heat flux, but much better correlation with ET, with r2 values of 0.83 and 0.87 for hourly and daily ET, respectively. The accuracy of the SLS was comparable to other ET sensing instruments with an RMSE of 0.13 mm·h−1 (31%) for hourly ET; however, summing hourly values to a daily time step reduced the ET error to 14% (0.75 mm·d−1). This level of accuracy indicates that potential exists for the SLS to be used in some water management applications. As few studies have been conducted to evaluate the SLS for ET estimation, or in combination with lysimetric data, further evaluations would be beneficial to investigate the applicability of the SLS in water resources management. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>The study location with placement of large weighing lysimeters at the centers of four square fields at the USDA-ARS Conservation and Production Research Laboratory, Bushland, TX. The research weather pen is shown adjacent to the east side of the east lysimeter fields. The dashed black line in the NE field illustrates the path of the scintillometer radiation.</p>
Full article ">Figure 2
<p>Soil sensor placement within the lysimeter. Shown are relative placement of soil heat flux (SHF) plates, model TDR315 soil water content sensors, and subsurface drip irrigation lines. The TDR315 sensors give accurate soil temperature readings, eliminating the need for thermocouples where they are used.</p>
Full article ">Figure 3
<p>Crop height for 2015 (<b>a</b>) and 2016 (<b>b</b>), LAI for 2015 (<b>c</b>) and 2016 (<b>d</b>), and daily lysimeter ET for the 2015 (<b>e</b>) and 2016 (<b>f</b>) study periods.</p>
Full article ">Figure 4
<p>Average daily energy balance components on the lysimeter for (<b>a</b>) 2015; (<b>b</b>) 2016.</p>
Full article ">Figure 5
<p>Average daily energy balance components for the SLS for (<b>a</b>) 2015; (<b>b</b>) 2016.</p>
Full article ">Figure 6
<p>Comparison of hourly H<sub>dT</sub> and H<sub>sun</sub> from the SLS to lysimeter H for the combined 2015–2016 data based on lysimeter data regressed to (<b>a</b>) H<sub>dT</sub>; (<b>b</b>) H<sub>sun</sub>.</p>
Full article ">Figure 7
<p>Temperature difference (calculated as the difference between values from two air temperature sensors at different heights—the lower sensor minus the higher sensor) and used as the indicator for flux direction.</p>
Full article ">Figure 8
<p>Comparison of hourly ET from SLS and lysimeter using (<b>a</b>) combined 2015<span>$</span>#x2013;2016 data; (<b>b</b>) 2016 only; (<b>c</b>) 2015 only.</p>
Full article ">Figure 8 Cont.
<p>Comparison of hourly ET from SLS and lysimeter using (<b>a</b>) combined 2015<span>$</span>#x2013;2016 data; (<b>b</b>) 2016 only; (<b>c</b>) 2015 only.</p>
Full article ">Figure 9
<p>Daily ET<sub>ref</sub> for the same DOY in 2015 and 2016, used as an indicator for differences in weather parameters.</p>
Full article ">Figure 10
<p>Comparison of daytime hourly ET from the SLS to lysimeter data using (<b>a</b>) 2015–2016 combined data; (<b>b</b>) 2016; (<b>c</b>) 2015.</p>
Full article ">Figure 10 Cont.
<p>Comparison of daytime hourly ET from the SLS to lysimeter data using (<b>a</b>) 2015–2016 combined data; (<b>b</b>) 2016; (<b>c</b>) 2015.</p>
Full article ">Figure 11
<p>Comparison of summed daily ET from the SLS to summed hourly lysimeter data using (<b>a</b>) combined 2015–2016 data; (<b>b</b>) 2016 only; (<b>c</b>) 2015 only.</p>
Full article ">Figure 12
<p>Comparison of summed daily ET from the SLS to daily (mass change from midnight to midnight) lysimeter data using (<b>a</b>) combined 2015–2016 data; (<b>b</b>) 2016 only; (<b>c</b>) 2015 only.</p>
Full article ">
3147 KiB  
Article
Evaluating Oilseed Biofuel Production Feasibility in California’s San Joaquin Valley Using Geophysical and Remote Sensing Techniques
by Dennis L. Corwin, Kevin Yemoto, Wes Clary, Gary Banuelos, Todd H. Skaggs, Scott M. Lesch and Elia Scudiero
Sensors 2017, 17(10), 2343; https://doi.org/10.3390/s17102343 - 14 Oct 2017
Cited by 3 | Viewed by 4638
Abstract
Though more costly than petroleum-based fuels and a minor component of overall military fuel sources, biofuels are nonetheless strategically valuable to the military because of intentional reliance on multiple, reliable, secure fuel sources. Significant reduction in oilseed biofuel cost occurs when grown on [...] Read more.
Though more costly than petroleum-based fuels and a minor component of overall military fuel sources, biofuels are nonetheless strategically valuable to the military because of intentional reliance on multiple, reliable, secure fuel sources. Significant reduction in oilseed biofuel cost occurs when grown on marginally productive saline-sodic soils plentiful in California’s San Joaquin Valley (SJV). The objective is to evaluate the feasibility of oilseed production on marginal soils in the SJV to support a 115 ML yr−1 biofuel conversion facility. The feasibility evaluation involves: (1) development of an Ida Gold mustard oilseed yield model for marginal soils; (2) identification of marginally productive soils; (3) development of a spatial database of edaphic factors influencing oilseed yield and (4) performance of Monte Carlo simulations showing potential biofuel production on marginally productive SJV soils. The model indicates oilseed yield is related to boron, salinity, leaching fraction, and water content at field capacity. Monte Carlo simulations for the entire SJV fit a shifted gamma probability density function: Q = 68.986 + gamma (6.134,5.285), where Q is biofuel production in ML yr−1. The shifted gamma cumulative density function indicates a 0.15–0.17 probability of meeting the target biofuel-production level of 115 ML yr−1, making adequate biofuel production unlikely. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Flow chart of the feasibility evaluation of oilseed production on marginal soils in the San Joaquin Valley showing the four steps and flow of information.</p>
Full article ">Figure 2
<p>Map showing the location of the 16.2-ha field in California’s Merced County.</p>
Full article ">Figure 3
<p>Maps of the apparent soil electrical conductivity (EC<sub>a</sub>) surveys taken with electromagnetic induction in the horizontal (EM<sub>h</sub>) and vertical (EM<sub>v</sub>) coil configurations, combining entire (full) 16.2-ha field and southeast (SE) corner surveys. Soil core sample sites are indicated by the clear and filled-in circles. The clear circles are soil sample sites selected from the full field EC<sub>a</sub> survey and the filled-in circles are from the SE corner EC<sub>a</sub> survey.</p>
Full article ">Figure 4
<p>Map of Ida Gold mustard oilseed yield. Oilseed yield sample sites (1 m<sup>2</sup> sample area) are indicated by the clear and filled-in circles. The clear circles are oilseed yield sample sites selected from the full field EC<sub>a</sub> survey and the filled-in circles are from the SE corner EC<sub>a</sub> survey.</p>
Full article ">Figure 5
<p>Salt tolerance models for Ida Gold mustard oilseed: (<b>a</b>) two-piece linear salt tolerance model (thick dashed line) of Maas and Hoffman [<a href="#B18-sensors-17-02343" class="html-bibr">18</a>] and (<b>b</b>) quadratic salt tolerance model (solid line). Thin dashed line indicates the salinity threshold of the two-piece linear salt tolerance model.</p>
Full article ">Figure 6
<p>Boron tolerance models for Ida Gold mustard oilseed: (<b>a</b>) three-piece linear B tolerance model (thick dashed line) and (<b>b</b>) quadratic B tolerance model (solid line). Thin dashed lines indicate the B deficiency and toxicity thresholds.</p>
Full article ">Figure 7
<p>Map of salt-affected soils for the west side of California’s San Joaquin Valley (WSJV) estimated from the regional-scale soil salinity model of Scudiero et al. [<a href="#B5-sensors-17-02343" class="html-bibr">5</a>,<a href="#B29-sensors-17-02343" class="html-bibr">29</a>].</p>
Full article ">Figure 8
<p>Biofuel production (<b>a</b>) histogram of Monte Carlo simulations and associated probability density function (PDF) and (<b>b</b>) cumulative density function (CDF) for Ida Gold mustard oilseed from San Joaquin Valley salt-affected soils (i.e., soils with salinity of &gt; 4 dS m<sup>−1</sup>) based on 10,000 Monte Carlo simulations.</p>
Full article ">
1589 KiB  
Article
Assessing White Wine Viscosity Variation Using Polarized Laser Speckle: A Promising Alternative to Wine Sensory Analysis
by Christelle Abou Nader, Hadi Loutfi, Fabrice Pellen, Bernard Le Jeune, Guy Le Brun, Roger Lteif and Marie Abboud
Sensors 2017, 17(10), 2340; https://doi.org/10.3390/s17102340 - 13 Oct 2017
Cited by 5 | Viewed by 5165
Abstract
In this paper, we report measurements of wine viscosity, correlated to polarized laser speckle results. Experiments were performed on white wine samples produced with a single grape variety. Effects of the wine making cellar, the grape variety, and the vintage on wine Brix [...] Read more.
In this paper, we report measurements of wine viscosity, correlated to polarized laser speckle results. Experiments were performed on white wine samples produced with a single grape variety. Effects of the wine making cellar, the grape variety, and the vintage on wine Brix degree, alcohol content, viscosity, and speckle parameters are considered. We show that speckle parameters, namely, spatial contrast and speckle decorrelation time, as well as the inertia moment extracted from the temporal history speckle pattern, are mainly affected by the alcohol and sugar content and hence the wine viscosity. Principal component analysis revealed a high correlation between laser speckle results on the one hand and viscosity and Brix degree values on the other. As speckle analysis proved to be an efficient method of measuring the variation of the viscosity of white mono-variety wine, one can therefore consider it as an alternative method to wine sensory analysis. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Schematic view of the speckle experiment setup. λ/4 is a quarter-wave plate.</p>
Full article ">Figure 2
<p>(<b>a</b>) Temporal correlation curve <math display="inline"> <semantics> <mrow> <mi>C</mi> <mrow> <mo>(</mo> <mi>τ</mi> <mo>)</mo> </mrow> </mrow> </semantics> </math> and (<b>b</b>) the temporal history of the speckle pattern (THSP) image for a 1.7 mL white wine sample with 0.3 mL of added microspheres with a diameter of 0.22 µm.</p>
Full article ">Figure 3
<p>Sketch of an Oswald viscometer showing the upper and the lower marks.</p>
Full article ">Figure 4
<p>Variation of the speckle spatial contrast <math display="inline"> <semantics> <mi mathvariant="normal">C</mi> </semantics> </math> as a function of white wine viscosity. Standard deviations are approximately ±0.001 mPa.s for the viscosity, and ±0.002 for the contrast <math display="inline"> <semantics> <mi mathvariant="normal">C</mi> </semantics> </math>. Error bars are not displayed on the figures for the sake of clarity.</p>
Full article ">Figure 5
<p>Variation of the speckle decorrelation time <math display="inline"> <semantics> <mrow> <msub> <mi>τ</mi> <mi>c</mi> </msub> </mrow> </semantics> </math> as a function of white wine viscosity. Symbols correspond to experimental values and lines correspond to linear fits. Standard deviations are approximately ±0.001 mPa.s for the viscosity, and ±0.006 ms for the decorrelation time τ. Error bars are not displayed on the figures for the sake of clarity.</p>
Full article ">Figure 6
<p>Pearson correlation circle representing the variables projected on the first principal component F1 and the second principal component F2 for all wine samples.</p>
Full article ">
2840 KiB  
Article
Application of Multilayer Perceptron with Automatic Relevance Determination on Weed Mapping Using UAV Multispectral Imagery
by Afroditi A. Tamouridou, Thomas K. Alexandridis, Xanthoula E. Pantazi, Anastasia L. Lagopodi, Javid Kashefi, Dimitris Kasampalis, Georgios Kontouris and Dimitrios Moshou
Sensors 2017, 17(10), 2307; https://doi.org/10.3390/s17102307 - 11 Oct 2017
Cited by 26 | Viewed by 4814
Abstract
Remote sensing techniques are routinely used in plant species discrimination and of weed mapping. In the presented work, successful Silybum marianum detection and mapping using multilayer neural networks is demonstrated. A multispectral camera (green-red-near infrared) attached on a fixed wing unmanned aerial vehicle [...] Read more.
Remote sensing techniques are routinely used in plant species discrimination and of weed mapping. In the presented work, successful Silybum marianum detection and mapping using multilayer neural networks is demonstrated. A multispectral camera (green-red-near infrared) attached on a fixed wing unmanned aerial vehicle (UAV) was utilized for the acquisition of high-resolution images (0.1 m resolution). The Multilayer Perceptron with Automatic Relevance Determination (MLP-ARD) was used to identify the S. marianum among other vegetation, mostly Avena sterilis L. The three spectral bands of Red, Green, Near Infrared (NIR) and the texture layer resulting from local variance were used as input. The S. marianum identification rates using MLP-ARD reached an accuracy of 99.54%. Τhe study had an one year duration, meaning that the results are specific, although the accuracy shows the interesting potential of S. marianum mapping with MLP-ARD on multispectral UAV imagery. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Orientation map and field surveyed locations in the study area.</p>
Full article ">Figure 2
<p>Vegetation species demonstrated similar spectral reflectance with <span class="html-italic">Silybum marianum</span>.</p>
Full article ">Figure 3
<p>Hinton diagram of the trained MLP-ARD. The vertical axis corresponds to feature while the horizontal shows hidden neurons.</p>
Full article ">Figure 4
<p>The alpha hyperparameters are shown for the four input features (1 = Texture, 2 = NIR, 3 = Red, 4 = Green).</p>
Full article ">Figure 5
<p><span class="html-italic">S. marianum</span> weed mapping based on MLP-ARD prediction. Green is <span class="html-italic">S. marianum</span> and yellow is other vegetation.</p>
Full article ">
15507 KiB  
Article
Classification of Fusarium-Infected Korean Hulled Barley Using Near-Infrared Reflectance Spectroscopy and Partial Least Squares Discriminant Analysis
by Jongguk Lim, Giyoung Kim, Changyeun Mo, Kyoungmin Oh, Hyeonchae Yoo, Hyeonheui Ham and Moon S. Kim
Sensors 2017, 17(10), 2258; https://doi.org/10.3390/s17102258 - 30 Sep 2017
Cited by 13 | Viewed by 4675
Abstract
The purpose of this study is to use near-infrared reflectance (NIR) spectroscopy equipment to nondestructively and rapidly discriminate Fusarium-infected hulled barley. Both normal hulled barley and Fusarium-infected hulled barley were scanned by using a NIR spectrometer with a wavelength range of [...] Read more.
The purpose of this study is to use near-infrared reflectance (NIR) spectroscopy equipment to nondestructively and rapidly discriminate Fusarium-infected hulled barley. Both normal hulled barley and Fusarium-infected hulled barley were scanned by using a NIR spectrometer with a wavelength range of 1175 to 2170 nm. Multiple mathematical pretreatments were applied to the reflectance spectra obtained for Fusarium discrimination and the multivariate analysis method of partial least squares discriminant analysis (PLS-DA) was used for discriminant prediction. The PLS-DA prediction model developed by applying the second-order derivative pretreatment to the reflectance spectra obtained from the side of hulled barley without crease achieved 100% accuracy in discriminating the normal hulled barley and the Fusarium-infected hulled barley. These results demonstrated the feasibility of rapid discrimination of the Fusarium-infected hulled barley by combining multivariate analysis with the NIR spectroscopic technique, which is utilized as a nondestructive detection method. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Schematic diagram and (<b>b</b>) photo of the NIR measurement system for the discrimination of hulled barley infected with <span class="html-italic">Fusarium</span>.</p>
Full article ">Figure 2
<p>(<b>a</b>) Location and (<b>b</b>) configuration of kernels on Petri dish to culture <span class="html-italic">Fusarium</span> of hulled barley.</p>
Full article ">Figure 3
<p>Denotation of (<b>a</b>) front side and (<b>b</b>) back side of hulled barley by the presence of the crease.</p>
Full article ">Figure 4
<p>Probe and light illumination in NIR reflectance spectrum acquisition for hulled barley.</p>
Full article ">Figure 5
<p>Culture results for hulled barley (JN151) in the control group, showing that <span class="html-italic">Fusarium</span> was not observed.</p>
Full article ">Figure 6
<p>Culture results for hulled barley (GN121, JB021, JB061, JB094) in the experimental group in which <span class="html-italic">Fusarium</span> was observed.</p>
Full article ">Figure 7
<p>NIR reflectance spectra obtained from hulled barley samples (515 kernels) in both the control and experimental groups.</p>
Full article ">Figure 8
<p>Total NIR reflectance spectra obtained from normal hulled barley samples (127 kernels) in the control group.</p>
Full article ">Figure 9
<p>Total NIR reflectance spectra obtained from <span class="html-italic">Fusarium</span>-infected hulled barley samples (298 kernels) in the experimental group.</p>
Full article ">Figure 10
<p>Average NIR reflectance spectra obtained from (<b>a</b>) the front side of the hulled barley with crease; (<b>b</b>) the back side of the hulled barley without crease; and (<b>c</b>) both front and back sides of the hulled barley.</p>
Full article ">Figure 11
<p><span class="html-italic">Fusarium</span> discrimination calibration (<b>a</b>) and validation (<b>b</b>) results of the PLS-DA model developed using raw reflectance spectra obtained from the front side and back side of the hulled barley.</p>
Full article ">Figure 12
<p><span class="html-italic">Fusarium</span> discrimination calibration (<b>a</b>) and validation (<b>b</b>) results of PLS-DA model developed by applying the second-order derivative pretreatment to the reflectance spectra obtained from the front side and back side of hulled barley.</p>
Full article ">Figure 13
<p><span class="html-italic">Fusarium</span> discrimination calibration (<b>a</b>) and validation (<b>b</b>) results of PLS-DA model developed using the raw reflectance spectra obtained from the front side of hulled barley.</p>
Full article ">Figure 14
<p><span class="html-italic">Fusarium</span> discrimination calibration (<b>a</b>) and validation (<b>b</b>) results of PLS-DA model developed by applying the third-order derivative pretreatment to the reflectance spectra obtained from the front side of hulled barley.</p>
Full article ">Figure 15
<p><span class="html-italic">Fusarium</span> discrimination calibration (<b>a</b>) and validation (<b>b</b>) results of PLS-DA model developed by using the raw reflectance spectra obtained from the back side of hulled barley.</p>
Full article ">Figure 16
<p><span class="html-italic">Fusarium</span> discrimination calibration (<b>a</b>) and validation (<b>b</b>) results of the PLS-DA model developed by using the raw reflectance spectra obtained from the back side of the hulled barley.</p>
Full article ">
6766 KiB  
Article
Hyperspectral Imaging Analysis for the Classification of Soil Types and the Determination of Soil Total Nitrogen
by Shengyao Jia, Hongyang Li, Yanjie Wang, Renyuan Tong and Qing Li
Sensors 2017, 17(10), 2252; https://doi.org/10.3390/s17102252 - 30 Sep 2017
Cited by 47 | Viewed by 7594
Abstract
Soil is an important environment for crop growth. Quick and accurately access to soil nutrient content information is a prerequisite for scientific fertilization. In this work, hyperspectral imaging (HSI) technology was applied for the classification of soil types and the measurement of soil [...] Read more.
Soil is an important environment for crop growth. Quick and accurately access to soil nutrient content information is a prerequisite for scientific fertilization. In this work, hyperspectral imaging (HSI) technology was applied for the classification of soil types and the measurement of soil total nitrogen (TN) content. A total of 183 soil samples collected from Shangyu City (People’s Republic of China), were scanned by a near-infrared hyperspectral imaging system with a wavelength range of 874–1734 nm. The soil samples belonged to three major soil types typical of this area, including paddy soil, red soil and seashore saline soil. The successive projections algorithm (SPA) method was utilized to select effective wavelengths from the full spectrum. Pattern texture features (energy, contrast, homogeneity and entropy) were extracted from the gray-scale images at the effective wavelengths. The support vector machines (SVM) and partial least squares regression (PLSR) methods were used to establish classification and prediction models, respectively. The results showed that by using the combined data sets of effective wavelengths and texture features for modelling an optimal correct classification rate of 91.8%. could be achieved. The soil samples were first classified, then the local models were established for soil TN according to soil types, which achieved better prediction results than the general models. The overall results indicated that hyperspectral imaging technology could be used for soil type classification and soil TN determination, and data fusion combining spectral and image texture information showed advantages for the classification of soil types. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Schematic diagram of the hyperspectral imaging system. This system can obtain images in the spectral region of 874–1734 nm.</p>
Full article ">Figure 2
<p>Main steps of this work.</p>
Full article ">Figure 3
<p>RGB images of paddy soil (<b>a</b>), red soil (<b>b</b>) and seashore saline soil (<b>c</b>) samples.</p>
Full article ">Figure 4
<p>(<b>a</b>) The average spectrum of each soil type in the wavelength range of 975–1645 nm; (<b>b</b>) Grouping of 183 soil samples based on Fisher’s LDA using the first four principal components of full spectrum matrix as input.</p>
Full article ">Figure 5
<p>RMSECV curves with the number of variables selected by SPA for soil type classification (<b>a</b>). The reference data in SPA was category value. The selected variables (shown as dots) corresponding to raw spectra were presented in (<b>b</b>).</p>
Full article ">Figure 6
<p>The mean values of (<b>a</b>) Energy, (<b>b</b>) Contrast, (<b>c</b>) Homogeneity and (<b>d</b>) Entropy of different soil types at the effective wavelengths.</p>
Full article ">Figure 7
<p>RMSECV curves with the number of variables selected by SPA for (<b>a</b>) paddy soil, (<b>b</b>) red soil, (<b>c</b>) seashore saline soil and (<b>d</b>) general models. The reference data in SPA was soil TN content.</p>
Full article ">Figure 8
<p>The variables selected by SPA (shown in circle markers) corresponding to raw spectra for paddy soil, red soil, seashore saline soil and general models were presented in (<b>a</b>–<b>d</b>), respectively.</p>
Full article ">Figure 8 Cont.
<p>The variables selected by SPA (shown in circle markers) corresponding to raw spectra for paddy soil, red soil, seashore saline soil and general models were presented in (<b>a</b>–<b>d</b>), respectively.</p>
Full article ">
13540 KiB  
Article
A Robust Deep-Learning-Based Detector for Real-Time Tomato Plant Diseases and Pests Recognition
by Alvaro Fuentes, Sook Yoon, Sang Cheol Kim and Dong Sun Park
Sensors 2017, 17(9), 2022; https://doi.org/10.3390/s17092022 - 4 Sep 2017
Cited by 1029 | Viewed by 47547
Abstract
Plant Diseases and Pests are a major challenge in the agriculture sector. An accurate and a faster detection of diseases and pests in plants could help to develop an early treatment technique while substantially reducing economic losses. Recent developments in Deep Neural Networks [...] Read more.
Plant Diseases and Pests are a major challenge in the agriculture sector. An accurate and a faster detection of diseases and pests in plants could help to develop an early treatment technique while substantially reducing economic losses. Recent developments in Deep Neural Networks have allowed researchers to drastically improve the accuracy of object detection and recognition systems. In this paper, we present a deep-learning-based approach to detect diseases and pests in tomato plants using images captured in-place by camera devices with various resolutions. Our goal is to find the more suitable deep-learning architecture for our task. Therefore, we consider three main families of detectors: Faster Region-based Convolutional Neural Network (Faster R-CNN), Region-based Fully Convolutional Network (R-FCN), and Single Shot Multibox Detector (SSD), which for the purpose of this work are called “deep learning meta-architectures”. We combine each of these meta-architectures with “deep feature extractors” such as VGG net and Residual Network (ResNet). We demonstrate the performance of deep meta-architectures and feature extractors, and additionally propose a method for local and global class annotation and data augmentation to increase the accuracy and reduce the number of false positives during training. We train and test our systems end-to-end on our large Tomato Diseases and Pests Dataset, which contains challenging images with diseases and pests, including several inter- and extra-class variations, such as infection status and location in the plant. Experimental results show that our proposed system can effectively recognize nine different types of diseases and pests, with the ability to deal with complex scenarios from a plant’s surrounding area. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Recognition and localization of plant diseases and pests: problem formulation. Our system aims to detect both class (what) and location (where) of the affected areas in the image.</p>
Full article ">Figure 2
<p>Flow chart of the deep meta-architecture approach used in this work. Our system proposes to treat a deep meta-architecture as an open system on which different feature extractors can be adapted to perform on our task. The system is trained and tested end-to-end using images captured in-place. The outputs are the class and localization of the infected area in the image.</p>
Full article ">Figure 3
<p>A representation of diseases and pests that affect tomato plants. (<b>a</b>) Gray mold, (<b>b</b>) Canker, (<b>c</b>) Leaf mold, (<b>d</b>) Plague, (<b>e</b>) Leaf miner, (<b>f</b>) Whitefly, (<b>g</b>) Low temperature, (<b>h</b>) Nutritional excess or deficiency, (<b>i</b>) Powdery mildew. The images are collected under different variations and environmental conditions. The patterns help to distinguish some proper characteristics of each disease and pest.</p>
Full article ">Figure 4
<p>System overview of the proposed deep-learning-based approach for plant diseases and pest recognition. Our deep meta-architecture approach consists of several steps that use input images as a source of information, and provide detection results in terms of class and location of the infected area of the plant in the image.</p>
Full article ">Figure 5
<p>Training loss curve of our proposed approach. The comparison includes results with and without data augmentation. Our network efficiently learns the data while achieving a lower error rate at about one hundred thousand iterations.</p>
Full article ">Figure 6
<p>Detection results of diseases and pests that affect tomato plants with Faster R-CNN and a VGG-16 detector. From left to right: the input image, annotated image, and predicted results. (<b>a</b>) Gray mold; (<b>b</b>) Canker; (<b>c</b>) Leaf mold; (<b>d</b>) Plague; (<b>e</b>) Leaf miner; (<b>f</b>) Whitefly; (<b>g</b>) Low temperature; (<b>h</b>) Nutritional excess or deficiency; (<b>i</b>) Powdery mildew.</p>
Full article ">Figure 6 Cont.
<p>Detection results of diseases and pests that affect tomato plants with Faster R-CNN and a VGG-16 detector. From left to right: the input image, annotated image, and predicted results. (<b>a</b>) Gray mold; (<b>b</b>) Canker; (<b>c</b>) Leaf mold; (<b>d</b>) Plague; (<b>e</b>) Leaf miner; (<b>f</b>) Whitefly; (<b>g</b>) Low temperature; (<b>h</b>) Nutritional excess or deficiency; (<b>i</b>) Powdery mildew.</p>
Full article ">Figure 7
<p>Deep feature maps visualization of diseases and pest (<b>a</b>) Canker; (<b>b</b>) Gray mold; (<b>c</b>) Leaf mold; (<b>d</b>) Low temperature; (<b>e</b>) Miner; (<b>f</b>) Nutritional excess; (<b>g</b>) Plague; (<b>h</b>) Powdery mildew; (<b>i</b>) Whitefly. Each feature map illustrates how our neural network system interprets a disease in the context after being classified by a SoftMax function.</p>
Full article ">Figure 8
<p>Detection results of inter- and intra-class variation of diseases and pests in the images. (<b>a</b>) Two classes affecting the same sample (powdery mildew and pest); (<b>b</b>) Three classes in the same sample (Gray mold, low temperature, and miners); (<b>c</b>) Leaf mold affecting the back side of the leaf; (<b>d</b>) Leaf mold affecting the front side of the leaf; (<b>e</b>) Gray mold in the early stage; (<b>f</b>) Gray mold in the last stage; (<b>g</b>) Plague can be also detected on other parts of the plant, such as fruits or stem; (<b>h</b>) Plague affecting the tomato production.</p>
Full article ">Figure 9
<p>Confusion matrix of the Tomato Diseases and Pests detection results (including background, which is a transversal class containing healthy parts of the plants and surrounding areas, such as part of the greenhouse).</p>
Full article ">Figure 10
<p>A representation of failure cases. (<b>a</b>) The intra-class variation makes the recognition harder and results in a low recognition rate. (<b>b</b>) Misdetected class due to confusion at earlier infection status (e.g., the real class is leaf mold, but the system recognizes it as canker).</p>
Full article ">
3133 KiB  
Article
Novelty Detection Classifiers in Weed Mapping: Silybum marianum Detection on UAV Multispectral Images
by Thomas K. Alexandridis, Afroditi Alexandra Tamouridou, Xanthoula Eirini Pantazi, Anastasia L. Lagopodi, Javid Kashefi, Georgios Ovakoglou, Vassilios Polychronos and Dimitrios Moshou
Sensors 2017, 17(9), 2007; https://doi.org/10.3390/s17092007 - 1 Sep 2017
Cited by 36 | Viewed by 6130
Abstract
In the present study, the detection and mapping of Silybum marianum (L.) Gaertn. weed using novelty detection classifiers is reported. A multispectral camera (green-red-NIR) on board a fixed wing unmanned aerial vehicle (UAV) was employed for obtaining high-resolution images. Four novelty detection classifiers [...] Read more.
In the present study, the detection and mapping of Silybum marianum (L.) Gaertn. weed using novelty detection classifiers is reported. A multispectral camera (green-red-NIR) on board a fixed wing unmanned aerial vehicle (UAV) was employed for obtaining high-resolution images. Four novelty detection classifiers were used to identify S. marianum between other vegetation in a field. The classifiers were One Class Support Vector Machine (OC-SVM), One Class Self-Organizing Maps (OC-SOM), Autoencoders and One Class Principal Component Analysis (OC-PCA). As input features to the novelty detection classifiers, the three spectral bands and texture were used. The S. marianum identification accuracy using OC-SVM reached an overall accuracy of 96%. The results show the feasibility of effective S. marianum mapping by means of novelty detection classifiers acting on multispectral UAV imagery. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Study area, UAV orthomosaic, focus area, and field surveyed locations.</p>
Full article ">Figure 2
<p>Spectra corresponding to each vegetation type.</p>
Full article ">Figure 3
<p>The influence of the RBF spreading parameter on the behaviour of the one class SVM [<a href="#B20-sensors-17-02007" class="html-bibr">20</a>].</p>
Full article ">Figure 4
<p>Target dataset regions and Voronoi polygons and the threshold perimeter for OCSOM. Target data defined by the threshold are resident inside the grey border line.</p>
Full article ">Figure 5
<p><span class="html-italic">Silybum marianum</span> coverage based on four novelty classifiers prediction in the study area (<b>a</b>); and in the focus area (<b>b</b>) <span class="html-italic">S. marianum</span> is green colour and other vegetation is yellow.</p>
Full article ">
3360 KiB  
Article
Discrimination of Transgenic Maize Kernel Using NIR Hyperspectral Imaging and Multivariate Data Analysis
by Xuping Feng, Yiying Zhao, Chu Zhang, Peng Cheng and Yong He
Sensors 2017, 17(8), 1894; https://doi.org/10.3390/s17081894 - 17 Aug 2017
Cited by 59 | Viewed by 5834
Abstract
There are possible environmental risks related to gene flow from genetically engineered organisms. It is important to find accurate, fast, and inexpensive methods to detect and monitor the presence of genetically modified (GM) organisms in crops and derived crop products. In the present [...] Read more.
There are possible environmental risks related to gene flow from genetically engineered organisms. It is important to find accurate, fast, and inexpensive methods to detect and monitor the presence of genetically modified (GM) organisms in crops and derived crop products. In the present study, GM maize kernels containing both cry1Ab/cry2Aj-G10evo proteins and their non-GM parents were examined by using hyperspectral imaging in the near-infrared (NIR) range (874.41–1733.91 nm) combined with chemometric data analysis. The hypercubes data were analyzed by applying principal component analysis (PCA) for exploratory purposes, and support vector machine (SVM) and partial least squares discriminant analysis (PLS–DA) to build the discriminant models to class the GM maize kernels from their contrast. The results indicate that clear differences between GM and non-GM maize kernels can be easily visualized with a nondestructive determination method developed in this study, and excellent classification could be achieved, with calculation and prediction accuracy of almost 100%. This study also demonstrates that SVM and PLS–DA models can obtain good performance with 54 wavelengths, selected by the competitive adaptive reweighted sampling method (CARS), making the classification processing for online application more rapid. Finally, GM maize kernels were visually identified on the prediction maps by predicting the features of each pixel on individual hyperspectral images. It was concluded that hyperspectral imaging together with chemometric data analysis is a promising technique to identify GM maize kernels, since it overcomes some disadvantages of the traditional analytical methods, such as complex and monotonous sampling. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Structure of the plant expression vector containing coding regions of the <span class="html-italic">cry1Ab/cry2Aj-G10evo</span> genes. LB is left border; RB is the right border; poly A is a terminator; PEPC is a terminator; 35S is a promoter; Ubi is a promoter; EPSPS denotes the herbicide-resistant genes; BT denotes the insect-resistant genes; EPSPS and BT are marked with a red triangle.</p>
Full article ">Figure 2
<p>Flowchart of image processing and data analysis for discrimination of genetically-modified (GM) maize kernels. CARS: competitive adaptive reweighted sampling; PLS-DA: partial least squares discriminant analysis; ROI: region of interest; SVM: support vector machine.</p>
Full article ">Figure 3
<p>The profiles for raw spectra (<b>A</b>) and average spectra reflectance (<b>B</b>) from the near-infrared (NIR) multispectral images of GM and non-GM maize kernels.</p>
Full article ">Figure 4
<p>Three-dimensional principal component analysis (PCA) scores scatter plot of the first three principle components (PCs) for the GM and non-GM maize kernels with raw spectra (<b>A</b>) after pre-processing and (<b>B</b>) main peaks of PC3 loadings that are indicative of the differences between GM and non-GM maize kernels.</p>
Full article ">Figure 5
<p>Score images of the first three principal components (<b>PC1</b>–<b>PC3</b>) of the images of maize kernels.</p>
Full article ">Figure 6
<p>Selected optimal wavelengths by competitive adaptive reweighted sampling (CARS). (<b>A</b>) The number of variables as the function of iterations; (<b>B</b>) ten-fold root mean squared errors (RMSECV) values; and (<b>C</b>) regression coefficients of each variable with the number of sampling runs. Each line denotes the path of a variable.</p>
Full article ">Figure 6 Cont.
<p>Selected optimal wavelengths by competitive adaptive reweighted sampling (CARS). (<b>A</b>) The number of variables as the function of iterations; (<b>B</b>) ten-fold root mean squared errors (RMSECV) values; and (<b>C</b>) regression coefficients of each variable with the number of sampling runs. Each line denotes the path of a variable.</p>
Full article ">Figure 7
<p>The distribution of optimal wavelengths selected by CARS.</p>
Full article ">Figure 8
<p>Visualization of independent maize kernels by the SVM classification process using optimal wavelengths in the hyperspectral images. Green denotes non-GM maize kernels, and red identifies GM maize kernels.</p>
Full article ">
8082 KiB  
Communication
A True-Color Sensor and Suitable Evaluation Algorithm for Plant Recognition
by Oliver Schmittmann and Peter Schulze Lammers
Sensors 2017, 17(8), 1823; https://doi.org/10.3390/s17081823 - 8 Aug 2017
Cited by 26 | Viewed by 8071
Abstract
Plant-specific herbicide application requires sensor systems for plant recognition and differentiation. A literature review reveals a lack of sensor systems capable of recognizing small weeds in early stages of development (in the two- or four-leaf stage) and crop plants, of making spraying decisions [...] Read more.
Plant-specific herbicide application requires sensor systems for plant recognition and differentiation. A literature review reveals a lack of sensor systems capable of recognizing small weeds in early stages of development (in the two- or four-leaf stage) and crop plants, of making spraying decisions in real time and, in addition, are that are inexpensive and ready for practical use in sprayers. The system described in this work is based on free cascadable and programmable true-color sensors for real-time recognition and identification of individual weed and crop plants. The application of this type of sensor is suitable for municipal areas and farmland with and without crops to perform the site-specific application of herbicides. Initially, databases with reflection properties of plants, natural and artificial backgrounds were created. Crop and weed plants should be recognized by the use of mathematical algorithms and decision models based on these data. They include the characteristic color spectrum, as well as the reflectance characteristics of unvegetated areas and areas with organic material. The CIE-Lab color-space was chosen for color matching because it contains information not only about coloration (a- and b-channel), but also about luminance (L-channel), thus increasing accuracy. Four different decision making algorithms based on different parameters are explained: (i) color similarity (ΔE); (ii) color similarity split in ΔL, Δa and Δb; (iii) a virtual channel ‘d’ and (iv) statistical distribution of the differences of reflection backgrounds and plants. Afterwards, the detection success of the recognition system is described. Furthermore, the minimum weed/plant coverage of the measuring spot was calculated by a mathematical model. Plants with a size of 1–5% of the spot can be recognized, and weeds in the two-leaf stage can be identified with a measuring spot size of 5 cm. By choosing a decision model previously, the detection quality can be increased. Depending on the characteristics of the background, different models are suitable. Finally, the results of field trials on municipal areas (with models of plants), winter wheat fields (with artificial plants) and grassland (with dock) are shown. In each experimental variant, objects and weeds could be recognized. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Nineteen-diode hexagon sensor IC with three segment photodiodes for color detection.</p>
Full article ">Figure 2
<p>Design of the sensor-lens unit prepared for use in the experiments. Components: 1. True-color sensor with PC interface (RS232), power supply, switching outputs and two optical fibers (emitter and receiver); 2. Lens (double concave, diameter 5 cm, angle 22.5° connector for optical fiber); 3. Space for valve and nozzle.</p>
Full article ">Figure 3
<p>Representation of backgrounds. (<b>a</b>) Gray paving stones; (<b>b</b>) Stone steps; (<b>c</b>) Asphalt; (<b>d</b>) Fine chippings; (<b>e</b>) Gravel; (<b>f</b>) Red paving stones; (<b>g</b>) Sandy path; (<b>h</b>) Arable land with shadows; (<b>i</b>) Grassland; (<b>j</b>) Grassland with dew.</p>
Full article ">Figure 4
<p>Stationary carrier for dynamic sensor tests.</p>
Full article ">Figure 5
<p>Concept (<b>Left</b>) and prototype (<b>Right</b>) of the mobile field carrier with spraying device.</p>
Full article ">Figure 6
<p>Reflection of paved ground with different colored cards, ∆E, ∆L, ∆a, ∆b and the defined threshold for detection.</p>
Full article ">Figure 7
<p>Reflection of a winter wheat field with different colored cards, ∆E, ∆L, ∆a, ∆b and the defined threshold for detection.</p>
Full article ">Figure 8
<p>Reflection of grassland with dock, ∆E, ∆L, ∆a, ∆b and the defined threshold for detection.</p>
Full article ">Figure 9
<p>Signal distribution of backgrounds for Channels L, a and b. (<b>a</b>) Asphalt; (<b>b</b>) Fine chippings; (<b>c</b>) Red paving stones; (<b>d</b>) Stone steps; (<b>e</b>) Gray paving stones; (<b>f</b>) Gravel; (<b>g</b>) Sandy path; (<b>h</b>) Arable land; (<b>i</b>) Grazing land; (<b>j</b>) Grassland with dew.</p>
Full article ">
10820 KiB  
Article
Combining Multi-Agent Systems and Wireless Sensor Networks for Monitoring Crop Irrigation
by Gabriel Villarrubia, Juan F. De Paz, Daniel H. De La Iglesia and Javier Bajo
Sensors 2017, 17(8), 1775; https://doi.org/10.3390/s17081775 - 2 Aug 2017
Cited by 82 | Viewed by 10634
Abstract
Monitoring mechanisms that ensure efficient crop growth are essential on many farms, especially in certain areas of the planet where water is scarce. Most farmers must assume the high cost of the required equipment in order to be able to streamline natural resources [...] Read more.
Monitoring mechanisms that ensure efficient crop growth are essential on many farms, especially in certain areas of the planet where water is scarce. Most farmers must assume the high cost of the required equipment in order to be able to streamline natural resources on their farms. Considering that many farmers cannot afford to install this equipment, it is necessary to look for more effective solutions that would be cheaper to implement. The objective of this study is to build virtual organizations of agents that can communicate between each other while monitoring crops. A low cost sensor architecture allows farmers to monitor and optimize the growth of their crops by streamlining the amount of resources the crops need at every moment. Since the hardware has limited processing and communication capabilities, our approach uses the PANGEA architecture to overcome this limitation. Specifically, we will design a system that is capable of collecting heterogeneous information from its environment, using sensors for temperature, solar radiation, humidity, pH, moisture and wind. A major outcome of our approach is that our solution is able to merge heterogeneous data from sensors and produce a response adapted to the context. In order to validate the proposed system, we present a case study in which farmers are provided with a tool that allows us to monitor the condition of crops on a TV screen using a low cost device. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Agent organization flow chart.</p>
Full article ">Figure 2
<p>Architecture sensor network.</p>
Full article ">Figure 3
<p>OpenGarden Gateway Node functionality board [<a href="#B42-sensors-17-01775" class="html-bibr">42</a>].</p>
Full article ">Figure 4
<p>OpenGarden Slave Node functionality board [<a href="#B42-sensors-17-01775" class="html-bibr">42</a>].</p>
Full article ">Figure 5
<p>Look and feel of the devices.</p>
Full article ">Figure 6
<p>Process workflow: the main loop introduces a time delay in the whole process. Once the sensors are calibrated and measures obtained, the moisture level is checked. If the humidity crosses a given threshold, data is sent to the server. Otherwise, the input values of the sensors are sent to the fuzzy logic algorithm and the irrigation process is triggered.</p>
Full article ">Figure 7
<p>Rules provided by a human expert.</p>
Full article ">Figure 8
<p>Temperature inference rule.</p>
Full article ">Figure 9
<p>Solar radiation inference rule.</p>
Full article ">Figure 10
<p>Soil inference rule.</p>
Full article ">Figure 11
<p>Irrigation time estimate.</p>
Full article ">Figure 12
<p>Inputs/output.</p>
Full article ">Figure 13
<p>Output simulation.</p>
Full article ">Figure 14
<p>Relation between temperature and moisture.</p>
Full article ">Figure 15
<p>Relation between solar radiation and moisture.</p>
Full article ">Figure 16
<p>Scheme platform.</p>
Full article ">Figure 17
<p>Message structure.</p>
Full article ">Figure 18
<p>Remote control platform of the irrigation system.</p>
Full article ">Figure 19
<p>Smart TV application that shows the weather forecast.</p>
Full article ">Figure 20
<p>User interface screen capture with information from the different sensors displayed on the TV platform.</p>
Full article ">Figure 21
<p>Relationship between temperature and solar radiation at different times of day.</p>
Full article ">Figure 22
<p>Relationship between the watering time and temperature.</p>
Full article ">Figure 23
<p>Relationship between moisture and watering time.</p>
Full article ">Figure 24
<p>Total water consumption in the system using the fuzzy logic system.</p>
Full article ">Figure 25
<p>Commercial device used for comparison with the conducted case study.</p>
Full article ">
11519 KiB  
Article
Phenoliner: A New Field Phenotyping Platform for Grapevine Research
by Anna Kicherer, Katja Herzog, Nele Bendel, Hans-Christian Klück, Andreas Backhaus, Markus Wieland, Johann Christian Rose, Lasse Klingbeil, Thomas Läbe, Christian Hohl, Willi Petry, Heiner Kuhlmann, Udo Seiffert and Reinhard Töpfer
Sensors 2017, 17(7), 1625; https://doi.org/10.3390/s17071625 - 14 Jul 2017
Cited by 45 | Viewed by 9073
Abstract
In grapevine research the acquisition of phenotypic data is largely restricted to the field due to its perennial nature and size. The methodologies used to assess morphological traits and phenology are mainly limited to visual scoring. Some measurements for biotic and abiotic stress, [...] Read more.
In grapevine research the acquisition of phenotypic data is largely restricted to the field due to its perennial nature and size. The methodologies used to assess morphological traits and phenology are mainly limited to visual scoring. Some measurements for biotic and abiotic stress, as well as for quality assessments, are done by invasive measures. The new evolving sensor technologies provide the opportunity to perform non-destructive evaluations of phenotypic traits using different field phenotyping platforms. One of the biggest technical challenges for field phenotyping of grapevines are the varying light conditions and the background. In the present study the Phenoliner is presented, which represents a novel type of a robust field phenotyping platform. The vehicle is based on a grape harvester following the concept of a moveable tunnel. The tunnel it is equipped with different sensor systems (RGB and NIR camera system, hyperspectral camera, RTK-GPS, orientation sensor) and an artificial broadband light source. It is independent from external light conditions and in combination with artificial background, the Phenoliner enables standardised acquisition of high-quality, geo-referenced sensor data. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Overview of the Phenoliner. (<b>a</b>) Phenoliner construction plan. (<b>b</b>) Scheme of the sensor layout in the tunnel (marked red above); right: Sensor A: consisting of RGB cameras 1–3 and 5, and a NIR camera 4, left: Sensor B consisting of two hyperspectral cameras; (<b>c</b>) Phenoliner in the vine row.</p>
Full article ">Figure 2
<p>Tasks within IGG (Institute of Geodesy and Geoinformation) Geotagger 2.0. GNSS: Global Navigation Satellite System; Plant ID: Plant identification; EXIF: Exchangeable image file format.</p>
Full article ">Figure 3
<p>(<b>a</b>) Terrestrial laser scan of the Phenoliner to measure the lever arm between the GPS antenna and the camera system. (<b>b</b>) Evaluation measurement to test the georeferencing accuracy of the system. The coordinates of the black and white targets at the poles are known and can be compared with the target positions seen in the images.</p>
Full article ">Figure 4
<p>Experiment set up of POIs at the position of vine stems. (<b>a</b>) Image, selected as the one closest to an POI, with the deviation e as the error of the selection process. (<b>b</b>) Distribution of deviations from 10 runs at different speeds.</p>
Full article ">Figure 5
<p>(<b>a</b>) Basic principle of the system setup for 3D reconstruction of the full vine row. The Phenoliners tunnel is driven over the vine rows. The MCS is oriented parallel to the rows, capturing images automatically while in motion. (<b>b</b>) Reconstructed point cloud of black (left side) and green grape (right side) varieties. The upper images show the rows from a frontal view and the lower images show grapes from a profile view. Single berry elevations of their spherical geometry are clearly distinguishable.</p>
Full article ">Figure 6
<p>(<b>a</b>) Rectified image acquired with Sensor A (camera (5), only overlapping part of stereo pair shown). (<b>b</b>) Depth map calculated with stereo pair from camera (3) and camera (5). The brightness indicates the distance to the cameras (white for near points, dark gray for far points). Black pixels indicate positions with no depth which can be assumed to be background. (<b>c</b>) First result of a test for classification with manually set thresholds. The RGB image from (<b>a</b>) and the depth map from (<b>b</b>) are used as input. Classes are: blue for “grapes”, green for “canopy”, brown for “cane”, black for “background”.</p>
Full article ">Figure 7
<p>Pre-processing of hyperspectral image recording: (<b>a</b>) channel image at 1100nm, (<b>b</b>) clustering of spectral data, colour indicates pixel groups, (<b>c</b>) classification for foreground (leaves) vs. background, (<b>d</b>) image after removal of dark areas and marking of vine position from GPS.</p>
Full article ">Figure 8
<p>Normalized reflectance spectra for the VNIR 400–1000 nm (<b>a</b>) and SWIR 1000–2500 nm (<b>b</b>) range imaged at 2 p.m. and pooled over west and east side.</p>
Full article ">Figure 9
<p>Classification accuracy for differentiation of sprayed vs. non-sprayed leaves measured from (<b>a</b>) west and east; (<b>b</b>) only west and (<b>c</b>) only east side. Across the results, the spectral reflectance data of the VIS-NIR range seems to be the more robust predictor for the spray status. Machine Learning and LDA approach are compared.</p>
Full article ">Figure 10
<p>Relevance profile of the visual-near infrared range based on RBF performance at different recording times.</p>
Full article ">Figure 11
<p>High intensity reflections result in a wrong point positions at object borders (left image). At berry arches, they may corrupt the spherical geometry of the berries during reconstruction.</p>
Full article ">
2479 KiB  
Article
Leaf Area Index Estimation Using Chinese GF-1 Wide Field View Data in an Agriculture Region
by Xiangqin Wei, Xingfa Gu, Qingyan Meng, Tao Yu, Xiang Zhou, Zheng Wei, Kun Jia and Chunmei Wang
Sensors 2017, 17(7), 1593; https://doi.org/10.3390/s17071593 - 8 Jul 2017
Cited by 13 | Viewed by 4872
Abstract
Leaf area index (LAI) is an important vegetation parameter that characterizes leaf density and canopy structure, and plays an important role in global change study, land surface process simulation and agriculture monitoring. The wide field view (WFV) sensor on board the Chinese GF-1 [...] Read more.
Leaf area index (LAI) is an important vegetation parameter that characterizes leaf density and canopy structure, and plays an important role in global change study, land surface process simulation and agriculture monitoring. The wide field view (WFV) sensor on board the Chinese GF-1 satellite can acquire multi-spectral data with decametric spatial resolution, high temporal resolution and wide coverage, which are valuable data sources for dynamic monitoring of LAI. Therefore, an automatic LAI estimation algorithm for GF-1 WFV data was developed based on the radiative transfer model and LAI estimation accuracy of the developed algorithm was assessed in an agriculture region with maize as the dominated crop type. The radiative transfer model was firstly used to simulate the physical relationship between canopy reflectance and LAI under different soil and vegetation conditions, and then the training sample dataset was formed. Then, neural networks (NNs) were used to develop the LAI estimation algorithm using the training sample dataset. Green, red and near-infrared band reflectances of GF-1 WFV data were used as the input variables of the NNs, as well as the corresponding LAI was the output variable. The validation results using field LAI measurements in the agriculture region indicated that the LAI estimation algorithm could achieve satisfactory results (such as R2 = 0.818, RMSE = 0.50). In addition, the developed LAI estimation algorithm had potential to operationally generate LAI datasets using GF-1 WFV land surface reflectance data, which could provide high spatial and temporal resolution LAI data for agriculture, ecosystem and environmental management researches. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Square region in the left image shows the geo-location of the Shenzhou study area in Hebei Province, and the right image is the GF-1 WFV data acquired on 15 August 2014. Generally, the red patches are farmland in the right image and the blue patches are non-vegetated regions, such as residential areas and roads. The green triangles on the right image indicate the field survey locations.</p>
Full article ">Figure 2
<p>Flowchart of the leaf area index (LAI) estimation algorithm for GF-1 WFV data.</p>
Full article ">Figure 3
<p>The 13 soil reflectances used to represent the possible range of spectral shapes for the PROSAIL model.</p>
Full article ">Figure 4
<p>The architecture of the back propagation neural networks (BPNNs) used for LAI estimation from GF-1 WFV reflectance data.</p>
Full article ">Figure 5
<p>GF-1 WFV land surface reflectance data ((<b>a</b>) 27 June 2014, (<b>b</b>) 18 July 2014, (<b>c</b>) 15 August 2014, (<b>d</b>) 24 August 2014, and (<b>e</b>) 18 September 2014) and their corresponding LAI estimates ((<b>f</b>): 27 June 2014, (<b>g</b>) 18 July 2014, (<b>h</b>) 15 August 2014, (<b>i</b>) 24 August 2014, and (<b>j</b>) 18 September 2014).</p>
Full article ">Figure 6
<p>Scatter plots between field survey LAI and GF-1 WFV data predicated LAI.</p>
Full article ">
8673 KiB  
Article
Optical Sensing to Determine Tomato Plant Spacing for Precise Agrochemical Application: Two Scenarios
by Jorge Martínez-Guanter, Miguel Garrido-Izard, Constantino Valero, David C. Slaughter and Manuel Pérez-Ruiz
Sensors 2017, 17(5), 1096; https://doi.org/10.3390/s17051096 - 11 May 2017
Cited by 13 | Viewed by 6405
Abstract
The feasibility of automated individual crop plant care in vegetable crop fields has increased, resulting in improved efficiency and economic benefits. A systems-based approach is a key feature in the engineering design of mechanization that incorporates precision sensing techniques. The objective of this [...] Read more.
The feasibility of automated individual crop plant care in vegetable crop fields has increased, resulting in improved efficiency and economic benefits. A systems-based approach is a key feature in the engineering design of mechanization that incorporates precision sensing techniques. The objective of this study was to design new sensing capabilities to measure crop plant spacing under different test conditions (California, USA and Andalucía, Spain). For this study, three different types of optical sensors were used: an optical light-beam sensor (880 nm), a Light Detection and Ranging (LiDAR) sensor (905 nm), and an RGB camera. Field trials were conducted on newly transplanted tomato plants, using an encoder as a local reference system. Test results achieved a 98% accuracy in detection using light-beam sensors while a 96% accuracy on plant detections was achieved in the best of replications using LiDAR. These results can contribute to the decision-making regarding the use of these sensors by machinery manufacturers. This could lead to an advance in the physical or chemical weed control on row crops, allowing significant reductions or even elimination of hand-weeding tasks. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Details of the sensors on the laboratory platform (vertical LiDAR and Light-beam sensors) for the detection and structure of the modular 3D plant.</p>
Full article ">Figure 2
<p>(<b>a</b>) Light-beam sensors mounted on the experimental platform designed for field trials at UC Davis, California; (<b>b</b>) Progressive monitoring flowchart using light-beam sensors.</p>
Full article ">Figure 3
<p>(<b>a</b>) Structure housing the sensors mounted on the tractor (left) and detail of the LiDAR and Kinect setup (right) for field trials at the University of Seville; (<b>b</b>) Progressive monitoring flowchart using LiDAR and Kinect sensors.</p>
Full article ">Figure 4
<p>(<b>a</b>) raw image captured by the Kinect sensor with the ROI indicated; (<b>b</b>) ROI with white balance and saturation adjustment; (<b>c</b>) OBIA resulting image with isolated plant pixels.</p>
Full article ">Figure 5
<p>(<b>a</b>) Stem detection of the 11 artificial plants through the lower pair of light-beam sensors. (<b>b</b>) Detection of the aerial part (leaves and branches) of the 11 artificial plants through the centre pair of light-beam sensors.</p>
Full article ">Figure 6
<p>(<b>a</b>) Positions of the stems of the 11 plants in the laboratory test; (<b>b</b>) Average plant position for a distance of 11 mm.</p>
Full article ">Figure 7
<p>Histogram of measured distances between 3d plant stems during the adjusting process of sensors on the detection platform in the laboratory.</p>
Full article ">Figure 8
<p>Point cloud representation of artificial 3D plants obtained using a lateral-scanning LiDAR sensor during laboratory tests.</p>
Full article ">Figure 9
<p>(<b>a</b>) Detections and positions of the 41 tomato plants in first field test with IR sensors; and (<b>b</b>) detail of the positions of potential plants.</p>
Full article ">Figure 10
<p>Detection results on three tested lines. Green dotted lines represent the cluster centre and black dotted lines the real plant interval obtained by the Kinect image (location ± Std.). (<b>a</b>) 19 stick detections using LiDAR; (<b>b</b>) 51 plants detected during Test 1 in row 1.</p>
Full article ">Figure 11
<p>Plant location method based on the intersection of stem line and ground line. (<b>a</b>) Intersection point between the average aerial part of the plant line (green line) and the average ground line (red line); (<b>b</b>) Intersection point between the stick aerial part line (green line) and the ground line (red line). Both histograms of aerial part of the plant points are shown on the bottom.</p>
Full article ">Figure 12
<p>Plant and sticks location results obtained by the three different methods: Centre of the cluster; lowest point; and ground-plant intersection. (<b>a</b>) Sticks row data using LiDAR; (<b>b</b>) Tomato plants detected during Test 1 on row 1; (<b>c</b>) Detail of the aerial point cloud of tomato plants generated by the LiDAR during Test 1 on row 1. Each plant location method is marked with different coloured dotted line.</p>
Full article ">Figure 12 Cont.
<p>Plant and sticks location results obtained by the three different methods: Centre of the cluster; lowest point; and ground-plant intersection. (<b>a</b>) Sticks row data using LiDAR; (<b>b</b>) Tomato plants detected during Test 1 on row 1; (<b>c</b>) Detail of the aerial point cloud of tomato plants generated by the LiDAR during Test 1 on row 1. Each plant location method is marked with different coloured dotted line.</p>
Full article ">
3648 KiB  
Article
Spectroscopic Diagnosis of Arsenic Contamination in Agricultural Soils
by Tiezhu Shi, Huizeng Liu, Yiyun Chen, Teng Fei, Junjie Wang and Guofeng Wu
Sensors 2017, 17(5), 1036; https://doi.org/10.3390/s17051036 - 4 May 2017
Cited by 24 | Viewed by 4766
Abstract
This study investigated the abilities of pre-processing, feature selection and machine-learning methods for the spectroscopic diagnosis of soil arsenic contamination. The spectral data were pre-processed by using Savitzky-Golay smoothing, first and second derivatives, multiplicative scatter correction, standard normal variate, and mean centering. Principle [...] Read more.
This study investigated the abilities of pre-processing, feature selection and machine-learning methods for the spectroscopic diagnosis of soil arsenic contamination. The spectral data were pre-processed by using Savitzky-Golay smoothing, first and second derivatives, multiplicative scatter correction, standard normal variate, and mean centering. Principle component analysis (PCA) and the RELIEF algorithm were used to extract spectral features. Machine-learning methods, including random forests (RF), artificial neural network (ANN), radial basis function- and linear function- based support vector machine (RBF- and LF-SVM) were employed for establishing diagnosis models. The model accuracies were evaluated and compared by using overall accuracies (OAs). The statistical significance of the difference between models was evaluated by using McNemar’s test (Z value). The results showed that the OAs varied with the different combinations of pre-processing, feature selection, and classification methods. Feature selection methods could improve the modeling efficiencies and diagnosis accuracies, and RELIEF often outperformed PCA. The optimal models established by RF (OA = 86%), ANN (OA = 89%), RBF- (OA = 89%) and LF-SVM (OA = 87%) had no statistical difference in diagnosis accuracies (Z < 1.96, p < 0.05). These results indicated that it was feasible to diagnose soil arsenic contamination using reflectance spectroscopy. The appropriate combination of multivariate methods was important to improve diagnosis accuracies. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Study areas (<b>a</b>) and spatial distribution of soil samples in Yixing (<b>b</b>) and Zhongxiang (<b>c</b>).</p>
Full article ">Figure 2
<p>The reflectance spectra and the three first principal components (PC1, PC2 and PC3) for the contaminated and uncontaminated soil samples: (<b>a</b>) original reflectance spectra, (<b>b</b>) mean centering spectra, (<b>c</b>) standard normal variate spectra, (<b>d</b>) multiplicative scatter correction spectra, (<b>e</b>) first derivative spectra, and (<b>f</b>) second derivative spectra.</p>
Full article ">Figure 2 Cont.
<p>The reflectance spectra and the three first principal components (PC1, PC2 and PC3) for the contaminated and uncontaminated soil samples: (<b>a</b>) original reflectance spectra, (<b>b</b>) mean centering spectra, (<b>c</b>) standard normal variate spectra, (<b>d</b>) multiplicative scatter correction spectra, (<b>e</b>) first derivative spectra, and (<b>f</b>) second derivative spectra.</p>
Full article ">Figure 3
<p>RELIEF weights and the selected spectral features for original reflectance spectra (<b>a</b>), mean centering spectra (<b>b</b>), standard normal variate spectra (<b>c</b>), multiplicative scatter correction spectra (<b>d</b>), first derivative spectra (<b>e</b>), and second derivative spectra (<b>f</b>). The threshold of RELIEF weight was set to 0 (horizontal dashed lines).</p>
Full article ">Figure 3 Cont.
<p>RELIEF weights and the selected spectral features for original reflectance spectra (<b>a</b>), mean centering spectra (<b>b</b>), standard normal variate spectra (<b>c</b>), multiplicative scatter correction spectra (<b>d</b>), first derivative spectra (<b>e</b>), and second derivative spectra (<b>f</b>). The threshold of RELIEF weight was set to 0 (horizontal dashed lines).</p>
Full article ">Figure 4
<p>Mean decrease GINI values for RELIEF-selected spectral features.</p>
Full article ">Figure 5
<p>Values of samples predicted by using: (<b>a</b>) second derivative spectra (second), RELIEF and random forests; (<b>b</b>) first derivative spectra (first), principle component analysis and artificial neural network; (<b>c</b>) second, RELIEF and radial basis function-based support vector machine (SVM); and (<b>d</b>) first, RELIEF and linear function-based SVM. Value 1 indicates contaminated, and value 0 indicates uncontaminated. The correctly-diagnosed and misdiagnosed samples are displayed in the figures.</p>
Full article ">
12358 KiB  
Article
A Wireless Sensor Network for Growth Environment Measurement and Multi-Band Optical Sensing to Diagnose Tree Vigor
by Shinichi Kameoka, Shuhei Isoda, Atsushi Hashimoto, Ryoei Ito, Satoru Miyamoto, Genki Wada, Naoki Watanabe, Takashi Yamakami, Ken Suzuki and Takaharu Kameoka
Sensors 2017, 17(5), 966; https://doi.org/10.3390/s17050966 - 27 Apr 2017
Cited by 28 | Viewed by 6731
Abstract
We have tried to develop the guidance system for farmers to cultivate using various phenological indices. As the sensing part of this system, we deployed a new Wireless Sensor Network (WSN). This system uses the 920 MHz radio wave based on the Wireless [...] Read more.
We have tried to develop the guidance system for farmers to cultivate using various phenological indices. As the sensing part of this system, we deployed a new Wireless Sensor Network (WSN). This system uses the 920 MHz radio wave based on the Wireless Smart Utility Network that enables long-range wireless communication. In addition, the data acquired by the WSN were standardized for the advanced web service interoperability. By using these standardized data, we can create a web service that offers various kinds of phenological indices as secondary information to the farmers in the field. We have also established the field management system using thermal image, fluorescent and X-ray fluorescent methods, which enable the nondestructive, chemical-free, simple, and rapid measurement of fruits or trees. We can get the information about the transpiration of plants through a thermal image. The fluorescence sensor gives us information, such as nitrate balance index (NBI), that shows the nitrate balance inside the leaf, chlorophyll content, flavonol content and anthocyanin content. These methods allow one to quickly check the health of trees and find ways to improve the tree vigor of weak ones. Furthermore, the fluorescent x-ray sensor has the possibility to quantify the loss of minerals necessary for fruit growth. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Aluminum as the control point in the field experiment: (<b>a</b>) Tree whose vigor strength is good; (<b>b</b>) Tree whose vigor strength is bad; (<b>c</b>) The usage of aluminum tape.</p>
Full article ">Figure 2
<p>Cultivation condition of orange trees.</p>
Full article ">Figure 3
<p>Sampled leaves used in this study.</p>
Full article ">Figure 4
<p>Sampled leaves: (<b>a</b>) Leaves of strong tree; (<b>b</b>) Leaves of weak tree.</p>
Full article ">Figure 5
<p>Fluorescent sensor used and measuring area of the leaf: (<b>a</b>) Fluorescent sensor used in this study; (<b>b</b>) Measurement areas in each leaf.</p>
Full article ">Figure 6
<p>Special measurement station for leaf: (<b>a</b>) Titanium coin used instead of iron; (<b>b</b>) special measurement station for the leaves; (<b>c</b>) hXRF analyzer used in this study.</p>
Full article ">Figure 7
<p>Structure of the WSN in Tomi-no-oka vineyard: (<b>a</b>) Installation points of sensors; (<b>b</b>) System configuration of WSN.</p>
Full article ">Figure 8
<p>The information about SPPNet protocol used in this study.</p>
Full article ">Figure 9
<p>The structure of the WSN used in this study.</p>
Full article ">Figure 10
<p>Design of agricultural IoT as a service for farmers</p>
Full article ">Figure 11
<p>ER chart of database structure.</p>
Full article ">Figure 12
<p>Primary index as a list displayed in web service.</p>
Full article ">Figure 13
<p>Data verification about solar duration: (<b>a</b>) The data acquired in January; (<b>b</b>) The data acquired in July.</p>
Full article ">Figure 14
<p>Secondary index displayed in web application. Since this application is written in Japanese, English annotation is mentioned with red letter.</p>
Full article ">Figure 15
<p>Polygonal line graph shown in this web application.</p>
Full article ">Figure 16
<p>Transformation of leaves: (<b>a</b>) Transformed data; (<b>b</b>) Average of bundle; (<b>c</b>) Inverse transformation.</p>
Full article ">Figure 17
<p>Mapping of the leaf to the template: (<b>a</b>) Sample of the mapping; (<b>b</b>) apply this mapping method to real leaf.</p>
Full article ">Figure 18
<p>Thermal images of the leaves: (<b>a</b>) Strong leaf; (<b>b</b>) Weak leaf; (<b>c</b>) Temperature difference between strong leaf and weak one.</p>
Full article ">Figure 19
<p>Comparison of temperature distribution between weak leaf and strong one.</p>
Full article ">Figure 20
<p>Comparison of each index of various leaves: (<b>a</b>) The rusult of chlorophyll; (<b>b</b>) The result of flavonol; (<b>c</b>) The result of anthocyanin; (<b>d</b>) The result of NBI.</p>
Full article ">Figure 21
<p>Correlation between ANTH and NBI: (<b>a</b>) Leaves of strong vigor; (<b>b</b>) Leaves of weak vigor.</p>
Full article ">Figure 22
<p>Normalised fluorescence X-ray spectroscopic data of strong leaf.</p>
Full article ">

Review

Jump to: Research

18 pages, 1560 KiB  
Review
Plant Pest Detection Using an Artificial Nose System: A Review
by Shaoqing Cui, Peter Ling, Heping Zhu and Harold M. Keener
Sensors 2018, 18(2), 378; https://doi.org/10.3390/s18020378 - 28 Jan 2018
Cited by 146 | Viewed by 18272
Abstract
This paper reviews artificial intelligent noses (or electronic noses) as a fast and noninvasive approach for the diagnosis of insects and diseases that attack vegetables and fruit trees. The particular focus is on bacterial, fungal, and viral infections, and insect damage. Volatile organic [...] Read more.
This paper reviews artificial intelligent noses (or electronic noses) as a fast and noninvasive approach for the diagnosis of insects and diseases that attack vegetables and fruit trees. The particular focus is on bacterial, fungal, and viral infections, and insect damage. Volatile organic compounds (VOCs) emitted from plants, which provide functional information about the plant’s growth, defense, and health status, allow for the possibility of using noninvasive detection to monitor plants status. Electronic noses are comprised of a sensor array, signal conditioning circuit, and pattern recognition algorithms. Compared with traditional gas chromatography–mass spectrometry (GC-MS) techniques, electronic noses are noninvasive and can be a rapid, cost-effective option for several applications. However, using electronic noses for plant pest diagnosis is still in its early stages, and there are challenges regarding sensor performance, sampling and detection in open areas, and scaling up measurements. This review paper introduces each element of electronic nose systems, especially commonly used sensors and pattern recognition methods, along with their advantages and limitations. It includes a comprehensive comparison and summary of applications, possible challenges, and potential improvements of electronic nose systems for different plant pest diagnoses. Full article
(This article belongs to the Special Issue Sensors in Agriculture)
Show Figures

Figure 1

Figure 1
<p>An E-nose system based on QCM sensor array. MFC, mass flow control; DAQ, data acquisition.</p>
Full article ">Figure 2
<p>Illustration of VOC collection system for infected plants.</p>
Full article ">Figure 3
<p>Typical pattern recognition methods applied on E-nose system.</p>
Full article ">Figure 4
<p>The applications of E-nose in plants disease detection.</p>
Full article ">
Back to TopTop