Analysis of the Functionality of the Feed Chain in Olive Pitting, Slicing and Stuffing Machines by IoT, Computer Vision and Neural Network Diagnosis
<p>Example of the use of region of interest (ROI) in an olive in a normal position. The red point corresponds to the origin of coordinates for pixels. The green rectangle comprises the pocket within its walls and depth and limited by the previous and the next pocket of the feed chain. The red rectangle is the ROI.</p> "> Figure 2
<p>Example of the use of ROI in an olive in a “boat” position between [80°, 90°] and [−80°, −90°].</p> "> Figure 3
<p>Selected ROI.</p> "> Figure 4
<p>Punch needles in an olive pitting machine.</p> "> Figure 5
<p>Magnetic sensor used to detect the passage of the pockets in the chain.</p> "> Figure 6
<p>Electronic circuit of the external trigger and LED.</p> "> Figure 7
<p>Electronic circuit of the PC with CM1K chip.</p> "> Figure 8
<p>Qt Creator application for IoT management of olive pitting, slicing and stuffing (DRR) machines.</p> "> Figure 9
<p>The system implemented on a “DRR machine”.</p> "> Figure 10
<p>PC-CM1K communication system.</p> "> Figure 11
<p>IoT control system.</p> "> Figure 12
<p>Confusion matrix using 10 × 10 resolution images.</p> "> Figure 13
<p>Matrix of the obtained results using MATLAB (11 × 11 pixels resolution).</p> "> Figure 14
<p>Matrix of the obtained results using MATLAB (16 × 16 pixels resolution).</p> "> Figure 15
<p>Graphical User Interface (GUI) created in Matlab to analyze data from the “DRR machines”.</p> "> Figure 16
<p>Settings dialogue box of the GUI from Matlab.</p> "> Figure 17
<p>GUI of “boat” olives and doubles for the “DRR machine” diagnosis.</p> "> Figure 18
<p>Multiple accumulated values (boats, empty, doubles and small parts).</p> "> Figure 19
<p>Instantaneous values for boats, doubles, small parts and empty.</p> "> Figure 20
<p>Instantaneous values of speed and production.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. Neural Network
2.2. Operation of the CM1K Chip
2.3. The MATLAB Neural Network Toolbox
2.3.1. Preliminary Tests: Maximum Resolution Available
- Establishing a region of interest (ROI) on the image.
- Testing different resolutions that allow the identification of the image at the minimum processing rate.
2.3.2. Preliminary Tests: Minimum Resolution
- [0°, 10°] and [0°, −10°] (Normal)
- [10°, 20°] and [−10°, −20°] (Intermediate)
- [20°, −30°] and [−20°, −30°] (Intermediate)
- [30°, −40°] and [−30°, −40°] (Intermediate)
- [40°, −50°] and [−40°, −50°] (Intermediate)
- [50°, −60°] and [−50°, −60°] (Intermediate)
- [60°, −70°] and [−60°, −70°] (Intermediate)
- [70°, −80°] and [−70°, −80°] (Intermediate)
- [80°, −90°] and [−80°, −90°] (Boat)
2.4. Hardware Used in Image Capture
- Obtain images for deferred analysis with Matlab, the Intel Curie and CM1K neural chips to evaluate the operation of the latter.
- Characterize the real-time operation through IoT of the pitting machines of an olive factory that reach speeds of up to 2500 olives/min.
2.5. Application for IoT Management of DRR Machines
2.6. Neural Chips Used
- First, there is a complex and thorough process on both chips by which olives are classified according to existing angles, see previous Section 2.3.2.
- Later, an intermediate position is set to simplify the rank of angles and the following positions are only taken into account: normal (interval [0°, −10°] and [0°, 10°]), “boat” ([−80°, −90°] and [80°, 90°]), intermediate (from 10° to 80° and −10° to −80°) and empty pocket.
- Finally, this classification is simplified even more using only three out of four pocket cases: normal, “boat” and empty.
2.7. CM1K-PC Hardware Communication for Real Time Classification
- (1)
- The INTEL CURIE [58] (which internally incorporates a limited version of the CM1K chip with 128 neurons with a 128 byte vector) but which also can be used just for a part of the communication with the PC At the present time, it is no longer produced.
- (2)
- A TEENSY 4.0 [59] which includes a ARM Cortex-M7 (NXP iMXRT1062) operating at a speed of 600 MHz.
2.8. IoT System to Control the DRR Machine and Data Analysis
3. Results
3.1. Results Obtained Using a MATLAB Neural Network
3.2. Results Obtained Using Neuromorphic Chips
3.3. Analysis of the Results of the Operation of the DRR Machine
- Speed: Indicated in pockets per minute
- Production: Real value in olives per minute (without empty pocket)
- “Boat” olives, normal, double olives, empty, small pieces, anomalies: Percentage of the total production.
- Accumulated values: Data added in the selected period.
4. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Santos, F.J. Siles New technologies in table olive processing. Grasas Aceites 1999, 502, 131–140. [Google Scholar] [CrossRef]
- Madueño, A.; Lineros, M.; Madueño, J. System and Procedure Based on a Synchronism Sensor for the Detection of Malfunctions in Pitting Machines Olive and Filling Machines, Quantification and Optimization of Performance, Signaling, Monitoring and Remote Control. ES2529816A2. Available online: https://patents.google.com/patent/ES2529816A2/en (accessed on 2 February 2020).
- Tang, Y.; Li, L.; Wang, C.; Chen, M.; Feng, W.; Zou, X.; Huang, K. Real-time detection of surface deformation and strain in recycled aggregate concrete-filled steel tubular columns via four-ocular vision. Robot. Comput. Integr. Manuf. 2019, 59, 36–46. [Google Scholar] [CrossRef]
- Chen, M.; Tan, Y.; Zou, X.; Huang, K.; Li, L.; He, Y. High-accuracy multi-camera reconstruction enhanced by adaptive point cloud correction algorithm. Opt. Lasers Eng. 2019, 122, 170–183. [Google Scholar] [CrossRef]
- Nie, M.; Zhao, Q.; Xu, Y.; Shen, T. Machine Vision-based Apple External Quality Grading. In Proceedings of the Chinese Control and Decision Conference, Nanchang, China, 3–5 June 2019. [Google Scholar]
- Lucas, A.; Madueño, A.; De Jódar, M.; Molina, J.; Ruiz, A. Characterization of the percentage of poorly positioned olives in pitting, rolling and filling machines for table olives (DRR). In Proceedings of the X Congresso Ibérico de Agroengenharia, Huesca, Spain, 3–6 September 2019. [Google Scholar]
- Lin, G.; Tang, Y.; Zou, X.; Li, J.; Xiong, J. In-field citrus detection and localisation based on RGB-D image analysis. Biosyst. Eng. 2019, 186, 34–44. [Google Scholar] [CrossRef]
- Lin, G.; Tang, Y.; Zou, X.; ·Xiong, J.; Fang, Y. Color‑, depth‑, and shape‑based 3D fruit detection. Prec. Agric. 2020, 21, 1–17. [Google Scholar] [CrossRef]
- Lin, G.; Tang, Y.; Zou, X.; Xiong, J.; Li, J. Guava. Detection and Pose Estimation Using a Low-Cost RGB-D Sensor in the Field. Sensors 2019, 19, 428. [Google Scholar] [CrossRef] [Green Version]
- Yang, F. Classification of apple surface features using machine vision and neural networks. Comput. Electron. Agric. 1993, 9, 1–12. [Google Scholar] [CrossRef]
- Nagata, M.; Bato, P.; Mitaria, M.; Cao, Q.; Kitahara, T. Study on Sorting System for Strawberry Using Machine Vision (Part 1). Jap. Soc. Agric. Mach. 2000, 62, 100–110. [Google Scholar]
- Behroozi, N.; Tavakoli, T.; Ghassemian, H.; Hadi, M.; Banakar, A. Applied machine vision and artificial neural network for modeling and controlling of the grape drying process. Comput. Electron. Agric. 2013, 98, 205–213. [Google Scholar] [CrossRef]
- Gatica, G.; Bestb, S.; Ceronic, J.; Lefranc, G. Olive Fruits Recognition Using Neural Networks. Proc. Comput. Sci. 2013, 17, 412–419. [Google Scholar] [CrossRef] [Green Version]
- Mancuso, S.; Nicese, F.P. Identifying Olive (Olea europaea) Cultivars Using Artificial Neural Networks. Am. Soc. Hortic. Sci. 1999, 124, 527–531. [Google Scholar] [CrossRef] [Green Version]
- Sun, D. Computer Vision Technology for Food Quality Evaluation, 2nd ed.; Academic Press: Waltham, MA, USA, 2016; pp. 273–350. [Google Scholar]
- Diaz, R. Computer Vision Technology for Food Quality Evaluation, 2nd ed.; Academic Press: Waltham, MA, USA, 2016; pp. 351–367. [Google Scholar]
- Bottle Inspection. General Visions. 2013. Available online: https://www.general-vision.com/appnotes/AN_BottleInspection.pdf (accessed on 2 February 2020).
- Menendez, A.; Paillet, G. Fish Inspection System Using a Parallel Neural Network Chip and the Image Knowledge Builder Application. AI Mag. 2008, 29, 21. [Google Scholar]
- Liu, Y.; Wei, D.; Zhang, N. Vehicle-license-plate recognition based on neural networks. In Proceedings of the IEEE on Information and Automation, Shenzhen, China, 6–8 June 2011. [Google Scholar]
- Sardar, S.; Tewari, G.; Babu, K.A. A hardware/software co-design model for face recognition using Cognimem Neural Network chip. In Proceeding of the IEEE on Image Information Processing, Shimla, India, 3–5 November 2011. [Google Scholar]
- Davies, M.; Srinivasa, N.; Lin, T.; Chinya, G.; Cao, Y.; Choday, S.; Dimou, G.; Joshi, P.; Imam, N.; Jain, S.; et al. Loihi: A Neuromorphic Manycore Processor with On-Chip Learning. IEEE Micro 2018, 38, 82–99. [Google Scholar] [CrossRef]
- Moran, S.; Gaonkar, B.; Whitehead, W.; Wolk, A.; Macyszyn, L.S.; Iyer, S. Deep learning for medical image segmentation—Using the IBM TrueNorth neurosynaptic system. In Proceedings of the SPIE Medical Imaging, Houston, TX, USA, 6 March 2018. [Google Scholar]
- Moradi, S.; Qiao, N.; Stefanini, F.; Indiveri, G. A scalable multi-core architecture with heterogeneous memory structures for dynamic neuromorphic asynchronous processors (dynaps). IEEE Trans. Biomed. Circuits Syst. 2018, 12, 106–122. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Frenkel, C.; Lefebvre, M.; Legat, J.; Bol, D. A 0.086-mm² 12.7-pJ/SOP 64k-Synapse 256-Neuron Online-Learning Digital Spiking Neuromorphic Processor in 28-nm CMOS. IEEE Trans. Biomed. Circuits. Syst. 2019, 13, 145–158. [Google Scholar]
- Fried, L. Making machine learning arduino compatible: A gaming handheld that runs neural networks-[Resources_Hands On]. IEEE Spectr. 2019, 56, 14–15. [Google Scholar] [CrossRef]
- Lobachev, I.; Maleryk, R.; Antoschuk, S.; Filiahin, D.; Lobachev, M. Integration of neural networks into smart sensor networks. In Proceedings of the IEEE Xplore, Kiev, Ukraine, 24–27 May 2018. [Google Scholar]
- Mittal, S. A Survey on optimized implementation of deep learning models on the NVIDIA Jetson platform. J. Syst. Arch. 2019, 97, 428–442. [Google Scholar] [CrossRef]
- Kim, J. New Neuromorphic AI NM500 and Its ADAS Application. In AETA-2018 Recent Advances in Electrical Engineering and Related Sciences: Theory and Application; Lecture Notes in Electrical, Engineering; Zelinka, I., Brandstetter, P., Trong Dao, T., Hoang Duy, V., Kim, S., Eds.; Springer: Cham, Switzerland, 2019; Volume 554, pp. 3–12. [Google Scholar]
- CogniPat SDK for Matlab. General Visions. 2018. Available online: https://www.general-vision.com/download/cp_sdk_ml/ (accessed on 2 February 2020).
- NeuroMem USB Dongle. General Visions. 2019. Available online: https://www.general-vision.com/hardware/usbdongle/ (accessed on 2 February 2020).
- López Riquelme, J.A.; Soto, F.; Suardíaz, J.; Sánchez, P.; Iborra, A.; Vera, J.A. Wireless Sensor Networks for precision horticulture in Southern Spain. Comput. Electron. Agric. 2019, 68, 25–35. [Google Scholar] [CrossRef]
- Garcia, L.; Parra, L.; Jimenez, J.M.; Lloret, J.; Lorenz, P. IoT-Based Smart Irrigation Systems: An Overview on the Recent Trends on Sensors and IoT Systems for Irrigation in Precision Agriculture. Sensors 2020, 20, 1042. [Google Scholar] [CrossRef] [Green Version]
- Urbano, O.; Perles, A.; Pedraza, C.; Rubio-Arraez, S.; Castelló, M.L.; Ortola, M.D.; Mercado, R. Cost-Eective Implementation of a Temperature Traceability System Based on Smart RFID Tags and IoT Services. Sensors 2020, 20, 1163. [Google Scholar] [CrossRef] [Green Version]
- Escolar Díaz, S.; Carretero Pérez, J.; Calderón Mateos, A.; Marinescu, M.C.; Bergua Guerra, B. A novel methodology for the monitoring of the agricultural production process based on wireless sensor networks. Comput. Electron. Agric. 2011, 76, 252–265. [Google Scholar] [CrossRef]
- Automated-olive-chain. The internet of Food & Farm. 2020. Available online: https://www.iof2020.eu/trials/fruits/automated-olive-chain (accessed on 20 December 2019).
- De Jodar, M.; Madueño, A.; Lucas, A.; Molina, J.M.; Cánales, A.R.; Madueño, J.M.; Justicia, M.; Baena, M. Deep learning in olive pitting machines by computer visión. Comput. Electron. Agric. 2020, 171, 105304. [Google Scholar] [CrossRef]
- Hecht-Nielsen, R. Theory of the Backpropagation Neural Network. In Proceedings of the International 1989 Joint Conference on Neural Networks, Washington, DC, USA, 18–22 June 1989. [Google Scholar]
- Google Coral Edge TPU. Google LLC. 2020. Available online: https://coral.ai/docs/accelerator/datasheet/ (accessed on 15 November 2019).
- Intel® Movidius™ Neural Computer Stick 2. Intel Corporation. 2020. Available online: https://www.intel.es/content/www/es/es/design/products-and-solutions/boards-kits-and-modules/movidius-neural-compute-stick-2/technical-library.html?grouping=rdc%20Content%20Types&sort=title:asc (accessed on 10 December 2019).
- Nvidia-Jetson-Nano. Nvidia Corporation. 2020. Available online: https://www.nvidia.com/es-es/autonomous-machines/embedded-systems/jetson-nano/ (accessed on 15 December 2019).
- TM TestNeurons SimpleScript. General Visions. Available online: http://www.general-vision.com/documentation/TM_TestNeurons_SimpleScript.pdf (accessed on 2 February 2020).
- TM NeuroMem Technology Reference Guide. General Visions. 2019. Available online: https://www.general-vision.com/documentation/TM_NeuroMem_Technology_Reference_Guide.pdf (accessed on 2 February 2020).
- TM_CM1K_Hardware_Manual. General Visions. 2017. Available online: https://www.general-vision.com/documentation/TM_CM1K_Hardware_Manual.pdf (accessed on 2 February 2020).
- Halgamuge, S.; Poechmueller, W.; Glesner, M. An Alternative Approach for Generation of Membership Functions and Fuzzy Rules Based on Radial and Cubic Basis Function Networks. Int. J. Approx. Reason. 1995, 12, 279–298. [Google Scholar] [CrossRef] [Green Version]
- DS_CM1K. General Visions. 2014. Available online: https://www.general-vision.com/datasheet/DS_CM1K.pdf (accessed on 2 February 2020).
- Neural-network. The MathWorks, Inc. 1994–2017. Available online: https://es.mathworks.com/solutions/deep-learning/convolutional-neural-network.html?s_tid=srchtitle (accessed on 2 February 2020).
- Train Autoencoder. The MathWorks, Inc. 1994–2017. Available online: http://es.mathworks.com/help/nnet/ref/trainautoencoder.html (accessed on 2 February 2020).
- Train Stacked Autoencoders for Image Classification. The MathWorks Inc. 1994–2019. Available online: https://es.mathworks.com/help/deeplearning/examples/train-stacked-autoencoders-for-image-classification.html (accessed on 2 February 2020).
- Image Set Repository. Available online: https://github.com/Torras86/Olive-image-set (accessed on 18 December 2019).
- IDS Imaging Development Systems GmbH. 2016. Available online: https://es.ids-imaging.com/store/ui-1220se.html (accessed on 2 February 2020).
- QT Creator. The Qt Company. 2020. Available online: https://doc.qt.io/ (accessed on 18 December 2019).
- Dropbox. Dropbox Inc. 2020. Available online: https://www.dropbox.com/developers/documentation (accessed on 2 February 2020).
- Braincard. General Visions. 2017. Available online: https://www.general-vision.com/documentation/TM_BrainCard.pdf (accessed on 2 February 2020).
- NM500 Chip. General Visión. 2019. Available online: https://www.general-vision.com/documentation/TM_NeuroShield_GettingStarted.pdf (accessed on 2 February 2020).
- I2C (Inter Integrated Circuit), Phillips Semiconductor, 1982. Available online: https://en.wikipedia.org/wiki/I%C2%B2C (accessed on 2 February 2020).
- FT232RL USB UART IC. Future Technology Devices International Limited. 2018. Available online: https://www.ftdichip.com/Support/Documents/DataSheets/ICs/DS_FT232R.pdf (accessed on 20 December 2019).
- USB to Serial Chip CH340. SparkFun Electronics. 2015. Available online: https://cdn.sparkfun.com/datasheets/Dev/Arduino/Other/CH340DS1.PDF (accessed on 19 December 2019).
- Intel Curie Module. Intel Corporation. 2020. Available online: https://ark.intel.com/content/www/es/es/ark/products/96282/intel-curie-module-intel-quark-se-soc.html (accessed on 21 December 2019).
- Teensy 4.0 USB Development Board. PJRC Electronics Projects Components Available Worldwide. Available online: https://www.pjrc.com/teensy/ (accessed on 20 December 2019).
- Google Remote Desktop. Google LLC. 2020. Available online: https://support.google.com/chrome/answer/1649523?co=GENIE.Platform%3DDesktop&hl=es (accessed on 20 December 2019).
ANGLE [0°, 10º] | ANGLE [0°,−10°] | ANGLE [10°,20°] | ANGLE [−10°,−20°] |
ANGLE [20°,30°] | ANGLE [−20°,−30°] | ANGLE [30°,40°] | ANGLE [-30°,-40°] |
ANGLE [40°,50°] | ANGLE [−40°,−50°] | ANGLE [50°,60°] | ANGLE [−50°,−60°] |
ANGLE [60°,70°] | ANGLE [−60°,−70°] | ANGLE [70°,80°] | ANGLE [−70°,−80°] |
ANGLE [80°,90°] or [−80°,−90°] | |||
CH340G | CURIE | TEENSY | |
---|---|---|---|
1 BYTE | 366.48 | 258.08 | 152.26 |
258 BYTES | 22.41 | 5.04 | 0.78 |
Type of Test | Degrees | Intermediate Position | Simple | ||||
---|---|---|---|---|---|---|---|
Neuron | 96 | Neurons | 24 | Neurons | 4 | ||
Board | Vector | Repeatability | Error | Repeatability | Error | Repeatability | Error |
Intel Curie | 121 | 1 | 19% | 1 | 7% | 1 | 6% |
2 | 15% | 2 | 11% | 2 | 9% | ||
3 | 7% | 3 | 7% | 3 | 4% | ||
4 | 11% | 4 | 10% | 4 | 6% | ||
5 | 13% | 5 | 9% | 5 | 6% | ||
6 | 14% | 6 | 10% | 6 | 6% | ||
7 | 8% | 7 | 6% | 7 | 3% | ||
8 | 7% | 8 | 9% | 8 | 2% | ||
9 | 12% | 9 | 10% | 9 | 3% | ||
10 | 7% | 10 | 7% | 10 | 3% | ||
Totals | 11.30% | 8.47% | 4.80% |
Type of Test | Degrees | Intermediate Position | Simple | ||||
---|---|---|---|---|---|---|---|
Neuron | 66 | Neurons | 24 | Neurons | 4 | ||
Board | Vector | Repeatability | Error | Repeatability | Error | Repeatability | Error |
Braincard (CM1K) | 256 | 1 | 8% | 1 | 3% | 1 | 1% |
2 | 9% | 2 | 4% | 2 | 0% | ||
3 | 10% | 3 | 4% | 3 | 0% | ||
4 | 12% | 4 | 3% | 4 | 2% | ||
5 | 13% | 5 | 4% | 5 | 1% | ||
6 | 13% | 6 | 5% | 6 | 1% | ||
7 | 15% | 7 | 2% | 7 | 0% | ||
8 | 16% | 8 | 5% | 8 | 1% | ||
9 | 16% | 9 | 8% | 9 | 1% | ||
10 | 13% | 10 | 2% | 10 | 0% | ||
Totals | 12.43% | 4% | 0.63% |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lucas Pascual, A.; Madueño Luna, A.; de Jódar Lázaro, M.; Molina Martínez, J.M.; Ruiz Canales, A.; Madueño Luna, J.M.; Justicia Segovia, M. Analysis of the Functionality of the Feed Chain in Olive Pitting, Slicing and Stuffing Machines by IoT, Computer Vision and Neural Network Diagnosis. Sensors 2020, 20, 1541. https://doi.org/10.3390/s20051541
Lucas Pascual A, Madueño Luna A, de Jódar Lázaro M, Molina Martínez JM, Ruiz Canales A, Madueño Luna JM, Justicia Segovia M. Analysis of the Functionality of the Feed Chain in Olive Pitting, Slicing and Stuffing Machines by IoT, Computer Vision and Neural Network Diagnosis. Sensors. 2020; 20(5):1541. https://doi.org/10.3390/s20051541
Chicago/Turabian StyleLucas Pascual, Alberto, Antonio Madueño Luna, Manuel de Jódar Lázaro, José Miguel Molina Martínez, Antonio Ruiz Canales, José Miguel Madueño Luna, and Meritxell Justicia Segovia. 2020. "Analysis of the Functionality of the Feed Chain in Olive Pitting, Slicing and Stuffing Machines by IoT, Computer Vision and Neural Network Diagnosis" Sensors 20, no. 5: 1541. https://doi.org/10.3390/s20051541